-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Providing a pydantic model instead of docstring for tool parameters. #646
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
All parameters must be required for strict mode but the can be nullable.
Use agent context for this. Context is available to tools but never leaves the process.
|
hey @pakrym-oai thanks for the reply. I have another doubt.
suppose here, when I get the chat history through P.S.: I also want to use
But I get this error: |
Please read this first
Question
@rm-openai What if I want some parameters to be required and others to be optional, I can easily do this via pydantic, where any field that I have not wrapped in an optional object will be required. Is there a way to replicate this here ? Because as far as I understand, if I do strict_mode = False, while low, there's a chance that llm might not pass anything, I want to avoid that.
Secondly, suppose my tool calls an api, which is linked to the backend, now I want to pass an auth token to the api, which I currently do by adding it as a parameter in each tool and asking agent to provide it through its system prompt, is there a better way to do it ?
Edit: I just checked there's something called function_schema that you use to extract the parameters and description from the docstring and then convert them to pydantic model. is there a way to directly provide a pydantic model to this ?
The text was updated successfully, but these errors were encountered: