Replies: 10 comments 3 replies
-
Btw: request to ollama is like this:
And response like this:
|
Beta Was this translation helpful? Give feedback.
-
@eloycoto Thanks for posting this thread. It's a great question, it would mean taking an existing PDL function and extracting its signature to send to the model. We can work on this feature. To help us with it, do you mind sending a version of the first code snippet above that works for you? I am getting a weird error from the model on that. Thanks! |
Beta Was this translation helpful? Give feedback.
-
Got an error too, it's maybe because it should translate functions to tools dictionary as said in the doc? https://docs.litellm.ai/docs/completion/function_call functions is not a supported param: |
Beta Was this translation helpful? Give feedback.
-
made a quick check, If I do this:
The request an response from the API looks good, but tool_usage are not used anywhere, or I do not think so!
|
Beta Was this translation helpful? Give feedback.
-
@eloycoto you could directly pass the JSON object in the |
Beta Was this translation helpful? Give feedback.
-
@eloycoto I see what you mean in the LiteLLM code, it indeed looks like it's dropping |
Beta Was this translation helpful? Give feedback.
-
Checked with tools parameters instead of functions(deprecated) to use tools, and I got this: https://github.com/IBM/prompt-declaration-language/blob/main/src/pdl/pdl_ast.py#L206 PDL: description: "Get current stock"
text:
- model: openai/llama3.2:1b
parameters:
temperature: 0.3
tools:
- { "name": "get_current_weather", "description": "Get the current weather in a given location", "parameters": { "type": "object", "properties": { "location": { "type": "string", "description": "The city and state, e.g. San Francisco, CA" }, "unit": { "type": "string", "enum": ["celsius", "fahrenheit"] } }, "required": ["location"] } }
input: |
array:
- role: user
content: |
Could you tell me the weather in NY city please? HTTP request & response
Looks like the problem is when the llm Response has tool_calls, pdl do nothing with it. For me, tackling this problem will be:
|
Beta Was this translation helpful? Give feedback.
-
@eloycoto This has been fixed now in the main branch (#178). Here's an example. The
We are working on making this easy to consume on the PDL side (i.e. make it easy to do the tool call here, and call the model again with the results). |
Beta Was this translation helpful? Give feedback.
-
@eloycoto We are working on it. |
Beta Was this translation helpful? Give feedback.
-
@eloycoto PR #184 (just merged) contains a fix to allow richer messages. So the following example now works for obtaining tool calls from the LLM, calling them locally, and returning the responses back to the LLM:
The ContributeValue in the function call allows the user to contribute a rich message containing the function response to the background context. In the future, we can make this more seamless with the following abstractions to help with the following:
|
Beta Was this translation helpful? Give feedback.
-
There is a way to define functions like this as the model parameter:
Is there any way to define this functions as PDL functions? Like, I have:
Is there any way that I say:
This allow to reuse and compose scripts using the same tools/functions available in different PDL scripts.
Beta Was this translation helpful? Give feedback.
All reactions