@@ -90,6 +90,10 @@ that the LLM can decide to call based on the prompt. The function calling
9090interface must be used with chat completions and the ` gpt-4-0613 ` or
9191` gpt-3.5-turbo-0613 ` models or later.
9292
93+ > See < https://github.com/leafo/lua-openai/blob/main/examples/example5.lua > for
94+ > a full example that implements basic math functions to compute the standard
95+ > deviation of a list of numbers
96+
9397Here's a quick example of how to use functions in a chat exchange. First you
9498will need to create a chat session with the ` functions ` option containing an
9599array of available functions.
@@ -130,17 +134,19 @@ if type(res) == "table" and res.function_call then
130134 -- Note that res may also include a content field if the LLM produced a textual output as well
131135
132136 local cjson = require " cjson"
137+ local name = res .function_call .name
133138 local arguments = cjson .decode (res .function_call .arguments )
134- call_my_function ( res . function_call . name , arguments )
139+ -- ... compute the result and send it back ...
135140end
136141```
137142
138- Finally, you can evaluate the function and send the result back to the client
139- so it can resume operation:
143+ You can evaluate the requested function & arguments and send the result back to
144+ the client so it can resume operation with a ` role=function ` message object :
140145
141146> Since the LLM can hallucinate every part of the function call, you'll want to
142147> do robust type validation to ensure that function name and arguments match
143- > what you expect.
148+ > what you expect. Assume every stage can fail, including receiving malformed
149+ > JSON for the arguments.
144150
145151``` lua
146152local name , arguments = ... -- the name and arguments extracted from above
0 commit comments