Skip to content

Commit 9bed1a8

Browse files
committed
Add support for o1-preview and o1-mini models (#9)
Adding support for [⁠the new o1-preview and o1-mini models](https://openai.com/index/introducing-openai-o1-preview/). Note that these two no longer support the name we translate our `MaxNumTokens` into. Fortunately, the new name does get accepted in the older models, too. ``` ================================================================================ Error occurred in topenAIChat/canUseModel(ModelName=o1-preview) and it did not run to completion. --------- Error ID: --------- 'llms:apiReturnedError' -------------- Error Details: -------------- Error using openAIChat/generate (line 259) Server returned error indicating: "Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead." Error in topenAIChat/canUseModel (line 100) testCase.verifyClass(generate(openAIChat(ModelName=ModelName),"hi",MaxNumTokens=1),"string"); ================================================================================ ```
1 parent bb7a186 commit 9bed1a8

File tree

4 files changed

+14
-3
lines changed

4 files changed

+14
-3
lines changed

+llms/+internal/callOpenAIChatAPI.m

+3-3
Original file line numberDiff line numberDiff line change
@@ -135,11 +135,11 @@
135135
end
136136

137137
if isempty(nvp.StopSequences)
138-
parameters = rmfield(parameters,"stop");
138+
parameters = rmfield(parameters,dict("StopSequences"));
139139
end
140140

141141
if nvp.MaxNumTokens == Inf
142-
parameters = rmfield(parameters,"max_tokens");
142+
parameters = rmfield(parameters,dict("MaxNumTokens"));
143143
end
144144

145145
end
@@ -150,7 +150,7 @@
150150
dict("TopP") = "top_p";
151151
dict("NumCompletions") = "n";
152152
dict("StopSequences") = "stop";
153-
dict("MaxNumTokens") = "max_tokens";
153+
dict("MaxNumTokens") = "max_completion_tokens";
154154
dict("PresencePenalty") = "presence_penalty";
155155
dict("FrequencyPenalty ") = "frequency_penalty";
156156
end

+llms/+openai/models.m

+2
Original file line numberDiff line numberDiff line change
@@ -9,5 +9,7 @@
99
"gpt-4","gpt-4-0613", ...
1010
"gpt-3.5-turbo","gpt-3.5-turbo-0125", ...
1111
"gpt-3.5-turbo-1106",...
12+
"o1-preview",...
13+
"o1-mini",...
1214
];
1315
end

doc/OpenAI.md

+1
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@ To start using the OpenAI APIs, you first need to obtain OpenAI API keys. You ar
66

77
Some of the current LLMs supported on OpenAI are:
88
- gpt-4o-mini, gpt-4o-mini-2024-07-18
9+
- o1-preview, o1-mini
910
- gpt-3.5-turbo, gpt-3.5-turbo-1106, gpt-3.5-turbo-0125
1011
- gpt-4o, gpt-4o-2024-05-13 (GPT-4 Omni)
1112
- gpt-4-turbo, gpt-4-turbo-2024-04-09 (GPT-4 Turbo with Vision)

tests/topenAIChat.m

+8
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,7 @@
99
InvalidGenerateInput = iGetInvalidGenerateInput();
1010
InvalidValuesSetters = iGetInvalidValuesSetters();
1111
StringInputs = struct('string',{"hi"},'char',{'hi'},'cellstr',{{'hi'}});
12+
ModelName = cellstr(llms.openai.models);
1213
end
1314

1415
methods(Test)
@@ -95,6 +96,13 @@ function fixedSeedFixesResult(testCase)
9596
testCase.verifyEqual(result1,result2);
9697
end
9798

99+
function canUseModel(testCase,ModelName)
100+
testCase.verifyClass(generate(...
101+
openAIChat(ModelName=ModelName), ...
102+
"hi",MaxNumTokens=1), ...
103+
"string");
104+
end
105+
98106
function invalidInputsConstructor(testCase, InvalidConstructorInput)
99107
testCase.verifyError(@()openAIChat(InvalidConstructorInput.Input{:}), InvalidConstructorInput.Error);
100108
end

0 commit comments

Comments
 (0)