{openaistreamES} is based on an unofficial SDK for OpenAI, which aimed to
implement all related features as comprehensively as possible. It was
updated every two months based on the actual changes in OpenAI’s
interfaces but was last updated June 2024. This fork extends the library to enable support for deepseek or local openai api capable streaming responses.
The method to install this package is to download and install from source:
install.packages("path to openaistreamES", repos = NULL, type = "source")Before using the API, you need to create a link object and configure your secret key and the target ai server:
handle_openai <- openai$new(Sys.getenv("OPENAI_KEY", "openai"))If you cannot access OpenAI due to network restrictions, you can configure a valid proxy address and port:
handle_openai$set_proxy("PROXY_IP", PROXY_PORT)Test the connection by querying available models:
handle_openai$models$list()Start a chat session:
streamlg <- handle_openai$chat$create(
messages = data.frame(role = c("system", "user"),
content = c("You are an assistant.", "How's the weather today?")),
model = "gpt-3.5-turbo",
max_tokens = 10,
n = 3
)Note: The messages parameter is a data.frame with two columns: the first column is the role identifier, and the second column is the conversation content. The response will be stored in the streamlg variable.
Add the stream = TRUE parameter to return a stream object, which includes three methods: get_state() to get the current stream state, next_value to get the next data value, and close() to close the current stream object.
streamlg <- handle_openai$chat$create(
messages = data.frame(role = c("system", "user"),
content = c("You are an assistant.", "How's the weather today?")),
model = "gpt-3.5-turbo",
stream = TRUE,
max_tokens = 2,
n = 3
)
streamlg$get_state()
text <- streamlg$next_value
streamlg$close()All API parameters correspond to the OpenAI HTTP interface and support almost all OpenAI parameters. You can refer to the OpenAI official documentation for the relevant parameters. Apart from messages which is passed as a data.frame, other object parameters can be replaced with a list.
Note: In the following code, stream_options is an object passed as a list in R:
streamlg <- handle_openai$chat$create(
messages = data.frame(role = c("system", "user"),
content = c("You are an assistant.", "How's the weather today?")),
model = "gpt-3.5-turbo",
stream = TRUE,
stream_options = list(include_usage = TRUE),
max_tokens = 2,
n = 3
)(For full API support, usage reference: chat.R)
(For full API support, usage reference: test_images.R)
(For full API support, usage reference: test_files.R)
(For full API support, usage reference: test_fine_tuning.R)
(For full API support, usage reference: test_vector_stores.R)
(For full API support, usage reference: test_assistants.R; test_run.R)
(For full API support, usage reference: test_embedding.R)
(For full API support, usage reference: test_speech.R)
(For full API support, usage reference: test_batch.R)
(For full API support, usage reference: test_moderations.R)
