litellm.APIConnectionError: APIConnectionError: OpenAIException - 'async_generator' object has no attribute 'get' #6086
Unanswered
deepakdeore2004
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
litellm gives below error when
streamis set in curlbelow is configmap
litellm
litellm-database:main-v1.48.16calling bckend vLLM model directly returns streaming response but same command fails with litellm
what can be wrong here?
here are some debug logs:
Beta Was this translation helpful? Give feedback.
All reactions