Skip to content

Premium requests consumed very fast if using litellm #10

@chenyanming

Description

@chenyanming

Very fast, it is like single prompt consume 30. However, copilot-api consume 1 to 2. May I know why?

But copilot-api got me error sometimes:
⎿  API Error: 466 {"error":{"message":"bad request: the specified API version is no longer supported. You may need to update your client to a newer version.\n","type":"error"}}

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions