-
Notifications
You must be signed in to change notification settings - Fork 6.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[llm] Roadmap for Data and Serve LLM APIs #51313
Comments
@kouroshHakha Hi, I’m very interested in the part about enabling Ray to support SGLang. Would it be possible for me to work on this? |
I think that'd be awesome @Qiaolin-Yu . How about let's coordinate on the Ray Slack? |
Sure! |
Would like to see ability to serve embedding models with Ray Serve LLM through a specific endpoint, as per: |
@lk-chen I am very interested in |
@yurigorokhov That'd be awesome. Can you join ray's slack workspace and #llm channel (https://ray.slack.com/archives/C08H0M37WLQ) ? We can hash out how you can contribute. |
This document includes a list of issues / feature requests that we have collected across the oss and other channels. We’ll update this list with relevant info from issues, etc as we go. If there are any features that are not prioritized here, please feel free to open an RFC or feature request or post on the slack community channel.
Core features
Serve
Data
CI/CD and release pipeline
Docs and community support
The text was updated successfully, but these errors were encountered: