-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(llm) : Streaming output generation and update gradio to latest version #197
base: main
Are you sure you want to change the base?
Conversation
@imbajin |
@Aryankb Thank you for your attempt, we already implement a partially less elegant asynchronous streaming output by modifying the previous version. We can keep this implementation first and let the current PR only upgrade the gradio version. If possible, a simple list of the useful features of 4.x -> 5.x for us is better. |
@imbajin okay, so i just need to list the possible changes (useful features) or need to make changes in gradio? Also, |
refer #190 (Added async func calls to the entire path, but some places have TODO) After introducing the async workflow framework, we should make all current operator/step calls async. Currently, there are many synchronous places, so it is very troublesome to change them |
Fixes #173
Fixes:-
rag_block -> rag_answer
, can be applied to all other output boxes of gradioModifications:-
used gr.update() in rag_block to generate streaming output.
How to Test:-
Run python3 -m hugegraph_llm.demo.rag_demo.app in CLI
click
Answer question
underRAG and user functions
.the output is streaming. (though it is not async. the final output is streaming after collecting all outputs)
Screencast from 03-07-2025 01:13:46 PM.webm