You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Feb 3, 2025. It is now read-only.
Given an LLM response that contains a code symbol (e.g. class name, method, etc.), we should link it to its exact location on GitHub. For instance, if you're chatting with Hugging Face's Transformers library:
Original response: "To define a BERT model using the Hugging Face Transformers library, you can use the BertModel class."
Link-ified response: "To define a BERT model using the Hugging Face Transformers library, you can use the BertModel class."
This should be done as a post-processing step in chat.py, once the model is done streaming.
Work items:
Produce an AST (Abstract Syntax Tree) for every file in the repository. This can be done via the tree_sitter library, which we're already using in chunk.py.
Identify class / method / file names in the model response, potentially using regular expressions.
Look up the strings matched by the regular expressions in the per-file ASTs.
Map the code symbol to a github URL.
The text was updated successfully, but these errors were encountered:
Given an LLM response that contains a code symbol (e.g. class name, method, etc.), we should link it to its exact location on GitHub. For instance, if you're chatting with Hugging Face's Transformers library:
Original response: "To define a BERT model using the Hugging Face Transformers library, you can use the BertModel class."
Link-ified response: "To define a BERT model using the Hugging Face Transformers library, you can use the BertModel class."
This should be done as a post-processing step in chat.py, once the model is done streaming.
Work items:
tree_sitter
library, which we're already using in chunk.py.The text was updated successfully, but these errors were encountered: