-
Notifications
You must be signed in to change notification settings - Fork 307
Agents landing page #1728
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Agents landing page #1728
Changes from 2 commits
c631e6c
8173750
e87b33d
64b6241
600bb3b
d99f6d2
e4e1ef1
d10be2b
a3dc9c5
ddc6745
5451f02
1fcf691
6f99832
3fe50ab
69f4f63
efd6f20
9ea20e7
ac7663e
d4cb687
5cd1f6c
8139c97
b951f2a
38bcfa4
dfbf60b
75390d5
28a41a9
bc904c3
0979d18
4a9e958
54f60f5
d4a18c4
a0ee2b3
b209798
ccab741
eff50a9
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,64 @@ | ||
# Agents on Hub | ||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
This page compiles all the libraries and tools Hugging Face offers for agentic workflows. | ||
|
||
## Smolagents | ||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
[Smolagents](https://github.com/huggingface/smolagents) is a lightweight library to cover all agentic use cases from code writing agents to computer use in few lines of code. It is model agnostic, supporting local models served with Hugging Face Transformers, as well as models offered with [Inference Providers](../inference-providers/index.md), and proprietary model providers. | ||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
It offers three agent classes based on ReAct framework: `CodeAgent` for agents writing their own codes, `ToolCallingAgent` for tool calling agents and the `MultiStepAgent` which the former two agents are based on for multi-step ReAct workflows. | ||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
If you want to avoid defining agents yourself, easiest way to start an agent is through CLI, with `smolagent` command. | ||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
```python | ||
smolagent "Plan a trip to Tokyo, Kyoto and Osaka between Mar 28 and Apr 7." --model-type "InferenceClientModel" --model-id "Qwen/Qwen2.5-Coder-32B-Instruct" --imports "pandas numpy" --tools "web_search" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. can we format it on multiple lines? see current rendering: https://moon-ci-docs.huggingface.co/docs/hub/pr_1728/en/agents There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. sorry it was supposed to be bash There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. made a suggestion on above here: #1728 (comment)
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
``` | ||
|
||
Agents can be pushed to Hugging Face Hub as Spaces. Check out all the cool agents people have built [here](https://huggingface.co/spaces?filter=smolagents&sort=likes). | ||
|
||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
## huggingface.js mcp-client | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. One nit: would prefer putting smolagents above huggingface.js agents, since it has more usage and content. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. to me it makes sense to make it high to low level, but I have no strong opinion There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. smolagents seems more high level IMO, this part about huggingface.js sounds like a low level detail There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. CLI yes, but other than that it's not I think there's more customization There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I don't really see the logic in having "one MCP server of a package" listed above "one MCP server of a bigger package" (smolagents has more usage than the agentic part of huggingface.js): for me it would make sense for the bigger one to be listed first There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. personally don't think the order matter much, it's just three options anyway, IMHO having more hub related things up front makes wee bit more sense - we can always change the order later on if it doesn't make sense. |
||
|
||
Huggingface.js offers an MCP client served with Inference Providers. Getting started with them is as simple as running `pnpm agent`. You can plug and play different models and providers by setting `PROVIDER` and `MODEL_ID` environmental variables. | ||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
```bash | ||
export HF_TOKEN="hf_..." | ||
export MODEL_ID="Qwen/Qwen2.5-72B-Instruct" | ||
export PROVIDER="nebius" | ||
pnpm agent | ||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
``` | ||
|
||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
You can get more information about mcp-client [here](https://huggingface.co/docs/huggingface.js/en/mcp-client/README). | ||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
|
||
## Gradio MCP Server | ||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
Gradio MCP Server feature wraps Gradio applications to make them available for LLM to use. | ||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
To make a Gradio application an MCP server, simply pass in `mcp_server=True` when launching your demo like follows. | ||
|
||
```python | ||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
demo = gr.Interface( | ||
fn=generate_images, | ||
inputs="text", | ||
outputs="image", | ||
title="Image Generator" | ||
) | ||
|
||
demo.launch(mcp_server=True) | ||
``` | ||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
The server will be available at `http://your-server:port/gradio_api/mcp/sse` where your application is served. Lastly, add this to the settings of the MCP Client of your choice. | ||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
``` | ||
{ | ||
"mcpServers": { | ||
"gradio": { | ||
"url": "http://your-server:port/gradio_api/mcp/sse" | ||
} | ||
} | ||
} | ||
``` | ||
|
||
|
||
This is very powerful because it let's the LLM use any Gradio application as a tool. You can find a variety of them on [Spaces](https://huggingface.co/spaces) and serve. | ||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i would put it in Other, maybe between Paper pages and Search: