-
Notifications
You must be signed in to change notification settings - Fork 303
Agents landing page #1728
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Agents landing page #1728
Changes from all commits
c631e6c
8173750
e87b33d
64b6241
600bb3b
d99f6d2
e4e1ef1
d10be2b
a3dc9c5
ddc6745
5451f02
1fcf691
6f99832
3fe50ab
69f4f63
efd6f20
9ea20e7
ac7663e
d4cb687
5cd1f6c
8139c97
b951f2a
38bcfa4
dfbf60b
75390d5
28a41a9
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
@@ -0,0 +1,103 @@ | ||||||||||||||
# Agents on the Hub | ||||||||||||||
|
||||||||||||||
This page compiles all the libraries and tools Hugging Face offers for agentic workflows. | ||||||||||||||
|
||||||||||||||
## smolagents | ||||||||||||||
|
||||||||||||||
[smolagents](https://github.com/huggingface/smolagents) is a lightweight library to cover all agentic use cases, from code-writing agents to computer use, in few lines of code. It is model agnostic, supporting local models served with Hugging Face Transformers, as well as models offered with [Inference Providers](../inference-providers/index.md), and proprietary model providers. | ||||||||||||||
|
||||||||||||||
It offers a unique kind of agent :`CodeAgent`, an agent that writes its actions in Python code. | ||||||||||||||
It also supports the standard agent that writes actions in JSON blobs as most other agentic frameworks do, called `ToolCallingAgent`. | ||||||||||||||
To learn more about write actions in code vs JSON, check out our [new short course on DeepLearning.AI](https://www.deeplearning.ai/short-courses/building-code-agents-with-hugging-face-smolagents/). | ||||||||||||||
|
||||||||||||||
If you want to avoid defining agents yourself, the easiest way to start an agent is through the CLI, using the `smolagent` command. | ||||||||||||||
|
||||||||||||||
```bash | ||||||||||||||
smolagent "Plan a trip to Tokyo, Kyoto and Osaka between Mar 28 and Apr 7." --model-type "InferenceClientModel" --model-id "Qwen/Qwen2.5-Coder-32B-Instruct" --imports "pandas numpy" --tools "web_search" | ||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
bash -> multiline so that it's more readable for readers. |
||||||||||||||
``` | ||||||||||||||
|
||||||||||||||
Agents can be pushed to Hugging Face Hub as Spaces. Check out all the cool agents people have built [here](https://huggingface.co/spaces?filter=smolagents&sort=likes). | ||||||||||||||
|
||||||||||||||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||||||||||||||
smolagents also supports MCP servers as tools, as follows: | ||||||||||||||
|
||||||||||||||
```python | ||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
so that the snippets are complete. |
||||||||||||||
from smolagents import MCPClient, CodeAgent | ||||||||||||||
from mcp import StdioServerParameters | ||||||||||||||
import os | ||||||||||||||
|
||||||||||||||
server_parameters = StdioServerParameters( | ||||||||||||||
command="uvx", # Using uvx ensures dependencies are available | ||||||||||||||
args=["--quiet", "[email protected]"], | ||||||||||||||
env={"UV_PYTHON": "3.12", **os.environ}, | ||||||||||||||
) | ||||||||||||||
|
||||||||||||||
with MCPClient(server_parameters) as tools: | ||||||||||||||
agent = CodeAgent(tools=tools, model=model, add_base_tools=True) | ||||||||||||||
agent.run("Please find the latest research on COVID-19 treatment.") | ||||||||||||||
``` | ||||||||||||||
|
||||||||||||||
Learn more [in the documentation](https://huggingface.co/docs/smolagents/tutorials/tools#use-mcp-tools-with-mcpclient-directly). | ||||||||||||||
|
||||||||||||||
## huggingface.js mcp-client | ||||||||||||||
|
||||||||||||||
Huggingface.js offers an MCP client served with [Inference Providers](https://huggingface.co/docs/inference-providers/en/index) or local LLMs. Getting started with them is as simple as running `pnpm agent`. You can plug and play different models and providers by setting `PROVIDER` and `MODEL_ID` environment variables. | ||||||||||||||
|
||||||||||||||
```bash | ||||||||||||||
export HF_TOKEN="hf_..." | ||||||||||||||
export MODEL_ID="Qwen/Qwen2.5-72B-Instruct" | ||||||||||||||
export PROVIDER="nebius" | ||||||||||||||
npx @huggingface/mcp-client | ||||||||||||||
``` | ||||||||||||||
|
||||||||||||||
merveenoyan marked this conversation as resolved.
Show resolved
Hide resolved
|
||||||||||||||
or, you can use any Local LLM (for example via lmstudio): | ||||||||||||||
|
||||||||||||||
```bash | ||||||||||||||
ENDPOINT_URL=http://localhost:1234/v1 \ | ||||||||||||||
MODEL_ID=lmstudio-community/Qwen3-14B-GGUF \ | ||||||||||||||
npx @huggingface/mcp-client | ||||||||||||||
You can get more information about mcp-client [here](https://huggingface.co/docs/huggingface.js/en/mcp-client/README). | ||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
this shouldn't be in the bash statement. |
||||||||||||||
``` | ||||||||||||||
|
||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
adding it back here. |
||||||||||||||
|
||||||||||||||
## Gradio MCP Server / Tools | ||||||||||||||
|
||||||||||||||
You can build an MCP server in just a few lines of Python with Gradio. If you have an existing Gradio app or Space you'd like to use as an MCP server / tool, it's just a single-line change. | ||||||||||||||
|
||||||||||||||
To make a Gradio application an MCP server, simply pass in `mcp_server=True` when launching your demo like follows. | ||||||||||||||
|
||||||||||||||
```python | ||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
making the snippet more complete |
||||||||||||||
def generate_image(prompt: str): | ||||||||||||||
""" | ||||||||||||||
Generate an image based on a text prompt | ||||||||||||||
|
||||||||||||||
Args: | ||||||||||||||
prompt: a text string describing the image to generate | ||||||||||||||
""" | ||||||||||||||
pass | ||||||||||||||
|
||||||||||||||
demo = gr.Interface( | ||||||||||||||
fn=generate_image, | ||||||||||||||
inputs="text", | ||||||||||||||
outputs="image", | ||||||||||||||
title="Image Generator" | ||||||||||||||
) | ||||||||||||||
|
||||||||||||||
demo.launch(mcp_server=True) | ||||||||||||||
``` | ||||||||||||||
|
||||||||||||||
The MCP server will be available at `http://your-server:port/gradio_api/mcp/sse` where your application is served. It will have a tool corresponding to each function in your Gradio app, with the tool description automatically generated from the docstrings of your functions. | ||||||||||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. for the deployed space on the hub - it'd be https://www.gradio.app/guides/using-docs-mcp#installing-in-the-clients cc: @abidlabs |
||||||||||||||
|
||||||||||||||
Lastly, add this to the settings of the MCP Client of your choice (e.g. Cursor). | ||||||||||||||
|
||||||||||||||
``` | ||||||||||||||
{ | ||||||||||||||
"mcpServers": { | ||||||||||||||
"gradio": { | ||||||||||||||
"url": "http://your-server:port/gradio_api/mcp/sse" | ||||||||||||||
} | ||||||||||||||
} | ||||||||||||||
} | ||||||||||||||
``` | ||||||||||||||
|
||||||||||||||
|
||||||||||||||
This is very powerful because it lets the LLM use any Gradio application as a tool. You can find thousands of them on [Spaces](https://huggingface.co/spaces). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can we format it on multiple lines? see current rendering: https://moon-ci-docs.huggingface.co/docs/hub/pr_1728/en/agents
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sorry it was supposed to be bash
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
made a suggestion on above here: #1728 (comment)