Skip to content

Commit fae19e4

Browse files
Add Quickstart page to docs (#1413)
Co-authored-by: Aymeric Roucher <[email protected]>
1 parent f8ca1b9 commit fae19e4

File tree

3 files changed

+102
-10
lines changed

3 files changed

+102
-10
lines changed

docs/source/en/_toctree.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
11
- title: Get started
22
sections:
33
- local: index
4-
title: 🤗 Agents
4+
title: Introduction
55
- local: installation
6-
title: Installation
6+
title: Installation options
77
- local: guided_tour
88
title: Guided tour
99
- title: Tutorials

docs/source/en/index.mdx

Lines changed: 99 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,20 +1,112 @@
11
# `smolagents`
22

33
<div class="flex justify-center">
4-
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/smolagents/license_to_call.png" width=100%/>
4+
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/smolagents/license_to_call.png" style="max-width:700px"/>
55
</div>
66

7-
This library is the simplest framework out there to build powerful agents! By the way, wtf are "agents"? We provide our definition [in this page](conceptual_guides/intro_agents), where you'll also find tips for when to use them or not (spoilers: you'll often be better off without agents).
7+
## What is smolagents?
88

9-
This library offers:
9+
`smolagents` is an open-source Python library designed to make it extremely easy to build and run agents using just a few lines of code.
1010

11-
**Simplicity**: the logic for agents fits in ~thousand lines of code. We kept abstractions to their minimal shape above raw code!
11+
Key features of `smolagents` include:
1212

13-
🌐 **Support for any LLM**: it supports models hosted on the Hub loaded in their `transformers` version or through [Inference providers](https://huggingface.co/docs/inference-providers/index): Cerebras, Cohere, Fal, Fireworks, Hyperbolic, Nebius, Novita, Replicate, SambaNova, Together, etc. It also supports models from OpenAI, Anthropic... it's really easy to power an agent with any LLM.
13+
**Simplicity**: The logic for agents fits in ~thousand lines of code. We kept abstractions to their minimal shape above raw code!
1414

15-
🧑‍💻 **First-class support for Code Agents**, i.e. agents that write their actions in code (as opposed to "agents being used to write code"), [read more here](tutorials/secure_code_execution).
15+
🧑‍💻 **First-class support for Code Agents**: [`CodeAgent`](reference/agents#smolagents.CodeAgent) writes its actions in code (as opposed to "agents being used to write code") to invoke tools or perform computations, enabling natural composability (function nesting, loops, conditionals). To make it secure, we support [executing in sandboxed environment](tutorials/secure_code_execution) via [E2B](https://e2b.dev/) or via Docker.
1616

17-
🤗 **Hub integrations**: you can share and load Gradio Spaces as tools to/from the Hub, and more is to come!
17+
📡 **Common Tool-Calling Agent Support**: In addition to CodeAgents, [`ToolCallingAgent`](reference/agents#smolagents.ToolCallingAgent) supports usual JSON/text-based tool-calling for scenarios where that paradigm is preferred.
18+
19+
🤗 **Hub integrations**: Seamlessly share and load agents and tools to/from the Hub as Gradio Spaces.
20+
21+
🌐 **Model-agnostic**: Easily integrate any large language model (LLM), whether it's hosted on the Hub via [Inference providers](https://huggingface.co/docs/inference-providers/index), accessed via APIs such as OpenAI, Anthropic, or many others via LiteLLM integration, or run locally using Transformers or Ollama. Powering an agent with your preferred LLM is straightforward and flexible.
22+
23+
👁️ **Modality-agnostic**: Beyond text, agents can handle vision, video, and audio inputs, broadening the range of possible applications. Check out [this tutorial](examples/web_browser) for vision.
24+
25+
🛠️ **Tool-agnostic**: You can use tools from any [MCP server](reference/tools#smolagents.ToolCollection.from_mcp), from [LangChain]reference/tools#smolagents.Tool.from_langchain), you can even use a [Hub Space](reference/tools#smolagents.Tool.from_space) as a tool.
26+
27+
💻 **CLI Tools**: Comes with command-line utilities (smolagent, webagent) for quickly running agents without writing boilerplate code.
28+
29+
## Quickstart
30+
31+
Get started with smolagents in just a few minutes! This guide will show you how to create and run your first agent.
32+
33+
### Installation
34+
35+
Install smolagents with pip:
36+
37+
```bash
38+
pip install smolagents
39+
```
40+
41+
For additional features, you can install optional dependencies:
42+
```bash
43+
pip install smolagents[toolkit] # Includes default tools like web search
44+
```
45+
46+
### Create Your First Agent
47+
48+
Here's a minimal example to create and run an agent:
49+
50+
```python
51+
from smolagents import CodeAgent, InferenceClientModel
52+
53+
# Initialize a model (using Hugging Face Inference API)
54+
model = InferenceClientModel() # Uses a default model
55+
56+
# Create an agent with no tools
57+
agent = CodeAgent(tools=[], model=model)
58+
59+
# Run the agent with a task
60+
result = agent.run("Calculate the sum of numbers from 1 to 10")
61+
print(result)
62+
```
63+
64+
That's it! Your agent will use Python code to solve the task and return the result.
65+
66+
### Adding Tools
67+
68+
Let's make our agent more capable by adding some tools:
69+
70+
```python
71+
from smolagents import CodeAgent, InferenceClientModel
72+
73+
# Initialize with default tools (requires smolagents[toolkit])
74+
model = InferenceClientModel()
75+
agent = CodeAgent(
76+
tools=[], # Empty list since we'll use default tools
77+
model=model,
78+
add_base_tools=True # This adds web search and other default tools
79+
)
80+
81+
# Now the agent can search the web!
82+
result = agent.run("What is the current weather in Paris?")
83+
print(result)
84+
```
85+
86+
### Using Different Models
87+
88+
You can use various models with your agent:
89+
90+
```python
91+
# Using a specific model from Hugging Face
92+
model = InferenceClientModel(model_id="meta-llama/Llama-2-70b-chat-hf")
93+
94+
# Using OpenAI/Anthropic (requires smolagents[litellm])
95+
from smolagents import LiteLLMModel
96+
model = LiteLLMModel(model_id="gpt-4")
97+
98+
# Using local models (requires smolagents[transformers])
99+
from smolagents import TransformersModel
100+
model = TransformersModel(model_id="meta-llama/Llama-2-7b-chat-hf")
101+
```
102+
103+
## Next Steps
104+
105+
- Learn how to set up smolagents with various models and tools in the [Installation Guide](installation)
106+
- Check out the [Guided Tour](guided_tour) for more advanced features
107+
- Learn about [building custom tools](tutorials/tools)
108+
- Explore [secure code execution](tutorials/secure_code_execution)
109+
- See how to create [multi-agent systems](tutorials/building_good_agents)
18110

19111
<div class="mt-10">
20112
<div class="w-full flex flex-col space-y-4 md:space-y-0 md:grid md:grid-cols-2 md:gap-y-4 md:gap-x-5">

docs/source/en/installation.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# Installation Guide
1+
# Installation Options
22

33
The `smolagents` library can be installed using pip. Here are the different installation methods and options available.
44

0 commit comments

Comments
 (0)