Skip to content

siure/Rust-AI-Agent

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Rust AI Agent

An autonomous, extensible AI agent built in Rust that can use various tools to accomplish complex goals.

The agent uses a "Reasoning and Acting" (ReAct) pattern, allowing it to think through a problem, decide on an action, execute it using a tool, and observe the result before continuing.

Features

  • Autonomous Goal Pursuit: Set a complex goal, and the agent iterates until it finds a solution or reaches the maximum iteration limit.
  • Multi-Provider Support: Compatible with OpenAI, and other providers via OpenRouter.
  • Smart Memory Management: Implements semantic summarization and priority-based memory management to keep the context window efficient.
  • Tool Use Capabilities:
    • Calculator: Evaluate complex mathematical expressions.
    • File Operations: Read, write, list, and manage files in a sandboxed agent_workspace.
    • DateTime: Get current date and time information.
    • Web Search: Search the web for real-time information using Serper.dev.
  • Interactive Chat Mode: A conversational interface for interacting with the agent.
  • Streaming Output: Real-time feedback as the agent "thinks" and performs actions.

Installation

Prerequisites

  • Rust and Cargo (2024 edition recommended)
  • API keys for your preferred AI provider (OpenAI or OpenRouter)
  • Serper.dev API key for web search capabilities

Setup

  1. Clone the repository:

    git clone <repository-url>
    cd rust-agent
  2. Create a .env file in the project root:

    # Provider selection (openai or openrouter)
    PROVIDER=openai
    API_KEY=your_api_key
    
    # Required for web search
    SERPER_API_KEY=your_serper_key
  3. Build the project:

    cargo build --release

Usage

Interactive Chat Mode (Default)

Start a conversation with the agent:

cargo run --release

Available commands in chat:

  • /quit or /exit: End the session.
  • /clear: Clear conversation memory.
  • /memory: Show current memory trace.
  • /help: Show help message.

Single Goal Mode

Run the agent to accomplish a specific task and exit:

cargo run --release -- --goal "Write a short poem about Rust and save it to poem.txt"

CLI Options

cargo run --release -- --help
Option Description Default
-g, --goal <GOAL> Run a single goal directly -
-m, --model <MODEL> LLM model to use openai/gpt-4o-mini
--max-iterations <N> Maximum reasoning iterations 5
-v, --verbose Show debug information and memory traces false
--no-stream Disable streaming output false

Example Usage

# Simple calculation
cargo run --release -- --goal "What is 25 * 4 + 100?"

# File operations
cargo run --release -- --goal "Create a file called notes.txt with a shopping list"

# Web search (requires SERPER_API_KEY)
cargo run --release -- --goal "Search for the latest Rust version and save it to a file"

Project Structure

src/
├── main.rs           # CLI entry point and orchestration
├── agent/            # Core agent logic
│   ├── agent.rs      # ReAct loop and tool execution
│   ├── memory.rs     # Token-aware memory management
│   ├── prompts.rs    # System prompts for the LLM
│   └── action_parser.rs  # Parse LLM output into actions
├── clients/          # API clients
│   ├── openai.rs     # OpenAI API client
│   ├── openrouter.rs # OpenRouter API client
│   └── config.rs     # Provider configuration
├── tools/            # Available tools
│   ├── calculator_tool.rs  # Mathematical expressions
│   ├── file_ops_tool.rs    # Sandboxed file operations
│   ├── datetime_tool.rs    # Date/time information
│   └── web_search.rs       # Web search via Serper.dev
├── error.rs          # Centralized error types
├── http.rs           # HTTP client configuration
└── retry.rs          # Retry logic with exponential backoff

Extending the Agent

Adding new tools is straightforward:

  1. Create a new file in src/tools/ implementing the Tool trait:

    #[async_trait]
    impl Tool for MyTool {
        fn name(&self) -> &str { "my_tool" }
        fn description(&self) -> &str { "Description for the LLM" }
        async fn execute(&self, args: Value) -> anyhow::Result<ToolOutput> {
            // Tool implementation
        }
    }
  2. Register the tool in src/main.rs within the setup_tools function.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages