Skip to content

feat: add native OpenRouter provider#4208

Open
subtleGradient wants to merge 6 commits intocrewAIInc:mainfrom
subtleGradient:feat/openrouter-provider
Open

feat: add native OpenRouter provider#4208
subtleGradient wants to merge 6 commits intocrewAIInc:mainfrom
subtleGradient:feat/openrouter-provider

Conversation

@subtleGradient
Copy link

@subtleGradient subtleGradient commented Jan 10, 2026

Summary

Adds OpenRouter as a first-class native provider in CrewAI, enabling direct access to 300+ models through a unified API.

Changes

  • Register "openrouter" in SUPPORTED_NATIVE_PROVIDERS
  • Add provider mapping and validation in LLM class
  • Create new OpenRouterCompletion class extending OpenAICompletion
  • Add 25 unit tests covering all functionality

Usage

from crewai import LLM

llm = LLM(model="openrouter/anthropic/claude-4.5-sonnet")

Verified Working Example

This exact script was tested and produces the output shown below:

#!/usr/bin/env -S uv run --script
# /// script
# requires-python = ">=3.10"
# dependencies = [
#     "crewai @ git+https://github.com/subtleGradient/crewAI.git@feat/openrouter-provider#subdirectory=lib/crewai",
# ]
# ///
"""Native OpenRouter provider test (PR #4208). Requires OPENROUTER_API_KEY env var."""

from crewai import LLM, Agent, Crew, Task

# Native OpenRouter provider - no base_url needed!
llm = LLM(model="openrouter/anthropic/claude-4.5-sonnet")

agent = Agent(
    role="Assistant", goal="Answer concisely", backstory="Helpful AI.", llm=llm
)
task = Task(
    description="What is 2 + 2? One word.", expected_output="A number", agent=agent
)
result = Crew(agents=[agent], tasks=[task]).kickoff()

print(f"Result: {result}")

Actual Output (verified):

Installed 133 packages in 81ms
Result: Four

Why

While OpenRouter works today via LiteLLM's openrouter/ prefix, a native provider offers:

  • Proper OPENROUTER_API_KEY environment variable support (not OPENAI_API_KEY)
  • OpenRouter-specific headers (HTTP-Referer, X-Title) for attribution
  • Cleaner integration without LiteLLM dependency for simple use cases

Implementation Details

OpenRouterCompletion extends OpenAICompletion (since OpenRouter is OpenAI-compatible) and:

  • Overrides base_url to https://openrouter.ai/api/v1
  • Uses OPENROUTER_API_KEY instead of OPENAI_API_KEY
  • Adds OpenRouter-specific headers for rankings attribution
  • Returns True in model validation (like Azure) since OpenRouter proxies 300+ models

Testing

  • 25 unit tests covering initialization, API key handling, header injection, and context window
  • Verified working locally with anthropic/claude-4.5-sonnet model

- Add OpenRouterCompletion class extending OpenAICompletion
- Use OpenRouter API endpoint (https://openrouter.ai/api/v1)
- Support OPENROUTER_API_KEY environment variable
- Support OpenRouter attribution headers (HTTP-Referer, X-Title)
- Register provider in SUPPORTED_NATIVE_PROVIDERS
- Add 24 unit tests covering initialization, client params, context window,
  header injection, and provider integration
- Fix bug: pass provider='openrouter' to parent constructor
- Improve error message with link to openrouter.ai/keys
- Add usage examples to class docstring
- Document context window default rationale

Tests cover:
- API key handling (explicit + env var fallback)
- Site URL/name for attribution
- Base URL override behavior
- Header injection logic
- Error handling for missing API key
- Remove unused MagicMock import from tests
- Add Returns section to _get_client_params docstring
- Add Returns section to get_context_window_size docstring
- Add Raises section to _get_client_params docstring
@subtleGradient subtleGradient marked this pull request as ready for review January 10, 2026 01:16
Copilot AI review requested due to automatic review settings January 10, 2026 01:16
@chatgpt-codex-connector
Copy link

You have reached your Codex usage limits for code reviews. You can see your limits in the Codex usage dashboard.
To continue using code reviews, you can upgrade your account or add credits to your account and enable them for code reviews in your settings.

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This pull request adds OpenRouter as a first-class native provider in CrewAI, enabling direct access to 300+ models through a unified OpenAI-compatible API. The implementation extends the existing OpenAICompletion class and adds proper support for OpenRouter-specific features including API key management and attribution headers.

Key changes:

  • Introduces OpenRouterCompletion class that extends OpenAICompletion with OpenRouter-specific configuration
  • Registers "openrouter" in the supported native providers list and adds routing logic in the LLM factory
  • Adds comprehensive test coverage for initialization, API key handling, header injection, and context window size

Reviewed changes

Copilot reviewed 4 out of 6 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
lib/crewai/src/crewai/llm.py Registers "openrouter" in SUPPORTED_NATIVE_PROVIDERS and adds provider mapping and routing logic
lib/crewai/src/crewai/llms/providers/openrouter/completion.py Implements OpenRouterCompletion class with OpenRouter-specific API endpoint, environment variables, and attribution headers
lib/crewai/src/crewai/llms/providers/openrouter/init.py Empty module initialization file for the openrouter provider package
lib/crewai/tests/llms/providers/openrouter/test_completion.py Comprehensive test suite covering initialization, API key handling, site attribution, header injection, and integration scenarios
lib/crewai/tests/llms/providers/openrouter/init.py Empty test module initialization file
lib/crewai/tests/llms/providers/init.py Empty test module initialization file

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

When LLM class passes provider='openrouter' explicitly, it was being
duplicated since OpenRouterCompletion also hardcodes provider='openrouter'
in super().__init__(). Now we pop 'provider' from kwargs before calling super.
Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR is being reviewed by Cursor Bugbot

Details

Your team is on the Bugbot Free tier. On this plan, Bugbot will review limited PRs each billing cycle for each member of your team.

To receive Bugbot reviews on all of your PRs, visit the Cursor dashboard to activate Pro and start your 14-day free trial.

- Add openrouter case to _validate_model_in_constants (returns True for all models since OpenRouter is a proxy)
- Fix context window docstring to accurately note that 128K default may exceed some models' actual limits
- Store resolved OpenRouter API key before calling super().__init__
- Override self.api_key after super to ensure OPENAI_API_KEY is never used
- Add test to verify OPENAI_API_KEY is never used even when set

Fixes cursor[bot] feedback about wrong API key being used when
OPENAI_API_KEY is set but OPENROUTER_API_KEY is not.
@github-actions
Copy link
Contributor

This PR is stale because it has been open for 45 days with no activity.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants