Skip to content

Support for prompt caching #35

@simonw

Description

@simonw

https://openrouter.ai/docs/features/prompt-caching

Could be a tiny bit tricky to implement. I expect whatever mechanism I use for llm-anthropic will mostly work here too:

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions