Skip to content

build an OpenAI-compatible multi-provider chat completions proxy #195

@lukeocodes

Description

@lukeocodes

Integration: OpenAI-Compatible Multi-Provider Chat Completions Proxy for Voice Agent API

What this should show

A proxy server that sits between the Deepgram Voice Agent API and multiple LLM backends (Amazon Bedrock Agents, OpenAI). The proxy exposes an OpenAI-compatible chat completions endpoint so the Voice Agent can route requests to different providers via configuration, without changing application code. Should demonstrate:

  • OpenAI-compatible /v1/chat/completions endpoint
  • Provider switching via config (Amazon Bedrock Agents, OpenAI)
  • Integration with Deepgram Voice Agent API as the LLM backend
  • Minimal setup to swap providers

Credentials likely needed

  • DEEPGRAM_API_KEY
  • OPENAI_API_KEY
  • AWS credentials (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION) for Bedrock

Original request:

What's on your mind?

An OpenAI-compatible multi-provider chat completions proxy for the Voice Agent API, supporting Amazon Bedrock Agents and OpenAI with easy provider switching. Enables developers to swap LLM backends behind a Voice Agent deployment without changing application code.

Any extra context? (optional)

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    action:generateAction: ready for code generationpriority:userUser-submitted suggestion — builds before bot-queued examplesqueue:new-exampleQueue: build a new example

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions