Introducing openai-compat provider #423
Open
+317
−0
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
OpenAI-Compatible Provider
Objective
Ax should support any OpenAI-compatible endpoint (Groq, Cerebras, Vercel AI Gateway, Fireworks, custom proxies, etc.) without forcing bespoke adapters. Contributors need a reusable transport that:
base_url, API key, and optional extra headers/query params required by gateways.logit_bias, Cerebras disallowing certain penalties) so callers can adjust prompts/config.References
src/ax/ai/openai/api.ts(serializer),src/ax/ai/openrouter/api.ts(shows how to reuse OpenAI base),src/ax/ai/wrap.ts(factory).Implementation Plan
AxAIOpenAICompatible)src/ax/ai/openai-compatible/api.ts.AxAIOpenAIBasebut acceptsendpoint,apiKey, optionalextraHeaders, optionalsupportForoverride, and optionalproviderName.axAIOpenAIDefaultConfig()but allow overrides.Authorization: Bearer <apiKey>plus optionalHTTP-Referer, etc.AxAIArgsunion + constructor switch (src/ax/ai/wrap.ts) to recognizename: 'openai-compatible'.src/ax/ai/index.ts.README.md(andllm.txtvia docs) gets a “Using OpenAI-Compatible Providers” section with Groq + Cerebras + Vercel snippets (ai({ name: 'openai-compatible', endpoint: 'https://api.groq.com/openai/v1', config: { model: 'groq/llama3-70b-8192' } })).config.src/examples/openai-compatible.tsshowing environment variable usage (AI_COMPAT_API_URL,AI_COMPAT_API_KEY).src/ax/ai/openai-compatible/api.test.tsverifying:apiCall.supportForoverride works.CHANGELOG.mdunder “Unreleased” or upcoming version.Open Questions / Follow-ups
logit_biaswhile targeting Groq)? For scope we’ll document the limitations but not enforce them, while leaving room for per-provider presets viamodels[].stream_options, confirm compatibility.Next steps: implement adapter + tests + docs, then update this file with outcomes and verification notes.
Implementation Status
✅ OpenAI-Compatible Adapter
AxAIOpenAICompatibleinsrc/ax/ai/openai-compatible/api.ts, reusing the OpenAI serializer but allowing arbitraryendpoint,headers,providerName, and capability overrides.ai()factory +AxAIArgsunion (src/ax/ai/wrap.ts) soai({ name: 'openai-compatible', ... })just works.npm run build:index -w src/axsoAxAIOpenAICompatibleis publicly available for direct imports.✅ Tooling & Config
vitest.config.tsfiles undersrc/ax/andsrc/tools/to pin Vitest to the workspace root and avoid leaking user-levelvite.config.ts.✅ Docs & Examples
src/examples/openai-compatible.ts.AI_COMPAT_API_URL,AI_COMPAT_API_KEY, optionalAI_COMPAT_PROVIDER_HEADER) and logging the response.✅ Tests
src/ax/ai/openai-compatible/api.test.tsensure endpoint/header wiring + capability overrides.src/ax/ai/integration.test.tsto cover theopenai-compatiblebranch of the factory.src/axandsrc/toolsthanks to the pinned configs.✅ Changelog
CHANGELOG.md).Testing Evidence
Commands executed in repo root unless noted:
npm run test:type-check -w src/examples– ensured new example type-checks.npm run test:unit -w src/ax– ran the full Ax package Vitest suite (69 files / 808 tests).npm run test:unit -w src/tools– ran tools package Vitest suite (2 files / 4 tests).Follow-ups
logit_bias,presence_penalty, etc.) are set while targeting a custom model preset.models[]to simplify multi-model routing (not required for this PR).