Skip to content

Conversation

@HynoR
Copy link
Contributor

@HynoR HynoR commented Nov 15, 2025

GPT5、5.1 系列不再支持以下参数,因此直接强制重写到默认值或者零值不传
https://platform.openai.com/docs/guides/latest-model
image

Summary by CodeRabbit

  • Bug Fixes
    • Enhanced parameter compatibility for GPT-5 models to ensure reliable request processing.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Nov 15, 2025

Walkthrough

The ConvertOpenAIRequest function in the OpenAI adaptor is modified to universally zero out unsupported parameters (Temperature, TopP, LogProbs) for gpt-5 family models, replacing a previous conditional approach based on exact model names.

Changes

Cohort / File(s) Change Summary
OpenAI parameter normalization for gpt-5 models
relay/channel/openai/adaptor.go
Updated ConvertOpenAIRequest to unconditionally nullify unsupported parameters (Temperature, TopP, LogProbs) for gpt-5-series models, removing prior conditional logic

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

  • Single file with localized logic change to parameter handling
  • Straightforward nullification of incompatible fields for gpt-5 models
  • No exported signature modifications or cross-file dependencies

Poem

🐰 Five's the magic, gpt's the call,
Unsupported params? We zero them all!
TopP to naught, Temperature's bare,
LogProbs dismissed with utmost care—
Compatibility crafted, the adaptor now fair! ✨

Pre-merge checks and finishing touches

✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main change: setting unsupported parameters to default/zero values for gpt-5 model series to enforce proper request handling.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between fd38abd and c6125ec.

📒 Files selected for processing (1)
  • relay/channel/openai/adaptor.go (1 hunks)

Comment on lines +311 to +313
request.Temperature = nil
request.TopP = 0 // oai 的 top_p 默认值是 1.0,但是为了 omitempty 属性直接不传,这里显式设置为 0
request.LogProbs = false
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

Verify parameter omission strategy is correctly implemented.

There's an inconsistency in how unsupported parameters are being cleared:

  • Temperature is set to nil (line 311)
  • TopP is set to 0 (line 312)
  • LogProbs is set to false (line 313)

The comment on line 312 indicates the intent is to omit these parameters from the request using omitempty. However, this only works correctly if:

  1. For TopP = 0: The field must be a pointer type (*float64) for omitempty to omit zero values. If it's a regular float64, setting it to 0 will serialize as "top_p": 0 in JSON, not omit it. Consider using nil like Temperature if it's a pointer.
  2. For LogProbs = false: Similarly, if this is a bool, setting to false will serialize as "logprobs": false rather than omitting it, unless it's a pointer type (*bool).

If the goal is to truly omit these parameters (as stated in the PR description and line 312 comment), ensure the field types support this approach or set them to nil consistently.

Run the following script to verify the field types in the request struct:

Additionally, verify the OpenAI documentation to confirm gpt-5 unsupported parameters:


🏁 Script executed:

#!/bin/bash
# Description: Check the field types for Temperature, TopP, and LogProbs in GeneralOpenAIRequest

# Search for the GeneralOpenAIRequest struct definition
ast-grep --pattern $'type GeneralOpenAIRequest struct {
  $$$
}'

# Also search for these specific field definitions
rg -n -A 2 "type GeneralOpenAIRequest struct" dto/
rg -n "Temperature|TopP|LogProbs" dto/ --type go -C 2

Length of output: 6459


🌐 Web query:

What parameters are not supported by OpenAI GPT-5 and GPT-5.1 models according to the latest documentation?

💡 Result:

Short answer — per the latest docs and dev reports, GPT‑5 / GPT‑5.1 no longer accept the old completion-style parameters; specifically they do NOT support:

  • max_tokens (must use max_completion_tokens instead). [1][2]
  • temperature and top_p. [2][3]
  • logit_bias. [1][2]

Other parameters reported as unsupported or limited for the new/optimized models include: n, stream (or streaming true for some optimized variants), user, metadata, modalities/audio-related params, and similar legacy completion fields — check per-model notes because availability varies by model/variant. [1][3][4]

Sources:
[1] Compatibility/parameter notes (community doc) listing unsupported params including logit_bias, max_tokens, n, user, etc.
[2] Multiple GitHub issues and project reports describing GPT‑5 rejecting max_tokens and temperature (error: “Unsupported parameter …”).
[3] OpenAI/structured-model docs and model notes showing optimized models require max_completion_tokens and limit parameters (stream/temperature/top_p restrictions).
[4] Microsoft/Azure Q&A noting reasoning/verbosity parameters are only available for reasoning-enabled GPT‑5 deployments (standard GPT‑5 chat rejects them).

If you want, I can fetch and paste the exact error messages and links to the specific documentation pages (with direct citations).


Fix field types to properly omit unsupported parameters for GPT-5 models.

The verification confirms the original concern. The code attempts to omit unsupported parameters using omitempty, but this approach is flawed for two fields:

Field Type Analysis:

  • Temperature is *float64 — setting to nil correctly omits it ✓
  • TopP is float64 — setting to 0 will serialize as "top_p": 0 and send to API ✗
  • LogProbs is bool — setting to false will serialize as "logprobs": false and send to API ✗

GPT-5 and GPT-5.1 do not support temperature, top_p, and logprobs, so these parameters must be properly omitted. Since omitempty only works with nil values on pointer types, TopP and LogProbs fields should be changed to pointer types (*float64 and *bool) in dto/openai_request.go (lines 39 and 56), and the assignments in relay/channel/openai/adaptor.go should be updated to set them to nil instead of 0 and false.

🤖 Prompt for AI Agents
In relay/channel/openai/adaptor.go around lines 311-313 and
dto/openai_request.go at lines ~39 and ~56, TopP and LogProbs are non-pointer
types so omitempty cannot drop them; change dto/openai_request.go so TopP is
*float64 and LogProbs is *bool (leave Temperature as *float64), then update
relay/channel/openai/adaptor.go to assign request.TopP = nil and
request.LogProbs = nil (instead of 0 and false) so these fields are omitted from
JSON for GPT-5/5.1.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant