-
-
Notifications
You must be signed in to change notification settings - Fork 2.4k
fix: Set default to unsupported value for gpt-5 model series requests #2229
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
WalkthroughThe Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~10 minutes
Poem
Pre-merge checks and finishing touches✅ Passed checks (3 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| request.Temperature = nil | ||
| request.TopP = 0 // oai 的 top_p 默认值是 1.0,但是为了 omitempty 属性直接不传,这里显式设置为 0 | ||
| request.LogProbs = false |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
Verify parameter omission strategy is correctly implemented.
There's an inconsistency in how unsupported parameters are being cleared:
Temperatureis set tonil(line 311)TopPis set to0(line 312)LogProbsis set tofalse(line 313)
The comment on line 312 indicates the intent is to omit these parameters from the request using omitempty. However, this only works correctly if:
- For
TopP = 0: The field must be a pointer type (*float64) foromitemptyto omit zero values. If it's a regularfloat64, setting it to0will serialize as"top_p": 0in JSON, not omit it. Consider usingnillikeTemperatureif it's a pointer. - For
LogProbs = false: Similarly, if this is abool, setting tofalsewill serialize as"logprobs": falserather than omitting it, unless it's a pointer type (*bool).
If the goal is to truly omit these parameters (as stated in the PR description and line 312 comment), ensure the field types support this approach or set them to nil consistently.
Run the following script to verify the field types in the request struct:
Additionally, verify the OpenAI documentation to confirm gpt-5 unsupported parameters:
🏁 Script executed:
#!/bin/bash
# Description: Check the field types for Temperature, TopP, and LogProbs in GeneralOpenAIRequest
# Search for the GeneralOpenAIRequest struct definition
ast-grep --pattern $'type GeneralOpenAIRequest struct {
$$$
}'
# Also search for these specific field definitions
rg -n -A 2 "type GeneralOpenAIRequest struct" dto/
rg -n "Temperature|TopP|LogProbs" dto/ --type go -C 2Length of output: 6459
🌐 Web query:
What parameters are not supported by OpenAI GPT-5 and GPT-5.1 models according to the latest documentation?
💡 Result:
Short answer — per the latest docs and dev reports, GPT‑5 / GPT‑5.1 no longer accept the old completion-style parameters; specifically they do NOT support:
- max_tokens (must use max_completion_tokens instead). [1][2]
- temperature and top_p. [2][3]
- logit_bias. [1][2]
Other parameters reported as unsupported or limited for the new/optimized models include: n, stream (or streaming true for some optimized variants), user, metadata, modalities/audio-related params, and similar legacy completion fields — check per-model notes because availability varies by model/variant. [1][3][4]
Sources:
[1] Compatibility/parameter notes (community doc) listing unsupported params including logit_bias, max_tokens, n, user, etc.
[2] Multiple GitHub issues and project reports describing GPT‑5 rejecting max_tokens and temperature (error: “Unsupported parameter …”).
[3] OpenAI/structured-model docs and model notes showing optimized models require max_completion_tokens and limit parameters (stream/temperature/top_p restrictions).
[4] Microsoft/Azure Q&A noting reasoning/verbosity parameters are only available for reasoning-enabled GPT‑5 deployments (standard GPT‑5 chat rejects them).
If you want, I can fetch and paste the exact error messages and links to the specific documentation pages (with direct citations).
Fix field types to properly omit unsupported parameters for GPT-5 models.
The verification confirms the original concern. The code attempts to omit unsupported parameters using omitempty, but this approach is flawed for two fields:
Field Type Analysis:
Temperatureis*float64— setting tonilcorrectly omits it ✓TopPisfloat64— setting to0will serialize as"top_p": 0and send to API ✗LogProbsisbool— setting tofalsewill serialize as"logprobs": falseand send to API ✗
GPT-5 and GPT-5.1 do not support temperature, top_p, and logprobs, so these parameters must be properly omitted. Since omitempty only works with nil values on pointer types, TopP and LogProbs fields should be changed to pointer types (*float64 and *bool) in dto/openai_request.go (lines 39 and 56), and the assignments in relay/channel/openai/adaptor.go should be updated to set them to nil instead of 0 and false.
🤖 Prompt for AI Agents
In relay/channel/openai/adaptor.go around lines 311-313 and
dto/openai_request.go at lines ~39 and ~56, TopP and LogProbs are non-pointer
types so omitempty cannot drop them; change dto/openai_request.go so TopP is
*float64 and LogProbs is *bool (leave Temperature as *float64), then update
relay/channel/openai/adaptor.go to assign request.TopP = nil and
request.LogProbs = nil (instead of 0 and false) so these fields are omitted from
JSON for GPT-5/5.1.
GPT5、5.1 系列不再支持以下参数,因此直接强制重写到默认值或者零值不传

https://platform.openai.com/docs/guides/latest-model
Summary by CodeRabbit