Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 4 additions & 3 deletions relay/channel/openai/adaptor.go
Original file line number Diff line number Diff line change
Expand Up @@ -306,10 +306,11 @@ func (a *Adaptor) ConvertOpenAIRequest(c *gin.Context, info *relaycommon.RelayIn
request.Temperature = nil
}

// gpt-5系列模型适配 归零不再支持的参数
if strings.HasPrefix(info.UpstreamModelName, "gpt-5") {
if info.UpstreamModelName != "gpt-5-chat-latest" {
request.Temperature = nil
}
request.Temperature = nil
request.TopP = 0 // oai 的 top_p 默认值是 1.0,但是为了 omitempty 属性直接不传,这里显式设置为 0
request.LogProbs = false
Comment on lines +311 to +313
Copy link
Contributor

@coderabbitai coderabbitai bot Nov 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

Verify parameter omission strategy is correctly implemented.

There's an inconsistency in how unsupported parameters are being cleared:

  • Temperature is set to nil (line 311)
  • TopP is set to 0 (line 312)
  • LogProbs is set to false (line 313)

The comment on line 312 indicates the intent is to omit these parameters from the request using omitempty. However, this only works correctly if:

  1. For TopP = 0: The field must be a pointer type (*float64) for omitempty to omit zero values. If it's a regular float64, setting it to 0 will serialize as "top_p": 0 in JSON, not omit it. Consider using nil like Temperature if it's a pointer.
  2. For LogProbs = false: Similarly, if this is a bool, setting to false will serialize as "logprobs": false rather than omitting it, unless it's a pointer type (*bool).

If the goal is to truly omit these parameters (as stated in the PR description and line 312 comment), ensure the field types support this approach or set them to nil consistently.

Run the following script to verify the field types in the request struct:

Additionally, verify the OpenAI documentation to confirm gpt-5 unsupported parameters:


🏁 Script executed:

#!/bin/bash
# Description: Check the field types for Temperature, TopP, and LogProbs in GeneralOpenAIRequest

# Search for the GeneralOpenAIRequest struct definition
ast-grep --pattern $'type GeneralOpenAIRequest struct {
  $$$
}'

# Also search for these specific field definitions
rg -n -A 2 "type GeneralOpenAIRequest struct" dto/
rg -n "Temperature|TopP|LogProbs" dto/ --type go -C 2

Length of output: 6459


🌐 Web query:

What parameters are not supported by OpenAI GPT-5 and GPT-5.1 models according to the latest documentation?

💡 Result:

Short answer — per the latest docs and dev reports, GPT‑5 / GPT‑5.1 no longer accept the old completion-style parameters; specifically they do NOT support:

  • max_tokens (must use max_completion_tokens instead). [1][2]
  • temperature and top_p. [2][3]
  • logit_bias. [1][2]

Other parameters reported as unsupported or limited for the new/optimized models include: n, stream (or streaming true for some optimized variants), user, metadata, modalities/audio-related params, and similar legacy completion fields — check per-model notes because availability varies by model/variant. [1][3][4]

Sources:
[1] Compatibility/parameter notes (community doc) listing unsupported params including logit_bias, max_tokens, n, user, etc.
[2] Multiple GitHub issues and project reports describing GPT‑5 rejecting max_tokens and temperature (error: “Unsupported parameter …”).
[3] OpenAI/structured-model docs and model notes showing optimized models require max_completion_tokens and limit parameters (stream/temperature/top_p restrictions).
[4] Microsoft/Azure Q&A noting reasoning/verbosity parameters are only available for reasoning-enabled GPT‑5 deployments (standard GPT‑5 chat rejects them).

If you want, I can fetch and paste the exact error messages and links to the specific documentation pages (with direct citations).


Fix field types to properly omit unsupported parameters for GPT-5 models.

The verification confirms the original concern. The code attempts to omit unsupported parameters using omitempty, but this approach is flawed for two fields:

Field Type Analysis:

  • Temperature is *float64 — setting to nil correctly omits it ✓
  • TopP is float64 — setting to 0 will serialize as "top_p": 0 and send to API ✗
  • LogProbs is bool — setting to false will serialize as "logprobs": false and send to API ✗

GPT-5 and GPT-5.1 do not support temperature, top_p, and logprobs, so these parameters must be properly omitted. Since omitempty only works with nil values on pointer types, TopP and LogProbs fields should be changed to pointer types (*float64 and *bool) in dto/openai_request.go (lines 39 and 56), and the assignments in relay/channel/openai/adaptor.go should be updated to set them to nil instead of 0 and false.

🤖 Prompt for AI Agents
In relay/channel/openai/adaptor.go around lines 311-313 and
dto/openai_request.go at lines ~39 and ~56, TopP and LogProbs are non-pointer
types so omitempty cannot drop them; change dto/openai_request.go so TopP is
*float64 and LogProbs is *bool (leave Temperature as *float64), then update
relay/channel/openai/adaptor.go to assign request.TopP = nil and
request.LogProbs = nil (instead of 0 and false) so these fields are omitted from
JSON for GPT-5/5.1.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Golang的零值和忽略并不等价,这里能不能生效感觉有疑问

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Check the actual field types for TopP, LogProbs, and Temperature in GeneralOpenAIRequest
rg -n "type GeneralOpenAIRequest struct" dto/ -A 100 | head -120

Length of output: 7513


@seefs001 你的质疑完全正确!当前的实现确实有问题。

在 Go 中,omitempty 标签的行为取决于字段类型:

当前的字段类型:

  • Temperature *float64 (指针类型) - 设置为 nil ✅ 会被省略
  • TopP float64 (非指针类型) - 设置为 0会序列化为 "top_p": 0
  • LogProbs bool (非指针类型) - 设置为 false会序列化为 "logprobs": false

问题: omitempty 对于非指针类型的零值(0, false)不会省略字段,仍然会发送到 API。这意味着当前代码会发送:

{
  "top_p": 0,
  "logprobs": false
}

而不是完全省略这些参数。

正确的解决方案:

需要修改 dto/openai_request.go 中的字段定义为指针类型:

// Line 39
TopP *float64 `json:"top_p,omitempty"`

// Line 56  
LogProbs *bool `json:"logprobs,omitempty"`

然后在 relay/channel/openai/adaptor.go 中设置为 nil

if strings.HasPrefix(info.UpstreamModelName, "gpt-5") {
    request.Temperature = nil
    request.TopP = nil        // 而不是 0
    request.LogProbs = nil     // 而不是 false
}

这样才能确保这些参数真正不会被发送到 OpenAI API。


🧬 Code Graph Analysis Results
  1. Filename: dto/openai_request.go, lines 14-17
ResponseFormat struct {
	Type       string          `json:"type,omitempty"`
	JsonSchema json.RawMessage `json:"json_schema,omitempty"`
}
  1. Filename: dto/openai_request.go, lines 251-253
StreamOptions struct {
	IncludeUsage bool `json:"include_usage,omitempty"`
}
  1. Filename: relay/channel/openrouter/dto.go, lines 5-11
RequestReasoning struct {
	// One of the following (not both):
	Effort    string `json:"effort,omitempty"`     // Can be "high", "medium", or "low" (OpenAI-style)
	MaxTokens int    `json:"max_tokens,omitempty"` // Specific token limit (Anthropic-style)
	// Optional: Default is false. All models support this.
	Exclude bool `json:"exclude,omitempty"` // Set to true to exclude reasoning tokens from response
}

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Golang的零值和忽略并不等价,这里能不能生效感觉有疑问

那要不要把那几个全改指针类型,传nil进去

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems like the humans are having a chat. I'll hop back into my burrow for now. If you need me again, just tag @coderabbitai in a new comment, and I'll come hopping out!

}

// 转换模型推理力度后缀
Expand Down