Skip to content

[Agent] Gemini INVALID_ARGUMENT in Agent #26466

@pr-maia

Description

@pr-maia

Self Checks

  • I have read the Contributing Guide and Language Policy.
  • This is only for bug report, if you would like to ask a question, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report, otherwise it will be closed.
  • 【中文用户 & Non English User】请使用英语提交,否则会被关闭 :)
  • Please do not modify this template :) and fill in all the required fields.

Dify version

1.9.1

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

Gemini Plugin Version: 0.5.5

A workflow that was running just fine until this morning started throwing this error:

Run failed: Failed to transform agent message: req_id: 9d46b2ab18 PluginInvokeError: {"args":{},"error_type":"Exception","message":"read llm model failed: request failed: [google] Error: req_id: 75678b1c8a PluginInvokeError: {"args":{},"error_type":"ClientError","message":"400 INVALID_ARGUMENT. {'error': {'code': 400, 'message': 'The input token count (947054) exceeds the maximum number of tokens allowed (131072).', 'status': 'INVALID_ARGUMENT'}}"}"}

Two things caught my attention:

(1) The input token count (947k) is just below the limit for the Gemini 2.5 Pro model in the Paid tier but i'm receiving an error mentioning the 131k limit from the free trial.

(2) The mention of INVALID_ARGUMENT

I've wrote a small script that takes an equivalent amount of input tokens and a standalone evaluation in python and the model answered just fine with no errors.

Maybe there was some changes in the Gemini API that need to be reflected in the Dify plugin.

✔️ Expected Behavior

The model should answer just like it does when running it outside of dify (python)

❌ Actual Behavior

Run failed: Failed to transform agent message: req_id: 9d46b2ab18 PluginInvokeError: {"args":{},"error_type":"Exception","message":"read llm model failed: request failed: [google] Error: req_id: 75678b1c8a PluginInvokeError: {"args":{},"error_type":"ClientError","message":"400 INVALID_ARGUMENT. {'error': {'code': 400, 'message': 'The input token count (947054) exceeds the maximum number of tokens allowed (131072).', 'status': 'INVALID_ARGUMENT'}}"}"}

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions