-
-
Notifications
You must be signed in to change notification settings - Fork 4.7k
Description
What happened?
I encountered an error with the token counting API when using it with Claude Code. The error occurs when trying to count tokens for conversations that contain tool use from Claude Code sub-agents.
Steps to reproduce:
- Create a Claude Code sub-agent following the documentation
- Use the sub-agent to complete some tasks (which generates tool_use and tool_result messages)
- Call the token counting API endpoint
/v1/messages/count_tokens?beta=truewith the conversation data
Expected behavior: The API should successfully count tokens for all message types in the conversation, including those with tool use from Claude Code sub-agents.
Actual behavior: The API returns a 500 Internal Server Error with the message: "Invalid content type: <class 'dict'>. Expected str or dict."
The error seems to occur when processing the tool_result content in the messages, specifically when the content is a nested structure containing dictionaries.
Request details:
- Method: POST
- URL:
https://DOMAIN/v1/messages/count_tokens?beta=true
Sample request body that triggers the error:
{
"model": "claude-sonnet-4-20250514",
"messages": [
{
"role": "user",
"content": [
{
"type": "text",
"text": "Use the GitHub action debugger sub-agent to check why the workflow at https://github.com/OWNER/REPO/actions/runs/RUNID failed to run."
}
]
},
{
"role": "assistant",
"content": [
{
"type": "text",
"text": "I'll use the GitHub Action debugger agent to analyze the failed workflow run and identify why it failed."
},
{
"type": "tool_use",
"id": "toolu_01DmpZZhxcZqjyfybtkhFro8",
"name": "Task",
"input": {
"subagent_type": "github-action-debugger",
"description": "Debug GitHub workflow failure",
"prompt": "Please analyze the GitHub Actions workflow run at https://github.com/OWNER/REPO/actions/runs/RUNID to determine why it failed..."
}
}
]
},
{
"role": "user",
"content": [
{
"tool_use_id": "toolu_01DmpZZhxcZqjyfybtkhFro8",
"type": "tool_result",
"content": [
{
"type": "text",
"text": "## Analysis Summary..."
}
]
}
]
},
{
"role": "assistant",
"content": [
{
"type": "text",
"text": "The reason for the workflow failure is a network connection issue...."
}
]
}
],
"tools": []
}
Relevant log output
07:12:55 - LiteLLM Proxy:ERROR: endpoints.py:282 - litellm.proxy.anthropic_endpoints.count_tokens(): Exception occurred - Error getting number of tokens from content list: Invalid content type: <class 'dict'>. Expected str or dict., default_token_count=None
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/token_counter.py", line 597, in _count_content_list
raise ValueError(
f"Invalid content type: {type(c)}. Expected str or dict."
)
ValueError: Invalid content type: <class 'dict'>. Expected str or dict.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.13/site-packages/litellm/proxy/anthropic_endpoints/endpoints.py", line 266, in count_tokens
token_response = await internal_token_counter(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<2 lines>...
)
^
File "/usr/lib/python3.13/site-packages/litellm/proxy/proxy_server.py", line 5946, in token_counter
total_tokens = token_counter(
model=model_to_use,
...<2 lines>...
custom_tokenizer=_tokenizer_used, # type: ignore
)
File "/usr/lib/python3.13/site-packages/litellm/utils.py", line 1847, in token_counter
return token_counter_new(
model,
...<7 lines>...
default_token_count,
)
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/token_counter.py", line 397, in token_counter
num_tokens = _count_messages(
params, new_messages, use_default_image_token_count, default_token_count
)
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/token_counter.py", line 458, in _count_messages
num_tokens += _count_content_list(
~~~~~~~~~~~~~~~~~~~^
params.count_function,
^^^^^^^^^^^^^^^^^^^^^^
...<2 lines>...
default_token_count,
^^^^^^^^^^^^^^^^^^^^
)
^
File "/usr/lib/python3.13/site-packages/litellm/litellm_core_utils/token_counter.py", line 604, in _count_content_list
raise ValueError(
f"Error getting number of tokens from content list: {e}, default_token_count={default_token_count}"
)
ValueError: Error getting number of tokens from content list: Invalid content type: <class 'dict'>. Expected str or dict., default_token_count=None
INFO: 10.0.15.176:59794 - "POST /v1/messages/count_tokens?beta=true HTTP/1.1" 500 Internal Server ErrorAre you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.76.1.rc.1
Twitter / LinkedIn details
No response