Skip to content

Conversation

@riunyfir
Copy link
Contributor

  • Description:
    In Python 3.10, asynchronous generators may not yield control immediately after yield, causing the event loop to be unable to process other tasks in a timely manner and thus affecting the real-time performance of streaming transmission. Python 3.12 has optimized the behavior of asynchronous generators, so they function properly.
    When calling agent.astream() or similar asynchronous streaming methods, you won't receive each token in real-time. All tokens are returned at once after the message is fully generated. In Python 3.12, the same code can properly receive tokens one by one.
    The solution is to add await asyncio.sleep(0) after each yield statement in the asynchronous generator to explicitly yield control, ensuring that the event loop can process streaming data in a timely manner.
  • **Issue:**Fixes astream is not sending token in python 3.10 #33903
  • Dependencies: None

@github-actions github-actions bot added integration Related to a provider partner package integration openai fix labels Nov 10, 2025
Copy link
Collaborator

@ccurme ccurme left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The originating issue seems to relate to ChatOllama, why are we modifying langchain-openai here?

@parth-dobariya-hdrs
Copy link

hi @ccurme , i also observed that this issue #33907 is not provider specific , i tried with both ChatOllama and ChatGroq and the issue persisted in both providers .

@jgasparetti
Copy link

I tested ChatOllama and ChatOpenAI and both suffer of the same issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

fix integration Related to a provider partner package integration openai

Projects

None yet

Development

Successfully merging this pull request may close these issues.

astream is not sending token in python 3.10

4 participants