Fix magistral streaming to emit reasoning chunks #16434
Merged
+139
−3
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Title
Fix magistral streaming to emit reasoning chunks
Relevant issues
Fixes #16272
Pre-Submission checklist
I have Added testing in the
tests/litellm/directory, Adding at least 1 test is a hard requirement - see detailsI have added a screenshot of my new test passing locally
pytest output attached in PR description
My PR passes all unit tests on
make test-unitpytest litellm/tests/test_litellm/llms/mistral/test_mistral_chat_transformation.pyMy PR's scope is as isolated as possible, it only solves 1 specific problem
Type
🐛 Bug Fix
Changes
mistral/magistral-medium-2509pricing entry inmodel_prices_and_context_window*.jsonthinking_blocks+reasoning_contentwith signatures inlitellm/llms/mistral/chat/transformation.pytests/test_litellm/llms/mistral/test_mistral_chat_transformation.pyto cover streaming reasoning output