feat(core): Capture raw provider response #33922
Open
+209
−5
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary
Implementation of #33884
Design Decisions
1. Configuration Location
include_raw_responsein model wrapper classes, NOT core message classes2. Data Structure Flexibility
dict[str, Any] | list[dict[str, Any]] | None3. Stream Processing Strategy
add_ai_message_chunks()only whenchunk_position="last"received4. Backward Compatibility
Key Changes
1. Core Message System (
libs/core/langchain_core/messages/ai.py)AIMessage Class
raw_response: dict[str, Any] | list[dict[str, Any]] | None = None__init__()to accept and storeraw_responseparameterdict()method to include non-Noneraw_responsevaluesAIMessageChunk Class
raw_response: dict[str, Any] | None = NoneNew Function:
add_ai_message_chunks()Aggregates multiple
AIMessageChunkobjects into a single finalAIMessageKey Features:
raw_responsemerging:chunk_position="last"marker receivedExample Usage:
2. Message Utilities (
libs/core/langchain_core/messages/utils.py)message_chunk_to_message() Function
raw_responsefrom chunk to final message3. Public API Exports (
libs/core/langchain_core/messages/__init__.py)all List
"add_ai_message_chunks"_dynamic_imports Dictionary
"add_ai_message_chunks": "ai"4. OpenAI Integration (
libs/partners/openai/langchain_openai/chat_models/base.py)BaseChatOpenAI Class
New Configuration Field (line ~686):
Implementation Locations:
_convert_chunk_to_generation_chunk() (~line 1071)
if self.include_raw_response: message_chunk.raw_response = chunk_stream() (~line 1274)
if self.include_raw_response and chunk_position == "last": ..._create_chat_result() (~line 1440)
if self.include_raw_response and isinstance(message, AIMessage): ..._astream() (~line 1529)
Scope
Currently, only the OpenAI model class has been adapted.
If this PR is accepted, similar adaptations will be implemented for other partner integrations (Anthropic, Google, Bedrock, etc.) in follow-up PRs.
Testing
tests/unit_tests/core/andtests/unit_tests/partners/openai/.