Skip to content

langchain updates#207

Open
dixitaniket wants to merge 2 commits intomainfrom
ani/langchain-changes
Open

langchain updates#207
dixitaniket wants to merge 2 commits intomainfrom
ani/langchain-changes

Conversation

@dixitaniket
Copy link
Collaborator

No description provided.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Updates the OpenGradient LangChain adapter to support newer LangChain message/tooling formats, add async + streaming support, and broaden initialization options (client injection, model validation, additional LLM config).

Changes:

  • Added temperature, model string validation (provider/model), optional client injection, and lifecycle management (close/aclose) to the LangChain chat model adapter.
  • Refactored message/tool conversion (developer ChatMessage, safer tool-args parsing/serialization) and added streaming support via _stream/_astream.
  • Expanded test coverage for async generation, tool choice, streaming chunk conversion, and new validation behavior.

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 1 comment.

File Description
src/opengradient/agents/og_langchain.py Major adapter update: initialization options, tool binding via convert_to_openai_tool, async + streaming implementations, and message/tool parsing changes.
src/opengradient/agents/__init__.py Updates langchain_adapter(...) wrapper to accept new parameters (temperature, model alias, client injection, and LLM connection config).
tests/langchain_adapter_test.py Adds/updates tests for new adapter behaviors (async _agenerate, tool choice, developer role messages, streaming chunk conversion, model validation).

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 3 out of 3 changed files in this pull request and generated 2 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

tool_dicts: List[Dict] = []
strict = kwargs.get("strict")
self._tools = [convert_to_openai_tool(tool, strict=strict) for tool in tools]
self._tool_choice = tool_choice or kwargs.get("tool_choice")
Copy link

Copilot AI Mar 26, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

bind_tools accepts tool_choice: Optional[Any] and _build_chat_kwargs forwards it directly to LLM.chat(...), but the SDK client currently types/uses tool_choice as an Optional[str] and serializes it into the request payload. If a caller passes an OpenAI-style dict tool_choice (which LangChain commonly supports), this adapter will send a dict to the backend and likely fail at runtime. Consider validating tool_choice to None | str (raising a clear ValueError otherwise), or explicitly translating supported LangChain/OpenAI dict forms into the string values your backend supports.

Suggested change
self._tool_choice = tool_choice or kwargs.get("tool_choice")
# Normalize tool_choice so that downstream clients always receive
# the string tool name (or None), matching the SDK expectation.
raw_tool_choice = tool_choice if tool_choice is not None else kwargs.get("tool_choice")
normalized_tool_choice: Optional[str]
if raw_tool_choice is None:
normalized_tool_choice = None
elif isinstance(raw_tool_choice, str):
normalized_tool_choice = raw_tool_choice
elif isinstance(raw_tool_choice, dict):
# Support OpenAI-style tool_choice dicts, e.g.:
# {"type": "function", "function": {"name": "my_tool", ...}}
function_block = raw_tool_choice.get("function")
if isinstance(function_block, dict):
function_name = function_block.get("name")
if isinstance(function_name, str):
normalized_tool_choice = function_name
else:
raise ValueError(
"Invalid tool_choice dict: 'function.name' must be a string "
f"(got {type(function_name).__name__})."
)
else:
raise ValueError(
"Invalid tool_choice dict: expected OpenAI-style "
"{'type': 'function', 'function': {'name': '<tool_name>', ...}}."
)
else:
raise ValueError(
"tool_choice must be None, a string, or an OpenAI-style dict; "
f"got value of type {type(raw_tool_choice).__name__}."
)
self._tool_choice = normalized_tool_choice

Copilot uses AI. Check for mistakes.
) -> Dict[str, Any]:
x402_settlement_mode = kwargs.get("x402_settlement_mode", self.x402_settlement_mode)
if isinstance(x402_settlement_mode, str):
x402_settlement_mode = x402SettlementMode(x402_settlement_mode)
Copy link

Copilot AI Mar 26, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

_build_chat_kwargs converts a string x402_settlement_mode via x402SettlementMode(x402_settlement_mode), but the rest of the repo exposes user-facing values like batch-hashed and individual-full (see src/opengradient/cli.py mapping) rather than the enum values (batch/individual). Passing those common strings here will raise a ValueError. Consider accepting the CLI strings as aliases (mapping them to the enum) or tightening the adapter API to only accept x402SettlementMode to avoid surprising runtime errors.

Suggested change
x402_settlement_mode = x402SettlementMode(x402_settlement_mode)
# Accept both CLI-facing aliases (e.g. "batch-hashed", "individual-full")
# and the raw enum values (e.g. "batch", "individual").
normalized_mode = x402_settlement_mode.lower()
alias_map = {
"batch-hashed": x402SettlementMode.batch,
"individual-full": x402SettlementMode.individual,
}
if normalized_mode in alias_map:
x402_settlement_mode = alias_map[normalized_mode]
else:
# Fall back to the enum constructor for canonical enum string values.
x402_settlement_mode = x402SettlementMode(x402_settlement_mode)

Copilot uses AI. Check for mistakes.
@dixitaniket dixitaniket changed the title [WIP] langchain updates langchain updates Mar 26, 2026
@dixitaniket dixitaniket marked this pull request as ready for review March 26, 2026 13:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants