-
Notifications
You must be signed in to change notification settings - Fork 1.4k
Open
Description
following code
from langchain.chat_models import init_chat_model
from langchain_core.tools import tool, InjectedToolArg
from tavily import TavilyClient
from typing_extensions import Annotated, Literal
model = init_chat_model(model="kitsonk/watt-tool-8B", model_provider="ollama", api_key="ollama")
@tool(parse_docstring=True)
def tavily_search(
query: str,
max_results: Annotated[int, InjectedToolArg] = 3,
topic: Annotated[Literal["general", "news", "finance"], InjectedToolArg] = "general",
) -> str:
"""Fetch results from Tavily search API with content summarization.
Args:
query: A single search query to execute
max_results: Maximum number of results to return
topic: Topic to filter results by ('general', 'news', 'finance')
Returns:
Formatted string of search results with summaries
"""
tavily_client = TavilyClient()
result = tavily_client.search(
query,
max_results=max_results,
include_raw_content=True,
topic=topic
)
# Format output for consumption
return result
model_with_tools = model.bind_tools([tavily_search])
gives an error
model_with_tools = model.bind_tools([tavily_search])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<my venv location>\Lib\site-packages\langchain_core\language_models\chat_models.py", line 1455, in bind_tools
raise NotImplementedError
NotImplementedError
I initially tried with model llama3.1 which gave the same error. Since ollama explicitly talks about tool support for model kitsonk/watt-tool-8B, I have tried that above.
Do chat_models in general not support tools with ollama?
Metadata
Metadata
Assignees
Labels
No labels