Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
37 commits
Select commit Hold shift + click to select a range
f8d86b0
Remove third-party OpenAI libraries from C# and Rust SDKs
bmehta001 Apr 3, 2026
92331ac
Update C# samples and docs to use new SDK types
bmehta001 Apr 3, 2026
2b5106c
Revert enums to strings for Role, Type, and FinishReason
bmehta001 Apr 3, 2026
94282ad
Fix CI failures and address Copilot review comments
bmehta001 Apr 3, 2026
afa77f8
Merge branch 'main' into bhamehta/remove-third-party-libs
bmehta001 Apr 4, 2026
20976e9
Add C# enums matching official .NET OpenAI SDK patterns; add Invalid …
bmehta001 Apr 4, 2026
1c42d9e
Apply code formatting across C# SDK and Rust SDK
bmehta001 Apr 4, 2026
2589fb0
Update docs and merge AudioTranscriptionRequest classes
bmehta001 Apr 4, 2026
9cfa428
Fix stale API docs for ChatClient, AudioClient, and LiveAudioTranscri…
bmehta001 Apr 4, 2026
dbc5583
Revert unrelated formatting changes to non-SDK files
bmehta001 Apr 4, 2026
1d73650
Remove stale Id field from LiveAudioTranscriptionResponse table in C#…
bmehta001 Apr 4, 2026
039fc85
Rename extension classes for clarity
bmehta001 Apr 4, 2026
976d9fb
Restore 'apply our specific settings' comment in AudioTranscriptionEx…
bmehta001 Apr 4, 2026
fa30c80
Update Rust api.md re-exported types to match actual exports
bmehta001 Apr 4, 2026
8a7c84d
Address PR review: nullable Role for streaming deltas, robust ToolCho…
bmehta001 Apr 4, 2026
b70bfcd
Add FunctionCall to C# FinishReason enum to match Rust and official O…
bmehta001 Apr 4, 2026
b179f97
Add max_completion_tokens to Rust, Python, and JS SDKs (closes #576)
bmehta001 Apr 4, 2026
4192909
Fix Rust clippy: remove useless String-to-String conversions
bmehta001 Apr 4, 2026
5c677d6
Simplify Rust SDK: derive Serialize for settings, deduplicate From impls
bmehta001 Apr 4, 2026
3f3efb8
Add refusal property to C# ChatMessage for cross-SDK parity
bmehta001 Apr 4, 2026
b087f62
Make ToolCall.Index and AudioTranscriptionResponse.Duration nullable
bmehta001 Apr 4, 2026
2d3f659
Rust review fixes: replace unwrap with expect, restore pub(crate) on …
bmehta001 Apr 4, 2026
e666360
Clarify AudioTranscriptionRequest comment about PascalCase convention
bmehta001 Apr 4, 2026
4289c9c
Add Developer role to ChatMessageRole in C# and Rust SDKs
bmehta001 Apr 4, 2026
26f3406
Rename internal extension files for clarity
bmehta001 Apr 4, 2026
3708b9b
Add token usage details and logprobs to C# and Rust SDKs
bmehta001 Apr 4, 2026
de24f58
Remove forward-compat fields not yet emitted by Core
bmehta001 Apr 4, 2026
20dc2aa
Remove refusal from C# and Rust SDKs — Core doesn't emit it
bmehta001 Apr 4, 2026
f2613ab
Fix PR review comments: wire format alignment and null safety
bmehta001 Apr 4, 2026
3fa1101
Remove redundant readonly modifiers from auto-properties
bmehta001 Apr 4, 2026
78a964f
Align ToolChoice with official OpenAI .NET SDK patterns
bmehta001 Apr 4, 2026
e775fb7
Merge branch 'main' into bhamehta/remove-third-party-libs
bmehta001 Apr 4, 2026
c5dd389
Rename FinishReason -> ChatFinishReason, ToolType -> ChatToolKind
bmehta001 Apr 5, 2026
f4b80ab
Rename CreatedAtUnix -> Created, ToolCall.FunctionCall -> ToolCall.Fu…
bmehta001 Apr 5, 2026
3da3272
Revert ChatFinishReason/ChatToolKind to wire-aligned FinishReason/Too…
bmehta001 Apr 5, 2026
51828db
Add cross-SDK integration tests for type array tool parameters (#576)
bmehta001 Apr 5, 2026
ac2168e
Revert "Add cross-SDK integration tests for type array tool parameter…
bmehta001 Apr 6, 2026
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -139,6 +139,7 @@ The Foundry Local SDK makes it easy to integrate local AI models into your appli
2. Use the SDK in your application as follows:
```csharp
using Microsoft.AI.Foundry.Local;
using Microsoft.AI.Foundry.Local.OpenAI;

var config = new Configuration { AppName = "foundry_local_samples" };
await FoundryLocalManager.CreateAsync(config);
Expand All @@ -154,7 +155,7 @@ The Foundry Local SDK makes it easy to integrate local AI models into your appli
var chatClient = await model.GetChatClientAsync();
var messages = new List<ChatMessage>
{
new() { Role = "user", Content = "What is the golden ratio?" }
new() { Role = ChatMessageRole.User, Content = "What is the golden ratio?" }
};

await foreach (var chunk in chatClient.CompleteChatStreamingAsync(messages))
Expand Down
3 changes: 1 addition & 2 deletions samples/cs/Directory.Packages.props
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,7 @@
</PropertyGroup>
<ItemGroup>
<PackageVersion Include="Microsoft.AI.Foundry.Local" Version="0.9.0-dev" />
<PackageVersion Include="Microsoft.AI.Foundry.Local.WinML" Version="0.9.0-dev-20260324" />
<PackageVersion Include="Betalgo.Ranul.OpenAI" Version="9.1.1" />
<PackageVersion Include="Microsoft.AI.Foundry.Local.WinML" Version="0.9.0-dev" />
<PackageVersion Include="Microsoft.Extensions.Logging" Version="9.0.10" />
<PackageVersion Include="Microsoft.Extensions.Logging.Console" Version="9.0.10" />
<PackageVersion Include="NAudio" Version="2.2.1" />
Expand Down
4 changes: 2 additions & 2 deletions samples/cs/model-management-example/Program.cs
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
using Microsoft.AI.Foundry.Local;
using Betalgo.Ranul.OpenAI.ObjectModels.RequestModels;
using Microsoft.AI.Foundry.Local.OpenAI;
using System.Diagnostics;

CancellationToken ct = new CancellationToken();
Expand Down Expand Up @@ -112,7 +112,7 @@
// Create a chat message
List<ChatMessage> messages = new()
{
new ChatMessage { Role = "user", Content = "Why is the sky blue?" }
new ChatMessage { Role = ChatMessageRole.User, Content = "Why is the sky blue?" }
};

// You can adjust settings on the chat client
Expand All @@ -123,7 +123,7 @@
var streamingResponse = chatClient.CompleteChatStreamingAsync(messages, ct);
await foreach (var chunk in streamingResponse)
{
Console.Write(chunk.Choices[0].Message.Content);

Check warning on line 126 in samples/cs/model-management-example/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (windows)

Dereference of a possibly null reference.

Check warning on line 126 in samples/cs/model-management-example/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (windows)

Dereference of a possibly null reference.

Check warning on line 126 in samples/cs/model-management-example/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (windows)

Dereference of a possibly null reference.

Check warning on line 126 in samples/cs/model-management-example/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (windows)

Dereference of a possibly null reference.
Console.Out.Flush();
}
Console.WriteLine();
Expand Down
6 changes: 3 additions & 3 deletions samples/cs/native-chat-completions/Program.cs
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
// <complete_code>
// <complete_code>
// <imports>
using Microsoft.AI.Foundry.Local;
using Betalgo.Ranul.OpenAI.ObjectModels.RequestModels;
using Microsoft.AI.Foundry.Local.OpenAI;
// </imports>

// <init>
Expand Down Expand Up @@ -90,7 +90,7 @@
// Create a chat message
List<ChatMessage> messages = new()
{
new ChatMessage { Role = "user", Content = "Why is the sky blue?" }
new ChatMessage { Role = ChatMessageRole.User, Content = "Why is the sky blue?" }
};

// Get a streaming chat completion response
Expand All @@ -98,7 +98,7 @@
var streamingResponse = chatClient.CompleteChatStreamingAsync(messages, ct);
await foreach (var chunk in streamingResponse)
{
Console.Write(chunk.Choices[0].Message.Content);

Check warning on line 101 in samples/cs/native-chat-completions/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (windows)

Dereference of a possibly null reference.

Check warning on line 101 in samples/cs/native-chat-completions/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (windows)

Dereference of a possibly null reference.

Check warning on line 101 in samples/cs/native-chat-completions/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (windows)

Dereference of a possibly null reference.

Check warning on line 101 in samples/cs/native-chat-completions/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (windows)

Dereference of a possibly null reference.
Console.Out.Flush();
}
Console.WriteLine();
Expand Down
26 changes: 12 additions & 14 deletions samples/cs/tool-calling-foundry-local-sdk/Program.cs
Original file line number Diff line number Diff line change
@@ -1,9 +1,7 @@
// <complete_code>
// <complete_code>
// <imports>
using Microsoft.AI.Foundry.Local;
using Betalgo.Ranul.OpenAI.ObjectModels.RequestModels;
using Betalgo.Ranul.OpenAI.ObjectModels.ResponseModels;
using Betalgo.Ranul.OpenAI.ObjectModels.SharedModels;
using Microsoft.AI.Foundry.Local.OpenAI;
using System.Text.Json;
// </imports>

Expand Down Expand Up @@ -59,14 +57,14 @@

// Get a chat client
var chatClient = await model.GetChatClientAsync();
chatClient.Settings.ToolChoice = ToolChoice.Required; // Force the model to make a tool call
chatClient.Settings.ToolChoice = ToolChoice.CreateRequiredChoice(); // Force the model to make a tool call


// Prepare messages
List<ChatMessage> messages =
[
new ChatMessage { Role = "system", Content = "You are a helpful AI assistant. If necessary, you can use any provided tools to answer the question." },
new ChatMessage { Role = "user", Content = "What is the answer to 7 multiplied by 6?" }
new ChatMessage { Role = ChatMessageRole.System, Content = "You are a helpful AI assistant. If necessary, you can use any provided tools to answer the question." },
new ChatMessage { Role = ChatMessageRole.User, Content = "What is the answer to 7 multiplied by 6?" }
];


Expand All @@ -76,7 +74,7 @@
[
new ToolDefinition
{
Type = "function",
Type = ToolType.Function,
Function = new FunctionDefinition()
{
Name = "multiply_numbers",
Expand All @@ -99,16 +97,16 @@

// <tool_loop>
// Get a streaming chat completion response
var toolCallResponses = new List<ChatCompletionCreateResponse>();
var toolCallResponses = new List<ChatCompletionResponse>();
Console.WriteLine("Chat completion response:");
var streamingResponse = chatClient.CompleteChatStreamingAsync(messages, tools, ct);
await foreach (var chunk in streamingResponse)
{
var content = chunk.Choices[0].Message.Content;

Check warning on line 105 in samples/cs/tool-calling-foundry-local-sdk/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (windows)

Dereference of a possibly null reference.

Check warning on line 105 in samples/cs/tool-calling-foundry-local-sdk/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (windows)

Dereference of a possibly null reference.
Console.Write(content);
Console.Out.Flush();

if (chunk.Choices[0].FinishReason == "tool_calls")
if (chunk.Choices[0].FinishReason == FinishReason.ToolCalls)
{
toolCallResponses.Add(chunk);
}
Expand All @@ -119,7 +117,7 @@
// Invoke tools called and append responses to the chat
foreach (var chunk in toolCallResponses)
{
var call = chunk?.Choices[0].Message.ToolCalls?[0].FunctionCall;
var call = chunk?.Choices[0].Message.ToolCalls?[0].Function;
if (call?.Name == "multiply_numbers")
{
var arguments = JsonSerializer.Deserialize<Dictionary<string, int>>(call.Arguments!)!;
Expand All @@ -132,7 +130,7 @@

var response = new ChatMessage
{
Role = "tool",
Role = ChatMessageRole.Tool,
Content = result.ToString(),
};
messages.Add(response);
Expand All @@ -142,12 +140,12 @@


// Prompt the model to continue the conversation after the tool call
messages.Add(new ChatMessage { Role = "system", Content = "Respond only with the answer generated by the tool." });
messages.Add(new ChatMessage { Role = ChatMessageRole.System, Content = "Respond only with the answer generated by the tool." });


// Set tool calling back to auto so that the model can decide whether to call
// the tool again or continue the conversation based on the new user prompt
chatClient.Settings.ToolChoice = ToolChoice.Auto;
chatClient.Settings.ToolChoice = ToolChoice.CreateAutoChoice();


// Run the next turn of the conversation
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
// <complete_code>
// <complete_code>
using Microsoft.AI.Foundry.Local;
using OpenAI;
using OpenAI.Chat;
Expand Down
8 changes: 4 additions & 4 deletions samples/cs/tutorial-chat-assistant/Program.cs
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
// <complete_code>
// <imports>
using Microsoft.AI.Foundry.Local;
using Betalgo.Ranul.OpenAI.ObjectModels.RequestModels;
using Microsoft.AI.Foundry.Local.OpenAI;
using Microsoft.Extensions.Logging;
// </imports>

Expand Down Expand Up @@ -48,7 +48,7 @@ await model.DownloadAsync(progress =>
{
new ChatMessage
{
Role = "system",
Role = ChatMessageRole.System,
Content = "You are a helpful, friendly assistant. Keep your responses " +
"concise and conversational. If you don't know something, say so."
}
Expand All @@ -70,7 +70,7 @@ await model.DownloadAsync(progress =>
}

// Add the user's message to conversation history
messages.Add(new ChatMessage { Role = "user", Content = userInput });
messages.Add(new ChatMessage { Role = ChatMessageRole.User, Content = userInput });

// <streaming>
// Stream the response token by token
Expand All @@ -91,7 +91,7 @@ await model.DownloadAsync(progress =>
// </streaming>

// Add the complete response to conversation history
messages.Add(new ChatMessage { Role = "assistant", Content = fullResponse });
messages.Add(new ChatMessage { Role = ChatMessageRole.Assistant, Content = fullResponse });
}
// </conversation_loop>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,6 @@

<!-- Packages -->
<ItemGroup>
<PackageReference Include="Betalgo.Ranul.OpenAI" />
<PackageReference Include="Microsoft.Extensions.Logging" />
<PackageReference Include="Microsoft.Extensions.Logging.Console" />
</ItemGroup>
Expand Down
6 changes: 3 additions & 3 deletions samples/cs/tutorial-document-summarizer/Program.cs
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
// <complete_code>
// <imports>
using Microsoft.AI.Foundry.Local;
using Betalgo.Ranul.OpenAI.ObjectModels.RequestModels;
using Microsoft.AI.Foundry.Local.OpenAI;
using Microsoft.Extensions.Logging;
// </imports>

Expand Down Expand Up @@ -75,8 +75,8 @@ async Task SummarizeFileAsync(
var fileContent = await File.ReadAllTextAsync(filePath, token);
var messages = new List<ChatMessage>
{
new ChatMessage { Role = "system", Content = prompt },
new ChatMessage { Role = "user", Content = fileContent }
new ChatMessage { Role = ChatMessageRole.System, Content = prompt },
new ChatMessage { Role = ChatMessageRole.User, Content = fileContent }
};

var response = await client.CompleteChatAsync(messages, token);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,6 @@

<!-- Packages -->
<ItemGroup>
<PackageReference Include="Betalgo.Ranul.OpenAI" />
<PackageReference Include="Microsoft.Extensions.Logging" />
<PackageReference Include="Microsoft.Extensions.Logging.Console" />
</ItemGroup>
Expand Down
26 changes: 12 additions & 14 deletions samples/cs/tutorial-tool-calling/Program.cs
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,7 @@
// <imports>
using System.Text.Json;
using Microsoft.AI.Foundry.Local;
using Betalgo.Ranul.OpenAI.ObjectModels.RequestModels;
using Betalgo.Ranul.OpenAI.ObjectModels.ResponseModels;
using Betalgo.Ranul.OpenAI.ObjectModels.SharedModels;
using Microsoft.AI.Foundry.Local.OpenAI;
using Microsoft.Extensions.Logging;
// </imports>

Expand All @@ -16,7 +14,7 @@
[
new ToolDefinition
{
Type = "function",
Type = ToolType.Function,
Function = new FunctionDefinition()
{
Name = "get_weather",
Expand All @@ -35,7 +33,7 @@
},
new ToolDefinition
{
Type = "function",
Type = ToolType.Function,
Function = new FunctionDefinition()
{
Name = "calculate",
Expand Down Expand Up @@ -136,13 +134,13 @@
Console.WriteLine("Model loaded and ready.");

var chatClient = await model.GetChatClientAsync();
chatClient.Settings.ToolChoice = ToolChoice.Auto;
chatClient.Settings.ToolChoice = ToolChoice.CreateAutoChoice();

var messages = new List<ChatMessage>
{
new ChatMessage
{
Role = "system",
Role = ChatMessageRole.System,
Content = "You are a helpful assistant with access to tools. " +
"Use them when needed to answer questions accurately."
}
Expand All @@ -165,7 +163,7 @@

messages.Add(new ChatMessage
{
Role = "user",
Role = ChatMessageRole.User,
Content = userInput
});

Expand All @@ -173,27 +171,27 @@
messages, tools, ct
);

var choice = response.Choices[0].Message;

Check warning on line 174 in samples/cs/tutorial-tool-calling/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (macos)

Dereference of a possibly null reference.

Check warning on line 174 in samples/cs/tutorial-tool-calling/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (macos)

Dereference of a possibly null reference.

if (choice.ToolCalls is { Count: > 0 })

Check warning on line 176 in samples/cs/tutorial-tool-calling/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (macos)

Dereference of a possibly null reference.

Check warning on line 176 in samples/cs/tutorial-tool-calling/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (macos)

Dereference of a possibly null reference.
{
messages.Add(choice);

foreach (var toolCall in choice.ToolCalls)
{
var toolArgs = JsonDocument.Parse(
toolCall.FunctionCall.Arguments
toolCall.Function.Arguments

Check warning on line 183 in samples/cs/tutorial-tool-calling/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (macos)

Dereference of a possibly null reference.

Check warning on line 183 in samples/cs/tutorial-tool-calling/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (macos)

Possible null reference argument for parameter 'json' in 'JsonDocument JsonDocument.Parse(string json, JsonDocumentOptions options = default(JsonDocumentOptions))'.

Check warning on line 183 in samples/cs/tutorial-tool-calling/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (macos)

Dereference of a possibly null reference.
).RootElement;
Console.WriteLine(
$" Tool call: {toolCall.FunctionCall.Name}({toolArgs})"
$" Tool call: {toolCall.Function.Name}({toolArgs})"
);

var result = ExecuteTool(
toolCall.FunctionCall.Name, toolArgs
toolCall.Function.Name, toolArgs

Check warning on line 190 in samples/cs/tutorial-tool-calling/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (macos)

Possible null reference argument for parameter 'functionName' in 'string ExecuteTool(string functionName, JsonElement arguments)'.
);
messages.Add(new ChatMessage
{
Role = "tool",
Role = ChatMessageRole.Tool,
ToolCallId = toolCall.Id,
Content = result
});
Expand All @@ -202,10 +200,10 @@
var finalResponse = await chatClient.CompleteChatAsync(
messages, tools, ct
);
var answer = finalResponse.Choices[0].Message.Content ?? "";

Check warning on line 203 in samples/cs/tutorial-tool-calling/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (macos)

Dereference of a possibly null reference.

Check warning on line 203 in samples/cs/tutorial-tool-calling/Program.cs

View workflow job for this annotation

GitHub Actions / cs-samples (macos)

Dereference of a possibly null reference.
messages.Add(new ChatMessage
{
Role = "assistant",
Role = ChatMessageRole.Assistant,
Content = answer
});
Console.WriteLine($"Assistant: {answer}\n");
Expand All @@ -215,7 +213,7 @@
var answer = choice.Content ?? "";
messages.Add(new ChatMessage
{
Role = "assistant",
Role = ChatMessageRole.Assistant,
Content = answer
});
Console.WriteLine($"Assistant: {answer}\n");
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,6 @@

<!-- Packages -->
<ItemGroup>
<PackageReference Include="Betalgo.Ranul.OpenAI" />
<PackageReference Include="Microsoft.Extensions.Logging" />
<PackageReference Include="Microsoft.Extensions.Logging.Console" />
</ItemGroup>
Expand Down
6 changes: 3 additions & 3 deletions samples/cs/tutorial-voice-to-text/Program.cs
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
// <complete_code>
// <imports>
using Microsoft.AI.Foundry.Local;
using Betalgo.Ranul.OpenAI.ObjectModels.RequestModels;
using Microsoft.AI.Foundry.Local.OpenAI;
using Microsoft.Extensions.Logging;
using System.Text;
// </imports>
Expand Down Expand Up @@ -81,14 +81,14 @@ await chatModel.DownloadAsync(progress =>
{
new ChatMessage
{
Role = "system",
Role = ChatMessageRole.System,
Content = "You are a note-taking assistant. Summarize " +
"the following transcription into organized, " +
"concise notes with bullet points."
},
new ChatMessage
{
Role = "user",
Role = ChatMessageRole.User,
Content = transcriptionText.ToString()
}
};
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,6 @@

<!-- Packages -->
<ItemGroup>
<PackageReference Include="Betalgo.Ranul.OpenAI" />
<PackageReference Include="Microsoft.Extensions.Logging" />
<PackageReference Include="Microsoft.Extensions.Logging.Console" />
</ItemGroup>
Expand Down
2 changes: 1 addition & 1 deletion samples/rust/tool-calling-foundry-local/src/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,7 @@ async fn main() -> Result<(), Box<dyn std::error::Error>> {
messages.push(assistant_msg);
messages.push(
ChatCompletionRequestToolMessage {
content: result.into(),
content: result,
tool_call_id: tc["id"].as_str().unwrap_or_default().to_string(),
}
.into(),
Expand Down
2 changes: 1 addition & 1 deletion samples/rust/tutorial-tool-calling/src/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -294,7 +294,7 @@ async fn main() -> anyhow::Result<()> {
execute_tool(function_name, &arguments);
messages.push(
ChatCompletionRequestToolMessage {
content: result.to_string().into(),
content: result.to_string(),
tool_call_id: tool_call.id.clone(),
}
.into(),
Expand Down
Loading
Loading