Tool call with local model using Ollama and AutoGen.Net

littlelittlecloud

Xiaoyun Zhang

Posted on July 29, 2024

Tool call with local model using Ollama and AutoGen.Net

Ollama, starting from version 0.3.0, supports tool calls with popular models like llama 3.1 and mistral. This functionality allows LLM models to interact with external functions by incorporating tools within their prompts.

In this blog, we’ll demo the usage of tool call with local model by building a weather report agent using AutoGen.Net and Ollama. The weather report agent will query weather information from a dummy function and generate final weather information based on tool call result

Requirements

To run this example, ensure you have:

  • ollama >= 0.3.0
  • dotnet sdk 8.0

Step 1: Create a .NET Console App

First, create a new .NET console application:

dotnet new console -n weather-report-agent
Enter fullscreen mode Exit fullscreen mode

Step 2: Add AutoGen.Net Packages

Next, add the necessary AutoGen.Net packages:

dotnet add package AutoGen
dotnet add package AutoGen.SourceGenerator # for type-safe function contract generation
Enter fullscreen mode Exit fullscreen mode

Step 3: Add the WeatherReportTool Class

Create a dummy tool class to fetch the weather report for a given city. The API in this class can be accessed by the LLM through tool calls.

public partial class WeatherReportTool
{
    /// <summary>
    /// Get the weather report for the given city
    /// </summary>
    /// <param name="city">city</param>
    /// <returns>weather report</returns>
    [Function]
    public async Task<string> GetWeatherReport(string city)
    {
        return $$"""
        {
            "city": "{{city}}",
            "temperature": "25°C",
            "weather": "Sunny"
        }
        """;
    }
}
Enter fullscreen mode Exit fullscreen mode

The [Function] attribute allows AutoGen.SourceGenerator to create a function contract from the structured comment.

Image description

Step 4: Create WeatherReportTool and FunctionCallMiddleware

Create a FunctionCallMiddleware with function definitions and function maps, then register it to an agent for consumption.

var weatherReportTool = new WeatherReportTool();
var weatherReportToolMiddleware = new FunctionCallMiddleware(
    functions: new[] { weatherReportTool.GetWeatherReportFunctionContract },
    functionMap: new Dictionary<string, Func<string, Task<string>>>
    {
        [nameof(weatherReportTool.GetWeatherReport)] = weatherReportTool.GetWeatherReport
    });
Enter fullscreen mode Exit fullscreen mode

Step 5: Create OpenAIChatAgent and Connect to Ollama

Since Ollama supports the OpenAI-compatible completion API, we can use OpenAIChatAgent to connect to the Ollama service as a third-party endpoint.

First, create a CustomHttpClientHandler to redirect all requests to the Ollama server:

public sealed class CustomHttpClientHandler : HttpClientHandler
{
    private readonly string _modelServiceUrl;

    public CustomHttpClientHandler(string modelServiceUrl)
    {
        _modelServiceUrl = modelServiceUrl;
    }

    protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
    {
        request.RequestUri = new Uri($"{_modelServiceUrl}{request.RequestUri.PathAndQuery}");
        return base.SendAsync(request, cancellationToken);
    }
}
Enter fullscreen mode Exit fullscreen mode

Then create an OpenAIChatAgent and connect it to the Ollama endpoint:

var openAIClient = new OpenAIClient("api-key", new OpenAIClientOptions
{
    HttpClientHandler = new CustomHttpClientHandler("https://ollama.com/api/")
});

var model = "llama3.1";
var agent = new OpenAIChatAgent(
    openAIClient: openAIClient,
    name: "assistant",
    modelName: model,
    systemMessage: "You are a weather assistant.",
    seed: 0)
    .RegisterMessageConnector() // convert AutoGen message to OpenAI message
    .RegisterMiddleware(weatherReportToolMiddleware) // register function call middleware
    .RegisterMiddleware(async (msgs, option, agent, ct) => {
        var reply = await agent.GenerateReplyAsync(msgs, option, ct);
        if (reply is ToolCallAggregateMessage toolCallMessage)
        {
            var chatHistory = msgs.Append(toolCallMessage);
            return await agent.GenerateReplyAsync(chatHistory, option, ct);
        }

        return reply;
    })
    .RegisterPrintMessage(); // print message to console
Enter fullscreen mode Exit fullscreen mode

Final Step: Ask for a Weather Report!

Once the weather report agent is created, you can ask it questions like "What's the weather in New York?"

var task = """
What is the weather in New York?
""";

await agent.SendAsync(task);
Enter fullscreen mode Exit fullscreen mode

The response will be something like:

--------------------
TextMessage from assistant
--------------------
The current temperature in New York is 25°C (77°F), and it's a sunny day.
Enter fullscreen mode Exit fullscreen mode

Conclusion and Further Reading

In this blog, we demonstrated how to enable tool calls using local models with Ollama and AutoGen.Net.

Thanks for reading! You can find the complete sample code in this repository.

💖 💪 🙅 🚩
littlelittlecloud
Xiaoyun Zhang

Posted on July 29, 2024

Join Our Newsletter. No Spam, Only the good stuff.

Sign up to receive the latest update from our blog.

Related