Quick tip: Running OpenAI's Swarm locally using Ollama
Akmal Chaudhri
Posted on November 19, 2024
Abstract
This short article shows how to integrate OpenAI's Swarm with Ollama without requiring direct OpenAI access, enabling efficient, scalable AI workflows using alternative large language models.
The notebook file used in this article is available on GitHub.
Introduction
In a previous article, we used OpenAI's Swarm, but required an OpenAI API Key and use of an OpenAI large language model. However, Swarm can be easily configured to use Ollama and alternative large language models. In this article, we'll see how.
Installing Ollama is quick and easy. On Linux, for example, we can use the following command:
curl -fsSL https://ollama.com/install.sh | sh
With Ollama installed and running, we'll launch Jupyter locally:
jupyter notebook
Fill out the notebook
We'll then create a new notebook and first install the following:
!pip install git+https://github.com/openai/swarm.git --quiet
!pip install ollama openai --quiet
Next, we'll import the following:
import ollama
from openai import OpenAI
and then download a model using Ollama:
model = "llama3.2:1b"
ollama.pull(model)
Ollama can be configured to provide OpenAI compatibility, as follows:
ollama_client = OpenAI(
base_url = "http://localhost:11434/v1",
api_key = "ollama"
)
As a quick example, we'll slightly modify the Swarm demo provided by OpenAI on GitHub:
from swarm import Swarm, Agent
client = Swarm(ollama_client)
def transfer_to_agent_b():
return agent_b
agent_a = Agent(
name = "Agent A",
model = model,
instructions = "You are a helpful agent.",
functions=[transfer_to_agent_b],
)
agent_b = Agent(
name = "Agent B",
model = model,
instructions = "Only speak in Haikus.",
)
response = client.run(
agent = agent_a,
messages = [{"role": "user", "content": "I want to talk to agent B."}],
)
print(response.messages[-1]["content"])
In the code, we've:
- Changed
client = Swarm()
toclient = Swarm(ollama_client)
- Added
model = model
to both agents
And that's it!
Summary
In this short article, we've quickly and easily configured OpenAI's Swarm to run locally using Ollama.
Posted on November 19, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.