Member-only story
Run OpenAI’s New Agents For Free
Using Ollama and a local LLM
This is a companion piece to a story I recently published on using OpenAI’s new agentic SDK. You’ll probably get more out of this story if you read the other one, so please check it out using the link at the end of this article.
In that article, I showed you how to use the new Agentic SDK from OpenAI and gave a couple of coding examples to show how straightforward it was to use. However, there is one big potential issue with using agentic systems, and that is the cost.
Running agents that use proprietary and closed-source large language models (LLMs), such as those from OpenAI, Anthropic, and others, can be expensive. Even though the cost of LLM token usage has decreased markedly over the last couple of years, if you run enough agents, your apps can still consume a lot of tokens. And greater token use leads to greater costs.
In this article, I will show you how to use OpenAI’s agentic framework for free.
How can we do this?
In simple terms, due to OpenAI's huge popularity after it burst onto the AI scene, many LLM creators decided that it would be useful if their LLM application program interfaces (APIs) were designed to be compatible with OpenAI’s API.
This is great news for developers as it means that, in practice, we can use the OpenAI API with non-OpenAI LLMs.
So, if you can find an LLM where its cost is negligible (or zero), and you can interact with it via an OpenAI API-compatible interface, then you’re “quids in”, as we say in the UK.
Now, there is a downside to all this in that when using local models, you often don’t have access to as powerful an LLM as you would with closed systems such as Claude Sonnet or GPT4o, etc... That’s the compromise you must make, and only you can decide whether it’s worth it.
One of the few ways to run LLM inference for free is to download and run the LLM locally. I think the best way to do this is to use Ollama. If you haven’t heard of Ollama before, I wrote an introductory article on it a while back that you read using this link.
Prerequisites
- Go to Ollama.com and download the latest…