Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

OpenAI compatibility · Ollama Blog

Feb 08, 2024 - ollama.ai
The article announces that Ollama now has built-in compatibility with the OpenAI Chat Completion API, allowing for more tooling and application usage with Ollama locally. The setup process involves downloading Ollama and pulling a model such as Llama 2 or Mistral. The usage of Ollama's OpenAI compatible API endpoint can be invoked using the same OpenAI format and changing the hostname to a local host. Examples of usage with cURL, OpenAI Python library, OpenAI JavaScript library, Vercel AI SDK, and Autogen are provided.

The article also mentions the Vercel AI SDK, an open-source library that helps developers build conversational streaming applications, and Autogen, a popular open-source framework by Microsoft for building multi-agent applications. Both can be used with Ollama. The article concludes by stating that this is initial experimental support for the OpenAI API, with future improvements under consideration including Embeddings API, function calling, vision support, and Logprobs.

Key takeaways:

  • Ollama now has built-in compatibility with the OpenAI Chat Completion API, allowing it to be used with more tools and applications locally.
  • Users can invoke Ollama’s OpenAI compatible API endpoint using the same OpenAI format and changing the hostname to their local host.
  • Examples of how to use Ollama with OpenAI are provided for Python, JavaScript, Vercel AI SDK, and Autogen.
  • This is an initial experimental support for the OpenAI API with future improvements under consideration including Embeddings API, Function calling, Vision support, and Logprobs.
View Full Article

Comments (0)

Be the first to comment!