Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Introducing superopenai

Mar 12, 2024 - news.bensbites.co
The article introduces `superopenai`, a minimal library designed to log and cache LLM (Language Learning Model) requests and responses during development. The tool aims to increase visibility and speed up the iterative process of building LLM-based software, which often involves a lot of debugging and print statements. `superopenai` logs prompts, responses, latency, cost, and token usage, and caches LLM requests and responses when the request is identical and `temperature=0`. It is intended for use during development, not production, and is particularly useful for local logging when experimenting or building from scratch.

The article also provides examples of how to use `superopenai`, including how to initialize it, how to use it with the `openai` client, and how to use the `Logger` object for aggregate statistics. It also discusses caching, compatibility with other libraries, and how to use `superopenai` in the context of building and testing a RAG pipeline with langchain. The article concludes by inviting contributions to the open-source project, suggesting potential areas for development such as porting to other languages, adding retries and detailed function tracing, and integrating with third-party logging services.

Key takeaways:

  • Superopenai is a minimal library designed for logging and caching LLM requests and responses during development, providing visibility and facilitating rapid iteration.
  • It logs prompts, responses, latency, cost, and token usage, and caches LLM requests and responses when the request is identical and temperature=0.
  • Superopenai is not intended for production apps, but for development, allowing developers to look at their logs locally rather than setting up a remote observability tool.
  • The library is compatible with other third-party libraries like langchain, llama-index, instructor, guidance, Dspy, and more. It also supports streaming and async, and allows for nested loggers.
View Full Article

Comments (0)

Be the first to comment!