The creator has future plans for Baa, including making it a desktop app with local authentication, storage, and llm's, removing any cloud dependencies. Other plans include an optional online community for sharing chats, prompts, and functions, support for local llm's, local storage for chats, prompts, functions, and experiments, and VS Code integration. The creator also mentions the challenges in releasing Baa, including securely managing API keys and billing for API usage.
Key takeaways:
- Baa is a new llm client for the web and desktop, designed to facilitate working with AI models from OpenAI and Anthropic. It offers features like chat, handling prompts, constructing functions, and running experiments.
- The creator of Baa aims to make it a desktop app with local authentication, storage, and llm's, removing any cloud dependencies. Future plans also include VS Code integration, two-way audio, and an optional online community for sharing resources.
- Baa has integrated Anthropic's Claude2 API in just 30 minutes, demonstrating its potential to improve the UI of local llm's.
- The main challenges before releasing Baa are securely managing API keys provided by users and figuring out how to bill for API usage. For the local version, lightweight document storage is the current blocker.