Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

GitHub - iyaja/llama-fs: A self-organizing file system with llama 3

May 26, 2024 - github.com
LlamaFS is a self-organizing file manager that automatically renames and organizes files based on their contents and conventions, such as time. It supports various types of files, including images and audio, and operates in two modes: batch mode and watch mode. In watch mode, LlamaFS starts a daemon that monitors your directory, intercepts all filesystem operations, and uses your most recent edits to predict how you rename files. It also offers an "incognito mode" for privacy, routing requests through Ollama instead of Groq.

The tool was built on a Python backend, using the Llama3 model through Groq for file content summarization and tree structuring. For local processing, the team integrated Ollama running the same model. The frontend is designed with Electron, providing a user-friendly interface that allows users to interact with the suggested file structures before finalizing changes. Future plans for LlamaFS include finding and removing old/unused files. The application can be installed using Python 3.10 or higher and pip, and it can be served locally using FastAPI.

Key takeaways:

  • LlamaFS is a self-organizing file manager that automatically renames and organizes files based on their contents and conventions. It supports many types of files and operates in two modes: batch mode and watch mode.
  • The system is built on a Python backend, using the Llama3 model through Groq for file content summarization and tree structuring. The frontend is designed with Electron, providing a user-friendly interface.
  • LlamaFS is fast and immediately useful, processing most file operations in less than 500ms in watch mode. It also offers an "incognito mode" for privacy, routing requests through Ollama instead of Groq.
  • Future plans for LlamaFS include the ability to find and remove old/unused files. The system requires Python 3.10 or higher and pip for installation, and it can be served locally using FastAPI.
View Full Article

Comments (0)

Be the first to comment!