Sign up to save tools and stay up to date with the latest in AI
bg
bg

Localai

No reviews
Localai screenshot
Website
✨ Generated by ChatGPT

Local.ai Overview

Local.ai is a free and open-source native app that allows you to manage, verify, and experiment with AI offline and in private. It simplifies the AI process and does not require a GPU. The app is made with a Rust backend, making it memory efficient and compact. It offers features like CPU inferencing, model management, digest verification, and an inferencing server. It is compatible with Mac M2, Windows, and Linux .deb, and can power any AI app, offline or online.

Local.ai Highlights

  • Local.ai is a powerful native app with a Rust backend, making it memory efficient and compact. It can adapt to available threads and offers GGML quantization q4, 5.1, 8, f16.
  • The app provides a centralized location for AI model management. It features a resumable, concurrent downloader and is directory agnostic.
  • Local.ai ensures the integrity of downloaded models with a robust BLAKE3 and SHA256 digest compute feature. It also provides a known-good model API and license and usage chips.

All Reviews (0)