Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

GitHub - jzhang38/TinyLlama

Sep 04, 2023 - github.com
The TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens using 16 A100-40G GPUs within 90 days. The project, which started on September 1, 2023, uses the same architecture and tokenizer as Llama 2, allowing it to be integrated into many open-source projects built upon Llama. Despite its compactness with only 1.1B parameters, TinyLlama can cater to a multitude of applications that require limited computation and memory footprint.

The project plans to roll out intermediate checkpoints according to a set schedule, with the first checkpoint already achieved on September 4, 2023. TinyLlama can be used in various applications, such as assisting speculative decoding of larger models, deployment on edge devices for real-time machine translation without an internet connection, and enabling real-time dialogue generation in video games. The project also serves as a reference for enthusiasts interested in pretraining language models under 5 billion parameters.

Key takeaways:

  • The TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens using 16 A100-40G GPUs within a span of 90 days.
  • TinyLlama adopts the same architecture and tokenizer as Llama 2, making it compatible with many open-source projects built upon Llama. It is also compact, making it suitable for applications with restricted computation and memory footprint.
  • The project provides a detailed training setup and release schedule, and highlights its fast training speed due to various optimizations. It also suggests potential use cases such as assisting speculative decoding of larger models, deployment on edge devices, and real-time dialogue generation in video games.
  • The project is still under active development with plans to add scripts for pretraining on other datasets, sequence length extrapolation, testing throughput on RTX 3090/4090, and exploring retrieval-augmentation among others.
View Full Article

Comments (0)

Be the first to comment!