The article provides data on the compression ratio and speed, as well as the memory required for different models. The data indicates that the smaller RWKV models are a good compromise for text compression due to their RNN structure, which requires less memory and offers a relatively high running speed. The author concludes that these models are a good choice for text compression tasks.
Key takeaways:
- The `ts_zip` utility can compress text files with Large Language Models at a higher compression ratio than other tools, but requires the same language model and GPU for decompression.
- The compression ratio varies depending on the model and file, with the best results shown for the `rwkv_7B_q4` model on the `alice29.txt` file.
- Compression speed and memory requirements are also dependent on the model, with the `rwkv_169M` model showing the fastest speed and least memory usage on the `book1` file.
- The smaller RWKV models are recommended for text compression due to their balance of memory usage and running speed.