Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

GitHub - imoneoi/openchat: OpenChat: Advancing Open-source Language Models with Imperfect Data

Nov 06, 2023 - github.com
OpenChat is an open-source language model library that uses a strategy inspired by offline reinforcement learning to fine-tune its models. The models learn from mixed-quality data without preference labels and deliver performance comparable to ChatGPT, even with a 7B model. OpenChat has outperformed 70B models on MT-bench scoring 7.81 and has released various versions of its model, with the latest being OpenChat-3.5-7B.

The OpenChat training system uses padding-free training and the Multipack Sampler, resulting in a 3-10x speedup compared to conventional padded training. Despite its advanced capabilities, OpenChat has limitations in complex reasoning, mathematical tasks, programming challenges, and may sometimes generate inaccurate information. The OpenChat 3.5 code and models are distributed under the Apache License 2.0.

Key takeaways:

  • OpenChat is an open-source language model library that uses a strategy inspired by offline reinforcement learning to learn from mixed-quality data without preference labels.
  • The OpenChat 3.5 model, fine-tuned with C-RLFT and Mistral 7B as the base, has shown comparable results with ChatGPT and even outperformed Grok models with 33 billion parameters using only 7 billion parameters.
  • The OpenChat training system utilizes padding-free training and the Multipack Sampler, achieving a 3~10x speedup compared to conventional padded training.
  • Despite its advanced capabilities, OpenChat still has limitations in complex reasoning, mathematical and arithmetic tasks, programming and coding challenges, and may sometimes generate harmful or inaccurate responses.
View Full Article

Comments (0)

Be the first to comment!