Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Introducing Refact Code LLM: 1.6B State-of-the-Art LLM for Code that Reaches 32% HumanEval

Sep 04, 2023 - refact.ai
Refact LLM, a 1.6B code model, has been introduced, offering real-time code completion and chat capabilities. It outperforms other code models such as StableCode, CodeGen, and ReplitCode on the HumanEval metric, despite being 10x smaller in size. The model supports 20 programming languages, has a 4096 tokens context, and is pre-trained on permissively licensed code, making it available for commercial use.

The model was trained on a 50:50 ratio of code and open text datasets and then fine-tuned with open code instruction-following datasets and a synthetic dataset. It is accessible to all and can be used commercially under the BigScience OpenRAIL-M license. It can be easily integrated into existing developer workflows with an open-source docker container and VS Code and JetBrains plugins. The model is the third in the family of code models, following CodeContrast 3b and CodeContrast 0.3b.

Key takeaways:

  • Refact LLM is a 1.6B code model that offers real-time code completion and chat capabilities, and it outperforms other code models like StableCode, CodeGen, and ReplitCode on the HumanEval metric.
  • The model was trained on a set of code with permissive licenses and open text datasets, and then fine-tuned with open code instruction-following datasets and a synthetic dataset to improve performance.
  • Refact LLM is designed to be accessible to everyone, with the model being released for commercial use under the BigScience OpenRAIL-M license and the weight being made available on HuggingFace.
  • The model can be easily integrated into existing developers workflows with an open-source docker container and VS Code and JetBrains plugins, and it works great for real-time code completion tasks due to its smaller size.
View Full Article

Comments (0)

Be the first to comment!