September 8, 2024

Stability AI Launches StableLM to Rival ChatGPT

Posted April 24, 2023 at 10:47pm by iClarified · 4291 views
Stability AI has announced the release of a new open-source language model, StableLM, to rival ChatGPT. The Alpha version of StableLM is available in 3 billion and 7 billion parameters, with 15 billion to 65 billion parameter models to follow. Developers can freely inspect, use, and adapt StableLM base models for commercial or research purposes, subject to the terms of the CC BY-SA-4.0 license.

In 2022, Stability AI drove the public release of Stable Diffusion, a revolutionary image model that represents a transparent, open, and scalable alternative to proprietary AI. With the launch of the StableLM suite of models, Stability AI is continuing to make foundational AI technology accessible to all. Our StableLM models can generate text and code and will power a range of downstream applications. They demonstrate how small and efficient models can deliver high performance with appropriate training.

StableLM is trained on a new experimental dataset built on The Pile open-source dataset, but three times larger with 1.5 trillion tokens of content. The richness of this dataset is said to give StableLM surprisingly high performance in conversational and coding tasks, despite its small size of 3 to 7 billion parameters (by comparison, GPT-3 has 175 billion parameters).

Additionally, Stability AI has announced the release of a set of research models that are instruction fine-tuned. Initially, these fine-tuned models will use a combination of five recent open-source datasets for conversational agents: Alpaca, GPT4All, Dolly, ShareGPT, and HH. These fine-tuned models are intended for research use only and are released under a noncommercial CC BY-NC-SA 4.0 license, in-line with Stanford's Alpaca license.

Check out a few sample prompts in the screenshots below. The models are now available in the Stability AI GitHub repository.