Open-source GPT-3 is out!

Meet EleutherAI GPT-Neo, a large language model.

Since its release GPT-3 is widely appraised as a game-changer when it comes to generating text. OpenAI create a truly powerful machine learning model with only one caveat. The code is still not public and the only way to interact with the API is through an API. That might change soon.

Diagram for GPT-Neo from EleutherAI

What is EleutherAI and GPT-Neo?

EleutherAI is a free group of researchers working to open-source AI models. Founded in July of 2020, their flagship project is GPT-Neo, a replication of OpenAI’s massive 175B parameter language model, GPT-3.

After over half a year they’ve released GPT-Neo two days ago here:

The following is their update from 21.03.2021:

“We’re proud to release two pretrained GPT-Neo models trained on The Pile, the weights and configs can be freely downloaded from



For more information on how to get these set up, see the colab notebook, or read through the rest of the readme.

This repository will be (mostly) archived as we move focus to our GPU training repo, GPT-Neox

If you wonder what’s The Pile, it is an 825 GiB diverse, open-source language modelling dataset consisting of data from 22 high-quality sources, built for benchmarking large language models. Check it out here.

What’s next for GPT-3?

It’s still to early to predict the impact of the GPT-Neo and how it compares exactly with GPT-3, but it’s worth venturing into it and trying it out.

Stay tuned for more news!

CEO Contentyze, the text editor 2.0, PhD in maths, Forbes 30 under 30 — → Sign up for free at

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store