OpenAI Released The Largest GPT-2 Model With 774 Million Parameters

Six months after OpenAI published the paper on their new, large and powerful generative model for NLP – GPT-2, today the non-profit announced that it is releasing the 774 million parameter GPT-2.

After the controversial decision from OpenAI, back in February, not to release their full GPT-2 model, there has been quite a lot of discussion in the machine learning community. OpenAI was criticized for their decision and several months after, they released a small-sized model, then a medium-sized model in May (with 355M parameters) and today they are releasing the largest GPT-2 774M for the English language.

A number of companies and organizations have tried to replicate GPT-2 to some extent in the meantime. According to OpenAI, they will be fostering partnerships with companies, universities and research organizations that are interesting in analyzing and using GPT-2.

They mention in their blog post that as part of their release strategy, in a few months they will also release the largest GPT-2 model, that contains more than 1.5 billion parameters. Open AI opted for this kind of staged release strategy and partnerships due to the implications and potential misuse that might arise with the release of powerful generative models such as GPT-2. The 774M GPT-2 model can be downloaded from here.

Today, also Facebook released their pre-trained cross-lingual language models (XLM) for 100 languages. The released of the models along with a list of languages are available on Github.

Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments