OPT: Open Pre-trained Transformer Language Models
Susan Zhang, et al.,
arXiv,
May 05, 2022
Meta (a.k.a. Facebook) has released "Open Pre-trained Transformers (OPT), a suite of decoder-only pre-trained transformers ranging from 125M to 175B parameters." OPT is similar in scale and capacity to GPT-3, which has been producing astonishing results for OpenAI. The model is available for non-commercial use. Meta is also releasing its code and it training logbook. As the article (30 page PDF) reports, OPT was developed to perform natural language tasks, such as detection for hate speech, 9 types of biases, and toxicity. "Our goal was to replicate the performance and sizes of the GPT-3 class of models, while also applying the latest best practices in data curation and training efficiency." See also: Meta AI, Hacker News,
Today: 4 Total: 91 [Share]
] [