Understanding Transformers, the machine learning model behind GPT-3 - The Fact Factory

Post Top Ad

Saturday 22 May 2021

Understanding Transformers, the machine learning model behind GPT-3


You know that expression When you have a hammer, everything looks like a nail? Well, in machine learning, it seems like we really have discovered a magical hammer for which everything is, in fact, a nail, and they’re called Transformers. Transformers are models that can be designed to translate text, write poems and op eds, and even generate computer code. In fact, lots of the amazing research I write about on daleonai.com is built on Transformers, like AlphaFold 2, the model that predicts the structures of proteins from their genetic sequences, as well as powerful natural language processing (NLP) models like GPT-3, BERT, T5, Switch,…

This story continues at The Next Web


No comments:

Post a Comment