AI Training Method Exceeds GPT-3 Performance with 99.9% Fewer Parameters

0
TAGS: ,

AI Training Method Exceeds GPT-3 Performance with 99.9% Fewer Parameters

AI Training Method Exceeds GPT-3 Performance with 99.9% Fewer Parameters

A team of scientists at LMU Munich have developed Pattern-Exploiting Training (PET), a deep-learning training technique for natural language processing (NLP) models. Using PET, the team trained a Transformer NLP model with 223M parameters that out-performed the 175B-parameter GPT-3 by over 3 percentage points on the SuperGLUE benchmark.

Source: www.infoq.com/news/2020/10/training-exceeds-gpt3/

October 13, 2020
Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments

Subscribe to our Digest