Why GPT-3 Matters
Leo Gao,
Jul 21, 2020
I've run a few posts on GPT-3 and it makes sense to include this item to put it into context. First is its size; "it's an entire order of magnitude larger" than the previously largest model. "Loading the entire model's weights in fp16 would take up an absolutely preposterous 300GB of VRAM." What this means is that GPT-3's language models are "few shot learners" - that is, they can "perform a new language task from only a few examples or from simple instructions." That's why it can create a Shakespeare sonnet after being given only the first few lines - it recognizes what you're trying to do and is able to emulate it. Now we're not quite at the point where artificial intelligence can write new open educational resources (OER) on an as-needed basis - but we're a whole lot closer with GPT-3.
Today: 0 Total: 15 [Share]
] [