Podcast
Questions and Answers
How many words was GPT-3 trained on?
How many words was GPT-3 trained on?
- 1 trillion words
- 100 million words
- 500 billion words (correct)
- 10 quadrillion words
How does the performance of language models scale according to the 2020 OpenAI paper?
How does the performance of language models scale according to the 2020 OpenAI paper?
- Exponentially with model size and dataset size
- Linearly with model size and dataset size
- As a power-law with model size, dataset size, and compute used for training (correct)
- Logarithmically with model size and dataset size
How many parameters did GPT-1, OpenAI's first language model, have?
How many parameters did GPT-1, OpenAI's first language model, have?
- 117 million parameters (correct)
- 12 million parameters
- 768 million parameters
- 12 billion parameters
Approximately how many words does a typical human child encounter by age 10?
Approximately how many words does a typical human child encounter by age 10?
What is one reason given for the surprising performance of large language models like GPT-3?
What is one reason given for the surprising performance of large language models like GPT-3?
Flashcards are hidden until you start studying