Podcast
Questions and Answers
Match the following tasks with the type of language model used:
Match the following tasks with the type of language model used:
Predicting next word in a sentence = Auto-regressive Models Comprehensive understanding and encoding of entire sequences of tokens = Auto-encoding Models Natural Language Generation = Auto-regressive Models Natural Language Understanding = Auto-encoding Models
Match the following language models with their characteristics:
Match the following language models with their characteristics:
GPT Family = Predicting a future token given either the past tokens or the future tokens but not both BERT = Learning representations of the entire sequence by predicting tokens given both the past and future tokens Auto-regressive Models = Goal is to predict a future token given either the past tokens or the future tokens but not both Auto-encoding Models = Learning representations of the entire sequence by predicting tokens given both the past and future tokens
Match the following models with their primary usage:
Match the following models with their primary usage:
GPT Family = Natural Language Generation BERT = Natural Language Understanding Auto-regressive Models = Predicting next word in a sentence Auto-encoding Models = Comprehensive understanding and encoding of entire sequences of tokens
Match the following goals with the type of language model used:
Match the following goals with the type of language model used:
Signup and view all the answers
Match the following tasks with the type of language model used:
Match the following tasks with the type of language model used:
Signup and view all the answers