From the course: Large Language Models: Text Classification for NLP using BERT

Unlock the full course today

Join today to access over 24,800 courses taught by industry experts.

Challenge: BERT model sizes

Challenge: BERT model sizes

(upbeat music) - [Instructor] We're going to continue with working on the model sizes in this challenge. So go ahead to the Colab notebook and select Runtime and Run all. Now, just so you know, a checkpoint includes the model configuration and pre-trained weights. Two of the checkpoints you'll often use with BERT are BERT base cased and BERT base uncased. BERT base cased means that we distinguish between upper and lowercase words, and BERT base uncased means that we don't. So let's head over to the questions. So the first question is, how many parameters does BERT base cased model have? Use the get_model_size function below to help you. And secondly, if you know the number of parameters for a model, how might you be able to determine how much memory is required when running a model inference? So this would be the amount of space that's required on RAM. Now each parameter is represented as a 4-byte floating point.…

Contents