From the course: Large Language Models: Text Classification for NLP using BERT
Unlock the full course today
Join today to access over 24,800 courses taught by industry experts.
Solution: BERT model sizes - Python Tutorial
From the course: Large Language Models: Text Classification for NLP using BERT
Solution: BERT model sizes
- So the first question is how many parameters does the BERT base cased model have? And we can use the get model size function below to help us. So let's head over to that function. Now I don't need to do anything because I've provided the BERT based case as my argument to that function. And you can see that the BERT base case model has approximately 108 million parameters. Now if you know that the number of parameters for a model, how much you be able to determine how much memories required when running a model inference. Now because each of these parameters are represented as single precision floating point numbers. This means they require four bites. So if we do four times the number of parameters this will give us an approximate value for the size of the model. So if I was to take the BERT base cased model I have 108 million parameters. This means I would require approximately 432 megabytes of Ram. Which is about…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.