From the course: Large Language Models: Text Classification for NLP using BERT
Unlock the full course today
Join today to access over 24,800 courses taught by industry experts.
BERT and text classification - Python Tutorial
From the course: Large Language Models: Text Classification for NLP using BERT
BERT and text classification
- [Instructor] We are now in a position where we can fine-tune our pre-trained BERT model for text classification. We'll be using the IMDB dataset, which has two fields or columns, a text column, which has the review for a movie, and a label column. A one means that the movie has a positive review in the label column and a zero means that the movie has a negative review in the label column. As part of the pre-training step, when Google trained BERT with the next sentence prediction task, which is a text classification task, a linear layer was added at the end of the BERT model. The only thing that was fed into that linear layer was from the CLS embedding. So in order for the BERT model to perform well, it learned that it needed to capture all the information required in the CLS token at the top left of the diagram. This means when we want to fine-tune BERT, say on movie reviews, all we need to do is to add a…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.