From the course: Introduction to Transformer Models for NLP
Unlock this course with a free trial
Join today to access over 24,800 courses taught by industry experts.
Fine-tuning BERT to solve NLP tasks
From the course: Introduction to Transformer Models for NLP
Fine-tuning BERT to solve NLP tasks
- Section 5.3, fine tuning BERT to solve natural language processing tasks. So once BERT has a general idea about how words are used in sentences through the mask language modeling task, and then how sentences are treated in a larger document through the next sentence prediction task, we now have our pre-trained language model, BERT. With that, we can then use BERT and everything BERT has learned and fine tune it on a specific task of our choosing. So we're going to be focusing on three tasks in this lesson. The first one is sequence classification, and the way the architecture is going to be laid out for us is going to look relatively similar to our next sentence prediction task in that we're going to be feeding BERT a sentence. The differences will only be feeding it one sequence. Istanbul is a great city, for example. We're not going to have a second sentence coming after the SEP token. We'll simply have one sentence. We will then pass that sentence through our pre-trained BERT…