From the course: Learning Graph Neural Networks
Unlock this course with a free trial
Join today to access over 24,800 courses taught by industry experts.
Exercise: Training the baseline model
From the course: Learning Graph Neural Networks
Exercise: Training the baseline model
- Let's get started with the training of our neural network. I've set num_epochs to 200, so we'll train for 200 epochs, and I run a forward loop for each epoch of training on line three. Because this dataset is really small, we don't need to batch the data before we pass it to the neural network. Model.train will set the model in training mode where gradients will be computed, and I zero out the gradients on the optimizer. On line seven, we make a forward pass through the model and get the raw logic score at the output. And on line nine, we compute the training loss. When you compute this training loss, it's important that you only take into account the loss on the training nodes. That means you have to index both the outputs and the actual Y values using the train mask on the citeseer_graph. We'll compute the loss only on the training nodes loss.backward will compute trainings, and optimizer.step will update the model's parameters in this backward pass. On line 15, I get the…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.