SlideShare a Scribd company logo
PYTORCH AND
MACHINE LEARNING
FOR THE MATH
WHO AM I
▸Tyrel Denison
▸Senior Developer at Field Agent
▸@tyreldenison
DEEP LEARNING, WHAT IS IT?
BUT I SUCK AT MATH!!
LOOK AT THIS NETWORK
CONVOLUTION
MAX POOLING
NETWORK FLOW
WHY PYTORCH
▸Imperative vs Symbolic
▸Dynamic vs Static Computation Graph
▸GPU Access (Cuda)
▸Pythonic
▸In and out of Numpy (Super similar apis)
▸Autograd and History
WHAT'S A TENSOR
from __future__ import print_function
import torch
x = torch.Tensor(5, 3)
print(x)
>>0.0000 0.0000 0.0001
0.0000 0.0001 0.0000
3.3717 0.0000 3.3717
0.0000 3.8859 0.0000
3.8001 0.0000 27.0173
[torch.FloatTensor of size 5x3]
print(x.size())
>>torch.Size([5, 3])
MEDIA TO DATA
▸Pillow and OpenCV for images
▸scipy and librosa for audio
▸Python and Cython for text
NEURAL NETWORK IN PYTORCH
from torch.autograd import Variable
import torch.nn as nn
import torch.nn.functional as F
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.conv1 = nn.Conv2d(3, 6, 5)
self.pool = nn.MaxPool2d(2, 2)
self.conv2 = nn.Conv2d(6, 16, 5)
self.fc1 = nn.Linear(16 * 5 * 5, 120)
self.fc2 = nn.Linear(120, 84)
self.fc3 = nn.Linear(84, 10)
def forward(self, x):
x = self.pool(F.relu(self.conv1(x)))
x = self.pool(F.relu(self.conv2(x)))
x = x.view(-1, 16 * 5 * 5)
x = F.relu(self.fc1(x))
x = F.relu(self.fc2(x))
x = self.fc3(x)
return x
net = Net()
input = Variable(torch.randn(1, 1, 32, 32))
out = net(input)
print(out)
>>Variable containing:
0.0490 0.0798 0.0189 -0.0231 0.0577 -0.0843 -0.0357 -0.0950 0.0220 -0.1126
[torch.FloatTensor of size 1x10]
CIFAR10 TRAINING DATA
import torch
import torchvision
import torchvision.transforms as transforms
transform = transforms.Compose(
[transforms.ToTensor(),
transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))])
trainset = torchvision.datasets.CIFAR10(root='./data', train=True,
download=True, transform=transform)
trainloader = torch.utils.data.DataLoader(trainset, batch_size=4,
shuffle=True, num_workers=2)
testset = torchvision.datasets.CIFAR10(root='./data', train=False,
download=True, transform=transform)
testloader = torch.utils.data.DataLoader(testset, batch_size=4,
shuffle=False, num_workers=2)
classes = ('plane', 'car', 'bird', 'cat',
'deer', 'dog', 'frog', 'horse', 'ship', 'truck')
TRAINING THE NETWORK
import torch.optim as optim
criterion = nn.CrossEntropyLoss()
optimizer = optim.SGD(net.parameters(), lr=0.001, momentum=0.9)
for epoch in range(2): # loop over the dataset multiple times
running_loss = 0.0
for i, data in enumerate(trainloader, 0):
# get the inputs
inputs, labels = data
# wrap them in Variable
inputs, labels = Variable(inputs), Variable(labels)
# zero the parameter gradients
optimizer.zero_grad()
# forward + backward + optimize
outputs = net(inputs)
loss = criterion(outputs, labels)
loss.backward()
optimizer.step()
# print statistics
running_loss += loss.data[0]
if i % 2000 == 1999: # print every 2000 mini-batches
print('[%d, %5d] loss: %.3f' %
(epoch + 1, i + 1, running_loss / 2000))
running_loss = 0.0
print('Finished Training')
TESTING THE NETWORK
correct = 0
total = 0
for data in testloader:
images, labels = data
outputs = net(Variable(images))
_, predicted = torch.max(outputs.data, 1)
total += labels.size(0)
correct += (predicted == labels).sum()
print('Accuracy of the network on the 10000 test images: %d %%' % (
100 * correct / total))
>>Accuracy of the network on the 10000 test images: 53 %
class_correct = list(0. for i in range(10))
class_total = list(0. for i in range(10))
for data in testloader:
images, labels = data
outputs = net(Variable(images))
_, predicted = torch.max(outputs.data, 1)
c = (predicted == labels).squeeze()
for i in range(4):
label = labels[i]
class_correct[label] += c[i]
class_total[label] += 1
for i in range(10):
print('Accuracy of %5s : %2d %%' % (
classes[i], 100 * class_correct[i] / class_total[i]))
>>Accuracy of plane : 43 %
Accuracy of car : 67 %
Accuracy of bird : 27 %
Accuracy of cat : 60 %
Accuracy of deer : 44 %
Accuracy of dog : 36 %
Accuracy of frog : 64 %
Accuracy of horse : 56 %
Accuracy of ship : 55 %
ARTICLES REFERENCED
▸http://pytorch.org/tutorials/
▸https://medium.com/@ageitgey/machine-learning-is-fun-
80ea3ec3c471
▸https://hackernoon.com/learning-ai-if-you-suck-at-math-
8bdfb4b79037

More Related Content

PDF
Pandas pythonfordatascience
KEY
Pointer Events in Canvas
PPTX
Deep learning study 3
PDF
Numpy python cheat_sheet
PPTX
Explanation on Tensorflow example -Deep mnist for expert
DOCX
Mosaic plot in R.
PDF
TensorFlow Tutorial
DOCX
Advanced Data Visualization Examples with R-Part II
Pandas pythonfordatascience
Pointer Events in Canvas
Deep learning study 3
Numpy python cheat_sheet
Explanation on Tensorflow example -Deep mnist for expert
Mosaic plot in R.
TensorFlow Tutorial
Advanced Data Visualization Examples with R-Part II

What's hot (20)

PDF
Gentlest Introduction to Tensorflow
PDF
Машинное обучение на JS. С чего начать и куда идти | Odessa Frontend Meetup #12
PPTX
Advanced Concepts in Python
DOCX
CLUSTERGRAM
PPTX
Image processing lab work
PDF
Kristhyan kurtlazartezubia evidencia1-metodosnumericos
PDF
Gentlest Introduction to Tensorflow - Part 3
PDF
Data visualization using the grammar of graphics
PDF
CUDA First Programs: Computer Architecture CSE448 : UAA Alaska : Notes
PPTX
TensorFlow
PDF
Glm talk Tomas
PPT
Deep learning
PPTX
TensorFlow in Practice
PDF
Google TensorFlow Tutorial
PDF
Gentlest Introduction to Tensorflow - Part 2
PDF
Data visualization with multiple groups using ggplot2
PDF
Geo Spatial Plot using R
PPT
Dip syntax 4
PDF
PDF
Python For Data Science Cheat Sheet
Gentlest Introduction to Tensorflow
Машинное обучение на JS. С чего начать и куда идти | Odessa Frontend Meetup #12
Advanced Concepts in Python
CLUSTERGRAM
Image processing lab work
Kristhyan kurtlazartezubia evidencia1-metodosnumericos
Gentlest Introduction to Tensorflow - Part 3
Data visualization using the grammar of graphics
CUDA First Programs: Computer Architecture CSE448 : UAA Alaska : Notes
TensorFlow
Glm talk Tomas
Deep learning
TensorFlow in Practice
Google TensorFlow Tutorial
Gentlest Introduction to Tensorflow - Part 2
Data visualization with multiple groups using ggplot2
Geo Spatial Plot using R
Dip syntax 4
Python For Data Science Cheat Sheet
Ad

Similar to Pytorch and Machine Learning for the Math Impaired (20)

PDF
Machine learning with py torch
PDF
Dive Into PyTorch
PDF
01_pytorch_workflow jutedssd huge hhgggdf
PDF
Pytorch for tf_developers
PDF
0b85886e-4490-4af0-8b46-7ff3caf5dc2e.pdf
PDF
Pytorch meetup
PDF
A Tale of Three Deep Learning Frameworks: TensorFlow, Keras, & PyTorch with B...
PDF
implementing _Training_Neural_Network_with_PyTorch .pdf
PPTX
Nimrita deep learning
PDF
OpenPOWER Workshop in Silicon Valley
PDF
1-pytorch-CNN-RNN.pdf
PDF
Pydata DC 2018 (Skorch - A Union of Scikit-learn and PyTorch)
PDF
Pytorch A Detailed Overview Agladze Mikhail
PDF
Deep-Learning-with-PydddddddddddddTorch.pdf
PPTX
2Wisjshsbebe pehele isienew Dorene isksnwnw
PDF
maXbox starter65 machinelearning3
PPTX
PyTorch Tutorial for NTU Machine Learing Course 2017
PDF
Introduction to Machine Learning
PDF
pytorch-cheatsheet.pdf for ML study with pythroch
PPTX
Deeplearning
Machine learning with py torch
Dive Into PyTorch
01_pytorch_workflow jutedssd huge hhgggdf
Pytorch for tf_developers
0b85886e-4490-4af0-8b46-7ff3caf5dc2e.pdf
Pytorch meetup
A Tale of Three Deep Learning Frameworks: TensorFlow, Keras, & PyTorch with B...
implementing _Training_Neural_Network_with_PyTorch .pdf
Nimrita deep learning
OpenPOWER Workshop in Silicon Valley
1-pytorch-CNN-RNN.pdf
Pydata DC 2018 (Skorch - A Union of Scikit-learn and PyTorch)
Pytorch A Detailed Overview Agladze Mikhail
Deep-Learning-with-PydddddddddddddTorch.pdf
2Wisjshsbebe pehele isienew Dorene isksnwnw
maXbox starter65 machinelearning3
PyTorch Tutorial for NTU Machine Learing Course 2017
Introduction to Machine Learning
pytorch-cheatsheet.pdf for ML study with pythroch
Deeplearning
Ad

Recently uploaded (20)

PPTX
Essential Infomation Tech presentation.pptx
PDF
T3DD25 TYPO3 Content Blocks - Deep Dive by André Kraus
PDF
Internet Downloader Manager (IDM) Crack 6.42 Build 41
PPTX
Agentic AI Use Case- Contract Lifecycle Management (CLM).pptx
PDF
AI in Product Development-omnex systems
PDF
System and Network Administration Chapter 2
PPTX
CHAPTER 2 - PM Management and IT Context
PDF
How to Choose the Right IT Partner for Your Business in Malaysia
PDF
Internet Downloader Manager (IDM) Crack 6.42 Build 42 Updates Latest 2025
PPTX
Introduction to Artificial Intelligence
PPTX
L1 - Introduction to python Backend.pptx
PDF
Audit Checklist Design Aligning with ISO, IATF, and Industry Standards — Omne...
PDF
Raksha Bandhan Grocery Pricing Trends in India 2025.pdf
PDF
Flood Susceptibility Mapping Using Image-Based 2D-CNN Deep Learnin. Overview ...
PDF
Design an Analysis of Algorithms II-SECS-1021-03
PPTX
Agentic AI : A Practical Guide. Undersating, Implementing and Scaling Autono...
PDF
Nekopoi APK 2025 free lastest update
PDF
Navsoft: AI-Powered Business Solutions & Custom Software Development
PPTX
Reimagine Home Health with the Power of Agentic AI​
PPTX
Operating system designcfffgfgggggggvggggggggg
Essential Infomation Tech presentation.pptx
T3DD25 TYPO3 Content Blocks - Deep Dive by André Kraus
Internet Downloader Manager (IDM) Crack 6.42 Build 41
Agentic AI Use Case- Contract Lifecycle Management (CLM).pptx
AI in Product Development-omnex systems
System and Network Administration Chapter 2
CHAPTER 2 - PM Management and IT Context
How to Choose the Right IT Partner for Your Business in Malaysia
Internet Downloader Manager (IDM) Crack 6.42 Build 42 Updates Latest 2025
Introduction to Artificial Intelligence
L1 - Introduction to python Backend.pptx
Audit Checklist Design Aligning with ISO, IATF, and Industry Standards — Omne...
Raksha Bandhan Grocery Pricing Trends in India 2025.pdf
Flood Susceptibility Mapping Using Image-Based 2D-CNN Deep Learnin. Overview ...
Design an Analysis of Algorithms II-SECS-1021-03
Agentic AI : A Practical Guide. Undersating, Implementing and Scaling Autono...
Nekopoi APK 2025 free lastest update
Navsoft: AI-Powered Business Solutions & Custom Software Development
Reimagine Home Health with the Power of Agentic AI​
Operating system designcfffgfgggggggvggggggggg

Pytorch and Machine Learning for the Math Impaired

  • 2. WHO AM I ▸Tyrel Denison ▸Senior Developer at Field Agent ▸@tyreldenison
  • 4. BUT I SUCK AT MATH!!
  • 5. LOOK AT THIS NETWORK
  • 9. WHY PYTORCH ▸Imperative vs Symbolic ▸Dynamic vs Static Computation Graph ▸GPU Access (Cuda) ▸Pythonic ▸In and out of Numpy (Super similar apis) ▸Autograd and History
  • 10. WHAT'S A TENSOR from __future__ import print_function import torch x = torch.Tensor(5, 3) print(x) >>0.0000 0.0000 0.0001 0.0000 0.0001 0.0000 3.3717 0.0000 3.3717 0.0000 3.8859 0.0000 3.8001 0.0000 27.0173 [torch.FloatTensor of size 5x3] print(x.size()) >>torch.Size([5, 3])
  • 11. MEDIA TO DATA ▸Pillow and OpenCV for images ▸scipy and librosa for audio ▸Python and Cython for text
  • 12. NEURAL NETWORK IN PYTORCH from torch.autograd import Variable import torch.nn as nn import torch.nn.functional as F class Net(nn.Module): def __init__(self): super(Net, self).__init__() self.conv1 = nn.Conv2d(3, 6, 5) self.pool = nn.MaxPool2d(2, 2) self.conv2 = nn.Conv2d(6, 16, 5) self.fc1 = nn.Linear(16 * 5 * 5, 120) self.fc2 = nn.Linear(120, 84) self.fc3 = nn.Linear(84, 10) def forward(self, x): x = self.pool(F.relu(self.conv1(x))) x = self.pool(F.relu(self.conv2(x))) x = x.view(-1, 16 * 5 * 5) x = F.relu(self.fc1(x)) x = F.relu(self.fc2(x)) x = self.fc3(x) return x net = Net() input = Variable(torch.randn(1, 1, 32, 32)) out = net(input) print(out) >>Variable containing: 0.0490 0.0798 0.0189 -0.0231 0.0577 -0.0843 -0.0357 -0.0950 0.0220 -0.1126 [torch.FloatTensor of size 1x10]
  • 13. CIFAR10 TRAINING DATA import torch import torchvision import torchvision.transforms as transforms transform = transforms.Compose( [transforms.ToTensor(), transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))]) trainset = torchvision.datasets.CIFAR10(root='./data', train=True, download=True, transform=transform) trainloader = torch.utils.data.DataLoader(trainset, batch_size=4, shuffle=True, num_workers=2) testset = torchvision.datasets.CIFAR10(root='./data', train=False, download=True, transform=transform) testloader = torch.utils.data.DataLoader(testset, batch_size=4, shuffle=False, num_workers=2) classes = ('plane', 'car', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck')
  • 14. TRAINING THE NETWORK import torch.optim as optim criterion = nn.CrossEntropyLoss() optimizer = optim.SGD(net.parameters(), lr=0.001, momentum=0.9) for epoch in range(2): # loop over the dataset multiple times running_loss = 0.0 for i, data in enumerate(trainloader, 0): # get the inputs inputs, labels = data # wrap them in Variable inputs, labels = Variable(inputs), Variable(labels) # zero the parameter gradients optimizer.zero_grad() # forward + backward + optimize outputs = net(inputs) loss = criterion(outputs, labels) loss.backward() optimizer.step() # print statistics running_loss += loss.data[0] if i % 2000 == 1999: # print every 2000 mini-batches print('[%d, %5d] loss: %.3f' % (epoch + 1, i + 1, running_loss / 2000)) running_loss = 0.0 print('Finished Training')
  • 15. TESTING THE NETWORK correct = 0 total = 0 for data in testloader: images, labels = data outputs = net(Variable(images)) _, predicted = torch.max(outputs.data, 1) total += labels.size(0) correct += (predicted == labels).sum() print('Accuracy of the network on the 10000 test images: %d %%' % ( 100 * correct / total)) >>Accuracy of the network on the 10000 test images: 53 % class_correct = list(0. for i in range(10)) class_total = list(0. for i in range(10)) for data in testloader: images, labels = data outputs = net(Variable(images)) _, predicted = torch.max(outputs.data, 1) c = (predicted == labels).squeeze() for i in range(4): label = labels[i] class_correct[label] += c[i] class_total[label] += 1 for i in range(10): print('Accuracy of %5s : %2d %%' % ( classes[i], 100 * class_correct[i] / class_total[i])) >>Accuracy of plane : 43 % Accuracy of car : 67 % Accuracy of bird : 27 % Accuracy of cat : 60 % Accuracy of deer : 44 % Accuracy of dog : 36 % Accuracy of frog : 64 % Accuracy of horse : 56 % Accuracy of ship : 55 %