SlideShare a Scribd company logo
TensorFlow Tutorial
Detecting Diabetic Retinopathy
Diabetic Retinopathy is the fastest growing cause of blindness in the world. Deep Learning algorithms using TensorFlow are
capable of interpreting signs of Diabetic Retinopathy in retinal photographs.
128,000 images of eye retina’s
Fed to a Neural Network
for training
New image of eye retina Identifies Healthy and
Diseased eye
What’s in it for you?
What is Deep Learning?
What is TensorFlow?
Top Deep Learning Libraries
Why use TensorFlow?
Building a Computational Graph
Programming Elements in TensorFlow
Introducing Recurrent Neural Networks
Use case implementation of RNN using TensorFlow
What is Deep learning?
Subset of Machine Learning and works
on the structure and functions of a
human brain
Learns from unstructured data and
performs complex computations
Uses a Neural Net with multiple layers to
train an algorithm
Deep Learning Input Layer Hidden
Layers
Output
Layer
Popular libraries for Deep Learning
TensorFlow
Deep Learning 4
Java
TheanoTorch
Keras
Deep Learning Libraries
Why use TensorFlow?
Provides both C++ and Python
API’s that makes it easier to
work on
TensorFlow reduces the
chances of errors by 55% to
85%
Teams can run TensorFlow on
large scale server farms
embedded on devices, CPUs,
GPUs, TPUs, etc
TensorFlow allows you to train
models faster as it has faster
compilation time
What is TensorFlow?
TensorFlow
Open source library
developed by Google brain
team in 2012
Developed initially to run large
sets of numerical
computations
Uses Data Flow graphs to
process data and perform
computations
Takes data in the form of
arrays of potentially higher
dimensions and ranks
What is a Tensor?
Tensor is a mathematical object represented as arrays of higher dimensions. These arrays of data with different
dimensions and ranks that are fed as input to the neural network are called Tensors.
Arrays of data with
different dimensions is
fed as input to the
network
Input Layer Hidden Layers Output Layer
a
m
k
q
d
Tensor of Dimensions[5]
What is a Tensor?
Tensor is a mathematical object represented as arrays of higher dimensions. These arrays of data with different
dimensions and ranks that are fed as input to the neural network are called Tensors.
Input Layer Hidden Layers Output Layer
1
6
8
3
9
3
3
4
1
7
4
9
1
5
3
7
1
6
9
2
Tensor of Dimensions[5,4]
What is a Tensor?
Tensor is a mathematical object represented as arrays of higher dimensions. These arrays of data with different
dimensions and ranks that are fed as input to the neural network are called Tensors.
Input Layer Hidden Layers Output Layer
Tensor of Dimensions[3,3,3]
Tensor Rank
The number of dimensions used to represent the data is known as its Rank.
S = 10 Tensor of Rank 0 or a Scalar.
Tensor of Rank 1 or a Vector.V = [10., 11., 12.]
M = [[1, 2, 3],[4, 5, 6]] Tensor of Rank 2 or a Matrix.
T = [[[1],[2],[3]],[[4],[5],[6]],[[7],[8],[9]]] Tensor of Rank 3 or a Tensor
Tensor Data type
Data Type Python Type Description
DF_FLOAT tf.float32 32 bits floating point
DF_DOUBLE tf.float64 64 bits floating point
DT_INT8 tf.int8 8 bits signed integer
DT_INT16 tf.int16 16 bits signed integer
DT_INT32 tf.int32 32 bits signed integer
DT_INT64 tf.int64 64 bits signed integer
DT_UINT8 tf.unit8 8 bits unsigned integer
DT_STRING tf.string Variable length byte arrays
DT_BOOL tf.bool Boolean
In addition to rank and shape, Tensors have a data type. Following is the list of the data type:
Building a Computation Graph
Everything in TensorFlow is based on creating a computational graph. It has a network of nodes, with each
node performing an operation like addition, multiplication or evaluating some multivariate equation.
input
input
add
mul
mul
a
b
c
d
e
5
3
4
2
7
1
2
84
In TensorFlow, a computation is described
using a Data Flow graph
Nodes represent mathematical operation and
edge represents tensors
Building a Computation Graph
Lets compute a function F of 3 variables a, b, c : F(a,b,c) = 5(a+bc)
p = bc
q = a + p
F = 5*q
a=4
b=3
c=5
p = bc
q = a + p F = 5*q
15
19 95
Programming elements in TensorFlow
Constants
Constants are parameters whose value does not
change. To define a constant, we use tf.constant()
command.
Example:
a = tf.constant(2.0, tf.float32)
b = tf.constant(3.0)
Print(a, b)
Programming elements in TensorFlow
Variables
Variables allow us to add new trainable parameters to
graph. To define a variable, we use tf.Variable()
command and initialize them before running the graph in
a session.
Example:
W = tf.Variable([.3],dtype=tf.float32)
b = tf.Variable([-.3],dtype=tf.float32)
x = tf.placeholder(tf.float32)
linear_model = W*x+b
Programming elements in TensorFlow
Placeholder
Placeholders allow us to feed data to a tensorflow model
from outside a model. It permits a value to be assigned
later. To define a placeholder, we use tf.placeholder()
command.
Example:
a = tf.placeholder(tf.float32)
b = a*2
with tf.Session() as sess:
result = sess.run(b,feed_dict={a:3.0})
print result
feed_dict specifies tensors
that provide concrete values to
the placeholders
Programming elements in TensorFlow
Session
A session is run to evaluate the nodes. This is called as
the TensorFlow runtime.
Example:
a = tf.constant(2.0)
b = tf.constant(4.0)
c = a+b
# Launch Session
sess = tf.Session()
# Evaluate the tensor c
print(sess.run(c))
Running a Computation
Graph
a
b
c
4.0
2.0
Addition
6.0
Linear Regression using TensorFlow
Let’s work on a regression example to solve a simple equation [y=m*x+b]. We will calculate the slope and the intercept of the
line that best fits our data.
1. Setting up some artificial data for regression
Linear Regression using TensorFlow
2. Plot the Data
Linear Regression using TensorFlow
3. Assign the variables
4. Apply the Cost Function
Linear Regression using TensorFlow
5. Apply the Optimization function
6. Initialize the variables
7. Create the session and run the computation
Linear Regression using TensorFlow
8. Print the slope and intercept
Linear Regression using TensorFlow
9. Evaluate the results
Introducing Recurrent Neural Networks
In a Feed-Forward Network like CNN/ANN, information flows only in forward direction, from the input nodes, through the
hidden layers (if any) and to the output nodes. There are no cycles or loops in the network.
Input Layer Hidden Layers Output Layer
Feed-Forward Network
Yellow
Patch
Petals length
and width
Sepals length
and width
Output
Iris
Not Iris
Introducing Recurrent Neural Networks
Recurrent Neural Networks are used to handle sequential time series data because it is able to memorize the inputs. At a
particular time step, RNN accepts previous output result along with the current input to generate the output at that time step.
Recurrent Neural Network
h
x
y
w
Rotate the neural network
vertically and compress the
layers
How does a Recurrent Neural Network look like?
h
x
y
w Unfold ht-1
xt-1
yt-1
ht
xt
yt
ht+1
xt+1
yt+1
w ww w
Input at time t
Output at time t
Hidden state at
time t
ht = f (ht-1 ,xtw )
ht = new state
fw = function with parameter w
ht-1
= old state
xt = input vector at time step t
Types of RNN
While Feed-Forward Networks map one input to one output, Recurrent Neural Networks can map one to many, many to one
and many to many.
One to One One to Many Many to One Many to Many
Types of RNN
While Feed-Forward Networks map one input to one output, Recurrent Neural Networks can map one to many, many to one
and many to many.
One to One One to Many Many to One Many to Many
• Known as the Vanilla Neural Network. Used for
regular machine learning problems
1 output
1 input
Types of RNN
While Feed-Forward Networks map one input to one output, Recurrent Neural Networks can map one to many, many to one
and many to many.
One to One One to Many Many to One Many to Many
• Used for image captioning. Given an image, it
generate a sequence of words, captioning the
image
multiple
outputs
1 input
Types of RNN
While Feed-Forward Networks map one input to one output, Recurrent Neural Networks can map one to many, many to one
and many to many.
One to One One to Many Many to One Many to Many
• Used to carry out Sentiment Analysis. Given a
set of words, it tells you the sentiment present
1 output
multiple
inputs
Types of RNN
While Feed-Forward Networks map one input to one output, Recurrent Neural Networks can map one to many, many to one
and many to many.
One to One One to Many Many to One Many to Many
• Used in Machine Translation. Given a sets of
words in one language, it translates it to
another
multiple
inputs
multiple
outputs
Use case implementation of RNN
Lets look at a use case of predicting the monthly milk production
per cow in pounds using a time series data
Based on
Data between Jan 1962 to
Dec 1975
How much milk
production can we
expect in a
month?
Use case implementation of RNN
1. Import the necessary libraries
2. Read the dataset and print the head of it
Use case implementation of RNN
3. Convert the index to time series
4. Plot the time series data
Use case implementation of RNN
5. Perform the train test split on the data
Use case implementation of RNN
6. Scale the data using standard Machine Learning process
7. Applying the batch function
Use case implementation of RNN
8. Setting up the RNN model
Use case implementation of RNN
9. Create Placeholders for X and y
10. Applying the loss function and optimizer
11. Initialize the Global variables
12. Create an instance of tf.train.Saver()
Use case implementation of RNN
13. Create the session and run it
Use case implementation of RNN
14. Display the Test Data
Use case implementation of RNN
15. Create a seed training_instance to predict the last 12 months milk production from the training data
Use case implementation of RNN
16. Displaying the results of the prediction
Use case implementation of RNN
17. Reshape the results
18. Create a new column on the test data called Generated
Use case implementation of RNN
19. View the test_set dataframe
Use case implementation of RNN
20. Plot the predicted result and the actual result
Key Takeaways
TensorFlow Tutorial | Deep Learning With TensorFlow | TensorFlow Tutorial For Beginners |Simplilearn

More Related Content

PPTX
What is TensorFlow? | Introduction to TensorFlow | TensorFlow Tutorial For Be...
PDF
Transformer Introduction (Seminar Material)
PDF
Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...
PPTX
Transfer Learning and Fine Tuning for Cross Domain Image Classification with ...
PDF
TensorFlow and Keras: An Overview
PDF
Introduction to Generative Adversarial Networks (GANs)
PPTX
Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | S...
PPTX
Introduction to Keras
What is TensorFlow? | Introduction to TensorFlow | TensorFlow Tutorial For Be...
Transformer Introduction (Seminar Material)
Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...
Transfer Learning and Fine Tuning for Cross Domain Image Classification with ...
TensorFlow and Keras: An Overview
Introduction to Generative Adversarial Networks (GANs)
Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | S...
Introduction to Keras

What's hot (20)

PPT
Decision tree
PPTX
Introduction to Graph Neural Networks: Basics and Applications - Katsuhiko Is...
PDF
Tensorflow presentation
PDF
220206 transformer interpretability beyond attention visualization
PPTX
Introduction to Transformer Model
PPTX
Deep Learning Applications | Deep Learning Applications In Real Life | Deep l...
PDF
Deep Learning for Computer Vision: Data Augmentation (UPC 2016)
PDF
Deep Dive into Hyperparameter Tuning
PPTX
Introduction to Deep Learning
PPTX
RapidMiner: Introduction To Rapid Miner
PDF
Rnn and lstm
PDF
Recurrent Neural Networks. Part 1: Theory
PPTX
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
PDF
Graph neural networks overview
PPTX
1.Introduction to deep learning
PPTX
Introduction to CNN
PDF
Introduction to Recurrent Neural Network
PDF
Transfer Learning
PPTX
Support vector machine
PPTX
Notes on attention mechanism
Decision tree
Introduction to Graph Neural Networks: Basics and Applications - Katsuhiko Is...
Tensorflow presentation
220206 transformer interpretability beyond attention visualization
Introduction to Transformer Model
Deep Learning Applications | Deep Learning Applications In Real Life | Deep l...
Deep Learning for Computer Vision: Data Augmentation (UPC 2016)
Deep Dive into Hyperparameter Tuning
Introduction to Deep Learning
RapidMiner: Introduction To Rapid Miner
Rnn and lstm
Recurrent Neural Networks. Part 1: Theory
What Is Deep Learning? | Introduction to Deep Learning | Deep Learning Tutori...
Graph neural networks overview
1.Introduction to deep learning
Introduction to CNN
Introduction to Recurrent Neural Network
Transfer Learning
Support vector machine
Notes on attention mechanism
Ad

Similar to TensorFlow Tutorial | Deep Learning With TensorFlow | TensorFlow Tutorial For Beginners |Simplilearn (20)

PDF
Language translation with Deep Learning (RNN) with TensorFlow
 
PDF
Introduction To TensorFlow | Deep Learning Using TensorFlow | TensorFlow Tuto...
PPTX
TensorFlow.pptx
PDF
Introduction to Neural Networks in Tensorflow
PPTX
Neural networks and deep learning
PDF
Deep Learning Tutorial | Deep Learning Tutorial for Beginners | Neural Networ...
PPTX
Introduction to Deep Learning and TensorFlow
PPTX
H2 o berkeleydltf
PDF
TensorFlow example for AI Ukraine2016
PDF
DyCode Engineering - Machine Learning with TensorFlow
PPTX
Introduction to Deep Learning, Keras, and Tensorflow
PDF
Introduction to Deep Learning, Keras, and TensorFlow
PDF
TensorFlow Tutorial.pdf
PPTX
Deep Learning in Your Browser
PPTX
TensorFlow in Your Browser
PPTX
Deep Learning in your Browser: powered by WebGL
PPTX
Intro to Deep Learning, TensorFlow, and tensorflow.js
PPTX
C++ and Deep Learning
PPTX
Lecture Note DL&NN Tensorflow.pptx
Language translation with Deep Learning (RNN) with TensorFlow
 
Introduction To TensorFlow | Deep Learning Using TensorFlow | TensorFlow Tuto...
TensorFlow.pptx
Introduction to Neural Networks in Tensorflow
Neural networks and deep learning
Deep Learning Tutorial | Deep Learning Tutorial for Beginners | Neural Networ...
Introduction to Deep Learning and TensorFlow
H2 o berkeleydltf
TensorFlow example for AI Ukraine2016
DyCode Engineering - Machine Learning with TensorFlow
Introduction to Deep Learning, Keras, and Tensorflow
Introduction to Deep Learning, Keras, and TensorFlow
TensorFlow Tutorial.pdf
Deep Learning in Your Browser
TensorFlow in Your Browser
Deep Learning in your Browser: powered by WebGL
Intro to Deep Learning, TensorFlow, and tensorflow.js
C++ and Deep Learning
Lecture Note DL&NN Tensorflow.pptx
Ad

More from Simplilearn (20)

PPTX
Top 50 Scrum Master Interview Questions | Scrum Master Interview Questions & ...
PPTX
Bagging Vs Boosting In Machine Learning | Ensemble Learning In Machine Learni...
PPTX
Future Of Social Media | Social Media Trends and Strategies 2025 | Instagram ...
PPTX
SQL Query Optimization | SQL Query Optimization Techniques | SQL Basics | SQL...
PPTX
SQL INterview Questions .pTop 45 SQL Interview Questions And Answers In 2025 ...
PPTX
How To Start Influencer Marketing Business | Influencer Marketing For Beginne...
PPTX
Cyber Security Roadmap 2025 | How To Become Cyber Security Engineer In 2025 |...
PPTX
How To Become An AI And ML Engineer In 2025 | AI Engineer Roadmap | AI ML Car...
PPTX
What Is GitHub Copilot? | How To Use GitHub Copilot? | How does GitHub Copilo...
PPTX
Top 10 Data Analyst Certification For 2025 | Best Data Analyst Certification ...
PPTX
Complete Data Science Roadmap For 2025 | Data Scientist Roadmap For Beginners...
PPTX
Top 7 High Paying AI Certifications Courses For 2025 | Best AI Certifications...
PPTX
Data Cleaning In Data Mining | Step by Step Data Cleaning Process | Data Clea...
PPTX
Top 10 Data Analyst Projects For 2025 | Data Analyst Projects | Data Analysis...
PPTX
AI Engineer Roadmap 2025 | AI Engineer Roadmap For Beginners | AI Engineer Ca...
PPTX
Machine Learning Roadmap 2025 | Machine Learning Engineer Roadmap For Beginne...
PPTX
Kotter's 8-Step Change Model Explained | Kotter's Change Management Model | S...
PPTX
Gen AI Engineer Roadmap For 2025 | How To Become Gen AI Engineer In 2025 | Si...
PPTX
Top 10 Data Analyst Certification For 2025 | Best Data Analyst Certification ...
PPTX
Complete Data Science Roadmap For 2025 | Data Scientist Roadmap For Beginners...
Top 50 Scrum Master Interview Questions | Scrum Master Interview Questions & ...
Bagging Vs Boosting In Machine Learning | Ensemble Learning In Machine Learni...
Future Of Social Media | Social Media Trends and Strategies 2025 | Instagram ...
SQL Query Optimization | SQL Query Optimization Techniques | SQL Basics | SQL...
SQL INterview Questions .pTop 45 SQL Interview Questions And Answers In 2025 ...
How To Start Influencer Marketing Business | Influencer Marketing For Beginne...
Cyber Security Roadmap 2025 | How To Become Cyber Security Engineer In 2025 |...
How To Become An AI And ML Engineer In 2025 | AI Engineer Roadmap | AI ML Car...
What Is GitHub Copilot? | How To Use GitHub Copilot? | How does GitHub Copilo...
Top 10 Data Analyst Certification For 2025 | Best Data Analyst Certification ...
Complete Data Science Roadmap For 2025 | Data Scientist Roadmap For Beginners...
Top 7 High Paying AI Certifications Courses For 2025 | Best AI Certifications...
Data Cleaning In Data Mining | Step by Step Data Cleaning Process | Data Clea...
Top 10 Data Analyst Projects For 2025 | Data Analyst Projects | Data Analysis...
AI Engineer Roadmap 2025 | AI Engineer Roadmap For Beginners | AI Engineer Ca...
Machine Learning Roadmap 2025 | Machine Learning Engineer Roadmap For Beginne...
Kotter's 8-Step Change Model Explained | Kotter's Change Management Model | S...
Gen AI Engineer Roadmap For 2025 | How To Become Gen AI Engineer In 2025 | Si...
Top 10 Data Analyst Certification For 2025 | Best Data Analyst Certification ...
Complete Data Science Roadmap For 2025 | Data Scientist Roadmap For Beginners...

Recently uploaded (20)

PPTX
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
PDF
2.FourierTransform-ShortQuestionswithAnswers.pdf
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PDF
VCE English Exam - Section C Student Revision Booklet
PDF
Complications of Minimal Access Surgery at WLH
PPTX
PPH.pptx obstetrics and gynecology in nursing
PDF
Basic Mud Logging Guide for educational purpose
PDF
Mark Klimek Lecture Notes_240423 revision books _173037.pdf
PDF
Anesthesia in Laparoscopic Surgery in India
PPTX
The Healthy Child – Unit II | Child Health Nursing I | B.Sc Nursing 5th Semester
PDF
TR - Agricultural Crops Production NC III.pdf
PPTX
Pharma ospi slides which help in ospi learning
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PPTX
BOWEL ELIMINATION FACTORS AFFECTING AND TYPES
PDF
Abdominal Access Techniques with Prof. Dr. R K Mishra
PDF
01-Introduction-to-Information-Management.pdf
PPTX
Introduction to Child Health Nursing – Unit I | Child Health Nursing I | B.Sc...
PDF
Business Ethics Teaching Materials for college
PPTX
Cell Structure & Organelles in detailed.
PDF
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
PPT- ENG7_QUARTER1_LESSON1_WEEK1. IMAGERY -DESCRIPTIONS pptx.pptx
2.FourierTransform-ShortQuestionswithAnswers.pdf
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
VCE English Exam - Section C Student Revision Booklet
Complications of Minimal Access Surgery at WLH
PPH.pptx obstetrics and gynecology in nursing
Basic Mud Logging Guide for educational purpose
Mark Klimek Lecture Notes_240423 revision books _173037.pdf
Anesthesia in Laparoscopic Surgery in India
The Healthy Child – Unit II | Child Health Nursing I | B.Sc Nursing 5th Semester
TR - Agricultural Crops Production NC III.pdf
Pharma ospi slides which help in ospi learning
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
BOWEL ELIMINATION FACTORS AFFECTING AND TYPES
Abdominal Access Techniques with Prof. Dr. R K Mishra
01-Introduction-to-Information-Management.pdf
Introduction to Child Health Nursing – Unit I | Child Health Nursing I | B.Sc...
Business Ethics Teaching Materials for college
Cell Structure & Organelles in detailed.
3rd Neelam Sanjeevareddy Memorial Lecture.pdf

TensorFlow Tutorial | Deep Learning With TensorFlow | TensorFlow Tutorial For Beginners |Simplilearn

  • 2. Detecting Diabetic Retinopathy Diabetic Retinopathy is the fastest growing cause of blindness in the world. Deep Learning algorithms using TensorFlow are capable of interpreting signs of Diabetic Retinopathy in retinal photographs. 128,000 images of eye retina’s Fed to a Neural Network for training New image of eye retina Identifies Healthy and Diseased eye
  • 3. What’s in it for you? What is Deep Learning? What is TensorFlow? Top Deep Learning Libraries Why use TensorFlow? Building a Computational Graph Programming Elements in TensorFlow Introducing Recurrent Neural Networks Use case implementation of RNN using TensorFlow
  • 4. What is Deep learning? Subset of Machine Learning and works on the structure and functions of a human brain Learns from unstructured data and performs complex computations Uses a Neural Net with multiple layers to train an algorithm Deep Learning Input Layer Hidden Layers Output Layer
  • 5. Popular libraries for Deep Learning TensorFlow Deep Learning 4 Java TheanoTorch Keras Deep Learning Libraries
  • 6. Why use TensorFlow? Provides both C++ and Python API’s that makes it easier to work on TensorFlow reduces the chances of errors by 55% to 85% Teams can run TensorFlow on large scale server farms embedded on devices, CPUs, GPUs, TPUs, etc TensorFlow allows you to train models faster as it has faster compilation time
  • 7. What is TensorFlow? TensorFlow Open source library developed by Google brain team in 2012 Developed initially to run large sets of numerical computations Uses Data Flow graphs to process data and perform computations Takes data in the form of arrays of potentially higher dimensions and ranks
  • 8. What is a Tensor? Tensor is a mathematical object represented as arrays of higher dimensions. These arrays of data with different dimensions and ranks that are fed as input to the neural network are called Tensors. Arrays of data with different dimensions is fed as input to the network Input Layer Hidden Layers Output Layer a m k q d Tensor of Dimensions[5]
  • 9. What is a Tensor? Tensor is a mathematical object represented as arrays of higher dimensions. These arrays of data with different dimensions and ranks that are fed as input to the neural network are called Tensors. Input Layer Hidden Layers Output Layer 1 6 8 3 9 3 3 4 1 7 4 9 1 5 3 7 1 6 9 2 Tensor of Dimensions[5,4]
  • 10. What is a Tensor? Tensor is a mathematical object represented as arrays of higher dimensions. These arrays of data with different dimensions and ranks that are fed as input to the neural network are called Tensors. Input Layer Hidden Layers Output Layer Tensor of Dimensions[3,3,3]
  • 11. Tensor Rank The number of dimensions used to represent the data is known as its Rank. S = 10 Tensor of Rank 0 or a Scalar. Tensor of Rank 1 or a Vector.V = [10., 11., 12.] M = [[1, 2, 3],[4, 5, 6]] Tensor of Rank 2 or a Matrix. T = [[[1],[2],[3]],[[4],[5],[6]],[[7],[8],[9]]] Tensor of Rank 3 or a Tensor
  • 12. Tensor Data type Data Type Python Type Description DF_FLOAT tf.float32 32 bits floating point DF_DOUBLE tf.float64 64 bits floating point DT_INT8 tf.int8 8 bits signed integer DT_INT16 tf.int16 16 bits signed integer DT_INT32 tf.int32 32 bits signed integer DT_INT64 tf.int64 64 bits signed integer DT_UINT8 tf.unit8 8 bits unsigned integer DT_STRING tf.string Variable length byte arrays DT_BOOL tf.bool Boolean In addition to rank and shape, Tensors have a data type. Following is the list of the data type:
  • 13. Building a Computation Graph Everything in TensorFlow is based on creating a computational graph. It has a network of nodes, with each node performing an operation like addition, multiplication or evaluating some multivariate equation. input input add mul mul a b c d e 5 3 4 2 7 1 2 84 In TensorFlow, a computation is described using a Data Flow graph Nodes represent mathematical operation and edge represents tensors
  • 14. Building a Computation Graph Lets compute a function F of 3 variables a, b, c : F(a,b,c) = 5(a+bc) p = bc q = a + p F = 5*q a=4 b=3 c=5 p = bc q = a + p F = 5*q 15 19 95
  • 15. Programming elements in TensorFlow Constants Constants are parameters whose value does not change. To define a constant, we use tf.constant() command. Example: a = tf.constant(2.0, tf.float32) b = tf.constant(3.0) Print(a, b)
  • 16. Programming elements in TensorFlow Variables Variables allow us to add new trainable parameters to graph. To define a variable, we use tf.Variable() command and initialize them before running the graph in a session. Example: W = tf.Variable([.3],dtype=tf.float32) b = tf.Variable([-.3],dtype=tf.float32) x = tf.placeholder(tf.float32) linear_model = W*x+b
  • 17. Programming elements in TensorFlow Placeholder Placeholders allow us to feed data to a tensorflow model from outside a model. It permits a value to be assigned later. To define a placeholder, we use tf.placeholder() command. Example: a = tf.placeholder(tf.float32) b = a*2 with tf.Session() as sess: result = sess.run(b,feed_dict={a:3.0}) print result feed_dict specifies tensors that provide concrete values to the placeholders
  • 18. Programming elements in TensorFlow Session A session is run to evaluate the nodes. This is called as the TensorFlow runtime. Example: a = tf.constant(2.0) b = tf.constant(4.0) c = a+b # Launch Session sess = tf.Session() # Evaluate the tensor c print(sess.run(c)) Running a Computation Graph a b c 4.0 2.0 Addition 6.0
  • 19. Linear Regression using TensorFlow Let’s work on a regression example to solve a simple equation [y=m*x+b]. We will calculate the slope and the intercept of the line that best fits our data. 1. Setting up some artificial data for regression
  • 20. Linear Regression using TensorFlow 2. Plot the Data
  • 21. Linear Regression using TensorFlow 3. Assign the variables 4. Apply the Cost Function
  • 22. Linear Regression using TensorFlow 5. Apply the Optimization function 6. Initialize the variables 7. Create the session and run the computation
  • 23. Linear Regression using TensorFlow 8. Print the slope and intercept
  • 24. Linear Regression using TensorFlow 9. Evaluate the results
  • 25. Introducing Recurrent Neural Networks In a Feed-Forward Network like CNN/ANN, information flows only in forward direction, from the input nodes, through the hidden layers (if any) and to the output nodes. There are no cycles or loops in the network. Input Layer Hidden Layers Output Layer Feed-Forward Network Yellow Patch Petals length and width Sepals length and width Output Iris Not Iris
  • 26. Introducing Recurrent Neural Networks Recurrent Neural Networks are used to handle sequential time series data because it is able to memorize the inputs. At a particular time step, RNN accepts previous output result along with the current input to generate the output at that time step. Recurrent Neural Network h x y w Rotate the neural network vertically and compress the layers
  • 27. How does a Recurrent Neural Network look like? h x y w Unfold ht-1 xt-1 yt-1 ht xt yt ht+1 xt+1 yt+1 w ww w Input at time t Output at time t Hidden state at time t ht = f (ht-1 ,xtw ) ht = new state fw = function with parameter w ht-1 = old state xt = input vector at time step t
  • 28. Types of RNN While Feed-Forward Networks map one input to one output, Recurrent Neural Networks can map one to many, many to one and many to many. One to One One to Many Many to One Many to Many
  • 29. Types of RNN While Feed-Forward Networks map one input to one output, Recurrent Neural Networks can map one to many, many to one and many to many. One to One One to Many Many to One Many to Many • Known as the Vanilla Neural Network. Used for regular machine learning problems 1 output 1 input
  • 30. Types of RNN While Feed-Forward Networks map one input to one output, Recurrent Neural Networks can map one to many, many to one and many to many. One to One One to Many Many to One Many to Many • Used for image captioning. Given an image, it generate a sequence of words, captioning the image multiple outputs 1 input
  • 31. Types of RNN While Feed-Forward Networks map one input to one output, Recurrent Neural Networks can map one to many, many to one and many to many. One to One One to Many Many to One Many to Many • Used to carry out Sentiment Analysis. Given a set of words, it tells you the sentiment present 1 output multiple inputs
  • 32. Types of RNN While Feed-Forward Networks map one input to one output, Recurrent Neural Networks can map one to many, many to one and many to many. One to One One to Many Many to One Many to Many • Used in Machine Translation. Given a sets of words in one language, it translates it to another multiple inputs multiple outputs
  • 33. Use case implementation of RNN Lets look at a use case of predicting the monthly milk production per cow in pounds using a time series data Based on Data between Jan 1962 to Dec 1975 How much milk production can we expect in a month?
  • 34. Use case implementation of RNN 1. Import the necessary libraries 2. Read the dataset and print the head of it
  • 35. Use case implementation of RNN 3. Convert the index to time series 4. Plot the time series data
  • 36. Use case implementation of RNN 5. Perform the train test split on the data
  • 37. Use case implementation of RNN 6. Scale the data using standard Machine Learning process 7. Applying the batch function
  • 38. Use case implementation of RNN 8. Setting up the RNN model
  • 39. Use case implementation of RNN 9. Create Placeholders for X and y 10. Applying the loss function and optimizer 11. Initialize the Global variables 12. Create an instance of tf.train.Saver()
  • 40. Use case implementation of RNN 13. Create the session and run it
  • 41. Use case implementation of RNN 14. Display the Test Data
  • 42. Use case implementation of RNN 15. Create a seed training_instance to predict the last 12 months milk production from the training data
  • 43. Use case implementation of RNN 16. Displaying the results of the prediction
  • 44. Use case implementation of RNN 17. Reshape the results 18. Create a new column on the test data called Generated
  • 45. Use case implementation of RNN 19. View the test_set dataframe
  • 46. Use case implementation of RNN 20. Plot the predicted result and the actual result

Editor's Notes