SlideShare a Scribd company logo
Machine Learning Overview:
How did we get to here?
Prof. Alan F. Smeaton
Insight @ DCU
WW II, the first modern computers
•  1943 Mauchly and Eckert prepare a proposal for the US
Army to build an Electronic Numerical Integrator
–  calculate a trajectory in 1 second
•  1944 early thoughts on stored program computers by
members of the ENIAC team
•  Spring 1945 ENIAC working well
•  Alan Turing, recruited as a codebreaker,
had the Automatic Computing Engine
(ACE) basic design by Spring, 1946
•  Pilot ACE working, May 10, 1950
•  Full version of ACE at NPL, 1959
The Late 40s
•  1947 first computer bug when Grace
Murray Hopper found bug killed in jaws of
electromechanical relay on Mark II
computer at Harvard
•  1947 –Shockley et al. invent the transistor
•  1949 –Wilkes at Cambridge developed
EDSAC, the first large scale, fully
operational stored program computer
•  1951 – Remington-Rand sold Univac 1 to
US government for $1,000,000
The 1950s
•  IBM produces series of computers with Jean
Amdahl as chief architect
•  Memory upgraded to magnetic core memory,
magnetic tapes and disks with movable read/write
heads
•  Fortran was introduced in 1957, still around, used
in large-scale legacy data processing
•  Integrated Circuit was invented in 1958
The 1960s
•  Computers begin to have
business uses
•  1965 the IBM/360 Mainframe
introduced using integrated
circuits
•  In 1965 DEC introduced PDP-8,
first minicomputer
– Bill Gates’ school bought one and
he used it daily
•  In 1969 work began on
ARPAnet (the predecessor of
the internet)
The Early 1970s
•  In 1971 Intel releases the 4004, the
first microprocessor
– Also in 1971, the first floppy disk is introduced
•  In 1973, Xerox invents Ethernet, the common
networking standard
•  In 1975, the first PC, MITS Altair 8800 (no
keyboard, no display, no auxiliary storage)
•  Bill Gates and Paul Allen (high school friends)
wrote software for the Altair that allowed users to
write their first program … became their first
product
The 1980s
•  In 1981 IBM release the IBM PC based
on an Intel 8088 chip
•  In the same year, Microsoft release an
operating system called DOS
•  In 1982, Time magazine announces the
computer as the 1982 Man of the Year
•  Portable computers
released in 1982
•  Also in 1982, the first
affordable home computer,
the Commodore 64
•  Apple Macintosh released in 1984
What’s in common ?
•  John von Neumann
(1952) proposed a
model for computers
… that’s still with us
Could computers do AI ?
•  Since the early days, always the question of could
computers do artificial intelligence ?
The Human Brain
•  1.5kg, or 2% of our body weight, made of 86B neurons
(grey matter) connected by trillions of connections
(synapses)
•  Responsible for executive functions like autonomic (heart,
breathing, digestion, etc) and voluntary, in addition to
executive functions like self-control, planning, reasoning,
and abstract thought.
•  This architecture of huge number of
simple connected processors, is good for
solving very complex problems, like vision,
and learning
Human Memory
•  The brain has been
coarsely mapped
•  The architecture is of simple but massively parallel
processing, a form of perceptron, highly connected
graph of nodes and links with signal-passing
How neurons work …
What’s a BCI?
•  So how do we implement Artificial Intelligence, emulating
the human brain, on the vN architecture ?
•  Can it be implemented directly in hardware ?
Late 1980s, Connection Machine
•  Problem with CM was it only had 000’s nodes with
connections, not enough to simulate “intelligence”
•  So it was used for simple, massive parallelism apps.
Raised expectations … AI “Winter”
Classic Gartner Curve !
Neural Network research continued
•  Research into neural networks – computational
implementation of the brain’s structure – continued but at a
slow burn and a bit of an AI backwater !
•  Meanwhile, hand-crafted, rules-driven AI research
continued through 70s, 80s, 90s, even 00s
•  AI applications were
–  Speech
–  Machine Translation
–  Computer Vision
–  Expert Systems
–  Etc.
Here’s how Neural Networks developed …
Meanwhile in a galaxy far far away …
AI and Machine Learning
•  Machine learning evolved as an AI tool
•  With mathematics and statistics input rather than any
neural network connection
•  Slow evolution of Machine Learning over decades
•  Nourished by increasing availability of huge data volumes
from … internet searching … social media …online
transactions … etc.
•  One application which pushed this was computer vision
What does this mean?
Machine Learning of Semantic Concepts
•  Use Machine Learning to train a classifier to identify
an object
–  Decision tree learning
–  Random forests
–  Genetic programming
–  Support vector machines
•  Given some input data (e.g. SIFT features or colours
or textures or all 3)
•  Given a lot of + and - examples
•  Let the computer figure out how to classify new
examples into + or – clusters … that’s the modus for
machine learning
Concept Detection
•  We should develop a suite of retrieval techniques that can be used for
such features …
FEATURE DETETCION
0.2 Indoor
0.8 Outdoor
0.7 CityScape
0.3 Landscape
0.1 People
0.0 Face
0.8 Sky
0.2 Vegetation
0.7 Building
How does “standard” ML work ?
•  Lots of + and – examples as training data
•  Plot each data point onto an n-dimensional space
•  Learn a boundary function which differentiates
•  Train and test, refine, then deploy
Visualisation
So how well did it work ?
So how well did it work ?
So how well did it work ?
And in 2013 ?
Machine Learning Overview: How did we get here ?
Machine Learning Overview: How did we get here ?
Standard Machine Learning …
•  So we’re bumbling along, slow and steady
progress in using ML for CV, and ML is being used
elsewhere also … from parole recommendations in
US court cases … to recommending books from
Amazon … to …
One of the pluses of “standard ML” is that
we can conjure up an explanation for a result
•  Another plus is that we can rate the relative
importance of each of the axes .. i.e. features
•  A downside is we have to do feature engineering to
define the axes, and that’s a black art
•  Lots of applications across domains
And then this happened !
In 2012, Krizhevsky et al. “won” the ImageNet large scale visual
recognition challenge with a convolutional neural network
Machine Learning Overview: How did we get here ?
How does this happen ?
In 2012, Krizhevsky et al. “won” the ImageNet large scale visual
recognition challenge with a convolutional neural network
Since then we’re all on that bandwagon .. er .. “following that line”
Machine Learning Overview: How did we get here ?
•  Deep Convolutional Neural Networks are end-to-end
solutions in which both the feature extraction and the
classifier training are performed at once.
•  The first layers extracts a type of information which is
similar to the features/descriptors extracted in the classical
approaches.
•  These, called “deep features”, turn out to be significantly
more efficient that the classical “engineered” ones, even
when used with classical machine learning for classifier
training
•  Once a model for a concept is built, it can be packaged
and released and easily run in a hosted environment
Online systems now available
•  Imagga, Bulgarian SME, offering up to 6,000 visual tags
Concepts in image search
•  Google+ photos now uses computer vision and machine
learning to identify objects and settings in your uploaded
snapshots
My uploads
My uploads
My uploads
How good can a computer get at
automatically annotating images
without using any context at all ?
Machine Learning Overview: How did we get here ?
Machine Learning Overview: How did we get here ?
Machine Learning Overview: How did we get here ?
How to implement CNNs ?
•  Its definitely not von Neumann architecture
•  So we kludge it by throwing massive parallelism … what’s
the cheapest massive parallelism … GPUs
•  New role for Nvidia … not necessarily fast, but many of
them !
•  An alternative is to design new chips …
•  Intel deep learning chips code named Lake Crest followed
by Knight’s Crest, under development, 2017 arrival
•  Samsung handset chips and Qualcomm chips, allow deep
learning on devices, and Movidius (now part of Intel)
specialising in computer vision using deep learning on
silicon.
How to implement CNNs ?
•  Google was starting to use deep networks for NLP
applications like speech and also to run Panda website
rating, but needed more horsepower, like x2 !
•  Tensor Processing Unit (TPU) designed from scratch, very
efficient
•  Used for executing neural networks rather than training ..
saved building x15 data centres and can run neural
networks x30 times faster than conventional vN chip
•  Downsides … need lots and lots and lots of training data,
lots and lots of compute resources … and no facility for
explaining why a decision or categorisation or output, is
made
What of von Neumann architecture ?
•  We used to have people, writing
algorithms, encoded as
programs which were stored
and run
•  Now we also have data, used to
train networks and develop
models (sets of weights and
network configurations) which
are stored and run

More Related Content

PPTX
Deep Learning for Data Scientists - Data Science ATL Meetup Presentation, 201...
PPT
The Turing Test - A sociotechnological analysis and prediction - Machine Inte...
PPTX
AIML_Unit1.pptx
PPT
Ai chapter1
PPTX
Marek Rosa - Inventing General Artificial Intelligence: A Vision and Methodology
PPTX
The Future of AI is Generative not Discriminative 5/26/2021
PDF
Artificial Intelligence is Maturing
PPTX
Amdocs ai s1
Deep Learning for Data Scientists - Data Science ATL Meetup Presentation, 201...
The Turing Test - A sociotechnological analysis and prediction - Machine Inte...
AIML_Unit1.pptx
Ai chapter1
Marek Rosa - Inventing General Artificial Intelligence: A Vision and Methodology
The Future of AI is Generative not Discriminative 5/26/2021
Artificial Intelligence is Maturing
Amdocs ai s1

What's hot (12)

PPTX
Artificial general intelligence research project at Keen Software House (3/2015)
PPTX
Inferene trends in industry
PPTX
Artificial general intelligence research project at Keen Software House (3/2015)
PPTX
artificial intelligence
PPTX
Foundation of A.I
PDF
Introduction to Artificial Intelligence - Cybernetics Robo Academy
ODP
An Introduction to Computer Vision
PDF
AI and Robotics at an Inflection Point
PPT
Lect#1 (Artificial Intelligence )
PPTX
AI Introduction
PDF
Simplified Introduction to AI
PPTX
Deep Learning for Artificial Intelligence (AI)
Artificial general intelligence research project at Keen Software House (3/2015)
Inferene trends in industry
Artificial general intelligence research project at Keen Software House (3/2015)
artificial intelligence
Foundation of A.I
Introduction to Artificial Intelligence - Cybernetics Robo Academy
An Introduction to Computer Vision
AI and Robotics at an Inflection Point
Lect#1 (Artificial Intelligence )
AI Introduction
Simplified Introduction to AI
Deep Learning for Artificial Intelligence (AI)
Ad

Similar to Machine Learning Overview: How did we get here ? (20)

PDF
Fascinating Tales of a Strange Tomorrow
PDF
2018-11-05 Intro to AI
PDF
Introduction to Deep Learning: Concepts, Architectures, and Applications
PDF
A brief history of machine learning
PDF
Deep Learning Hardware: Past, Present, & Future
PDF
Vertex perspectives artificial intelligence
PDF
Vertex Perspectives | Artificial Intelligence
PPTX
(r)Evolution of Machine Learning
PPTX
Lesson 1 intro to ai
PDF
Machine Learning Challenges and Opportunities in Education, Industry, and Res...
PDF
Growing-up With AI
PPTX
Computer Design Concepts for Machine Learning
PDF
Big data and AI presentation slides
PPTX
AI: the silicon brain
PDF
Deep Learning Explained-History, Key Components, Applications, Benefits & Ind...
PDF
How to program DL & AI applications
PDF
Artificial Intelligence-Machine Learning Explained.pdf
PDF
Fascinating Tales of a Strange Tomorrow
PPTX
Introduction to Deep Learning (September 2017)
PDF
AILABS - Lecture Series - Is AI the New Electricity? - Advances In Machine Le...
Fascinating Tales of a Strange Tomorrow
2018-11-05 Intro to AI
Introduction to Deep Learning: Concepts, Architectures, and Applications
A brief history of machine learning
Deep Learning Hardware: Past, Present, & Future
Vertex perspectives artificial intelligence
Vertex Perspectives | Artificial Intelligence
(r)Evolution of Machine Learning
Lesson 1 intro to ai
Machine Learning Challenges and Opportunities in Education, Industry, and Res...
Growing-up With AI
Computer Design Concepts for Machine Learning
Big data and AI presentation slides
AI: the silicon brain
Deep Learning Explained-History, Key Components, Applications, Benefits & Ind...
How to program DL & AI applications
Artificial Intelligence-Machine Learning Explained.pdf
Fascinating Tales of a Strange Tomorrow
Introduction to Deep Learning (September 2017)
AILABS - Lecture Series - Is AI the New Electricity? - Advances In Machine Le...
Ad

More from Universitat Politècnica de Catalunya (20)

PDF
Deep Generative Learning for All - The Gen AI Hype (Spring 2024)
PDF
Deep Generative Learning for All
PDF
The Transformer in Vision | Xavier Giro | Master in Computer Vision Barcelona...
PDF
Towards Sign Language Translation & Production | Xavier Giro-i-Nieto
PDF
The Transformer - Xavier Giró - UPC Barcelona 2021
PDF
Learning Representations for Sign Language Videos - Xavier Giro - NIST TRECVI...
PDF
Open challenges in sign language translation and production
PPTX
Generation of Synthetic Referring Expressions for Object Segmentation in Videos
PPTX
Discovery and Learning of Navigation Goals from Pixels in Minecraft
PDF
Learn2Sign : Sign language recognition and translation using human keypoint e...
PDF
Intepretability / Explainable AI for Deep Neural Networks
PDF
Convolutional Neural Networks - Xavier Giro - UPC TelecomBCN Barcelona 2020
PDF
Self-Supervised Audio-Visual Learning - Xavier Giro - UPC TelecomBCN Barcelon...
PDF
Attention for Deep Learning - Xavier Giro - UPC TelecomBCN Barcelona 2020
PDF
Generative Adversarial Networks GAN - Xavier Giro - UPC TelecomBCN Barcelona ...
PDF
Q-Learning with a Neural Network - Xavier Giró - UPC Barcelona 2020
PDF
Language and Vision with Deep Learning - Xavier Giró - ACM ICMR 2020 (Tutorial)
PDF
Image Segmentation with Deep Learning - Xavier Giro & Carles Ventura - ISSonD...
PDF
Curriculum Learning for Recurrent Video Object Segmentation
PDF
Deep Self-supervised Learning for All - Xavier Giro - X-Europe 2020
Deep Generative Learning for All - The Gen AI Hype (Spring 2024)
Deep Generative Learning for All
The Transformer in Vision | Xavier Giro | Master in Computer Vision Barcelona...
Towards Sign Language Translation & Production | Xavier Giro-i-Nieto
The Transformer - Xavier Giró - UPC Barcelona 2021
Learning Representations for Sign Language Videos - Xavier Giro - NIST TRECVI...
Open challenges in sign language translation and production
Generation of Synthetic Referring Expressions for Object Segmentation in Videos
Discovery and Learning of Navigation Goals from Pixels in Minecraft
Learn2Sign : Sign language recognition and translation using human keypoint e...
Intepretability / Explainable AI for Deep Neural Networks
Convolutional Neural Networks - Xavier Giro - UPC TelecomBCN Barcelona 2020
Self-Supervised Audio-Visual Learning - Xavier Giro - UPC TelecomBCN Barcelon...
Attention for Deep Learning - Xavier Giro - UPC TelecomBCN Barcelona 2020
Generative Adversarial Networks GAN - Xavier Giro - UPC TelecomBCN Barcelona ...
Q-Learning with a Neural Network - Xavier Giró - UPC Barcelona 2020
Language and Vision with Deep Learning - Xavier Giró - ACM ICMR 2020 (Tutorial)
Image Segmentation with Deep Learning - Xavier Giro & Carles Ventura - ISSonD...
Curriculum Learning for Recurrent Video Object Segmentation
Deep Self-supervised Learning for All - Xavier Giro - X-Europe 2020

Recently uploaded (20)

PPTX
STUDY DESIGN details- Lt Col Maksud (21).pptx
PPTX
Business Acumen Training GuidePresentation.pptx
PPTX
The THESIS FINAL-DEFENSE-PRESENTATION.pptx
PPT
Quality review (1)_presentation of this 21
PPTX
DISORDERS OF THE LIVER, GALLBLADDER AND PANCREASE (1).pptx
PPT
ISS -ESG Data flows What is ESG and HowHow
PPTX
Database Infoormation System (DBIS).pptx
PDF
Mega Projects Data Mega Projects Data
PDF
22.Patil - Early prediction of Alzheimer’s disease using convolutional neural...
PPTX
mbdjdhjjodule 5-1 rhfhhfjtjjhafbrhfnfbbfnb
PDF
Galatica Smart Energy Infrastructure Startup Pitch Deck
PDF
Business Analytics and business intelligence.pdf
PDF
annual-report-2024-2025 original latest.
PPTX
oil_refinery_comprehensive_20250804084928 (1).pptx
PPTX
AI Strategy room jwfjksfksfjsjsjsjsjfsjfsj
PDF
Clinical guidelines as a resource for EBP(1).pdf
PPTX
iec ppt-1 pptx icmr ppt on rehabilitation.pptx
PPT
Miokarditis (Inflamasi pada Otot Jantung)
PDF
Fluorescence-microscope_Botany_detailed content
PPTX
Business Ppt On Nestle.pptx huunnnhhgfvu
STUDY DESIGN details- Lt Col Maksud (21).pptx
Business Acumen Training GuidePresentation.pptx
The THESIS FINAL-DEFENSE-PRESENTATION.pptx
Quality review (1)_presentation of this 21
DISORDERS OF THE LIVER, GALLBLADDER AND PANCREASE (1).pptx
ISS -ESG Data flows What is ESG and HowHow
Database Infoormation System (DBIS).pptx
Mega Projects Data Mega Projects Data
22.Patil - Early prediction of Alzheimer’s disease using convolutional neural...
mbdjdhjjodule 5-1 rhfhhfjtjjhafbrhfnfbbfnb
Galatica Smart Energy Infrastructure Startup Pitch Deck
Business Analytics and business intelligence.pdf
annual-report-2024-2025 original latest.
oil_refinery_comprehensive_20250804084928 (1).pptx
AI Strategy room jwfjksfksfjsjsjsjsjfsjfsj
Clinical guidelines as a resource for EBP(1).pdf
iec ppt-1 pptx icmr ppt on rehabilitation.pptx
Miokarditis (Inflamasi pada Otot Jantung)
Fluorescence-microscope_Botany_detailed content
Business Ppt On Nestle.pptx huunnnhhgfvu

Machine Learning Overview: How did we get here ?

  • 1. Machine Learning Overview: How did we get to here? Prof. Alan F. Smeaton Insight @ DCU
  • 2. WW II, the first modern computers •  1943 Mauchly and Eckert prepare a proposal for the US Army to build an Electronic Numerical Integrator –  calculate a trajectory in 1 second •  1944 early thoughts on stored program computers by members of the ENIAC team •  Spring 1945 ENIAC working well •  Alan Turing, recruited as a codebreaker, had the Automatic Computing Engine (ACE) basic design by Spring, 1946 •  Pilot ACE working, May 10, 1950 •  Full version of ACE at NPL, 1959
  • 3. The Late 40s •  1947 first computer bug when Grace Murray Hopper found bug killed in jaws of electromechanical relay on Mark II computer at Harvard •  1947 –Shockley et al. invent the transistor •  1949 –Wilkes at Cambridge developed EDSAC, the first large scale, fully operational stored program computer •  1951 – Remington-Rand sold Univac 1 to US government for $1,000,000
  • 4. The 1950s •  IBM produces series of computers with Jean Amdahl as chief architect •  Memory upgraded to magnetic core memory, magnetic tapes and disks with movable read/write heads •  Fortran was introduced in 1957, still around, used in large-scale legacy data processing •  Integrated Circuit was invented in 1958
  • 5. The 1960s •  Computers begin to have business uses •  1965 the IBM/360 Mainframe introduced using integrated circuits •  In 1965 DEC introduced PDP-8, first minicomputer – Bill Gates’ school bought one and he used it daily •  In 1969 work began on ARPAnet (the predecessor of the internet)
  • 6. The Early 1970s •  In 1971 Intel releases the 4004, the first microprocessor – Also in 1971, the first floppy disk is introduced •  In 1973, Xerox invents Ethernet, the common networking standard •  In 1975, the first PC, MITS Altair 8800 (no keyboard, no display, no auxiliary storage) •  Bill Gates and Paul Allen (high school friends) wrote software for the Altair that allowed users to write their first program … became their first product
  • 7. The 1980s •  In 1981 IBM release the IBM PC based on an Intel 8088 chip •  In the same year, Microsoft release an operating system called DOS •  In 1982, Time magazine announces the computer as the 1982 Man of the Year •  Portable computers released in 1982 •  Also in 1982, the first affordable home computer, the Commodore 64 •  Apple Macintosh released in 1984
  • 8. What’s in common ? •  John von Neumann (1952) proposed a model for computers … that’s still with us
  • 9. Could computers do AI ? •  Since the early days, always the question of could computers do artificial intelligence ?
  • 10. The Human Brain •  1.5kg, or 2% of our body weight, made of 86B neurons (grey matter) connected by trillions of connections (synapses) •  Responsible for executive functions like autonomic (heart, breathing, digestion, etc) and voluntary, in addition to executive functions like self-control, planning, reasoning, and abstract thought. •  This architecture of huge number of simple connected processors, is good for solving very complex problems, like vision, and learning
  • 11. Human Memory •  The brain has been coarsely mapped •  The architecture is of simple but massively parallel processing, a form of perceptron, highly connected graph of nodes and links with signal-passing
  • 13. What’s a BCI? •  So how do we implement Artificial Intelligence, emulating the human brain, on the vN architecture ? •  Can it be implemented directly in hardware ?
  • 15. •  Problem with CM was it only had 000’s nodes with connections, not enough to simulate “intelligence” •  So it was used for simple, massive parallelism apps.
  • 16. Raised expectations … AI “Winter”
  • 18. Neural Network research continued •  Research into neural networks – computational implementation of the brain’s structure – continued but at a slow burn and a bit of an AI backwater ! •  Meanwhile, hand-crafted, rules-driven AI research continued through 70s, 80s, 90s, even 00s •  AI applications were –  Speech –  Machine Translation –  Computer Vision –  Expert Systems –  Etc.
  • 19. Here’s how Neural Networks developed …
  • 20. Meanwhile in a galaxy far far away …
  • 21. AI and Machine Learning •  Machine learning evolved as an AI tool •  With mathematics and statistics input rather than any neural network connection •  Slow evolution of Machine Learning over decades •  Nourished by increasing availability of huge data volumes from … internet searching … social media …online transactions … etc. •  One application which pushed this was computer vision
  • 22. What does this mean?
  • 23. Machine Learning of Semantic Concepts •  Use Machine Learning to train a classifier to identify an object –  Decision tree learning –  Random forests –  Genetic programming –  Support vector machines •  Given some input data (e.g. SIFT features or colours or textures or all 3) •  Given a lot of + and - examples •  Let the computer figure out how to classify new examples into + or – clusters … that’s the modus for machine learning
  • 24. Concept Detection •  We should develop a suite of retrieval techniques that can be used for such features … FEATURE DETETCION 0.2 Indoor 0.8 Outdoor 0.7 CityScape 0.3 Landscape 0.1 People 0.0 Face 0.8 Sky 0.2 Vegetation 0.7 Building
  • 25. How does “standard” ML work ? •  Lots of + and – examples as training data •  Plot each data point onto an n-dimensional space •  Learn a boundary function which differentiates •  Train and test, refine, then deploy
  • 27. So how well did it work ?
  • 28. So how well did it work ?
  • 29. So how well did it work ?
  • 33. Standard Machine Learning … •  So we’re bumbling along, slow and steady progress in using ML for CV, and ML is being used elsewhere also … from parole recommendations in US court cases … to recommending books from Amazon … to …
  • 34. One of the pluses of “standard ML” is that we can conjure up an explanation for a result
  • 35. •  Another plus is that we can rate the relative importance of each of the axes .. i.e. features •  A downside is we have to do feature engineering to define the axes, and that’s a black art •  Lots of applications across domains
  • 36. And then this happened ! In 2012, Krizhevsky et al. “won” the ImageNet large scale visual recognition challenge with a convolutional neural network
  • 38. How does this happen ? In 2012, Krizhevsky et al. “won” the ImageNet large scale visual recognition challenge with a convolutional neural network Since then we’re all on that bandwagon .. er .. “following that line”
  • 40. •  Deep Convolutional Neural Networks are end-to-end solutions in which both the feature extraction and the classifier training are performed at once. •  The first layers extracts a type of information which is similar to the features/descriptors extracted in the classical approaches. •  These, called “deep features”, turn out to be significantly more efficient that the classical “engineered” ones, even when used with classical machine learning for classifier training •  Once a model for a concept is built, it can be packaged and released and easily run in a hosted environment
  • 41. Online systems now available •  Imagga, Bulgarian SME, offering up to 6,000 visual tags
  • 42. Concepts in image search •  Google+ photos now uses computer vision and machine learning to identify objects and settings in your uploaded snapshots
  • 46. How good can a computer get at automatically annotating images without using any context at all ?
  • 50. How to implement CNNs ? •  Its definitely not von Neumann architecture •  So we kludge it by throwing massive parallelism … what’s the cheapest massive parallelism … GPUs •  New role for Nvidia … not necessarily fast, but many of them ! •  An alternative is to design new chips … •  Intel deep learning chips code named Lake Crest followed by Knight’s Crest, under development, 2017 arrival •  Samsung handset chips and Qualcomm chips, allow deep learning on devices, and Movidius (now part of Intel) specialising in computer vision using deep learning on silicon.
  • 51. How to implement CNNs ? •  Google was starting to use deep networks for NLP applications like speech and also to run Panda website rating, but needed more horsepower, like x2 ! •  Tensor Processing Unit (TPU) designed from scratch, very efficient •  Used for executing neural networks rather than training .. saved building x15 data centres and can run neural networks x30 times faster than conventional vN chip •  Downsides … need lots and lots and lots of training data, lots and lots of compute resources … and no facility for explaining why a decision or categorisation or output, is made
  • 52. What of von Neumann architecture ? •  We used to have people, writing algorithms, encoded as programs which were stored and run •  Now we also have data, used to train networks and develop models (sets of weights and network configurations) which are stored and run