SlideShare a Scribd company logo
DBG / Oct 3, 2018 / © 2018 IBM Corporation
RNNs for
Recommendation &
Personalization
Nick Pentreath
Principal Engineer
@MLnick
DBG / Oct 3, 2018 / © 2018 IBM Corporation
About
@MLnick on Twitter & Github
Principal Engineer, IBM
CODAIT - Center for Open-Source Data & AI
Technologies
Machine Learning & AI
Apache Spark committer & PMC
Author of Machine Learning with Spark
Various conferences & meetups
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Center for Open Source Data and AI Technologies
CODAIT
codait.org
CODAIT aims to make AI solutions
dramatically easier to create, deploy,
and manage in the enterprise
Relaunch of the Spark Technology
Center (STC) to reflect expanded
mission
Improving Enterprise AI Lifecycle in Open Source
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Agenda
Recommender systems overview
Deep learning and RNNs
RNNs for recommendations
Challenges and future directions
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Recommender Systems
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Users and Items
Recommender Systems
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Events
Recommender Systems
Implicit preference data
▪ Online – page view, click, app interaction
▪ Commerce – cart, purchase, return
▪ Media – preview, watch, listen
Explicit preference data
▪ Ratings, reviews
Intent
▪ Search query
Social
▪ Like, share, follow, unfollow, block
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Context
Recommender Systems
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Prediction
Recommender Systems
Prediction is ranking
– Given a user and context, rank the available items in
order of likelihood that the user will interact with them
Sort
items
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Matrix Factorization
Recommender Systems
The de facto standard model
– Represent user ratings as a user-item matrix
– Find two smaller matrices (called the factor
matrices) that approximate the full matrix
– Minimize the reconstruction error (i.e. rating
prediction / completion)
– Efficient, scalable algorithms
• Gradient Descent
• Alternating Least Squares (ALS)
– Prediction is simple
– Can handle implicit data
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Cold Start
Recommender Systems
New items
– No historical interaction data
– Typically use baselines (e.g. populariy) or item content
New (or unknown) users
– Previously unseen or anonymous users have no user
profile or historical interactions
– Have context data (but possibly very limited)
– Cannot directly use collaborative filtering models
• Item-similarity for current item
• Represent session as aggregation of items
• Contextual models can incorporate short-term history
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Deep Learning and 

Recurrent Neural Networks
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Overview
Deep Learning
Original theory from 1940s; computer models
originated around 1960s; fell out of favor in
1980s/90s
Recent resurgence due to
– Bigger (and better) data; standard datasets (e.g. ImageNet)
– Better hardware (GPUs)
– Improvements to algorithms, architectures and optimization
Leading to new state-of-the-art results in
computer vision (images and video); speech/
text; language translation and more
Source: Wikipedia
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Modern Neural Networks
Deep Learning
Deep (multi-layer) networks
Computer vision
– Convolution neural networks (CNNs)
– Image classification, object detection, segmentation
Sequences and time-series
– Recurrent neural networks (RNNs)
– Machine translation, text generation
– LSTMs, GRUs
Embeddings
– Text, categorical features
Deep learning frameworks
– Flexibility, computation graphs, auto-differentiation, GPUs
Source: Stanford CS231n
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Recurrent Neural Networks
Deep Learning
Neural Network on Sequences …
– … sequence of neural network (layers)
– Hidden layers (state) dependent on previous state as well as
current input
– “memory” of what came before
Source: Stanford CS231n
– Share weights across all time steps
– Training using backpropagation through time (BPTT)
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Recurrent Neural Networks
Source: Andrej Karpathy
Deep Learning
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Recurrent Neural Networks
Deep Learning
Issues
– Exploding gradients - clip / scale gradients
– Vanishing gradients
Source: Stanford CS231n
Solutions
– Truncated BPTT
– Restrict sequence length
– Cannot encode very long term memory
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Recurrent Neural Networks
Deep Learning
Long Short Term Memory (LSTM)
– Replace simple RNN layer (activation) with a LSTM cell
– Cell has 3 gates - Input (i), Forget (f), Output (o)
– Activation (g)
– Backpropagation depends only on elementwise operations (no
matrix operations over W)
Gated Recurrent Unit (GRU)
– Effectively a simplified version of LSTM
– 2 gates instead of 3 - input and forget gate is combined into an
update gate. No output gate
GRU has fewer parameters, LSTM may be more
expressive
Source: Stanford CS231n; Hochreiter et al.
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Recurrent Neural Networks
Deep Learning
Variants
– Multi-layer (deep) RNNs
– Bi-directional
– Deep bi-directional
– Attention
Source: Stanford CS231n; Denny Britz
DBG / Oct 3, 2018 / © 2018 IBM Corporation
RNNs for Recommendations
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Deep Learning for Recommenders Overview
RNNs for Recommendations
Most approaches have focused on combining
– Performance of collaborative filtering models
(especially matrix factorization)
• Embeddings with appropriate loss = MF
– Power of deep learning for feature extraction
• CNNs for image content, audio, etc.
• Embeddings for categorical features
• Linear models for interactions
• RNNs for text
Source: Spotify / Sander Dieleman
Google Research
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Session-based recommendation
RNNs for Recommendations
Apply the advances in sequence modeling
from deep learning
– RNN architectures trained on the sequence of user
events in a session (e.g. products viewed,
purchased) to predict next item in session
– Adjustments for domain
• Item encoding (1-of-N, weighted average)
• Parallel mini-batch processing
• Ranking losses – BPR , TOP1
• Negative item sampling per mini-batch
– Report 20-30% accuracy gain over baselines
Source: Hidasi, Karatzoglou, Baltrunas, Tikk
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Contextual Session-based models
RNNs for Recommendations
Add contextual data to the RNN architecture
– Context included time, time since last event, event
type
– Combine context data with input / output layer
– Also combine context with the RNN layers
– About 3-6% improvement (in Recall@10 metric)
over simple RNN baseline
– Importantly, model is even better at predicting
sales (vs view, add to cart events) and at predicting
new / fresh items (vs items the user has already
seen)
Source: Smirnova, Vasile
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Content and Session-based models
RNNs for Recommendations
Add content data to the RNN architecture
– Parallel RNN (p-RNN)
– Follows trend in combining DL architectures for
content feature extraction with CF models for
interaction data
• CNN for image data
• BOW for text (alternatives are Word2Vec-style models
and RNN language models)
– Some training tricks
• Alternating – keep one subnet fixed, train other
• Residual – subnets trained on residual error
• Interleaved – alternating training per mini-batch
Source: Hidasi, Quadrana, Karatzoglou, Tikk
DBG / Oct 3, 2018 / © 2018 IBM Corporation
3D CNNs for Session-based Recommendation
RNNs for Recommendations
As we’ve seen in text / NLP, CNNs can also be
effective in modeling sequences
– 3D convolutional models have been applied in
video classification
– Potentially faster to train, easier to understand
– Use character-level encoding of IDs and item
features (name, description, categories)
• Compact representation
• No embedding layer
– “ResNet” style architecture
– Show improvement over p-RNN
Source: Tuan, Phuong
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Challenges
Challenges particular to recommendation
models
– Data size and dimensionality (input & output)
• Sampling
– Extreme sparsity
• Embeddings & compressed representations
– Wide variety of specialized settings
– Combining session, content, context and
preference data
– Model serving is difficult – ranking, large number of
items, computationally expensive
– Metrics – model accuracy and its relation to real-
world outcomes and behaviors
– Need for standard, open, large-scale, datasets that
have time and session data and are content- and
context-rich
• RecSys 15 Challenge – YouChoose dataset
– Evaluation – watch you baselines!
• When Recurrent Neural Networks meet the
Neighborhood for Session-Based Recommendation
Challenges and Future Directions
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Future Directions
Challenges and Future Directions
Most recent and future directions in research
& industry
– Improved RNNs
• Cross-session models (e.g. Hierarchical RNN)
• Further research on contextual models, as well as
content and metadata
• Attention models:
– Attentive Neural Architecture for Music
Recommendation
– Neural Attentive Session-based Recommendation
– Combine sequence and historical models (long-
and short-term user profiles)
– Domain-specific applications
• Contextualized Location Sequence Recommender
– RecGAN (yes, GANs and RNNS!)
– Applications at scale
• Dimensionality reduction, compressed encodings
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Summary
Challenges and Future Directions
DL for recommendation is just getting started
(again)
– Huge increase in interest, research papers. Already
many new models and approaches
– DL approaches have generally yielded incremental
% gains
• But that can translate to significant $$$
• More pronounced in session-based
– Cold start scenarios benefit from multi-modal
nature of DL models and explicit modeling of
sequences
– Flexibility of DL frameworks helps a lot
– Benefits from advances in DL for images, video,
NLP etc.
– Open-source libraries appearing (e.g. Spotlight)
– Check out DLRS workshops & tutorials @ RecSys
2016 / 2017, and upcoming in Oct, 2018
– RecSys challenges
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Thank you!
codait.org
twitter.com/MLnick
github.com/MLnick
developer.ibm.com
FfDL
Sign up for IBM Cloud and try Watson Studio!
https://ibm.biz/BdYbTY
https://datascience.ibm.com/
MAX
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Links & References
Wikipedia: Perceptron
Stanford CS231n Convolutional Neural Networks for Visual
Recognition
Stanford CS231n – RNN Slides
Recurrent Neural Networks Tutorial
The Unreasonable Effectiveness of Recurrent Neural
Networks
Understanding LSTM Networks
Learning Phrase Representations using RNN Encoder-
Decoder for Statistical Machine Translation
Long short-term memory
Attention and Augmented Recurrent Neural Networks
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Links & References
Deep Content-based Music Recommendation
Google’s Wide and Deep Learning Model
Deep Learning for Recommender Systems Workshops @
RecSys
Deep Learning for Recommender Systems Tutorial @
RecSys 2017
Session-based Recommendations with Recurrent Neural
Networks
Recurrent Neural Networks with Top-k Gains for Session-
based Recommendations
Sequential User-based Recurrent Neural Network
Recommendations
DBG / Oct 3, 2018 / © 2018 IBM Corporation
Links & References
Personalizing Session-based Recommendations with
Hierarchical Recurrent Neural Networks
Parallel Recurrent Neural Network Architectures for
Feature-rich Session-based Recommendations
Contextual Sequence Modeling for Recommendation with
Recurrent Neural Networks
When Recurrent Neural Networks meet the Neighborhood
for Session-Based Recommendation
3D Convolutional Networks for Session-based
Recommendation with Content Features
Spotlight: Recommendation models in PyTorch
RecSys 2015 Challenge – YouChoose Dataset
DBG / Oct 3, 2018 / © 2018 IBM Corporation

More Related Content

PDF
Tutorial on Sequence Aware Recommender Systems - ACM RecSys 2018
PDF
Deep Learning for Recommender Systems
PDF
Deep Learning for Recommender Systems
PDF
Recent Trends in Personalization at Netflix
PPTX
Learning a Personalized Homepage
PDF
Contextualization at Netflix
PDF
Past, Present & Future of Recommender Systems: An Industry Perspective
PDF
Recommender Systems
Tutorial on Sequence Aware Recommender Systems - ACM RecSys 2018
Deep Learning for Recommender Systems
Deep Learning for Recommender Systems
Recent Trends in Personalization at Netflix
Learning a Personalized Homepage
Contextualization at Netflix
Past, Present & Future of Recommender Systems: An Industry Perspective
Recommender Systems

What's hot (20)

PPTX
Recommendation system (1).pptx
PDF
RecSysOps: Best Practices for Operating a Large-Scale Recommender System
PDF
Recommender system algorithm and architecture
PDF
Artwork Personalization at Netflix
PDF
Making Netflix Machine Learning Algorithms Reliable
PPTX
Tutorial on sequence aware recommender systems - UMAP 2018
PDF
Deep Learning for Recommender Systems
PPTX
Collaborative Filtering using KNN
PDF
Recommender Systems In Industry
PDF
Introduction to Recommendation Systems
PDF
An introduction to Recommender Systems
PDF
Recommending for the World
PDF
Session-Based Recommender Systems
PDF
Missing values in recommender models
PDF
Sequential Decision Making in Recommendations
PDF
Vector database
PPTX
GRU4Rec v2 - Recurrent Neural Networks with Top-k Gains for Session-based Rec...
PPTX
Udacity webinar on Recommendation Systems
PPTX
Recommender system introduction
PPTX
AlphaZero: A General Reinforcement Learning Algorithm that Masters Chess, Sho...
Recommendation system (1).pptx
RecSysOps: Best Practices for Operating a Large-Scale Recommender System
Recommender system algorithm and architecture
Artwork Personalization at Netflix
Making Netflix Machine Learning Algorithms Reliable
Tutorial on sequence aware recommender systems - UMAP 2018
Deep Learning for Recommender Systems
Collaborative Filtering using KNN
Recommender Systems In Industry
Introduction to Recommendation Systems
An introduction to Recommender Systems
Recommending for the World
Session-Based Recommender Systems
Missing values in recommender models
Sequential Decision Making in Recommendations
Vector database
GRU4Rec v2 - Recurrent Neural Networks with Top-k Gains for Session-based Rec...
Udacity webinar on Recommendation Systems
Recommender system introduction
AlphaZero: A General Reinforcement Learning Algorithm that Masters Chess, Sho...
Ad

Similar to Recurrent Neural Networks for Recommendations and Personalization with Nick Pentreath (20)

PPTX
RNNs for Recommendations and Personalization
PPTX
Deep Learning for Recommender Systems
PDF
Deep Learning for Recommender Systems with Nick pentreath
PPTX
Search and Recommendations: 3 Sides of the Same Coin
PDF
Measuring, Quantifying, & Predicting the Cost-Accuracy Tradeoff
PPTX
L'architettura di classe enterprise di nuova generazione - Massimo Brignoli
PPTX
Shikha fdp 62_14july2017
PDF
BUILDING BETTER PREDICTIVE MODELS WITH COGNITIVE ASSISTANCE IN A DATA SCIENCE...
PDF
Spark Summit EU talk by Bas Geerdink
PDF
FIWARE Training: Introduction to Smart Data Models
PPTX
Azure Databricks for Data Scientists
PDF
NGD Systems and Microsoft Keynote Presentation at IPDPS MPP in Vacouver
PDF
Big Data Architectures @ JAX / BigDataCon 2016
PDF
Simplified Machine Learning, Text, and Graph Analytics with Pivotal Greenplum
PDF
Analytics&IoT
PPTX
Bitkom Cray presentation - on HPC affecting big data analytics in FS
PDF
The Bitter Lesson of ML Pipelines
PDF
DM Radio Webinar: Adopting a Streaming-Enabled Architecture
PDF
Mtc strategy-briefing-houston-pd m-05212018-3
RNNs for Recommendations and Personalization
Deep Learning for Recommender Systems
Deep Learning for Recommender Systems with Nick pentreath
Search and Recommendations: 3 Sides of the Same Coin
Measuring, Quantifying, & Predicting the Cost-Accuracy Tradeoff
L'architettura di classe enterprise di nuova generazione - Massimo Brignoli
Shikha fdp 62_14july2017
BUILDING BETTER PREDICTIVE MODELS WITH COGNITIVE ASSISTANCE IN A DATA SCIENCE...
Spark Summit EU talk by Bas Geerdink
FIWARE Training: Introduction to Smart Data Models
Azure Databricks for Data Scientists
NGD Systems and Microsoft Keynote Presentation at IPDPS MPP in Vacouver
Big Data Architectures @ JAX / BigDataCon 2016
Simplified Machine Learning, Text, and Graph Analytics with Pivotal Greenplum
Analytics&IoT
Bitkom Cray presentation - on HPC affecting big data analytics in FS
The Bitter Lesson of ML Pipelines
DM Radio Webinar: Adopting a Streaming-Enabled Architecture
Mtc strategy-briefing-houston-pd m-05212018-3
Ad

More from Databricks (20)

PPTX
DW Migration Webinar-March 2022.pptx
PPTX
Data Lakehouse Symposium | Day 1 | Part 1
PPT
Data Lakehouse Symposium | Day 1 | Part 2
PPTX
Data Lakehouse Symposium | Day 2
PPTX
Data Lakehouse Symposium | Day 4
PDF
5 Critical Steps to Clean Your Data Swamp When Migrating Off of Hadoop
PDF
Democratizing Data Quality Through a Centralized Platform
PDF
Learn to Use Databricks for Data Science
PDF
Why APM Is Not the Same As ML Monitoring
PDF
The Function, the Context, and the Data—Enabling ML Ops at Stitch Fix
PDF
Stage Level Scheduling Improving Big Data and AI Integration
PDF
Simplify Data Conversion from Spark to TensorFlow and PyTorch
PDF
Scaling your Data Pipelines with Apache Spark on Kubernetes
PDF
Scaling and Unifying SciKit Learn and Apache Spark Pipelines
PDF
Sawtooth Windows for Feature Aggregations
PDF
Redis + Apache Spark = Swiss Army Knife Meets Kitchen Sink
PDF
Re-imagine Data Monitoring with whylogs and Spark
PDF
Raven: End-to-end Optimization of ML Prediction Queries
PDF
Processing Large Datasets for ADAS Applications using Apache Spark
PDF
Massive Data Processing in Adobe Using Delta Lake
DW Migration Webinar-March 2022.pptx
Data Lakehouse Symposium | Day 1 | Part 1
Data Lakehouse Symposium | Day 1 | Part 2
Data Lakehouse Symposium | Day 2
Data Lakehouse Symposium | Day 4
5 Critical Steps to Clean Your Data Swamp When Migrating Off of Hadoop
Democratizing Data Quality Through a Centralized Platform
Learn to Use Databricks for Data Science
Why APM Is Not the Same As ML Monitoring
The Function, the Context, and the Data—Enabling ML Ops at Stitch Fix
Stage Level Scheduling Improving Big Data and AI Integration
Simplify Data Conversion from Spark to TensorFlow and PyTorch
Scaling your Data Pipelines with Apache Spark on Kubernetes
Scaling and Unifying SciKit Learn and Apache Spark Pipelines
Sawtooth Windows for Feature Aggregations
Redis + Apache Spark = Swiss Army Knife Meets Kitchen Sink
Re-imagine Data Monitoring with whylogs and Spark
Raven: End-to-end Optimization of ML Prediction Queries
Processing Large Datasets for ADAS Applications using Apache Spark
Massive Data Processing in Adobe Using Delta Lake

Recently uploaded (20)

PPT
Miokarditis (Inflamasi pada Otot Jantung)
PPTX
Business Acumen Training GuidePresentation.pptx
PPTX
IBA_Chapter_11_Slides_Final_Accessible.pptx
PPTX
Business Ppt On Nestle.pptx huunnnhhgfvu
PDF
Galatica Smart Energy Infrastructure Startup Pitch Deck
PPTX
05. PRACTICAL GUIDE TO MICROSOFT EXCEL.pptx
PPTX
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
PPTX
Introduction to Basics of Ethical Hacking and Penetration Testing -Unit No. 1...
PDF
Foundation of Data Science unit number two notes
PDF
Recruitment and Placement PPT.pdfbjfibjdfbjfobj
PPTX
Acceptance and paychological effects of mandatory extra coach I classes.pptx
PPTX
Introduction to Firewall Analytics - Interfirewall and Transfirewall.pptx
PPTX
oil_refinery_comprehensive_20250804084928 (1).pptx
PPTX
Major-Components-ofNKJNNKNKNKNKronment.pptx
PDF
Fluorescence-microscope_Botany_detailed content
PPTX
Global journeys: estimating international migration
PDF
22.Patil - Early prediction of Alzheimer’s disease using convolutional neural...
PPTX
Introduction to Knowledge Engineering Part 1
PPT
Chapter 3 METAL JOINING.pptnnnnnnnnnnnnn
PPTX
Database Infoormation System (DBIS).pptx
Miokarditis (Inflamasi pada Otot Jantung)
Business Acumen Training GuidePresentation.pptx
IBA_Chapter_11_Slides_Final_Accessible.pptx
Business Ppt On Nestle.pptx huunnnhhgfvu
Galatica Smart Energy Infrastructure Startup Pitch Deck
05. PRACTICAL GUIDE TO MICROSOFT EXCEL.pptx
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
Introduction to Basics of Ethical Hacking and Penetration Testing -Unit No. 1...
Foundation of Data Science unit number two notes
Recruitment and Placement PPT.pdfbjfibjdfbjfobj
Acceptance and paychological effects of mandatory extra coach I classes.pptx
Introduction to Firewall Analytics - Interfirewall and Transfirewall.pptx
oil_refinery_comprehensive_20250804084928 (1).pptx
Major-Components-ofNKJNNKNKNKNKronment.pptx
Fluorescence-microscope_Botany_detailed content
Global journeys: estimating international migration
22.Patil - Early prediction of Alzheimer’s disease using convolutional neural...
Introduction to Knowledge Engineering Part 1
Chapter 3 METAL JOINING.pptnnnnnnnnnnnnn
Database Infoormation System (DBIS).pptx

Recurrent Neural Networks for Recommendations and Personalization with Nick Pentreath

  • 1. DBG / Oct 3, 2018 / © 2018 IBM Corporation RNNs for Recommendation & Personalization Nick Pentreath Principal Engineer @MLnick
  • 2. DBG / Oct 3, 2018 / © 2018 IBM Corporation About @MLnick on Twitter & Github Principal Engineer, IBM CODAIT - Center for Open-Source Data & AI Technologies Machine Learning & AI Apache Spark committer & PMC Author of Machine Learning with Spark Various conferences & meetups
  • 3. DBG / Oct 3, 2018 / © 2018 IBM Corporation Center for Open Source Data and AI Technologies CODAIT codait.org CODAIT aims to make AI solutions dramatically easier to create, deploy, and manage in the enterprise Relaunch of the Spark Technology Center (STC) to reflect expanded mission Improving Enterprise AI Lifecycle in Open Source
  • 4. DBG / Oct 3, 2018 / © 2018 IBM Corporation Agenda Recommender systems overview Deep learning and RNNs RNNs for recommendations Challenges and future directions
  • 5. DBG / Oct 3, 2018 / © 2018 IBM Corporation Recommender Systems
  • 6. DBG / Oct 3, 2018 / © 2018 IBM Corporation Users and Items Recommender Systems
  • 7. DBG / Oct 3, 2018 / © 2018 IBM Corporation Events Recommender Systems Implicit preference data ▪ Online – page view, click, app interaction ▪ Commerce – cart, purchase, return ▪ Media – preview, watch, listen Explicit preference data ▪ Ratings, reviews Intent ▪ Search query Social ▪ Like, share, follow, unfollow, block
  • 8. DBG / Oct 3, 2018 / © 2018 IBM Corporation Context Recommender Systems
  • 9. DBG / Oct 3, 2018 / © 2018 IBM Corporation Prediction Recommender Systems Prediction is ranking – Given a user and context, rank the available items in order of likelihood that the user will interact with them Sort items
  • 10. DBG / Oct 3, 2018 / © 2018 IBM Corporation Matrix Factorization Recommender Systems The de facto standard model – Represent user ratings as a user-item matrix – Find two smaller matrices (called the factor matrices) that approximate the full matrix – Minimize the reconstruction error (i.e. rating prediction / completion) – Efficient, scalable algorithms • Gradient Descent • Alternating Least Squares (ALS) – Prediction is simple – Can handle implicit data
  • 11. DBG / Oct 3, 2018 / © 2018 IBM Corporation Cold Start Recommender Systems New items – No historical interaction data – Typically use baselines (e.g. populariy) or item content New (or unknown) users – Previously unseen or anonymous users have no user profile or historical interactions – Have context data (but possibly very limited) – Cannot directly use collaborative filtering models • Item-similarity for current item • Represent session as aggregation of items • Contextual models can incorporate short-term history
  • 12. DBG / Oct 3, 2018 / © 2018 IBM Corporation Deep Learning and 
 Recurrent Neural Networks
  • 13. DBG / Oct 3, 2018 / © 2018 IBM Corporation Overview Deep Learning Original theory from 1940s; computer models originated around 1960s; fell out of favor in 1980s/90s Recent resurgence due to – Bigger (and better) data; standard datasets (e.g. ImageNet) – Better hardware (GPUs) – Improvements to algorithms, architectures and optimization Leading to new state-of-the-art results in computer vision (images and video); speech/ text; language translation and more Source: Wikipedia
  • 14. DBG / Oct 3, 2018 / © 2018 IBM Corporation Modern Neural Networks Deep Learning Deep (multi-layer) networks Computer vision – Convolution neural networks (CNNs) – Image classification, object detection, segmentation Sequences and time-series – Recurrent neural networks (RNNs) – Machine translation, text generation – LSTMs, GRUs Embeddings – Text, categorical features Deep learning frameworks – Flexibility, computation graphs, auto-differentiation, GPUs Source: Stanford CS231n
  • 15. DBG / Oct 3, 2018 / © 2018 IBM Corporation Recurrent Neural Networks Deep Learning Neural Network on Sequences … – … sequence of neural network (layers) – Hidden layers (state) dependent on previous state as well as current input – “memory” of what came before Source: Stanford CS231n – Share weights across all time steps – Training using backpropagation through time (BPTT)
  • 16. DBG / Oct 3, 2018 / © 2018 IBM Corporation Recurrent Neural Networks Source: Andrej Karpathy Deep Learning
  • 17. DBG / Oct 3, 2018 / © 2018 IBM Corporation Recurrent Neural Networks Deep Learning Issues – Exploding gradients - clip / scale gradients – Vanishing gradients Source: Stanford CS231n Solutions – Truncated BPTT – Restrict sequence length – Cannot encode very long term memory
  • 18. DBG / Oct 3, 2018 / © 2018 IBM Corporation Recurrent Neural Networks Deep Learning Long Short Term Memory (LSTM) – Replace simple RNN layer (activation) with a LSTM cell – Cell has 3 gates - Input (i), Forget (f), Output (o) – Activation (g) – Backpropagation depends only on elementwise operations (no matrix operations over W) Gated Recurrent Unit (GRU) – Effectively a simplified version of LSTM – 2 gates instead of 3 - input and forget gate is combined into an update gate. No output gate GRU has fewer parameters, LSTM may be more expressive Source: Stanford CS231n; Hochreiter et al.
  • 19. DBG / Oct 3, 2018 / © 2018 IBM Corporation Recurrent Neural Networks Deep Learning Variants – Multi-layer (deep) RNNs – Bi-directional – Deep bi-directional – Attention Source: Stanford CS231n; Denny Britz
  • 20. DBG / Oct 3, 2018 / © 2018 IBM Corporation RNNs for Recommendations
  • 21. DBG / Oct 3, 2018 / © 2018 IBM Corporation Deep Learning for Recommenders Overview RNNs for Recommendations Most approaches have focused on combining – Performance of collaborative filtering models (especially matrix factorization) • Embeddings with appropriate loss = MF – Power of deep learning for feature extraction • CNNs for image content, audio, etc. • Embeddings for categorical features • Linear models for interactions • RNNs for text Source: Spotify / Sander Dieleman Google Research
  • 22. DBG / Oct 3, 2018 / © 2018 IBM Corporation Session-based recommendation RNNs for Recommendations Apply the advances in sequence modeling from deep learning – RNN architectures trained on the sequence of user events in a session (e.g. products viewed, purchased) to predict next item in session – Adjustments for domain • Item encoding (1-of-N, weighted average) • Parallel mini-batch processing • Ranking losses – BPR , TOP1 • Negative item sampling per mini-batch – Report 20-30% accuracy gain over baselines Source: Hidasi, Karatzoglou, Baltrunas, Tikk
  • 23. DBG / Oct 3, 2018 / © 2018 IBM Corporation Contextual Session-based models RNNs for Recommendations Add contextual data to the RNN architecture – Context included time, time since last event, event type – Combine context data with input / output layer – Also combine context with the RNN layers – About 3-6% improvement (in Recall@10 metric) over simple RNN baseline – Importantly, model is even better at predicting sales (vs view, add to cart events) and at predicting new / fresh items (vs items the user has already seen) Source: Smirnova, Vasile
  • 24. DBG / Oct 3, 2018 / © 2018 IBM Corporation Content and Session-based models RNNs for Recommendations Add content data to the RNN architecture – Parallel RNN (p-RNN) – Follows trend in combining DL architectures for content feature extraction with CF models for interaction data • CNN for image data • BOW for text (alternatives are Word2Vec-style models and RNN language models) – Some training tricks • Alternating – keep one subnet fixed, train other • Residual – subnets trained on residual error • Interleaved – alternating training per mini-batch Source: Hidasi, Quadrana, Karatzoglou, Tikk
  • 25. DBG / Oct 3, 2018 / © 2018 IBM Corporation 3D CNNs for Session-based Recommendation RNNs for Recommendations As we’ve seen in text / NLP, CNNs can also be effective in modeling sequences – 3D convolutional models have been applied in video classification – Potentially faster to train, easier to understand – Use character-level encoding of IDs and item features (name, description, categories) • Compact representation • No embedding layer – “ResNet” style architecture – Show improvement over p-RNN Source: Tuan, Phuong
  • 26. DBG / Oct 3, 2018 / © 2018 IBM Corporation Challenges Challenges particular to recommendation models – Data size and dimensionality (input & output) • Sampling – Extreme sparsity • Embeddings & compressed representations – Wide variety of specialized settings – Combining session, content, context and preference data – Model serving is difficult – ranking, large number of items, computationally expensive – Metrics – model accuracy and its relation to real- world outcomes and behaviors – Need for standard, open, large-scale, datasets that have time and session data and are content- and context-rich • RecSys 15 Challenge – YouChoose dataset – Evaluation – watch you baselines! • When Recurrent Neural Networks meet the Neighborhood for Session-Based Recommendation Challenges and Future Directions
  • 27. DBG / Oct 3, 2018 / © 2018 IBM Corporation Future Directions Challenges and Future Directions Most recent and future directions in research & industry – Improved RNNs • Cross-session models (e.g. Hierarchical RNN) • Further research on contextual models, as well as content and metadata • Attention models: – Attentive Neural Architecture for Music Recommendation – Neural Attentive Session-based Recommendation – Combine sequence and historical models (long- and short-term user profiles) – Domain-specific applications • Contextualized Location Sequence Recommender – RecGAN (yes, GANs and RNNS!) – Applications at scale • Dimensionality reduction, compressed encodings
  • 28. DBG / Oct 3, 2018 / © 2018 IBM Corporation Summary Challenges and Future Directions DL for recommendation is just getting started (again) – Huge increase in interest, research papers. Already many new models and approaches – DL approaches have generally yielded incremental % gains • But that can translate to significant $$$ • More pronounced in session-based – Cold start scenarios benefit from multi-modal nature of DL models and explicit modeling of sequences – Flexibility of DL frameworks helps a lot – Benefits from advances in DL for images, video, NLP etc. – Open-source libraries appearing (e.g. Spotlight) – Check out DLRS workshops & tutorials @ RecSys 2016 / 2017, and upcoming in Oct, 2018 – RecSys challenges
  • 29. DBG / Oct 3, 2018 / © 2018 IBM Corporation Thank you! codait.org twitter.com/MLnick github.com/MLnick developer.ibm.com FfDL Sign up for IBM Cloud and try Watson Studio! https://ibm.biz/BdYbTY https://datascience.ibm.com/ MAX
  • 30. DBG / Oct 3, 2018 / © 2018 IBM Corporation Links & References Wikipedia: Perceptron Stanford CS231n Convolutional Neural Networks for Visual Recognition Stanford CS231n – RNN Slides Recurrent Neural Networks Tutorial The Unreasonable Effectiveness of Recurrent Neural Networks Understanding LSTM Networks Learning Phrase Representations using RNN Encoder- Decoder for Statistical Machine Translation Long short-term memory Attention and Augmented Recurrent Neural Networks
  • 31. DBG / Oct 3, 2018 / © 2018 IBM Corporation Links & References Deep Content-based Music Recommendation Google’s Wide and Deep Learning Model Deep Learning for Recommender Systems Workshops @ RecSys Deep Learning for Recommender Systems Tutorial @ RecSys 2017 Session-based Recommendations with Recurrent Neural Networks Recurrent Neural Networks with Top-k Gains for Session- based Recommendations Sequential User-based Recurrent Neural Network Recommendations
  • 32. DBG / Oct 3, 2018 / © 2018 IBM Corporation Links & References Personalizing Session-based Recommendations with Hierarchical Recurrent Neural Networks Parallel Recurrent Neural Network Architectures for Feature-rich Session-based Recommendations Contextual Sequence Modeling for Recommendation with Recurrent Neural Networks When Recurrent Neural Networks meet the Neighborhood for Session-Based Recommendation 3D Convolutional Networks for Session-based Recommendation with Content Features Spotlight: Recommendation models in PyTorch RecSys 2015 Challenge – YouChoose Dataset
  • 33. DBG / Oct 3, 2018 / © 2018 IBM Corporation