SlideShare a Scribd company logo
Multi-label Relational Neighbor Classification
using Social Context Features
Xi Wang and Gita Sukthankar
Department of EECS
University of Central Florida
Motivation
 The conventional relational
classification model focuses on
the single-label classification
problem.
 Real-world relational datasets
contain instances associated
with multiple labels.
 Connections between instances
in multi-label networks are
driven by various casual
reasons.
Example: Scientific collaboration network
Machine Learning
Data Mining
Artificial
Intelligence
1
Problem Formulation
 Node classification in multi-relational networks
 Input:
 Network structure (i.e., connectivity information)
 Labels of some actors in the network
 Output:
 Labels of the other actors
2
Classification in Networked Data
 Homophily: nodes with similar labels are more likely to be
connected
 Markov assumption:
 The label of one node depends on that of its immediate neighbors in
the graph
 Relational models are built based on the labels of neighbors.
 Predictions are made using collective inference.
3
Contribution
 A new multi-label iterative relational neighbor classifier
(SCRN)
 Extract social context features using edge clustering to
represent a node’s potential group membership
 Use of social features boosts classification performance
over benchmarks on several real-world collaborative
networked datasets
4
Relational Neighbor Classifier
 The Relational Neighbor (RN) classifier proposed by Macskassy et al.
(MRDM’03), is a simple relational probabilistic model that makes
predictions for a given node based solely on the class labels of its
neighbors.
Iteration 1 Iteration 2Training Graph
5
Relational Neighbor Classifier
 Weighted-vote relational neighbor classifier (wvRN)
estimates prediction probability as:
Here is the usual normalization factor, and
is the weight of the link between node and


ij Nv
jjjiii NcLPvvw
z
vcLP )|(),(
1
)|(
z w(vi,vj )
vi vj
6
Apply RN in Multi-relational Network
Ground truth
: nodes with both labels (red, green)
: nodes with green label only
: nodes with red label only
7
Edge-Based Social Feature Extraction
 Connections in human networks are mainly affiliation-
driven.
 Since each connection can often be regarded as principally
resulting from one affiliation, links possess a strong
correlation with a single affiliation class.
 The edge class information is not readily available in most
social media datasets, but an unsupervised clustering
algorithm can be applied to partition the edges into disjoint
sets (KDD’09,CIKM’09).
8
Cluster edges using K-Means
 Scalable edge clustering method proposed by Tang et al.
(CIKM’09).
 Each edge is represented in a feature-based format, where
each edge is characterized by its adjacent nodes.
 K-means clustering is used to separate the edges into
groups, and the social feature (SF) vector is constructed
based on edge cluster IDs.
Original network Step1 : Edge representations Step2: Construct social features
9
Edge-Clustering Visualization
Figure: A subset of DBLP with 95 instances. Edges are clustered into 10
groups, with each shown in a different color.
10
Proposed Method: SCRN
 The initial set of reference features for class c can be
defined as the weighted sum of social feature vectors for
nodes known to be in class c:
 Then node ’s class propagation probability for class c
conditioned on its social features:
RV(c) =
1
|Vc
K
|
P(li
c
=1)´SF(vi )
viÎVc
K
å
vi
PCP (li
c
| SF(vi ))= sim(SF(vi ), RV(c))
11
SCRN
 SCRN estimates the class-membership probability of node
belonging to class c using the following equation:
P(li
c
| Ni,SF(vi )) =
1
z
PCP (li
c
| SF(vi ))´w(vi,vj )´ P(lj
c
| Nj )
vj ÎNi
å
class propagation probability
similarity between connected nodes
(link weight)
class probability of its neighbors
vi
12
SCRN Overview
Input: , Max_Iter
Output: for nodes in
1. Construct nodes’ social feature space
2. Initialize the class reference vectors for each class
3. Calculate the class-propagation probability for each test
node
4. Repeat until # of iterations > Max_Iter or predictions
converge
 Estimate test node’s class probability
 Update the test node’s class probability in collective inference
 Update the class reference vectors
 Re-calculate each node’s class-propagation probability
{G,V,E,C,LK }
LU VU
13
SCRN Visualization
Figure: SCRN on synthetic multi-label network with 1000 nodes and 32 classes
(15 iterations).
14
Datasets
DBLP
 We construct a weighted collaboration network for
authors who have published at least 2 papers during the
2000 to 2010 time- frame.
 We selected 15 representative conferences in 6 research
areas:
DataBase: ICDE,VLDB, PODS, EDBT
Data Mining: KDD, ICDM, SDM, PAKDD
Artificial Intelligence: IJCAI, AAAI
Information Retrieval: SIGIR, ECIR
Computer Vision: CVPR
Machine Learning: ICML, ECML
15
Datasets
IMDb
 We extract movies and TV shows released between
2000 and 2010, and those directed by the same director
are linked together.
 We only retain movies and TV programs with greater
than 5 links.
 Each movie can be assigned to a subset of 27 different
candidate movie genres in the database such as
“Drama", “Comedy", “Documentary" and “Action”.
16
Datasets
YouTube
 A subset of data (15000 nodes) from the original
YouTube dataset[1] using snowball sampling.
 Each user in YouTube can subscribe to different interest
groups and add other users as his/her contacts.
 Class labels are 47 interest groups.
[1] http://www.public.asu.edu/~ltang9/social_ dimension.html
17
Comparative Methods
Edge (EdgeCluster)
wvRN
Prior
Random
18
Experiment Setting
 Size of social feature space :
 1000 for DBLP and YouTube; 10000 for IMDb
 Class propagation probability is calculated with the
Generalized Histogram Intersection Kernel.
 Relaxation Labeling is used in the collective inference
framework for SCRN and wvRN.
 We assume the number of labels for testing nodes is known.
19
Experiment Setting
 We employ the network cross-validation (NCV) method
(KAIS’11) to reduce the overlap between test samples.
 Classification performance is evaluated based on Micro-F1,
Macro-F1 and Hamming Loss.
20
Results (Micro-F1)
DBLP
10
20
30
40
50
60
70
5 10 15 20 25 30
Micro-F1accuracy(%)
Training data percentage(%)
SCRN
Edge
wvRN
Prior
Random
21
Results (Macro-F1)
DBLP
10
20
30
40
50
60
70
5 10 15 20 25 30
Macro-F1accuracy(%)
Training data percentage (%)
SCRN
Edge
wvRN
Prior
Random
22
Results (Hamming Loss)
DBLP
23
Results (Hamming Loss)
IMDb
24
Results (Hamming Loss)
YouTube
25
Conclusion
 Links in multi-relational networks are heterogeneous.
 SCRN exploits label homophily while simultaneously
leveraging social feature similarity through the introduction
of class propagation probabilities.
 Significantly boosts classification performance on multi-
label collaboration networks.
 Our open-source implementation of SCRN is available at:
http://code.google.com/p/multilabel-classification-on-social-network/
26
Reference
 MACSKASSY, S. A., AND PROVOST, F. A simple relational classifier. In
Proceedings of the Second Workshop on Multi-Relational Data Mining (MRDM) at
KDD, 2003, pp. 64–76.
 TANG, L., AND LIU, H. Relational learning via latent social dimensions. In
Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery
and Data Mining (KDD), 2009, pp. 817–826.
 TANG, L., AND LIU, H. Scalable learning of collective behavior based on sparse
social dimensions. In Proceedings of International Conference on Information and
Knowledge Management (CIKM), 2009, pp. 1107-1116.
 NEVILLE, J., GALLAGHER, B., ELIASSI-RAD, T., AND WANG, T. Correcting
evaluation bias of relational classifiers with network cross validation. Knowledge
and Information Systems (KAIS), 2011, pp. 1–25.
27
Thank you!
28

More Related Content

DOCX
Deepwalk vs Node2vec
PDF
network mining and representation learning
PDF
Network embedding
DOC
DOWNLOAD
PPTX
Image Segmentation Using Deep Learning : A survey
PDF
CONTENT BASED VIDEO CATEGORIZATION USING RELATIONAL CLUSTERING WITH LOCAL SCA...
PDF
MODELLING AND SYNTHESIZING OF 3D SHAPE WITH STACKED GENERATIVE ADVERSARIAL NE...
PDF
Entropy based algorithm for community detection in augmented networks
Deepwalk vs Node2vec
network mining and representation learning
Network embedding
DOWNLOAD
Image Segmentation Using Deep Learning : A survey
CONTENT BASED VIDEO CATEGORIZATION USING RELATIONAL CLUSTERING WITH LOCAL SCA...
MODELLING AND SYNTHESIZING OF 3D SHAPE WITH STACKED GENERATIVE ADVERSARIAL NE...
Entropy based algorithm for community detection in augmented networks

What's hot (20)

PDF
D1803022335
PPTX
Deep Semi-supervised Learning methods
PDF
Robust Image Watermarking Based on Dual Intermediate Significant Bit (DISB) I...
PDF
Density Based Clustering Approach for Solving the Software Component Restruct...
PDF
QSIC09.ppt
PDF
Paper overview: "Deep Residual Learning for Image Recognition"
PDF
Canalyzation in mathematical modeling
PDF
Image Classifiers for Network Intrusions
PDF
DOMINANT FEATURES IDENTIFICATION FOR COVERT NODES IN 9/11 ATTACK USING THEIR ...
PDF
Design of a Reliable Wireless Sensor Network with Optimized Energy Efficiency...
PDF
SVD Based Robust Digital Watermarking For Still Images Using Wavelet Transform
PDF
Using Neighbor’s State Cross-correlation to Accelerate Adaptation in Docitiv...
PDF
A Survey of Deep Learning Algorithms for Malware Detection
PDF
Defeating jamming with the power of silence a gametheoretic analysis
PDF
A Novel Dencos Model For High Dimensional Data Using Genetic Algorithms
PDF
A Systematic Review of Congestion Control in Ad Hoc Network
PDF
Delta-Screening: A Fast and Efficient Technique to Update Communities in Dyna...
PDF
Recognition of handwritten digits using rbf neural network
PDF
Project titles abstract_2012
PPTX
140320702029 maurya ppt
D1803022335
Deep Semi-supervised Learning methods
Robust Image Watermarking Based on Dual Intermediate Significant Bit (DISB) I...
Density Based Clustering Approach for Solving the Software Component Restruct...
QSIC09.ppt
Paper overview: "Deep Residual Learning for Image Recognition"
Canalyzation in mathematical modeling
Image Classifiers for Network Intrusions
DOMINANT FEATURES IDENTIFICATION FOR COVERT NODES IN 9/11 ATTACK USING THEIR ...
Design of a Reliable Wireless Sensor Network with Optimized Energy Efficiency...
SVD Based Robust Digital Watermarking For Still Images Using Wavelet Transform
Using Neighbor’s State Cross-correlation to Accelerate Adaptation in Docitiv...
A Survey of Deep Learning Algorithms for Malware Detection
Defeating jamming with the power of silence a gametheoretic analysis
A Novel Dencos Model For High Dimensional Data Using Genetic Algorithms
A Systematic Review of Congestion Control in Ad Hoc Network
Delta-Screening: A Fast and Efficient Technique to Update Communities in Dyna...
Recognition of handwritten digits using rbf neural network
Project titles abstract_2012
140320702029 maurya ppt
Ad

Viewers also liked (12)

PPTX
Text extraction using document structure features and support vector machines
PPTX
Presentation of Alaa Abi Haidar at the BnF Information Day
PDF
Voting Based Learning Classifier System for Multi-Label Classification
PDF
Analyse de données fonctionnelles par Machines à Vecteurs de Support (SVM)
PDF
Théorie de l’apprentissage et SVM : présentation rapide et premières idées da...
PPT
Support Vector machine
PPTX
Decision trees
PPTX
Decision Tree Analysis
PDF
Decision tree example problem
PDF
Decision tree
PDF
2013-1 Machine Learning Lecture 05 - Andrew Moore - Support Vector Machines
Text extraction using document structure features and support vector machines
Presentation of Alaa Abi Haidar at the BnF Information Day
Voting Based Learning Classifier System for Multi-Label Classification
Analyse de données fonctionnelles par Machines à Vecteurs de Support (SVM)
Théorie de l’apprentissage et SVM : présentation rapide et premières idées da...
Support Vector machine
Decision trees
Decision Tree Analysis
Decision tree example problem
Decision tree
2013-1 Machine Learning Lecture 05 - Andrew Moore - Support Vector Machines
Ad

Similar to 2013 KDD conference presentation--"Multi-Label Relational Neighbor Classification using Social Context Features" (20)

PPTX
Learning to Classify Users in Online Interaction Networks
PDF
Representation Learning in Large Attributed Graphs
PPTX
240325_JW_labseminar[node2vec: Scalable Feature Learning for Networks].pptx
PDF
APPLICATION OF CLUSTERING TO ANALYZE ACADEMIC SOCIAL NETWORKS
DOCX
Link Prediction Survey
PPTX
Graph Representation Learning
PDF
Scalable Graph Convolutional Network Based Link Prediction on a Distributed G...
PDF
IRJET- Link Prediction in Social Networks
PPTX
PPT
Learning Social Networks From Web Documents Using Support
PDF
LPCNN: convolutional neural network for link prediction based on network stru...
PDF
An experimental evaluation of similarity-based and embedding-based link predi...
PDF
Lise Getoor, Professor, Computer Science, UC Santa Cruz at MLconf SF
PDF
Social Networks
PDF
Multi-Mode Conceptual Clustering Algorithm Based Social Group Identification ...
PDF
FRIEND SUGGESTION SYSTEM FOR THE SOCIAL NETWORK BASED ON USER BEHAVIOR
PDF
Predicting_new_friendships_in_social_networks
PDF
Distributed Link Prediction in Large Scale Graphs using Apache Spark
PDF
kdd_talk.pdf
PDF
kdd_talk.pdf
Learning to Classify Users in Online Interaction Networks
Representation Learning in Large Attributed Graphs
240325_JW_labseminar[node2vec: Scalable Feature Learning for Networks].pptx
APPLICATION OF CLUSTERING TO ANALYZE ACADEMIC SOCIAL NETWORKS
Link Prediction Survey
Graph Representation Learning
Scalable Graph Convolutional Network Based Link Prediction on a Distributed G...
IRJET- Link Prediction in Social Networks
Learning Social Networks From Web Documents Using Support
LPCNN: convolutional neural network for link prediction based on network stru...
An experimental evaluation of similarity-based and embedding-based link predi...
Lise Getoor, Professor, Computer Science, UC Santa Cruz at MLconf SF
Social Networks
Multi-Mode Conceptual Clustering Algorithm Based Social Group Identification ...
FRIEND SUGGESTION SYSTEM FOR THE SOCIAL NETWORK BASED ON USER BEHAVIOR
Predicting_new_friendships_in_social_networks
Distributed Link Prediction in Large Scale Graphs using Apache Spark
kdd_talk.pdf
kdd_talk.pdf

Recently uploaded (20)

PPTX
Business Acumen Training GuidePresentation.pptx
PPT
Quality review (1)_presentation of this 21
PPTX
IB Computer Science - Internal Assessment.pptx
PDF
Business Analytics and business intelligence.pdf
PPTX
iec ppt-1 pptx icmr ppt on rehabilitation.pptx
PDF
Fluorescence-microscope_Botany_detailed content
PDF
Recruitment and Placement PPT.pdfbjfibjdfbjfobj
PPTX
IBA_Chapter_11_Slides_Final_Accessible.pptx
PDF
Foundation of Data Science unit number two notes
PPT
ISS -ESG Data flows What is ESG and HowHow
PDF
“Getting Started with Data Analytics Using R – Concepts, Tools & Case Studies”
PPTX
AI Strategy room jwfjksfksfjsjsjsjsjfsjfsj
PPTX
Business Ppt On Nestle.pptx huunnnhhgfvu
PDF
Galatica Smart Energy Infrastructure Startup Pitch Deck
PPTX
Qualitative Qantitative and Mixed Methods.pptx
PPTX
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
PPTX
ALIMENTARY AND BILIARY CONDITIONS 3-1.pptx
PPTX
Acceptance and paychological effects of mandatory extra coach I classes.pptx
PPTX
Computer network topology notes for revision
PDF
BF and FI - Blockchain, fintech and Financial Innovation Lesson 2.pdf
Business Acumen Training GuidePresentation.pptx
Quality review (1)_presentation of this 21
IB Computer Science - Internal Assessment.pptx
Business Analytics and business intelligence.pdf
iec ppt-1 pptx icmr ppt on rehabilitation.pptx
Fluorescence-microscope_Botany_detailed content
Recruitment and Placement PPT.pdfbjfibjdfbjfobj
IBA_Chapter_11_Slides_Final_Accessible.pptx
Foundation of Data Science unit number two notes
ISS -ESG Data flows What is ESG and HowHow
“Getting Started with Data Analytics Using R – Concepts, Tools & Case Studies”
AI Strategy room jwfjksfksfjsjsjsjsjfsjfsj
Business Ppt On Nestle.pptx huunnnhhgfvu
Galatica Smart Energy Infrastructure Startup Pitch Deck
Qualitative Qantitative and Mixed Methods.pptx
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
ALIMENTARY AND BILIARY CONDITIONS 3-1.pptx
Acceptance and paychological effects of mandatory extra coach I classes.pptx
Computer network topology notes for revision
BF and FI - Blockchain, fintech and Financial Innovation Lesson 2.pdf

2013 KDD conference presentation--"Multi-Label Relational Neighbor Classification using Social Context Features"

  • 1. Multi-label Relational Neighbor Classification using Social Context Features Xi Wang and Gita Sukthankar Department of EECS University of Central Florida
  • 2. Motivation  The conventional relational classification model focuses on the single-label classification problem.  Real-world relational datasets contain instances associated with multiple labels.  Connections between instances in multi-label networks are driven by various casual reasons. Example: Scientific collaboration network Machine Learning Data Mining Artificial Intelligence 1
  • 3. Problem Formulation  Node classification in multi-relational networks  Input:  Network structure (i.e., connectivity information)  Labels of some actors in the network  Output:  Labels of the other actors 2
  • 4. Classification in Networked Data  Homophily: nodes with similar labels are more likely to be connected  Markov assumption:  The label of one node depends on that of its immediate neighbors in the graph  Relational models are built based on the labels of neighbors.  Predictions are made using collective inference. 3
  • 5. Contribution  A new multi-label iterative relational neighbor classifier (SCRN)  Extract social context features using edge clustering to represent a node’s potential group membership  Use of social features boosts classification performance over benchmarks on several real-world collaborative networked datasets 4
  • 6. Relational Neighbor Classifier  The Relational Neighbor (RN) classifier proposed by Macskassy et al. (MRDM’03), is a simple relational probabilistic model that makes predictions for a given node based solely on the class labels of its neighbors. Iteration 1 Iteration 2Training Graph 5
  • 7. Relational Neighbor Classifier  Weighted-vote relational neighbor classifier (wvRN) estimates prediction probability as: Here is the usual normalization factor, and is the weight of the link between node and   ij Nv jjjiii NcLPvvw z vcLP )|(),( 1 )|( z w(vi,vj ) vi vj 6
  • 8. Apply RN in Multi-relational Network Ground truth : nodes with both labels (red, green) : nodes with green label only : nodes with red label only 7
  • 9. Edge-Based Social Feature Extraction  Connections in human networks are mainly affiliation- driven.  Since each connection can often be regarded as principally resulting from one affiliation, links possess a strong correlation with a single affiliation class.  The edge class information is not readily available in most social media datasets, but an unsupervised clustering algorithm can be applied to partition the edges into disjoint sets (KDD’09,CIKM’09). 8
  • 10. Cluster edges using K-Means  Scalable edge clustering method proposed by Tang et al. (CIKM’09).  Each edge is represented in a feature-based format, where each edge is characterized by its adjacent nodes.  K-means clustering is used to separate the edges into groups, and the social feature (SF) vector is constructed based on edge cluster IDs. Original network Step1 : Edge representations Step2: Construct social features 9
  • 11. Edge-Clustering Visualization Figure: A subset of DBLP with 95 instances. Edges are clustered into 10 groups, with each shown in a different color. 10
  • 12. Proposed Method: SCRN  The initial set of reference features for class c can be defined as the weighted sum of social feature vectors for nodes known to be in class c:  Then node ’s class propagation probability for class c conditioned on its social features: RV(c) = 1 |Vc K | P(li c =1)´SF(vi ) viÎVc K å vi PCP (li c | SF(vi ))= sim(SF(vi ), RV(c)) 11
  • 13. SCRN  SCRN estimates the class-membership probability of node belonging to class c using the following equation: P(li c | Ni,SF(vi )) = 1 z PCP (li c | SF(vi ))´w(vi,vj )´ P(lj c | Nj ) vj ÎNi å class propagation probability similarity between connected nodes (link weight) class probability of its neighbors vi 12
  • 14. SCRN Overview Input: , Max_Iter Output: for nodes in 1. Construct nodes’ social feature space 2. Initialize the class reference vectors for each class 3. Calculate the class-propagation probability for each test node 4. Repeat until # of iterations > Max_Iter or predictions converge  Estimate test node’s class probability  Update the test node’s class probability in collective inference  Update the class reference vectors  Re-calculate each node’s class-propagation probability {G,V,E,C,LK } LU VU 13
  • 15. SCRN Visualization Figure: SCRN on synthetic multi-label network with 1000 nodes and 32 classes (15 iterations). 14
  • 16. Datasets DBLP  We construct a weighted collaboration network for authors who have published at least 2 papers during the 2000 to 2010 time- frame.  We selected 15 representative conferences in 6 research areas: DataBase: ICDE,VLDB, PODS, EDBT Data Mining: KDD, ICDM, SDM, PAKDD Artificial Intelligence: IJCAI, AAAI Information Retrieval: SIGIR, ECIR Computer Vision: CVPR Machine Learning: ICML, ECML 15
  • 17. Datasets IMDb  We extract movies and TV shows released between 2000 and 2010, and those directed by the same director are linked together.  We only retain movies and TV programs with greater than 5 links.  Each movie can be assigned to a subset of 27 different candidate movie genres in the database such as “Drama", “Comedy", “Documentary" and “Action”. 16
  • 18. Datasets YouTube  A subset of data (15000 nodes) from the original YouTube dataset[1] using snowball sampling.  Each user in YouTube can subscribe to different interest groups and add other users as his/her contacts.  Class labels are 47 interest groups. [1] http://www.public.asu.edu/~ltang9/social_ dimension.html 17
  • 20. Experiment Setting  Size of social feature space :  1000 for DBLP and YouTube; 10000 for IMDb  Class propagation probability is calculated with the Generalized Histogram Intersection Kernel.  Relaxation Labeling is used in the collective inference framework for SCRN and wvRN.  We assume the number of labels for testing nodes is known. 19
  • 21. Experiment Setting  We employ the network cross-validation (NCV) method (KAIS’11) to reduce the overlap between test samples.  Classification performance is evaluated based on Micro-F1, Macro-F1 and Hamming Loss. 20
  • 22. Results (Micro-F1) DBLP 10 20 30 40 50 60 70 5 10 15 20 25 30 Micro-F1accuracy(%) Training data percentage(%) SCRN Edge wvRN Prior Random 21
  • 23. Results (Macro-F1) DBLP 10 20 30 40 50 60 70 5 10 15 20 25 30 Macro-F1accuracy(%) Training data percentage (%) SCRN Edge wvRN Prior Random 22
  • 27. Conclusion  Links in multi-relational networks are heterogeneous.  SCRN exploits label homophily while simultaneously leveraging social feature similarity through the introduction of class propagation probabilities.  Significantly boosts classification performance on multi- label collaboration networks.  Our open-source implementation of SCRN is available at: http://code.google.com/p/multilabel-classification-on-social-network/ 26
  • 28. Reference  MACSKASSY, S. A., AND PROVOST, F. A simple relational classifier. In Proceedings of the Second Workshop on Multi-Relational Data Mining (MRDM) at KDD, 2003, pp. 64–76.  TANG, L., AND LIU, H. Relational learning via latent social dimensions. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), 2009, pp. 817–826.  TANG, L., AND LIU, H. Scalable learning of collective behavior based on sparse social dimensions. In Proceedings of International Conference on Information and Knowledge Management (CIKM), 2009, pp. 1107-1116.  NEVILLE, J., GALLAGHER, B., ELIASSI-RAD, T., AND WANG, T. Correcting evaluation bias of relational classifiers with network cross validation. Knowledge and Information Systems (KAIS), 2011, pp. 1–25. 27

Editor's Notes

  • #6: This class-propagation probability captures the node’s intrinsic likelihood of belonging to each class, and serves as a prior weight for each class when aggregating the neighbors’ class labels in the collective inference procedure. Our classifier extends Relational neighbor classifier by introducing a node class-propagation probability that modulates the amount of propagation that occurs in a class specific way based on the node’s similarity to each class.
  • #22: which produces fair comparisons between different within-network classification approaches
  • #23: The results are averaged over 10-cross validation folds