SlideShare a Scribd company logo
Charting the Design and Analytics Agenda of
Learnersourcing Systems
Dr Hassan Khosravi
The University of Queensland
Brisbane, QLD, Australia
h.khosravi@uq.edu.au
Professor Shazia Sadiq
The University of Queensland
Brisbane, QLD, Australia
shazia@itee.uq.edu.au
Professor Dragan Gasevic
Monash University
Clayton, Vic, Australia
Dragan.Gasevic@monash.edu
A/Prof Gianluca Demartini
The University of Queensland
Brisbane, QLD, Australia
g.demartini@uq.edu.au
Charting the Design and Analytics Agenda of Learnersourcing Systems 2
Challenges in Teaching Large Classes
Learnersourcing is emerging as a viable and pedagogically justified approach of addressing these
challenges.
Help students develop creativity,
critical thinking, communication and
collaboration skills
Provide timely and authentic feedback
Provide learning content and
instructions that cater to the diverse
academic ability of the learners
https://httpessaysahmyookhs.wordpress.com/21st-century-skills/
Charting the Design and Analytics Agenda of Learnersourcing Systems 3
Overview
Data Driven reflections and lessons learned from design,
development and deployment of RiPPLE
Suggestions for developing learnersourcing system
The RiPPLE system
Learnersourcing
Charting the Design and Analytics Agenda of Learnersourcing Systems 4
Learnersourcing: a pedagogically
supported form of crowdsourcing that
mobilises the learner community as
experts-in-training to contribute to
teaching and learning activities while
being engaged in meaningful learning
experiences themselves.
Learnersourcing
Kim, J, 2015. Learnersourcing: improving learning with collective learner
activity. Ph.D. Dissertation. Massachusetts Institute of Technology.
Khosravi, H. and Demartini, G, Sadiq, S., & Gasevic D. (2021). Charting the
Design and Analytics Agenda of Learnersourcing Systems. Lak’11 (in
press)
Charting the Design and Analytics Agenda of Learnersourcing Systems 5
Learnersourcing in Action
Charting the Design and Analytics Agenda of Learnersourcing Systems 6
Benefits
• Promotes critical learning, creativity, collaboration
and communication.
• Recognises them as partners in learning.
• Helps them develop evaluative judgement.
• Leads to the development of large repositories
study material.
Challenges
• Assessing quality of contributions
• Incentivising students to contribute
• Enabling Instructors to provide oversight with
minimal additional work.
Learnersourcing Benefits and Challenges
Charting the Design and Analytics Agenda of Learnersourcing Systems 7
Problem, Aim and Contribution
Problem
best practices for design and development of
learnersourcing systems are unknown
Aim
Contribute to development of best practices
for implementation of learnersourcing
systems
https://blog.marketo.com/2014/02/how-to-set-goals-you-can-meet.html
https://www.searchenginejournal.com/question-improve-your-problem-solving/327995/#close
Contribution
Share data-driven reflections and lessons
learned from design, implementation and
deployment of a learnersourcing system
Overview
Data Driven reflections and lessons learned from design,
development and deployment of RiPPLE
Learnersourcing The RiPPLE system
Suggestions for developing learnersourcing system
Charting the Design and Analytics Agenda of Learnersourcing Systems 9
The RiPPLE System
9
Key Features
Partnering with students to
develop a repository of
high-quality study material.
Learnersourcing Adaptive learning
Recommending learning resources
to each student based on
their learning needs.
Engaging students
with interactive in-class or self-
paced learning activities
Peer learning
Helping students find
effective study partners
that suit their availability
and learning needs.
Additional features
Interactive learning activities
RiPPLE is an open access educational platform that aims to provide a personalised, active and social learning
experience.
Khosravi et al. in the Journal of Learning Analytics (2019)
For more info visit http://ripplelearning.org/
Charting the Design and Analytics Agenda of Learnersourcing Systems 10
Adoption and Recognition
RiPPLE is increasingly being recognised as an exemplar tool for using AI in education.
15 Peer-reviewed articles
2 best paper nomination
Featured as an exemplar
educational technology in
EDUCAUSE Horizon 2019 Report
Academics and researchers
from over 50 universities have
accessed RiPPLE
After six semesters, RiPPLE has been adopted by courses in Medicine, Pharmacy,
Psychology, Education, Business, Computer Science & Biosciences.
60
Course offerings
14000+
Users
1,000,000+
Resources studied
30,000+
Study resources created
Charting the Design and Analytics Agenda of Learnersourcing Systems 11
Learnersourcing in RiPPLE
(1) Students author learning resources. (2) Resources are moderated by peers and tutors.
(3) Based on the moderations a decision is made. (4) RIPPLE flags questionable resources for spot checking.
Overview
Suggestions for developing learnersourcing system
Learnersourcing The RiPPLE system
Data driven reflections and lessons learned from design,
development and deployment of RiPPLE
Charting the Design and Analytics Agenda of Learnersourcing Systems 13
2. Incentivising high quality contributions
1. Quality control of student generated content
3. Empowering instructors with explainable and actionable insights
Data Driven Reflections and Lessons Learned
Charting the Design and Analytics Agenda of Learnersourcing Systems 14
Can Students Accurately Evaluate the Quality of Learning Resources?
• Data presence 64k moderations from 2.4k
learners and 28 instructors on 15K resources.
• Overall, ratings provided by students strongly
correlate with ratings from experts
Abdi et al. in IEEE Transactions on Educational Technologies (2021)
• When instructors reject a resource, the chance of
students also rejecting is 16%.
• Identifying low-quality resources based on peer
review is challenging
Charting the Design and Analytics Agenda of Learnersourcing Systems 15
1.1 Developing Accurate and Explainable Consensus Approaches
Arrieta, A et al. (2020). Explainable Artificial Intelligence (XAI):
Concepts, taxonomies, opportunities and challenges toward
responsible AI. Information Fusion, 58, 82-115.
How can we accurately, transparently and fairly evaluate student-generated content?
TPR TNR AUC TPR TNR AUC
Majority Vote 0.96 0.20 0.58 0.99 0.10 0.54
Mean 0.96 0.20 0.58 1.00 0.14 0.57
Median 1.00 0.20 0.60 1.00 0.14 0.57
Expectation Maximisation 0.94 0.4 0.67 1.00 0.33 0.67 High
Trust Propagation 0.81 0.62 0.71 0.88 0.52 0.70 Medium
Sentiment-Alignment 0.94 0.53 0.74 1.00 0.29 0.64
Relatedness - BERT 0.94 0.60 0.77 0.98 0.24 0.61
Length×Relatedness 0.92 0.73 0.83 0.91 0.76 0.84
Bagged Decision Tree 0.89 0.53 0.71 0.91 0.81 0.86
Random Forest 0.91 0.6 0.75 0.91 0.67 0.79
Boosting 0.87 0.6 0.73 0.91 0.67 0.79
RPART 0.79 0.87 0.83 0.90 0.86 0.88
Comment-based Models
Type
Baseline Models
Ensemble Models
Explainable ?
Very High
Very Low
Low
Method
INFS NEUR
Probabilistc models
Darvishi et al., to appear in the proceedings of
the Learning at Scale conference (2020)
Charting the Design and Analytics Agenda of Learnersourcing Systems 16
1.2 Employing Human-in-the-loop and Responsible AI
Based on: https://hai.stanford.edu/news/humans-loop-design-interactive-ai-systems
Full Automation: AI as a Big Red Button Designing with a Human in the Loop
Benefits
ü Significant gains in transparency
ü Effective incorporation of human judgment
ü Less pressure on building “perfect”
algorithms.
Challenges
Data and algorithmic bias
Lack of transparency and trust in decision
making
Charting the Design and Analytics Agenda of Learnersourcing Systems 17
1.2 Employing Human-in-the-loop in RiPPLE
Challenging the outcome of a moderation
Charting the Design and Analytics Agenda of Learnersourcing Systems 18
1.3 Employing Appropriate Criteria for Evaluating Resources
Criteria: Alignment, Correctness, Clarity Alignment, Correctness + Clarity, Difficulty, Critical thinking
Gyamfi et al. in Assessment and Higher Education (2021)
Charting the Design and Analytics Agenda of Learnersourcing Systems 19
Quality control of student generated content
Empowering instructors with explainable and actionable insights
Data Driven Reflections and Lessons Learned
Incentivising high quality contributions
Charting the Design and Analytics Agenda of Learnersourcing Systems 20
Participation inequality. In most online
communities, 90% of users are lurkers who
never contribute, 9% of users contribute a
little, and 1% of users account for almost all
the action.
Incentivising high quality contributions
Many streaming, gaming, and social media
platforms are designed with the prime
intention of increasing engagement without
considering the quality of the engagement.
So how can learnersourcing systems be designed to encourage a large portion of the student population
to contribute high-quality learnersourcing activities?
Photo by camilo jimenez on Unsplash
Source: https://www.nngroup.com/articles/participation-inequality/
Charting the Design and Analytics Agenda of Learnersourcing Systems 21
2.1 Modelling students in Learnersourcing Systems
Abdi et al. in proceedings of AI in Education (2020 and 2021)
1. Discouraging learnersourcing contributions.
2. Missing the opportunity to leverage learnersourcing data in modelling students
3. Advancing the belief that learnersourcing does not contribute to learning.
Charting the Design and Analytics Agenda of Learnersourcing Systems 22
2.1 Open Learner Models for Learnersourcing Systems
Abdi et al. in proceedings of AI in Education (2020 and 2021)
Charting the Design and Analytics Agenda of Learnersourcing Systems 23
• Using RiPPLE Formatively
• Engagement following 90-9-1 rule
• Low amount of high-quality engagement
2.2 Tying Learnersourcing to Assessment
• Using RiPPLE Summatively
• High amount of low quality engagement.
• Requires supporting assessment logistics
Charting the Design and Analytics Agenda of Learnersourcing Systems 24
2.3 Employing Gamification Mechanisms
Profile page Leaderboard Google Analytics
Charting the Design and Analytics Agenda of Learnersourcing Systems 25
Quality control of student generated content
Incentivising high quality contributions
Data Driven Reflections and Lessons Learned
Empowering instructors with explainable and actionable insights
Charting the Design and Analytics Agenda of Learnersourcing Systems 26
3.1 The Analytics Toolbox
Charting the Design and Analytics Agenda of Learnersourcing Systems 27
3.2 Spot checking in RiPPLE
Spot checking in RiPPLE
• Identifying resources that have passed moderation but are likely to be incorrect or ineffective.
• Uses human-driven metrics (e.g., high-disagreement in moderation evaluations) and data-
driven metrics (e.g., low discrimination) to recommend resources for inspection.
Charting the Design and Analytics Agenda of Learnersourcing Systems 28
3.2 Explainable Spot Checking in RiPPLE
Overview
Learnersourcing The RiPPLE system
Data driven reflections and lessons learned from design,
development and deployment of RiPPLE
Suggestions for developing learnersourcing system
Charting the Design and Analytics Agenda of Learnersourcing Systems 30
Suggestions for Developing Learnersourcing Systems
Develop accurate and
explainable consensus
approaches
Enable users to raise
concerns against
automated decisions
Reflect on the criteria
used in the evaluation
student generated content
Use open learner models
to present student
contributions
Support logistics to tie
leanersourcing to
assessment
Employ various
gamification
mechanisms
Empower instructors with
actionable and explainable
insights
conduct rigorous
empirical studies to
evaluate the platform.
Charting the Design and Analytics Agenda of Learnersourcing Systems 31
1. Abdi, S., Khosravi, H., Sadiq, S., & Demartini, G. (2021). Evaluating the Quality of Learning Resources: A
Learnersourcing Approach. IEEE Transactions on Learning Technologies, 14(1), 81-92.
2. Abdi, S., Khosravi, H., & Sadiq, S. (2020, July). Modelling Learners in Crowdsourcing Educational Systems.
In International Conference on Artificial Intelligence in Education (pp. 3-9). Springer, Cham.
3. Arrieta, A et al. (2020). Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges
toward responsible AI. Information Fusion, 58, 82-115.
4. Darvishi, A., Khosravi, H., & Sadiq, S. (2020, September). Utilising Learnersourcing to Inform Design Loop Adaptivity.
In European Conference on Technology Enhanced Learning (pp. 332-346). Springer, Cham.
5. Gyamfi, G., Hanna, B. E., & Khosravi, H. (2021). The effects of rubrics on evaluative judgement: a randomised
controlled experiment. Assessment & Evaluation in Higher Education, 1-18.
6. Jiang, Y., Schlagwein, D., & Benatallah, B. (2018, June). A Review on Crowdsourcing for Education: State of the Art
of Literature and Practice. In PACIS (p. 180).
7. Kim, J. (2015). Learnersourcing: improving learning with collective learner activity (Doctoral dissertation,
Massachusetts Institute of Technology).
8. Khosravi, H., Kitto, K., & Williams, J. J. (2019). RiPPLE: A Crowdsourced Adaptive Platform for Recommendation of
Learning Activities. Journal of Learning Analytics, 6(3), 91-105.
References

More Related Content

PDF
LAK18 Graph-based Visual Topic Dependency Models
PDF
A Multivariate Elo-based Learner Model for Adaptive Educational Systems
PDF
LAK18 Reciprocal Peer Recommendation for Learning Purposes
PPTX
Personalized Online Practice Systems for Learning Programming
PPTX
An Infrastructure for Sustainable Innovation and Research in Computer Scienc...
PPTX
Course-Adaptive Content Recommender for Course Authoring
PPTX
Adaptive Learning for Educational Game Design
PDF
Intelligent Adaptive Learning - An Essential Element of 21st Century Teaching...
LAK18 Graph-based Visual Topic Dependency Models
A Multivariate Elo-based Learner Model for Adaptive Educational Systems
LAK18 Reciprocal Peer Recommendation for Learning Purposes
Personalized Online Practice Systems for Learning Programming
An Infrastructure for Sustainable Innovation and Research in Computer Scienc...
Course-Adaptive Content Recommender for Course Authoring
Adaptive Learning for Educational Game Design
Intelligent Adaptive Learning - An Essential Element of 21st Century Teaching...

What's hot (19)

PDF
Development and Adoption of an Adaptive Learning System: Reflections and Less...
PPTX
Defining Adaptive Learning Technology
PPTX
Using Learning Analytics to Create our 'Preferred Future'
PPTX
Using Learning Analytics to Assess Innovation & Improve Student Achievement
PPTX
The Achievement Gap in Online Courses through a Learning Analytics Lens
PPTX
The Virtuous Loop of Learning Analytics & Academic Technology Innovation
PDF
Adaptive Learning Systems: A review of Adaptation.
PPTX
Data visualisation with predictive learning analytics
PPTX
What data from 3 million learners can tell us about effective course design
PPTX
LRT Talks 2013-03-12 CETIS
PDF
Knewton - Adaptive learning
PDF
2021_01_15 «Applying Learning Analytics in Living Labs for Educational Innova...
PPTX
22 January 2018 HEFCE open event “Using data to increase learning gains and t...
PPTX
When forced into a corner we do have options: I suggest we choose to be activ...
PDF
Addictive links, Keynote talk at WWW 2014 workshop
PPTX
Improving Student Achievement with New Approaches to Data
PDF
Data-Driven Education 2020: Using Big Educational Data to Improve Teaching an...
PDF
2021_03_26 "Eye-tracking techniques and methods in e-learning environments" -...
PPTX
The Learning, Teaching and IT Interface; Do TEL
Development and Adoption of an Adaptive Learning System: Reflections and Less...
Defining Adaptive Learning Technology
Using Learning Analytics to Create our 'Preferred Future'
Using Learning Analytics to Assess Innovation & Improve Student Achievement
The Achievement Gap in Online Courses through a Learning Analytics Lens
The Virtuous Loop of Learning Analytics & Academic Technology Innovation
Adaptive Learning Systems: A review of Adaptation.
Data visualisation with predictive learning analytics
What data from 3 million learners can tell us about effective course design
LRT Talks 2013-03-12 CETIS
Knewton - Adaptive learning
2021_01_15 «Applying Learning Analytics in Living Labs for Educational Innova...
22 January 2018 HEFCE open event “Using data to increase learning gains and t...
When forced into a corner we do have options: I suggest we choose to be activ...
Addictive links, Keynote talk at WWW 2014 workshop
Improving Student Achievement with New Approaches to Data
Data-Driven Education 2020: Using Big Educational Data to Improve Teaching an...
2021_03_26 "Eye-tracking techniques and methods in e-learning environments" -...
The Learning, Teaching and IT Interface; Do TEL
Ad

Similar to Charting the Design and Analytics Agenda of Learnersourcing Systems (20)

PPTX
Assessment Analytics - EUNIS 2015 E-Learning Task Force Workshop
PPTX
Learning Analytics
PPTX
The power of learning analytics to unpack learning and teaching: a critical p...
PPTX
LERU Presentation - March 2017
PDF
App4 zedrowp
PPTX
Digifest 2017 - Learning Analytics & Learning Design
PDF
Co-developing bespoke, enterprise-scale analytics systems with teaching staff
PPTX
Designing Learning Analytics for Humans with Humans
PPTX
EDUCA Leveraging Analytics FINAL
PPTX
Using Data-Driven Discovery Techniques for the Design and Improvement of Educ...
PPTX
Aligning Learning Analytics with Classroom Practices & Needs
PDF
Data Visualization and Learning Analytics with xAPI
PDF
Big Data and Student Retention
PDF
Russell Matousek Challenges and Barriers
PPTX
TajdgosvsuzkavsiandgakjxjndhsjsbsbsksnsP.pptx
PDF
Instructional Design in Higher Education. A report on the role, workflow, and...
PDF
AI-Learning style prediction for primary education
PPTX
Exploring Tools for Promoting Teacher Efficacy with mLearning (mlearn 2014 Pr...
PDF
SMART Seminar Series: Learning Journeys – Making learning visible in developi...
PPTX
Cots Presentation Lilly Conference
Assessment Analytics - EUNIS 2015 E-Learning Task Force Workshop
Learning Analytics
The power of learning analytics to unpack learning and teaching: a critical p...
LERU Presentation - March 2017
App4 zedrowp
Digifest 2017 - Learning Analytics & Learning Design
Co-developing bespoke, enterprise-scale analytics systems with teaching staff
Designing Learning Analytics for Humans with Humans
EDUCA Leveraging Analytics FINAL
Using Data-Driven Discovery Techniques for the Design and Improvement of Educ...
Aligning Learning Analytics with Classroom Practices & Needs
Data Visualization and Learning Analytics with xAPI
Big Data and Student Retention
Russell Matousek Challenges and Barriers
TajdgosvsuzkavsiandgakjxjndhsjsbsbsksnsP.pptx
Instructional Design in Higher Education. A report on the role, workflow, and...
AI-Learning style prediction for primary education
Exploring Tools for Promoting Teacher Efficacy with mLearning (mlearn 2014 Pr...
SMART Seminar Series: Learning Journeys – Making learning visible in developi...
Cots Presentation Lilly Conference
Ad

Recently uploaded (20)

DOC
Soft-furnishing-By-Architect-A.F.M.Mohiuddin-Akhand.doc
PPTX
Introduction to Building Materials
PPTX
Unit 4 Skeletal System.ppt.pptxopresentatiom
PDF
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
PDF
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
PDF
Paper A Mock Exam 9_ Attempt review.pdf.
PDF
Trump Administration's workforce development strategy
PDF
RMMM.pdf make it easy to upload and study
PDF
احياء السادس العلمي - الفصل الثالث (التكاثر) منهج متميزين/كلية بغداد/موهوبين
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PDF
Indian roads congress 037 - 2012 Flexible pavement
PDF
Complications of Minimal Access Surgery at WLH
PPTX
Tissue processing ( HISTOPATHOLOGICAL TECHNIQUE
PPTX
Chinmaya Tiranga Azadi Quiz (Class 7-8 )
PDF
Practical Manual AGRO-233 Principles and Practices of Natural Farming
PPTX
Final Presentation General Medicine 03-08-2024.pptx
PDF
RTP_AR_KS1_Tutor's Guide_English [FOR REPRODUCTION].pdf
PDF
A systematic review of self-coping strategies used by university students to ...
PPTX
Orientation - ARALprogram of Deped to the Parents.pptx
PPTX
202450812 BayCHI UCSC-SV 20250812 v17.pptx
Soft-furnishing-By-Architect-A.F.M.Mohiuddin-Akhand.doc
Introduction to Building Materials
Unit 4 Skeletal System.ppt.pptxopresentatiom
Black Hat USA 2025 - Micro ICS Summit - ICS/OT Threat Landscape
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
Paper A Mock Exam 9_ Attempt review.pdf.
Trump Administration's workforce development strategy
RMMM.pdf make it easy to upload and study
احياء السادس العلمي - الفصل الثالث (التكاثر) منهج متميزين/كلية بغداد/موهوبين
Final Presentation General Medicine 03-08-2024.pptx
Indian roads congress 037 - 2012 Flexible pavement
Complications of Minimal Access Surgery at WLH
Tissue processing ( HISTOPATHOLOGICAL TECHNIQUE
Chinmaya Tiranga Azadi Quiz (Class 7-8 )
Practical Manual AGRO-233 Principles and Practices of Natural Farming
Final Presentation General Medicine 03-08-2024.pptx
RTP_AR_KS1_Tutor's Guide_English [FOR REPRODUCTION].pdf
A systematic review of self-coping strategies used by university students to ...
Orientation - ARALprogram of Deped to the Parents.pptx
202450812 BayCHI UCSC-SV 20250812 v17.pptx

Charting the Design and Analytics Agenda of Learnersourcing Systems

  • 1. Charting the Design and Analytics Agenda of Learnersourcing Systems Dr Hassan Khosravi The University of Queensland Brisbane, QLD, Australia h.khosravi@uq.edu.au Professor Shazia Sadiq The University of Queensland Brisbane, QLD, Australia shazia@itee.uq.edu.au Professor Dragan Gasevic Monash University Clayton, Vic, Australia Dragan.Gasevic@monash.edu A/Prof Gianluca Demartini The University of Queensland Brisbane, QLD, Australia g.demartini@uq.edu.au
  • 2. Charting the Design and Analytics Agenda of Learnersourcing Systems 2 Challenges in Teaching Large Classes Learnersourcing is emerging as a viable and pedagogically justified approach of addressing these challenges. Help students develop creativity, critical thinking, communication and collaboration skills Provide timely and authentic feedback Provide learning content and instructions that cater to the diverse academic ability of the learners https://httpessaysahmyookhs.wordpress.com/21st-century-skills/
  • 3. Charting the Design and Analytics Agenda of Learnersourcing Systems 3 Overview Data Driven reflections and lessons learned from design, development and deployment of RiPPLE Suggestions for developing learnersourcing system The RiPPLE system Learnersourcing
  • 4. Charting the Design and Analytics Agenda of Learnersourcing Systems 4 Learnersourcing: a pedagogically supported form of crowdsourcing that mobilises the learner community as experts-in-training to contribute to teaching and learning activities while being engaged in meaningful learning experiences themselves. Learnersourcing Kim, J, 2015. Learnersourcing: improving learning with collective learner activity. Ph.D. Dissertation. Massachusetts Institute of Technology. Khosravi, H. and Demartini, G, Sadiq, S., & Gasevic D. (2021). Charting the Design and Analytics Agenda of Learnersourcing Systems. Lak’11 (in press)
  • 5. Charting the Design and Analytics Agenda of Learnersourcing Systems 5 Learnersourcing in Action
  • 6. Charting the Design and Analytics Agenda of Learnersourcing Systems 6 Benefits • Promotes critical learning, creativity, collaboration and communication. • Recognises them as partners in learning. • Helps them develop evaluative judgement. • Leads to the development of large repositories study material. Challenges • Assessing quality of contributions • Incentivising students to contribute • Enabling Instructors to provide oversight with minimal additional work. Learnersourcing Benefits and Challenges
  • 7. Charting the Design and Analytics Agenda of Learnersourcing Systems 7 Problem, Aim and Contribution Problem best practices for design and development of learnersourcing systems are unknown Aim Contribute to development of best practices for implementation of learnersourcing systems https://blog.marketo.com/2014/02/how-to-set-goals-you-can-meet.html https://www.searchenginejournal.com/question-improve-your-problem-solving/327995/#close Contribution Share data-driven reflections and lessons learned from design, implementation and deployment of a learnersourcing system
  • 8. Overview Data Driven reflections and lessons learned from design, development and deployment of RiPPLE Learnersourcing The RiPPLE system Suggestions for developing learnersourcing system
  • 9. Charting the Design and Analytics Agenda of Learnersourcing Systems 9 The RiPPLE System 9 Key Features Partnering with students to develop a repository of high-quality study material. Learnersourcing Adaptive learning Recommending learning resources to each student based on their learning needs. Engaging students with interactive in-class or self- paced learning activities Peer learning Helping students find effective study partners that suit their availability and learning needs. Additional features Interactive learning activities RiPPLE is an open access educational platform that aims to provide a personalised, active and social learning experience. Khosravi et al. in the Journal of Learning Analytics (2019) For more info visit http://ripplelearning.org/
  • 10. Charting the Design and Analytics Agenda of Learnersourcing Systems 10 Adoption and Recognition RiPPLE is increasingly being recognised as an exemplar tool for using AI in education. 15 Peer-reviewed articles 2 best paper nomination Featured as an exemplar educational technology in EDUCAUSE Horizon 2019 Report Academics and researchers from over 50 universities have accessed RiPPLE After six semesters, RiPPLE has been adopted by courses in Medicine, Pharmacy, Psychology, Education, Business, Computer Science & Biosciences. 60 Course offerings 14000+ Users 1,000,000+ Resources studied 30,000+ Study resources created
  • 11. Charting the Design and Analytics Agenda of Learnersourcing Systems 11 Learnersourcing in RiPPLE (1) Students author learning resources. (2) Resources are moderated by peers and tutors. (3) Based on the moderations a decision is made. (4) RIPPLE flags questionable resources for spot checking.
  • 12. Overview Suggestions for developing learnersourcing system Learnersourcing The RiPPLE system Data driven reflections and lessons learned from design, development and deployment of RiPPLE
  • 13. Charting the Design and Analytics Agenda of Learnersourcing Systems 13 2. Incentivising high quality contributions 1. Quality control of student generated content 3. Empowering instructors with explainable and actionable insights Data Driven Reflections and Lessons Learned
  • 14. Charting the Design and Analytics Agenda of Learnersourcing Systems 14 Can Students Accurately Evaluate the Quality of Learning Resources? • Data presence 64k moderations from 2.4k learners and 28 instructors on 15K resources. • Overall, ratings provided by students strongly correlate with ratings from experts Abdi et al. in IEEE Transactions on Educational Technologies (2021) • When instructors reject a resource, the chance of students also rejecting is 16%. • Identifying low-quality resources based on peer review is challenging
  • 15. Charting the Design and Analytics Agenda of Learnersourcing Systems 15 1.1 Developing Accurate and Explainable Consensus Approaches Arrieta, A et al. (2020). Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI. Information Fusion, 58, 82-115. How can we accurately, transparently and fairly evaluate student-generated content? TPR TNR AUC TPR TNR AUC Majority Vote 0.96 0.20 0.58 0.99 0.10 0.54 Mean 0.96 0.20 0.58 1.00 0.14 0.57 Median 1.00 0.20 0.60 1.00 0.14 0.57 Expectation Maximisation 0.94 0.4 0.67 1.00 0.33 0.67 High Trust Propagation 0.81 0.62 0.71 0.88 0.52 0.70 Medium Sentiment-Alignment 0.94 0.53 0.74 1.00 0.29 0.64 Relatedness - BERT 0.94 0.60 0.77 0.98 0.24 0.61 Length×Relatedness 0.92 0.73 0.83 0.91 0.76 0.84 Bagged Decision Tree 0.89 0.53 0.71 0.91 0.81 0.86 Random Forest 0.91 0.6 0.75 0.91 0.67 0.79 Boosting 0.87 0.6 0.73 0.91 0.67 0.79 RPART 0.79 0.87 0.83 0.90 0.86 0.88 Comment-based Models Type Baseline Models Ensemble Models Explainable ? Very High Very Low Low Method INFS NEUR Probabilistc models Darvishi et al., to appear in the proceedings of the Learning at Scale conference (2020)
  • 16. Charting the Design and Analytics Agenda of Learnersourcing Systems 16 1.2 Employing Human-in-the-loop and Responsible AI Based on: https://hai.stanford.edu/news/humans-loop-design-interactive-ai-systems Full Automation: AI as a Big Red Button Designing with a Human in the Loop Benefits ü Significant gains in transparency ü Effective incorporation of human judgment ü Less pressure on building “perfect” algorithms. Challenges Data and algorithmic bias Lack of transparency and trust in decision making
  • 17. Charting the Design and Analytics Agenda of Learnersourcing Systems 17 1.2 Employing Human-in-the-loop in RiPPLE Challenging the outcome of a moderation
  • 18. Charting the Design and Analytics Agenda of Learnersourcing Systems 18 1.3 Employing Appropriate Criteria for Evaluating Resources Criteria: Alignment, Correctness, Clarity Alignment, Correctness + Clarity, Difficulty, Critical thinking Gyamfi et al. in Assessment and Higher Education (2021)
  • 19. Charting the Design and Analytics Agenda of Learnersourcing Systems 19 Quality control of student generated content Empowering instructors with explainable and actionable insights Data Driven Reflections and Lessons Learned Incentivising high quality contributions
  • 20. Charting the Design and Analytics Agenda of Learnersourcing Systems 20 Participation inequality. In most online communities, 90% of users are lurkers who never contribute, 9% of users contribute a little, and 1% of users account for almost all the action. Incentivising high quality contributions Many streaming, gaming, and social media platforms are designed with the prime intention of increasing engagement without considering the quality of the engagement. So how can learnersourcing systems be designed to encourage a large portion of the student population to contribute high-quality learnersourcing activities? Photo by camilo jimenez on Unsplash Source: https://www.nngroup.com/articles/participation-inequality/
  • 21. Charting the Design and Analytics Agenda of Learnersourcing Systems 21 2.1 Modelling students in Learnersourcing Systems Abdi et al. in proceedings of AI in Education (2020 and 2021) 1. Discouraging learnersourcing contributions. 2. Missing the opportunity to leverage learnersourcing data in modelling students 3. Advancing the belief that learnersourcing does not contribute to learning.
  • 22. Charting the Design and Analytics Agenda of Learnersourcing Systems 22 2.1 Open Learner Models for Learnersourcing Systems Abdi et al. in proceedings of AI in Education (2020 and 2021)
  • 23. Charting the Design and Analytics Agenda of Learnersourcing Systems 23 • Using RiPPLE Formatively • Engagement following 90-9-1 rule • Low amount of high-quality engagement 2.2 Tying Learnersourcing to Assessment • Using RiPPLE Summatively • High amount of low quality engagement. • Requires supporting assessment logistics
  • 24. Charting the Design and Analytics Agenda of Learnersourcing Systems 24 2.3 Employing Gamification Mechanisms Profile page Leaderboard Google Analytics
  • 25. Charting the Design and Analytics Agenda of Learnersourcing Systems 25 Quality control of student generated content Incentivising high quality contributions Data Driven Reflections and Lessons Learned Empowering instructors with explainable and actionable insights
  • 26. Charting the Design and Analytics Agenda of Learnersourcing Systems 26 3.1 The Analytics Toolbox
  • 27. Charting the Design and Analytics Agenda of Learnersourcing Systems 27 3.2 Spot checking in RiPPLE Spot checking in RiPPLE • Identifying resources that have passed moderation but are likely to be incorrect or ineffective. • Uses human-driven metrics (e.g., high-disagreement in moderation evaluations) and data- driven metrics (e.g., low discrimination) to recommend resources for inspection.
  • 28. Charting the Design and Analytics Agenda of Learnersourcing Systems 28 3.2 Explainable Spot Checking in RiPPLE
  • 29. Overview Learnersourcing The RiPPLE system Data driven reflections and lessons learned from design, development and deployment of RiPPLE Suggestions for developing learnersourcing system
  • 30. Charting the Design and Analytics Agenda of Learnersourcing Systems 30 Suggestions for Developing Learnersourcing Systems Develop accurate and explainable consensus approaches Enable users to raise concerns against automated decisions Reflect on the criteria used in the evaluation student generated content Use open learner models to present student contributions Support logistics to tie leanersourcing to assessment Employ various gamification mechanisms Empower instructors with actionable and explainable insights conduct rigorous empirical studies to evaluate the platform.
  • 31. Charting the Design and Analytics Agenda of Learnersourcing Systems 31 1. Abdi, S., Khosravi, H., Sadiq, S., & Demartini, G. (2021). Evaluating the Quality of Learning Resources: A Learnersourcing Approach. IEEE Transactions on Learning Technologies, 14(1), 81-92. 2. Abdi, S., Khosravi, H., & Sadiq, S. (2020, July). Modelling Learners in Crowdsourcing Educational Systems. In International Conference on Artificial Intelligence in Education (pp. 3-9). Springer, Cham. 3. Arrieta, A et al. (2020). Explainable Artificial Intelligence (XAI): Concepts, taxonomies, opportunities and challenges toward responsible AI. Information Fusion, 58, 82-115. 4. Darvishi, A., Khosravi, H., & Sadiq, S. (2020, September). Utilising Learnersourcing to Inform Design Loop Adaptivity. In European Conference on Technology Enhanced Learning (pp. 332-346). Springer, Cham. 5. Gyamfi, G., Hanna, B. E., & Khosravi, H. (2021). The effects of rubrics on evaluative judgement: a randomised controlled experiment. Assessment & Evaluation in Higher Education, 1-18. 6. Jiang, Y., Schlagwein, D., & Benatallah, B. (2018, June). A Review on Crowdsourcing for Education: State of the Art of Literature and Practice. In PACIS (p. 180). 7. Kim, J. (2015). Learnersourcing: improving learning with collective learner activity (Doctoral dissertation, Massachusetts Institute of Technology). 8. Khosravi, H., Kitto, K., & Williams, J. J. (2019). RiPPLE: A Crowdsourced Adaptive Platform for Recommendation of Learning Activities. Journal of Learning Analytics, 6(3), 91-105. References