SlideShare a Scribd company logo
Learning Analytics @ The Open University
JISC Networking Event 11th May 2016
Kevin Mayles, Head of Analytics, The Open University
kevin.mayles@open.ac.uk | @kevinmayles
Where are you from?
Learning Analytics @ The Open University
● PVC Learning & Teaching
● CIO / IT
● Planning Office
● Student Support
● Faculty
Learning
and
Teaching
Centre
Institute of
Educational
Technology
Faculties
and
Schools
Learning
and
Teaching
Solutions
Academic Professional Services
Information
Technology
Strategy
and
Information
Office
Academic
Services
Marketing
Student
Registration
and Fees
Business
Performance
Improvement
Library
Services
kevin.mayles@open.ac.uk | @kevinmayles
Where are you from?
Learning Analytics @ The Open University
● PVC Learning & Teaching
● CIO / IT
● Planning Office
● Student Support
● Faculty
Learning
and
Teaching
Centre
Institute of
Educational
Technology
Faculties
and
Schools
Learning
and
Teaching
Solutions
Academic Professional Services
Information
Technology
Strategy
and
Information
Office
Academic
Services
Marketing
Student
Registration
and Fees
Business
Performance
Improvement
Library
Services
©TransportforLondon
kevin.mayles@open.ac.uk | @kevinmayles
OU Context
2014/15
174k students
The average age of our new
undergraduate students is 29
40% new undergraduates have 1
A-Level or lower on entry
Over 21,000 OU students have
disabilities
868k assessments submitted, 395k
phone calls and 176k emails received
from students
kevin.mayles@open.ac.uk | @kevinmayles
p.5
A clear vision statement was developed to galvanise effort across the
institution on the focused use of analytics
Analytics for student success vision
Vision
To use and apply information strategically (through specified indicators) to retain
students and progress them to complete their study goals
Mission
This needs to be achieved at :
● a macro level to aggregate information about the student learning experience at an
institutional level to inform strategic priorities that will improve student retention and
progression
● a micro level to use analytics to drive short, medium and long-term interventions
kevin.mayles@open.ac.uk | @kevinmayles
Vision in action
kevin.mayles@open.ac.uk | @kevinmayles
The OU recognises that three equally important strengths are required
for the effective deployment of analytics
Analytics enhancement strategy
Adapted from Barton and Court (2012)
kevin.mayles@open.ac.uk | @kevinmayles
Analytics enhancement strategy
Early alert indicators using
predictive analytics
Policy on the ethical use of
student data for learning analytics
Analytics for action evaluation
framework
Impact of learning design on
outcomes
kevin.mayles@open.ac.uk | @kevinmayles
Analytics enhancement strategy
Early alert indicators using
predictive analytics
Policy on the ethical use of
student data for learning analytics
Analytics for action evaluation
framework
Impact of learning design on
outcomes
kevin.mayles@open.ac.uk | @kevinmayles
10
Development of early alert indicators
Application of a student number forecasting model to trigger
interventions with vulnerable students
Calvert (2014)
kevin.mayles@open.ac.uk | @kevinmayles
11
Open University: data + analysis
Statistical modelling
2015 cohort
‘Training’ dataset
Predictions
for 2016
cohort
Output dataset
Factors
Factors
Logistic regression
kevin.mayles@open.ac.uk | @kevinmayles
12
Development of early alert indicators
The 30 variables identified associated with success vary in their
importance at each milestone
Student
(Demographic)
Student –
previous
study/motivation
Student progress
in previous OU
study
Student – module
Qualification /
module of study
Calvert (2014)
kevin.mayles@open.ac.uk | @kevinmayles
13
Current indicators
Module probabilities
Integrated into
the Student
Support
Intervention
Tool
Predicts the
probability of a
student
completing and
passing the
module
kevin.mayles@open.ac.uk | @kevinmayles
14
OU Analyse
FailPass
No
submit
Time(weeks)
student engagement with learning activities
kevin.mayles@open.ac.uk | @kevinmayles
15
OU Analyse
Module fingerprint
time
Assessment 1
kevin.mayles@open.ac.uk | @kevinmayles
16
Current indicators
OU Analyse
Predicts the
submission of
next
assignment
weekly
Deployed
through OU
Analyse
Dashboard
kevin.mayles@open.ac.uk | @kevinmayles
17
Outcomes of current pilots
Summary of the interim evaluation of piloting as at March 2016
● There is a mixed picture in the quantitative analysis on the impact in the pilot tutor
groups on withdrawal rates and assignment submissions (note that tutors are self
selected and the expectations to intervene are not consistent across the module
piloting)
● It is a useful tool for understanding students and their participation
● Predictions generally agree with tutors' experience and intuitions of which students
might potentially be at risk
● A (potential) USP of OU Analyse was the information provided between the
assignment submission in relation to students' engagement with learning materials
● Overall, all tutors interviewed were positive about the affordances of OUA, and are
keen to use it again (for a range of reasons) in their next module
kevin.mayles@open.ac.uk | @kevinmayles
18
Case studies and vignettes
“I love it it’s brilliant. It brings together things I already do
[…] it’s an easy way to find information without researching
around such as in the forums and look for students to see
what they do when I have no contact with them […] if they
do not answer emails or phones there is not much I can do.
OUA tells me whether they are engaged and gives me an
early indicator rather than waiting for the day they submit”
kevin.mayles@open.ac.uk | @kevinmayles
Analytics enhancement strategy
Early alert indicators using
predictive analytics
Policy on the ethical use of
student data for learning analytics
Analytics for action evaluation
framework
Impact of learning design on
outcomes
kevin.mayles@open.ac.uk | @kevinmayles
20
http://www.open.ac.uk/students/charter/essential-documents/ethical-use-student-data-learning-analytics-policy
kevin.mayles@open.ac.uk | @kevinmayles
Information for students
21
kevin.mayles@open.ac.uk | @kevinmayles
Analytics enhancement strategy
Early alert indicators using
predictive analytics
Policy on the ethical use of
student data for learning analytics
Analytics for action evaluation
framework
Impact of learning design on
outcomes
kevin.mayles@open.ac.uk | @kevinmayles
Scaffolding action
Analytics for Action Evaluation Framework and Toolkit
23
kevin.mayles@open.ac.uk | @kevinmayles
Real-time progression reports
24
25
26
27
kevin.mayles@open.ac.uk | @kevinmayles
Supporting module teams
● Module teams work with support to identify actions that
can be taken for current and future presentations
● The Analytics project have developed a ‘costed’ menu of
response actions that can be taken ‘in-presentation’ or
during the next presentation
● Budgetary considerations
● Resource considerations
Removing ‘Blockers’
28
LTS enabled
Module Team
SST enabled
AL enabled
Library enabledSupporting evaluation methods
kevin.mayles@open.ac.uk | @kevinmayles
Analytics enhancement strategy
Early alert indicators using
predictive analytics
Policy on the ethical use of
student data for learning analytics
Analytics for action evaluation
framework
Impact of learning design on
outcomes
kevin.mayles@open.ac.uk | @kevinmayles
30
Learning design link to success
kevin.mayles@open.ac.uk | @kevinmayles
31
Learning design link to success

kevin.mayles@open.ac.uk | @kevinmayles
Constructivist
Learning Design
Assessment
Learning Design
Balanced-
variety Learning
Design
Socio-construct.
Learning Design
Learning Design
150+ modules
Rienties, B. and Toetenel, L. (2016)
kevin.mayles@open.ac.uk | @kevinmayles
Constructivist
Learning Design
Assessment
Learning Design
Balanced-
variety Learning
Design
Socio-construct.
Learning Design
Student
Satisfaction
Student
retention
Learning Design
150+ modules
VLE Engagement
Week
1
Week
2
Week
30+
Rienties, B. and Toetenel, L. (2016)
Communication
kevin.mayles@open.ac.uk | @kevinmayles
Analytics enhancement strategy
Early alert indicators using
predictive analytics
Policy on the ethical use of
student data for learning analytics
Analytics for action evaluation
framework
Impact of learning design on
outcomes
kevin.mayles@open.ac.uk | @kevinmayles
Are there any questions?
For further details please contact:
● Kevin Mayles – kevin.mayles@open.ac.uk
● @kevinmayles
● Slideshare: http://www.slideshare.net/KevinMayles
● OU Analyse: https://analyse.kmi.open.ac.uk/
References:
BARTON, D. and COURT, D., 2012. Making Advanced Analytics Work For You. Harvard business review, 90(10), pp.
78-83.
CALVERT, C.E., 2014. Developing a model and applications for probabilities of student success: a case study of
predictive analytics. Open Learning: The Journal of Open, Distance and e-Learning.
KUZILEK, J., HLOSTA, M., HERRMANNOVA, D., ZDRAHAL, Z. and WOLFF, A., 2015. OU Analyse: Analysing At-Risk
Students at The Open University. Learning Analytics Review, no. LAK15-1, March 2015, ISSN: 2057-7494
RIENTIES, B. and TOETENEL, L., 2016. The impact of learning design on student behaviour, satisfaction and
performance: a cross-institutional comparison across 151 modules. Computers in Human Behavior, 60 pp. 333–
341.
SCHÖN, D.A., 1987. Educating the reflective practitioner: Toward a new design for teaching and learning in the
professions. San Francisco, CA, US: Jossey-Bass.

More Related Content

PPTX
Using predictive indicators of student success at scale – implementation succ...
PPTX
Riding the tiger: dealing with complexity in the implementation of institutio...
PPTX
Presentations morning session 22 January 2018 HEFCE open event “Using data to...
PPTX
22 January 2018 HEFCE open event “Using data to increase learning gains and t...
PPTX
Scaling up learning analytics
PDF
Data Driven College Counseling by SchooLinks
PPTX
MedU Plenary Session on Analytics
PPTX
Associate Professor Tracey Bretag: Contract cheating implications for Teachin...
Using predictive indicators of student success at scale – implementation succ...
Riding the tiger: dealing with complexity in the implementation of institutio...
Presentations morning session 22 January 2018 HEFCE open event “Using data to...
22 January 2018 HEFCE open event “Using data to increase learning gains and t...
Scaling up learning analytics
Data Driven College Counseling by SchooLinks
MedU Plenary Session on Analytics
Associate Professor Tracey Bretag: Contract cheating implications for Teachin...

What's hot (20)

PDF
CypherWorx OST Effiacy Study Results 2015
PDF
Can learning analytics improve engineering education in both MOOC and traditi...
PDF
TAMU_Poster_2015
PDF
ImprovementScience
PPTX
Learning analytics: the state of the art and the future
PPTX
researchED Haninge Presentation
PPTX
B.K. Ashley Hoolash Conference 2014
PDF
Exams evaluate students. Who’s evaluating exams? Data-Informed Exam Design
PPTX
Designing Learning Analytics for Humans with Humans
PDF
BMBF STELA @ EDMEDIA 2018
PPTX
Learning Analytics for Learning
PPT
A Quiet Crisis
PPTX
Innovating pedagogy
PPTX
Entrepreneurship and Mentorship in Online Courses
PDF
ABLE - NTU Eunis case study - Nov 2017
PPTX
Throughput, cost and standardization: Does a serious game in healthcare work ...
PPTX
Closing the Gap With STEM Education: Why, What, and How
PPTX
Evaluation Methods
PPTX
Moocs: what the research tells us
PPTX
Comparing Moodle Analytics with Student Evaluation of Online Resources
CypherWorx OST Effiacy Study Results 2015
Can learning analytics improve engineering education in both MOOC and traditi...
TAMU_Poster_2015
ImprovementScience
Learning analytics: the state of the art and the future
researchED Haninge Presentation
B.K. Ashley Hoolash Conference 2014
Exams evaluate students. Who’s evaluating exams? Data-Informed Exam Design
Designing Learning Analytics for Humans with Humans
BMBF STELA @ EDMEDIA 2018
Learning Analytics for Learning
A Quiet Crisis
Innovating pedagogy
Entrepreneurship and Mentorship in Online Courses
ABLE - NTU Eunis case study - Nov 2017
Throughput, cost and standardization: Does a serious game in healthcare work ...
Closing the Gap With STEM Education: Why, What, and How
Evaluation Methods
Moocs: what the research tells us
Comparing Moodle Analytics with Student Evaluation of Online Resources
Ad

Similar to Learning Analytics @ The Open University (20)

PPTX
Leading with Data: Impacting Change on Scale featuring Belinda Tynan
PPTX
Presentation LMU Munich: The power of learning analytics to unpack learning a...
PPTX
Using Learning analytics to support learners and teachers at the Open University
PPTX
Using student data to transform teaching and learning
PDF
Overview of Effective Learning Analytics Using data and analytics to support ...
PPTX
Analytics4Action Evaluation Framework: A Review of Evidence-Based Learning An...
PPTX
«Learning Analytics at the Open University and the UK»
PPTX
20_05_08 «Learning Analytics en la Open University y en el Reino Unido».
PPTX
What have we learned from 6 years of implementing learning analytics amongst ...
PPTX
SAAIR 2014 keynote Sharon Slade
PPTX
AI in Education Amsterdam Data Science (ADS) What have we learned after a dec...
PPTX
Analysing analytics, what is learning analytics?
PDF
Learning Analytics
PDF
Designing Systemic Learning Analytics at the Open University
PPTX
Keynote presentation OOFHEC2016: Bart Rienties
PPTX
Learning analytics adoption in Higher Education: Reviewing six years of exper...
PPTX
Learning analytics: developing an action plan ... developing a vision
PDF
Learning Analytics - UTS 2013
PPTX
Keynote SEC2019 Leeds: The power of learning analytics to impact learning and...
PPTX
Using Assessment data
Leading with Data: Impacting Change on Scale featuring Belinda Tynan
Presentation LMU Munich: The power of learning analytics to unpack learning a...
Using Learning analytics to support learners and teachers at the Open University
Using student data to transform teaching and learning
Overview of Effective Learning Analytics Using data and analytics to support ...
Analytics4Action Evaluation Framework: A Review of Evidence-Based Learning An...
«Learning Analytics at the Open University and the UK»
20_05_08 «Learning Analytics en la Open University y en el Reino Unido».
What have we learned from 6 years of implementing learning analytics amongst ...
SAAIR 2014 keynote Sharon Slade
AI in Education Amsterdam Data Science (ADS) What have we learned after a dec...
Analysing analytics, what is learning analytics?
Learning Analytics
Designing Systemic Learning Analytics at the Open University
Keynote presentation OOFHEC2016: Bart Rienties
Learning analytics adoption in Higher Education: Reviewing six years of exper...
Learning analytics: developing an action plan ... developing a vision
Learning Analytics - UTS 2013
Keynote SEC2019 Leeds: The power of learning analytics to impact learning and...
Using Assessment data
Ad

Recently uploaded (20)

PDF
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
PPTX
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
PPTX
Microbial diseases, their pathogenesis and prophylaxis
PDF
Abdominal Access Techniques with Prof. Dr. R K Mishra
PPTX
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
PDF
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
PDF
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PDF
Insiders guide to clinical Medicine.pdf
PPTX
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
PDF
Microbial disease of the cardiovascular and lymphatic systems
PDF
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
PDF
Complications of Minimal Access Surgery at WLH
PDF
TR - Agricultural Crops Production NC III.pdf
PDF
Pre independence Education in Inndia.pdf
PDF
01-Introduction-to-Information-Management.pdf
PPTX
Cell Structure & Organelles in detailed.
PDF
Module 4: Burden of Disease Tutorial Slides S2 2025
PDF
STATICS OF THE RIGID BODIES Hibbelers.pdf
PPTX
human mycosis Human fungal infections are called human mycosis..pptx
3rd Neelam Sanjeevareddy Memorial Lecture.pdf
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
Microbial diseases, their pathogenesis and prophylaxis
Abdominal Access Techniques with Prof. Dr. R K Mishra
IMMUNITY IMMUNITY refers to protection against infection, and the immune syst...
The Lost Whites of Pakistan by Jahanzaib Mughal.pdf
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
Insiders guide to clinical Medicine.pdf
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
Microbial disease of the cardiovascular and lymphatic systems
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
Complications of Minimal Access Surgery at WLH
TR - Agricultural Crops Production NC III.pdf
Pre independence Education in Inndia.pdf
01-Introduction-to-Information-Management.pdf
Cell Structure & Organelles in detailed.
Module 4: Burden of Disease Tutorial Slides S2 2025
STATICS OF THE RIGID BODIES Hibbelers.pdf
human mycosis Human fungal infections are called human mycosis..pptx

Learning Analytics @ The Open University

  • 1. Learning Analytics @ The Open University JISC Networking Event 11th May 2016 Kevin Mayles, Head of Analytics, The Open University
  • 2. kevin.mayles@open.ac.uk | @kevinmayles Where are you from? Learning Analytics @ The Open University ● PVC Learning & Teaching ● CIO / IT ● Planning Office ● Student Support ● Faculty Learning and Teaching Centre Institute of Educational Technology Faculties and Schools Learning and Teaching Solutions Academic Professional Services Information Technology Strategy and Information Office Academic Services Marketing Student Registration and Fees Business Performance Improvement Library Services
  • 3. kevin.mayles@open.ac.uk | @kevinmayles Where are you from? Learning Analytics @ The Open University ● PVC Learning & Teaching ● CIO / IT ● Planning Office ● Student Support ● Faculty Learning and Teaching Centre Institute of Educational Technology Faculties and Schools Learning and Teaching Solutions Academic Professional Services Information Technology Strategy and Information Office Academic Services Marketing Student Registration and Fees Business Performance Improvement Library Services ©TransportforLondon
  • 4. kevin.mayles@open.ac.uk | @kevinmayles OU Context 2014/15 174k students The average age of our new undergraduate students is 29 40% new undergraduates have 1 A-Level or lower on entry Over 21,000 OU students have disabilities 868k assessments submitted, 395k phone calls and 176k emails received from students
  • 5. kevin.mayles@open.ac.uk | @kevinmayles p.5 A clear vision statement was developed to galvanise effort across the institution on the focused use of analytics Analytics for student success vision Vision To use and apply information strategically (through specified indicators) to retain students and progress them to complete their study goals Mission This needs to be achieved at : ● a macro level to aggregate information about the student learning experience at an institutional level to inform strategic priorities that will improve student retention and progression ● a micro level to use analytics to drive short, medium and long-term interventions
  • 7. kevin.mayles@open.ac.uk | @kevinmayles The OU recognises that three equally important strengths are required for the effective deployment of analytics Analytics enhancement strategy Adapted from Barton and Court (2012)
  • 8. kevin.mayles@open.ac.uk | @kevinmayles Analytics enhancement strategy Early alert indicators using predictive analytics Policy on the ethical use of student data for learning analytics Analytics for action evaluation framework Impact of learning design on outcomes
  • 9. kevin.mayles@open.ac.uk | @kevinmayles Analytics enhancement strategy Early alert indicators using predictive analytics Policy on the ethical use of student data for learning analytics Analytics for action evaluation framework Impact of learning design on outcomes
  • 10. kevin.mayles@open.ac.uk | @kevinmayles 10 Development of early alert indicators Application of a student number forecasting model to trigger interventions with vulnerable students Calvert (2014)
  • 11. kevin.mayles@open.ac.uk | @kevinmayles 11 Open University: data + analysis Statistical modelling 2015 cohort ‘Training’ dataset Predictions for 2016 cohort Output dataset Factors Factors Logistic regression
  • 12. kevin.mayles@open.ac.uk | @kevinmayles 12 Development of early alert indicators The 30 variables identified associated with success vary in their importance at each milestone Student (Demographic) Student – previous study/motivation Student progress in previous OU study Student – module Qualification / module of study Calvert (2014)
  • 13. kevin.mayles@open.ac.uk | @kevinmayles 13 Current indicators Module probabilities Integrated into the Student Support Intervention Tool Predicts the probability of a student completing and passing the module
  • 14. kevin.mayles@open.ac.uk | @kevinmayles 14 OU Analyse FailPass No submit Time(weeks) student engagement with learning activities
  • 15. kevin.mayles@open.ac.uk | @kevinmayles 15 OU Analyse Module fingerprint time Assessment 1
  • 16. kevin.mayles@open.ac.uk | @kevinmayles 16 Current indicators OU Analyse Predicts the submission of next assignment weekly Deployed through OU Analyse Dashboard
  • 17. kevin.mayles@open.ac.uk | @kevinmayles 17 Outcomes of current pilots Summary of the interim evaluation of piloting as at March 2016 ● There is a mixed picture in the quantitative analysis on the impact in the pilot tutor groups on withdrawal rates and assignment submissions (note that tutors are self selected and the expectations to intervene are not consistent across the module piloting) ● It is a useful tool for understanding students and their participation ● Predictions generally agree with tutors' experience and intuitions of which students might potentially be at risk ● A (potential) USP of OU Analyse was the information provided between the assignment submission in relation to students' engagement with learning materials ● Overall, all tutors interviewed were positive about the affordances of OUA, and are keen to use it again (for a range of reasons) in their next module
  • 18. kevin.mayles@open.ac.uk | @kevinmayles 18 Case studies and vignettes “I love it it’s brilliant. It brings together things I already do […] it’s an easy way to find information without researching around such as in the forums and look for students to see what they do when I have no contact with them […] if they do not answer emails or phones there is not much I can do. OUA tells me whether they are engaged and gives me an early indicator rather than waiting for the day they submit”
  • 19. kevin.mayles@open.ac.uk | @kevinmayles Analytics enhancement strategy Early alert indicators using predictive analytics Policy on the ethical use of student data for learning analytics Analytics for action evaluation framework Impact of learning design on outcomes
  • 22. kevin.mayles@open.ac.uk | @kevinmayles Analytics enhancement strategy Early alert indicators using predictive analytics Policy on the ethical use of student data for learning analytics Analytics for action evaluation framework Impact of learning design on outcomes
  • 23. kevin.mayles@open.ac.uk | @kevinmayles Scaffolding action Analytics for Action Evaluation Framework and Toolkit 23
  • 25. 25
  • 26. 26
  • 27. 27
  • 28. kevin.mayles@open.ac.uk | @kevinmayles Supporting module teams ● Module teams work with support to identify actions that can be taken for current and future presentations ● The Analytics project have developed a ‘costed’ menu of response actions that can be taken ‘in-presentation’ or during the next presentation ● Budgetary considerations ● Resource considerations Removing ‘Blockers’ 28 LTS enabled Module Team SST enabled AL enabled Library enabledSupporting evaluation methods
  • 29. kevin.mayles@open.ac.uk | @kevinmayles Analytics enhancement strategy Early alert indicators using predictive analytics Policy on the ethical use of student data for learning analytics Analytics for action evaluation framework Impact of learning design on outcomes
  • 32. kevin.mayles@open.ac.uk | @kevinmayles Constructivist Learning Design Assessment Learning Design Balanced- variety Learning Design Socio-construct. Learning Design Learning Design 150+ modules Rienties, B. and Toetenel, L. (2016)
  • 33. kevin.mayles@open.ac.uk | @kevinmayles Constructivist Learning Design Assessment Learning Design Balanced- variety Learning Design Socio-construct. Learning Design Student Satisfaction Student retention Learning Design 150+ modules VLE Engagement Week 1 Week 2 Week 30+ Rienties, B. and Toetenel, L. (2016) Communication
  • 34. kevin.mayles@open.ac.uk | @kevinmayles Analytics enhancement strategy Early alert indicators using predictive analytics Policy on the ethical use of student data for learning analytics Analytics for action evaluation framework Impact of learning design on outcomes
  • 35. kevin.mayles@open.ac.uk | @kevinmayles Are there any questions? For further details please contact: ● Kevin Mayles – kevin.mayles@open.ac.uk ● @kevinmayles ● Slideshare: http://www.slideshare.net/KevinMayles ● OU Analyse: https://analyse.kmi.open.ac.uk/ References: BARTON, D. and COURT, D., 2012. Making Advanced Analytics Work For You. Harvard business review, 90(10), pp. 78-83. CALVERT, C.E., 2014. Developing a model and applications for probabilities of student success: a case study of predictive analytics. Open Learning: The Journal of Open, Distance and e-Learning. KUZILEK, J., HLOSTA, M., HERRMANNOVA, D., ZDRAHAL, Z. and WOLFF, A., 2015. OU Analyse: Analysing At-Risk Students at The Open University. Learning Analytics Review, no. LAK15-1, March 2015, ISSN: 2057-7494 RIENTIES, B. and TOETENEL, L., 2016. The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151 modules. Computers in Human Behavior, 60 pp. 333– 341. SCHÖN, D.A., 1987. Educating the reflective practitioner: Toward a new design for teaching and learning in the professions. San Francisco, CA, US: Jossey-Bass.

Editor's Notes

  • #3: Belinda This encapsulates our strategy which is moving forward on all fronts. Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.
  • #4: Belinda This encapsulates our strategy which is moving forward on all fronts. Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.
  • #6: Belinda Analytics is at the heart of the university’s strategic priority to deliver an outstanding student experience. We’ve developed this vision that drives our development of the use of analytics for both short term action and long term strategic decision making.
  • #8: Belinda Our strategy is based around the 3 key underpinning strengths we need to develop as an institution. Each equally important.
  • #9: Belinda This encapsulates our strategy which is moving forward on all fronts. Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.
  • #10: Belinda This encapsulates our strategy which is moving forward on all fronts. Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.
  • #12: Needing to improve our longer-term forecasting of student numbers we are now able to predict a students chances of success by mapping their situation/characteristics to 30 of the most signification factors associated with student success: These include student characteristics or factors such as previous educational attainment occupational status previous study or reasons for study the number of credits already achieved the total number of previous passes or withdrawals the number of credits being studied in the year latest assignment scores what modules or qualification they are studying. We then map this data to historical records. Essentially we model one cohort of students based on the activates and outcomes of previous cohorts. This enables us to correlate those student characteristics to likely student outcome. This aggregated data is very accurate – providing us with a 2% tolerance. This data can be used to predict student progression and therefore income. As well as being applied to a cohort these predictions can also be used to trigger interventions with individual students who may have a low probability of success.  
  • #15: Mention Zdenek Zdrahal and how session… In the second approach we have developed a tool called OU Analyse  which enables us to undertake a week by week analysis of student engagement including study engagement with their learning activities. From this data we can analyse previous student cohorts and correlate learning activities against outcome. This enables us to identify, on a module by module basis, which activities carry the greatest value in terms of student success. To note: the predictive model assesses a student using 4 different algorithms, each of the algorithms takes into account different factors and the historic data we hold about previously successful students. The model will then give a student a vote for each of the algorithms that shows the student is at risk, therefore the more votes you have the higher the likelihood you are at risk. On this slide, the diagram represents one of the algorithms that takes into account the learning activities, which, are represented by the circles. Blue circles are activities that combine in different ways, actions such as forum use, interaction or downloading of resources, or online content. The pink circle shows a week of non-engagement with the VLE. Here we see a stylised pattern of weekly activity of a student who successfully achieved a pass for this module. Contrast this with the activity of a student who didn’t submit. You can see that the pattern is very different. In this example, the student who didn’t submit spend periods not engaging with the VLE. By modelling the pattern of behaviour of students who pass, fail or don’t submit, we are able to get a predict what successful and not successful VLE engagment looks like, and get a ‘finger-print’ for each module. Armed with this knowledge we are then able to monitor activity within the current cohort and quickly identify individual students who are not engaging with those high value activities. These students are flagged as ‘at risk’ and can be offered targeted interventions. Finally, we have also applied analytics to the exploration of the impact of learning design on learning performance.   Takes demographic data + presentation-related data (aggregated VLE data available weekly) For each module, identifying important VLE actions, e.g., access forum, access or download a resource, access OU content. Learning activities will include a combination of these actions Identify students at risk of failing the module as early as possible so that OU intervention is efficient and meaningful. Finally, we have also applied analytics to the exploration of the impact of learning design on learning performance.   Takes demographic data + presentation-related data (aggregated VLE data available weekly) For each module, identifying important VLE actions, e.g., access forum, access or download a resource, access OU content. Learning activities will include a combination of these actions Identify students at risk of failing the module as early as possible so that OU intervention is efficient and meaningful.
  • #16: Based on demographic data Based on VLE activities
  • #20: Belinda This encapsulates our strategy which is moving forward on all fronts. Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.
  • #23: Belinda This encapsulates our strategy which is moving forward on all fronts. Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.
  • #30: Belinda This encapsulates our strategy which is moving forward on all fronts. Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.
  • #33: Cluster analysis of 40 modules (>19k students) indicate that module teams design four different types of modules: constructivist, assessment driven, balanced, or socio-constructivist. The LAK paper by Rienties and colleagues indicates that learning design and learning design activities in particular strongly influence how students are engaging in our LMS. VLE engagement is higher in modules with socio-constructivist or balanced variety learning designs, and lower for constructivist designs. In terms of learning outcomes, learning design seems to have an impact on learning performance. Students rate constructivist modules higher, and socio-constructivist modules lower. However, in terms of student retention (% of students passed) constructivist modules have lower retention, while socio-constructivist have higher. In particular, modules with a heavy reliance on content and cognition (assimilative activities) seemed to lead to lower completion and pass rates. Thus, learning design strongly influences behaviour, experience and performance. (and we believe we are the first to have mapped this with such a large cohort). Red lines represent –ve effect Green lines represent +ve effect Blue lines represent no significant effect. Eventually with the availability of yet more data, we may soon be better able to understand the complex relationship between learning design and learning processes and outcomes with resultant sharing of best practice evidenced by such analyses. If this could then be mapped to student characteristics, there is the potential to personalise learning designs. (Learning performance was calculated by the number of learners who completed and passed the module relative to the number of learners who registered for each module.)
  • #34: Cluster analysis of 40 modules (>19k students) indicate that module teams design four different types of modules: constructivist, assessment driven, balanced, or socio-constructivist. The LAK paper by Rienties and colleagues indicates that learning design and learning design activities in particular strongly influence how students are engaging in our LMS. VLE engagement is higher in modules with socio-constructivist or balanced variety learning designs, and lower for constructivist designs. In terms of learning outcomes, learning design seems to have an impact on learning performance. Students rate constructivist modules higher, and socio-constructivist modules lower. However, in terms of student retention (% of students passed) constructivist modules have lower retention, while socio-constructivist have higher. In particular, modules with a heavy reliance on content and cognition (assimilative activities) seemed to lead to lower completion and pass rates. Thus, learning design strongly influences behaviour, experience and performance. (and we believe we are the first to have mapped this with such a large cohort). Red lines represent –ve effect Green lines represent +ve effect Blue lines represent no significant effect. Eventually with the availability of yet more data, we may soon be better able to understand the complex relationship between learning design and learning processes and outcomes with resultant sharing of best practice evidenced by such analyses. If this could then be mapped to student characteristics, there is the potential to personalise learning designs. (Learning performance was calculated by the number of learners who completed and passed the module relative to the number of learners who registered for each module.)
  • #35: Belinda This encapsulates our strategy which is moving forward on all fronts. Kevin will now demonstrate an operation tool available at scale and one of our latest experimental prototypes.