SlideShare a Scribd company logo
avinash.boroowa@open.ac.uk | @nashman11178
Results so far
Pass rates:
On 7 modules the pass rates increased
On 4 modules the pass rates decreased
No comparable data for 7 modules.
Z- scores:
On 8 modules the z scores increased
On 2 modules the z scores decreased
No impact on z-scores for one module
No comparable data for 7 modules.
Satisfaction:
On 7 modules the overall satisfaction increased
On 4 modules the overall satisfaction decreased
No comparable data for 7 modules.
2014-2015 academic year
1
While proving causality is always very difficult,
the increases in pass rates, z-scores and overall
satisfaction can be a proxy for engaged and well-
supported module team chairs willing and being
able to take action.
avinash.boroowa@open.ac.uk | @nashman11178
A224 Inside music – an example
2
2012J 2013J 2014J 2015J
TMA01 92 94 94 97
TMA02 87 88 87 91
TMA03 83 83 84 83
TMA04 74 74 73
TMA05 73 71 71
TMA06 67 64 67
Drops 2012J 2013J 2014J 2015J
TMA 1-2 5 6 7 6
TMA 2-3 4 5 3 8
TMA 3-4 9 9 11
TMA 4-5 1 3 2
TMA 5-6 6 7 4
Analytics 4
Action
TMA Submissions
0
20
40
60
80
100
120
TMA01 TMA02 TMA03 TMA04 TMA05 TMA06
% of TMA Submissions
2012J 2013J 2014J 2015J
0
2
4
6
8
10
12
TMA 1-2 TMA 2-3 TMA 3-4 TMA 4-5 TMA 5-6
TMA Submission % Drop off
between TMAs
2012J 2013J 2014J 2015J
 Reviewing the TMA data
revealed a possible issue
between TMAs 03 and 04.
 The data suggested a
significant drop in
students submitting
TMA04 (11% on the 2014J
presentation)
avinash.boroowa@open.ac.uk | @nashman11178
3
 The data also suggested
retention was an issue
between TMA03 and
TMA05. 12% formal
withdrawal in this period.
A224 Inside music – an example
avinash.boroowa@open.ac.uk | @nashman11178
4
 Investigation of
the data showed
low engagement
with structured
content activities
17.22 -17.26
 Proposed Action:
Rework the
content in these
activities.
A224 Inside music – an example
avinash.boroowa@open.ac.uk | @nashman11178
5
 Data suggests that
roughly a quarter of
students registered
on A224 15J are
studying 120 credits.
 Further investigation
required to track
their success.
 Proposed Action: To
better advise these
students at
registration.
avinash.boroowa@open.ac.uk | @nashman11178
The Problem
● Previous presentations over 500 students
registered at start
● Steady attrition means that the two previous
presentations had
● Completion rates of 67% and 69%
● Pass rates of 63% and 65%
● 30% of students with lower than A level - PEQs
● High concurrency with other modules (30%)
● Based on feedback, the feeling was that
students new to OU study weren’t adequately
prepared for what it is to study 30/60 credits.
L192 Bon départ: beginners' French
6
The Solution
● Module website opens 3 weeks before start. In
those 3 weeks students were offered induction
sessions on
1. Materials
2. Tutorials
3. Support
4. Drop-in session for ad-hoc queries
● Rationale: to set the right expectations in the
minds of students - what it is to study 30
credits/60 credits
● Targeted at students new to the OU
avinash.boroowa@open.ac.uk | @nashman11178
The outcomes
L192 Bon départ: beginners' French
7
Session attendance Number %
Attended session 133 20.37%
Did not attend session 520 79.63%
Grand Total 653 100.00%
Individual session attendance N %
Session A - Materials 97 14.85%
Session B – Tutorials and OU Live 85 13.02%
Session C - Support 53 8.12%
Drop-in session 31 4.75%
Attended
introductory
session
Did not attend
introductory
session
Total
Categories N % N % N %
Remained
registered at
start 131 98.50% 461 88.65% 592 90.66%
Withdrew before
start 2 1.50% 59 11.35% 61 9.34%
Grand Total 133 100.00% 520 100.00% 653 100.00%
Withdrawals before module start
avinash.boroowa@open.ac.uk | @nashman11178
The outcomes
L192 Bon départ: beginners' French
8
Attended
introductory
session
Did not attend
introductory
session
Total
Categories N % N % N %
Registered 119 89.47% 384 73.85% 503 77.03%
Withdrawn 14 10.53% 136 26.15% 150 22.97%
Grand Total 133 100.00% 520 100.00% 653 100.00%
Withdrawals as at 03/03/2016
Attended
introductory
session
Did not attend
introductory
session
Total
Categories N % N % N %
Submitted TMA1 129 98.47% 411 89.15% 540 91.22%
Submitted TMA2 120 91.60% 370 80.26% 490 82.77%
Submitted TMA3 109 83.21% 295 63.99% 404 68.24%
Grand Total 131 100.00% 460 100.00% 592 100.00%
Assignment Submission Rates
avinash.boroowa@open.ac.uk | @nashman11178
The outcomes
L192 Bon départ: beginners' French
9
Attended
introductory
session
Did not
attend
introductory
session
Total
Average score of TMA 1 87.59 82.15 83.45
Average score of TMA 2 80.25 73.98 75.51
Average score of TMA 3 84.59 78.71 80.29
Average Assessment Scores
• Overall significant correlations between attendance and subsequent retention and performance
• Number of modules in concurrent study – the group that did not attend the induction session have registered
on more modules (in both terms of modules and credits) than those who did - it may be a lack of time that
caused them not to engage in the induction session
• Proportion of new/continuing students – the group that did not attend the induction session has significantly
higher proportion of continuing students. On the whole, it looks like new students were the ones who mostly
benefited.
avinash.boroowa@open.ac.uk | @nashman11178
10
avinash.boroowa@open.ac.uk | @nashman11178
Evaluating the use of the A4A Framework
Technology Acceptance Model (TAM1)
11
● Explains why a user accepts or rejects a technology.
● Perceived usefulness and perceived ease of use influence
intentions to use and actual behaviour.
● Identify what factors explain future intentions to use the
innovation and actual usage behaviour
The Technology Acceptance Model, version 1.
(Davis, Bagozzi & Warshaw 1989)
avinash.boroowa@open.ac.uk | @nashman11178
Feedback from Data Source Briefing Workshops
Perceived usefulness (PU)
● Using the data tools will improve the delivery
of the module.
● Using the data tools will increase my
productivity.
● Using the data tools will enhance the
effectiveness of the teaching on the module.
Perceived ease-of-use (PEOU)
● Learning to operate data tools is easy for me.
● I find it easy to get the data tools to do what I
want them to do.
● I find the data tools easy to use.
Based on Technology Acceptance Model (TAM1)
12
Perceived training requirement
● I expect most staff will need formal training on
the data tools
Satisfaction with Workshop
● The instructors were enthusiastic in the data
briefing.
● The instructors provided clear instructions on
what to do.
● Overall, I am satsified with the workshop.
avinash.boroowa@open.ac.uk | @nashman11178
13
avinash.boroowa@open.ac.uk | @nashman11178
14
avinash.boroowa@open.ac.uk | @nashman11178
15
avinash.boroowa@open.ac.uk | @nashman11178
16
avinash.boroowa@open.ac.uk | @nashman11178
17
avinash.boroowa@open.ac.uk | @nashman11178
Feedback from Data Support Meetings
Perceived usefulness (PU)
● Using the data tools from the support meeting
will enhance the effectiveness of the teaching
on the module.
● Using the data tools from the support meeting
will improve the delivery of my module.
● Using the data tools from the support meeting
will increase my productivity.
Perceived ease-of-use (PEOU)
● I find it easy to get the data tools used in the
support meetings to do what I want them to
do.
● I find the tools used in the support meeting
easy to use.
● Learning to operate the data tools used in the
support meeting is easy for me.
Based on Technology Acceptance Model (TAM1)
18
Perceived training requirement
● Based upon my experience with the data tools
used in the support meeting, I expect that
most staff will need formal training to use
these tools.
Satisfaction with Workshop
● The facilitators helped me identify an issue, or
an action, that could be taken on my module.
● The facilitators provided a clear interpretation
of my module's data.
● The facilitators were enthusiastic in the
support meeting.
● Overall, I am satisfied with the support
meeting.
avinash.boroowa@open.ac.uk | @nashman11178
19
avinash.boroowa@open.ac.uk | @nashman11178
20
avinash.boroowa@open.ac.uk | @nashman11178
21
avinash.boroowa@open.ac.uk | @nashman11178
22
avinash.boroowa@open.ac.uk | @nashman11178
23
avinash.boroowa@open.ac.uk | @nashman11178
Are there any questions?
For further details please contact:
● Avinash Boroowa – avinash.boroowa@open.ac.uk ¦ @nashman11178
● Bart Rienties – bart.rienties@open.ac.uk ¦ @DrBartRienties
● Dr. Christothea Herodotu – christothea.herodotu@open.ac.uk
avinash.boroowa@open.ac.uk | @nashman11178
References
Agudo-Peregrina, Á. F., Iglesias-Pradas, S., Conde-González, M. Á., & Hernández-García, Á. (2014). Can we predict success from log data in VLEs? Classification of
interactions for learning analytics and their relation with performance in VLE-supported F2F and online learning. Computers in Human Behavior, 31(February), 542-
550. doi: 10.1016/j.chb.2013.05.031
Anderson, T., Rourke, L., Garrison, D., & Archer, W. (2001). Assessing teaching presence in a computer conferencing context. Journal of Asynchronous Learning
Networks, 5(2), 1-17.
Arbaugh, J. B. (2014). System, scholar, or students? Which most influences online MBA course effectiveness? Journal of Computer Assisted Learning, 30(4), 349-
362. doi: 10.1111/jcal.12048
Arnold, K. E., & Pistilli, M. D. (2012). Course signals at Purdue: using learning analytics to increase student success. Paper presented at the Proceedings of the 2nd
International Conference on Learning Analytics and Knowledge.
Ashby, A. (2004). Monitoring student retention in the Open University: definition, measurement, interpretation and action. Open Learning: The Journal of Open,
Distance and e-Learning, 19(1), 65-77. doi: 10.1080/0268051042000177854
Calvert, C. E. (2014). Developing a model and applications for probabilities of student success: a case study of predictive analytics. Open Learning: The Journal of
Open, Distance and e-Learning, 29(2), 160-173. doi: 10.1080/02680513.2014.931805
Cleveland-Innes, M., & Campbell, P. (2012). Emotional presence, learning, and the online learning environment. The International Review of Research in Open and
Distance Learning, 13(4), 269-292.
Clow, D., Cross, S., Ferguson, R., & Rienties, B. (2014). Evidence Hub Review. Milton Keynes: LACE Project.
Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design Research: Theoretical and Methodological Issues. The journal of the learning sciences, 13(1), 15-42.
Conde, M. Á., & Hernández-García, Á. (2015). Learning analytics for educational decision making. Computers in Human Behavior, 47, 1-3. doi:
http://dx.doi.org/10.1016/j.chb.2014.12.034
Conole, G. (2012). Designing for Learning in an Open World. Dordrecht: Springer.
Cross, S., Galley, R., Brasher, A., & Weller, M. (2012). Final Project Report of the OULDI-JISC Project: Challenge and Change in Curriculum Design Process,
Communities, Visualisation and Practice. York: JISC.
Ferguson, R. (2012). Learning analytics: drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5), 304-317. doi:
10.1504/ijtel.2012.051816
Ferguson, R., & Buckingham Shum, S. (2012). Social learning analytics: five approaches. Paper presented at the Proceedings of the 2nd International Conference on
Learning Analytics and Knowledge, Vancouver, British Columbia, Canada.
Garrison, D., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education The Internet and Higher
Education, 2(2), 87-105. doi: 10.1016/S1096-7516(00)00016-6
Garrison, D., & Arbaugh, J. B. (2007). Researching the community of inquiry framework: Review, issues, and future directions. The Internet and Higher Education,
10(3), 157-172. doi: DOI: 10.1016/j.iheduc.2007.04.001
25
avinash.boroowa@open.ac.uk | @nashman11178
References
Gasevic, D., Zouaq, A., & Janzen, R. (2013). “Choose your classmates, your GPA is at stake!”: The association of cross-class social ties and academic performance.
American Behavioral Scientist, 57(10), 1460-1479. doi: 10.1177/0002764213479362
González-Torres, A., García-Peñalvo, F. J., & Therón, R. (2013). Human–computer interaction in evolutionary visual software analytics. Computers in Human
Behavior, 29(2), 486-495. doi: 10.1016/j.chb.2012.01.013
Hattie, J. (2009). Visible Learning: A synthesis of over 800 meta-analyses relating to achievement. New York: Routledge.
Hess, F. M., & Saxberg, B. (2013). Breakthrough Leadership in the Digital Age: Using Learning Science to Reboot Schooling. Thousand Oaks: Corwin Press.
Inkelaar, T., & Simpson, O. (2015). Challenging the ‘distance education deficit’ through ‘motivational emails’. Open Learning: The Journal of Open, Distance and e-
Learning, 1-12. doi: 10.1080/02680513.2015.1055718
Isherwood, M. C. (2009). Developing on-line Learning Aids for M150 Students COLMSCT Final Report. Milton Keynes, UK: The Open University UK.
Jordan, S. (2014). Computer-marked assessment as learning analytics. Paper presented at the CALRG Annual Conference, Milton Keynes, UK.
McMillan, J. H., & Schumacher, S. (2014). Research in education: Evidence-based inquiry. Harlow: Pearson Higher Ed.
Papamitsiou, Z., & Economides, A. (2014). Learning Analytics and Educational Data Mining in Practice: A Systematic Literature Review of Empirical Evidence.
Educational Technology & Society, 17(4), 49–64.
Richardson, J. T. E. (2012). The attainment of White and ethnic minority students in distance education. Assessment & Evaluation in Higher Education, 37(4), 393-
408. doi: 10.1080/02602938.2010.534767
Rienties, B., & Alden Rivers, B. (2014). Measuring and Understanding Learner Emotions: Evidence and Prospects LACE review papers (Vol. 1). Milton Keynes: LACE.
Rienties, B., Cross, S., & Zdrahal, Z. (2016). Implementing a Learning Analytics Intervention and Evaluation Framework: what works? In B. Motidyang & R. Butson
(Eds.), Big Data and Learning Analytics in Higher Education Current Theory and Practice. Heidelberg: Springer.
Rienties, B., Giesbers, S., Lygo-Baker, S., Ma, S., & Rees, R. (2014). Why some teachers easily learn to use a new Virtual Learning Environment: a Technology
Acceptance perspective. Interactive Learning Environments. doi: 10.1080/10494820.2014.881394
Rienties, B., Toetenel, L., & Bryan, A. (2015). “Scaling up” learning design: impact of learning design activities on LMS behavior and performance. Paper presented
at the 5th Learning Analytics Knowledge conference, New York.
Rienties, B., & Townsend, D. (2012). Integrating ICT in business education: using TPACK to reflect on two course redesigns. In P. Van den Bossche, W. H. Gijselaers
& R. G. Milter (Eds.), Learning at the Crossroads of Theory and Practice (Vol. 4, pp. 141-156). Dordrecht: Springer.
Siroker, D., & Koomen, P. (2013). A/B Testing: The Most Powerful Way to Turn Clicks Into Customers. New Jersey: John Wiley & Sons.
Slade, S., & Prinsloo, P. (2013). Learning Analytics: Ethical Issues and Dilemmas. American Behavioral Scientist, 57(10), 1510-1529. doi:
10.1177/0002764213479366
Slavin, R. E. (2002). Evidence-Based Education Policies: Transforming Educational Practice and Research. Educational Researcher, 31(7), 15-21. doi:
10.2307/3594400
Slavin, R. E. (2008). Perspectives on evidence-based research in education—What works? Issues in synthesizing educational program evaluations. Educational
Researcher, 37(1), 5-14. doi: 10.3102/0013189X08314117 26
avinash.boroowa@open.ac.uk | @nashman11178
References
Stenbom, S., Cleveland-Innes, M., & Hrastinski, S. (2014). Online Coaching as a Relationship of Inquiry: Mathematics, online help, and emotional presence. Paper
presented at the Eden 2014, Oxford.
Tempelaar, D. T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: Learning Analytics in a data-rich context.
Computers in Human Behavior, 47, 157-167. doi: 10.1016/j.chb.2014.05.038
Tingle, R., & Cross, S. (2010). How do 'WP' students differ from others in their engagement with e-learning activities. Paper presented at the OU Widening
Participation Conference, Milton Keynes, UK.
https://latestendeavour.files.wordpress.com/2010/06/wp_students_engagment_with_elearning_poster_tinglecross2010.pdf
Tobarra, L., Robles-Gómez, A., Ros, S., Hernández, R., & Caminero, A. C. (2014). Analyzing the students’ behavior and relevant topics in virtual learning communities.
Computers in Human Behavior, 31(0), 659-669. doi: 10.1016/j.chb.2013.10.001
Torgerson, D. J., & Torgerson, C. (2008). Designing randomised trials in health, education and the social sciences: an introduction. London: Palgrave Macmillan.
Whitelock, D., Richardson, J., Field, D., Van Labeke, N., & Pulman, S. (2014). Designing and Testing Visual representations of Draft Essays for Higher Education
Students. Paper presented at the Learning Analytics Knoweldge conference 2014, Indianapolis.
Wolff, A., Zdrahal, Z., Herrmannova, D., Kuzilek, J., & Hlosta, M. (2014). Developing predictive models for early detection of at-risk students on distance learning
modules, Workshop: Machine Learning and Learning Analytics Paper presented at the Learning Analytics and Knowledge (2014), Indianapolis.
Wolff, A., Zdrahal, Z., Nikolov, A., & Pantucek, M. (2013). Improving retention: predicting at-risk students by analysing clicking behaviour in a virtual learning
environment. Paper presented at the Proceedings of the Third International Conference on Learning Analytics and Knowledge.
27

More Related Content

PPTX
Analytics4Action Evaluation Framework: A Review of Evidence-Based Learning An...
PPTX
Comparing Moodle Analytics with Student Evaluation of Online Resources
PPTX
Usage, Engagement and Impact: Evaluating the usage of and measuring impact an...
PPTX
Students First 2020 - Usage and impact of academic support
PDF
Introductory Psychology Textbooks: The Roles of Online vs. Print and Open v...
PPTX
Student Attitudes Toward content in Higher Education: Nadine Vassallo, Projec...
PDF
Taking the Holistic View: Building a customer feedback database.
PPTX
The Inclusive Access Model, presented by Jason Lorgan, Stores Director, Unive...
Analytics4Action Evaluation Framework: A Review of Evidence-Based Learning An...
Comparing Moodle Analytics with Student Evaluation of Online Resources
Usage, Engagement and Impact: Evaluating the usage of and measuring impact an...
Students First 2020 - Usage and impact of academic support
Introductory Psychology Textbooks: The Roles of Online vs. Print and Open v...
Student Attitudes Toward content in Higher Education: Nadine Vassallo, Projec...
Taking the Holistic View: Building a customer feedback database.
The Inclusive Access Model, presented by Jason Lorgan, Stores Director, Unive...

What's hot (20)

PPTX
A student monitoring and remedial action system for improving retention on co...
PPTX
MOOC: Challenges for HE Institutions
PDF
Using Analytics to Increase Student Success
PPTX
Uksg 2013 sarah thompson_liz waller
PPTX
Student Attitudes Toward Content in Higher Education, with Nadine Vassallo, P...
PPTX
Student Attitudes Toward Higher Education with Nadine Vassallo, Project Manag...
PDF
Developing Data Systems- Slovenia
PPTX
Lak2017 Herodotou, Christothea
PDF
Open Textbook Summit - Emerging OER Distributed Ecosystem to Improve Student ...
PPTX
Open Textbook Summit - BCcampus
PDF
Keynote Address
PDF
What Makes Charter Schools Effective?
PPTX
The Future of Online Testing with MOOCs: An Exploratory Analysis of Current P...
PPTX
Online Learning, MOOCs, and More
PPTX
BCFS 2017 AGM presentation
PPTX
Open Education Summit Math Workshop Denver June 2013
PDF
Wanneer is een rooster goed en ruimtegebrek optimaal?
PPTX
Using data to drive academic engagement through strategic reading - Neil Dono...
PPTX
What questions are MOOCs asking? An evidence based investigation
PDF
Web2.0
A student monitoring and remedial action system for improving retention on co...
MOOC: Challenges for HE Institutions
Using Analytics to Increase Student Success
Uksg 2013 sarah thompson_liz waller
Student Attitudes Toward Content in Higher Education, with Nadine Vassallo, P...
Student Attitudes Toward Higher Education with Nadine Vassallo, Project Manag...
Developing Data Systems- Slovenia
Lak2017 Herodotou, Christothea
Open Textbook Summit - Emerging OER Distributed Ecosystem to Improve Student ...
Open Textbook Summit - BCcampus
Keynote Address
What Makes Charter Schools Effective?
The Future of Online Testing with MOOCs: An Exploratory Analysis of Current P...
Online Learning, MOOCs, and More
BCFS 2017 AGM presentation
Open Education Summit Math Workshop Denver June 2013
Wanneer is een rooster goed en ruimtegebrek optimaal?
Using data to drive academic engagement through strategic reading - Neil Dono...
What questions are MOOCs asking? An evidence based investigation
Web2.0
Ad

Similar to Analytics4Action Evaluation Framework: A Review of Evidence-Based Learning Analytics Interventions at the Open University UK - Part 2 (Examples and Results) (20)

PDF
Learning Analytics for Student Support
PPTX
La rienties sirikt_27_05_2015
PPTX
Retention strategies on a new innovative modules
PPTX
Learning Analytics: Innovative Practices
PDF
Student Feedback - On Programme - Internal benchmark
PPTX
Talis Insight Asia-Pacific 2017: Simon Bedford, University of Wollongong
PPTX
Keynote EARLI SIG17 The power of learning analytics: a need to move towards n...
PPTX
Creating an action plan for learning analytics
PPT
Teaching Presentation new2
PDF
Appraisers note taking appleby 2016
PPTX
«Learning Analytics at the Open University and the UK»
PPTX
20_05_08 «Learning Analytics en la Open University y en el Reino Unido».
PPTX
HE Course and Module Evaluation Conference - Peter Bird & Rachel Forsyth
PPTX
TLC2016 - Data for Students - A student-centred approach to analytics in Learn
PPTX
Ajman University
PPT
BRAG - Monitoring of Attendance and Performance
PPTX
Using Learning analytics to support learners and teachers at the Open University
PPTX
Higher Education Teachers' Experiences of Learning Analytics in Relation to S...
PPT
June 21 learning analytics overview
PDF
Learning Analytics In Higher Education: Struggles & Successes (Part 2)
Learning Analytics for Student Support
La rienties sirikt_27_05_2015
Retention strategies on a new innovative modules
Learning Analytics: Innovative Practices
Student Feedback - On Programme - Internal benchmark
Talis Insight Asia-Pacific 2017: Simon Bedford, University of Wollongong
Keynote EARLI SIG17 The power of learning analytics: a need to move towards n...
Creating an action plan for learning analytics
Teaching Presentation new2
Appraisers note taking appleby 2016
«Learning Analytics at the Open University and the UK»
20_05_08 «Learning Analytics en la Open University y en el Reino Unido».
HE Course and Module Evaluation Conference - Peter Bird & Rachel Forsyth
TLC2016 - Data for Students - A student-centred approach to analytics in Learn
Ajman University
BRAG - Monitoring of Attendance and Performance
Using Learning analytics to support learners and teachers at the Open University
Higher Education Teachers' Experiences of Learning Analytics in Relation to S...
June 21 learning analytics overview
Learning Analytics In Higher Education: Struggles & Successes (Part 2)
Ad

Recently uploaded (20)

PPTX
IB Computer Science - Internal Assessment.pptx
PDF
Recruitment and Placement PPT.pdfbjfibjdfbjfobj
PPTX
Acceptance and paychological effects of mandatory extra coach I classes.pptx
PPTX
advance b rammar.pptxfdgdfgdfsgdfgsdgfdfgdfgsdfgdfgdfg
PPTX
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
PDF
“Getting Started with Data Analytics Using R – Concepts, Tools & Case Studies”
PPTX
Introduction to Knowledge Engineering Part 1
PPTX
Supervised vs unsupervised machine learning algorithms
PPTX
oil_refinery_comprehensive_20250804084928 (1).pptx
PPTX
The THESIS FINAL-DEFENSE-PRESENTATION.pptx
PDF
BF and FI - Blockchain, fintech and Financial Innovation Lesson 2.pdf
PPTX
IBA_Chapter_11_Slides_Final_Accessible.pptx
PPTX
1_Introduction to advance data techniques.pptx
PPT
Quality review (1)_presentation of this 21
PPT
Miokarditis (Inflamasi pada Otot Jantung)
PDF
22.Patil - Early prediction of Alzheimer’s disease using convolutional neural...
PPTX
Computer network topology notes for revision
PDF
Fluorescence-microscope_Botany_detailed content
PPT
Reliability_Chapter_ presentation 1221.5784
PPT
ISS -ESG Data flows What is ESG and HowHow
IB Computer Science - Internal Assessment.pptx
Recruitment and Placement PPT.pdfbjfibjdfbjfobj
Acceptance and paychological effects of mandatory extra coach I classes.pptx
advance b rammar.pptxfdgdfgdfsgdfgsdgfdfgdfgsdfgdfgdfg
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
“Getting Started with Data Analytics Using R – Concepts, Tools & Case Studies”
Introduction to Knowledge Engineering Part 1
Supervised vs unsupervised machine learning algorithms
oil_refinery_comprehensive_20250804084928 (1).pptx
The THESIS FINAL-DEFENSE-PRESENTATION.pptx
BF and FI - Blockchain, fintech and Financial Innovation Lesson 2.pdf
IBA_Chapter_11_Slides_Final_Accessible.pptx
1_Introduction to advance data techniques.pptx
Quality review (1)_presentation of this 21
Miokarditis (Inflamasi pada Otot Jantung)
22.Patil - Early prediction of Alzheimer’s disease using convolutional neural...
Computer network topology notes for revision
Fluorescence-microscope_Botany_detailed content
Reliability_Chapter_ presentation 1221.5784
ISS -ESG Data flows What is ESG and HowHow

Analytics4Action Evaluation Framework: A Review of Evidence-Based Learning Analytics Interventions at the Open University UK - Part 2 (Examples and Results)

  • 1. avinash.boroowa@open.ac.uk | @nashman11178 Results so far Pass rates: On 7 modules the pass rates increased On 4 modules the pass rates decreased No comparable data for 7 modules. Z- scores: On 8 modules the z scores increased On 2 modules the z scores decreased No impact on z-scores for one module No comparable data for 7 modules. Satisfaction: On 7 modules the overall satisfaction increased On 4 modules the overall satisfaction decreased No comparable data for 7 modules. 2014-2015 academic year 1 While proving causality is always very difficult, the increases in pass rates, z-scores and overall satisfaction can be a proxy for engaged and well- supported module team chairs willing and being able to take action.
  • 2. avinash.boroowa@open.ac.uk | @nashman11178 A224 Inside music – an example 2 2012J 2013J 2014J 2015J TMA01 92 94 94 97 TMA02 87 88 87 91 TMA03 83 83 84 83 TMA04 74 74 73 TMA05 73 71 71 TMA06 67 64 67 Drops 2012J 2013J 2014J 2015J TMA 1-2 5 6 7 6 TMA 2-3 4 5 3 8 TMA 3-4 9 9 11 TMA 4-5 1 3 2 TMA 5-6 6 7 4 Analytics 4 Action TMA Submissions 0 20 40 60 80 100 120 TMA01 TMA02 TMA03 TMA04 TMA05 TMA06 % of TMA Submissions 2012J 2013J 2014J 2015J 0 2 4 6 8 10 12 TMA 1-2 TMA 2-3 TMA 3-4 TMA 4-5 TMA 5-6 TMA Submission % Drop off between TMAs 2012J 2013J 2014J 2015J  Reviewing the TMA data revealed a possible issue between TMAs 03 and 04.  The data suggested a significant drop in students submitting TMA04 (11% on the 2014J presentation)
  • 3. avinash.boroowa@open.ac.uk | @nashman11178 3  The data also suggested retention was an issue between TMA03 and TMA05. 12% formal withdrawal in this period. A224 Inside music – an example
  • 4. avinash.boroowa@open.ac.uk | @nashman11178 4  Investigation of the data showed low engagement with structured content activities 17.22 -17.26  Proposed Action: Rework the content in these activities. A224 Inside music – an example
  • 5. avinash.boroowa@open.ac.uk | @nashman11178 5  Data suggests that roughly a quarter of students registered on A224 15J are studying 120 credits.  Further investigation required to track their success.  Proposed Action: To better advise these students at registration.
  • 6. avinash.boroowa@open.ac.uk | @nashman11178 The Problem ● Previous presentations over 500 students registered at start ● Steady attrition means that the two previous presentations had ● Completion rates of 67% and 69% ● Pass rates of 63% and 65% ● 30% of students with lower than A level - PEQs ● High concurrency with other modules (30%) ● Based on feedback, the feeling was that students new to OU study weren’t adequately prepared for what it is to study 30/60 credits. L192 Bon départ: beginners' French 6 The Solution ● Module website opens 3 weeks before start. In those 3 weeks students were offered induction sessions on 1. Materials 2. Tutorials 3. Support 4. Drop-in session for ad-hoc queries ● Rationale: to set the right expectations in the minds of students - what it is to study 30 credits/60 credits ● Targeted at students new to the OU
  • 7. avinash.boroowa@open.ac.uk | @nashman11178 The outcomes L192 Bon départ: beginners' French 7 Session attendance Number % Attended session 133 20.37% Did not attend session 520 79.63% Grand Total 653 100.00% Individual session attendance N % Session A - Materials 97 14.85% Session B – Tutorials and OU Live 85 13.02% Session C - Support 53 8.12% Drop-in session 31 4.75% Attended introductory session Did not attend introductory session Total Categories N % N % N % Remained registered at start 131 98.50% 461 88.65% 592 90.66% Withdrew before start 2 1.50% 59 11.35% 61 9.34% Grand Total 133 100.00% 520 100.00% 653 100.00% Withdrawals before module start
  • 8. avinash.boroowa@open.ac.uk | @nashman11178 The outcomes L192 Bon départ: beginners' French 8 Attended introductory session Did not attend introductory session Total Categories N % N % N % Registered 119 89.47% 384 73.85% 503 77.03% Withdrawn 14 10.53% 136 26.15% 150 22.97% Grand Total 133 100.00% 520 100.00% 653 100.00% Withdrawals as at 03/03/2016 Attended introductory session Did not attend introductory session Total Categories N % N % N % Submitted TMA1 129 98.47% 411 89.15% 540 91.22% Submitted TMA2 120 91.60% 370 80.26% 490 82.77% Submitted TMA3 109 83.21% 295 63.99% 404 68.24% Grand Total 131 100.00% 460 100.00% 592 100.00% Assignment Submission Rates
  • 9. avinash.boroowa@open.ac.uk | @nashman11178 The outcomes L192 Bon départ: beginners' French 9 Attended introductory session Did not attend introductory session Total Average score of TMA 1 87.59 82.15 83.45 Average score of TMA 2 80.25 73.98 75.51 Average score of TMA 3 84.59 78.71 80.29 Average Assessment Scores • Overall significant correlations between attendance and subsequent retention and performance • Number of modules in concurrent study – the group that did not attend the induction session have registered on more modules (in both terms of modules and credits) than those who did - it may be a lack of time that caused them not to engage in the induction session • Proportion of new/continuing students – the group that did not attend the induction session has significantly higher proportion of continuing students. On the whole, it looks like new students were the ones who mostly benefited.
  • 11. avinash.boroowa@open.ac.uk | @nashman11178 Evaluating the use of the A4A Framework Technology Acceptance Model (TAM1) 11 ● Explains why a user accepts or rejects a technology. ● Perceived usefulness and perceived ease of use influence intentions to use and actual behaviour. ● Identify what factors explain future intentions to use the innovation and actual usage behaviour The Technology Acceptance Model, version 1. (Davis, Bagozzi & Warshaw 1989)
  • 12. avinash.boroowa@open.ac.uk | @nashman11178 Feedback from Data Source Briefing Workshops Perceived usefulness (PU) ● Using the data tools will improve the delivery of the module. ● Using the data tools will increase my productivity. ● Using the data tools will enhance the effectiveness of the teaching on the module. Perceived ease-of-use (PEOU) ● Learning to operate data tools is easy for me. ● I find it easy to get the data tools to do what I want them to do. ● I find the data tools easy to use. Based on Technology Acceptance Model (TAM1) 12 Perceived training requirement ● I expect most staff will need formal training on the data tools Satisfaction with Workshop ● The instructors were enthusiastic in the data briefing. ● The instructors provided clear instructions on what to do. ● Overall, I am satsified with the workshop.
  • 18. avinash.boroowa@open.ac.uk | @nashman11178 Feedback from Data Support Meetings Perceived usefulness (PU) ● Using the data tools from the support meeting will enhance the effectiveness of the teaching on the module. ● Using the data tools from the support meeting will improve the delivery of my module. ● Using the data tools from the support meeting will increase my productivity. Perceived ease-of-use (PEOU) ● I find it easy to get the data tools used in the support meetings to do what I want them to do. ● I find the tools used in the support meeting easy to use. ● Learning to operate the data tools used in the support meeting is easy for me. Based on Technology Acceptance Model (TAM1) 18 Perceived training requirement ● Based upon my experience with the data tools used in the support meeting, I expect that most staff will need formal training to use these tools. Satisfaction with Workshop ● The facilitators helped me identify an issue, or an action, that could be taken on my module. ● The facilitators provided a clear interpretation of my module's data. ● The facilitators were enthusiastic in the support meeting. ● Overall, I am satisfied with the support meeting.
  • 24. avinash.boroowa@open.ac.uk | @nashman11178 Are there any questions? For further details please contact: ● Avinash Boroowa – avinash.boroowa@open.ac.uk ¦ @nashman11178 ● Bart Rienties – bart.rienties@open.ac.uk ¦ @DrBartRienties ● Dr. Christothea Herodotu – christothea.herodotu@open.ac.uk
  • 25. avinash.boroowa@open.ac.uk | @nashman11178 References Agudo-Peregrina, Á. F., Iglesias-Pradas, S., Conde-González, M. Á., & Hernández-García, Á. (2014). Can we predict success from log data in VLEs? Classification of interactions for learning analytics and their relation with performance in VLE-supported F2F and online learning. Computers in Human Behavior, 31(February), 542- 550. doi: 10.1016/j.chb.2013.05.031 Anderson, T., Rourke, L., Garrison, D., & Archer, W. (2001). Assessing teaching presence in a computer conferencing context. Journal of Asynchronous Learning Networks, 5(2), 1-17. Arbaugh, J. B. (2014). System, scholar, or students? Which most influences online MBA course effectiveness? Journal of Computer Assisted Learning, 30(4), 349- 362. doi: 10.1111/jcal.12048 Arnold, K. E., & Pistilli, M. D. (2012). Course signals at Purdue: using learning analytics to increase student success. Paper presented at the Proceedings of the 2nd International Conference on Learning Analytics and Knowledge. Ashby, A. (2004). Monitoring student retention in the Open University: definition, measurement, interpretation and action. Open Learning: The Journal of Open, Distance and e-Learning, 19(1), 65-77. doi: 10.1080/0268051042000177854 Calvert, C. E. (2014). Developing a model and applications for probabilities of student success: a case study of predictive analytics. Open Learning: The Journal of Open, Distance and e-Learning, 29(2), 160-173. doi: 10.1080/02680513.2014.931805 Cleveland-Innes, M., & Campbell, P. (2012). Emotional presence, learning, and the online learning environment. The International Review of Research in Open and Distance Learning, 13(4), 269-292. Clow, D., Cross, S., Ferguson, R., & Rienties, B. (2014). Evidence Hub Review. Milton Keynes: LACE Project. Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design Research: Theoretical and Methodological Issues. The journal of the learning sciences, 13(1), 15-42. Conde, M. Á., & Hernández-García, Á. (2015). Learning analytics for educational decision making. Computers in Human Behavior, 47, 1-3. doi: http://dx.doi.org/10.1016/j.chb.2014.12.034 Conole, G. (2012). Designing for Learning in an Open World. Dordrecht: Springer. Cross, S., Galley, R., Brasher, A., & Weller, M. (2012). Final Project Report of the OULDI-JISC Project: Challenge and Change in Curriculum Design Process, Communities, Visualisation and Practice. York: JISC. Ferguson, R. (2012). Learning analytics: drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5), 304-317. doi: 10.1504/ijtel.2012.051816 Ferguson, R., & Buckingham Shum, S. (2012). Social learning analytics: five approaches. Paper presented at the Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, Vancouver, British Columbia, Canada. Garrison, D., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education The Internet and Higher Education, 2(2), 87-105. doi: 10.1016/S1096-7516(00)00016-6 Garrison, D., & Arbaugh, J. B. (2007). Researching the community of inquiry framework: Review, issues, and future directions. The Internet and Higher Education, 10(3), 157-172. doi: DOI: 10.1016/j.iheduc.2007.04.001 25
  • 26. avinash.boroowa@open.ac.uk | @nashman11178 References Gasevic, D., Zouaq, A., & Janzen, R. (2013). “Choose your classmates, your GPA is at stake!”: The association of cross-class social ties and academic performance. American Behavioral Scientist, 57(10), 1460-1479. doi: 10.1177/0002764213479362 González-Torres, A., García-Peñalvo, F. J., & Therón, R. (2013). Human–computer interaction in evolutionary visual software analytics. Computers in Human Behavior, 29(2), 486-495. doi: 10.1016/j.chb.2012.01.013 Hattie, J. (2009). Visible Learning: A synthesis of over 800 meta-analyses relating to achievement. New York: Routledge. Hess, F. M., & Saxberg, B. (2013). Breakthrough Leadership in the Digital Age: Using Learning Science to Reboot Schooling. Thousand Oaks: Corwin Press. Inkelaar, T., & Simpson, O. (2015). Challenging the ‘distance education deficit’ through ‘motivational emails’. Open Learning: The Journal of Open, Distance and e- Learning, 1-12. doi: 10.1080/02680513.2015.1055718 Isherwood, M. C. (2009). Developing on-line Learning Aids for M150 Students COLMSCT Final Report. Milton Keynes, UK: The Open University UK. Jordan, S. (2014). Computer-marked assessment as learning analytics. Paper presented at the CALRG Annual Conference, Milton Keynes, UK. McMillan, J. H., & Schumacher, S. (2014). Research in education: Evidence-based inquiry. Harlow: Pearson Higher Ed. Papamitsiou, Z., & Economides, A. (2014). Learning Analytics and Educational Data Mining in Practice: A Systematic Literature Review of Empirical Evidence. Educational Technology & Society, 17(4), 49–64. Richardson, J. T. E. (2012). The attainment of White and ethnic minority students in distance education. Assessment & Evaluation in Higher Education, 37(4), 393- 408. doi: 10.1080/02602938.2010.534767 Rienties, B., & Alden Rivers, B. (2014). Measuring and Understanding Learner Emotions: Evidence and Prospects LACE review papers (Vol. 1). Milton Keynes: LACE. Rienties, B., Cross, S., & Zdrahal, Z. (2016). Implementing a Learning Analytics Intervention and Evaluation Framework: what works? In B. Motidyang & R. Butson (Eds.), Big Data and Learning Analytics in Higher Education Current Theory and Practice. Heidelberg: Springer. Rienties, B., Giesbers, S., Lygo-Baker, S., Ma, S., & Rees, R. (2014). Why some teachers easily learn to use a new Virtual Learning Environment: a Technology Acceptance perspective. Interactive Learning Environments. doi: 10.1080/10494820.2014.881394 Rienties, B., Toetenel, L., & Bryan, A. (2015). “Scaling up” learning design: impact of learning design activities on LMS behavior and performance. Paper presented at the 5th Learning Analytics Knowledge conference, New York. Rienties, B., & Townsend, D. (2012). Integrating ICT in business education: using TPACK to reflect on two course redesigns. In P. Van den Bossche, W. H. Gijselaers & R. G. Milter (Eds.), Learning at the Crossroads of Theory and Practice (Vol. 4, pp. 141-156). Dordrecht: Springer. Siroker, D., & Koomen, P. (2013). A/B Testing: The Most Powerful Way to Turn Clicks Into Customers. New Jersey: John Wiley & Sons. Slade, S., & Prinsloo, P. (2013). Learning Analytics: Ethical Issues and Dilemmas. American Behavioral Scientist, 57(10), 1510-1529. doi: 10.1177/0002764213479366 Slavin, R. E. (2002). Evidence-Based Education Policies: Transforming Educational Practice and Research. Educational Researcher, 31(7), 15-21. doi: 10.2307/3594400 Slavin, R. E. (2008). Perspectives on evidence-based research in education—What works? Issues in synthesizing educational program evaluations. Educational Researcher, 37(1), 5-14. doi: 10.3102/0013189X08314117 26
  • 27. avinash.boroowa@open.ac.uk | @nashman11178 References Stenbom, S., Cleveland-Innes, M., & Hrastinski, S. (2014). Online Coaching as a Relationship of Inquiry: Mathematics, online help, and emotional presence. Paper presented at the Eden 2014, Oxford. Tempelaar, D. T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: Learning Analytics in a data-rich context. Computers in Human Behavior, 47, 157-167. doi: 10.1016/j.chb.2014.05.038 Tingle, R., & Cross, S. (2010). How do 'WP' students differ from others in their engagement with e-learning activities. Paper presented at the OU Widening Participation Conference, Milton Keynes, UK. https://latestendeavour.files.wordpress.com/2010/06/wp_students_engagment_with_elearning_poster_tinglecross2010.pdf Tobarra, L., Robles-Gómez, A., Ros, S., Hernández, R., & Caminero, A. C. (2014). Analyzing the students’ behavior and relevant topics in virtual learning communities. Computers in Human Behavior, 31(0), 659-669. doi: 10.1016/j.chb.2013.10.001 Torgerson, D. J., & Torgerson, C. (2008). Designing randomised trials in health, education and the social sciences: an introduction. London: Palgrave Macmillan. Whitelock, D., Richardson, J., Field, D., Van Labeke, N., & Pulman, S. (2014). Designing and Testing Visual representations of Draft Essays for Higher Education Students. Paper presented at the Learning Analytics Knoweldge conference 2014, Indianapolis. Wolff, A., Zdrahal, Z., Herrmannova, D., Kuzilek, J., & Hlosta, M. (2014). Developing predictive models for early detection of at-risk students on distance learning modules, Workshop: Machine Learning and Learning Analytics Paper presented at the Learning Analytics and Knowledge (2014), Indianapolis. Wolff, A., Zdrahal, Z., Nikolov, A., & Pantucek, M. (2013). Improving retention: predicting at-risk students by analysing clicking behaviour in a virtual learning environment. Paper presented at the Proceedings of the Third International Conference on Learning Analytics and Knowledge. 27

Editor's Notes

  • #7: L192 is a compulsory course in the Certificate in French (C33) L192 is an optional course in the BA/BSc (Honours) European Studies (B10) BA (Honours) Humanities (B03) BA (Honours) Modern Language Studies (B30) Certificate of Higher Education in Humanities (C98)
  • #12: The technology acceptance model (TAM) is an information systems theory that models how users come to accept and use a technology. The model suggests that when users are presented with a new technology, a number of factors influence their decision about how and when they will use it, notably: Perceived usefulness (PU) – This was defined by Fred Davis as "the degree to which a person believes that using a particular system would enhance his or her job performance". Perceived ease-of-use (PEOU) – Davis defined this as "the degree to which a person believes that using a particular system would be free from effort" (Davis 1989).