SlideShare a Scribd company logo
The challenges of Assessment and Feedback: findings from an HEA project Denise Whitelock [email_address]
Outline e-Assessment Challenge Authentic assessment, e-portfolios Peer assessment MCQs and self-assessment Feedback Advice for Action
Project purpose in conjunction with Southampton University  Consult the academic community on useful references Seminar series Survey Advisors Invited contributors Prioritise evidence-based references Synthesise main points For readers: Academics using technology enhancement for assessment and feedback Learning technologists  Managers of academic departments
The e-Assessment Challenge Constructivist Learning – Push Institutional reliability and accountability – Pull .
www.storiesabout.com www.storiesabout.com/creativepdp [email_address]
Elliott’s characteristics of Assessment 2.0 activities  Characteristics Descriptor Authentic Involving real-world knowledge and skills Personalised Tailored to the knowledge, skills and interests of each student Negotiated Agreed between the learner and the teacher Engaging Involving the personal interests of the students Recognise existing skills Willing to accredit the student’s existing work Deep Assessing deep knowledge – not memorization Problem oriented Original tasks requiring genuine problem solving skills Collaboratively produced Produced in partnership with fellow students Peer and self assessed Involving self reflection and peer review Tool supported Encouraging the use of ICT
Authentic assessments :e-portfolios Electronic NVQ portfolio cover contents page, OCR IT Practitioner, EAIHFE, Robert Wilsdon
Candidate Assessment Records section, OCR IT Practitioner, EAIHFE, Robert Wilsdon
Building e-portfolios on a chef’s course food preparation for e-portfolio, Modern Apprenticeship in Hospitality and Catering,  West Suffolk College, Mike Mulvihill Evidence of food preparation skill for e-portfolio, Modern Apprenticeship in Hospitality and Catering, West Suffolk College, Mike Mulvihill
Sharing e-portfolios: The Netfolio concept Social constructivism Connecting e-portfolios (Barbera, 2009) Share and build upon a joint body of evidence Trialled with 31 PhD students at a virtual university Control group used but Netfolio group obtained higher grades Greater visibility of revision process and peer assessment in the Netfolio system
Peer Assessment and the WebPA Tool Loughborough (Loddington et al, 2009) Self assess and peer assess with given criteria Group mark awarded by tutor Students rated: More timely feedback Reflection Fair rewards for hard work Staff rated: Time savings Administrative gains Automatic calculation Students have faith in the administrative system
MCQs: Variation on a theme (1) The question is an example of a COLA assessment used at the  Reid Kerr College, Paisley. It is a Multiple response Question used in one of their modules. The question was developed using Questionmark Perception at the University of Dundee. It is part a set of formative assessment for medical students.
MCQs: Variation on a theme (2) Example of LAPT Certainty-Based Marking, UK cabinet ministers demo exercise showing feedback,  University College, Tony Gardner-Medwin Drug Chart Errors and Omissions, Medicines Administration Assessment,  Chesterfield Royal Hospital
Scaffolding and High Stakes assessment Math for Science Tutor less course Competency led No point to cheat Web  home exam  Invigilation technologies
Self diagnosis Basic IT skills, first year med students (Sieber, 2009) Competency based testing Repeating tests for revision Enables remedial intervention
Students want more support with assessment More Feedback Quicker Feedback Full Feedback User friendly Feedback And ..................National Students’ Survey
Problems with Feedback Ignore feedback Look at the mark only  Tells me correct solution but not what’s wrong with mine Needs decoding Timely
Gains from Interactivity with Feedback: Formative Assessment Mean effect size on standardised tests between 0.4 to 0.7 (Black & Williams, 1998) Particularly effective for students who have not done well at school  http://kn.open.ac.uk/document.cfm?docid=10817 Can keep students to timescale and motivate them How can we support our students to become more reflective learners and enter a digitaldiscourse?
Mobile Technologies and Assessment MCQs ,PDAs  Valdiva & Nussbaum(2009) Polls,instant surveys Simpson & Oliver (2007) Draper (2009) EVS
Collaborative formative assessment with Global Warming
Global Warming
Global Warming: Simlink Presentation
Next: ‘Yoked’ apps via BuddySpace Student A Student B (‘yoked’, but without full  screen sharing required!)
Free Text Entry and Feedback LISC for languages Open Comment IAT for Science
LISC: Aily Fowler Kent University ab-initio Spanish module Large student numbers  Skills-based course Provision of sufficient formative assessment meant unmanageable marking loads Impossible to provide immediate feedback leading to fossilisation of errors
The LISC solution: developed by Ali Fowler A CALL system designed to enable students to: Independently practise sentence translation Receive immediate (and robust) feedback on all errors Attend immediately to the feedback (before fossilisation can occur)
How is the final mark arrived at in the  LISC System? The two submissions are  un equally weighted Best to give more weight to the first attempt since this ensures that students give  careful  consideration to the construction of their first answer but can improve their mark by refining the answer The marks ratio can vary (depending on assessment/feedback type) the more information given in the feedback, the lower the weight the second mark should carry
Heuristics for the final mark If the ratio is skewed too far in favour of the first attempt… students are less inclined to try  hard  to correct non-perfect answers If the ratio is skewed too far in favour of the second attempt… students exhibit less care over the construction of their initial answer
Open Comment addresses the problem of free text entry Automated formative assessment tool Free text entry for students Automated feedback and guidance Open questions, divergent assessment No marks awarded For use by Arts Faculty
IAT (Jordan & Mitchell, 2009) Marking engine – Web service Authoring tool for marking rules for each question Model answers but free text entry by student Human computer marking comparisons indistinguishable at 1% level for two thirds of questions Problems question writing  Human marking is inconsistent (Conole & Warburton, 2005)
Models of feedback which are open to test How would you instruct a robot to mark as you do?
Stages of analysis by computer of students’ free text entry for Open Comment: advice with respect to content (socio-emotional support stylised example) STAGE 1a:  DETECT ERRORS  E.g. Incorrect dates, facts. (Incorrect inferences and causality is dealt with below) Instead of concentrating on X, think about Y in order to answer this question Recognise effort (Dweck) and encourage to have another go You have done well to start answering this question but perhaps you misunderstood it. Instead of thinking about X which did not…….. Consider Y
Computer analysis continued STAGE 2a:  REVEAL FIRST OMISSION Consider the role of Z in your answer Praise what is correct and point out what is missing Good but now consider the role X plays in your answer STAGE 2b:  REVEAL SECOND OMISSION Consider the role of P in your answer Praise what is correct and point out what is missing Yes but also consider P. Would it have produced the same result if P is neglected?
Final stages of analysis STAGE 3:REQUEST CLARIFICATION OF KEY POINT 1 STAGE 4:REQUEST FURTHER ANALYSIS OF KEY POINT 1 (Stages 3 and 4 repeated with all the key points) STAGE 5:REQUEST THE INFERENCE FROM THE ANALYSIS OF KEY POINT 1 IF IT IS MISSING STAGE 6:REQUEST THE INFERENCE FROM THE ANALYSIS OF KEY POINT 1 IF IT IS NOT COMPLETE STAGE 7:CHECK THE CAUSALITY STAGE 8:REQUEST ALL THE CAUSAL FACTORS ARE WEIGHTED
McFeSPA system Supports teaching assistants to mark and give feedback on undergraduate computer programming assignments Support tool for semi-automated marking and scaffolding of feedback Findings that feedback model would be helpful in training tutors .... Similar to Open Comment findings
Feedback: Advice for Action Students must decode feedback and then act on it Boud (2000) Students must have the opportunity to act on feedback Sadler (1989) Gauging efficacy through student action Deep and strategic study approaches more effective in processing e-feedback Strang (2010)
Audio Feedback (Middleton & Nortcliffe, 2010) Timely and meaningful Manageable for tutors to produce and the learner to use Clear in purpose, adequately introduced and pedagogically embedded Technically reliable and not adversely determined by technical constraints or difficulties Targeted at specific students, groups or cohorts, addressing their needs with relevant points in a structured way Produced within the context of local assessment strategies and in combination, if appropriate, with other feedback methods using each medium to good effect Brief, engaging and clearly presented, with emphasis on key points that demand a specified response from the learner Of adequate technical quality to avoid technical interference in the listener’s experience Encouraging, promoting self esteem Formative, challenging and motivational
Elliott’s characteristics of Assessment 2.0 activities  A d v i c e  f o r  A c t i o n Characteristics Descriptor Authentic Involving real-world knowledge and skills Personalised Tailored to the knowledge, skills and interests of each student Negotiated Agreed between the learner and the teacher Engaging Involving the personal interests of the students Recognise existing skills Willing to accredit the student’s existing work Deep Assessing deep knowledge – not memorization Problem oriented Original tasks requiring genuine problem solving skills Collaboratively produced Produced in partnership with fellow students Peer and self assessed Involving self reflection and peer review Tool supported Encouraging the use of ICT
Creating teaching and learning dialogues: towards guided learning supported by technology Learning to judge Providing reassurance Providing  a variety of signposted routes to achieve learning goals
Key Messages Effective regular, online testing can encourage student learning and improve their performance in tests (JISC, 2008) Automated marking can be more reliable than human markers and there is no medium effect between paper and computerized exams ( Lee and Weerakoon, 2001)  The success of assessment and feedback with technology-enhancement lies with the pedagogy rather than the technology itself; technology is an enabler (Draper, 2009)
Keys Messages 2 Technology-enhanced assessment is not restricted to simple questions and clear-cut right and wrong answers, much more sophisticated questions are being used as well (Whitelock & Watt, 2008) The design of appropriate and constructive feedback plays a vital role in the success of assessment, especially assessment for learning. ( Beaumont, O’Doherty & Shannon, 2008)
Key Messages 3 Staff development essential to the process (Warburton, 2009) Prepare students to take the assessments that use technology enhancement by practicing with similar levels of assessment using the same equipment and methods ( Shepherd et al, 2006) The reports generated by many technology-enhanced assessment systems are very helpful in checking the reliability and validity of each test item and the test as a whole can be checked on commercial systems ( McKenna and Bull, 2000)
References Beaumont, C., O’Doherty, M., and Shannon, L. (2008). Staff and student perceptions of feedback quality in the context of widening participation, Higher Education Academy. Retrieved May 2010 from: http://www.heacademy.ac.uk/assets/York/documents/ourwork/research/Beaumont_Final_Report.pdf. Draper, S. (2009). Catalytic assessment: understanding how MCQs and EVS can foster deep learning.  British Journal of Educational Technology, 40 (2), 285-293. JISC, HE Academy, and ALT (2008). Exploring Tangible Benefits of e-Learning. Retrieved in May 2010 from  http://www.jiscinfonet.ac.uk/publications/info/tangible-benefits-publication . Lee, G. and Weerakoon, P. (2001). The role of computer-aided assessment in health professional education: a comparison of student performance in computer-based and paper-and-pen multiple-choice tests.  Medical Teacher, Vol 23 , No. 2, 152 - 157. McKenna, C. and Bull, J. (2000). Quality assurance of computer-assisted assessment: practical and strategic issues.  Quality Assurance in Education. 8 (1), 24-31.
References 2 Middleton, A.  and Nortcliffe, A.  (2010) ‘Audio feedback design: principles and emerging practice’,  In D.Whitelock and P.Brna (eds)  Special Issue ‘Focusing on electronic feedback: feasible progress or just unfulfilled promises?’   Int. J. Continuing Engineering Education and Life-Long Learning,  Vol. 20, No. 2, pp.208-223. Shephard, K., Warburton, B., Maier, P. and Warren, A. (2006). Development and evaluation of computer-assisted assessment in higher education in relation to BS7988.  Assessment & Evaluation in Higher Education, 31 : 5, 583 — 595. Strang, K.D. (2010) ‘Measuring self regulated e-feedback, study approach and academic outcome of multicultural university students’, In D.Whitelock and P.Brna (eds)  Special Issue ‘Focusing on electronic feedback: feasible progress or just unfulfilled promises?’   Int. J. Continuing Engineering Education and Life-Long Learning,  Vol. 20, No. 2, pp.239-255. Warburton, B. (2009). Quick win or slow burn: modelling UK HE CAA uptake,  Assessment & Evaluation in Higher Education, 34 : 3, 257 — 272. Whitelock, D. and Watt, S. (2008). Reframing e-assessment: adopting new media and adapting old frameworks. Learning, Media and Technology, Vol. 33, No. 3, September 2008, pp.153–156 Routledge, Taylor & Francis Group. ISSN 1743-9884
Three Assessment Special Issues Brna, P. & Whitelock, D. (Eds.) (2010) Special Issue of International Journal of Continuing Engineering Education and Life-long Learning, Focussing on electronic Feedback: Feasible progress or just unfulfilled promises? Volume 2, No. 2 Whitelock, D. (Ed.) (2009) Special on e-Assessment: Developing new dialogues for the digital age. Volume 40, No. 2 Whitelock, D. and Watt, S. (Eds.) (2008). Reframing e-assessment: adopting new media and adapting old frameworks.  Learning, Media and Technology, Vol. 33 , No. 3

More Related Content

PPTX
purposes of assessment
PPTX
Principles of effective activity based participative learning. - updated. pptx
PPTX
Classroom assessment powerpoint ch.2 and 3
PPTX
Giving students feedback on assessment
PPT
Effective feedback
PPT
Motivate students to come to class ppt
PPTX
Feedback assessment
PPTX
Purpose of Assessment
purposes of assessment
Principles of effective activity based participative learning. - updated. pptx
Classroom assessment powerpoint ch.2 and 3
Giving students feedback on assessment
Effective feedback
Motivate students to come to class ppt
Feedback assessment
Purpose of Assessment

What's hot (20)

PPTX
Types of Assessment in Classroom
PPTX
Strategies in curriculum evaluation
PPTX
Assessment techniques
PPTX
ASSESSMENT AND EVALUATION
PPTX
Summative assessment( advantages vs. disadvantages)
PPT
ASSESSMENT: SUMATIVE & FORMATIVE ASSESSMENT
PPTX
Meaning and concept of test, testing, measurement, assessment and evaluation
DOCX
Tools and Techniques for Assessment
PPTX
The Dynamic Classroom
PPTX
2.3. Types of Validity in assessment and education
PPT
Item analysis
PPT
Validity and reliability in assessment.
PPTX
Test construction
PPT
Theories of learning
PPTX
Test and Assessment Types
PDF
Gagne's Nine Events of Instruction
PPTX
Lesson plan
PPTX
Considerations in preparing relevant test items
PPTX
What is assessment ‫‬
PPTX
classroom observation and type of observers
Types of Assessment in Classroom
Strategies in curriculum evaluation
Assessment techniques
ASSESSMENT AND EVALUATION
Summative assessment( advantages vs. disadvantages)
ASSESSMENT: SUMATIVE & FORMATIVE ASSESSMENT
Meaning and concept of test, testing, measurement, assessment and evaluation
Tools and Techniques for Assessment
The Dynamic Classroom
2.3. Types of Validity in assessment and education
Item analysis
Validity and reliability in assessment.
Test construction
Theories of learning
Test and Assessment Types
Gagne's Nine Events of Instruction
Lesson plan
Considerations in preparing relevant test items
What is assessment ‫‬
classroom observation and type of observers
Ad

Viewers also liked (16)

PPT
An investigation into the use of Twitter in teaching
PPTX
Challenging the challenges of innovative assessment a focus on careers deve...
PPT
The Fundamentals of Nutrition & Compound Movements
PPT
Respi 2
PPTX
Approach to the physical Assessment
PPTX
Assessment of respiratory system
PPTX
Respiratory assessment in adults
PPTX
health assessment for nursing student
PPTX
Formative assessment
PDF
Nursing Health Assessment: Purpose, Types, Sources cld
PPT
Respiratory assessment
PPTX
Blood- composition and function
PDF
Pass That Exam! Test Prep Strategies & Resources
PPTX
Blood Physiology - Ppt
PPT
Examination of the respiratory system
PPTX
Ppt for physical examination
An investigation into the use of Twitter in teaching
Challenging the challenges of innovative assessment a focus on careers deve...
The Fundamentals of Nutrition & Compound Movements
Respi 2
Approach to the physical Assessment
Assessment of respiratory system
Respiratory assessment in adults
health assessment for nursing student
Formative assessment
Nursing Health Assessment: Purpose, Types, Sources cld
Respiratory assessment
Blood- composition and function
Pass That Exam! Test Prep Strategies & Resources
Blood Physiology - Ppt
Examination of the respiratory system
Ppt for physical examination
Ad

Similar to The challenges of Assessment and Feedback: findings from an HEA project (20)

PPTX
New Options for Online Student Feedback
PPT
Online Course Assessment Part 1
PPTX
Assessment for learning
PPTX
TESTA, Presentation to the SDG Course Leaders, University of West of Scotlan...
PPT
formative e-assessment: a scoping study
PPT
Rossiter and Biggs (2008) - Development of Online Quizzes to Support Problem-...
PPT
My Seminar 3
PPT
Use of online quizzes to support inquiry-based learning in chemical engineering
PDF
Advanced Pedagogy training in different outcomes
PPTX
TESTA, SIAST Universities of Regina & Saskathewan Webinar (November 2013)
PPTX
App pgr workshop3
PPT
Assessment Futures: The Role for e-Assessment?
PDF
An evidence based model
PPT
Investigating the Pedagogical Push and Technological Pull of Computer Assiste...
PPTX
TESTA, Durham University (December 2013)
PPT
Technology-Enhanced Assessment and Feedback: How is evidence-based literature...
PPTX
Why a programme view? Why TESTA?
PPTX
TESTA to FASTECH (November 2011)
PPTX
Feedback as Dialogue
PPTX
TESTA, School of Politics & International Relations, University of Nottingham...
New Options for Online Student Feedback
Online Course Assessment Part 1
Assessment for learning
TESTA, Presentation to the SDG Course Leaders, University of West of Scotlan...
formative e-assessment: a scoping study
Rossiter and Biggs (2008) - Development of Online Quizzes to Support Problem-...
My Seminar 3
Use of online quizzes to support inquiry-based learning in chemical engineering
Advanced Pedagogy training in different outcomes
TESTA, SIAST Universities of Regina & Saskathewan Webinar (November 2013)
App pgr workshop3
Assessment Futures: The Role for e-Assessment?
An evidence based model
Investigating the Pedagogical Push and Technological Pull of Computer Assiste...
TESTA, Durham University (December 2013)
Technology-Enhanced Assessment and Feedback: How is evidence-based literature...
Why a programme view? Why TESTA?
TESTA to FASTECH (November 2011)
Feedback as Dialogue
TESTA, School of Politics & International Relations, University of Nottingham...

More from Denise Whitelock (17)

PPTX
Should feedback be at the centre of Personalised Learning?
PPT
Technology Enhanced Assessment: Do we have a wolf in sheep's clothing?
PPTX
Who has the crystal ball for moving forward with Digital Assessment?
PPTX
D Whitelock LAK presentation open_essayistfv
PPT
Designing and testing visual representations of draft essays for Higher Educa...
PPT
OpenEssayist: Feedback and moving forward with draft essays
PPT
Learning Analytics and student feedback
PPT
Good pedagogical practice driving learning analytics: OpenMentor, Open Commen...
PPT
Understanding current practice around the Assessment of Multimedia Artefacts
PPT
Looking backwards to move forwards: Seminal research that has influenced key ...
PPT
Academics' Understanding of Authentic Assessment
PPT
Technology Enhanced Activities for Learning Science for Children in Hospital ...
PPTX
Supporting Science Studies for children with long term health problems using ...
PPT
Synthesis Report onAssessment and Feedback with Technology Enhancement (SRAFTE)
PPT
Calrg2010
PPT
Dmw. E Assessmentlive July2009 Videos
PPT
Framing Feedback for Formative Assessment, Denise Whitelock
Should feedback be at the centre of Personalised Learning?
Technology Enhanced Assessment: Do we have a wolf in sheep's clothing?
Who has the crystal ball for moving forward with Digital Assessment?
D Whitelock LAK presentation open_essayistfv
Designing and testing visual representations of draft essays for Higher Educa...
OpenEssayist: Feedback and moving forward with draft essays
Learning Analytics and student feedback
Good pedagogical practice driving learning analytics: OpenMentor, Open Commen...
Understanding current practice around the Assessment of Multimedia Artefacts
Looking backwards to move forwards: Seminal research that has influenced key ...
Academics' Understanding of Authentic Assessment
Technology Enhanced Activities for Learning Science for Children in Hospital ...
Supporting Science Studies for children with long term health problems using ...
Synthesis Report onAssessment and Feedback with Technology Enhancement (SRAFTE)
Calrg2010
Dmw. E Assessmentlive July2009 Videos
Framing Feedback for Formative Assessment, Denise Whitelock

Recently uploaded (20)

PDF
AI-driven educational solutions for real-life interventions in the Philippine...
PDF
1_English_Language_Set_2.pdf probationary
PDF
FORM 1 BIOLOGY MIND MAPS and their schemes
PDF
احياء السادس العلمي - الفصل الثالث (التكاثر) منهج متميزين/كلية بغداد/موهوبين
PDF
Weekly quiz Compilation Jan -July 25.pdf
PDF
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...
PDF
RTP_AR_KS1_Tutor's Guide_English [FOR REPRODUCTION].pdf
PDF
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
PPTX
Share_Module_2_Power_conflict_and_negotiation.pptx
PPTX
Virtual and Augmented Reality in Current Scenario
PDF
Chinmaya Tiranga quiz Grand Finale.pdf
PPTX
Computer Architecture Input Output Memory.pptx
PPTX
ELIAS-SEZIURE AND EPilepsy semmioan session.pptx
PPTX
TNA_Presentation-1-Final(SAVE)) (1).pptx
PDF
CISA (Certified Information Systems Auditor) Domain-Wise Summary.pdf
PDF
Paper A Mock Exam 9_ Attempt review.pdf.
PDF
Hazard Identification & Risk Assessment .pdf
PDF
Trump Administration's workforce development strategy
PPTX
20th Century Theater, Methods, History.pptx
PPTX
Onco Emergencies - Spinal cord compression Superior vena cava syndrome Febr...
AI-driven educational solutions for real-life interventions in the Philippine...
1_English_Language_Set_2.pdf probationary
FORM 1 BIOLOGY MIND MAPS and their schemes
احياء السادس العلمي - الفصل الثالث (التكاثر) منهج متميزين/كلية بغداد/موهوبين
Weekly quiz Compilation Jan -July 25.pdf
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...
RTP_AR_KS1_Tutor's Guide_English [FOR REPRODUCTION].pdf
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
Share_Module_2_Power_conflict_and_negotiation.pptx
Virtual and Augmented Reality in Current Scenario
Chinmaya Tiranga quiz Grand Finale.pdf
Computer Architecture Input Output Memory.pptx
ELIAS-SEZIURE AND EPilepsy semmioan session.pptx
TNA_Presentation-1-Final(SAVE)) (1).pptx
CISA (Certified Information Systems Auditor) Domain-Wise Summary.pdf
Paper A Mock Exam 9_ Attempt review.pdf.
Hazard Identification & Risk Assessment .pdf
Trump Administration's workforce development strategy
20th Century Theater, Methods, History.pptx
Onco Emergencies - Spinal cord compression Superior vena cava syndrome Febr...

The challenges of Assessment and Feedback: findings from an HEA project

  • 1. The challenges of Assessment and Feedback: findings from an HEA project Denise Whitelock [email_address]
  • 2. Outline e-Assessment Challenge Authentic assessment, e-portfolios Peer assessment MCQs and self-assessment Feedback Advice for Action
  • 3. Project purpose in conjunction with Southampton University Consult the academic community on useful references Seminar series Survey Advisors Invited contributors Prioritise evidence-based references Synthesise main points For readers: Academics using technology enhancement for assessment and feedback Learning technologists Managers of academic departments
  • 4. The e-Assessment Challenge Constructivist Learning – Push Institutional reliability and accountability – Pull .
  • 6. Elliott’s characteristics of Assessment 2.0 activities Characteristics Descriptor Authentic Involving real-world knowledge and skills Personalised Tailored to the knowledge, skills and interests of each student Negotiated Agreed between the learner and the teacher Engaging Involving the personal interests of the students Recognise existing skills Willing to accredit the student’s existing work Deep Assessing deep knowledge – not memorization Problem oriented Original tasks requiring genuine problem solving skills Collaboratively produced Produced in partnership with fellow students Peer and self assessed Involving self reflection and peer review Tool supported Encouraging the use of ICT
  • 7. Authentic assessments :e-portfolios Electronic NVQ portfolio cover contents page, OCR IT Practitioner, EAIHFE, Robert Wilsdon
  • 8. Candidate Assessment Records section, OCR IT Practitioner, EAIHFE, Robert Wilsdon
  • 9. Building e-portfolios on a chef’s course food preparation for e-portfolio, Modern Apprenticeship in Hospitality and Catering, West Suffolk College, Mike Mulvihill Evidence of food preparation skill for e-portfolio, Modern Apprenticeship in Hospitality and Catering, West Suffolk College, Mike Mulvihill
  • 10. Sharing e-portfolios: The Netfolio concept Social constructivism Connecting e-portfolios (Barbera, 2009) Share and build upon a joint body of evidence Trialled with 31 PhD students at a virtual university Control group used but Netfolio group obtained higher grades Greater visibility of revision process and peer assessment in the Netfolio system
  • 11. Peer Assessment and the WebPA Tool Loughborough (Loddington et al, 2009) Self assess and peer assess with given criteria Group mark awarded by tutor Students rated: More timely feedback Reflection Fair rewards for hard work Staff rated: Time savings Administrative gains Automatic calculation Students have faith in the administrative system
  • 12. MCQs: Variation on a theme (1) The question is an example of a COLA assessment used at the Reid Kerr College, Paisley. It is a Multiple response Question used in one of their modules. The question was developed using Questionmark Perception at the University of Dundee. It is part a set of formative assessment for medical students.
  • 13. MCQs: Variation on a theme (2) Example of LAPT Certainty-Based Marking, UK cabinet ministers demo exercise showing feedback, University College, Tony Gardner-Medwin Drug Chart Errors and Omissions, Medicines Administration Assessment, Chesterfield Royal Hospital
  • 14. Scaffolding and High Stakes assessment Math for Science Tutor less course Competency led No point to cheat Web home exam Invigilation technologies
  • 15. Self diagnosis Basic IT skills, first year med students (Sieber, 2009) Competency based testing Repeating tests for revision Enables remedial intervention
  • 16. Students want more support with assessment More Feedback Quicker Feedback Full Feedback User friendly Feedback And ..................National Students’ Survey
  • 17. Problems with Feedback Ignore feedback Look at the mark only Tells me correct solution but not what’s wrong with mine Needs decoding Timely
  • 18. Gains from Interactivity with Feedback: Formative Assessment Mean effect size on standardised tests between 0.4 to 0.7 (Black & Williams, 1998) Particularly effective for students who have not done well at school http://kn.open.ac.uk/document.cfm?docid=10817 Can keep students to timescale and motivate them How can we support our students to become more reflective learners and enter a digitaldiscourse?
  • 19. Mobile Technologies and Assessment MCQs ,PDAs Valdiva & Nussbaum(2009) Polls,instant surveys Simpson & Oliver (2007) Draper (2009) EVS
  • 20. Collaborative formative assessment with Global Warming
  • 22. Global Warming: Simlink Presentation
  • 23. Next: ‘Yoked’ apps via BuddySpace Student A Student B (‘yoked’, but without full screen sharing required!)
  • 24. Free Text Entry and Feedback LISC for languages Open Comment IAT for Science
  • 25. LISC: Aily Fowler Kent University ab-initio Spanish module Large student numbers Skills-based course Provision of sufficient formative assessment meant unmanageable marking loads Impossible to provide immediate feedback leading to fossilisation of errors
  • 26. The LISC solution: developed by Ali Fowler A CALL system designed to enable students to: Independently practise sentence translation Receive immediate (and robust) feedback on all errors Attend immediately to the feedback (before fossilisation can occur)
  • 27. How is the final mark arrived at in the LISC System? The two submissions are un equally weighted Best to give more weight to the first attempt since this ensures that students give careful consideration to the construction of their first answer but can improve their mark by refining the answer The marks ratio can vary (depending on assessment/feedback type) the more information given in the feedback, the lower the weight the second mark should carry
  • 28. Heuristics for the final mark If the ratio is skewed too far in favour of the first attempt… students are less inclined to try hard to correct non-perfect answers If the ratio is skewed too far in favour of the second attempt… students exhibit less care over the construction of their initial answer
  • 29. Open Comment addresses the problem of free text entry Automated formative assessment tool Free text entry for students Automated feedback and guidance Open questions, divergent assessment No marks awarded For use by Arts Faculty
  • 30. IAT (Jordan & Mitchell, 2009) Marking engine – Web service Authoring tool for marking rules for each question Model answers but free text entry by student Human computer marking comparisons indistinguishable at 1% level for two thirds of questions Problems question writing Human marking is inconsistent (Conole & Warburton, 2005)
  • 31. Models of feedback which are open to test How would you instruct a robot to mark as you do?
  • 32. Stages of analysis by computer of students’ free text entry for Open Comment: advice with respect to content (socio-emotional support stylised example) STAGE 1a: DETECT ERRORS E.g. Incorrect dates, facts. (Incorrect inferences and causality is dealt with below) Instead of concentrating on X, think about Y in order to answer this question Recognise effort (Dweck) and encourage to have another go You have done well to start answering this question but perhaps you misunderstood it. Instead of thinking about X which did not…….. Consider Y
  • 33. Computer analysis continued STAGE 2a: REVEAL FIRST OMISSION Consider the role of Z in your answer Praise what is correct and point out what is missing Good but now consider the role X plays in your answer STAGE 2b: REVEAL SECOND OMISSION Consider the role of P in your answer Praise what is correct and point out what is missing Yes but also consider P. Would it have produced the same result if P is neglected?
  • 34. Final stages of analysis STAGE 3:REQUEST CLARIFICATION OF KEY POINT 1 STAGE 4:REQUEST FURTHER ANALYSIS OF KEY POINT 1 (Stages 3 and 4 repeated with all the key points) STAGE 5:REQUEST THE INFERENCE FROM THE ANALYSIS OF KEY POINT 1 IF IT IS MISSING STAGE 6:REQUEST THE INFERENCE FROM THE ANALYSIS OF KEY POINT 1 IF IT IS NOT COMPLETE STAGE 7:CHECK THE CAUSALITY STAGE 8:REQUEST ALL THE CAUSAL FACTORS ARE WEIGHTED
  • 35. McFeSPA system Supports teaching assistants to mark and give feedback on undergraduate computer programming assignments Support tool for semi-automated marking and scaffolding of feedback Findings that feedback model would be helpful in training tutors .... Similar to Open Comment findings
  • 36. Feedback: Advice for Action Students must decode feedback and then act on it Boud (2000) Students must have the opportunity to act on feedback Sadler (1989) Gauging efficacy through student action Deep and strategic study approaches more effective in processing e-feedback Strang (2010)
  • 37. Audio Feedback (Middleton & Nortcliffe, 2010) Timely and meaningful Manageable for tutors to produce and the learner to use Clear in purpose, adequately introduced and pedagogically embedded Technically reliable and not adversely determined by technical constraints or difficulties Targeted at specific students, groups or cohorts, addressing their needs with relevant points in a structured way Produced within the context of local assessment strategies and in combination, if appropriate, with other feedback methods using each medium to good effect Brief, engaging and clearly presented, with emphasis on key points that demand a specified response from the learner Of adequate technical quality to avoid technical interference in the listener’s experience Encouraging, promoting self esteem Formative, challenging and motivational
  • 38. Elliott’s characteristics of Assessment 2.0 activities A d v i c e f o r A c t i o n Characteristics Descriptor Authentic Involving real-world knowledge and skills Personalised Tailored to the knowledge, skills and interests of each student Negotiated Agreed between the learner and the teacher Engaging Involving the personal interests of the students Recognise existing skills Willing to accredit the student’s existing work Deep Assessing deep knowledge – not memorization Problem oriented Original tasks requiring genuine problem solving skills Collaboratively produced Produced in partnership with fellow students Peer and self assessed Involving self reflection and peer review Tool supported Encouraging the use of ICT
  • 39. Creating teaching and learning dialogues: towards guided learning supported by technology Learning to judge Providing reassurance Providing a variety of signposted routes to achieve learning goals
  • 40. Key Messages Effective regular, online testing can encourage student learning and improve their performance in tests (JISC, 2008) Automated marking can be more reliable than human markers and there is no medium effect between paper and computerized exams ( Lee and Weerakoon, 2001) The success of assessment and feedback with technology-enhancement lies with the pedagogy rather than the technology itself; technology is an enabler (Draper, 2009)
  • 41. Keys Messages 2 Technology-enhanced assessment is not restricted to simple questions and clear-cut right and wrong answers, much more sophisticated questions are being used as well (Whitelock & Watt, 2008) The design of appropriate and constructive feedback plays a vital role in the success of assessment, especially assessment for learning. ( Beaumont, O’Doherty & Shannon, 2008)
  • 42. Key Messages 3 Staff development essential to the process (Warburton, 2009) Prepare students to take the assessments that use technology enhancement by practicing with similar levels of assessment using the same equipment and methods ( Shepherd et al, 2006) The reports generated by many technology-enhanced assessment systems are very helpful in checking the reliability and validity of each test item and the test as a whole can be checked on commercial systems ( McKenna and Bull, 2000)
  • 43. References Beaumont, C., O’Doherty, M., and Shannon, L. (2008). Staff and student perceptions of feedback quality in the context of widening participation, Higher Education Academy. Retrieved May 2010 from: http://www.heacademy.ac.uk/assets/York/documents/ourwork/research/Beaumont_Final_Report.pdf. Draper, S. (2009). Catalytic assessment: understanding how MCQs and EVS can foster deep learning. British Journal of Educational Technology, 40 (2), 285-293. JISC, HE Academy, and ALT (2008). Exploring Tangible Benefits of e-Learning. Retrieved in May 2010 from http://www.jiscinfonet.ac.uk/publications/info/tangible-benefits-publication . Lee, G. and Weerakoon, P. (2001). The role of computer-aided assessment in health professional education: a comparison of student performance in computer-based and paper-and-pen multiple-choice tests. Medical Teacher, Vol 23 , No. 2, 152 - 157. McKenna, C. and Bull, J. (2000). Quality assurance of computer-assisted assessment: practical and strategic issues. Quality Assurance in Education. 8 (1), 24-31.
  • 44. References 2 Middleton, A. and Nortcliffe, A. (2010) ‘Audio feedback design: principles and emerging practice’, In D.Whitelock and P.Brna (eds) Special Issue ‘Focusing on electronic feedback: feasible progress or just unfulfilled promises?’ Int. J. Continuing Engineering Education and Life-Long Learning, Vol. 20, No. 2, pp.208-223. Shephard, K., Warburton, B., Maier, P. and Warren, A. (2006). Development and evaluation of computer-assisted assessment in higher education in relation to BS7988. Assessment & Evaluation in Higher Education, 31 : 5, 583 — 595. Strang, K.D. (2010) ‘Measuring self regulated e-feedback, study approach and academic outcome of multicultural university students’, In D.Whitelock and P.Brna (eds) Special Issue ‘Focusing on electronic feedback: feasible progress or just unfulfilled promises?’ Int. J. Continuing Engineering Education and Life-Long Learning, Vol. 20, No. 2, pp.239-255. Warburton, B. (2009). Quick win or slow burn: modelling UK HE CAA uptake, Assessment & Evaluation in Higher Education, 34 : 3, 257 — 272. Whitelock, D. and Watt, S. (2008). Reframing e-assessment: adopting new media and adapting old frameworks. Learning, Media and Technology, Vol. 33, No. 3, September 2008, pp.153–156 Routledge, Taylor & Francis Group. ISSN 1743-9884
  • 45. Three Assessment Special Issues Brna, P. & Whitelock, D. (Eds.) (2010) Special Issue of International Journal of Continuing Engineering Education and Life-long Learning, Focussing on electronic Feedback: Feasible progress or just unfulfilled promises? Volume 2, No. 2 Whitelock, D. (Ed.) (2009) Special on e-Assessment: Developing new dialogues for the digital age. Volume 40, No. 2 Whitelock, D. and Watt, S. (Eds.) (2008). Reframing e-assessment: adopting new media and adapting old frameworks. Learning, Media and Technology, Vol. 33 , No. 3