Rachel Bury
Academic Liaison Manager –
Quality, Communication
and Marketing
rachel.bury@edgehill.ac.uk
@rachelriding
Helen Jamieson
Customer Services Manager
helen.jamieson@edgehill.ac.uk
@jamiesonhelena
Evaluating the Impact of Academic Skills
Support @ Edge Hill University
Edge Hill University – University of the Year!
• 13,500 FTE
• University staff – 1,298
• University status – 2006
• Large provider of Teacher Education
and Nursing
Learning Services
• Libraries
• Learning Technology
• SpLD support
• Media & ICT Support
• Academic Skills
• Research Support
Presentation aims:
• Drivers for assessing impact of academic
skills
• Use of AMOSSHE value and impact tool
kit for group sessions
• Use of impact survey for 1-2-1 support
• Introduction of a peer review framework
Drivers:
• Customer Service Excellence holders (10 years) – the ‘so what!?’ factor
• What difference does it make to the student experience? Is it adding value?
• Do we do more of the same? Something different?
• Measuring quality of academic skills support – outcomes versus outputs
• Learning from best practice - time for reflection
My project brief: Evaluate the effectiveness of our
Steps to Success programme
• Supplementary academic skills/information literacy
workshops – academic writing, Harvard Referencing,
literature searching….
• Intensive on staff time/Low – Medium take up
• Series of recommendations
Using a value and impact toolkit (AMOSSHE, 2011) the
project looked at three strands:
• Evaluating satisfaction (whether customers are
happy/satisfied with the experience)
• Evaluating impact (whether a change has taken
place as a result of an intervention)
• Evaluating value for money (using the 3 E’s –
economy, efficiency and effectiveness)
Evaluating satisfaction – feedback forms after the session
• Small number of questions
• Room for free text
comments
Evaluating impact: has a change has taken place?
Changes may be:
• Affective: attitudes, perceptions, levels of confidence
• Behavioural: people doing things differently e.g. doing something more or less often,
asking different types of questions, being more critical or more independent
• Knowledge based: e.g. knowing about key sources of relevant information
• Competence based: people doing things more effectively e.g. improved search
techniques, finding more appropriate information
Impact survey
• Sent 4-6 weeks after the intervention
• Looking for changes in practice – affective, behavioural, knowledge based,
competency based
• Self reported – surrogate impact indicators – need to triangulate
• Follow up – semi structured interviews
Value for money: Cost/benefit analysis using the 3 E’s model
1. Economy: for example how much has the programme cost to run per person/per session?
£55.77 per session (where there was one or more attendees) and £15.58 per head
Total cost of delivery of
attended sessions
£545.30
Preparation time for all
sessions
£344.40
Administration £729.60
Promotion/publicity £500
Total £2119.30
2. Efficiency: for example how many students are seen per session, how can this be
improved?
3. Effectiveness: is the programme delivering the intended outcomes?
Have outcomes been clearly articulated at the outset? Difficult to measure if not…
• So what about one to one support?
• 2013/2014 - 1,343 sessions delivered – Academic Skills Support
• 2013/2014 – 1,950 sessions delivered – specialist SpLD support
• Whole teams of staff deliver sessions – demand increasing
• No formal way of either gathering feedback, assessing quality or measuring impact
• Impact survey devised based on principles of AMOSSHE
• How – feedback likely to be too superficial straight after the session
• Now the boring part – administration!
• 2 rounds of surveys completed in 2014/15 – average 30% return rate
• Feedback has been very valuable
Question 4 Could you tell us one change you have made since accessing support?
‘Clearer writing’
‘I changed the structure of my assignments’
‘My grades went up from 65- 72%’
‘Being able to keep up and more confidence in my work’
‘I have been able to have an understanding of how to reflect on my work’
• So what have we done with it? Two key outputs
• More reflection and real evidence!
• Feedback used in team meetings – looking at some of the responses in more
detail
• Used information to develop our offer in terms of skills support
• Bringing the quality and impact framework full circle
• Learning Services Peer Support Framework
• 4 teams involved – staff who deliver one to one & group sessions
• Staff development and support
• Boring bits again – the administration and paperwork
• Peer support scheme – the challenges
• Don’t call it peer observation – staff anxiety and fears
• 10 months on – has everyone engaged?
• Next year?
• Try and improve return rates of survey and look at other ways to UX
• Encourage more staff to engage in peer support and sharing in meetings
Thank you for listening
Any questions?

More Related Content

PPTX
HE Course and Module Evaluation Conference - Suzanne Cholerton
PPT
HE Course and Module Evaluation Conference - Paul bennet
PPTX
Helen Carmichael, Matt Grange and Matt Linguard - Working with Students
PPTX
Ellie Russell - Student Voice in Evaluation
PPT
Challenges of Quality in Teaching and Learning - Gwen van der velden
PPTX
Changes in student complaints and appeals, and responses to them
PPTX
Sloan C Users' Group Best Practices
PPTX
HE Course and Module Evaluation Conference - Peter Bird & Rachel Forsyth
HE Course and Module Evaluation Conference - Suzanne Cholerton
HE Course and Module Evaluation Conference - Paul bennet
Helen Carmichael, Matt Grange and Matt Linguard - Working with Students
Ellie Russell - Student Voice in Evaluation
Challenges of Quality in Teaching and Learning - Gwen van der velden
Changes in student complaints and appeals, and responses to them
Sloan C Users' Group Best Practices
HE Course and Module Evaluation Conference - Peter Bird & Rachel Forsyth

What's hot (19)

PPTX
111: From generalist to specialist
PPTX
Breaking Down Silos How to bridge the gap between academic departments and ce...
PPTX
HE Course and Module Evaluation Conference - Shaun mcgall & Marie salter
PDF
13:1, may the odds be forever in our favour - Valerie Innes, University of th...
PPTX
Development Conference 2014, The Professional Behaviours Framework,Ian Darker...
PDF
Managing Change Open Forum: Supporting your team
PPTX
Able mc neil foster ucisa sero
PDF
413: Transforming Spaces, Practices and Attitudes
PPTX
Midlands Conference, Restructure and Reorganisation, Paul Travill
PPTX
"Challenging traditional program design and delivery to better support wideni...
PPT
Yorkshire and North East Conference '13 - Coaching and Mentoring In HE, Bradl...
PPTX
009 laura mac kenzie whatuni student insights day future employability final
PDF
Traditional vs. Transformative Careers Model
PPT
More than a Mandate: Helping Faculty Respect and Value Assessment
PPT
HE Course and Module Evaluation Conference -
PDF
Making Assessment Work
PDF
Building Change Capability - Jess Annison and Jan Jayes
PPTX
005 vicky gunn london te what uni event
PPTX
CPD Session for Ambitious Futures
111: From generalist to specialist
Breaking Down Silos How to bridge the gap between academic departments and ce...
HE Course and Module Evaluation Conference - Shaun mcgall & Marie salter
13:1, may the odds be forever in our favour - Valerie Innes, University of th...
Development Conference 2014, The Professional Behaviours Framework,Ian Darker...
Managing Change Open Forum: Supporting your team
Able mc neil foster ucisa sero
413: Transforming Spaces, Practices and Attitudes
Midlands Conference, Restructure and Reorganisation, Paul Travill
"Challenging traditional program design and delivery to better support wideni...
Yorkshire and North East Conference '13 - Coaching and Mentoring In HE, Bradl...
009 laura mac kenzie whatuni student insights day future employability final
Traditional vs. Transformative Careers Model
More than a Mandate: Helping Faculty Respect and Value Assessment
HE Course and Module Evaluation Conference -
Making Assessment Work
Building Change Capability - Jess Annison and Jan Jayes
005 vicky gunn london te what uni event
CPD Session for Ambitious Futures
Ad

Viewers also liked (10)

PPTX
Social Networks and Relational Capital in Library Service Assessment
PPT
Library technician kpi
PPT
KPI Library Overview
PPTX
Developing a Matrix and Using Self-Reported Scoring to Measure Librarian Enga...
PPT
Systems librarian kpi
PPT
Library staff kpi
PPTX
Evaluate! Evaluation of school libraries
PPT
Laboratory supervisor kpi
PPT
Enhancing the Usability of Libraries
PDF
LIBRARY ASSESSMENT
Social Networks and Relational Capital in Library Service Assessment
Library technician kpi
KPI Library Overview
Developing a Matrix and Using Self-Reported Scoring to Measure Librarian Enga...
Systems librarian kpi
Library staff kpi
Evaluate! Evaluation of school libraries
Laboratory supervisor kpi
Enhancing the Usability of Libraries
LIBRARY ASSESSMENT
Ad

Similar to New approaches to evaluating impact (20)

PPTX
ARLG 2019: Myer have we made a difference
PDF
Reflections on transforming assessment and feedback: complexity and collabora...
PDF
PGCAP cohort 2: week 6 - assessing and feeding back
PPTX
TESTA, Presentation to the SDG Course Leaders, University of West of Scotlan...
PPT
TESTA - UNSW, Sydney Australia (September 2011)
PPT
Sally Brown 2008
PPT
Assessment for Learning I
PDF
Enhancing students’ experience of their programme assessment and feedback jou...
PPTX
Collaborative Learning Skills
PPTX
Assessment Careers: Enhancing learning pathways through assessment
PPTX
TESTA, SIAST Universities of Regina & Saskathewan Webinar (November 2013)
PPTX
Empowering Educators: Embracing Coaching-Based Observation Feedback in Higher...
PPTX
Pigs might fly: changing the assessment narrative through TESTA
PDF
CoreSep11: assessment_and_feedback
PPT
Learning or grades? A case for changing assessment to pass/fail marking.
PPTX
Portsmouth BAM Knowledge and Learning SIG
PPT
Using Enquiry Based Learning to Create a Blended Academic Skills Development ...
PPTX
The Learning Hub at Teesside University: an innovative approach to supporting...
PDF
Formative Aspects of Summative Assessment, LSBU, 6 May 2010
PPTX
TESTA to FASTECH Presentation
ARLG 2019: Myer have we made a difference
Reflections on transforming assessment and feedback: complexity and collabora...
PGCAP cohort 2: week 6 - assessing and feeding back
TESTA, Presentation to the SDG Course Leaders, University of West of Scotlan...
TESTA - UNSW, Sydney Australia (September 2011)
Sally Brown 2008
Assessment for Learning I
Enhancing students’ experience of their programme assessment and feedback jou...
Collaborative Learning Skills
Assessment Careers: Enhancing learning pathways through assessment
TESTA, SIAST Universities of Regina & Saskathewan Webinar (November 2013)
Empowering Educators: Embracing Coaching-Based Observation Feedback in Higher...
Pigs might fly: changing the assessment narrative through TESTA
CoreSep11: assessment_and_feedback
Learning or grades? A case for changing assessment to pass/fail marking.
Portsmouth BAM Knowledge and Learning SIG
Using Enquiry Based Learning to Create a Blended Academic Skills Development ...
The Learning Hub at Teesside University: an innovative approach to supporting...
Formative Aspects of Summative Assessment, LSBU, 6 May 2010
TESTA to FASTECH Presentation

Recently uploaded (20)

PDF
CISA (Certified Information Systems Auditor) Domain-Wise Summary.pdf
PDF
Chinmaya Tiranga quiz Grand Finale.pdf
PDF
medical_surgical_nursing_10th_edition_ignatavicius_TEST_BANK_pdf.pdf
PDF
Complications of Minimal Access-Surgery.pdf
PPTX
TNA_Presentation-1-Final(SAVE)) (1).pptx
PPTX
Virtual and Augmented Reality in Current Scenario
PDF
Empowerment Technology for Senior High School Guide
PDF
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
PPTX
A powerpoint presentation on the Revised K-10 Science Shaping Paper
PDF
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
PDF
Paper A Mock Exam 9_ Attempt review.pdf.
PPTX
B.Sc. DS Unit 2 Software Engineering.pptx
PDF
IGGE1 Understanding the Self1234567891011
PDF
FORM 1 BIOLOGY MIND MAPS and their schemes
PDF
AI-driven educational solutions for real-life interventions in the Philippine...
DOC
Soft-furnishing-By-Architect-A.F.M.Mohiuddin-Akhand.doc
PPTX
ELIAS-SEZIURE AND EPilepsy semmioan session.pptx
PPTX
20th Century Theater, Methods, History.pptx
PDF
Hazard Identification & Risk Assessment .pdf
PDF
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...
CISA (Certified Information Systems Auditor) Domain-Wise Summary.pdf
Chinmaya Tiranga quiz Grand Finale.pdf
medical_surgical_nursing_10th_edition_ignatavicius_TEST_BANK_pdf.pdf
Complications of Minimal Access-Surgery.pdf
TNA_Presentation-1-Final(SAVE)) (1).pptx
Virtual and Augmented Reality in Current Scenario
Empowerment Technology for Senior High School Guide
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
A powerpoint presentation on the Revised K-10 Science Shaping Paper
A GUIDE TO GENETICS FOR UNDERGRADUATE MEDICAL STUDENTS
Paper A Mock Exam 9_ Attempt review.pdf.
B.Sc. DS Unit 2 Software Engineering.pptx
IGGE1 Understanding the Self1234567891011
FORM 1 BIOLOGY MIND MAPS and their schemes
AI-driven educational solutions for real-life interventions in the Philippine...
Soft-furnishing-By-Architect-A.F.M.Mohiuddin-Akhand.doc
ELIAS-SEZIURE AND EPilepsy semmioan session.pptx
20th Century Theater, Methods, History.pptx
Hazard Identification & Risk Assessment .pdf
David L Page_DCI Research Study Journey_how Methodology can inform one's prac...

New approaches to evaluating impact

  • 1. Rachel Bury Academic Liaison Manager – Quality, Communication and Marketing rachel.bury@edgehill.ac.uk @rachelriding Helen Jamieson Customer Services Manager helen.jamieson@edgehill.ac.uk @jamiesonhelena Evaluating the Impact of Academic Skills Support @ Edge Hill University
  • 2. Edge Hill University – University of the Year! • 13,500 FTE • University staff – 1,298 • University status – 2006 • Large provider of Teacher Education and Nursing Learning Services • Libraries • Learning Technology • SpLD support • Media & ICT Support • Academic Skills • Research Support
  • 3. Presentation aims: • Drivers for assessing impact of academic skills • Use of AMOSSHE value and impact tool kit for group sessions • Use of impact survey for 1-2-1 support • Introduction of a peer review framework
  • 4. Drivers: • Customer Service Excellence holders (10 years) – the ‘so what!?’ factor • What difference does it make to the student experience? Is it adding value? • Do we do more of the same? Something different? • Measuring quality of academic skills support – outcomes versus outputs • Learning from best practice - time for reflection
  • 5. My project brief: Evaluate the effectiveness of our Steps to Success programme • Supplementary academic skills/information literacy workshops – academic writing, Harvard Referencing, literature searching…. • Intensive on staff time/Low – Medium take up • Series of recommendations
  • 6. Using a value and impact toolkit (AMOSSHE, 2011) the project looked at three strands: • Evaluating satisfaction (whether customers are happy/satisfied with the experience) • Evaluating impact (whether a change has taken place as a result of an intervention) • Evaluating value for money (using the 3 E’s – economy, efficiency and effectiveness)
  • 7. Evaluating satisfaction – feedback forms after the session • Small number of questions • Room for free text comments
  • 8. Evaluating impact: has a change has taken place? Changes may be: • Affective: attitudes, perceptions, levels of confidence • Behavioural: people doing things differently e.g. doing something more or less often, asking different types of questions, being more critical or more independent • Knowledge based: e.g. knowing about key sources of relevant information • Competence based: people doing things more effectively e.g. improved search techniques, finding more appropriate information
  • 9. Impact survey • Sent 4-6 weeks after the intervention • Looking for changes in practice – affective, behavioural, knowledge based, competency based • Self reported – surrogate impact indicators – need to triangulate • Follow up – semi structured interviews
  • 10. Value for money: Cost/benefit analysis using the 3 E’s model 1. Economy: for example how much has the programme cost to run per person/per session? £55.77 per session (where there was one or more attendees) and £15.58 per head Total cost of delivery of attended sessions £545.30 Preparation time for all sessions £344.40 Administration £729.60 Promotion/publicity £500 Total £2119.30
  • 11. 2. Efficiency: for example how many students are seen per session, how can this be improved? 3. Effectiveness: is the programme delivering the intended outcomes? Have outcomes been clearly articulated at the outset? Difficult to measure if not…
  • 12. • So what about one to one support? • 2013/2014 - 1,343 sessions delivered – Academic Skills Support • 2013/2014 – 1,950 sessions delivered – specialist SpLD support • Whole teams of staff deliver sessions – demand increasing • No formal way of either gathering feedback, assessing quality or measuring impact
  • 13. • Impact survey devised based on principles of AMOSSHE • How – feedback likely to be too superficial straight after the session • Now the boring part – administration! • 2 rounds of surveys completed in 2014/15 – average 30% return rate • Feedback has been very valuable
  • 14. Question 4 Could you tell us one change you have made since accessing support? ‘Clearer writing’ ‘I changed the structure of my assignments’ ‘My grades went up from 65- 72%’ ‘Being able to keep up and more confidence in my work’ ‘I have been able to have an understanding of how to reflect on my work’
  • 15. • So what have we done with it? Two key outputs • More reflection and real evidence! • Feedback used in team meetings – looking at some of the responses in more detail • Used information to develop our offer in terms of skills support
  • 16. • Bringing the quality and impact framework full circle • Learning Services Peer Support Framework • 4 teams involved – staff who deliver one to one & group sessions • Staff development and support • Boring bits again – the administration and paperwork
  • 17. • Peer support scheme – the challenges • Don’t call it peer observation – staff anxiety and fears • 10 months on – has everyone engaged? • Next year? • Try and improve return rates of survey and look at other ways to UX • Encourage more staff to engage in peer support and sharing in meetings
  • 18. Thank you for listening Any questions?

Editor's Notes

  • #2: Helen and Rachel
  • #13: Desire to explore the value and impact of group sessions was also
  • #14: Used the same approach and principles as we want to measure impact of one to ones in the same way as the group sessions How – not really appropriate to interview after the session – best approach for now is a survey – although aware we are asking a number of closed questions Confidentiality important – made the survey totally anolymous unless students wanted to be entered in a draw and did not ask for information about their actual appointment Decided to survey twice a year Two surveys devised as the nature of the continuous support provided by Spld needed different questions. Decided to email all those who have attended and then send a text reminder
  • #15: Used the same approach and principles as we want to measure impact of one to ones in the same way as the group sessions How – not really appropriate to interview after the session – best approach for now is a survey – although aware we are asking a number of closed questions Confidentiality important – made the survey totally anolymous unless students wanted to be entered in a draw and did not ask for information about their actual appointment Decided to survey twice a year Two surveys devised as the nature of the continuous support provided by Spld needed different questions. Decided to email all those who have attended and then send a text reminder
  • #16: Demonstrate impact!
  • #17: Delivery in what ever format is a key part of staff roles – large numbers of staff are working with Edge Hill staff and students to deliver training and skills Assessment of the impact in one strand to measuring impact but also wanted to develop the quality – excellent sessions delivered enthusiastic and knowledgeable staff will improve the impact of those sessions – where it be groups or one to one Peer observation not a new areas per say in Higher education Review of the literature – some examples of library teams developing a scheme but not many Used examples from Higher Ed where it was both support staff and academics. Long tradition of developing LS staff teaching and learning skills – staff can access the Post Grad Certificate in T and Learning. Approach – steering group with representation from each of the teams who are involved. Taking the intial ideas and plans back to teams and presenting at tema meetings to improve staff awareness. Event in Sep 2014 developed by an external trainer on giving and receiving feedback – also asking staff to talk about any concerns There is some paperwork but 90% of this is confidential between the two people – admin team and myself just informed it has taken place
  • #18: Introducing via teams with a champion in each team a good approach – allows staff to discuss in their teams and hear what others have to say Some concerns and anxiety – only natural – would happen with any groups of staff at any institution We call it peer support as it is support Staff dev event allowed staff to discuss their concerns with the trainer – without managers present – I addressed and gave clairifaction at their next team meetings which worked – staff like to be able to at least voice any concerns but get an answer 70% yes – I will work with those who haven’t fully completed Excellent examples of sharing – staff encouraged to bring back examples of good practice to their team meetings – it is an item on agendas so people can share Will now undertake a feedback and review exercise with all the staff involved to seek there views and see if improvements can be made but on the whole staff have said they have found it really useful Example – staff member voicing at a team meeting that by working with someone in another team – not only do you see their style and approach and might pick things up but learning so much more about the areas those teams are working in
  • #19: Happy to share the framework and the related paperwork