A rough, ready and rapid
guide to TESTA
@solentlearning
@tansyjtweets
Tansy Jessop
Unravelling A&F Symposium
University of Surrey
18 June 2018
Workshop plan
• Brief explanation of TESTA
• Why a programme approach?
• Mock audit
• AEQ data
• Focus groups
• Changing the assessment narrative
Rough ready and rapid guide to TESTA
Research and change process
Programme
Team
Meeting
Assessment
Experience
Questionnaire
(AEQ)
TESTA
Programme
Audit
Student
Focus Groups
Why take a programme approach?
1. A modular problem
2. A curriculum problem
3. An alienation problem
4. An engagement solution
The modular degree
IKEA 101: great for flat-pack furniture but..
Curriculum privileges knowing stuff
A state of alienation?
Image, "Alienation Nightmare" © 1996 by Sabu
Motorways to alienation
• M1: Modules
• M2: Markets
• M3: Metrics
• M4: Mass higher education
TESTA improves students’ perceptions
of A&F…
60%
65%
70%
75%
80%
85%
90%
95%
Q5 Q6 Q7 Q8 Q9 OS
AVERAGENSSSCORES
COMPARISON OF 32 PROGS IN 13 UNIVERSITIES WITH SECTOR SCORES
NSS 2015 SCORES TESTA SCORES
…and improves the staff experience
More
engaging
formative
Less
measuring
Students
learning
more
Curriculum
less stuffed
Activity One: mock audit
Programme
Team
Meeting
Assessment
Experience
Questionnaire
(AEQ)
TESTA
Programme
Audit
Student
Focus Groups
The Audit: Caveats
1. Audit is not everything
2. Official discourse
3. Planned curriculum
4. Some better data, some weaker, some gaps
• Some context
• Number of summative
• Number of formative
• Varieties of assessment
• Proportion of exams
• Written feedback
• Speed of return of
feedback
Summary of audit data
TESTA definitions
Summative:
graded assessment which counts towards the degree
Formative:
Does not count: ungraded, required task with
feedback
Mock Audit
Make sense of audit data
1) What is striking about the data?
2) What surprises or puzzles you?
3) What student learning
behaviours might the assessment
patterns foster?
4) What do you want to know
more about?
Each table look at:
• 1 x overview data
OR
• 1 x discipline
OR
• 1 x university type.
• Briefly discuss in relation
to questions
Typical A&F patterns
73 programmes in 14 unis (Jessop and Tomas 2017)
Characteristic Low Medium High
Volume of summative
assessment
Below 33 40-48 More than 48
Volume of formative only Below 1 5-19 More than 19
% of tasks by examinations Below 11% 22-31% More than
31%
Variety of assessment
methods
Below 8 11-15 More than 15
Written feedback in words Less than
3,800
6,000-7,600 More than
7,600
Characteristic Medium
(Research-
Intensive)
Medium
(Teaching
Intensive)
Mann-
Whitney U
test results
Summative 41-79
(Median 50)
34-41
(Median 35)
RI*
Formative 1-26
(Median 3)
3-17
(Median 7)
n.s.
Proportion of
Examinations
27-42%
(Median 30%)
5-19%
(Median 10%)
RI*
Variety assessment
methods
8-10
(Median 8)
12-14
(Median 15)
TI*
Research Tools 2:
The AEQ
Lies, damned lies and statistics…
• The AEQ constructs
• Filling out the AEQ
• The scales
• Analysing AEQ & audit
data
Why a questionnaire about assessment?
• Weaker NSS scores
• Weak NSS?
• Quick and big data
• Quant/qualitative
AEQ 3.3 (2003)
• Designed to measure ‘conditions under which
assessment supports learning’
• Based on theory and evidence + selected CEQ scales
• Robust enough factor structure and scale coherence
– measures what it’s meant to be measuring?
AEQ 4.0 (2018)
• Fill in the AEQ 4.0 from the vantage point of
being a student in one of your classes
• Paper or online?
• https://educ.sphinxonline.net/v4/s/ha2fbs
Comparing Audit and AEQ data
from one programme
In pairs or groups, explore programme
audit and AEQ data from one programme.
Does anything stack up? Are there loose
ends, questions, contradictions?
Research Tools 3:
Focus Groups
Where to find focus group guidance
• How to run a focus group>resources>research toolkits
http://www.testa.ac.uk/index.php
• Role play>resources>focus group schedule
www.testa.ac.uk
Main pointers for focus group
• Questions are broad themes
• Easy to complicated
• Sit in a circle
• It’s the discussion that matters
• Go with the flow
• But steer when off topic, direct, pass the ball
• Troubleshooting
• Ethics
Role play (5 minutes)
Getting students to attend…
• Get the support of lecturers, programme team
• Explore using student researchers
• Use vouchers
• Food
• Between 3 and 8 students for one hour
• Ethics and confidentiality
What the data looks like:
…and the intelligent transcript
texttoMP3
https://transcribe.wreally.com/
Coding 101
CODES
• Marker variation
• Written criteria
• Peer feedback
• Subjectivity
• Formative feedback
• Exemplars
• Multi-stage assessment
• Marking exercises
CATEGORY
Internalising standards
Raw focus group data
• Read section of full transcript
• Highlight relevant text
• Develop a few codes
• Suggest a few themes
• Devise three headlines
• Identify quotations under each headline
Have a go at triangulating data
• Read through audit, AEQ and focus group data from one
programme
• Quick abstract/bullet points of what seems to be going on
• Discuss with your group/flipchart
a) What are the stand out themes?
b) What jigsaw pieces fit together?
c) What unresolved issues remain?
The tone of the case study
• Build a narrative thread
• Descriptive, non-evaluative tone
• Empathetic
• Surprises, puzzles, contradictions
• Balancing weak and strong features
• Admitting gaps, interpretation, errors
• Not prescriptive, but give a steer & create
options
TESTA is like fly fishing
Negotiating TESTA evidence
with programme teams
How not to do it
What doesn’t work: lessons learned
• Too much information
• Too much negative information
• Lack of soft stuff – food, drinks, chat, humour,
empathy, conducive spaces
• An inquorate team meeting
• Focusing on modules
What has worked and why
• Post-it predictions beforehand
• Trust and confidentiality
• Admitting gaps, listening
• Respect for disciplines
• Team ownership
• One pages notes – “you said”
• Focus on the whole programme
The TESTA effect
• Helps teams to talk about whole programme design
• Acting on evidence and principles
• Formative assessment
• Develops connections within/across modules
• Feedback as dialogue
• Greater knowledge and confidence among teachers about
assessment for learning
• And…improved NSS scores
Rough ready and rapid guide to TESTA
References
Barlow, A. and Jessop, T. 2016. “You can’t write a load of rubbish”: Why blogging works as formative
assessment. Educational Developments. 17(3), 12-15. SEDA.
Boud, D. and Molloy, E. (2013) ‘Rethinking models of feedback for learning: The challenge of
design’, Assessment & Evaluation in Higher Education, 38(6), pp. 698–712.
Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning. Learning
and Teaching in Higher Education. 1(1): 3-31.
Harland, T., McLean, A., Wass, R., Miller, E. and Sim, K. N. (2014) ‘An assessment arms race and its fallout:
High-stakes grading and the case for slow scholarship’, Assessment & Evaluation in Higher Education.
Jessop, T. and Tomas, C. 2017. The implications of programme assessment on student learning. Assessment
and Evaluation in Higher Education.
Jessop, T. and Maleckar, B. 2016. The Influence of disciplinary assessment patterns on student learning: a
comparative study. Studies in Higher Education. Published Online 27 August 2014
Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-scale
study of students’ learning in response to different assessment patterns. Assessment and Evaluation in
Higher Education. 39(1) 73-88.
Nicol, D. 2010. From monologue to dialogue: improving written feedback processes in mass higher
education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517.
O'Donovan, B , Price, M. and Rust, C. (2008) 'Developing student understanding of assessment standards: a
nested hierarchy of approaches', Teaching in Higher Education, 13: 2, 205 -217.
Sadler, D. R. 1989. ‘Formative assessment and the design of instructional systems’, Instructional Science,
18(2), pp. 119–144.
Tomas, C and Jessop, T. 2018. Struggling and juggling: A comparison of student assessment loads across
research and teaching-intensive universities. Assessment and Evaluation in Higher Education. 18 April.
Wu, Q. and Jessop, T. 2018. Formative assessment: missing in action in both research-intensive and teaching-
focused universities. Assessment and Evaluation in Higher Education. Published online 15 January.

More Related Content

PPTX
Improving student learning through programme assessment
PPTX
22 January 2018 HEFCE open event “Using data to increase learning gains and t...
PPTX
Birmingham Assessment and Feedback Symposium
PPTX
Presentations morning session 22 January 2018 HEFCE open event “Using data to...
PPTX
Fasp pd skills & beliefs
PPTX
NYSCOSS Conference Superintendents Training on Assessment 9 14
PPTX
Taking control of the South Carolina Teacher Evaluation framework
PPTX
National Superintendent's Dialogue
Improving student learning through programme assessment
22 January 2018 HEFCE open event “Using data to increase learning gains and t...
Birmingham Assessment and Feedback Symposium
Presentations morning session 22 January 2018 HEFCE open event “Using data to...
Fasp pd skills & beliefs
NYSCOSS Conference Superintendents Training on Assessment 9 14
Taking control of the South Carolina Teacher Evaluation framework
National Superintendent's Dialogue

What's hot (20)

PPTX
Higher Education Teachers' Experiences of Learning Analytics in Relation to S...
PPT
Maximizing student assessment systems cronin
PPTX
ISSOTL Presentation
PPT
NB Provincial Assessment Program
PPTX
SHEILA workshop at EC-TEL 2018
PDF
ABLE - NTU Danish visit February 2018
PPTX
Strategies for Assessment of Inquiry Learning in Science (SAILS), Eilish McLo...
PDF
). Reflections on methodology used in assessing teachers´perceptions of colle...
PPTX
Hlc1762b final prepare achieve succeed presentation linkedin version
PPT
OECD Reviews of Evaluation and Assessment in Education: SWEDEN - Stockholm – ...
PPT
Ed Reform Lecture - University of Arkansas
PPTX
Predicting Student Performance on the MSP-HSPE: Understanding, Conducting, an...
PPT
2009 siym theory, research, practice, and profession evidence_final
PPT
Connecticut mesuring and modeling growth
PPT
Connecticut mesuring and modeling growth
PPT
Connecticut mesuring and modeling growth
PPTX
Closing the Gap With STEM Education: Why, What, and How
PDF
HSTA September 2013 ees survey results 10-4-13
PPTX
The reading teacher as classroom researcher
Higher Education Teachers' Experiences of Learning Analytics in Relation to S...
Maximizing student assessment systems cronin
ISSOTL Presentation
NB Provincial Assessment Program
SHEILA workshop at EC-TEL 2018
ABLE - NTU Danish visit February 2018
Strategies for Assessment of Inquiry Learning in Science (SAILS), Eilish McLo...
). Reflections on methodology used in assessing teachers´perceptions of colle...
Hlc1762b final prepare achieve succeed presentation linkedin version
OECD Reviews of Evaluation and Assessment in Education: SWEDEN - Stockholm – ...
Ed Reform Lecture - University of Arkansas
Predicting Student Performance on the MSP-HSPE: Understanding, Conducting, an...
2009 siym theory, research, practice, and profession evidence_final
Connecticut mesuring and modeling growth
Connecticut mesuring and modeling growth
Connecticut mesuring and modeling growth
Closing the Gap With STEM Education: Why, What, and How
HSTA September 2013 ees survey results 10-4-13
The reading teacher as classroom researcher
Ad

Similar to Rough ready and rapid guide to TESTA (20)

PPTX
TESTA Masterclass
PPTX
TESTA Interactive Masterclass
PPTX
Getting to grips with TESTA methods
PPTX
Getting to grips with TESTA methods
PPTX
Inspiring change in assessment and feedback
PPTX
Developing assessment patterns that work through TESTA
PPTX
Why a programme view? Why TESTA?
PPTX
Changing the assessment landscape
PPTX
Portsmouth BAM Knowledge and Learning SIG
PPTX
1 why do testa
PPTX
TESTA to FASTECH Presentation
PPTX
From alienation to engagement through a programme assessment approach
PPTX
TESTA, Presentation to the SDG Course Leaders, University of West of Scotlan...
PPTX
Improving student learning through taking a programme approach
PPTX
MMU TESTA Keynote
PPTX
Evidence to action: Why TESTA works
PPTX
SLTCC 2016 (Keynote 2) Evidence to Action: Why TESTA works
PPTX
TESTA to FASTECH (November 2011)
PPTX
Improving student learning through programme assessment
PPTX
Learning-oriented assessment in an era of high-stakes and insecure testing
TESTA Masterclass
TESTA Interactive Masterclass
Getting to grips with TESTA methods
Getting to grips with TESTA methods
Inspiring change in assessment and feedback
Developing assessment patterns that work through TESTA
Why a programme view? Why TESTA?
Changing the assessment landscape
Portsmouth BAM Knowledge and Learning SIG
1 why do testa
TESTA to FASTECH Presentation
From alienation to engagement through a programme assessment approach
TESTA, Presentation to the SDG Course Leaders, University of West of Scotlan...
Improving student learning through taking a programme approach
MMU TESTA Keynote
Evidence to action: Why TESTA works
SLTCC 2016 (Keynote 2) Evidence to Action: Why TESTA works
TESTA to FASTECH (November 2011)
Improving student learning through programme assessment
Learning-oriented assessment in an era of high-stakes and insecure testing
Ad

More from Tansy Jessop (20)

PPTX
Rewriting the script about pedagogic research in HE
PPTX
Engaging in meaningful change in assessment
PPTX
Changing the assessment narrative
PPTX
Pigs might fly: changing the assessment narrative through TESTA
PPTX
A curriculum for personal knowing
PPTX
Netherlands
PPTX
A broken assessment paradigm?
PPTX
TESTA: changing conceptions and practice
PPTX
Making sense of curriculum in Higher Education
PPTX
Activating student agency through feedback
PPTX
The untapped potential of pedagogic research
PPTX
A very brief history: Universities, Solent and the question of purpose
PPTX
Increasing engagement to transform student learning
PPTX
Curriculum Transformation: a rough guide to Solent's approach
PPTX
At home everywhere and nowhere: the place of pedagogic research in HE
PPTX
Curriculum as counter narrative
PPTX
Old and new hands at academic development
PPTX
Writers' banquet
PPTX
The unexamined curriculum is not worth teaching
PPT
Rolling out testa
Rewriting the script about pedagogic research in HE
Engaging in meaningful change in assessment
Changing the assessment narrative
Pigs might fly: changing the assessment narrative through TESTA
A curriculum for personal knowing
Netherlands
A broken assessment paradigm?
TESTA: changing conceptions and practice
Making sense of curriculum in Higher Education
Activating student agency through feedback
The untapped potential of pedagogic research
A very brief history: Universities, Solent and the question of purpose
Increasing engagement to transform student learning
Curriculum Transformation: a rough guide to Solent's approach
At home everywhere and nowhere: the place of pedagogic research in HE
Curriculum as counter narrative
Old and new hands at academic development
Writers' banquet
The unexamined curriculum is not worth teaching
Rolling out testa

Recently uploaded (20)

PPTX
ELIAS-SEZIURE AND EPilepsy semmioan session.pptx
PPTX
A powerpoint presentation on the Revised K-10 Science Shaping Paper
PDF
International_Financial_Reporting_Standa.pdf
PDF
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
PDF
semiconductor packaging in vlsi design fab
PPTX
Climate Change and Its Global Impact.pptx
DOCX
Cambridge-Practice-Tests-for-IELTS-12.docx
PDF
CRP102_SAGALASSOS_Final_Projects_2025.pdf
PDF
Literature_Review_methods_ BRACU_MKT426 course material
PDF
Empowerment Technology for Senior High School Guide
PDF
CISA (Certified Information Systems Auditor) Domain-Wise Summary.pdf
PDF
BP 505 T. PHARMACEUTICAL JURISPRUDENCE (UNIT 2).pdf
PDF
LIFE & LIVING TRILOGY- PART (1) WHO ARE WE.pdf
PPTX
Module on health assessment of CHN. pptx
PDF
Journal of Dental Science - UDMY (2022).pdf
PDF
FORM 1 BIOLOGY MIND MAPS and their schemes
PDF
BP 505 T. PHARMACEUTICAL JURISPRUDENCE (UNIT 1).pdf
PDF
MBA _Common_ 2nd year Syllabus _2021-22_.pdf
PDF
Myanmar Dental Journal, The Journal of the Myanmar Dental Association (2013).pdf
PPTX
Introduction to pro and eukaryotes and differences.pptx
ELIAS-SEZIURE AND EPilepsy semmioan session.pptx
A powerpoint presentation on the Revised K-10 Science Shaping Paper
International_Financial_Reporting_Standa.pdf
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
semiconductor packaging in vlsi design fab
Climate Change and Its Global Impact.pptx
Cambridge-Practice-Tests-for-IELTS-12.docx
CRP102_SAGALASSOS_Final_Projects_2025.pdf
Literature_Review_methods_ BRACU_MKT426 course material
Empowerment Technology for Senior High School Guide
CISA (Certified Information Systems Auditor) Domain-Wise Summary.pdf
BP 505 T. PHARMACEUTICAL JURISPRUDENCE (UNIT 2).pdf
LIFE & LIVING TRILOGY- PART (1) WHO ARE WE.pdf
Module on health assessment of CHN. pptx
Journal of Dental Science - UDMY (2022).pdf
FORM 1 BIOLOGY MIND MAPS and their schemes
BP 505 T. PHARMACEUTICAL JURISPRUDENCE (UNIT 1).pdf
MBA _Common_ 2nd year Syllabus _2021-22_.pdf
Myanmar Dental Journal, The Journal of the Myanmar Dental Association (2013).pdf
Introduction to pro and eukaryotes and differences.pptx

Rough ready and rapid guide to TESTA

  • 1. A rough, ready and rapid guide to TESTA @solentlearning @tansyjtweets Tansy Jessop Unravelling A&F Symposium University of Surrey 18 June 2018
  • 2. Workshop plan • Brief explanation of TESTA • Why a programme approach? • Mock audit • AEQ data • Focus groups • Changing the assessment narrative
  • 4. Research and change process Programme Team Meeting Assessment Experience Questionnaire (AEQ) TESTA Programme Audit Student Focus Groups
  • 5. Why take a programme approach? 1. A modular problem 2. A curriculum problem 3. An alienation problem 4. An engagement solution
  • 7. IKEA 101: great for flat-pack furniture but..
  • 9. A state of alienation? Image, "Alienation Nightmare" © 1996 by Sabu
  • 10. Motorways to alienation • M1: Modules • M2: Markets • M3: Metrics • M4: Mass higher education
  • 11. TESTA improves students’ perceptions of A&F… 60% 65% 70% 75% 80% 85% 90% 95% Q5 Q6 Q7 Q8 Q9 OS AVERAGENSSSCORES COMPARISON OF 32 PROGS IN 13 UNIVERSITIES WITH SECTOR SCORES NSS 2015 SCORES TESTA SCORES
  • 12. …and improves the staff experience More engaging formative Less measuring Students learning more Curriculum less stuffed
  • 13. Activity One: mock audit Programme Team Meeting Assessment Experience Questionnaire (AEQ) TESTA Programme Audit Student Focus Groups
  • 14. The Audit: Caveats 1. Audit is not everything 2. Official discourse 3. Planned curriculum 4. Some better data, some weaker, some gaps
  • 15. • Some context • Number of summative • Number of formative • Varieties of assessment • Proportion of exams • Written feedback • Speed of return of feedback Summary of audit data
  • 16. TESTA definitions Summative: graded assessment which counts towards the degree Formative: Does not count: ungraded, required task with feedback
  • 18. Make sense of audit data 1) What is striking about the data? 2) What surprises or puzzles you? 3) What student learning behaviours might the assessment patterns foster? 4) What do you want to know more about? Each table look at: • 1 x overview data OR • 1 x discipline OR • 1 x university type. • Briefly discuss in relation to questions
  • 19. Typical A&F patterns 73 programmes in 14 unis (Jessop and Tomas 2017) Characteristic Low Medium High Volume of summative assessment Below 33 40-48 More than 48 Volume of formative only Below 1 5-19 More than 19 % of tasks by examinations Below 11% 22-31% More than 31% Variety of assessment methods Below 8 11-15 More than 15 Written feedback in words Less than 3,800 6,000-7,600 More than 7,600
  • 20. Characteristic Medium (Research- Intensive) Medium (Teaching Intensive) Mann- Whitney U test results Summative 41-79 (Median 50) 34-41 (Median 35) RI* Formative 1-26 (Median 3) 3-17 (Median 7) n.s. Proportion of Examinations 27-42% (Median 30%) 5-19% (Median 10%) RI* Variety assessment methods 8-10 (Median 8) 12-14 (Median 15) TI*
  • 22. Lies, damned lies and statistics… • The AEQ constructs • Filling out the AEQ • The scales • Analysing AEQ & audit data
  • 23. Why a questionnaire about assessment? • Weaker NSS scores • Weak NSS? • Quick and big data • Quant/qualitative
  • 24. AEQ 3.3 (2003) • Designed to measure ‘conditions under which assessment supports learning’ • Based on theory and evidence + selected CEQ scales • Robust enough factor structure and scale coherence – measures what it’s meant to be measuring?
  • 25. AEQ 4.0 (2018) • Fill in the AEQ 4.0 from the vantage point of being a student in one of your classes • Paper or online? • https://educ.sphinxonline.net/v4/s/ha2fbs
  • 26. Comparing Audit and AEQ data from one programme In pairs or groups, explore programme audit and AEQ data from one programme. Does anything stack up? Are there loose ends, questions, contradictions?
  • 28. Where to find focus group guidance • How to run a focus group>resources>research toolkits http://www.testa.ac.uk/index.php • Role play>resources>focus group schedule www.testa.ac.uk
  • 29. Main pointers for focus group • Questions are broad themes • Easy to complicated • Sit in a circle • It’s the discussion that matters • Go with the flow • But steer when off topic, direct, pass the ball • Troubleshooting • Ethics
  • 30. Role play (5 minutes)
  • 31. Getting students to attend… • Get the support of lecturers, programme team • Explore using student researchers • Use vouchers • Food • Between 3 and 8 students for one hour • Ethics and confidentiality
  • 32. What the data looks like: …and the intelligent transcript texttoMP3 https://transcribe.wreally.com/
  • 33. Coding 101 CODES • Marker variation • Written criteria • Peer feedback • Subjectivity • Formative feedback • Exemplars • Multi-stage assessment • Marking exercises CATEGORY Internalising standards
  • 34. Raw focus group data • Read section of full transcript • Highlight relevant text • Develop a few codes • Suggest a few themes • Devise three headlines • Identify quotations under each headline
  • 35. Have a go at triangulating data • Read through audit, AEQ and focus group data from one programme • Quick abstract/bullet points of what seems to be going on • Discuss with your group/flipchart a) What are the stand out themes? b) What jigsaw pieces fit together? c) What unresolved issues remain?
  • 36. The tone of the case study • Build a narrative thread • Descriptive, non-evaluative tone • Empathetic • Surprises, puzzles, contradictions • Balancing weak and strong features • Admitting gaps, interpretation, errors • Not prescriptive, but give a steer & create options
  • 37. TESTA is like fly fishing
  • 39. How not to do it
  • 40. What doesn’t work: lessons learned • Too much information • Too much negative information • Lack of soft stuff – food, drinks, chat, humour, empathy, conducive spaces • An inquorate team meeting • Focusing on modules
  • 41. What has worked and why • Post-it predictions beforehand • Trust and confidentiality • Admitting gaps, listening • Respect for disciplines • Team ownership • One pages notes – “you said” • Focus on the whole programme
  • 42. The TESTA effect • Helps teams to talk about whole programme design • Acting on evidence and principles • Formative assessment • Develops connections within/across modules • Feedback as dialogue • Greater knowledge and confidence among teachers about assessment for learning • And…improved NSS scores
  • 44. References Barlow, A. and Jessop, T. 2016. “You can’t write a load of rubbish”: Why blogging works as formative assessment. Educational Developments. 17(3), 12-15. SEDA. Boud, D. and Molloy, E. (2013) ‘Rethinking models of feedback for learning: The challenge of design’, Assessment & Evaluation in Higher Education, 38(6), pp. 698–712. Gibbs, G. & Simpson, C. (2004) Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education. 1(1): 3-31. Harland, T., McLean, A., Wass, R., Miller, E. and Sim, K. N. (2014) ‘An assessment arms race and its fallout: High-stakes grading and the case for slow scholarship’, Assessment & Evaluation in Higher Education. Jessop, T. and Tomas, C. 2017. The implications of programme assessment on student learning. Assessment and Evaluation in Higher Education. Jessop, T. and Maleckar, B. 2016. The Influence of disciplinary assessment patterns on student learning: a comparative study. Studies in Higher Education. Published Online 27 August 2014 Jessop, T. , El Hakim, Y. and Gibbs, G. (2014) The whole is greater than the sum of its parts: a large-scale study of students’ learning in response to different assessment patterns. Assessment and Evaluation in Higher Education. 39(1) 73-88. Nicol, D. 2010. From monologue to dialogue: improving written feedback processes in mass higher education, Assessment & Evaluation in Higher Education, 35: 5, 501 – 517. O'Donovan, B , Price, M. and Rust, C. (2008) 'Developing student understanding of assessment standards: a nested hierarchy of approaches', Teaching in Higher Education, 13: 2, 205 -217. Sadler, D. R. 1989. ‘Formative assessment and the design of instructional systems’, Instructional Science, 18(2), pp. 119–144. Tomas, C and Jessop, T. 2018. Struggling and juggling: A comparison of student assessment loads across research and teaching-intensive universities. Assessment and Evaluation in Higher Education. 18 April. Wu, Q. and Jessop, T. 2018. Formative assessment: missing in action in both research-intensive and teaching- focused universities. Assessment and Evaluation in Higher Education. Published online 15 January.

Editor's Notes

  • #2: Tansy
  • #7: Disconnected seeing the whole degree in silos – my module, lecturer perspective (Elephant, trunk, ears, tusks etc) compared to student perspective of the whole huge beast. I realise that what we were saying is two per module
  • #8: Not so good for complex learning, integrating knowledge, lends itself to disposable curriculum fragmented learning. Amplified summative, less time for formative. Hard to make connections, difficult to see the joins between assessments, much more assessment, much more assessment to accredit each little box. Multiplier effect. Less challenge, less integration. Lots of little neo-liberal tasks. The Assessment Arms Race.
  • #9: Language of ‘covering material’ Should we be surprised?
  • #13: The TESTA report back of programme findings was by far the most significant meeting I have attended in ten years of sitting through many meetings at this university. For the first time, I felt as though I was a player on the pitch, rather than someone watching from the side-lines. We were discussing real issues. (Senior Lecturer, Education
  • #22: Tansy
  • #24: In the UK, assessment and feedback are primary areas of disquiet in the NSS. Provide very little diagnostic information to help course teams adopt more effective assessment strategies. Every year, routine charts red/green/orange – visual representation, accompanied by ritual humiliation of programmes, but not sure why or how to change! AEQ has been used in many countries
  • #25: Looked at in the overview – student effort; intellectual challenge, focused on understanding rather than memorising or ‘sufficing’; clear about goals and stds; feedback is effective – students read it, understand it, use it to improve what they do next. Cronbach’s Alpha – sounds like a disease to me but a test to measure the internal consistency of items – do all items measure the same construct?
  • #28: Tansy
  • #34: Codes look at small units of meaning – a student says – it takes 4 weeks to get it back, so you’ve already handed in the next one, or the tasks are different one after the other, or I never bother at the end of the course because it’s over. All of these are partly to do with timing, and they contribute to the theme of students not using their feedback. Another sub reason is that students don’t use feedback is because they don’t trust it so they say things like If x marks it you’ll get a good grade, if y a bad, or if you get her on a good day, or it’s so subjective etc
  • #35: Please stay on the same data set – as AEQ session. We will work it into a case profile.
  • #36: Use flip chart
  • #37: What does it feel like to be a student? What does it feel like to be a lecturer at the end of this? An empathetic balanced reporting.
  • #39: Tansy
  • #41: Big guns, multiple agendas, using TESTA to get systemic changes done, before we did headlines, lots of themes and literature – there is lit to back up but for team briefing we keep it implicit, pull it out theory informally and illustratively in conversation