A feedback survey for taught
postgraduates
Workshop 2: Timing and publication
Catherine Cameron
Senior HE Policy Adviser, HEFCE
Richard Puttock
Head of Data and Management Information, HEFCE
Future Inn, Bristol City Centre
21 September 2017
Session Outline
1. Survey coverage (10 mins)
2. Survey timing (20 mins)
3. Data linking (10 mins)
4. Publication levels (20 mins)
Survey coverage
Initial starting point was whether there were
any groups of PGT students who should be
excluded
Our proposal
All students on PGT courses which lead to a
qualification
Excepting those on integrated Masters
Questions or comments?
Survey timing
Key consideration is to provide representative
and meaningful information
Options
• Single survey window
• Post completion survey
• Multi-point survey
• Rolling survey
Survey timing
End date analysis
Whilst June/July and
September/October cover most
students, there are differences for:
• Particular providers
• Some students on unstructured
courses
• Some students studying particular
subjects
End month Percentage
of students
Jan 3%
Feb 2%
Mar 3%
Apr 2%
May 4%
Jun 12%
Jul 16%
Aug 5%
Sep 34%
Oct 10%
Nov 6%
Dec 3%
Survey timing
Conclusion
• A multi-point or rolling survey is the only way
to achieve responses which reflect the
majority of a students’ academic experience
and a sufficiently high response rate.
• Running the survey at four points in the year
should be sufficient to capture a significant
majority of the PGT population close to the
end of their period of study.
Survey timing - questions
Your questions ….
Our questions
a. What are the challenges for institutions in
delivering a multi-point survey?
b. Is there anything we haven’t considered
that we should have?
Data linking
• In order to be able to analyse and represent
survey responses by student characteristics
we must be able to link back to individualised
student data
• This will also allow us to link to other
datasets: NSS, Graduate Outcomes for
analysis
• Data futures – will introduce in-year data
collection and potentially allow us to generate
target lists
Questions or comments?
Publication
• Intend to consult on some elements of
publication approach now and some following
successful piloting
• The key things that we think that we will need
to consider are:
• Publication routes
• Publication thresholds
• Publication level
• Breakdown of responses
• Benchmarking of responses
• Detailed approach to presentation of
information
Publication
Publication routes
Envisage publishing outcomes in similar way to
NSS:
• Publishing responses on the Office for Students (OfS) website
• Publishing on the central information source to support
decision-making being provided by the Office for Students
and the other funding bodies (assuming continuation)
• Publishing an open dataset so that other information
providers can make use of this information
• Making responses available to providers through a secure
system.
Publication
Publication thresholds
Propose to adopt same thresholds as used for publication
of NSS
• At least 10 students
• 50 per cent of sample
Question
Our analysis is based on these thresholds and we
propose not to consult on them. Are there any concerns
about this?
Publication
Potential coverage
However
Expected gaps for some particular types of provision, particularly at
FECs, due to very small cohort sizes
Publication level Estimated coverage
(70% response rate)
Estimated number of
students included
Course 19% 67%
HECoS CAH level 3 57% 94%
HECoS CAH level 2 70% 98%
Publication
Breakdown of responses
We also need to consider the course characteristics by
which the survey responses should be broken down to
best meet its purposes
For example, mode of study.
But …. we must be careful not to fragment to an extent
where we cannot publish data
Questions
• Are there ways in which is would be essential to break
down the data to make it meaningful?
• Are there ways in which it would be desirable to break
it down?
Publication
Benchmarking
• We will consider benchmarking as part of the second
stage of developing proposals.
• HEFCE is currently leading a fundamental review of
the approach to benchmarking used in the UK HE
Performance Indicators and the TEF to ensure it
meets best practice.
• This will influence our approach to benchmarking of
survey outcomes.
Other questions or
comments?
How to find out more
website www.hefce.ac.uk/lt/PGT/

More Related Content

PPTX
Initial consultation event: Content workshop
PDF
Bob Gilworth
PDF
Hugh Mannerings & Jonathan Neves
PDF
Jekaterina Rogaten
PDF
Iain Mansfield
PDF
Chris Millward
PPT
Challenges of Quality in Teaching and Learning - Gwen van der velden
PPTX
National university assessment process
Initial consultation event: Content workshop
Bob Gilworth
Hugh Mannerings & Jonathan Neves
Jekaterina Rogaten
Iain Mansfield
Chris Millward
Challenges of Quality in Teaching and Learning - Gwen van der velden
National university assessment process

What's hot (20)

PDF
NISO Apr 29 Virtual Conference: ‘Good Enough’: Applying a Holistic Approach f...
PDF
NISO Apr 29 Virtual Conference: Assessing Game-Based Library Initiatives Kyle...
PPT
LLB Student Perceptions of Assessment and Feedback: Lessons from the National...
PPTX
LLB student perceptions of assessment and feedback: lessons from the National...
PDF
Sharon Smith
PPT
Programme Evaluation in Open and Distance Learning
PPTX
WSU new standards mean new baseline
ODP
Going open (education): What, why, and how?
PPT
Research, policy and practice in widening participation: the evidence from A...
PPT
Effective Assessment Designs
PPTX
Designing good assessment
PPTX
Retiring Exam Questions? How to Use These Items in Formative Assessments
PPTX
Towards a check-list or road map to enhance pro-poor policy studies
 
PPT
2009 ECNO Conference - Teaching Learning Critical Pathway
PDF
Continuing adventures of library learning analytics
PPT
Assessment in Distance Education
PPTX
4 reasons why you need online assessments
PPTX
AfL believers or preachers HK 2017
PPTX
Formative and summative assessments presentation
PPT
Cycle of Inquiry - Planning around Content Standards
NISO Apr 29 Virtual Conference: ‘Good Enough’: Applying a Holistic Approach f...
NISO Apr 29 Virtual Conference: Assessing Game-Based Library Initiatives Kyle...
LLB Student Perceptions of Assessment and Feedback: Lessons from the National...
LLB student perceptions of assessment and feedback: lessons from the National...
Sharon Smith
Programme Evaluation in Open and Distance Learning
WSU new standards mean new baseline
Going open (education): What, why, and how?
Research, policy and practice in widening participation: the evidence from A...
Effective Assessment Designs
Designing good assessment
Retiring Exam Questions? How to Use These Items in Formative Assessments
Towards a check-list or road map to enhance pro-poor policy studies
 
2009 ECNO Conference - Teaching Learning Critical Pathway
Continuing adventures of library learning analytics
Assessment in Distance Education
4 reasons why you need online assessments
AfL believers or preachers HK 2017
Formative and summative assessments presentation
Cycle of Inquiry - Planning around Content Standards
Ad

Similar to Initial consultation event: Timing and publication (20)

PPTX
Interview presentation.pptx
PPTX
TESTA Interactive Masterclass
PPTX
MCCVLC webinar - Online Program Review
PPTX
Beyond MOOCS – A Catalyst for Change
PPTX
Recognition of short learning programmes policy forum - may 28 2021 by clare ...
PPTX
Outcome Based Education and it's basic concept Presentation.pptx
PPT
OBE pdf [Autosaved].ppt
PDF
ABLE - the NTU Student Dashboard - University of Derby
PPTX
Glfes summer institute2013_raleigh_final
PDF
ABLE - UKAT - Using Learning Analytics to Boost Personal Tutoring
PPTX
Group4 present3 3-15
PDF
Feed-forward approaches for enhancing assessment and feedback
PPTX
5R Open Course Design Framework, Fall 2015 version
PDF
Interrogating evaluation 2015 induction
PDF
Interrogating evaluation 2015 inductionb
PPTX
Professional Development: Differentiated Instruction
PDF
QAA Modelling and Managing Student Satisfaction: Use of student feedback to ...
PDF
Assessment in CBME Competency Based Medical Education Dr Girish .B CISP 2 MCI
PPTX
What are we learning from learning analytics: Rhetoric to reality escalate 2014
Interview presentation.pptx
TESTA Interactive Masterclass
MCCVLC webinar - Online Program Review
Beyond MOOCS – A Catalyst for Change
Recognition of short learning programmes policy forum - may 28 2021 by clare ...
Outcome Based Education and it's basic concept Presentation.pptx
OBE pdf [Autosaved].ppt
ABLE - the NTU Student Dashboard - University of Derby
Glfes summer institute2013_raleigh_final
ABLE - UKAT - Using Learning Analytics to Boost Personal Tutoring
Group4 present3 3-15
Feed-forward approaches for enhancing assessment and feedback
5R Open Course Design Framework, Fall 2015 version
Interrogating evaluation 2015 induction
Interrogating evaluation 2015 inductionb
Professional Development: Differentiated Instruction
QAA Modelling and Managing Student Satisfaction: Use of student feedback to ...
Assessment in CBME Competency Based Medical Education Dr Girish .B CISP 2 MCI
What are we learning from learning analytics: Rhetoric to reality escalate 2014
Ad

More from Higher Education Funding Council for England (20)

PDF
Learning gain in international perspective: Towards a more reliable model for...
PDF
Ethics and researcher integrity: Institutional considerations for learning ga...
PDF
The UEA learning gain project: Learning gain and confidence gain in the class...
PDF
Engaging students in the national mixed methodology learning gain project
PDF
Are assessment scores good proxies of estimating learning gains?
PPTX
Frameworks for responsible metrics: Challenges and solutions
PPT
The perspective of Research Councils UK
PPTX
The culture of research metrics in research organisations
PPTX
The turning tide a new culture of research metrics - David Sweeney
PDF
Progress with HELGA: Can it provide a sector measure of learning gain? - Jame...
PDF
Learning gain in the national context - Yvonne Hawkins
PDF
HEFCE's learning gain programme: Update on current activity - Ruby Gatehouse
PPTX
Financial health of the sector and related developments - Madeleine Atkins
PPTX
Improving access and student success in higher education - Chris Millward
PPTX
Consultation on the Office for Students regulatory framework - Susan Lapworth
PPTX
UKRI, Research England and the REF - David Sweeney
PPTX
Initial consultation event: Introduction
PPTX
A university challenge: Perspectives on university contributions to place-bas...
PPTX
Challenges, opportunities, and financial health overview - Madeline Atkins
PPTX
Universities as core partners in realising the Industrial Strategy - Luke Geo...
Learning gain in international perspective: Towards a more reliable model for...
Ethics and researcher integrity: Institutional considerations for learning ga...
The UEA learning gain project: Learning gain and confidence gain in the class...
Engaging students in the national mixed methodology learning gain project
Are assessment scores good proxies of estimating learning gains?
Frameworks for responsible metrics: Challenges and solutions
The perspective of Research Councils UK
The culture of research metrics in research organisations
The turning tide a new culture of research metrics - David Sweeney
Progress with HELGA: Can it provide a sector measure of learning gain? - Jame...
Learning gain in the national context - Yvonne Hawkins
HEFCE's learning gain programme: Update on current activity - Ruby Gatehouse
Financial health of the sector and related developments - Madeleine Atkins
Improving access and student success in higher education - Chris Millward
Consultation on the Office for Students regulatory framework - Susan Lapworth
UKRI, Research England and the REF - David Sweeney
Initial consultation event: Introduction
A university challenge: Perspectives on university contributions to place-bas...
Challenges, opportunities, and financial health overview - Madeline Atkins
Universities as core partners in realising the Industrial Strategy - Luke Geo...

Recently uploaded (20)

PDF
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
PPTX
Education and Perspectives of Education.pptx
PDF
HVAC Specification 2024 according to central public works department
PPTX
ELIAS-SEZIURE AND EPilepsy semmioan session.pptx
PDF
FOISHS ANNUAL IMPLEMENTATION PLAN 2025.pdf
PDF
Skin Care and Cosmetic Ingredients Dictionary ( PDFDrive ).pdf
PDF
My India Quiz Book_20210205121199924.pdf
PPTX
Computer Architecture Input Output Memory.pptx
PDF
Environmental Education MCQ BD2EE - Share Source.pdf
DOCX
Cambridge-Practice-Tests-for-IELTS-12.docx
PDF
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 2).pdf
PDF
semiconductor packaging in vlsi design fab
PDF
LIFE & LIVING TRILOGY - PART (3) REALITY & MYSTERY.pdf
PDF
Journal of Dental Science - UDMY (2021).pdf
PDF
AI-driven educational solutions for real-life interventions in the Philippine...
PDF
BP 505 T. PHARMACEUTICAL JURISPRUDENCE (UNIT 2).pdf
PPTX
What’s under the hood: Parsing standardized learning content for AI
PDF
LIFE & LIVING TRILOGY- PART (1) WHO ARE WE.pdf
PPTX
Unit 4 Computer Architecture Multicore Processor.pptx
PDF
English Textual Question & Ans (12th Class).pdf
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 1)
Education and Perspectives of Education.pptx
HVAC Specification 2024 according to central public works department
ELIAS-SEZIURE AND EPilepsy semmioan session.pptx
FOISHS ANNUAL IMPLEMENTATION PLAN 2025.pdf
Skin Care and Cosmetic Ingredients Dictionary ( PDFDrive ).pdf
My India Quiz Book_20210205121199924.pdf
Computer Architecture Input Output Memory.pptx
Environmental Education MCQ BD2EE - Share Source.pdf
Cambridge-Practice-Tests-for-IELTS-12.docx
BP 704 T. NOVEL DRUG DELIVERY SYSTEMS (UNIT 2).pdf
semiconductor packaging in vlsi design fab
LIFE & LIVING TRILOGY - PART (3) REALITY & MYSTERY.pdf
Journal of Dental Science - UDMY (2021).pdf
AI-driven educational solutions for real-life interventions in the Philippine...
BP 505 T. PHARMACEUTICAL JURISPRUDENCE (UNIT 2).pdf
What’s under the hood: Parsing standardized learning content for AI
LIFE & LIVING TRILOGY- PART (1) WHO ARE WE.pdf
Unit 4 Computer Architecture Multicore Processor.pptx
English Textual Question & Ans (12th Class).pdf

Initial consultation event: Timing and publication

  • 1. A feedback survey for taught postgraduates Workshop 2: Timing and publication Catherine Cameron Senior HE Policy Adviser, HEFCE Richard Puttock Head of Data and Management Information, HEFCE Future Inn, Bristol City Centre 21 September 2017
  • 2. Session Outline 1. Survey coverage (10 mins) 2. Survey timing (20 mins) 3. Data linking (10 mins) 4. Publication levels (20 mins)
  • 3. Survey coverage Initial starting point was whether there were any groups of PGT students who should be excluded Our proposal All students on PGT courses which lead to a qualification Excepting those on integrated Masters Questions or comments?
  • 4. Survey timing Key consideration is to provide representative and meaningful information Options • Single survey window • Post completion survey • Multi-point survey • Rolling survey
  • 5. Survey timing End date analysis Whilst June/July and September/October cover most students, there are differences for: • Particular providers • Some students on unstructured courses • Some students studying particular subjects End month Percentage of students Jan 3% Feb 2% Mar 3% Apr 2% May 4% Jun 12% Jul 16% Aug 5% Sep 34% Oct 10% Nov 6% Dec 3%
  • 6. Survey timing Conclusion • A multi-point or rolling survey is the only way to achieve responses which reflect the majority of a students’ academic experience and a sufficiently high response rate. • Running the survey at four points in the year should be sufficient to capture a significant majority of the PGT population close to the end of their period of study.
  • 7. Survey timing - questions Your questions …. Our questions a. What are the challenges for institutions in delivering a multi-point survey? b. Is there anything we haven’t considered that we should have?
  • 8. Data linking • In order to be able to analyse and represent survey responses by student characteristics we must be able to link back to individualised student data • This will also allow us to link to other datasets: NSS, Graduate Outcomes for analysis • Data futures – will introduce in-year data collection and potentially allow us to generate target lists Questions or comments?
  • 9. Publication • Intend to consult on some elements of publication approach now and some following successful piloting • The key things that we think that we will need to consider are: • Publication routes • Publication thresholds • Publication level • Breakdown of responses • Benchmarking of responses • Detailed approach to presentation of information
  • 10. Publication Publication routes Envisage publishing outcomes in similar way to NSS: • Publishing responses on the Office for Students (OfS) website • Publishing on the central information source to support decision-making being provided by the Office for Students and the other funding bodies (assuming continuation) • Publishing an open dataset so that other information providers can make use of this information • Making responses available to providers through a secure system.
  • 11. Publication Publication thresholds Propose to adopt same thresholds as used for publication of NSS • At least 10 students • 50 per cent of sample Question Our analysis is based on these thresholds and we propose not to consult on them. Are there any concerns about this?
  • 12. Publication Potential coverage However Expected gaps for some particular types of provision, particularly at FECs, due to very small cohort sizes Publication level Estimated coverage (70% response rate) Estimated number of students included Course 19% 67% HECoS CAH level 3 57% 94% HECoS CAH level 2 70% 98%
  • 13. Publication Breakdown of responses We also need to consider the course characteristics by which the survey responses should be broken down to best meet its purposes For example, mode of study. But …. we must be careful not to fragment to an extent where we cannot publish data Questions • Are there ways in which is would be essential to break down the data to make it meaningful? • Are there ways in which it would be desirable to break it down?
  • 14. Publication Benchmarking • We will consider benchmarking as part of the second stage of developing proposals. • HEFCE is currently leading a fundamental review of the approach to benchmarking used in the UK HE Performance Indicators and the TEF to ensure it meets best practice. • This will influence our approach to benchmarking of survey outcomes.
  • 16. How to find out more website www.hefce.ac.uk/lt/PGT/