SlideShare a Scribd company logo
Module 4: Monitoring and assessment
PreparingTutorsfor
WorkBasedLearning
Developed in the framework of the Erasmus+ Project 2018-1-RO01-KA202-049191
TOTVET - Training of Tutors and VET professionals for high quality in Work
Based Learning and Dual Learning
This publication reflects the views only of the author, and the Commission cannot be held
responsible for any use which may be made of the information contained therein.
Summary
In this Leaning Tool you will:
learn
 to implement the monitoring and assessment plan that you
designed and developed in earlier phases of the systematic
approach to training
understand
 training and learning about monitoring and assessment
be able to
using self-assessment in measuring abilities before and after training (skills
improvement and training effectiveness)
Summary
 Main issues:
1. Evaluation of training
2. Measuring improvement using self-assessment
3. Training evaluation research
Evaluation of training
There are the two principal factors which need to be
resolved:
• Who is responsible for the validation and evaluation
processes?
• What resources of time, people and money are available
for validation/evaluation purposes? (Within this, consider
the effect of variation to these, for instance an unexpected
cut in budget or manpower. In other words anticipate and
plan contingency to deal with variation.)
Measuring improvement using self-assessment
 The 'revised pre-trained ability' is a reassessment to be
carried out after training of the ability level that existed
before training.
 This will commonly be significantly different to the ability
assessment made before training, because by implication, we do
not fully understand competence and ability in a skill/area
before we are trained in it.
 People commonly over-estimate their ability before training.
After training many people realise that they actually had lower
competence than they first believed (i.e., before receiving the
training).
 It is important to allow for this when attempting to measure real
improvement using self-assessment. This is the reason for
revising (after training) the pre-trained assessment of ability.
 Additionally, in many situations after training, people's ideas of
competence in a particular skill/area can expand hugely. They
realise how big and complex the subject is and they become
more conscious of their real ability and opportunities to
improve.
The trainer's overall responsibilities - aside from
training evaluation
 Over the years the trainer's roles have changed, but the basic purpose of the trainer is to
provide efficient and effective training programmes. The following suggests the elements
of the basic role of the trainer, but it must be borne in mind that different circumstances
will require modifications of these activities.
 1. The basic role of a trainer (or however they may be designated) is to offer and provide
efficient and effective training programmes aimed at enabling the participants to learn
the knowledge, skills and attitudes required of them.
 2. A trainer plans and designs the training programmes, or otherwise obtains them (for
example, distance learning or e-technology programmes on the Internet or on CD/DVD),
in accordance with the requirements identified from the results of a TNIA (Training
Needs Identification and Analysis - or simply TNA, Training Needs Analysis) for the
relevant staff of an organizations or organizations.
 3. The training programmes cited at (1) and (2) must be completely based on the TNIA
which has been: (a) completed by the trainer on behalf of and at the request of the
relevant organization (b) determined in some other way by the organization.
 4. Following discussion with or direction by the organization management who will have
taken into account costs and values (e.g. ROI - Return on Investment in the training), the
trainer will agree with the organization management the most appropriate form and
methods for the training.
 5 . If the appropriate form for satisfying the training need is a direct training course or
workshop, or an Intranet provided programme, the trainer will design this programme
using the most effective approaches, techniques and methods, integrating face-to-face
practices with various forms of e-technology wherever this is possible or desirable.
 6. If the appropriate form for satisfying the training need is some form of open
learning programme or e-technology programme, the trainer, with the support
of the organization management obtain, plan the utilization and be prepared
to support the learner in the use of the relevant materials.
 7. The trainer, following contact with the potential learners, preferably through
their line managers, to seek some pre-programme activity and/or initial
evaluation activities, should provide the appropriate training programme(s) to
the learners provided by their organization(s). During and at the end of the
programme, the trainer should ensure that: (a) an effective form of
training/learning validation is followed (b) the learners complete an action
plan for implementation of their learning when they return to work.
 8. Provide, as necessary, having reviewed the validation results, an analysis of
the changes in the knowledge, skills and attitudes of the learners to the
organization management with any recommendations deemed necessary. The
review would include consideration of the effectiveness of the content of the
programme and the effectiveness of the methods used to enable learning, that
is whether the programme satisfied the objectives of the programme and those
of the learners.
 9. Continue to provide effective learning opportunities as required by the
organization.
 10. Enable their own CPD (Continuing Professional Development) by all
possible developmental means - training programmes and self-development
methods.
 11. Arrange and run educative workshops for line managers on the subject of
their fulfillment of their training and evaluation responsibilities.
Training evaluation research
 here are many different ways to assess and evaluate training and
learning.
 Remember that evaluation is for the learner too - evaluation
is not just for the trainer or organisation.
 Feedback and test results help the learner know where they are,
and directly affect the learner's confidence and their
determination to continue with the development - in some cases
with their own future personal development altogether.
 Central to improving training and learning is the question
of bringing more meaning and purpose to people's lives,
aside from merely focusing on skills and work-related
development and training courses.
 Learning and training enables positive change and improvement
- for people and employers - when people's work is aligned with
people's lives - their strengths, personal potential, goals and
dreams - outside work as well as at work.
 Evaluation of training can only effective if the training
itself is effective and appropriate. Testing the wrong
things in the wrong way will give you unhelpful data,
and could be even more unhelpful for learners.
 Consider people's learning styles when evaluating
personal development. Learning styles are essentially a
perspective of people's preferred working, thinking
and communicating styles. Written tests do not enable
all types of people to demonstrate their competence.
 Evaluating retention of knowledge only is a very
limited form of assessment. It will not indicate how
well people apply their learning and development in
practice.
 How do you develop a monitoring and evaluation
system?
 Steps
 Step 1: Identify Program Goals and Objectives.
 Step 2: Define Indicators.
 Step 3: Define Data Collection Methods and Timeline.
 Step 4: Identify M&E Roles and Responsibilities.
 Step 5: Create an Analysis Plan and Reporting
 Step 6: Plan for Dissemination and Donor Reporting.
Step 1: Identify Program Goals and Objectives
The first step to creating an M&E plan is to
identify the program goals and objectives. If the program
already has a logic model or theory of change, then the
program goals are most likely already defined. However,
if not, the M&E plan is a great place to start. Identify the
program goals and objectives.
Defining program goals starts with answering
three questions:
What problem is the program trying to solve?
What steps are being taken to solve that problem?
How will program staff know when the program has
been successful in solving the problem?
Step 2: Define Indicators
Once the program’s goals and objectives are defined, it is
time to define indicators for tracking progress towards
achieving those goals. Program indicators should be a
mix of those that measure process, or what is being done
in the program, and those that measure outcomes.
Process indicators track the progress of the program.
They help to answer the question, “Are activities being
implemented as planned?” Some examples of process
indicators are:
 Number of trainings held with health providers
 Number of outreach activities conducted at youth-
friendly locations
Step 3: Define Data Collection Methods and
Timeline
After creating monitoring indicators, it is time to
decide on methods for gathering data and how
often various data will be recorded to track indicators.
This should be a conversation between program staff,
stakeholders, and donors. These methods will have
important implications for what data collection methods
will be used and how the results will be reported.
The source of monitoring data depends largely on
what each indicator is trying to measure. The program
will likely need multiple data sources to answer all of the
programming questions. Below is a table that represents
some examples of what data can be collected and how.
Step 4: Identify M&E Roles and Responsibilities
The next element of the M&E plan is a section on
roles and responsibilities. It is important to decide from
the early planning stages who is responsible for
collecting the data for each indicator. This will probably
be a mix of M&E staff, research staff, and program staff.
Everyone will need to work together to get data collected
accurately and in a timely fashion.
Step 5: Create an Analysis Plan and Reporting
Templates
 The M&E plan should include a section with details
about what data will be analyzed and how the results
will be presented. Do research staff need to perform
any statistical tests to get the needed answers?
Step 6: Plan for Dissemination and Donor Reporting
The M&E plan should include plans for internal
dissemination among the program team, as well as wider
dissemination among stakeholders and donors. For example,
a program team may want to review data on a monthly basis
to make programmatic decisions and develop future
workplans, while meetings with the donor to review data and
program progress might occur quarterly or annually.
Dissemination of printed or digital materials might occur at
more frequent intervals. These options should be discussed
with stakeholders and your team to determine reasonable
expectations for data review and to develop plans for
dissemination early in the program. If these plans are in place
from the beginning and become routine for the project,
meetings and other kinds of periodic review have a much
better chance of being productive ones that everyone looks
forward to.
Conclusion
 After following these 6 steps, the outline of the M&E plan should look
something like this:
 Introduction to program
 Program goals and objectives
 Logic model/Logical Framework/Theory of change
 Indicators
 Table with data sources, collection timing, and staff member responsible
 Roles and Responsibilities
 Description of each staff member’s role in M&E data collection, analysis, and/or
reporting
 Reporting
 Analysis plan
 Reporting template table
 Dissemination plan
 Description of how and when M&E data will be disseminated internally and
externally
The trainer
 Provision of any necessary pre-programme work etc and
programme planning.
 Identification at the start of the programme of the
knowledge and skills level of the trainees/learners.
 Provision of training and learning resources to enable the
learners to learn within the objectives of the programme
and the learners' own objectives.
 Monitoring the learning as the programme progresses.
 At the end of the programme, assessment of and receipt of
reports from the learners of the learning levels achieved.
 Ensuring the production by the learners of an action plan
to reinforce, practise and implement learning.

More Related Content

PPTX
Learning tool M4T2: Asses the trainees learning outputs
PPTX
Learning tool M4T3: Document and report the intership results
PPTX
RPL toolkit portfolio presentation for e-portfolios and more conference 24 05...
PPT
Amau ccs abet_orientation
PPTX
Instructional plan phase iv
PPTX
NBA preparation for Tier-II Institute according to revised SAR
PPTX
A proposal for a valuation canvas 20210518
PPTX
Instructional plan – phase iv
Learning tool M4T2: Asses the trainees learning outputs
Learning tool M4T3: Document and report the intership results
RPL toolkit portfolio presentation for e-portfolios and more conference 24 05...
Amau ccs abet_orientation
Instructional plan phase iv
NBA preparation for Tier-II Institute according to revised SAR
A proposal for a valuation canvas 20210518
Instructional plan – phase iv

What's hot (20)

PPTX
CUR/516 Instructional plan and presentation
PPTX
Tracking and Assessing Vocational Qualifications
PDF
FWK_2501_2016
PDF
Behind the Build: Designing a Low Cost Career Readiness Platform
PPTX
Lehman Assessment and ePortfolio for 5.6.11
PPTX
Signature assignment instructional plan & presentation
PPTX
Quality does Matter - Implementations across TAMU System
PDF
Retrofitting Legacy Systems Faculty Development Model - Competency-Based Educ...
PPTX
Mentor Me
PPTX
Connecting and transforming: Using ePortfolios to support employability and p...
PPTX
Design thinking action labs prototype
PDF
Beyond the Resume – Using eportfolios for demonstrating skills and gaining em...
PPTX
Higher Education Competences versus Companies Professional Needs in Engineering
PDF
Co po mapping of moodle course
PPTX
Micro-credentialing Workshop: A view from the Australian higher education sector
PDF
Centralized ID Led Model Faculty Development Model - Competency-Based Education
PDF
ProfessioNole Ready Instructor Guide
PPT
Nba msr
PPTX
National Board of Accreditation for higher education in 2020
PPT
CUR/516 Instructional plan and presentation
Tracking and Assessing Vocational Qualifications
FWK_2501_2016
Behind the Build: Designing a Low Cost Career Readiness Platform
Lehman Assessment and ePortfolio for 5.6.11
Signature assignment instructional plan & presentation
Quality does Matter - Implementations across TAMU System
Retrofitting Legacy Systems Faculty Development Model - Competency-Based Educ...
Mentor Me
Connecting and transforming: Using ePortfolios to support employability and p...
Design thinking action labs prototype
Beyond the Resume – Using eportfolios for demonstrating skills and gaining em...
Higher Education Competences versus Companies Professional Needs in Engineering
Co po mapping of moodle course
Micro-credentialing Workshop: A view from the Australian higher education sector
Centralized ID Led Model Faculty Development Model - Competency-Based Education
ProfessioNole Ready Instructor Guide
Nba msr
National Board of Accreditation for higher education in 2020
Ad

Similar to Learning tool M4T4: Assesment and improving the program (20)

PPTX
Tarining evaluation
PPTX
Steps in systematic
PPTX
Training & Development - Designing a training program - key factors, strategi...
PPTX
UNIT 3-HRM.pptx
PPTX
DOCX
IPM NDTHRD-Developing a Trainers Toolkit
PPTX
Training Need Identification_A seminar
PPTX
Training evaluation
PPTX
Planning CEN NEW.pptx by kamna devi MSc nursing
DOCX
Preparing Session Plan Final Guide for Students and Teacher
PPTX
importance of evaluation.pptx (NSTP lesson)
PPTX
Human Resource Training & Developing Model
PDF
Unit 1 abg izhar
PPT
Training and Development.ppt
DOCX
Running Head TRAINING AND DEVELOPMENT PROPOSALTRAINING AND DE.docx
PPTX
SHRD-Lecture-7-Evaluating-HRD-Programs.pptx
PDF
training-final-161119052307.pdf
PPTX
Training evaluation
PPT
HRM RV Chapter 5.ppt
DOCX
TRAINING DESIGN
Tarining evaluation
Steps in systematic
Training & Development - Designing a training program - key factors, strategi...
UNIT 3-HRM.pptx
IPM NDTHRD-Developing a Trainers Toolkit
Training Need Identification_A seminar
Training evaluation
Planning CEN NEW.pptx by kamna devi MSc nursing
Preparing Session Plan Final Guide for Students and Teacher
importance of evaluation.pptx (NSTP lesson)
Human Resource Training & Developing Model
Unit 1 abg izhar
Training and Development.ppt
Running Head TRAINING AND DEVELOPMENT PROPOSALTRAINING AND DE.docx
SHRD-Lecture-7-Evaluating-HRD-Programs.pptx
training-final-161119052307.pdf
Training evaluation
HRM RV Chapter 5.ppt
TRAINING DESIGN
Ad

More from TOTVET (20)

PPTX
Learning tool module 3 topic 2_german_final
PPTX
Learning tool module 1 topic 4_spanish_final
PPTX
Learning tool module 3 topic 2_romana_final
PPTX
Learning tool module 3 topic 2_spanish - final
PPTX
Pl learning tool module 3 topic 2_polish_final
PPTX
Learning tool module 3 topic 3_english_final
PPTX
Learning tool module 3 topic 3_spanish_final
PPTX
Learning tool module 3 topic 3_german_final
PPTX
Learning tool module 2 topic 4_engish
PPTX
Learning tool module4 topic4 Assesment and improving the program
PPTX
Learning tool module4 topic3 Document and report the intership results
PPTX
Learning tool module4 topic2 Asses the trainees learning outputs
PPTX
Learning tool module3 topic4 Networking – operative inform the external stake...
PPTX
Learning tool module3 topic3 Problems solving
PPTX
Learning tool module3 topic2 Leadership and teamwork
PPTX
Learning tool module3 topic1 Efficiency and management of time
PPTX
Learning tool module2 topic4 Interpersonal communication and empathy
PPTX
Learning tool module2 topic3 Conduct the specific work-based learning
PPTX
Learning tool module2 topic2 Work in team and cooperate with training team an...
PPTX
Learning tool module2 topic1 Implement the training programme
Learning tool module 3 topic 2_german_final
Learning tool module 1 topic 4_spanish_final
Learning tool module 3 topic 2_romana_final
Learning tool module 3 topic 2_spanish - final
Pl learning tool module 3 topic 2_polish_final
Learning tool module 3 topic 3_english_final
Learning tool module 3 topic 3_spanish_final
Learning tool module 3 topic 3_german_final
Learning tool module 2 topic 4_engish
Learning tool module4 topic4 Assesment and improving the program
Learning tool module4 topic3 Document and report the intership results
Learning tool module4 topic2 Asses the trainees learning outputs
Learning tool module3 topic4 Networking – operative inform the external stake...
Learning tool module3 topic3 Problems solving
Learning tool module3 topic2 Leadership and teamwork
Learning tool module3 topic1 Efficiency and management of time
Learning tool module2 topic4 Interpersonal communication and empathy
Learning tool module2 topic3 Conduct the specific work-based learning
Learning tool module2 topic2 Work in team and cooperate with training team an...
Learning tool module2 topic1 Implement the training programme

Recently uploaded (20)

PDF
Mark Klimek Lecture Notes_240423 revision books _173037.pdf
PPTX
Pharma ospi slides which help in ospi learning
PPTX
Introduction to Child Health Nursing – Unit I | Child Health Nursing I | B.Sc...
PDF
Abdominal Access Techniques with Prof. Dr. R K Mishra
PDF
Microbial disease of the cardiovascular and lymphatic systems
PPTX
BOWEL ELIMINATION FACTORS AFFECTING AND TYPES
PPTX
master seminar digital applications in india
PPTX
Week 4 Term 3 Study Techniques revisited.pptx
PDF
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
PPTX
Cell Types and Its function , kingdom of life
PPTX
PPH.pptx obstetrics and gynecology in nursing
PPTX
Microbial diseases, their pathogenesis and prophylaxis
PDF
102 student loan defaulters named and shamed – Is someone you know on the list?
PPTX
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
PDF
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
PPTX
Cell Structure & Organelles in detailed.
PDF
STATICS OF THE RIGID BODIES Hibbelers.pdf
PDF
TR - Agricultural Crops Production NC III.pdf
PDF
O5-L3 Freight Transport Ops (International) V1.pdf
PDF
Business Ethics Teaching Materials for college
Mark Klimek Lecture Notes_240423 revision books _173037.pdf
Pharma ospi slides which help in ospi learning
Introduction to Child Health Nursing – Unit I | Child Health Nursing I | B.Sc...
Abdominal Access Techniques with Prof. Dr. R K Mishra
Microbial disease of the cardiovascular and lymphatic systems
BOWEL ELIMINATION FACTORS AFFECTING AND TYPES
master seminar digital applications in india
Week 4 Term 3 Study Techniques revisited.pptx
Saundersa Comprehensive Review for the NCLEX-RN Examination.pdf
Cell Types and Its function , kingdom of life
PPH.pptx obstetrics and gynecology in nursing
Microbial diseases, their pathogenesis and prophylaxis
102 student loan defaulters named and shamed – Is someone you know on the list?
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
Chapter 2 Heredity, Prenatal Development, and Birth.pdf
Cell Structure & Organelles in detailed.
STATICS OF THE RIGID BODIES Hibbelers.pdf
TR - Agricultural Crops Production NC III.pdf
O5-L3 Freight Transport Ops (International) V1.pdf
Business Ethics Teaching Materials for college

Learning tool M4T4: Assesment and improving the program

  • 1. Module 4: Monitoring and assessment PreparingTutorsfor WorkBasedLearning Developed in the framework of the Erasmus+ Project 2018-1-RO01-KA202-049191 TOTVET - Training of Tutors and VET professionals for high quality in Work Based Learning and Dual Learning This publication reflects the views only of the author, and the Commission cannot be held responsible for any use which may be made of the information contained therein.
  • 2. Summary In this Leaning Tool you will: learn  to implement the monitoring and assessment plan that you designed and developed in earlier phases of the systematic approach to training understand  training and learning about monitoring and assessment
  • 3. be able to using self-assessment in measuring abilities before and after training (skills improvement and training effectiveness)
  • 4. Summary  Main issues: 1. Evaluation of training 2. Measuring improvement using self-assessment 3. Training evaluation research
  • 5. Evaluation of training There are the two principal factors which need to be resolved: • Who is responsible for the validation and evaluation processes? • What resources of time, people and money are available for validation/evaluation purposes? (Within this, consider the effect of variation to these, for instance an unexpected cut in budget or manpower. In other words anticipate and plan contingency to deal with variation.)
  • 6. Measuring improvement using self-assessment  The 'revised pre-trained ability' is a reassessment to be carried out after training of the ability level that existed before training.  This will commonly be significantly different to the ability assessment made before training, because by implication, we do not fully understand competence and ability in a skill/area before we are trained in it.  People commonly over-estimate their ability before training. After training many people realise that they actually had lower competence than they first believed (i.e., before receiving the training).  It is important to allow for this when attempting to measure real improvement using self-assessment. This is the reason for revising (after training) the pre-trained assessment of ability.  Additionally, in many situations after training, people's ideas of competence in a particular skill/area can expand hugely. They realise how big and complex the subject is and they become more conscious of their real ability and opportunities to improve.
  • 7. The trainer's overall responsibilities - aside from training evaluation  Over the years the trainer's roles have changed, but the basic purpose of the trainer is to provide efficient and effective training programmes. The following suggests the elements of the basic role of the trainer, but it must be borne in mind that different circumstances will require modifications of these activities.  1. The basic role of a trainer (or however they may be designated) is to offer and provide efficient and effective training programmes aimed at enabling the participants to learn the knowledge, skills and attitudes required of them.  2. A trainer plans and designs the training programmes, or otherwise obtains them (for example, distance learning or e-technology programmes on the Internet or on CD/DVD), in accordance with the requirements identified from the results of a TNIA (Training Needs Identification and Analysis - or simply TNA, Training Needs Analysis) for the relevant staff of an organizations or organizations.  3. The training programmes cited at (1) and (2) must be completely based on the TNIA which has been: (a) completed by the trainer on behalf of and at the request of the relevant organization (b) determined in some other way by the organization.  4. Following discussion with or direction by the organization management who will have taken into account costs and values (e.g. ROI - Return on Investment in the training), the trainer will agree with the organization management the most appropriate form and methods for the training.  5 . If the appropriate form for satisfying the training need is a direct training course or workshop, or an Intranet provided programme, the trainer will design this programme using the most effective approaches, techniques and methods, integrating face-to-face practices with various forms of e-technology wherever this is possible or desirable.
  • 8.  6. If the appropriate form for satisfying the training need is some form of open learning programme or e-technology programme, the trainer, with the support of the organization management obtain, plan the utilization and be prepared to support the learner in the use of the relevant materials.  7. The trainer, following contact with the potential learners, preferably through their line managers, to seek some pre-programme activity and/or initial evaluation activities, should provide the appropriate training programme(s) to the learners provided by their organization(s). During and at the end of the programme, the trainer should ensure that: (a) an effective form of training/learning validation is followed (b) the learners complete an action plan for implementation of their learning when they return to work.  8. Provide, as necessary, having reviewed the validation results, an analysis of the changes in the knowledge, skills and attitudes of the learners to the organization management with any recommendations deemed necessary. The review would include consideration of the effectiveness of the content of the programme and the effectiveness of the methods used to enable learning, that is whether the programme satisfied the objectives of the programme and those of the learners.  9. Continue to provide effective learning opportunities as required by the organization.  10. Enable their own CPD (Continuing Professional Development) by all possible developmental means - training programmes and self-development methods.  11. Arrange and run educative workshops for line managers on the subject of their fulfillment of their training and evaluation responsibilities.
  • 9. Training evaluation research  here are many different ways to assess and evaluate training and learning.  Remember that evaluation is for the learner too - evaluation is not just for the trainer or organisation.  Feedback and test results help the learner know where they are, and directly affect the learner's confidence and their determination to continue with the development - in some cases with their own future personal development altogether.  Central to improving training and learning is the question of bringing more meaning and purpose to people's lives, aside from merely focusing on skills and work-related development and training courses.  Learning and training enables positive change and improvement - for people and employers - when people's work is aligned with people's lives - their strengths, personal potential, goals and dreams - outside work as well as at work.
  • 10.  Evaluation of training can only effective if the training itself is effective and appropriate. Testing the wrong things in the wrong way will give you unhelpful data, and could be even more unhelpful for learners.  Consider people's learning styles when evaluating personal development. Learning styles are essentially a perspective of people's preferred working, thinking and communicating styles. Written tests do not enable all types of people to demonstrate their competence.  Evaluating retention of knowledge only is a very limited form of assessment. It will not indicate how well people apply their learning and development in practice.
  • 11.  How do you develop a monitoring and evaluation system?  Steps  Step 1: Identify Program Goals and Objectives.  Step 2: Define Indicators.  Step 3: Define Data Collection Methods and Timeline.  Step 4: Identify M&E Roles and Responsibilities.  Step 5: Create an Analysis Plan and Reporting  Step 6: Plan for Dissemination and Donor Reporting.
  • 12. Step 1: Identify Program Goals and Objectives The first step to creating an M&E plan is to identify the program goals and objectives. If the program already has a logic model or theory of change, then the program goals are most likely already defined. However, if not, the M&E plan is a great place to start. Identify the program goals and objectives. Defining program goals starts with answering three questions: What problem is the program trying to solve? What steps are being taken to solve that problem? How will program staff know when the program has been successful in solving the problem?
  • 13. Step 2: Define Indicators Once the program’s goals and objectives are defined, it is time to define indicators for tracking progress towards achieving those goals. Program indicators should be a mix of those that measure process, or what is being done in the program, and those that measure outcomes. Process indicators track the progress of the program. They help to answer the question, “Are activities being implemented as planned?” Some examples of process indicators are:  Number of trainings held with health providers  Number of outreach activities conducted at youth- friendly locations
  • 14. Step 3: Define Data Collection Methods and Timeline After creating monitoring indicators, it is time to decide on methods for gathering data and how often various data will be recorded to track indicators. This should be a conversation between program staff, stakeholders, and donors. These methods will have important implications for what data collection methods will be used and how the results will be reported. The source of monitoring data depends largely on what each indicator is trying to measure. The program will likely need multiple data sources to answer all of the programming questions. Below is a table that represents some examples of what data can be collected and how.
  • 15. Step 4: Identify M&E Roles and Responsibilities The next element of the M&E plan is a section on roles and responsibilities. It is important to decide from the early planning stages who is responsible for collecting the data for each indicator. This will probably be a mix of M&E staff, research staff, and program staff. Everyone will need to work together to get data collected accurately and in a timely fashion.
  • 16. Step 5: Create an Analysis Plan and Reporting Templates  The M&E plan should include a section with details about what data will be analyzed and how the results will be presented. Do research staff need to perform any statistical tests to get the needed answers?
  • 17. Step 6: Plan for Dissemination and Donor Reporting The M&E plan should include plans for internal dissemination among the program team, as well as wider dissemination among stakeholders and donors. For example, a program team may want to review data on a monthly basis to make programmatic decisions and develop future workplans, while meetings with the donor to review data and program progress might occur quarterly or annually. Dissemination of printed or digital materials might occur at more frequent intervals. These options should be discussed with stakeholders and your team to determine reasonable expectations for data review and to develop plans for dissemination early in the program. If these plans are in place from the beginning and become routine for the project, meetings and other kinds of periodic review have a much better chance of being productive ones that everyone looks forward to.
  • 18. Conclusion  After following these 6 steps, the outline of the M&E plan should look something like this:  Introduction to program  Program goals and objectives  Logic model/Logical Framework/Theory of change  Indicators  Table with data sources, collection timing, and staff member responsible  Roles and Responsibilities  Description of each staff member’s role in M&E data collection, analysis, and/or reporting  Reporting  Analysis plan  Reporting template table  Dissemination plan  Description of how and when M&E data will be disseminated internally and externally
  • 19. The trainer  Provision of any necessary pre-programme work etc and programme planning.  Identification at the start of the programme of the knowledge and skills level of the trainees/learners.  Provision of training and learning resources to enable the learners to learn within the objectives of the programme and the learners' own objectives.  Monitoring the learning as the programme progresses.  At the end of the programme, assessment of and receipt of reports from the learners of the learning levels achieved.  Ensuring the production by the learners of an action plan to reinforce, practise and implement learning.