SlideShare a Scribd company logo
3
Most read
6
Most read
15
Most read
Introduction to
Research Methods
in Education
Jasarat Ilyas Jokhio
M.Ed.
Preston University
Karachi Campus.
Research Methods
 Why is it important to understand research
methods for interdisciplinary researchers?
 Types of research
 How do you measure learning
experimentally?
 Within subjects design
 Between subjects design
 What are you measuring?
 Important Statistical terms
 [other methods]
Research methods as boundary
object in interdisciplinary research
 Study of cross-disciplinary research
collaboration (Mercier, Penuel, Remold, Villalba, Kuhl &
Romo, under review)
 The Bilingual Baby Project
 4-year longitudinal study
 Neuroscience, cognitive science, sociology
 How does growing up in a bilingual environment
influence cognitive development & school
readiness?
 Biggest issues was sample selection (methods)
… [a child] was brought back for me to test, and I was
putting [an ERP] cap on and then his mom said, well,
he’s used to this from going to the neurologist … and
then she told me that he has epilepsy. And I thought
to myself, well, okay, I’m wasting the next two-hours
because I’m not going to be able to use the data….. A
quantitative researcher would never include a child
that—you wouldn’t even waste the time and
resources.
It’s difficult because you do form close
relationships with the families…. So it was
then much harder for our group to eliminate
them from the pool, which we didn’t. We have
followed up, we’ve gone and interviewed her,
and it would be very interesting to see what
kind of school readiness skills that child has
and what kind of problems that mother has
encountered. I know that they wanted
children who had no other disabilities in order
to focus on their language acquisition, but not
all children are of this type
Two Types of Research
Quantitative
 Experimental
 Surveys (usually)
Qualitative
 Biography, phenomenology, grounded theory,
ethnography & case study
Mixed methods
 You can’t account for context with numbers
 The plural of anecdote is not data
What do you want your data &
results to look like?
 Do you want to show learning, engagement,
the process of the activity?
 Do you want to show that your tool works, or
that it is better than an alternative?
 Do you want to describe, code, run statistics,
present a case study?
 How will you design the study to get the type
of results you want to present?
How do you measure
learning experimentally?
Ecology lesson
- Aim: to teach 5 year
olds about complex
systems
- Ten 1-hour long
sessions over 5 weeks
- “Embodied curriculum”
- Technology; dancing;
drawing
0
1
2
3
4
5
6
7
8
9
10
Pre Post
Scoreontest
No way to know whether it was the curriculum,
or just being taught that led to learning
CLAIM: Embodied curriculum is a
good way to teach complex systems
How do you measure
learning experimentally?
(and know that it’s because of what you did…)
Pre/post test design (within subjects)
 Sequestered problem solving (SPS)
 Preparation for future learning (PFL)
 Free-write/free recall
 Delayed post-test
 Multiple baseline/single case design
How do you measure
learning experimentally?
(and know that it’s because of what you did…)
Control design (between subjects)
 Only some of the participants receive
intervention & compare post-test scores
between control & experimental groups
 Compare ‘experimental group’ with
previous groups who did not receive
intervention
 Randomized control trials (the medical
model)
2x2 design
Recall Condition
On Land Under
water
Learning
condition
On Land 20 20
Under
Water
20 20
Godden & Baddeley (1975)
But what are you measuring?
Prerequisites for Maths
Course
- Is maths101 necessary
to pass stats202?
- Half of students had
taken maths101
- All students take
stats202
- Contrast outcomes
0
10
20
30
40
50
60
70
80
90
100
Maths101 No Maths101
Meanpercentonposttest
Floor effect: either the post-test didn’t measure the content or very
little was learned from stats202.
CLAIM: No need to take
maths101 before taking stats202
But what are you measuring…
 Is it valid?
 Internal validity (is the effect caused by the IV)
 External validity (would it replication beyond the sample)
 Is it reliable?
 test-retest reliability
 Split-test reliability
 [piloting]
What sort of learning will you measure?
Important statistical ideas
 Independent variables
 The thing you manipulate/control for
 Main effects & Interaction effects between IV
 Dependent variables
 The outcome measure
 Floor effects & ceiling effects
 Statistical significance
 Usually <.05 for social science
 Indicates whether the effect is genuine or due to
chance
Important statistical ideas
 Level of measurement
 Nominal
 Ordinal
 Interval (& ranking)
 Population & sample
 Normal distribution
 Descriptive statistics
 Means, standard deviations, standard errors,
 Parametric and non-parametric statistics
Assumptions for parametric statistics
 Level of measurement must be at least
interval
 Sample is drawn from a normally distributed
population
 Homogeneity of Variance
 Variance of two samples is not significantly
different
 Independence of scores
Questions?
Key things to look for:
 Are the differences between conditions
significant:
 T-tests, ANOVA, Chi Square
 Is there a relationship between variables?
 Correlations (note: you can’t tell causation from this)
 Pay attention to r values (between 1 and -1).
 Which of the IVs cause the DV?
 Regression Analysis (note: need very large sample
size; controversial technique)
Survey design
 In the past 24 hours, did you watch more
than an hour of television programming,
or not? Yes/No
 In the past 24 hours, did you read a daily
newspaper, or not? Yes/No
 On a scale of 1 to 7, please rate
How satisfied were you with what you
learned and the usability of the software?
(1) Agree strongly…….(7) disagree strongly
Survey design (things to remember)
 Is there only one question in each item?
 Pilot with a number of people – do they read the
question the way it was intended?
 Are all your scales in the same direction (if not,
reverse them before analysis)
 Do the answers match the questions?
 How will you make sense of the answers
 What sort of analysis can you do on rating, frequency,
open-ended items?
 Are particular answers ‘socially desirable?’

More Related Content

DOCX
Characteristics of educational research
PPTX
Meaning, definitions & need of educational research.
PPTX
Situation analysis in curriculum design
PPT
Comparative education
PPT
Evaluation in Education
PPTX
Curriculum Development
PPTX
Measurment, Assessment and Evaluation
PPT
MEASUREMENT, ASSESSMENT AND EVALUATION
Characteristics of educational research
Meaning, definitions & need of educational research.
Situation analysis in curriculum design
Comparative education
Evaluation in Education
Curriculum Development
Measurment, Assessment and Evaluation
MEASUREMENT, ASSESSMENT AND EVALUATION

What's hot (20)

PPTX
The role of international organization in education policy and planning
PPTX
Assembling The Test
PPTX
Educational Diagnosis - Diagnostic Test and Remedial Instruction
DOCX
Concept of Comparative education.
PPTX
Marking and Reporting in Assessment
PPTX
COMPUTERS IN EDUCATION - UNIT 6 - COMPUTER MANAGED LEARNING (CML) - B.ED - 8...
PPTX
Nature,scope,meaning,function of philosophy in Education (https://www.youtube...
PPTX
EVALUATION AT SECONDARY LEVEL-8624
PPTX
Role of a teacher in Curriculum Development at various Level (https://www.you...
PPTX
Educational Research : Meaning and Score
PPTX
Nature and Classification of Educational Research (Part 1)
PPTX
Curriculum Evaluation
PPTX
Purpose of assessment
PPT
Educational research basic Steps
PPTX
Curriculum models
PPTX
Test administration (Educational Assessment)
PPTX
Taba model of curriculum development
PPTX
The process and purpose of evaluation
PPT
Factors and forces influencing on curriculum development
PPTX
Concept of cte
The role of international organization in education policy and planning
Assembling The Test
Educational Diagnosis - Diagnostic Test and Remedial Instruction
Concept of Comparative education.
Marking and Reporting in Assessment
COMPUTERS IN EDUCATION - UNIT 6 - COMPUTER MANAGED LEARNING (CML) - B.ED - 8...
Nature,scope,meaning,function of philosophy in Education (https://www.youtube...
EVALUATION AT SECONDARY LEVEL-8624
Role of a teacher in Curriculum Development at various Level (https://www.you...
Educational Research : Meaning and Score
Nature and Classification of Educational Research (Part 1)
Curriculum Evaluation
Purpose of assessment
Educational research basic Steps
Curriculum models
Test administration (Educational Assessment)
Taba model of curriculum development
The process and purpose of evaluation
Factors and forces influencing on curriculum development
Concept of cte
Ad

Viewers also liked (20)

PPTX
Research Methods in Education and Education Technology Prof Lili Saghafi Con...
PDF
Educatonal Research Methodology Framework
PPT
Chapter1
PPTX
Research in edu group 1
PPT
Monitoring, Evaluation and Reporting
PPT
Epb3044 topic 1 research method 110912 073703
PDF
Research methods course schedule
PPT
Seminar Esl Research Methods & Assessment
PDF
Introduction to Research Methods
PPT
Nature Of Scientific Inquiry
PDF
Lesson 10 research methods sampling and experimental design 2013
PPT
Research an overview: A Tutorial PowerPoint Presentation by Ramesh Adhikari
PPT
Nature of Inquiry and Research
PPTX
Identifying the Inquiry and Stating the Problem
PPT
Introduction To The Research Method
PPT
Chapter 2-OVERVIEW OF RESEARCH PROCESS
PPT
Research Methods
PPT
Research methods
PDF
Research Methods in Psychology Sampling and Experimental Design
PPT
Chapter.1
Research Methods in Education and Education Technology Prof Lili Saghafi Con...
Educatonal Research Methodology Framework
Chapter1
Research in edu group 1
Monitoring, Evaluation and Reporting
Epb3044 topic 1 research method 110912 073703
Research methods course schedule
Seminar Esl Research Methods & Assessment
Introduction to Research Methods
Nature Of Scientific Inquiry
Lesson 10 research methods sampling and experimental design 2013
Research an overview: A Tutorial PowerPoint Presentation by Ramesh Adhikari
Nature of Inquiry and Research
Identifying the Inquiry and Stating the Problem
Introduction To The Research Method
Chapter 2-OVERVIEW OF RESEARCH PROCESS
Research Methods
Research methods
Research Methods in Psychology Sampling and Experimental Design
Chapter.1
Ad

Similar to Introduction to Research Methods in Education (20)

DOCX
non verbal group intelligence test.dUUUUUocx
PDF
NRES 1 Quantitative and Qualitative Research
PDF
Nursinf Research 1 Quantitative-Research.pdf
PDF
Nursing research Quantitative-Research.pdf
PPTX
Test Construction
PPTX
Kondas IPA IIIIIIIIIIIIIIIIIIIIIIII.pptx
PPT
testconst-do-151030115824-lva1-app6892.ppt
PPTX
Active Engagement Using Classroom Response Systems - CSU Pueblo - Jeff Loats
PDF
Technological Problem Solving Seminar
PPTX
3-Quantitative and Qualitative Research and principles
PPTX
Practical Research 2 First S - Akio.pptx
PPTX
JiTT - Blended Learning Across the Academy - Teaching Prof. Tech - Oct 2015
PDF
Building a System of Learning and Instructional Improvement – Barbara Schneider
PPTX
EAPRIL explanatory evaluation
PDF
Vocce action research
PDF
Advanced Pedagogy training in different outcomes
PPT
surveys non experimental
PPTX
TESTA to FASTECH Presentation
PPTX
Formulating Action Research Questions.pptx
PPTX
Quantifying the Effects of an Active Learning Strategy on the Motivation of S...
non verbal group intelligence test.dUUUUUocx
NRES 1 Quantitative and Qualitative Research
Nursinf Research 1 Quantitative-Research.pdf
Nursing research Quantitative-Research.pdf
Test Construction
Kondas IPA IIIIIIIIIIIIIIIIIIIIIIII.pptx
testconst-do-151030115824-lva1-app6892.ppt
Active Engagement Using Classroom Response Systems - CSU Pueblo - Jeff Loats
Technological Problem Solving Seminar
3-Quantitative and Qualitative Research and principles
Practical Research 2 First S - Akio.pptx
JiTT - Blended Learning Across the Academy - Teaching Prof. Tech - Oct 2015
Building a System of Learning and Instructional Improvement – Barbara Schneider
EAPRIL explanatory evaluation
Vocce action research
Advanced Pedagogy training in different outcomes
surveys non experimental
TESTA to FASTECH Presentation
Formulating Action Research Questions.pptx
Quantifying the Effects of an Active Learning Strategy on the Motivation of S...

Introduction to Research Methods in Education

  • 1. Introduction to Research Methods in Education Jasarat Ilyas Jokhio M.Ed. Preston University Karachi Campus.
  • 2. Research Methods  Why is it important to understand research methods for interdisciplinary researchers?  Types of research  How do you measure learning experimentally?  Within subjects design  Between subjects design  What are you measuring?  Important Statistical terms  [other methods]
  • 3. Research methods as boundary object in interdisciplinary research  Study of cross-disciplinary research collaboration (Mercier, Penuel, Remold, Villalba, Kuhl & Romo, under review)  The Bilingual Baby Project  4-year longitudinal study  Neuroscience, cognitive science, sociology  How does growing up in a bilingual environment influence cognitive development & school readiness?  Biggest issues was sample selection (methods)
  • 4. … [a child] was brought back for me to test, and I was putting [an ERP] cap on and then his mom said, well, he’s used to this from going to the neurologist … and then she told me that he has epilepsy. And I thought to myself, well, okay, I’m wasting the next two-hours because I’m not going to be able to use the data….. A quantitative researcher would never include a child that—you wouldn’t even waste the time and resources.
  • 5. It’s difficult because you do form close relationships with the families…. So it was then much harder for our group to eliminate them from the pool, which we didn’t. We have followed up, we’ve gone and interviewed her, and it would be very interesting to see what kind of school readiness skills that child has and what kind of problems that mother has encountered. I know that they wanted children who had no other disabilities in order to focus on their language acquisition, but not all children are of this type
  • 6. Two Types of Research Quantitative  Experimental  Surveys (usually) Qualitative  Biography, phenomenology, grounded theory, ethnography & case study Mixed methods  You can’t account for context with numbers  The plural of anecdote is not data
  • 7. What do you want your data & results to look like?  Do you want to show learning, engagement, the process of the activity?  Do you want to show that your tool works, or that it is better than an alternative?  Do you want to describe, code, run statistics, present a case study?  How will you design the study to get the type of results you want to present?
  • 8. How do you measure learning experimentally? Ecology lesson - Aim: to teach 5 year olds about complex systems - Ten 1-hour long sessions over 5 weeks - “Embodied curriculum” - Technology; dancing; drawing 0 1 2 3 4 5 6 7 8 9 10 Pre Post Scoreontest No way to know whether it was the curriculum, or just being taught that led to learning CLAIM: Embodied curriculum is a good way to teach complex systems
  • 9. How do you measure learning experimentally? (and know that it’s because of what you did…) Pre/post test design (within subjects)  Sequestered problem solving (SPS)  Preparation for future learning (PFL)  Free-write/free recall  Delayed post-test  Multiple baseline/single case design
  • 10. How do you measure learning experimentally? (and know that it’s because of what you did…) Control design (between subjects)  Only some of the participants receive intervention & compare post-test scores between control & experimental groups  Compare ‘experimental group’ with previous groups who did not receive intervention  Randomized control trials (the medical model)
  • 11. 2x2 design Recall Condition On Land Under water Learning condition On Land 20 20 Under Water 20 20 Godden & Baddeley (1975)
  • 12. But what are you measuring? Prerequisites for Maths Course - Is maths101 necessary to pass stats202? - Half of students had taken maths101 - All students take stats202 - Contrast outcomes 0 10 20 30 40 50 60 70 80 90 100 Maths101 No Maths101 Meanpercentonposttest Floor effect: either the post-test didn’t measure the content or very little was learned from stats202. CLAIM: No need to take maths101 before taking stats202
  • 13. But what are you measuring…  Is it valid?  Internal validity (is the effect caused by the IV)  External validity (would it replication beyond the sample)  Is it reliable?  test-retest reliability  Split-test reliability  [piloting]
  • 14. What sort of learning will you measure?
  • 15. Important statistical ideas  Independent variables  The thing you manipulate/control for  Main effects & Interaction effects between IV  Dependent variables  The outcome measure  Floor effects & ceiling effects  Statistical significance  Usually <.05 for social science  Indicates whether the effect is genuine or due to chance
  • 16. Important statistical ideas  Level of measurement  Nominal  Ordinal  Interval (& ranking)  Population & sample  Normal distribution  Descriptive statistics  Means, standard deviations, standard errors,  Parametric and non-parametric statistics
  • 17. Assumptions for parametric statistics  Level of measurement must be at least interval  Sample is drawn from a normally distributed population  Homogeneity of Variance  Variance of two samples is not significantly different  Independence of scores
  • 19. Key things to look for:  Are the differences between conditions significant:  T-tests, ANOVA, Chi Square  Is there a relationship between variables?  Correlations (note: you can’t tell causation from this)  Pay attention to r values (between 1 and -1).  Which of the IVs cause the DV?  Regression Analysis (note: need very large sample size; controversial technique)
  • 20. Survey design  In the past 24 hours, did you watch more than an hour of television programming, or not? Yes/No  In the past 24 hours, did you read a daily newspaper, or not? Yes/No  On a scale of 1 to 7, please rate How satisfied were you with what you learned and the usability of the software? (1) Agree strongly…….(7) disagree strongly
  • 21. Survey design (things to remember)  Is there only one question in each item?  Pilot with a number of people – do they read the question the way it was intended?  Are all your scales in the same direction (if not, reverse them before analysis)  Do the answers match the questions?  How will you make sense of the answers  What sort of analysis can you do on rating, frequency, open-ended items?  Are particular answers ‘socially desirable?’

Editor's Notes

  • #10: Dancing kindergarteners