SlideShare a Scribd company logo
Experimental Research
Experiments
 Begin with a Hypothesis
 Modify Something in a Situation
 Compare Outcomes
 Cases or People are Termed “Subjects”
Random Assignment
 Probability of Equal Selection
 Allows Accurate Prediction
 An Alternative to Random
Assignment is Matching
Parts of the Classic Experiment
 Treatment or Independent Variable
 Dependent Variable
 Pretest
 Posttest
 Experimental Group
 Control Group
 Random Assignment
Variations on Experimental Design
 Pre-experimental Design
 One-shot Case Study
 One-group Pretest-Posttest Design
 Static Group Comparison
 Quasi-Experimental and Special Designs
Types of Validity
 External Validity
 Do the results apply to the broader
population?
 Internal Validity
 Is the independent variable responsible for the
observed changes in the dependent variable?
Confounding Variables That Threaten
Internal Validity
 Maturation
 Changes due to normal growth or
predictable changes
 History
 Changes due to an event that occurs during
the study, which might have affects the
results
Confounding Variables That Threaten
Internal Validity
 Instrumentation
 Any change in the calibration of the measuring
instrument over the course of the study
 Regression to the Mean
 Tendency for participants selected because of extreme
scores to be less extreme on a retest
 Selection
 Any factor that creates groups that are not equal at the
start of the study
Confounding Variables That Threaten
Internal Validity
 Attrition

Loss of participants during a study; are the
participants who drop out different from those
who continue?
 Diffusion of treatment
 Changes in participants” behavior in one
condition because of information they
obtained about the procedures in other
conditions
Subject Effects
 Participants are not passive
 They try to understand the study to help them to
know what they “should do”
 This behavior termed “subject effects”
 Participants respond to subtle cues about what is
expected (termed “demand characteristics”)
 Placebo effect: treatment effect that is due
to expectations that the treatment will work
Experimenter Effects
 Any preconceived idea of the
researcher about how the experiment
should turn out
 Compensatory effects
Types of Control Procedures
 General control procedures (applicable to
virtually all research)
 Control over subject and experimenter effects
 Control through the selection and assignment
of participants
 Control through specific experimental design
Principles of Experimental Design
 Control the effects of lurking
variables on the response, most
simply by comparing two or more
treatments
 Randomize
 Replicate
Randomization
 The use of chance to divide experimental
units into groups is called randomization.
 Comparison of effects of several treatments
is valid only when all treatments are applied
to similar groups of experimental units.
How to randomize?
How to randomize?
 Flip a coin or draw numbers out of a hat
 Use a random number table
 Use a statistical software package or
program
 Minitab
 www.whfreeman.com/ips
Statistical Significance
 An observed effect so large that it
would rarely occur by chance is called
statistically significant.
A few more things…
A few more things…
 Double-blind: neither the subjects nor the
person administering the treatment knew
which treatment any subject had received
 Lack of realism is a major weakness of
experiments. Is it possible to duplicate the
conditions that we want?

More Related Content

PPT
Chapter8.ppt resaerch types and advantages disadvantages
PPT
Principles of quantitative research
PPT
Experimental research
PPT
Experimental research sd
PPT
Experimental research
PPTX
Comparing research designs fw 2013 handout version
PPTX
Living evidence 3
PPTX
Experimental design
Chapter8.ppt resaerch types and advantages disadvantages
Principles of quantitative research
Experimental research
Experimental research sd
Experimental research
Comparing research designs fw 2013 handout version
Living evidence 3
Experimental design

Similar to experimental types of research in mentioned in research methodology.ppt (20)

PPTX
Experimental design
PPTX
Experimental research
PPTX
Research Methodology - Dr Kusum Gaur
PPTX
Experimental research
PPTX
Experimental research
PPT
Clinical trials
PPTX
Research Methodology
PPT
Adler clark 4e ppt 08
PPT
Experimental Design1.ppt
PPTX
Experimental research
PDF
Methodology
PPT
Constructs, variables, hypotheses
PPT
241109 rm-j.p.-non experimental design
PPTX
Biostatistics Classes.pptx all that we need to know
PPT
experimental designs
PPTX
Research design: Design of Experiment
PPTX
Pre-experimental Research Design and It's Uses
PPT
Lecture 10 between groups designs
PPTX
Experimental research
PPT
Intro to research.i
Experimental design
Experimental research
Research Methodology - Dr Kusum Gaur
Experimental research
Experimental research
Clinical trials
Research Methodology
Adler clark 4e ppt 08
Experimental Design1.ppt
Experimental research
Methodology
Constructs, variables, hypotheses
241109 rm-j.p.-non experimental design
Biostatistics Classes.pptx all that we need to know
experimental designs
Research design: Design of Experiment
Pre-experimental Research Design and It's Uses
Lecture 10 between groups designs
Experimental research
Intro to research.i
Ad

Recently uploaded (20)

PPTX
Supervised vs unsupervised machine learning algorithms
PPTX
STUDY DESIGN details- Lt Col Maksud (21).pptx
PPTX
Data_Analytics_and_PowerBI_Presentation.pptx
PPTX
STERILIZATION AND DISINFECTION-1.ppthhhbx
PDF
BF and FI - Blockchain, fintech and Financial Innovation Lesson 2.pdf
PPT
Reliability_Chapter_ presentation 1221.5784
PPTX
Qualitative Qantitative and Mixed Methods.pptx
PPTX
Introduction to Knowledge Engineering Part 1
PPTX
iec ppt-1 pptx icmr ppt on rehabilitation.pptx
PPTX
01_intro xxxxxxxxxxfffffffffffaaaaaaaaaaafg
PPTX
Introduction to Basics of Ethical Hacking and Penetration Testing -Unit No. 1...
PDF
Optimise Shopper Experiences with a Strong Data Estate.pdf
PPTX
The THESIS FINAL-DEFENSE-PRESENTATION.pptx
PPTX
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
PPT
ISS -ESG Data flows What is ESG and HowHow
PDF
Introduction to Data Science and Data Analysis
PDF
Galatica Smart Energy Infrastructure Startup Pitch Deck
PPTX
oil_refinery_comprehensive_20250804084928 (1).pptx
PPTX
IB Computer Science - Internal Assessment.pptx
PPTX
Computer network topology notes for revision
Supervised vs unsupervised machine learning algorithms
STUDY DESIGN details- Lt Col Maksud (21).pptx
Data_Analytics_and_PowerBI_Presentation.pptx
STERILIZATION AND DISINFECTION-1.ppthhhbx
BF and FI - Blockchain, fintech and Financial Innovation Lesson 2.pdf
Reliability_Chapter_ presentation 1221.5784
Qualitative Qantitative and Mixed Methods.pptx
Introduction to Knowledge Engineering Part 1
iec ppt-1 pptx icmr ppt on rehabilitation.pptx
01_intro xxxxxxxxxxfffffffffffaaaaaaaaaaafg
Introduction to Basics of Ethical Hacking and Penetration Testing -Unit No. 1...
Optimise Shopper Experiences with a Strong Data Estate.pdf
The THESIS FINAL-DEFENSE-PRESENTATION.pptx
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
ISS -ESG Data flows What is ESG and HowHow
Introduction to Data Science and Data Analysis
Galatica Smart Energy Infrastructure Startup Pitch Deck
oil_refinery_comprehensive_20250804084928 (1).pptx
IB Computer Science - Internal Assessment.pptx
Computer network topology notes for revision
Ad

experimental types of research in mentioned in research methodology.ppt

  • 2. Experiments  Begin with a Hypothesis  Modify Something in a Situation  Compare Outcomes  Cases or People are Termed “Subjects”
  • 3. Random Assignment  Probability of Equal Selection  Allows Accurate Prediction  An Alternative to Random Assignment is Matching
  • 4. Parts of the Classic Experiment  Treatment or Independent Variable  Dependent Variable  Pretest  Posttest  Experimental Group  Control Group  Random Assignment
  • 5. Variations on Experimental Design  Pre-experimental Design  One-shot Case Study  One-group Pretest-Posttest Design  Static Group Comparison  Quasi-Experimental and Special Designs
  • 6. Types of Validity  External Validity  Do the results apply to the broader population?  Internal Validity  Is the independent variable responsible for the observed changes in the dependent variable?
  • 7. Confounding Variables That Threaten Internal Validity  Maturation  Changes due to normal growth or predictable changes  History  Changes due to an event that occurs during the study, which might have affects the results
  • 8. Confounding Variables That Threaten Internal Validity  Instrumentation  Any change in the calibration of the measuring instrument over the course of the study  Regression to the Mean  Tendency for participants selected because of extreme scores to be less extreme on a retest  Selection  Any factor that creates groups that are not equal at the start of the study
  • 9. Confounding Variables That Threaten Internal Validity  Attrition  Loss of participants during a study; are the participants who drop out different from those who continue?  Diffusion of treatment  Changes in participants” behavior in one condition because of information they obtained about the procedures in other conditions
  • 10. Subject Effects  Participants are not passive  They try to understand the study to help them to know what they “should do”  This behavior termed “subject effects”  Participants respond to subtle cues about what is expected (termed “demand characteristics”)  Placebo effect: treatment effect that is due to expectations that the treatment will work
  • 11. Experimenter Effects  Any preconceived idea of the researcher about how the experiment should turn out  Compensatory effects
  • 12. Types of Control Procedures  General control procedures (applicable to virtually all research)  Control over subject and experimenter effects  Control through the selection and assignment of participants  Control through specific experimental design
  • 13. Principles of Experimental Design  Control the effects of lurking variables on the response, most simply by comparing two or more treatments  Randomize  Replicate
  • 14. Randomization  The use of chance to divide experimental units into groups is called randomization.  Comparison of effects of several treatments is valid only when all treatments are applied to similar groups of experimental units.
  • 15. How to randomize? How to randomize?  Flip a coin or draw numbers out of a hat  Use a random number table  Use a statistical software package or program  Minitab  www.whfreeman.com/ips
  • 16. Statistical Significance  An observed effect so large that it would rarely occur by chance is called statistically significant.
  • 17. A few more things… A few more things…  Double-blind: neither the subjects nor the person administering the treatment knew which treatment any subject had received  Lack of realism is a major weakness of experiments. Is it possible to duplicate the conditions that we want?