SlideShare a Scribd company logo
Research design:  The backbone of academic inquiry Peter Neff – Doshisha University Matthew Apple – Nara National College of Technology David Beglar – Temple University Japan CUE Forum 2008 Olympic Memorial Youth Center November 1, 2008, 1:15 - 2:50 p.m.
Introduction: The importance of good research design Approaching the study Developing the study design Break for Q&A Designing the right instrument Implementing your design Conclusion Further Q&A Overview
Introduction: The importance of good research design
Poor design vs. Good design Poorly-designed study Hit on an idea, dive right in No background research Throw together a survey, give to a group of  unwary participants Collect data,  then  ponder how to analyze Run to a colleague for help Fish around for most “interesting” findings Pray to get published
Poor design vs. Good design Well-designed study Hit on an idea, do background research Formulate relevant, specific, practical RQs Consider participants, context, data analysis in advance Decide/develop instrument; pilot and revise it Decide on appropriate pre/post-test instruments Plan stages and structure of data collection Prepare participants adequately … Then carry out the study
The importance of good design A well-designed study provides many benefits: Demonstrates researcher knowledge Ties the study to an underlying philosophy Provides a clear path for the researcher(s) Helps avoid mishaps of previous studies
The importance of good design Other benefits Leads to more concrete results, more definitive conclusions Improves chances of publication Raises the status of SLA as a field of inquiry
A word about mixed methods designs The great quan-qual debate Mixed methods – the “best” of both worlds Add a qualitative component to a quantitatively-oriented study:  Participant interviews Observational, audiovisual data Open-ended survey questions Plan for qualitative analyses (text analysis, response coding)
Part 1 Approaching the Study
Approaching the study Hitting upon research ideas Review of the literature Formulating research questions
Hitting upon research ideas Identify the topic in a few words Reflect on “doability” of research Can I research this? Should I research this? Am I interested in researching this? Review of the literature can help redefine and revise ideas
Identifying the topic:  Hints for starting to narrow Pose a short question using “what” or “how”  Write a short title that consists of one sentence under 12 words Ask a friend or colleague to read your topic and gauge their reactions Draft research questions to see if the topic can be adequately explored
A “researchable” topic “ Can I do this in my current situation?” “ Does this concern people at other institutions?” “ Does this add to the current body of research related to this topic?” “ Does this study contribute something from a unique perspective?”
Filtering “probably not so good” ideas:  To boldly go where no research has gone before…  (The “Star Trek” idea) My theory is clearly better than X  (The “Steven Krashen is so wrong” idea) My classroom is totally unique  (The “I don’t need theory” idea) This is a really cool technology / methodology / text book  (The “I am primarily a teacher” idea)
Filtering ideas: A few hints Review research designs and statistical techniques Review teaching methods and overall SLA research results Evaluate access to potential study participants Plan time for material creation, study design, and implementation
Review of the literature Relate the study to continuing “dialogue” in current research Finding a “gap” in the literature  Provide a framework for the importance of the study
Review of the Literature: Finding a “gap” in knowledge “ We do not enough about X…” “ This way of looking at X has never been done…” “ This way of learning about X has not been duplicated in my context” “ Previous research has inadequately explored X…”
Finding literature: Some hints Google Scholar using key words or researcher names Scour recent literature review articles Check for “cited” numbers online Get access to university databases Refer to recently published articles After the year 2000 During the previous 2 to 3 years Examine “outside the field” articles
Finding literature:  Separating the wheat… “ Top tier” journal articles Most-often-cited articles Recent articles Research articles (not reviews) Books / Edited book-articles Major international conference papers Dissertations / dissertation abstracts
… from the chaff “ In-house” journal articles Articles from “proceedings” books Online journal articles with only .html versions Unedited books from small publishers Newspaper and magazine articles Web pages Anecdotal evidence Your own previous papers for an MA course
Research questions:  A few useful guidelines Naturally flow from the literature review Strongly connected to the topic At least two or three (not one)… … but not five or six or more As specific as possible Directly concern variables in the study Do not contain yes/no question words
RQs: What not to ask “ Is X true/false?” “ Will X happen if…?” “ Does X cause Y?” “ What do participants think of X?” “ Why does X happen?”
RQs: What to ask “ What differences exist between…” “ Compared to X, how does Y…?” “ To what degree do X and Y differ…?” “ When X is controlled for Y…, how does Z…?” “ What are underlying patterns among…?” “ To what degree does X predict Y?”
Part 2 Developing the Study Design
Research design Cross-sectional design: A design in which data are collected from a sample at only one point in time. Longitudinal design: A design in which data are collected at more than one point in time.
Randomized Control-Group Pretest-Posttest Design Experimental Group 1   T 1  Xa  (Method  a ) T 2 Experimental Group 2   T 1  Xb  (Method  b ) T 2 Control Group   T 1 T 2
Randomized Control-Group Pretest-Posttest Design Reasonably strong conclusions can be reached about the effects of the treatments. Problem 1: Within session variation (e.g., different teachers or room conditions) may intervene. The solution? Randomly assigning participants, times, and places to the experimental and control conditions.  Problem 2: The pretest may interact with the treatment. This potential problem is dealt with in the next design.
Randomized Solomon Six-Group Design Pretested (Random assignment) T 1  Xa  (Method  a ) T 2 Pretested (Random assignment) T 1  Xb  (Method  b ) T 2 Pretested (Random assignment) T 1 T 2 Unpretested (Random assignment)   Xa  (Method  a ) T 2 Unpretested (Random assignment)   Xb  (Method  b ) T 2 Unpretested (Random assignment) T 2
Randomized Solomon Six-Group Design This design amounts to doing the experiment twice –once with and once without pretesting. It is possible to know what effects, if any, are associated with pretesting. If the results of the “two experiments” are consistent, greater confidence can be placed in the findings.
Counterbalanced Design This design is useful when randomization is not possible and intact groups must be used. C B C D 4 B D A C 3 A A D B 2 D C B A 1 Xd Xc Xb Xa Replication
Counterbalanced design The counterbalanced design rotates out the participants’ differences (e.g., one group has more aptitude or motivation than the other groups) by exposing each group to all variations of the treatment. Order-of-presentation effects are controlled.  Primary weakness: The possibility of carryover effects from one treatment to the next exists. Allowing time between treatments can alleviate this problem.
Control-Group Time-Series Design Experimental Group 1  T 1  T 2   T 3   T 4 Xa  (Method  a ) T 5  T 6   T 7   T 8 Experimental Group 2  T 1  T 2   T 3   T 4 Xb  (Method  b ) T 5  T 6   T 7   T 8 Control Group   T 1  T 2   T 3   T 4 T 5  T 6   T 7   T 8
Control-Group Time-Series Design This design allows the researcher to determine growth over time, and the effect of an intervention. The presence of a control group increases the trustworthiness of the results because the possibility of a contemporary event causing any gains can be determined.
Control-Group Time-Series Design This design can be extended by exposing the participants to the intervention on multiple occasions. This approach is more sensitive to partial gains in knowledge and tests the strength of the intervention more than once, thus giving the researcher a more accurate understanding of the effectiveness of the intervention. Experimental Group 1 T 1  T 2   Xa T 3   T 4 Xa T 5  T 6 Experimental Group 2 T 1  T 2   Xb T 3   T 4 Xb T 5  T 6 Control Group T 1  T 2 T 3   T 4 T 5  T 6
Q&A Break
Part 3 Designing the Right Instrument
Instrument Design  Commonly used instruments in SLA research Scored tests Rater scores Surveys Interviews Consider your eventual data analysis when developing instruments
Instruments - Scored tests Pluses Quantitative items (M/C, Cloze/C-tests) Simple to score large # of participants Easier to analyze Qualitative items (short answer, timed essays) Good complement to quantitative scores Can provide more in-depth assessment of participants’ abilities Minuses Quantitative items Limited to one type of data Qualitative items Take more time/effort to score Rater bias
Instruments – Performance ratings An assessment of participants’ performance in an assigned task Tasks may include presentations, interviews, written essays Performances can be scored using a Likert-scale, a rubric, or holistically Usually scored by at least two “expert” raters; sometimes also by peers
Performance ratings Rating criteria should be concretely established with little ambiguity Avoid including too many (or too few) criteria for one performance task All raters should undergo a “normative” training session prior to assessment Use models to train raters Avoid single-score holistic ratings
Instruments - Surveys Often used for: Collecting learner history data (L2 study experience, other background info)  Assessing participants’ attitudes towards a predetermined construct (language learning motivation, anxiety using the L2) Determining reactions to an experimental treatment (teaching methods, innovative learning tasks)
Survey making For non-advanced learners –  surveys should be in their L1 Build in redundancy - Include multiple questions for each concept area Questions should be simply worded – avoid negative or confusing wording Depending on the purpose of the survey, 20-40 items/session is a good range to shoot for
Survey making Any survey used in a serious study should be piloted in advance It is acceptable to make adjustments to an existing instrument Likert-scale items should usually have between 4 and 6 choices A few qualitative questions can provide a nice complement to quantitative instruments
Instruments - Interviews Interviews can provide an excellent qualitative component to a larger study It is not necessary to interview  all  participants  a subsample as small as 10-20% can be acceptable Use your best judgment on participants’ language ability For intermediate-and-above learners, L2 interviews are often fine
Conducting interviews Inform students they are being interviewed, obtain consent Record unobtrusively  “ Warm up” the participants before getting into the heart of the interview Collect more data than you need
Validating Instruments
Instrument Validity The construct = The heart of the matter What construct do you wish to measure?  How do you define the construct? What are its component parts? Do they form a unified whole?
Operationalizing the construct: The items Conceptualize the construct as a continuum: easy—difficult items and less able—more able persons.  How have other researchers measured the construct? Write original or adapted items. Cover the estimated range of your participants. Write 50% more items than you intend to use. This will allow you to “cherry pick” the best items as well as items at various levels of difficulty.
Operationalizing the Construct: The Items More able  |   More difficult persons  |  items | x  |   item 1 xx  |   item 2  item 3 xxx  |   item 4  item 5 xxxx  |   item 6  item 7  item 8 xxxx  |   item 9  item 10 item 11 xxx  |   item 12 item 13 xx  |   item 14 x  |   item 15 | Less able  |   Less difficult Persons  |   items
Operationalizing the Construct: The Items After piloting the items, statistically analyze the results. Examine dimensionality, item difficulty, and item content. Select the best items to make an efficient, highly reliable instrument.
Part 4 Implementing Your Design
Implementing the design Including other researchers Practical issues Handling ethical concerns
Including other researchers in the study The nature of the researchers involved Main researcher plus “helpers” One researcher plus “other participants” The nature of the instructors involved Teaching methods Students taught Course goals University program goals
Working with other researchers Work with people you know and trust Establish a schedule early  Define clear roles for each researcher Decide definite research goals prior to data collection  Keep in regular contact The “band practice time” principle
Example research roles Head researcher / contact person Data entry specialist Statistician Interviewer Literature analyst Editor / proofreader
Heading off potential problems Explain study commitments prior to starting the study Agree on “ownership” prior to data collection and data entry Who will keep the data? Whose name comes first, second, etc.? Keep everyone aware of deadlines Include everyone in decision-making processed and data analysis
Things to avoid in group research People you don’t know Research groups larger than four or five members Using someone for language skills Involving others just to get more participants Forgetting to thank others for their assistance
Improving relations with research helpers Write clear instructions Thank profusely for their time and effort Offer to send copies of final research papers and/or results Offer to assist in future research Keep in touch after research ends
Practical Issues Timing of implementation Learning and research context Participant consent Financial considerations
Timing of the implementation Beginning, middle, or end of semester Day of the week Time of day Exams and exam preparation periods “ Culture Festivals” or other club-related events “ Open classes” or “parents’ day”
Learning and research context Differing course goals (I.e., listening class vs. reading class) Different major field of study Gender, age, year in school Number of class meetings Perception of the value of research by institution heads
Participant consent Always allow for “non-participation” choice from potential participants Write clear instructions for participants asking for their cooperation Ask co-researchers or helpers to briefly inform participants about their choice
Financial considerations Copies for questionnaires, exams, etc. Computer analysis software Mailing costs Interview travel costs Recording equipment Reference books Journal article costs
Heading off lack of cooperation problems Review requirements of the study  How many items in the questionnaire?  How many treatments? Recognize that students are busy, tired, etc. Plan to get between 30-50% more data than you need Try to get more data at a later date
Ethical Considerations
Ethical considerations Students should not be exploited just because they are there In theory, they should derive benefit (directly or indirectly) from the research Explain the basic purpose of your research before collecting data 1-2 sentences should be fine It is also good to briefly explain this at the beginning of any survey instrument
Ethics - Consent Provide students with the chance to opt out of participation Verbal consent is usually acceptable for surveys, ratings, and test scores A written consent form may be necessary for more involved forms of participation (interviews, essay passages) When in doubt, check recent articles in well-known journals for guidance
Ethics – Other considerations The role of this study within your institution Potential gender, proficiency or other issues that may affect your data or conclusions  Have a plan for anonymizing the data (and consider making this conspicuous on your instruments)
Conclusion
In conclusion… There are many factors to consider when embarking on a serious study Some points to take away… Most studies should fill a place (a “gap”) within the current academic dialogue  Research questions should reflect continuity with the literature and should be specific Carefully consider the design setup which will work best with the participant groups and the research and analytical goals
In conclusion Further points… Different instruments work better in different circumstances. Choose those which best reflect your aims. Plan your analyses at the same time you are developing your instruments Develop and pilot instruments which can cover responses from a range of participants
In conclusion Even more points… Work with others who will be serious and committed, and then be an organized and conscientious leader Consider how practical, pedagogical, and institutional concerns may affect your study Do not forget current ethical guidelines for carrying out participant research
Good luck with your research! Thank you for listening
Q&A
For a copy of this presentation: http://jaltcue-sig.org

More Related Content

PPTX
Identifying and formulating a research question: Ayurveda Perspective
PDF
Qualitative data analysis
PPT
11 - qualitative research data analysis ( Dr. Abdullah Al-Beraidi - Dr. Ibrah...
PPTX
Analyzing qualitative data 4 13-17
PPTX
321423152 e-0016087606-session39134-201012122352 (1)
PDF
Research methods workshop data analysis
PPTX
Research Ethics, IPR, Plagiarism
Identifying and formulating a research question: Ayurveda Perspective
Qualitative data analysis
11 - qualitative research data analysis ( Dr. Abdullah Al-Beraidi - Dr. Ibrah...
Analyzing qualitative data 4 13-17
321423152 e-0016087606-session39134-201012122352 (1)
Research methods workshop data analysis
Research Ethics, IPR, Plagiarism

What's hot (18)

PDF
Coding in Deductive Qualitative Analysis
PDF
Analysing Qualitative Data
PPSX
Chapter 4 common features of qualitative data analysis
ODP
Qualitative research, lab report overview, and review of lectures 1 to 7
PPTX
Dr anil jain paper acceptance in index journal tips and tricks dr. anil.k.jain
PPTX
Tools to Expedite Your Proposal, IRB, and Results Chapter
PDF
Qualitative and quantitative research
PPTX
Plagiarism Issues
PPTX
Fundamentals of measurement
PPTX
Action research for_librarians_carl2012
PPT
Business Research Methods session 6 research design
PPTX
Instrument development and psychometric validation 030222
PPT
Making Sense of It All: Analyzing Qualitative Data
PDF
Qualitative Research Methods
PPTX
The research process steps
PPTX
Coding Your Results
PDF
Research Design simplified
PDF
Presenting qualitative data
Coding in Deductive Qualitative Analysis
Analysing Qualitative Data
Chapter 4 common features of qualitative data analysis
Qualitative research, lab report overview, and review of lectures 1 to 7
Dr anil jain paper acceptance in index journal tips and tricks dr. anil.k.jain
Tools to Expedite Your Proposal, IRB, and Results Chapter
Qualitative and quantitative research
Plagiarism Issues
Fundamentals of measurement
Action research for_librarians_carl2012
Business Research Methods session 6 research design
Instrument development and psychometric validation 030222
Making Sense of It All: Analyzing Qualitative Data
Qualitative Research Methods
The research process steps
Coding Your Results
Research Design simplified
Presenting qualitative data
Ad

Viewers also liked (9)

PPSX
Campus
PPTX
Copy Of Renaissance Pr
PPTX
Ema Ushioda Plenary (Sunday July 3rd)
PDF
Boys news 1, 2012
PDF
Boys Class Report 2008
PPTX
Ema Ushioda Plenary at CUE 2011
PPSX
Healthcare
PDF
Clippers For Cancer 2010
PDF
Language Learning Motivation in Asia
Campus
Copy Of Renaissance Pr
Ema Ushioda Plenary (Sunday July 3rd)
Boys news 1, 2012
Boys Class Report 2008
Ema Ushioda Plenary at CUE 2011
Healthcare
Clippers For Cancer 2010
Language Learning Motivation in Asia
Ad

Similar to Cue Forum2008 (20)

PDF
Research methods for Masters and Doctoral dissertation scholars
PPT
Research Methods for Business Students
PPTX
odisee.pptx
PDF
How to Read Academic Papers
PPT
Merriam ch 8 5.26.10
PPT
Research design dr. raj agrawal
PDF
How to propose project and conduct research
PDF
Research Methods and Methodologies
PPT
Systematic Literature Reviews : Concise Overview
PPTX
STEM Mom Speaks to Teachers at Princeton University
PPTX
Research Methodology UNIT 2.pptx
PPT
What is research?
PPTX
RESEARCH DESIGN
PPT
Research Proposal Seminar
PDF
Project documentation format
DOCX
Coding of the Interview Theme Analysis.docx
DOCX
Coding of the Interview Theme Analysis.docx
PPT
Information Skills For Researchers V3
PPT
Mixed methods-research -design-and-procedures
Research methods for Masters and Doctoral dissertation scholars
Research Methods for Business Students
odisee.pptx
How to Read Academic Papers
Merriam ch 8 5.26.10
Research design dr. raj agrawal
How to propose project and conduct research
Research Methods and Methodologies
Systematic Literature Reviews : Concise Overview
STEM Mom Speaks to Teachers at Princeton University
Research Methodology UNIT 2.pptx
What is research?
RESEARCH DESIGN
Research Proposal Seminar
Project documentation format
Coding of the Interview Theme Analysis.docx
Coding of the Interview Theme Analysis.docx
Information Skills For Researchers V3
Mixed methods-research -design-and-procedures

More from Matthew Apple (14)

PPTX
Language Motivation: From the Theoretical to the Practical
PPTX
KyotoJALT 2015
PPT
Don't walk: Rasch to join the questionnaire trend!
PPTX
Kuis summer seminar 2014 Plenary
PPT
An overview of recent language learning motivation in Japan
PPT
Collaborating with students to understand their motivation
PPT
Motivational Self of EFL science students in Japan
PDF
Noels Plenary CUE2011
PPTX
Yoshiyuki Nakata (Sunday July 3rd)
PDF
Ema Ushioda plenary handout
PPT
Panel discussion 1 (Sat July 2)
ODP
Autonomy and English Proficiency
PPS
Personality and English Proficiency
PPS
Benefits and Drawbacks of Podcasting in EFL
Language Motivation: From the Theoretical to the Practical
KyotoJALT 2015
Don't walk: Rasch to join the questionnaire trend!
Kuis summer seminar 2014 Plenary
An overview of recent language learning motivation in Japan
Collaborating with students to understand their motivation
Motivational Self of EFL science students in Japan
Noels Plenary CUE2011
Yoshiyuki Nakata (Sunday July 3rd)
Ema Ushioda plenary handout
Panel discussion 1 (Sat July 2)
Autonomy and English Proficiency
Personality and English Proficiency
Benefits and Drawbacks of Podcasting in EFL

Cue Forum2008

  • 1. Research design: The backbone of academic inquiry Peter Neff – Doshisha University Matthew Apple – Nara National College of Technology David Beglar – Temple University Japan CUE Forum 2008 Olympic Memorial Youth Center November 1, 2008, 1:15 - 2:50 p.m.
  • 2. Introduction: The importance of good research design Approaching the study Developing the study design Break for Q&A Designing the right instrument Implementing your design Conclusion Further Q&A Overview
  • 3. Introduction: The importance of good research design
  • 4. Poor design vs. Good design Poorly-designed study Hit on an idea, dive right in No background research Throw together a survey, give to a group of unwary participants Collect data, then ponder how to analyze Run to a colleague for help Fish around for most “interesting” findings Pray to get published
  • 5. Poor design vs. Good design Well-designed study Hit on an idea, do background research Formulate relevant, specific, practical RQs Consider participants, context, data analysis in advance Decide/develop instrument; pilot and revise it Decide on appropriate pre/post-test instruments Plan stages and structure of data collection Prepare participants adequately … Then carry out the study
  • 6. The importance of good design A well-designed study provides many benefits: Demonstrates researcher knowledge Ties the study to an underlying philosophy Provides a clear path for the researcher(s) Helps avoid mishaps of previous studies
  • 7. The importance of good design Other benefits Leads to more concrete results, more definitive conclusions Improves chances of publication Raises the status of SLA as a field of inquiry
  • 8. A word about mixed methods designs The great quan-qual debate Mixed methods – the “best” of both worlds Add a qualitative component to a quantitatively-oriented study: Participant interviews Observational, audiovisual data Open-ended survey questions Plan for qualitative analyses (text analysis, response coding)
  • 9. Part 1 Approaching the Study
  • 10. Approaching the study Hitting upon research ideas Review of the literature Formulating research questions
  • 11. Hitting upon research ideas Identify the topic in a few words Reflect on “doability” of research Can I research this? Should I research this? Am I interested in researching this? Review of the literature can help redefine and revise ideas
  • 12. Identifying the topic: Hints for starting to narrow Pose a short question using “what” or “how” Write a short title that consists of one sentence under 12 words Ask a friend or colleague to read your topic and gauge their reactions Draft research questions to see if the topic can be adequately explored
  • 13. A “researchable” topic “ Can I do this in my current situation?” “ Does this concern people at other institutions?” “ Does this add to the current body of research related to this topic?” “ Does this study contribute something from a unique perspective?”
  • 14. Filtering “probably not so good” ideas: To boldly go where no research has gone before… (The “Star Trek” idea) My theory is clearly better than X (The “Steven Krashen is so wrong” idea) My classroom is totally unique (The “I don’t need theory” idea) This is a really cool technology / methodology / text book (The “I am primarily a teacher” idea)
  • 15. Filtering ideas: A few hints Review research designs and statistical techniques Review teaching methods and overall SLA research results Evaluate access to potential study participants Plan time for material creation, study design, and implementation
  • 16. Review of the literature Relate the study to continuing “dialogue” in current research Finding a “gap” in the literature Provide a framework for the importance of the study
  • 17. Review of the Literature: Finding a “gap” in knowledge “ We do not enough about X…” “ This way of looking at X has never been done…” “ This way of learning about X has not been duplicated in my context” “ Previous research has inadequately explored X…”
  • 18. Finding literature: Some hints Google Scholar using key words or researcher names Scour recent literature review articles Check for “cited” numbers online Get access to university databases Refer to recently published articles After the year 2000 During the previous 2 to 3 years Examine “outside the field” articles
  • 19. Finding literature: Separating the wheat… “ Top tier” journal articles Most-often-cited articles Recent articles Research articles (not reviews) Books / Edited book-articles Major international conference papers Dissertations / dissertation abstracts
  • 20. … from the chaff “ In-house” journal articles Articles from “proceedings” books Online journal articles with only .html versions Unedited books from small publishers Newspaper and magazine articles Web pages Anecdotal evidence Your own previous papers for an MA course
  • 21. Research questions: A few useful guidelines Naturally flow from the literature review Strongly connected to the topic At least two or three (not one)… … but not five or six or more As specific as possible Directly concern variables in the study Do not contain yes/no question words
  • 22. RQs: What not to ask “ Is X true/false?” “ Will X happen if…?” “ Does X cause Y?” “ What do participants think of X?” “ Why does X happen?”
  • 23. RQs: What to ask “ What differences exist between…” “ Compared to X, how does Y…?” “ To what degree do X and Y differ…?” “ When X is controlled for Y…, how does Z…?” “ What are underlying patterns among…?” “ To what degree does X predict Y?”
  • 24. Part 2 Developing the Study Design
  • 25. Research design Cross-sectional design: A design in which data are collected from a sample at only one point in time. Longitudinal design: A design in which data are collected at more than one point in time.
  • 26. Randomized Control-Group Pretest-Posttest Design Experimental Group 1 T 1 Xa (Method a ) T 2 Experimental Group 2 T 1 Xb (Method b ) T 2 Control Group T 1 T 2
  • 27. Randomized Control-Group Pretest-Posttest Design Reasonably strong conclusions can be reached about the effects of the treatments. Problem 1: Within session variation (e.g., different teachers or room conditions) may intervene. The solution? Randomly assigning participants, times, and places to the experimental and control conditions. Problem 2: The pretest may interact with the treatment. This potential problem is dealt with in the next design.
  • 28. Randomized Solomon Six-Group Design Pretested (Random assignment) T 1 Xa (Method a ) T 2 Pretested (Random assignment) T 1 Xb (Method b ) T 2 Pretested (Random assignment) T 1 T 2 Unpretested (Random assignment) Xa (Method a ) T 2 Unpretested (Random assignment) Xb (Method b ) T 2 Unpretested (Random assignment) T 2
  • 29. Randomized Solomon Six-Group Design This design amounts to doing the experiment twice –once with and once without pretesting. It is possible to know what effects, if any, are associated with pretesting. If the results of the “two experiments” are consistent, greater confidence can be placed in the findings.
  • 30. Counterbalanced Design This design is useful when randomization is not possible and intact groups must be used. C B C D 4 B D A C 3 A A D B 2 D C B A 1 Xd Xc Xb Xa Replication
  • 31. Counterbalanced design The counterbalanced design rotates out the participants’ differences (e.g., one group has more aptitude or motivation than the other groups) by exposing each group to all variations of the treatment. Order-of-presentation effects are controlled. Primary weakness: The possibility of carryover effects from one treatment to the next exists. Allowing time between treatments can alleviate this problem.
  • 32. Control-Group Time-Series Design Experimental Group 1 T 1 T 2 T 3 T 4 Xa (Method a ) T 5 T 6 T 7 T 8 Experimental Group 2 T 1 T 2 T 3 T 4 Xb (Method b ) T 5 T 6 T 7 T 8 Control Group T 1 T 2 T 3 T 4 T 5 T 6 T 7 T 8
  • 33. Control-Group Time-Series Design This design allows the researcher to determine growth over time, and the effect of an intervention. The presence of a control group increases the trustworthiness of the results because the possibility of a contemporary event causing any gains can be determined.
  • 34. Control-Group Time-Series Design This design can be extended by exposing the participants to the intervention on multiple occasions. This approach is more sensitive to partial gains in knowledge and tests the strength of the intervention more than once, thus giving the researcher a more accurate understanding of the effectiveness of the intervention. Experimental Group 1 T 1 T 2 Xa T 3 T 4 Xa T 5 T 6 Experimental Group 2 T 1 T 2 Xb T 3 T 4 Xb T 5 T 6 Control Group T 1 T 2 T 3 T 4 T 5 T 6
  • 36. Part 3 Designing the Right Instrument
  • 37. Instrument Design Commonly used instruments in SLA research Scored tests Rater scores Surveys Interviews Consider your eventual data analysis when developing instruments
  • 38. Instruments - Scored tests Pluses Quantitative items (M/C, Cloze/C-tests) Simple to score large # of participants Easier to analyze Qualitative items (short answer, timed essays) Good complement to quantitative scores Can provide more in-depth assessment of participants’ abilities Minuses Quantitative items Limited to one type of data Qualitative items Take more time/effort to score Rater bias
  • 39. Instruments – Performance ratings An assessment of participants’ performance in an assigned task Tasks may include presentations, interviews, written essays Performances can be scored using a Likert-scale, a rubric, or holistically Usually scored by at least two “expert” raters; sometimes also by peers
  • 40. Performance ratings Rating criteria should be concretely established with little ambiguity Avoid including too many (or too few) criteria for one performance task All raters should undergo a “normative” training session prior to assessment Use models to train raters Avoid single-score holistic ratings
  • 41. Instruments - Surveys Often used for: Collecting learner history data (L2 study experience, other background info) Assessing participants’ attitudes towards a predetermined construct (language learning motivation, anxiety using the L2) Determining reactions to an experimental treatment (teaching methods, innovative learning tasks)
  • 42. Survey making For non-advanced learners – surveys should be in their L1 Build in redundancy - Include multiple questions for each concept area Questions should be simply worded – avoid negative or confusing wording Depending on the purpose of the survey, 20-40 items/session is a good range to shoot for
  • 43. Survey making Any survey used in a serious study should be piloted in advance It is acceptable to make adjustments to an existing instrument Likert-scale items should usually have between 4 and 6 choices A few qualitative questions can provide a nice complement to quantitative instruments
  • 44. Instruments - Interviews Interviews can provide an excellent qualitative component to a larger study It is not necessary to interview all participants a subsample as small as 10-20% can be acceptable Use your best judgment on participants’ language ability For intermediate-and-above learners, L2 interviews are often fine
  • 45. Conducting interviews Inform students they are being interviewed, obtain consent Record unobtrusively “ Warm up” the participants before getting into the heart of the interview Collect more data than you need
  • 47. Instrument Validity The construct = The heart of the matter What construct do you wish to measure? How do you define the construct? What are its component parts? Do they form a unified whole?
  • 48. Operationalizing the construct: The items Conceptualize the construct as a continuum: easy—difficult items and less able—more able persons. How have other researchers measured the construct? Write original or adapted items. Cover the estimated range of your participants. Write 50% more items than you intend to use. This will allow you to “cherry pick” the best items as well as items at various levels of difficulty.
  • 49. Operationalizing the Construct: The Items More able | More difficult persons | items | x | item 1 xx | item 2 item 3 xxx | item 4 item 5 xxxx | item 6 item 7 item 8 xxxx | item 9 item 10 item 11 xxx | item 12 item 13 xx | item 14 x | item 15 | Less able | Less difficult Persons | items
  • 50. Operationalizing the Construct: The Items After piloting the items, statistically analyze the results. Examine dimensionality, item difficulty, and item content. Select the best items to make an efficient, highly reliable instrument.
  • 51. Part 4 Implementing Your Design
  • 52. Implementing the design Including other researchers Practical issues Handling ethical concerns
  • 53. Including other researchers in the study The nature of the researchers involved Main researcher plus “helpers” One researcher plus “other participants” The nature of the instructors involved Teaching methods Students taught Course goals University program goals
  • 54. Working with other researchers Work with people you know and trust Establish a schedule early Define clear roles for each researcher Decide definite research goals prior to data collection Keep in regular contact The “band practice time” principle
  • 55. Example research roles Head researcher / contact person Data entry specialist Statistician Interviewer Literature analyst Editor / proofreader
  • 56. Heading off potential problems Explain study commitments prior to starting the study Agree on “ownership” prior to data collection and data entry Who will keep the data? Whose name comes first, second, etc.? Keep everyone aware of deadlines Include everyone in decision-making processed and data analysis
  • 57. Things to avoid in group research People you don’t know Research groups larger than four or five members Using someone for language skills Involving others just to get more participants Forgetting to thank others for their assistance
  • 58. Improving relations with research helpers Write clear instructions Thank profusely for their time and effort Offer to send copies of final research papers and/or results Offer to assist in future research Keep in touch after research ends
  • 59. Practical Issues Timing of implementation Learning and research context Participant consent Financial considerations
  • 60. Timing of the implementation Beginning, middle, or end of semester Day of the week Time of day Exams and exam preparation periods “ Culture Festivals” or other club-related events “ Open classes” or “parents’ day”
  • 61. Learning and research context Differing course goals (I.e., listening class vs. reading class) Different major field of study Gender, age, year in school Number of class meetings Perception of the value of research by institution heads
  • 62. Participant consent Always allow for “non-participation” choice from potential participants Write clear instructions for participants asking for their cooperation Ask co-researchers or helpers to briefly inform participants about their choice
  • 63. Financial considerations Copies for questionnaires, exams, etc. Computer analysis software Mailing costs Interview travel costs Recording equipment Reference books Journal article costs
  • 64. Heading off lack of cooperation problems Review requirements of the study How many items in the questionnaire? How many treatments? Recognize that students are busy, tired, etc. Plan to get between 30-50% more data than you need Try to get more data at a later date
  • 66. Ethical considerations Students should not be exploited just because they are there In theory, they should derive benefit (directly or indirectly) from the research Explain the basic purpose of your research before collecting data 1-2 sentences should be fine It is also good to briefly explain this at the beginning of any survey instrument
  • 67. Ethics - Consent Provide students with the chance to opt out of participation Verbal consent is usually acceptable for surveys, ratings, and test scores A written consent form may be necessary for more involved forms of participation (interviews, essay passages) When in doubt, check recent articles in well-known journals for guidance
  • 68. Ethics – Other considerations The role of this study within your institution Potential gender, proficiency or other issues that may affect your data or conclusions Have a plan for anonymizing the data (and consider making this conspicuous on your instruments)
  • 70. In conclusion… There are many factors to consider when embarking on a serious study Some points to take away… Most studies should fill a place (a “gap”) within the current academic dialogue Research questions should reflect continuity with the literature and should be specific Carefully consider the design setup which will work best with the participant groups and the research and analytical goals
  • 71. In conclusion Further points… Different instruments work better in different circumstances. Choose those which best reflect your aims. Plan your analyses at the same time you are developing your instruments Develop and pilot instruments which can cover responses from a range of participants
  • 72. In conclusion Even more points… Work with others who will be serious and committed, and then be an organized and conscientious leader Consider how practical, pedagogical, and institutional concerns may affect your study Do not forget current ethical guidelines for carrying out participant research
  • 73. Good luck with your research! Thank you for listening
  • 74. Q&A
  • 75. For a copy of this presentation: http://jaltcue-sig.org