theories in empirical software engineering

Post on 15-Apr-2017

600 Views

Category:

Science

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

TheoriesinEmpiricalSoftwareEngineering

RoelWieringa

Sidekicks:DanielMéndezLutzPrechelt

21October2015 IASESE 1

Whoarewe?Roel WieringaUniversityofTwente,Germany

http://wwwhome.ewi.utwente.nl/~roelw/

21October2015 IASESE 2

LutzPrechelt

FUBerlinhttp://www.mi.fu-berlin.de/w/Main/LutzPrechelt

DanielMéndez

TUMünchenhttp://www4.in.tum.de/~mendezfe/

Whoareyou?

Quickround• Whoare you?• What is your experience inconductingempirical studies?

• What are your expectations?

3

Whatdoyouthink?

Whydoweneedscientifictheoriesinsoftwareengineering?

4

4. Methodology(thestudyofresearchmethods)a. Notionofconceptualframework;statementsaboutthemb. Notionofgeneralization;statementsaboutthem

3. Theory(statementaboutmanyresearchresults)a. Conceptualframeworkb. Generalization

2. Researchquestions(what,how,whenwhere,….,why)aimedatgeneralizableknowledge,researchmethod,andresearchresult

1. Practicedomain:SW,methods,tools,processes(asis/tobe)

21October2015 IASESE 5

Looking atresearchfrom thesky

Generalknowledge isthegoldweareafter

Hardwork to growknowledge

Grassroots

• Everything onthe slidesinthis talk,except the examples,isatlevel4.• Theexamples ontheseslidescontain explicitlevelindications.

• Theseparateexample slidesreportabout researchthat contains 2and 3.• Thereported researchstudiessome aspectof1.

Agenda

Time Topic

09:00– 10:30 OpeningandIntroduction

10:30 – 11:00 Coffeebreak

11:00– 12:30 InferringTheoriesfromData

12:30– 13:30 Lunch

13:30– 15:00 Designing ResearchbasedonTheories

15:00– 15:30 Coffeebreak

15:30– 16:30 Hands-onWorkingSession andQ&A

16:30– 17:00 Wrapup(all)

6

WhatisaScientificTheory

21October2015 IASESE 7

Scientific theories• Atheory isabeliefthat there isapattern inphenomena• Ascientific theory isatheory that– Hassurvived testsagainst experience

• Observation,measurement• Possibly experiment,simulation,trials

– Hassurvived criticism by critical peers• Anonymous peerreview• Publication• Replication

21October2015 IASESE 8

Examples (level3)• Theory ofcognitive dissonance• Theory ofelectromagnetism• TheBalancetheorem insocial networks• Theories X,Y,Z,and Wof(project)management• TechnologyAcceptance Model

• Hannayetal.ASystematic“ReviewofTheoryUseinSoftwareEngineeringExperiments”.IEEETOSEM33(2),February2007

• Limetal.“TheoriesUsedinInformationSystemsResearch:IdentifyingTheoryNetworksinLeadingISJournals”./ICIC2009,paper91.

• Non-examples– Speculations based onimagination rather than fact:Conspiracy theories

about who killed JohnKennedy– Opinions that cannot be refuted:TheDutchlostthe WorldChampionship

because they play likeprimadonnas

21October2015 IASESE 9

Designtheories

• Adesigntheory isascientific theory about anartifact inacontext

• Vriezekolk:What isatheory• Méndez:What isatheory

21October2015 IASESE4 10

TheStructureofTheories

21October2015 IASESE 11

Thestructure ofscientific theories

1. Conceptual framework– Constructs used to express beliefs about patterns inphenomena– E.g.Theconcepts ofbeamforming,ofmulti-agentplanning,ofdata

location compliance.(level3)

2. Generalizations– stated interms oftheseconcepts,that express beliefs about

patterns inphenomena.– E.g.relationbetween angle ofincidence and phase difference,– Statementabout delayreduction onairports.(level3)

• Generalizations haveascope,a.k.a.targetofgeneralization

21October2015 IASESE4 12

Thestructure ofdesign theories1. Conceptual framework2. Generalizations

– Artifact specification XContextassumptions→Effects– Effects satisfy arequirement to someextent

21October2015 IASESE4 13

1. Architectural structures:Classofsystems,componentswithcapabilities,interactions– E.g.entities,(de)composition, taxonomies,cardinality,events,

processes,procedures,constraints,…(level4)– Useful for case-basedresearch(observational casestudies,case

experiments,simulations,technical actionresearch)– Typically qualitative

2. Statisticalstructures:Population,variableswith probabilitydistributions,relationsamong variables– Useful for sample-based research(surveys,statisticaldifference-

makingexperiments)– Typically quantitative

Two kindsofconceptual structures

21October2015 IASESE14

• Prechelt:What isatheory,the structure oftheories

• Vriezekolk:Thestructure oftheories• Méndez:Thestructure oftheories

21October2015 IASESE 15

TheUseofTheories

21October2015 IASESE 16

Uses ofaconceptual framework• Framing aproblemorartifact:choosingwhich concepts to

use– Usingthe theory ofinfectuous diseases to understand apatient’s

symptoms– Usingconcepts offorce&energyto understand behavior ofamachine– Usingconceptofacoordination gatekeeperto understand a

distributedSEproject(all three examples atlevel1)

• Describe aproblemorspecify an artifact:using theconcepts• Generalize about theproblemorartifact• Analyze aproblemorartifact (i.e.analyze theframework)

21October2015 IASESE 17

Functions ofgeneralizations

• Functions ofgeneralizations– Explanation:explain phenomenaby identifyingcauses,mechanisms orreasons

– Prediction:statewhat will happeninthefuture• Design:use generalizations to justify adesignchoice

21October2015 IASESE 18

• Prechelt:the use oftheories• Vriezekolk:the use oftheories• Méndez:the use oftheories

21October2015 IASESE 19

Usability oftheories• When isadesigntheory

Contextassumptions XArtifact design→Effectsusable by apractitioner?1. He/she iscapable to recognize Contextassumptions2. and to acquire/buildArtifact under constraints ofpractice,3. effects will indeed occur,and4. He/she can observe this,and5. Theywill contribute to stakeholdergoals/satisfy

requirements

• Practitionerhasto asses theriskthat each ofthesefails

21October2015 IASESE 20

• Prechelt:the usability oftheories• Vriezekolk:the usability oftheories• Méndez:the usability oftheories

21October2015 IASESE 21

Agenda

Time Topic

09:00– 10:30 OpeningandIntroduction

10:30 – 11:00 Coffeebreak

11:00– 12:30 InferringTheoriesfromData

12:30– 13:30 Lunch

13:30– 15:00 Designing ResearchbasedonTheories

15:00– 15:30 Coffeebreak

15:30– 16:30 Hands-onWorkingSession andQ&A

16:30– 17:00 Wrapup(all)

22

ScientificInference

21October2015 IASESE 23

Case-basedinference

• Descriptiveinference:Describingobservations• Abductive inference:Providinganexplanation• Analogicinference:Generalizetosimilarcases

21October2015 IASESE 24

Data

Explanations

Observations

Generalizations

Abduction

AnalogyDescription

Proposition(s) to generalize

Scopeofgeneralization

• Architectural explanation mustbe thebasisoftheanalogic generalization;

• Otherwise,weengage inwishful/magical thinking– You haveobserved that some smallcompaniesdid not putacustomerrepresentative on-siteofan agileproject;

– you explain this asaresult oftight resources (level3);– you generalize by analogy that this will happen in(almost)all smallcompanies(level3).

21October2015 IASESE 25

Data

Explanations

Observations

Generalizations

Abduction

AnalogyDescription

Architectural

Architectural

Sample-based inference

• Descriptiveinference:Describesamplestatistics• Statisticalinference:Generalizetopopulationparameters• Abductive inference:Provideanexplanation• Analogicinference:Expandthescopeofatheorybasedonsimilarity

21October2015 IASESE 26

Explanations

Observations

GeneralizationsStatistical inference

AbductionAnalogyDataDescription

• Causal explanations can be supported by sample-baseddesigns(treatmentgroup/controlgroup)

• Generalization from apopulation,to similar populationsmustbe based onarchitectural explanation– Inan experimentwithasampleofstudents you observe adifference between

treatmentgroup and controlgroup;– By randomness you generalize topopulation ofstudents– Your explanation:this difference iscaused by the treatment(level3);– Inturnexplainedby cognitive processes ofstudents (level3);– generalizedby analogy to novicesoftwareengineers(level3).

21October2015 IASESE 27

Explanations

Observations

Generalizations

AbductionAnalogyDataDescription

Statistical inference

Architectural

Causal &Architectural

• Vriezekolk:Inferring theories from data• Méndez:inferring theories from data• Prechelt:Applying/inferring theories to/fromdata

21October2015 IASESE 28

Agenda

Time Topic

09:00– 10:30 OpeningandIntroduction

10:30 – 11:00 Coffeebreak

11:00– 12:30 InferringTheoriesfromData

12:30– 13:30 Lunch

13:30– 15:00 Designing ResearchbasedonTheories

15:00– 15:30 Coffeebreak

15:30– 16:30 Hands-onWorkingSession andQ&A

16:30– 17:00 Wrapup(all)

29

ResearchDesign

21October2015 IASESE 30

Theresearchsetup

• Inexperiments weareinterested inthe effectofthetreatmentonthe OoS– Requires capability to apply treatmentand control

• Inobservational studiesweareinterested inthe structure anddynamics ofthe OoS itself– Only weak supportfor causality

21October2015 IASESE 31

PopulationSample of Objects of

Study

Representsone or more

populationelements

Treatment instruments

Measure-ment

instruments

• Case-baseddesigns– provide architecturalexplanations– generalizebyarchitecturalanalogy

– Nondeterminism across cases is not quantified

• Sample-based designs– Collectsamplestatistics– Infer properties ofdistributionoverpopulation

– May be purely descriptive!– Possibly a causal explanation– To generalize further, need architectural explanation too– Nondeterminsim within the population is quantified, but not

across analogous populations

21October2015 IASESE 32

Fieldversuslab

21October2015 IASESE 33

• If aphenomenoncannot be (re)produced inthe lab,it canonly be investigated inthe field

• Which ofthe followingdesignscan be done inalab?

Case-basedinference Sample-based inferenceNo treatment(observational study)

Observational casestudy Survey

Treatment(experimental study)

Single-case mechanismexperiment,Technicalactionresearch

Statisticaldifference-makingexperiment

E.g. simulation, test of individual OoS Treatment group /

control group designsE.g. test with client, pilot project

• VriezekolkTheresearchsetup• Méndez:Theresearchsetup• Prechelt:Theresearchsetup

21October2015 IASESE 34

Agenda

Time Topic

09:00– 10:30 OpeningandIntroduction

10:30 – 11:00 Coffeebreak

11:00– 12:30 InferringTheoriesfromData

12:30– 13:30 Lunch

13:30– 15:00 Designing ResearchbasedonTheories

15:00– 15:30 Coffeebreak

15:30– 16:30 Hands-onWorkingSession andQ&A

16:30– 17:00 Wrapup(all)

35

Hands-onWorkingSession

21October2015 IASESE 36

Hands-onWorking Session1. What isyour researchquestion?2. Describe aresearchsetupto answer it3. What inferences doyou planto baseonthis setup?

Groups of3• 15:30Each personfirstdrafts aflipchartwith his/heranswers for

own research• 15:45Each groupmembercomments onthe two flipcharts of

others inhis/hergroup,inparticular on:– Arethe answers clear?– Arethe answers defensible?

• 16:30Each personfinalizes (for now)his/herflipchart• 16:31Pasteto the wall.Seewhat you can learn fromother designs.• 16:45Plenary wrap-up

21October2015 IASESE 37

Q&A

21October2015 IASESE 38

Youprobablycan’task anyway,soaskus!

21October2015 IASESE 39

“Namingthepaininrequirementsengineering:AdesignforaglobalfamilyofsurveysandfirstresultsfromGermany”

Méndez&WagnerInformation&Softwaretechnology2015

“TowardsBuildingKnowledgeonCausesofCriticalRequirementsEngineeringProblems”

Kalinowski etalTwenty-SeventhInternationalConferenceonSoftwareEngineeringand

KnowledgeEngineering(SEKE2015)pp.1-6

40

• Internationalon-line surveyofrequirements engineeringprofessionals’opinionabout causes and effects ofREproblems

• Researchquestions– RQ1WhataretheexpectationsonagoodRE?– RQ2HowisREdefined,applied,andcontrolled?– RQ3HowisREcontinuouslyimproved?– RQ4WhichcontemporaryproblemsexistinRE,andwhatimplications

dotheyhave?– RQ5Arethereobservablepatternsofexpectations,statusquo,and

problemsinRE?

• Observational research

41

Whatisatheory

• Theresearchersformulated34hypothesesabout– REimprovement

• Isbeneficial• Ischallenging

– REstandardization• Hamperscreativity• Improvesquality• ….

– Company-specificstandards• ….

42

• Thistheory(consistingof34proposedgeneralizations)istestedagainst– Opinions ofprofessionals,basedontheirexperience– Criticalpeerreviewinthepublicationprocess

• Theopinionsofprofessionalsarethemselvestheoriesbasedonexperience,– butnotsubjectedtosystematictests– nortocriticalpeerreviews

43

Thestructureoftheories

1. Conceptualframework– Requirements,needs,goals,specification,RE

skill,etc.2. Generalizations– Alliftheclaimsaboutsocialmechanismson

previousslides

44

45

customerProjectteam

Requirementsengineer

Product Requirementsspecification

Nosolution approachAgileapproachNoexperienceREconsidered unimportantNoREqualificationNotimeTeamtoo smallDifferentinterestsNodomainknowledge

NotemplatePoor techniquesNocompleteness check

REconsidered unimportantNoREskillsUnclear needsUnrealistic expectationsNoengagementUnclear requirements

Frequentchanges

Poorly defined

Brazilian theory ofsocial mechanisms that leadto incompleterequirements

Artifact:Requirements engineeringprojectContext:softwaredevelopment

46

customerProjectteam

Requirementsengineer

Product Requirementsspecification

Nosolution approachAgileapproachNoexperienceREconsidered unimportantNoREqualificationNotimeTeamtoo smallDifferentinterestsNodomainknowledgeNocontactpersonSolutionorientation

NotemplatePoor techniquesNocompleteness checkNocompanystandard

REconsidered unimportantNoREskillsUnclear needsUnrealistic expectationsNoengagementUnclear requirementsNocontactpersonSolutionorientationDomaincomplexity

Frequentchanges

Poorly defined

Businessdept

conflict

German theory ofsocial mechanisms that leadto incompleterequirements

• Theconceptual structure ofsocial mechanisms inthe previous two slidesisarchitectural:– Components– Interactions

• Conceptual structure ofthe causal theories onthe nextslidesisstatistical:– Variables– Distributionoverpopulation

47

48

• Brazilian respondents’theory about causes and effects ofincompleterequirements

• German respondents’theory about causes and effects ofincompleterequirements

49

Theuseoftheories• “Requirementsareincompletebecausecustomershave

unclearneedsandhasnoREskills”– Frameaphenomenon:requirementscanbecompletelyspecified– Describeit:describeallmechanismsthatareresponsiblefor

incompleterequirements– Specifyatreatment:trainthecustomerinREskills(??)– Analyzeit:—– Generalizeaboutit:claimthatthisisresponsibleforincomplete

requirementsmoreoften/always– Predictaneffect:predictthatitwillhappeninthenextproject– Explainaneffect:explainthatincompletenessisduestounclearneeds

andabsenceofREskillsincustomer

50

Usability oftheories• Thetheory of34hypothesesisnot intended to be used by

professionalsto improve their practice.Consider the theory``improvingREskillsreduces requirements incompleteness’’

1. Professionaliscapable to recognize Contextassumptions– Yes:recognizablewhen there isrequirements engineering

2. Capable to acquire/buildArtifact under constraints ofpractice– That depends onthe available budget(time,money)for REtraining

3. Theeffects will indeed occur– That depends onthe training;and onother factorscausingREincompleteness

4. He/she can observe this– Hardto saywhether requirements aremorecomplete

5. Theywill contribute to stakeholdergoals/satisfy requirements– Hardto saywhether REcompletenesswill contribute to stakeholdergoals

51

Inferringtheoriesfromdata– Description

• Interpretationoftheanswersoftherespondents• Descriptivestatistics

– Statisticalinference• Nostatisticalinference

– Abductive inference• Theassumedexplanationoftherespondent’sanswersisthat

theybasethemonexperience

– Analogicinference• Otherprofessionalswillanswersimilarly;butpossiblydifferent

acrosscountries/cultures

52

Theresearchsetup

PopulationSample of Objects of

Study

Representsone or more

populationelements

Treatment instruments

Measure-ment

instruments

53

All REprofessionals

SampleofREprofessionalsNotreatment

On-line surveytool,questionnaire

21October2015 IASESE 54

“WhySoftwareRepositoriesAreNotUsedForDefect-InsertionCircumstanceAnalysis

MoreOften:ACaseStudy”Lutz Prechelt,AlexanderPepper

Informationand SoftwareTechnology

55

“WhySoftwareRepositoriesAreNotUsedForDefect-InsertionCircumstanceAnalysisMoreOften:ACaseStudy”

Lutz Prechelt,AlexanderPepperInformationand SoftwareTechnology

• Pepper tried to minesoftwarerepositories ofthe contentmanagementsystemFiona,produced by Infopark,inordertoidentify correlates ofdefectinsertion,hoping that they can beused to improve the softwareprocess.– Engineeringcycle ofthe client

• Pepper and Prechelt observed this.– Casestudy

• Validationofacommunity-wide developmentofMSRtechniques for DICA.– Engineeringcycle ofresearchcommunity

• Researchquestionthat emerged from the case:why areMSRtechniques for DICAnot used moreoften? 56

Whatisatheory• Theory1,heldbythecommunity:

– MSRcanprovideinformationaboutimprovementopportunitiesofthesoftwareprocess(p.3rightcolumn)

• Artifact:MSR• Context:anysoftwaredevelopmentprocess

57

Descriptivegeneralization

• Theory2,proposedbyPrechelt andPepperbasedonthecasestudy:– R1:…– …– R5:Thereisnoaffordablemethodtoassessthe

reliabilityoftheresultsofMSRinDICA– R6:ThereliabilityofMSRresultsinDICAislow– R5andR6arethemajorreasonswhyMSRisnotused

forDICA

• Artifact:MSR• Context:organizationsthatdevelopweb

applicationsforalongperiodoftime,confusedefectswithissues,andhavenodedicatedstafftomaintainbugtracks(sect8.1)

58

Descriptivegeneralizations

Rationalexplanation of a phenomenon.

(= architecturalexplanation, where somecomponents are actors that have goals and mayhave reasons foractions)

Thestructureoftheories

• Conceptualframework– Definitionsofchange,defect,rework,issue,bug,bugfix,defectinsertion,defectcorrection

– Difficulty,cost,utility,reliabilityofatechnique• NB1conceptssharedwiththeOoS• NB2architecturalframework

• Generalizations– Previousslide

• NBtheyareabouttheeffectsofaclassofartifactsinaclassofcontexts

59

Theuseoftheories• “MSRcanprovideinformationaboutimprovement

opportunitiesofthesoftwareprocess”– Frameaphenomenon:softwareimprovementisaproblemoflackof

dataaboutthesoftwareprocess– Describeit:describesoftwarerepositories– Specifyatreatment:specifyMSRtechniques,toolsandsteps– Analyzeit:analyzethemeaningoftheoutputofMSR– Generalizeaboutit:claimthattheoutcomewillbeobtainedinall

softwareprocesses– Predictaneffect:predictthatitwillhappeninthenextproject– Explainaneffect:explainthatanimprovementhasoccurredbecause

ofremovalofaweakspotintheprocess

60

Usability oftheories1. Professionaliscapable to recognize Contextassumptions

– yes

2. Capable to acquire/buildArtifact under constraints ofpractice– Prechelt &Pepper:considerable effortintheir case

3. Theeffects will indeed occur– Noevidence that reliable informationabout processeswill be

produced

4. He/she can observe this– No:considerable uncertaintywhether effects haveoccured

5. Theywill contribute to stakeholdergoals/satisfyrequirements– Noevidence that process improvementswill occur

61

Applying existingtheoriestodataandInferring neworupdatedtheoriesfromdata

• Description– Casedescriptionsofeverystep– InterpretationofeverystepintermsofR1– R6

• Statisticalinference– Notpossiblefromacase– (butthereisoneinsidethiscasetoinvestigatethe

relationbetweendefectdescriptionsandissuedescriptions)

• Abductive inference– Explanationofnon-useintermsofR1– R6– Rationalexplanationintermsofreasonsofactors

• Analogicinference– Descriptionsandexplanationgeneralizedbyanalogy– Discussionofexternalvalidity

62

How did it happen? • Existing theory 1

assumed, and falsified• New theory 2 emerged

from the data and fromopinions of actors in theOoS. Or were thepropositions R1-6 specified before the case study was started?

Theresearchsetup

PopulationSample of Objects of

Study

Representsone or more

populationelements

Treatment instruments

Measure-ment

instruments

63

Sources of evidence p. 5:Context information, raw data of version archive andbugtracker, analysis steps taken and not taken, issues and arguments of those steps, data provided by MSR tools,Infopark’s interpretation of the outcomes of the steps

MSR tools providing data;Peppers work notes;Pepper’s memory(sect 8.3)

MSR tools

One complex Object of Study: Infopark and its software repositories

Other software development organizations and their repositories

Treatment is the 4–step procedure listed in sect 2.3 performed by Pepper at Infopark

21October2015 IASESE 64

“ExperimentalValidationofaRiskAssessmentMethod”

Vriezekolk,Etalle&Wieringa

21stWorking ConferenceonRequirements Engineering:

Foundationsfor SoftwareQuality(REFSQ)2015

65

• Labexperimentto testreliability ofamethod,RASTER,to assess riskoftelecomavailability– Researchquestion:Howreliable isRASTER?– Researchsetup:Sixgroupsofthree students eachhadto estimate likelihood and impactofalistofnon-availabilityrisks for an emailservice,usingthe RASTERmethod

66

Whatisatheory

• Designtheory– RASTERxprofessionalsprovidingservicesduringincidentsanddisasters→availabilityriskassessments

• Theoryoftheexperiment– Sourcesofvariabilityinassessmentare

• Ambiguityorincompletenessofthemethoddescription• Misunderstandingofthemethod,• Lackofexperience• Lackofmotivation• Casecomplexity• Disturbancefromtheenvironment

67

Empirical test,Peer review?

Empirical test,Peer review?

Artefact, context

Artefact, context

ThestructureoftheoriesDesigntheory1. Conceptualframework

– Rasterconcepts(infrastructurecomponent,vulnerability,risk,impact,likelihood,…)

2. Thedesigngeneralization

Theoryoftheexperiment1. Conceptualframework

– Riskassessor,team,targetofassessment,asse4ssmentenvironment

2. Generalizations– Claimsaboutmechanismsthatproducevariability

68

Theuseoftheories• “RasterxProfessionals→riskassessments”

– Frameaphenomenon:riskassessmentsaremadebyprofessionals– Describeit:describetelcoinfrastructurearchitectureandits

vulnerabilities– Specifyatreatment:useRASTERtoassessrisks– Analyzeit:Traceriskstoarchitecturecomponents– Generalizeaboutit:claimthatotherprofessionalswouldfindthe

samerisksofsimilartelcoarchitectures– Predictaneffect:predictthatthiswillhappeninthenextproject– Explainaneffect:ExplainassessmentsintermsofRASTERmethodand

ToA

69

Usability oftheories1. Professionaliscapable to recognize Contextassumptions

– Yes

2. Capable to acquire/buildArtifact under constraints ofpractice– RASTERrequires relatively little training;RAisexpensive,butnot due to

RASTER

3. Theeffects will indeed occur– Hasbeenshown inexperiments and pilots

4. He/she can observe this– Plain for all to see

5. Theywill contribute to stakeholdergoals/satisfy requirements– Goalisto obtain accurateand reliable assessments

70

Inferringtheoriesfromdata– Description

• OutcomeofRA’sonpaper• Krippendorf’s alphatomeasureinterrateragreement• Outcomeofexitquestionnairestoassessourcesofvariability

– Statisticalinference• Samplenon-random,andtoosmall.

– Abductive inferenceObservedvariabilityexplainedby1. lackofexpertknowledge,2. differencesinassumptions,3. difficultytochoosebetweenadjacentordinalvaluesforlikelihood

– Analogicinference• 1and 2absent/reduced inthe field,so less variability there• 3motivates improvement ofthemethod to reduce this phenomenon

71

Theresearchsetup

PopulationSample of Objects of

Study

Representsone or more

populationelements

Treatment instruments

Measure-ment

instruments

72

RAprofessionals intelcoDoing RAinaquiet room

Self-selected sampleofstudentsInaquiet room

ApplicationofRASTERto asmallcase

Personalobservation,Exitquestionnaire,RASTERforms

Oralinstruction,written casedescription and RASTERhelp

Similarities and dissimilarities!Used both to reason from sampleto population

1. Theory ofvariability formulated;2. Designed aresearchsetupthat minimized the impactofthesesources;3. Explained observed variation interms ofthis theory4. Used this to generalize to population and to improve RASTER

top related