measurement edrs 6301 summer 2001 dr. kielborn. measurement n all measures contain error n random...

15
Measurement EDRS 6301 Summer 2001 Dr. Kielborn

Upload: elwin-cook

Post on 17-Jan-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Measurement EDRS 6301 Summer 2001 Dr. Kielborn. Measurement n All measures contain error n Random error leads to unreliability n Systematic error leads

Measurement

EDRS 6301

Summer 2001

Dr. Kielborn

Page 2: Measurement EDRS 6301 Summer 2001 Dr. Kielborn. Measurement n All measures contain error n Random error leads to unreliability n Systematic error leads

Measurement

All measures contain error Random error leads to unreliability Systematic error leads to invalidity

True Score = Obtained Score + Random Error

Page 3: Measurement EDRS 6301 Summer 2001 Dr. Kielborn. Measurement n All measures contain error n Random error leads to unreliability n Systematic error leads

True Scores and Error Scores

A true score is the real and unchanging measure of the human characteristic

The error score is a positive or negative value that results from uncontrolled and unrealized variability in the measurement

Page 4: Measurement EDRS 6301 Summer 2001 Dr. Kielborn. Measurement n All measures contain error n Random error leads to unreliability n Systematic error leads

Error

Error can be the result of the way we observe or test the individual

Observational (difficulty, broad test with few samples matching each concept; items unclear to the participant)

Procedural (inconsistent administration, recording, scoring or interpretation)

Page 5: Measurement EDRS 6301 Summer 2001 Dr. Kielborn. Measurement n All measures contain error n Random error leads to unreliability n Systematic error leads

Error continued

Subject - (Individuals performing differently; reaction of participant to instrument or experiment)

Page 6: Measurement EDRS 6301 Summer 2001 Dr. Kielborn. Measurement n All measures contain error n Random error leads to unreliability n Systematic error leads

Reliability

Reliability is consistency in measurement

Consistency is specific to the group being assessed

If there is consistency, there is confidence in the results

Page 7: Measurement EDRS 6301 Summer 2001 Dr. Kielborn. Measurement n All measures contain error n Random error leads to unreliability n Systematic error leads

Reliability

The consistency of measurement It is the extent to which observations/

experimental design can be replicated by another independent researcher

How consistently a data collection process measures whatever it measures

Page 8: Measurement EDRS 6301 Summer 2001 Dr. Kielborn. Measurement n All measures contain error n Random error leads to unreliability n Systematic error leads

Sources of Unreliability

Faulty items and observations (tricky, ambiguous, or confusing) questions or format

Excessively difficult elements of the data collection process (participants guess)

Excessively easy elements of the data collection process

Inadequate number of observations or items

Page 9: Measurement EDRS 6301 Summer 2001 Dr. Kielborn. Measurement n All measures contain error n Random error leads to unreliability n Systematic error leads

Sources of Unreliability

Accidentally focusing on multiple outcomes (all or most test items or questions in an interview/survey refer to the same characteristic)

Faulty scoring

Characteristics of the respondents (inability to concentrate, mood)

Faulty administration (room may be hot or cold or full of distractions)

Page 10: Measurement EDRS 6301 Summer 2001 Dr. Kielborn. Measurement n All measures contain error n Random error leads to unreliability n Systematic error leads

Validity

Does it measure what we think it is measuring?

High validity - high amount of accuracy

Page 11: Measurement EDRS 6301 Summer 2001 Dr. Kielborn. Measurement n All measures contain error n Random error leads to unreliability n Systematic error leads

Validity

Establish rapport Minimize disruptions Use unobtrusive methods for recording

data Triangulation - confirming results

through more than one data source

Page 12: Measurement EDRS 6301 Summer 2001 Dr. Kielborn. Measurement n All measures contain error n Random error leads to unreliability n Systematic error leads

Internal validity

The extent to which the results of a study are supported by the methodology. A well-controlled study is said to have high internal validity and “believable” conclusions.

Page 13: Measurement EDRS 6301 Summer 2001 Dr. Kielborn. Measurement n All measures contain error n Random error leads to unreliability n Systematic error leads

Threats to Internal Validity

History - An event occurring between pre and post tests

Maturation - A change that occurs because a participant has grown older or gained experience

Instrumentation - A change that occurs because the testing procedures are unreliable or have altered unintentionally

Page 14: Measurement EDRS 6301 Summer 2001 Dr. Kielborn. Measurement n All measures contain error n Random error leads to unreliability n Systematic error leads

Threats to Internal Validity

Testing - A change that occurs because the test has sensitized the participants to the nature of the research

Regression - The tendency of a very low score or a very high score to move toward the mean

Page 15: Measurement EDRS 6301 Summer 2001 Dr. Kielborn. Measurement n All measures contain error n Random error leads to unreliability n Systematic error leads

References Marshall, J. (2001). Assessment for educational

improvement workshop, UWG, Carrollton, GA, June 16, 2001.

Schloss, P.J., & Smith, M.A. (1999) Conducting research. Upper Saddle River, NJ: Merrill.

Vockell, E.L., & Asher, J.W. (1995) Educational Research (2nd Ed.). Upper Saddle River, NJ: Merrill.