capturing elusive level 3 data: the secrets of survey design session: su116

Post on 31-Dec-2015

15 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Capturing Elusive Level 3 Data: The Secrets Of Survey Design Session: SU116. Presented by: Ken Phillips Phillips Associates May 19, 2013. AGENDA. Review Kirkpatrick/Phillips 5 Level Evaluation Model Examine Level 3 evaluation facts - PowerPoint PPT Presentation

TRANSCRIPT

Phillips Associates 1

Capturing Elusive Level 3 Data: The Secrets Of

Survey Design

Session: SU116

Capturing Elusive Level 3 Data: The Secrets Of

Survey Design

Session: SU116

Presented by:Ken PhillipsPhillips AssociatesMay 19, 2013

Phillips Associates 2

➤ Review Kirkpatrick/Phillips 5 Level Evaluation Model

➤ Examine Level 3 evaluation facts

➤ Analyze survey creation errors in a sample Level 3 evaluation

➤ Apply 12 tips for creating valid, scientifically sound Level 3 evaluations

AGENDAAGENDA

Phillips Associates 3

Levels of Evaluation Measurement Focus Time Frame

Level 1: Reaction Participant favorable reaction to a learning program

Conclusion of learning program

Level 2: Learning Degree to which participants acquired new knowledge, skills or attitudes

Conclusion of learning program or within 6 to 8 weeks after

Level 3: Behavior Degree to which participants applied back-on-the-job what was learned

2 to 12 months

Level 4: Results Degree to which targeted business outcomes were achieved

9 to 18 months

Level 5: ROI Degree to which monetary program benefits exceed program costs

9 to 18 months

KIRKPATRICK/PHILLIPS EVALUATION KIRKPATRICK/PHILLIPS EVALUATION MODELMODEL

Phillips Associates 4

LEVEL 3 EVALUATION FACTS*LEVEL 3 EVALUATION FACTS*

55% of organizations evaluate at least some learning programs at Level 3

Organizations that use Level 3s on average evaluate 25% of all programs

75% of organizations view data collected as having high or very high value

*ASTD Research Study, “The Value of Evaluation: Making Training Evaluations More Effective,” 2009

Phillips Associates 5

*Donald & James Kirkpatrick, “Evaluating Training Programs: The Four Levels,” 2006.

DATA COLLECTION METHODSDATA COLLECTION METHODS

5

Phillips Associates 6

LEVEL 3 EVALUATION FACTS*LEVEL 3 EVALUATION FACTS*

➤ Most common methods for evaluating behavior

• Participant surveys (31%)

• Action planning (27%)

• Performance records monitoring (24%)

• On job observation (24%)

*ASTD Research Study 2009

Phillips Associates 7

POSSIBLE SURVEY RESPONDENTSPOSSIBLE SURVEY RESPONDENTS

➤ Learners

➤ Peers/colleagues

➤ Direct reports

➤ Managers

➤ External customers

Phillips Associates 8

HOW TO DECIDEHOW TO DECIDE

Who has first-hand knowledge of learners’ behavior?

How disruptive/costly is data collection method?

How credible do results need to be?

What are business executive/stakeholder expectations?

Phillips Associates 9

REQUIRED DRIVERS*REQUIRED DRIVERS*

➤ Planning to measure behavior? Include required drivers

• Reinforcements – follow-up modules, job aids

• Encouragement – coaching, mentoring

• Rewards – recognition, $

• Monitoring – surveys, focus groups,

➤ On-the-job behavior change is both our & management’s responsibility

*Kirkpatrick Four Levels Evaluation Certification Program participant workbook

Phillips Associates 10

SAMPLESAMPLE

LEVEL 3LEVEL 3

PARTICIPANT SURVEYPARTICIPANT SURVEY

Phillips Associates 11

8. Before providing employees with feedback about their job performance, my manager considers whether or not he or she is knowledgeable about their job.

25. When giving feedback to an employee my manager considers whether it should be done privately or in the presence of others.

WHAT’S WRONG WITH THESE?WHAT’S WRONG WITH THESE?

Phillips Associates 12

TIP 1 (CONTENT)TIP 1 (CONTENT)

*Palmer Morrel-Samuels, “Getting the Truth into Workplace Surveys”, Harvard Business Review, 2002.

Focus on observable behavior not

thoughts or motives*

Phillips Associates 13

14. My manager gives his or her employees feedback just as soon as possible after an event has happened and avoids getting emotional or evaluative.

18. My manager provides employees with regular ongoing feedback about their job performance and speaks in a normal conversational tone or manner when delivering the feedback.

WHAT’S WRONG WITH THESE?WHAT’S WRONG WITH THESE?

Phillips Associates 14

TIP 2 (CONTENT)TIP 2 (CONTENT)

Limit each item to a single description of

behavior*

*Palmer Morrel-Samuels, 2002

Phillips Associates 15

WHAT’S WRONG WITH THESE?WHAT’S WRONG WITH THESE?

2. My manager doesn’t get to know his or her employees as individuals before providing them with feedback about their job performance.

7. When giving employees feedback about their job performance, my manager doesn’t distinguish between patterns of behavior and random one-time events.

Phillips Associates 16

TIP 3 (CONTENT)TIP 3 (CONTENT)

*Palmer Morrel-Samuels, 2002.

Word about 1/3 of the survey items so

that the desired answer is negative*

Phillips Associates 17

WHAT’S WRONG WITH THESE?WHAT’S WRONG WITH THESE?

➤ Building Trust

➤ Credibility

➤ Feedback Sign

➤ Feedback Timing

➤ Feedback Frequency

➤ Message Characteristics

Phillips Associates 18

TIP 4 (FORMAT)TIP 4 (FORMAT)

Keep sections of the survey unlabeled*

*Palmer Morrel-Samuels, 2002.

Phillips Associates 19

TIP 5 (FORMAT)TIP 5 (FORMAT)

*Palmer Morrel-Samuels, 2002.

Design sections to contain a similar

number of items and questions to contain

a similar number of words*

Phillips Associates 20

TIP 6 (FORMAT)TIP 6 (FORMAT)

Place questions regarding respondent

demographics (e.g. name, title,

department, etc.) at end of survey, make

completion optional and keep questions

to a minimum*

*Palmer Morrel-Samuels, 2002.

Phillips Associates 21

TIP 7 (MEASUREMENT)TIP 7 (MEASUREMENT)

Collect data from multiple observers or a

single observer multiple times

Phillips Associates 22

WHAT’S WRONG WITH THIS?WHAT’S WRONG WITH THIS?

StronglyAgree

Agree Disagree Strongly Disagree

N/A

4 3 2 1

Phillips Associates 23

TIP 8 (MEASUREMENT)TIP 8 (MEASUREMENT)

Create a response scale with numbers at

regularly spaced intervals and words

only at each end*

*Palmer Morrel-Samuels, 2002.

Phillips Associates 24

EXAMPLESEXAMPLES

This:

Not This:

Never This:

Not at all True

RarelyTrue

Occasionally True

SomewhatTrue

MostlyTrue

FrequentlyTrue

Completely True

1 2 3 4 5 6 7

Not at all True

Completely True

1 2 3 4 5 6 7

Not at all True

RarelyTrue

OccasionallyTrue

SomewhatTrue

MostlyTrue

FrequentlyTrue

Completely True

Phillips Associates 25

TIP 9 (MEASUREMENT)TIP 9 (MEASUREMENT)

*Palmer Morrel-Samuels, 2002.

Use only one response scale with an odd

number of points (7, 9 & 11 point

scales are best)*

Phillips Associates 26

ODD vs EVEN SCALEODD vs EVEN SCALEODD vs EVEN SCALEODD vs EVEN SCALE

Phillips Associates 27

TIP 10 (MEASUREMENT)TIP 10 (MEASUREMENT)

Use a response scale that measures

frequency not agreement or

effectiveness*

*Palmer Morrel-Samuels, 2002.

Phillips Associates 28

EXAMPLESEXAMPLES

This:

Or this:Not at all

TrueCompletely

True

1 2 3 4 5 6 7

Never Always

1 2 3 4 5 6 7

Phillips Associates 29

TIP 11 (MEASUREMENT)TIP 11 (MEASUREMENT)

Place small numbers at left or low end of

scale and large numbers at right or

high end of scale

Phillips Associates 30

EXAMPLESEXAMPLES

This:

Not this:

Not at all True

Completely True

1 2 3 4 5 6 7

CompletelyTrue

Not at all True

7 6 5 4 3 2 1

Phillips Associates 31

TIP 12 (MEASUREMENT)TIP 12 (MEASUREMENT)

Include a “Not Applicable” or “Did Not

Observe” response choice and make it

different*

*Palmer Morrel-Samuels, 2002.

Phillips Associates 32

EXAMPLEEXAMPLE

Not at all True

Completely True

NA

1 2 3 4 5 6 7

Phillips Associates 33

SUMMARY (CONTENT)SUMMARY (CONTENT)

Focus on observable behavior

Limit ideas to a single description of

behavior

Word 1/3 of items as reverse score

Phillips Associates 34

SUMMARY (FORMAT)SUMMARY (FORMAT)

Keep survey sections unlabeled

Design sections to contain similar number of items & questions similar

number of words

Place questions regarding respondent demographics at end of survey, make

completion optional and keep questions to a minimum

Phillips Associates 35

SUMMARY (MEASUREMENT)SUMMARY (MEASUREMENT)

Collect data from multiple observers or multiple times

Create a response scale that:

• Has words only at each end• Has an odd number of points• Measures frequency• Has small numbers at left and large

numbers at right• Includes a “Not Applicable” that is

different

Phillips Associates 36

Phillips, Ken, “Eight Tips on Developing Valid Level 1 Evaluation Forms,” Training Today, Fall 2007, pps. 8 & 14.

Phillips, Ken, “Developing Valid Level 2 Evaluations,” Training Today, Fall 2009.

Phillips, Ken, “Capturing Elusive Level 3 Data: The Secrets of Participant Survey Design,” Unpublished article, 2013.

Phillips, Ken, “Do Level 1 Evaluations Have a Role in Organizational Learning Strategy?”, Unpublished article, 2013.

FREE ARTICLESFREE ARTICLES

Phillips Associates 37

YOUR FEEDBACK COUNTS!YOUR FEEDBACK COUNTS!>Your Feedback Counts!

Evaluation forms for this session are

available NOW via the mobile app and

at the following link:

www.astdconference.org

Phillips Associates 38

Ken Phillips

Phillips Associates

34137 N. Wooded Glen Drive

Grayslake, Illinois 60030

(847) 231-6068

www.phillipsassociates.com

ken@phillipsassociates.com

top related