whitepaper: instructional designer/developer practice ......instructional designer/developer...

33
INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Whitepaper: Instructional Designer/Developer Practice Analysis and Survey Results Sharon L. Gander, CPT, Practice Leader September 2014

Upload: others

Post on 27-May-2020

12 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

Whitepaper: Instructional Designer/Developer

Practice Analysis and Survey Results

Sharon L. Gander, CPT, Practice Leader

September 2014

Page 2: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

Table of Content Whitepaper: Instructional Designer/Developer Practice Analysis and Survey Results ................................. 1

Table of Content ......................................................................................................................................... 2

Background and Overview .......................................................................................................................... 4

Step 1: The Current State of Instructional Design ................................................................................. 4

Market ................................................................................................................................................ 4

Standards ............................................................................................................................................ 5

Theories and Models .......................................................................................................................... 5

Job Descriptions .................................................................................................................................. 5

Changing market dynamics ................................................................................................................ 6

The Need................................................................................................................................................. 7

Step 2: Strategic Decision – Certification vs Microcredentials ............................................................... 7

Step 3: Model-free/Theory-free Standards ............................................................................................ 7

Step 4: Practice Analysis -- Identifying Common Themes ...................................................................... 8

Step 5: Practice Analysis - Validating Domains ....................................................................................... 9

Practice Analysis Survey Results ................................................................................................................ 9

Demographics ......................................................................................................................................... 9

Current Role ...................................................................................................................................... 10

Supporting Experience ...................................................................................................................... 11

Years of Experience .......................................................................................................................... 12

Learning Solutions Developed .......................................................................................................... 13

Standards Domains and Performances .................................................................................................... 16

Importance ....................................................................................................................................... 17

Frequency ............................................................................................................................................. 18

Difficulty............................................................................................................................................ 19

Who does this work? ............................................................................................................................ 21

What Did We Miss? .............................................................................................................................. 22

Summary ................................................................................................................................................... 23

Appendix A: Theories ............................................................................................................................... 24

Page 3: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

Appendix B: Standardization of Terms .................................................................................................... 28

Appendix C: Instructional Design Badge Definitions for Solutions Domain Badges ................................. 29

Target Audience:................................................................................................................................... 29

Secondary target audience: .................................................................................................................. 29

Domains with performance measures (i.e., standards) to meet ............................................................. 29

Align Solution ........................................................................................................................................ 29

Assess performance .............................................................................................................................. 30

Ensure context sensitivity ..................................................................................................................... 30

Elicit performance "practice" ............................................................................................................... 30

Engage learner ...................................................................................................................................... 31

Enhance retention and transfer ........................................................................................................... 31

Ensures Relevance ................................................................................................................................ 31

Addresses Sustainability ....................................................................................................................... 32

Collaborates and Partners .................................................................................................................... 32

Appendix D: Learning Solutions ............................................................................................................... 33

Page 4: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

4 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

Background and Overview The Institute for Performance Improvement, L3C (TIfPI) sponsors a variety of workforce capability

development credentials and is seeking to develop a credential for instructional designers and

developers. TIfPI commissioned a practice analysis to define an international, theory-free/model-free set

of standards for instructional designers and developers (IDs). One element of that process included a

survey to validate the proposed standards (domains of knowledge, skill, and performance).

This paper discusses the practice analysis process, including survey results that generated nine primary

skill set domains for IDs.

The instructional design and development field is large, international, fragmented, and generally unable

to demonstrate consistent value for work performed. Job and role titles vary. Job preparation varies and

career paths are extremely flexible. Often, individuals in this field struggle to be seen as credible experts

with highly refined skills. Providing opportunities for individuals to earn credentials would be the first

step to building field validity, creating standards that transcend borders, and increasing the value of

instructional design practitioners regardless of position title. A team of TIfPI practice leaders who are

instructional design and development experts tackled the problem. This paper provides an overview of

the current state of ID, followed by an explanation of the method these experts proposed to tackle the

issue of credentialing individuals in this field.

Step 1: The Current State of Instructional Design The US Bureau of Labor and Statistics (BLS) lists only one role within the Professional and Services sector

related to the field of instructional design, the Training and Development Specialist. BLS does not include

a national labor role for Instructional Designer or Instructional Developer or Instructional Technologist,

even though degree programs for these roles exist in colleges and universities across the United States

and around the world. However, BLS indicates that the demand for Training and Development Specialists

is expected to increase by 15% between 2012 and 2020, adding more than 35,000 new jobs (see Table 1).

Table 1: US Bureau of Labor and Statistics (BLS) Quick Facts – Training and Development Specialist.*

(http://www.bls.gov/ooh/business-and-financial/print/training-and-development-specialists.htm#tab-2)

Quick Facts: Training and Development Specialists

2012 Median Pay $55,930 per year $26.89 per hour

Entry-Level Education Bachelor’s degree

Work Experience in a Related Occupation Less than 5 years

On-the-job Training None

Number of Jobs, 2012 228,800

Job Outlook, 2012-22 15% (Faster than average)

Employment Change, 2012-22 35,400

*Note: US BLS does not have a listing for Instructional Designer or Instructional Developer.

Market

Page 5: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

5 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

The employment market is fragmented and diverse. Every sector of business requires instructional

designers and developers. Many employers prefer IDs with experience in their business sector. This

means that subject/content experts with a talent for teaching often move into instructional design using

their field-specific knowledge as the key to open the door to course design and development, but with

little or no formal preparation for quality instructional design and development.

Statistics are not available on the exact percentage of IDs with degrees in instructional design and

development, adult education, or training and development. However, practitioners usually estimate

that around 20% of the field has formal education in the field. The rest move into the field laterally, as

subject/content experts with talent supplemented by workshops, conferences, reading, and trial-and-

error on-the-job experience.

For many lateral movers, their move into instructional design and development constitutes a significant

promotion. Meanwhile, experienced IDs of all types struggle to find effective career paths.

The field is fragmented, diverse, and undifferentiated.

Standards

Few standards exist within the field; those that do exist are implemented across a vast array of solution

sets, complicating adherence to the standards. An additional challenge is that some of the industry

standards refer to models or theories that are open to interpretation. The standards are not consistent.

Theories and Models

A variety of theories and theorists infuse the field with expectations that frequently conflict with one

another. Many theories and models are very effective; few are mutually exclusive. Often, practitioners

build expertise by applying one model or theory. They may even build a professional self-identity related

to it. As a result, differences between models and theories become points of friction that create “camps”

from which experts operate. They also become buzzwords for leaders seeking to hire or promote

learning experts, while not having sufficient personal expertise to decode the differences in skillsets.

The truly expert ID chooses the right set of theories, models, and tools to apply in any given situation.

Often, the expert ID’s choice must be justified to clients and leaders who have minimal understanding of

the impact that various choice may produce. In the real world of ID work, any two designers

approaching the same situation might choose different theories, models, and tools to accomplish the

same goal. Both could be successful. There is no “one right answer” to design.

However, theories and models do not provide sufficient guidance to create field standards that work

across organizational, institutional, geographic, or philosophical boundaries.

Job Descriptions

In 2012-2013, Sharon Gander, CPT and TIfPI Practice Leader/Instructional Designer, tracked job postings

for more than 120 job listings on major job search engines. This was not a formal study with defined

results; just a job-search experience summarized as follows. Many of the instructional design positions

listed in 2012-2013:

Page 6: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

6 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

Required knowledge of the ADDIE model, without indicating that the position incumbent would

use this model.

Required expert proficiency in a large number of tools and technologies, but the employer did not

indicate what the incumbent of the position was supposed to do with the mix of tools.

Requested inexplicable and often non-standard degrees (e.g., high school diploma required in

some job descriptions, while very similar ones required Ph.Ds. in organizational or industrial

psychology).

Required 3-5 years of experience (not more and not less).

Offered low pay scales – some barely more than minimum wage, though more frequently started

in the low $30K range.

Provided no career path information, though a few implied a career path by using job titles such

as ID Level 1, ID3, etc.

In effect, the majority of posted positions assumed experience (3-5 years) supplemented by significant

expertise with complex toolsets. However, these posted positions paid the equivalent of a post-college

new-hire position. This makes it difficult to enter the market. (How does one get that first three years of

experience, anyway?) The lack of a career path makes it difficult to attract qualified candidates. The

overall effect is to devalue the role of instructional designer, which impacts not only the ID but the

employer, as well, since the job search does not generate a list of viable candidates.

Changing market dynamics

ID job listings and freelance websites show a distinct devaluation of instructional design and

development professionals. Indicators of changing market dynamics include:

Shifting work off-shore to reduce the costs of instructional development services. This practice

shifts the focus to bid-rates rather than design, quality, or even learner outcomes. This practice

also creates a ‘buyer beware’ market, since the buyer of low-bid ID services has no way to

compare the quality of the ID’s work on any scale other than rate.

Devaluing professional work as highly experienced practitioners compete against entry-level

practitioners and off-shore bidders for work that is based solely on price. The impact is an inability

to compare work results on any basis except dollars. This also occurred during the recession

between 2007 and 2011.

Increasing market demand for specific tools or tool platforms creates a tools emphasis without

addressing learner need. In addition, there is no indication that the market recognizes or

differentiates the technology skill set required for the development of any solution set other than

the elearning ones. Nor does it appear that employers know that other solution sets exist. In

addition, this tool focus tends to idealize today's tools and ignore the fact that other emerging tool

sets will continue to change the way learning is designed and developed.

Increasing emphasis on content, rather than learner needs, devalues the effort required during

the design process to identify and/or create content. This creates a shift toward expert knowledge

embedded in slide decks, blogs, and expert presentations, even as it devalues the work effort

required to develop (write, review, approve) new content. It creates a mindset that instructional

design it "merely" copy-pasting content from a source to a elearning tool.

Page 7: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

7 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

Increasing use of technology to deliver training and provide infrastructure also:

o Increases the quantity of content that field experts can produce.

o Decreases the effort required to deliver content in learning formats.

o Focuses attention on tools rather than processes.

o Ignores the need to evaluate learner progress.

o Ignores the need to define exactly what learners need and the structured organization of

content and experience required to meet those need.

The Need Recent changes in the ID roles and the lack of BLS role standards have created:

Role confusion,

Difficulty matching the right skills to the work,

Selection of worker based on wage (hourly cost) rather than on the ability to do the work,

A focus on tools rather than skills or quality, and

Inconsistent results from learning solution products.

The need, then, is to provide a framework that will validate professional instructional design and

development skills as they are practiced by both the formally trained (i.e., by those with college

degrees) and informally trained practitioner.

Step 2: Strategic Decision – Certification vs Microcredentials TIfPI’s first strategic decision was whether to drive out certification for instructional designers that

required field-wide experience (including needs assessment) or to consider approaching the field with

microcredentials (a.k.a. badges). A microcredential validates competency in one or more subsets of a

field, whereas a full certification validates competency across the breadth of the field. Due to the

market fragmentation, role confusion, and the lack of differentiation among skillsets, TIfPI chose to

explore the microcredential path as the opportunity with the most promise for building and validating

capability among instructional designers and developers.

Step 3: Model-free/Theory-free Standards Credentials, even microcredentials, require a set of common standards against which an applicant’s work

can be judged. Assessment of work against standards may occur through knowledge testing, evidence of

performance, or a combination of the two. However, the instructional design and development field is

very unclear about its standards. Often, employers judge work based on an individual’s ability to apply

common models (ADDIE, ISD, SAM, Agile, Lean, etc.) or theories (adult learning theory, behaviorism vs

constructivism, Bloom, Mager, Gagne, Merrill, Rossett, etc.). Different employers accept different

models and theories. A standard is not available based on either a preferred model or a preferred

theory.

Because other organizations, such as American Society for Training and Development (now the

Association for Talent Development), developed credentials based on testing knowledge of learning

theory and because TIfPI promotes evidence-based credentialing, TIfPI chose to take the alternative

route of using an evidence-based, model-free approach to credentialing.

Page 8: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

8 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

In addition, TIfPI’s basic philosophy is to develop credentials that allow all workers to demonstrate

capability – in this case, for instructional designers’ and developers’ to be able to show their ability to get

results. Because the field has no common standards, only models and theories, the proposed credential

needs to define standards that are both model-free and theory-free and are used by expert instructional

designers and developers around the world.

Therefore, the second strategic decision was to look for an international, model-free/theory-free

approach to credentialing instructional designers and developers. This resulted in a practice analysis to

identify these standards. An explanation of the practice analysis and its results follows.

Step 4: Practice Analysis -- Identifying Common Themes TIfPI practice leader, Sharon Gander, CPT, researched nine major learning theories to uncover

commonalities among them (see Appendix A). This resulted in a list of 20 key terms used by multiple

theorists. Ten TIfPI Practice Leaders specializing in instructional design and development ranked these

terms for their value to instructional design, which created a short list of eleven common practices (see

Appendix B) that cross multiple design theories and models. These themes became the basis for

performance practices and potential skill set domains.

The team four of TIfPI’s instructional design and development experts, Andrea Moore, Annette

Wisniewski, Sharon Gander, and Dr. SiatMoy Chong convened to drive out standards; they became the ID

Badges Teams. This team identified nearly three dozen potential badges that would be useful across the

field. Then, they narrowed their focus to the development of learning solutions as the subset of the field

that should be approached first. They also recognized the need for badges front-end analysis, design,

measurement and evaluation, delivery logistics, content development, and project management. These

areas will be addressed in the future. The first practice analysis focused on development of learning

solutions and the standards that IDs use when developing out a given type of learning solution.

TIfPI defines “learning solution development” as the work that is required to build out, test, and

implement a design. It does not include front-end analysis, design documentation, on-going learning

delivery, or post-learning evaluation. Work here focuses on production of new learning events and tools.

The team identified nine practice domains needed by learning solution developers (by whatever role

name their organization defines them):

1. Align solution internally and externally

2. Assess performance

3. Address sustainability

4. Elicit performance practice

5. Ensure context sensitivity

6. Engage the learner

7. Enhance retention and transfer

8. Ensure relevance

9. Collaborate and partner (*) Performance assessment focused on learning only.

Page 9: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

9 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

These domains allow an expert reviewing a learning solution to answer two essential questions:

1. Does the ID provide evidence that the developed learning solution adequately adhered to

common standards?

2. Does the ID provide evidence that the developed solution adequately met the identified need of

the learner and the learner’s organization?

Step 5: Practice Analysis - Validating Domains These nine domains create a new language that is international, theory-free, and model-free. Since it is

new to both instructional designers/developers and to the ID field, TIfPI validated the new terminology.

First, the ID Badge Team shared the new domains with TIfPI’s Practice Leaders/instructional design

experts to identify gaps. Although the language of these domains was non-traditional and somewhat

uncomfortable, these experts ratified the new terminology and the domains. They did not add any

additional domains, but did request additional details in the form of definitions and examples of

expected performances for each domain.

The ID Badge Team defined terms and described performance expectations (see Appendix C). TIfPI then

tested the idea publicly by publishing a survey and promoting that survey to:

TIfPI Practice Leaders

International Society for Performance Improvement (ISPI) conference attendees at the

2014 THE Performance Conference (Indianapolis, IN)

TIfPI Practice Leaders’ personal constituents via email from the Practice Leader

TIfPI webinar and workshop attendees via mass email from TIfPI

Discussion groups in LinkedIn via links provided in discussion groups by Practice Leaders

participating in discussions

The survey was open for 12 weeks, April 10 – June 10, 2014.

Practice Analysis Survey Results Sixty-seven individuals responded to the survey. Twenty-nine responded to the demographics section only and skipped the practice domain standards questions. However, 43 individuals responded to both the demographics and the practice domain questions. These 43 respondents validated the domains and performance.

A strong alignment existed among these respondents; and the respondents appeared to represent the

field well. The following section describes these findings.

Demographics The survey started with four questions designed to focus respondents on instructional design in its varied

facets – design, development, delivery, artistry, project management, and leadership. Respondents were

asked to declare their current role (see Table 1 and Figure A), identify other roles that they had held (see

Table 2), define their length of service, and list the types of learning solutions that they had developed.

The demographics statistics alone were fascinating.

Page 10: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

10 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

Current Role

Although TIfPI intended this survey for instructional designers, instructional developers, and instructional

technologist, the survey included related roles to better understand the breadth of experience that

instructional designers and developers bring to the field. To that end, survey participants from multiple

backgrounds were included.

Individuals identifying themselves as currently being instructional designers, developers, or technologists

represented 59% of the respondents. The other 41% represented a range of possible roles in the learning

industry (see Table 2 and Figure A).

Five individuals indicated that they currently held roles other than those provided in the survey. Similar

roles did exist in the list of choices; however, five individuals felt the need to express their specific role.

They were:

Adjunct Faculty at a state university, teaching graduate instructional design classes

Elearning Voiceover talent

Human Capital Executive

Teaching Artist

Curriculum Designer

Table 2: (Q1) How would you classify your current role? (Pick the one that is most appropriate.)

Answer Options Response Percent

Response Count

Instructional Designer 49% 33 Academic – college level research and/or teaching adult education, instructional design or instructional technology 9% 6

Learning Project Manager 9% 6

Instructional Developer 8% 5

Instructor (trainer) 6% 4

Learning Function Executive 6% 4

Learning Function Manager 5% 3

Student in adult education, instructional design, or instructional technology 3% 2

Graphic artist 2% 1

Instructional Technologist 2% 1

Programmer of learning solutions 2% 1

Subject content expert 2% 1

Social Media Expert 0% 0

Videographer 0% 0

Other (please specify)

5

Total

67

Page 11: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

11 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

Figure A: Current roles diagram (Q1).

Supporting Experience

Instructional designers often transition between roles. In fact, the industry includes a large percentage of

practitioners that have made lateral movements from subject/content roles, artistic roles, and

technology roles. The role that any given instructional practitioner holds today may or may not be the

same as any previous roles. Therefore, the survey asked respondents to provide additional insight on

other roles that each respondent may have held in the past (see Table 3.)

Table 3: (Q2) What other roles have you held? (Choose all that apply.)

Answer Options Response Percent

Response Count

Instructor or Trainer 63% 42

Instructional Designer 54% 36

Instructional Developer 43% 29

Learning Project Manager 39% 26 Academic – college level research and/or teaching adult education, instructional design or instructional technology 34% 23

Subject content expert 28% 19

Instructional Technologist 25% 17

Learning Function Manager 19% 13

Graphic artist 13% 9

Learning Function Executive 10% 7

Programmer of learning solutions 9% 6

Social Media Expert 9% 6

Videographer 5% 3

Other (please specify)

6

Total

242

Average Roles/Respondent

3.6

Page 12: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

12 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

This question allowed respondents to select multiple options and provided TIfPI with snapshot of

respondents’ work history as compared to their current role (see Table 2). Together, Tables 2 and 3

provided several key insights:

The average respondent held 3.6 training and development roles, in addition to the respondent’s

current role.

Over 60% of respondents had been instructors/trainers in the past.

Over 50% of respondents indicated that they were instructional designers in the past.

30% of the respondents indicated that they currently or previously held the roles of Instructional

developer, learning project manager, or academic, in addition to their current role.

Other roles, such as leadership, technology, and artistic roles showed an increase over the same

roles listed in Question #1 (current role). This shows that these roles are essential to the career

path in this field, but may be flexible with IDs moving into them and out them with high

frequency.

TIfPI concluded from these results that instructional design and development practitioners might hold

multiple roles and functions, ranging from the artistic side to core design and development to leadership,

at various times in their careers. The results also show that individuals move among those roles

throughout their careers – that upward movement (from development to design to leadership) is not the

norm. Individuals in this field move multiple directions during their career in non-traditional moves.

For example, when looking at respondents who listed learning leadership as one of their previous roles,

their current roles where instructional design, developer, technologist, or academic, today. One was a

graphics artist, today.

Individuals with graphics artist as previous roles, were instructional designer, developer, content experts,

academic, and leader, today.

Those who listed instructional design/development at previous roles held leadership, graphic arts,

videography, content expert, and academic roles, as well as instructional design and project management

roles, today.

This mottled career path is to be expected in a fragmented and undifferentiated market.

In addition, responses to this question showed the dramatic breadth and depth of experience that

practitioners bring to their current work roles. Even the novices, with under five years’ experience, had

held several other roles. The average number of roles held for all respondents was 3.6 plus their role at

the time of reporting, or 4.6 overall.

Years of Experience

The large number of roles held by survey respondents might be viewed as a skewed perspective on the

field, if all respondents also identified themselves as having significant experience, as well – if the survey

had captured only the fields’ masters.

Table 4 and Figure B show respondents’ years of experience spread across the range of experience sets.

More than half of the respondents (55.2%) described themselves as having more than 10 years of

experience; these are the masters of the field. Responses for individuals with less than five years’

experience (the novices) and for individuals with five to ten years’ experience (intermediates) were

Page 13: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

13 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

distributed at 22.4% each. While experience groups were not equal, the survey population was random

enough for TIfPI to conclude that this experience-base reflects the current ID field’s experience.

Table 4: (Q3) How many years of experience do you have in the field of learning and instructional design and development? (Choose one.)

Answer Options Response

Percent Response

Count

Less than 5 22.4% 15

5 to 10 22.4% 15

More than 10 55.2% 37

answered question 67

Figure B: Years of Experience in Instructional Design and Development.

Learning Solutions Developed In the fourth question respondents were asked to select the different types of learning solution that they had created. Seventeen distinct learning solutions were available from which respondents could choose (see Table 5). Each learning solution was described so that respondents could distinguish between similar types of solutions). Multiple responses were allowed in order to provide an indication of the breadth and depth of instructional design expertise required.

Page 14: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

14 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

Table 5: (Q4) What types of learning solutions have you developed?

Response Percent

Response Count

Asynchronous Elearning: Self-instructional learning solutions presented entirely online without any peer or instructor involvement; learning that is directed by a computer based on learners responses to questions and activities.

76.6% 49

Synchronous Elearning: Presented entirely online in real-time, these learning solutions include peer and instructor involvement through multiple mediums such as discussions, webinars, wikis, project spaces, etc.; learning can be modified by instructor to meet needs of learner.

60.9% 39

mLearning: Asynchronous elearning provided for mobile device such as cellphones and tablets.

25.0% 16

Instructor Led Training: Classroom-based learning led by an instructor or trainer where learning events may include other solution elements, such as media, job aids, electronic performance support, games, etc.

84.4% 54

Simulations: Online or classroom-based learning that recreates essential elements of real world conditions within the learning environment in order to provide safe learning environment supported by feedback while learners engage in mock events similar to those in their real world work or life.

57.8% 37

Learning Games: Uniquely designed game experiences that facilitate authentic learning through interaction with peers, content, processes, and manipulative game pieces or interface; does not repurpose other games especially branded. (Note: branded games, such as Jeopardy, Clue, etc. are considered to be edutainment games, gamification, or practice activities only and will not be evaluated as Serous Learning Games.)

42.2% 27

Electronic Performance Support: Electronic job aids that support work processes where such tools may or may not also be used in training programs or may include learning elements within the learners native workflow directly in the work environment.

42.2% 27

Job Aids: Tools that support learning and recall directly within the work environment where such tools may include charts, diagrams, memory aids, videos, and more.)

75.0% 48

Social Media: Online community for facilitating learning and/or practice, typically among trainees and workers.

29.7% 19

Reusable learning objects (RLOs, a.k.a. micro learning objects): Learning tightly focused to develop mastery of one specific skill or process step.

32.8% 21

Goal-based/problem-based scenarios: Learning environments that mimic common problems or scenarios within the work environment and provide resource rich learning support in order to solve the problem or reach the goal.

54.7% 35

Self-study: Self-directed learning where the direction and timing of learning is guided by the learner themselves through the use of text, media, and online accesses, etc.

46.9% 30

Blended learning: Combinations of learning solutions, particularly learning solution integrated to create a single learning outcome.

71.9% 46

Community of Practice: A peer-to-peer community that addresses key issues or problems within their field of practice.

39.1% 25

Coaching/Mentoring: A formalized program where participants can access coaches or mentors and where coach, mentor and participant have structured roles in order to accomplish organization or role-specific learning.

43.8% 28

Learning Videos: Videos designed to teach skills or processes. 45.3% 29

Informal Learning: Formalized programs that encourage and track work-based learning from peers, superiors, customers, social media, and the environment.

32.8% 21

Other (Please specify.) 1

answered question 64

Average Response per Questions 8.6

Page 15: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

15 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

Figure C: Percentage of total respondents who indicated that they had developed each type of learning solution.

The results displayed in Table 5 and Figure C indicate the following:

Page 16: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

16 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

The majority of respondents developed some combination of asynchronous elearning, synchronous elearning, instructor-led training, simulations, goal-based or problem-based scenarios, job aids, and blended learning solutions.

Mobile learning (m-learning) and social media lag with 25% and 30%, respectively, indicating that significantly fewer IDs had the opportunity to worked on these types of learning solutions.

With 552 learning solutions selected, the average number of solutions per respondent was 8.6. This speaks to the breadth of experience expected.

Respondents with fewer than five years of experience listed essentially the same experience set as the overall group. They reported 100 responses, for an average of 6.25 learning solution types in less than five years of experience. This is significant number of learning solution types developed in a very short time period.

We can now answer key questions related to any survey:

Did this survey represent a cross-section of the intended learning and development audience?

Yes. The sampled respondents represent the years of experience, the range of roles, and the

variety of learning solution types typical of the ID field.

Is the survey skewed towards one portion of the population or another?

No, the balance of years of experience and types of roles appear to reflect the current state of the

field. On face validity alone, this ratio is very close to the field’s current experience

representation.

Will the survey results adequately represent the opinions of the field? Yes.

Standards Domains and Performances The majority of the survey was dedicated to questions about the nine domains of standard with

performances. Each standards domain had two questions. The first question had three parts asking

respondents to rate the importance, difficulty, and frequency of the domain on a scale of 1-4 (very low,

low, medium, and high). The second question asked respondents to identify which roles they would

expect to see involved in doing this work.

Domains surveyed were:

Address sustainability (Q23)

Align the learning solution (Q5)

Assess performance (Q8)

Collaborate and partner (Q26)

Elicit performance "practice" (Q14)

Engage the learner (Q20)

Enhance retention and transfer (Q17)

Ensure context sensitivity (Q11) Question numbers in parenthesis indicate the survey question about important, difficulty, and frequency.

Page 17: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

17 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

Importance

In the area of domain importance, there was strong agreement among respondents to the survey, across

respondents’ years of experience and between domains (see Table 6). Align the learning solution ranked

as the most important domain. All three experience groups and the all-respondents group ranked this as

most important. Assess performance (during learning) ranked nearly as highly with an all-respondent

ranking of #2. Enhance retention and transfer, and Assess performance ranked either second or third in

all groups. Interestingly, the group with more than 10 years of experience, the masters, considered the

domains of Collaborate and partner and Engage the learner to be more important than did their juniors.

Elicit performance practice and Address sustainability ranked as highly as Enhance retention and transfer

did for those with less than 5 years of experience, and ranked at the bottom of the list for practitioners

with more than 10 years of experience.

More importantly, all nine domains were valued as moderately to highly important, which means that all

domains received mean ratings ranging of at least 3.0 and many received mean ratings of more than 3.5

out of 4.0. Although the three experience groups valued some skill sets more than others, the

differences were statically minor as indicated by a standard deviation (SD) between domains of less than

.2 points for all respondents.

The survey validates that all nine domains are important with Align the solution and Assess performance

ranking as most important.

Table 6: Importance of skill set domains, as rated by years of experience and all respondents.

Novice (<5 Yrs) Intermediate

(5-10 Yrs) Master (10> Yrs) All Respondents

Importance Mean # Rank Mean # Rank Mean # Rank Mean # Rank

Address sustainability (Q23) 3.71 7 3 3.63 8 5 3.67 15 7 3.67 30 7 Align the learning solution (Q5) 3.90 10 1 3.90

10 1 3.96 23 1 3.93 43 1

Assess performance (Q8) 3.78 9 2 3.80

10 3 3.94 18 2 3.86 37 2

Collaborate & partner (Q26) 3.67 6 6 3.63 8 5 3.94 16 2 3.81 32 4 Elicit performance "practice" (Q14) 3.71 7 3 3.78 9 4 3.61 18 8 3.68 34 6 Engage the learner (Q20) 3.57 7 7 3.50 8 7 3.88 16 4 3.71 31 5 Enhance retention & transfer (Q17) 3.71 7 3 3.89 9 2 3.82 17 5 3.82 33 3 Ensure context sensitivity (Q11) 3.00 7 8 3.50 8 7 3.78 18 6 3.55 33 8

Mean Across Domains 3.63 7.5 3.70 8.7 3.82 17.6 3.75 34.1

Standard Deviation

between Domains

0.27 1.31 0.16 0.89 0.13 2.45 0.13 4.16

Page 18: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

18 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

Frequency The frequency with which practitioners do tasks and use skill sets (see Table 7) provides a counterbalance

to that skill set’s importance. A skill set domain that is deemed to be “important” may be used

“frequently”, thereby, increasing the demand for that skill. Frequency is an important factor in defining

domains for credentials.

As shown on Table 7, the masters with more than 10 years of experience drove the frequency rankings

for the following four domains:

Enhance retention and transfer

Collaborate and partner

Align the learning solution

Assess performance

In addition, there are clear arcs of increasing frequency across experience groups for:

Enhance retention and transfer (frequency ranked 3-2-1 as experience increases and an all-

respondents average of 1).

Collaborate and partner (frequency ranked 5-3-2 as experience increases and an all-respondents

average of 2).

However, Elicit performance practice, seems to arc frequency in reverse (2-4-7; all-respondents =5). This

reverse arch may be an important indicator of changing scope of work over time. That is, elicting

performance practice is an emphasis earlier in an ID's career. As ID succeed with this, they spend less

time on it but still consider it important. Over time, it may be easier to practice, and is, therefore, less

emphasized.

These mixed ranking and skill arcs by experience group may be indicative of age-and-stage skill set

expectations. The novices appears to be focused on Assess performance and Elicit performance practice;

they may be heavily focused on building solutions that create practice and evaluate actions (e.g.,

elearning modules showing step-by-step actions, skills workshops, etc.). The intermediate group appears

to work on Align solutions, Enhance retention and transfer, and Collaborate and partner. Here, the

intermediate level practitioner may be connecting learning solution to business need and participating in

teams building complex learning solutions. The masters are bringing it all together into a ‘complete

package’ and mentoring others and are doing most task with moderate to high frequency.

Page 19: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

19 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

Table 7: Frequency with which a skill set domain is used, as rated by years of experience and all respondents.

Novice (<5 Yrs) Intermediate

(5-10 Yrs) Master (10> Yrs) All Respondents

Frequency Mean # Rank Mean # Rank Mean # Rank Mean # Rank

Address sustainability (Q23) 3.71 7 3 3.00 8 7 3.20 15 8 3.27 30 8

Align the learning solution (Q5) 3.60 10 6 3.70 10 1 3.70 23 3 3.67 43 3

Assess performance (Q8) 3.78 9 1 3.30 10 5 3.67 18 4 3.59 37 4

Collaborate and partner (Q26) 3.67 6 5 3.50 8 3 3.80 15 2 3.69 29 2

Elicit performance "practice"

(Q14) 3.71 7 2 3.44 9 4 3.41 17 7 3.48 33 5

Engage the learner (Q20) 3.57 7 7 2.88 8 8 3.67 15 4 3.43 30 6

Enhance retention and transfer

(Q17) 3.71 7 3 3.67 9 2 3.88 16 1 3.78 32 1

Ensure context sensitivity (Q11) 3.00 7 8 3.25 8 6 3.44 18 6 3.30 33 7

Mean Across Domains 3.59 7.5 3.34 8.8 3.60 17.1 3.53 33.4

Standard Deviation (SD) between

Domains 0.25 1.3 0.30 8.6 0.22 2.7 0.19 4.6

Once again, the alignment across domains and across experience groups is very tight. There is less than

one third of a point in standard deviation between skill set domains for each of the experience groups

and less than one quarter of point deviation between skill set domains for all-respondents.

Overall, with a mean rating of 3.53 by all-respondents, all skills set domains are used with moderately

high frequency and with little variance between domains. This leads to the conclusion that all domains

are used by all instructional designers and developers regularly.

Difficulty

The difficulty of tasks in a domain is a key factor in defining credential domains. For example, some

domains may have a low difficulty rating and be highly important or frequent, while others may

important and infrequent, as well as very difficult.

All nine standards domains rated as moderately difficult with little variance (SD) between them in any of the experience groups or in the all-respondents group (see Table 8). However, Ensures context sensitivity received the highest rating and ranking for the intermediate experience group (5-10 years) and lowest for the novice group (< 5 years). It ranked last among all the domains in the all-respondent group but retained a moderate difficulty rating of 3.27. This may be a factor of inconsistent content availability where some individuals have better quality content than others. Alternatively, it may mean that this is a very subtle skill developed over time as novices discover how difficult it is to generate good content, then (as intermediates) struggle with it, and eventually master it. Regardless, content and context remain important and frequently used skills.

Page 20: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

20 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

Table 8: Skill set domain difficulty as rated by years of experience and all respondents.

Novice (<5 Yrs) Intermediate

(5-10 Yrs) Master (10> Yrs) All Respondents

Difficulty Mean # Rank Mean # Rank Mean # Rank Mean # Rank Align the learning solution

(Q5) 3.40 10 4 3.10 10 6 3.30 23 8 3.28 43 7

Assess performance (Q8) 3.44 9 2 3.10 10 8 3.44 18 4 3.35 37 4

Enhance retention and

transfer (Q17) 3.43 7 3 3.33 9 4 3.50 18 3 3.44 34 3

Collaborate and partner

(Q26) 3.33 6 5 3.38 8 2 3.31 16 7 3.33 30 5

Engage the learner (Q20) 3.29 7 6 3.38 8 2 3.63 16 1 3.48 31 1

Elicit performance "practice"

(Q14) 3.14 7 7 3.22 9 5 3.44 18 4 3.32 34 6

Address sustainability (Q23) 3.57 7 1 3.25 8 7 3.53 15 2 3.47 30 2

Ensure context sensitivity

(Q11) 2.86 7 8 3.50 8 1 3.33 18 6 3.27 33 8

Mean Across Domains 3.31 7.5 3.28 8.8 3.44 17.8 3.37 34.0

Standard Deviation between

Domains 0.22 1.3 0.14 0.9 0.11 2.4 0.08 4.3

Align the learning solution received the top ranking for importance (#1) by all-respondents (see Table 6).

It received a moderate ranking (#3) in frequency (see Table 7), but received a lower difficulty ranking (#7)

(see Table 8). However, it retains an all-respondent mean of 3.28 or moderately high difficult. Novices

saw this skill set as slightly more difficult than the intermediate and masters experience groups.

In the all-respondent group, Engage the learner received the top ranking (#1) for difficulty, even though it

has lower importance ranking (#5) and moderate frequency ranking (#3). It also appears to have an arc

of increasing difficulty as practitioners gain experience (6-2-1, all-respondents =1). This may be a

predictive skill set where the quality of learner engagements developed by a novice is predictive of their

success in the field. That is, novices that do not see the difficulty in engaging learners may not be doing

these tasks, while the intermediate and masters level practitioner do these and consider them worth the

extra efforts.

Once again, the alignment across domains and across experience groups is very tight. There is less than

one-tenth of point deviation between domains for all-respondents.

Overall, with a mean rating of 3.37 by all-respondents, all nine standards domains are moderately

difficult. When combined with frequency and importance, all nine standards domains are valid and

valued.

Page 21: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

21 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

Who does this work? For each domain, a second question was asked, “Who does this work?” The role options presented were

the same as those presented in the demographics section (Q1 & Q2). Respondents could choose all the

appropriate responses. As Table 9 indicates, the vast majority of respondents expected instructional

designers to apply the entire range of domains. Instructional developers and instructional technologists

were less frequently tapped, but were still expected to do work.

Table 9: Responses by domain regarding who performs the work.

Skill Set Domain

Instructional Designer

Subject Content Expert

Instructional Developer

Instructional Technologist

Social Media Expert

Visual Artist(*)

Instructor/ Trainer

Align solution 88% 49% 24% 27% 7% 0% 32%

Assess performance 83% 47% 42% 25% 8% 0% 53%

Ensure context sensitivity 94% 79% 25% 22% 15% 0% 63%

Elicit performance practice 79% 49% 49% 39% 18% 0% 73%

Enhance retention and transfer 91% 49% 52% 49% 18% 0% 73%

Engage the learner 83% 53% 43% 33% 13% 0% 63%

Address sustainability 73% 47% 40% 27% 10% 0% 40%

Collaborate and partner 77% 57% 60% 43% 17% 0% 50%

Average for Role 83% 54% 42% 33% 13% 0% 56%

Although the survey was focused on domains skills for instructional designers and developers, results also

show that subject/content experts and instructor/trainers were expected to use the domains, as well.

However, social media specialists, visual artists (graphic artists, videographers, etc.) and other

development specialists were not. Academics were also not identified as doing work in any of the

domains.

The purpose of the survey was to understand the work of the instructional designer and developer. Table

9 clearly indicates that instructional designers, instructional developers, instructional technologists,

instructors/trainers, and subject/content experts must use all domains. Because most of the roles are

feeder roles providing career paths into the role of instructional designer, all domains must be

considered important for all of these roles.

However, the subject/content expert as a “doer” of these standards is unexpected. This needs further

exploration to understand whether this statistic is:

a) A manifestation of the recognized, but not documented, career path from subject/content expert

to instructor to instructional designer (a lateral movement into instructional design).

b) A unique subset of skills required of subject/content experts to participate in learning solution

development projects (e.g., engaging learner means demonstrating to IDs how subject experts

represented audience learns best).

Role>>

Page 22: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

22 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

c) An emerging skill set in content development and management that was not identified by this

practice analysis, and may need to be added to the badge set at a later date.

More study is needed on the role and function of subject/content experts in the instructional design and

development process. In the mean time, subject/content experts moving into instructional design and

development have much to gain from building capabilities in the nine standards against which IDs

measure themselves.

In addition, learning leadership appears to participate in the instructional design and development, as

well. The reasoning behind the ratings on Table 10 may need to be explored. There are potential

opportunities to identify how these domains are used in leadership roles. Apparently they are expected

to at least collaborate and partner and to address sustainability, since these two domains received the

highest ratings for each role.

Table 10: Leadership roles in relationship to domains percentage of respondents who associate domain’s work with this role.

Role Skill Set

Learning Project Manager

Manager of Learning Function

Executive of Learning Function

Align solution 34% 24% 10%

Assess performance 14% 36% 11%

Ensure context sensitivity 31% 22% 9%

Elicit performance practice 24% 24% 15%

Enhance retention and transfer

18% 12% 6%

Engage the learner 13% 17% 13%

Address sustainability 60% 43% 30%

Collaborate and partner 73% 53% 33%

Average for Role 34% 29% 16%

What Did We Miss? A final open-ended question asked survey respondents to tell us what skill sets we had missed. This

question allowed respondents to voice their opinions of whether the survey had covered the essential

skills of the field.

One individual listed a specific theory (failure-based learning). Others indicated that they could not think

of missed elements.

Page 23: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

23 | P a g e © T I f P I , 2 0 1 4

INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE)

Summary Question: Do the nine proposed domains address the complete learning solution development skill set?

Answer: Yes.

The survey has validated that the proposed nine domains provide a model-free/theory-free set of

standards for the field of instructional design and development. It can be concluded that individuals in a

variety of roles across the industry and across experience levels understand the domains and

performances. Respondents validated the work done in each domain as moderately-to-highly important,

frequently required, and moderately difficult for instructional designers and developers in a variety of

roles.

Survey respondents indicated that they saw instructional designer/developers, instructor/trainers, and

subject/content experts working in all nine domains.

They saw instructional developers and instructional technologists doing more work in some domains

(Elicit performance practice, Enhance retention and transfer, and Collaborate and partner) than in others,

but participating to some degree in the work of all domains.

Additional research will be needed to understand the differences between instructional

designer/developers and subject/content experts involved in course design and development. The

overlapping roles here may be indicative of the career movement from subject expert to instructional

designer or it may indicate unique tasks and skill sets for subject experts.

Additional research will also be needed to understand the work of leadership within these domains.

Page 24: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

24 | P a g e © T I f P I , 2 0 1 4

ID BADGES SURVEY BACKGROUND AND RESULTS

Appendix A: Theories

Gagne Shank Shank Merrill Pike Snyder & Wilson

Kopfler Bruner Skinner Rossett

Philosophic Approach

Behaviorist Constructivist Constructivist Eclectic/ Centrist

Behaviorist Constructivist Social Learning

Behaviorist

Behaviorist Eclectic/ Centrist

Theory 9-Events of Instruction

Case-based Reasoning reasoning through analogy

Goal-based Learning

First Principles of instruction Pebble-in-the-Pond Scenario-based

7 Laws of Learning Theory

Augmented Learning

Gaming & Social Media

Cognitive Scaffolding

Mastery Learning

Blended

Reference http://citt.ufl.edu/tools/gagnes-9-events-of-instruction/

http://hlwiki.slais.ubc.ca/index.php/Case-Based_Reasoning

https://sites.google.com/a/nau.edu/learning-theories-etc547-spring-2011/theory/goal-based-scenarios

http://mdavidmerrill.com/Papers/firstprinciplesbymerrill.pdf

http://en.wikipedia.org/wiki/Principles_of_learning

http://en.wikipedia.org/wiki/Augmented_learning

http://henryjenkins.org/2008/07/an_interview_with_eric_klopfer.html

http://en.wikipedia.org/wiki/Instructional_scaffolding

http://en.wikipedia.org/wiki/Mastery_learning

https://sites.google.com/a/nau.edu/learning-theories-etc547-spring-2011/theorist/rossett

Gain attention (need to know why)

Attention Goal Goal

Describe/retrieve the goal/problem

Present goal

Relevant Relevant Relevant Task/Problem

Law of the Learner (WIIFM)

Relevant Relevant

Stimulate recall of prior knowledge

Recall Retrieve/adapt/store

Relevant Activation Law of the Lesson (from known to unknown)

Present the material to be learned

Present Relevant to retrieval of analogy (story)

Resource rich Demonstration

Law of the Teacher (bring personal experience)

Page 25: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

25 | P a g e © T I f P I , 2 0 1 4

ID BADGES SURVEY BACKGROUND AND RESULTS

Gagne Shank Shank Merrill Pike Snyder & Wilson

Kopfler Bruner Skinner Rossett

Provide guidance for learning

Provide guidance

See Zone of Proximal Development

See Zone of Proximal Development

Law of the Language (Speak to be understood)

Elicit performance "practice"

Practice Retrieve/adapt/store

Integration Application Law of the Teaching Process (Involvement)

Expert-peer relationships

Expert-peer relationships

Mastery (fluency)

reasoning by analogy

Authentic environment

Law of the Learning Process (learning does not take place until behavior changes)

Individuality – can provide unique scaffolding that is customized to the individual’s path of investigation.

Provide informative feedback

Feedback Feedback Feedback Remediation Expert-peer relationships

Mastery (fluency)

Assess performance (test/provide progress information)

Assess progress

Immediate & related to real work

Immediate & related to real work

Expert-peer relationships

Mastery (fluency)

Enhance retention and transfer (retain)

Retain Retain/store (case/story)

Retain Integration Law of Review and Application (show real life application)

Immediate & related to real work

Page 26: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

26 | P a g e © T I f P I , 2 0 1 4

ID BADGES SURVEY BACKGROUND AND RESULTS

Gagne Shank Shank Merrill Pike Snyder & Wilson

Kopfler Bruner Skinner Rossett

Just-in-time/real-time

just-in-time (near the time of need)

just-in-time (near the time of need)

On demand real-time learning

Expert-peer relationships

Personalized mastery

Expert-peer relationships

Expert-peer relationships

Expert-peer relationships

Mastery (fluency)

Social dynamics

Learning is social

Learning is social

Zone of proximal development (learning from those around you)

Expert-peer relationships

Expert-peer relationships

Expert-peer relationships

Expert-peer relationships

portability – can take the computer to different sites and move around within a location

Learning is social

social interactivity – can exchange data and collaborate with other people face to face

Learning is social

context sensitivity – can gather data unique to the current location, environment, and time,

Relevance

Page 27: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

27 | P a g e © T I f P I , 2 0 1 4

ID BADGES SURVEY BACKGROUND AND RESULTS

Gagne Shank Shank Merrill Pike Snyder & Wilson

Kopfler Bruner Skinner Rossett

including both real and simulated data

connectivity

– can connect handhelds to data collection devices, other handhelds, and to a common network that creates a true shared environment

Learning is social

individuality

– can provide unique scaffolding that is customized to the individual’s path of investigation.

Expert-peer relationship

Page 28: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

28 | P a g e © T I f P I , 2 0 1 4

ID BADGES SURVEY BACKGROUND AND RESULTS

Appendix B: Standardization of Terms

The following list of 25B key terms in instructional designed were evaluated by ten TIfPI Practice

Leaders/Instructional Designers. The resulting evaluation paired the list down to the 11 domains listed

in italics.

Alignment – Both the external alignment with sponsoring organization needs and the internal alignment of instructional elements

Assess performance

Business acumen – Designer shows knowledge of the business drivers behind their learning solution

Context sensitivity

Creativity and Ingenuity in problem solving

Elicit performance "practice"

Engagement – learner is actively engaged in learning process

Enhance retention and transfer

Expert-peer relationships

Feasibility (can it be done)

Feedback

Focus learner’s attention

Individuality

Just-in-time/real-time

Managing risk

Measure learning and impact of learning on the organization

Personalized mastery

Portability (ability to move between platforms)

Recall

Relevant to learner

Social dynamics (interpersonal relationships during learning)

Social interactivity

Sustainability (solution can will be around awhile)

Zone of proximal development (learning from those around you)

Page 29: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

29 | P a g e © T I f P I , 2 0 1 4

ID BADGES SURVEY BACKGROUND AND RESULTS

Appendix C: Instructional Design Badge Definitions for Solutions Domain Badges

Target Audience: Entry- to mid-level professional (1-5 years) developing solutions

The typical target audience member might one or more of the following:

Content experts moving into instructional design by developing content and structures for their own

courses.

Recent college graduates with an instructional design or adult learning degree and who are working with

subject experts to develop content and solutions.

Individuals who have recently taken a workshop in instructional design or other training in adult learning

and who are now responsible for developing solutions.

Members of large learning project team who bring a unique talent (e.g., technology, programming,

technical writing, graphics, etc.) and who are now responsible for developing content and instructional

elements.

Secondary target audience: Highly experienced professionals working with a new solution type

Seasoned professionals wanting o highlight their expertise in a given solution

Professionals who want an extra “edge” in the market place

Domains with performance measures (i.e., standards) to meet The nine domains or standards for instructional designers and developers are as follows. These

standards are all approximately equal in value and no process order is implied. That is, these standards

do not constitute a process model, but rather are the various lens through which IDs view their own and

other professionals work.

Each standard includes a brief definition and a list of performance expected in the execution of that

standard.

Align Solution Definition: Creates or changes relationships among parts of the solution (internal to the solution) or between

the solution and its parent organization or sponsors (external to the solution).

Performances that demonstrate this domain for a Solution Development Badge:

Maps the instructional elements to defined project and audience requirements.

Sequences learning elements and content appropriately for defined learners.

Modifies planned instructional elements in order to make those elements more effective.

Selects appropriate content for the solution.

Maps content to appropriate instructional elements.

Page 30: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

30 | P a g e © T I f P I , 2 0 1 4

ID BADGES SURVEY BACKGROUND AND RESULTS

Assess performance Definition: Evaluates what the learner does within the learning environment using a specific set of criteria as

the measure or standard for the learner’s progress.

Performances that demonstrate this standard for a Solution Domain Badge:

Creates metrics or rubrics that guide the assessment of performance within the learning

environment.

Creates effective assessment tools(1) to support the assessment process.

Creates instructions for using the performance tools.

Pilots test tools to assure that the tool measured the appropriate performance.

Modifies tools based on feedback from pilot testing.

Ensures that resulting data drives feedback to the learner, to the instructor, to the sponsoring

organization, or to the instructional design process for future modification. (1) Assessment tools may include any technique to observe, track, measure, or record assessment (e.g., polls, surveys, self-

assessments, tests, interactive activities in Elearning modules, checklists, observation worksheet, etc.)

Ensure context sensitivity Definition: Considers the conditions and circumstances that are relevant to the learning content, event,

process and outcomes.

Performances that demonstrate this standard for a Solution Domain Badge:

Creates solutions that acknowledge:

o Culture – workplace, learner, language, society, work group, individual’s demographic

benchmarks (education, gender, age, disabilities, etc.)

o Prior experience

o Relationships to work -- the degree to which the learning content and activities reflect “real”

work and work tools (e.g., are we using genericized content designed only for learning

purposes or accessing working content that is maintained for work process purposes)

o Variability in content – that some content is more critical, more frequent, or more difficult.

Verifies that materials reflect the capabilities of audience (e.g., readability – localization, plain

language, global English, physical capabilities).

Maps to other learning opportunities

Aligns content with learning objectives and desired outcomes

Elicit performance "practice" Definition: Ensures that the learning environment and practice opportunities reflect the actual environment in

which the performance will occur.

Performances that demonstrate this standard for a Solution Domain Badge:

Creates practice opportunities that mimic work tasks and work processes.

Chooses elements of the “real” work environment, tools, and technology to include in the practice

learning environment.

Scripts steps and interactions.

Creates the full spectrum of support materials to ensure that learning occurs.

Page 31: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

31 | P a g e © T I f P I , 2 0 1 4

ID BADGES SURVEY BACKGROUND AND RESULTS

Engage learner Definition: Captures and keeps the participant’s attention and interest through active participation, practice

opportunities, feedback, and reflection.

Performances that demonstrate this standard for a Solution Domain Badge:

Uses techniques that gain learner’s attention.

Provides opportunities for the learner to gain confidence through active involvement such as

discussion, practice, self-assessment, group activities, individual activities, etc.

Provides activities at the appropriate level for the audience.

Adjusts activity levels as learner gains skill and confidence.

Provides opportunities for constructive feedback appropriate to audience level.

Provides feedback techniques that give learners performance-specific information.

Provides opportunities for learners to give input on their learning experience, when appropriate.

Enhance retention and transfer Definition: Ensures that the learning environment creates and measures recall, recognition, and replication of

desired outcomes.

Performances that demonstrate this standard for a Solution Domain Badge:

Chooses elements of the “real” work environment, tools, and technology to include in the practice

learning environment.

Measures readiness for learning.

Triggers relevant previous experience.

Provides interim self-assessment or skill measurement opportunities.

Incorporates tools for on-the-job performance.

Provides opportunities for learner to integrate changed skills based on feedback.

Provides feedback techniques that give learners information relevant to enhancing performance,

retention, and transfer.

Ensures Relevance Definition: Creates content and activities that address the learner’s background and work experiences.

Performances that demonstrate this standard for a Solution Domain Badge:

Explain the needs of the learning audience and how the proposed solution addresses those needs.

Describes for the learner what the learning process and outcomes will be.

o Objectives

o Schedules

o Course outline

o Module structures such as overview, questions, content, review

Creates activities that connect learner’s previous experience and background to the learning process

and outcomes.

Ensures that feedback opportunities address the learner’s performance.

Page 32: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

32 | P a g e © T I f P I , 2 0 1 4

ID BADGES SURVEY BACKGROUND AND RESULTS

Addresses Sustainability Definition: Considers the best usage of resources (time, money, materials, staffing, technologies, etc.) now

and in the future.

Performances that demonstrate this standard for a Solution Domain Badge (one or more of the following):

Selects tools and methods that can be replicated at minimal costs and time.

Builds in techniques that allow subject experts and instructors to modify the learning solution

without requiring the solution to go through a complete revision cycle for each modification.

Recommends tools and techniques that improve the learner’s learning environment and better

match the learner’s needs.

Recommends tools and techniques that improve the learning solution’s cost effectiveness.

Leverages content, solution development processes, and solutions for reuse and lowest cost of

reproduction.

Develops solutions that can be turned over to a different team who will support or teach it over time.

Develops solutions that include planned future review cycles.

Remediates expensive one-time solutions with follow-up that allows learners to access elements of

that learning solution.

Explain improvements to original learning design where such improvement created savings, improved

learning, improved functionality, generated better data to the sponsors.

Collaborates and Partners Definition: Works jointly with sponsors and other members of the solution development team to develop the

solution.

Performances that demonstrate this standard for a Solution Domain Badge:

Addresses sponsor’s issues and needs by listening to requests for modifications, offering solutions to

modification requests, and reporting progress.

Participates in the project team though:

o Identification of project issues

o Meeting attendance

o Regular reporting

o Generates ideas to resolve issues, improve sustainability, and enhance learning solution.

Negotiates changes to solution during development and solution testing.

Plans solution product tests that validate with the sponsor and intended audience that the right

solution elements have been developed.

Executes product tests including reporting results of tests.

Works with content experts to identify content, relevant work processes and procedures, and

appropriate feedback and assessment technique.

Page 33: Whitepaper: Instructional Designer/Developer Practice ......INSTRUCTIONAL DESIGNER/DEVELOPER PRACTICE ANALYSIS AND SURVEY RESULTS (ID BADGES PREMISE) Background and Overview The Institute

33 | P a g e © T I f P I , 2 0 1 4

ID BADGES SURVEY BACKGROUND AND RESULTS

Appendix D: Learning Solutions Consider the following list of solutions.

Asynchronous Elearning: Elearning solutions presented entirely online without any peer or instructor

involvement; learning is directed by a computer and may or may not be modified to meet needs of the

learner.

Synchronous Elearning: Elearning solutions presented entirely only that include peer and instructor

involvement through multiple mediums such as discussions, webinars, wikis, project spaces, etc.; learning can

be modified by instructor to meet needs of learner)

mLearning: Asynchronous Elearning provided for mobile device such as cellphones and tablets.

Instructor Led Training: Classroom-based learning lead by an instructor or trainer where learning events may

include other solution elements such as media, job aids, electronic performance support, games, etc.

Simulations: Online or classroom-based learning based on a scenario that accurately recreate real world

conditions within the learning environment while also providing real world experiences within the safety of

the learning environment supported by feedback.

Serious Learning Games (aka Serious Games): Workplace and classroom game experiences that facilitate

learning through interaction with peers, content, processes, and manipulative game pieces or interface.

Serious games are designed specifically for learning serious workplace topics, process, and skills in contrast to

edutainment games that provide rote recall for either children or adults. (Note: branded games such as

Jeopardy, Clue, etc. are considered to be edutainment games, gamification, or practice activities only and will

not be evaluated as Serous Learning Games.)

Electronic Performance Support Systems (EPSS): Electronic job aids that support work processes where such

tools may or may not also be used in training programs or may include learning elements within the learners

native workflow.

Job Aids: Tools that support learning and recall directly within the work environment where such tools may

include charts, diagrams, memory aids, videos, and more.)

Social Media: Online learning facilitated by or delivered through social media sites.)

Reusable learning objects (RLOs) (also called micro learning objects) Learning tightly focused to develop

mastery of one specific skill or process step.

Goal-based/problem-based scenarios: Learning environments that mimic common problems or scenarios

within the work environment and provide resource rich learning support in order to solve the problem or

reach the goal.

Self-study: Learning structured to not use an instructor or facilitator where the direction and timing of

learning is guided by the learner working through the use of text, media, and online accesses, etc.

Blended learning: Combinations of learning solutions particularly learning solutions that incorporate both

formal and informal learning or online and offline learning.

Community of Practice: A peer-to-peer community that addresses key issues or problems within their field of

practice.

Coaching/Mentoring: A formalized program where participants can access coaches or mentors and where

coach, mentor and participant have structured roles in order to accomplish organization or role-specific

learning.

Learning Videos: Videos designed to teach skills or processes.

Informal Learning: Formalized programs that encourage and track work-based learning from peers, superiors,

customers and the environment (on-the-job learning).