using qualtrics for online trainings

78
USING QUALTRICS FOR ONLINE TRAININGS Shalin Hai-Jew

Upload: shalin-hai-jew

Post on 16-Apr-2017

558 views

Category:

Education


2 download

TRANSCRIPT

USING QUALTRICS FOR ONLINE TRAININGS

Shalin Hai-Jew

2

Learning objectives

■ Review the features and functionalities in Qualtrics that enable its use in online

trainings (“automated” in these cases)

– Think of a software tool as its functionalities, not only its main designed

intended usage (e.g. Qualtrics not only as a research suite but an online

training platform)

– Consider the “integrations” between Qualtrics and student information systems

and human resources information systems which enables accurate and large-

scale record-keeping

■ Explore some important instructional design elements in online trainings [including

for (1) policy compliance, (2) mass-scale trainings, and (3) customized trainings]

3

Learning objectives (cont.)

■ Review some core elements of online trainings

– Reflect on some real-world considerations when building an online training on

Qualtrics

■ Propose some additional features to Qualtrics to enhance this targeted usage

4

SOME COMMON ONLINE TRAININGS

5

Online trainings…

■ are educational experiences designed to promote workplace awareness, skills,

attitudes, and behaviors

■ are deliverable through web-based platforms to desktop computers, laptops, and

mobile devices (“automated” in these cases)

■ include formative and summative assessments

■ are recorded in terms of learners and their performance (ungraded, pass/fail,

numerical score, or others)

■ may include an attestation of commitment to certain attitudes, behaviors, or actions

(“I attest..”)

6

Useful “soup to nuts” conceptualization of the online training experience

1. Identification of learners for

particular trainings

2. Registration of learners into the

correct tracks

3. Delivery of training

4. Assignments

5. Assessments

6. Record-keeping and verifiability

(data integrity)

7. Training refreshers

8. On-demand trainings

9. Training updates

10. Digital file redundancy and

protection of a “pristine master”

7

1. Online policy compliance trainings as a “use case”

■ Online policy compliance trainings…

– must accommodate both new hires as well as continuing workers who are

experiencing skills decay (those who need refreshers)

– must accommodate updates based on new standards set by the regulatory

agencies

■ These may be in the form of single digital learning objects (DLOs) and short courses

■ These may be longer learning sequences, with multiple elements and sequences

8

Online policy compliance training from the employer view

■ Employers…

– must ensure the accuracy, availability, and consistency of the respective

trainings

– must ensure that the online trainings are accessible (Section 508); do not

contravene intellectual property laws; do not infringe on privacy rights, and

otherwise are legal

– must maintain accurate records showing who took what training when

■ In work places, there are usually pre- and post-training assessments to test for the

efficacy of trainings (in the short term, mid-term, and long-term)

9

2. Online mass scale trainings as a “use case”

■ Online mass scale trainings…

– must be widely accessible to a variety of distributed contexts and on a variety

of technological platforms

– may be on any variety of topics

10

3. Customized online trainings as a “use case”

■ Customized online trainings…

– must be customizable with various pieces and parts based on learner needs

11

SOME AFFORDANCES IN QUALTRICS

12

13

A partial list of affordances

■ Open and distributed development teams

■ Registration and sign-up

■ Flexible scripting (and tactical uses of automations)

■ Assessment building and threshold setting

■ Additional “hidden” questions

■ Panels

■ Multimedia integration

■ Information system integrations

■ Accessibility features

■ Templating

■ Design features

■ Security features

■ Third-party tool integrations

■ Broad distribution

■ Data collection

■ Data extraction

■ Data analysis

■ Libraries

■ Data archival

14

Open and distributed development teams

■ “Collaborate” capability to enable virtual collaboration

– Authentication through e-mail verified invitations

– Ability to control collaborators’ levels of access (edit, view results,

activate/deactivate, copy, and distribute)

15

Registration and sign-up

■ Enablements to create sign-up forms for trainings

■ Ability to channel respective learners to different learning sequences through

branching

– By professional role(s) and requirements

– By profile

– By performance

– By selection / choice

16

Flexible scripting

■ Variety of question types

■ Branching logic

– Respondent’s answer to a

question

– Embedded data

– Specific device type used

– Defined quota

– GeoIP location

■ Piped text {a} (for respondent

answer re-use, customized address

of participants by name, and other

customizations)

■ Panel triggers (auto-populated

panels based on answers given,

technologies used, performance

achieved / scores, and others)

■ Email triggers (auto-generated

emails on particular conditionals

being met)

17

Flexible scripting (cont.)

■ Display logic (controlling for user ability to see particular answers based on conditionals)

■ Loop-and-merge to enable additional data capture based on user responses to particular questions (also “carry forward”)

■ Quotas to limit the number of responses to a question or a survey

■ Default answers (pre-set answers) to multiple-choice questions

■ Closed panel invitations (by verified email) and unique links for survey access

■ Custom messaging

■ Custom conclusions

■ Coding to track social media platforms (as sources of responses for open access elicitations)

18

Assessment building and threshold setting

■ Use of “scoring” feature to enable application of points to questions and totaling

■ Ability to set threshold conditionals for learners

■ Ability to capture names, emails, and such, about individuals who meet certain

score criteria in order to channel them to particular panels (which may then be

contacted for other learning sequences, assessment re-takes, awarding of

certificates, and so on)

■ Ability to randomize answers

■ Ability to create branches of different questions for subsets of respondents who

respond a particular way to a particular question

19

Additional “hidden” question types

■ Timing questions [the amount of time a respondent spent on a particular question,

such as a question in which there was an iframed (inline framed) simulation or

game…or in which there was a video]

■ Meta information questions (web browser type, browser version, operating system,

screen resolution, Flash version, Java support, user agent)

20

Panels

■ May be populated with emails, from databases, surveys, and other sources

(including manual ones)

■ May be populated in an automated way based on answers to particular questions,

technology used, geographical location, performance on an assessment, or some

other criterion or criteria

– May be used to send emails to particular subsets of respondents to a

particular training

21

Multimedia integration

■ Ability to seamlessly integrate links, imagery, audio, video (including through direct

embed text linking), and other elements

■ File upload questions (ability to capture feedback in the form of uploaded files)

■ Ability to launch a live poll (with near real-time feedback) on a website

22

Information system integrations

■ Qualtrics has an application programming interface (API) which enables data

exchange (but requires developer and system administrator skill set to connect the

data flows)

– May be linked to student information systems (on campus)

– May be linked to human resources information systems (on campus)

– May be linked to other databases

■ Requires administrative decision-making to actualize the connectivity

23

Information system integrations (cont.)

■ Enables fast and accurate recording of achieved trainings without “humans in the

loop”

– Enables legal standards for asserting that trainings were delivered

– Enables broad-scale summary data about provision of training

– Enables granular levels of drilling down to individual levels of performance

(single records)

24

Accessibility features

■ Ability to check the accessibility of a survey (Advanced Options)

– Some questions because of their technological structure are inherently

inaccessible since they cannot be made coherent by a screen reader

■ Mobile accessibility features

– Visual Preview capability to demo a small screen view

– Mobile skinning for design

– Intuitive builds such as stacking related images vertically vs. horizontally

■ Ability to add alt-texting (alternate text annotation metadata) to imagery used in a

training or survey

25

Templating

About Templates

■ Elements of an online training may be built into a training template

– Templates may be used and re-used to help structure online trainings

– Templates ensure that trainings are uniform and as comprehensive as possible

■ Templates, like project stylebooks (aka “projects of work”), are generally designed in

a group-based consensual way based on the needs of the project and the varied

expertise of the development team members

– Templates require a coherent look-and-feel skinning as part of the template

design

26

Templating (cont.)

Templating in Qualtrics

■ Ability to create templates (reusable forms or patterns) as “blocks” or full

“surveys”…and templates may be archived in the Library, from which it may be

copied out for use

– Captures the sequencing and scripting as well

– “Panels” (reached by “panel triggers”) do not transfer though and will need to

be re-created in each new instance of a template-based training

27

Design features

■ Ability to create unique and unified look-and-feel (with various skins and customized

logo editing)

– Some themes will disable some tools

■ Enables minimized designs for mobile

– Need to stack correctly-sized images vertically instead of horizontally

– Need to stack tables vertically instead of horizontally

– Need to size images properly for initial viewing but with sufficient resolution for

increased detail with enlargement or zooming in

28

Security features

■ By invitation only (closed survey offerings by email, by navigating to a training from a designated webpage / URL)

■ Password protection (using Text Entry question type)

■ Hiding public surveys from spiders / web crawlers

– The prevention of automated “Indexing” for web findability

■ Enabling context-based memory for users (“Save and Continue”), enabling stopping and re-starting

■ Ability to turn off IP (Internet Protocol) address collection, ability to fully Anonymizing Responses [even from the researcher(s)]

– Irrecoverable anonymization vs. single-blind approaches and researcher maintenance of confidentiality and protection of data

29

Security features (cont.)

■ Prevention of “ballot box stuffing”

– Tamper-proofing responses

based on IP tracking

■ CAPTCHAs (Completely Automated

Public Turing test to tell Computers

and Humans Apart) to protect

against automated ‘bot responses

(a common way that people try to

“stack the deck” for online surveys)

– Human-readable CAPTCHAs

■ Survey expiration by date or quota

completion or other conditional

■ Ability to control visibility of

questions and answers based on

user-provided information and / or

user-provided behavior (question

display logic)

30

Third-party tool integrations

■ Google Translate for automated slide-by-slide translation

– Ability to re-upload corrected non-English slides

– Highly advisable to have a native speaker review all machine translations for

accuracy and to make necessary corrections

31

Broad distribution

■ Easy pilot-testing features with both back-end data analytics and direct elicitation of

responses from participants

■ Ability to reach a broad range of respondents

– Ability to access for-pay respondents to surveys

32

Data collection

■ Wide range of possible question types and resulting information

■ Fast automated reports of learner activities

■ Easy download of quantitative, mixed methods, and qualitative data for analytics

33

Data extraction

■ Data extraction in varying readable formats (Word, PowerPoint, Excel, and PDF) to

enable analysis through other tools

■ Subgroup data extraction (based on demographic data or answers to particular

questions)

■ Formatted summary report (with built-in tables)

34

Data analysis

■ Ability to recode values at any point in the data collection process

■ Built-in item analysis of the assessment

■ Built-in “Reporting / Survey Statistics” feature for analysis of overall interactions of

learners with the training

■ Built-in cross-tabulation analysis of special types of responses (variables)

35

Libraries

User libraries

■ Ability to store (and retrieve) user-

created questions, blocks, and

surveys

■ Ability to create own training

templates for re-use

■ Ability to access shared resources

in shared libraries in Qualtrics

Qualtrics libraries

■ Ability to access Qualtrics’ survey

templates for reworking and re-use

36

Data archival

Offline

■ Ability to download a survey (.qsf file) and its related information (.csv) for reconstitution online later

■ Ability to download auto-created reports (for fast data skimming)

■ Ability to download data tables related to each question (optimal way to extract data for analytics)

Online

■ May keep survey with linked data in

the active “My Surveys” area

37

Main Qualtrics affordances for the three “use case” types raised

1. Policy compliance training features: Easy updatability, easy performance recording

with API integrations to connect to databases, information integrity features

2. Mass-scale training features: Easy delivery with URL (uniform resource locator) and

open-access setting; easy delivery with closed-access setting; efficient data

collection; some built-in data analytics features

3. Customized training features: Ability to use performance in a prior training or

assessment to surface particular learning sequences (with scripting capabilities);

other customizations ( with piped text {a} )

38

FROM AN INSTRUCTIONAL DESIGN POINT-OF-VIEW

39

Seven general instructional design focuses

1. Legal requirements

2. Pedagogy / andragogy

3. Digital content creation

4. Assignments

5. Assessments

6. Technologies

7. Pilot-testing and revision

40

Design: 1. Legal requirements

“Authorizing” Regulatory Agency and Related Regulation

■ Who is the “authorizing” regulatory agency, and what are its main responsibilities

and areas of concerns? (if relevant)

■ What is the regulation and / or policy under which the training is being created? (if

relevant)

■ What outcomes does the regulatory agency want to see? (if relevant)

41

Design: 1. Legal requirements (cont.)

Intellectual Property

■ Who owns copyright to the digital contents (imagery, articles, slideshows, videos, simulations, games, and other elements)?

– If copyright is able to be located to an owner (not orphaned works), is it possible to practically acquire copyright releases to enable use of the respective works?

■ Is there proper documentation of such rights releases to the team?

– If the contents are available through a Creative Commons licensure, does the source actually have the rights to release the contents under CC licensure? (Are there other potential owners based on a web search and Tin Eye check?)

– Is it possible to link / embed text to digital contents (if the source is sufficiently stable)?

42

Design: 1. Legal requirements (cont.)

Privacy Protections

■ If original images, audio recordings, video recordings, and such are used, were there

correct media releases signed?

■ Were the media releases properly acquired? (No minors. No coercion. No

excessive promise of rewards.)

■ Does the team have the records?

■ Are there any other possible sense of “trespass” on others’ rights?

43

Design: 1. Legal requirements (cont.)

Legal Publication

■ Is any part of the messaging potentially libelous?

■ Is any part of the messaging potentially defamatory?

44

Design: 1. Legal requirements (cont.)

Accessibility

■ Is the online learning fully accessible?

– Are all images modified with properly informative alt-text (readable by screen readers)?

– Are all audio files transcribed (optimally with timed text)?

– Are all video files transcribed (optimally with timed text or closed captioning)?

– Are all data tables properly structured?

– Is color used in an accessible way? (augmented by text descriptors)

– Are digital files all in universal product formats (to be as transcodable as possible)?

– Are text documents tagged in hierarchical formats?

– Is the English simple and clear?

– Are automations and sequenced actions under user control?

45

Design: 1. Legal requirements (cont.)

Data Handling

■ Have the learners been notified of what data will be collected, stored, accessed, and

handled?

■ Will only the necessary data be collected?

■ Will the learner data be stored, accessed, and handled in a way that protects the

learners?

46

Design: 2. Pedagogy

Practice of Teaching

■ What are the main purposes of the training?

■ What are the main learning objectives?

■ Who are the learners?

■ How may the diverse learners’ needs be met?

■ What may be assumed about what the learners know already? (Bayesian knowledge tracing)

– If they may have mistaken information, how may that be addressed? If there are challenging attitudes towards the training topic, how may that be addressed?

47

Design: 2. Pedagogy (cont.)

Practice of Teaching (cont.)

■ What are the most difficult concepts / practices / attitudes in the training? How may these best be mitigated?

■ How should assignments be created that are real-world, practicable, and memorable?

■ How may “negative learning” and misconceptions be avoided in the training? (How will trainers know that negative learning is happening in the training and information collection process?)

■ What “cognitive scaffolding” may be employed (either statically or dynamically)?

– For novices (those new to the topic)? For amateurs (those who do not plan to go farther in the field)? For experts (those highly trained in the field but still needing to maintain certification)?

48

Design: 2. Pedagogy (cont.)

Practice of Teaching (cont.)

■ Culturally, what messages might be off-putting and offensive (and therefore to-be-

avoided)? What messages and rationales might be appealing (and therefore to-be-

used)?

■ What is an optimal way to sequence the learning? Optimal ways to sequence?

Linear? Non-linear sequencing?

– What important points should be reinforced? How?

■ What informational graphics may be employed? Maps? Visuals? Audio? Video?

Games?

■ What do the learners need to know to successfully apply the information from the

training?

49

Design: 2. Pedagogy (cont.)

Practice of Teaching (cont.)

■ How will they use the information in decision-making in real life?

■ What teaching / training design may be most effective in reaching these learners?

What technologies might be most effective? Why? (What are all the practical

options?)

■ What learning techniques might be most effective?

■ What level of language should be applied?

■ If the training is offered in multiple languages, which other languages should be

used? How will the correctness of that language be checked?

50

Design: 2. Pedagogy (cont.)

Practice of Teaching (cont.)

■ How should the assessments be designed to best test for knowledge, attitudes, and skills?

■ How much presence should the trainer have in the training and in what form(s)? Imagery, statements, audio, video? Live interactions?

■ Is there a social learning component to the training? If so, how may that be set up? How should negative learning (introduced by untrained co-learners) be addressed / corrected?

■ How will trainers “surveil” and critique their training to ensure that it is functioning properly and serving the needs of the learners (and other stakeholders)?

■ How often will the training be updated? What criteria will be applied to the choice to update or not?

51

Design: 2. Andragogy

Adult Learning

■ How may respect of the adult learners be conveyed?

■ What are the grounds for the credibility of the training organization and the trainer(s)?

– How is the credibility of the trainer and training organization conveyed?

■ How may the learners’ differing motivations for learning be harnessed? How are adult learners’ practical needs being met by the training?

■ How may the training be designed to compete for learner attention (which is assumed to be in short supply)?

– How may learners be engaged and attentive during the training?

– How are expectations for learner instant gratification met?

■ How may the training convey learner responsibility and agency (and internal locus of control)?

52

Design: 2. Andragogy (cont.)

Adult Learning (cont.)

■ How may the training be delivered in the most effective way possible?

– The most focused and briefest way possible?

■ How may learning preferences be accommodated (with multimodal design and delivery)?

■ What sequence(s) of learning is most effective for the learning?

■ Given people’s strengths in visual processing, how may that be tapped with informative imagery (photos, diagrams, timelines, etc.) and video?

■ Given how people process information—based on research in human attention, perception, cognition, memory, and learning—how may the online training be best offered?

– Empirically, based on pilot testing, what works and what doesn’t?

53

Design: 2. Andragogy (cont.)

Adult Learning (cont.)

■ How may human cognitive biases be adjusted for?

■ What are the best ways to ensure transferability of the learning to the respective

learners’ differing contexts?

– How can that transferability of awareness, learning, and decision-making be

observed and recorded?

■ What are ways to ensure that the new learning actually “takes” and may be applied

effectively over longer time horizons?

54

Design: 3. Digital content creation

Training Scope

■ What is the scope of the training? What contents should be covered?

■ How does the training interrelate with other trainings? What is covered in other trainings, and how can learners segue between the various other trainings?

Sourcing

■ Where can the expertise be acquired?

■ What extant learning contents are there?

– What needs to be developed and from which respected source information? What source citation methods should be used?

– Are there free and open-source contents released by Creative Commons licensure or released in the public domain?

55

Design: 3. Digital content creation (cont.)

Project Stylebook

■ What is the delivery system for the training? What are the features of this technology system? What are its requirements for effective delivery?

■ What types of digital learning objects and other (digital and analog) deliverables will be created in the learning (based on pedagogical reasoning and within the limits of the budget and deadlines)?

– Slideshows, articles, photo albums, serious games, scenarios, podcasts, video, and others

■ What is the look-and-feel of the training? What is the language? How is the training styled? What are points of consistency that have to be followed? What are cultural considerations in the styling?

56

Design: 3. Digital content creation (cont.)

Project Stylebook (cont.)

■ What standards need to be applied to each of the different types of digital learning

objects? How will the dev team know that this threshold has been met?

■ What metadata (information about information) should be included with the digital

learning contents?

■ What accessibility mitigations will need to be applied?

■ What technological standards need to be met by the various individual digital

objects? The digital learning objects? The learning sequences?

57

Design: 4. Assignments

■ What work may be assigned to the learners that would enhance their

understandings?

– Which may be required? Which may be optional? What are the differences

between required and optional assignments in online trainings?

■ How will this assigned work be monitored, if possible? If not, how will learners

acquire accurate feedback on their assignment performance?

– How can negative learning / misconceptions be headed off?

■ How may assignments be designed to be practicable and reasonably priced for

distributed learners?

58

Design: 4. Assignments (cont.)

■ Is it possible to create virtual assignments / scenarios / cases / experiences?

– How may these be tracked for further insights to enhance the learning for the

learners? The design for the trainers?

■ Are there social elements to the assignments? If so, how may these social elements

be integrated into the work?

59

Design: 5. Assessments

■ What types of assessments would be familiar to the individuals taking the training?

■ What formative assessments will be used? What summative assessments will be used?

How will assessments reinforce the training?

■ How do assessments test for the following:

– knowledge acquisition

– attitude or attitude change or attitude maintenance

– knowledge / skill / attitude

– judgment and decision-making

– long-term knowledge or skill or attitude kill retention

– learner agency

60

Design: 5. Assessments (cont.)

■ How are the assessments designed not to create excess anxiety (which raises the

cognitive load) for the test taker but still convey the seriousness of the assessment?

■ How is performance recorded?

■ How is feedback from the assessments returned to the learners?

61

Design: 6. Technologies

■ Which technologies will be used? Why?

– Learning management systems, immersive virtual worlds

– Wikis

– Blogs, web logs

– Content-sharing social media platforms

– Authoring tools

– Photo editing tools

– Drawing tools

– Audio editing tools

– Video editing tools

– Screen capture tool

– Screen shot tool

– Animation tools (occasionally)

– Geographical mapping tools

– Data analytics tools

– Data visualization tools, and others

62

Design: 6. Technologies (cont.)

■ What “raw” files will be used? What “processed” files? What is the sequence of

work, and how does that inform what files are used when?

■ How can the digital contents be “future proofed” as much as possible? How can

files be stored in proper archival format to protect against future inaccessibility? (as

when file formats go extinct)

■ How will Section 508 accessibility be ensured in the design and development?

63

Design: 7. Pilot-testing and revision

■ In an automated learning context, an online learning sequence has to be stand-

alone, comprehensive, and understandable for a broad range of learners…

■ Beta (β) testing involves pilot testing with individuals who are outside the

development team:

– Which individuals may be tapped to pilot-test the training? How can such

individuals be tapped close-in (as a convenience sample)? How can

individuals be included from more distant contexts?

– If the training will reach those who do not have English as a first language,

would it be valuable to include people from those backgrounds? Will it be

important for them to test on a different language version of the training?

– Would it be valuable to include people with known accessibility challenges to

test for accessibility?

64

Design: 7. Pilot-testing and revision (cont.)

■ What basic training assessment questions will be asked of the pilot testers? How

will that information be practically collected?

– What non-obvious or implicit testing will there be? How will that be handled?

■ What opinion-based testing will be used in this phase? How will such feedback be

incorporated?

■ How will the suggestions be considered and then applied to revisions (to the

training)?

65

SOME BASIC INSTRUCTIONAL DESIGN

APPROACHES(and implied tenets)

66

Some basic instructional design approaches

■ Co-create a project stylebook as a team based on the project requirements and the expertise of each of the team members

– “I will need images to be sent to me with these parameters…and this metadata…and these naming protocols…”

– Create templates to capture requirements where possible. Update as needed.

■ Assess the learning context. Take an audit of available learning resources in public space.

■ Review specified requirements of the training. Conduct necessary research for the instructional design.

■ Design the learning in depth within the limits of the context (technologies, budget, time, and expertise). Vet the design in depth. Run this by the project PI(s).

67

Some basic instructional design approaches (cont.)

■ Pre-build the elements that will be used in the online training.

– Photos, diagrams, illustrations, audio, video, slideshows, and others

■ Paper-prototype one of the learning objects (or learning object types). Critique.

Revise the stylebook based on evolving insights.

– It is much lower-cost to spend effort on design than on development and then

have to change course.

■ Create the training in Qualtrics. Script the sequencing and learning paths. Script

the behaviors of the respective segments. Add page breaks. Alt-text the imagery.

Add timed text to audio and video files. Test the various behaviors on various

browsers. Test the data capture and data downloading and data analytics.

68

Some basic instructional design approaches (cont.)

■ Alpha-test (α) the training within-dev team for…

– accuracy of facts in the content domain

– clarity of imagery, language, and writing

– technological functioning, for sequencing and learning paths, for scripted

behaviors, for file formatting and parameters (and interactivity between

technological systems)

– accessibility (behavior with screen readers, etc.)

– legality and rights releases

– meeting defined learning objectives (based on authorizing documents and

project PIs) for the broadest range of learners

69

Some basic instructional design approaches (cont.)

■ Beta-test (β) the training with members outside the team (preferably a small group ~

to the real-world learners for the training) for…

– content clarity (in all modes: text, imagery, audio, video, and others)

– optimal learner experiential sequence: priming for the topic -> cognitive

scaffolding -> presentation of contents -> formative assessment -> practice ->

summative assessment -> downloadables -> attestation -> completion

– learning efficacy (various methods: observation, assessments, debriefing)

■ Based on the feedback, revise and / or version and / or re-version.

70

Some basic instructional design approaches (cont.)

■ Archive all raw files. Archive project documentation. Archive all processed files.

■ Go for the simple build. There will be plenty of complexity in the contents and in the

learners.

– Make sure that the principal investigator (PI) or faculty member or

administrator who inherits the training is sufficiently comfortable with the

technology to be able to manage it.

■ Back up the training with a master training.

■ Launch the online training. Continue to monitor the training for areas for

improvement. Maintain intercommunications with the learners in the training.

71

GENERAL CORE COMPONENTS OF

AN ONLINE TRAINING

72

General core components of an online training

■ Title

■ Overview of learning objectives

■ Overview of topics covered

■ Authorizing regulation or policy (if relevant)

■ Estimated length of training (time commitment)

■ Content (information): Text, images, stories, audio, and video (with source citations)

■ Decision-making and walk-throughs of real-world scenarios

■ Assignments (actionable)

■ Formative questions and answers

■ Summative assessment (with results to the learner)

■ Attestation

■ Feedback from learners about the training and assessment (for continuing improvement)

73

General core components of an online training (cont.)

■ Contact information of the training provider

■ Downloadables and takeaways

– Checklists

– Tip sheets

– Posters

■ Learner notes (original, digitally captured)

■ Memory enhancers like mnemonics, images, and stories

■ Refresher contents (post-training)

– Online sites

– Scenarios

– Decision walk-throughs

■ Encouragement to practice-practice-practice

■ Support and direction to continue learning (through other reputable sources)

■ Descriptive metadata (to help relate / tie current training to other trainings and training sequences)

74

SOME WANTS RE: QUALTRICS

(for online training creation and delivery)

…while I’m at it… :)

75

Some wants re: Qualtrics

■ Built-in features that enable broad use of Qualtrics for online trainings

– Machine learning analytics

– More sophisticated item analysis (for learning)

– More sophisticated scoring rules

■ Easier packaging and sequencing of trainings

■ Higher thresholds (numbers of participants) for survey delivery without super user

access

■ A branch of an online community focused on using Qualtrics as a teaching and learning

platform [without the full functionality of a learning management system (LMS) but more

features than a microblog or wiki site]

■ A way to conduct full transfer and / or sharing of panels to other collaborator accounts

76

Some wants re: Qualtrics (cont.)

■ A way to randomize questions (as a subset) from a select set of questions

■ A way to enable “email verification” of users to verify identity and related online

training participation

■ A way to archive a training with its data online in Qualtrics (instead of taking up

space in the active survey area, loaded surveys may be concluded and stored within

a user’s Qualtrics Library)

■ A way to store accompanying private / protected files (such as copyright releases,

media releases, proposals / statements of work, authorizing documents, raw files,

and other backup data) with an online training for proxemics record-keeping

■ A set of designed training templates in the Qualtrics Library for broad customer use

77

78

Conclusion and contact

■ Dr. Shalin Hai-Jew

– Instructional Designer

– iTAC, Kansas State University

– 212 Hale – Farrell Library

[email protected]

– 785-532-5262

■ The presenter has no formal tie to Qualtrics.

■ This slideshow was created as part of on-campus work.