assessing & improving caller experience greg simsar, vp speech services eduardo olvera, vui...
TRANSCRIPT
Assessing & Improving Caller Experience
Greg Simsar, VP Speech ServicesEduardo Olvera, VUI Architect
Why We Are Here
• To empower you with the knowledge, confidence & inspiration to improve your company’s caller experience
How We’re Going to Empower You
• Sharing what we know• Learning by doing• Getting you fired up!
Agenda
• Caller experience & why it matters• Benchmarking, best practices &
standards• Methods & tools for assessing &
improving caller experience• “Hands-on” exercises• “Selling” caller experience• Takeaways
Caller Experience Defined
• Caller experience – everything the caller experiences ‘over the phone’ when contacting a company for any reason
• Focus for this seminar will be (but not limited to) the caller experience with call center automation – call routing (ACD), self-service (IVR), and computer telephony integration (CTI)
Why Caller Experience Matters
Why Caller Experience Matters
• What caller’s think about caller experience
• What caller’s think about speech recognition
• Why it matters• Good caller experience is good business
(have your cake and eat it too)
What Caller’s Think
Which method for doing business with a company is described by the phrase?
Speech systems are perceived to deliver a higher level of benefits than keypad entry systems.
= statistically significant difference
What Caller’s Think
If speech were to replace keypad entry, how would that affect your overall experience?
70% feel their overall experience would be improved if speech technology were used instead of touch-tone menus.
Somewhat improve
47%
Very much improve
23%
Not improve30%
What Caller’s Think
How influential is customer service on your perception of the company?
88%88% reported that the
quality of customer service is
“very influential”“very influential” or
“influential”“influential” on their perception of a company.
Very influential
62%
Very influential
62%
Influential26%
Influential26%
Somewhat Somewhat influentialinfluential
10%10%
Not veryNot veryinfluentialinfluential
1%1%
Why it Matters
76%76% of customers say theyof customers say they havehave
because ofbecause of
poor customer servicepoor customer service
stoppedstopped doing businessdoing business
with a companywith a company
Why it Matters
Satisfaction is split
11%
38%
12%
29%
10%
Satisfied – 49% Dissatisfied – 39%
Completelysatisfied
Somewhatsatisfied
Neither satisfied nor unsatisfied
Somewhatunsatisfied
Completelyunsatisfied
In general, how satisfied are you with your interactions with customer service?
Why it Matters
1%
3%
5%
5%
16%
70%
Online support usinginstant messaging or chat
Online support usingemail
Online support using thecompany's website
Speak with a customerservice representative in
person
An automated telephonesystem
Speak with a customerservice representative on
the telephone
86%86% choose the phone for
contacting companies.
The phone remains the dominant method of
contacting customer service.
Methods used for contacting customer service most often
Why it Matters
• Caller ‘choice’ and automation usage are not mutually exclusive.
• Customers prefer automated interactions if their hold time for agents exceeds 2 minutes.
• “60 percent of customers favor an automated option for many types of simple interactions; the rest said they didn’t mind being presented with an automated option as long as they could connect with a live agent if they wanted one.” [McKinsey]
Have some cake and eat it too!
Agenda
• Caller experience & why it matters• Benchmarking, best practices &
standards• Methods & tools for assessing &
improving caller experience• “Hands-on” exercises• “Selling” caller experience• Takeaways
Benchmarking, Best Practices & Standards
• Available caller experience benchmarking
• Caller experience best practices• Caller experience standards• The gethuman™ standard• None of these substitute for assessing
and improving your company’s caller experience
Caller Experience Benchmarking
• No ‘industry’ benchmark• Commercial benchmarks exist and are
useful: – Sterling Audits, – BenchmarkPortal,– EIG
• Customer experience vs. Voice User Interface (VUI) design benchmarking
Caller Experience Best Practices
• Agreed upon customer experience ‘best practices’ – Widely accepted by customer interaction
design community– Basic, common sense
• Best practices vs. ‘conventional wisdom’• Customer experience (interaction) best
practices vs. VUI design best practices
Caller Experience Standards
• Best practices vs. standards • No true customer experience
‘standards’ exist• No true VUI design standards exist• The gethuman™ standard
What is gethuman™?
• “The gethuman project is a consumer movement to improve the quality of phone support in the US.” [gethuman.com]
• Originally “The IVR Cheat Sheet” started by Paul English
• Hosts a website gethuman.com• “One million consumers” [gethuman.com]
• The gethuman 500 database - lists companies, grades them, and tells you how to reach a human
What is gethuman?
What is the gethuman standard?
• “A specification for how customer service phone systems and support should work.” [gethuman.com]
• Version 1.0 published 18 October 2006• Not really a “standard” – not approved by any
formal standards body or industry organization• “Contributing” organizations: Microsoft, IBM,
Avaya, Genesys, Nortel, Syntellect…• 84.4% of the 500 received an F, and fewer
than 2% of the 500 received an A score [and they don’t have any automation]
Should you care?
• After all…– “Its not really a standard”– “Just how big is this movement anyway”– “I’ve got plenty of company”
• The real answer is yes - if you care about customer experience – The “standard” is just “common sense”
when it comes to customer experience– Agrees with “everyone’s” best practices for
customer interaction design• And don’t forget good caller experience
is good business (eating cake)
The gethuman standard
1. The caller must always be able to dial 0 or to say "operator" to queue for a human.
2. An accurate estimated wait-time, based on call traffic statistics at the time of the call, should always be given when the caller arrives in the queue. A revised update should be provided periodically during hold time.
3. Callers should never be asked to repeat any information (name, full account number, description of issue, etc.) provided to a human or an automated system during a call.
4. When a human is not available, callers should be offered the option to be called back. If 24 hour service is not available, the caller should be able to leave a message, including a request for a call back the following business day.
5. Speech applications should provide touch-tone (DTMF) fall-back
6. Callers should not be forced to listen to long/verbose prompts.
The gethuman standard
7. Callers should be able to interrupt prompts (via dial-through for DTMF applications and/or via barge-in for speech applications) whenever doing so will enable the user to complete his task more efficiently.
8. Do not disconnect for user errors, including when there are no perceived key presses (as the caller might be on a rotary phone); instead queue for a human operator and/or offer the choice for call-back.
9. Default language should be based on consumer demographics for each organization. Primary language should be assumed with the option for the caller to change language. (i.e. English should generally be assumed for the US, with a specified key for Spanish.)
10.All operators/representatives of the organization should be able to communicate clearly with the caller (i.e. accents should not hinder communication; representatives should have excellent diction and enunciation.)
Pop quiz
• Name that gethuman standard
• How’s this…
The gethuman “gold” standard
1. While holding, allow callers to disable hold music; remember their selection for future calls.
2. If ads or promotions are played, allow users to disable them.
3. Allow callers, where appropriate, to identify themselves via caller ID and a securely defined PIN, instead of being required to enter long account numbers each call.
4. Default to preferred language based on caller ID. 5. Support and publicize individual toll-free numbers for
individual languages. 6. Allow callers to access audio transcriptions of their calls
via the organization's website. 7. Call back the caller at the time that he/she specified.
Exercise: How do you measure up?
• You: rate your company• Group: compare ratings and see how
you measure up
Agenda
• Caller experience & why it matters• Benchmarking, best practices &
standards• Methods & tools for assessing &
improving caller experience• “Hands-on” exercises• “Selling” caller experience• Takeaways
Methods & Tools for Assessing & Improving Caller Experience
• Assessment & Improvement lifecycle• “Be the caller”• Call Monitoring & Recordings• Evaluative Usability Studies• Surveys• Statistics & Reports• Speech vs. Touchtone
Assess, Improve & Do It Some More
ProductionAsse
ss
Impro
veImplem
ent
Assessment & Improvement Lifecycle
“Be The Caller”
• Critical shift from focusing on ‘business’ requirements to ‘caller’ requirements
• The simplest way to do this is just pick up the phone and call your company – “be the caller”
• Good first step:– Its “free”– Shifts your perspective– Some immediate improvements may
surface– Can get you fired up!
“Be The Caller”
• Self-Survey– What do you think of the experience?– Did you accomplish your task?– Was it easy to do?– If not, why not?– What did you like?– What didn’t you like?– What would you to improve the experience?– What impression of your company was
created by the experience?
Call Monitoring & Recordings
• Easy to do if…– IVR ports are configured on your call
recording platform– IVR ports are configured on your ACD
• And if not…its critical to enable one or both capabilities
• Allows you to hear ‘both sides’ of the conversation
• If you monitor quality of your CSRs why wouldn’t you want to monitor the quality of your IVRs?
Call Monitoring & Recordings
• Excellent next step:– In the Pareto (80/20) sweet spot – low cost
with very high reward– Completes your perspective shift– Will definitely surface areas for
improvement– Will definitely get you fired-up!
• As with CSRs consider monitoring on a periodic basis
Call Monitoring & Recordings
• How to go about it– Typically one day of monitoring will do– 100 calls is a good “rule of thumb” sample
size – adjust for application complexity and number of user profiles
– If you can monitor you can record too – tools of the trade revealed
– Take notes - highlighting trouble spots and observations (organize and tabulate later)
– Organize and tabulate calls based on notes & recordings - goal is a prioritized list of issues and opportunities for improvement
Evaluative Usability Studies
• Getting inside the caller’s head– Interview callers on their usage patterns
and characteristics– Observe callers using the system– Interview callers on their experience with
attempted tasks– Survey callers on their overall experience
using the system
Evaluative Usability Studies
• Goes beyond observation to investigation and understanding the caller and what they are thinking
• Excellent next step to call monitoring:– There is an engagement cost– Generally good value – especially if caller
experience issues were identified by call monitoring
– Will not only identify areas for improvement but yield insight into the best way to fix them
Evaluative Usability Studies
• What’s involved– Define key user profiles, demographics & tasks– Develop participant ‘screening script’– Recruit and schedule usability test participants
– 10 to 12 per user profile– Develop usability interview scripts– Conduct usability sessions– Compile findings and recommendations
• Options:– Remote ‘over the phone’ or– Lab with video for ‘non-verbal’ feedback
Evaluative Usability Studies
• In addition to usability sessions:– Management & staff interviews– Review of available statistics & reports– IVR call monitoring (or recordings)– CSR service observation & interviews
(optional)
• How to go about it – involve an experienced professional
Caller Surveys
• Captures direct caller feedback• Complement to other assessment
methods & tools– Cost, value & benefit based on type of caller
survey– Best way to measure customer satisfaction– Broader but shallower feedback vs. usability
study
• Caller Survey Types– Brief post-call survey– Broad outbound surveys
Brief Post-Call Surveys
• Brief caller survey performed by CSRs about IVR/automation experience
• Complement to other assessment methods & tools:– Low engagement cost if any – indirect costs
need consideration– Good value
• Measures general caller satisfaction with experience
• Identifies trouble spots and areas for improvement
Brief Post-Call Surveys
• <sample survey>
Broad Out-bound Surveys
• More in-depth caller survey conducted by independent party
• Broadly conducted - random customer sample
• Complement to other assessment methods & tools:– Engagement cost can be significant– If you can afford it – do it– Best measure of general satisfaction with
caller experience– Limited trouble spot & improvement insight
Broad Out-bound Surveys
• <sample survey>
Costs & Benefits
Benefit
Cost
•Make a call
•Call Monitoring
•Brief Post-call Survey
•Broad Outbound Survey
•Usability Studies
Statistics & Reports
• Useful report types– Task completion– Hot spots
• Use reports to identify potential trouble spots and areas for improvement…but not to assess caller experience and define improvements
• Won’t shift perspective or fire you up• Complement but not a substitute for the
other methods we’ve covered here
Statistics & Reports
• “High-end” caller behavior mapping tools – Clearly map aggregate caller behavior– Map caller behavior across multiple
interactions and multiple channels (i.e. phone, web, etc.)
– Focus on caller behavior vs. caller experience
– Useful – but relatively high cost– Complement but not a substitute for the
other methods we’ve covered here
Recognition Analysis & Tuning
• Required step in optimizing speech recognition performance
• Focus on tuning grammars and recognition parameters vs. caller experience and usability
• Can identify caller experience trouble spots and potential areas for improvement
• Complement but not a substitute for the other methods we’ve covered here
Agenda
• Caller experience & why it matters• Benchmarking, best practices &
standards• Methods & tools for assessing &
improving caller experience• “Hands-on” exercises• “Selling” caller experience• Takeaways
“Hands-on” Exercises
• The ‘lucky’ company• Exercise 1: Pick up the phone and dial• Exercise 2: “Listening in”• Exercise 3: “Live” usability session
Exercise 1: Pick up the Phone
• Group: make calls into customer call center – attempt 2-3 self-service tasks & transfer to CSR
• You: take ‘self-survey’• Group: compare and discuss self-survey
feedback
Exercise 2: “Listening In”
• Group: listen to call recordings of actual customer calls
• You: take notes and identify trouble spots and areas for improvement
• Greg & Eduardo: share tabulation spreadsheet for full call sample
• Group: compare and discuss identified trouble spots and areas for improvement
Exercise 3: “Live” Usability Session
• Group: observe “live” usability session• Small teams: identify and define caller
experience improvements based on usability session
• Group: compare and discuss caller experience improvements
Agenda
• Caller experience & why it matters• Benchmarking, best practices &
standards• Methods & tools for assessing &
improving caller experience• “Hands-on” exercises• “Selling” caller experience• Takeaways
“Selling” Caller Experience
• “Getting it” is not enough, you have to “sell it”
• Share what you’ve learned here:– Why caller experience matters– Good caller experience is good experience
(eating cake)
• Make calls & listen to calls and share what you find - ‘shift’ focus of call center management & staff
Takeaways
• Caller experience matters• What’s good for the caller is good for
business (eating cake)• Take the first step: make calls & listen
to calls…the rest will follow• Once you’ve started – don’t stop • Don’t just get it, sell it!