targets that work (for the service desk), susan storey

Download Targets That Work (for the Service Desk), Susan Storey

If you can't read please download the document

Upload: service-desk-institute

Post on 20-Aug-2015

318 views

Category:

Business


2 download

TRANSCRIPT

  1. 1. www.servicedeskinstitute.com Targets that Work 8 October 2014 By Susan Storey SDI Auditor, ITIL Expert, consultant and trainer
  2. 2. www.servicedeskinstitute.com Targets that work Agenda Why we measure Three rules of measurement Goals and targets the difference One measurement, many stories Measurement affects behaviour What service desk managers measure 30 measurements of SDC | Page 2 ssa ltd 2014 |
  3. 3. www.servicedeskinstitute.com We like you but were not sure why . | Page 3 ssa ltd 2014 |
  4. 4. www.servicedeskinstitute.com Why we measure To understand whats good and whats not To provide management information for decision making To influence behaviour To improve resource management To improve speed and productivity | Page 4 ssa ltd 2014 |
  5. 5. www.servicedeskinstitute.com Objectives of metrics To support and manage a service, process or activity To provide accurate, up-to-date and complete information To validate management decisions To highlight direction and targets for future activities To identify complimentary/conflicting priorities To ensure adaptability to changing market conditions | Page 5 ssa ltd 2014 |
  6. 6. www.servicedeskinstitute.com Help to connect the dots | Page 6 ssa ltd 2014 |
  7. 7. www.servicedeskinstitute.com Three rules of measurement Link to the bottom line Link to corporate objectives Be forward looking | Page 7 ssa ltd 2014 |
  8. 8. www.servicedeskinstitute.com Key performance indicators KPI is a collection of metrics used to manage a process, service or activity Either qualitative or quantitative KPI categories: Compliance Quality Performance Value Executive Vision Mission Incremental steps Goals Objectives Business drivers Critical success factor KPIs Data Metrics Measurements | Page 8 ssa ltd 2014 |
  9. 9. www.servicedeskinstitute.com Using goals, objectives and targets Goal an aspiration for where you want to be Objective explains how the goal will be met (SMART) Target tactical, small attainments towards gradually achieving the goal Trending a line of best fit on a graph showing the general direction of progress or regress | Page 9 ssa ltd 2014 |
  10. 10. www.servicedeskinstitute.com Goals, targets and trends | Page 10 ssa ltd 2014 | 0.00 500.00 1,000.00 1,500.00 2,000.00 2,500.00 Jan feb march april may june july august sept oct nov dec Incidents logged 2013
  11. 11. www.servicedeskinstitute.com Measuring - considerations One set of data might be all you need Whatever you measure will influence behaviour Be clear - are you measuring to: drive higher performance or checking progress towards goals? Counting volume and quantity is not a reliable measure of quality or performance Find a way to measure quality, including customer satisfaction 11 KPI key performance indicator, a collection of measurements that show how the CSF is being supported eg number and percentage of access requests dealt with, following security procedures, in the allocated service time
  12. 12. www.servicedeskinstitute.com One measurement, many stories Total calls logged Week 1 Week 2 Week 3 Week 4 Week 5 SDA 1 80 113 88 90 88 SDA 2 99 211 150 170 149 SDA 3 147 234 203 217 187 Actual calls taken 259 444 430 611 456 12 1. Total closed 326 558 441 477 424 2. Total left open 67 114 11 134 32 3. % unclosed 20.55 20.43 2.49 28.09 7.54 4. Total calls in July 2200 5. Total closed in July 2226
  13. 13. www.servicedeskinstitute.com Measurement affects behaviour Percentage of calls open longer than one day (monitoring backlog) Unhappy customers who have waited for resolution. SLAs breached have no ultimate time- frame Percentage of calls closed Rushed solutions, likely to be reopened and result in drop in customer satisfaction SLA targets successful Forced call closure, using stop the clock, logging user recalls incorrectly as new calls Most logged per SDA per day Chaotic logging pattern and unsustainable activity, poor attention to customer service mistakes 13
  14. 14. www.servicedeskinstitute.com What Service Desk Managers measure ASA (Average Speed to Answer) ABA (Average Abandon Before Answer) ATT (Average Talk Time) Availability to take incoming calls Average call time FCR (First Contact Resolution) FLR (First Level Resolution) User recalls (user calls to chase up) Calls fixed within SLA Calls breaching SLA 14
  15. 15. www.servicedeskinstitute.com Customer satisfaction survey results 0 50 100 150 200 250 1. How long system up 2. How many times down 3. How disruptive 4. Attitude of support 5. Competence of support Response Questions Customer satisfaction survey July 2013 1000 surveys sent, 429 received (target 80% return) poor quite poor avg good excellent | Page 15 ssa ltd 2014 |
  16. 16. www.servicedeskinstitute.com Dont be trapped by data formatting! 0 50 100 150 200 250 1. How long system up 2. How many times down 3. How disruptive 4. Attitude of support 5. Competence of support Response Questions Customer satisfaction survey July 2013 1000 surveys sent, 429 received (target 80% return) poor quite poor avg good excellent | Page 16 ssa ltd 2014 |
  17. 17. www.servicedeskinstitute.com Weighted scoring | Page 17 ssa ltd 2014 | 0 1 2 3 4 5 System uptime Number of failures Disruption Support attitude Support competence Customer satisfaction survey July 2013 1000 surveys sent, 429 received (target 80% return) Response to Q Target
  18. 18. www.servicedeskinstitute.com 18 Productivity measures
  19. 19. www.servicedeskinstitute.com 19 Count sheep whilst tending your flock (Avg No. of days worked pcm) x (No. of work hrs in day) x (60 mins/hr) (Avg No. of calls handled by SDA pcm) x (Avg call handle time in mins)Analyst Utilisation = xx% (21 working days pcm) x (7.5 work hrs per day) x (60 mins/hr) (500 calls/pcm) x (10 mins/call)Analyst Utilisation = 52.9%
  20. 20. www.servicedeskinstitute.com 20 Possible reasons -low SDA utilisation SDAs working day not wholly allocated to Service Desk percentage of work is service requests service requests are not logged service requests are categorised as incidents not all calls are logged SDAs are resolvers on 2nd or even 3rd line support SDAs time includes roll outs or project work SDA is diverted into training new SDAs Desk side support included (travel time not)
  21. 21. www.servicedeskinstitute.com SDC measurement requirements Every metric needs a SMART target All metrics should be trended towards goals over 3, 6 and 12 months All targets should be reviewed at least annually Reporting activities must be undertaken Business related metrics must be in evidence | Page 21 ssa ltd 2014 |
  22. 22. www.servicedeskinstitute.com +23 more measures for SDC Number of incidents and service requests Re-opened incident rate Backlog Management Hierarchic escalations (management) Functional escalations (re-assignment) Average resolution time by priority Average resolution time by incident category and service request type Comparison of overall service level goals to actual results Remote control monitoring measured against goals Self-logging monitoring measured against goals Self-help monitoring measured against goals Knowledge Usage 13. Quality of knowledge and its effectiveness 14. Monitoring incidents caused by failed changes measured against goals 15. Total cost of service delivery 16. Average cost per incident and service request (cost per contact) 17. Average cost per incident and service request by channel 18. People satisfaction feedback 19. Staff turnover 20. Unplanned absence days 21. Periodic customer satisfaction measurement 22. Event-based customer satisfaction measurement 23. Complaints, suggestions and compliments | Page 22 ssa ltd 2014 |
  23. 23. www.servicedeskinstitute.com 23 Making sense of data
  24. 24. www.servicedeskinstitute.com Thank you Susan Storey [email protected] 07831 576615 24