iwsm2014 dev ops measurements (amir arooni)

Post on 06-Jul-2015

420 Views

Category:

Software

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

IWSM Presentation

TRANSCRIPT

From dashboard management To an

improvement index for the teams

Amir Arooni, October 2014

Head of Global Digital Channels&Payments Services

Do not put content

on the brand

signature area

Do not put content

on the brand

signature area

Orange

RGB= 255,102,000

Light blue

RGB= 180,195,225

Dark blue

RGB= 000,000,102

Grey

RGB= 150,150,150

ING colour balance

Guideline

www.ing-presentations.intranet

Introduction….

As of 2001 employed by ING:

Director ING CB Global Digital Channels & Payments Services

Director ING IT Solution Delivery Centre Channels NL

Senior manager ING IT-Finance, Marketing & Sales support NL

VP IT-Securities, IT-Investment Management, IT-Generic Systems

Before 2001:

Various leadership roles at KPN, TPG, Cap Gemini, Sun Microsystems

Education:

Msc Consulting & Coaching for Change (Oxford/HEC)

MBA, RSM Erasmus University

IT- Business & Information Modeling, Managing Complex projects

Accountancy, BA, University of Teheran

Amir Arooni MBA

6 oktober 2014 3

After 10 years of remodelling, renovation and restoration: the opening of the new

Rijksmuseum on April 13th 2013. Costs 375M euro

The IT Department is out of sync with our company's needs…..

We don't understand why we are spending so much in IT…

Our IT department is not responsive…

Difficulty to align many IT departments…

M& A: “without data, We only have opinions….”

5

Components: DRTC

AARB

GTRD

SOLL

GREC

TTPC

URTB

SDLI

KLBT

S2BT

MEYT

Components: DRTC

AARB

GTRD

SOLL

GREC

TTPC

URTB

SDLI

KLBT

S2BT

MEYT

What are the targets? Which data do we need and

how to analyze them?

6 oktober 2014 7

Do not put content

on the brand

signature area

Do not put content

on the brand

signature area

Orange

RGB= 255,102,000

Light blue

RGB= 180,195,225

Dark blue

RGB= 000,000,102

Grey

RGB= 150,150,150

ING colour balance

Guideline

www.ing-presentations.intranet

Our vision is to deliver strategic advantage

•8

Do not put content

on the brand

signature area

Do not put content

on the brand

signature area

Orange

RGB= 255,102,000

Light blue

RGB= 180,195,225

Dark blue

RGB= 000,000,102

Grey

RGB= 150,150,150

ING colour balance

Guideline

www.ing-presentations.intranet

The superior customer

experience is our strategic differentiator

9

Wow!

Wow!

Wow!

Do not put content

on the brand

signature area

Do not put content

on the brand

signature area

Orange

RGB= 255,102,000

Light blue

RGB= 180,195,225

Dark blue

RGB= 000,000,102

Grey

RGB= 150,150,150

ING colour balance

Guideline

www.ing-presentations.intranet

Time to

market

10

Do not put content

on the brand

signature area

Do not put content

on the brand

signature area

Orange

RGB= 255,102,000

Light blue

RGB= 180,195,225

Dark blue

RGB= 000,000,102

Grey

RGB= 150,150,150

ING colour balance

Guideline

www.ing-presentations.intranet

11

Collaborative

community

Do not put content

on the brand

signature area

Do not put content

on the brand

signature area

Orange

RGB= 255,102,000

Light blue

RGB= 180,195,225

Dark blue

RGB= 000,000,102

Grey

RGB= 150,150,150

ING colour balance

Guideline

www.ing-presentations.intranet

12

Do not put content

on the brand

signature area

Do not put content

on the brand

signature area

Orange

RGB= 255,102,000

Light blue

RGB= 180,195,225

Dark blue

RGB= 000,000,102

Grey

RGB= 150,150,150

ING colour balance

Guideline

www.ing-presentations.intranet

We use the

methods that

are the most

valued by

engineers.

We follow the agile

manifesto

We work based on

Scrum Framework

We build, test, and

deploy using

continuous

delivery principles

We are fully

organized in

DevOps teams.

Common DevOps Measurements

• Cycle Time • Time from feature requested to available in production

• Time from issue reported to fixed in production

Common DevOps Measurements

• Team Velocity and Predictability of Sprint

6 oktober 2014 15

Predictability

Productivity

Common DevOps Measurements

• Software Quality • Complexity, structure of the code

• Design, layering of software components

• Documentation, right comments and readability

• Duplications, multiple places

• Defects, number of issues during development and production

• Size, manageable size of components

• Tests, the level of automation and code covrage

Operational measurement is done with

automatic tools.

Simple dashboards, Automate every repetitive task

in metrics gathering

Built & maintained by ING • Results in agility in metrics

• Take as needed while maturing

Data from Product Backlogs • Enterprise Product Backlog Mgmt process

Data from the DevOps Teams • Feedback from Scrum Masters

• Continuous Integration process

• Test Automation process

• Continuous Delivery process

• Incident tracking (HPC)

• Monitoring

All data for a DevOps Team is gathered by the DevOps Team Stay ‘lean and mean’

17

But these measurements are all about…

6 oktober 2014 18

There are 4 valuable areas which we start

to measure to improve teams capabilities.

Operational Excellence

-# Defects per Release per Product Defect = Root cause of Incident +

Bugs discovered by a user

-Cycle Time per Product From ‘Prepare’ status to ‘Production’

Prepare = team spends effort on making user story Ready

-Indexed Velocity Increase per Team Velocity / Velocity of first Sprint

User story points indexed on reference user story

Business Value

-Business Value (EUR) Calculated per feature by Product Owner

-# Outages per Product Per Release

-# Open Risks per Product Per Release

Customer Orientation

-User Feedback (1-5 stars) Simple: Hands-up at demo (1-5 fingers up)

Advanced: User Feedback Form

Future Orientation

-People Engagement Happy, OK, Sad / end of each Sprint

-People skills Novice – Expert per individual & skill set

Novice – Expert per team across skill sets

-Simplified architecture TCO reduction

19

We are also measuring the impact of our transition

and resulting improvements of the teams.

1) In our Agile transformation, we want to • Inspect whether the intended change is actually taking place

• Inspect whether the change manifests itself in terms of our objectives

2) To give useful and positive feedback for self-

improvement to teams about: • Continuous Delivery practices

• Improving or adding practices

6 oktober 2014 20

We agree on the need for a simple set of metrics.

21

•Business Value

•People Engagement

•Service Quality

•Time to market

•Productivity

Why were we not successful in M&A in the past?

KPI driven not improvement oriented, not from the teams for the teams. Matrices did not reflect the business objectives Most of standard measurements were:

• Incomplete • Inconsistent • Infrequent

Many manual data collection activities

Differences in objectives led to confusion: • Cost, productivity, quality • Testable requirements , #faults been found, future

22

Develop measurements which help the teams and

add value to our business…

- Right functionality for the customers

- Skills and capabilities

- Continuous Delivery

- Collaboration and High Performance

23

Metrics we use must be lightweight, valuable,

simple and comparable.

Lightweight: Data collection and - processing may not slow or burden

DevOps teams or Management Teams

Valuable: Metrics must have a strong relation to either our objectives or to our

change. Metrics must help making decisions. If we collect metrics without using

them – stop!

Simple: Collectable by the DevOps Teams in the course of their own work, or

generated automatically by continuous delivery tooling

Comparable: DevOps Team A in Domain X to DevOps Team B in Domain Y

Not as a KPI, but to feed the learning cycle.

24

•Thank you for your attention! • : Amir Arooni • amir.arooni@mail.ing.nl

top related