quantifying operational risk: possibilities and limitations · how accurate are var-estimates?...

Post on 08-May-2020

0 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Quantifying Operational Risk:Possibilities and Limitations

Hansjorg Furrer

ETH Zurich

hjfurrer@math.ethz.ch/∼hjfurrer

(Based on joint work with Paul Embrechts and Roger Kaufmann)

Deutsche Bundesbank Training Centre, Eltville

19-20 March 2004

Managing OpRisk

Contents

A. The New Accord (Basel II)

B. Risk measurement methods for OpRisks

C. Advanced Measurement Approaches (AMA)

D. Conclusions

E. References

Managing OpRisk 1

A. The New Accord (Basel II)

• 1988: Basel Accord (Basel I): minimum capital requirements

against credit risk. One standardised approach

• 1996: Amendment to Basel I: market risk.

• 1999: First Consultative Paper on the New Accord (Basel II).

• 2003: CP3: Third Consultative Paper on the

New Basel Capital Accord. (www.bis.org/bcbs/bcbscp3.htmcp3)

• mid 2004: Revision of CP3

• end of 2006: full implementation of Basel II ([9])

Managing OpRisk 2

What’s new?

• Rationale for the New Accord: More flexibility and risk sensitivity

• Structure of the New Accord: Three-pillar framework:

Ê Pillar 1: minimal capital requirements (risk measurement)

Ë Pillar 2: supervisory review of capital adequacy

Ì Pillar 3: public disclosure

Managing OpRisk 3

What’s new? (cont’d)

• Two options for the measurement of credit risk:

D Standard approach

D Internal rating based approach (IRB)

• Pillar 1 sets out the minimum capital requirements:

total amount of capital

risk-weighted assets≥ 8%

• MRC (minimum regulatory capital)def= 8% of risk-weighted assets

• New regulatory capital approach for operational risk (the riskof losses resulting from inadequate or failed internal processes,people and systems, or external events)

Managing OpRisk 4

What’s new? (cont’d)

• Notation: COP: capital charge for operational risk

• Target: COP ≈ 12% of MRC

• Estimated total losses in the US (2001): $50b

• Some examples

D 1977: Credit Suisse Chiasso-affair

D 1995: Nick Leeson/Barings Bank, £1.3b

D 2001: Enron (largest US bankruptcy so far)

D 2003: Banque Cantonale de Vaudoise, KBV Winterthur

Managing OpRisk 5

B. Risk measurement methods for OpRisks

Pillar 1 regulatory minimal capital requirements for OpRisk: Three

distinct approaches:

Ê Basic Indicator Approach

Ë Standardised Approach

Ì Advanced Measurement Approaches (AMA)

Managing OpRisk 6

Basic Indicator Approach

• Capital charge:

CBIAOP = α×GI

• CBIAOP : capital charge under the Basic Indicator Approach

• GI: average annual gross income over the previous three years

• α = 15% (set by the Committee)

Managing OpRisk 7

Standardised Approach

• Similar to the BIA, but on the level of each business line:

CSAOP =

8∑i=1

βi ×GIi

βi ∈ [12%, 18%], i = 1, 2, . . . , 8.

• 8 business lines:

Corporate finance Payment & Settlement

Trading & sales Agency Services

Retail banking Asset management

Commercial banking Retail brokerage

Managing OpRisk 8

C. Advanced Measurement Approaches (AMA)

• Allows banks to use their own methods for assessing their exposure

to OpRisk

• Preconditions: Bank must meet

- qualitative and

- quantitative

standards before using the AMA

• Risk mitigation via insurance allowed

Managing OpRisk 9

Quantitative standards

• The Committee does not stipulate which (analytical) approach

banks must use

• However, banks must demonstrate that their approach captures

severe tail loss events, e.g 99.9% VaR

• In essence,

MRC = EL + UL

If ELs are already captured in a bank’s internal business practices,

then

MRC = UL

Managing OpRisk 10

Quantitative standards (cont’d)

• Internal data: Banks must record internal loss data

• Internally generated OpRisk measures must be based on a five-year

observation period

• External data: A bank’s OpRisk measurement system must also

include external data

Managing OpRisk 11

Some internal datatype 1

1992 1994 1996 1998 2000 2002

010

2030

40

type 2

1992 1994 1996 1998 2000 2002

05

1015

20

type 3

1992 1994 1996 1998 2000 2002

02

46

8

pooled operational losses

1992 1994 1996 1998 2000 2002

010

2030

40

Managing OpRisk 12

Modeling issues

• Stylized facts about OP risk losses

- Loss occurrence times are irregularly spaced in time

(selection bias, economic cycles, regulation, management interactions,. . . )

- Loss amounts show extremes

• Large losses are of main concern!

• Repetitive vs non-repetitive losses

• Red flag: Are observations in line with modeling assumptions?

• Example: “iid” assumption implies

D NO structural changes in the data as time evolves

D Irrelevance of which loss is denoted X1, which one X2,. . .

Managing OpRisk 13

A mathematical (actuarial) model

• OpRisk loss database{X

(t,k)` , t ∈ {T − n + 1, . . . , T − 1, T} (years),

k ∈ {1, 2, . . . , 7} (loss event type),

` ∈ {1, 2, . . . , N (t,k)} (number of losses)}

• Loss event types: Internal fraud

External fraud

Employment practices and workplace safety

Clients, products & business practices

Damage to physical assets

Business disruption and system failures

Execution, delivery & process managementManaging OpRisk 14

A mathematical (actuarial) model (cont’d)

• TruncationX

(t,k)` = X

(t,k)` 1{

X(t,k)`

>d(t,k)}

and (random) censoring

• A further index j, j ∈ {1, . . . , 8} indicating business line can be introduced(suppressed for this talk)

• Estimate a risk measure for FL(T+1,k)(x) = P[L(T+1,k) ≤ x

]like

D Op-VaR(T+1,k)α = F←

L(T+1,k)(α)

D Op-EST+1α = E

[L(T+1,k)|L(T+1,k) > Op-VaR(T+1,k)

α

]where

• L(T+1,k) =N(T+1,k)∑

`=1

X(T+1,k)` : OpRisk loss for loss event type k

over the period [T, T + 1].

Managing OpRisk 15

A mathematical (actuarial) model (cont’d)

• Discussion: Recall the stylized facts

- X’s are heavy-tailed

- N shows non-stationarity

• Conclusions:

- FX(x) and FL(t,k)(x) difficult to estimate

- In-sample estimation of VaRα (α large) almost impossible!

- actuarial tools may be useful:

∗ Approximation (translated gamma/lognormal)

∗ Inversion methods (FFT)

∗ Recursive methods (Panjer)

∗ Simulation

∗ Extreme Value Theory (EVT)Managing OpRisk 16

How accurate are VaR-estimates?

• Make inference about the tail decay of the aggregate loss L(t,k) via the taildecay of the individual losses X(t,k):

L =N∑

`=1

X` , 1− FX(x) ∼ x−αh(x) , x →∞

Ü 1− FL(x) ∼ E[N ]x−αh(x) , x →∞.

• Assumptions: (Xm) iid ∼ F and for some ξ, β and u large

Fu(x) := P[X − u ≤ x|X > u] = Gξ,β(u)(x)

where

Gξ,β(x) =

1−(1 + ξx

β(u)

)−1/ξ

, ξ 6= 0,

1− e−x/β , ξ = 0.

Managing OpRisk 17

How accurate are VaR-estimates? (cont’d)

• Tail- and quantile estimate:

1− FX(x) =Nu

n

(1 + ξ

x− u

β

)−1/ξ

, x > u.

VaRα = qα = u− β

ξ

(1−

( Nu

n(1− α)

)ξ ) (1)

• Idea: Comparison of estimated quantiles with the corresponding

theoretical ones by means of a simulation study ([8]).

Managing OpRisk 18

How accurate are VaR-estimates? (cont’d)

• Simulation procedure:

Ê Choose F and fix α0 < α < 1, Nu ∈ {25, 50, 100, 200}(Nu: # of data points above u)

Ë Calculate u = qα0 and the true value of the quantile qα

Ì Sample Nu independent points of F above u by the rejection

method. Record the total number n of sampled points this

requires

Í Estimate ξ, β by fitting the GPD to the Nu exceedances over u

by means of MLE.

Î Determine qα according to (1)

Ï Repeat N times the above to arrive at estimates of Bias(qα) and

SE(qα)

Managing OpRisk 19

How accurate are VaR-estimates? (cont’d)

• Accuracy of the quantile estimate expressed in terms of bias and

standard error:

Bias(qα) = E[qα − qα], SE(qα) = E[(qα − qα)2

]1/2

Bias(qα) =1N

N∑j=1

qjα − qα , SE(qα) =

( 1N

N∑j=1

(qjα − qα)2

)1/2

• For comparison purposes (different distributions) introduce

Percentage Bias :=Bias(qα)

qα, Percentage SE :=

SE(qα)qα

Managing OpRisk 20

How accurate are VaR-estimates? (cont’d)

• Criterion for a “good estimate”: Percentage Bias andPercentage SE should be small, e.g.

α = 0.99 α = 0.999

Percentage Bias ≤ 0.05 ≤ 0.10

Percentage SE ≤ 0.30 ≤ 0.60

Managing OpRisk 21

Example: Pareto distribution 1− FX(x) = x−θ1− FX(x) = x−θ1− FX(x) = x−θ, θ = 2θ = 2θ = 2

u = F←(xq) α Goodness of VaRα

0.99 A minimum number of 100 exceedances(corresponding to 333 observations) is requiredto ensure accuracy wrt bias and standard error.

q = 0.70.999 A minimum number of 200 exceedances

(corresponding to 667 observations) is requiredto ensure accuracy wrt bias and standard error.

0.99 Full accuracy can be achieved with the minimumnumber 25 of exceedances (corresponding to 250observations).

q = 0.90.999 A minimum number of 100 exceedances

(corresponding to 1000 observations) is requiredto ensure accuracy wrt bias and standard error.

Managing OpRisk 22

Example: Pareto distribution 1− FX(x) = x−θ1− FX(x) = x−θ1− FX(x) = x−θ, θ = 1θ = 1θ = 1

u = F←(xq) α Goodness of VaRα

0.99 For all number of exceedances up to200 (corresponding to a minimum of 667observations) the VaR estimates fail to meetthe accuracy criteria.

q = 0.70.999 For all number of exceedances up to

200 (corresponding to a minimum of 667observations) the VaR estimates fail to meetthe accuracy criteria.

0.99 A minimum number of 100 exceedances(corresponding to 1000 observations) is requiredto ensure accuracy wrt bias and standard error.

q = 0.90.999 A minimum number of 200 exceedances

(corresponding to 2000 observations) is requiredto ensure accuracy wrt bias and standard error.

Managing OpRisk 23

How accurate are VaR-estimates? (cont’d)

• Large number of observations necessary to achieve targeted

accuracy.

• Minimum number of observations increases as the tails become

thicker ([8]).

• Remember: The simulation study was done under idealistic

assumptions. OpRisk losses, however, typically do NOT fulfil

these assumptions.

Managing OpRisk 24

Pricing risk under incomplete information

• Recall: L(T+1,k): OpRisk loss for loss event type k over the period

[T, T + 1]

L(T+1,k) =N(T+1,k)∑

`=1

X(T+1,k)`

• Question: Suppose we have calculated risk measures ρ(T+1,k)α ,

k ∈ {1, . . . , 7} for each loss event type. When can we consider

7∑k=1

ρ(T+1,k)α

a “good” risk measure for the total loss LT+1 =∑7

k=1 L(T+1,k) ?

Managing OpRisk 25

Pricing risk under incomplete information (cont’d)

• Answer: Ingredients

- (non-) coherence of risk measures (Artzner, Delbaen, Eber, Heath

framework)

- optimization problem: given ρ(T+1,k)α , k ∈ {1, . . . , 7}, what is

the worst case for the overall risk for LT+1?

Solution: using copulas in [4] and references therein.

- aggregation of banking risks [1]

Managing OpRisk 26

A ruin-theoretic problem motivated by OpRisk

• OpRisk process:

V(k)t = u(k) + p(k)(t)− L

(k)t , t ≥ 0

for some initial capital u(k) and a premium function p(k)(t)satisfying P[L(k)

t − p(k)(t) → −∞] = 1.

• Given ε > 0, calculate u(k)(ε) such that

P[

infT≤t≤T+1

(u(k)(ε) + p(k)(t)− L

(k)t

)< 0

]≤ ε (2)

u(k)(ε) is a risk capital charge (internal)

Managing OpRisk 27

Ruin-theoretic problem (cont’d)

• Solving for (2) is difficult

- complicated loss process L(k)t

- heavy-tailed case

- finite time horizon [T, T + 1]

• Recall for the classical Cramer-Lundberg model

Y (t) = u + ct−N(t)∑k=1

Xk = u + ct− SXN (t)

Ψ(u) = P[supt≥0

(SX

N (t)− ct)

> u]

Managing OpRisk 28

Ruin-theoretic problem (cont’d)

• In the heavy-tailed case: P[X1 > x] ∼ x−(β+1)L(x), L slowly

varying, implies that

Ψ(u) ∼ κu−βL(u) , u →∞.

• Important: Net-profit condition

P[

limt→∞

(SX

N (t)− ct)

= −∞]

= 1

• Now assume for some general loss process (S(t))

P[

limt→∞

(S(t)− ct

)= −∞

]= 1

Ψ1(u) = P[supt≥0

(S(t)− ct

)> u

]∼ u−βL(u) , u →∞. (3)

Managing OpRisk 29

Ruin-theoretic problem (cont’d)

• Question: How much can we change S keeping (3)?

• Solution: use time change S∆(t) = S(∆(t))

Ψ∆(u) = P[supt≥0

(S∆(t)− ct

)> u

]Under some technical conditions on ∆ and S, general models are

given so that

limu→∞

Ψ∆(u)Ψ1(u)

= 1

i.e. ultimate ruin behaves similarly under the time change

Managing OpRisk 30

Ruin-theoretic problem (cont’d)

• Example:

- start from the homogeneous Poisson case (classical Cramer-

Lundberg, heavy-tailed case)

- use ∆ to transform to changes in intensities motivated by

operational risk, see [6].

Managing OpRisk 31

D. Conclusions

• OP risk 6=market risk, credit risk

• “Multiplicative structure” of OP risk losses ([7])

S × T ×M (Selection-Training-Monitoring)

• Standard actuarial methods (including EVT) aiming to derive

capital charges are of limited use due to

D lack of data

D inconsistency of the data with the modeling assumptions

• OpRisk loss databases must grow

• interesting source of mathematical problems

Managing OpRisk 32

Conclusions (cont’d)

• challenges: choice of risk measures, aggregation of risk measures

• OpRisk charges can not be based on statistical modeling alone

I Pillar 2 (overall OP risk management such as analysis of causes,

prevention, . . . ) more important than Pillar 1

Managing OpRisk 33

E. References

[1] Alexander, C., and Pezier, P. (2003). Assessment andaggregation of banking risks. 9th IFCI Annual Risk Management

Round Table, International Financial Risk Institute (IFCI).

[2] Basel Committee on Banking Supervision. The New BaselCapital Accord. April 2003. BIS, Basel, Switzerland,

www.bis.org/bcbs

[3] Embrechts, P., Furrer, H.J., and Kaufmann, R. (2003).

Quantifying Regulatory Capital for Operational Risk. DerivativesUse, Trading and Regulation, Vol. 9, No. 3, 217-233. Also

available on www.bis.org/bcbs/cp3comments.htm

Managing OpRisk 34

[4] Embrechts, P., Hoeing, A., and Juri, A. (2003). Using copulae

to bound the Value-at-Risk for functions of dependent risks.

Finance and Stochastics, Vol. 7, No 2, 145-167.

[5] Embrechts, P., Kaufmann, R., and Samorodnitsky, G. (2002).

Ruin theory revisited: stochastic models for operational risk.

Submitted.

[6] Embrechts, P., and Samorodnitsky, G. (2003). Ruin problem

and how fast stochastic processes mix. Annals of AppliedProbability, Vol. 13, 1-36.

[7] Geiger, H. (2000). Regulating and Supervising Operational Risk

for Banks. Working paper, University of Zurich.

Managing OpRisk 35

[8] McNeil, A. J., and Saladin, T. (1997) The peaks over thresholds

method for estimating high quantiles of loss distributions.

Proceedings of XXVIIth International ASTIN Colloquium,

Cairns, Australia, 23-43.

[9] The Economist (2003). Blockage in Basel. Vol. 369, No 8344,

October 2003.

Managing OpRisk 36

top related