cyber security for commercial off-the-shelf (cots)...

30
CAN UNCLASSIFIED Defence Research and Development Canada Scientific Report DRDC-RDDC-2017-R148 November 2017 CAN UNCLASSIFIED Cyber security for Commercial Off-The-Shelf (COTS) and open source software systems A new system-to-function Vulnerability/Impact Assessment Methodology (VI-AM) based on the Common Vulnerabilities Exposures / Common Vulnerability Scoring System (CVE/CVSS) databases (Version 1.0) Mario Couture DRDC – Valcartier Research Centre

Upload: hoangtram

Post on 10-Sep-2018

212 views

Category:

Documents


0 download

TRANSCRIPT

CAN UNCLASSIFIED

Defence Research and Development Canada Scientific Report DRDC-RDDC-2017-R148 November 2017

CAN UNCLASSIFIED

Cyber security for Commercial Off-The-Shelf (COTS) and open source software systems

A new system-to-function Vulnerability/Impact Assessment Methodology (VI-AM) based on the Common Vulnerabilities Exposures / Common Vulnerability Scoring System (CVE/CVSS) databases (Version 1.0)

Mario Couture DRDC – Valcartier Research Centre

CAN UNCLASSIFIED

Template in use: R17-0714-1108 - template.dotm

© Her Majesty the Queen in Right of Canada (Department of National Defence), 2017

© Sa Majesté la Reine en droit du Canada (Ministère de la Défence nationale), 2017

CAN UNCLASSIFIED

IMPORTANT INFORMATIVE STATEMENTS

Disclaimer: Her Majesty the Queen in right of Canada, as represented by the Minister of National Defence ("Canada"), makes no representations or warranties, express or implied, of any kind whatsoever, and assumes no liability for the accuracy, reliability, completeness, currency or usefulness of any information, product, process or material included in this document. Nothing in this document should be interpreted as an endorsement for the specific use of any tool, technique or process examined in it. Any reliance on, or use of, any information, product, process or material included in this document is at the sole risk of the person so using it or relying on it. Canada does not assume any liability in respect of any damages or losses arising out of or in connection with the use of, or reliance on, any information, product, process or material included in this document.

This document was reviewed for Controlled Goods by Defence Research and Development Canada (DRDC) using the Schedule to the Defence Production Act.

Platform-to-Assembly Secured System (PASS) project.

DRDC-RDDC-2017-R148 i

Abstract

Operating systems and software applications will likely always contain software flaws that may be

exploited by hackers to attack Governmental computing infrastructure. Assessing the presence of

vulnerabilities in these systems and their potential impacts on military operations are important tasks that

must be performed to identify and apply the best corrective actions that will make these systems and the

services they deliver more “cyber secure.”

This Scientific Report presents a new methodology, the Vulnerability Impact Assessment Methodology

(VI-AM). VI-AM can be used to assess vulnerabilities in software systems and the effects they may

produce at the system and operational function levels. This methodology uses publicly available

standardized CVE/CVSS datasets and metrics to express: a) required maximum impacts that can be

tolerated at the operational functions level, and b) actual computed impacts for the same functions (based

on vulnerability scans). Gap analyses then generate information to help prioritize corrective actions on the

systems to make operational functions more cyber secure.

Significance to defence and security

The application of the proposed VI-AM methodology will help the definition of RCN’s Engineering

changes documents and contribute to making military systems more cyber secure. Priorities can be

established to optimize resources utilization and the improvement of military systems cyber security.

This Scientific Report is a deliverable of the Platform-to-Assembly Secure Systems (PASS) project

resulting from work at DRDC – Valcartier Research Centre. The sponsors of this portion of the PASS

project are: Directorate of Naval Combat Systems (DNCS), Director General Maritime Equipment

Program Management (DGMEPM), Directorate Information Management Security (Dir IM Secur), and

DG Cyber.

ii DRDC-RDDC-2017-R148

Résumé

Les systèmes d’opération et les applications logicielles vont probablement toujours contenir des failles

logicielles qui peuvent être exploitées par des hackers pour attaquer les infrastructures informatiques du

gouvernement. L’évaluation de la présence de vulnérabilités dans ces systèmes et de leurs impacts

potentiels sur les opérations militaires sont des tâches importantes qui doivent être effectuées dans le but

d’identifier et d’appliquer les meilleures actions correctives qui vont rendre plus « cyber sécure » ces

systèmes et les services qu’ils rendent.

Ce rapport scientifique présente et guide l’utilisation d’une nouvelle méthodologie, la « Vulnerability

Impact Assessment Methodology » (VI-AM). VI-AM peut être utilisé pour évaluer les vulnérabilités qui

sont présentes dans les systèmes logiciels et les effets qu’elles peuvent produire aux niveaux système et

fonctionnalités opérationnelles. Cette méthodologie implique l’utilisation des ensembles publics

standardisés de données CVE/CVSS et métriques pour exprimer : a) les impacts maximum requis qui

peuvent être tolérés au niveau des fonctions opérationnelles et b) les vrais impacts calculés pour les

mêmes fonctions (sur la base de scans de vulnérabilités). Des analyses différentielles d’impacts génèrent

ensuite l’information nécessaire aidant la priorisation des actions correctives qui doivent être posées sur

les systèmes pour rendre les fonctions opérationnelles plus sécures.

Importance pour la défense et la sécurité

L’application de la méthodologie VI-AM va aider la définition des documents de type « Engineering

changes » de la MRC et contribuer à rendre les systèmes militaires plus cyber sécures. Les priorités

peuvent être établies permettant d’optimiser l’utilisation des ressources et l’amélioration de la cyber

sécurité des systèmes militaires.

Ce rapport scientifique est un livrable du projet PASS. Son contenu résulte de travaux qui ont été réalisés

à RDDC – Centre de recherches de Valcartier. Les commanditaires du projet sont les suivants : Direction

des Systèmes de combat naval (DSCN), Directeur générale de gestion des programmes et de l’équipement

maritime (DGGPEM), Direction de la Sécurité de la gestion de l’information (D Sécur GI), et Direction

générale (DG) Cyber.

DRDC-RDDC-2017-R148 iii

Table of contents

Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . i

Significance to defence and security . . . . . . . . . . . . . . . . . . . . . . i

Résumé . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ii

Importance pour la défense et la sécurité . . . . . . . . . . . . . . . . . . . . ii

Table of contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . iii

List of figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iv

List of tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . v

Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi

1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

2 On the need to create yet a new VI-AM . . . . . . . . . . . . . . . . . . . 3

3 The VI-AM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

3.1 A multi-layer approach . . . . . . . . . . . . . . . . . . . . . . . 6

3.2 The TD part . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

3.3 The BU part . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

3.4 Determining CIA metrics values . . . . . . . . . . . . . . . . . . . . 9

3.4.1 Propagation of CIA metrics values . . . . . . . . . . . . . . . 10

3.4.2 Evaluation of TD metrics values . . . . . . . . . . . . . . . . 12

3.4.3 Evaluation of BU metric values . . . . . . . . . . . . . . . . 13

3.5 Gap analysis . . . . . . . . . . . . . . . . . . . . . . . . . . 14

3.6 Possible improvements . . . . . . . . . . . . . . . . . . . . . . 15

3.7 Required data and tools . . . . . . . . . . . . . . . . . . . . . . 16

4 Concluding remarks and recommendations . . . . . . . . . . . . . . . . . 17

References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

List of symbols/abbreviations/acronyms/initialisms . . . . . . . . . . . . . . . 20

iv DRDC-RDDC-2017-R148

List of figures

Figure 1: Components of the VI-AM. . . . . . . . . . . . . . . . . . . . . . 7

Figure 2: The TD part of VI-AM. . . . . . . . . . . . . . . . . . . . . . . . 8

Figure 3: The BU part of VI-AM. . . . . . . . . . . . . . . . . . . . . . . . 9

Figure 4: Hypothetical example of a topology of functions, services and systems. . . . 10

DRDC-RDDC-2017-R148 v

List of tables

Table 1: List of selected standardized processes for risk assessment/management. . . . 5

Table 2: Mapping values of CIA effects for VI-AM. . . . . . . . . . . . . . . 11

Table 3: Required values for each type of CIA metric in the TD part of VI-AM. . . . 13

Table 4: CIA metrics for the BU part of VI-AM. . . . . . . . . . . . . . . . . 14

Table 5: Differences between TD and BU metrics values. . . . . . . . . . . . . 15

Table 6: Mapping Exploitability values for Access, Authorization and Complexity. . . 16

Table 7: CIA & E metrics in the BU part of VI-AM. . . . . . . . . . . . . . . 16

vi DRDC-RDDC-2017-R148

Acknowledgements

The author would like to thank Dr. François Rhéaume and Mr. Daniel U. Thibault for their help in

developing this methodology.

DRDC-RDDC-2017-R148 1

1 Introduction

Modern networked computing infrastructure and systems have reached unprecedented degrees of

technological complexity that make their complete debugging and full certification very challenging.

Software flaws will likely remain that may suddenly manifest as errors and service failures at runtime.

When discovered by hackers or advanced malware, they may become vulnerabilities that can be exploited

to launch many forms of cyber-attacks and negatively impact systems, military operations, as well as the

data, information and knowledge that support them. It has become increasingly important to assess the

presence of vulnerabilities in these systems and the impacts these may have on the Confidentiality,

Integrity and Availability (CIA) of these systems and operations. These assessments should help prioritize

corrective measures that will optimize resources utilization and minimize impacts.

This Scientific Report describes the Vulnerability/Impact Assessment Methodology (VI-AM); a

methodology for: a) assessing the presence of vulnerabilities in Commercial Off-The-Shelf (COTS) and

open source software systems, and b) defining cyber security requirements for operational functions. An

example of an operational function would be the search for above-water entities, which is part of the

tactical picture compilation operation or action, which in turn is part of many Navy missions.

The methodology has two main parts. First, the Top-Down (TD) part uses standardized CIA-related

metrics to define cyber security requirements for selected operational functions, the supporting computing

services and systems. Second, the Bottom-Up (BU) part uses databases of known vulnerabilities/impacts

to assess the presence of vulnerabilities in these supporting software systems and, then, deduce possible

actual impacts cyber-attacks exploiting these would have at the system and operational levels—in terms

of CIA metrics values associated with every vulnerability that was found.

The results of the application of both parts of VI-AM (TD and BU parts) can then be compared in gap

analyses. The differences between the required and actual metrics values, combined with the relative

priority of operational functions, will reveal which systems must be improved/corrected first and the types

of modifications required.

As shown in this report, the VI-AM methodology:

1. defines cyber security requirements for operational functions in terms of metrics describing the

degree of required CIA; the needed CIA values;

2. propagates these cyber security requirement CIA values downward to the chain of supporting

software systems that provide the computing services needed by these operational functions;

3. conducts vulnerability scans to: a) evaluate the actual Common Vulnerabilities and Exposures (CVE)

vulnerabilities that are present in these supporting software systems, and b) identify the potential

impacts these may have (in terms of CIA values);

4. propagates these potential impacts to the chain of provided computing services and the supported

operational functions; the actual computed CIA values; and

5. evaluates the differences between needed CIA values (found in Step 1) and computed actual CIA

values (found in Step 4) for operational functions (gap analysis).

2 DRDC-RDDC-2017-R148

Gap analyses will reveal potential weaknesses at the operational level and help identify and prioritize the

computing components that should be the object of corrective measures.

DRDC-RDDC-2017-R148 3

2 On the need to create yet a new VI-AM

The methodology proposed in this document is mainly based on two internationally recognized open

source databases from the National Institute of Standards and Technology (NIST) [1] that standardize the

way software vulnerabilities and their possible impacts can be expressed and used; the Common

Vulnerabilities Exposures (CVE) [2] and the Common Vulnerability Scoring System (CVSS) [3]. The

CVE database is a catalogue of known software cyber threats, which can be categorized in vulnerabilities

(software weaknesses that can be exploited) and exposures (system configuration issues that can be

exploited). The CVSS database contains the information that can be used to assign severity scores to CVE

vulnerabilities. As of August 2017, the number of software vulnerabilities of the CVE database exceeded

89,000 instances for more than 2,750 software products from 1,250 software vendors [4].

This dataset represents an important source of information for vulnerability assessment process. Many

vulnerability scanning tools are able to search the database and find all the CVE vulnerabilities (herein

called CVE instances) that are present in COTS and open source software applications and operating

systems [5]. Such lists of CVE instances then tell which components must be updated, provided that

corrective software patches exist. Regular system updates are important because publicly available

exploits (from the Exploit Database [6], for example) can easily be used by hackers to exploit many CVE

vulnerabilities.

The identification of CVE vulnerabilities in software systems represents only one aspect of what can be

done to build cyber threat pictures for these systems, and then identify/impose corrective measures that

will reduce risks for military operations. The CVSS database defines and standardizes a number of

metrics that can be used to characterize the impact of possible attacks on CIA. In other words, for any

specific CVE vulnerability present in the system, it is possible to compute the CIA impacts of attacks

exploiting these vulnerabilities. The CVE/CVSS databases can be used in two different ways:

1. Required CIA impact values (Section 3.2): CIA metrics could be used to express CIA impact

requirements for any operational functions. CIA impact values would specify the maximum tolerable

effects the exploitation of any CVE vulnerabilities would have on operational functions during

operations.

2. Actual CIA impact values (Section 3.3): vulnerability scans made on the system would identify all

CVE vulnerabilities present within, and then actual CIA impacts values can be identified from the

CVSS database. These CIA impact values could be used to characterize both supporting computing

services and operational functions.

It then becomes possible to compute the differences between the required CIA values of operational

functions and the actual CIA impact values of the same functions. The results of this gap analysis help

prioritize corrective measures.

To our knowledge, there is no standardized process of methodology that could be used as is to conduct

this type of gap analysis for operational functions. The VI-AM methodology described in this report was

developed to address this need. It proposes a number of steps that can be applied, in the following

low-level chain, to generate the information necessary to compute these gaps, in both the TD () and

BU () directions:

4 DRDC-RDDC-2017-R148

Operational functions Computing services/data Computing systems

The definition of some of these steps was inspired by a number of related standardized processes, briefly

described in Table 1 (bold texts in each row identify the elements that were considered for the building of

VI-AM).

One observation is that these processes tend to address the whole spectrum of cyber-related problems at

all organizational levels (strategic, operational, and tactical), which is huge and complex. VI-AM is much

simpler and easier to apply. It focuses on vulnerabilities and their impacts that can be found in the

CVE/CVSS databases and it is applied at low levels.

VI-AM deals with the technological details of a software vulnerability assessment. Other processes may

deal with bits (0s and 1s), with human factors, or any other level included in the full assessment of cyber

security. Potential links between VI-AM and these processes include:

1. Higher-level cyber security assessment processes specify the operational functions to be studied with

VI-AM and their required degree of cyber security in terms of CIA metrics values. The TD part of

VI-AM needs this information as input to start evaluating these metrics for computing services and

systems.

2. Higher-level cyber security assessment processes will most certainly require an evaluation of the

degree of software security against known vulnerabilities of the actual systems and services that

support specific operational functions.

3. Higher-level cyber security assessment processes may require multiple applications of the BU part of

VI-AM where specific vulnerabilities present in systems are considered solved. These evaluations

would reveal which vulnerabilities in the systems are more critical (or have more potential negative

effects on operational functions) or, conversely, which corrective measures would provide the

greatest improvement in cyber security. For example, a well-connected system’s vulnerability could

bring down the cyber security rating of multiple services. Hence, a single fix would pay off in

multiple places in the overall system.

These are just some preliminary examples that will be made explicit and clarified with further research

and integration with the other parts of the project and with client requirements.

DRDC-RDDC-2017-R148 5

Table 1: List of selected standardized processes for risk assessment/management.

Title

(Number)

Reference

Author

(date)

Description

Risk

assessment

methodology

(SP 800-30)

(SP 800-30r1)

[7, 8]

NIST

(2002)

(2012)

“A foundation for the development of an effective risk management

program, containing both the definitions and the practical guidance

necessary for assessing and mitigating risks identified within IT

systems.” [9]

Guide for

Applying the

Risk

Management

Framework to

Federal

Information

Systems

(SP 800-37)

[10]

Note: the

ITSG-33 [11]

standard is

similar to this

framework

NIST

(2010)

“A guidelines for applying the Risk Management Framework to federal

information systems to include conducting the activities of security

categorization, security control selection and implementation, security

control assessment, information system authorization, and security

control monitoring.” […] “The guidelines have been developed: To

ensure that managing information system-related security risks is

consistent with the organization’s mission/business objectives and

overall risk strategy established by the senior leadership through the

risk executive (function); To ensure that information security

requirements, including necessary security controls, are integrated

into the organization’s enterprise architecture and system development

life cycle processes; To support consistent, well-informed, and ongoing

security authorization decisions (through continuous monitoring),

transparency of security and risk management-related information, and

reciprocity; and To achieve more secure information and

information systems within the federal government through the

implementation of appropriate risk mitigation strategies.” [10]

Risk

management

process

(SP 800-39)

[12]

NIST

(2011)

“Introduces a structured approach to risk management in information

security. This high level process presents four actions: Frame, Assess,

Respond and Monitor.” […] “Assess, Respond and Monitor actions are

intuitive to understand, while Frame needs explanations. Frame tries to

establish how organizations manage risk regarding various aspects

such as assumptions, constraints, risk tolerances, and priorities used

within organizations for making investment and operational decisions.”

[9]

Information

security risk

management

(ISO/IEC

27005:2011)

[13]

ISO

(2011)

“The information risk assessment process consists of Context

Establishment, Risk Assessment, Risk Treatment, Risk Acceptance,

Risk Communication, and Risk Monitoring.” […] “It shows many

similarities with the Risk Assessment Process of NIST SP 800-30 r1

standard.” [9]

Threat and

Risk

Assessment

methodology

(TRA-1)

[14]

CSE,

RCMP

(2007)

“TRA-1 applies to both physical security and information security. The

document provides a detailed description of the five (5) steps that lead

to the production of a plan for assessing threats and risks.” [9]

6 DRDC-RDDC-2017-R148

3 The VI-AM

This chapter describes the multi-layered VI-AM (Section 3.1). It is made of two complementary

sub-methods or parts; the TD part (Section 3.2) and the BU part (Section 3.3). Section 3.4 describes how

to apply VI-AM and Section 3.5 shows how gap analyses can be performed with the results. The rest of

Chapter 3 describes potential improvements to VI-AM (Section 3.6), and the data and tools required

(Section 3.7).

3.1 A multi-layer approach

Figure 1 illustrates the main components of VI-AM. The methodology merges two complementary

sub-processes or parts, TD and BU, that are organized around three logical levels L1, L2 and L3.

Level 3 (L3) corresponds to the tactical/operational functions, herein called operational functions, that are

used to achieve specific operations or actions within the context of a mission. Operational functions are

functions that are used to achieve specific operations or actions within the context of military missions.

An example of a naval operational function would be the search for above water entities, which is part of

the tactical picture compilation operation or action, which in turn is part of many Navy missions.

Level 2 (L2) corresponds to the computing services that realize the L3 functions. Computing services

differ from operational functions in that they are functionalities provided by computing systems that help

accomplishing operational functions during missions.

Finally, Level 1 (L1) represents the computing systems—both the software and hardware systems—that

provide computing services (L2) during missions.

The TD part of VI-AM defines the cyber security requirements of selected operational function(s) in

terms of CIA metrics at L3, and then propagates these values to supporting elements at L2 and L1. Based

on the results of vulnerability and impact analyses, the BU part evaluates the actual degree of cyber

security of computing systems at L1 in terms of the same CIA metrics, and then propagates these values

to elements pertaining to L2 and L3. The TD part is akin to risk management and the BU part is akin to

risk assessment [7, 8, 12].

In this document, the following initialisms are used to refer to specific parts of the process. For example,

VI-AM activities taking place in the TD part at level L3 are called activities at TD-L3. Normally, one

applies VI-AM using the following sequence: TD-L3, TD-L2, TD-L1, BU-L1, BU-L2, and finally BU-L3

(Figure 1).

Functions (L3), services (L2) and systems (L1) are called elements in VI-AM. These elements form

chains in the sense that these elements are linked by a provider-consumer relationship (between layers).

For example, a specific computer and software (at L1) provides computing services (at L2) that are

consumed by specific operational functions (at L3). The TD and BU parts of VI-AM are applied to chains

of interrelated elements.

DRDC-RDDC-2017-R148 7

Level 3 (L3)

(Operational functions)

Level 2 (L2)

(Computing services)

Level 1 (L1)

(Computing systems)

(TD-L3)-The focus: military functions-Identify TD-L3 metrics-Define values of TD-L3 metrics

(TD-L2)-Identify needed computing services -Identify TD-L2 metrics-Deduce values of TD-L2 metrics

(TD-L1)-Identify needed computing systems -Identify TD-L1 metrics-Deduce values of TD-L1 metrics

(BU-L3)-BU-L3 metrics are TD-L3 metrics-Deduce values of BU-L3 metrics (from BU-L2)-Compute the value of Δ3

-Deduce possible impacts at L3

(BU-L2)-BU-L2 metrics are TD-L2 metrics-Deduce values of BU-L2 metrics (from BU-L1)-Compute the value of Δ2

-Deduce possible impacts at L2

(BU-L1) -Perf./vuln./impact studies are made at this level (L1)-BU-L1 metrics are TD-L1 metrics-Compute values of BU-L1 metrics (perf./vuln. studies)-Compute the value of Δ1

-Deduce possible impacts at L1

Top-down (TD) part Bottom-up (BU) part

Figure 1: Components of the VI-AM.

Two postulates are used in VI-AM:

1. The first postulate applies to the TD part of VI-AM: the CIA metrics values specifying the

requirements (at TD-L3) are propagated to the supporting computing services (at TD-L2) and systems

(at TD-L1). In other words, all metrics values used to characterize the required degree of cyber

security of selected operational functions are given to related computing services and systems. For

example, a function that needs to be fully cyber secure in terms of CIA should involve computing

services and systems that support (and have) the same degree of cyber security.

2. The second postulate applies to the BU part of VI-AM: all the found CIA & E metrics values of

supporting systems are propagated to the computing services they provide (at BU-L2), and then to the

operational functions they support (at BU-L3). For example, three vulnerabilities negatively

impacting the integrity of one specific system will also affect the integrity of the computing service(s)

it provides and of the function(s) it supports; all these L2 and L3 elements should normally be given

the same integrity metric value(s) as the L1 system they rely on. The BU propagation of metrics

values from one layer to another must be made by a security expert; propagated metrics values may

have to be adapted according to the specifics of the layers.

8 DRDC-RDDC-2017-R148

The difference between the CIA metrics values obtained at L3 for TD and BU will reveal how the actual

cyber security of the selected operational functions diverges from the requirement. This gap analysis is

further discussed in Section 3.5.

Additionally, the application of the TD part will generate important information that may be used by

Acquisition Project Teams to help define requirements for a new system. Identified metrics values are

good indicators that specify critical aspects of cyber security that would be needed for elements at the L2

and L1 levels.

3.2 The TD part

This part of VI-AM defines the cyber security requirements for elements at TD-L3, TD-L2 and TD-L1

(Figure 2). Operational functions that need to be studied are first selected or given by higher-level

assessment processes (TD-L3-S1). The missions in which these functions are utilized may be used to

specify their degree of required cyber security in terms of CIA.

Computing services and systems supporting the selected operational functions in the chains are then

identified (at TD-L2-S1 and TD-L1-S1). Metrics and their values defining the required cyber security in

terms of CIA for the functions (TD-L3-S2 and TD-L3-S3) are then used to characterize supporting

computing services and functions (TD-L2-S2 and TD-L1-S2). As shown in Section 3.4, a high degree of

required cyber security translates into a numerically large value for the metric.

TD-L3-S1Select the functions

to be studied(scope)

TD-L3-S2Identify TD-L3 metrics

TD-L3-S3Deduce values of TD-L3

metrics

TD-L2-S1Deduce needed computing

services

TD-L2-S2Deduce values of TD-L2

metrics

TD-L1-S1Deduce needed computing

systems

TD-L1-S2Deduce values of TD-L1

metrics

Level 1(Systems)

Level 3(Functions)

Level 2(Services)

Figure 2: The TD part of VI-AM.

3.3 The BU part

This part of VI-AM starts where the TD part ends (Figure 3). The same CIA metrics used at TD-L1 are

reused in the BU part to characterize identified system(s) (BU-L1-S2), service(s) (BU-L2-S1) and

operational function(s) (BU-L3-S1) along the chains of elements.

The values of these CIA metrics (BU-L1-S2) are computed using results from vulnerability and impact

analyses (BU-L1-S1) made on all identified systems (at TD-L1). These analyses involve the utilization of

DRDC-RDDC-2017-R148 9

information such as the systems’ operating system types and versions, software names and versions, and

publicly known vulnerabilities from the CVE/CVSS databases [2, 3, 4] to evaluate the values of the CIA

metrics for these systems.

These databases also provide information that helps evaluate how easily the vulnerabilities of a system

can be exploited—also called their exploitability (Section 3.6). As shown in Section 3.4, known cyber

security vulnerabilities will lower the values; the more vulnerable a system is, the lower its metric’s

values.

BU-L1-S1Perform perf./vuln./impact

studies on identified systems

BU-L1-S2Compute values of BU-L1 metrics (TD-L1 metrics)

BU-L1-S4Identify possible impacts/risks

at L1 for the selected focus

BU-L1-S3Compute the Δ1 value at L1

for the selected focus

BU-L2-S1Deduce values of BU-L2

metrics (from BU-L1 metrics)

BU-L2-S3Identify possible impacts/risks

at L2 for the selected focus

BU-L2-S2Compute the Δ2 value at L2

for the selected focus

BU-L3-S1Deduce values of BU-L3

metrics (from BU-L2 metrics)

BU-L3-S3Identify possible impacts/risks

at L3 for the selected focus

BU-L3-S2Compute the Δ3 value at L3

for the selected focus

Level 1(Systems)

Level 3(Functions)

Level 2(Services)

Figure 3: The BU part of VI-AM.

Both the TD and BU parts generate metrics and their associated values characterizing the required and

actual degrees of cyber security for the operational functions at L3 (TD-L3 and BU-L3), the computing

services at L2 (TD-L2 and BU-L2), and the computing systems at L1 (TD-L1 and BU-L1).

The differences between both sets of metrics at each logical level (Δ1, Δ2, and Δ3 obtained at BU-L1-S3,

BU-L2-S2 and BU-L3-S2, respectively) will reveal the possible impacts/risks (BU-L1-S4, BU-L2-S3,

and BU-L3-S3), called the gaps (Section 3.5). The next section describes how the CIA and exploitability

metrics are obtained.

3.4 Determining CIA metrics values

The TD part of VI-AM aims to capture the cyber security requirements for functions (L3), services (L2)

and systems (L1)—all these are called elements. The metrics considered at TD-L3, TD-L2 and

TD-L1 should describe the degree of required CIA of all elements at these levels.

The CIA metrics defined in the CVE/CVSS databases [2, 3, 4] are used.

10 DRDC-RDDC-2017-R148

3.4.1 Propagation of CIA metrics values

Before defining how CIA metrics values are determined, it is important to mention how VI-AM takes into

account the fact that one or many specific operational function(s) can be supported by one or many simultaneous

computing service(s) and system(s) in chains that may have different topologies and common elements.

Figure 4 illustrates a hypothetical example showing different chains of elements. This example is used

throughout this chapter. Two operational functions are defined as FnA and FnB. They are supported by

computing services (SvA, SvB, SvC), which are provided by different topologies of systems (SysA, SysB,

SysC, SysD, SysE, SysF). Systems SysA and SysB work together in parallel to provide SvA. SysC works

alone and supports services SvB and SvA. SysD works in series with SysF, and these two elements work in

parallel with SysE to support SvC.

FnA FnB

SvA SvB SvC

SysA

SysB

SysC SysD

SysE

SysF

Mission (Type A)

Level 3(Functions)

Level 2(Services)

Level 1(Systems)

Figure 4: Hypothetical example of a topology of functions, services and systems.

As shown in the next section, both the TD and BU parts of VI-AM require that CIA metrics values be

attributed to all the elements—functions, services, systems—in the chains. The TD chains of elements

that are considered by VI-AM in Figure 4 are the following:

TD chain 1: FnA SvA (SysA-SysB)

TD chain 2: FnA SvA SysC

TD chain 3: FnB SvB SysC

TD chain 4: FnB SvC ((SysD SysF)-SysE)

DRDC-RDDC-2017-R148 11

According to the first postulate (Section 3.1), the strongest CIA metrics values—in terms of cyber

security—of operational functions FnA and FnB are attributed to the CIA metrics of both supporting

computing services and systems in each chain. One particularity of this example is that TD chains 2 and 3

have the same ending element; system SysC. In this example, the CIA metrics values that are the

strongest—between SvA and SvB—are attributed to SysC.

In the BU part, a second set of CIA metrics values are assigned to all computing systems based on results

obtained from vulnerability and impact analyses (such as in [15]). The BU chains of elements that are

considered by VI-AM in this example are the following:

BU chain 1: (SysA-SysB) SvA FnA

BU chain 2: SysC SvA FnA

BU chain 3: SysC SvB FnB

BU chain 4: ((SysD SysF)-SysE) SvC FnB

According to the second postulate, the weakest CIA metrics values—in terms of cyber security—of

computing systems (SysA, SysB, SysC, SysD, SysE, SysF) are attributed to the CIA metrics of computing

services and operational functions in each chain. In this example, the CIA metrics values that are the

weakest between SysA, SysB and SysC are attributed to SvA and, similarly, the CIA metrics values that are

the weakest between SvB and SvC are attributed to FnB.

Possible CIA values that can be assigned to computing systems (L1), computing services (L2) and

operational functions (L3) (both in VI-AM TD and BU parts) are listed in Table 2. Values are assigned as

a function of the effects the vulnerabilities can achieve—no effect (10), partial effects (5), complete

effects (1). The higher the number, the more secure the system is: a vulnerability that has a weak impact

lowers the value less than a vulnerability that has a serious one.

The choice of numerical values—for example, a value of 1 instead of 2 for complete effects impact in

Table 2—is arbitrary save that they must form a strictly increasing sequence of values. As the same

numerical values are used throughout the TD and BU parts of VI-AM, this mapping induces no bias

errors.

Table 2: Mapping values of CIA effects for VI-AM.

Metric types Complete effects Partial effects No effect

Confidentiality 1 5 10

Integrity 1 5 10

Availability 1 5 10

As an example (for the TD part), an operational function that can tolerate no degradation in a given CIA

metric assigned the no effects value, while others that can tolerate partial or total degradation are assigned

the partial effects or complete effects value, respectively.

12 DRDC-RDDC-2017-R148

3.4.2 Evaluation of TD metrics values

Building on our example (Figure 4) would yield the following results.

CIA metrics at TD-L3 (operational functions FnA and FnB):

The following metrics values are identified for FnA and FnB according to steps TD-L3-S2 and TD-L3-S3

(Figure 2):

Confidentiality, Integrity, Availability (CTD-L3, ITD-L3, ATD-L3): each operational function (FnA and

FnB) are attributed numerical values corresponding to the highest required degree of CIA values

(Table 2).

The missions in which these functions are utilized may help determine these values.

CIA metrics at TD-L2 (computing services SvA, SvB, SvC):

The following metrics values are identified for SvA, SvB and SvC according to the first postulate

(Section 3.1) and steps TD-L2-S1 and TD-L2-S2:

Confidentiality, Integrity, Availability (CTD-L2, ITD-L2, ATD-L2): each service (SvA, SvB, SvC) are

normally attributed the numerical values corresponding to the highest required degree of CIA values

of the related operational functions. In our example, the FnA CIA metric values are attributed to SvA.

The FnB CIA metric values are attributed to SvB and SvC.

CIA metrics at TD-L1 (computing systems SysA, SysB, SysC, SysD, SysE, SysF):

The following metrics values are identified for SysA, SysB, SysC, SysD, SysE and SysF according to the

first postulate and steps TD-L1-S1 and TD-L1-S2:

Confidentiality, Integrity, Availability (CTD-L1), (ITD-L1), (ATD-L1): each system (SysA, SysB, SysC,

SysD, SysE and SysF) are normally attributed the numerical values corresponding to the highest

required degree of CIA values of the related computational services. In our example, the FnA CIA

metric values are attributed to SysA and SysB. The highest CIA metric values of SvB and SvA are

assigned to SysC, and the CIA metric values of SvC are assigned to SysD, SysE and SysF.

Table 3 shows the metrics for which values must be determined by the TD part of VI-AM for our

example.

DRDC-RDDC-2017-R148 13

Table 3: Required values for each type of CIA metric in the TD part of VI-AM.

Confidentiality

(C)

Integrity

(I)

Availability

(A)

Function A FnA-CTD-L3 FnA-ITD-L3 FnA-ATD-L3

Function B FnB-CTD-L3 FnB-ITD-L3 FnB-ATD-L3

Service A SvA-CTD-L2 SvA-ITD-L2 SvA-ATD-L2

Service B SvB-CTD-L2 SvB-ITD-L2 SvB-ATD-L2

Service C SvC-CTD-L2 SvC-ITD-L2 SvC-ATD-L2

System A SysA-CTD-L1 SysA-ITD-L1 SysA-ATD-L1

System B SysB-CTD-L1 SysB-ITD-L1 SysB-ATD-L1

System C SysC-CTD-L1 SysC-ITD-L1 SysC-ATD-L1

System D SysD-CTD-L1 SysD-ITD-L1 SysD-ATD-L1

System E SysE-CTD-L1 SysE-ITD-L1 SysE-ATD-L1

System F SysF-CTD-L1 SysF-ITD-L1 SysF-ATD-L1

3.4.3 Evaluation of BU metric values

The BU part of VI-AM aims to characterize the systems identified at TD-L1-S1 that provide the

computing services consumed by the selected operational functions.

Vulnerability and impact analyses—like the ones made in [15]—must be made for each identified system

(Figure 3; TD-L1-S1) to identify inherent vulnerabilities and their possible impacts on CIA at the three

logical levels (BU-L1, BU-L2 and BU-L3). These analyses involve the utilization of information such as

the software versions of each system and the publicly known vulnerabilities for this software [2, 3, 4].

CIA metrics at BU-L1 (computing systems SysA, SysB, SysC, SysD, SysE, SysF):

The following metrics values are identified for SysA, SysB, SysC, SysD, SysE and SysF according to steps

BU-L1-S1 and BU-L1-S2 (Figure 3):

Confidentiality, Integrity, Availability (CBU-L1, IBU-L1, ABU-L1): each system (SysA, SysB, SysC, SysD,

SysE and SysF) are normally attributed the numerical values corresponding to the degree of CIA

values found by vulnerability and impact analyses.

Important note: the vulnerability assessment may find many vulnerabilities (CVE instances) for each

system. As specified in the second postulate (Section 3.1), the cyber expert will decide which values

for CBU-L1, IBU-L1 and ABU-L1 should be kept for each system, and then propagated to L2 and L1

levels.

CIA metrics at BU-L2 (computing services SvA, SvB, SvC):

The following metrics values are identified for SvA, SvB and SvC according to the second postulate and

step BU-L2-S1:

Confidentiality, Integrity, Availability (CBU-L2, IBU-L2, ABU-L2): each computing service (SvA, SvB and

SvC) are normally attributed the numerical values corresponding to the weakest (lowest) CIA values

14 DRDC-RDDC-2017-R148

afforded by the providing systems. In our example, unless specified otherwise by the cyber expert,

the weakest CIA values of SysA, SysB and SysC would normally be attributed to SvA, the weakest

CIA values of SysC would be attributed to SvB and the weakest CIA values of SysD, SysE and SysF

would be attributed to SvC. Propagated metrics values may have to be adapted by the expert

according to the specifics of the L2 layer.

CIA metrics at BU-L3 (operational functions FnA and FnB):

The following metrics values are identified for FnA and FnB according to the second postulate and step

BU-L3-S1:

Confidentiality, Integrity, Availability (CBU-L3, IBU-L3, ABU-L3): each operational function (FnA and

FnB) are normally attributed the numerical values corresponding to the weakest (lowest) CIA values

afforded by the providing computing services. In our example, unless specified otherwise by the

cyber expert, the CIA values of SvA would normally be attributed to FnA and the weakest CIA

values of SvB and SvC would be attributed to FnB. Propagated metrics values may have to be adapted

by the expert according to the specifics of the L3 layer.

Table 4 shows the metrics for which values must be determined by the BU part of VI-AM in our example.

Table 4: CIA metrics for the BU part of VI-AM.

Confidentiality

(C)

Integrity

(I)

Availability

(A)

System A SysA-CBU-L1 SysA-IBU-L1 SysA-ABU-L1

System B SysB-CBU-L1 SysB-IBU-L1 SysB-ABU-L1

System C SysC-CBU-L1 SysC-IBU-L1 SysC-ABU-L1

System D SysD-CBU-L1 SysD-IBU-L1 SysD-ABU-L1

System E SysE-CBU-L1 SysE-IBU-L1 SysE-ABU-L1

System F SysF-CBU-L1 SysF-IBU-L1 SysF-ABU-L1

Service A SvA-CBU-L2 SvA-IBU-L2 SvA-ABU-L2

Service B SvB-CBU-L2 SvB-IBU-L2 SvB-ABU-L2

Service C SvC-CBU-L2 SvC-IBU-L2 SvC-ABU-L2

Function A FnA-CBU-L3 FnA-IBU-L3 FnA-ABU-L3

Function B FnB-CBU-L3 FnB-IBU-L3 FnB-ABU-L3

3.5 Gap analysis

After Tables 3 and 4 are completed, the difference between required (TD) and actual (BU) CIA metric

values (ΔC, ΔI, ΔA; Table 5) can be computed for all the selected operational functions, computing

services and systems in the chains. The cyber expert should decide which CIA metrics values should be

used for this gap analysis.

DRDC-RDDC-2017-R148 15

Table 5: Differences between TD and BU metrics values.

Confidentiality

(C)

Integrity

(I)

Availability

(A)

Function A FnA-ΔC-L3 FnA-ΔI-L3 FnA-ΔA-L3

Function B FnB-ΔC-L3 FnB-ΔI-L3 FnB-ΔA-L3

Service A SvA-ΔC-L2 SvA-ΔI-L2 SvA-ΔA-L2

Service B SvB-ΔC-L2 SvB-ΔI-L2 SvB-ΔA-L2

Service C SvC-ΔC-L2 SvC-ΔI-L2 SvC-ΔA-L2

System A SysA-ΔC-L1 SysA-ΔI-L1 SysA-ΔA-L1

System B SysB-ΔC-L1 SysB-ΔI-L1 SysB-ΔA-L1

System C SysB-ΔC-L1 SysC-ΔI-L1 SysC-ΔA-L1

System D SysB-ΔC-L1 SysD-ΔI-L1 SysD-ΔA-L1

System E SysB-ΔC-L1 SysE-ΔI-L1 SysE-ΔA-L1

System F SysB-ΔC-L1 SysF-ΔI-L1 SysF-ΔA-L1

Using the confidentiality of Function A as an example, the difference FnA-ΔC-L3 for this function is

obtained by calculating FnA-CTD-L3 - FnA-CBU-L3. The Fnx-ΔC-L3, Fnx-ΔI-L3 and Fnx-ΔA-L3 values are good

indicators of the gaps between the actual and the required degrees of CIA cyber security for this function.

Using the chains of involved elements, these differences also suggest specific improvements for the

computing system(s) and possibly for the services.

A more nuanced evaluation of the threat to components could be obtained by scaling the actual cyber

security with the exploitability of each vulnerability involved (Section 3.6).

The relative degree of criticality of each operational function could also help project teams to prioritize

system improvement efforts by focussing on the most critical problems first.

3.6 Possible improvements

Public vulnerabilities databases such as CVE and CVSS [3, 1, 4] also provide the information to evaluate

the ease with which the vulnerabilities present in a system can be exploited by hackers; the

exploitability E. Equation (1) shows that E can be calculated using numerical values mapping the

information provided by the databases. Table 6 proposes mapped values for VI-AM. These values will go

up as Exploitability gets easier.

If exploitability values are computed for all vulnerabilities that were found in the systems, Table 7 would

replace Table 4. The cyber expert would decide which values should be kept for each system, and which

values should be propagated at L2 and L3 levels.

𝐸𝑥𝑝𝑙𝑜𝑖𝑡𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 𝑎 𝑣𝑢𝑙𝑛𝑒𝑟𝑎𝑏𝑖𝑙𝑖𝑡𝑦 = 𝐴𝑐𝑐𝑒𝑠𝑠 𝑥 𝐴𝑢𝑡ℎ𝑜𝑟𝑖𝑧𝑎𝑡𝑖𝑜𝑛 𝑥 𝐶𝑜𝑚𝑝𝑙𝑒𝑥𝑖𝑡𝑦 (1)

16 DRDC-RDDC-2017-R148

Table 6: Mapping Exploitability values for Access, Authorization and Complexity.

Access Local = 1 Adjacent network = 5 Remote = 10

Authorization Multiple = 1 Single = 5 None = 10

Complexity High = 1 Medium = 5 Low = 10

Table 7: CIA & E metrics in the BU part of VI-AM.

Confidentiality

(C)

Integrity

(I)

Availability

(A)

Exploitability

(E)

System A SysA-CBU-L1 SysA-IBU-L1 SysA-ABU-L1 SysA-EBU-L1

System B SysB-CBU-L1 SysB-IBU-L1 SysB-ABU-L1 SysB-EBU-L1

System C SysC-CBU-L1 SysC-IBU-L1 SysC-ABU-L1 SysC-EBU-L1

System D SysD-CBU-L1 SysD-IBU-L1 SysD-ABU-L1 SysD-EBU-L1

System E SysE-CBU-L1 SysE-IBU-L1 SysE-ABU-L1 SysE-EBU-L1

System F SysF-CBU-L1 SysF-IBU-L1 SysF-ABU-L1 SysF-EBU-L1

Service A SvA-CBU-L2 SvA-IBU-L2 SvA-ABU-L2 SvA-EBU-L2

Service B SvB-CBU-L2 SvB-IBU-L2 SvB-ABU-L2 SvB-EBU-L2

Service C SvC-CBU-L2 SvC-IBU-L2 SvC-ABU-L2 SvC-EBU-L2

Function A FnA-CBU-L3 FnA-IBU-L3 FnA-ABU-L3 FnA-EBU-L3

Function B FnB-CBU-L3 FnB-IBU-L3 FnB-ABU-L3 FnB-EBU-L3

3.7 Required data and tools

The execution of VI-AM requires specific data, information and tools. The following types of information

must be available to apply the VI-AM methodology:

1. The list of operational functions to be studied. This list must include all operational functions,

supporting computing services, and systems.

2. For each operational function, the required value of CIA for the mission being considered.

3. For each system, a list of Operating Systems (OS), software and their version.

4. An updated list of CVE/CVSS-like vulnerability data, including corresponding access, authority,

complexity, CIA values.

DRDC-RDDC-2017-R148 17

4 Concluding remarks and recommendations

This report proposes and guides the utilization of a new methodology for assessing vulnerabilities in

software systems and the impacts they may have on CIA at the system and operational levels; it is called

the Vulnerability/Impact Assessment Methodology. VI-AM uses the publicly available standardized

CVE/CVSS databases and metrics to identify vulnerabilities and impacts in COTS and open source

operating systems and software applications.

The VI-AM methodology is made of two parts. The TD part is used to specify the required tolerable

impact values (in terms of CIA) of cyber-attacks on operational functions, supporting computing

services/data and software systems.

The BU part then: a) assesses the presence of CVE vulnerabilities in the same systems through

vulnerability scans, b) identifies actual CIA impact values from identified CVE vulnerabilities and

c) propagates these impact values to computing services and operational functions. Gap analyses compute

the differences between required and actual impact values at the operational function level.

Results from gap analyses can be used to prioritize corrective actions on operating systems and software

application. They contribute to optimize resources utilization and cyber security improvements. Using

VI-AM to guide the prioritization of system fixes and improvements requires the consideration of several

factors which are beyond the scope of this report. In particular, the initial assessment of the cyber security

requirements of any system is entirely dependent on the mission context; commanders will therefore have

to weigh VI-AM’s results over a spectrum of potential missions. For instance, it could be that a particular

system’s cyber security needs to be improved only for a narrow set of missions, while it is sufficient for

the rest of the mission portfolio. Ranking or weighing missions against each other is a complex matter

that reaches all the way up to the strategic planning level.

The following recommendations are made for the future utilization of VI-AM:

1. Automate VI-AM to make it as quick and practical to use as possible.

2. Apply the VI-AM methodology to all systems as part of a complete risk assessment.

18 DRDC-RDDC-2017-R148

References

[1] The National Institute of Standards and Technology (NIST) (online), https://www.nist.gov/ (Access

date: August 2017).

[2] The MITRE Corporation, Common Vulnerabilities and Exposures (CVE) (online),

https://cve.mitre.org/ (Access date: August 2017).

[3] The CVSS Special Interest Group (SIG), Common Vulnerability Scoring System (CVSS) (online),

https://www.first.org/cvss (Access date: August 2017).

[4] Özkan, S., CVE Details (online), http://www.cvedetails.com/ (Access date: August 2017).

[5] The MITRE Corporation, CVE-Compatible Products and Services (Archived) (online),

https://cve.mitre.org/compatible/compatible.html (Access date: August 2017).

[6] Offensive Security, Exploit Database (online), https://www.exploit-db.com/ (Access date:

August 2017).

[7] The National Institute of Standards and Technology (2002), Risk Management Guide for Information

Technology Systems, (Standard SP 800-30) NIST. https://csrc.nist.gov/publications/detail/sp/800-

30/archive/2002-07-01.

[8] The National Institute of Standards and Technology (2012), Guide for Conducting Risk Assessments,

(Standard SP 800-30 Rev. 1) NIST.https://csrc.nist.gov/publications/detail/sp/800-30/rev-1/final.

[9] Boivin, E. (2015), Foundations of the Platform Risk Analysis Process (PRAP),

(DRDC-RDDC-2015-R016) Defence Research and Development Canada. http://cradpdf.drdc-

rddc.gc.ca/PDFS/unc203/p801260_A1b.pdf.

[10] The National Institute of Standards and Technology (2010), Guide for Applying the Risk

Management Framework to Federal Information Systems: a Security Life Cycle Approach,

(Standard SP 800-37 Rev. 1) NIST. https://csrc.nist.gov/publications/detail/sp/800-37/rev-1/final.

[11] Communications Security Establishment (2012), IT Security Risk Management: A Lifecycle

Approach, (Guidance ITSG-33) CSE, Canada. https://www.cse-cst.gc.ca/en/publication/itsg-33.

[12] The National Institute of Standards and Technology (2011), Managing Information Security Risk:

Organization, Mission, and Information System View, (Standard SP 800-39) NIST.

https://csrc.nist.gov/publications/detail/sp/800-39/final.

[13] International Organization for Standardization (2011), Information Technology – Security

Techniques – Information Security Risk Management, (Standard IEC 27005:2011) ISO.

https://www.iso.org/standard/56742.html.

[14] Communications Security Establishment (2007), Harmonized TRA Methodology, (Methodology

TRA-1) CSE, Canada. https://www.cse-cst.gc.ca/en/publication/tra-1.

DRDC-RDDC-2017-R148 19

[15] Couture, M. (2017), Cyber-security of military systems, (DRDC-RDDC-2017-R030)

Defence Research and Development Canada.

20 DRDC-RDDC-2017-R148

List of symbols/abbreviations/acronyms/initialisms

BU Bottom-Up

CIA Confidentiality, Integrity and Availability

CVE Common Vulnerabilities and Exposures

CVSS Common Vulnerability Scoring System

DGGPEM Directeur général, Gestion du programme d’équipement maritime

DGMEPM Director General Maritime Equipment Program Management

DNCS Directorate of Naval Combat Systems

DRDC Defence Research and Development Canada

DSCN Directeur, Systèmes de combat navals

IT Information Technology

ITSG Information Technology Security Guidance

NIST National Institute of Standards and Technology

PASS Platform-to-Assembly Secure Systems

TD Top-Down

VI-AM Vulnerability Assessment Methodology

CAN UNCLASSIFIED

CAN UNCLASSIFIED

DOCUMENT CONTROL DATA (Security markings for the title, abstract and indexing annotation must be entered when the document is Classified or Designated)

1. ORIGINATOR (The name and address of the organization preparing the document.

Organizations for whom the document was prepared, e.g., Centre sponsoring a

contractor's report, or tasking agency, are entered in Section 8.)

DRDC – Valcartier Research CentreDefence Research and Development Canada2459 route de la BravoureQuebec (Quebec) G3J 1X5Canada

2a. SECURITY MARKING (Overall security marking of the document including

special supplemental markings if applicable.)

CAN UNCLASSIFIED

2b. CONTROLLED GOODS

NON-CONTROLLED GOODS DMC A

3. TITLE (The complete document title as indicated on the title page. Its classification should be indicated by the appropriate abbreviation (S, C or U) in

parentheses after the title.)

Cyber security for Commercial Off-The-Shelf (COTS) and open source software systems: A newsystem-to-function Vulnerability/Impact Assessment Methodology (VI-AM) based on the CommonVulnerabilities Exposures / Common Vulnerability Scoring System (CVE/CVSS) databases(Version 1.0)

4. AUTHORS (last name, followed by initials – ranks, titles, etc., not to be used)

Couture, M.

5. DATE OF PUBLICATION (Month and year of publication of document.)

November 2017

6a. NO. OF PAGES (Total containing information,

including Annexes, Appendices,

etc.)

26

6b. NO. OF REFS

(Total cited in document.)

15

7. DESCRIPTIVE NOTES (The category of the document, e.g., technical report, technical note or memorandum. If appropriate, enter the type of report,

e.g., interim, progress, summary, annual or final. Give the inclusive dates when a specific reporting period is covered.)

Scientific Report

8. SPONSORING ACTIVITY (The name of the department project office or laboratory sponsoring the research and development – include address.)

DRDC – Valcartier Research CentreDefence Research and Development Canada2459 route de la BravoureQuebec (Quebec) G3J 1X5Canada

9a. PROJECT OR GRANT NO. (If appropriate, the applicable research

and development project or grant number under which the document

was written. Please specify whether project or grant.)

9b. CONTRACT NO. (If appropriate, the applicable number under

which the document was written.)

10a. ORIGINATOR’S DOCUMENT NUMBER (The official document

number by which the document is identified by the originating

activity. This number must be unique to this document.)

DRDC-RDDC-2017-R148

10b. OTHER DOCUMENT NO(s). (Any other numbers which may be

assigned this document either by the originator or by the sponsor.)

11a. FUTURE DISTRIBUTION (Any limitations on further dissemination of the document, other than those imposed by security classification.)

Public release

11b. FUTURE DISTRIBUTION OUTSIDE CANADA (Any limitations on further dissemination of the document, other than those imposed by security

classification.)

CAN UNCLASSIFIED

CAN UNCLASSIFIED

12. ABSTRACT (A brief and factual summary of the document. It may also appear elsewhere in the body of the document itself. It is highly desirable that

the abstract of classified documents be unclassified. Each paragraph of the abstract shall begin with an indication of the security classification of the

information in the paragraph (unless the document itself is unclassified) represented as (S), (C), (R), or (U). It is not necessary to include here abstracts in

both official languages unless the text is bilingual.)

Operating systems and software applications will likely always contain software flaws that may

be exploited by hackers to attack Governmental computing infrastructure. Assessing the

presence of vulnerabilities in these systems and their potential impacts on military operations

are important tasks that must be performed to identify and apply the best corrective actions that

will make these systems and the services they deliver more “cyber secure.”

This report presents a new methodology, the Vulnerability Impact Assessment Methodology;

VI-AM. VI-AM can be used to assess vulnerabilities in software systems and the effects they

may produce at the system and operational function levels. This methodology uses publicly

available standardized CVE/CVSS datasets and metrics to express: a) required maximum

impacts that can be tolerated at the operational functions level, and b) actual computed impacts

for the same functions (based on vulnerability scans). Gap analyses then generate information to

help prioritize corrective actions on the systems to make operational functions more cyber

secure.

___________________________________________________________________________

Les systèmes d’opération et les applications logicielles vont probablement toujours contenir des

failles logicielles qui peuvent être exploitées par des hackers pour attaquer les infrastructures

informatiques du gouvernement. L’évaluation de la présence de vulnérabilités dans ces systèmes

et de leurs impacts potentiels sur les opérations militaires sont des tâches importantes qui

doivent être effectuées dans le but d’identifier et d’appliquer les meilleures actions correctives

qui vont rendre plus « cyber sécure » ces systèmes et les services qu’ils rendent.

Ce rapport scientifique présente et guide l’utilisation d’une nouvelle méthodologie, la

« Vulnerability Impact Assessment Methodology » (VI-AM). VI-AM peut être utilisé pour

évaluer les vulnérabilités qui sont présentes dans les systèmes logiciels et les effets qu’elles

peuvent produire aux niveaux système et fonctionnalités opérationnelles. Cette méthodologie

implique l’utilisation des ensembles publics standardisés de données CVE/CVSS et métriques

pour exprimer : 1) les impacts maximum requis qui peuvent être tolérés au niveau des fonctions

opérationnelles et 2) les vrais impacts calculés pour les mêmes fonctions (sur la base de scans de

vulnérabilités). Des analyses différentielles d’impacts génèrent ensuite l’information nécessaire

aidant la priorisation des actions correctives qui doivent être posées sur les systèmes pour rendre

les fonctions opérationnelles plus sécures.

13. KEYWORDS, DESCRIPTORS or IDENTIFIERS (Technically meaningful terms or short phrases that characterize a document and could be helpful

in cataloguing the document. They should be selected so that no security classification is required. Identifiers, such as equipment model designation,

trade name, military project code name, geographic location may also be included. If possible keywords should be selected from a published thesaurus,

e.g., Thesaurus of Engineering and Scientific Terms (TEST) and that thesaurus identified. If it is not possible to select indexing terms which are

Unclassified, the classification of each should be indicated as with the title.)

Cyber-threat; online cyber-surveillance; vulnerability assessment process