harnessing innovative procedures under an administration ......that cried wolf: reconciling...

226
Defense Acquisition Research Journal A Publication of the Defense Acquisition University Harnessing Innovative Procedures Under an Administration IN TRANSITION 2017 Edward Hirsch Acquisition and Writing Award Presented on behalf of DAU by: DAU April 2017 Vol. 24 No. 2 | ISSUE 81

Upload: others

Post on 28-Sep-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

  • Defense Acquisition Research Journal A Publication of the Defense Acquisition University

    Harnessing Innovative Procedures Under

    an Administration IN TRANSITION

    2017 Edward Hirsch Acquisition and Writing Award

    Presented on behalf of DAU by:

    DAU

    April 2017 Vol. 24 No. 2 | ISSUE 81

  • Using Analytical Hierarchy and Analytical Network Processes to Create Cyber Security Metrics George C. Wilamowski, Jason R. Dever, and Steven M. F. Stuban

    The Threat Detection System That Cried Wolf: Reconciling Developers with Operators Shelley M. Cazares

    Army Aviation: Quantifying the Peacetime and Wartime Maintenance Man-Hour Gaps LTC William Bland, USA (Ret.),

    CW5 Donald L. Washabaugh Jr., USA (Ret.),

    and Mel Adams

    Complex Acquisition Requirements Analysis Using a Systems Engineering Approach Col Richard M. Stuckey, USAF (Ret.), Shahram Sarkani, and Thomas A. Mazzuchi

    An Investigation of Nonparametric Data Mining Techniques for Acquisition Cost Estimating Capt Gregory E. Brown, USAF, and Edward D. White

    Online-only Article Critical Success Factors for Crowdsourcing with Virtual Environments to Unlock Innovation Glenn E. Romanczuk, Christopher Willy, and John E. Bischoff

    The Defense Acquisition Professional Reading List Getting Defense Acquisition Right Written and Introduced by the Honorable Frank Kendall

    Article List

    ARJ Extra

  • Research Advisory BoardDr. Mary C. Redshaw

    Dwight D. Eisenhower School for National Security and Resource Strategy

    Editorial BoardDr. Larrie D. Ferreiro

    Chairman and Executive Editor

    Mr. Richard AltieriDwight D. Eisenhower School for NationalSecurity and Resource Strategy

    Dr. Michelle BaileyDefense Acquisition University

    Dr. Don Birchler Center for Naval Analyses Corporation

    Mr. Kevin Buck The MITRE Corporation

    Mr. John Cannaday Defense Acquisition University

    Dr. John M. Colombi Air Force Institute of Technology

    Dr. Richard DonnellyThe George Washington University

    Dr. William T. EliasonDwight D. Eisenhower School for NationalSecurity and Resource Strategy

    Dr. J. Ronald Fox Harvard Business School

    Mr. David Gallop Defense Acquisition University

    Dr. Jacques Gansler University of Maryland

    RADM James Greene, USN (Ret.)Naval Postgraduate School

    Dr. Mike KotzianDefense Acquisition University

    Dr. Craig LushDefense Acquisition University

    Dr. Troy J. MuellerThe MITRE Corporation

    Dr. Andre Murphy Defense Acquisition University

    Dr. Christopher G. PerninRAND Corporation

    Dr. Richard ShipeDwight D. Eisenhower School for NationalSecurity and Resource Strategy

    Dr. Keith SniderNaval Postgraduate School

    Dr. John SnoderlyDefense Acquisition University

    Ms. Dana Stewart Defense Acquisition University

    Dr. David M. TateInstitute for Defense Analyses

    Dr. Trevor TaylorCranfield University (UK)

    Mr. Jerry VandewieleDefense Acquisition University

    Mr. James A. MacStravicPerforming the duties of Under Secretary of Defense for Acquisition, Technology, and Logistics

    Mr. James P. WoolseyPresident, Defense Acquisition University

    ISSN 2156-8391 (print) ISSN 2156-8405 (online)DOI: https://doi.org/10.22594/dau.042017-81.24.02

    The Defense Acquisition Research Journal, formerly the Defense Acquisition Review Journal, is published quarterly by the Defense Acquisition University (DAU) Press and is an official publication of the Department of Defense. Postage is paid at the U.S. Postal facility, Fort Belvoir, VA, and at additional U.S. Postal facilities. Postmaster, send address changes to: Editor, Defense Acquisition Research Journal, DAU Press, 9820 Belvoir Road, Suite 3, Fort Belvoir, VA 22060-5565. The journal-level DOI is: https://doi.org/10.22594/dauARJ.issn.2156-8391. Some photos appearing in this publication may be digitally enhanced.

    Articles represent the views of the authors and do not necessarily reflect the opinion of DAU or the Department of Defense.

  • Director, Visual Arts & Press Randy Weekes

    Managing Editor Deputy Director,

    Visual Arts & PressNorene L. Taylor

    Assistant Editor Emily Beliles

    Production Manager,Visual Arts & Press Frances Battle

    Lead Graphic Designer Diane Fleischer/Tia GrayMichael Krukowski

    Graphic Designer, Digital Publications Nina Austin

    Technical Editor Collie J. Johnson

    Associate Editor Michael Shoemaker

    Copy Editor/Circulation Manager Debbie Gonzalez

    Multimedia Assistant Noelia Gamboa

    Editing, Design, and Layout The C3 Group &Schatz Publishing Group

  • CONTENTS | Featured ResearchA Publication of the Defense Acquisition University April 2017 Vol. 24 No. 2 ISSUE 81

    RES

    EARC

    H PAPE

    R COMPETITION

    2016 ACS1stplace

    DE

    FEN

    SE A

    CQ

    UIS

    ITIO

    N UN

    IVERSITY ALUM

    NI A

    SSOC

    IATIO

    N

    p. 186 Using Analytical Hierarchy and Analytical Network Processes to Create Cyber Security Metrics George C. Wilamowski, Jason R. Dever, and Steven M. F. Stuban

    This article discusses cyber security controls anda use case that involves decision theory methods to produce a model and independent first-order results using a form-fit-function approach as a generalized application benchmarking framework. The framework combines subjective judgments that are based on a survey of 502 cyber security respondents with quantitative data and identifies key performancedrivers in the selection of specific criteria for three communities of interest: local area network, wide area network, and remote users.

    p. 222 The Threat Detection System That Cried Wolf: Reconciling Developers with Operators Shelley M. Cazares

    Threat detection systems that perform well intesting can “cry wolf” during operation, generating many false alarms. The author posits that program managers can still use these systems as part of atiered system that, overall, exhibits better performance than each individual system alone.

  • Featured Research

    p. 246 Army Aviation: Quantifying the Peacetime and Wartime Maintenance Man-Hour Gaps LTC William Bland, USA (Ret.), CW5 Donald L. Washabaugh Jr., USA (Ret.),

    and Mel Adams

    T he M a i nt en a nc e M a n-Hou r ( M M H ) G a pCa lcu lator conf irms a nd qua ntif ies a la rge, persistent gap in Army aviation maintenancerequired to support each Combat Aviation Brigade.

    p. 266 Complex Acquisition Requirements Analysis Using a Systems Engineering Approach Col Richard M. Stuckey, USAF (Ret.), Shahram Sarkani, and Thomas A. Mazzuchi

    Programs lack an optimized solution set of requirements attributes. This research provides a set ofvalidated requirements attributes for ultimateprogram execution success.

  • CONTENTS | Featured ResearchA Publication of the Defense Acquisition University April 2017 Vol. 24 No. 2 ISSUE 81

    p. 302An Investigation of Nonpara-metric Data Mining Techniques for Acquisition Cost EstimatingCapt Gregory E. Brown, USAF, and Edward D. White

    Given the recent enhancements in acquisition data collection, a meta-analysis reveals that nonpara-metric data mining techniques may improve the accuracy of future DoD cost estimates.

    Critical Success Factors for Crowdsourcing with Virtual Environments to Unlock Innovation Glenn E. Romanczuk, Christopher Willy, and John E. Bischoff

    Delphi methods were used to discover critical success factors in five areas: virtual environments, MBSE, crowdsourcing, human systems integration, and the overall process. Results derived from this study present a framework for using virtualenvironments to crowdsource systems design usingwarfighters and the greater engineering staff.

    http://www.dau.mil/library/arj

    http://www.dau.mil/library/arj

  • Featured Research

    CONTENTS | Featured Research

    p. viii From the Chairman and Executive Editor

    p. xii Research Agenda 2017–2018

    p. xvii DAU Alumni Association

    p. 368 Professional Reading List

    Getting Defense Acquisition Right Written and Introduced by the Honorable Frank Kendall

    p. 370 New Research in Defense Acquisition

    A selection of new research curated by the DAU Research Center and the Knowledge Repository.

    p. 376 Defense ARJ Guidelines for Contributors

    The Defense Acquisition Research Journal (ARJ) is a scholarly peer-reviewed journal published by the

    Defense Acquisition University. All submissions receive a blind review to ensure impartial evaluation.

    p. 381 Call for Authors

    We are currently soliciting articles and subject matter experts for the 2017–2018 Defense ARJ print years.

    p. 384 Defense Acquisition University Website

    Your online access to acquisition research, consulting, information, and course offerings.

  • FROM THE CHAIRMAN AND

    EXECUTIVE EDITOR

    Dr. Larrie D. Ferreiro

    A Publication of the Defense Acquisition University http://www.dau.mil

    x

    The theme for this edition of Defense A c q u i s i t i o n R e s e a r c h J o u r n a l i s “Harnessing Innovative Procedures under an Administration in Transition.” Fiscal Year 2017 will see many changes, not only in a new administration, but also under the National Defense Authorization Act (NDAA). Under this NDAA, by February 2018 the Under Secretary of Defense for Acquisition, Technology, and Logistics (USD[AT&L]) office will be disestab

    lished, and its duties divided between two separate offices. The first office, the Under Secretary of Defense for Research and Engineering (USD[R&E]), will carry out the mission of defense technological innovation. The second office, the Under Secretary of Defense for Acquisition and Sustainment (USD[A&S]), will ensure that sustainment issues are integrated during the acquisition process. The articles in this issue show some of the innovative ways that acquisition can be tailored to these new paradigms.

    The first article is “Using Analytical Hierarchy and Analytical Network Processes to Create Cyber Security Metrics” by George C. Wilamowski, Jason R. Dever, and Steven M. F. Stuban. It was the recipient (from among strong competition) of the DAU Alumni Association (DAUAA) 2017 Edward Hirsch Acquisition and Writing Award, given annually for research papers that best meet the criteria of significance, impact, and readability. The authors discuss cyber

  • April 2017

    xi

    security controls and a use case involving decision theory to develop a benchmarking framework that identifies key performance drivers in local area network, wide area network, and remote user communities. Next, the updated and corrected article by Shelley M. Cazares, “The Threat Detection System That Cried Wolf: Reconciling Developers with Operators,” points out that some threat detection systems that perform well in testing can generate many false alarms (“cry wolf”) in operation. One way to mitigate this problem may be to use these systems as part of a tiered system that, overall, exhibits better performance than each individual system alone. The next article, “Army Aviation: Quantifying the Peacetime and Wartime Maintenance Man-Hour Gaps” by William Bland, Donald L. Washabaugh Jr., and Mel Adams, describes the development of a Maintenance Man-Hour Gap Calculator tool that confirmed and quantified a large, persistent gap in Army aviation maintenance. Following this is “Complex Acquisition Requirements Analysis Using a Systems Engineering Approach” by Richard M. Stuckey, Shahram Sarkani, and Thomas A. Mazzuchi. The authors examine prioritized requirement attributes to account for program complexities, and provide a guide to establishing effective requirements needed for informed trade-off decisions. The results indicate that the key attribute for unconstrained systems is "achievable." Then, Gregory E. Brown and Edward D. White, in their article “An Investigation of Nonparametric Data Mining Techniques for Acquisition Cost Estimating,” use a meta-analysis to argue that nonparametric data mining techniques may improve the accuracy of future DoD cost estimates.

    The online-only article, “Critical Success Factors for Crowdsourcing with Virtual Environments to Unlock Innovation” by Glenn E. Romanczuk, Christopher Willy, and John E. Bischoff, explains how to use virtual environments to crowdsource systems design using warfighters and the engineering staff to decrease the cycle time required to produce advanced innovative systems tailored to meet warfighter needs.

    This issue inaugurates a new addition to the Defense Acquisition Research Journal, “New Research in Defense Acquisition.” Here, we bring to the attention of the defense acquisition community a selection of current research that may prove of further interest. These selections are curated by the DAU Research Center and the Knowledge Repository, and in these pages we provide the summaries and links that will allow interested readers to access the full works.

  • A Publication of the Defense Acquisition University http://www.dau.mil

    xii

    The featured book in this issue’s Defense Acquisition Professional Reading List is Getting Defense Acquisition Right by former Under Secretary of Defense for Acquisition, Technology, and Logistics Frank Kendall.

    Finally, the entire production and publishing staff of the Defense ARJ now bids a fond farewell to Diane Fleischer, who has been our Graphic Specialist/Lead Designer for this journal since our January 2012 Issue 61, Vol. 19 No. 1. She has also been with the DAU Press for more than 5 years, and has been instrumental in the Defense ARJ team winning two APEX awards for One-of-a-Kind Publications— Government in both 2015 and 2016. Diane is retiring, and she and her family are relocating to Greenville, South Carolina. Diane, we all wish you “fair winds and following seas.”

    Biography

    Ms. Diane Fleischer has been employed as a Visual Information Specialist in graphic design at the Defense Acquisition University (DAU) since November 2011. Prior to her arrival at DAU as a contractor with the Schatz Publishing Group, she worked in a wide variety of commercial graphic positions, both print and web-based. Diane’s graphic arts experience spans more than 38 years, and she holds a BA in Fine Arts from Asbury University in Wilmore, Kentucky.

  • This Research Agenda is intended to make researchers aware of the topics that are, or should be, of particular concern to the broader defense acquisition community within the federal government, academia, and defense industrial sectors. The center compiles the agenda annually, using inputs from subject matter experts across those sectors. Topics are periodically vetted and updated by the DAU Center’s Research Advisory Board to ensure they address current areas of strategic interest.

    The purpose of conducting research in these areas is to provide solid, empirically based findings to create a broad body of knowl-edge that can inform the development of policies, procedures, and processes in defense acquisition, and to help shape the thought lead-ership for the acquisition community. Most of these research topics were selected to support the DoD’s Better Buying Power Initiative (see http://bbp.dau.mil). Some questions may cross topics and thus appear in multiple research areas.

    Potential researchers are encouraged to contact the DAU Director of Research ([email protected]) to suggest additional research questions and topics. They are also encouraged to contact the listed Points of Contact (POC), who may be able to provide general guidance as to current areas of interest, potential sources of infor-mation, etc.

    A Publication of the Defense Acquisition University http://www.dau.mil

    xiv

    DAU CENTER

    FOR DEFENSE

    ACQUISITION

    RESEARCH AGENDA 2017–2018

  • Competition POCs • John Cannaday, DAU: [email protected]

    • Salvatore Cianci, DAU: [email protected]

    • Frank Kenlon (global market outreach), DAU: [email protected]

    Measuring the Effects of Competition • What means are there (or can be developed) to measure

    the effect on defense acquisition costs of maintaining the defense industrial base in various sectors?

    • What means are there (or can be developed) of mea-suring the effect of utilizing defense industria l infrastructure for commercial manufacture, and in particular, in growth industries? In other words, can we measure the effect of using defense manufacturing to expand the buyer base?

    • What means are there (or can be developed) to deter-mine the degree of openness that exists in competitive awards?

    • What are the different effects of the two best value source selection processes (trade-off vs. lowest price technically acceptable) on program cost, schedule, and performance?

    Strategic Competition• Is there evidence that competition between system

    portfolios is an effective means of controlling price and costs?

    • Does lack of competition automatically mean higher prices? For example, is there evidence that sole source can result in lower overall administrative costs at both the government and industry levels, to the effect of lowering total costs?

    • What are the long-term historical trends for compe-tition guidance and practice in defense acquisition policies and practices?

    April 2017

    xv

  • • To what extent are contracts being awarded non-competitively by congressional mandate for policy interest reasons? What is the effect on contract price and performance?

    • What means are there (or can be developed) to deter-mine the degree to which competitive program costs are negatively affected by laws and regulations such as the Berry Amendment, Buy American Act, etc.?

    • The DoD should have enormous buying power and the ability to influence supplier prices. Is this the case? Examine the potential change in cost performance due to greater centralization of buying organizations or strategies.

    Effects of Industrial Base • What are the effects on program cost, schedule, and

    performance of having more or fewer competitors? What measures are there to determine these effects?

    • What means are there (or can be developed) to measure the breadth and depth of the industrial base in various sectors that go beyond simple head-count of providers?

    • Has change in the defense industrial base resulted in actual change in output? How is that measured?

    Competitive Contracting • Commercial industry often cultivates long-term, exclu-

    sive (noncompetitive) supply chain relationships. Does this model have any application to defense acquisition? Under what conditions/circumstances?

    • What is the effect on program cost, schedule, and performance of awards based on varying levels of competition: (a) “Effective” competition (two or more offers); (b) “Ineffective” competition (only one offer received in response to competitive solicitation); (c) split awards vs. winner take all; and (d) sole source.

    A Publication of the Defense Acquisition University http://www.dau.mil

    xvi

  • Improve DoD Outreach for Technology and Products from Global Markets

    • How have militaries in the past benefited from global technology development?

    • How/why have militaries missed the largest techno-logical advances?

    • What are the key areas that require the DoD’s focus and attention in the coming years to maintain or enhance the technological advantage of its weapon systems and equipment?

    • What types of efforts should the DoD consider pursu-ing to increase the breadth and depth of technology push efforts in DoD acquisition programs?

    • How effectively are the DoD’s global science and tech-nology investments transitioned into DoD acquisition programs?

    • Are the DoD’s applied research and development (i.e., acquisition program) investments effectively pursuing and using sources of global technology to affordably meet current and future DoD acquisition program requirements? If not, what steps could the DoD take to improve its performance in these two areas?

    • What are the strengths and weaknesses of the DoD’s global defense technology investment approach as compared to the approaches used by other nations?

    • What are the strengths and weaknesses of the DoD’s global defense technology investment approach as compared to the approaches used by the private sector—both domestic and foreign entities (compa-nies, universities, private-public partnerships, think tanks, etc.)?

    • How does the DoD currently assess the relative benefits and risks associated with global versus U.S. sourcing of key technologies used in DoD acquisition programs? How could the DoD improve its policies and procedures in this area to enhance the benefits of global technology sourcing while minimizing potential risks?

    April 2017

    xvii

  • • How could current DoD/U.S. Technology Security and Foreign Disclosure (TSFD) decision-making policies and processes be improved to help the DoD better bal-ance the benefits and risks associated with potential global sourcing of key technologies used in current and future DoD acquisition programs?

    • How do DoD primes and key subcontractors currently assess the relative benefits and risks associated with global versus U.S. sourcing of key technologies used in DoD acquisition programs? How could they improve their contractor policies and procedures in this area to enhance the benefits of global technology sourcing while minimizing potential risks?

    • How could current U.S. Export Control System deci-sion-making policies and processes be improved to help the DoD better balance the benefits and risks associated with potential global sourcing of key tech-nologies used in current and future DoD acquisition programs?

    Comparative Studies • Compare the industrial policies of military acquisition

    in different nations and the policy impacts on acquisi-tion outcomes.

    • Compare the cost and contract performance of highly regulated public utilities with nonregulated “natu-ral monopolies,” e.g., military satellites, warship building, etc.

    • Compare contracting/competition practices between the DoD and complex, custom-built commercial prod-ucts (e.g., offshore oil platforms).

    • Compare program cost performance in various market sectors: highly competitive (multiple offerors), limited (two or three offerors), monopoly?

    • Compare the cost and contract performance of mil-itary acquisition programs in nations having single “purple” acquisition organizations with those having Service-level acquisition agencies.

    A Publication of the Defense Acquisition University http://www.dau.mil

    xviii

  • DAU ALUMNI ASSOCIATION

    Join the Success Network!

    The DAU Alumni Association opens the door to a worldwide network of Defense Acquisition University graduates, faculty, staff members, and defense industry representatives—all ready to share their expertise with you and benefit from yours. Be part of a two-way exchange of information with other acquisition professionals.

    • Stay connected to DAU and link to other professional organizations. • Keep up to date on evolving defense acquisition policies and developments

    through DAUAA newsletters and the DAUAA LinkedIn Group.

    • Attend the DAU Annual Acquisition Training Symposium and bi-monthly hot

    topic training forums—both supported by the DAUAA and earn Continuous Learning Points toward DoD continuing education requirements.

    Membership is open to all DAU graduates, faculty, staff, and defense industrymembers. It’s easy to join right from the DAUAA Website at www.dauaa.org, or scan the following QR code:

    For more information call 703-960-6802 or 800-755-8805, or e-mail [email protected].

    mailto:[email protected]:www.dauaa.org

  • ISSUE 81 APRIL 2017 VOL. 24 NO. 2

  • We’re on the Web at: http://www.dau.mil/library/arj 185185

  • Image designed by Diane Fleischer

    -

    - -

    -

    RES

    EARC

    HPAPE

    R COMPETITION

    2016 ACS 1stplace

    DE

    FEN

    SE A

    CQ

    UIS

    ITIO

    NUN

    IVERSITY ALUM

    NI A

    SSOC

    IATIO

    N

    Using Analytical Hierarchy and Analytical

    Network Processes to Create

    CYBER SECURITY METRICS

    George C. Wilamowski, Jason R. Dever, and Steven M. F. Stuban

    Authentication, authorization, and accounting are key access control measures that decision makers should consider when crafting a defense against cyber attacks. Two decision theory methodologies were compared. Analytical hierarchy and analytical network processes were applied to cyber security-related decisions to derive a measure of effectiveness for risk eval uation. A network/access mobile security use case was employed to develop a generalized application benchmarking framework. Three communities of interest, which include local area network, wide area network, and remote users, were referenced while demonstrating how to prioritize alternatives within weighted rankings. Subjective judgments carry tremendous weight in the minds of cyber security decision makers. An approach that combines these judgments with quantitative data is the key to creating effective defen sive strategies.

    DOI: https://doi.org/10.22594/dau.16-760.24.02 Keywords: Analytical Hierarchy Process (AHP), Analytical Network Process (ANP), Measure of Effectiveness (MOE), Benchmarking, Multi Criteria Decision Making (MCDM)

    http:760.24.02https://doi.org/10.22594/dau.16

  • 188 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    Cyber Security Metrics http://www.dau.mil

    Authentication, authorization, and accounting (AAA) are the last lines of defense among access controls in a defense strategy for safeguarding the privacy of information via security controls and risk exposure (EY, 2014). These controls contribute to the effectiveness of a data network’s system security. The risk exposure is predicated by the number of preventative measures the Trusted Information Provider, or “TIP”—an agnostic term for the

    organization that is responsible for privacy and security of an organization—is willing to apply against cyber attacks (National

    Institute of Standards and Technology [NIST], 2014). Recently, persistent cyber attacks against the data

    of a given organization have caused multiple data breaches within commercial industries and the

    U.S. Government. Multiple commercial data networks were breached or compromised in

    2014. For example, 76 million households and 7 million small businesses and other commercial businesses had their data compromised at JPMorgan Chase & Co.; Home

    Depot had 56 million customer accounts compromised; TJ Ma xx had 45.6

    million customer accounts compromised; and Target had 40 million customer accounts compromised (Weise, 2014). A recent example of a commercial cyber attack was the attack against Anthem, Inc.,

    from January to February 2015, when a sophisticated external attack compromised the data of approximately 80 million customers and employees (McGuire, 2015).

    C on s e q u e n t l y, v a r i o u s effor ts have been made

    to combat these increasingly common attacks. For example, on February 13, 2015, at a Summit

    on Cybersecurity and Consumer Protection

    at Stanford University in

  • 189 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    April 2017

    Palo Alto, California, the President of the United States signed an executive order that would enable private firms to share information and access classified information on cyber attacks (Obama, 2015; Superville & Mendoza, 2015). The increasing number of cyber attacks that is currently experienced by many private firms is exacerbated by poorly implemented AAA security controls and risk exposure minimization. These firms do not have a method for measuring the effectiveness of their AAA policies and protocols (EY, 2014). Thus, a systematic process for measuring the effectiveness of defensive strategies in critical cyber systems is urgently needed.

    Literature Review A literature review has revealed a wide range of Multi-Criteria Decision

    Making (MCDM) models for evaluating a set of alternatives against a set of criteria using mathematical methods. These mathematical methods include linear programming, integer programming, design of experiments, influence diagrams, and Bayesian networks, which are used in formulating the MCDM decision tools (Kossiakoff, Sweet, Seymour, & Biemer, 2011). The decision tools include Multi-Attribute Utility Theory (MAUT) (Bedford & Cooke, 1999; Keeney, 1976, 1982); criteria for deriving scores for alternatives; decision trees (Bahnsen, Aouada, & Ottersten, 2015; Kurematsu & Fujita, 2013; Pachghare & Kulkarni, 2011); decisions based on graphical networks and Cost-Benefit Analysis (CBA) (Maisey, 2014; Wei, Frinke, Carter, & Ritter, 2001); simulations for calculating a system’s alternatives per unit cost; and the House of Quality: Quality Function Deployment (QFD) (Chan & Wu, 2002; Zheng & Pulli, 2005), which is a planning matrix that relates what a customer wants to how a firm (that produces the products) is going to satisfy those needs (Kossiakoff et al., 2011).

    The discussion on the usability of decision theory against cyber threats is limited, which indicates the existence of a gap. This study will employ analytical hierarchies and analytical network processes to create AAA cyber security metrics within these well-known MCDM models (Rabbani & Rabbani, 1996; Saaty, 1977, 2001, 2006, 2009, 2010, 2012; Saaty & Alexander, 1989; Saaty & Forman, 1992; Saaty, Kearns, & Vargas, 1991; Saaty & Peniwati, 2012) for cyber security decision-making. Table 1 represents a network/access mobile security use case that employs mathematically based techniques of criteria and alternative pairwise comparisons.

  • 190 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    Cyber Security Metrics http://www.dau.mil

    -

    TABLE 1. CYBER SECURITY DECISION MAKING USE CASE

    Primary Actor Cyber Security Manager Scope Maximize; Network Access/Mobility’s Measure of

    Effectiveness

    Level Cyber Security Control Decisions Stakeholder Security Respondents—Organization’s Security Decision and Interests Influencers

    C-suite—Resource Allocation by Senior Executives

    Precondition Existing Authentication, Authorization, and Accounting (AAA); Limited to Security Controls Being Evaluated

    Main Success Scenario

    1. AAA Goal Setting 2. Decision Theory Model 3. AAA Security Interfaces/Relationships Design 4. A/B Survey Questionnaire with 9-Point Likert scale 5. Survey Analysis 6. Survey’s A/B Judgement Dominance 7. Scorecard Pairwise Data Input Into Decision Theory

    Software 8. Decision—Priorities and Weighted Rankings

    Extensions 1a. Goals into Clusters: Criteria, Subcriteria, and Alternatives

    3a. Selection of AAA Attribute Interfaces 3b. Definition of Attribute Interfaces 4a. 9-Point Likert Scale; Equal Importance (1) to Extreme

    Importance (9) 5a. Survey’s Margin of Error 5b. Empirical Analysis 5c. Normality Testing 5d. General Linear Model (GLM) Testing 5e. Anderson-Darling Testing 5f. Cronbach Alpha Survey Testing for Internal

    Consistency 6a. Dominate Geometric Mean Selection 6b. Dominate Geometric Mean used for Scorecard Build

    Out 7a. Data Inconsistencies Check between 0.10 and 0.20 7b. Cluster Priority Ranking

    Note. Adapted from Writing Effective Use Cases, by Alistair Cockburn. Copyright 2001, by Addison-Wesley.

  • 191 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    April 2017

    Research The objective of this research was to demonstrate a method for assessing

    measures of effectiveness by means of two decision theory methodologies; the selected MCDM methods were an Analytical Hierarchy Process (AHP) and an Analytical Network Process (ANP). Both models employ numerical scales within a prioritization method that is based on eigenvectors. These methods were applied to cyber security-related decisions to derive a measure of effectiveness for risk evaluation. A network/access mobile security use case, as shown in Table 1, was employed to develop a generalized application benchmarking framework to evaluate cyber security control decisions. The security controls are based on the criteria of AAA (NIST, 2014).

    The Defense Acquisition System initiates a Capabilities Based Assessment (CBA) to be performed upon which an Initial Capabilities Document (ICD) is built (AcqNotes, 2016a). Part of creating an ICD is to define a functional area (or areas’) Measure of Effectiveness (MOE) (Department of Defense [DoD], 2004, p. 30). MOEs are a direct output from a Functional Area Assessment (AcqNotes, 2016a). The MOE for Cyber Security Controls would be an area that needs to be assessed for acquisition. The term MOE was initially used by Morse and Kimball (1946) in their studies for the U.S. Navy on the effectiveness of weapons systems (Operations Evaluation Group [OEG] Report 58). There has been a plethora of attempts to define MOE, as shown in Table 2. In this study, we adhere to the following definition of MOEs:

    MOEs are measures of mission success stated under specific environmental and operating conditions, from the users’ viewpoint. They relate to the overall operational success criteria (e.g., mission performance, safety, availability, and security)… (MITRE, 2014; Saaty, Kearns, & Vargas, 1991, pp. 14–21)

    [by] a qualitative or quantitative metric of a system’s overall performance that indicates the degree to which it achieves its objectives under specified conditions. (Kossiakoff et al., 2011, p. 157)

  • 192 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    Cyber Security Metrics http://www.dau.mil

    TABLE 2. PANORAMA OF MOE DEFINITIONS

    Definition Source The “operational” measures of success that are closely related to the achievement of the mission or operational objective being evaluated, in the intended operational environment under a specified set of conditions; i.e., how well the solution achieves the intended purpose. Adapted from DoDI 5000.02, Defense Acquisition University, and International Council on Systems Engineering.

    (Roedler & Jones, 2005)

    “… standards against which the capability of a (Sproles, 2001, solution to meet the needs of a problem may be p. 254)

    judged. The standards are specific properties that any potential solution must exhibit to some extent. MOEs are independent of any solution and do not specify performance or criteria.”

    “A measure of effectiveness is any mutually (Dockery, 1986, agreeable parameter of the problem which induces p. 174) a rank ordering on the perceived set of goals.”

    “A measure of the ability of a system to meet its specified needs (or requirements) from a particular viewpoint(s). This measure may be quantitative or qualitative and it allows comparable systems to be ranked. These effectiveness measures are defined in the problem-space. Implicit in the meeting of problem requirements is that threshold values must be exceeded.”

    (Smith & Clark, 2004, p. 3)

    "… how effective a task was in doing the right (Masterson, 2004) thing."

    "A criterion used to assess changes in system (Joint Chiefs of behavior, capability, or operational environment Staff, 2011, p. xxv) that is tied to measuring the attainment of an end state, achievement of an objective, or creation of an effect."

    "… an MOE may be based on quantitative measures (National Research to reflect a trend and show progress toward a Council, 2013, measurable threshold." p. 166)

    "… are measures designed to correspond to (AcqNotes, 2016b) accomplishment of mission objectives and achievement of desired results. They quantify the results to be obtained by a system and may be expressed as probabilities that the system will perform as required."

  • 193 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    April 2017

    TABLE 2. PANORAMA OF MOE DEFINITIONS, CONTINUED

    Definition Source "The data used to measure the military effect (Measures of (mission accomplishment) that comes from Effectiveness, 2015) using the system in its expected environment. That environment includes the system under test and all interrelated systems, that is, the planned or expected environment in terms of weapons, sensors, command and control, and platforms, as appropriate, needed to accomplish an end-to-end mission in combat."

    "A quantitative measure that represents the (Wasson, 2015, outcome and level of performance to be achieved p. 101)

    by a system, product, or service and its level of attainment following a mission."

    The goal of the benchmarking framework that is proposed in this study is to provide a systematic process for evaluating the effectiveness of an organization’s security posture. The proposed framework process and procedures are categorized into the following four functional areas: (a) hierarchical structure; (b) judgment dominance and alternatives; (c) measures; and (d) analysis (Chelst & Canbolat, 2011; Saaty & Alexander, 1989), as shown in Figure 1. We develop a scorecard system that is based on a ubiquitous survey of 502 cyber security Subject Matter Experts (SMEs). The form, fit, and function of the two MCDM models were compared during the development of the scorecard system for each model using the process and procedures shown in Figure 1.

    FIGURE 1. APPLICATION BENCHMARKING FRAMEWORK

    Function #1

    Function #2

    Function #3

    Function #4

    Form

    FitForPurpose

    Function

    Hierarchical Structure

    Judgment Dominance; Alternatives

    Measures

    Analysis

  • 194 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    Cyber Security Metrics http://www.dau.mil

    Form Methodology The benchmarking framework shown in Figure 1 is accomplished by

    considering multiple facets of a problem; the problem is divided into smaller components that can yield qualitative and quantitative priorities from cyber security SME judgments. Each level within the framework affects the levels above and below it. The AHP and ANP facilitate SME knowledge using heuristic judgments throughout the framework (Saaty, 1991). The first action (Function #1) requires mapping out a consistent goal, criteria parameters, and alternatives for each of the models shown in Figures 2 and 3.

    FIGURE 2. AAA IN AHP FORM

    Goal

    Criteria

    Subcriteria

    Alternatives

    Maximize; Network(s) Access/Mobility Measure of Effectiveness for

    Trusted Information Providers: AAA

    Authentication (A1)

    Authorization (A2)

    Diameter RADIUS Activity Q&A User Name/ Password (Aging)

    LAN WAN

    Accounting (A3)

    Human Log Enforcement

    Automated Log Enforcement

    RemoteUser

    Note. AAA = Authentication, Authorization, and Accounting; AHP = Analytical Hierarchy Process; LAN = Local Area Network; Q&A = Question and Answer; WAN = Wide Area Network.

  • 195 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    April 2017

       

       

       

     

      

      

    FIGURE 3. AAA IN ANP FORM

    Maximize; Network(s) Access Controls Measure of Effectiveness for

    Trusted Information Providers: AAA

    • Authentication • RADIUS • Diameter

    Goal

    Identify (1)

    • LAN • WAN • Remote User

    • Authorization • Activity Q&A • User Name & 

    Password Aging

    Alternatives (4)

    ANALYTICAL NETWORK PROCESS

    Access (2)

    Elements

    • Accounting • Human Log 

    Enforcement • Automated Log Mgt

    Activity (3)

    Outer Dependencies

    Note. AAA = Authentication, Authorization, and Accounting; ANP = Analytical Network Process; LAN = Local Area Network; Mgt = Management; Q&A = Question and Answer; WAN = Wide Area Network.

    In this study, the AHP and ANP models were designed with the goal of maximizing the network access and mobility MOEs for the TIP’s AAA. The second action of Function #1 is to divide the goal objectives into clustered groups: criteria, subcriteria, and alternatives. The subcriteria are formed from the criteria cluster (Saaty, 2012), which enables further decomposition of the AAA grouping within each of the models. The third action of Function #1 is the decomposition of the criteria groups, which enables a decision maker to add, change, or modify the depth and breadth of the specificity when making a decision that is based on comparisons within each grouping. The final cluster contains the alternatives, which provide the final weights from the hierarchical components. These weights generate a total ranking priority that constitutes the MOE baseline for the AAA based on the attributes of the criteria.

  • 196 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    Cyber Security Metrics http://www.dau.mil

    The criteria of AAA implement an infrastructure of access control systems (Hu, Ferraiolo, & Kuhn, 2006) in which a server verifies the authentication and authorization of entities that request network access and manages their billing accounts. Each of the criteria has defined structures for application-specific information. Table 3 defines the attributes of the AHP and ANP model criteria, subcriteria, and alternatives; it does not include all of the subcriteria for AAA.

    TABLE 3. AHP/ANP MODEL ATTRIBUTES

    Attributes Description Source Accounting "Track of a user's activity (Accounting, n.d.)

    while accessing a network's resources, including the amount of time spent in the network, the services accessed while there, and the amount of data transferred during the session. Accounting data are used for trend analysis, capacity planning, billing, auditing, and cost allocation."

    Activity Q&A Questions that are used when resetting your password or logging in from a computer that you have not previously authorized.

    (Scarfone & Souppaya, 2009)

    Authentication "The act of verifying a claimed identity, in the form of a preexisting label from a mutually known name space, as the originator of a message (message authentication) or as the end-point of a channel (entity authentication)."

    (Aboba & Wood, 2003, p. 2)

    Authorization "The act of determining if a particular right, such as access to some resource, can be granted to the presenter of a particular credential."

    (Aboba & Wood, 2003, p. 2)

    Automatic Log Management

    Automated Logs provide (Kent & Souppaya, firsthand information regarding 2006) your network activities. Automated Log management ensures that network activity data hidden in the logs are converted to meaningful, actionable security information.

  • 197 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    April 2017

    TABLE 3. AHP/ANP MODEL ATTRIBUTES, CONTINUED

    Attributes Description Source Diameter Diameter is a newer AAA (Fajardo, Arkko,

    protocol for applications such Loughney, & Zorn, as network access and IP 2012) mobility. It is the replacement for the protocol radius. It is intended to work in both local and roaming AAA situations.

    Human Accounting Enforcement

    Human responsibilities for log (Kent & Souppaya, management for personnel 2006)

    throughout the organization, including establishing log management duties at both the individual system level and the log management infrastructure level.

    LAN—Local "A short distance data (LAN—Local Area Area Network communications network Network, 2008, p. 559)

    (typically within a building or campus) used to link computers and peripheral devices (such as printers, CD-ROMs, modems) under some form of standard control."

    RADIUS RADIUS is an older protocol for (Rigney, Willens, carrying information related to Rubens, & Simpson, authentication, authorization, 2000) and configuration between a Network Access Server that authenticates its links to a shared Authentication Server.

    Remote User "In computer networking, (Mitchell, 2016) remote access technology allows logging into a system as an authorized user without being physically present at its keyboard. Remote access is commonly used on corporate computer networks, but can also be utilized on home networks."

    User Name Users must change their (Scarfone & Souppaya, & Password passwords according to a 2009) Aging schedule.

    WAN—Wide A public voice or data network (WAN—Wide Area Area Network that extends beyond the Network, 2008)

    metropolitan area.

  • 198 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    Cyber Security Metrics http://www.dau.mil

    The relationship between authentication and its two subcriteria—RADIUS (Rigney, Willens, Rubens, & Simpson, 2000) and Diameter (Fajardo, Arkko, Loughney, & Zorn, 2012)—enables the management of network access (Figures 2 and 3). Authorization enables access using Password Activity Question & Answer, which is also known as cognitive passwords (Zviran & Haga, 1990) or User Name & Password Aging (Zeilenga, 2001) (Figures 2 and 3). Accounting (Aboba, Arkko, & Harrington, 2000) can take two forms, which include the Automatic Log Management system or Human Accounting Enforcement (Figures 2 and 3). Our framework enables each TIP to evaluate a given criterion (such as authentication) and its associated subcriteria (such as RADIUS versus Diameter), and determine whether additional resources should be expended to improve the effectiveness of the AAA. After the qualitative AHP and ANP forms were completed, these data were quantitatively formulated using AHP’s hierarchical square matrix and ANP’s feedback super matrix.

    A square matrix is required for the AHP model to obtain numerical values that are based on group judgments, record these values, and derive priorities. Comparisons of n pairs of elements based on their relative weights are described in Criteria A1, .….., An and by weights w1, …., wn (Saaty, 1991, p. 15).

    A reciprocal matrix was constructed based on the following property: aji = 1/aj, where aii = 1 (Saaty, 1991, p. 15). Multiplying the reciprocal matrix by the transposition of vector wT = (w1,…, wn) yields vector nw; thus, Aw = nw (Saaty, 1977, p. 236).

    To test the degree of matrix inconsistency, a consistency index was generated by adding the columns of the judgment matrix and multiplying the resulting vector by the vector of priorities. This test yielded an eigenvalue that is denoted by λ max (Saaty, 1983), which is the largest eigenvalue of a reciprocal matrix of order n. To measure the deviation from consistency, Saaty developed the following consistency index (Saaty & Vargas, 1991):

    CI = (λ max – n) / (n -1)

    As stated by Saaty (1983), “this index has been randomly generated for reciprocal matrices of different orders. The averages of the resulting consistency indices (R.I.) are given by” (Saaty & Vargas, 1991, p. 147):

    n 1 2 3 4 5 6 7 8 R.I. 0 0 0.58 0.9 1.12 1.24 1.32 1.41

  • 199 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    April 2017

    The consistency ratio (CR) is defined as CR = CI/RI, and a CR of 20 percent or less satisfies the consistency criterion (Saaty, 1983).

    The ANP model is a general form of the AHP model, which employs complex relationships among the decision levels. The AHP model formulates a goal at the top of the hierarchy and then deconstructs it to the bottom to achieve its results (Saaty, 1983). Conversely, the ANP model does not adhere to a strict decomposition within its hierarchy; instead, it has feedback relationships among its levels. This feedback within the ANP framework is the primary difference between the two models. The criteria can describe dependence using an undirected arc between the levels of analysis, as shown in Figure 3, or using a looped arc within the same level. The ANP framework uses interdependent relationships that are captured in a super matrix (Saaty & Peniwati, 2012).

    Fit-for-Purpose Approach We developed a fit-for-purpose approach that includes a procedure

    for effectively validating the benchmarking of a cyber security MOE. We created an AAA scorecard system by analyzing empirical evidence that introduced MCDM methodologies within the cyber security discipline with the goal of improving an organization’s total security posture.

    The first action of Function #2 is the creation of a survey design. This design, which is shown in Table 3, is the basis of the survey questionnaire. The targeted sample population was composed of SMEs that regularly manage Information Technology (IT) security issues. The group was self-identified in the survey and selected based on their depth of experience and prerequisite knowledge to answer questions regarding this topic (Office of Management and Budget [OMB], 2006). We used the Internet survey-gathering site SurveyMonkey, Inc. (Palo Alto, California, http://www. surveymonkey.com) for data collection. The second activity of Function #2 was questionnaire development; a sample question is shown in Figure 4.

    http:surveymonkey.comhttp://www

  • Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    Cyber Security Metrics http://www.dau.mil

    FIGURE 4. SURVEY SAMPLE QUESTION AND SCALE

    * With respect to User Name/PasswordAging, what do you find to be more important?

    * Based on your previous choice, evaluate the following statements.

    Remote User

    WAN

    Importance of Selection

    Equal Importance

    Moderate Importance

    Strong Importance

    Very Strong Importance

    Extreme Importance

    The questions were developed using the within-subjects design concept. This concept compels a respondent to view the same question twice, but in a different manner. A within-subjects design reduces the errors that are associated with individual differences by asking the same question in a different way (Epstein, 2013). This process enables a direct comparison of the responses and reduces the number of required respondents (Epstein, 2013).

    The scaling procedure in this study was based on G. A. Miller’s (1956) work and the continued use of Saaty’s hierarchal scaling within the AHP and ANP methodologies (Saaty, 1977, 1991, 2001, 2009, 2010, 2012; Saaty & Alexander, 1989; Saaty & Forman, 1992; Saaty & Peniwati, 2012; Saaty & Vargas, 1985, 1991). The scales within each question were based on the Likert scale; this scale has “equal importance” as the lowest parameter, which is indicated with a numerical value of one, and “extreme importance” as the highest parameter, which is indicated with a numerical value of nine (Figure 4).

    Demographics is the third action of Function #2. Professionals who were SMEs in the field of cyber security were sampled and had an equal probability of being chosen for the survey. Using probabilities, each SME had an equal probability of being chosen for the survey. The random sample enabled an unbiased representation of the group (Creative Research Systems, 2012; SurveyMonkey, 2015). A sample size of 502 respondents was surveyed in this study. Of the 502 respondents, 278 of the participants completed all of the survey responses. The required margin of error, which is also known as the confidence interval, was ±6%. This statistic is based on the concept of how well the sample population’s answers can be considered to represent the “true value” of the required population (e.g., 100,000+) (Creative Research

    200

  • 201 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    April 2017

    Systems, 2012; SurveyMonkey, 2015). The confidence level accurately measures the sample size and shows that the population falls within a set margin of error. A 95 percent confidence level was required in this survey.

    Survey Age of respondents was used as the primary measurement source for experience with a sample size of 502 respondents to correlate against job position (Table 4), company type (Table 5), and company size (Table 6).

    TABLE 4. AGE VS. JOB POSITION

    Age-Row 1 2 3 4 5 Grand Labels Total

    18-24 1 1 4 5 11

    25-34 7 2 27 6 28 70

    35-44 22 1 63 21 32 139

    45-54 19 4 70 41 42 176

    55-64 11 1 29 15 26 82

    65 > 1 2 3 6

    Grand 60 9 194 85 136 484 Total

    SKIPPED: 18

    Legend 1 2 3 4 5 (Job NetEng Sys- IA IT Mgt Other Position) Admin

    Note. IA = Information Assurance; IT = Information Technology; NetEng = Network Engineering; SysAdmin = System Administration.

  • Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    Cyber Security Metrics http://www.dau.mil

    TABLE 5. AGE VS. COMPANY TYPE

    Age-Row 1 2 3 4 5 Grand Labels Total

    18-24 2 7 2 11

    25-34 14 7 35 10 4 70

    35-44 13 26 69 19 11 138

    45-54 13 42 73 35 13

    55-64 7 12 37 22 4

    65 > 5 1 6

    Grand 47 87 216 98 35 Total

    SKIPPED: 19

    483

    Legend 1 2 3 4 5 (Job Mil Gov't Com- FFRDC Other Position) Uniform mercial

    Note. FFRDC = Federally Funded Research and Development Center; Gov’t = Government; Mil = Military.

    TABLE 6. AGE VS. COMPANY SIZE

    Age-Row 1 2 3 4 Grand Labels Total

    18-24 2 1 1 7 11

    25-34 8 19 7 36 70

    35-44 16 33 17 72 138

    45-54 19 37 21 99 176

    55-64 11 14 10 46 81

    65 > 2 4 6

    Grand 58 104 56 264 482 Total

    SKIPPED: 20

    Legend 1 2 3 4 (Company 1-49 50-999 1K-5,999 6K > Size)

    The respondents were usually mature and worked in the commercial sector (45 percent) in organizations that had 6,000+ employees (55 percent) and within the Information Assurance discipline (40 percent). A high number of

    202

    176

    82

  • Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    April 2017

    respondents described their job descriptions as other (28 percent). The other category in Table 4 reflects an extensive range of job titles and job descriptions in the realm of cyber security, which were not categorized in Table 4.

    Descriptive statistical analysis is the fourth action of Function #2. This action summarizes the outcomes of the characteristics in concise quantitative terms to enable statistical inference (Daniel, 1990), as listed in Table 7.

    TABLE 7. CRITERIA DESCRIPTIVE STATISTICS

    A: 26% Diameter Protocol

    B: 74% Automated Log Management

    A: 42% Human Accounting

    Enforcement

    B: 58% Diameter Protocol

    Answered: 344 Answered: 348 1 11

    Q13.

    1 22 1 16

    Q12.

    1 22

    2 2 2 17 2 8 2 7

    3 9 3 21 3 19 3 13

    4 7 4 24 4 10 4 24

    5 22 5 66 5 41 5 53

    6 15 6 34 6 17 6 25

    7 14 7 40 7 25 7 36

    8 3 8 12 8 4 8 9

    9 6 9 19 9 7 9 12

    Mean 5.011 Mean 5.082 Mean 4.803 Mean 5.065

    Mode 5.000 Mode 5.000 Mode 5.000 Mode 5.000

    Standard Deviation

    2.213 Standard Deviation

    2.189 Standard Deviation

    2.147 Standard Deviation

    2.159

    Variance 4.898 Variance 4.792 Variance 4.611 Variance 4.661

    Skewedness

    -0.278 Skewedness

    -0.176 Skewedness

    -0.161 Skewedness

    -0.292

    Kurtosis -0.489 Kurtosis -0.582 Kurtosis -0.629 Kurtosis -0.446

    n 89.000 n 255.000 n 147.000 n 201.000

    Std Err 0.235 Std Err 0.137 Std Err 0.177 Std Err 0.152

    Minimum 1.000 Minimum 1.000 Minimum 1.000 Minimum 1.000

    1st Quartile 4.000 1st Quartile 4.000 1st Quartile 3.000 1st Quartile 4.000

    Median 5.000 Median 5.000 Median 5.000 Median 5.000

    3rd Quartile

    7.000 3rd Quartile

    7.000 3rd Quartile

    6.000 3rd Quartile

    7.000

    Maximum 9.000 Maximum 9.000 Maximum 9.000 Maximum 9.000

    Range 8.000 Range 8.000 Range 8.000 Range 8.000

    Which do you like best? Which do you like best?

    203

  • Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    Cyber Security Metrics http://www.dau.mil

    -

    Statistical inference, which is derived from the descriptive analysis, relates the population demographics, data normalization, and data reliability of the survey based on the internal consistency. Inferential statistics enables a sample set to represent the total population due to the impracticality of surveying each member of the total population. The sample set enables a visual interpretation of the statistical inference and is used to calculate the standard deviation, mean, and other categorical distributions, and test the data normality. The MiniTab® software was used to perform these analyses, as shown in Figure 5, using the Anderson-Darling testing methodology.

    FIGURE 5. RESULTS OF THE ANDERSON DARLING TEST

    Perce

    nt

    99.9

    99

    95

    90

    80

    70 60 50 40 30 20

    10

    5

    1

    0.1

    Probability of Plot Q9 Normal

    Q9

    Mean StDev N AD PValue 

    0 3 6 9 12

    4.839 2.138

    373 6.619

  • Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    April 2017

    -

    from the survey questionnaire. The resulting plot (Figure 6) shows normally distributed residuals, which is consistent with the assumption that a General Linear Model (GLM) is adequate for the ANOVA test for categorical demographic predictors (i.e., respondent age, employer type, employer size, and job position).

    FIGURE. 6 RESIDUAL Q Q PLOT AND ITS GLM ANALYSIS FOR Q9 Factor Information Factor Type Levels Values AGE Fixed 6 1, 2, 3, 4, 5, 6 SIZE Fixed 4 1, 2, 3, 4 Type Fixed 5 1, 2, 3, 4, 5 Position Fixed 5 1, 2, 3, 4, 5 Analysis of Variance Source DF Adj SS Adj MS F-Value P-Value

    AGE 5 32.35 6.470 1.43 0.212 SIZE 3 4.02 1.340 0.30 0.828 Type 4 28.40 7.101 1.57 0.182 Position 4 23.64 5.911 1.31 0.267

    Error 353 1596.56 4.523 Lack-of-Fit 136 633.01 4.654 1.05 0.376 Pure Error 217 963.55 4.440

    Total 369 1690.22

    Y = Xβ + ε (Equation 1)

    β o Q9 = 5.377 - 1.294 AGE_1 - 0.115 AGE_2 - 0.341 AGE_3 - 0.060 AGE_4 + 0.147 AGE_5 + 1.66 AGE_6

    + 0.022 SIZE_1 + 0.027 SIZE_2 + 0.117 SIZE_3 - 0.167 SIZE_4 - 0.261 Type_1 + 0.385 Type_2 - 0.237 Type_3 - 0.293 Type_4 + 0.406 Type_5 + 0.085 Position_1 + 0.730 Position_2 - 0.378 Position_3 + 0.038 Position_4 - 0.476 Position_5

    Note. ε error vectors are working in the background.

    ♦β Coefficients Term Coef SE Coef T-Value P-Value VIF Constant 5.377 0.318 16.92 0.000 AGE

    1 -1.294 0.614 -2.11 0.036 1.07 2 -0.115 0.366 -0.31 0.754 1.32 3 -0.341 0.313 -1.09 0.277 1.76 4 -0.060 0.297 -0.20 0.839 1.82 5 0.147 0.343 0.43 0.669 1.38

    SIZE 1 0.022 0.272 0.08 0.935 3.02 2 0.027 0.228 0.12 0.906 2.67 3 0.117 0.275 0.43 0.670 2.89

    Type 1 -0.261 0.332 -0.79 0.433 1.49 2 0.385 0.246 1.56 0.119 1.28 3 -0.237 0.191 -1.24 0.216 1.18 4 -0.293 0.265 -1.11 0.269 1.40

    Position 1 0.085 0.316 0.27 0.787 3.03 2 0.730 0.716 1.02 0.309 8.97 3 -0.378 0.243 -1.55 0.121 3.06 4 0.038 0.288 0.13 0.896 3.03

    Parameters

    [

    205

  • Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    Cyber Security Metrics http://www.dau.mil

    -

    FIGURE. 6 RESIDUAL Q Q PLOT AND ITS GLM ANALYSIS FOR Q9, CONTINUED

    Q9. What do you like best?

    Password Activity-Based Q&A or Diameter Protocol

    Normal Probability Plot (response is Q9)

    Perce

    nt

    Residual

    99.9

    99

    95

    90

    80

    70 60 50 40 30 20

    10

    5

    1

    0.1 7.5 5.0 2.5 0.0 2.5 5.0

    The P-values in Figure 6 show that the responses to Question 9 have minimal sensitivity to the age, size, company type, and position. Additionally, the error ( ε ) of the lack-of-fit has a P-value of 0.376, which indicates that there is insufficient evidence to conclude that the model does not fit. The GLM model formula (Equation 1) in Minitab® identified Y as a vector of survey question responses, β as a vector of parameters (age, job position, company type, and company size), X as the design matrix of the constants, and ε as a vector of the independent normal random variables (MiniTab®, 2015). The equation is as follows:

    Y = Xβ + ε (1)

    Once the data were tested for normality (Figure 6 shows the normally distributed residuals and equation traceability), an additional analysis was conducted to determine the internal consistency of the Likert scale survey questions. This analysis was performed using Cronbach’s alpha (Equation 2). In Equation 2, N is the number of items, c-bar is the average inter-item covariance, and v-bar is the average variance (Institute for Digital Research and Education [IDRE], 2016). The equation is as follows:

    206

  • 207 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    April 2017

    N c (2) α = v + (N – 1) c

    Cronbach’s alpha determines the reliability of a survey questionnaire based on the internal consistency of a Likert scale question, as shown in Figure 4 (Lehman et al., 2011). Cronbach’s alpha scores that are greater than 0.70 are considered to indicate good performance. The score for the respondent data from the survey was 0.98.

    The determination of dominance is the fifth action of Function #2, which converts individual judgments into group decisions for a pairwise comparison between two survey questions (Figure 4). The geometric mean was employed for dominance selection, as shown in Equation (3) (Ishizaka & Nemery, 2013). If the geometric mean identifies a tie between answers A (4.9632) and B (4.9365), then expert judgment is used to determine the most significant selection. The proposed estimates suggested that there was no significant difference beyond the hundredth decimal position. The equation is as follows:

    1/NN (3)geometric mean = (∏x)i i = 1

    The sixth and final action of Function #2 is a pairwise comparison of the selection of alternatives and the creation of the AHP and ANP scorecards. The number of pairwise comparisons is based on the criteria for the interactions shown in Figures 2 and 3—the pairwise comparisons form the AHP and ANP scorecards. The scorecards shown in Figure 7 (AHP) and Figure 8 (ANP) include the pairwise comparisons for each MCDM and depict the dominant A/B survey answers based on the geometric mean shaded in red.

  • 208 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    Cyber Security Metrics http://www.dau.mil

    FIG

    UR

    E 7

    . AH

    P S

    CO

    RE

    CA

    RD

    : A P

    AIR

    WIS

    E C

    OM

    PAR

    ISO

    N M

    ATR

    IX

    No

    de:

    Go

    alC

    lust

    er: 1

    Go

    al

    Co

    mp

    aris

    on

    wrt

    Go

    al N

    od

    e in

    2 M

    easu

    re O

    f E

    ff ec

    tive

    ness

    Cri

    teri

    a1_

    Aut

    hent

    icat

    ion

    9

    8

    7 6

    5

    4.9

    98

    8

    3 2

    1 2

    3 4

    5

    6

    7 8

    9

    2_

    Aut

    hori

    zati

    on

    1_A

    uthe

    ntic

    atio

    n 9

    8

    7

    6

    5 4

    .96

    3 3

    2 1

    2 3

    4

    5 6

    7

    8

    9

    3_A

    cco

    unti

    ng2_

    Aut

    hori

    zati

    on

    9

    8

    7 6

    5

    4.9

    202

    3 2

    1 2

    3 4

    5

    6

    7 8

    9

    3_

    Acc

    oun

    ting

    No

    de:

    1_A

    uthe

    ntic

    atio

    n C

    om

    par

    iso

    n w

    rt 1

    _Aut

    hent

    icat

    ion

    No

    de

    in 3

    a A

    uthe

    ntic

    atio

    n Su

    bcr

    iter

    ia11

    _RA

    DIU

    S

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3 4

    .426

    5 5

    6

    7 8

    9

    12

    _Dia

    met

    er

    No

    de:

    2_A

    utho

    riza

    tio

    nC

    lust

    er: 2

    Mea

    sure

    Of

    Eff

    ecti

    vene

    ss

    Co

    mp

    aris

    on

    wrt

    2_A

    utho

    riza

    tio

    n N

    od

    e in

    3b

    Aut

    hori

    zati

    on

    Sub

    crit

    eria

    21_A

    ctiv

    ity

    Q&

    A

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3 4

    .164

    9

    5 6

    7

    8

    9

    22_U

    ser

    Nam

    e &

    Pas

    swo

    rd A

    gin

    g

    No

    de:

    3_A

    cco

    unti

    ngC

    lust

    er: 2

    Mea

    sure

    Of

    Eff

    ecti

    vene

    ss

    Co

    mp

    aris

    on

    wrt

    3_A

    cco

    unti

    ng N

    od

    e in

    3c

    Acc

    oun

    ting

    Sub

    crit

    eria

    31_H

    uman

    Acc

    oun

    ting

    E

    nfo

    rcem

    ent

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3 4

    .26

    97

    5 6

    7

    8

    9

    32_A

    uto

    mat

    ed

    Log

    Man

    agem

    ent

    No

    de:

    11_

    RA

    DIU

    SC

    lust

    er:

    3aA

    uthe

    ntic

    atio

    n

    Co

    mp

    aris

    ons

    wrt

    11_

    RA

    DIU

    S N

    od

    e in

    4 A

    lter

    nati

    ves

    1_LA

    N

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3 4

    .071

    5

    6

    7 8

    9

    2_

    WA

    N

    1_LA

    N

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3 4

    .39

    97

    5 6

    7

    8

    9

    3_R

    emo

    te U

    ser

    2_W

    AN

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3

    4.0

    69

    5

    6

    7 8

    9

    3_

    Rem

    ote

    Use

    r

    No

    de:

    12_

    Dia

    met

    erC

    lust

    er:

    3aA

    uthe

    ntic

    atio

    n

    Co

    mp

    aris

    ons

    wrt

    12_

    Dia

    met

    er N

    od

    e in

    4 A

    lter

    nati

    ves

    1_LA

    N

    9

    8

    7 6

    5

    4

    3.8

    394

    2

    1 2

    3 4

    5

    6

    7 8

    9

    2_

    WA

    N

    1_LA

    N

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3.9

    955

    4

    5

    6

    7 8

    9

    3_

    Rem

    ote

    Use

    r

    2_W

    AN

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3.

    974

    4

    4

    5 6

    7

    8

    9

    3_R

    emo

    te U

    ser

    No

    de:

    21_

    Act

    ivit

    y Q

    &A

    Clu

    ster

    : 3b

    Aut

    hori

    zati

    on

    Co

    mp

    aris

    ons

    wrt

    21_

    Act

    ivit

    y Q

    &A

    No

    de

    in 4

    Alt

    erna

    tive

    s1_

    LAN

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3

    4.2

    89

    8

    5 6

    7

    8

    9

    2_W

    AN

    1_LA

    N

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3 4

    .279

    8

    5 6

    7

    8

    9

    3_R

    emo

    te U

    ser

    2_W

    AN

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3

    4.9

    08

    9

    5 6

    7

    8

    9

    3_R

    emo

    te U

    ser

    No

    de:

    22_

    Use

    r N

    ame

    &P

    assw

    ord

    Ag

    ing

    Clu

    ster

    : 3b

    Aut

    hori

    zati

    on

    Co

    mp

    aris

    ons

    wrt

    22_

    Use

    r N

    ame

    & P

    assw

    ord

    Ag

    ing

    No

    de

    in 4

    Alt

    erna

    tive

    s1_

    LA

    N

    9

    8

    7 6

    5

    4

    3.8

    60

    4

    2 1

    2 3

    4

    5 6

    7

    8

    9

    2_W

    AN

    1_L

    AN

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3

    4.0

    244

    5

    6

    7 8

    9

    3_

    Rem

    ote

    Use

    r

    2_W

    AN

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3.

    936

    2 4

    5

    6

    7 8

    9

    3_

    Rem

    ote

    Use

    r

    No

    de:

    31_

    Hum

    anA

    cco

    unti

    ng E

    nfo

    rcem

    ent

    Clu

    ster

    : 3c

    Acc

    oun

    ting

    Co

    mp

    aris

    ons

    wrt

    31_

    Hum

    an A

    cco

    unti

    ng E

    nfo

    rcem

    ent

    No

    de

    in 4

    Alt

    erna

    tive

    s1_

    LAN

    9

    8

    7

    6

    5 4

    3.

    7635

    2

    1 2

    3 4

    5

    6

    7 8

    9

    2_

    WA

    N

    1_LA

    N

    9

    8

    7 6

    5

    4

    3.8

    60

    1 2

    1 2

    3 4

    5

    6

    7 8

    9

    3_

    Rem

    ote

    Use

    r2_

    WA

    N

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3 4

    .59

    79

    5 6

    7

    8

    9

    3_R

    emo

    te U

    ser

    No

    de:

    32_

    Aut

    om

    ated

    Lo

    g M

    anag

    emen

    tC

    lust

    er:

    3c A

    cco

    unti

    ng

    Co

    mp

    aris

    ons

    in w

    rt 3

    2_A

    uto

    mat

    ed L

    og

    Man

    agem

    ent

    No

    de

    in 4

    Alt

    erna

    tive

    s1_

    LA

    N

    9

    8

    7 6

    5

    4.6

    352

    3 2

    1 2

    3 4

    5

    6

    7 8

    9

    2_

    WA

    N

    1_L

    AN

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3

    4.8

    90

    6

    5 6

    7

    8

    9

    3_R

    emo

    te U

    ser

    2_W

    AN

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3

    4.7

    737

    5 6

    7

    8

    9

    3_R

    emo

    te U

    ser

  • Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    April 2017

    FIG

    UR

    E 8

    . AN

    P S

    CO

    RE

    CA

    RD

    : A P

    AIR

    WIS

    E C

    OM

    PAR

    ISO

    N M

    ATR

    IX

    No

    de:

    Go

    alC

    lust

    er: 1

    Go

    al

    Co

    mp

    aris

    ons

    wrt

    Go

    al n

    od

    e in

    2 M

    easu

    re O

    f E

    ff ec

    tive

    ness

    1_A

    uthe

    ntic

    atio

    n 9

    8

    7

    6

    5 4

    3

    2 1

    2 3

    4.9

    98

    8

    5 6

    7

    8

    9

    2_A

    utho

    riza

    tio

    n1_

    Aut

    hent

    icat

    ion

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3 4

    .90

    00

    5

    6

    7 8

    9

    3_

    Acc

    oun

    ting

    2_A

    utho

    riza

    tio

    n 9

    8

    7

    6

    5 4

    3

    2 1

    2 3

    4.9

    202

    5 6

    7

    8

    9

    3_A

    cco

    unti

    ng

    No

    de:

    1_R

    AD

    IUS

    Clu

    ster

    : 1a

    Aut

    hent

    icat

    ion

    Co

    mp

    aris

    ons

    wrt

    1_R

    AD

    IUS

    nod

    e in

    Alt

    erna

    tive

    s1_

    LAN

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3

    4.0

    710

    5

    6

    7 8

    9

    2_

    WA

    N

    1_LA

    N

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3 4

    .39

    97

    5 6

    7

    8

    9

    3_R

    emo

    te U

    ser

    2_W

    AN

    9

    8

    7

    6

    5 4

    3.

    971

    6

    2 1

    2 3

    4

    5 6

    7

    8

    9

    3_R

    emo

    te U

    ser

    No

    de:

    2_D

    iam

    eter

    Clu

    ster

    : 1a

    Aut

    hent

    icat

    ion

    Co

    mp

    aris

    ons

    wrt

    2_D

    iam

    eter

    no

    de

    in A

    lter

    nati

    ves

    1_LA

    N

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3.8

    394

    4

    5

    6

    7 8

    9

    2_

    WA

    N

    1_LA

    N

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3.9

    955

    4

    5

    6

    7 8

    9

    3_

    Rem

    ote

    Use

    r

    2_W

    AN

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3.

    974

    4

    4

    5 6

    7

    8

    9

    3_R

    emo

    te U

    ser

    No

    de:

    1_L

    AN

    Clu

    ster

    : Alt

    erna

    tive

    s C

    om

    par

    iso

    n w

    rt 1

    _LA

    N n

    od

    e in

    1a

    Aut

    hent

    icat

    ion

    2_R

    AD

    IUS

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3

    4.4

    265

    5 6

    7

    8

    9

    3_D

    iam

    eter

    No

    de:

    2_W

    AN

    Clu

    ster

    : Alt

    erna

    tive

    s C

    om

    par

    iso

    n w

    rt 2

    _WA

    N n

    od

    e in

    1a

    Aut

    hent

    icat

    ion

    2_R

    AD

    IUS

    9

    8

    7

    6

    5 4

    .933

    9

    3 2

    1 2

    3 4

    5

    6

    7 8

    9

    3_

    Dia

    met

    er

    No

    de:

    3_R

    emo

    te

    Use

    rC

    lust

    er: A

    lter

    nati

    ves

    Co

    mp

    aris

    on

    wrt

    3_R

    emo

    te U

    ser

    nod

    e in

    1a

    Aut

    hent

    icat

    ion

    2_R

    AD

    IUS

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3

    4

    5 6

    7

    8

    9

    3_D

    iam

    eter

    4.5

    952

    N

    od

    e: 1

    _Act

    ivit

    y Q

    &A

    Clu

    ster

    : 2a

    Aut

    hori

    zati

    on

    Co

    mp

    aris

    ons

    wrt

    1_A

    ctiv

    ity

    Q&

    A n

    od

    e in

    Alt

    erna

    tive

    s1_

    LAN

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3

    4.3

    785

    5 6

    7

    8

    9

    2_W

    AN

    1_LA

    N

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3 4

    .50

    92

    5 6

    7

    8

    9

    3_R

    emo

    te U

    ser

    2_W

    AN

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3

    4.6

    056

    5

    6

    7 8

    9

    3_

    Rem

    ote

    Use

    r

    No

    de:

    2_U

    ser

    Nam

    e &

    Pas

    swo

    rd A

    gin

    gC

    lust

    er:

    2a A

    utho

    riza

    tio

    n

    Co

    mp

    aris

    ons

    wrt

    2_U

    ser

    Nam

    e &

    Pas

    swo

    rd A

    gin

    g n

    od

    e in

    Alt

    erna

    tive

    s1_

    LAN

    9

    8

    7

    6

    5 4

    3.

    86

    04

    2

    1 2

    3 4

    5

    6

    7 8

    9

    2_

    WA

    N

    1_LA

    N

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3 4

    .024

    4

    5 6

    7

    8

    9

    3_R

    emo

    te U

    ser

    2_W

    AN

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3

    3.9

    362

    5 6

    7

    8

    9

    3_R

    emo

    te U

    ser

    No

    de:

    1_L

    AN

    Clu

    ster

    : Alt

    erna

    tive

    s C

    om

    par

    iso

    n w

    rt 1

    _LA

    N n

    od

    e in

    2a

    Aut

    hori

    zati

    on

    1_A

    ctiv

    ity

    Q&

    A

    9

    8

    7 6

    5

    4.16

    49

    3

    2 1

    2 3

    4

    5 6

    7

    8

    9

    2_U

    ser

    Nam

    e &

    Pas

    swo

    rd A

    gin

    gN

    od

    e: 2

    _WA

    NC

    lust

    er: A

    lter

    nati

    ves

    Co

    mp

    aris

    on

    wrt

    2_W

    AN

    no

    de

    2a A

    utho

    riza

    tio

    n1_

    Act

    ivit

    y Q

    &A

    9

    8

    7

    6

    5 4

    .272

    8

    3 2

    1 2

    3 4

    5

    6

    7 8

    9

    2_

    Use

    r N

    ame

    &P

    assw

    ord

    Ag

    ing

    No

    de:

    3_R

    emo

    te

    Use

    rC

    lust

    er: A

    lter

    nati

    ves

    Co

    mp

    aris

    on

    wrt

    3_R

    emo

    te U

    ser

    nod

    e in

    2a

    Aut

    hori

    zati

    on

    1_A

    ctiv

    ity

    Q&

    A

    9

    8

    7 6

    5

    4.4

    96

    3 3

    2 1

    2 3

    4

    5 6

    7

    8

    9

    2_U

    ser

    Nam

    e &

    Pas

    swo

    rd A

    gin

    g

    209

  • Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    Cyber Security Metrics http://www.dau.mil

    FIG

    UR

    E 8

    . AN

    P S

    CO

    RE

    CA

    RD

    : A P

    AIR

    WIS

    E C

    OM

    PAR

    ISO

    N M

    ATR

    IX, C

    ON

    TIN

    UE

    D

    No

    de:

    1_H

    uman

    Acc

    tE

    nfo

    rcem

    ent

    Clu

    ster

    : 3a

    Acc

    oun

    ting

    Co

    mp

    aris

    ons

    wrt

    1_H

    uman

    Acc

    t E

    nfo

    rcem

    ent

    nod

    e in

    Alt

    erna

    tive

    s1_

    LAN

    9

    8

    7

    6

    5 4

    3.

    7635

    2

    1 2

    3 4

    5

    6

    7 8

    9

    2_

    WA

    N

    1_LA

    N

    9

    8

    7 6

    5

    4

    3.8

    60

    1 2

    1 2

    3 4

    5

    6

    7 8

    9

    3_

    Rem

    ote

    Use

    r

    2_W

    AN

    9

    8

    7

    6

    5 4

    3.

    971

    6

    2 1

    2 3

    4

    5 6

    7

    8

    9

    3_R

    emo

    te U

    ser

    No

    de:

    2_A

    uto

    Lo

    g

    Mg

    tC

    lust

    er: 3

    a A

    cco

    unti

    ng

    Co

    mp

    aris

    ons

    wrt

    2_A

    uto

    Lo

    g M

    gt

    nod

    e in

    Alt

    erna

    tive

    s1_

    LA

    N

    9

    8

    7 6

    5

    4.6

    352

    3 2

    1 2

    3 4

    5

    6

    7 8

    9

    2_

    WA

    N

    1_L

    AN

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3

    4.8

    90

    6

    5 6

    7

    8

    9

    3_R

    emo

    te U

    ser

    2_W

    AN

    9

    8

    7

    6

    5 4

    3

    2 1

    2 3

    4.7

    737

    5 6

    7

    8

    9

    3_R

    emo

    te U

    ser

    No

    de:

    1_L

    AN

    Clu

    ster

    : Alt

    erna

    tive

    s C

    om

    par

    iso

    n w

    rt 1

    _LA

    N n

    od

    e in

    3a

    Acc

    oun

    ting

    1_H

    uman

    Acc

    tE

    nfo

    rcem

    ent

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3 4

    .26

    97

    5 6

    7

    8

    9

    2_A

    uto

    Lo

    g M

    gt

    No

    de:

    2_W

    AN

    Clu

    ster

    : Alt

    erna

    tive

    s C

    om

    par

    iso

    n w

    rt 2

    _WA

    N n

    od

    e in

    3a

    Acc

    oun

    ting

    1_H

    uman

    Acc

    tE

    nfo

    rcem

    ent

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3 4

    .1478

    5

    6

    7 8

    9

    2_

    Aut

    o L

    og

    Mg

    t

    No

    de:

    3_R

    emo

    te

    Use

    rC

    lust

    er: A

    lter

    nati

    ves

    Co

    mp

    aris

    on

    wrt

    3_R

    emo

    te U

    ser

    nod

    e in

    3a

    Acc

    oun

    ting

    1_H

    uman

    Acc

    tE

    nfo

    rcem

    ent

    9

    8

    7 6

    5

    4

    3 2

    1 2

    3 4

    .317

    1 5

    6

    7 8

    9

    2_

    Aut

    o L

    og

    Mg

    t

    210

  • 211 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    April 2017

    After the scorecard data were populated, as shown in Figures 7 and 8, the data were transferred into Super Decisions, which is a software package that was employed to complete the final function of the proposed analysis.

    Function To ensure the validity of the data’s functionality in forming the AHP

    and ANP models, we used the Super Decisions (SD) software to verify the proposed methodology. The first action of Function #3 is Measures. This action begins by recreating the AHP and ANP models, as shown in Figures 2 and 3, and replicating them in SD. The second action of Function #3 is to incorporate the composite scorecards into the AHP and ANP model designs. The composite data in the scorecards were input into SD to verify that the pairwise comparisons of the AHP and ANP models in the scorecards (Figures 7 and 8) had been mirrored and validated by SD’s questionnaire section. During the second action and after the scorecard pairwise criteria comparison section had been completed, immediate feedback was provided to check the data for inconsistencies and provide a cluster priority ranking for each pair, as shown in Figure 9.

    FIGURE 9. AHP SCORECARD INCONSISTENCY CHECK Comparisons wrt "12_Diameter"node in "4Alternatives" cluster 1_LAN is moderately more important than 2_WAN 1. 1_LAN >=9.5 9 8 7 6 5 4 3 2 2 3 4 5 6 7 8 9 >=9.5 No comp. 2_WAN 2. 1_LAN >=9.5 9 8 7 6 5 4 3 2 2 3 4 5 6 7 8 9 >=9.5 No comp. 3_Remote User 3. 2_WAN >=9.5 9 8 7 6 5 4 3 2 2 3 4 5 6 7 8 9 >=9.5 No comp. 3_Remote User

    Inconsistency: 0.13040

    1_LAN 0.28083

    2_WAN 0.13501

    3_Remote 0.58416

    All of the AHP and ANP models satisfied the required inconsistency check, with values between 0.10 and 0.20 (Saaty, 1983). This action concluded the measurement aspect of Function #3. Function #4—Analysis—is the final portion of the application approach to the benchmarking framework for the MOE AAA. This function ranks priorities for the AHP and ANP models. The first action of Function #4 is to review the priorities and weighted rankings of each model, as shown in Figure 10.

  • 212 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    Cyber Security Metrics http://www.dau.mil

    FIGURE 10. AHP/ANP SECURITY METRICS

    AHP ANP RADIUS 0.20000 Authentication RADIUS 0.18231 Diameter 0.80000 Diameter 0.81769

    LAN 0.12950

    WAN 0.33985

    Remote User 0.53065

    Password Activity Q&A

    0.20000 Authorization Password Activity Q&A

    0.20000

    User Name & Password Aging

    0.80000 User Name & Password Aging

    0.80000

    LAN 0.12807

    WAN 0.22686

    Remote User 0.64507

    Human Acct Enforcement

    0.20001 Accounting Human Acct Enforcement

    0.20000

    Auto Log Mgt 0.79999 Auto Log Mgt 0.80000

    LAN 0.32109

    WAN 0.13722

    Remote User 0.54169

    LAN 0.15873 Alternative Ranking

    LAN 0.02650

    WAN 0.24555 WAN 0.05710

    Remote User 0.60172 Remote User 0.92100

    These priorities and weighted rankings are the AAA security control measures that cyber security leaders need to make well-informed choices as they create and deploy defensive strategies.

  • 213 Defense ARJ, April 2017, Vol. 24 No. 2 : 186–221

    April 2017

    Summary of Analysis Using a GLM, the survey data showed normally distributed residuals,

    which is consistent with the assumption that a GLM is adequate for the ANOVA test for categorical demographic predictors (i.e., the respondent age, employer type, employer size, and job position).

    Additionally, using Cronbach’s alpha analysis, a score of 0.98 ensured that the reliability of the survey questionnaire was acceptable based on the internal consistency of the Likert scale for each question.

    The subjective results of the survey contradicted the AHP and ANP MCDM model results shown in Figure 10.

    The survey indicated that 67 percent (with a ±6% margin of error) of the respondents preferred RADIUS to Diameter; conversely, both the AHP model and the ANP model selected Diameter over RADIUS. Within the ANP model, the LAN (2008), WAN (2008), and remote user communities provided ranking priorities for the subcriteria and a final community ranking at the end based on the model interactions (Figures 3 and 10). The ranking of interdependencies, outer-dependencies, and feedback loops is considered within the ANP model, whereas the AHP model is a top-down approach, and its community ranking is last (Figures 2 and 10).

    The preferen