dr. bhavani thuraisingham the university of texas at dallas (utd) june 2011

84
Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011 Security Architecture and Design

Upload: tal

Post on 06-Jan-2016

20 views

Category:

Documents


0 download

DESCRIPTION

Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011. Security Architecture and Design. Domain Agenda. System and Components Security Architectural Security Concepts and Models Information Systems Evaluation Models. Domain Agenda. System and Components Security - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Dr. Bhavani ThuraisinghamThe University of Texas at Dallas (UTD)

June 2011

Security Architecture and Design

Page 2: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Domain Agenda• System and Components Security• Architectural Security Concepts and Models• Information Systems Evaluation Models

Page 3: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Domain Agenda• System and Components Security

– Architectural Concepts and Definitions

• Architectural Security Concepts and Models• Information Systems Evaluation Models

Page 4: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Common Security Architecture Terms• Information Security Management System• Information Security Architecture• Best Practice• Architecture• Blueprint• Framework• Infrastructure

Page 5: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Objectives of EnterpriseSecurity Architecture

• Guidance• Strategically aligned business and security decisions• Provide security-related guidance• Apply security best practices• Define security zones

Page 6: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Benefits of an EnterpriseSecurity Architecture

• Consistently manage risk• Reduce the costs of managing risk• Accurate security-related decisions• Promote interoperability, integration and ease-of-access• Provide a frame of reference

Page 7: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Characteristics of a GoodSecurity Architecture

• Strategic• Holistic• Multiple implementations

Page 8: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Effects of Poor Architectural Planning• Inability to efficiently support new business services• Unidentified security vulnerabilities• Increased frequency and visibility of security breaches• Poorly understood or coordinated compliance requirements• Poor understanding of security goals and objectives

Page 9: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Enterprise SecurityArchitecture Components

• Common Architecture Language• Architecture Model• Zachman Framework

Page 10: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Zachman Framework• Complete overview of IT business alignment• Two-dimensional• Intent• Scope• Principles

Page 11: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

SABSA• What are the business requirements?

– Contextual– Conceptual– Logical– Physical– Component

Page 12: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

ISO 7498-2• OSI second part• About secure communications• NOT an implementation

Page 13: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

ISO/IEC 4010:2007• Systems and software engineering• Practice for architectural description of software-intensive

systems

Page 14: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

The Open GroupArchitecture Framework

• Governance• Business• Application• Data • Technology

Page 15: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Department of DefenseArchitecture Framework

• OMB A-130 requirement• All view• Operational view• Systems view• Technical standards view

Page 16: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Which Framework is Right?• Starting place• Culture• Template

Page 17: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

System and Component Security• Components that provide basic security services• Hardware components• Software components

Page 18: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

CPU and Processor Privilege States• Supervisor state• Problem state

Page 19: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

CPU Process States• Running• Ready• Blocked• Masked/interruptible

Page 20: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Common ComputerArchitecture Layers

• Application programs• Utilities• Operating system• Computer hardware

Page 21: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Common Computer Architecture• Program execution• Access to input/output devices• Controlled access to files and data• Error detection and response• Accounting and tracking• Access for maintenance and troubleshooting

Page 22: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Hardware: Computers• Mainframe• Minicomputer• Desktop / server• Laptop / notebook• Embedded

Page 23: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Hardware: Communication Devices• Modem• Network Interface Card (NIC)

Page 24: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Hardware: Printers• Network-aware• More than output device• Full operating systems

Page 25: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Hardware: Wireless• Network interface card• Access point• Ethernet bridge• Router• Range extender

Page 26: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Input/Output (I/O) Devices• I/O Controller• Managing memory• Hardware• Operating system

Page 27: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Firmware: Pre-programmed Chips• ROMs (Read-only memory)• PROMs (Programmable read-only memory)• EPROMs (Erasable, programmable, read-only memory)• EEPROMs (Electrically erasable, programmable, read-only

memory• Field Programmable Gate Arrays (FPGAs)• Flash chips

Page 28: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Software: Operating System• Hardware control• Hardware abstraction• Resource manager

Page 29: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

CPU and OS Support for Applications• Applications were originally self-contained• OS capable of accommodating more than one application at a

time

Page 30: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

CPU and OS Support for Applications - Today

• Today’s applications are portable• Execute multiple process threads• Threads

Page 31: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Operating Systems Support for Applications

• Multi-tasking• Multi-programming• Multi-processing• Multi-processor• Multi-core

Page 32: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Software: Vendor• Commercial off the shelf (COTS)• Function first• Evaluation

Page 33: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Software: Custom• Minimal scripting• Business application• System life cycle

Page 34: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Software: Customer-relationship Management Systems

• Business to customer interactions• Tracking habits

Page 35: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Systems Architecture Approaches• Open• Closed• Dedicated• Single level• Multi-level• Embedded

Page 36: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Architectures: Middleware• Interoperability• Post implementation• Distributed

Page 37: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Types of System Memory Resources• CPU registers• Cache• Main memory• Swap space• Disk storage

Page 38: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Requirements forMemory Management

• Relocation• Protection• Sharing

Page 39: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Three Types of Memory Addressing• Logical• Relative• Physical

Page 40: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Memory Protection Benefits• Memory reference• Different data classes• Users can share access• Users cannot generate addresses

Page 41: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Virtual Memory• Extends apparent memory• Paging includes

– Splitting physical memory– Splitting programs (processes)– Allocating the required number page frames

• Swapping

Page 42: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Virtual Machines• Mimic the architecture of the actual system• Provided by the operating system

Page 43: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Domain Agenda• System and Components Security• Architectural Security Concepts and Models• Information Systems Evaluation Models

Page 44: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Ring Protection

0. O/S Kernel1. I/O2. Utilities3. User Apps

Page 45: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Layering and Data Hiding• Layering• Data Hiding

Page 46: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Privilege Levels• Identifying, authenticating and authorizing subjects• Subjects of higher trust• Subjects with lower trust

Page 47: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Process Isolation• Object’s integrity• Prevents interaction• Independent states• Process isolation method

Page 48: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Security Architecture• Security critical components of the system• Trusted Computing Base• Reference Monitor and Security Kernel• Security Perimeter• Security Policy• Least Privilege

Page 49: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Trusted Computing Base (TCB)• Trusted Computing Base

– Hardware– Firmware– Software– Processes– Inter-process communications

• Simple and Testable

Page 50: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Trusted Computing Base (TCB)• Enforces security policy• Monitors four basic functions

– Process activation– Execution domain switching– Memory protection– Input/output operations

Page 51: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Trusted Computing Base (TCB)• The trusted computing base (TCB) of a computer system is the set of all

hardware, firmware, and/or software components that are critical to its security, in the sense that bugs or vulnerabilities occurring inside the TCB might jeopardize the security properties of the entire system. By contrast, parts of a computer system outside the TCB must not be able to misbehave in a way that would leak any more privileges than are granted to them in accordance to the security policy.

• The careful design and implementation of a system's trusted computing base is paramount to its overall security. Modern operating systems strive to reduce the size of the TCB so that an exhaustive examination of its code base (by means of manual or computer-assisted software audit or program verification) becomes feasible.

Page 52: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Reference Monitor Concept• Abstract machine concept

– Must be tamper-proof– Always invoked– Verifiable

• Security kernel• Subject• Object

Page 53: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Reference Monitor and Security Kernel• In operating systems architecture, a reference monitor is a tamperproof,

always-invoked, and small-enough-to-be-fully-tested-and-analyzed module that controls all software access to data objects or devices (verifiable).

• The reference monitor verifies that the request is allowed by the access control policy.

• For example, Windows 3.x and 9x operating systems were not built with a reference monitor, whereas the Windows NT line, which also includes Windows 2000 and Windows XP, was designed to contain a reference monitor, although it is not clear that its properties (tamperproof, etc.) have ever been independently verified, or what level of computer security it was intended to provide.

Page 54: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Domain Agenda• System and Components Security• Architectural Security Concepts and Models

– Security Models

• Information Systems Evaluation Models

Page 55: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Bell-LaPadula Confidentiality Model• Hierarchical state machine model• Three fundamental modes• Secure state• Defines access rules

Page 56: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Bell and LaPadula• A system state is defined to be "secure" if the only permitted access modes of subjects

to objects are in accordance with a security policy. To determine whether a specific access mode is allowed, the clearance of a subject is compared to the classification of the object (more precisely, to the combination of classification and set of compartments, making up the security level) to determine if the subject is authorized for the specific access mode. The clearance/classification scheme is expressed in terms of a lattice. The model defines two mandatory access control (MAC) rules and one discretionary access control (DAC) rule with three security properties:

– The Simple Security Property - a subject at a given security level may not read an object at a higher security level (no read-up).

– The *-property (read "star"-property) - a subject at a given security level must not write to any object at a lower security level (no write-down). The *-property is also known as the Confinement property.

– The Discretionary Security Property - use of an access matrix to specify the discretionary access control.

Page 57: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Biba• In general, preservation of data integrity has three goals:

– Prevent data modification by unauthorized parties– Prevent unauthorized data modification by authorized parties– Maintain internal and external consistency (i.e. data reflects the real world)

• Biba security model is directed toward data integrity (rather than confidentiality) and is characterized by the phrase: "no read down, no write up". This is in contrast to the Bell-LaPadula model which is characterized by the phrase "no write down, no read up".

• The Biba model defines a set of security rules similar to the Bell-LaPadula model. These rules are the reverse of the Bell-LaPadula rules:

– The Simple Integrity Axiom states that a subject at a given level of integrity must not read an object at a lower integrity level (no read down).

– The * (star) Integrity Axiom states that a subject at a given level of integrity must not write to any object at a higher level of integrity (no write up).

Page 58: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Clark Wilson Model• The Clark-Wilson integrity model provides a foundation for specifying and

analyzing an integrity policy for a computing system.• The model is primarily concerned with formalizing the notion of information

integrity. • Information integrity is maintained by preventing corruption of data items in a

system due to either error or malicious intent. • An integrity policy describes how the data items in the system should be kept

valid from one state of the system to the next and specifies the capabilities of various principals in the system.

• The model defines enforcement rules and certification rules.• The model’s enforcement and certification rules define data items and

processes that provide the basis for an integrity policy. The core of the model is based on the notion of a transaction.

Page 59: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Clark Wilson Model• A well-formed transaction is a series of operations that transition a system from one

consistent state to another consistent state.

• In this model the integrity policy addresses the integrity of the transactions.

• The principle of separation of duty requires that the certifier of a transaction and the implementer be different entities.

• The model contains a number of basic constructs that represent both data items and processes that operate on those data items. The key data type in the Clark-Wilson model is a Constrained Data Item (CDI). An Integrity Verification Procedure (IVP) ensures that all CDIs in the system are valid at a certain state. Transactions that enforce the integrity policy are represented by Transformation Procedures (TPs). A TP takes as input a CDI or Unconstrained Data Item (UDI) and produces a CDI. A TP must transition the system from one valid state to another valid state. UDIs represent system input (such as that provided by a user or adversary). A TP must guarantee (via certification) that it transforms all possible values of a UDI to a “safe” CDI.

Page 60: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Clark Wilson Model• At the heart of the model is the notion of a relationship between an authenticated

principal (i.e., user) and a set of programs (i.e., TPs) that operate on a set of data items (e.g., UDIs and CDIs). The components of such a relation, taken together, are referred to as a Clark-Wilson triple. The model must also ensure that different entities are responsible for manipulating the relationships between principals, transactions, and data items. As a short example, a user capable of certifying or creating a relation should not be able to execute the programs specified in that relation.

• The model consists of two sets of rules: Certification Rules (C) and Enforcement Rules (E). The nine rules ensure the external and internal integrity of the data items. To paraphrase these:

• C1—When an IVP is executed, it must ensure the CDIs are valid. C2—For some associated set of CDIs, a TP must transform those CDIs from one valid state to another. Since we must make sure that these TPs are certified to operate on a particular CDI, we must have E1 and E2.

Page 61: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Clark Wilson Model• E1—System must maintain a list of certified relations and ensure only TPs certified to run

on a CDI change that CDI. E2—System must associate a user with each TP and set of CDIs. The TP may access the CDI on behalf of the user if it is “legal.” This requires keeping track of triples (user, TP, {CDIs}) called “allowed relations.”

• C3—Allowed relations must meet the requirements of “separation of duty.” We need authentication to keep track of this.

• E3—System must authenticate every user attempting a TP. Note that this is per TP request, not per login. For security purposes, a log should be kept.

• C4—All TPs must append to a log enough information to reconstruct the operation. When information enters the system it need not be trusted or constrained (i.e. can be a UDI). We must deal with this appropriately.

• C5—Any TP that takes a UDI as input may only perform valid transactions for all possible values of the UDI. The TP will either accept (convert to CDI) or reject the UDI. Finally, to prevent people from gaining access by changing qualifications of a TP:

• E4—Only the certifier of a TP may change the list of entities associated with that TP.

Page 62: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Other Fundamental Models• Information flow model• Non-interference model• State machine• Lattice-based model• Graham-Denning• Harrison-Ruzzo-Ullman result

Page 63: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Domain Agenda• System and Components Security• Architectural Security Concepts and Models• Information Systems Evaluation Models

Page 64: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Evaluation Roles• Buyer/Customer• Seller/Vendor• Lab/Certifier

Page 65: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Documents & Organizations• TCSEC (U.S DoD)• ITSEC (European Union)• Common Criteria (ISO Standard I5408)

Page 66: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

TCSEC or Orange Book• DoD-centric• Security and functionality• Product evaluation

Page 67: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Secure System Evaluation: TCSEC• Trusted Computer System Evaluation Criteria (TCSEC) is a United States

Government Department of Defense (DoD) standard that sets basic requirements for assessing the effectiveness of computer security controls built into a computer system. The TCSEC was used to evaluate, classify and select computer systems being considered for the processing, storage and retrieval of sensitive or classified information.

• The TCSEC, frequently referred to as the Orange Book, is the centerpiece of the DoD Rainbow Series publications. Initially issued in 1983 by the National Computer Security Center (NCSC), an arm of the National Security Agency, and then updated in 1985,.

• TCSEC was replaced by the Common Criteria international standard originally published in 2005.

Page 68: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Secure System Evaluation: TCSEC• Policy - The security policy must be explicit, well-defined and enforced by the

computer system. There are two basic security policies:

• Mandatory Security Policy - Enforces access control rules based directly on an individual's clearance, authorization for the information and the confidentiality level of the information being sought. Other indirect factors are physical and environmental. This policy must also accurately reflect the laws, general policies and other relevant guidance from which the rules are derived.

– Marking - Systems designed to enforce a mandatory security policy must store and preserve the integrity of access control labels and retain the labels if the object is exported.

• Discretionary Security Policy - Enforces a consistent set of rules for controlling and limiting access based on identified individuals who have been determined to have a need-to-know for the information.

Page 69: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Secure System Evaluation: TCSEC• Accountability - Individual accountability regardless of policy must be enforced. A

secure means must exist to ensure the access of an authorized and competent agent which can then evaluate the accountability information within a reasonable amount of time and without undue difficulty. There are three requirements under the accountability objective:– Identification - The process used to recognize an individual user.

– Authentication - The verification of an individual user's authorization to specific categories of information.

– Auditing - Audit information must be selectively kept and protected so that actions affecting security can be traced to the authenticated individual.

• The TCSEC defines four divisions: D, C, B and A where division A has the highest security. Each division represents a significant difference in the trust an individual or organization can place on the evaluated system. Additionally divisions C, B and A are broken into a series of hierarchical subdivisions called classes: C1, C2, B1, B2, B3 and A1.

Page 70: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Secure System Evaluation: TCSEC• Assurance: The computer system must contain hardware/software mechanisms

that can be independently evaluated to provide sufficient assurance that the system enforces the above requirements. By extension, assurance must include a guarantee that the trusted portion of the system works only as intended. To accomplish these objectives, two types of assurance are needed with their respective elements:– Assurance Mechanisms : Operational Assurance: System Architecture, System

Integrity, Covert Channel Analysis, Trusted Facility Management and Trusted Recovery

– Life-cycle Assurance : Security Testing, Design Specification and Verification, Configuration Management and Trusted System Distribution

Page 71: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

ITSEC• International origin• ITSEM• Functionality• Assurance

Page 72: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Secure System Evaluation: ITSEC• The Information Technology Security Evaluation Criteria (ITSEC) is a structured

set of criteria for evaluating computer security within products and systems. The ITSEC was first published in May 1990 in France, Germany, the Netherlands, and the United Kingdom based on existing work in their respective countries. Following extensive international review, Version 1.2 was subsequently published in June 1991 by the Commission of the European Communities for operational use within evaluation and certification schemes.

• Levels E1 – E6

Page 73: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Common Criteria• Origins• ISO• Documents• EAL I-7• PP• TEO• ST

Page 74: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Secure System Evaluation:Common Criteria

• The Common Criteria for Information Technology Security Evaluation (abbreviated as Common Criteria or CC) is an international standard (ISO/IEC 15408) for computer security certification.

• Common Criteria is a framework in which computer system users can specify their security functional and assurance requirements, vendors can then implement and/or make claims about the security attributes of their products, and testing laboratories can evaluate the products to determine if they actually meet the claims. In other words, Common Criteria provides assurance that the process of specification, implementation and evaluation of a computer security product has been conducted in a rigorous and standard manner.

• Levels: EAL 1 – EAL 7 (Evaluation Assurance Levels)

Page 75: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

EAL = $In natural language

1. A user has only one password and is assigned only one role.2. A valid password is a case-sensitive string of from six to eight

single-byte alphanumeric characters.3. A user must set up a password at the 1st-time login.4. The system allows the authenticated user to change his

password.5. The new password cannot be the same as the old one.

Page 76: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Comparison of Evaluation LevelsCommon Criteria

US TCSEC European ITSEC

-- D: Minimal Protection EO

EAL 1 -- --

EAL 2 C1: Discretionary Security Protection E1

EAL 3 C2: Controlled Access Protection E2

EAL 4 B1: Labeled Security Protection E3

EAL 5 B2: Structure Prote4ction E4

EAL 6 B3: Security Domains E5

EAL 7 A1: Verified Design E6

Page 77: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Certification and Accreditation• Certification and Accreditation (C&A) is a process for implementing

information security. It is a systematic procedure for evaluating, describing, testing and authorizing systems prior to or after a system is in operation.

• Certification is a comprehensive assessment of the management, operational, and technical security controls in an information system, made in support of security accreditation, to determine the extent to which the controls are implemented correctly, operating as intended, and producing the desired outcome with respect to meeting the security requirements for the system.

• Accreditation is the official management decision given by a senior agency official to authorize operation of an information system and to explicitly accept the risk to agency operations (including mission, functions, image, or reputation), agency assets, or individuals, based on the implementation of an agreed-upon set of security controls.

Page 78: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Popular Management Frameworks• ISO 27001• ITIL • COSO• CMMI

Page 79: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

ISO 27001 – Stages in Implementing an ISMS

1. Define information security policy2. Define scope of ISMS3. Perform risk assessment4. Manage risks5. Select controls6. Prepare statement of applicability

Page 80: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

IT Infrastructure Library (ITIL)• Focuses on IT services• Seven main sections• Supporting products

Page 81: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Committee of Sponsoring Organizations

• Emphasizes the importance of identifying and managing risks– Process– People– Reasonable assurance– Objectives

Page 82: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Capability Maturity Model• Developed by SEI• Based on TQM concepts• Framework for improving process• Benefits

Page 83: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Open vs. Closed System• Open systems allow users to reuse, edit, manipulate, and contribute to the system

development– Open source software is an example of Open systems

• Licensed to the public– Freeware is also an example of Open systems

• Closed systems permits users to use the system as it is

Page 84: Dr. Bhavani Thuraisingham The University of Texas at Dallas (UTD) June 2011

Some Security Threats• Buffer Overflow• Maintenance Hooks• Time of check / Time of use attacks