secure design james walden northern kentucky university

40
Secure Design James Walden Northern Kentucky University

Upload: lesley-wade

Post on 27-Dec-2015

218 views

Category:

Documents


0 download

TRANSCRIPT

Secure Design

James Walden

Northern Kentucky University

Topics

1. Risk Analysis

2. Design Principles

3. Design By Contract

4. Legacy Issues

Risk Analysis

Fix design flaws, not implementation bugs.

Risk analysis steps1. Develop an architecture model.

2. Identify threats and possible vulnerabilities.

3. Develop attack scenarios.

4. Rank risks based on probability and impact.

5. Develop mitigation strategy.

6. Report findings

Meta Principles

1. Simplicity Fewer components and cases to fail. Fewer possible inconsistencies. Easy to understand.

2. Restriction Minimize access. Inhibit communication.

Security Design Principles

1. Least Privilege2. Fail-Safe Defaults3. Economy of Mechanism4. Complete Mediation5. Open Design 6. Separation of Privilege7. Least Common Mechanism8. Psychological Acceptability

Least Privilege

A subject should be given only those privileges necessary to complete its task. Function, not identity, controls. Rights added as needed, discarded after use. Minimal protection domain.

Most common violation: Running as administrator or root. Use runas or sudo instead.

Least Privilege Example

Problem: A web server. Serves files under /usr/local/http. Logs connections under /usr/local/http/log. HTTP uses port 80 by default. Only root can open ports < 1024.

Solution: Web server runs as root user. How does this solution violate the Principle of

Least Privilege and how could we fix it?

How do we run with least privilege?

List required resources and special tasks Files Network connections Change user account Backup data

Determine what access you need to resources Access Control model Do you need create, read, write, append, &c?

Fail-Safe Defaults

Default action is to deny access. When an action fails, system must be

restored to a state as secure as the state it was in when it started the action.

Fail Safe Defaults Example

Problem: Retail credit card transaction. Card looked up in vendor database to check

for stolen cards or suspicious transaction pattern.

What happens if system cannot contact vendor?

Solution No authentication, but transaction is logged. How does this system violate the Principle of

Fail-Safe Defaults?

Fail Safe Defaults Example

Problem: MS Office Macro Viruses. MS office files can contain Visual Basic code

(macros.) MS Office automatically executes certain macros

when opening a MS Office file. Users can turn off automatic execution. Don’t mix code and data!

Solution MS Office XP has automatic execution of macros

turned off by default. While the solution is a fail-safe default, does it follow

least privilege too?

Economy of Mechanism

Keep it as simple as possible (KISS). Use the simplest solution that works. Fewer cases and components to fail.

Reuse known secure solutions i.e., don’t write your own cryptography.

Economy of Mechanism Example

Problem: SMB File Sharing Protocol. Used since late 1980s. Newer protocol version protects data

integrity by employing packet signing technique.

What do you do about computers with older versions of protocol?

Solution: Let client negotiate SMB version to use. How does this solution violate economy of

mechanism?

Complete Mediation

Check every access. Usually checked once, on first access:

UNIX: File ACL checked on open(), but not on subsequent accesses to file.

If permissions change after initial access, unauthorized access may be permitted.

bad example: DNS cache poisoning

Open Design

Security should not depend on secrecy of design or implementation. Popularly misunderstood to mean that source

code should be public. “Security through obscurity” Refers to security policy and mechanism, not

simple user secrets like passwords and cryptographic keys.

Open Design Example: Problem: MPAA wants control over DVDs.

Region coding, unskippable commercials. Solution: CSS (Content Scrambling System)

CSS algorithm kept secret. DVD Players need player key to decrypt disk key on

DVD to descript movie for playing.- Encryption uses 40-bit keys.- People w/o keys can copy but not play DVDs.

What happened next? CSS algorithm reverse engineered. Weakness in algorithm allows disk key to be

recovered in an attack of complexity 225, which takes only a few seconds.

Closed Source

Security through obscurity. Assumes code in binary can’t be read

what about disassemblers? what about decompilers? what about debuggers? what about strings, lsof, truss, /proc?

Reverse engineering.

Open Source

Linus’ Law: Given enough eyeballs, all bugs are shallow.

Not so effective for security More incentives to add features than security. Few people have skills to find security holes.

Having source eliminates a barrier to entry for crackers.

Separation of Privilege

Require multiple conditions to grant access. Separation of duty. Compartmentalization. Defence in depth.

Separation of Duty

Functions are divided so that one entity does not have control over all parts of a transaction.

Example: Different persons must initiate a purchase and

authorize a purchase. Two different people may be required to arm

and fire a nuclear missile.

Compartmentalization

Problem: A security violation in one process should not affect others.

Solution: Virtual Memory Each process gets its own address space. In what ways is this solution flawed?

- i.e., how can the compartments communicate?

How could we improve compartmentalization of processes?

Defence in Depth Diverse defensive strategies

Different types of defences.- Protection- Detection- Reaction

Different implementations of defences. If one layer pierced, next layer may stop. Avoid “crunchy on the outside, chewy on

the inside” network security. Contradicts “Economy of Mechanism”

Think hard about more than 2 layers.

Defence in Depth Example

Problem: Bank. How to secure the money?

Solution: Defence in depth. Guards inside bank. Closed-circuit cameras monitor activity. Tellers do not have access to vault. Vault has multiple defences:

- Time-release.- Walls and lock complexity.- Multiple compartments.

Least Common Mechanism

Mechanisms used to access resources should not be shared. Information can flow along shared channels. Covert channels.

Contradicts Economy of Mechanism?

Least Common Mechanism

Problem: Compromising web server allows attacker

access to entire machine.

Solution Run web server as non-root user. Attacker still gains “other” access to

filesystem. Run web server in chroot jail.

Psychological Acceptability

Security mechanisms should not add to the difficulty of accessing a resource. Hide complexity introduced by security

mechanisms. Ease of installation, configuration, and use. Human factors critical here.

Psychological Acceptability

Users will not read documentation. Make system secure in default configuration.

Users will not read dialog boxes. Don’t offer complex choices. example: Mozilla/IE certificate dialogs.

Privacy vs Usability example: one-click shopping

Acceptability Example

Problem: Your workstation is myws, but you log into green every day to do other tasks and don’t want to type your password.

Solution: Let green trust myws. Create ~/.rhosts file on green that lists myws

as trusted host, then rlogin green will allow access without a password.

Does this solution violate other principles? Is there a more secure alternative solution?

Design Principles Questions Many systems disable an account after a small number

of failed accesses. Which principle(s) does this follow? Violate?

A system that invokes a shell exposes itself to command injection attacks. What principle does that violate? How could Least Privilege be used to improve security?

When changing your password, you typically have to enter your old password despite being logged in? How does this make the system more secure? Which principle(s) does it follow?

1. Least Privilege2. Fail-Safe Defaults3. Economy of Mechanism4. Complete Mediation5. Open Design 6. Separation of Privilege7. Least Common Mechanism8. Psychological Acceptability

Design By Contract

Executable contract btw class and clients. Client must guarantee certain preconditions

before it calls a method. Class guarantees certain properties will hold

after the call.

Applications Static analysis to check contracts. Dynamic analysis to check at runtime. Design technique, similar to TDD. Documentation.

Design by Contract (Eiffel)put (x: ELEMENT; key: STRING) is

-- Insert x so that it will be retrievable through key. require

count <= capacity not key.empty

do ... Some insertion algorithm ...

ensure has (x) item (key) = x count = old count + 1

end

Contract Features

Contract Preconditions (require) Invariants (invariant 0 <= count) Postconditions (ensure)

Contracts in other languages assert() in C/C++ Class::Contract in perl Java Modeling Language (JML)

JML Example

/*@ requires a != null

@ && (\forall int i;

@ 0 < i && i < a.length;

@ a[i-1] <= a[i]);

@*/

int binarySearch(int[] a, int x) {

// ...

}

Legacy Issues

How can you design security into legacy applications without source code? Wrappers Interposition

What is the best way to fix security flaws in an existing application? Code Maintenance Techniques

Retrofitting: Wrappers

Move existing application to special location. Replace old application with wrapper that:

Performs access control check. Performs input checks. Secures environment. Logs invocation of application. Invokes legacy application from new

location. Example: AusCERT overflow_wrapper

http://www.auscert.org.au/render.html?it=2016

Retrofitting: Interposition Interpose software between two

programs we cannot control. Add access control. Filter communication.

Example: Network proxy Router blocks direct client/server comm. Client talks to proxy, who makes connection

to remote server on behalf of client.- Access Control: disallow certain clients/servers.- Filtering: scan for viruses, worms, etc.- Auditing: all connections can be logged.

Maintenance: Sun tar flaw

1993: Every tar file produced under Solaris 2.0 contained fragments of /etc/passwd file.

Tar reads and writes fixed size blocks. Last block written has contents of memory block

that were not overwritten by disk read. Tar reads /etc/passwd to obtain user info. Immediately before it allocates the block read

buffer. Heap allocation doesn’t zero out memory. In earlier versions, other memory allocations

were between reading passwd and block read alloc.

Legacy Issues: Maintenance

How can you avoid adding new security flaws when performing code maintenance?

Before looking at a code maintenace procedure, what design principles could have prevented the Sun tar flaw?

Legacy Issues: Maintenance

1. Understand security model and mechanisms already in place.

2. Learn how the program actually works. Read design docs, code, and profile the program.

3. When designing and coding the fix:1. Don’t violate the spirit of the design.

2. Don’t introduce new trust relationships.

References

1. Matt Bishop, Computer Security: Art and Science, Addison-Wesley, 2004.

2. Mark Graff and Kenneth van Wyk, Secure Coding: Principles & Practices, O’Reilly, 2003.

3. Gary McGraw, Software Security, Addison-Wesley, 2006.4. Gary T. Leavens and Yoonsik Cheon, Design by Contract with

JML, http://www.jmlspecs.org/jmldbc.pdf5. Bertrand Meyer, Building bug-free O-O software: An introduction

to Design by Contract, http://archive.eiffel.com/doc/manuals/technology/contract/

6. Jermone H. Saltzer and Michael D. Schroeder, "The Protection of Information in Computer Systems," 1278-1308. Proceedings of the IEEE 63, 9 (September 1975).

7. John Viega and Gary McGraw, Building Secure Software, Addison-Wesley, 2002.