cs 5950 – computer security and information assurance section 6: program security this is the...

82
CS 5950 – Computer Security and Information Assurance Section 6: Program Security This is the short version of Section 6. It does not includes OPTIONAL slides that you may SKIP. OPTIONAL details can be seen in the longer version of this Section. Dr. Leszek Lilien Department of Computer Science Western Michigan University Slides based on Security in Computing. Third Edition by Pfleeger and Pfleeger. Using some slides courtesy of: Prof. Aaron Striegel — course taught at U. of Notre Dame Prof. Barbara Endicott-Popovsky and Prof. Deborah Frincke (U. Idaho) — taught at U. Washington Prof. Jussipekka Leiwo — taught at Vrije Universiteit (Free U.), Amsterdam, The Netherlands Slides not created by the above authors are © 2006 by Leszek T. Lilien Requests to use original slides for non-profit purposes will be gladly granted upon a written request.

Post on 21-Dec-2015

218 views

Category:

Documents


3 download

TRANSCRIPT

CS 5950 –Computer Security and Information Assurance

Section 6: Program Security

This is the short version of Section 6.It does not includes OPTIONAL slides that you may SKIP.

OPTIONAL details can be seen in the longer version of this Section.

Dr. Leszek LilienDepartment of Computer Science

Western Michigan University

Slides based on Security in Computing. Third Edition by Pfleeger and Pfleeger.Using some slides courtesy of:

Prof. Aaron Striegel — course taught at U. of Notre DameProf. Barbara Endicott-Popovsky and Prof. Deborah Frincke (U. Idaho) — taught at U.

WashingtonProf. Jussipekka Leiwo — taught at Vrije Universiteit (Free U.), Amsterdam, The Netherlands

Slides not created by the above authors are © 2006 by Leszek T. LilienRequests to use original slides for non-profit purposes will be gladly granted upon a written request.

2© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Program Security – Outline (1)

NOTE: Some subsections SKIPped

6.1. Secure Programs – Defining & Testinga. Introductionb. Judging S/w Security by Fixing Faultsc. Judging S/w Security by Testing Pgm Behaviord. Judging S/w Security by Pgm Security Analysise. Types of Pgm Flaws

6.2. Nonmalicious Program Errorsa. Buffer overflowsb. Incomplete mediationc. Time-of-check to time-of-use errorsd. Combinations of nonmalicious program flaws

3© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Program Security – Outline (2)

6.3. Malicious Code6.3.1. General-Purpose Malicious Code incl.

Virusesa. Introduction b. Kinds of Malicious Codec. How Viruses Workd. Virus Signaturese. Preventing Virus Infectionsf. Seven Truths About Virusesg. Case Studiesh. Virus Removal and System Recovery After Infection

6.3.2. Targeted Malicious Codea. Trapdoorsb. Salami attackc. Covert channels

4© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Program Security – Outline (3)

6.4. Controls for Securitya. Introductionb. Developmental controls for securityc. Operating System controls for securityd. Administratrive controls for securitye. Conclusions

5© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

6. Program Security (1)

Program security – The fundamental step in applying security to computing

Protecting programs is the heart of computer security All kinds of programs, from apps via OS, DBMS, networks

Issues: How to keep pgms free from flaws How to protect computing resources from pgms with

flaws

Issues of trust not considered: How trustworthy is a pgm you buy? How to use it in its most secure way?

Partial answers: Third-party evaluations Liability and s/w warranties

6© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--SKIPped a slide--

You can see all SKIPped slidesin the LONG version of Section 5.

(The SKIPped slides are OPTIONAL,not required for exams.)

7© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

6.1. Secure Programs - Defining & Testing

… Continued …

[cf. B. Endicott-Popovsky]

8© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

a. Introduction (1) Pgm is secure if we trust that it provides/enforces:

Confidentiality Integrity Availability

What is „Program security?”Depends on who you ask

user - fit for his task programmer - passes all „her” tests manager - conformance to all specs

Developmental criteria for program security include:

Correctness of security & other requirements Correctness of implementation Correctness of testing

9© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Introduction (2) Fault tolerance terminology:

Error - may lead to a fault Fault - cause for deviation from intended

function Failure - system malfunction caused by fault

Note: [cf. A. Striegel]

Faults - seen by „insiders” (e.g., programmers)Failures - seen by „outsiders” (e.g., independent testers,

users)

Error/fault/failure example: Programmer’s indexing error, leads to buffer overflow fault Buffer overflow fault causes system crash (a failure)

Two categories of faults w.r.t. duration [cf. A. Striegel]

Permanent faults Transient faults – can be much more difficult to diagnose[cf. A. Striegel]

10© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Basic approaches to having secure programs:1) Judging s/w security by fixing pgm faults

Red Team / Tiger Team tries to crack s/w If pgm withstands the attack => security is good

2) Judging s/w security by testing pgm behavior Run tests to compare behavior vs. requirements (think

testing or Ss/w engg) Important: If a flaw detected as a failure (an effect), look

for the underlying fault (the cause)

Recall: fault seen by insiders, failure – by outsiders If possible, detect faults before they become failures

Any kind of fault/failure can cause a security incident=> we must consider security consequences for all kinds of detected faults/failures Even inadvertent faults / failures

Inadvertent faults are the biggest source of security vulnerabilities exploited by attackers

Testing only increases probability of eliminating faults

[cf. B. Endicott-Popovsky]

11© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

3) Judging s/w security by pgm security analysis Best approach to judging s/w security Analyze what can go wrong

At every stage of program development! From requirement definition to testing

After deployment Configurations / policies / practices

[cf. B. Endicott-Popovsky]

12© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--SKIPped-- a few slides

You can see all SKIPped slidesin the LONG version of Section 5.

(The SKIPped slides are OPTIONAL,not required for exams.)

13© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

e. Types of Pgm Flaws Taxonomy of pgm flaws:

1) Intentionala) Maliciousb) Nonmalicious

2) Inadvertenta) Validation error (incomplete or inconsistent)

e.g., incomplete or inconsistent input data

b) Domain error e.g., using a variable value outside of its domain

c) Serialization and aliasing serialization – e.g., in DBMSs or OSs aliasing - one variable or some reference, when changed,

has an indirect (usually unexpected) effect on some other data

Note: ‘Aliasing’ not in computer graphics sense!

d) Inadequate ID and authentication (Section 4—on OSs)

e) Boundary condition violationf) Other exploitable logic errors [cf. B. Endicott-Popovsky]

14© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

6.2. Nonmalicious Program Errors Nonmalicious program errors include:

a. Buffer overflowsb. Incomplete mediationc. Time-of-check to time-of-use errorsd. Combinations of nonmalicious program flaws

15© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

a. Buffer Overflows (1)

Buffer overflow flaw — often inadvertent (=>nonmalicious) but with serious security consequences

Many languages require buffer size declaration C language statement: char sample[10]; Execute statement: sample[i] = ‘A’;

where i=10 Out of bounds (0-9) subscript – buffer overflow

occurs Some compilers don’t check for exceeding bounds

C does not perform array bounds checking. Similar problem caused by pointers

No reasonable way to define limits for pointers

[cf. B. Endicott-Popovsky]

16© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Buffer Overflows (2)

Where does ‘A’ go? Depends on what is adjacent to ‘sample[10]’

Affects user’s data - overwrites user’s data

Affects users code - changes user’s instruction

Affects OS data - overwrites OS data Affects OS code - changes OS

instruction

This is a case of aliasing (cf. Slide 26)

[cf. B. Endicott-Popovsky]

17© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Buffer Overflows (3)

Implications of buffer overflow: Attacker can insert malicious data

values/instruction codes into „overflow space” Supp. buffer overflow affects OS code area

Attacker code executed as if it were OS code Attacker might need to experiment to see what

happens when he inserts A into OS code area Can raise attacker’s privileges (to OS privilege level)

When A is an appropriate instruction Attacker can gain full control of OS

[cf. B. Endicott-Popovsky]

18© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Buffer Overflows (4)

Supp. buffer overflow affects a call stack areaA scenario:

Stack: [data][data][...] Pgm executes a subroutine

=> return address pushed onto stack (so subroutine knows where to return control to when finished)Stack: [ret_addr][data][data][...]

Subroutine allocates dynamic buffer char sample[10] => buffer (10 empty spaces) pushed onto stackStack: [..........][ret_addr][data][data][...]

Subroutine executes: sample[i] = ‘A’ for i = 10Stack: [..........][A][data][data][...]

Note: ret_address overwritten by A!(Assumed: size of ret_address is 1 char)

19© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Buffer Overflows (5) Supp. buffer overflow affects a call stack area—CONT

Stack: [..........][A][data][data][...] Subroutine finishes

Buffer for char sample[10] is deallocatedStack: [A][data][data][...]

RET operation pops A from stack (considers it ret. addr.)Stack: [data][data][...]

Pgm (which called the subroutine) jumps to A=> shifts program control to where attacker wanted

Note: By playing with ones own pgm attacker can specify any „return address” for his subroutine

Upon subroutine return, pgm transfers control to attacker’s chosen address A (even in OS area)

Next instruction executed is the one at address A Could be 1st instruction of pgm that grants

highest access privileges to its „executor”

20© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Buffer Overflows (6) Note: [Wikipedia – aliasing]

C programming language specifications do not specify how data is to be laid out in memory (incl. stack layout)

Some implementations of C may leave space between arrays and variables on the stack, for instance, to minimize possible aliasing effects.

21© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Buffer Overflows (7) Web server attack similar to buffer overflow attack:

pass very long string to web server (details: textbook, p.103)

Buffer overflows still common Used by attackers

to crash systems to exploit systems by taking over control

Large # of vulnerabilities due to buffer overflows

22© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

b. Incomplete Mediation (1) Incomplete mediation flaw — often inadvertent (=>

nonmalicious) but with serious security consequences Incomplete mediation:

Sensitive data are in exposed, uncontrolled condition

Example URL to be generated by client’s browser to access

server, e.g.:http://www.things.com/order/final&custID=101&part=555A&qy=20&price=10&ship=boat&shipcost=5&total=205

Instead, user edits URL directly, changing price and total cost as follows: http://www.things.com/order/final&custID=101&part=555A&qy=20&price=1&ship=boat&shipcost=5&total=25

User uses forged URL to access server The server takes 25 as the total cost

23© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--SKIPped a slide--

You can see all SKIPped slidesin the LONG version of Section 5.

(The SKIPped slides are OPTIONAL,not required for exams.)

24© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

c. Time-of-check to Time-of-use Errors (1)

Time-of-check to time-of-use flaw — often inadvertent (=> nonmalicious) but with serious security consequences

A.k.a. synchronization flaw / serialization flaw

TOCTTOU — mediation with “bait and switch” in the middle

Non-computing example: Swindler shows buyer real Rolex watch (bait) After buyer pays, switches real Rolex to a forged one

In computing: Change of a resource (e.g., data) between time

access checked and time access used Q: Any examples of TOCTTOU problems

fromcomputing?

25© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Time-of-check to Time-of-use Errors (2) ... TOCTTOU — mediation with “bait and switch” in the

middle ...

Q: Any examples of TOCTTOU problems from

computing? A: E.g., DBMS/OS: serialization problem:

pgm1 reads value of X = 10pgm1 adds X = X+ 5

pgm2 reads X = 10, adds 3 to X, writes X = 13

pgm1 writes X = 15

X ends up with value 15 – should be X = 18

26© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--SKIPped-- a few slides

You can see all SKIPped slidesin the LONG version of Section 5.

(The SKIPped slides are OPTIONAL,not required for exams.)

27© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

6.3. Malicious Code Malicious code or rogue pgm is written to exploit flaws in pgms

Malicious code can do anything a pgm can Malicious code can change

Data Other programs

Malicious code - „officially” defined by Cohen in 1984 but virus behavior known since at least 1970 Ware’s study for Defense Science Board (classified, made public in 1979)

Outline for this Subsection:6.3.1. General-Purpose Malicious Code (incl.

Viruses)6.3.2. Targeted Malicious Code

28© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

6.3.1. General-Purpose Malicious Code (incl. Viruses)

Outlinea. Introductionb. Kinds of Malicious Codec. How Viruses Workd. Virus Signaturese. Preventing Virus Infectionsf. Seven Truths About Virusesg. Case Studies

[cf. B. Endicott-Popovsky]

29© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

a. Introduction Viruses are prominent example of general-purpose

malicious code Not „targeted” against any user Attacks anybody with a given app/system/config/...

Viruses Many kinds and varieties Benign or harmful Transferred even from trusted sources Also from „trusted” sources that are negligent to

update antiviral programs and check for viruses

[cf. B. Endicott-Popovsky]

30© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--REMIND YOURSELF-- (from Section 1)

b. Kinds of Malicious Code (1)

TrapdoorsTrapdoorsTrojan HorsesTrojan Horses

BacteriBacteriaa

Logic BombsLogic BombsWormsWorms

VirusViruseses

XFiles

[cf. Barbara Edicott-Popovsky and Deborah Frincke, CSSE592/492, U. Washington]

31© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--REMIND YOURSELF-- (from Section 1) b. Kinds of Malicious Code (2)

Trojan horse - A computer program that appears to have a useful function, but also has a hidden and potentially malicious function that evades security mechanisms, sometimes by exploiting legitimate authorizations of a system entity that invokes the program

Virus - A hidden, self-replicating section of computer software, usually malicious logic, that propagates by infecting (i.e., inserting a copy of itself into and becoming part of) another program. A virus cannot run by itself; it requires that its host program be run to make the virus active.

Worm - A computer program that can run independently, can propagate a complete working version of itself onto other hosts on a network, and may consume computer resources destructively.

32© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--REMIND YOURSELF-- (from Section 1) Kinds of Malicious Code (3)

Bacterium - A specialized form of virus which does not attach to a specific file. Usage obscure.

Logic bomb - Malicious [program] logic that activates when specified conditions are met. Usually intended to cause denial of service or otherwise damage system resources.

Time bomb - activates when specified time occurs Rabbit – A virus or worm that replicates itself without

limit to exhaust resource

Trapdoor / backdoor - A hidden computer flaw known to an intruder, or a hidden computer mechanism (usually software) installed by an intruder, who can activate the trap door to gain access to the computer without being blocked by security services or mechanisms.

33© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--SKIPped a slide--

You can see all SKIPped slidesin the LONG version of Section 5.

(The SKIPped slides are OPTIONAL,not required for exams.)

34© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

c. How Viruses Work (1) Pgm containing virus must be executed to spread

virus or infect other pgms Even one pgm execution suffices to spread virus

widely

Virus actions: spread / infect

--SKIP-- Spreading – Example 1: Virus in a pgm on installation CD User activates pgm contaning virus when she runs

INSTALL or SETUP Virus installs itself in any/all executing pgms

present in memory Virus installs itself in pgms on hard disk From now on virus spreads whenever any of the

infected pgms (from memory or hard disk) executes

35© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--SKIPped-- a few slides

You can see all SKIPped slidesin the LONG version of Section 5.

(The SKIPped slides are OPTIONAL,not required for exams.)

36© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

How Viruses Work (6)

Characteristics of a ‘perfect’ virus (goals of virus writers) Hard to detect Not easily destroyed or deactivated Spreads infection widely Can reinfect programs Easy to create Machine and OS independent

37© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

How Viruses Work (7)

Virus hiding places1) In bootstrap sector – best place for virus

Bec. virus gains control early in the boot process Before detection tools are active!

2) In memory-resident pgms TSR pgms (TSR = terminate and stay resident) TSR pgms are most frequently used OS pgms or

specialized user pgms=> good place for viruses (activated very often)

[Fig. cf. J. Leiwo & textbook]

Before infection:

After infection:

38© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

How Viruses Work (8)

3) In application pgms Best for viruses: apps with macros

(MS Word, MS PowerPoint, MS Excel, MS Access, ...)One macro: startup macro executed when app startsVirus instructions attach to startup macro, infect document files

Bec. doc files can include app macros (commands)

E.g., .doc file include macros for MS WordVia data files infects other startup macros, etc. etc.

4) In libraries Libraries used/shared by many pgms => spread

virus Execution of infected library pgm infects

5) In other widely shared pgms Compilers / loaders / linkers Runtime monitors Runtime debuggers Virus control pgms (!)

39© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

d. Virus Signatures (1) Virus hides but can’t become invisible – leaves behind a

virus signature, defined by various patterns:1) Storage patterns: must be stored

somewhere/somehow (maybe in pieces)

2) Execution patterns: executes in a particular way3) Distribution patterns: spreads in a certain way

Virus scanners use virus signatures to detect viruses (in boot sectior, on hard disk, in memory)

Scanner can use file checksums to detect changes to files

Once scanner finds a virus, it tries to remove it I.e., tries to remove all pieces of a virus V from target pgm T

Virus scanner and its database of virus signatures must be up-to-date to be effective!

Update and run daily!

40© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Virus Signatures (2)

Detecting Virus Signatures (1)

Difficulty 1 — in detecting execution patterns: Most of effects of virus execution (see next page)

are „invisible” Bec. they are normal – any legitimate pgm could cause

them (hiding in a crowd)=> can’t help in detecion

41© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--SKIPped a slide--

You can see all SKIPped slidesin the LONG version of Section 5.

(The SKIPped slides are OPTIONAL,not required for exams.)

42© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Virus Signatures (4)Detecting Virus Signatures (3)

Difficulty 2 — in finding storage patterns: Polymorphic viruses:

changes from one „form” (storage pattern) to another

Simple virus always recognizable by a certain char pattern

Polymorphic virus mutates into variety of storage patterns

Examples of polymorphic virus mutations Randomly repositions all parts of itself and randomly

changes all fixed data within its code Repositioning is easy since (infected) files stored as chains of

data blocks - chained with pointers

Randomly intersperses harmless instructions throughout its code (e.g., add 0, jump to next instruction)

Encrypting virus: Encrypts its object code (each time with a different/random key), decrypts code to run

... More below ...

43© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--SKIPped-- a few slides

You can see all SKIPped slidesin the LONG version of Section 5.

(The SKIPped slides are OPTIONAL,not required for exams.)

44© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

e. Preventing Virus Infections Preventing Virus Infections

Use commercial software fromtrustworthy sources

But even this is not an absoluteguarantee of virus-free code!

Test new software on isolated computers Open only safe attachments Keep recoverable system image in safe place Backup executable system files Use virus scanners often (daily) Update virus detectors daily

Databases of virus signatures change very often

No absolute guarantees even if you follow all the rules – just much better chances of preventing a virus

[cf. B. Endicott-Popovsky]

45© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

f. Seven Truths About Viruses Viruses can infect any platform Viruses can modify “hidden” / “read only” files Viruses can appear anywhere in system Viruses spread anywhere sharing occurs Viruses cannot remain in memory aftera complete

power off/power on on reboot But virus reappears if saved on disk (e.g., in the boot sector)

Viruses infect software that runs hardware There are firmware viruses (if firmware writeable by s/w)

Viruses can be malevolent, benign, or benevolent Hmmm...

Would you like a benevolent virus doing good things (like compressing pgms to save storage) but without your knowledge?

[cf. B. Endicott-Popovsky]

46© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--SKIPped-- a few slides

You can see all SKIPped slidesin the LONG version of Section 5.

(The SKIPped slides are OPTIONAL,not required for exams.)

47© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

h. Virus Removal andSystem Recovery After Infection

Fixing a system after infection by virus V:1) Disinfect (remove) viruses (using antivirus pgm)

Can often remove V from infected file for T w/o damaging T

if V code can be separated from T code and V did not corrupt T

Have to delete T if can’t separate V from T code

2) Recover files:- deleted by V- modified by V- deleted during disinfection (by antivirus pgm)

=> need file backups!

Make sure to have backups of (at least) important files

48© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

6.3.2. Targeted Malicious Code

Targeted = written to attack a particular system, a particular application, and for a particular purpose

Many virus techniques applySome new techniques as well

Outline:a. Trapdoorsb. Salami attackc. Covert channels

49© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

a. Trapdoors (1) --SKIP this def.-- Original def:

Trapdoor / backdoor - A hidden computer flaw known to an intruder, or a hidden computer mechanism (usually software) installed by an intruder, who can activate the trap door to gain access to the computer without being blocked by security services or mechanisms.

A broader definition:Trapdoor – an undocumented entry point to a module

Inserted during code development For testing As a hook for future extensions As emergency access in case of s/w failure

50© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Trapdoors (2)

Testing: With stubs and drivers for unit testing (Fig. 3-10 p.

138) Testing with debugging code inserted into tested

modules May allow programmer to modify internal module variables

Major sources of trapdoors: Left-over (purposely or not) stubs, drivers, debugging

code Poor error checking

E.g., allowing for unacceptable input that causes buffer overflow

Undefined opcodes in h/w processors Some were used for testing, some random

Not all trapdoors are bad Some left purposely w/ good intentions

— facilitate system maintenance/audit/testing

51© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

b. Salami attack Salami attack - merges bits of seemingly

inconsequential data to yield powerful results Old example: interest calculation in a bank:

Fractions of 1 ¢ „shaved off” n accounts and deposited in attacker’s account

Nobody notices/cares if 0.1 ¢ vanishes Can accumulate to a large sum

Easy target for salami attacks: Computer computations combining large numbers with small numbers

Require rounding and truncation of numbers Relatively small amounts of error from these op’s

are accepted as unavoidable – not checked unless a strong suspicion

Attacker can hide „salami slices” within the error margin

52© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

c. Covert Channels (CC) (1)

--SKIP-- Outline:i. Covert Channels - Definition and Examplesii. Types of Covert Channelsiii. Storage Covert Channelsiv. Timing Covert Channelsv. Identifying Potential Covert Channelsvi. Covert Channels - Conclusions

53© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

i. CC – Definition and Examples (1) So far: we looked at malicious pgms that perform wrong

actions Now: pgms that disclose confidential/secret info

They violate confidentiality, secrecy, or privacy of info

Covert channels = channels of unwelcome disclosure of info

Extract/leak data clandestinely

Examples1) An old military radio communication network

The busiest node is most probably the command center Nobody is so naive nowadays

2) Secret ways spies recognize each other Holding a certain magazine in hand Exchanging a secret gesture when approaching each other ...

54© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Covert Channels – Definition and Examples (2) How programmers create covert channels?

Providing pgm with built-in Trojan horse Uses covert channel to communicate extracted data

Example: pgm w/ Trojan horse using covert channel Should be:

Protected LegitimateData <------[ Service Pgm ]------> User

Is:Protected LegitimateData <------[ Service Pgm ]------> User

[ w/ Trojan h. ]

covert channel

Spy (Spy - e.g., programmer who put Trojan into pgm; directly or via Spy Pgm)

55© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Covert Channels – Definition and Examples (3)

How covert channels are created?I.e., How leaked data are hidden?

Example: leaked data hidden in output reports (or displays)

Different ‘marks’ in the report: (cf. Fig. 3-12, p.143)

Varying report format Changing line length / changing nr of lines per page Printing or not certain values, characters, or headings

- each ‘mark’ can convey one bit of info

56© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--SKIPped a slide--

You can see all SKIPped slidesin the LONG version of Section 5.

(The SKIPped slides are OPTIONAL,not required for exams.)

57© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

ii. Types of Covert Channels Types of covert channels

Storage covert channels Convey info by presence or absence of an

object in storage

Timing covert channels Convey info by varying the speed at which

things happen

58© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--SKIPped-- a few slides

You can see all SKIPped slidesin the LONG version of Section 5.

(The SKIPped slides are OPTIONAL,not required for exams.)

59© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Covert Channels - Conclusions Covert channels are a serious threat to

confidentiality and thus security („CIA” = security)

Any virus/Trojan horse can create a covert channel

In open systems — no way to prevent covert channels

Very high security systems require a painstaking and costly design preventing (some) covert channels

Analysis must be performed periodically as high security system evolves

60© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

6.4. Controls for Security How to control security of pgms during their

development and maintenance

--SKIP-- Outline:a. Introductionb. Developmental controls for securityc. Operating system controls for securityd. Administrative controls for securitye. Conclusions

61© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

a. Introduction „Better to prevent than to cure”

Preventing security flaws We have seen a lot of possible security flaws How to prevent (some of) them? Software engineering concentrates on developing

and maintaining high-quality s/w We’ll take a look at some techniques useful

specifically for developing/ maintaining secure s/w

Three types of controls for security (against pgm flaws):1) Developmental controls2) OS controls3) Administrative controls

62© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

b. Developmental Controls for Security (1)

Nature of s/w development Collaborative effort Team of developers, each involved in 1 of stages:

Requirement specification Regular req. specs: „do X” Security req. specs: „do X and nothing more”

Design Implementation Testing Documenting at each stage Reviewing at each stage Managing system development thru all stages Maintaining deployed system (updates, patches, new

versions, etc.)

Both product and process contribute to overall quality — incl. security dimension of quality

63© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (2) Fundamental principles of s/w engineering

1) Modularity2) Encapsulation3) Info hiding

1) Modularity Modules should be:

Single-purpose - logically/functionally Small - for a human to grasp Simple - for a human to grasp Independent – high cohesion, low coupling

High cohesion – highly focused on (single) purpose Low coupling – free from interference from other

modules Modularity should improve correctness

Fewer flaws => better security

64© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (3)

2) Encapsulation Minimizing info sharing with other modules

=> Limited interfaces reduce # of covert channels Well documented interfaces „Hiding what should be hidden and showing what

should be visible.”

3) Information hiding Module is a black box

Well defined function and I/O Easy to know what module does but not how it

does it Reduces complexity, interactions, covert

channels, ...=> better security

65© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (4)

Many techniques for building solid software --SKIP--1) Peer reviews2) Hazard analysis3) Testing4) Good design5) Risk prediction & mangement6) Static analysis7) Configuration management8) Additional developmental controls

--SKIP--> ... Please read on your own .....Also see slides—all discussed below ...

[cf. B. Endicott-Popovsky]

66© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--SKIPped-- a few slides

You can see all SKIPped slidesin the LONG version of Section 5.

(The SKIPped slides are OPTIONAL,not required for exams.)

67© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (8)

4) Good design Good design uses:

i. Modularity / encapsulation / info hidingii. Fault toleranceiii. Consistent failure handling policiesiv. Design rationale and historyv. Design patterns

i. Using modularity / encapsulation / info hiding - as discussed above

68© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (9)

ii. Using fault tolerance for reliability and security System tolerates component failures System more reliable than any of its components

Different than for security, where system is as secure as its weakest component

Fault-tolerant approach: Anticipate faults (car: anticipate having a flat tire)

Active fault detection rather than pasive fault detection (e.g., by use of mutual suspicion: active input data checking)

Use redundancy (car: have a spare tire)

Isolate damage Minimize disruption (car: replace flat tire, continue your

trip)

[cf. B. Endicott-Popovsky]

69© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (10)

Example 1: Majority voting (using h/w redundancy) 3 processor running the same s/w

E.g., in a spaceship Result accepted if results of 2 processors agree

Example 2: Recovery Block (using s/w redundancy)

Primary Codee.g., Quick Sort

Secondary Code

e.g., Bubble Sort

Acceptance Test

Quick Sort – – new code (faster)Bubble Sort –– well-tested code

70© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--SKIPped-- a few slides

You can see all SKIPped slidesin the LONG version of Section 5.

(The SKIPped slides are OPTIONAL,not required for exams.)

71© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Developmental Controls for Security (13)

Value of Good Design Easy maintenance Understandability Reuse Correctness Better testing

=> translates into (saving) BIG bucks !

[cf. B. Endicott-Popovsky]

72© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--SKIPped-- a few slides

You can see all SKIPped slidesin the LONG version of Section 5.

(The SKIPped slides are OPTIONAL,not required for exams.)

73© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

c. Operating System Controls for Security (1)

Developmental controls not always usedOR: Even if used, not foolproof=> Need other, complementary controls, incl. OS

controls

Such OS controls can protect against some pgm flaws

74© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Operating System Controls for Security (2)

Trusted software – code rigorously developed an analyzed so we can trust that it does all and only what specs say Trusted code establishes foundation upon which

untrusted code runs Trusted code establishes security baseline for the whole

system In particular, OS can be trusted s/w

75© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Operating System Controls for Security (3)

Key characteristics determining if OS code is trusted1) Functional correctness

OS code consistent with specs

2) Enforcement of integrity OS keeps integrity of its data and other resources even if

presented with flawed or unauthorized commands

3) Limited privileges OS minimizes access to secure data/resources Trusted pgms must have „need to access” and proper

access rights to use resources protected by OS Untrusted pgms can’t access resources protected by OS

4) Appropriate confidence level OS code examined and rated at appropriate trust level

76© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Operating System Controls for Security (4)

Similar criteria used to establish if s/w other than OS can be trusted

Ways of increasing security if untrusted pgms present:

1) Mutual suspicion2) Confinement3) Access log

1) Mutual suspicion between programs Distrust other pgms – treat them as if they were

incorrect or malicious Pgm protects its interface data

With data checks, etc.

77© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Operating System Controls for Security (5)

2) Confinement OS can confine access to resources by suspected

pgm Example 1: strict compartmentalization

Pgm can affect data and other pgms only within its compartment

Example 2: sandbox for untrusted pgms

Can limit spread of viruses

78© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

Operating System Controls for Security (6)

3) Audit log / access log Records who/when/how (e.g., for how long)

accessed/used which objects Events logged: logins/logouts, file accesses, pgm

ecxecutions, device uses, failures, repeated unsuccessful commands (e.g., many repeated failed login attempts can indicate an attack)

Audit frequently for unusual events, suspicious patterns

It is a forensic measure not protective measure Forensics – investigation to find who broke law,

policies, or rules (a posteriori, not a priori)

79© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

d. Administrative Controls for Security (1)

They prohibit or demand certain human behavior via policies, procedures, etc.

They include:1) Standards of program development2) Security audits3) Separation of duties

80© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

--SKIPped-- a few slides

You can see all SKIPped slidesin the LONG version of Section 5.

(The SKIPped slides are OPTIONAL,not required for exams.)

81© 2006 by Leszek T. LilienSection 6 – Computer Security and Information Assurance – Spring 2006

e. Conclusions (for Controls for Security)

Developmental / OS / administrative controls help produce/maintain higher-quality (also more secure) s/w

Art and science - no „silver bullet” solutions „A good developer who truly understands security

will incorporate security into all phases of development.”

[textbook, p. 172]

Summary:Control Purpose Benefit

Develop-

mental

Limit mistakesMake malicious code difficult

Produce better software

OperatingSystem

Limit access to system Promotes safe sharing of info

Adminis-trative

Limit actions of people Improve usability, reusability and maintainability

[cf. B. Endicott-Popovsky]

The End of Section 6 (Ch. 3): Program Security

OPTIONAL details can be seen in the longer version of this Section (which includes slides that

you may SKIP)