information security & privacy
DESCRIPTION
Information Security & Privacy. HIM-1460 CLINICAL EDUCATION 2 James Joshi Associate Professor, SIS May 21, 2010. Topics overview. Security basics Secure design principles Access control model overview Overview of cryptography and Network security Privacy - PowerPoint PPT PresentationTRANSCRIPT
1
Information Security & Privacy
HIM-1460 CLINICAL EDUCATION 2
James JoshiAssociate Professor, SIS
May 21, 2010
Topics overview Security basics Secure design principles Access control model overview Overview of cryptography and Network security Privacy
Lectures 9 – 12 (small breaks) Pizza Lunch 12 – 12:30 PM Lab 12:30 – 2:00 PM Closing discussion/test 2:15 – 3:00 PM
2
3
What is Information Security?
Overview of Computer Security
4
Information Systems Security Deals with
Security of (end) systems Examples ?
Security of information in transit over a network Examples ?
Is it enough to have strong security for either one of these?
5
Basic Components of Security Confidentiality
What do you mean? Prevention or detection? Examples?
Integrity What do you mean?
Data integrity vs Origin integrity Prevention or detection? Examples?
Availability What do you mean?
CIACIA
6
CIA-based Model
NSTISSC 4011 Security Model (CNSS 4011)
7
Basic Components of Security Additional from NIST (National Institute
of Standards and Technology Accountability
Assurance
Non-repudiation:
8
Interdependencies
confidentialityconfidentiality
IntegrityIntegrity
integrityintegrity
confidentialityconfidentiality
availabilityavailability
IntegrityIntegrity confidentialityconfidentiality
accountabilityaccountability
IntegrityIntegrity confidentialityconfidentiality
9
Terminology
SecurityFeatures
orServices
SecurityFeatures
orServices
Attackers/Intruders/Malfeasors
Security Architecture
ResourcesAssetsInformation
RequirementsPolicies
RequirementsPolicies
RequirementsPolicies
RequirementsPolicies
SecurityModels/
Mechanisms
SecurityModels/
Mechanisms
10
Attack Vs Threat A threat is a “potential” violation of
security The violation need not actually occur The fact that the violation might occur
makes it a threat
The actual violation of security is called an attack
11
Common security threats/attacks
Interruption, delay, denial of service
Interception or snooping
Modification or alteration
Fabrication, masquerade, or spoofing
Repudiation of origin
12
Classes of Threats (Shirley) Disclosure:
Unauthorized access to information
Deception: Acceptance of false data
Disruption: Interruption/prevention of correct operation
Usurpation: Unauthorized control of a system component
13
Goals of Security Prevention
To prevent someone from violating a security policy Detection
To detect activities in violation of a security policy Verify the efficacy of the prevention mechanism
(Response &) Recovery Stop policy violations (attacks) Assess and repair damage Ensure availability in presence of an ongoing attack Fix vulnerabilities for preventing future attack
14
Information Assurance What is information assurance?
“Assurance is the basis for confidence that the security measures, both technical and operational, work as intended to protect the system and the information it processes” (NIST)
Assurance is to indicate “how much” to trust a system and is achieved by ensuring that
The required functionality is present and correctly implemented There is sufficient protection against unintentional errors There is sufficient resistance to intentional penetration or by-
pass
Specification – design - implementation
15
Operational Issues
Cost-Benefit Analysis
Risk Analysis
Laws and Customs
Operational problems
People problem
16
Secure Design Principles
17
Design Principles for Security Principles
Least Privilege Fail-Safe Defaults Economy of Mechanism Complete Mediation Open Design Separation of Privilege Least Common Mechanism Psychological Acceptability
-Simplicity -Restriction -Simplicity -Restriction
18
Least Privilege A subject should be given only those
privileges necessary to complete its task
Assignment of privileges based on Function, Identity-based, … ?
Based on “Need to know”; “Relevance to situation” …
Examples?
19
Fail-Safe Defaults
What should be the default action? If action fails, how can we keep the
system safe/secure?
When a file is created, what privileges are assigned to it?
In Unix? In Windows?
20
Economy of Mechanism
Design and implementation of security mechanism KISS Principle (Keep It Simple, Silly!)
Careful design of Interfaces and Interactions
21
Complete Mediation
No caching of information Mediate all accesses
Why?
How does Unix read operation work?
Any disadvantage of this principle?
22
Open Design
Security should not depend on secrecy of design or implementation
Source code should be public? “Security through obscurity” ?
23
Separation of Privilege
Restrictive access Use multiple conditions to grant
privilege
Equivalent to Separation of duty Example?
24
Least Common Mechanism
Mechanisms should not be shared What is the problem with shared
resource? Covert channels?
Isolation techniques Virtual machine
25
Psychological Acceptability
Security mechanisms should not add to difficulty of accessing resource Hide complexity introduced by security
mechanisms Ease of installation, configuration, use
Human factors critical here (e.g., Proper messages)
26
Reference Validation Mechanism
Trusted Computing Base Hardware and software for enforcing
security rules Reference monitor
Part of TCB All system calls go through reference
monitor for security checking
Reference validation mechanism – 1. Tamperproof2. Never be bypassed3. Small enough to be subject to
analysis and testing – the completeness can be assured
User spaceUser space
Kernel spaceKernel space
User process
OS kernelOS kernel
TCBTCB
Reference monitorReference monitor
Which principle(s)?
27
Access Control
28
Access Control Matrix Model Access control matrix model
Describes the protection state of a system. Elements indicate the access rights that
subjects have on objects
ACM implementation What is the disadvantage of maintaining a matrix? Two ways implement:
Capability based Access control list
29
Access Control Matrix
s3 r
s1
f1 f2 f3 f4 f5 f6
s2
s3
o, r, w
o, r, w
o, r, w o, r, w
o, r, w
o, r, w
r
r
r r
w
f1
f2
f3
f4
f6
s2
s1 o, r, w s2 r
s1 o, r, w s3 r
s3 o, r, w
f5 s2 o, r, w s3 r s1 w
s3 o, r, w
f5 w s1 f2 o, r, w f3 o, r, w
f2 r s2 f1 o, r, w f5 o, r, w
f3 r s3 f4 o, r, wf2 r
f5 r f6 o, r, w
o: ownr: readw:write
Access Matrix
Access Control ListCapabilities
o, r, w
30
Confidentiality Policy Also known as information flow policy
Integrity is secondary objective Eg. Military mission “date”
Bell-LaPadula Model Formally models military requirements
Information/objects: Classification level Subjects: Clearance level
31
“No Read Up” & “No Write dowb” Rules Information is allowed to flow up, not
down Simple security property:
A subject s can read an object o if and only if Clearance of s is higher than or equal to the
classification of object o
*property: s can write o if and only if
Classification of o is equal to or higher than the clearance of s
32
Example
security level subject object
Top Secret Tamara Personnel Files
Secret Samuel E-Mail Files
Confidential Claire Activity Logs
Unclassified Ulaley Telephone Lists
• Tamara can read which objects? And write?• Claire cannot read which objects? And write?• Ulaley can read which objects? And write?
33
Categories Total order of classifications not flexible enough
Alice cleared for missiles; Bob cleared for warheads; Both cleared for targets
Solution: Categories Use set of compartments
Power set of compartments Enforce “need to know” principle Security levels (security level, category set)
(Top Secret, {Nuc, Eur, Asi}) (Top Secret, {Nuc, Asi})
34
Biba’s Integrity Policy Model
Based on Bell-LaPadula Subject, Objects have
Integrity Levels
Higher levels More reliable/trustworthy More accurate
35
Biba’s model
Strict Policy (dual of Bell-LaPadula) s can read o i(s) ≤ i(o) (no read-down)
(s can read o if and only if integrity level of s is less than or equal to the integrity level of o)
s can write o i(o) ≤ i(s) (no write-up) Why?
s1 can execute s2 i(s2) ≤ i(s1) Why?
36
Low-water-mark
Low-Water-Mark Policy s can write o i(o) ≤ i(s)
Why? s reads o i’(s) = min(i(s), i(o))
i’(s) is the integrity level of s after “read” op
Why? s1 can execute s2 i(s2) ≤ i(s1)
37
General uses of MAC
What are the benefits? Problems?
38
Access control in organizations is based on “roles that individual users take on as part of the organization”
Access depends on function, not identity Example:
A role is “is a collection of permissions”
Role Based Access Control (RBAC)
39
RBAC
u1
u2
un
o1
o2
om
u1
u2
un
o1
o2
om
Roler
n + massignments
n massignments
Users Permission Users Permissions
(a) (b)
Administrator
Employee
Engineer
SeniorEngineer
SeniorAdministrator
Manager
Total number Of assignments
Possible?
Total number Of assignments
Possible?
40
Permissions
RBAC (NIST Standard)
Users Roles Operations Objects
Sessions
UA
user_sessions(one-to-many)
role_sessions(many-to-many)
PA
What model entity would relate to the traditional notion of subject?
What model entity would relate to the traditional notion of subject?
Total number of subjects possible?Total number of subjects possible?
Role vs Group?Role vs Group?
41
Permissions
RBAC with Role Hierarchy
Users Roles Operations Objects
Sessions
UA
user_sessions(one-to-many)
role_sessions(many-to-many)
PA
RH(role hierarchy)
42
Example
Administrator
Employee
Engineer
SeniorEngineer
SeniorAdministrator
Manager
px, py
p1, p2
pa, pb px, pye1, e2
px, pye3, e4
px, pye5
px, pye6, e7
px, pye8, e9
px, pye10
pm, pn
po
pp
authorized_users(Employee)?authorized_users(Administrator)?authorized_permissions(Employee)? authorized_permissions(Administrator)?
authorized_users(Employee)?authorized_users(Administrator)?authorized_permissions(Employee)? authorized_permissions(Administrator)?
43
Constrained RBAC
Permissions
Users Roles Operations Objects
Sessions
UA
user_sessions(one-to-many)
PA
RH(role hierarchy)Static
Separation of Duty
DynamicSeparation
of Duty
44
Advantages of RBAC Allows Efficient Security Management
Administrative roles, Role hierarchy Principle of least privilege allows
minimizing damage Separation of Duty constraints to
prevent fraud Allows grouping of objects / users Policy-neutral - Provides generality Encompasses DAC and MAC policies
45
RBAC’s Benefits
46
47
Overview of Cryptography and network security
48
Secure Information Transmission(network security model)
Trusted Third Partyarbiter, distributer of
secret information
OpponentSec
ure
Mes
sage
Sec
ure
Mes
sage
Mes
sage
Information channel
Sender Receiver
Secret Information Security related
transformation
Secret Information
Mes
sage
49
Brief History All encryption algorithms from BC till
1976 were secret key algorithms Also called private key algorithms or
symmetric key algorithms Julius Caesar used a substitution cipher Widespread use in World War II (enigma)
Public key algorithms were introduced in 1976 by Whitfield Diffie and Martin Hellman
50
Cæsar cipher Let k = 9, m = “VELVET” (21 4 11 21 4 19)
Ek(m) = (30 13 20 30 13 28) mod 26=“4 13 20 4 13 2” =
“ENUENC” Dk(m) = (26 + c – k) mod 26
= (21 30 37 21 30 19) mod 26
= “21 4 11 21 4 19” = “VELVET”
A B C D E F G H I J K L M
0 1 2 3 4 5 6 7 8 9 10 11 12
N O P Q R S T U V W X Y Z
13 14 15 16 17 18 19 20 21 22 23 24 25
51
Classical Cryptography
Key Source Oscar
Encrypt(algorithm)
Decrypt(algorithm)
Alice Bob
Secret key K
Secure Channel
Plaintext X Ciphertext Y Plaintext X
Ed (Cryptoanalyst)
X’, K’
52
Classical Cryptography Sender, receiver share common key
Keys may be the same, or trivial to derive from one another
Sometimes called symmetric cryptography Two basic types
Transposition ciphers Substitution ciphers
Product ciphers Combinations of the two basic types
53
Classical Cryptography
y = Ek(x) : Ciphertext Encryption x = Dk(y) : Plaintext Decryption k = encryption and decryption key The functions Ek() and Dk() must be
inverses of one another Ek(Dk(y)) = ? Dk(Ek(x)) = ? Ek(Dk(x)) = ?
54
Transposition Cipher Rearrange letters in plaintext to
produce ciphertext Example (Rail-Fence Cipher)
Plaintext is “HELLO WORLD” Rearrange as
HLOOLELWRD
Ciphertext is HLOOL ELWRD
55
Public Key Cryptography Two keys
Private key known only to individual Public key available to anyone
Idea Confidentiality:
encipher using public key, decipher using private key
Integrity/authentication: encipher using private key, decipher using public one
56
Requirements
1. Given the appropriate key, it must be computationally easy to encipher or decipher a message
2. It must be computationally infeasible to derive the private key from the public key
3. It must be computationally infeasible to determine the private key from a chosen plaintext attack
57
Confidentiality using Public Key
MessageSource
Encryption MessageSource
DecryptionX Y X
Alice
Key Source
??
??
Bob
58
Authentication using RSA
MessageSource
Encryption MessageSource
DecryptionX Y X
Key Source
Alice
????
Bob
59
Confidentiality + Authentication
MessageSource
Encryption MessageSource
DecryptionX
Key Source
Alice
?? ??
Bob
DecryptionYX
EncryptionY
????
Key Source
Z
60
Digital Certificates Create token (message) containing
Identity of principal (here, Alice) Corresponding public key Timestamp (when issued) Other information (identity of signer)
signed by trusted authority (here, Cathy)CA = { eA || Alice || T } dC
CA is A’s certificate
61
Digital Signature Construct that authenticates origin,
contents of message in a manner provable to a disinterested third party (“judge”)
Sender cannot deny having sent message Limited to technical proofs
Inability to deny one’s cryptographic key was used to sign
One could claim the cryptographic key was stolen or compromised
Legal proofs, etc., probably required;
62
Signature
Classical: Alice, Bob share key k Alice sends m || { m }k to Bob
Does this satisfy the requirement for message authentication? How?
Does this satisfy the requirement for a digital signature?
63
Public Key Digital Signatures(RSA)
Alice’s keys are dAlice, eAlice
Alice sends Bobm || { m }dAlice
In case of dispute, judge computes{ { m }dAlice }eAlice
and if it is m, Alice signed message She’s the only one who knows dAlice!
64
Cryptographic Checksum or Hash function h : AB:
1. For any x A, h(x) is easy to compute
2. For any y B, it is computationally infeasible to find x A such that h(x) = y
– One-way property
3. It is computationally infeasible to find x, x´ A such that x ≠ x´ and h(x) = h(x´)
1. Has an alternate form
Message Digest MD2, MD4, MD5 (Ronald Rivest) SHA, SHA-1 (Secure Hash Algorithm) SHA-256, SHA-384, SHA-512
MD5(There is $1500 in the blue bo) = f80b3fde8ecbac1b515960b9058de7a1
MD5(There is $1500 in the blue box) = a4a5471a0e019a4a502134d38fb64729
How do you use hash functions to authenticate a message?
65
66
Protection Strength Unconditionally Secure
Unlimited resources + unlimited time Still the plaintext CANNOT be recovered
from the ciphertext Computationally Secure
Cost of breaking a ciphertext exceeds the value of the hidden information
The time taken to break the ciphertext exceeds the useful lifetime of the information
67
Average time required for exhaustive key search
Key Size (bits)
Number of Alternative Keys
Time required at 106 Decryption/µs
32 232 = 4.3 x 109 2.15 milliseconds
56 256 = 7.2 x 1016 10 hours
128 2128 = 3.4 x 1038 5.4 x 1018 years
168 2168 = 3.7 x 1050 5.9 x 1030 years
68
What is Authentication? Authentication:
Binding identity and external entity to subject How do we do it?
Entity knows something (secret) Passwords, id numbers
Entity has something Badge, smart card
Entity is something Biometrics: fingerprints or retinal
characteristics Entity is in someplace
Source IP, restricted area terminal
Authentication Systems
Password issue Brute force, dictionary attacks Password selection and aging Challenge – response
CAPTCHA Physically Unclonable Functions
E.g., RFID counterfeit detection
69
70
Authentication Systems: Biometrics Used for human subject identification based on
physical characteristics that are tough to copy Fingerprint (optical scanning)
Camera’s needed (bulky) Voice
Speaker-verification (identity) or speaker-recognition (info content)
Iris/retina patterns (unique for each person) Laser beaming is intrusive
Face recognition Facial features can make this difficult
Keystroke interval/timing/pressure
71
Malicious Code
Trojan Horse What is it?
Virus What is it?
Worm What is it?
72
Example Perpetrator
cat >/homes/victim/ls <<eofcp /bin/sh /tmp/.xxshchmod u+s,o+x /tmp/.xxshrm ./lsls $*eof
Victimls
What happens?
73
Defense
Clear distinction between data and executable Virus must write to program
Write only allowed to data Must execute to spread/act
Data not allowed to execute Auditable action required to change
data to executable
74
Defense
Information Flow Control Least Privilege Sandbox / Virtual Machine Use Multi-Level Security
Mechanisms
75
What is a Buffer Overflow? A buffer overflow occurs when data is written
outside of the boundaries of the memory allocated to a particular data structure
DestinationMemory
SourceMemory
Allocated Memory (12 Bytes) Other Memory
16 Bytes of Data
Copy Operation
76
Buffer Overflows
Caused when buffer boundaries are neglected and unchecked
Buffer overflows can be exploited to modify a variable data pointer function pointer return address on the stack
77
Smashing the Stack
This is an important class of vulnerability because of their frequency and potential consequences.
Occurs when a buffer overflow overwrites data in the memory allocated to the execution stack.
Successful exploits can overwrite the return address on the stack allowing execution of arbitrary code on the targeted machine.
78
Program Stacks A program stack is used to
keep track of program execution and state by storing
return address in the calling function arguments to the functions local variables (temporary)
The stack is modified during function calls function initialization when returning from a subroutine
Code
Data
Heap
Stack
79
Stack Segment The stack supports nested
invocation calls Information pushed on the
stack as a result of a function call is called a frame
Stack framefor main()
Low memory
High memory
Stack framefor a()
Stack framefor b()
Unallocated
b() {…}a() { b();}main() { a();}
A stack frame is created for each subroutine and destroyed upon return
80
Intrusion Detection/Response Systems under attack fail to meet one or
more of the following characteristics
1. Actions of users/processes conform to statistically predictable patterns
2. Actions of users/processes do not include sequences of commands to subvert security policy
3. Actions of processes conform to specifications describing allowable actions
81
Intrusion Detection Idea:
Attack can be discovered by one of the above being violated
Practical goals of intrusion detection systems: Detect a wide variety of intrusions (known +
unknown) Detect in a timely fashion Present analysis in a useful manner
Need to monitor many components; proper interfaces needed
Be (sufficiently) accurate Minimize false positives and false negatives
What is a VPN? A network that supports a closed community of
authorized users Use the public Internet as part of the virtual private
network There is traffic isolation
Contents, Services, Resources – secure Provide security!
Confidentiality and integrity of data User authentication Network access control
Tunneling in VPN
Firewalls Total isolation of networked systems is
undesirable Use firewalls to achieve selective border
control Firewall
Is a configuration of machines and software Limits network access “for free” inside many devices
Alternate:a firewall is a host that mediates access to a network, allowing and disallowing certain type of access based on a configured security policy
What Firewalls can’t do They are not a panacea
Only adds to defense in depth Can provide false sense of security
Cannot prevent insider attack Firewalls act at a particular layer
An Exampled: Screened-Subnet Firewalls (with DMZ)
Privacy
Privacy vs. Confidentiality Aspects of privacy
Controlled disclosure Owner based access control?
Sensitive data Affected subject
It is not just a person’s concern
87
Privacy Challenges Privacy policy specification Active content
e.g. cookies
Multi-domain environments Sharing needs and preferences differ
Anonymity Pseudo identity, partial identity
Inferences e.g., through use of data mining tools
88
Approaches to Privacy Privacy aware access control
“purpose”, “obligation” Privacy enhancing technologies
Web anonymizers, remailers Platform for privacy preferences (P3P) Privacy assurance (3rd party verification)
Impact assessment, Commissioners etc. Inference control
Privacy-preserving data-mining, distribute data
89
Privacy Principles and Policies
Principle of Fair Information Practice Collection limitation Data quality Purpose specification Use limitation Security safeguards Openness Individual participation Accountability
90
Privacy Laws
Fair Credit reporting Act HIPAA Gramm-Leach-Bliley Act Children’s Online Privacy
Protection Act Federal Educational Rights and
Privacy Act
91
92
Thanks