cyber security and control system survivability...pserc tele-seminar – november 1, 2005 7...
TRANSCRIPT
© 2000 – 2005 by Carnegie Mellon UniversityCERT is a registered service mark of Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
Cyber Security and Control System Survivability:Technical and Policy Challenges
Howard F. Lipson, Ph.D. CERTSoftware Engineering InstitutePittsburgh, PA [email protected]
PSERC Tele-Seminar November 1, 2005
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 2
Outline
1. A Brief Introduction to Cyber Security
2. Survivability Concepts
3. Control System Survivability – Challenges and Research Issues
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 3
Cyber security issues
• The Internet was not designed to resist highly untrustworthy users- Example: IP spoofing
• The Internet was never designed for tracking and tracing user behavior- billing not based on fine-grained behavior- a packet’s source address is untrustworthy- high speed traffic hinders tracking- attacks often cross multiple administrative,
jurisdictional and national boundaries- anonymizers impede tracking- link between a user and an IP address is tenuous
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 4
Cyber security issues (2)
• The current threat and usage environment far exceeds the Internet’s design parameters- Severe real-time constraints for control systems
takes this to a new level
• The expertise of the average system administrator continues to decline- Poorly-protected hosts (preferably with high-
bandwidth connections) are compromised and used to as stepping stones to attack other (possibly more well-protected) targets, and to hide the true origin of the attack
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 5
Cyber security issues (3)
• Commercial-off-the-shelf (COTS) software and public domain software are ubiquitous, and widely accessible for experimentation to discover vulnerabilities (which are later exploited by malicious adversaries)
• Security is usually an afterthought in the software development life cycle- “patch and pray” is not enough- need security training & education for developers- need to “build security in” from the start
(See https://buildsecurityin.us-cert.gov)
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 6
Cyber security issues (4)
• Systems designed for use on closed (private) networks were not engineered with the security necessary for today’s Internet- Policies and procedures (e.g., who has access to
what assets) not planned with cyber security in mind
• “Security through obscurity” often fails
• Cyber attacks are often not recognized by the victim
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 7
Confidentiality
Integrity
Availability
ProcessingStorage
Transmission
Policy & ProceduresTechnology
Education, Training & Awareness
Information Security Model
NSTISSI 4011: National Training Standard for Information Systems Security Professionals, 1994
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 8
Examples of Security VulnerabilitiesOften narrowly defined as exploitable software flaws:• Unchecked buffer bounds (buffer overflow)• Missing or incorrect validation of input (malformed
input interpreted as system commands)• “Do-it-yourself” crypto• Insufficient provision of system resources• Temporary files accessible or modifiable by
unauthorized processes
Vulnerabilities in a much broader sense:• Lack of security training (e.g., social engineering,
“phishing”)• System misconfiguration (e.g., default password)
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 9
Examples of Malware
• Worms (e.g., Morris worm, Code Red, Slammer)• Viruses• Trojan horses• Logic bombs• Scanners (can be used by attackers or defenders)• Spyware (e.g., keystroke loggers)• Exploit tools and toolkits• Blended threats (e.g., W32/Blaster)
General trend: more targeted (“day-zero”) attacks
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 10
Distributed Denial of Service (DDoS)
• Botnet – a collection of compromised machines that can be remotely controlled by an attacker.
• Generally poor state of system security throughout the Internet makes very large botnets possible
• Distributed denial of service attacks are very difficult to defend against or trace. (Active research area.)
• Extortion by means of DDoS threats is today’s cyber incarnation of the “protection racket”
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 11
A Typical DDoS Attack
Internet
intruder
X XXXX
XX
X
X X
X X X XX X X
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 12
Step One - Intruder to Handler
Internet
intruder
intruder sendscommands to
handler
X XXXX
XX
X
X X
X X X XX X X
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 13
Step Two - Handler to Agents
Internet
intruder
master sendscommands to agents
X XXXX
XX
X
X X
X X X XX X X
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 14
Step Three - Agents to Victim
Internet
intruder
each agent (“zombie”)“independently” sends
traffic to the victimX XXXX
XX
X
X X
X X X XX X X
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 15
Why Survivability?
Traditional computer security is not adequate to keep highly distributed systems running in the face of cyber attacks.
Survivability is an emerging discipline –a risk-management-based security paradigm.
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 16
The Problem
Large-scale highly distributed systems cannot be totally isolated from potential intruders.
No amount of system “hardening” can guarantee that such systems are invulnerable to attack.
Increasing complexity of systems provides more opportunity for attackers.
Serious consequencesif things go wrong.
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 17
Attack Sophistication vs. Intruder Technical Knowledge
email propagation of malicious code
“stealth”/advanced scanning techniques
widespread attacks using NNTP to distribute attack
widespread attacks on DNS infrastructure
executable code attacks (against browsers)
automated widespread attacks
GUI intruder tools
hijacking sessions
Internet social engineering attacks
packet spoofingautomated probes/scans
widespread denial-of-service
attacks
techniques to analyze code for vulnerabilitieswithout source code
DDoS attacks
increase in worms
sophisticated command & control
anti-forensic techniques
home users targeted
distributed attack tools
increase in wide-scale Trojan horse distribution
Windows-based remote controllable
Trojans (Back Orifice)
Intruder Knowledge
Atta
ck S
ophi
stic
atio
n
1990 2003
Copyright 2004 Carnegie Mellon University. Reprinted with permission of the CERT® Coordination Center.
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 18
AdvancedIntrudersDiscover NewVulnerability
CrudeExploit Tools
Distributed
Novice IntrudersUse Crude
Exploit Tools
AutomatedScanning/ExploitTools Developed
Widespread Use of Automated Scanning/Exploit Tools
Intruders Begin Using New Types of Exploits
Vulnerability Exploit Cycle
Time
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 19
Number of Vulnerabilities Reported to the CERT/CC
Total vulnerabilities reported(1995-2Q,2005): 19,600
5,000
4,000
3,000
2,000
500
0
1,000
1,500
2,500
3,500
4,500
1997 1998 1999 2000 2001 2002 2003 2004 1st half 2005
311 262417
1,090
2,437
4,129
3,784 3,780
2,874http://www.cert.org/stats/
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 20
In the beginning . . .
“Can we build DoD systems that will continue to operate despite a successful cyber-attack?”
DARPA (Survivability Program)Late 1995, early 1996
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 21
Survivability is the ability of a system to fulfill its mission, in a timely manner, in the presence of attacks, failures, or accidents.
Survivability
Attack
Resist
RecognizeAdapt
Recover
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 22
3 R’s of Survivability
Resistanceability of a system to repel attacks
Recognitionability to recognize attacks and theextent of damage
Recoveryability to restore essential services during attack, and recover full services after attack
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 23
For Short-term Survivability
Deal with the effects of a crisis (survivability scenario): Car rounding a sharp curve is about to veer off a cliff.
A guardrail is a “survivability solution”, whether the underlying cause is:
• Ice on the road• Drunken driver• Brakes have been tampered with
For long-term survivability: Do the forensics!
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 24
An Analogy Is Becoming Reality
Emerging trend: X-by-Wire replacing mechanical and hydraulic control linkages.
X = { fly, steer, brake, ... }
Today,• Power steering degrades to difficult but functional
manual steering• Power braking degrades to manual braking
Tomorrow ?
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 25
For Long-term Survivability
System adaptation and evolution is essential,because …
• New vulnerabilities are discovered• New attack patterns appear• Continual attacker-defender escalation• Underlying technologies change• Collaborators become competitors• Political, social, legal changes• Missions evolve, or change drastically
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 26
Traditional Assumptions for Information Security
• Clearly defined boundaries
• Central administrative control
• Global visibility
• Trustworthy insiders “Fortress” Model
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 27
Today’s Computing Environment
Everything on the previous slide
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 28
The New Computing Environment Changes Everything• Open, highly distributed systems• Boundaries are ill-defined (complex physical
and logical perimeters)• No central (or unified) administrative control• No global visibility• Untrustworthy insiders
- includes incomplete and imprecise information about software: COTS, Java applets, Active X controls, etc.
• Unknown participants• Large scale, distributed, coordinated attacks• Survival at risk
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 29
Unbounded Systems
• No unified administrative control
• No global visibility
• Untrustworthy insiders
• Lack of complete, timely information
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 30
Bounded Thinking in an Unbounded World
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 31
Survivability is the ability of a system to fulfill its mission, in a timely manner, in the presence of attacks, failures, or accidents.
Survivability
Attack
Resist
RecognizeAdapt
Recover
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 32
Fundamental Assumption
No individual component of a system is immune to all attacks, accidents, and design errors.
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 33
Fundamental Goal
The mission must survive.• Not any individual component• Not even the system itself
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 34
Mission
A very high level statement of context-dependent requirements:
(1) Under normal usage(2) Under stress
. . . graceful degradation
. . . essential services maintained
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 35
Example: Mission of the Titanic
Under normal conditions:
Luxurious transatlantic transportation
Under stress:
Buoyancy
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 36
Example: Mission of the US Electric Power Industry
Reliablygenerate and supply electricity wherever and whenever it is needed in North America.
Under Deregulation
and profitably
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 37
Survivability RequirementsMission-critical functionality • (alternate sets of) minimum essential services • graceful degradation of services
Mission-critical software quality attributes• security, safety, reliability,
performance, usability
Requirements for the 3 R’s and evolution
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 38
The New Paradigm –Survivability versus Security
Security is a technical specialty that provides generic solutions that are largely independent of the mission being protected.
Survivability is a blend of security and mission-specific risk management.
Survivability solutions require• participation from all aspects of an organization:
technical and business (and other stakeholders)• intense collaboration between domain experts
and security/survivability experts.
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 39
Some Techniques & Methods
Security
• Fortress model: firewalls, security policy• Authentication, access control (insider trust)• Encryption• Intrusion detection (recovery secondary)• Auditing, integrity checking, monitoring• Success criteria:
- binary: attack succeeds or fails- follow industry standard practices
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 40
Survivability Techniques & Methods• Security techniques where applicable• Diversity, redundancy• Trust validation• Recovery (largely automated)• Mission-specific risk management
- includes contingency (disaster) planning• Emergent algorithms • Success criterion:
- mission fulfillment – graceful degradation– essential services maintained
• Solutions can transcend the system
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 41
Characteristics of Survivability
Survivability is an emergent property of a system.
Desired system-wide properties “emerge” from local actions and distributed cooperation.
An emergent property need not be a property of any individual node or link.
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 42
Some Survivability Research Approaches
Survivable Systems Analysis Method
Emergent Algorithms
Survivable Systems Simulation
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 43
Survivable Systems Analysis
• Understand survivability risks for your system:- What system services must survive attacks, accidents, and failures?
- What architectural elements aid inresistance, recognition, and recovery?
• Identify mitigating strategies:
- What architecture changes can improve survivability
- Which changes have the highest payoff?
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 44
Survivable System Analysis MapIn trusionScenario
SoftspotE ffects
A rch itectureS tra teg ies fo r
Resistance Recogn ition Recovery
Current(Scenario1)
… Recom m ended
Current(Scenarion)
Recom m ended
Defines survivability strategies for the three R’s based on intrusion softspots
Relates survivability strategies to the architecture
Makes recommendations for architecture modifications
Provides basis for risk analysis, cost-benefit trade-offs
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 45
Emergent Algorithms
Simple Local Actions+ Simple Near Neighbor Interactions
=> Complex Global Properties
Autonomous distributed agentssuch that if sufficiently many act as intended,desired global properties will emerge.
Distributed computationsthat fulfill mission requirements by exploiting the characteristics of unbounded systems.
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 46
Survivable Systems SimulationEasel — Emergent Algorithm Simulation
Environment and Language
Research Goals:
• Advance scientific knowledge of survivable systems
• Improve survivability of mission-critical systems• Provide tools and methods for survivability
engineering
EASEL -- Version 3.0 (Download)http://www.sei.cmu.edu/community/easel/
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 47
Survivability – Summary
Security provides generic technical solutionsbinary “fortress” modelleave security to the experts
Survivability is a blend of security and mission-specific risk management
graceful degradationessential services maintainedall stakeholders must contribute
- domain experts must be full partners
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 48
Internet-connected Control System
FirewallIT Business System
Internet
Control System
The Concept
Reality
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 49
A further dose of reality ...
FirewallIT Business System
Internet
Control System
Reality
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 50
Internet connectivity issues• Control systems now faced with all the engineering
issues associated with Internet connectivity
• Exposed to general Internet malware and attacks
• Subject to targeted attacks (“day-zero” attacks) for which no attack signatures are available
• Subject to probes and vulnerability scans in preparation for attack
• Denial of service attacks, tampering with monitoring results, or injecting malicious control requests can have disastrous consequences
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 51
Internet connectivity issues (2)• Control system devices and protocols designed for a
“closed” system environment don’t have the security properties needed in an “open”environment- no strong authentication- no encryption
• CPU power and storage may be too limited to support needed security tasks (e.g., encryption)
• “Security through obscurity” will often fail
• IT or security staff may be unaware of all Internet access points or other remote access
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 52
Internet connectivity issues (3)• Blended threats (for example, physical and cyber, or
multiple types of cyber attack) are possible
• Security and survivability degrade over time, so continual adaptation / evolution is necessary
• Traditional long replacement / evolution cycle versus the need to react quickly to security advisories
• How can you resolve the need for rapid application of security patches with the necessity for extremely careful testing and evaluation of those patches in a control system environment?
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 53
Internet connectivity issues (4)
• As control system devices (components) move from proprietary protocols towards open standards, for use across multiple industries, the vulnerability landscape may begin to resemble that of COTS products on the Internet today.
• Need education and training (operators, managers, software developers), new policies and procedures (including changes to physical security to protect cyber assets)
• Need an industry-wide incident emergency response capability, specialized for control systems
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 54
Some SurvivabilityResearch Issues
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 55
Survivability Research Issues
How do you assess and measure the survivability of control systems?
How do you effectively model, simulate, and visualize survivability in the control system domain?
What are the necessary capabilities of a test bed for control system security and survivability?• INL’s Control System Security Center
http://controlsystemssecurity.inl.gov/
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 56
Survivability Research Issues (2)
What architectural approaches are best?- context (scenario and domain) dependent- must be capable of rapid evolution
What control system architectures (and component mix) provide the redundancy and true diversity needed to contribute to a high assurance of survivability?
How can control system devices (components) be designed (what security and survivability properties must they have) so that they can demonstrably contribute to the overall survivability of the composite system (or system of systems)?
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 57
Survivability Research Issues (3)
What methodologies could help incorporate survivability into the engineering life cycle for control systems?
How do you manage the risks and tradeoffs to design survivable and affordable control systems?
How do you design control systems that can sustain their survivability in the face of ever-escalating attacker capabilities?
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 58
Survivability Research Issues (4)
What are the survivability strategies for dealing with legacy devices versus web-enabled devices?
How can society’s public policy decisions be translated into survivability solutions?
What economic incentives for vendors, or what regulatory-legal environment would lead to enhanced survivability for control systems?
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 59
For further reading …Howard F. Lipson, Tracking and Tracing Cyber-Attacks: Technical Challenges and Global Policy Issues, SEI Special Report, November 2002. http://www.cert.org/archive/pdf/02sr009.pdf
Howard F. Lipson and David A. Fisher, “Survivability—A New Technical and Business Perspective on Security,” Proceedings of the 1999 New Security Paradigms Workshop, Caledon Hills, ON, Sept. 21–24, 1999, Association for Computing Machinery, New York, NY. http://www.cert.org/archive/pdf/busperspec.pdf
Jelena Mirkovic, Sven Dietrich, David Dittrich, and Peter Reiher, Internet Denial of Service: Attack and Defense Mechanisms, Prentice Hall, 2004.
Frederick Sheldon, Tom Potok, Andy Loebl, and Axel Krings and Paul Oman, “Energy Infrastructure Survivability, Inherent Limitations, Obstacles and Mitigation Strategies,”Proceedings of PowerCon 2003, IASTED International Conference, New York, NY.
Joseph Weiss, “Cyber Security Meets Plant Politics – Don’t Neglect Your Control System,” InTech, ISA, July 1, 2005. http://www.isa.org/InTechTemplate.cfm?Section=InTech&template=/ContentManagement/ContentDisplay.cfm&ContentID=45287
More on survivability research is available at: http://www.cert.org/research/
© 2000 – 2005 by Carnegie Mellon University
Carnegie Mellon UniversitySoftware Engineering Institute
PSERC Tele-Seminar – November 1, 2005 60
Contact InfoHoward F. Lipson, Ph.D.Sr. Member of the Technical StaffCERTSoftware Engineering InstitutePittsburgh, PA [email protected]+1-412-268-7237http://www.cert.org/research
Adjunct ProfessorDepartment of Engineering and Public PolicyCarnegie Mellon University