using contextual information for ids alarm classiflcation
TRANSCRIPT
Using Contextual Information for IDS Alarm Classification
Francois Gagnon
Ph.D. StudentCarleton University, [email protected]
www.nmai.ca
Frederic Massicotte p p Babak Esfandiari
Communications Research Carleton University
Centre Canada Canada
Communications Research
IDS Context F. Gagnon @ DIMVA’09 1 / 23
• IDSes produce a lot of alarms.
• Administrators are overwhelmedwith non-critical alarms.
IDS Context F. Gagnon @ DIMVA’09 2 / 23
Outline
ë Introduction
ë Experiment Setup
ë Results
ë Conclusion
IDS Context F. Gagnon @ DIMVA’09 3 / 23
Introduction
Alarm Classes
Non-critical alarms are not indicative of a plausible threat.
NormalEvents
MaliciousEvents
Event Space
Non-Critical Alarm: A false positive (alarm related to normal background traffic)or a non-relevant positive (alarm related to an unsuccessful attack attempt).
• They pose two problems:
– Distract security officers from real threats.
– Prevent automatically blocking attacks.
IDS Context F. Gagnon @ DIMVA’09 4 / 23
Introduction
Alarm Classes
Non-critical alarms are not indicative of a plausible threat.
NormalEvents
MaliciousEvents
SuccessfulAttacks
Event Space
Non-Critical Alarm: A false positive (alarm related to normal background traffic)or a non-relevant positive (alarm related to an unsuccessful attack attempt).
• They pose two problems:
– Distract security officers from real threats.
– Prevent automatically blocking attacks.
IDS Context F. Gagnon @ DIMVA’09 4 / 23
Introduction
Alarm Classes
Non-critical alarms are not indicative of a plausible threat.
NormalEvents
MaliciousEvents
IDSAlarms
SuccessfulAttacks
Event Space
Non-Critical Alarm: A false positive (alarm related to normal background traffic)or a non-relevant positive (alarm related to an unsuccessful attack attempt).
• They pose two problems:
– Distract security officers from real threats.
– Prevent automatically blocking attacks.
IDS Context F. Gagnon @ DIMVA’09 4 / 23
Introduction
Alarm Classes
Non-critical alarms are not indicative of a plausible threat.
NormalEvents
MaliciousEvents
IDSAlarms
SuccessfulAttacks
TN FP FNTPNRP
LFN
Event Space
Non-Critical Alarm: A false positive (alarm related to normal background traffic)or a non-relevant positive (alarm related to an unsuccessful attack attempt).
• They pose two problems:
– Distract security officers from real threats.
– Prevent automatically blocking attacks.
IDS Context F. Gagnon @ DIMVA’09 4 / 23
Introduction
Using Contextual Information
• An attack succeeds only when several conditions are met.
• As soon as 1 condition is not respected, the attack fails.
• Using the attack context, we can identify some of those that will fail.
• Several types of contextual information:
– Network (topology and protocols)
– Attack side effect (returned messages and log files)
– Vulnerability assessment
– Target configuration (operating system and applications)
IDS Context F. Gagnon @ DIMVA’09 5 / 23
Introduction
Objectives
• Potential:
Is target configuration an effective piece of contextual information to clas-sify IDS alarms ?
• Current:
Are the existing tools good enough to gather this context automatically ?
IDS Context F. Gagnon @ DIMVA’09 6 / 23
Outline
ë Introduction
ë Experiment Setup
ë Results
ë Conclusion
IDS Context F. Gagnon @ DIMVA’09 7 / 23
Experiment Setup
Dataset
• Using freely available attack dataset from CRC [2]
• 5,761 traces (1 trace ⇒ 1 attack attempt ⇒ 1 alarm)
• No background traffic
• 92 exploits
– Covering 47 vulnerabilities (BIDs)
– Targeting 18 ports (TCP and UDP)
• 95 targets (34 BSD, 25 Linux, 36 Windows)
• Well-documented
– Target OS and App
– Attack result (success/failure)
– Snort alarms
IDS Context F. Gagnon @ DIMVA’09 8 / 23
Experiment Setup
Evaluation Process - Potential
•Target IP•Target OS•Target App•Attack Outcome(Success\failure)
Trace Info--------------
IDS Alarm--------------
•Target IP•BID
•BID•Vuln products•Non-vuln products
Bugtraq--------------
•Target OS•Target App
•Vuln products•Non-vuln products
Automatic Verification--------------
•Result (correct\incorrect)
Automatic Classification--------------
•Class (non-critical\attempt)
(1)(2)
(3)
(4)
IDS Context F. Gagnon @ DIMVA’09 9 / 23
Experiment Setup
Evaluation Process - Tools
•Attack Outcome(Success\failure)
Trace Info--------------
•Target IP•Target OS•Target App
Tool--------------
IDS Alarm--------------
•Target IP•BID
•BID•Vuln products•Non-vuln products
Bugtraq--------------
•Target OS•Target App
•Vuln products•Non-vuln products
Automatic Verification--------------
•Result (correct\incorrect)
Automatic Classification--------------
•Class (non-critical\attempt)
(1)(2)
(3)
(4)
IDS Context F. Gagnon @ DIMVA’09 10 / 23
Experiment Setup
Classification Algorithms
ContextOS:
(1) if the target OS is listed as non-vulnerable for this exploit, return NC
(2) if the target OS is not listed as vulnerable for the BID and
(2.1) if all the products listed as vulnerable are OSes, return NC
(3) return A
ContextApp: considers only applicationContextOSApp: considers both OS and AppContextOSDeduction: considers only OS and deduce some App info from OS. (e.g., Microsoft IIS cannot run on a Linux computer)
IDS Context F. Gagnon @ DIMVA’09 11 / 23
Outline
ë Introduction
ë Experiment Setup
ë Results
ë Conclusion
IDS Context F. Gagnon @ DIMVA’09 12 / 23
Performance Measures
Recall =# of non-critical alarms classified as NC
# of non-critical alarms=
α
α + γ
Precision =# of non-critical alarms classified as NC
# of alarms classified as NC=
α
α + β
Alarm
Non-critical Critical
ClassificationNC α β
A γ δ
IDS Context F. Gagnon @ DIMVA’09 13 / 23
Results
Target Configuration Potential
Assuming we know the exact target configuration (OS and App)
0
10
20
30
40
50
60
70
80
OS App OSDeduction OSApp
Information Source
Recall %
95
96
97
98
99
100
OS App OSDeduction OSApp
Information SourceP
recis
ion
%
Recall % Precision %
• Errors (precision decrease) are due to missing entries on securityfocus.
IDS Context F. Gagnon @ DIMVA’09 14 / 23
Results
OSD Tools
Using the ContextOSDeduction algorithm
0
10
20
30
40
50
p0f (Syn) p0f(RstAck)
Nmap ettercap p0f(SynAck)
Xprobe
OSD Tools
Rec
all %
Recall %
• Current OSD tools are not nearly good enough (13% vs 40%).
IDS Context F. Gagnon @ DIMVA’09 15 / 23
Results
AppD Tools
Using the ContextApp algorithm
95
96
97
98
99
100
ettercap Nmap
AppD Tools
Pre
cisi
on
%0
5
10
15
20
25
30
ettercap Nmap
AppD Tools
Rec
all %
Recall % Precision %
• How can Nmap be better than the ideal case (27% vs 23%)?
IDS Context F. Gagnon @ DIMVA’09 16 / 23
Results
Weird Results
Suppose the target application (Microsoft IIS FTP) is vulnerable to the attack, butthe attack fails anyway (thus it is non-critical):
• The alarm is classified A by ContextApp with exact knowledge.
• This means 0/1 for recall.
Suppose Nmap thinks the target application is wuftpd (not vulnerable):
• The alarm is now classified NC by ContextApp with Nmap.
• This means 1/1 for recall.
Those mistakes should result in a decrease of precision for Nmap (successful attackmisclassified as NC).
The dataset does not have enough successful attacks.
IDS Context F. Gagnon @ DIMVA’09 17 / 23
Outline
ë Introduction
ë Experiment Setup
ë Results
ë Conclusion
IDS Context F. Gagnon @ DIMVA’09 18 / 23
Conclusion
Discussion
• Target configuration is very useful for IDS context:
– Filtering 73% of non-critical alarms.
– Not filtering critical alarms.
• OSD tools are not adequate to gather the required contextual information(they achieve only 1/3 of potential).
• There is a possibility for an attacker to manipulate the context, by injectingtraffic.
IDS Context F. Gagnon @ DIMVA’09 19 / 23
Conclusion
Future Work
• Compare the effectiveness of the different IDS context elements (e.g., vulnera-bility assessment with Nessus vs target configuration vs attack side effect).
• Develop a new OS discovery approach (HOSDa) [1].
– Detect manipulation attempts on the context.
• Re-run the experiment on another dataset.
ahttp://hosd.sourceforge.net
IDS Context F. Gagnon @ DIMVA’09 20 / 23
References
[1] Francois Gagnon, Babak Esfandiari, and Leopoldo Bertossi. A HybridApproach to Operating System Discovery Using Answer Set Programming.Proceedings of the 10th IFIP/IEEE Symposium on Integrated Management(IM’07), pages 391–400, 2007.
[2] Frederic Massicotte, Francois Gagnon, Mathieu Couture, Yvan Labiche,and Lionel Briand. Automatic Evaluation of Intrusion Detection Systems.Proceedings of the 2006 Annual Computer Security Applications Confer-ence (ACSAC’06), 2006.
IDS Context F. Gagnon @ DIMVA’09 22 / 23