dynamic log analysis - the future

16
The Future: Dynamic Log Analysis Whitepaper In partnership with

Upload: clear-technologies

Post on 09-Jun-2015

371 views

Category:

Documents


0 download

DESCRIPTION

HAWK Network Defense has developed this, patent pending, technology that transforms the tedious and time consuming tasks of event logging into a dynamic, powerful experience that proactively mitigate risks. Not only will the analyst be able to rely on experience of the tool to prevent threats, but also be able to utilize his own experience by writing, through regular expression, rules that will place a ‘score’ on specific inter-organizational nuances which are not a threat.

TRANSCRIPT

Page 1: Dynamic Log Analysis - The Future

The Future:DynamicLogAnalysis

Whitepaper

In partnership with

Page 2: Dynamic Log Analysis - The Future

The Future: Dynamic Log Analysis Whitepaper

© Copyright 2009 HAWK Network Defense, Inc. All rights reserved.– 2 – Sal Mistry

© Copyright 2009, HAWK Network Defense, Inc. All Rights Reserved.

The information in this paper is proprietary and confidential property ofHAWK Network Defense, Inc. and may not be republished, redistributed,or modified in any way. The contents of this paper are intended solely forthe recipient. To request permission for republishing or redistribution,contact HAWK Network Defense at [email protected] or ClearTechnologies at [email protected]. .

Page 3: Dynamic Log Analysis - The Future

The Future: Dynamic Log Analysis Whitepaper

© Copyright 2009 HAWK Network Defense, Inc. All rights reserved.– 3 – Sal Mistry

Introduction

Our society has become dependent on computers and network systems.This dependence has continued to grow since the Internet and e-commerce exploded in the 1990s. Exposure to computer systems’vulnerabilities has also grown at an alarming rate as hackers strive toidentify and make the most of the vulnerabilities. Consequently,computers are attacked and compromised on a daily basis. Theseattacks steal personal identities, bring down an entire network, disablethe online presence of businesses, or eliminate sensitive information thatis critical for personal or business purposes. Over the past year, Virginia’sDHS system, TJX, Heartland Payment Systems, Google and T-Mobile havebeen adversely affected by breaches. Even Kaspersky, a securitysoftware provider, was infiltrated earlier this year. One security survey, forexample, noted how in 1997, 37% of respondents reported a breach. A2009 report by the Ponemon Institute, a privacy management researchfirm, reported a figure of 85%.

The Internet Crime Complaint Center, a partnership of the FBI, theNational White Collar Crime Center, and Bureau of Justice Assistance,reported that the number of complaints from victims of cyber crime roseby almost a third since 2007. The total number reached 275,284,amounting to $265 million in money lost. According to the InternetSecurity Threat Report published by Symantec in April 2009, attackersreleased Trojan horses, viruses, and worms at a record pace in 2008,primarily targeting computer users’ confidential information, in particulartheir online banking account credentials. Specifically, Symantecdocumented a record of 1.6 million instances of malicious code on theWeb in 2008, about one million more than 2007.

As a result, in recent years organizations have paid increasing attentionto IT security. This is understandable given the sheer amount ofinformation now in digital form. A recent InformationWeek Analyticssurvey revealed that 75% of its executive level respondents stated thatinformation security is among its highest priorities.

One reason is the ever-increasing volume of personal information theftand its accompanying liability costs. In certain cases, these events resultin costly lawsuits with much of the fees being paid to litigation servicefirms to sift through inaccessible, unorganized volumes of data. In othercases, companies incur the expense of setting up credit monitoringservices for customers affected by the breach. According to the

Page 4: Dynamic Log Analysis - The Future

The Future: Dynamic Log Analysis Whitepaper

© Copyright 2009 HAWK Network Defense, Inc. All rights reserved.– 4 – Sal Mistry

Ponemon Institute, U.S. Companies paid an average of $202 perexposed record in 2008, up from $197 in 2007. The report furtherindicated that the total cost per breach for each company was $6.6million in 2008, up from $6.3 million and $4.7 million in 2007 and 2006.

Also, the publicized financial violations and scandals, news of sensitive-data leakage and identity theft, and the rise of blended threats havemobilized global legislators and industry to craft laws and guidelines toprotect shareholders, customers and citizens. The following industrystandards and government regulations require all organizations, both inthe public sector and in industry/ business, to protect personal data:

COBIT, DS5.5 Security Testing, Surveillance and Monitoring Payment Card Industry Data Security Standard (PCI DSS),

Requirement 10 Track and monitor all access to network resourcesand cardholder data

ISO 17799:2005 10.10.2 Monitoring system use Federal Information Security Management Act of 2002 (FISMA),

AU-6 Audit Monitoring, Analysis, and Reporting Health Insurance Portability and Accountability Act of 1996

(HIPPA), 164.308(a)(1) Information System Activity Review CMS Acceptable Risk Safeguards, AU-6 Audit Monitoring, Analysis,

and Reporting Sarbanes-Oxley Act of 2002 (SOX) Gramm-Leach-Bliley Act (GLBA)

Additionally, with an increasing number of laid-off and overworkedemployees, executives are concerned about rogue employeesabotage. Earlier this year, Forbes published an article confirming that agrowing number of current and former employees are seeking retributionagainst the employer. The Identity Theft Resource Center, a San Diegobased nonprofit, found that of the roughly 250 data breaches publiclyreported in the United States between January 1 and June 12, 2008,victims blamed the largest share of incidents on theft by employees (18.4percent).

Lastly, executives are focused on information security in order topreserve brand value. For years, Business Week/InterBrand has publishedtheir yearly findings on the top 100 Brands. Because stability is one of thefactors for determining a brand’s value, one can assume that acustomer will be doubtful of the stability of a brand that cannot protecttheir information.

Page 5: Dynamic Log Analysis - The Future

The Future: Dynamic Log Analysis Whitepaper

© Copyright 2009 HAWK Network Defense, Inc. All rights reserved.– 5 – Sal Mistry

In order to protect these systems, save money, be in compliance withmany regulations and standards, as well as to protect brand value,organizations of all sizes must monitor and analyze their systems on aregular basis. In the early days of computers and networks this taskseemed fairly straightforward. The administrator would review the events(something that occurs within a system or network) by accessing the log(a record of the events occurring within an organization’s systems andnetworks). Thereafter, the administrator would analyze the log bystudying log entries to identify events of interest or suppress log entries forinsignificant events.

However, with the sheer number of computers, routers, servers anddevices used in an organization, the number, volume, and variety ofcomputer security logs have increased greatly. This has created theneed for computer security log management—the process forgenerating, transmitting, storing, analyzing, and disposing of computersecurity log data. In their publication, “Guide to Computer Security LogManagement”, The National Institute of Standards and Technologyidentified two major problems with log management. The first problem isthat of “balancing a limited quantity of log management resources witha continuous supply of log data.” The second is that of “ensuring thatsecurity, system and network administrators regularly perform efficientand effective analysis of log data.”

Thoroughly addressing these two issues with new and innovative solutionsis imperative to solving this crisis. It is not enough to efficiently andeffectively analyze log data and follow the current status of collecting,aggregating, normalizing, correlating and reporting the information.What is needed is a solution that effectively and efficiently detectsmalicious behavior before the damage is incurred, rather than letting theadministrator know when the organizational assets have beencompromised.

The early days: Data Collection

From 1950-70, few real threats existed because of the small number ofcomputers and networks. However, in the 1970-80s attacks includedremote password attacks, social engineering attacks, the first PC andcomputer virus, various ‘worms’, international espionage, and bufferoverflow attacks. From the 1990s until today, various viruses and attackshave been perpetrated upon governments, organizations and

Page 6: Dynamic Log Analysis - The Future

The Future: Dynamic Log Analysis Whitepaper

© Copyright 2009 HAWK Network Defense, Inc. All rights reserved.– 6 – Sal Mistry

individuals. As a result, technology has tried to keep up with this negativedemand.

Log Collection. In the mid-1990s, it became apparent that manualanalysis of logs belonging to critical systems (UNIX in particular) was notpractical as the obscure information generated in the typical security logwas not easily translated into meaningful information. Further, it wasdifficult to not only identify when a security breach occurred, but alsodifficult to pinpoint the root cause and related events. Systemadministrators began to write scripts that would search throughmegabytes of data for certain events. The practice became standardmainly in the UNIX community which limited its effectiveness withWindows and other operating systems. Moreover, the strength of thismethod was limited as each script looked for common events and eventhen, there were many that were missed or overlooked. The results of thescripts would be dumped into a file and later reviewed by an ITadministrator. In most cases, the results were not available until the nextday or many days later.

Even if the administrator was able to manually sift through the data andinterpret it, the administrator was only able to conduct "after the fact"analysis related to the security breach. Despite its significance inevaluation and enforcement of a proper security process, this exercisewas impractical for security IT staff to find something specific such asinsight into security holes in an enterprise network or the evidence ofsecurity violations.

Next Generation: Data Management

Due to the incredible speed that computers process data, it is humanlyimpossible for a system administrator to examine each request forinformation or system event. In addition, it is difficult to establish abaseline to use to analyze the information. In effect, the ITadministrator’s dilemma is: Where there is smoke, there is fire, but notnecessarily with all security risks.

Log Management. Commercial development of intrusion detectiontechnologies began in the early 1990s. An Intrusion Detection System(IDS) is a device that aims to detect any attempt to penetrate thesecurity perimeter of the network by analyzing data for potentialvulnerabilities and attacks in progress. Its underpinnings relied on thework conducted by several engineers including Dr. Dorothy Deming of

Page 7: Dynamic Log Analysis - The Future

The Future: Dynamic Log Analysis Whitepaper

© Copyright 2009 HAWK Network Defense, Inc. All rights reserved.– 7 – Sal Mistry

SRI, Dr. Todd Heberlin of University of California at Davis, and severalengineers at Lawrence Livermore Laboratories. Haystack Labs was thefirst commercial vendor of IDS tools, with its Stalker line of host-basedproducts. SAIC was also developing a form of host-based intrusiondetection, called Computer Misuse Detection System (CMDS). The AirForce also developed the Automated Security Measurement System(ASIM) to monitor network traffic on the US Air Force's network. ASIMmade considerable progress in overcoming scalability and portabilityissues that previously plagued network intrusion detection (NID) products.Additionally, ASIM was the first solution to incorporate both a hardwareand software solution to NID. In 1994, the ASIM project formed acommercial company, the Wheel Group. Their product, NetRanger, wasthe first commercially viable network intrusion detection device.

The primary limitation of an IDS solution is false positives. According to ITSecurity Interviews Exposed, a false positive occurs when an IntrusionDetection and/or Prevention (IDS/IPS) device identifies normal traffic asmalicious. According to the Network Security Bible, “most IDS require ahuman operator to be in the loop. Given the current maturity of IDStechnology, the dangers of automated response are significant, andoutweigh the preceding advantages. With the frequency of falsepositives that exist in the current generation of IDSs, the potential forinappropriate response to misdiagnosis is too high.”

Intrusion prevention technology is considered by some to be anextension of IDS technology. Early Intrusion prevention systems (IPS) wereIDS that were able to implement prevention commands to firewalls andaccess control changes to routers. IPS evolved in the late 1990s toresolve ambiguities in passive network monitoring by placing detectionsystems inline. An intrusion prevention system is a network security devicethat monitors network and/or system activities for malicious or unwantedbehavior and can react, in real-time, block or prevent those activities.Network-based IPS, for example, will operate inline to monitor all networktraffic for malicious code or attacks. When an attack is detected, it candrop the offending packets while still allowing all other traffic to pass.

The challenge facing organizations is that many existingcountermeasures often miss sophisticated attacks and privileged userviolations because most traditional security systems are geared toidentify failed logins, denied firewall or VPN access, or known intrusiondetection system hits. In order to find something specific, administratorsturned to the use of manual and basic event correlation techniques,

Page 8: Dynamic Log Analysis - The Future

The Future: Dynamic Log Analysis Whitepaper

© Copyright 2009 HAWK Network Defense, Inc. All rights reserved.– 8 – Sal Mistry

which is finding relationships between two or more log entries. An eventcorrelation system is the brain of the correlation process. According to ITSecurity Interviews Exposed, event correlation is the ability to analyzeseemingly disparate, unrelated events that make up a concertedattack.

Since the mid-1990’s the marketplace has provided some solutions thataddress some of the problems that administrators face in managing andanalyzing log data. The first step toward progress on this front occurredwith the development of Security Information Management (SIM)solutions. SIM provides reporting and historical analysis to generatesecurity metrics and support security policy compliance management.SIM’s collected event data from various sources, including antivirussoftware, IDS, IPS, file systems, firewalls, routers, servers and switches. Theprimary goal of SIM was to enable analysis and trending of security data(to support internal investigations) and to reduce the expense ofregulatory compliance and security reporting. Its primary benefit was itsability to correlate data from multiple sources, aggregate the data, and‘translate’ the event data from various sources into a common format,through XML. However, its primary limitation was its inability to providereal-time suspicious activity, decipher between benign and real threatsand to display the activity visually.

As a result of some of the limitations with a traditional SIM solution, aSecurity Event Management (SEM) was developed. SEM provides theability to process near-real-time data from security devices and systemsto determine when security events of interest have occurred. Thiscapability provides real-time analysis of security data and helps ITsecurity operations personnel be more effective in discovering andmanaging security events. The primary goal of SEM is to improve thetimely resolution of security incidents.

Third Generation: Making sense of data

Log Intelligence. The best of SIM and SEM products eventually becameknown as Security Information Event Management (SIEM) or SecurityThreat Management (STM) products. SIEM/STM was able to incorporatethe functionality of SIM, adding the ability to cross-correlate to helpadministrators discern between real threats and false positives, provideautomated incidence response, send alerts, and provide a visualconsole to display the aggregated, correlated information. The benefitof event correlation is simplification of events and also the ability to

Page 9: Dynamic Log Analysis - The Future

The Future: Dynamic Log Analysis Whitepaper

© Copyright 2009 HAWK Network Defense, Inc. All rights reserved.– 9 – Sal Mistry

digest large quantities of information (such as error logs). Normalizingand summarizing allows staff to review and analyze more attacks in agiven time period. Also, staff can take multiple events and perform trendanalysis. Most SIEM embodies a myriad of security event correlationalgorithms to perform real-time discovery of possible security attackssuch as SCAN/DSCAN Attacks, DOS/DDOS Attacks, WORMS, FlashWORMS, Virus Attacks and other INTRUSION Attacks. These correlationalgorithms are executed in real-time and in parallel to discover the asquickly as possible. Most of the event data for this purpose is thereforekept in memory for fast access during the correlation process. However,because most SIEM only ‘help’ administrators discern between falsepositives through event correlation, its primary limitation is lack of abilityto intelligently discern between false positives. That is, the SIEM system isonly as good as the accuracy of correlation algorithms which in turneliminate false positive and false negatives. Well-known attack typessuch as basic scan or intrusion are easier to detect and block with up tonearly 100% accuracy, whereas sophisticated zero-day attacks or flashworms have a much smaller likelihood of being detected in a timelyfashion.

Too often, the average analyst is bestowed with the near super-humantask of perpetually monitoring, identifying and extinguishing thesethreats. In the same way a CFO may have to sift through worksheets ofinformation all deemed to be ‘important’, ‘relevant’ and ‘sorted’, ananalyst must do the same in order to identify threats. Currently, bothhuman and computer resources have to sift through a multitude ofinformation in order to not only identify threats but also find ‘superpatterns’ that identify antecedents to sophisticated threats. This manualprocess is seldom effectively managed because of the shear tediousnature of the exercise. Ironically, this whole process is performed just toassess whether a threat exists and does nothing to actually solve anyproblems.

A New Era: HAWK’s approach

In an era in which dollars count more than ever, a true solution willenable an organization to more efficiently prevent a security breachand respond to each appropriately. Security breaches of all types willcontinue to affect an organization’s bottom line. It is no longer sufficientto merely respond to breaches. What is needed is a solution that enablesan organization to effectively and efficiently anticipate threats.

Page 10: Dynamic Log Analysis - The Future

The Future: Dynamic Log Analysis Whitepaper

© Copyright 2009 HAWK Network Defense, Inc. All rights reserved.– 10 – Sal Mistry

Recently, Network World stated it best:

“ ‘Correlation’ has long been the buzzword used around eventreduction, and all of the products we tested contained acorrelation engine of some sort. The engines vary in complexity,but they all allow for basic comparisons: if the engine sees A andalso sees B or C, then it will go do X. Otherwise, file the event awayin storage and move onto the next. We'd love to see someoneattack the event reduction challenge with something creative likeBayesian filtering, but for now correlation-based event reductionappears to be the de facto standard.”

Quite simply, the current marketplace tools rely on basic Boolean rulesets. Although somewhat effective, it usefulness is only as good as theperson analyzing, assessing and monitoring the events to identifypotential threats. Sophisticated Bayesian filtering will enable the analystto identify threats and precursors to threats.

Famed scientist Thomas Kuhn stated that individuals are unlikely torelinquish an unworkable paradigm, despite many indications that theparadigm is not functioning properly, until a better paradigm can bepresented. By utilizing a Naïve-Bayesian Histogram algorithm, DynamicLog Analysis is the next paradigm shift.

Dynamic Log Analysis. By eliminating events that are not of threat, onecan identify real threats. Much like a ‘super-detective’ who is constantlymonitoring, learning and adapting to threats, dynamic log analysis isconstantly performing predictive activities to eliminate false threats. As aresult, the average analyst is transformed into a team of veteran ‘super-detectives’ with the ability to immediately decipher a real threat from aminor daily occurrence. This proactive method mitigates the probabilitythat a network will be fully infiltrated.

Dynamic Log Analysisenables the averageanalyst to utilize a teamof resources that candifferentiate events thatare not of threat, so thatreal threats can beidentified and prevented.Dynamic Log Analysis refers to an event driven solution that iteratively

Page 11: Dynamic Log Analysis - The Future

The Future: Dynamic Log Analysis Whitepaper

© Copyright 2009 HAWK Network Defense, Inc. All rights reserved.– 11 – Sal Mistry

assesses the probability that certain types of events will produce athreat. Using a Naïve-Bayesian Histogram algorithm to assign ‘scores’ aswell as utilizing Boolean rule sets, the system learns and placesimportance on certain types of correlated events. The system thenassigns a ‘score’ to the threat. Dynamic Log Analysis’s scoringtechnology determines the priority of an ‘event’ for alerting andresponding and its Multi-Decision Tree Matching Algorithm increasesspeed of matching of events to rules developed by the administrator. Bycombining these two processes, the time to identify, respond andremediate an event is greatly reduced.

Scoring. Just as a team of ‘super-detectives’ uses their sharedexperiences to identify and place emphasis on significant threats, theBayesian Histogram algorithm and Boolean Ruleset assigns a score todefine the magnitude of a threat. The ‘score’ is then placed in thedatabase and the administrator is alerted on the most perilous threats.The unique total score is determined by utilizing the naïve Bayesianlearning algorithm, the Boolean rule-set, as well as information acquiredduring the normalization and matching process. All of the gatheredinformation is taken into account before the total score is determined.

In its simplest form, the solution performs the following:

Once the event, which is any user action, log entry, security notification,and performance statistic, has been selected for processing, its contentsare inserted into the database. After database insertion, the event goesthrough the unique multifaceted scoring process that first includes adetermination of the naïve Bayesian score by analyzing the standarddeviation. The system is then able to match against those target eventsthat have not been previously identified. In addition, this Naïve-Bayesianalgorithm is specifically designed to match against known or trainedinformation. Together, the engine establishes an operating baseline,and to looks for deviations against this standard norm.

Next, Bayesian score is included along with the existing event propertiesto be processed by the Boolean rule-sets, which is list of rules associatedwith a positive or negative score. Once a Boolean rule-set is matched

Page 12: Dynamic Log Analysis - The Future

The Future: Dynamic Log Analysis Whitepaper

© Copyright 2009 HAWK Network Defense, Inc. All rights reserved.– 12 – Sal Mistry

against a provided event, the associated score is added to the existingscore, which in most cases is zero. Once all the rules have beencompared against the event, a total score is determined, allowing futureactions to be taken based upon the pre-configured score threshold.

At this stage, the unique total score only applies to a single event. Byassigning each event a unique score, an analyst is able receive alerts onisolated, specific events that exceed a specified score threshold. Inaddition, isolating and assigning a unique score to each event enablesthe analyst to conduct a trend analysis and rapidly adjust to changes inoverall activity.

HAWK’s Multi-Decision Tree Matching Algorithm. In the same way a teamof ‘super-detectives’ relies ontheir shared knowledge andexperiences in order toquickly match threats tospecific, predeterminedhigh-risk behavior, thedecision and matchingtechnology then matchesthe provided event to itsrelated ‘rule’ faster. This

technology is designed in three layers.

When an event is received by the dynamic log analysis engine, itconverts the received information into a normalized event, matches itagainst its pre-defined rule set and is then separated into two types;compiled modules, and a textual rule-set. The textual rule-sets areseparated into three basic classifications that provide the means formatching against our rule-set: triggers, rule-groups, and rules. A trigger isa regular expression that must match a threat in order for the rules withinthe module to continue processing. If it does, the event proceeds toone of the rules groups and within the rule group, a rule is applied. A rulecontains all the given information HAWK requires for improved matching,correlation, and scoring. Each rule contains the alert name, category,knowledgebase id, host and network packet information, as well asaudit procedure information for compliance monitoring and scoring. Thefinal rule, upon successful match, allows the administrator to assign thespecific information to the event’s normalized hash table. The final ruleallows for multiple matching rules as well as using the ‘not’ indicator.

Page 13: Dynamic Log Analysis - The Future

The Future: Dynamic Log Analysis Whitepaper

© Copyright 2009 HAWK Network Defense, Inc. All rights reserved.– 13 – Sal Mistry

Once these activities have been completed, the event is passed into theprocessing queue for archiving, scoring and additional correlation.

HAWK’s Information EventConsole. Lastly, in thesame manner that ateam of ‘superdetectives’ combines allof their respectiveexperiences andknowledge into oneshared, cohesive view tovisualize the extent of thethreat, the HAWKInformation EventConsole presents anoverall view of the highestand lowest priority alerts,all arranged by severity ofcorrelation. Further, it acts as the management and data retrievalinterface with the relational database, provides a historical retrieval oflogged information, and, over secure encrypted sessions, provides rolebased access controls.

In conclusion, HAWK Network Defense has developed this, patent-pending, technology that transforms the tedious and time consumingtasks of event logging into a dynamic, powerful experience thatproactively mitigates risk. Not only will the analyst be able to rely onexperience of the tool to prevent threats, but also be able to utilize hisown experience by writing, through regular expression, rules that willplace a ‘score’ on specific inter-organizational nuances which are not athreat.

Page 14: Dynamic Log Analysis - The Future

The Future: Dynamic Log Analysis Whitepaper

© Copyright 2009 HAWK Network Defense, Inc. All rights reserved.– 14 – Sal Mistry

ReferencesBagchi, Kallol, and Godwin Udo. "An Analysis of the Growth of Computer and

Internet Security Breaches." Communications of the Association forInformation Systems 12 (2003): 684. Print.

Burns, Bryan, Dave Killion, Nicolas Beauchesne, Eric Moret, Julien Sobrier,Michael Lynn, Eric Markham, Chris Iezzoni, Philippe Biondi, JenniferGranick, Steve Manzuik, and Paul Guersch. Security Power Tools. NorthMankato: O'Reilly Media, Inc., 2007: 3, 46, 52, 221, 661. Print.

Butler, Chris, Russ Rogers, Mason Ferratt, Greg Miles, Ed Fuller, Chris Hurley, RobCameron, and Brian Kirouac. IT Security Interviews Exposed Secrets toLanding Your Next Information Security Job. New York: Wiley, 2007. Print.

Carvey, Harlan. Windows Forensic Analysis Including DVD Toolkit. Rockland:Syngress, 2007: 71, 196, 254-90. Print.

"Chronicle of Data Breaches." www.privacyrights.org. Privacy Rights. Web.<http://www.privacyrights.org/ar/ChronDataBreaches.htm#CP>.

Cole, Eric, James W. Conley, and Ronald L. Krutz. Network Security Bible. NewYork: John Wiley & Sons, 2009: 727. Print.

Gregory, Peter H. CISSP Guide to Security Essentials. Menlo Park: ThomsonCourse Technology, 2009: 57, 215, 329, 384-87. Print.

Helm, Burt. “Picking the Best Brands.” McGraw-Hill Companies., 18 September2008.Web.<http://www.businessweek.com/magazine/content/08_39/b4101056103890.htm>.

Hinde, S. "Security Surveys Spring Crop." 21 (2002) (4), pp. 310?321. 21.4 (2002):310-21. Print.

"IC3 2008 Annual Report on Internet Crime." http://www.ic3.gov. Internet CrimeComplaint Center. Web. <http://www.ic3.gov/media/default.aspx>.

“Internet Security Threat Report.” Symantec. April 2009. Web.<http://www.symantec.com/business/theme.jsp?themeid=threatreport>.

Kaeo, Merike. Designing Network Security, Second Edition. New York: Cisco,2003: 93-114, 261-80, 316-21. Print.

Page 15: Dynamic Log Analysis - The Future

The Future: Dynamic Log Analysis Whitepaper

© Copyright 2009 HAWK Network Defense, Inc. All rights reserved.– 15 – Sal Mistry

McClure, Stuart, and George Kurtz. Hacking Exposed Network Security Secrets& Solutions, Second Edition (Hacking Exposed). New York: McGraw-Hill/Osborne Media, 2000: 57-8, 406, 464. Print.

The National Institute of Standards and Technology. “Guide to ComputerSecurity Log Management.” 2006.

Peikari, Cyrus. Security warrior. Beijing: O'Reilly, 2004: 460-4. Print.

Ponemon Institute. "2008 Privacy Management Report." 2008. Web.<http://www.ponemon.org/data-security >

Samuelle, T.J. Mike Meyers' CompTIA Security+ Certification Passport. 2nd ed.New York: The McGraw-Hill Companies, 2009: 64, 124-9, 202, 240, 383.Print.

Santarcangelo, Michael J. Into The Breach: Protect Your Business By ManagingPeople, Information, And Risk. Catalyst Media, 2008. Print.

Shipley, Greg. "SIEM tools come up short." Network World. Network World, Inc.,30 June 2008. Web.<http://www.networkworld.com/reviews/2008/063008-test-siem.html>.

Sperling, Ed. " Pink Slips And Data Theft." Forbes. Forbes Publications, Inc., 6 April2009. Print.

Stallings. Network and Internetwork Security: Principles and Practice. UpperSaddle River: Prentice-Hall, Inc., 2002: 230-7. Print.

Wadlow, Thomas A. The Process of Network Security Designing and Managing aSafe Network. New York: Addison-Wesley Professional, 2000: 136-9. Print.

Wilhelm. Thomas. Professional Penetration Testing: Creating And Operating AFormal Hacking Lab. Burlington: Syngress Media, 2009: 391-96. Print.

Page 16: Dynamic Log Analysis - The Future

The Future: Dynamic Log Analysis Whitepaper

© Copyright 2009 HAWK Network Defense, Inc. All rights reserved.– 16 – Sal Mistry

About HAWK Network Defense HAWK Network Defense, Inc. is anaward-winning global softwaredeveloper, providing unparalleledexpertise in preventing electronicsecurity threats. The firm is client-focused, with a provenmethodology that fosters success.HAWK Network Defense, Inc., hasdeveloped the best solution byunderstanding its client goals andscope to develop solutions thateases transitions andimplementation. Founded in 2006,HAWK Network Defense has aproven, effective difference. Formore information, visitwww.hawkdefense.com.

About Clear Technologies Clear Technologies is an award-winning value-added reseller ofcomprehensive technologysolutions that encompass server,storage, application solutions,services and competitive financing.Founded in 1993, ClearTechnologies began by offeringinformation technology solutionsand consulting for IBM.www.cleartechnologies.net.