joint sensor: security test and evaluation embedded in a ... · symmetrical framework of visibility...

14
Joint Sensor: Security Test and Evaluation Embedded in a Production Network Sensor Cloud Tim Owen, Rob Scott, and Roy Campbell, Ph.D. Defense Research and Engineering Network, High Performance Computing Modernization Program, Lorton, Virginia A great security posture inherently requires that cyber operations employ the latest discoveries in emerging security research to keep in step with trends in attack methodologies. The most trenchant cyber security research to date employs actual network data to ensure sensing algorithms and defense methodologies are effective in real-world scenarios. This approach often requires discernments to be made as temporally close to the observed events as possible to allow rapid adaptability of the security posture upon detection of an anomaly. Traditional security architectures, on the other hand, are static and are managed as a centralized, homogenous, symmetrical framework of visibility and interception. Even though access to the data collected from such an environment provides some accessional improvement to researching new algorithms and detection methods, these incremental offline advancements are vetted in a sterile, non-real-time environment without the benefit of sequent responses or adaptive determinations accoutered by a production environment. The primary goal of the Defense Research Engineering Network Cyber Security Test Bed is to leverage emerging network protocols and recent distributed computational techniques to create a cloud of sensors built on tractable computer server platforms that enables cutting-edge security to coexist with current security infrastructure directly inside the production network. The transition time of the latest cyber research from theory to practice will be significantly reduced while intrinsically revolutionizing the approach to engineering network security architectures. By creating a true proving ground by which the science of new algorithms and detection methods can interact directly with raw (as opposed to filtered, sensed, or captured) traffic in real or near-real time in a safe and controlled way, the proposed test bed will provide meaningful advances that can appreciably address the ever- changing landscape of cyber attacks. Key words: Adaptive; cloud; Defense Research and Engineering Network (DREN); detection; live; network; security; sensor; test bed. I t is an easy sell, at least on the surface, to discuss the introduction of new detection and protection techniques into the traditional network-centric security posture. Just be- neath the surface, however, it is clear that traditional architecture has been engineered with specific requirements in mind and purposefully uses a relatively simplistic model: make it persistent (or make it static), consistent (centrally manage and control it), homogenous (use the same tools everywhere), and symmetrical (see both sides of all connections) wherever possible. The solution is the result of a steady evolution from deploying detection and protec- tions in key locations on a network boundary in a Draconian fashion based on event history to modern- izing by adding layers of automation, such as firewall or intrusion protection system (IPS) signature subscrip- tion services and domain policy for maintaining system vulnerability patching for up-to-the-day readiness. Practically, a circumscribed ‘‘defensible’’ perimeter represents both a sphere of visibility and control and a clear demarcation of responsibility, while reducing the complexity of management of enterprise-wide security solutions and systems within the boundary. Therefore, introducing complexities into that equation can easily translate to more manpower and effort as ITEA Journal 2010; 31: 499–511 Copyright 2010 by the International Test and Evaluation Association 31(4) N December 2010 499

Upload: others

Post on 23-Mar-2020

10 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Joint Sensor: Security Test and Evaluation Embedded in a ... · symmetrical framework of visibility and interception. Even though access to the data collected ... such as firewall

Joint Sensor: Security Test and Evaluation Embedded in aProduction Network Sensor Cloud

Tim Owen, Rob Scott, and Roy Campbell, Ph.D.

Defense Research and Engineering Network,

High Performance Computing Modernization Program, Lorton, Virginia

A great security posture inherently requires that cyber operations employ the latest discoveries in

emerging security research to keep in step with trends in attack methodologies. The most

trenchant cyber security research to date employs actual network data to ensure sensing

algorithms and defense methodologies are effective in real-world scenarios. This approach often

requires discernments to be made as temporally close to the observed events as possible to allow

rapid adaptability of the security posture upon detection of an anomaly. Traditional security

architectures, on the other hand, are static and are managed as a centralized, homogenous,

symmetrical framework of visibility and interception. Even though access to the data collected

from such an environment provides some accessional improvement to researching new

algorithms and detection methods, these incremental offline advancements are vetted in a sterile,

non-real-time environment without the benefit of sequent responses or adaptive determinations

accoutered by a production environment. The primary goal of the Defense Research Engineering

Network Cyber Security Test Bed is to leverage emerging network protocols and recent

distributed computational techniques to create a cloud of sensors built on tractable computer

server platforms that enables cutting-edge security to coexist with current security infrastructure

directly inside the production network. The transition time of the latest cyber research from

theory to practice will be significantly reduced while intrinsically revolutionizing the approach

to engineering network security architectures. By creating a true proving ground by which the

science of new algorithms and detection methods can interact directly with raw (as opposed to

filtered, sensed, or captured) traffic in real or near-real time in a safe and controlled way, the

proposed test bed will provide meaningful advances that can appreciably address the ever-

changing landscape of cyber attacks.

Key words: Adaptive; cloud; Defense Research and Engineering Network (DREN);

detection; live; network; security; sensor; test bed.

It is an easy sell, at least on the surface, todiscuss the introduction of new detection andprotection techniques into the traditionalnetwork-centric security posture. Just be-neath the surface, however, it is clear that

traditional architecture has been engineered withspecific requirements in mind and purposefully uses arelatively simplistic model: make it persistent (or makeit static), consistent (centrally manage and control it),homogenous (use the same tools everywhere), andsymmetrical (see both sides of all connections)wherever possible. The solution is the result of asteady evolution from deploying detection and protec-

tions in key locations on a network boundary in aDraconian fashion based on event history to modern-izing by adding layers of automation, such as firewall orintrusion protection system (IPS) signature subscrip-tion services and domain policy for maintaining systemvulnerability patching for up-to-the-day readiness.Practically, a circumscribed ‘‘defensible’’ perimeterrepresents both a sphere of visibility and control anda clear demarcation of responsibility, while reducingthe complexity of management of enterprise-widesecurity solutions and systems within the boundary.Therefore, introducing complexities into that equationcan easily translate to more manpower and effort as

ITEA Journal 2010; 31: 499–511

Copyright ’ 2010 by the International Test and Evaluation Association

31(4) N December 2010 499

Page 2: Joint Sensor: Security Test and Evaluation Embedded in a ... · symmetrical framework of visibility and interception. Even though access to the data collected ... such as firewall

Report Documentation Page Form ApprovedOMB No. 0704-0188

Public reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering andmaintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information,including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, ArlingtonVA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if itdoes not display a currently valid OMB control number.

1. REPORT DATE 2010 2. REPORT TYPE

3. DATES COVERED 00-00-2010 to 00-00-2010

4. TITLE AND SUBTITLE Joint Sensor: Security Test and Evaluation Embedded in a ProductionNetwork Sensor Cloud

5a. CONTRACT NUMBER

5b. GRANT NUMBER

5c. PROGRAM ELEMENT NUMBER

6. AUTHOR(S) 5d. PROJECT NUMBER

5e. TASK NUMBER

5f. WORK UNIT NUMBER

7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Defense Research and Engineering Network,High PerformanceComputing Modernization Program,Lorton,VA,22079

8. PERFORMING ORGANIZATIONREPORT NUMBER

9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSOR/MONITOR’S ACRONYM(S)

11. SPONSOR/MONITOR’S REPORT NUMBER(S)

12. DISTRIBUTION/AVAILABILITY STATEMENT Approved for public release; distribution unlimited

13. SUPPLEMENTARY NOTES

14. ABSTRACT

15. SUBJECT TERMS

16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT Same as

Report (SAR)

18. NUMBEROF PAGES

13

19a. NAME OFRESPONSIBLE PERSON

a. REPORT unclassified

b. ABSTRACT unclassified

c. THIS PAGE unclassified

Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18

Page 3: Joint Sensor: Security Test and Evaluation Embedded in a ... · symmetrical framework of visibility and interception. Even though access to the data collected ... such as firewall

well as the opportunity for errors that can easily hidereal problems.

In contrast to traditional defense, today’s threat isdynamic, decentralized or distributed, heterogeneous(using various means and methods and attack vectors),and asymmetrical. It connotes a style of using a diverseplatform to launch a barrage of threats and even turnthe defense mechanisms against themselves to denyservice, compromise systems, malign applications,exfiltrate data, and then (as if that were not enough)use those systems to launch subsequent assailment—allwithout being detected firsthand and often not untillong after the event. It then comes as no surprise thatefforts are now placed in a variety of avenues to aid inidentifying malicious content, anomalous traffic pat-terns, and even behaviors of the programmers devel-oping malware. The security architecture that takesadvantage of all of these developments by deliveringmore comprehensive and compatible features thatimprove detection, investigation, and mitigation is farmore likely to yield the significant gains required toremain afloat in the face of cyber storms.

It is not, however, as simple as replacing oneparadigm with another or one tool with a newer one. Itis primarily a philosophical transition from thehistorical function of security to seek out andcategorically block incoming antagonists to a moresurgical and focused reaction to maintain as muchoperational normalcy as possible while defeatingspecific intrusions. In making that transition, toolscannot necessarily just be upgraded in place orabandoned for newer ones. The nature and functionof the newer tools rely heavily on hardware andprocessor capabilities and fit into more fluid datacommunications structures not entirely compatiblewith the existing installed base of components,protocols, or data formats. Likewise, research anddevelopment of these new tools and techniques usingonly data that is captured using the traditional methodsmay also lack the perspective to uncover new tacticsand unleash new response mechanisms. Instead, theremust be a way to leverage existing capabilities to createmeasures of usable data, as well as grant access for thenext generation of detection and defense algorithms tobe proven within the current architecture with real andreal-time data to smooth the transition and transformthe defense strategy.

It is precisely this gap that a neoteric sensor platformcan help fill by injecting research methodologies andtools into the existing architecture. A replacement fortraditional sensing appliances, this solution combinescurrent sensing techniques with an isolated modularspace within which to test new tools and strategies on amultipurpose hardware platform directly within the

production environment. The goals of such a device areto continue to provide existing capabilities, enhancethose capabilities with small doses of new techniquesfor detection and protection, and significantly reducethe development cycle from research to production forquality tools. These nascent methodologies must beimplemented without breaking performance or com-promising operations and be directly subject to thesame inimical traffic to both better sense the anomaliesinitially and provide clear value by uncovering threatsand activities not previously detected.

Divergence of attack and defense styleMany security professionals feel the approaches for

cyber defense of the past need to be amended oraugmented to find new attacks so that we maycontinue to ‘‘meet mission in the face of cyber warfare.’’Alternatively, some people go further to identify cyberwarfare as a fifth combat arena (The Economist 2010)behind land, sea, air, and space. The latter camp buildsan argument by identifying three unsound assumptionswith the former camp:

N The boundary is structurally defensible (whichdoes not account for mobility),

N The threat is more readily tractable on theexisting dimensions being defended (in the faceof multifarious attack),

N Automation is equivalent to readiness (whichboth relies on an asynchronous mechanical clientupdate system and intrinsically trusts the contentand structure of resultant code).

A read of this year’s Verizon 2010 Data BreachInvestigations Report (Baker et al. 2010) may in asense reiterate the assumptions and propagate theimpression that the most significant issues in cybersecurity are resolved by better deploying the existingtechnologies. Their statistics, generated in collabora-tion with the U.S. Secret Service as a study of existingcases of breaches, indicates that 61% of the cases werediscovered by a third party, 85% were not considereddifficult to accomplish, and a whopping 96% of studiedbreaches could have been avoided with the use of someform of low- or median-level mitigation. The conclu-sions were based on one glaring fact: in 86% of thecases, victims had evidence of the breach in their logfiles. Taken at face value, this type of study shows theneed for bolstering and continuing to emend existinginstalled solutions that mitigate known vulnerabilities,but it does not necessarily mollify the need for abroader perspective on warfare or keeping even worsefrom happening. Perhaps more germane to theargument is that the method of the study mightindicate a marginal disconnect or divergence between

Owen, Scott, & Campbell

500 ITEA Journal

Page 4: Joint Sensor: Security Test and Evaluation Embedded in a ... · symmetrical framework of visibility and interception. Even though access to the data collected ... such as firewall

defense and attack style simply because the data beingused to develop conclusions, or in turn, new tools, wasuncovered using conventional data and visibility.

Though the Verizon report shows hackers still haveopen to them several ‘‘paths of least resistance,’’ simplyclosing those paths does not secure against the broader,more organized, mature, and insidious threat thatcyber warfare proponents assert. And while the reportfurther catalogs the expanse of the threat, theconclusions are focused on closing the known holesand implementing processes to look for more holes.While this retrospective admonition has merit, effortsbased on this distractive construct of integral improve-ment not only diverge from the attack style but alsopropagate the limitation to invest in and supportdevelopment of modernized capabilities. In gist, whatis being asked is to fundamentally change how defenseis enacted. The real imperative of cyber warfare is todefend against something of which we have noknowledge, arriving as a previously undiscoveredzero-day attack, and entering on one or more vectorsabout which we do not know and over whichconventional defense has little control or visibility.Even so, outfitted with the latest weapons on each ofthe vectors, the strategy focused on finding somethingnew to block may in practice allow the attacker to usedefense systems and practices against themselves,whereby an effort to block the traffic in a gross motionmay result in a self-inflicted denial of service.Summarily, war in the cyber realm contends for theneed to stop just defending, bring the various vectorsinto a unified interdependent defense model, and haveconcentrated reactions specific to the attack. Furthermaturity is then needed in a strategic shift from forceprotection to surgical response with focus on precisionmitigation, low false positives, inoculation or learningwhere possible to defend against repeat or similarattacks, and effective and immediate recovery ofsystems that have been attacked.

The mobility and dimensionality notwithstanding,with the perceived necessary central approach tomanpower, management and reporting, ease andconsistency of tools (i.e., homogeneity), and automa-tion (e.g., updates and filter list managers), thedetection and pragmatic block of all possible mecha-nisms and signatures to avoid any potential knowntypes of attack simply is not scalable. Antiviruspowerhouse Symantec announced in its publiclyavailable quarterly report that it created 457,641 newmalicious code signatures in the second quarter of2010, down from 958,585 in the previous 3 months(Symantec 2010). McAfee indicated earlier this yearthat growth in new malware recorded remains around40,000 pieces per day (Muttik 2010). (How many

signatures can an IPS run before a significant drop inperformance occurs?) The signature convention ofblocking all possible inroads based on historical attacksis unsustainable. Moreover, these staple products usean asynchronous method of update, whereby the newmalware must be identified, submitted to the vendorfor processing, and then downloaded, with a periodicclient update to be installed on a computer to detectand then possibly mitigate a future infection by theknown malware. Considering the incredible capabili-ties of worms and other attacks like Conficker andAurora, perhaps a deficiency in the current model thatis even more sobering than scale is time.

By the time the press had come out about Googlebeing under attack by Aurora in January 2010, at least34 other organizations had indicated they had comeunder the same attack. Ongoing worm research hasalso indicated that, through similar techniques, asmany as 1 million hosts can be compromised by aworm in as little as 0.5 to 1.0 seconds (Stanisford et al.2004). So even the most robust traditional security justdoes not scale to protect from initial attack and doesnot learn fast enough to keep the compromise fromspreading. Details now known about the Aurora attackindicate that the attack was so diversified, usedencryption and obfuscation techniques of a complexitynot seen before, and came from such a spectrum ofsources as to avoid traditional detection. In just thesefew examples, it becomes clear that time is not on theside of traditional processes and even the best oftraditional security is not geared to detect, let alonerespond to, the changing threat. To wit, after furtherstudy, botnet expert David Dagon of Georgia Techprovided a telling rejoinder that beyond the network-based security not preventing the spread, ‘‘the networkis the infection’’ (Dagon 2005). Therefore, the defensestyle must, by virtue of functionality, now betransformed from an irresolute, static, and isochronalresponse to a dynamic, flexible, and predictive one todetect sooner and more effectively thwart the ever-changing attack.

More to the point, the evidence on all fronts driveshome the requirement to bolster the advancement orreplacement of traditional tools aimed at dynamicallyupdating host defense and anomaly recognition, primethe ontogenesis of detection and containment tech-niques, and incite the discovery of new tools andalgorithms designed to see new kinds of attacks beforeinfiltration. Those all begin with access to data.Typically, new tools are developed using simulateddata, network replay, or analysis of collected senseddata combined with data stores of log files and similarsystem-specific information. Any systematic approachto innovation by requirement features access to that

Joint Sensor

31(4) N December 2010 501

Page 5: Joint Sensor: Security Test and Evaluation Embedded in a ... · symmetrical framework of visibility and interception. Even though access to the data collected ... such as firewall

data, at least as a tenuous first step in progressiveingenuity. However, storage directly affects capability(i.e., you can only store so much for so long) andretrospective analysis directly affects network band-width (i.e., all this sensed data must be backhauled to acentral location for processing). Therefore, even in thisfirst phase, generally any data collected must be limitedby some suspicion threshold that triggers capturebefore caching locally and then compressing fortransmission to the depository. Care must be placedon the right rule sets to balance the amount of databeing captured and the bandwidth and storage requiredto retain it. (At what decay rate does the data collectedfrom such attacks become unusable for improvement ordevelopment of new strategies?)

The other shoe‘‘Botnets’’ are not just quite prevalent today but all

the rage, creating a marketplace that is both highlysophisticated and inexpensive. It has been reportedwidely in recent months that hackers are having whatsome call a ‘‘fire sale,’’ whereby an interested party canbuy a ‘‘botted’’ computer for a slice of time for as littleas $0.02. That creates an inexpensive attack infra-structure that is not only voluminous and widespreadbut also highly adaptive and dynamic. Imagine beingan upstart hacker needing to test a new algorithm,distributed denial of service, or a malware solution fordata exfiltration from a company or even a federalagency. Being able to rent a large infrastructure ofbotted machines from around the world for the priceof a few hamburgers would surely facilitate a large-enough attack source or malware hosting facility withsufficient obfuscation as to provide immediate resultson the validity of the code while avoiding detection ofthe actual traffic going into and out of the Internetaccess gateways, let alone tracing of the sources orcriminals moving as quickly as the shadow of thecloud under which they hide (Delbert et al. 2010).Further, the high availability and variety of differentsystems in diverse locations make it possible forhackers to rent the appropriate facility to reduce thehacker test-to-production life cycle for their mali-cious wares. The proverbial shoe is clearly on theother foot.

In stark contrast, researchers in retrospective-baseddefense system development rely heavily on large,expensive, summarily classified or otherwise unavail-able (and as a result, often quite stale) data sets withrigid posture and limited scope to test their ideas.Trying out new tools is not only complicated and timeconsuming but also costly and often delayed enoughthat researchers cannot know whether the ideas willbear results until just before or even after the finished

product is sent to market. Even more likely, theresearch takes so long and costs so much money thatthe resultant technology is already stale or unusableand is insufficiently up-to-date with the maliciousworld against which it was being designed to protect.Behavioral heuristics (detection of traffic anomalies),code obfuscation and encryption detection, infectiondetection and quarantine, and cyber genetics certainlyall have value and are being funded and pursued morethan ever, but all face this same dilemma. Withoutearly testing of ideas on quality offline data, interme-diate validation of budding algorithms using increas-ingly real-time traffic, or full-scale evaluation of theresultant solution in real time in a real environment,the cycle from idea to production expands to a nearlyuntenable and mostly unsustainable ambit. It thenseems relatively imperative to find new ways to get datato the science or, even better, get the latest sciencemore fugacious access to the data to bring that scienceto market faster. Further, engaging the mature tool setsin this final stage of access to contextual traffic forevanescent validity, the construct is finally broached forongoing support of the research, stimulating bothevolutionary and revolutionary adaptation to thethreats while improving the effectiveness of theinstalled security base.

Newer defense systems generally begin to abandonsignatures that require mechanical updates to a largedatabase running on the appliance supplanting themwith software or even Application-Specific IntegratedCircuit (ASIC)–based algorithms that look for partic-ular behavioral patterns in the traffic or anomalies inthe data. Detecting certain ways a malicious code willtry to behave to avoid detection, phone home, receiveremote instruction, or seek to further infiltrate aremore and more known by analysts and researchers tothe point of guessing that if that pattern of trafficoccurs, it may well be malicious in nature. Someresearchers have gone further to indicate that pro-grammers operate with similar behavioral patterns andare more easily detected. The desire to remainanonymous, to fragment traffic to avoid header analysisand the sheer numbers of resources around the worldto enlist as an attack source are all clear indicators of aneed for suspicion, and inclusion of these characteris-tics reduces false positives significantly. In some cases,these kinds of indicators are being built into the toolsets scanning data in retrospect and into IPS andcontent scrubbers watching traffic patterns, but inother cases, it is a unique research algorithm searchingpublic information for human behaviors and configu-ration patterns. For example, how a Domain NameService (DNS) server is configured in support of adomain might indicate the administrator’s desire to

Owen, Scott, & Campbell

502 ITEA Journal

Page 6: Joint Sensor: Security Test and Evaluation Embedded in a ... · symmetrical framework of visibility and interception. Even though access to the data collected ... such as firewall

remain low key. A broad study of publicly availableDNS information may uncover a trust model orlikeliness of becoming malicious at a later date.Cordoning off traffic to and from those addresses,domains, or network neighborhoods for more intensescrutiny would be more palatable because it reduces theamount of traffic that requires such scrubbing and thehardware and bandwidth required to watch withintensity.

This particular type of discovery is not exactly easyto do but does not necessarily require access to livetraffic. Yet information being learned from suchresearch becomes a more critical part of the overallprotective mechanism and must be integrated into thecomprehensive defense posture. Were the shoe on theother foot, or more precisely, were the shoe in theproduction environment designed to assimilate oraccommodate the other shoe being developed acrossmultiple research communities, the ability to test thecapability on the operational foot would provideprofound help in bringing both capabilities to bearon protecting the network. Unfortunately, the currentarchitecture is often device based, or even ASIC based,and introduction of new algorithms and tools is notsimple. And incorporation of the solutions others havefound is even more incoherent (reminiscent, tomaintain the allegory, to the intelligence fiasco of TallBlond Man with One Black Shoe). Instead, researchfrom various fields and intelligence gained from thosefields need to be married in a new approach to dynamicsecurity posture.

In complement, several security vendors, as well asfederal institutions and federally funded universityinstitutions, have made significant strides in InternetProtocol (IP) trust models. By using both collecteddata from a worldwide installed base of sensors andfirewalls, as well as security alerts and massive datastores of attack data, researchers have been able tocreate incredible databases used to categorize badagents, agents who may work with bad agents (guilt byassociation), and hosting or network providers andsupporting bad agents (autonomous systems thatprovide a safe haven for agents that generally do badthings). Cognate to these approaches, innovations suchas Milcord’s Botnet Threat Intelligence has been ableto identify malicious content providers by rapidchanges in DNS information or the hosting of largenumbers of domains across a very small number of IPaddresses, a behavior known as fast flux (Caglayan etal. 2010). Colorado State University’s Border GatewayProtocol (BGP) Monitoring System (Yan, Massey,McCracken, and Wang 2009) has vastly improved thecapability of identifying suspect organizations based onroute changes found in BGP route tables from around

the world. And there have been several ventures intogenetic and immunization models for dynamic defense,worm quarantine, and other means to identify andprevent the spread of malicious worms and Trojan-based attacks. These efforts, and others as they arise,need to be verified in the real environment, as well asincorporated into the overall attack strategy of theenterprise.

The Defense Research and Engineering Network(DREN), being the wide area network service providerfor the Department of Defense’s Research, Develop-ment, Test & Evaluation community, is in a uniqueposition to collaborate with these types of research inpursuit of the development of a robust and dynamicsecurity posture servicing a mature and complicatedenvironment. This solution is principally delineatedinto five elements:

N Identify (mainly through intelligence) historicallyor potentially malicious players (some throughbehavioral analysis), networks, or ‘‘neighbor-hoods’’ before they attack, and handle traffic toor from those entities appropriately;

N Detect or sense suspect traffic patterns as early aspossible to identify and contain events as they areunder way;

N Combine near-real-time alerting and response (orcountermeasures) with retrospective analysis toget to quick reactions as events occur (or aresuspected to be occurring), as well as full detailsshortly thereafter;

N Use dynamic networking capabilities to create atiered approach to protection and prevention,creating a mechanism for best-in-class protectionbeing provided by different devices, potentiallyeven in different locations;

N Facilitate the research in all these areas byproviding access to data with appropriate controlsand platforms so that new solutions can be provenand brought to bear sooner.

Through collaboration on several diverse projectswith other federal and federally funded agencies andprograms to pursue each of these elements, DREN canreadily vouch that significant research has already beendone in many areas such as improved analysis,heuristic-based detection, and even behavioral analysisof the attacker, all to provide increased intelligence ofwho the bad guys are, where they tend to live, and whatthey look and act like. Those data points and projectshave increased and continue to help increase the vitalability to defend the ever-changing environment inpursuit of an ever-evolving mission. Integration ofthese research elements, by redeveloping the architec-ture to make it compatible with influx of intelligence

Joint Sensor

31(4) N December 2010 503

Page 7: Joint Sensor: Security Test and Evaluation Embedded in a ... · symmetrical framework of visibility and interception. Even though access to the data collected ... such as firewall

and rethinking the data structures to support analysisfrom both internal and external sources, provides thepivotal pieces to migrate the traditional architecture inpieces to a more robust and nimble posture. Being awide area provider, however, puts DREN in a uniqueposition to focus our efforts on the final of theseelements, namely, getting data to offline scientificresearch and getting the results of that science into theenvironment for proof and innovation. For this reason,the Joint Sensor (JS) project was initiated to concen-trate our proficiencies for the betterment of cyberresearch in pursuit of that grand defense solution.

The DREN JSThe crux of the DREN security architecture that

spans all five critical elements of strategic defense is theinvention of a new multipurpose platform, the JS (seeFigure 1). This platform services both real-time andretrospective analysis tools, will be fed in places bydynamic network redirection of suspicious trafficidentified through intelligence feeds and other mech-anisms, and will make access to data more readilyavailable for proving research in a real environment.While the main idea being conveyed is to allow testingassets to sit next to or on top of a device alreadyproviding production sensing, the JS platform is morebroadly valuable in that the platform is capable ofsupporting some of the more rudimentary require-ments posed by a security operation with a dynamically

changing deployment strategy—namely, full or partialreimaging, traffic-specific sensing, and functionalparticipation in a cloud-based distributed environment.

The key to the JS project’s success over the varioushistorically incompatible uses (operational, data min-ing, and research algorithms) is its component designas an appropriately sized, multipurpose computingplatform with data connectivity allowing for thecollection of sensors to act as a fully capable,distributed, node-parallel computing platform. To gainthe advantages of this, various methods of dataseparation and protection must be applied internallyto a system and then externally to how they areconnected to the network. The system’s specificationswere also selected as being able to support the morerecent advances in virtualization and process isolation,making it possible for the various elements to coexistwithout treading on one another. Then, in addition tohaving enough capacity to operate correctly when atfull network flow load, the design of the system allowsfor other uses when the network load is normal and farless than full capacity.

Computational techniques and hardware for a givenprice point have advanced far enough to apply betterthan simple signature techniques to improve cybersecurity. Given the multicore, large memory systemthat is required to deal with a high-bandwidth ordenial-of-service attack, a significant amount ofprocessing power would be available at all other times

Figure 1. Joint Sensor functional architecture.

Owen, Scott, & Campbell

504 ITEA Journal

Page 8: Joint Sensor: Security Test and Evaluation Embedded in a ... · symmetrical framework of visibility and interception. Even though access to the data collected ... such as firewall

for nonoperational uses. Classic design of sensingrequires many sensors located at the network data. Thismeans that the JS systems across DREN mostly will beavailable as a large, distributed-processing computingcapability. Advances in scheduling, memory use, datamining, and parallel techniques would allow thiscapacity to perform research and test functions withoutinterfering with operational function as a classicsecurity sensor. A key advantage to this approach isthat the same network flows (Rajahalme, Conte,Carpenter, and Deering 2004) that are being examinedin the normal ways can be examined, mined, andvectorized by both emerging applications and researchsampling methods.

In addition, with the bandwidth available at off-useperiods on the high-performance DREN network,both existing and new methods in the area of datacross-correlation can be researched and developed as astage toward integration with a response mechanism.When combined with tools such as BGP FlowSpecification (FLOWSPEC) and other network datacopying and redirection techniques, network flows ofinterest can be sampled or piped through otherresources such as those available from the HighPerformance Computing Modernization Program foreven more advanced and intense algorithms. Thesealgorithms, working on live data that is representativeof both normal and intrusion-type flows, can lead tonew techniques of detection, elimination, and evenpotential cleansing to deal with the ever-changingthreats. Even in the area of existing operationalsensing, the JS project can add a new capacity byproviding communications and computational frame-works for doing simple distribution and redistributionof signature and threat analysis processing across allnodes, shifting work from heavily used systems tobarely used ones. Techniques can be applied to cross-correlate data and findings across all nodes such that,for example, network flows seen in more than onelocation are only processed or analyzed once in thepath. Moreover, signature hits can be multicast to theother nodes to increase monitoring of related flows orto change scheduling in recognition of a high loadevent spreading across the network.

Data separation in the joint sensors will have manyfacets. Initially, the standard mechanisms are processcore binding and data isolation, combined withhardware-supported memory protection. The operat-ing system was selected to be able to take advantage ofsecuring mechanisms available now or being addedincrementally, such as security-enhanced Linux, con-tainerization, and full virtualization. Since a largereason for both the operational and the research use ofa JS system is to capture the full network traffic at the

location, a mechanism will be developed to act as datacontroller for that network capture. A single capturewill be passed to one or more existing modules ormethods of analysis. A more controlled and possiblyrestricted copy of the data would be made available toresearch algorithms on the system. A sanitized andrestricted data set could be made available on thesystem (most likely to a separate virtual machine on thesystem) for use by external and affiliated researchers. Inaddition to all of these methods local to the system, thedata controller could send a full or subsampled set ofthe data using an encrypted path to other resources forfurther analysis or research algorithm processing. Thislast method could also be used to make diagnosticcaptures based on a filter definition fed from a remote,fully authenticated control station. Similarly, controlscould be passed into the system, such as to satisfy datarequests from law enforcement, which could includeinstructions for the data controller to apply specialencryption to the data and pass multiple copies todistinct and appropriately controlled archival systems.

All processes will be fully vetted before deploymenton the JS. In addition, extra steps can be deployed toensure proper function even while running a researchmodule. Some such modules can be subjected toadditional memory and processing restrictions (coreaffinity, central processing unit utilization, and mem-ory allocations), as well as techniques such as memoryleak monitoring and process destruction, to ensure nodeleterious effects on the mainline processing of thesystem. In addition to these mechanisms within theoperating system, new techniques provided by librariesand processor features will add virtualization capabil-ities to provide further isolation. These methodsinclude containerization, which allows a process torun on the main operating system but with no accessexcept as defined to the operating system and leavesthe process unaware and incapable of interaction withother processes on the system. A further step would beto do full virtualization, where a module would existwith its own operating system and copy of the datawithout interaction with the host operating system, anyprocess on that host system, or any data or process ofanother guest virtual machine. This technique alsoallows for any Intel-based operating system orappliance-like package that is completely different orincompatible with the RedHat Enterprise host oper-ating system to run locally and have the captured dataset available (using internal host or guest networkinterfaces).

These systems are homed to the DREN network. Byits mission, DREN is a high-performance, high-capacity network to transport Department of Defenseresearch and development, science and technology, test

Joint Sensor

31(4) N December 2010 505

Page 9: Joint Sensor: Security Test and Evaluation Embedded in a ... · symmetrical framework of visibility and interception. Even though access to the data collected ... such as firewall

and evaluation, and modeling and simulation data. Asthe transport and security of the data results of thesensor function need to be assured, this is another placewhere separation and control techniques need to beused. The DREN architecture provides several mech-anisms that are useful to this need. First, multiplelayer-3 IP virtual private networks can be used withvarying amounts of separations to ensure connectivityand monitoring of the JS system; and delivery of itsdata can be handled in a separate and preferentiallyqueued way. The path of data to CERT operations andinternal research systems would use this method. Theredirection and cloning of data using BGP FLOW-SPEC techniques would also use similar layer 3methods available. At the next level, DREN canprovide isolated layer 2 paths. Using virtual privatelocal area network service in a configuration developedfor another project on DREN, the console implemen-tation will use a layer 2–separated path from theoperations control points to the sensors while notallowing traffic between the sensors. This consoleimplementation provides connectivity to the onboardintegrated Dell Remote Access Controller interface,which implements a full Intelligent Platform Manage-ment Interface 2.0 capability and then some. What thismeans is complete control of the system from a remotelocation with console functionality—both serial andgraphic, as if locally connected—but only frompredetermined locations. In addition, this has thecapacity to mount a remote image that appears as if adigital video disc was inserted into the system.

Using this combination of tools, a complete systemboot from the remote image; preparation, install, andcustomization of the operating system; and inclusion ofall add-on modules can occur across DREN in aprivate, controlled manner. This can occur remotely inabout 35 minutes, in contrast with 25 minutes whenusing local media in the local installation lab setup.This capacity not only provides full control to ensurethat the sensor remains fully functional at itsoperations mission but also allows it to be adaptableand even completely remoldable without significantshipping costs, travel, distributed manpower, ordowntimes that are longer than necessary. Since thisis a limited virtual private local area network servicedeployment of layer 2 connectivity, its separation fromany other DREN function is high, and the locationswith access to this remote console capability can betightly controlled to restricted, on-network sites. Usingthe system’s other interfaces, additional paths will beset up to manage the system, collect data for theoperational CERT functions, communicate with theother JS systems, and have the capacity to set uptemporarily unique captures and data paths over

DREN. Having these interfaces and paths allows theuse of the features inherent to DREN to provide high-capacity, secure communications where data protectionand integrity are ensured.

Expectation engineeringThe final piece of the sensor bed puzzle is

engineering the willingness to support such an intricatesolution. The success of even getting such an emergenttest bed deployed within an operational environmentboils down to three key elements, at least in terms ofbringing to bear the right framework to create andsustain the environment, as well as to provide sufficientverdure to attract willing parties and sustain harmoni-ous living within it. These three elements are asfollows:

N Providing access for the researchers to realoperational data sets (traffic, data store, centralstorage, or other appropriate capability, whetheron the device or in a controlled shared space), aswell as to the test sensor packages in the cloud formanaging and making changes to the product;

N Indirect but immediate sharing of algorithms tosecurity operations that provide visibility intoattack vectors not otherwise seen using traditionalsensing and showing intrinsic value for thearrangement;

N Guaranteeing some level of control but, moreimportantly, significant levels of visibility fornetwork and security operations into the functionof test capabilities and the process whereby thefirst two key elements are managed and delivered.

In an environment where security is prime, there hashistorically been a separation and isolation betweenoperations and research, usually upheld in reality bydividing activities on network segments (e.g., demili-tarized zones, border zones, and lab networks), as wellas temporal separation between live traffic sets and databeing offered for research. Simply put, there has beenlimited access to the live network by anything otherthan stable and secured applications and devices. Tofacilitate this test bed and provide benefit to bothoperations and research that are nearer real time andlasting in effect, some of those conventions need to bebroken down, and the research must be meticulouslyinserted to maintain the original character of security.At the outset, this translates to not only best practicesbut also sophisticated operational security measures onthe joint sensor, as well as in the processes of securingthe applications before use. To make network securityoperations a willing participant, these additionalfunctional requirements for the sensor test bed meanleveraging stable and tried operating systems, middle-

Owen, Scott, & Campbell

506 ITEA Journal

Page 10: Joint Sensor: Security Test and Evaluation Embedded in a ... · symmetrical framework of visibility and interception. Even though access to the data collected ... such as firewall

ware, and application configurations in the field.Perhaps far more important to success and willingness,there needs to be visibility access granted to securityoperations—not just into the additional software butalso into the processes and procedures of how thoseelements are managed and maintained. Controls mustbe implemented on how information is shared with thesoftware and, in turn, from that software to itsmanagement systems, source coders, and stakeholders.But more rudimentarily, this leads to an emphasis onvisibility into the process of fielding a package into thesensor test bed so that operational security can injectreviewing hinge points and affect policies at variousreview stages. An engineering resource internal to theorganization that will take the research participant,along with the operations participant, through theprocess from concept to field trial gives all partiessufficient voice to ensure the solution is engineeredwithin expectation and guidelines.

This progression is also a phased approach, wherebythe research participant begins with access to steriledata with which to run algorithms to do a rudimentaryproof of concept offline, followed by an experimentalinitial offering in a lab environment using real but notreal-time data, quickly proceeding to a similar scenariowhere live or nearly real-time streams of operationaldata are tested for verification of algorithms, as well asconstructive processes such as management andalerting. These earlier phases give the researchers theopportunity to test their theories before expenditure ofoperational man-hours and resources for field deploy-ment and to create a more trusted expectation oncefield testing is approved to begin. Through this phasedprocess, another of the key elements is awardedparticipants: algorithms and actual results using livedata can supply researchers with validation of thealgorithms and demonstrate to operations with evi-dence that these algorithms are of value. Once in thefield, a more intimate relationship between researchersand operations (or at least the output of the test sensorand the input of the operational security mechanism),brokered by the internal engineering capacity, will givethe security participants more immediate value byfinding issues their tools would not otherwise havefound. Conclusively, these algorithms, running inparallel to existing capabilities, provide a number-for-number cross-correlation of results, false positives, anddetection rates as all are subject to the same traffic.

Ultimately, the grandiose concept of a cloud of testsensors built on the back of the production sensorsrequires as much operational nuance as it doestechnical innovation to ensure environmental policiesare enforced, participants have valid expectations, andreal results have both immediate and long-term

impact. In the DREN JS project, the supporters ofthe former methodologies have been enlivened by theopportunity to provide input to the building of thesensor and the process whereby the sensor will bemanaged, and part of that enlivenment was directlycreated by an enlightenment of seeing new sensortechnology find real issues in data that the incumbenttechnology had not seen. Billing the new technologynot as a replacement of the existing methodologies but,instead, as an enhancement to them would not havecreated willingness without the proof of real data andvisibility into how the system would fit into thearchitecture.

Dynamic network supportIn general, the effectiveness of intrusion detection

systems, intrusion prevention systems, and even fire-walls is like real estate—location, location, location—and in general, prime location is at the enterprise edgefacing the wide area network. In the case of DREN, awide area network service provider, the edge is anasymmetrical collection of geographically diversenetwork access points connecting the network to avariety of upstream and downstream entities, such astier 1 and regional Internet service providers, direct andprivate peers, and of course, customers. This rendersthe key location to see the most traffic a less-than-optimal location to see both sides of any givenconversation. As best current practice for a wide areanetwork is still ‘‘hot potato routing’’ (getting packetsoff your network as fast as possible via the closestconnection along the path to the destination), thecapability to synchronize bidirectional conversations isimpractical and nearly impossible to manage. However,significant capabilities in network equipment can nowfacilitate ‘‘symmetrizing’’ predetermined connections or‘‘flows’’1 such that the traffic to or from a particularprefix or protocol of interest can be dynamicallyredirected to a remote-triggered black hole, sink, orscrub (see Figure 2).

A remote trigger is a means to dynamically tell anetwork device (usually a router) to redirect traffic withcertain parameters (source IP address, destination IPaddress, IP protocol, and transmission control proto-col/user datagram protocol source and destinationport) as it hits a filter or access list, sending this trafficto a black hole (a means to drop the traffic), sink (acollector designed to capture the traffic for analysisrather than just drop it), or a scrub network. A scrub isa collection of tools that can be in line at a separatelocation somewhere in the network cloud used formonitoring or performing stateful inspection, intrusionprotection, or content filtering, and allowing validtraffic to proceed unchanged to its original destination.

Joint Sensor

31(4) N December 2010 507

Page 11: Joint Sensor: Security Test and Evaluation Embedded in a ... · symmetrical framework of visibility and interception. Even though access to the data collected ... such as firewall

This scrub network redirection is facilitated byconfigurations and protocols designed to temporarilymodify the path of the traffic within the networkwithout giving any indication to external entities thatthe redirect is occurring. Even better, because thenetwork can be affected bidirectionally, traffic throughthe scrub is now symmetrical, and both sides of thesuspect conversation can be monitored through a singleinspection point.

Because this redirection is now available, it is nolonger required that you have all the right tools at everypossible location, and tools at specific locations can bespecialized to focus on particular protocols or traffictypes. For example, consider a subscription service tomalicious URLs or an intelligence feed of alerts aboutsuspected ‘‘botnet’’ addresses. It is possible to inject thisintelligence into the network so that the boundarydevices (facing the ISP and the customers) uponreceiving any packets associated with these suspectaddresses and/or protocols can be redirected to the scrub(or sink). There the tools can now see both sides of aconnection in order to accurately determine maliciouscontent. For the purposes of a sensor test bed, you nolonger are required to have sensors at every site to be ofsignificant value. An algorithm that focuses on maliciousweb traffic, email scanning, DNS attacks, or anyapplication-specific determinism can now sit in a singleor a few locations and achieve complete visibility into allinteresting traffic for focused testing.

One particular element, a new network layer reach-ability information protocol of BGP known asFLOWSPEC, allows a system (such as an sFlowcollector or analysis tool) to publish ‘‘rule sets’’ in thefashion of particular flow or traffic parameters in aBGP session with a specific action (such as discard orredirect). This gives the security operations thecapability of dynamically updating filters on routersacross the wide area simultaneously. Used in conjunc-tion with virtual route forwarding tables and protocolslike multiprotocol label switching, FLOWSPECenables dynamic blocking—or better yet, redirec-tion—based on information gained from outsidesources, deep packet inspection, or retrospective analysis.In conjunction with a sensor test bed, particular trafficpatterns of interest can be ‘‘symmetrized’’ and sent to aset of tools for better isolation and tighter focus ofresearch algorithms. In the context of the phasedapproach mentioned earlier, dynamic redirect couldsurgically separate known-suspicious packets and sendcopies or temporarily divert that traffic through acontrolled, laboratory-type environment without havingthe test software residing in the production environmentor touching valid, sensitive data.

As an aside, this capability can change with theperspective of the researcher. As indicated earlier, it ispossible to redirect traffic based on an intelligence feed,the outcome of internal analysis, or other such meansof identifying peculiar or suspicious traffic. Likewise,

Figure 2. Dynamic redirect using Border Gateway Protocol Flow Specification.

Owen, Scott, & Campbell

508 ITEA Journal

Page 12: Joint Sensor: Security Test and Evaluation Embedded in a ... · symmetrical framework of visibility and interception. Even though access to the data collected ... such as firewall

just this part of the overall mechanism is a means toresearch emerging algorithms or tools in that thechoice of what traffic is diverted to a particular sensoror scrub center may be determined by the work of anew tool or even an outside research project. Today,feeds from Milcord, security vendors, lookups from theBGP Monitoring System, and others are all possiblecandidates for a redirect to an algorithm-specific sensoror protection system, making it readily possible todivide and conquer and thus reducing overall perfor-mance and bandwidth requirements on any individualsystem and constructing a defensible boundary oneprotocol or application at a time. Other methods foridentifying things for redirection can also be testedsafely, including research such as cyber genetics, man-in-the-middle botnet investigation, and immunityalgorithms for identifying bad patterns and othernegative characteristics. This thinking goes quite wellwith the distributed parallel computing capabilities of acollection of joint sensors across DREN.

In the end, dynamic network support is critical tofacilitating the next generation of traffic protection,sensing, and enterprise-wide dynamic security archi-tecture with a focused attack response. In themeantime, for the purposes of a test bed, it becomesquite powerful in facilitating the first phases of proof ofconcept for new algorithms and tools before they areput in the production realm. This extra step in theprocess provides sufficient ‘‘warm up’’ time for thesecurity operations teams to assess or remove anyquestion before putting any risk on the network.Similarly, the ability to send only the traffic that needsto be seen by the particular algorithm, selectedspecifically for its suspicious or known maliciousnature, means the tool does not have access to sensitivedata but does still get sufficient real and real-timetraffic to perform the research. Any anomalies detectedor protections proven through this dynamic redirectgive the research considerable value and providesecurity operations tremendous insight into thefunction of these new tools—without putting themin production.

Next stepsThe first opportunity to improve the development

cycle of interesting new tools from concept toproduction is to provide data sets to researchers forearly analysis. Once the algorithms that are imple-mented in the tools are proven and improved throughaccess to production traffic, the logical next step is todevelop a means to incorporate what the algorithmdetects into the overall, aggregated analysis andresponse system of security operations. Systems andalgorithms developed through research around the

world result in new intelligence feeds and alerts thatcan feed the central aggregate analysis in theproduction environment, as well as those rule setsindicated in flow data and BGP FLOWSPECdeployments. Likewise, these tools being developedand possibly deployed as products in other networksand research environments should then result in newalert feeds made available to this network as productiontools. The aggregation of these data streams is criticalin the next generation of security architectures.

A project under way at the Naval ResearchLaboratory in Washington, D.C., is taking thisconcept and creating a sort of cross-correlation system.Tools such as host-based security systems, firewall andIPS databases, malicious uniform resource locators,and other subscriptions are all being synchronized tocreate a multidimensional set of target parameters.DREN has a similar function being developed,whereby a scripting system is used to indicate whetherparticular IPs are showing up in multiple intelligencefeeds. Certainly, any IP or prefix or autonomoussystem number that appears in multiple lists should beregarded as a more serious threat and can be moreclosely scrutinized. In addition, taking data frommultiple solution sets from various vectors can helpcreate a richer attack vector analysis and presentanalysts and dynamic watch systems with a sort oftrust model of dangerous protocols or ‘‘bad neighbor-hoods.’’ Just as long lists of individual known bad IPs ishard to digest and incorporate in a watch list oranalysis tool, too many tiny fragments are that muchharder to distribute through protocols on the network.Therefore, being able to combine knowledge frommultiple sources and different types of information intoa general attack and protection pattern simplifies thearchitecture and provides a more robust responsesystem.

As previously indicated, dynamic protection alsoengenders the focused response as a measurableoutcome to be researched during this project. Theresearchers that are creating the algorithms should beworking in concert with the security professionals todevelop response mechanisms and methodologies aspart of developing the detection and preventionstrategies. Mitigation recommendations and progres-sions for various traffic types or attack intensities,reduction of false positives, inoculation against repeat-ed attacks, and means whereby infiltration can berecovered from must be incorporated into the defensestrategy as each technique for discovery is pursued.Expecting the vendors and researchers to provide bothnew alerts and sustainment is critical in the thoughtprocesses required in making a marketable product, aswell as an integrated tool for use in the environment.

Joint Sensor

31(4) N December 2010 509

Page 13: Joint Sensor: Security Test and Evaluation Embedded in a ... · symmetrical framework of visibility and interception. Even though access to the data collected ... such as firewall

One next step in the sensor test bed project is todevelop the mechanisms and processes wherebypotential suitable research initiatives can begin takingadvantage of this new solution. These must be finalizedand put into a quality management system. The goal ofthis programmatic development is to better understandthe nuances of how to select the most valuable andmost mature tools first and get immediate benefit fromthe program. In parallel, as a measure of effectivenessof the program and the tools it produces, we must alsofocus on improving the political relationships andcreating solutions that are more immediately usable bythe broader federal community, even while still in theinfant stage. The tools being produced through the testbed should not be limited to use within DREN andshould also promote the development of (or compat-ibility with) community-centric capabilities such asshadow-mirror databases, standardized data structuresand formats, reporting templates, and alert communi-cations systems. A tool being developed through the JStest bed may bring to bear components where nationalor vendor-specific collections of attack signatures andthreats—like McAfee’s IP Trusted Source or Syman-tec’s AV database—would benefit the communitygreatly and much earlier than traditional procurementprocesses afford. A shadow-mirror is a duplication of avendor database that receives updates from the vendorsystem (being populated by submissions and alert feedsfrom all over the world) but then becomes the internalcollection point of new additions discovered in the wildwithin the community (rather than reporting them tothe publicly available system). It is called a shadowbecause the sensitive information from the environ-ment is kept internal, but the lessons learned (fromattack) are available from both inside and the Internetat large. Any tool that is introduced into the test bedwould also be required to create a module for thiscentral collection and continue to provide updates forthe life of the product, whether it is eventually soldinto the federal market or not.

ConclusionThe DREN Cyber Security Test Bed will provide a

novel environment for testing new cyber securitymethods. The following active elements will beimplemented as individual, well-contained modules:traditional government off-the-shelf intrusion detec-tion software, traditional commercial off-the-shelfintrusion detection software, active network perfor-mance software, and experimental cyber security code.The test bed will be embedded in a productionnetwork, thereby providing real traffic to all modulesto generate in situ results for comparison, contrast,and correlation. This collocation of cutting-edge

security with existing security infrastructure (embed-ded in a production network) will dramaticallyexpedite the transition of posited network and dataprotection concepts to proven adaptive cyber securityalgorithms.

The success of the program relies on not only qualitytechnical implementation but also sound operationaland expectation engineering. Creating processes thatallow for visibility and interaction from securityoperations and providing nearly immediate resultsback to researchers and operations will solidify thevalue of a given tool and the program as a whole.Bridging the gap between product research andsupport for the federal environment in the shape ofnew data feeds, comprehensive aggregate analysis, andresponse solutions, the goal becomes furthering theoverall process, not just the posture of the enterprisesecurity architecture. With so many new attack stylesand dimensions, the most valuable outcome of thisproject will be the new way of approaching theproblem. C

MR. TIM OWEN received his undergraduate degrees in

telecommunications engineering technology from Capitol

College in 1989 and physics and science education from

North Carolina State University in 1992. He is the chief

engineer for WareOnEarth Communications, Inc., and is

a key member of the DREN engineering staff. E-mail:

[email protected]

MR. ROB SCOTT received his bachelor’s degree in

communications media from the University of Maryland

in 1986. He is currently the chief technology officer of

GeoWireless, Inc., and is a key member of the DREN

engineering staff. E-mail: [email protected]

DR. ROY CAMPBELL received his doctorate in electrical

engineering from Mississippi State University in 2002.

He currently serves as the program manager for the

DREN. E-mail: [email protected]

ReferencesBaker, W., et al. 2010. 2010 data breach investiga-

tions report. http://www.verizonbusiness.com/resources/reports/rp_2010-data-breach-report_en_xg.pdf (ac-cessed October 13, 2010).

Caglayan, A., Toothaker, M., Drapeau, D., Burke,D. and Eaton, G. 2010. Behavioral patterns of fastflux service networks, Hawaii International Conference

on System Sciences (HICSS-43) Cyber Security and

Information Intelligence Research Minitrack. Koloa,Kauai, Hawaii, Jan. 5-8, 2010. http://csdl2.computer.org/comp/proceedings/hicss/2010/3869/00/02-02-05.pdf (accessed October 10, 2010).

Owen, Scott, & Campbell

510 ITEA Journal

Page 14: Joint Sensor: Security Test and Evaluation Embedded in a ... · symmetrical framework of visibility and interception. Even though access to the data collected ... such as firewall

Dagon, D. 2005. Botnet detection and response: Theinternet is the infection. OARC Workshop. http://www.caida.org/workshops/dns-oarc/200507/slides/oarc0507-Dagon.pdf(accessed October 13, 2010).

Delbert, R., et al. April 2010. Shadows in the cloud:Investigating cyber espionage 2.0, JRo3-2010. http://www.nartv.org/mirror/shadows-in-the-cloud.pdf (ac-cessed October 10, 2010).

Economist. July 2010 War in the fifth domain.Muttik, I. 2010. Coorperation is key to internet

security. McAfee Security Journal, 2010 (issue 6): 20–24.http://www.mcafee.com/us/local_content/misc/threat_center/articles/summer2010/msj_article05_cooperation_is_key_to_internet_security.pdf (accessed october 13,2010).

Rajahalme, J., Conta, A., Carpenter, B., andDeering, S. 2004. IPv6 flow label specification,Network Working Group Request for Comments, RFC3697. Reston, VA: The Internet Society, http://www.ietf.org/rfc/rfc3697.txt (accessed October 13, 2010).

Staniford, S., Moore, D., Paxson, V., and Weaver,N. 2004. The top speed of flash worms, In Proceedingsof the 2004 ACM Workshop on Rapid Malcode, October2004. http://www.icir.org/vern/papers/topspeed-worm04.pdf (accessed October 10, 2010).

Symantec. 2010. Symantec Corp. FY2011 2QReport. http://www.symantec.com/business/theme.jsp?themeid5threatreport (accessed October 13,2010).

Yan, H., Massey, D., McCracken, E., and Wang L.2009. BGPMon and NewViews: Real-time BGP. InProceedings of the Institute of Electrical and ElectronicsEngineers International Conference on Computer Com-munications, April, Rio de Janeiro, Brazil. http://www.ieee-infocom.org/2009/demos/5%20-%20BGP%20Mon.pdf (accessed October 13, 2010).

Zetter, K. 2010. Google hack attack was ultra-sophisticated, new details show. WIRED, January 14,2010. http://www.wired.com/threatlevel/2010/01/operation-aurora/ (accessed October 13, 2010).

Joint Sensor

31(4) N December 2010 511