A Wide Range of Scientific DisciplinesWill Require a Common Infrastructure
• Example--Two e-Science Grand Challenges– NSF’s EarthScope—US Array– NIH’s Biomedical Informatics Research Network
• Common Needs– Large Number of Sensors / Instruments– Daily Generation of Large Data Sets– Data is on Multiple Length and Time Scales– Automatic Archiving in Distributed Federated Repositories– Large Community of End Users– Multi-Megapixel and Immersive Visualization– Collaborative Analysis From Multiple Sites– Complex Simulations Needed to Interpret Data
NSF’s EarthScope--USArray • Resolution of Crust & Upper Mantle Structure to Tens of kms.• Transportable Array
– Fixed Design Broadband Array– 400 Broadband Seismometers
– ~70 Km Spacing– ~1500 X 1500 Km Grid
– ~2 Year Deployments at Each Site– Rolling Deployment Over More Than 10 Years
• Permanent Reference Network– GSN/NSN Quality Seismometers– Geodetic Quality GPS Receivers
• All Data to Community in Near Real Time– Bandwidth Will Be Driven by Visual Analysis in Federated
Repositories
Source: Frank Vernon (IGPP SIO, UCSD)
Rollout Over 14 Years Starting
With Existing Broadband Stations
Federated Repositories Are Needed to Link Brain Multi-Scale Structure and Function
• Filling Information Gaps with Advanced 3 & 4D Microscopies and New Labeling Technologies
• Leveraging on
Advances in Computational Capabilities
• Electron Tomography Over Multiple Scales
Source: Mark Ellisman, UCSD
NIH is Funding a National-Scale Grid Federating Multi-Scale Biomedical Data
National Partnership for Advanced Computational Infrastructure
Part of the UCSD CRBS Center for Research on Biological Structure
Biomedical Informatics Research Network
(BIRN)
NIH Plans to Expand to Other Organs
and Many Laboratories
Similar Needs for Many Other e-Science Community Resources
ATLAS
Sloan Digital Sky Survey
LHC
ALMA
CONTROL
PLANE
Clusters
DynamicallyAllocatedLightpaths
Switch Fabrics
PhysicalMonitoring
Apps Middleware
A LambdaGrid Will Be the Backbone for an e-Science Network
• Metro Area Laboratories Springing Up Worldwide
• Developing GigE and 10GigE Applications and Services
• Testing Optical Switches
• Metro Optical Testbeds-the next GigaPOP?
Campus Laboratory LambdaGrid “On-Ramps” are Needed to Link to MetroGrid
• TND2 = Datamining Clusters at NU and UIC Lab. for Advanced Computing – 32 Deerfield processors with 10GigE networking each, NetRam storage
• TNV2 = Visualization Clusters at NU and UIC EVL– 27 Deerfield processors with 10GigE networking each, 25 screens
• TNC2 = TeraGrid Computing Clusters at EVL– 32 Deerfield processors with 10GigE networking each
LAC
10x 10GigE
UIC
O-O-Oswitch
EVLTNV2 TNC2
router
10x10GigE
2x40GigE
2x40GigE
TND2
router
StarLight/Northwestern
TND2TNV2
router
DWDM DWDM
O-O-Oswitch
…
Source: Tom DeFanti, EVL, UIC
Research Topics for Building an e-Science LambdaGrid
• Provide Integrated Services in the Tbit/s Range– Lambda-Centric Communication & Computing Resource Allocation– Middleware Services for Real-Time Distributed Programs – Extend Internet QoS Provisioning Over a WDM-Based Network
• Develop a Common Control-Plane Optical Transport Architecture: – Transport Traffic Over Multiple User Planes With Variable Switching
Modes– Lambda Switching– Burst Switching – Inverse Multiplexing (One Application Uses Multiple Lambdas)
– Extend GMPLS:– Routing– Resource Reservation – Restoration
UCSD, UCI, USC, UIC, & NW
Research Topics for Building an e-Science LambdaGrid
• Enhance Security Mechanisms:– End-to-End Integrity Check of Data Streams– Access Multiple Locations With Trusted Authentication Mechanisms– Use Grid Middleware for Authentication, Authorization, Validation,
Encryption and Forensic Analysis of Multiple Systems and Administrative Domains
• Distribute Storage While Optimizing Storewidth:– Distribute Massive Pools of Physical RAM (Network Memory)– Develop Visual TeraMining Techniques to Mine Petabytes of Data – Enable Ultrafast Image Rendering – Create for Optical Storage Area Networks (OSANs)
– Analysis and Modeling Tools– OSAN Control and Data Management Protocols – Buffering Strategies and Memory Hierarchies for WDM Optical
Networks
UCSD, UCI, USC, UIC, & NW
A Layered Software Architecture is Needed for Defense and Civilian Applications
www.ndia-sd.org/docs/NDIA_20June00.pdf
SPAWARSystemsCenter
San Diego