mwa overview · quick overview – on the ground 1 •main goals: detection of hydrogen signature...
TRANSCRIPT
MWA Overview
25 Mar 2013
Dave Pallot
Quick Overview – On the ground
1
• Main Goals: Detection of hydrogen signature from EOR,
transient detection, all sky survey, Solar
• Low frequency interferometer 60-300 MHz (30 deg FOV)
• 2048 dipoles antennas
• 16 elements per tile to make 128 tiles (8128 baselines)
• Virtual beam steering
• Dense core short baselines
• Longest baseline 3km
Quick Overview – On the ground
2
• 16 receivers (8 tile capacity each)
• Filtering and conditioning
• Analog to digital signal processing
• Produces 1.28MHz frequency course channels
• Transmitted via underground fibre to MRO central processing facility
Quick Overview – Inside processing facility
3
• Polyphase filter bank (FPGA FFT)
• Converts 1.28MHz channels to 40kHz fine channels (all baselines)
• GPU cluster (Software based, nVidia Teslas)
• Correlator (multiplier) produces visibility sets every 0.5 secs
• Easy to modify integration time and channel frequency based on
science case
• Data Capture
• Up to 10Gb
• ~80TB on-site buffer
• Transmit to long-term storage at PBStore/Pawsey
4
5
6
7
8
128 Tile Data Flow
Calibration and
UV Images
The Sky (Analog RF)
16x Receivers into
24x GPU Correlator Nodes
RTS
Fornax, Perth
10 Gbps
Correlator Products + Meta-data
Long-term archive
Pawsey, Perth
10 Gbs MRO to Perth (Currently 1Gb)
128 Tiles
Staging/Buffering
~80TB
Visibility Data
Default: 40kHz 0.5 sec
Rate: 3 - 4 Gbs
MIT, USA AARNet
NZ, Melb, ?
Supercomputing and Storage
9
• Pawsey
• Storage Allocation: 6 PB over 2 years
• Dictates observing schedule
• 5 Gb/s total data rate @ 40kHz 0.5 sec
• 12 hr day = 27 TB
• FORAX (96 computing nodes @ UWA)
– Current host for MWA Real Time System (RTS)
– Real-time imaging and calibration of visibilities
– Currently being commissioned (April 2013)
Onsite Data Capture
• Visibility Data - Correlator Output
– Currently 24 GPU nodes producing 24 FITS files
– Files split 1GB boundary.
– FITS Header Meta-data – Self describing files
• UTC data time, Observation and Project ID
• Adding: BF delays, frequency, integration time, gains etc
– Filenames
• Observation ID, capture time, originating GPU, file part number
• All meta-data is recorded against the observation ID,
including the file reference.
10
MWA Archive Interfaces
• NGAS (Next Generation Archive System)
– Handles data capture and storage interfaces
• Data staging and archiving
• User API and file delivery via URL and HTTP
– Automatic remote archive replication and filtering
• Science Archive and Instrument Meta-data Interfaces
– Python, Web based screens (Django) and PostgreSQL
– Hosted on a VM at PBStore (http://ngas01.ivec.org)
– VO compliant archive/interface (end 2013)
11
12
13
14
Project Progress
• Currently commissioning full 128 tile system
• Call for proposals ends May 2013
– 600 hrs time allocated July to Dec 2013
• On target for MWA early operations July 2013
15
Archive Priorities for Pawsey
• Hardware
– 10Gb interface to Pawsey front end from MRO
– 10Gb termination at MWA racks (CSIRO, May/June 2013)
– 10Gb link to Fairway (ICRAR, May/June 2013)
– HSM front end access for MWA (100TB)
– Tape library access for MWA (6TB)
– Access to Data Mover(s) (NGAS)
• Software
– 4 Virtual Machines (CentOS)
• Science archive
• Databases and web tools
16
17