ceph for tier-2 storage at msi...© 2014 regents of the university of minnesota. all rights reserved...
Post on 14-Oct-2020
1 Views
Preview:
TRANSCRIPT
© 2014 Regents of the University of Minnesota. All rights reserved.
Minnesota Supercomputing Institute for Advanced Computational Research
Ceph for Tier-2 Storage at MSI
Benjamin Lynch, Ph.D.
Minnesota Supercomputing Institute for Advanced Computational Research
© 2014 Regents of the University of Minnesota. All rights reserved.
Campus and External Hosts
High-Performance Computing Clusters
Interactive HPC
2nd Tier Persistent Storage
(Ceph/S3)
High-Performance
Persistent Storage
and Scratch Space
(PANFS)
Tape Backup
Minnesota Supercomputing Institute for Advanced Computational Research
© 2014 Regents of the University of Minnesota. All rights reserved.
Ceph / S3
MSI Private Cloud
Public Cloud
Web
MSI HPC
Globus
Hadoop
and Spark
Minnesota Supercomputing Institute for Advanced Computational Research
© 2014 Regents of the University of Minnesota. All rights reserved.
Web
Ceph / S3
MSI HPC
Public Cloud
Globus
Hadoop
and Spark
MSI Private Cloud
Minnesota Supercomputing Institute for Advanced Computational Research
© 2014 Regents of the University of Minnesota. All rights reserved.
Web
Ceph / S3
MSI Private Cloud
Public Cloud
Globus
MSI HPC
Hadoop
and Spark
Minnesota Supercomputing Institute for Advanced Computational Research
© 2014 Regents of the University of Minnesota. All rights reserved.
Ceph / S3
MSI HPC
MSI Private Cloud
Public Cloud
Hadoop
and Spark
Web
Globus
Minnesota Supercomputing Institute for Advanced Computational Research
© 2014 Regents of the University of Minnesota. All rights reserved.
Ceph / S3
MSI Private Cloud
Public Cloud
Web
MSI HPC
Globus
Hadoop
and Spark
Minnesota Supercomputing Institute for Advanced Computational Research
© 2014 Regents of the University of Minnesota. All rights reserved.
Ceph / S3
MSI Private Cloud
Public Cloud
Web
MSI HPC
Globus
Hadoop
and Spark
Scalable volume
Scalable performance
High availability
Minnesota Supercomputing Institute for Advanced Computational Research
© 2014 Regents of the University of Minnesota. All rights reserved.
Why Ceph?
• Scalable • Fault-tolerant • Flexible configuration • Flexible support
Minnesota Supercomputing Institute for Advanced Computational Research
© 2014 Regents of the University of Minnesota. All rights reserved.
Our Ceph Cluster • Cluster Specs
– Ceph 0.94.5 (Hammer) – 3 PB raw disk – 4+2 erasure coding for most data – 4 load-balanced gateways (civetweb) – 10GigE network – 3 monitors
• Applications – Genomics workflows on public & private cloud compute resources – Data staging for HPC applications – Data staging for Hadoop / Spark cluster – Bulk store of stale data
Minnesota Supercomputing Institute for Advanced Computational Research
© 2014 Regents of the University of Minnesota. All rights reserved.
• Luke Burns (System Administrator) • Matt Mix (Network Administrator)
Acknowledgements
top related