introduction to the hpcc jim leikert system administrator high performance computing center
Post on 12-Jan-2016
219 Views
Preview:
TRANSCRIPT
Introduction to the HPCC
Jim LeikertSystem Administrator
High Performance Computing Center
HPCC Online Resources
www.hpcc.msu.edu – HPCC home wiki.hpcc.msu.edu – Public/Private Wiki forums.hpcc.msu.edu – User forums rt.hpcc.msu.edu – Help desk request tracking mon.hpcc.msu.edu – System Monitors
HPCC Cluster Overview Linux operating system
Primary interface is text based though Secure Shell (ssh)
All Machines in the main cluster are binary compatible (compile once run anywhere)
Each user has 50Gigs of personal hard drive space. /mnt/home/username/
Users have access to 33TB of shared scratch space. /mnt/scratch/username/
A scheduler is used to manage jobs running on the cluster
A submission script is used to tell the scheduler the resources required and how to run a job
A Module system is used to manage the loading and unloading of software configurations
gateway
Access to HPCC is primarily though the gateway machinie: ssh username@gateway.hpcc.msu.edu Access to all HPCC services uses your MSU NetID
and password. For MSU NetID info- netid.msu.edu
HPCC System Diagram
Hardware Time LineYear Name Description Cores Memory Total Cores
2005 green 1.6GHz Itanium2 (very old) 128 576 (shared) 128
Main Cluster
2005 amd05 Dual-core 2.2GHz AMD Opterons 4 8GB 512
2007 intel07 Quad-core 2.3GHz Xeons 8 8GB 1024
2008 intel08 Sun x4450s (Fat Node) 16 64GB 32
2009 amd09 Sun Fire X4600 Opterons (Fat Node) 32 256GB 128
1696
We are currently have two new hardware additions for 2010 Graphics Processing Unit (GPU) Cluster – In House New General Purpose Large Cluster – RFP/RFQ stage
HPCC System Diagram
Cluster Developer Nodes Developer Nodes are accessible from gateway and
used for testing. ssh dev-amd05 – Same hardware as amd05 ssh dev-intel07 – Same hardware as intel07 ssh dev-amd09 – Same hardware as amd09
We periodically have some test boxes. These include: ssh dev-intel09 – 8 core intel Xeon with 48GB of memory ssh gfx-000 – Nvidia Graphics Processing Node (permanent
dev-gfx available soon)
Jobs running on the developer nodes should be limited to two hours of walltime.
Developer nodes are shared by everyone.
HPCC System Diagram
Available Software
Center Supported Development Software Intel compilers, openmp, openmpi, mvapich,
totalview, mkl, pathscale, gnu... Center Supported Research Software
Matlab, R, fluent, abaqus, HEEDS, amber, blast, ls-dyna, starp...
Center Unsupported Software (module use.cus) gromacs, cmake, cuda, imagemagick, java,
openmm, siesta...
Steps in Using the HPCC
1. Connect to HPCC
2. Transfer required input files and source code
3. Determine required software
4. Compile programs (if needed)
5. Test software/programs on a developer node
6. Write a submission script
7. Submit the job
8. Get your results and write a paper!!
Module System
To maximize the different types of software and system configurations that are available to the users. HPCC uses a Module system.
Key Commands module avail – show available modules module list – list currently loaded modules module load modulename – load a module module unload modulename – unload a module
Getting Help Documentation and User Manual - wiki.hpcc.msu.edu
User Forums - forums.hpcc.msu.edu
Contact HPCC and iCER Staff for:
Reporting System Problems
HPC Program writing/debugging Consultation
Help with HPC grant writing
System Requests
Other General Questions
Primary form of contact - www.hpcc.msu.edu/contact
Apply for an account – www.hpcc.msu.edu/request
HPCC Request tracking system – rt.hpcc.msu.edu
HPCC Phone – (517) 353-9309 9am-5pm
HPCC Office – Engineering Building 3200 9am-5pm
top related