HyPER C3 – Community Cloud HPC Cluster

HyPER C3 provides a community high-performance computing (HPC) cluster to support compute-intensive scholarship at Emory. This cloud HPC infrastructure is available to active faculty, staff, or students within the Emory community for the purpose of conducting Emory research and education activities. Initial access will be provided to Faculty members and associated teams who are involved in AI.Humanity initiatives.

Key Features and Benefits

  • Free-Tier Access: HyPER C3 offers faculty groups direct access to scientific computing resources for research and teaching use cases without the cost and burden of administrating clusters. Shared base level usage is offered at no cost for active Emory faculty, academic staff, and students. Usage is governed through community “fair share” configurations and policies built into the cluster to optimizeutilization.
  • Dedicated Support and TrainingA team of HPC engineers, coordinated with central and school IT teams, to manage daily cluster administrations and help investigators with onboarding, training on using the cloud cluster, advice on leveraging the resources for projects, and designing solutions sets. 
  • Scientific compute nodes: A state-of-the-art selection of high-performance compute nodes available in AWS, including general-purpose CPUs and GPU-based processors, high-end GPU instances and high-memory nodes, served through a head/logon node and several partitions (or queues) suitable for a variety of scientific computing use cases.  

  • Specifications and Usage Policies:  
    • 1 compute node with 8 A100 GPUs and 320G GPU memory;  
    • 1 compute node with 8 V100 GPUs and 128G GPU memory;  
    • A selection of 12 A10G GPUs (6 nodes), 4 CPU-optimized nodes, and 5 high-memory nodes;  
    • Persistent personal and group storage, scalable Lustre parallel file system as HPC workspace;  
    • Specialized software: R, Python, TensorFlow, PyTorch, Anaconda, Go; Apache MXNet, Nvidia CUDA