HPC Partners

Advanced Research Computing provides access to research computing solutions for the university community in Maine through collaboration with premier computing centers. These services include technical support and are available at nationally competitive rates. University of Maine faculty can leverage negotiated agreements and competitive rate structures by contacting ARC today.

Preferred service providers:

Ohio Supercomputer Center (OSC)

The Ohio Supercomputer Center enables research development in computational science and the applications of supercomputing. OSC is equipped with research staff specializing in the fields of supercomputing, computational science, data management, biomedical applications, and a host of emerging disciplines.

Computing Hardware

Clusters Pitzer Owens
Machine description Dell Intel Gold 6148 Dell Intel Xeon E5-2680 v4
# Cores 10,240 23,392
Memory (minimum) 192 GB per node 128 GB per node
GPU capability 64 NVIDIA Tesla V100 160 NVIDIA Tesla P100

Detailed descriptions of each cluster are available at OSC’s cluster computing webpage.

Software

A suite of software is available: complete list of current software. Most applications are available without fee for academic use.

Storage

Home Directory Storage (available to each User Account): Each user account includes a home directory with 500GB of storage and a maximum of 1,000,000 files.

Project Storage (setup as a single repository for a defined group of users): Each project includes 500GB of storage and a maximum of 1,000,000 files. Additional storage may be added upon request.

For detail information, please visit OSC’s storage documentation webpage, and contact ARC.

Interested in Ohio Supercomputer Center (OSC)?

Please contact UMaine Advanced Research Computing at um.arc@maine.edu


 

logo of TACC

Texas Advanced Computing Center (TACC)

The Texas Advanced Computing Center (TACC) is a premier center of computational excellence in the U.S. Since 2001, TACC has been enabling discoveries and the advancement of science through the application of advanced research computing technologies.

Computing Hardware

Clusters STAMPEDE 2 MAVERICK 2 LONGHORN
Machine description 4,200 Intel Knight Landing nodes, each with 68 core

1,736 Intel Skylake nodes, each with 48 core

24 GTX compute nodes with Intel Xeon CPU E5-2620 v4

4 V100 compute nodes with Xeon Platinum 8160 CPU

3 P100 nodes with Intel Platinum 8160 CPU

IBM Power System AC922 nodes with IBM Power 9 processors
Memory Intel Knight Landing nodes – 96 GB of DDR RAM and 16 GB of MCDRAM

Intel Skylake – 192 GB of RAM per node

GTX nodes – 128 GB RAM

V100 nodes – 192 GB RAM

P100 nodes – 192 GB RAM

GPU Nodes – 256 GB of RAM

GPU Large Memory Nodes – 512 GB of RAM

GPU capability NA GTX nodes – 4 NVidia 1080-TI GPUs per node

V100 nodes – 2 NVidia V100 adapters

P100 nodes – 2 NVidia P100 adapters

96 V100 nodes, with 4 GPUs per node

8 large memory V100 nodes, each with 4 GPUs per node and

Detailed descriptions of each system are available:

STAMPEDE2: system summary; user guide.

MAVERICK2: user guide.

LONGHORN: system summary; user guide.

Software

A suite of software is available: complete list of current software. Most applications are available without fee for academic use.

Storage

Home Directory Storage (available to each User Account): Each user account includes a home directory with 500GB.

Project Data Storage: Corral is available for additional project storage needs. This space will never be purged. Users who wish to backup data to more than one system, separate Archival Data Storage is available. For detailed information please visit Corral User Guide. Competitive rates apply to Corral storage, and pricing information is available from ARC.

Archival Data Storage: Ranch is a long term tape storage system, and it is available for archiving project data. This space is not intended for active data, and it is also not suitable for system backups. The Ranch system provides redundant data storage for project related data. For detailed information please visit the Ranch User Guide. Competitive rates apply to Ranch storage, and pricing information is available from ARC.

Interested in Texas Advanced Computing Center (TACC)?

Please contact UMaine Advanced Research Computing at um.arc@maine.edu