High Performance Computing

The Miami Redhawk cluster is available for use by faculty, staff, and students. Contact the RCS group to arrange a discussion about how the cluster can support your research and teaching efforts.

For larger computing needs, the RCS group can assist researchers with using resources available from the Ohio Supercomputer Center.

Accounts

Faculty can send a request for account activation to rescomp@MiamiOH.edu. Students can ask a faculty member to sponsor an account for them.

Access

There are different ways to access the Redhawk cluster. Note that when you are accessing the cluster, you are connecting to the head or login node of the cluster. See the section on using the cluster for information about accessing the compute nodes.

Command line access is available using SSH, with SFTP/SCP used to transfer files. 

Access to the cluster with a full Linux desktop is also available using a tool called NoMachine or NX.

To get more information on how to connect to the Redhawk cluster, submit a Connection Instruction Request

Hardware

Miami's current HPC cluster consists of:

  • 2 login nodes – 24 cores, 384 GB of memory each. Machine names:
    • mualhplp01
    • mualhplp02
  • 26 compute nodes – 24 cores, Intel Xeon Gold 6126 2.6 GHz processors, 96 GB of memory each. Machine names:
    • mualhpcp10.mpi-mualhpcp26.mpi
    • mualhpcp28.mpi-mualhpcp35.mpi
    • mualhpcp37.mpi
  • 2 large memory nodes – 24 cores, Intel Xeon Gold 6126 2.6 GHz processors , 1.5 TB of memory each. Machine names:
    • mualhpcp27.mpi
    • mualhpcp36.mpi
  • 4 GPU nodes – 96 GB of RAM, 24 cores, Intel Xeon Gold 6126 2.6 GHz processors and each with 2 Nvidia Tesla V100-PCIE-16GB GPUs. Machine names:
    • mualhpcp38.mpi-mualhpcp41.mpi
  • Shared storage system with approximately 20 TB of storage, expandable.

Software

Most software on the cluster is managed with the modules tool.

The HPC Software table lists software installed on the Redhawk cluster. Visit the modules page to learn more about software management on the cluster. Where a module name is not listed, view the details to learn more about using the package.

Contact the RCS group to request installation of additional packages or with other questions about software on the cluster.

Using the Cluster

Cluster usage is broken into two categories – interactive and batch. Interactive use allows the user to interact with a cluster node to run the software and receive output in real-time. In batch usage, work is submitted to cluster and executes when the needed resources are available, with optional e-mail notifications to the user when the job starts and ends. More details can be found at:

Governance

Governance is provided through the HPC advisory committee, a group of faculty and university administrators. For information about policies and related information please contact Research Computing Support.