The Argon system is the current primary central HPC resource. The system contains ~16,000 processor cores and 300 GPU accelerators. The system is combination of Infiniband and Omnipath high speed interconnects and is connected to the campus via multiple 10Gb ethernet connections. Click here for details of the Argon HPC Cluster. For information about buying nodes in the HPC clusters please visit this page.
In Q2-19, the community was surveyed to solicit feedback regarding their needs. Survey results are available here.
In response to community feedback, a request for purchase was issued and an award was granted in July, 2019. This information allowed us to make the following upgrades to the Argon HPC environment:
Hardware: Argon power was increased by 40 percent, with double the number of graphics processing units (GPUs). Up to eight GPUs are available on a single system. The new hardware was available in October 2019, and its warranty runs through October 2024. For more details, or to purchase a dedicated compute node, visit the Research Services website.
Software: Argon’s software stack is now more flexible and agile than ever before. Fall 2019 updates were loaded to an environment module which allows new and old application versions to coexist in the stack. Initially, new compiler and OpenMPI versions were added, with Python-2 and Python-3 updates. Additional packages will be added in the future. For instructions on loading the environment module, visit the Research Services website.Questions? Please contact email@example.com.
UI has partnered with several national computing and scientific organizations in order to provide additional computing resources to its researchers and staff. ITS - Research Services can help facilitate access to national computing resources, such as those provided by XSEDE. This includes access to large-scale computational resources and expertise that may not be available on the UI campus. Additionally, we can help facilitate access to commercial cloud platforms such as AWS and Azure.
For more details visit this page. Additional resources are available through the Great Lakes Consortium for Petascale Computation (GLCPC), and the Blue Waters Project.