Center for High Throughput Computing
The Center for High Throughput Computing (CHTC), established in 2006, aims to bring the power of High Throughput Computing to all fields of research, and to allow the future of HTC to be shaped by insight from all fields.
High Throughput Computing is a collection of principles and techniques which maximize the effective throughput of computing resources towards a given problem. When applied for scientific computing, HTC can result in improved use of a computing resource, improved automation, and help drive the scientific problem forward.
The team at CHTC develops technologies and services for HTC. CHTC is the home of the HTCondor Software Suite which has over 30 years of experience in tackling HTC problems; it manages shared computing resources for researchers on the UW-Madison campus; and it leads the OSG Consortium, a national-scale environment for distributed HTC.
High-throughput computing as an enabler of black hole science
The stunning new image of a supermassive black hole in the center of the Milky Way was created by eight telescopes, 300 international astronomers and more than 5 million computational tasks. This Morgridge Institute article describes how the Wisconsin-based Open Science Pool helped make sense of it all.
NIAID/ACE - OSG collaboration leads to a successful virtual training session
The U.S. National Institute of Allergy and Infectious Diseases (NIAID) and the African Centers for Excellence in Bioinformatics and Data-Intensive Science (ACE) partnered with the OSG Consortium to host a virtual high throughput computing training session for graduate students from Makerere University and the University Of Sciences, Techniques, and Technologies of Bamako (USTTB).
Machine Learning and Image Analyses for Livestock Data
In this presentation from HTCondor Week 2021, Joao Dorea from the Digital Livestock Lab explains how high-throughput computing is used in the field of animal and dairy sciences.
The HTCondor Software Suite (HTCSS) provides sites and users with the ability to manage and execute HTC workloads. Whether it’s managing a single laptop or 250,000 cores at CERN, HTCondor helps solve computational problems through the application of the HTC principles.
CHTC manages over 20,000 cores and dozens of GPUs for the UW-Madison campus; this resource, which is free and shared, aims to advance the mission of the University of Wisconsin in support of the Wisconsin Idea. Researchers can place their workloads on an access point at CHTC and utilize the resources at CHTC, across the campus, and across the nation.
CHTC’s Research Facilitation team empowers researchers to utilize computing to achieve their goals. The Research Facilitation approach emphasizes teaching users skills and methodologies to manage & automate workloads on resources like those at CHTC, the campus, or across the world.
As part of its many services to UW-Madison and beyond, the CHTC is home to or supports the following Research Projects and Partners.
The OSG is a consortium of research collaborations, campuses, national laboratories and software providers dedicated to the advancement of all of open science via the practice of distributed High Throughput Computing (dHTC), and the advancement of its state of the art. The OSG operates a fabric of dHTC services for the national science and engineering community and CHTC has been a major force in OSG since its inception in 2005.
The Partnership to Advance Throughput Computing (PATh) is a partnership between CHTC and OSG to advance throughput computing. Funded through a major investment from NSF, PATh helps advance HTC at a national level through support for HTCSS and provides a fabric of services for the NSF science and engineering community to access resources across the nation.
The Morgridge Institute for Research is a private, biomedical research institute located on the UW-Madison campus. Morgridge’s Research Computing Theme is a unique partner with CHTC, investing in the vision of HTC and its ability to advance basic research.