High performance computing is necessary for supporting all aspects of data-driven research. HPC-related research includes computer architecture, systems software and middleware, networks, parallel and high performance algorithms, and programming paradigms, and run-time systems for data science.
Rich Vuduc, Director
Georgia Tech is now an established leader in computational techniques and algorithms for high performance computing and massive data. The center aims to advance the state of the art in massive data and high performance computing technology, and exploit HPC to solve high-impact real-world problems. The inherent complexity of these problems necessitates both advances in high performance computing and breakthroughs in our ability to extract knowledge from and understand massive complex data. The center's focus is primarily on algorithms and applications. Recent big data research has been in the areas of graph analytics, big data analytics for high throughput DNA sequencing, converting electronic health records into clinical phenotypes, and determining composite characteristics of metal alloys. Both IDEaS and the HPC Center will be co-located in the upcoming Coda building in Tech Square.
Tom Conte, Director
Tom Conte has spearheaded the development of the Center for Research into Novel Computing Hierarchies (CRNCH) to address fundamental challenges in the design of computer architectures. In 2015, the IEEE “Rebooting Computing Initiative” posed a grand challenge to create a computer able to adapt to data-driven problems and ultimately emulate computation with the efficiency of the human brain. This challenge is largely motivated by the end of Moore’s Law, which refers to doubling in computer performance every 18 months that has held historically but has now been curtailed due to physical limitations. Massive data sets and the challenges arising from data research make leadership in novel architectures critical to Georgia Tech’s computational leadership.