High Performance Computing: Enabling Complex Decisions Making in the Information Era Ananth Grama Director, Computational Science and Engineering Associate Director, Center for Science of Information Professor of Computer Science Purdue University. High performance computing has emerged as a critical enabling technology for simulation, design, and prototyping. More recently, large scale computing platforms have found extensive use in data analytics and decision support. Purdue has significant strengths in all of these areas. This talk summarizes research at Purdue on various aspects of high performance computing. It also highlights closely related efforts on data analytics and a comprehensive graduate program -- the Computational Science and Engineering specialization, which is one of the top-ranked programs, worldwide, with over 150 affiliated faculty and 160 graduate students. The talk will focus on Purdue's presence in the following areas: High Performance Computing: Methods, Software, Environments, and Platforms. Research at Purdue has resulted in highly scalable numerical techniques for linear and non-linear system solvers, eigenvalue problems, multi-body simulations (from atomistic simulations to PPM/ SPH/ MPM methods) and their applications. These methods provide the basis for complete simulation environments integrating uncertainty quantification techniques in support of predictive design. These methods have been validated in the context of applications ranging from nanoscale devices to large wind turbines. A number of public domain software libraries have been developed, along with web-enabled services that enable remote access to these software frameworks on large parallel computers at ITaP. Large-Scale Data Analytics: Algorithms, Statistical Techniques, Systems Software. Purdue researchers have worked on topics including scalable distributed formulations of a variety of Machine Learning and Data Mining kernels and associated statistical frameworks. In addition, they have developed systems software, ranging from programming models and compilers to runtime systems for distributed platforms. These systems have been validated on a variety of applications ranging from supervised/ unsupervised learning to information retrieval and recommender systems. Related Centers and Efforts: A number of Centers at Purdue investigate closely related topics. This includes the Center for Science of Information, a Science and Technology Center of the National Science Foundation. This center explores quantitative approaches to data storage, representation, communication, analysis, valuation, and use in various contexts. Education Programs: The Computational Science and Engineering program is a multidisciplinary program aimed at training graduate students from different disciplines in the use of computational methods. Core courses in the curricula include parallel algorithms, programming, numerical methods, and optimization techniques. Optional course from various departments specialize the skill-set to specific disciplines. In addition to training students, this program provides critical expertise to a number of large projects on campus, including the PRISM Center, Center for Science of Information, and the Network for Computational Nanotechnology. A new Data Science graduate specialization will be initiated in 2014-15.