Professor Anuran Makur receives IEEE Dissertation Award - Department of Computer Science - Purdue University Skip to main content

Professor Anuran Makur receives IEEE Dissertation Award

07-26-2021

Anuran Makur

Assistant Professor Anuran Makur has received the IEEE Information Theory Society Thomas M. Cover Dissertation Award. Makur’s thesis “Information Contraction and Decompositionwas completed in June 2019 in the EECS Department at MIT under the joint supervision of Lizhong Zheng and Yury Polyanskiy. Makur holds an assistant professor position in both the Department of Computer Science and the School of Electrical and Computer Engineering at Purdue University.

Makur received a B.S. degree with highest honors (summa cum laude) from the Department of Electrical Engineering and Computer Sciences at the University of California, Berkeley (UC Berkeley), CA, USA, in 2013, and the S.M. and Sc.D. degrees from the Department of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology (MIT), Cambridge, MA, USA, in 2015 and 2019, respectively. He completed his postdoctoral research at the Laboratory for Information and Decision Systems at MIT. His research interests include theoretical statistics and machine learning, information theory, and other areas in applied probability. He was a recipient of the Arthur M. Hopkin Award from UC Berkeley in 2013, the Irwin Mark Jacobs and Joan Klein Jacobs Presidential Fellowship from MIT in 2013, the Ernst A. Guillemin Master's Thesis Award from MIT in 2015, and the Jin Au Kong Doctoral Thesis Award from MIT in 2020.

The IEEE Information Theory Society Thomas M. Cover Dissertation Award, established in 2013, is awarded annually to the author of an outstanding doctoral dissertation contributing to the mathematical foundations of any of the information sciences within the purview of the Society including, but not limited to, Shannon theory, source and channel coding theory, data compression, learning theory, quantum information theory and computing, complexity theory, and applications of information theory in probability and statistics.

 

Thesis summary
Information contraction is one of the most fundamental concepts in information theory as evidenced by the numerous classical converse theorems that utilize it. This dissertation studies several problems aimed at better understanding this notion, broadly construed, within the intertwined realms of information theory, statistics, and discrete probability theory.

In information theory, the contraction of f-divergences, such as Kullback-Leibler (KL) divergence, χ2-divergence, and total variation (TV) distance, through channels (or the contraction of mutual f -information along Markov chains) is quantitatively captured by the well-known data processing inequalities. These inequalities can be tightened to produce “strong” data processing inequalities (SDPIs), which are obtained by introducing appropriate channel-dependent or source-channel-dependent “contraction coefficients.”

The thesis first proves various properties of contraction coefficients of source-channel pairs, and derives linear bounds on specific classes of such contraction coefficients in terms of the contraction coefficient for χ2-divergence (or the Hirschfeld-Gebelein-Rényi maximal correlation). Next, the thesis adopts a more statistical and machine learning perspective in elucidating the elegant geometry of SDPIs for χ2-divergence by developing modal decompositions of bivariate distributions based on singular value decompositions of conditional expectation operators.

Last Updated: Jul 29, 2021 8:45 PM

Department of Computer Science, 305 N. University Street, West Lafayette, IN 47907

Phone: (765) 494-6010 • Fax: (765) 494-0739

Copyright © 2024 Purdue University | An equal access/equal opportunity university | Copyright Complaints

Trouble with this page? Disability-related accessibility issue? Please contact the College of Science.