Skip to Main Content
College Home Page
E C E Home Page

  (808) 956-9741

  POST 205F


My work covers theory and practical topics at the intersection of statistical learning and information theory. In particular, the focus is on high dimensional and complex problems, that are not amenable to traditional statistical methods and guarantees. At the same time, we also try to understand the fundamental limitations on when learning is possible, and how to characterize non-uniform learning. We also explore complexity in sampling, such as from slow mixing Markov processes (for example, obtaining news on very polarized topics on the Internet), and study how to interpret data from such sources. 

Currently my research program is funded by grants from the National Science Foundation, and my latest work tries to reshape the way we think about some statistical problems. At a high level, consider the nature of scientific discovery. We keep refining theories as we see more data. The natural question therefore is---will the refinements ever end? Will it happen that at some point, we can make up our minds and say with confidence that our inference is good enough and no more data will change it substantially? 

There are a number of nuances to this broad perspective, and our recent publications listed below summarize some of our results. A more complete list of publications is available via my CV here.

[1] M. Hosseini and N. Santhanam. Tail Redundancy and its Characterization of Compression of Memoryless Sources. Journal of Selected Areas of Information Theory, 2023. Available online here.

[2] N. Santhanam, V. Anantharam, and W. Szpankowski. Data driven weak universal compression. Journal of Machine Learning Research, 2022. Available online here.

[3] C. Wu and N. Santhanam. Non-uniform consistency of online learning with random sampling. In Proceedings of the 32nd International Conference on Algorithmic Learning Theory, 2021. Available online here.

[4] C. Wu and N. Santhanam. Prediction with finitely many errors almost surely. In Proceedings of The 24th International Conference on Artificial Intelligence and Statistics, 2021. Available online here.

[5] C. Wu and N. Santhanam. Entropy property testing with finitely many errors’. In Proceedings of IEEE Symposium on Information Theory, Virtual conference due to covid-19, 2020.

[6] C. Wu and N. Santhanam. Almost uniform sampling from neural networks. In Proceedings of the 54th Annual Conference on Information Sciences and Systems, 2020.

Semester Number Title Times Location
Fall 2021 ee342 Probability and Statistics MWF 10:30-11:20 Virtual
Fall 2021 ee345/lab Linear algebra and machine learning MWF 8:30-9:30, Thu 9-11:45 Virtual
Every semester eex96 Design Project Mon: 10:30am, Tue: 430pm POST 205F
Spring 2022 ee646 Information Theory MW 9-10:30 Sakamaki B301