COM3-02-57
651 61179

www.comp.nus.edu.sg/~scarlett

Jonathan SCARLETT

Associate Professor
Assistant Dean, Graduate Studies
Department of Mathematics, National University of Singapore
Institute for Data Science, National University of Singapore

  • Ph.D. (Information Engineering, University of Cambridge, 2014)
  • B.Eng. (Electrical Engineering, University of Melbourne, 2010)
  • B.Sci. (Computer Science, University of Melbourne, 2010)

Jonathan is an assistant professor jointly in the Department of Computer Science, Department of Mathematics, and Institute of Data Science, National University of Singapore. His research interests are in the areas of information theory, machine learning, and high-dimensional statistics. In 2010, Jonathan received the B.Eng. degree in electrical engineering and the B.Sci. degree in computer science from the University of Melbourne, Australia. From October 2011 to August 2014, he was a Ph.D. student in the Signal Processing and Communications Group at the University of Cambridge, United Kingdom. From September 2014 to September 2017, he was a post-doctoral researcher with the Laboratory for Information and Inference Systems at the École Polytechnique Fédérale de Lausanne (EPFL), Switzerland. He is a recipient of the Singapore National Research Foundation (NRF) fellowship, and the NUS Presidential Young Professorship.

RESEARCH INTERESTS

  • Machine Learning

  • Information Theory

  • High-Dimensional Statistics

  • Bayesian Optimization

  • Group Testing

RESEARCH PROJECTS

Information-theoretic limits of statistical inference and learning problems

The field of information theory was introduced to understand the fundamental limits of data compression and transmission, and has shaped the design of practical communication systems for decades. This project pursues the emerging perspective that information theory is not only a theory of communication, but a far-reaching theory of data benefiting diverse inference and learning problems.


Modern methods for high-dimensional estimation and learning

Extensive research has led to a variety of powerful techniques for high-dimensional learning, with the prevailing approaches assuming low-dimensional structure such as sparsity and low-rankness. This project pursues a paradigm shift towards data-driven techniques, including the replacement of explicit modeling assumptions by implicit generative models based on deep neural networks.


Theory and algorithms for group testing

Group testing is a classical sparse estimation problem that seeks to identify "defective" items by testing groups of items in pools, with applications including database systems, communication protocols, and COVID-19 testing. This project seeks to push recent advances further towards settings that better account for crucial practical phenomena, including noisy outcomes and prior information.


Theory and algorithms for Bayesian optimization

Bayesian optimization has emerged as a versatile tool for optimizing black-box functions, with particular success in automating machine learning algorithms (e.g., in the famous AlphaGo program). This project seeks to advance the state-of-the-art theory and algorithms, with an emphasis on practical variations that remain lesser-understood, including adversarial corruptions and high dimensionality.


Robustness considerations in machine learning

Robustness requirements pose many of the most important unsolved challenges in modern machine learning, arising from sources of uncertainty such as mismatched models, corrupted data, and adversaries. This project seeks to better understand some of the most practically pertinent sources of uncertainty and develop new algorithms that are robust in the face of this uncertainty.


Safety and Reliability in Black-Box Optimization

This project seeks to enhance safety, reliability, and robustness in black-box optimization, exploring new function structures and addressing limitations. This includes extending decision-making frameworks to grey-box settings and multi-agent learning, utilizing a methodology blending theoretical analyses and algorithm development.


RESEARCH GROUPS

Information Theory and Statistical Learning Group

Our group performs research at the intersection of information theory, machine learning, and high-dimensional statistics, with ongoing areas of interest including information-theoretic limits of learning, adaptive decision-making under uncertainty, scalable algorithms for large-scale inference and learning, and robustness considerations in machine learning.


TEACHING INNOVATIONS

SELECTED PUBLICATIONS

  • Matthew Aldridge, Oliver Johnson, and Jonathan Scarlett, "Group testing: An information theory perspective,"Foundations and Trends in Communications and Information Theory, Volume 15, Issue 3-4, pp. 196-392, Dec. 2019.
  • Jonathan Scarlett, "Tight regret bounds for Bayesian optimization in one dimension,"International Conference on Machine Learning ICML, 2018.
  • Ilija Bogunovic, Jonathan Scarlett, Stefanie Jegelka, and Volkan Cevher, "Adversarially robust optimization with Gaussian processes,"Conference on Neural Information Processing Systems NeurIPS, 2018.
  • Jonathan Scarlett, Ilija Bogunovic, and Volkan Cevher, “Lower bounds on regret for noisy Gaussian process bandit optimization,” Conference on Learning Theory COLT, Amsterdam, 2017.
  • Jonathan Scarlett, "Noisy adaptive group testing: Bounds and algorithms," IEEE Transactions on Information Theory, Volume 65, Issue 6, pp. 3646-3661, June 2019.

AWARDS & HONOURS

  • Singapore National Research Foundation (NRF) Fellowship

  • NUS Early Career Research Award

MODULES TAUGHT

CS5275
The Algorithm Designer's Toolkit

 

In the News

3 November 2021
03 November 2021 – Assistant Professor Jonathan Scarlett has been named in this year’s ‘Innovators Under 35’ Asia Pacific List ...
5 February 2021
5 February 2021 – Twenty-two research papers by NUS Computing faculty and students are featured in the 35th AAAI Conference ...

Knowledge@Computing

2 October 2019
If you’ve ever had an MRI done, you would know that it’s not the most comfortable experience. They can make ...