FJ-LMI Seminar

Seminar information archive ~12/24Next seminarFuture seminars 12/25~

Organizer(s) Toshiyuki Kobayashi, Michael Pevzner

Next seminar

2026/01/09

10:00-12:00   Room #056 (Graduate School of Math. Sci. Bldg.)
Thomas Karam (Shanghai Jiao Tong University)
Contributions of information theory to pure mathematics (英語)
[ Abstract ]
Information theory, founded by Shannon (1948), was originally motivated by communications engineering and has since grown to occupy a key role in several major approaches to artificial intelligence, including machine learning and neural networks, among others. Lecture 1 shall discuss the origins and the definition of Shannon entropy, as well as two approaches naturally leading to that definition. Lecture 2 shall then cover the definitions of the main central information-theoretic quantities aside from the Shannon entropies of random variables, and the main identities and inequalities that they satisfy. Lecture 3 will then specialise these results to recover many of the standard identities and inequalities involving dimensions of groups, dimensions of linear spaces and sizes of sets.

After that, Lectures 4,5,6,7 shall each illustrate a way in which basic information theory has provided tools that have enabled first proofs or new enlightening proofs of several results in pure mathematics that have simple and accessible formulations and are central to their respective areas. In probability, we shall highlight an entropy proof of the central limit theorem and the underlying analogy between Shannon entropy and thermodynamic entropy. In geometry, we shall explore applications of entropy to higher-dimensional geometry, in particular through Shearer’s lemma (1986) and the resulting control of the size of a set by its projections. In pure combinatorics, we shall focus on a breakthrough of Gilmer (2022) on the infamous conjecture of Frankl (1979) on union-closed families of sets. In combinatorial number theory, we shall outline the solution by Gowers, Green, Manners, Tao (2024) to Marton’s conjecture, one of the central problems of the area.

Finally, Lecture 8 will be devoted to a brief glimpse of the mathematically beautiful theory of information geometry recognised last year (2025) by the award of the Kyoto Prize to its founder Amari, and conclude with some of its practical applications – to neural networks – as Shannon presumably would have.
[ Reference URL ]
https://fj-lmi.cnrs.fr/seminars/