Spring school 

Convex geometry and random matrices in high dimensions

14-18 June 2021 Paris (France)  - Amphithéâtre Hermite -  Institut Henri Poincaré

Abstracts

Ramon van Handel (Princeton University)

Title: Around the Alexandrov-Fenchel inequality

Abstract: The isoperimetric theorem states that the ball minimizes surface area among all bodies of the same volume. For convex bodies, however, volume and surface area are merely two examples of a large family of natural geometric parameters called mixed volumes that arise as coefficients of the volume polynomial. Mixed volumes were discovered by Minkowski in a seminal 1903 paper that laid much of the foundation for modern convex geometry. In particular, Minkowski, Alexandrov and Fenchel discovered a remarkable set of quadratic inequalities between mixed volumes that constitute a far-reaching generalization of the classical isoperimetric theorem. The theory of these inequalities and their applications in characterized by unexpected connections with various questions in geometry, analysis, algebra, and combinatorics, and features some long-standing open problems. My aim in these lectures is to introduce some of the problems, connections, and recent progress on this topic.

 

Mark Rudelson (University of Michigan)

Title: On the delocalization of the eigenvectors of random matrices

Abstract: Consider a random matrix with i.i.d. normal entries. Since its distribution is invariant under rotations, any normalized eigenvector is uniformly distributed over the unit sphere. For a general distribution of the entries, this is no longer true. Yet, if the size of the matrix is large, the eigenvectors are distributed approximately uniformly. This property, called delocalization, can quantified in various senses. In these lectures, we will discuss recent results on delocalization for general random matrices.
 We will consider two notions of delocalization. First, we will discuss the sup-norm delocalization. It is easy to see that all coordinates of a  vector uniformly distributed over the unit sphere are small with probability close to 1. We strive to extend this fact to the eigenvalues of general random matrices. We will also consider a notion of ``no-gaps'' delocalization. Namely, we will show that with high probability, any relatively large set of coordinates carries a non-negligible portion of the norm of the eigenvector. These two notions are complementary to each other: while the sup-norm delocalization rules out large coordinates of the eigenvector, the no-gaps delocalization rules out the small ones. If time allows, we will consider applications of delocalization to random graphs in which these two properties work in tandem.
Our approach to establishing delocalization will rely in a large part on ideas of high-dimensional convex geometry and measure concentration. The lectures will be self-contained, and all necessary tools will be introduced along the way.
 


Online user: 1