Analysis Seminar

Homogenization questions inspired by machine learning and the semi-supervised learning problem

Speaker: Raghavendra Venkatraman, Courant Institute

Location: Warren Weaver Hall 1302

Date: Thursday, December 8, 2022, 11 a.m.


This talk comprises two parts. In the first half of the talk (joint with Dejan Slepcev (CMU)), we revisit the problem of pointwise semi-supervised learning (SSL). Working on random geometric graphs (a.k.a point clouds) with few "labeled points", our task is to propagate these labels to the rest of the point cloud. Algorithms that are based on the graph Laplacian are often found to perform poorly in such pointwise learning tasks since minimizers develop localized spikes near labeled data, being essentially constant elsewhere. In the first half of the talk we introduce a class of graph-based higher order fractional Sobolev spaces (H^s) and study their consistency in the large data limit, along with applications to the SSL problem. The mathematical essence of the question is the continuity of the pointwise-evaluation functional, uniformly in the number of data points. A crucial tool is recent convergence results for the spectrum of the graph Laplacian to that of the continuum. 

Obtaining optimal convergence rates for the spectrum of the graph laplacian on point clouds is an open question in stochastic homogenization. In the second part of the talk, we consider such a question in the simpler context of periodic homogenization. For a Schrodinger equation on all of space with a confining potential and periodic coefficients, we obtain optimal convergence rates and high order asymptotic expansions for the eigenvalues and eigenfunctions. Our results are optimal also in how high up in the spectrum we can go. This part is joint work with Scott Armstrong (Courant).