EE 6112: Topics in Random Processes and Concentrations

Piazza signup link


In Spring 2022, I am offering an advanced graduate-level course on Probability Theory focusing on its application to problems arising in theoretical machine learning. This course will tentatively cover the following topics.

A. Review of Basic Probability Theory

  • Measure Spaces, Sigma-Algebras, Random Variables
  • Expectation, Convergence Theorems
  • Conditional Expectation, Filtrations, $\mathcal{L}^2$ theory
  • Martinagles

B. Concentration Inequalities

  • Martingale concentrations (Azuma-Hoeffding, Doob’s martingale method, median concentrations), Entropy methods
  • Logarithmic Sobolev inequality
  • Talagrand's inequality
  • Dudley's entropy integral, Sudakov's lower bound

C. Application to Learning Theory

  • Generalization bounds, symmetrization, and concentration
  • Complexity of function classes
  • Information Theoretic Lower bounds on estimation/testing
  • Online learning and Random Processes
  • Topics on High Dimensional Probability


Prerequisites

Solid background in Probability theory and sufficient mathematical maturity. Exposure to elements of Machine Learning and Statistics.

Evaluations

Problem-solving will be our primary vehicle for learning the material. We will have bi-weekly problem sets, a mid-term and a final project. The final grade will be a weighted average of these three components as detailed below:
  • Problem Sets (50%)
  • Mid-term (20%)
  • Final Exam (30%)


Problem Sets

References

We will not follow any one particular source, in general. However, the following references will be useful