# GRandMa: Random Graphs in Machine learning

Graphs have become popular objects to represent many kinds of structured and relational data. As a consequence, the field of *Graph Machine Learning* (ML) has grown exponentially in the last few decades, with popular tools such as graph kernels, graph signal processing, and Graph Neural Networks (GNN). Despite this, classical ML analyses such as generalization or optimization guarantees, and the associated sample complexities and rates, remain relatively limited for graph data (with some notable exception such as SPectral Clustering and unsupervised learning). We argue that this is due to a relative lack of *statistical modelling*, in particular of large graphs, without which notions like generalization are ill-defined. On the other hand, *Random Graphs* (RG) represent a vast field in Statistics and Graph Theory, with a long history, but have been quite overlooked in modern Graph ML.

**GRandMa (2022 - 2026) aims to fill this gap, by incorporating the use of Random Graphs in modern Graph ML algorithms. By improving our theoretical understanding of these algorithms and their limitations on large graphs, we hope to develop new, efficient graph ML algorithms.**

## Jobs

**Spring 2022: Master 2 Internship + PhD**(Filled): Random Graphs in Machine Learning. The core topic of GRandMa: the use of random graphs to study generalization and sample complexities in graph machine learning. W/ Simon BarthelmÃ© and Yohann De Castro.

## Associated papers

- N. Keriven.
**Not too little, not too much: a theoretical analysis of graph (over)smoothing***NeurIPS*2022. Pdf - N. Keriven.
**Entropic Optimal Transport on Random Graphs***Preprint*2022. Pdf - N. Keriven, A. Bietti, S. Vaiter.
**On the Universality of Graph Neural Networks on Large Random Graphs***NeurIPS*2021. Pdf - N. Keriven, A. Bietti, S. Vaiter.
**Convergence and Stability of Graph Convolutional Network Networks on Large Random Graphs***NeurIPS (Spotlight)*2020. Pdf