Work in progress with Yizhe Zhu and Ludovic Stefan: extension for regular graphs, regular digraphs and non-backtracking matrices.
We show that scattering transforms on graphs are continuous with respect to local-weak distance: as a consequences, these graph descriptors are transferable among network models sharing the same local properties and show a remarkable degree of stability, even in very sparse graph models. From an experimental perspective, we examine how these non-learned transforms characterize graph models and graph signals through moment-constrained sampling.
Work in progress with Bartek B. and Bharatt Chowdhuri.
In parallel, I'm interested in the rigidites of random point processes, such as number-rigidities, fluctuations reductions, hyperuniformity, and the possible links between these notions. There are different ways in which point processes in can exhibit a stronger order than the totally chaotic Poisson process; hyperuniformity is when the (random) number of points falling in a large domain of radius has a reduced variance, that is, when
In this survey, I try to give a mathematical overview of this rich domain. Topics: the Fourier caracterization of hyperuniformity, the fluctuation scale, the links with number-rigidity and maximal rigidity for stealthy processes, the example of pertubed lattices.
Here is a version of this survey. It's still work in progress.
Hyperuniformity survey (june 2021: added a paragraph on JLM laws)
Many stationary point processes have recently been shown to be rigid, that is, the number of points of the process inside a disk is a measurable function of the point configuration outside the disk. However, most of these functions are limits of linear statistics of the point process and they frequently have an exponential radius of stabilization, making it nearly impossible to effectively recover the number of points in a small disk by the observation of the configuration in a large window. Can we construct more explicit reconstruction functions ? With a deep learning perspective, one can try to train invariant neural networks to get back this number and evaluate the complexity of the solutions.
Work in progress with Antoine Brochard.
Joint work with Charles Bordenave.
Arxiv link – Published in Journal of Combinatorial Theory (series B).
This is a short note on a generalization of the Erdös-Gallai theorem on graphical sequences.