Title: Proof of the Contiguity Conjecture and Lognormal Limit for the Symmetric Perceptron
Abstract: We consider the symmetric binary perceptron model, a simple model of neural networks that has gathered significant attention in the statistical physics, information theory and probability theory communities, with recent connections made to the performance of learning algorithms. We establish that the partition function of this model, normalized by its expected value, converges to a lognormal distribution. As a consequence, this allows us to establish the contiguity conjecture between the planted and unplanted models in the satisfiable regime and other properties of the structure of the solution space. Our proof technique relies on a dense counter-part of the small graph conditioning method, which was developed for sparse models in the celebrated work of Robinson and Wormald.