When Sinkhorn-Knopp Meets Marchenko-Pastur: Diagonal Scaling Reveals the Rank of a Count Matrix
A longstanding question when applying PCA is how to choose the number of principal components. Random matrix theory provides useful insights into this question by assuming a “signal+noise” model, where the goal is to estimate the rank of the underlying signal matrix. If the noise is homoskedastic, i.e. the noise variances are identical across all entries, the spectrum of the noise admits the celebrated Marchenko-Pastur (MP) law, providing a simple method for rank estimation. However, in many practical situations, such as in single-cell RNA sequencing (scRNA-seq), the noise is far from being homoskedastic. In this talk, focusing on a Poisson data model, I will present a simple procedure termed biwhitening, which enforces the MP law to hold by appropriately scaling the rows and columns of the data matrix. Aside from the Poisson distribution, this procedure is extended to families of distributions with a quadratic variance function. I will demonstrate this approach on both simulated and experimental data, showcasing accurate rank estimation in simulations and excellent fits to the MP law for real scRNA-seq datasets.
Boris Landa is a Gibbs Assistant Professor in the program for applied mathematics at Yale University. Previously, he completed his Ph.D. in applied mathematics at Tel Aviv University under the guidance of Prof. Yoel Shkolnisky. Boris's research is focused on theory and methods for processing large datasets corrupted by noise and deformations, with applications in the biological sciences.