# Publications

Universality Laws for High-Dimensional Learning with Random Features,” Submitted. arXiv:2009.07669 [cs.IT]Abstract

, “ , “ , “ Householder Dice: A Matrix-Free Algorithm for Simulating Dynamics on Gaussian and Random Orthogonal Ensembles,” IEEE Transactions on Information Theory, 2021. arXiv:2101.07464 [cs.IT]Abstract

, “ Construction of optimal spectral methods in phase retrieval,” in Mathematical and Scientific Machine Learning, 2021. arXiv:2012.04524 [cs.IT]Abstract

, “ On the Inherent Regularization Effects of Noise Injection During Training,” in International Conference on Machine Learning (ICML), 2021. arXiv:2102.07379 [cs.LG]Abstract

, “ Phase Transitions in Transfer Learning for High-Dimensional Perceptrons,” Entropy, Special Issue "The Role of Signal Processing and Information Theory in Modern Machine Learning", vol. 23, no. 4, 2021. arXiv:2101.01918 [cs.LG]Abstract

, “ The Limiting Poisson Law of Massive MIMO Detection with Box Relaxation,” IEEE Journal on Selected Areas in Information Theory, vol. 1, no. 3, pp. 695-704, 2020. arXiv:2006.08416 [cs.IT]Abstract

, “ Phase Transitions of Spectral Initialization for High-Dimensional Nonconvex Estimation,” Information and Inference: A Journal of the IMA, vol. 9, no. 3, pp. 507-541, 2020. arXiv:1702.06435 [cs.IT]Abstract

, “ Generalization error in high-dimensional perceptrons: Approaching Bayes error with convex optimization,” in Conference on Neural Information Processing Systems (NeurIPS), 2020. arXiv:2006.06560 [stat.ML]Abstract

, “ The role of regularization in classification of high-dimensional noisy Gaussian mixture,” in International Conference on Machine Learning (ICML), 2020. arXiv:2002.11544 [stat.ML]Abstract

, “ Optimal Spectral Initialization for Signal Recovery with Applications to Phase Retrieval,” IEEE Transactions on Signal Processing, vol. 67, no. 9, pp. 2347-2356, 2019. arXiv:1811.04420 [cs.IT]Abstract

, “ Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview,” IEEE Transactions on Signal Processing, vol. 67, no. 20, pp. 5239-5269, 2019. arXiv:1809.09573 [cs.LG]Abstract

, “ The scaling limit of high-dimensional online independent component analysis,” Journal of Statistical Mechanics (Special Issue on Machine Learning), vol. 2019, 2019. Publisher's VersionAbstract JSTAT_2019.pdf

, “ A Solvable High-Dimensional Model of GAN,” in Proc. Thirty-third Conference on Neural Information Processing Systems (NeurIPS), 2019. arXiv:1805.08349 [cs.LG]Abstract

, “ Generalized Approximate Survey Propagation for High-Dimensional Estimation,” in Proc. International Conference on Machine Learning (ICML), 2019. arXiv:1905.05313 [cond-mat.dis-nn]Abstract

, “ Asymptotics and optimal designs of SLOPE for sparse linear regression,” in Prof. International Symposium on Information Theory (ISIT), 2019. Longer version with full technical detailsAbstract

, “ Super resolution phase retrieval for sparse signals,” IEEE Transactions on Signal Processing, vol. 67, no. 18, 2019. arXiv:1808.01961 [cs.IT]Abstract

, “ MMSE Approximation For Sparse Coding Algorithms Using Stochastic Resonance,” IEEE Transactions on Signal Processing, vol. 67, no. 17, 2019. arXiv:1806.10171 [eess.SP]Abstract

, “ , “