Infinite width limit of Neural Networks

5. Infinite width limit of Neural Networks#

Infinite width limit of Neural Networks

Contents:

Papers: [RW06], [PAP+23], [Nea96], [LBN+17], [JGH20]

5.3. Cited references:#

[JGH20]

Arthur Jacot, Franck Gabriel, and Clément Hongler. Neural tangent kernel: convergence and generalization in neural networks. 2020. URL: https://arxiv.org/abs/1806.07572, arXiv:1806.07572.

[LBN+17]

Jaehoon Lee, Yasaman Bahri, Roman Novak, Samuel S Schoenholz, Jeffrey Pennington, and Jascha Sohl-Dickstein. Deep neural networks as gaussian processes. arXiv preprint arXiv:1711.00165, 2017.

[Nea96]

Radford M. Neal. Bayesian Learning for Neural Networks. Volume 118 of Lecture Notes in Statistics. Springer, 1996. doi:10.1007/978-1-4612-0745-0.

[PAP+23]

R. Pacelli, S. Ariosto, M. Pastore, F. Ginelli, M. Gherardi, and P. Rotondo. A statistical mechanics framework for bayesian deep neural networks beyond the infinite-width limit. Nature Machine Intelligence, 5(12):1497–1507, December 2023. URL: http://dx.doi.org/10.1038/s42256-023-00767-6, doi:10.1038/s42256-023-00767-6.

[RW06]

Carl Edward Rasmussen and Christopher K. I. Williams. Gaussian Processes for Machine Learning. The MIT Press, 2006.