Abstract: The loss landscape of neural networks is a critical aspect of their behavior, and understanding its properties is essential for improving their performance. In this paper, we investigate how the loss surface changes when the sample size increases, a previously unexplored issue. We theoretically analyze the convergence of the loss landscape in a fully connected neural network and derive upper bounds for the difference in loss function values when adding a new object to the sample. Our empirical study confirms these results on various datasets, demonstrating the convergence of the loss function surface for image classification tasks. Our findings provide insights into the local geometry of neural loss landscapes and have implications for the development of sample size determination techniques.
- [2025/03/22] Paper was published in Doklady Mathematics journal (🤗 Open Access).
- [2025/10/10] Slides for presentation were added.
- [2024/09/27] Paper was accepted to be published in Doklady Mathematics journal.
- [2024/09/18] Preprint was added to arXiv.
- [2024/08/16] Paper was accepted to the AI Journey 2024 conference.
- [2024/08/13] Paper and Code were released.
This repository is structured as follows:
code
: The computational experiments code with its ownREADME.md
paper
: Preprintmain.pdf
with source LaTeX filemain.tex
.slides
: Presentation slidesmain.pdf
with source LaTeX filemain.tex
.
If you find our paper to be useful for your research or applications, please cite us:
@article{kiselev2025unraveling,
title={Unraveling the Hessian: A Key to Smooth Convergence in Loss Function Landscapes},
author={Kiselev, Nikita and Grabovoy, Andrey},
journal={Doklady Mathematics},
number={1},
pages={S49--S61},
volume={110},
year={2024}
}
@article{kiselev2024unraveling,
title={Unraveling the Hessian: A Key to Smooth Convergence in Loss Function Landscapes},
author={Kiselev, Nikita and Grabovoy, Andrey},
journal={arXiv preprint arXiv:2409.11995},
year={2024}
}
We also appreciate it if you could give a star ⭐ to this repository. Thanks a lot!