View Complete Reference

Sahu, SK, Java, A, Shaikh, A and Kilcher, Y (2021)

Rethinking Neural Networks With Benford's Law

Preprint arXiv:2102.03313 [cs.LG]; last accessed October 12, 2024.

ISSN/ISBN: Not available at this time. DOI: Not available at this time.



Abstract: Benford's Law (BL) or the Significant Digit Law defines the probability distribution of the first digit of numerical values in a data sample. This Law is observed in many naturally occurring datasets. It can be seen as a measure of naturalness of a given distribution and finds its application in areas like anomaly and fraud detection. In this work, we address the following question: Is the distribution of the Neural Network parameters related to the network's generalization capability? To that end, we first define a metric, MLH (Model Enthalpy), that measures the closeness of a set of numbers to Benford's Law and we show empirically that it is a strong predictor of Validation Accuracy. Second, we use MLH as an alternative to Validation Accuracy for Early Stopping, removing the need for a Validation set. We provide experimental evidence that even if the optimal size of the validation set is known before-hand, the peak test accuracy attained is lower than not using a validation set at all. Finally, we investigate the connection of BL to Free Energy Principle and First Law of Thermodynamics, showing that MLH is a component of the internal energy of the learning system and optimization as an analogy to minimizing the total energy to attain equilibrium.


Bibtex:
@misc{, title={Rethinking Neural Networks With Benford's Law}, author={Surya Kant Sahu and Abhinav Java and Arshad Shaikh and Yannic Kilcher}, year={2021}, eprint={2102.03313}, archivePrefix={arXiv}, primaryClass={cs.LG}, url={https://arxiv.org/abs/2102.03313}, }


Reference Type: Preprint

Subject Area(s): Computer Science