Proceedings of International Congress on Information and Communication Technology, pp. 509–528.
ISSN/ISBN: Not available at this time. DOI: 10.1007/978-981-97-3305-7_41
Abstract: Context: Benford’s Law describes the distribution of atypical patterns of numbers. It focuses on the occurrence of the first digit in a natural population of numbers. When these numbers are divided into nine categories based on their first digit, the largest category consists of numbers that start with 1, followed by those starting with 2, and so on. Objective: Each neuron in a neural network (NN) holds a mathematical value, often referred to as a weight, which is updated according to certain parameters. This study explores the Degree of Benford’s Law Existence (DBLE) within Convolutional Neural Networks (CNNs). Additionally, the experiment investigates the correlation between the DBLE and NN’s accuracy. Methods: A CNN is subjected to testing 15 times using various datasets and hyperparameters. The DBLE is calculated for each CNN variation, and the correlation between the CNN’s performance and DBLE is examined. To further explore the presence of Benford’s Law in CNN models, nine transfer learning models are also tested for. Results: The experiment suggests (1) Benford’s Law is observed in the weights of neural networks, and in most cases, the DBLE increases as the training progresses. (2) It is observed that models with significant differences in performance tend to demonstrate relatively high divergence in DBLE.
Bibtex:
@incollection{,
author = {Farshad Ghassemi Toosi},
title = {The Relationship Between the Distribution of Neural Network Weights and Model Accuracy: A Benford’s Law Perspective},
publisher = {Springer Nature},
year = {2024},
pages = {509--528},
month = {aug},
doi = {10.1007/978-981-97-3305-7_41},
booktitle = {Proceedings of Ninth International Congress on Information and Communication Technology},
}
Reference Type: Book Chapter
Subject Area(s): Computer Science