Sparse deep neural networks for modeling physical and biological systems
Loading...
Date
Authors
Zahn, Olivia Tessa
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Characterizing the relationship between network performance and its parameters is an active area of investigation within the fields of deep learning and complex network science. In this thesis, sparse deep neural networks (DNNs) are explored as tools for modeling physical and biological systems and the functional complexity of trained, sparse DNNs is characterized using tools from complex network theory, namely network motif theory. Neural network pruning is used to find a sparse computational model for controlling a biological motor task. Using a sequential, magnitude-based pruning algorithm, as many as 93\% of network parameters can be removed from a DNN without compromising performance. Through quantifying the distribution of network motifs in the remaining sparse network, we visualize the change in network complexity throughout the pruning process and across networks. We find that, despite the random initialization of network parameters before training, enforced sparsity causes DNNs to converge to similar non-random connectivity patterns as characterized by their network motif significance. Furthermore, we find that network motifs become more significant through the sparsification process. In summary, we find that through using a simple selection process to determine parameter importance, the number of parameters in a trained DNN model can be reduced dramatically and results in a non-trivial network topology.
Description
Thesis (Ph.D.)--University of Washington, 2023
