Department of Business Information Systems, Pantheon-Sorbonne University, 12 Pl. du Panthéon, 75231 Paris, France
Mini Review
Exploring the Role of Sparsity in Deep Neural Networks for Improved Performance
Author(s): Mark Daniel*
Deep Neural Networks (DNNs) have achieved remarkable success in various domains, ranging from computer vision to natural language
processing. However, their increasing complexity poses challenges in terms of model size, memory requirements, and computational costs.
To address these issues, researchers have turned their attention to sparsity, a technique that introduces structural zeros into the network,
thereby reducing redundancy and improving efficiency. This research article explores the role of sparsity in DNNs and its impact on performance
improvement. We review existing literature, discuss sparsity-inducing methods, and analyze the benefits and trade-offs associated with sparse
networks. Furthermore, we present experimental results that demonstrate the effectiveness of sparsity in improving performance metrics such as
accuracy, memory footprint, and compu.. Read More»
DOI:
10.37421/0974-7230.2023.16.462
Journal of Computer Science & Systems Biology received 2279 citations as per Google Scholar report