Publications
Detail of publication
Citation
: Towards Network Simplification for Low-Cost Devices by Removing Synapses . International Conference on Speech and Computer, p. 58-67, Springer, Cham, 2018.
Download PDF
Additional information
Abstract
The deployment of robust neural network based models on low-cost devices touches the problem with hardware constraints like limited memory footprint and computing power. This work presents a general method for a rapid reduction of parameters (80–90%) in a trained (DNN or LSTM) network by removing its redundant synapses, while the classification accuracy is not significantly hurt. The massive reduction of parameters leads to a notable decrease of the model’s size and the actual prediction time of on-board classifiers. We show the pruning results on a simple speech recognition task, however, the method is applicable to any classification data.
Detail of publication
| Title: | Towards Network Simplification for Low-Cost Devices by Removing Synapses |
|---|---|
| Author: | M. Bulín ; L. Šmídl ; J. Švec |
| Language: | English |
| Date of publication: | 18 Sep 2018 |
| Year: | 2018 |
| Type of publication: | Papers in proceedings of reviewed conferences |
| Title of journal or book: | International Conference on Speech and Computer |
| Page: | 58 - 67 |
| DOI: | https://doi.org/10.1007/978-3-319-99579-3_7 |
| ISBN: | 978-3-319-99579-3 |
| Publisher: | Springer, Cham |
| Date: | 18 Sep 2018 - 22 Sep 2018 |
Keywords
Pruning synapses, Network simplification, Minimal network structure, Low-cost devices, Speech recognition
BibTeX
@INPROCEEDINGS{MBulin_2018_TowardsNetwork,
author = {M. Bul\'{i}n and L. \v{S}m\'{i}dl and J. \v{S}vec},
title = {Towards Network Simplification for Low-Cost Devices by Removing Synapses},
year = {2018},
publisher = {Springer, Cham},
journal = {International Conference on Speech and Computer},
pages = {58-67},
ISBN = {978-3-319-99579-3},
doi = {https://doi.org/10.1007/978-3-319-99579-3_7},
url = {http://www.kky.zcu.cz/en/publications/MBulin_2018_TowardsNetwork},
}


ZČU
