Skip to content

Detail of publication

Citation

Soutner, D. and Müller, L. : Application of LSTM Neural Networks in Language Modelling . Lecture Notes in Computer Science, p. 105-112, 2013.

Download PDF

PDF

Abstract

Artificial neural networks have become state-of-the-art in the task of language modelling on a small corpora. While feed-forward networks are able to take into account only a fixed context length to predict the next word, recurrent neural networks (RNN) can take advantage of all previous words. Due the difficulties in training of RNN, the way could be in using Long Short Term Memory (LSTM) neural network architecture. In this work, we show an application of LSTM network with extensions on a language modelling task with Czech spontaneous phone calls. Experiments show considerable improvements in perplexity and WER on recognition system over n-gram baseline.

Detail of publication

Title: Application of LSTM Neural Networks in Language Modelling
Author: Soutner, D. ; Müller, L.
Language: English
Date of publication: 1 Sep 2013
Year: 2013
Type of publication: Papers in proceedings of reviewed conferences
Series: Lecture Notes in Computer Science
Page: 105 - 112
DOI: 10.1007/978-3-642-40585-3_14
/ 2016-06-22 10:58:09 /

Keywords

language modelling, recurrent neural networks, LSTM neural networks

BibTeX

@ARTICLE{SoutnerD_2013_ApplicationofLSTM,
 author = {Soutner, D. and M\"{u}ller, L.},
 title = {Application of LSTM Neural Networks in Language Modelling},
 year = {2013},
 pages = {105-112},
 series = {Lecture Notes in Computer Science},
 doi = {10.1007/978-3-642-40585-3_14},
 url = {http://www.kky.zcu.cz/en/publications/SoutnerD_2013_ApplicationofLSTM},
}