Skip to content

Detail of publication

Citation

Soutner D. and Müller L. : On Continuous Space Word Representations as Input of LSTM Language Model . Statistical Language and Speech Processing, 2015.

Download PDF

PDF

Abstract

Artificial neural networks have become the state-of-the-art in the task of language modelling whereas Long-Short Term Memory (LSTM) networks seem to be an efficient architecture. The continuous skip-gram and the continuous bag of words (CBOW) are algorithms for learning quality distributed vector representations that are able to capture a large number of syntactic and semantic word relationships. In this paper, we carried out experiments with a combination of these powerful models: the continuous representations of words trained with skip-gram/CBOW/GloVe method, word cache expressed as a vector using latent Dirichlet allocation (LDA). These all are used on the input of LSTM network instead of 1-of-N coding traditionally used in language models. The proposed models are tested on Penn Treebank and MALACH corpus.

Detail of publication

Title: On Continuous Space Word Representations as Input of LSTM Language Model
Author: Soutner D. ; Müller L.
Language: Czech
Date of publication: 17 Nov 2015
Year: 2015
Type of publication: Papers in proceedings of reviewed conferences
Book title: Statistical Language and Speech Processing
DOI: 10.1007/978-3-319-25789-1_25
/ 2016-06-22 10:59:14 /

BibTeX

@ARTICLE{SoutnerD_2015_OnContinuousSpace,
 author = {Soutner D. and M\"{u}ller L.},
 title = {On Continuous Space Word Representations as Input of LSTM Language Model},
 year = {2015},
 booktitle = {Statistical Language and Speech Processing},
 doi = {10.1007/978-3-319-25789-1_25},
 url = {http://www.kky.zcu.cz/en/publications/SoutnerD_2015_OnContinuousSpace},
}