A tagger-aided language model with a stack decoder

Ruiqiang Zhang, Ezra Black, Andrew Finch, Yoshinori Sagisaka

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This contribution of this paper is to investigate the utility of exploiting words and predicted detailed semantic tags in the long history to enhance a standard trigram language model. The paper builds on earlier work in the field that also used words and tags in the long history, but offers a cleaner, and ultimately much more accurate system by integrating the application of these new features directly into the decoding algorithm. The features used in our models are derived using a set of complex questions about the tags and words in the history, written by a linguist. Maximum entropy modelling techniques are then used to combine these features with a standard trigram language model. We evaluate the technique in terms of word error rate, on Wall Street Journal test data.

Original languageEnglish
Title of host publication6th International Conference on Spoken Language Processing, ICSLP 2000
PublisherInternational Speech Communication Association
ISBN (Electronic)7801501144, 9787801501141
Publication statusPublished - 2000
Externally publishedYes
Event6th International Conference on Spoken Language Processing, ICSLP 2000 - Beijing, China
Duration: 2000 Oct 162000 Oct 20

Other

Other6th International Conference on Spoken Language Processing, ICSLP 2000
CountryChina
CityBeijing
Period00/10/1600/10/20

ASJC Scopus subject areas

  • Linguistics and Language
  • Language and Linguistics

Fingerprint Dive into the research topics of 'A tagger-aided language model with a stack decoder'. Together they form a unique fingerprint.

  • Cite this

    Zhang, R., Black, E., Finch, A., & Sagisaka, Y. (2000). A tagger-aided language model with a stack decoder. In 6th International Conference on Spoken Language Processing, ICSLP 2000 International Speech Communication Association.