A tagger-aided language model with a stack decoder

Ruiqiang Zhang, Ezra Black, Andrew Finch, Yoshinori Sagisaka

研究成果: Conference contribution

抄録

This contribution of this paper is to investigate the utility of exploiting words and predicted detailed semantic tags in the long history to enhance a standard trigram language model. The paper builds on earlier work in the field that also used words and tags in the long history, but offers a cleaner, and ultimately much more accurate system by integrating the application of these new features directly into the decoding algorithm. The features used in our models are derived using a set of complex questions about the tags and words in the history, written by a linguist. Maximum entropy modelling techniques are then used to combine these features with a standard trigram language model. We evaluate the technique in terms of word error rate, on Wall Street Journal test data.

本文言語English
ホスト出版物のタイトル6th International Conference on Spoken Language Processing, ICSLP 2000
出版社International Speech Communication Association
ISBN(電子版)7801501144, 9787801501141
出版ステータスPublished - 2000
外部発表はい
イベント6th International Conference on Spoken Language Processing, ICSLP 2000 - Beijing, China
継続期間: 2000 10月 162000 10月 20

出版物シリーズ

名前6th International Conference on Spoken Language Processing, ICSLP 2000

Other

Other6th International Conference on Spoken Language Processing, ICSLP 2000
国/地域China
CityBeijing
Period00/10/1600/10/20

ASJC Scopus subject areas

  • 言語学および言語
  • 言語および言語学

フィンガープリント

「A tagger-aided language model with a stack decoder」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル