A tagger-aided language model with a stack decoder

Ruiqiang Zhang, Ezra Black, Andrew Finch, Yoshinori Sagisaka

研究成果: Conference contribution

抄録

This contribution of this paper is to investigate the utility of exploiting words and predicted detailed semantic tags in the long history to enhance a standard trigram language model. The paper builds on earlier work in the field that also used words and tags in the long history, but offers a cleaner, and ultimately much more accurate system by integrating the application of these new features directly into the decoding algorithm. The features used in our models are derived using a set of complex questions about the tags and words in the history, written by a linguist. Maximum entropy modelling techniques are then used to combine these features with a standard trigram language model. We evaluate the technique in terms of word error rate, on Wall Street Journal test data.

元の言語English
ホスト出版物のタイトル6th International Conference on Spoken Language Processing, ICSLP 2000
出版者International Speech Communication Association
ISBN(電子版)7801501144, 9787801501141
出版物ステータスPublished - 2000
外部発表Yes
イベント6th International Conference on Spoken Language Processing, ICSLP 2000 - Beijing, China
継続期間: 2000 10 162000 10 20

Other

Other6th International Conference on Spoken Language Processing, ICSLP 2000
China
Beijing
期間00/10/1600/10/20

Fingerprint

standard language
history
language
entropy
semantics
Language Model
Tag
History
Trigram

ASJC Scopus subject areas

  • Linguistics and Language
  • Language and Linguistics

これを引用

Zhang, R., Black, E., Finch, A., & Sagisaka, Y. (2000). A tagger-aided language model with a stack decoder. : 6th International Conference on Spoken Language Processing, ICSLP 2000 International Speech Communication Association.

A tagger-aided language model with a stack decoder. / Zhang, Ruiqiang; Black, Ezra; Finch, Andrew; Sagisaka, Yoshinori.

6th International Conference on Spoken Language Processing, ICSLP 2000. International Speech Communication Association, 2000.

研究成果: Conference contribution

Zhang, R, Black, E, Finch, A & Sagisaka, Y 2000, A tagger-aided language model with a stack decoder. : 6th International Conference on Spoken Language Processing, ICSLP 2000. International Speech Communication Association, 6th International Conference on Spoken Language Processing, ICSLP 2000, Beijing, China, 00/10/16.
Zhang R, Black E, Finch A, Sagisaka Y. A tagger-aided language model with a stack decoder. : 6th International Conference on Spoken Language Processing, ICSLP 2000. International Speech Communication Association. 2000
Zhang, Ruiqiang ; Black, Ezra ; Finch, Andrew ; Sagisaka, Yoshinori. / A tagger-aided language model with a stack decoder. 6th International Conference on Spoken Language Processing, ICSLP 2000. International Speech Communication Association, 2000.
@inproceedings{34a8bc6c36a1449b98793d8e6b90985c,
title = "A tagger-aided language model with a stack decoder",
abstract = "This contribution of this paper is to investigate the utility of exploiting words and predicted detailed semantic tags in the long history to enhance a standard trigram language model. The paper builds on earlier work in the field that also used words and tags in the long history, but offers a cleaner, and ultimately much more accurate system by integrating the application of these new features directly into the decoding algorithm. The features used in our models are derived using a set of complex questions about the tags and words in the history, written by a linguist. Maximum entropy modelling techniques are then used to combine these features with a standard trigram language model. We evaluate the technique in terms of word error rate, on Wall Street Journal test data.",
author = "Ruiqiang Zhang and Ezra Black and Andrew Finch and Yoshinori Sagisaka",
year = "2000",
language = "English",
booktitle = "6th International Conference on Spoken Language Processing, ICSLP 2000",
publisher = "International Speech Communication Association",

}

TY - GEN

T1 - A tagger-aided language model with a stack decoder

AU - Zhang, Ruiqiang

AU - Black, Ezra

AU - Finch, Andrew

AU - Sagisaka, Yoshinori

PY - 2000

Y1 - 2000

N2 - This contribution of this paper is to investigate the utility of exploiting words and predicted detailed semantic tags in the long history to enhance a standard trigram language model. The paper builds on earlier work in the field that also used words and tags in the long history, but offers a cleaner, and ultimately much more accurate system by integrating the application of these new features directly into the decoding algorithm. The features used in our models are derived using a set of complex questions about the tags and words in the history, written by a linguist. Maximum entropy modelling techniques are then used to combine these features with a standard trigram language model. We evaluate the technique in terms of word error rate, on Wall Street Journal test data.

AB - This contribution of this paper is to investigate the utility of exploiting words and predicted detailed semantic tags in the long history to enhance a standard trigram language model. The paper builds on earlier work in the field that also used words and tags in the long history, but offers a cleaner, and ultimately much more accurate system by integrating the application of these new features directly into the decoding algorithm. The features used in our models are derived using a set of complex questions about the tags and words in the history, written by a linguist. Maximum entropy modelling techniques are then used to combine these features with a standard trigram language model. We evaluate the technique in terms of word error rate, on Wall Street Journal test data.

UR - http://www.scopus.com/inward/record.url?scp=85009070549&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85009070549&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:85009070549

BT - 6th International Conference on Spoken Language Processing, ICSLP 2000

PB - International Speech Communication Association

ER -