Purest ever example-based machine translation

Detailed presentation and assessment

Yves Lepage, Etienne Denoual

Research output: Contribution to journalArticle

41 Citations (Scopus)

Abstract

We have designed, implemented and assessed an EBMT system that can be dubbed the "purest ever built": it strictly does not make any use of variables, templates or patterns, does not have any explicit transfer component, and does not require any preprocessing or training of the aligned examples. It uses only a specific operation, proportional analogy, that implicitly neutralizes divergences between languages and captures lexical and syntactic variations along the paradigmatic and syntagmatic axes without explicitly decomposing sentences into fragments. Exactly the same genuine implementation of such a core engine was evaluated on different tasks and language pairs. To begin with, we compared our system on two tasks of a previous MT evaluation campaign to rank it among other current state-of-the-art systems. Then, we illustrated the "universality" of our system by participating in a recent MT evaluation campaign, with exactly the same core engine, for a wide variety of language pairs. Finally, we studied the influence of extra data like dictionaries and paraphrases on the system performance.

Original languageEnglish
Pages (from-to)251-282
Number of pages32
JournalMachine Translation
Volume19
Issue number3-4
DOIs
Publication statusPublished - 2005 Dec
Externally publishedYes

Fingerprint

Engines
Syntactics
Glossaries
campaign
language
evaluation
dictionary
divergence
Machine Translation
performance
Language
Evaluation

Keywords

  • Divergences across languages
  • Example-based machine translation
  • Proportional analogies

ASJC Scopus subject areas

  • Hardware and Architecture
  • Software

Cite this

Purest ever example-based machine translation : Detailed presentation and assessment. / Lepage, Yves; Denoual, Etienne.

In: Machine Translation, Vol. 19, No. 3-4, 12.2005, p. 251-282.

Research output: Contribution to journalArticle

@article{82075827959e44839b5e352bcc192e69,
title = "Purest ever example-based machine translation: Detailed presentation and assessment",
abstract = "We have designed, implemented and assessed an EBMT system that can be dubbed the {"}purest ever built{"}: it strictly does not make any use of variables, templates or patterns, does not have any explicit transfer component, and does not require any preprocessing or training of the aligned examples. It uses only a specific operation, proportional analogy, that implicitly neutralizes divergences between languages and captures lexical and syntactic variations along the paradigmatic and syntagmatic axes without explicitly decomposing sentences into fragments. Exactly the same genuine implementation of such a core engine was evaluated on different tasks and language pairs. To begin with, we compared our system on two tasks of a previous MT evaluation campaign to rank it among other current state-of-the-art systems. Then, we illustrated the {"}universality{"} of our system by participating in a recent MT evaluation campaign, with exactly the same core engine, for a wide variety of language pairs. Finally, we studied the influence of extra data like dictionaries and paraphrases on the system performance.",
keywords = "Divergences across languages, Example-based machine translation, Proportional analogies",
author = "Yves Lepage and Etienne Denoual",
year = "2005",
month = "12",
doi = "10.1007/s10590-006-9010-x",
language = "English",
volume = "19",
pages = "251--282",
journal = "Machine Translation",
issn = "0922-6567",
publisher = "Springer Netherlands",
number = "3-4",

}

TY - JOUR

T1 - Purest ever example-based machine translation

T2 - Detailed presentation and assessment

AU - Lepage, Yves

AU - Denoual, Etienne

PY - 2005/12

Y1 - 2005/12

N2 - We have designed, implemented and assessed an EBMT system that can be dubbed the "purest ever built": it strictly does not make any use of variables, templates or patterns, does not have any explicit transfer component, and does not require any preprocessing or training of the aligned examples. It uses only a specific operation, proportional analogy, that implicitly neutralizes divergences between languages and captures lexical and syntactic variations along the paradigmatic and syntagmatic axes without explicitly decomposing sentences into fragments. Exactly the same genuine implementation of such a core engine was evaluated on different tasks and language pairs. To begin with, we compared our system on two tasks of a previous MT evaluation campaign to rank it among other current state-of-the-art systems. Then, we illustrated the "universality" of our system by participating in a recent MT evaluation campaign, with exactly the same core engine, for a wide variety of language pairs. Finally, we studied the influence of extra data like dictionaries and paraphrases on the system performance.

AB - We have designed, implemented and assessed an EBMT system that can be dubbed the "purest ever built": it strictly does not make any use of variables, templates or patterns, does not have any explicit transfer component, and does not require any preprocessing or training of the aligned examples. It uses only a specific operation, proportional analogy, that implicitly neutralizes divergences between languages and captures lexical and syntactic variations along the paradigmatic and syntagmatic axes without explicitly decomposing sentences into fragments. Exactly the same genuine implementation of such a core engine was evaluated on different tasks and language pairs. To begin with, we compared our system on two tasks of a previous MT evaluation campaign to rank it among other current state-of-the-art systems. Then, we illustrated the "universality" of our system by participating in a recent MT evaluation campaign, with exactly the same core engine, for a wide variety of language pairs. Finally, we studied the influence of extra data like dictionaries and paraphrases on the system performance.

KW - Divergences across languages

KW - Example-based machine translation

KW - Proportional analogies

UR - http://www.scopus.com/inward/record.url?scp=33847301037&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33847301037&partnerID=8YFLogxK

U2 - 10.1007/s10590-006-9010-x

DO - 10.1007/s10590-006-9010-x

M3 - Article

VL - 19

SP - 251

EP - 282

JO - Machine Translation

JF - Machine Translation

SN - 0922-6567

IS - 3-4

ER -