Recognition and generation of sentences through self-organizing linguistic hierarchy using MTRNN

Wataru Hinoshita, Hiroaki Arie, Jun Tani, Tetsuya Ogata, Hiroshi G. Okuno

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We show that a Multiple Timescale Recurrent Neural Network (MTRNN) can acquire the capabilities of recognizing and generating sentences by self-organizing a hierarchical linguistic structure. There have been many studies aimed at finding whether a neural system such as the brain can acquire languages without innate linguistic faculties. These studies have found that some kinds of recurrent neural networks could learn grammar. However, these models could not acquire the capability of deterministically generating various sentences, which is an essential part of language functions. In addition, the existing models require a word set in advance to learn the grammar. Learning languages without previous knowledge about words requires the capability of hierarchical composition such as characters to words and words to sentences, which is the essence of the rich expressiveness of languages. In our experiment, we trained our model to learn language using only a sentence set without any previous knowledge about words or grammar. Our experimental results demonstrated that the model could acquire the capabilities of recognizing and deterministically generating grammatical sentences even if they were not learned. The analysis of neural activations in our model revealed that the MTRNN had self-organized the linguistic structure hierarchically by taking advantage of differences in the time scale among its neurons, more concretely, neurons that change the fastest represented "characters," those that change more slowly represented "words," and those that change the slowest represented "sentences."

Original languageEnglish
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Pages42-51
Number of pages10
Volume6098 LNAI
EditionPART 3
DOIs
Publication statusPublished - 2010
Externally publishedYes
Event23rd International Conference on Industrial Engineering and Other Applications of Applied Intelligence Systems, IEA/AIE 2010 - Cordoba
Duration: 2010 Jun 12010 Jun 4

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 3
Volume6098 LNAI
ISSN (Print)03029743
ISSN (Electronic)16113349

Other

Other23rd International Conference on Industrial Engineering and Other Applications of Applied Intelligence Systems, IEA/AIE 2010
CityCordoba
Period10/6/110/6/4

Fingerprint

Multiple Time Scales
Recurrent neural networks
Recurrent Neural Networks
Self-organizing
Linguistics
Grammar
Neurons
Neuron
Model
Expressiveness
Activation
Brain
Time Scales
Chemical activation
Hierarchy
Language
Experimental Results
Chemical analysis
Experiment
Experiments

ASJC Scopus subject areas

  • Computer Science(all)
  • Theoretical Computer Science

Cite this

Hinoshita, W., Arie, H., Tani, J., Ogata, T., & Okuno, H. G. (2010). Recognition and generation of sentences through self-organizing linguistic hierarchy using MTRNN. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (PART 3 ed., Vol. 6098 LNAI, pp. 42-51). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 6098 LNAI, No. PART 3). https://doi.org/10.1007/978-3-642-13033-5_5

Recognition and generation of sentences through self-organizing linguistic hierarchy using MTRNN. / Hinoshita, Wataru; Arie, Hiroaki; Tani, Jun; Ogata, Tetsuya; Okuno, Hiroshi G.

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 6098 LNAI PART 3. ed. 2010. p. 42-51 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 6098 LNAI, No. PART 3).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Hinoshita, W, Arie, H, Tani, J, Ogata, T & Okuno, HG 2010, Recognition and generation of sentences through self-organizing linguistic hierarchy using MTRNN. in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). PART 3 edn, vol. 6098 LNAI, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), no. PART 3, vol. 6098 LNAI, pp. 42-51, 23rd International Conference on Industrial Engineering and Other Applications of Applied Intelligence Systems, IEA/AIE 2010, Cordoba, 10/6/1. https://doi.org/10.1007/978-3-642-13033-5_5
Hinoshita W, Arie H, Tani J, Ogata T, Okuno HG. Recognition and generation of sentences through self-organizing linguistic hierarchy using MTRNN. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). PART 3 ed. Vol. 6098 LNAI. 2010. p. 42-51. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); PART 3). https://doi.org/10.1007/978-3-642-13033-5_5
Hinoshita, Wataru ; Arie, Hiroaki ; Tani, Jun ; Ogata, Tetsuya ; Okuno, Hiroshi G. / Recognition and generation of sentences through self-organizing linguistic hierarchy using MTRNN. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 6098 LNAI PART 3. ed. 2010. pp. 42-51 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); PART 3).
@inproceedings{d13d60e3b14d4f97bd2b95c201df75ee,
title = "Recognition and generation of sentences through self-organizing linguistic hierarchy using MTRNN",
abstract = "We show that a Multiple Timescale Recurrent Neural Network (MTRNN) can acquire the capabilities of recognizing and generating sentences by self-organizing a hierarchical linguistic structure. There have been many studies aimed at finding whether a neural system such as the brain can acquire languages without innate linguistic faculties. These studies have found that some kinds of recurrent neural networks could learn grammar. However, these models could not acquire the capability of deterministically generating various sentences, which is an essential part of language functions. In addition, the existing models require a word set in advance to learn the grammar. Learning languages without previous knowledge about words requires the capability of hierarchical composition such as characters to words and words to sentences, which is the essence of the rich expressiveness of languages. In our experiment, we trained our model to learn language using only a sentence set without any previous knowledge about words or grammar. Our experimental results demonstrated that the model could acquire the capabilities of recognizing and deterministically generating grammatical sentences even if they were not learned. The analysis of neural activations in our model revealed that the MTRNN had self-organized the linguistic structure hierarchically by taking advantage of differences in the time scale among its neurons, more concretely, neurons that change the fastest represented {"}characters,{"} those that change more slowly represented {"}words,{"} and those that change the slowest represented {"}sentences.{"}",
author = "Wataru Hinoshita and Hiroaki Arie and Jun Tani and Tetsuya Ogata and Okuno, {Hiroshi G.}",
year = "2010",
doi = "10.1007/978-3-642-13033-5_5",
language = "English",
isbn = "3642130321",
volume = "6098 LNAI",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
number = "PART 3",
pages = "42--51",
booktitle = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
edition = "PART 3",

}

TY - GEN

T1 - Recognition and generation of sentences through self-organizing linguistic hierarchy using MTRNN

AU - Hinoshita, Wataru

AU - Arie, Hiroaki

AU - Tani, Jun

AU - Ogata, Tetsuya

AU - Okuno, Hiroshi G.

PY - 2010

Y1 - 2010

N2 - We show that a Multiple Timescale Recurrent Neural Network (MTRNN) can acquire the capabilities of recognizing and generating sentences by self-organizing a hierarchical linguistic structure. There have been many studies aimed at finding whether a neural system such as the brain can acquire languages without innate linguistic faculties. These studies have found that some kinds of recurrent neural networks could learn grammar. However, these models could not acquire the capability of deterministically generating various sentences, which is an essential part of language functions. In addition, the existing models require a word set in advance to learn the grammar. Learning languages without previous knowledge about words requires the capability of hierarchical composition such as characters to words and words to sentences, which is the essence of the rich expressiveness of languages. In our experiment, we trained our model to learn language using only a sentence set without any previous knowledge about words or grammar. Our experimental results demonstrated that the model could acquire the capabilities of recognizing and deterministically generating grammatical sentences even if they were not learned. The analysis of neural activations in our model revealed that the MTRNN had self-organized the linguistic structure hierarchically by taking advantage of differences in the time scale among its neurons, more concretely, neurons that change the fastest represented "characters," those that change more slowly represented "words," and those that change the slowest represented "sentences."

AB - We show that a Multiple Timescale Recurrent Neural Network (MTRNN) can acquire the capabilities of recognizing and generating sentences by self-organizing a hierarchical linguistic structure. There have been many studies aimed at finding whether a neural system such as the brain can acquire languages without innate linguistic faculties. These studies have found that some kinds of recurrent neural networks could learn grammar. However, these models could not acquire the capability of deterministically generating various sentences, which is an essential part of language functions. In addition, the existing models require a word set in advance to learn the grammar. Learning languages without previous knowledge about words requires the capability of hierarchical composition such as characters to words and words to sentences, which is the essence of the rich expressiveness of languages. In our experiment, we trained our model to learn language using only a sentence set without any previous knowledge about words or grammar. Our experimental results demonstrated that the model could acquire the capabilities of recognizing and deterministically generating grammatical sentences even if they were not learned. The analysis of neural activations in our model revealed that the MTRNN had self-organized the linguistic structure hierarchically by taking advantage of differences in the time scale among its neurons, more concretely, neurons that change the fastest represented "characters," those that change more slowly represented "words," and those that change the slowest represented "sentences."

UR - http://www.scopus.com/inward/record.url?scp=79551563700&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=79551563700&partnerID=8YFLogxK

U2 - 10.1007/978-3-642-13033-5_5

DO - 10.1007/978-3-642-13033-5_5

M3 - Conference contribution

AN - SCOPUS:79551563700

SN - 3642130321

SN - 9783642130328

VL - 6098 LNAI

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 42

EP - 51

BT - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

ER -