Decoding syllables from human fMRI activity

Yohei Otaka, Rieko Osu, Mitsuo Kawato, Meigen Liu, Satoshi Murata, Yukiyasu Kamitani

研究成果: Conference contribution

2 引用 (Scopus)

抄録

Language plays essential roles in human cognition and social communication, and therefore technology of reading out speech using non-invasively measured brain activity will have both scientific and clinical merits. Here, we examined whether it is possible to decode each syllable from human fMRI activity. Four healthy subjects participated in the experiments. In a decoding session, the subjects repeatedly uttered a syllable presented on a screen at 3Hz for a 12-s block. Nine different syllables are presented in a single experimental run which was repeated 8 times. We also specified the voxels which showed articulation-related activities by utterance of all the syllables in Japanese phonology in a conventional task-rest sequence. Then, we used either all of these voxels or a part of these voxels that exist in anatomically specified ROIs (M1, cerebellum) during decoding sessions as data samples for training and testing a decoder (linear support vector machine) that classifies brain activity patterns for different syllables. To evaluate decoding performance, we performed cross-validation by testing the sample of one decoding session using a decoder trained with the samples of the remaining sessions. As a result, syllables were correctly decoded at above-chance levels. The results suggest the possibility of using non-invasively measured brain activity to read out the intended speech of disabled patients in speech motor control.

元の言語English
ホスト出版物のタイトルNeural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers
ページ979-986
ページ数8
4985 LNCS
エディションPART 2
DOI
出版物ステータスPublished - 2008
外部発表Yes
イベント14th International Conference on Neural Information Processing, ICONIP 2007 - Kitakyushu
継続期間: 2007 11 132007 11 16

出版物シリーズ

名前Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
番号PART 2
4985 LNCS
ISSN(印刷物)0302-9743
ISSN(電子版)1611-3349

Other

Other14th International Conference on Neural Information Processing, ICONIP 2007
Kitakyushu
期間07/11/1307/11/16

Fingerprint

Functional Magnetic Resonance Imaging
Decoding
Voxel
Brain
Testing
Cerebellum
Motor Control
Decode
Support vector machines
Cognition
Cross-validation
Support Vector Machine
Classify
Human
Magnetic Resonance Imaging
Communication
Evaluate
Experiments
Experiment
Speech

ASJC Scopus subject areas

  • Computer Science(all)
  • Theoretical Computer Science

これを引用

Otaka, Y., Osu, R., Kawato, M., Liu, M., Murata, S., & Kamitani, Y. (2008). Decoding syllables from human fMRI activity. : Neural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers (PART 2 版, 巻 4985 LNCS, pp. 979-986). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); 巻数 4985 LNCS, 番号 PART 2). https://doi.org/10.1007/978-3-540-69162-4_102

Decoding syllables from human fMRI activity. / Otaka, Yohei; Osu, Rieko; Kawato, Mitsuo; Liu, Meigen; Murata, Satoshi; Kamitani, Yukiyasu.

Neural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers. 巻 4985 LNCS PART 2. 編 2008. p. 979-986 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); 巻 4985 LNCS, 番号 PART 2).

研究成果: Conference contribution

Otaka, Y, Osu, R, Kawato, M, Liu, M, Murata, S & Kamitani, Y 2008, Decoding syllables from human fMRI activity. : Neural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers. PART 2 Edn, 巻. 4985 LNCS, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 番号 PART 2, 巻. 4985 LNCS, pp. 979-986, 14th International Conference on Neural Information Processing, ICONIP 2007, Kitakyushu, 07/11/13. https://doi.org/10.1007/978-3-540-69162-4_102
Otaka Y, Osu R, Kawato M, Liu M, Murata S, Kamitani Y. Decoding syllables from human fMRI activity. : Neural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers. PART 2 版 巻 4985 LNCS. 2008. p. 979-986. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); PART 2). https://doi.org/10.1007/978-3-540-69162-4_102
Otaka, Yohei ; Osu, Rieko ; Kawato, Mitsuo ; Liu, Meigen ; Murata, Satoshi ; Kamitani, Yukiyasu. / Decoding syllables from human fMRI activity. Neural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers. 巻 4985 LNCS PART 2. 版 2008. pp. 979-986 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); PART 2).
@inproceedings{bd133c16edad44f080fe8ecc4694744b,
title = "Decoding syllables from human fMRI activity",
abstract = "Language plays essential roles in human cognition and social communication, and therefore technology of reading out speech using non-invasively measured brain activity will have both scientific and clinical merits. Here, we examined whether it is possible to decode each syllable from human fMRI activity. Four healthy subjects participated in the experiments. In a decoding session, the subjects repeatedly uttered a syllable presented on a screen at 3Hz for a 12-s block. Nine different syllables are presented in a single experimental run which was repeated 8 times. We also specified the voxels which showed articulation-related activities by utterance of all the syllables in Japanese phonology in a conventional task-rest sequence. Then, we used either all of these voxels or a part of these voxels that exist in anatomically specified ROIs (M1, cerebellum) during decoding sessions as data samples for training and testing a decoder (linear support vector machine) that classifies brain activity patterns for different syllables. To evaluate decoding performance, we performed cross-validation by testing the sample of one decoding session using a decoder trained with the samples of the remaining sessions. As a result, syllables were correctly decoded at above-chance levels. The results suggest the possibility of using non-invasively measured brain activity to read out the intended speech of disabled patients in speech motor control.",
keywords = "Brain machine interface, Decoding, Functional Magnetic Resonance Imaging (fMRI), Rehabilitation, Speech, Syllable",
author = "Yohei Otaka and Rieko Osu and Mitsuo Kawato and Meigen Liu and Satoshi Murata and Yukiyasu Kamitani",
year = "2008",
doi = "10.1007/978-3-540-69162-4_102",
language = "English",
isbn = "3540691596",
volume = "4985 LNCS",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
number = "PART 2",
pages = "979--986",
booktitle = "Neural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers",
edition = "PART 2",

}

TY - GEN

T1 - Decoding syllables from human fMRI activity

AU - Otaka, Yohei

AU - Osu, Rieko

AU - Kawato, Mitsuo

AU - Liu, Meigen

AU - Murata, Satoshi

AU - Kamitani, Yukiyasu

PY - 2008

Y1 - 2008

N2 - Language plays essential roles in human cognition and social communication, and therefore technology of reading out speech using non-invasively measured brain activity will have both scientific and clinical merits. Here, we examined whether it is possible to decode each syllable from human fMRI activity. Four healthy subjects participated in the experiments. In a decoding session, the subjects repeatedly uttered a syllable presented on a screen at 3Hz for a 12-s block. Nine different syllables are presented in a single experimental run which was repeated 8 times. We also specified the voxels which showed articulation-related activities by utterance of all the syllables in Japanese phonology in a conventional task-rest sequence. Then, we used either all of these voxels or a part of these voxels that exist in anatomically specified ROIs (M1, cerebellum) during decoding sessions as data samples for training and testing a decoder (linear support vector machine) that classifies brain activity patterns for different syllables. To evaluate decoding performance, we performed cross-validation by testing the sample of one decoding session using a decoder trained with the samples of the remaining sessions. As a result, syllables were correctly decoded at above-chance levels. The results suggest the possibility of using non-invasively measured brain activity to read out the intended speech of disabled patients in speech motor control.

AB - Language plays essential roles in human cognition and social communication, and therefore technology of reading out speech using non-invasively measured brain activity will have both scientific and clinical merits. Here, we examined whether it is possible to decode each syllable from human fMRI activity. Four healthy subjects participated in the experiments. In a decoding session, the subjects repeatedly uttered a syllable presented on a screen at 3Hz for a 12-s block. Nine different syllables are presented in a single experimental run which was repeated 8 times. We also specified the voxels which showed articulation-related activities by utterance of all the syllables in Japanese phonology in a conventional task-rest sequence. Then, we used either all of these voxels or a part of these voxels that exist in anatomically specified ROIs (M1, cerebellum) during decoding sessions as data samples for training and testing a decoder (linear support vector machine) that classifies brain activity patterns for different syllables. To evaluate decoding performance, we performed cross-validation by testing the sample of one decoding session using a decoder trained with the samples of the remaining sessions. As a result, syllables were correctly decoded at above-chance levels. The results suggest the possibility of using non-invasively measured brain activity to read out the intended speech of disabled patients in speech motor control.

KW - Brain machine interface

KW - Decoding

KW - Functional Magnetic Resonance Imaging (fMRI)

KW - Rehabilitation

KW - Speech

KW - Syllable

UR - http://www.scopus.com/inward/record.url?scp=54049120292&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=54049120292&partnerID=8YFLogxK

U2 - 10.1007/978-3-540-69162-4_102

DO - 10.1007/978-3-540-69162-4_102

M3 - Conference contribution

AN - SCOPUS:54049120292

SN - 3540691596

SN - 9783540691594

VL - 4985 LNCS

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 979

EP - 986

BT - Neural Information Processing - 14th International Conference, ICONIP 2007, Revised Selected Papers

ER -