Hybrid voice conversion of unit selection and generation using prosody dependent HMM

Tadashi Okubo*, Ryo Mochizuki, Tetsunori Kobayashi

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

8 Citations (Scopus)

Abstract

We propose a hybrid voice conversion method which employs a combination of techniques using HMM-based unit selection and spectrum generation. In the proposed method, the HMM-based unit selection selects the most likely unit for the required phoneme context from the target speaker's corpus when candidates of the target unit exist in the corpus. Unit selection is performed based on the sequence of the spectral probability distribution obtained from the adapted HMMs. On the other hand, when a target unit does not exist in a corpus, a target waveform is generated from the adapted HMM sequence by maximizing the spectral likelihood. The proposed method also employs the HMM in which the spectral probability distribution is adjusted to the target prosody using the weight defined by the prosodic probability of each distribution. To show the effectiveness of the proposed method, sound quality and speaker individuality tests were conducted. The results revealed that the proposed method could produce high-quality speech and individuality of the synthesized sound was more similar to the target speaker compared to conventional methods.

Original languageEnglish
Pages (from-to)2775-2782
Number of pages8
JournalIEICE Transactions on Information and Systems
VolumeE89-D
Issue number11
DOIs
Publication statusPublished - 2006 Nov

Keywords

  • HMM
  • MLLR
  • Speech synthesis
  • Unit selection
  • Voice conversion

ASJC Scopus subject areas

  • Software
  • Hardware and Architecture
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Hybrid voice conversion of unit selection and generation using prosody dependent HMM'. Together they form a unique fingerprint.

Cite this