The Alpha-HMM Estimation Algorithm

Prior Cycle Guides Fast Paths

Yasuo Matsuyama

    Research output: Contribution to journalReview article

    1 Citation (Scopus)

    Abstract

    The estimation of generative structures for sequences is becoming increasingly important for preventing such data sources from becoming a flood of disorganized information. Obtaining hidden Markov models (HMMs) has been a central method for structuring such data. However, users have been aware of the slow speed of this algorithm. In this study, we devise generalized and fast estimation methods for HMMs by employing a geometric information measure that is associated with a function called the alpha-logarithm. Using the alpha-logarithmic likelihood ratio, we exploit prior iterations to guide rapid convergence. The parameter alpha is used to adjust the utilization of previous information. A fixed-point approach using a causal shift and a series expansion is responsible for this gain. For software implementations, we present probability scaling to avoid underflow, where we generalize flaw corrections to the de facto standard. For the update mechanism, we begin with a method called shotgun surrogates, in relation to the parameter alpha. Then, we obtain a dynamic version that employs the controlling and undoing of alpha. Experiments on biological sequences and brain signals for practical state models demonstrate that a significant speedup is achieved compared to the Baum-Welch method. The effects of restricting the state models are also reported.

    Original languageEnglish
    Article number7895145
    Pages (from-to)3446-3461
    Number of pages16
    JournalIEEE Transactions on Signal Processing
    Volume65
    Issue number13
    DOIs
    Publication statusPublished - 2017 Jul 1

    Fingerprint

    Hidden Markov models
    Brain
    Defects
    Experiments

    Keywords

    • Alpha-hidden Markov model estimation
    • convergence speedup
    • dynamic surrogate
    • message passing
    • shotgun surrogates

    ASJC Scopus subject areas

    • Signal Processing
    • Electrical and Electronic Engineering

    Cite this

    The Alpha-HMM Estimation Algorithm : Prior Cycle Guides Fast Paths. / Matsuyama, Yasuo.

    In: IEEE Transactions on Signal Processing, Vol. 65, No. 13, 7895145, 01.07.2017, p. 3446-3461.

    Research output: Contribution to journalReview article

    @article{768ed3eed67f4751815a70663309360d,
    title = "The Alpha-HMM Estimation Algorithm: Prior Cycle Guides Fast Paths",
    abstract = "The estimation of generative structures for sequences is becoming increasingly important for preventing such data sources from becoming a flood of disorganized information. Obtaining hidden Markov models (HMMs) has been a central method for structuring such data. However, users have been aware of the slow speed of this algorithm. In this study, we devise generalized and fast estimation methods for HMMs by employing a geometric information measure that is associated with a function called the alpha-logarithm. Using the alpha-logarithmic likelihood ratio, we exploit prior iterations to guide rapid convergence. The parameter alpha is used to adjust the utilization of previous information. A fixed-point approach using a causal shift and a series expansion is responsible for this gain. For software implementations, we present probability scaling to avoid underflow, where we generalize flaw corrections to the de facto standard. For the update mechanism, we begin with a method called shotgun surrogates, in relation to the parameter alpha. Then, we obtain a dynamic version that employs the controlling and undoing of alpha. Experiments on biological sequences and brain signals for practical state models demonstrate that a significant speedup is achieved compared to the Baum-Welch method. The effects of restricting the state models are also reported.",
    keywords = "Alpha-hidden Markov model estimation, convergence speedup, dynamic surrogate, message passing, shotgun surrogates",
    author = "Yasuo Matsuyama",
    year = "2017",
    month = "7",
    day = "1",
    doi = "10.1109/TSP.2017.2692724",
    language = "English",
    volume = "65",
    pages = "3446--3461",
    journal = "IEEE Transactions on Signal Processing",
    issn = "1053-587X",
    publisher = "Institute of Electrical and Electronics Engineers Inc.",
    number = "13",

    }

    TY - JOUR

    T1 - The Alpha-HMM Estimation Algorithm

    T2 - Prior Cycle Guides Fast Paths

    AU - Matsuyama, Yasuo

    PY - 2017/7/1

    Y1 - 2017/7/1

    N2 - The estimation of generative structures for sequences is becoming increasingly important for preventing such data sources from becoming a flood of disorganized information. Obtaining hidden Markov models (HMMs) has been a central method for structuring such data. However, users have been aware of the slow speed of this algorithm. In this study, we devise generalized and fast estimation methods for HMMs by employing a geometric information measure that is associated with a function called the alpha-logarithm. Using the alpha-logarithmic likelihood ratio, we exploit prior iterations to guide rapid convergence. The parameter alpha is used to adjust the utilization of previous information. A fixed-point approach using a causal shift and a series expansion is responsible for this gain. For software implementations, we present probability scaling to avoid underflow, where we generalize flaw corrections to the de facto standard. For the update mechanism, we begin with a method called shotgun surrogates, in relation to the parameter alpha. Then, we obtain a dynamic version that employs the controlling and undoing of alpha. Experiments on biological sequences and brain signals for practical state models demonstrate that a significant speedup is achieved compared to the Baum-Welch method. The effects of restricting the state models are also reported.

    AB - The estimation of generative structures for sequences is becoming increasingly important for preventing such data sources from becoming a flood of disorganized information. Obtaining hidden Markov models (HMMs) has been a central method for structuring such data. However, users have been aware of the slow speed of this algorithm. In this study, we devise generalized and fast estimation methods for HMMs by employing a geometric information measure that is associated with a function called the alpha-logarithm. Using the alpha-logarithmic likelihood ratio, we exploit prior iterations to guide rapid convergence. The parameter alpha is used to adjust the utilization of previous information. A fixed-point approach using a causal shift and a series expansion is responsible for this gain. For software implementations, we present probability scaling to avoid underflow, where we generalize flaw corrections to the de facto standard. For the update mechanism, we begin with a method called shotgun surrogates, in relation to the parameter alpha. Then, we obtain a dynamic version that employs the controlling and undoing of alpha. Experiments on biological sequences and brain signals for practical state models demonstrate that a significant speedup is achieved compared to the Baum-Welch method. The effects of restricting the state models are also reported.

    KW - Alpha-hidden Markov model estimation

    KW - convergence speedup

    KW - dynamic surrogate

    KW - message passing

    KW - shotgun surrogates

    UR - http://www.scopus.com/inward/record.url?scp=85019179561&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=85019179561&partnerID=8YFLogxK

    U2 - 10.1109/TSP.2017.2692724

    DO - 10.1109/TSP.2017.2692724

    M3 - Review article

    VL - 65

    SP - 3446

    EP - 3461

    JO - IEEE Transactions on Signal Processing

    JF - IEEE Transactions on Signal Processing

    SN - 1053-587X

    IS - 13

    M1 - 7895145

    ER -