Almost sure and mean convergence of extended stochastic complexity

Masayuki Goto, Toshiyasu Matsushima, Shigeichi Hirasawa

    Research output: Contribution to journalArticle

    1 Citation (Scopus)

    Abstract

    We analyze the extended stochastic complexity (ESC) which has been proposed by K. Yamanishi. The ESC can be applied to learning algorithms for on-line prediction and batch-learning settings. Yamanishi derived the upper bound of ESC satisfying uniformly for all data sequences and that of the asymptotic expectation of ESC. However, Yamanishi concentrates mainly on the worst case performance and the lower bound has not been derived. In this paper, we show some interesting properties of ESC which are similar to Bayesian statistics: the Bayes rule and the asymptotic normality. We then derive the asymptotic formula of ESC in the meaning of almost sure and mean convergence within an error of o(1) using these properties.

    Original languageEnglish
    Pages (from-to)2129-2134
    Number of pages6
    JournalIEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
    VolumeE82-A
    Issue number10
    Publication statusPublished - 1999

    Fingerprint

    Stochastic Complexity
    Mean Convergence
    Almost Sure Convergence
    Learning algorithms
    Statistics
    Bayesian Statistics
    Bayes Rule
    Worst-case Performance
    Asymptotic Normality
    Asymptotic Formula
    Batch
    Learning Algorithm
    Lower bound
    Upper bound
    Prediction

    Keywords

    • Asymptotic normality
    • Bayesian statistics
    • Extended stochastic complexity
    • Stochastic complexity

    ASJC Scopus subject areas

    • Electrical and Electronic Engineering
    • Hardware and Architecture
    • Information Systems

    Cite this

    @article{78bd09cd63334a96b657d82e2c793a6c,
    title = "Almost sure and mean convergence of extended stochastic complexity",
    abstract = "We analyze the extended stochastic complexity (ESC) which has been proposed by K. Yamanishi. The ESC can be applied to learning algorithms for on-line prediction and batch-learning settings. Yamanishi derived the upper bound of ESC satisfying uniformly for all data sequences and that of the asymptotic expectation of ESC. However, Yamanishi concentrates mainly on the worst case performance and the lower bound has not been derived. In this paper, we show some interesting properties of ESC which are similar to Bayesian statistics: the Bayes rule and the asymptotic normality. We then derive the asymptotic formula of ESC in the meaning of almost sure and mean convergence within an error of o(1) using these properties.",
    keywords = "Asymptotic normality, Bayesian statistics, Extended stochastic complexity, Stochastic complexity",
    author = "Masayuki Goto and Toshiyasu Matsushima and Shigeichi Hirasawa",
    year = "1999",
    language = "English",
    volume = "E82-A",
    pages = "2129--2134",
    journal = "IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences",
    issn = "0916-8508",
    publisher = "Maruzen Co., Ltd/Maruzen Kabushikikaisha",
    number = "10",

    }

    TY - JOUR

    T1 - Almost sure and mean convergence of extended stochastic complexity

    AU - Goto, Masayuki

    AU - Matsushima, Toshiyasu

    AU - Hirasawa, Shigeichi

    PY - 1999

    Y1 - 1999

    N2 - We analyze the extended stochastic complexity (ESC) which has been proposed by K. Yamanishi. The ESC can be applied to learning algorithms for on-line prediction and batch-learning settings. Yamanishi derived the upper bound of ESC satisfying uniformly for all data sequences and that of the asymptotic expectation of ESC. However, Yamanishi concentrates mainly on the worst case performance and the lower bound has not been derived. In this paper, we show some interesting properties of ESC which are similar to Bayesian statistics: the Bayes rule and the asymptotic normality. We then derive the asymptotic formula of ESC in the meaning of almost sure and mean convergence within an error of o(1) using these properties.

    AB - We analyze the extended stochastic complexity (ESC) which has been proposed by K. Yamanishi. The ESC can be applied to learning algorithms for on-line prediction and batch-learning settings. Yamanishi derived the upper bound of ESC satisfying uniformly for all data sequences and that of the asymptotic expectation of ESC. However, Yamanishi concentrates mainly on the worst case performance and the lower bound has not been derived. In this paper, we show some interesting properties of ESC which are similar to Bayesian statistics: the Bayes rule and the asymptotic normality. We then derive the asymptotic formula of ESC in the meaning of almost sure and mean convergence within an error of o(1) using these properties.

    KW - Asymptotic normality

    KW - Bayesian statistics

    KW - Extended stochastic complexity

    KW - Stochastic complexity

    UR - http://www.scopus.com/inward/record.url?scp=33746823458&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=33746823458&partnerID=8YFLogxK

    M3 - Article

    AN - SCOPUS:33746823458

    VL - E82-A

    SP - 2129

    EP - 2134

    JO - IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences

    JF - IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences

    SN - 0916-8508

    IS - 10

    ER -