Model selection and information criterion

Noboru Murata, Hyeyoung Park

    Research output: Chapter in Book/Report/Conference proceedingChapter

    2 Citations (Scopus)

    Abstract

    In this chapter, a problem of estimating model parameters from observed data is considered such as regression and function approximation, and a method of evaluating the goodness of model is introduced. Starting from so-called leave-one-out cross-validation, and investigating asymptotic statistical properties of estimated parameters, a generalized Akaike's information criterion (AIC) is derived for selecting an appropriate model from several candidates. In addition to model selection, the concept of information criteria provides an assessment of the goodness of model in various situations. Finally, an optimization method using regularization is presented as an example.

    Original languageEnglish
    Title of host publicationInformation Theory and Statistical Learning
    PublisherSpringer US
    Pages333-354
    Number of pages22
    ISBN (Print)9780387848150
    DOIs
    Publication statusPublished - 2009

    ASJC Scopus subject areas

    • Computer Science(all)

    Cite this

    Murata, N., & Park, H. (2009). Model selection and information criterion. In Information Theory and Statistical Learning (pp. 333-354). Springer US. https://doi.org/10.1007/978-0-387-84816-7_14

    Model selection and information criterion. / Murata, Noboru; Park, Hyeyoung.

    Information Theory and Statistical Learning. Springer US, 2009. p. 333-354.

    Research output: Chapter in Book/Report/Conference proceedingChapter

    Murata, N & Park, H 2009, Model selection and information criterion. in Information Theory and Statistical Learning. Springer US, pp. 333-354. https://doi.org/10.1007/978-0-387-84816-7_14
    Murata N, Park H. Model selection and information criterion. In Information Theory and Statistical Learning. Springer US. 2009. p. 333-354 https://doi.org/10.1007/978-0-387-84816-7_14
    Murata, Noboru ; Park, Hyeyoung. / Model selection and information criterion. Information Theory and Statistical Learning. Springer US, 2009. pp. 333-354
    @inbook{59b0feb9b9754be58418cbae1609d89a,
    title = "Model selection and information criterion",
    abstract = "In this chapter, a problem of estimating model parameters from observed data is considered such as regression and function approximation, and a method of evaluating the goodness of model is introduced. Starting from so-called leave-one-out cross-validation, and investigating asymptotic statistical properties of estimated parameters, a generalized Akaike's information criterion (AIC) is derived for selecting an appropriate model from several candidates. In addition to model selection, the concept of information criteria provides an assessment of the goodness of model in various situations. Finally, an optimization method using regularization is presented as an example.",
    author = "Noboru Murata and Hyeyoung Park",
    year = "2009",
    doi = "10.1007/978-0-387-84816-7_14",
    language = "English",
    isbn = "9780387848150",
    pages = "333--354",
    booktitle = "Information Theory and Statistical Learning",
    publisher = "Springer US",

    }

    TY - CHAP

    T1 - Model selection and information criterion

    AU - Murata, Noboru

    AU - Park, Hyeyoung

    PY - 2009

    Y1 - 2009

    N2 - In this chapter, a problem of estimating model parameters from observed data is considered such as regression and function approximation, and a method of evaluating the goodness of model is introduced. Starting from so-called leave-one-out cross-validation, and investigating asymptotic statistical properties of estimated parameters, a generalized Akaike's information criterion (AIC) is derived for selecting an appropriate model from several candidates. In addition to model selection, the concept of information criteria provides an assessment of the goodness of model in various situations. Finally, an optimization method using regularization is presented as an example.

    AB - In this chapter, a problem of estimating model parameters from observed data is considered such as regression and function approximation, and a method of evaluating the goodness of model is introduced. Starting from so-called leave-one-out cross-validation, and investigating asymptotic statistical properties of estimated parameters, a generalized Akaike's information criterion (AIC) is derived for selecting an appropriate model from several candidates. In addition to model selection, the concept of information criteria provides an assessment of the goodness of model in various situations. Finally, an optimization method using regularization is presented as an example.

    UR - http://www.scopus.com/inward/record.url?scp=84891459804&partnerID=8YFLogxK

    UR - http://www.scopus.com/inward/citedby.url?scp=84891459804&partnerID=8YFLogxK

    U2 - 10.1007/978-0-387-84816-7_14

    DO - 10.1007/978-0-387-84816-7_14

    M3 - Chapter

    AN - SCOPUS:84891459804

    SN - 9780387848150

    SP - 333

    EP - 354

    BT - Information Theory and Statistical Learning

    PB - Springer US

    ER -