A musical mood trajectory estimation method using lyrics and acoustic features

Naoki Nishikawa, Katsutoshi Itoyama, Hiromasa Fujihara, Masataka Goto, Tetsuya Ogata, Hiroshi G. Okuno

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

In this paper, we present a new method that represents an overall musical time-varying impression of a song by a pair of mood trajectories estimated from lyrics and audio signals. The mood trajectory of the lyrics is obtained by using the probabilistic latent semantic analysis (PLSA) to estimate topics (representing impressions) from words in the lyrics. The mood trajectory of the audio signals is estimated from acoustic features by using the multiple linear regression analysis. In our experiments, the mood trajectories of 100 songs in Last.fm's Best of 2010 were estimated. The detailed analysis of the 100 songs confirms that acoustic features provide more accurate mood trajectory and the 21% resulting mood trajectories are matched to realistic musical mood available at Last.fm.

Original languageEnglish
Title of host publicationMM'11 - Proceedings of the 2011 ACM Multimedia Conference and Co-Located Workshops - MIRUM 2011 Workshop, MIRUM'11
Pages51-56
Number of pages6
DOIs
Publication statusPublished - 2011 Dec 1
Externally publishedYes
Event2011 ACM Multimedia Conference, MM'11 and Co-Located Workshops - 1st International ACM Workshop on Music Information Retrieval with User-Centered and Multimodal Strategies, MIRUM'11 - Scottsdale, AZ, United States
Duration: 2011 Nov 282011 Dec 1

Publication series

NameMM'11 - Proceedings of the 2011 ACM Multimedia Conference and Co-Located Workshops - MIRUM 2011 Workshop, MIRUM'11

Conference

Conference2011 ACM Multimedia Conference, MM'11 and Co-Located Workshops - 1st International ACM Workshop on Music Information Retrieval with User-Centered and Multimodal Strategies, MIRUM'11
CountryUnited States
CityScottsdale, AZ
Period11/11/2811/12/1

Keywords

  • Lyrics and audio signals music information retrieval
  • Mood trajectory
  • Musical mood estimation
  • Musical mood representation
  • Time-varying impression

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Human-Computer Interaction

Fingerprint Dive into the research topics of 'A musical mood trajectory estimation method using lyrics and acoustic features'. Together they form a unique fingerprint.

  • Cite this

    Nishikawa, N., Itoyama, K., Fujihara, H., Goto, M., Ogata, T., & Okuno, H. G. (2011). A musical mood trajectory estimation method using lyrics and acoustic features. In MM'11 - Proceedings of the 2011 ACM Multimedia Conference and Co-Located Workshops - MIRUM 2011 Workshop, MIRUM'11 (pp. 51-56). (MM'11 - Proceedings of the 2011 ACM Multimedia Conference and Co-Located Workshops - MIRUM 2011 Workshop, MIRUM'11). https://doi.org/10.1145/2072529.2072543