Affective music recommendation system using input images

Shoto Sasaki, Tatsunori Hirai, Hayato Ohya, Shigeo Morishima

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

Music that matches our current mood can create a deep impression, which we usually want to enjoy when we listen to music. However, we do not know which music best matches our present mood. We have to listen to each song, searching for music that matches our mood. As it is difficult to select music manually, we need a recommendation system that can operate affectively. Most recommendation methods, such as collaborative filtering or content similarity, do not target a specific mood. In addition, there may be no word exactly specifying the mood. Therefore, textual retrieval is not effective. In this paper, we assume that there exists a relationship between our mood and images because visual information affects our mood when we listen to music. We now present an affective music recommendation system using an input image without textual information.

Original languageEnglish
Title of host publicationACM SIGGRAPH 2013 Posters, SIGGRAPH 2013
DOIs
Publication statusPublished - 2013
EventACM Special Interest Group on Computer Graphics and Interactive Techniques Conference, SIGGRAPH 2013 - Anaheim, CA, United States
Duration: 2013 Jul 212013 Jul 25

Publication series

NameACM SIGGRAPH 2013 Posters, SIGGRAPH 2013

Conference

ConferenceACM Special Interest Group on Computer Graphics and Interactive Techniques Conference, SIGGRAPH 2013
Country/TerritoryUnited States
CityAnaheim, CA
Period13/7/2113/7/25

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Computer Vision and Pattern Recognition
  • Software

Fingerprint

Dive into the research topics of 'Affective music recommendation system using input images'. Together they form a unique fingerprint.

Cite this