The aim of this project is to develop software which can automatically classify the emotional content of any piece of music. The recent increase in popularity of large personal digital music collections and online music retailers has led to a need for new, interactive access methods. Current interfaces to large music databases only allow users to search for music by genre, artist, or similarity to other music items, and there is very little research in the area of automatically extracting the emotional content of music.
Existing methods have tended to rely on a fairly limited set of musical features. There is a variety of music features such as mode (a given series of musical intervals), tonality (the relationship between pitches in the music), and pitch (perceived frequency) which are key to expressing emotion in the composition of music. The project aim is to develop software that uses these features to more accurately identify the emotional content of any piece of music.
Allowing the user to browse for music by its emotional effect represents a step toward addressing the needs and preferences of the user in music information retrieval. This technology will allow the user to easily select a subset of their own personal music database to suit their current mood. Software like this is a good example of how technology can seamlessly merge into our daily activities.