|> /~\ |≥ (

Aimpathy

Much research has been done to better understand the emotional experience of music; from the philosophical, artistic, psychological, and statistical approaches. In this research we conduct a cross-domain experiment based on those four disciplines, to further understand the factors that influence the emotional perception of music; and in particular the difference between the artist’s emotional conception and the audience’s perception. In the experiment we train a novel model of an Artificial Neural Network, to predict the perceived emotion from a short musical phrase. We then feed the machine curated input, which simulates artistic choices, to explore its most significant factors in determining the perceived emotions. In the conclusion we describe the results, as well as the possible follow-ups to the experiment, such as an emotional expression training tool for musicians.

Belongs to

Related projects