Trends

AI can predict if your musical taste is more Ice T … or Vanilla Ice

Do you get down to Jackson Five, or is Stravinsky more your style? Artificial intelligence (AI) that predicts taste in music might seem stranger than fiction, but researchers at Jönköping University in Sweden and Maastricht University in the Netherlands believe they’ve cracked the code.
In a paper published on the preprint server Arxiv.org, the team described a system that considers a person’s listening behaviors and, using machine learning algorithms and psychological models, infers their “musical sophistication.”
“Psychological models are increasingly being used to explain … behavioral traces,” they wrote. “The use of domain dependent psychological models allows for more fine-grained identification of behaviors [like music listening] and provides a deeper understanding behind the occurrence of those behaviors.”
In this context, musical sophistication refers to “musical skills, expertise, achievements, and related behaviors across a range of facets.” Studies have shown, the researchers pointed out, that people with a higher degree of musical sophistication are more musically skilled, and in general tend to engage in more “musical behaviors,” like practicing an instrument or listening to a variety of musical genres.
They collected data through an app that tapped Spotify’s API, allowing them to retrieve users’ playlists and audio features like liveliness, energy, danceability, tempo, time signature, loudness, track popularity, and artist popularity. They also had participants answer questions from Goldmiths Musical Sophistication Index (Gold-MSI) — specifically, questions related to active engagement (how much time and money a person spends on music) and emotions (behaviors related to emotional responses to music).
The deluge of data fed into a neural network — an AI system consisting of processing nodes that model neurons in the human brain — that predicted 61 study subjects’ emotions and active musical engagement with a high degree of accuracy. Compared to the baseline, it was 95 percent accurate at predicting the former and 93 percent accurate at predicting the latter.
In future, the team plans to conduct additional, larger studies and explore the prediction of other subscales of the Gold-MSI, including singing abilities, perceptual abilities, and musical training.
“Our results show that music listening behavior can be used to infer the musical sophistication of users,” the researchers wrote.
It’s not the first time data scientists have attempted to predict musical preferences and tastes with machine learning.
At the Amsterdam Dance Event Tech conference in 2017, a team presented Hitwizard, a system trained to forecast popular songs. By taking into account features like beats per minute, valence, and tempo and comparing them against data sourced from Spotify charts and Dutch radio stations, it was able to predict hit tracks with 66 percent accuracy (and flops with 93 percent accuracy).
This year, engineers at Amazon leveraged AI to predict users’ musical tastes based on playback duration.
Source: VentureBeat

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker