Gracenote announced the launch of a new music descriptor system called Sonic Style on Wednesday (May 23) that, for the first time, classifies the “styles” of musical recordings across massive catalogs.
Sonic Style takes industry-standard artist-level genre categories a step further by honing in on individual songs’ musical style, rather than grouping it in with an artist’s catalog as a whole. The system is able to recognize when a pop artist creates a track that leans towards a different style or subgenre, such as Taylor Swift’s “Look What You Made Me Do,” which leans specifically towards pop electronica, dance pop and electroclash.
The intent is to help create a more perfect playlist. With each track being categorized based on its unique musical style, streaming music providers will be able to further personalize playlists for listeners based on taste, while smart speaker device makers will be better equipped to accommodate users’ specific requests. This will become especially helpful when including material by artists whose careers span multiple decades and include multiple genres. Both algorithms and human curators alike will be able to pick out which songs better match certain moods.
“Now that playlists are the new albums, music curators are clamoring for deeper insights into individual recordings for better discovery and personalization,” said Gracenote’s general manager of music and auto Brian Hamilton in a statement. “To achieve scale, Sonic Style applies neural network-powered machine learning to the world’s music catalogs, enabling Gracenote to deliver granular views of musical styles across complete music catalogs. These new turbo-charged style descriptors will revolutionize how the world’s music is organized and curated, ultimately delivering the freshest, most personalized playlists to keep fans listening.”
Record labels and music publishers will also benefit, as they will be able to gain a wider understand of which styles of music are driving worldwide listening trends.
Gracenote’s core Global Music Data, which now includes Sonic Style, has nearly 450 descriptor values and a weighted system that helps to accurately categorize the “style profile” of each musical recording. This includes editorially assigned descriptors, such as artist genre, era and origin, as well as machine learning descriptors like tempo and mood.