To achieve this, five volunteers underwent fMRI scans while listening to music from a selection of 500 songs from 10 different genres.
The scientists took scans of the volunteers’ brain activity and fed them into an artificial intelligence model called Brain2Music. This program processes the information from the scans to create songs inspired by what the volunteers were listening to.
The new AI-generated songs were produced using MusicLM, a Google production tool that is a text-to-music model capable of creating tracks based on text descriptions.
In the published study, the researchers describe the AI-created music as similar to “musical stimuli experienced by human subjects, with respect to semantic properties such as genre, instrumentation, and mood.”
They also noted how this exercise allowed them to study how “brain regions represent information derived from purely textual descriptions of musical stimuli.”
On the other hand, Instagram is working on a feature to distinguish AI-generated posts
Sigue toda la información de HIGHXTAR desde Facebook, Twitter o Instagram
You may also like...