EHS
EHS

neurosciencestuff: (Image caption: Brain areas best predicting…

neurosciencestuff:

(Image caption: Brain areas best predicting musicianship. Red: left/right anterior cingulate gyrus; Green: right inferior frontal gyrus; Blue: right superior temporal gyrus; Gray: caudate nucleus, middle frontal gyrus, inferior frontal gyrus)

Your brain responses to music reveal if you’re a musician or not

How your brain responds to music listening can reveal whether you
have received musical training, according to new Nordic research
conducted in Finland (University of Jyväskylä and AMI Center) and
Denmark (Aarhus University). By applying methods of computational music
analysis and machine learning on brain imaging data collected during
music listening, the researchers we able to predict with a significant
accuracy whether the listeners were musicians or not.

These results emphasize the striking impact of musical training on
our neural responses to music to the extent of discriminating musicians’
brains from nonmusicians’ brains despite other independent factors such
as musical preference and familiarity. The research also revealed that
the brain areas that best predict musicianship exist predominantly in
the frontal and temporal areas of the brain’s right hemisphere. These
findings conform to previous work on how the brain processes certain
acoustic characteristics of music as well as intonation in speech. The
paper was published on January 15 in the journal Scientific Reports.

The study utilized functional magnetic resonance imaging (fMRI) brain
data collected by Prof. Elvira Brattico’s team (previously at
University of Helsinki and currently at Aarhus University) from 18
musicians and 18 non-musicians while they attentively listened to music
of different genres. Computational algorithms were applied to extract
musical features from the presented music.

“A novel feature of our approach was that, instead of relying on
static representations of brain activity, we modelled how music is
processed in the brain over time,” explains Pasi Saari,
post-doctoral researcher at the University of Jyväskylä and the main
author of the study. “Taking the temporal dynamics into account was
found to improve the results remarkably.” As the last step of modelling,
the researchers used machine learning to form a model that predicts
musicianship from a combination of brain regions.

The machine learning model was able to predict the listeners’
musicianship with 77% accuracy, a result that is on a par with similar
studies on participant classification with, for example, clinical
populations of brain-damaged patients. The areas where music processing
best predicted musicianship resided mostly in the right hemisphere, and
included areas previously found to be associated with engagement and
attention, processing of musical conventions, and processing of
music-related sound features (e.g. pitch and tonality).

“These areas can be regarded as core structures in music processing
which are most affected by intensive, lifelong musical training,” states
Iballa Burunat, a co-author of the study. In these
areas, the processing of higher-level features such as tonality and
pulse was the best predictor of musicianship, suggesting that musical
training affects particularly the processing of these aspects of music.

“The novelty of our approach is the integration of computational
acoustic feature extraction with functional neuroimaging measures,
obtained in a realistic music-listening environment, and taking into
account the dynamics of neural processing. It represents a significant
contribution that complements recent brain-reading methods which decode
participant information from brain activity in realistic conditions,”
concludes Petri Toiviainen, the senior author of the study. The research was funded by the Academy of Finland and Danish National Research Foundation.

EHS
Back to top button