A team of researchers from Brazil, India, Germany and Finland at Brazil's D'Or Institute for Research and Education conducted a study to read the minds of participants. Magnetic Resonance machine was used to find out what song participants were listening to by decoding the signals sent by their brains to the machine. The machine displayed 85 per cent success.
The study, published in the journal Scientific Reports, contributed to the improvement of the technique and paved the way for new research on reconstruction of auditory imagination and inner speech. In the experiment, six volunteers heard 40 pieces of classical music, rock, pop, jazz and others. The neural fingerprint of each song on participants' brain was captured by the MR machine while a computer was learning to identify the brain patterns elicited by each musical piece.
Musical features such as tonality, dynamics, rhythm and timbre were taken into account by the computer.The computer showed up to 85 per cent accuracy in identifying the correct song - a great performance when compared with the previous studies. In the future, studies on brain decoding and machine learning will create possibilities of communication regardless of any kind of written or spoken language.
"Machines will be able to translate our musical thoughts into songs," said Sebastian Hoefle from D'Or Institute and Ph.D student from the Federal University of Rio de Janeiro, Brazil. In the future, Hoefle expects to find answers for questions like "what musical features make some people love a song while others don't? Is our brain adapted to prefer a specific kind of music"? In the clinical domain, this technology could be used to enhance brain-computer interfaces in order to establish communication with locked-in syndrome patients.
(With IANS inputs)
Keep watching our Facebook page for all updates