Complete E-Prime experiment, including code and media files, suitable to perform synchronized EEG data recordings according to the experiment and descriptions in: "Neural processing of Musical Mode through EEG signals" are provided here.
In this manuscript, the usage of electroencephalography (EEG) signals for musical mode detection is explored. To this end, pianist’s EEG signals while playing different scales in two modes (major and minor) are recorded. Then the signals are processed to extract energy-related features employed to train, and test Machine Learning (ML) and Neural Network (NN) models for the classification of musical mode.
The tests reveal that Wide Neural Network achieved the highest accuracy, reaching up to 98% in both intra- and inter-subject scenarios regardless participant’s handedness. Beta and gamma bands, showed to encompass musical mode processing information similarly to theta and alpha bands, in the schemes considered. Scalp power distribution of subjects showed predominant activity over frontal and parieto-occipital regions, with central region playing a key role in musical mode processing despite its lower power values.