Filters








300 Hits in 7.5 sec

Mind the beat: detecting audio onsets from EEG recordings of music listening [article]

Ashvala Vinay, Alexander Lerch, Grace Leslie
2021 arXiv   pre-print
We propose a deep learning approach to predicting audio event onsets in electroencephalogram (EEG) recorded from users as they listen to music.  ...  EEG to predict one second windows of onsets in the audio.  ...  EXPERIMENTAL SETUP We extract onsets in music from the EEG signals recorded during music listening sessions.  ... 
arXiv:2102.06393v1 fatcat:rtpbvno4q5euveqqekpgmg7x64

Decoding the infant mind: Multivariate pattern analysis (MVPA) using fNIRS

Lauren L. Emberson, Benjamin D. Zinszer, Rajeev D. S. Raizada, Richard N. Aslin, Suliann Ben Hamed
2017 PLoS ONE  
FNIRS is a neuroimaging modality that records the same physiological signal as fMRI but without the constraints of MRI, and with better spatial localization than EEG.  ...  The current paper presents a method of multivariate pattern analysis for fNIRS that allows the authors to decode the infant mind (a key fNIRS population).  ...  and the rest of the research assistants in the Rochester Baby Lab.  ... 
doi:10.1371/journal.pone.0172500 pmid:28426802 pmcid:PMC5398514 fatcat:55oz5bgoljf7fhixqspoussigy

Decoding the Infant Mind: Multichannel Pattern Analysis (MCPA) using fNIRS [article]

Lauren L. Emberson, Benjamin D. Zinszer, Rajeev D. S. Raizada, Richard N. Aslin
2016 bioRxiv   pre-print
FNIRS is a neuroimaging modality that records the same physiological signal as fMRI but without the constraints of MRI, and with better spatial localization than EEG.  ...  The current paper presents a method of multivariate pattern analysis for fNIRS that allows the authors to decode the infant mind (a key fNIRS population).  ...  and the rest of the research assistants in the Rochester Baby Lab.  ... 
doi:10.1101/061234 fatcat:5tln7oebsjdxpd7iq5qr26lwba

Possible Effect of Binaural Beat Combined With Autonomous Sensory Meridian Response for Inducing Sleep

Minji Lee, Chae-Bin Song, Gi-Hwan Shin, Seong-Whan Lee
2019 Frontiers in Human Neuroscience  
When we listen to acoustic beats of two tones in each ear simultaneously, a binaural beat is generated which induces brain signals at a specific desired frequency.  ...  In addition, the "ASMR triggers" that cause ASMR were presented from natural sound as the sensory stimuli.  ...  Thus, the total score for one factor is a maximum of 16 (4 mood descriptors × 4 points). EEG Acquisition and Analysis EEG Recording The EEG data were recorded at a sampling rate of 500 Hz.  ... 
doi:10.3389/fnhum.2019.00425 pmid:31849629 pmcid:PMC6900908 fatcat:ovmaqpqbtzeopa5zo3hejt63ha

Look at the Beat, Feel the Meter: Top–Down Effects of Meter Induction on Auditory and Visual Modalities

Alexandre Celma-Miralles, Robert F. de Menezes, Juan M. Toro
2016 Frontiers in Human Neuroscience  
In the present study, we aim to assess whether the projection of meter on auditory beats is also present in the visual domain.  ...  A frequency analysis of the elicited steady-state evoked potentials allowed us to compare the frequencies of the beat (2.4 Hz), its first harmonic (4.8 Hz), the binary subharmonic (1.2 Hz), and the ternary  ...  the analysis of the data and the writing of the article.  ... 
doi:10.3389/fnhum.2016.00108 pmid:27047358 pmcid:PMC4803728 fatcat:mf4kvdoslvgghjpzttu5jv6d7u

Name that tune: Decoding music from the listening brain

Rebecca S. Schaefer, Jason Farquhar, Yvonne Blokland, Makiko Sadakata, Peter Desain
2011 NeuroImage  
In the current study we use electroencephalography (EEG) to detect heard music from the brain signal, hypothesizing that the time structure in music makes it especially suitable for decoding perception  ...  from EEG signals.  ...  Acknowledgments The authors gratefully acknowledge the support of the BrainGain Smart Mix Programme of the Netherlands Ministry of Economic Affairs and the Netherlands Ministry of Education, Culture and  ... 
doi:10.1016/j.neuroimage.2010.05.084 pmid:20541612 fatcat:waahpbyh4rffte46gydqfg22ky

Neural Entrainment to Polyrhythms: A Comparison of Musicians and Non-musicians

Jan Stupacher, Guilherme Wood, Matthias Witte
2017 Frontiers in Neuroscience  
By combining EEG and behavioral measures, the current study provides evidence illustrating the importance of ongoing neural oscillations at beat-related frequencies-i.e., neural entrainment-for tracking  ...  Participants (13 musicians and 13 non-musicians) listened to drum rhythms that switched from a quadruple rhythm to a 3-over-4 polyrhythm.  ...  ACKNOWLEDGMENTS JS was supported by a DOC fellowship of the Austrian Academy of Sciences at the Department of Psychology, University of Graz.  ... 
doi:10.3389/fnins.2017.00208 pmid:28446864 pmcid:PMC5388767 fatcat:5icqqti5xrhlrjuhzlud7rs2bi

Toward Studying Music Cognition with Information Retrieval Techniques: Lessons Learned from the OpenMIIR Initiative

Sebastian Stober
2017 Frontiers in Psychology  
As an emerging sub-field of music information retrieval (MIR), music imagery information retrieval (MIIR) aims to retrieve information from brain activity recorded during music cognition-such as listening  ...  for music analysis like fingerprinting, beat tracking or tempo estimation on this new kind of data.  ...  ACKNOWLEDGMENTS The author would like to thank all collaborators who were involved in the work presented here: Avital Sternin, Jessica A. Grahn, Adrian M. Owen, Thomas Prätzlich, and Meinard Müller.  ... 
doi:10.3389/fpsyg.2017.01255 pmid:28824478 pmcid:PMC5541010 fatcat:6kjo4ctogbdwtc6ysln7jxv5ju

Dynamic Music Representations for Real-Time Performance: From Sound to Symbol and Back

Grigore Burloiu
2016 Zenodo  
This thesis contributes to several facets of real-time computer music, using the construct of dynamic music representations as a platform towards the online retrieval and processing of information from  ...  Finally, we harness the potential of musical sound for data exploration and performance mediation, by implementing an EEG sonification system as part of an interactive theatre piece.  ...  The examples in code listing 3.2 show the two kinds of temporal dynamics: On the left, the run-time value of $x determines the delay interval (in beats, starting from the detected onset of NOTE C4) until  ... 
doi:10.5281/zenodo.4280757 fatcat:pnxntdy6hzbchhfdwuplpft2nu

GuessTheMusic: Song Identification from Electroencephalography response [article]

Dhananjay Sonawane, Krishna Prasad Miyapuram, Bharatesh RS, Derek J. Lomas
2020 arXiv   pre-print
We recorded the EEG signals from a group of 20 participants while listening to a set of 12 song clips, each of approximately 2 minutes, that were presented in random order.  ...  The performance observed gives appropriate implication towards the notion that listening to a song creates specific patterns in the brain, and these patterns vary from person to person.  ...  It includes response data of 10 subjects who listened to 12 music fragments with duration ranging from 7 s to 16 s coming from popular musical pieces.  ... 
arXiv:2009.08793v1 fatcat:dudoxn4btvblhj2k7zptm5bdma

Early auditory processing in musicians and dancers during a contemporary dance piece

Hanna Poikonen, Petri Toiviainen, Mari Tervaniemi
2016 Scientific Reports  
during continuous music listening.  ...  Further, the comparison of dancers and musicians may help in defining whether these changes are influenced by personal history in intense listening of music or in active music-making.  ...  We would like to thank Miika Leminen, Tommi Makkonen, Niia Virtanen and Johanna Tuomisto for their assistance during the EEG recordings and data processing, Prof.  ... 
doi:10.1038/srep33056 pmid:27611929 pmcid:PMC5017142 fatcat:5cncvbbkujdrvffxtvnbo7xsbq

The Space Between Us: Evaluating a multi-user affective brain-computer music interface

Joel Eaton, Duncan Williams, Eduardo Miranda
2015 Brain-Computer Interfaces  
, derived from arousal and valence recorded in EEG.  ...  An Affective Jukebox, the work of a previous study, validates the method used to read emotions across two dimensions in EEG in response to music.  ...  The music composition for The Space Between Us was developed in collaboration with the composer Miss Weiwei Jin. Disclosure statement No potential conflict of interest was reported by the authors.  ... 
doi:10.1080/2326263x.2015.1101922 fatcat:qvjf5ro6ifahdprxioyf6dg7wy

The man who feels two hearts: the different pathways of interoception

Blas Couto, Alejo Salles, Lucas Sedeño, Margarita Peradejordi, Pablo Barttfeld, Andrés Canales-Johnson, Yamil Vidal Dos Santos, David Huepe, Tristán Bekinschtein, Mariano Sigman, Roberto Favaloro, Facundo Manes (+1 others)
2013 Social Cognitive and Affective Neuroscience  
The patients performance on the interoception task (heartbeat detection) seemed to be guided by signals from the artificial LVAD, which provides a somatosensory beat rather than by his endogenous heart  ...  The patient accurately performed several cognitive tasks, except for interoception-related social cognition domains (empathy, theory of mind and decision making).  ...  EEG data were segmented from 200 ms prior to the R-wave-EKG onset to 800 ms after its onset. All segments with eye movement contamination were removed from further analysis using a visual procedure.  ... 
doi:10.1093/scan/nst108 pmid:23887813 pmcid:PMC4158360 fatcat:biw73edtknghvmndasofauo6am

The Berlin Brain-Computer Interface: Progress Beyond Communication and Control

Benjamin Blankertz, Laura Acqualagna, Sven Dähne, Stefan Haufe, Matthias Schultze-Kraft, Irene Sturm, Marija Ušćumlic, Markus A. Wenzel, Gabriel Curio, Klaus-Robert Müller
2016 Frontiers in Neuroscience  
The combined effect of fundamental results about neurocognitive processes and advancements in decoding mental states from ongoing brain signals has brought forth a whole range of potential neurotechnological  ...  In this article, we discuss the reasons why, in some of the prospective application domains, considerable effort is still required to make the systems ready to deal with the full complexity of the real  ...  Moreover, we would like to thank our coauthors for allowing us to use materials from former joint publications.  ... 
doi:10.3389/fnins.2016.00530 pmid:27917107 pmcid:PMC5116473 fatcat:noknjgpzm5hlxbbe43g42wvyci

Effects of instructed timing on electric guitar and bass sound in groove performance

Guilherme Schmidt Câmara, Kristian Nymoen, Olivier Lartillot, Anne Danielsen
2020 Journal of the Acoustical Society of America  
This paper reports on two experiments that investigated the expressive means through which musicians well versed in groove-based music signal the intended timing of a rhythmic event.  ...  Data were collected from 21 expert electric guitarists and 21 bassists, who were instructed to perform a simple rhythmic pattern in three different timing styles-"laid-back," "on-the-beat," and "pushed  ...  This work was partially supported by the Research Council of Norway through its Centers of Excellence scheme, Project No. 262762 and the TIME (Timing and Sound in Musical Microrhythm) project, Grant No  ... 
doi:10.1121/10.0000724 pmid:32113267 fatcat:aondmtg2kzegzjdi3qmzr72mhm
« Previous Showing results 1 — 15 out of 300 results