BrainBERT: An Approach to Language Model Capturing the Correspondence between Brain Activities and Language
BrainBERT:脳活動とテキストの対応関係を捉えた 言語モデル構築への取り組み

Ying Luo, Ichiro Kobayashi
2021
In recent years, numerous attempts have been made to explore the decoding of brain activity. Specifically, a great deal of work has been done to capture the general correspondence between language and brain activity. In this paper, we investigate the mapping between text and brain activity data for brain decoding using BERT [1], a generic language model proposed in the field of Natural Language Processing (NLP). At the same time, we investigated the effect of brain activity on the NLP task
more » ... Brain BERT, a novel model promoted by our team. We also compared and validated different models to obtain the best bias and weights for Autoencoder and Brain BERT model to better extract the brain features.
doi:10.14864/fss.37.0_376 fatcat:7iavxr6dsrg65ctpvqv5b7w67a