Towards the automated localisation of targets in rapid image-sifting by collaborative brain-computer interfaces

Ana Matran-Fernandez, Riccardo Poli, Dewen Hu
<span title="2017-05-31">2017</span> <i title="Public Library of Science (PLoS)"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/s3gm7274mfe6fcs7e3jterqlri" style="color: black;">PLoS ONE</a> </i> &nbsp;
The N2pc is a lateralised Event-Related Potential (ERP) that signals a shift of attention towards the location of a potential object of interest. We propose a single-trial target-localisation collaborative Brain-Computer Interface (cBCI) that exploits this ERP to automatically approximate the horizontal position of targets in aerial images. Images were presented by means of the rapid serial visual presentation technique at rates of 5, 6 and 10 Hz. We created three different cBCIs and tested a
more &raquo; ... rticipant selection method in which groups are formed according to the similarity of participants' performance. The N2pc that is elicited in our experiments contains information about the position of the target along the horizontal axis. Moreover, combining information from multiple participants provides absolute median improvements in the area under the receiver operating characteristic curve of up to 21% (for groups of size 3) with respect to single-user BCIs. These improvements are bigger when groups are formed by participants with similar individual performance, and much of this effect can be explained using simple theoretical models. Our results suggest that BCIs for automated triaging can be improved by integrating two classification systems: one devoted to target detection and another to detect the attentional shifts associated with lateral targets. One of the new systems for human augmentation focuses on cortically-coupled vision: the use of a BCI for triaging imagery in order to speed up the detection of images of interest amongst a series of distractors [7, [13] [14] [15] . If the ratio of targets vs non-targets is sufficiently low (i.e., around 10%), a P300 Event-Related Potential (ERP) is elicited in response to targets [16] , and its detection allows for the classification of images into one of these two categories. Research in this area of application has shown that the rapid presentation of images in the same spatial location (a protocol called Rapid Serial Visual Presentation-RSVP) [17] combined with BCIs can speed up the process of reviewing the images of interest (i.e., reduce triage time) with respect to traditional manual search without detrimental effects on target detection accuracy [18] [19] [20] [21] [22] [23] [24] [25] [26] [27] [28] . In the future, these systems could be very useful, for instance, in areas in which large amounts of time-sensitive images need to be reviewed looking for possible targets, as is the case of intelligence analysts. Accurate and rapid target detection, however, is often only a prerequisite to more sophisticated processing. For instance, techniques such as the one we present in this manuscript, that allows to automatically locate targets within the images (a task that cannot be achieved using the P300 ERP alone), could be very beneficial for triage systems. In previous work [29, 30] , we showed that the N2pc is elicited in the conditions of the RSVP paradigm with real aerial images, and that it can be used to discriminate targets depending on ths side of an image where they are located in single-user BCIs. The N2pc is a negative component that generally appears within 170-300 ms of stimulus onset. It can be detected on electrode sites located on the opposite side to the visual field where the target is found, with maximum amplitudes in electrode sites P7/8 and PO7/8 from the 10-20 international system. The maximum amplitudes of the N2pc are around 2-3 μV [31] [32] [33] . This ERP, which has been widely studied in literature related to attention, is known to be elicited when participants look for a given template in a search display which contains at least one non-target item in addition to the target [31] . The most similar work to the one we report here is that of Putze and collaborators [34], who used EEG data to detect targets and eye tracking to locate them (participants were asked to fixate their eyes on the targets). However, this was not done on an RSVP task with realworld stimuli, but rather on a series of simple stimuli (a number of circles arranged within a larger circle) that were sequentially and randomly flashed for 2 seconds each. While this technique could, in principle, be extended to the real-world stimuli used in this paper, it is unlikely that it could work when images are presented at the high speeds used in RSVP, as there are previous reports of saccades being suppressed at such rates [35, 36] . EEG signals are highly contaminated by noise and artefacts. The usual approach for increasing the signal-to-noise ratio in BCIs, and thus improve their performance, is to average the ERPs recorded over a number of repetitions of each stimulus [37] [38] [39] [40] . For example, in their N2pc-driven BCI (which is, to the best of our knowledge, the only system that has used this ERP to control a BCI), Awni and collaborators [41] performed 3 repetitions of each stimulus and averaged across them prior to classifying each trial. One of the drawbacks of this approach is that the increase of performance is obtained by sacrificing speed, which makes this technique impractical for some applications, specially in those designed for able-bodied users. Moreover, averaging across multiple trials is not always possible (e.g., exposing an observer to the same stimulus repeatedly can alter their neural response to it [42, 43] ). Combining signals from a number of users via cBCIs in this type of situations, however, has proven to be very useful (e.g., [5, 44] ). The information from multiple participants can be fused at different levels in order to create a collaborative BCI. The simplest method consists of performing averages of the raw EEG recordings from single trials across users and using such averaged data to train a unique
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1371/journal.pone.0178498">doi:10.1371/journal.pone.0178498</a> <a target="_blank" rel="external noopener" href="https://www.ncbi.nlm.nih.gov/pubmed/28562664">pmid:28562664</a> <a target="_blank" rel="external noopener" href="https://pubmed.ncbi.nlm.nih.gov/PMC5451058/">pmcid:PMC5451058</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/676nkbtqgndmbo3ufena2ah2la">fatcat:676nkbtqgndmbo3ufena2ah2la</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20170628083727/http://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0178498&amp;type=printable" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/31/d6/31d6b14614c60adedf945187269ed809f979e6e2.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1371/journal.pone.0178498"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> plos.org </button> </a> <a target="_blank" rel="external noopener" href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5451058" title="pubmed link"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> pubmed.gov </button> </a>