Filters








78,098 Hits in 11.5 sec

Deep Networks with Internal Selective Attention through Feedback Connections [article]

Marijn Stollenga, Jonathan Masci, Faustino Gomez, Juergen Schmidhuber
2014 arXiv   pre-print
So does our Deep Attention Selective Network (dasNet) architecture. DasNets feedback structure can dynamically alter its convolutional filter sensitivities during classification.  ...  It harnesses the power of sequential processing to improve classification performance, by allowing the network to iteratively focus its internal attention on some of its convolutional filters.  ...  Conclusion DasNet is a deep neural network with feedback connections that are learned by through reinforcement learning to direct selective internal attention to certain features extracted from images.  ... 
arXiv:1407.3068v2 fatcat:ct5xhzbhubhctc3zpuhrfcuiga

Object Based Attention Through Internal Gating [article]

Jordan Lei, Ari S. Benjamin, Konrad P. Kording
2021 arXiv   pre-print
For example, attention in the brain is known to depend on top-down processing, whereas self-attention in deep learning does not.  ...  Here, we propose an artificial neural network model of object-based attention that captures the way in which attention is both top-down and recurrent.  ...  Similarly, Deep Attention Selective Networks [61] take fully-trained convolutional neural networks and learn to re-weight internal feature maps, modeling gating and multiplicative scaling.  ... 
arXiv:2106.04540v1 fatcat:cdmoov222rcphj3wawxfqy5qvy

Deep-BCN: Deep Networks Meet Biased Competition to Create a Brain-Inspired Model of Attention Control

Hossein Adeli, Gregory Zelinsky
2018 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)  
Our work advances this theory by making it computationally explicit as a deep neural network (DNN) model, thereby enabling predictions of goal-directed attention control using real-world stimuli.  ...  With Deep-BCN a DNN implementation of BCT now exists, which can be used to predict the neural and behavioral responses of an attention control mechanism as it mediates a goaldirected behavior-in our study  ...  Blue arrows indicate feedforward connections and red arrows indicate feedback connections.  ... 
doi:10.1109/cvprw.2018.00259 dblp:conf/cvpr/AdeliZ18 fatcat:v7r6m6cqdjeabc6pqeue3myrle

Self-attention Negative Feedback Network for Real-time Image Super-Resolution

Xiangbin Liu, Shuqi Chen, Liping Song, Marcin Woźniak, Shuai Liu
2021 Journal of King Saud University: Computer and Information Sciences  
The network model constrains the image mapping space and selects the key information of the image through the self-attention negative feedback model, so that higher quality images can be generated to meet  ...  Therefore, this paper proposes a self-attention negative feedback network (SRAFBN) for realizing the real-time image SR.  ...  The network model selects the key information of images through the self-attention negative feedback mechanism, so that the obtained images are of higher quality.  ... 
doi:10.1016/j.jksuci.2021.07.014 fatcat:v733gwxrrzdldktupldau7jpge

Lightweight Feedback Convolution Neural Network for Remote Sensing Images Super-Resolution

Jin Wang, Yiming Wu, Liu Wang, Lei Wang, Osama Alfarraj, Amr Tolba
2021 IEEE Access  
For saving costs, we propose the feedback ghost residual dense network (FGRDN), which considers the feedback mechanism as the framework to attain lower features through high-level refining.  ...  with the network depth.  ...  Next, the connected feature map is convolved with 1×1. Through this convolution operation, the number of feature layers is compressed from 64 × 2 layers to 64 layers.  ... 
doi:10.1109/access.2021.3052946 fatcat:zndked7xizfrrjdxwe7fihthnm

Deep Predictive Coding Network with Local Recurrent Processing for Object Recognition [article]

Kuan Han, Haiguang Wen, Yizhen Zhang, Di Fu, Eugenio Culurciello, Zhongming Liu
2018 arXiv   pre-print
deep network.  ...  Unlike feedforward-only convolutional neural networks, PCN includes both feedback connections, which carry top-down predictions, and feedforward connections, which carry bottom-up errors of prediction.  ...  Deep residual networks with shared weights can be strictly reformulated as a shallow RNN [37] .  ... 
arXiv:1805.07526v2 fatcat:jgscgoiqsvb6nonuuz2ifr54xu

Medical Image Segmentation Algorithm Based on Feedback Mechanism CNN

Feng-Ping An, Zhi-Wen Liu
2019 Contrast Media & Molecular Imaging  
So, a medical image segmentation algorithm based on a feedback mechanism convolutional neural network is proposed.  ...  A new feedback convolutional neural network algorithm based on neuron screening and neuron visual information recovery is constructed.  ...  through the entire network.  ... 
doi:10.1155/2019/6134942 pmid:31481851 pmcid:PMC6701432 fatcat:5npfrwcsu5dm7p3wmu4l6yu6ya

Hashtag Healthcare: From Tweets to Mental Health Journals Using Deep Transfer Learning [article]

Benjamin Shickel, Martin Heesacker, Sherry Benton, Parisa Rashidi
2017 arXiv   pre-print
However, less attention has been paid to analyzing users' internalized thoughts and emotions from a mental health perspective.  ...  We will use deep transfer learning techniques for analyzing the semantic gap between the two domains.  ...  Acknowledgements We thank TAO Connect, Inc. for access and assistance with retrieving only therapy logs.  ... 
arXiv:1708.01372v1 fatcat:s5uzyvw4vrfynkmpdbwwqyk3ni

Episodic CAMN: Contextual Attention-Based Memory Networks with Iterative Feedback for Scene Labeling

Abrar H. Abdulnabi, Bing Shuai, Stefan Winkler, Gang Wang
2017 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)  
We present a unified framework that mainly consists of a Convolutional Neural Network (CNN), specifically, Fully Convolutional Network (FCN) and an attention-based memory module with feedback connections  ...  In this paper, we introduce an episodic attention-based memory network to achieve the goal.  ...  Engaging the recurrent feedback connections can be viewed as cascading multiple attention layers that can form a deep attention model.  ... 
doi:10.1109/cvpr.2017.665 dblp:conf/cvpr/AbdulnabiSW017 fatcat:7zs4lkqiabccldwwrjd5eduiiq

Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions

Alberto Testolin, Marco Zorzi
2016 Frontiers in Computational Neuroscience  
These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account  ...  top-down, predictive processing supported by feedback loops.  ...  Multiple Connection Pathways and Multimodal Learning Deep learning architectures can also be used to simulate selective damage to specific connection pathways.  ... 
doi:10.3389/fncom.2016.00073 pmid:27468262 pmcid:PMC4943066 fatcat:vlybqbpt4re5dmbc2fsg5c2opu

Visual Sensation and Perception Computational Models for Deep Learning: State of the art, Challenges and Prospects [article]

Bing Wei, Yudi Zhao, Kuangrong Hao, Lei Gao
2021 arXiv   pre-print
Through this survey, it will provide a comprehensive reference for research in this direction.  ...  In this paper, visual perception computational models oriented deep learning are investigated from the biological visual mechanism and computational vision theory systematically.  ...  [40] aimed at developing a short-term memory (STM) based deep brain learning network (DBLN) with two error feedback loops for shape-reconstruction.  ... 
arXiv:2109.03391v1 fatcat:xtgda2x6azd2laun45tqfj77gi

Look and Think Twice: Capturing Top-Down Visual Attention with Feedback Convolutional Neural Networks

Chunshui Cao, Xianming Liu, Yi Yang, Yinan Yu, Jiang Wang, Zilei Wang, Yongzhen Huang, Liang Wang, Chang Huang, Wei Xu, Deva Ramanan, Thomas S. Huang
2015 2015 IEEE International Conference on Computer Vision (ICCV)  
The feedback networks help better visualize and understand how deep neural networks work, and capture visual attention on expected objects, even in images with cluttered background and multiple objects  ...  connections.  ...  Acknowledgement We gratefully acknowledge the support of NVIDIA Corporation with the donation of the Tesla K40 GPUs used in the prototyping stage of this research.  ... 
doi:10.1109/iccv.2015.338 dblp:conf/iccv/CaoLYYWWHWHXRH15 fatcat:qrmtsnsurfbstauhttoevpnr7a

Deep Learning Models Based on Image Classification: A Review

Kavi B. Obaid, Subhi R. M. Zeebaree, Omar M. Ahmed
2020 Zenodo  
With the development of the big data age, deep learning developed to become having a more complex network structure and more powerful feature learning and feature expression abilities than traditional  ...  This paper first introduces the deep learning, and then the latest model that has been used for image classification by deep learning are reviewed.  ...  Stollenga et al. (2014) proposed the DasNet, which is a deep neural network with feedback connections that are learned through reinforcement learning to direct selective internal attention to certain  ... 
doi:10.5281/zenodo.4108433 fatcat:boa4clckbvcepjze6et6vsfjpq

How Deactivating an Inhibitor Causes Absence Epilepsy: Validation of a Noble Lie

Martin J. Gallagher
2013 Epilepsy Currents  
These deep cortical neurons make two important excitatory connections to TC and nRT neurons: The excitation of the gamma amino butyric acid (GABA)-containing nRT neurons provides a critical feedforward  ...  What would happen if excitatory transmission were selectively reduced in the nRT?  ...  the nRT, VB, and the internal capsule that carries the afferent and efferent fibers connecting the thalamus with the cortex.  ... 
doi:10.5698/1535-7511-13.1.38 pmid:23447739 pmcid:PMC3577086 fatcat:av6vfvpkhzautdf43cctc2oipa

Attention-Mechanism-Containing Neural Networks for High-Resolution Remote Sensing Image Classification

Rudong Xu, Yiting Tao, Zhongyuan Lu, Yanfei Zhong
2018 Remote Sensing  
Thus, a deep neural network can be equipped with an attention mechanism to perform pixel-wise classification for very high-resolution remote sensing (VHRRS) images.  ...  In this study, we propose a novel neural network that incorporates two kinds of attention mechanisms in its mask and trunk branches; i.e., control gate (soft) and feedback attention mechanisms, respectively  ...  the proposed method with networks without internal classifiers, without mask attention, and without feedback attention.  ... 
doi:10.3390/rs10101602 fatcat:vwkv2mpu5ferzoiiaroe5jbnoi
« Previous Showing results 1 — 15 out of 78,098 results