Filters








1,031,431 Hits in 3.9 sec

Learn To Pay Attention [article]

Saumya Jetley, Nicholas A. Lord, Namhoon Lee, Philip H.S. Torr
2018 arXiv   pre-print
Our experimental observations provide clear evidence to this effect: the learned attention maps neatly highlight the regions of interest while suppressing background clutter.  ...  We propose an end-to-end-trainable attention module for convolutional neural network (CNN) architectures built for image classification.  ...  The network is thus forced to learn a pattern of attention relevant to solving the task at hand.  ... 
arXiv:1804.02391v2 fatcat:knfckby4h5b6tf5ovzini2mbki

Learning to Pay Attention

Rachel Jones
2007 PLoS Biology  
Learning to Pay Attention refl ect the allocation of resources to the target.  ...  Our sensory system is constantly bombarded with inputs, but owing to the brain's fi nite processing power, we are forced to pay attention to only a tiny proportion of these inputs at any given time.  ... 
doi:10.1371/journal.pbio.0050166 pmid:20076676 pmcid:PMC1865567 fatcat:xbuekuaqdnbfnm7zwkuty3r5ia

Learning To Pay Attention To Mistakes [article]

Mou-Cheng Xu and Neil P. Oxtoby and Daniel C. Alexander and Joseph Jacob
2020 arXiv   pre-print
This leads to high false negative detection rates. In this paper, we propose a novel attention mechanism to directly address such high false negative rates, called Paying Attention to Mistakes.  ...  We compared our methods with state-of-the-art attention mechanisms in medical imaging, including self-attention, spatial-attention and spatial-channel mixed attention.  ...  Our solution is to "implicitly" Pay Attention to Mistakes.  ... 
arXiv:2007.15131v3 fatcat:yocfewpm5rdvfoke7guax5xb2i

Paying attention to attention: New economies for learning

Suzanne de Castell, Jennifer Jenson
2004 Educational Theory  
Children quite literally pay their attention to new multimodal tools designed for them.  ...  By paying attention to attention, we might better identify and develop forms of productive engagement in which dynamic, multimodal learning environments are animated by students' deliberate and sustained  ... 
doi:10.1111/j.0013-2004.2004.00026.x fatcat:maebkaddezhabedp6z7jmyhare

On Engagement: Learning to Pay Attention

R. Lisle Baker, Daniel P. Brown
2013 Social Science Research Network  
Far from being immutable, engaged attention can be learned.  ...  In an age of electronic and mental distraction, the ability to pay attention is a fundamental legal skill increasingly important for law students and the lawyers and judges they will become, not only for  ...  on how to learn how to pay attention more successfully.  ... 
doi:10.2139/ssrn.2269726 fatcat:3lplbppsgzhr7gzluqmawgk2o4

Saliency Learning: Teaching the Model Where to Pay Attention [article]

Reza Ghaeini, Xiaoli Z. Fern, Hamed Shahbazi, Prasad Tadepalli
2019 arXiv   pre-print
Deep learning has emerged as a compelling solution to many NLP tasks with remarkable performances. However, due to their opacity, such models are hard to interpret and trust.  ...  In this paper, we aim to teach the model to make the right prediction for the right reason by providing explanation training and ensuring the alignment of the model's explanation with the ground truth  ...  1 and pays attention to the contributory words.  ... 
arXiv:1902.08649v3 fatcat:7g46fhwi4vdj5d4glauqsu2e5a

Reproduction Report on "Learn to Pay Attention" [article]

Levan Shugliashvili, Davit Soselia, Shota Amashukeli, Irakli Koberidze
2018 arXiv   pre-print
We have successfully implemented the "Learn to Pay Attention" model of attention mechanism in convolutional neural networks, and have replicated the results of the original paper in the categories of image  ...  Table 1 describes the hyperparameters of the Learn to Pay Attention model.  ...  Introduction The model proposed in the "Learn to Pay Attention" paper introduced a novel way to generate a trainable attention module for convolutional neural networks.  ... 
arXiv:1812.04650v1 fatcat:g3rbxveirnc7bazhgwmkzeieri

Pay Attention to Evolution: Time Series Forecasting with Deep Graph-Evolution Learning [article]

Gabriel Spadon, Shenda Hong, Bruno Brandoli, Stan Matwin, Jose F. Rodrigues-Jr, Jimeng Sun
2021 arXiv   pre-print
attention to how multiple multivariate data synchronously evolve.  ...  A still open gap in that literature is that statistical and ensemble learning approaches systematically present lower predictive performance than deep learning methods.  ...  Material Pay Attention to Evolution: Time Series Forecasting with Deep Graph-Evolution Learning Gabriel Spadon , Shenda Hong , Bruno Brandoli , Stan Matwin , Jose F.  ... 
arXiv:2008.12833v3 fatcat:dp3ytrfcfnhc3b23rkrmsyeffi

Pay Attention to Features, Transfer Learn Faster CNNs

Kafeng Wang, Xitong Gao, Yiren Zhao, Xingjian Li, Dejing Dou, Cheng-Zhong Xu
2020 International Conference on Learning Representations  
Transfer learning offers the chance for CNNs to learn with limited data samples by transferring knowledge from models pretrained on large datasets.  ...  In this paper, we propose attentive feature distillation and selection (AFDS), which not only adjusts the strength of transfer learning regularization but also dynamically determines the important features  ...  Instead of constraining the parameter search space, Li et al. (2019) showed that it is often more effective to regularize feature maps during fine-tuning, and further learns which features to pay attention  ... 
dblp:conf/iclr/WangGZ0D020 fatcat:32h3ra2e6recrlhgitm3ircjna

Paying attention to payoffs in analogy-based learning

Topi Miettinen
2010 Economic Theory  
may lead to discrimination.  ...  This paper introduces the payo¤-con...rming analogy-based expectation equilibrium (PCABEE) as a way to re...ne the set of analogy-based equilibria and the associated admissible analogy partitions.  ...  types of others, but fail to pay attention to correlations as in the cursed equilibrium.  ... 
doi:10.1007/s00199-010-0565-7 fatcat:ysjtqh2gxjhrfne63s6l6kvize

It's Time to Pay Attention to Attention in Aging

J. McGaughy
2002 Learning & memory (Cold Spring Harbor, N.Y.)  
However, in this issue of Learning & Memory, Barense et al. (2002) show that one aspect of attentional function may be highly sensitive to the effects of aging, and they suggest a neuroanatomical basis  ...  In monkeys, the lateral prefrontal cortex is required for switching attention to the alternate stimulus dimension, the EDS, but not the initial discrimination, IDS, or within-dimension reversal learning  ... 
doi:10.1101/lm.52902 pmid:12177227 fatcat:vmi5igpaqzgupoedectrpsrhne

TextAdaIN: Paying Attention to Shortcut Learning in Text Recognizers [article]

Oren Nuriel, Sharon Fogel, Ron Litman
2022 arXiv   pre-print
Recent work has termed this "shortcut learning" and addressed its presence in multiple domains.  ...  It generalizes to multiple architectures and to the domain of scene text recognition.  ...  As one can tell from the attention maps, the model fails to do so and is unable to correctly decode the image.  ... 
arXiv:2105.03906v3 fatcat:7wquhvzntbdvrcgkiqmuavra44

Conditionally Learn to Pay Attention for Sequential Visual Task [article]

Jun He, Quan-Jie Cao, Lei Zhang
2019 arXiv   pre-print
Sequential visual task usually requires to pay attention to its current interested object conditional on its previous observations.  ...  features align to the conditional global feature.  ...  Especially, for query-based task, [10] introduced a novel learn to pay attention method which directly uses a learned global feature to query images different from previous methods performing query by  ... 
arXiv:1911.04365v1 fatcat:mdgofg362ngwdp7lznisdwbvz4

Should We Pay More Attention to South-North Learning?

David Lewis
2017 Human service organizations, management, leadership & governance  
Guest editorial: Should we pay more attention to South-North learning?  ...  In particular, much attention is now given within the worlds of international development policy and practice to the importance of promoting the idea of "South-South" cooperation, learning and exchange  ... 
doi:10.1080/23303131.2017.1366222 fatcat:wvzuhrtbzfci3krak54yst7zhm

Pay Attention to Those Sets! Learning Quantification from Images [article]

Ionut Sorodoc, Sandro Pezzelle, Aurélie Herbelot, Mariella Dimiccoli, Raffaella Bernardi
2017 arXiv   pre-print
A case in point is given by their ability to learn quantifiers, i.e. expressions like 'few', 'some' and 'all'.  ...  We show that state-of-the-art attention mechanisms coupled with a traditional linguistic formalisation of quantifiers gives best performance on the task.  ...  The model is supposed to pay particular attention to the image regions that are relevant to the query via the attention layer.  ... 
arXiv:1704.02923v1 fatcat:qek6zhgl4vc6lgizxbyucxu2hy
« Previous Showing results 1 — 15 out of 1,031,431 results