Filters








966 Hits in 2.4 sec

Capsules with Inverted Dot-Product Attention Routing [article]

Yao-Hung Hubert Tsai, Nitish Srivastava, Hanlin Goh, Ruslan Salakhutdinov
2020 arXiv   pre-print
The new mechanism 1) designs routing via inverted dot-product attention; 2) imposes Layer Normalization as normalization; and 3) replaces sequential iterative routing with concurrent iterative routing.  ...  Our code is publicly available at: https://github.com/apple/ml-capsules-inverted-attention-routing An alternative implementation is available at: https://github.com/yaohungt/Capsules-Inverted-Attention-Routing  ...  Inverted Dot-Product Attention-B denotes our routing approach with sequential routing.  ... 
arXiv:2002.04764v2 fatcat:r5muuzptevfpdftp7absihoxvi

WideCaps: A Wide Attention based Capsule Network for Image Classification [article]

S J Pawan, Rishi Sharma, Hemanth Sai Ram Reddy, M Vani, Jeny Rajan
2021 arXiv   pre-print
This paper proposes a new design strategy for capsule network architecture for efficiently dealing with complex images.  ...  The capsule network has attained unprecedented success over image classification tasks with datasets such as MNIST and affNIST by encoding the characteristic features into the capsules and building the  ...  Capsules with inverted dot-product attention routing. arXiv preprint arXiv:2002.04764, 2020. [36] Yann LeCun, Fu Jie Huang, and Leon Bottou.  ... 
arXiv:2108.03627v2 fatcat:mkmnpws4tncb7dhxywx2s3qylu

Information Aggregation for Multi-Head Attention with Routing-by-Agreement

Jian Li, Baosong Yang, Zi-Yi Dou, Xing Wang, Michael R. Lyu, Zhaopeng Tu
2019 Proceedings of the 2019 Conference of the North  
In this work, we propose to improve the information aggregation for multi-head attention with a more powerful routing-by-agreement algorithm.  ...  Multi-head attention is appealing for its ability to jointly extract different types of information from multiple representation subspaces.  ...  In this work, we use scaled dot-product attention (Luong et al., 2015) , which achieves similar performance with its additive counterpart while is much faster and more space-efficient in practice (Vaswani  ... 
doi:10.18653/v1/n19-1359 dblp:conf/naacl/LiYDWLT19 fatcat:qfcehfxhkzaojjwkop3vofczcu

Training Deep Capsule Networks with Residual Connections [article]

Josef Gugglberger, David Peer, Antonio Rodriguez-Sanchez
2021 arXiv   pre-print
The connections between capsules encrypt part-whole relationships between objects through routing algorithms which route the output of capsules from lower level layers to upper level layers.  ...  However, most capsule network implementations use two to three capsule layers, which limits their applicability as expressivity grows exponentially with depth.  ...  Other routing algorithms worth of mention are scaled distance agreement routing [15] , inverted dot-product attention routing [24] and routing via variational bayes [21] just to name a few.  ... 
arXiv:2104.07393v1 fatcat:3qutr2c5incrrgglyxyevgywoy

Information Aggregation for Multi-Head Attention with Routing-by-Agreement [article]

Jian Li and Baosong Yang and Zi-Yi Dou and Xing Wang and Michael R. Lyu and Zhaopeng Tu
2019 arXiv   pre-print
In this work, we propose to improve the information aggregation for multi-head attention with a more powerful routing-by-agreement algorithm.  ...  Multi-head attention is appealing for its ability to jointly extract different types of information from multiple representation subspaces.  ...  In this work, we use scaled dot-product attention (Luong et al., 2015) , which achieves similar performance with its additive counterpart while is much faster and more space-efficient in practice (Vaswani  ... 
arXiv:1904.03100v1 fatcat:kp2rspir6vcstirh7frs4v4q4i

Efficient-CapsNet: Capsule Network with Self-Attention Routing [article]

Vittorio Mazzia, Francesco Salvetti, Marcello Chiaberge
2021 arXiv   pre-print
Moreover, we replace dynamic routing with a novel non-iterative, highly parallelizable routing algorithm that can easily cope with a reduced number of capsules.  ...  Nevertheless, little attention has been given to this relevant aspect.  ...  Acknowledgements This work has been developed with the contribution of the Politecnico di Torino Interdepartmental Centre for Service Robotics PIC4SeR 1 and SmartData@Polito 2 .  ... 
arXiv:2101.12491v2 fatcat:oti5yoq7czb6hm32ecznwbp5li

Generative Adversarial Network Architectures For Image Synthesis Using Capsule Networks [article]

Yash Upadhyay, Paul Schrater
2018 arXiv   pre-print
In this paper, we propose Generative Adversarial Network (GAN) architectures that use Capsule Networks for image-synthesis.  ...  Based on the principal of positional-equivariance of features, Capsule Network's ability to encode spatial relationships between the features of the image helps it become a more powerful critic in comparison  ...  The agreement, a ij , is calculated as the dot product of the output, v j and the incoming vector, u ji .  ... 
arXiv:1806.03796v4 fatcat:qpm4uccua5dmxo3vfrxtj3em64

An Algorithm for Routing Capsules in All Domains [article]

Franz A. Heinsen
2020 arXiv   pre-print
Building on recent work on capsule networks, we propose a new, general-purpose form of "routing by agreement" that activates output capsules in a layer as a function of their net benefit to use and net  ...  To illustrate the usefulness of our routing algorithm, we present two capsule networks that apply it in different domains: vision and language.  ...  Vaswani et al. (2017) proposed transformer models using query-key-value dot-product attention, and showed that such models can be more effective than prior methods for modeling sequences.  ... 
arXiv:1911.00792v6 fatcat:i5hw6vkfdvc57cegf6de7kat6y

Evaluating Generalization Ability of Convolutional Neural Networks and Capsule Networks for Image Classification via Top-2 Classification [article]

Hao Ren, Jianlin Su, Hong Lu
2022 arXiv   pre-print
Recently, Capsule Networks (CapsNets) are proposed, which can help eliminating this limitation.  ...  To reduce the number of parameters, we introduce the Parameter-Sharing (PS) mechanism between capsules.  ...  But we use the dot product value to measure the similarity between each lowlevel capsule and high-level capsule, because dot product has lower computational complexity than cosine similarity.  ... 
arXiv:1901.10112v3 fatcat:uno26w3r4rgqnbj3yhnf7hi7v4

A Quaternion-embedded Capsule Network Model for Knowledge Graph Completion

Heng Chen, Weimei Wang, Guanyu Li, Yimin Shi
2020 IEEE Access  
In this paper, we present a novel capsule network method for link prediction taking advantages of quaternion.  ...  More specifically, we explore two methods, including a relational rotation model called QuaR and a deep capsule neural model called CapS-QuaR to encode semantics of factual triples.  ...  These capsules are eventually generalized into one capsule in the second layer, which produces a vector for dot product operation with the weight vector W∈H d×1 , and the value of the dot product is used  ... 
doi:10.1109/access.2020.2997177 fatcat:tmmbgi7cmvfk7pdguklrqmcwoq

iCaps-Dfake: An Integrated Capsule-Based Model for Deepfake Image and Video Detection

Samar Samir Khalil, Sherin M. Youssef, Sherine Nagy Saleh
2021 Future Internet  
of capsule neural networks (CapsNets) implementing a concurrent routing technique.  ...  Now anyone can create manipulated unethical media forensics, defame, humiliate others or even scam them out of their money with a click of a button.  ...  The proposed model used the inverted dot-product attention routing algorithm [69] that applies a matrix-structured pose in a capsule.  ... 
doi:10.3390/fi13040093 fatcat:5bqlhhgztzcn3dyxol34s6vt3e

MobileCaps: A Lightweight Model for Screening and Severity Analysis of COVID-19 Chest X-Ray Images [article]

S J Pawan, Rahul Sankar, Amithash M Prabhudev, P A Mahesh, K Prakashini, Sudha Kiran Das, Jeny Rajan
2021 arXiv   pre-print
MobileCaps is trained and evaluated on the publicly available dataset with the model ensembling and Bayesian optimization strategies to efficiently classify CXR images of patients with COVID-19 from non-COVID  ...  We utilize MobileNetV2 architecture as the feature extractor and integrate it into Capsule Networks to construct a fully automated and lightweight model termed as MobileCaps.  ...  product, i.e. a measure of how much the predicted output agrees with the actual output is computed and the routing coefficient for each capsule u i is updated as per Eq 4 and Eq 5 [43] .  ... 
arXiv:2108.08775v1 fatcat:jjftpembencbnjd66yphtpdlzi

A COMPLETE AND UPDATED REVIEW ON VARIOUS TYPES OF DRUG DELIVERY SYSTEMS

RAJEEV GARG, SHARANPREET KAUR, RITIKA, SHEHNAZ KHATOON, NAINA, HITESH VERMA
2020 International Journal of Applied Pharmaceutics  
When UV light illuminates the quantum dots, a quantum dot electron can be excited to a higher energy state.  ...  Parenteral administration routes usually start more quickly than other administration routes [149, 150] .  ... 
doi:10.22159/ijap.2020v12i4.37508 fatcat:tkj2jkg7gzds7gwy26uzioa5mu

In Vitro and In Vivo Evaluation of a Water-in-Oil Microemulsion System for Enhanced Peptide Intestinal Delivery

Dongyun Liu, Taku Kobayashi, Steven Russo, Fengling Li, Scott E. Plevy, Todd M. Gambling, Johnny L. Carson, Russell J. Mumper
2012 AAPS Journal  
The treatment with TAMRA-TAT microemulsion after oral administration resulted in greater fluorescence intensity in all intestine sections (duodenum, jejunum, ileum, and colon) compared to TAMRA-TAT solution  ...  Zhen Hu of the Division of Pharmacotherapy and Experimental Therapeutics in the UNC Eshelman School of Pharmacy at the University of North Carolina at Chapel Hill for his assistance with the statistical  ...  The green dots represent clear samples, the red dots represent turbid samples, and the shaded area is the microemulsion window Fig. 2 . 2 The mapping a, rheological property b, and viscosity c of 11  ... 
doi:10.1208/s12248-012-9441-7 pmid:23196806 pmcid:PMC3535102 fatcat:3ppvhgxdxvhbpiibefygrkorui

Object-Centric Learning with Slot Attention [article]

Francesco Locatello, Dirk Weissenborn, Thomas Unterthiner, Aravindh Mahendran, Georg Heigold, Jakob Uszkoreit, Alexey Dosovitskiy, Thomas Kipf
2020 arXiv   pre-print
In this paper, we present the Slot Attention module, an architectural component that interfaces with perceptual representations such as the output of a convolutional neural network and produces a set of  ...  These slots are exchangeable and can bind to any object in the input by specializing through a competitive procedure over multiple rounds of attention.  ...  The closest such variant is inverted dot-product attention routing [62] which similarly uses a dot product attention mechanism to obtain assignment coefficients between representations.  ... 
arXiv:2006.15055v2 fatcat:2uvpbz754nftbnizjin6zeiyai
« Previous Showing results 1 — 15 out of 966 results