Filters








198,852 Hits in 3.4 sec

Better Set Representations For Relational Reasoning [article]

Qian Huang, Horace He, Abhay Singh, Yan Zhang, Ser-Nam Lim, Austin Benson
2020 arXiv   pre-print
Existing end-to-end approaches typically extract entities from inputs by directly interpreting the latent feature representations as a set.  ...  One defining trait of relational reasoning is that it operates on a set of entities, as opposed to standard vector representations.  ...  The set representations should contain as much information about the original image as possible and the latent structure can be easily visualized as a set of objects.  ... 
arXiv:2003.04448v2 fatcat:mfrdthl2qjfrjow67a7fyrfdme

Probabilistic Generative Deep Learning for Molecular Design [article]

Daniel T. Chang
2019 arXiv   pre-print
We discuss the major components of probabilistic generative deep learning for molecular design, which include molecular structure, molecular representations, deep generative models, molecular latent representations  ...  and latent space, molecular structure-property and structure-activity relationships, molecular similarity and molecular design.  ...  Molecular Graph The molecular graph [12] [13] encodes molecular structure by a graph G = (V, E, µ, υ) where the set of nodes V encodes the set of atoms and the set of edges E encodes the set of bonds  ... 
arXiv:1902.05148v1 fatcat:zrugq5rcgfgs5ivqonkksq63mq

Inducing Interpretable Representations with Variational Autoencoders [article]

N. Siddharth and Brooks Paige and Alban Desmaison and Jan-Willem Van de Meent and Frank Wood and Noah D. Goodman and Pushmeet Kohli and Philip H.S. Torr
2016 arXiv   pre-print
We develop a framework for incorporating structured graphical models in the encoders of variational autoencoders (VAEs) that allows us to induce interpretable representations through approximate variational  ...  This allows us to both perform reasoning (e.g. classification) under the structural constraints of a given graphical model, and use deep generative models to deal with messy, high-dimensional domains where  ...  For the purposes of this manuscript, we refer to latent representations that are disentangled as structured and latent representations that are entangled as unstructured.  ... 
arXiv:1611.07492v1 fatcat:vo45oyg6efebvjo6p6hnfetcji

Discrete Embedding for Latent Networks

Hong Yang, Ling Chen, Minglong Lei, Lingfeng Niu, Chuan Zhou, Peng Zhang
2020 Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence  
To address this issue, we present an end-to-end discrete network embedding model for latent networks DELN that can learn binary representations from underlying information cascades.  ...  binary node representation constraint.  ...  The learning function can be formulated by factorizing the latent Weisfeiler-Lehman matrix P under the constraints that the representations B are discrete and the best latent structure W can be inferred  ... 
doi:10.24963/ijcai.2020/170 dblp:conf/ijcai/YangCLNZZ20 fatcat:44bbpl6dwjawdga5rg6btycgle

Training Heterogeneous Features in Sequence to Sequence Tasks: Latent Enhanced Multi-filter Seq2Seq Model [article]

Yunhao Yang, Zhaokun Xue
2022 arXiv   pre-print
The representations are extracted from the final hidden state of the encoder and lie in the latent space. A latent space transformation is applied for enhancing the quality of the representations.  ...  Build upon the encoder-decoder architecture, we design a latent-enhanced multi-filter seq2seq model (LEMS) that analyzes the input representations by introducing a latent space transformation and clustering  ...  There are some researches exploring hierarchical structures on the latent space to obtain richer representations [3, 11] .  ... 
arXiv:2105.08840v2 fatcat:qbxgfoiwx5bxjitaqlwv2hpahi

Tiered Latent Representations and Latent Spaces for Molecular Graphs [article]

Daniel T. Chang
2019 arXiv   pre-print
We also briefly discuss the usage and exploration of tiered latent spaces. The tiered approach is applicable to other types of structured graphs similar in nature to molecular graphs.  ...  Flat latent representations (node embeddings or graph embeddings) fail to represent, and support the use of, groups.  ...  low-dimensional continuous latent representations summarize their graph position and the structure of their local graph neighborhood.  ... 
arXiv:1904.02653v1 fatcat:wr3axkv3wfbsdp2fc54cd57isi

Distribution-induced Bidirectional Generative Adversarial Network for Graph Representation Learning [article]

Shuai Zheng, Zhenfeng Zhu, Xingxing Zhang, Zhizhe Liu, Jian Cheng, Yao Zhao
2020 arXiv   pre-print
Instead of the widely used normal distribution assumption, the prior distribution of latent representation in our DBGAN is estimated in a structure-aware way, which implicitly bridges the graph and feature  ...  Thus discriminative and robust representations are generated for all nodes.  ...  We follow the settings in [17] to get the reconstructed adjacency matrixà from the latent representation, and herẽ A should be similar to real adjacency matrix A.  ... 
arXiv:1912.01899v3 fatcat:abryk4tnbzdujhsl5iox5m2yae

Automatic Chemical Design Using a Data-Driven Continuous Representation of Molecules

Rafael Gómez-Bombarelli, Jennifer N. Wei, David Duvenaud, José Miguel Hernández-Lobato, Benjamín Sánchez-Lengeling, Dennis Sheberla, Jorge Aguilera-Iparraguirre, Timothy D. Hirzel, Ryan P. Adams, Alán Aspuru-Guzik
2018 ACS Central Science  
Continuous representations allow us to automatically generate novel chemical structures by performing simple operations in the latent space, such as decoding random vectors, perturbing known chemical structures  ...  The predictor estimates chemical properties from the latent continuous vector representation of the molecule.  ...  The latent space representations for the QM9 and ZINC data sets had 156 dimensions and 196 dimensions, respectively. ■ RESULTS AND DISCUSSION Representation of Molecules in Latent Space.  ... 
doi:10.1021/acscentsci.7b00572 pmid:29532027 pmcid:PMC5833007 fatcat:eun57eul2vcpjfikuaowor3wtu

Representation Learning Through Latent Canonicalizations [article]

Or Litany, Ari Morcos, Srinath Sridhar, Leonidas Guibas, Judy Hoffman
2020 arXiv   pre-print
In this work, we seek the generalization power of disentangled representations, but relax the requirement of explicit latent disentanglement and instead encourage linearity of individual factors of variation  ...  Many prior approaches to this problem have focused on learning "disentangled" representations so that as individual factors vary in a new domain, only a portion of the representation need be updated.  ...  In this work, we additionally constrain the structure of the learned latent space using a set of latent canonicalization losses.  ... 
arXiv:2002.11829v1 fatcat:sqvupdu2bvgl7azfjtsfw5kdxe

Combining Latent Space and Structured Kernels for Bayesian Optimization over Combinatorial Spaces [article]

Aryan Deshwal, Janardhan Rao Doppa
2021 arXiv   pre-print
The key idea is to define a novel structure-coupled kernel that explicitly integrates the structural information from decoded structures with the learned latent space representation for better surrogate  ...  A recent BO approach for combinatorial spaces is through a reduction to BO over continuous spaces by learning a latent representation of structures using deep generative models (DGMs).  ...  in connecting the rich structural information of each structure in the combinatorial space with its corresponding latent space representation.  ... 
arXiv:2111.01186v1 fatcat:ifs75e5vejd6jgfaw5bwb4sa64

Diagnosis of Coronavirus Disease 2019 (COVID-19) with Structured Latent Multi-View Representation Learning

Hengyuan Kang, Liming Xia, Fuhua Yan, Zhibin Wan, Feng Shi, Huan Yuan, Huiting Jiang, Dijia Wu, He Sui, Changqing Zhang, Dinggang Shen
2020 IEEE Transactions on Medical Imaging  
To fully explore multiple features describing CT images from different views, a unified latent representation is learned which can completely encode information from different aspects of features and is  ...  endowed with promising class structure for separability.  ...  Step-2: Learning Projection From Original Features to Latent Representation In the Step-1, one low-dimensional latent representation h is obtained for each subject in the training set.  ... 
doi:10.1109/tmi.2020.2992546 pmid:32386147 fatcat:3c4cp5hnajd5tflorltvxcwj4a

Latent Fingerprint Enhancement via Time Variance Gabor and Sparse Representation

Namrata D Lokaksha
2019 International Journal for Research in Applied Science and Engineering Technology  
Dictionaries are constructed with a set of Gabor elementary functions to capture the characteristics, features and properties of fingerprint ridge structure, and Multi scale patch-based sparse representation  ...  Latent fingerprint images are usually of poor quality with unclear ridge structure.  ...  INTRODUCTION Latent fingerprint consists of fingerprint images with structured noise.  ... 
doi:10.22214/ijraset.2019.1121 fatcat:65ek3gub2jho7kr7h7zutlem5u

Operator Autoencoders: Learning Physical Operations on Encoded Molecular Graphs [article]

Willis Hoke, Daniel Shea, Stephen Casey
2021 arXiv   pre-print
In this work, we develop a pipeline for establishing graph-structured representations of time-series volumetric data from molecular dynamics simulations.  ...  We then train an autoencoder to find nonlinear mappings to a latent space where future timesteps can be predicted through application of a linear operator trained in tandem with the autoencoder.  ...  Graph autoencoders are neural networks trained to represent the structural features of a set of graph adjacency matrices as a vector of latent space variables [21] - [38] .  ... 
arXiv:2105.12295v1 fatcat:bzm56fwinjckhayoiplo2emxzi

Locally Embedding Autoencoders: A Semi-Supervised Manifold Learning Approach of Document Representation

Chao Wei, Senlin Luo, Xincheng Ma, Hao Ren, Ji Zhang, Limin Pan, Zhaohong Deng
2016 PLoS ONE  
To address this problem, we propose a semi-supervised manifold-inspired autoencoder to extract meaningful latent representations of documents, taking the local perspective that the latent representation  ...  We first determine the discriminative neighbors set with Euclidean distance in observation spaces.  ...  We obtained latent representation of the 5 shuffled test sets and randomly divided it into 2 equal parts.  ... 
doi:10.1371/journal.pone.0146672 pmid:26784692 pmcid:PMC4718658 fatcat:3ihhkndpw5hitapelxkkwglfx4

Latent semantic learning with structured sparse representation for human action recognition

Zhiwu Lu, Yuxin Peng
2013 Pattern Recognition  
More importantly, we construct the L1-graph with structured sparse representation, which can be obtained by structured sparse coding with its structured sparsity ensured by novel L1-norm hypergraph regularization  ...  structured sparse representation, which can help to bridge the semantic gap in the challenging task of human action recognition.  ...  Section III proposes a latent semantic learning method based on structured sparse representation.  ... 
doi:10.1016/j.patcog.2012.09.027 fatcat:tkjtzttzc5gfjhxvuirai4oz5m
« Previous Showing results 1 — 15 out of 198,852 results