Filters








153,657 Hits in 13.3 sec

Preserved Structure Across Vector Space Representations [article]

Andrei Amatuni, Estelle He, Elika Bergelson
2018 arXiv   pre-print
First, the pairwise item similarities derived within image-space and word-space are correlated, suggesting preserved structure among these extremely different representational formats.  ...  We conclude that this approach, which does not rely on human ratings of similarity, may nevertheless reflect stable within-class structure across these two spaces.  ...  vector space means that the global inter-object structure is preserved across this mapping.  ... 
arXiv:1802.00840v2 fatcat:eeiaju4r6zfgnpr5z7rpib3by4

Hyperalignment: Modeling shared information encoded in idiosyncratic cortical topographies

James V Haxby, J Swaroop Guntupalli, Samuel A Nastase, Ma Feilong
2020 eLife  
Individual transformation matrices project information from individual anatomical spaces into the common model information space, preserving the geometry of pairwise dissimilarities between pattern vectors  ...  The fundamental property of brain function that is preserved across brains is information content, rather than the functional properties of local features that support that content.  ...  Thus, while response hyperalignment aligns population responses across brains, preserving vector geometry of representation, connectivity hyperalignment aligns population connectivity vectors across brains  ... 
doi:10.7554/elife.56601 pmid:32484439 fatcat:dbikkgynz5fx3hpxyjdmal5bn4

On Representation Knowledge Distillation for Graph Neural Networks [article]

Chaitanya K. Joshi, Fayao Liu, Xu Xun, Jie Lin, Chuan-Sheng Foo
2022 arXiv   pre-print
in a shared representation space.  ...  An analysis of the representational similarity among teacher and student embedding spaces reveals that G-CRD balances preserving local and global relationships, while structure preserving approaches are  ...  However, independent MLPs on node feature vectors are 'structure-agnostic' [22] -they cannot adapt the shared representation space to the graph structure of the underlying data.  ... 
arXiv:2111.04964v2 fatcat:bbqsfbu2mfa75hw5kk2zpj3auq

Interpretable Privacy Preservation of Text Representations Using Vector Steganography [article]

Geetanjali Bihani
2021 arXiv   pre-print
To this end, I aim to study and develop methods to incorporate steganographic modifications within the vector geometry to obfuscate underlying spurious associations and preserve the distributional semantic  ...  Thus, the goal of my doctoral research is to develop interpretable approaches towards privacy preservation of text representations that retain data utility while guaranteeing privacy.  ...  Vector space steganography (VSS).  ... 
arXiv:2112.02557v2 fatcat:te6uijjvczcctklcrf6mrqufay

Unsupervised Object Matching for Relational Data [article]

Tomoharu Iwata, Naonori Ueda
2018 arXiv   pre-print
Then, the proposed method linearly projects the latent vectors for all the datasets onto a common latent space shared across all datasets by matching the distributions while preserving the structural information  ...  The structural information encoded in the latent vectors are preserved by using the orthogonality regularizer.  ...  Then, we project the representations onto a common latent space shared across all datasets by matching the distributions while preserving the encoded structural information.  ... 
arXiv:1810.03770v3 fatcat:svdjjzcdxbfdtclbca266srknu

Non-translational Alignment for Multi-relational Networks

Shengnan Li, Xin Li, Rui Ye, Mingzhong Wang, Haiping Su, Yingzi Ou
2018 Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence  
on anchors to project each network onto the same vector space during the process of learning the representation of individual networks.  ...  However, they cannot address triangular or other structural properties effectively.  ...  Specifically, the objective function is optimized to preserve network structures as well as to locate the correspondence across networks.  ... 
doi:10.24963/ijcai.2018/581 dblp:conf/ijcai/LiLYWSO18 fatcat:w3youzngtfdmllinpsdsylvcpe

EmbeddingVis: A Visual Analytics Approach to Comparative Network Embedding Inspection [article]

Quan Li, Kristanto Sean Njotoprawiro, Hammad Haleem, Qiaoan Chen, Chris Yi, Xiaojuan Ma
2018 arXiv   pre-print
Although the existing visualization methods allow simple examination of the structure of embedding space, they cannot support in-depth exploration of the embedding vectors.  ...  To be more specific, it facilitates comparison of what and how node metrics are preserved across different embedding models and investigation of relationships between node metrics and selected embedding  ...  E.4 stated that "it is practical to use embedding vectors to describe structural information since the trend of a cluster's 'average distance vector' curve is similar across models."  ... 
arXiv:1808.09074v1 fatcat:4gnar2eovnb2xj6lleuxktrkye

CoPHE: A Count-Preserving Hierarchical Evaluation Metric in Large-Scale Multi-Label Text Classification [article]

Matúš Falis, Hang Dong, Alexandra Birch, Beatrice Alex
2021 arXiv   pre-print
With the example of the ICD-9 ontology we describe a structural issue in the representation of the structured label space in prior art, and propose an alternative representation based on the depth of the  ...  Large-Scale Multi-Label Text Classification (LMTC) includes tasks with hierarchical label spaces, such as automatic assignment of ICD-9 codes to discharge summaries.  ...  This is likely due to a lack of representation of the label space structure within the explored models.  ... 
arXiv:2109.04853v1 fatcat:yhpcmmw5cjb7tebknmpbxxkuu4

Clustering with UMAP: Why and How Connectivity Matters [article]

Ayush Dalmia, Suzanna Sia
2021 arXiv   pre-print
Given that the initial topological structure is a precursor to the success of the algorithm, this naturally raises the question: What makes a "good" topological structure for dimensionality reduction?  ...  refined notion of connectivity (mutual k-Nearest Neighbors with minimum spanning tree) together with a flexible method of constructing the local neighborhood (Path Neighbors), can achieve a much better representation  ...  preserves the local structure by reducing the number of points randomly scattered in the vector space.  ... 
arXiv:2108.05525v2 fatcat:iop7cigrqrc3xgzxbmcieguqxu

Functional representation of vector lattices

Isidore Fleischer
1990 Proceedings of the American Mathematical Society  
and subadditive; restricted to the solid vector sublattice without infinitesimals, it preserves the full structure (including any existing infinite lattice extrema) and is faithful.  ...  This is used to give simple proofs of Freudenthal's spectral theorem and Kakutani's structure theorem for L-spaces.  ...  Restricted to the solid vector sublattice of vectors without infinitesimals, the representation preserves additive inverse, thus the group, as well as the scalar and lattice, structure.  ... 
doi:10.1090/s0002-9939-1990-0993750-3 fatcat:oojshpj43rgrlafgwyppfa2bju

Multimodal Deep Network Embedding with Integrated Structure and Attribute Information [article]

Conghui Zheng, Li Pan, Peng Wu
2019 arXiv   pre-print
Network embedding is the process of learning low-dimensional representations for nodes in a network, while preserving node features.  ...  We employ both structural proximity and attribute proximity in the loss function to preserve the respective features and the representations are obtained by minimizing the loss function.  ...  Thus, the learned representations preserve both the structural and attribute features of nodes in the embedding space.  ... 
arXiv:1903.12019v1 fatcat:p5h3ubyzvjf6lg35rj2ca22bei

Functional Representation of Vector Lattices

Isidore Fleischer
1990 Proceedings of the American Mathematical Society  
and subadditive; restricted to the solid vector sublattice without infinitesimals, it preserves the full structure (including any existing infinite lattice extrema) and is faithful.  ...  This is used to give simple proofs of Freudenthal's spectral theorem and Kakutani's structure theorem for L-spaces.  ...  Restricted to the solid vector sublattice of vectors without infinitesimals, the representation preserves additive inverse, thus the group, as well as the scalar and lattice, structure.  ... 
doi:10.2307/2048297 fatcat:yapprbairjdgrd3r6dabc4yn3q

EmbeddingVis: A Visual Analytics Approach to Comparative Network Embedding Inspection

Quan Li, Kristanto Sean Njotoprawiro, Hammad Haleem, Qiaoan Chen, Chris Yi, Xiaojuan Ma
2018 2018 IEEE Conference on Visual Analytics Science and Technology (VAST)  
E.4 stated that "it is practical to use embedding vectors to describe structural information since the trend of a cluster's 'average distance vector' curve is similar across models."  ...  Network embedding represents a graph in a low-dimensional vector space while preserving as much graph information as possible [5] .  ... 
doi:10.1109/vast.2018.8802454 dblp:conf/ieeevast/LiNHCYM18 fatcat:5solfjf62ra3dbc7dqcchqn53y

Avoiding Conflict: When Speaker Coordination Does Not Require Conceptual Agreement

Alexandre Kabbach, Aurélie Herbelot
2021 Frontiers in Artificial Intelligence  
Our results underline the specific way in which linguistic information is spread across singular vectors, and highlight the need to distinguish agreement from mere compatibility in alignment-based notions  ...  allows for exploiting alignment-based similarity metrics to measure inter-subject alignment over an entire semantic space, rather than a set of limited entries.  ...  from conceptual representations to conceptual spaces.  ... 
doi:10.3389/frai.2020.523920 pmid:33733196 pmcid:PMC7861244 fatcat:oekrwivqyrbijcntezf5qqgihy

Network Representation Learning: A Survey [article]

Daokun Zhang, Jie Yin, Xingquan Zhu, Chengqi Zhang
2018 arXiv   pre-print
Network representation learning has been recently proposed as a new learning paradigm to embed network vertices into a low-dimensional vector space, by preserving network topology structure, vertex content  ...  This facilitates the original network to be easily handled in the new vector space for further analysis.  ...  a latent, low-dimensional structure preserving space.  ... 
arXiv:1801.05852v3 fatcat:ploedafa4jhyxlvii5l42yw2bq
« Previous Showing results 1 — 15 out of 153,657 results