A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Filters
Limitations of Learning via Embeddings in Euclidean Half-Spaces
[chapter]
2001
Lecture Notes in Computer Science
The notion of embedding a class of dichotomies in a class of linear half spaces is central to the support vector machines paradigm. ...
We show that an overwhelming majority of the family of finite concept classes of any constant VC dimension cannot be embedded in low-dimensional half spaces. ...
Acknowledgments This work has been supported in part by the ESPRIT Working Group in Neural and Computational Learning II, NeuroCOLT2, No. 27150. ...
doi:10.1007/3-540-44581-1_25
fatcat:mkqw3m4fnrhx7ajmtq6srjlbsy
Large-Margin Classification in Hyperbolic Space
[article]
2018
arXiv
pre-print
Our work allows analytic pipelines that take the inherent hyperbolic geometry of the data into account in an end-to-end fashion without resorting to ill-fitting tools developed for Euclidean space. ...
With the goal of enabling accurate classification of points in hyperbolic space while respecting their hyperbolic geometry, we introduce hyperbolic SVM, a hyperbolic formulation of support vector machine ...
Review of Hyperbolic Space Models While hyperbolic space cannot be isometrically embedded in Euclidean space, there are several useful models of hyperbolic geometry formulated as a subset of Euclidean ...
arXiv:1806.00437v1
fatcat:zvertfmbvzgl7ei72cj56hjt7e
Poincaré Embeddings for Learning Hierarchical Representations
[article]
2017
arXiv
pre-print
However, while complex symbolic datasets often exhibit a latent hierarchical structure, state-of-the-art methods typically learn embeddings in Euclidean vector spaces, which do not account for this property ...
latent hierarchies, both in terms of representation capacity and in terms of generalization ability. ...
We then learn embeddings of all symbols in D such that related objects are close in the embedding space. ...
arXiv:1705.08039v2
fatcat:ywwlc2um4bg3nccz6gk2wqfwma
Extracting Event Temporal Relations via Hyperbolic Geometry
[article]
2021
arXiv
pre-print
However, embeddings in the Euclidean space cannot capture richer asymmetric relations such as event temporal relations. ...
Recent neural approaches to event temporal relation extraction typically map events to embeddings in the Euclidean space and train a classifier to detect temporal relations between event pairs. ...
Acknowledgements This work was funded in part by the UK Engineering and Physical Sciences Research Council (grant no. EP/V048597/1, EP/T017112/1). ...
arXiv:2109.05527v1
fatcat:kddtvp4agzfq5h5bv4javqdzm4
Page 3135 of Psychological Abstracts Vol. 90, Issue 9
[page]
2003
Psychological Abstracts
(Dept of Computer Science, Technion, Haifa, Israel) Limitations of learning via embeddings in Euclidean half spaces. Journal of Machine Learn- ing Research, 2003(Apr), Vol 3(3), 441-461. ...
We show that an overwhelming majority of the family of finite concept classes of any constant VC dimension cannot be embedded in low-dimensional half spaces. ...
Hyperbolic Deep Neural Networks: A Survey
[article]
2021
arXiv
pre-print
Recently, there has been a rising surge of momentum for deep representation learning in hyperbolic spaces due to theirhigh capacity of modeling data like knowledge graphs or synonym hierarchies, possessing ...
Such a hyperbolic neural architecture potentially leads to drastically compact model withmuch more physical interpretability than its counterpart in Euclidean space. ...
We also want to thank Emile Mathieu, from University of Oxford, for the explanation regarding the gyroplane layer in their Poincaré Variational Auto-Encoder. ...
arXiv:2101.04562v3
fatcat:yqj4zohrqjbplpsdy5f5uglnbu
Poincaré GloVe: Hyperbolic Word Embeddings
[article]
2018
arXiv
pre-print
Moreover, we adapt the well-known Glove algorithm to learn unsupervised word embeddings in this type of Riemannian manifolds. ...
In this paper, justified by the notion of delta-hyperbolicity or tree-likeliness of a space, we propose to embed words in a Cartesian product of hyperbolic spaces which we theoretically connect to the ...
Figures 5,6 ,7,8 show the three steps (centering, rotation, isometric mapping to half-plane) for 20D embeddings in (D 2 ) 10 , i.e. each of these steps in each of the 10 corresponding 2D spaces. ...
arXiv:1810.06546v2
fatcat:cjexwnqrazdi7abxaearldpfli
Hyperbolic Deep Neural Networks: A Survey
2021
IEEE Transactions on Pattern Analysis and Machine Intelligence
The promising results demonstrate its superior capability, significant compactness of the model, and a substantially better physical interpretability than its counterpart in the Euclidean space. ...
Recently, hyperbolic deep neural networks (HDNNs) have been gaining momentum as the deep representations in the hyperbolic space provide high fidelity embeddings with few dimensions, especially for data ...
For more information, see https://creativecommons.org/licenses/by/4.0/ This article has been accepted for publication in a future issue of this journal, but has not been fully edited. ...
doi:10.1109/tpami.2021.3136921
pmid:34932472
fatcat:ccpfqsjlevgyxpetwd73kvbf4y
Browser-based Hyperbolic Visualization of Graphs
[article]
2022
arXiv
pre-print
A comparison with Euclidean MDS shows that H-MDS produces embeddings with lower distortion for several types of networks. ...
However, current hyperbolic network visualization approaches are limited to special types of networks and do not scale to large datasets. ...
pairs of nodes) in the embedding (e.g., realized distances between pairs of nodes in the non-Euclidean space). ...
arXiv:2205.08028v1
fatcat:w3c4fhpknrhrna7unxisdn74p4
Prototypical Networks for Few-shot Learning
[article]
2017
arXiv
pre-print
Prototypical networks learn a metric space in which classification can be performed by computing distances to prototype representations of each class. ...
Compared to recent approaches for few-shot learning, they reflect a simpler inductive bias that is beneficial in this limited-data regime, and achieve excellent results. ...
All of our models were trained via SGD with Adam [11] . We used an initial learning rate of 10 −3 and cut the learning rate in half every 2000 episodes. ...
arXiv:1703.05175v2
fatcat:uhozztpvfvcgjaj7arv5mgtgna
Manifold Learning with Geodesic Minimal Spanning Trees
[article]
2003
arXiv
pre-print
In the manifold learning problem one seeks to discover a smooth low dimensional surface, i.e., a manifold embedded in a higher dimensional linear vector space, based on a set of measured sample points ...
In this paper we consider the closely related problem of estimating the manifold's intrinsic dimension and the intrinsic entropy of the sample points. ...
The authors can be contacted by email via jcosta,hero@eecs.umich.edu. ...
arXiv:cs/0307038v1
fatcat:q65muqhtorgzvavrrx2oiyni2e
Page 4863 of Mathematical Reviews Vol. , Issue 2004f
[page]
2004
Mathematical Reviews
of learning via embeddings in Euclidean half spaces. ...
Learn. Res. 3 (2002), Spec. Issue Comput. Learn. Theory, 441-461.
Embeddings in Euclidean half spaces are the crucial step for learn- ing with kernels, as for example with support vector machines. ...
Auto-encoded Latent Representations of White Matter Streamlines for Quantitative Distance Analysis
[article]
2021
bioRxiv
pre-print
vectors, namely, streamline embeddings, and enabled tractogram parcellation via unsupervised clustering in the latent space. ...
To resolve these issues, we proposed a novel atlas-free method that learnt a latent space using a deep recurrent autoencoder which efficiently embedded any lengths of streamlines to fixed-size feature ...
Supervised learning approaches are limited to predefined streamline bundle types and require a large number of annotated streamlines via manual or semi-automated annotations. ...
doi:10.1101/2021.10.06.463445
fatcat:2j5oj46llbdy7h4vkqizvpsko4
Semi-supervised learning and graph neural networks for fake news detection
2019
Proceedings of the 2019 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining
To this extend, we opted for semi-supervised learning approaches. In particular, our work proposes a graph-based semi-supervised fake news detection method, based on graph neural networks. ...
The main challenge here stems from the fact that the number of labeled data is limited; very few articles can be examined and annotated as fake. ...
in the embeddings space. ...
doi:10.1145/3341161.3342958
dblp:conf/asunam/BenamiraDLRSM19
fatcat:llv66xw5bvcjbfb7n7ijb4s6ce
On the Expressive Power of Kernel Methods and the Efficiency of Kernel Learning by Association Schemes
[article]
2019
arXiv
pre-print
Specifically, we define Euclidean kernels, a diverse class that includes most, if not all, families of kernels studied in literature such as polynomial kernels and radial basis functions. ...
Our structural results allow us to prove meaningful limitations on the expressive power of the class as well as derive several efficient algorithms for learning kernels over different domains. ...
Limitations on the success of kernel methods and embeddings in linear half spaces have also been studied. For specific kernels, [17] , as well as more general results [33, 5] . ...
arXiv:1902.04782v1
fatcat:cwjvhxmcnfeq7cohejg6gmiytq
« Previous
Showing results 1 — 15 out of 15,015 results