Filters








13,887 Hits in 8.2 sec

Limitations of Learning via Embeddings in Euclidean Half-Spaces [chapter]

Shai Ben-David, Nadav Eiron, Hans Ulrich Simon
2001 Lecture Notes in Computer Science  
The notion of embedding a class of dichotomies in a class of linear half spaces is central to the support vector machines paradigm.  ...  We show that an overwhelming majority of the family of finite concept classes of any constant VC dimension cannot be embedded in low-dimensional half spaces.  ...  Acknowledgments This work has been supported in part by the ESPRIT Working Group in Neural and Computational Learning II, NeuroCOLT2, No. 27150.  ... 
doi:10.1007/3-540-44581-1_25 fatcat:mkqw3m4fnrhx7ajmtq6srjlbsy

Large-Margin Classification in Hyperbolic Space [article]

Hyunghoon Cho, Benjamin DeMeo, Jian Peng, Bonnie Berger
2018 arXiv   pre-print
Our work allows analytic pipelines that take the inherent hyperbolic geometry of the data into account in an end-to-end fashion without resorting to ill-fitting tools developed for Euclidean space.  ...  With the goal of enabling accurate classification of points in hyperbolic space while respecting their hyperbolic geometry, we introduce hyperbolic SVM, a hyperbolic formulation of support vector machine  ...  Review of Hyperbolic Space Models While hyperbolic space cannot be isometrically embedded in Euclidean space, there are several useful models of hyperbolic geometry formulated as a subset of Euclidean  ... 
arXiv:1806.00437v1 fatcat:zvertfmbvzgl7ei72cj56hjt7e

Poincaré Embeddings for Learning Hierarchical Representations [article]

Maximilian Nickel, Douwe Kiela
2017 arXiv   pre-print
However, while complex symbolic datasets often exhibit a latent hierarchical structure, state-of-the-art methods typically learn embeddings in Euclidean vector spaces, which do not account for this property  ...  latent hierarchies, both in terms of representation capacity and in terms of generalization ability.  ...  We then learn embeddings of all symbols in D such that related objects are close in the embedding space.  ... 
arXiv:1705.08039v2 fatcat:ywwlc2um4bg3nccz6gk2wqfwma

Extracting Event Temporal Relations via Hyperbolic Geometry [article]

Xingwei Tan, Gabriele Pergola, Yulan He
2021 arXiv   pre-print
However, embeddings in the Euclidean space cannot capture richer asymmetric relations such as event temporal relations.  ...  Recent neural approaches to event temporal relation extraction typically map events to embeddings in the Euclidean space and train a classifier to detect temporal relations between event pairs.  ...  Acknowledgements This work was funded in part by the UK Engineering and Physical Sciences Research Council (grant no. EP/V048597/1, EP/T017112/1).  ... 
arXiv:2109.05527v1 fatcat:kddtvp4agzfq5h5bv4javqdzm4

Page 3135 of Psychological Abstracts Vol. 90, Issue 9 [page]

2003 Psychological Abstracts  
(Dept of Computer Science, Technion, Haifa, Israel) Limitations of learning via embeddings in Euclidean half spaces. Journal of Machine Learn- ing Research, 2003(Apr), Vol 3(3), 441-461.  ...  We show that an overwhelming majority of the family of finite concept classes of any constant VC dimension cannot be embedded in low-dimensional half spaces.  ... 

Hyperbolic Deep Neural Networks: A Survey [article]

Wei Peng, Tuomas Varanka, Abdelrahman Mostafa, Henglin Shi, Guoying Zhao
2021 arXiv   pre-print
Recently, there has been a rising surge of momentum for deep representation learning in hyperbolic spaces due to theirhigh capacity of modeling data like knowledge graphs or synonym hierarchies, possessing  ...  Such a hyperbolic neural architecture potentially leads to drastically compact model withmuch more physical interpretability than its counterpart in Euclidean space.  ...  We also want to thank Emile Mathieu, from University of Oxford, for the explanation regarding the gyroplane layer in their Poincaré Variational Auto-Encoder.  ... 
arXiv:2101.04562v3 fatcat:yqj4zohrqjbplpsdy5f5uglnbu

Poincaré GloVe: Hyperbolic Word Embeddings [article]

Alexandru Tifrea, Gary Bécigneul, Octavian-Eugen Ganea
2018 arXiv   pre-print
Moreover, we adapt the well-known Glove algorithm to learn unsupervised word embeddings in this type of Riemannian manifolds.  ...  In this paper, justified by the notion of delta-hyperbolicity or tree-likeliness of a space, we propose to embed words in a Cartesian product of hyperbolic spaces which we theoretically connect to the  ...  Figures 5,6 ,7,8 show the three steps (centering, rotation, isometric mapping to half-plane) for 20D embeddings in (D 2 ) 10 , i.e. each of these steps in each of the 10 corresponding 2D spaces.  ... 
arXiv:1810.06546v2 fatcat:cjexwnqrazdi7abxaearldpfli

Prototypical Networks for Few-shot Learning [article]

Jake Snell, Kevin Swersky, Richard S. Zemel
2017 arXiv   pre-print
Prototypical networks learn a metric space in which classification can be performed by computing distances to prototype representations of each class.  ...  Compared to recent approaches for few-shot learning, they reflect a simpler inductive bias that is beneficial in this limited-data regime, and achieve excellent results.  ...  All of our models were trained via SGD with Adam [11] . We used an initial learning rate of 10 −3 and cut the learning rate in half every 2000 episodes.  ... 
arXiv:1703.05175v2 fatcat:uhozztpvfvcgjaj7arv5mgtgna

Manifold Learning with Geodesic Minimal Spanning Trees [article]

Jose Costa, Alfred Hero
2003 arXiv   pre-print
In the manifold learning problem one seeks to discover a smooth low dimensional surface, i.e., a manifold embedded in a higher dimensional linear vector space, based on a set of measured sample points  ...  In this paper we consider the closely related problem of estimating the manifold's intrinsic dimension and the intrinsic entropy of the sample points.  ...  The authors can be contacted by email via jcosta,hero@eecs.umich.edu.  ... 
arXiv:cs/0307038v1 fatcat:q65muqhtorgzvavrrx2oiyni2e

Page 4863 of Mathematical Reviews Vol. , Issue 2004f [page]

2004 Mathematical Reviews  
of learning via embeddings in Euclidean half spaces.  ...  Learn. Res. 3 (2002), Spec. Issue Comput. Learn. Theory, 441-461. Embeddings in Euclidean half spaces are the crucial step for learn- ing with kernels, as for example with support vector machines.  ... 

Semi-supervised learning and graph neural networks for fake news detection

Adrien Benamira, Benjamin Devillers, Etienne Lesot, Ayush K. Ray, Manal Saadi, Fragkiskos D. Malliaros
2019 Proceedings of the 2019 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining  
To this extend, we opted for semi-supervised learning approaches. In particular, our work proposes a graph-based semi-supervised fake news detection method, based on graph neural networks.  ...  The main challenge here stems from the fact that the number of labeled data is limited; very few articles can be examined and annotated as fake.  ...  in the embeddings space.  ... 
doi:10.1145/3341161.3342958 dblp:conf/asunam/BenamiraDLRSM19 fatcat:llv66xw5bvcjbfb7n7ijb4s6ce

On the Expressive Power of Kernel Methods and the Efficiency of Kernel Learning by Association Schemes [article]

Pravesh K. Kothari, Roi Livni
2019 arXiv   pre-print
Specifically, we define Euclidean kernels, a diverse class that includes most, if not all, families of kernels studied in literature such as polynomial kernels and radial basis functions.  ...  Our structural results allow us to prove meaningful limitations on the expressive power of the class as well as derive several efficient algorithms for learning kernels over different domains.  ...  Limitations on the success of kernel methods and embeddings in linear half spaces have also been studied. For specific kernels, [17] , as well as more general results [33, 5] .  ... 
arXiv:1902.04782v1 fatcat:cwjvhxmcnfeq7cohejg6gmiytq

HAKG: Hierarchy-Aware Knowledge Gated Network for Recommendation [article]

Yuntao Du, Xinjun Zhu, Lu Chen, Baihua Zheng, Yunjun Gao
2022 arXiv   pre-print
Meanwhile, we introduce a novel angle constraint to preserve characteristics of items in the embedding space.  ...  Further analyses on the learned hyperbolic embeddings confirm that HAKG offers meaningful insights into the hierarchies of data.  ...  Euclidean space, we aim to capture the non-Euclidean latent anatomy of data.  ... 
arXiv:2204.04959v1 fatcat:xjj3a7e2z5dyxfaim24m24hnha

Directed Graph Embeddings in Pseudo-Riemannian Manifolds [article]

Aaron Sim, Maciej Wiatrak, Angus Brayne, Páidí Creed, Saee Paliwal
2021 arXiv   pre-print
The inductive biases of graph representation learning algorithms are often encoded in the background geometry of their embedding space.  ...  , and a unique likelihood function that explicitly incorporates a preferred direction in embedding space.  ...  In graph representation learning, the embedding space geometry itself encodes many such inductive biases, even in the simplest of spaces.  ... 
arXiv:2106.08678v1 fatcat:ccldikzizvcn7pm2nvkrgond2y

Overlapping Spaces for Compact Graph Representations [article]

Kirill Shevkunov, Liudmila Prokhorenkova
2022 arXiv   pre-print
The main idea is to allow subsets of coordinates to be shared between spaces of different types (Euclidean, hyperbolic, spherical).  ...  Our experiments confirm that overlapping spaces outperform the competitors in graph embedding tasks.  ...  If k = 1, we get a standard Euclidean, spherical or hyperbolical space. In [9] , it is proposed to simultaneously learn an embedding and scale coefficients w i .  ... 
arXiv:2007.02445v3 fatcat:ilscwwmcp5grfgm6hl6taxagea
« Previous Showing results 1 — 15 out of 13,887 results