Filters








20,595 Hits in 3.2 sec

Hilbert Space Embeddings of Predictive State Representations [article]

Byron Boots, Geoffrey Gordon, Arthur Gretton
2013 arXiv   pre-print
Predictive State Representations (PSRs) are an expressive class of models for controlled stochastic processes. PSRs represent state as a set of predictions of future observable events.  ...  The essence is to represent the state as a nonparametric conditional embedding operator in a Reproducing Kernel Hilbert Space (RKHS) and leverage recent work in kernel methods to estimate, predict, and  ...  We propose a novel and highly expressive model: Hilbert space embeddings of predictive state representations. This model extends discrete linear PSRs to large and continuous-valued dynamical systems.  ... 
arXiv:1309.6819v1 fatcat:qooafkqekzdnfc4yqoeyd3byim

Hilbert Space Embeddings of Predictive State Representations

Byron Boots, Arthur Gretton, Geoffrey J. Gordon
2018
Predictive State Representations (PSRs) are an expressive class of models for controlled stochastic processes. PSRs represent state as a set of predictions of future observable events.  ...  The essence is to represent the state as one or more nonparametric conditional embedding operators in a Reproducing Kernel Hilbert Space (RKHS) and leverage recent work in kernel methods to estimate, predict  ...  We propose a novel and highly expressive model: Hilbert space embeddings of predictive state representations. This model extends discrete linear PSRs to large and continuous-valued dynamical systems.  ... 
doi:10.1184/r1/6475775 fatcat:hr6gcdsbgbgtdizrv5n7mpywtq

Hilbert Space Embeddings of Hidden Markov Models

Le Song, Byron Boots, Sajid M. Siddiqi, Geoffrey J. Gordon, Alexander J. Smola
2010 International Conference on Machine Learning  
We apply our method to robot vision data, slot car inertial sensor data and audio event classification data, and show that in these applications, embedded HMMs exceed the previous state-of-the-art performance  ...  However, they are restricted to discrete latent states, and are largely restricted to Gaussian and discrete observations.  ...  Hilbert space observable representation We will focus on the embedding µ Xt+1|x1:t for the predictive density P(X t+1 |x 1:t ) of a HMM.  ... 
dblp:conf/icml/SongSGS10 fatcat:3hw2bbfsyrga5lpywzwgskykvq

Empirical Mode Modeling: A data-driven approach to recover and forecast nonlinear dynamics from noisy data [article]

Joseph Park, Gerald M Pao, Erik Stabenau, George Sugihara, Thomas Lorimer
2021 arXiv   pre-print
Here, we evaluate the synthesis of empirical mode decomposition with empirical dynamic modeling, which we term empirical mode modeling, to increase the information content of state-space representations  ...  data-driven, model-free, state-space analysis in the presence of noise.  ...  the fitness of state-space representations derived from traditional time-delay embeddings, and, from IMFs of observed variables.  ... 
arXiv:2103.07281v1 fatcat:ugeircur6neutlkf6usqfw7enu

A possible mathematics for the unification of quantum mechanics and general relativity

A. Kryukov
2010 Journal of Mathematical Physics  
The framework is based on Hilbert spaces H of functions of four space-time variables x,t, furnished with an additional indefinite inner product invariant under Poincar\'e transformations, and isomorphisms  ...  Simultaneously, the Minkowski space-time is isometrically embedded into H, Poincar\'e transformations have unique extensions to isomorphisms of H and the embedding commutes with Poincar\'e transformations  ...  Then the observer will not be able to use the functional content of the Hilbert space H ′ of representation or the representation itself to determine the state of motion of the frame K ′ .  ... 
doi:10.1063/1.3298678 fatcat:66six4ntnvcbhekxsjjckt32pq

Topology, Convergence, and Reconstruction of Predictive States [article]

Samuel P. Loomis, James P. Crutchfield
2021 arXiv   pre-print
Moreover, predictive states may be represented in Hilbert spaces that replicate the weak topology.  ...  We mathematically explain how these representations are particularly beneficial when reconstructing high-memory processes and connect them to reproducing kernel Hilbert spaces.  ...  Embedding predictions in a Hilbert space The space K(µ) of predictive states is a subspace P(X N ) of the probability measures over X N .  ... 
arXiv:2109.09203v1 fatcat:ujxwog73nvgojbtzpjyisgs5iy

Quantum embeddings for machine learning [article]

Seth Lloyd, Maria Schuld, Aroosa Ijaz, Josh Izaac, Nathan Killoran
2020 arXiv   pre-print
The first part of the circuit implements a quantum feature map that encodes classical inputs into quantum states, embedding the data in a high-dimensional Hilbert space; the second part of the circuit  ...  We propose to instead train the first part of the circuit---the embedding---with the objective of maximally separating data classes in Hilbert space, a strategy we call quantum metric learning.  ...  The goal of the embedding process is to find a representation of the data such that the known metric of the Hilbert space faithfully reproduces the unknown metric of the original data, for example, the  ... 
arXiv:2001.03622v2 fatcat:chqcqcxa7bbj5di2ax7q23oqry

Quantum-inspired Multimodal Fusion for Video Sentiment Analysis [article]

Qiuchi Li, Dimitris Gkoumas, Christina Lioma, Massimo Melucci
2021 arXiv   pre-print
The complex-valued neural network implementation of the framework achieves comparable results to state-of-the-art systems on two benchmarking video sentiment analysis datasets.  ...  We tackle the crucial challenge of fusing different modalities of features for multimodal sentiment analysis.  ...  Hence, in our framework, the Hilbert Space is a composition of unimodal Hilbert Spaces for single modalities, hence referred to as Multimodal Hilbert Space H mm .  ... 
arXiv:2103.10572v2 fatcat:hldyp5i35jhwrdtc7gvneb353m

Kernel Mean Embedding of Distributions: A Review and Beyond

Krikamol Muandet, Kenji Fukumizu, Bharath Sriperumbudur, Bernhard Schölkopf
2017 Foundations and Trends® in Machine Learning  
A Hilbert space embedding of a distribution---in short, a kernel mean embedding---has recently emerged as a powerful tool for machine learning and inference.  ...  The survey begins with a brief introduction to the RKHS and positive definite kernels which forms the backbone of this survey, followed by a thorough discussion of the Hilbert space embedding of marginal  ...  Hilbert space embeddings of predictive state representations. In Proceedings of the 29th International Conference on Uncertainty in Artificial Intelligence, pages 92-101, 2013. K. Borgwardt, A.  ... 
doi:10.1561/2200000060 fatcat:vgmsbodozngltpzy6c2idxnx34

Spatio-Temporal Hilbert Maps for Continuous Occupancy Representation in Dynamic Environments

Ransalu Senanayake, Lionel Ott, Simon Timothy O'Callaghan, Fabio Tozeto Ramos
2016 Neural Information Processing Systems  
The main benefit of this approach is that it can directly predict the occupancy state of the map in the future from past observations, being a valuable tool for robot trajectory planning under uncertainty  ...  We consider the problem of building continuous occupancy representations in dynamic environments for robotics applications.  ...  Static Hilbert maps (SHMs) A static Hilbert map (SHM) [3] is a continuous probabilistic occupancy representation of the space, given a collection of range sensor measurements.  ... 
dblp:conf/nips/SenanayakeOOR16 fatcat:e2qq2i3jxzc5ndqjzc2csghfjm

CHilEnPred: CNN Model With Hilbert Curve Representation of DNA Sequence For Enhancer Prediction [article]

Md. Monowar Anjum, Ibrahim Asadullah Tahmid, M. Sohel Rahman
2019 bioRxiv   pre-print
Results: In this study, we develop the predictor model CHilEnPred that has been trained with the visual representation of the DNA sequences with Hilbert Curve.  ...  We report our computational prediction result on FANTOM5 dataset where CHilEnPred achieves an accuracy of 94.97% and AUC of 0.987 on test data.  ...  Our method takes advantage of better representation by the Hilbert curve and uses CNN to predict enhancer.  ... 
doi:10.1101/552141 fatcat:w6u75qi4tne3jk4znpyla5s6be

Quantum Embedding Search for Quantum Machine Learning [article]

Nam Nguyen, Kwang-Chen Chen
2021 arXiv   pre-print
First, we establish the connection between the structures of quantum embedding and the representations of directed multi-graphs, enabling a well-defined search space.  ...  Second, we instigate the entanglement level to reduce the cardinality of the search space to a feasible size for practical implementations.  ...  Similar to kernel methods, quantum embedding transforms observations in classical data space into quantum Hilbert space of quantum states, which the inner product of quantum representations can represent  ... 
arXiv:2105.11853v2 fatcat:yx4cig5px5e6hedwwzdpugnmfq

Quantum Embedding Search for Quantum Machine Learning

Nam Nguyen, Kwang-Cheng Chen
2022 IEEE Access  
First, we establish the connection between the structures of entanglement using CNOT gates and the representations of directed multi-graphs, enabling a well-defined search space.  ...  by QES outperforms manual designs in term of the predictive performance.  ...  Similar to kernel methods, quantum embedding transforms observations in classical data space into quantum Hilbert space of quantum states, which the inner product of quantum representations can represent  ... 
doi:10.1109/access.2022.3167398 fatcat:fde64j6sl5cdrlhyinbo764tva

Learning and Inference in Hilbert Space with Quantum Graphical Models [article]

Siddarth Srinivasan, Carlton Downey, Byron Boots
2018 arXiv   pre-print
Unlike classical graphical models, QGMs represent uncertainty with density matrices in complex Hilbert spaces. Hilbert space embeddings (HSEs) also generalize Bayesian inference in Hilbert spaces.  ...  We show that these operations can be kernelized, and use these insights to propose a Hilbert Space Embedding of Hidden Quantum Markov Models (HSE-HQMM) to model dynamics.  ...  Υ f |h ← C f |h Υ hΓs,y|h ← C s, PSRNNs Predictive State Recurrent Neural Networks (PSRNNs) [Downey et al., 2017] are a recent state-ofthe-art model developed by embedding a Predictive State Representation  ... 
arXiv:1810.12369v1 fatcat:tuxpe4s3gvbo7ahrwtibeeqigq

Quantum Language Model with Entanglement Embedding for Question Answering [article]

Yiwei Chen, Yu Pan, Daoyi Dong
2020 arXiv   pre-print
We propose a neural network model with a novel Entanglement Embedding (EE) module, whose function is to transform the word sequences into entangled pure states of many-body quantum systems.  ...  Nevertheless, in the current literature word sequences are basically modelled as a classical mixture of word states, which cannot fully exploit the potential of a quantum probabilistic description.  ...  In line with the previous works, a word is modelled as a quantum pure state in a Hilbert space H w and the complexvalued word embedding module is used to transform the onehot encoded vector representation  ... 
arXiv:2008.09943v1 fatcat:sgfb5nrkwrfkbjp4njfyv5o2rq
« Previous Showing results 1 — 15 out of 20,595 results