262,276 Hits in 3.8 sec

Mutually Regressive Point Processes

Ifigeneia Apostolopoulou, Scott Linderman, Kyle Miller, Artur Dubrawski
2019 Neural Information Processing Systems  
Hawkes processes admit many efficient inference algorithms, but are limited to mutually excitatory effects.  ...  They can be modeled naturally as realizations of a point process.  ...  Mutually Regressive Point Process: a generalization of the Hawkes Process The intensity function λ n (t), for events of type n occurring at times ṫn i , of a Mutually Regressive Point Process (MR-PP) is  ... 
dblp:conf/nips/ApostolopoulouL19 fatcat:dc2mumrnwzaynikr2iegfhq7cq

Bayesian Kernel and Mutual k-Nearest Neighbor Regression [article]

Hyun-Chul Kim
2016 arXiv   pre-print
We propose Bayesian extensions of two nonparametric regression methods which are kernel and mutual k-nearest neighbor regression methods.  ...  Derived based on Gaussian process models for regression, the extensions provide distributions for target value estimates and the framework to select the hyperparameters.  ...  BAYESIAN KERNEL AND MUTUAL k-NN REGRESSION VIA GAUSSIAN PROCESSES A.  ... 
arXiv:1608.01410v1 fatcat:udtaxw5q6jeufdrysf7xtgfpgq

Bayesian Model Selection Methods for Mutual and Symmetric k-Nearest Neighbor Classification [article]

Hyun-Chul Kim
2016 arXiv   pre-print
Bayesian mutual and symmetric k-NN regression methods are based on Gaussian process models, and it turns out that they can do MkNN and SkNN classification with new encodings of target values (class labels  ...  We propose the ways how MkNN and SkNN classification can be performed based on Bayesian mutual and symmetric k-NN regression methods with the selection schemes for the parameter k.  ...  Bayesian Mutual and Symmetric k-NN Regression via Gaussian Processes 1) Gaussian Process Regression: Assume that we have a data set D of data points x i with continuous target values y i : D = {(x i  ... 
arXiv:1608.04063v1 fatcat:7yjh2qyxxjdftn5qj6x3w6dutm

Mutual information based dimensionality reduction with application to non-linear regression

Lev Faivishevsky, Jacob Goldberger
2010 2010 IEEE International Workshop on Machine Learning for Signal Processing  
Next we provide a nonlinear regression algorithm based on the proposed dimensionality reduction approach.  ...  In this paper we introduce a supervised linear dimensionality reduction algorithm which is based on finding a projected input space that maximizes mutual information between input and output values.  ...  )(xi c − xj c ) ⊤ A(xi c − xj c ) 2   A 1NN EXPERIMENTAL RESULTS First, we compared MIPR performance with other possibilities for the regression in the resulting subspace, such as Gaussian Process  ... 
doi:10.1109/mlsp.2010.5589176 fatcat:7qhylegog5g2nndimvlg7evafu

Page 3371 of Journal of the Atmospheric Sciences Vol. 62, Issue 9 [page]

2005 Journal of the Atmospheric Sciences  
By the fundamental data processing theorem in information theory (Cover and Thomas 1991, chapter 2), the mutual information be- tween the above variables satisfy the inequality I(F; O) = I(F:; V). (6)  ...  It can be shown that the average predictability of the regression forecast distri- bution is the mutual information between F and V, denoted /(F; V); /(F; V) will be called the predictability of the regression  ... 

Dimensionality reduction based on non-parametric mutual information

Lev Faivishevsky, Jacob Goldberger
2012 Neurocomputing  
Next we provide a nonlinear regression algorithm based on the proposed dimensionality reduction approach.  ...  In this paper we introduce a supervised linear dimensionality reduction algorithm which finds a projected input space that maximizes the mutual information between input and output values.  ...  Experimental results First, we compared MIPR performance with other alternatives for the regression in the resulting subspace, such as Gaussian process regression, considered today to be the state-of-the-art  ... 
doi:10.1016/j.neucom.2011.07.028 fatcat:cuv43q63kjfthhk33vfv3bgb2e

Prediction of Lung Function in Adolescence Using Epigenetic Aging: A Machine Learning Approach

Md Adnan Arefeen, Sumaiya Tabassum Nimi, M. Sohel Rahman, S. Hasan Arshad, John W. Holloway, Faisal I. Rezwan
2020 Methods and Protocols  
years from feature selected predictor variables (based on mutual information) and AA changes between the two time points.  ...  The best models were ridge regression (R2 = 75.21% ± 7.42%; RMSE = 0.3768 ± 0.0653) and elastic net regression (R2 = 75.38% ± 6.98%; RMSE = 0.445 ± 0.069) for FEV1 and FVC, respectively.  ...  Lung development is a continuous process from childhood to adolescence [18] .  ... 
doi:10.3390/mps3040077 pmid:33182250 fatcat:7wjbij62rrcmlpvwmar55gpmra

Localize to Classify and Classify to Localize: Mutual Guidance in Object Detection [article]

Heng Zhang, Elisa Fromont, Sébastien Lefevre, Bruno Avignon
2020 arXiv   pre-print
the simplicity of the proposed method, our experiments with different state-of-the-art deep learning architectures on PASCAL VOC and MS COCO datasets demonstrate the effectiveness and generality of our Mutual  ...  Dealing with such contradictory labels, as we do with Mutual Guidance, does not harm the training process.  ...  On each pixel of feature maps, it classifies the category of this sample point and regresses the four distances to the target bounding box borders.  ... 
arXiv:2009.14085v1 fatcat:lc6mo6wgbzdtdbnqe3lekznj7a

Mutual Information Pre-processing Based Broken-stick Linear Regression Technique for Web User Behaviour Pattern Mining

Gokulapriya Raman, CHRIST (Deemed to be University), Ganesh Raj, CHRIST (Deemed to be University)
2021 International Journal of Intelligent Engineering and Systems  
Mutual Information Pre-processing based Broken-Stick Linear Regression (MIP-BSLR) technique is proposed for refining the performance of web user behaviour pattern mining with higher accuracy.  ...  Then, Mutual Information based Pre-processing (MI-P) method is applied to compute mutual dependence between the two web patterns.  ...  This is owing to application of Mutual Information Based Pre-processing and Broken-Stick Linear Regression Analysis in proposed MIP-BSLR technique.  ... 
doi:10.22266/ijies2021.0228.24 fatcat:zl7ikga6z5g6nifzwhhy2ubwra

Wind speed forecast using random forest learning method [article]

G. V. Drisya, Valsaraj P., K. Asokan, K. Satheesh Kumar
2022 arXiv   pre-print
The computed values of mutual information and auto-correlation shows that wind speed values depend on the past data up to 12 hours.  ...  In this paper, we suggested the time series machine learning approach called random forest regression for predicting wind speed variations.  ...  As can be seen from Fig. 1 , mutual information computed for various delays vanishes by 72 data points (12 hours).  ... 
arXiv:2203.14909v1 fatcat:yia7ps2dmveb3muv5p627eah3y

Active learning for sparse bayesian multilabel classification

Deepak Vasisht, Andreas Damianou, Manik Varma, Ashish Kapoor
2014 Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining - KDD '14  
We focus on the real-world scenario where the average number of positive (relevant) labels per data point is small leading to positive label sparsity.  ...  The benefit of this alternate inference scheme is that it enables a natural approximation of the mutual information objective.  ...  Furthermore, by performing inference in a sparse Gaussian Process regression model, we can leverage the mutual information guarantees to provably show that the entire subset of data points selected for  ... 
doi:10.1145/2623330.2623759 dblp:conf/kdd/VasishtDVK14 fatcat:rzzhiyg3pjcsneii5azvrnxyam

Decoding of Factorial Experimental Design Models Implemented in Production Process

Adham Mohammed Alnadish, Mohamad Yusri Aman, Herda Yati Binti Katman, Mohd Rasdan Ibrahim
2022 Computers Materials & Continua  
After calculating the regression model, its variables must be returned to their original values for the model to be easy recognized and represented.  ...  Models without and with the mutual influence of independent variables differ. The encoding and decoding procedure on a model with two independent first-order parameters is presented in details.  ...  Factor Table 2 : 2 Regression coefficients for the proposed models First-order model without First-order model with mutual Second-order model with mutual influence influence mutual influence Label Values  ... 
doi:10.32604/cmc.2022.021642 fatcat:wyad2oymbfgoxgbk77fzosn3mi

Non-Methane Hydrocarbons Emission During The Photocopying Process

Kiurski S. Jelena, Aksentijević M. Snežana, Kecić S. Vesna, Oros B. Ivana
2015 Zenodo  
Obtained regression equations allow to measure the quantitative agreement between the variables and thus obtain more accurate knowledge of their mutual relations.  ...  Using multiple linear regression model and software package STATISTICA 10 the concentrations of occupational hazards and microclimates parameters were mutually correlated.  ...  A multiple linear regression analysis was performed to predict and determine the mutual correlation between dependent variable (nonmethane hydrocarbons concentration) and independent variables (microclimates  ... 
doi:10.5281/zenodo.1106248 fatcat:o6zr7s4yajggth3k3xkbj4fmle

Investigating the utility of clinical outcome-guided mutual information network in network-based Cox regression

Hyun-hwan Jeong, So Kim, Kyubum Wee, Kyung-Ah Sohn
2015 BMC Systems Biology  
Conclusions: We found that the alternative outcome-guided mutual information network further improved the prediction power of the network-based Cox regression.  ...  We measure the strength of this association using mutual information between the gene pair and the clinical outcome.  ...  chitin catabolic process 4.83E-06 4.94E-03 3 7 GO:BP GO:0006030 chitin metabolic process 4.83E-06 4.94E-03 3 7 GO:BP GO:1901072 glucosamine-containing compound catabolic process 7.70E  ... 
doi:10.1186/1752-0509-9-s1-s8 pmid:25708115 pmcid:PMC4331683 fatcat:w622fi6ryrcu3cro4ktw7uyfaq

Sparse Model Identification Using a Forward Orthogonal Regression Algorithm Aided by Mutual Information

Stephen A. Billings, Hua-Liang Wei
2007 IEEE Transactions on Neural Networks  
A new forward orthogonal regression algorithm, with mutual information interference, is proposed for sparse model selection and parameter estimation.  ...  A sparse representations, with satisfactory approximation accuracy, is usually desirable in any nonlinear system identification and signal processing problem.  ...  approximation and signal processing.  ... 
doi:10.1109/tnn.2006.886356 pmid:17278482 fatcat:7zwtrqcisngmldmzwsdo7kasg4
« Previous Showing results 1 — 15 out of 262,276 results