Filters








26,240 Hits in 3.4 sec

Robust Procedures for Estimating and Testing in the Framework of Divergence Measures

Leandro Pardo, Nirian Martín
2021 Entropy  
The approach for estimating and testing based on divergence measures has become, in the last 30 years, a very popular technique not only in the field of statistics, but also in other areas, such as machine  ...  Some recent proposals for robust model selection are criteria based on divergences and minimum divergence estimators.  ...  In "Robust Model Selection Criteria Based on Pseudodistances" by Toma et al. see [34] , a new class of robust model selection criteria are introduced.  ... 
doi:10.3390/e23040430 pmid:33917362 fatcat:ud3szi5mbnhznodwwvog245bva

The Minimum Density Power Divergence Approach in Building Robust Regression Models

Alessandra Durio, Ennio Davide Isaia
2011 Informatica  
in the data and therefore to choose the best tuning constant for the Minimum Density Power Divergence estimators.  ...  In this paper we investigate the use of the Minimum Density Power Divergence criterion as a practical tool for parametric regression models building.  ...  Authors are indebted to the coordinating editor and to the anonymous referees for carefully reading the manuscript and for their many important remarks and suggestions.  ... 
doi:10.15388/informatica.2011.313 fatcat:baxbge5oabdpvgymzcqmhm22fu

Robust statistical modeling of monthly rainfall: The minimum density power divergence approach [article]

Arnab Hazra, Abhik Ghosh
2022 arXiv   pre-print
In this paper, we discuss a robust parameter estimation approach based on the minimum density power divergence estimators (MDPDEs) which provides a class of estimates through a tuning parameter including  ...  The underlying tuning parameter controls the trade-offs between efficiency and robustness of the resulting inference; we also discuss a procedure for data-driven optimal selection of this tuning parameter  ...  One downside of the RIC is that it depends on the tuning parameter α.  ... 
arXiv:1909.08035v2 fatcat:fod4xu2utvapxbhuit6mvvoasq

Robust estimation in the normal mixture model

Hironori Fujisawa, Shinto Eguchi
2006 Journal of Statistical Planning and Inference  
A relationship between robustness and efficiency is investigated and an adaptive method for selecting the tuning parameter of the modified likelihood is suggested, based on the robust model selection criterion  ...  Numerical studies are presented to evaluate the performance. The robust method is applied to single nuleotide polymorphism typing for the purpose of outlier detection and clustering.  ...  This work was supported by Grant-in-Aid for Scientific Research of the Ministry of Education, Culture, Sports, Science and Technology.  ... 
doi:10.1016/j.jspi.2005.03.008 fatcat:zhnluyrn7vafzdoxokkasfpobm

Information Complexity Criterion for Model Selection in Robust Regression Using A New Robust Penalty Term [article]

Esra Pamukçu, Mehmet Niyazi Çankaya
2020 arXiv   pre-print
In this paper, we derive a new tool for the model selection in robust regression. We introduce a new definition of relative entropy based on objective functions.  ...  The log likelihood function for the lack of fit term and a specified penalty term are used as two parts in a model selection criteria.  ...  Hamparsum Bozdogan for his constructive comments.  ... 
arXiv:2012.02468v1 fatcat:audblercknen3al3ykcsycr4re

Adaptively Robust Geographically Weighted Regression [article]

Shonosuke Sugasawa, Daisuke Murakami
2021 arXiv   pre-print
We embed the standard geographically weighted regression in robust objective function based on γ-divergence.  ...  A novel feature of the proposed approach is that two tuning parameters that control robustness and spatial smoothness are automatically tuned in a data-dependent manner.  ...  Acknowledgements This work was supported by the Japan Society for the Promotion of Science (KAKENHI) Grant Numbers 18H03628, 20H00080, and 21H00699.  ... 
arXiv:2106.15811v3 fatcat:raalvqginfbjfomvi7ck3tdmaa

Robust fitting for generalized additive models for location, scale and shape

William H. Aeberhard, Eva Cantoni, Giampiero Marra, Rosalba Radice
2021 Statistics and computing  
AbstractThe validity of estimation and smoothing parameter selection for the wide class of generalized additive models for location, scale and shape (GAMLSS) relies on the correct specification of a likelihood  ...  The latter allows for automatic smoothing parameter selection and is particularly advantageous in applications with multiple smoothing parameters.  ...  Acknowledgements The authors thank three anonymous reviewers for relevant questions and suggestions which improved the quality of the paper.  ... 
doi:10.1007/s11222-020-09979-x fatcat:3mjmesdsf5cojeeesoz736gqj4

A comparison of related density-based minimum divergence estimators

M. C. Jones
2001 Biometrika  
This paper compares the minimum divergence estimator of Basu, Harris, Hjort and Jones (1998) to a competing minimum divergence estimator which turns out to be equivalent to a method proposed from a different  ...  Both methods can be applied for any parametric model, contain maximum likelihood as a special case, and can be extended to the context of regression situations.  ...  ACKNOWLEDGEMENTS The authors would like to thank Michele Basseville for references and Brent Burch and Roy St Laurent for helpful conversations.  ... 
doi:10.1093/biomet/88.3.865 fatcat:optppc7y7zdpno66h3pdgcyvrm

Model Selection in a Composite Likelihood Framework Based on Density Power Divergence

Elena Castilla, Nirian Martín, Leandro Pardo, Konstantinos Zografos
2020 Entropy  
on an tuning parameter α .  ...  This paper presents a model selection criterion in a composite likelihood framework based on density power divergence measures and in the composite minimum density power divergence estimators, which depends  ...  Acknowledgments: The authors would like to thank the Editor and Reviewers for taking their precious time to make several valuable comments on the manuscript.  ... 
doi:10.3390/e22030270 pmid:33286044 fatcat:ynd4hmm5xvfobjsiz6wacnhfsu

Robustly Leveraging Prior Knowledge in Text Classification [article]

Biao Liu, Minlie Huang
2015 arXiv   pre-print
In this paper, we propose three regularization terms on top of generalized expectation criteria, and conduct extensive experiments to justify the robustness of the proposed methods.  ...  Many approaches have been proposed to formalise a variety of knowledge, however, whether the proposed approach is robust or sensitive to the knowledge supplied to the model has rarely been discussed.  ...  The Influence of λ We present the influence of λ on the method that incorporates KL divergence in this section. Since we simply set λ = β|K|, we just tune β here.  ... 
arXiv:1503.00841v1 fatcat:qel3cb3745cdzfc7sp76g5zk6q

Assessment of an L-Kurtosis-Based Criterionfor Quantile Estimation

M. D. Pandey, P. H. A. J. M. van Gelder, J. K. Vrijling
2001 Journal of hydrologic engineering  
Simulation results indicate that the L-kurtosis criterion can provide quantile estimates that are in good agreement with benchmark estimates obtained from other robust criteria.  ...  The divergence is a comprehensive measure of probabilistic distance used in the modern information theory for signal analysis and pattern recognition.  ...  criteria for a wide range of distribution parameters.  ... 
doi:10.1061/(asce)1084-0699(2001)6:4(284) fatcat:5tciiz2lyzgjncfxzkhmrnwi3u

Page 7359 of Mathematical Reviews Vol. , Issue 2001J [page]

2001 Mathematical Reviews  
Such estimated risks then yield new model selection criteria.  ...  Summary: “A sequential probability ratio test is developed for the problem of selecting the best of k multinomial parameter estima- tion procedures when only one observation per the k estimation procedures  ... 

Domain Adaptation Using Factorized Hidden Layer for Robust Automatic Speech Recognition

Khe Chai Sim, Arun Narayanan, Ananya Misra, Anshuman Tripathi, Golan Pundak, Tara Sainath, Parisa Haghani, Bo Li, Michiel Bacchiani
2018 Interspeech 2018  
In this paper, we consider speech data collected for different applications as separate domains and investigate the robustness of acoustic models trained on multidomain data on unseen domains.  ...  Experimental results on two unseen domains show that FHL is a more effective adaptation method compared to selectively finetuning part of the network, without dramatically increasing the model parameters  ...  For selective fine-tuning, we adjust the adaptation complexity by updating only the parameters of the first n LSTM layers.  ... 
doi:10.21437/interspeech.2018-2246 dblp:conf/interspeech/SimNMTPSHLB18 fatcat:e4pm64d74zduzhvkr5lm7t74se

Extreme Precipitation Frequency Analysis Using a Minimum Density Power Divergence Estimator

Yongwon Seo, Junshik Hwang, Byungsoo Kim
2017 Water  
The minimum density power divergence estimator (MDPDE) with the optimal value of a tuning parameter, α, was suggested as an alternative estimator instead of the maximum likelihood estimator (MLE); its  ...  On the other hand, the MDPDE of the GEV distribution with a positive shape parameter, ξ, does not show its advantage conditionally because the GEV distribution has a heavier right tail than the GUM distribution  ...  Conflicts of Interest: The authors declare no conflict of interest.  ... 
doi:10.3390/w9020081 fatcat:vhqux5ke7vgsze6sfv44w5coee

An Empirical Evaluation of Various Information Gain Criteria for Active Tactile Action Selection for Pose Estimation [article]

Prajval Kumar Murali, Ravinder Dahiya, Mohsen Kaboli
2022 arXiv   pre-print
In this paper, we empirically evaluate various information gain criteria for action selection in the context of object pose estimation.  ...  We find similar performance in terms of pose accuracy with sparse measurements across all the selected criteria.  ...  The active touch action selection is started from the 4th touch onwards. In particular for Rényi divergence, we used an α = 0.3 that was empirically tuned.  ... 
arXiv:2205.04697v1 fatcat:yw2a44vwqnccbac5odefhdouma
« Previous Showing results 1 — 15 out of 26,240 results