A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Filters
Fused inverse regression with multi-dimensional responses
2021
Communications for Statistical Applications and Methods
Sufficient dimension reduction provides effective tools for the reduction, but there are few sufficient dimension reduction methodologies for multivariate regression. ...
The proposed approaches are robust to the numbers of clusters or slices and improve the estimation results over existing methods by fusing many kernel matrices. ...
In regression of Y ∈ R u |X ∈ R p , sufficient dimension reduction (SDR) seeks to replace the original p-dimensional predictors X by its lower-dimensional predictor η T X without loss of information on ...
doi:10.29220/csam.2021.28.3.267
fatcat:taf6ty4ln5defjjd47uwu4e4d4
Clustering Algorithm for Time Series with Similar Shapes
2018
KSII Transactions on Internet and Information Systems
The reason for such a problem is that the existing algorithms do not consider the limitations on the size of the generated clusters, and use a dimension reduction method in which the information loss is ...
In the clustering step, we use density-based spatial clustering of applications with noise (DBSCAN) to create a precluster. ...
[1] used symbolic aggregate approximation (SAX) [7] as a dimension-reduction method for time series data with high-dimensional characteristics. ...
doi:10.3837/tiis.2018.07.008
fatcat:i4xhadnfdvaolha2unqqx66nmm
On hierarchical clustering in sufficient dimension reduction
2020
Communications for Statistical Applications and Methods
The K-means clustering algorithm has had successful application in sufficient dimension reduction. ...
clustering algorithm has not yet been done in a sufficient dimension reduction context. ...
Acknowledgements For Chaeyeon Yoo, ...
doi:10.29220/csam.2020.27.4.431
fatcat:kanywqfyujey3ajxik7543yiqy
A minimum discrepancy approach to multivariate dimension reduction via $k$-means inverse regression
2009
Statistics and its Interface
We proposed a new method to estimate the intra-cluster adjusted central subspace for regressions with multivariate responses. ...
Our method was designed to recover the intracluster information and outperformed previous method with respect to estimation accuracies on both the central subspace and its dimension. ...
sufficient predictors from the four marginal dimension reduction models. ...
doi:10.4310/sii.2009.v2.n4.a11
fatcat:ekgytmy3sfbw7o6v2w6dbyn64e
Comparing K-Value Estimation for Categorical and Numeric Data Clustring
2010
International Journal of Computer Applications
The Gmeans algorithm is based on a statistical test for the hypothesis that a subset of data follows a Gaussian distribution. ...
We used an improved algorithm for learning k while clustering the Categorical clustering. A Clustering algorithm Gaussian means applied in k-means paradigm that works well for categorical features. ...
Both these tests are one dimensional test. We have a high dimensional dataset; we reduce the dimensions using dimension reduction method, so we learning true dimension under PCA method. ...
doi:10.5120/1565-1875
fatcat:zsuibcmoebafpcrttnhkvgcdz4
Page 3241 of Mathematical Reviews Vol. , Issue 2004d
[page]
2004
Mathematical Reviews
Amir (IL-HEBR-CSE; Jerusalem) ;
Tishby, Naftali (IL-HEBR-INC; Jerusalem)
Sufficient dimensionality reduction. ...
In this paper we introduce an information theoretic nonlinear method for finding a most informative such dimension reduction. ...
Nonlinear Signal Sources Estimation Based on Nonlinear Dimension Reduction
2018
DEStech Transactions on Engineering and Technology Research
In this paper, we propose an estimation method by combining the ICA and nonlinear dimension reduction. ...
Our experimental results show that the proposed algorithm outperforms methods for comparison under nonlinear condition. ...
Dimensionality reduction aims to transform high-dimensional data to a low-dimensional feature space. ...
doi:10.12783/dtetr/apop2017/18734
fatcat:mx3y6zqhpras5oewkpc2eadi7u
Sufficient Component Analysis for Supervised Dimension Reduction
[article]
2011
arXiv
pre-print
The purpose of sufficient dimension reduction (SDR) is to find the low-dimensional subspace of input features that is sufficient for predicting output values. ...
Through large-scale experiments on real-world image classification and audio tagging problems, the proposed method is shown to compare favorably with existing dimension reduction approaches. ...
Kenji Fukumizu for providing us the KDR code and Prof. Taiji Suzuki for his valuable comments. MY was supported by the JST PRESTO program. GN was supported by the MEXT scholarship. ...
arXiv:1103.4998v1
fatcat:dwgj5szbzncxxlvwkv5wgtgkoe
Comparisons of Non-Gaussian Statistical Models in DNA Methylation Analysis
2014
International Journal of Molecular Sciences
Comparisons and analysis of different dimension reduction strategies and unsupervised clustering methods are presented. ...
Afterwards, non-Gaussian statistical model-based unsupervised clustering strategies are applied to cluster the data. ...
Acknowledgments The authors would like to thank the editor for organizing the review process and thank the anonymous reviewers for their efforts in reviewing this manuscript and providing fruitful suggestions ...
doi:10.3390/ijms150610835
pmid:24937687
pmcid:PMC4100184
fatcat:mjukdxenc5an7a37uzruntfjs4
Speaker diarization of broadcast streams using two-stage clustering based on i-vectors and cosine distance scoring
2012
2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
In this paper we present our system for speaker diarization of broadcast news based on recent advances in the speaker recognition field. ...
Finally, two-stage clustering employing BIC-based clustering to pre-cluster segments in the first stage is examined and showed to yield further performance improvement. ...
Fig. 4(a) shows the effect of different LDA dimension reductions for systems operating with total variability spaces of dimensions of 300 and 400. ...
doi:10.1109/icassp.2012.6288843
dblp:conf/icassp/SilovskyP12
fatcat:vmj2pkrlw5fj7jxbzsu5qie6cq
Understanding Reuse, Performance, and Hardware Cost of DNN Dataflows: A Data-Centric Approach Using MAESTRO
[article]
2020
arXiv
pre-print
execution time and energy efficiency for a DNN model and hardware configuration. ...
We codify this analysis into an analytical cost model, MAESTRO (Modeling Accelerator Efficiency via Spatio-Temporal Reuse and Occupancy), that estimates various cost-benefit tradeoffs of a dataflow including ...
ACKNOWLEDGEMENT We thank Joel Emer for insightful advice and constructive comments to improve this work; Vivienne Sze and Yu-Hsin Chen for their insights and taxonomy that motivated this work. ...
arXiv:1805.02566v6
fatcat:3656k7gkcbfbxewgcebz7v2wrq
Cause-Effect Deep Information Bottleneck For Systematically Missing Covariates
[article]
2020
arXiv
pre-print
Based on the sufficiently reduced covariate, we transfer the relevant information to cases where data is missing at test time, allowing us to reliably and accurately estimate the effects of an intervention ...
Estimating the causal effects of an intervention from high-dimensional observational data is difficult due to the presence of confounding. ...
Overall, we we can perform a sufficient reduction of the high-dimensional covariate information to between 4 and 6 dimensions, while accurately estimating Y . ...
arXiv:1807.02326v3
fatcat:o64pbwr4kbenlbpb3f5xtfrnhq
Evaluating Dimensionality Reduction Techniques For Visual Category Recognition Using Rényi Entropy
2011
Zenodo
The value of the lower dimension is based on estimation of intrinsic dimensionality using three methods for all categories. ...
INTRINSIC DIMENSIONALITY ESTIMATION The choice of dimension of the lower dimensional sub-space is based on the intrinsic dimensionality of the visual category. ...
doi:10.5281/zenodo.42618
fatcat:leygi75v2jgqvkv723uikhrd4e
Poisson factor models with applications to non-normalized microRNA profiling
2013
Computer applications in the biosciences : CABIOS
The method is shown to outperform several other normalization and dimension reduction methods in a simulation study. ...
We develop an efficient algorithm for estimating the Poisson factor model, entitled Poisson Singular Value Decomposition with Offset (PSVDOS). ...
See, among others, Anders and Huber (2010) and Robinson and Oshlack (2010) in the context of supervised clustering. ...
doi:10.1093/bioinformatics/btt091
pmid:23428639
pmcid:PMC3634185
fatcat:omt3povndbgfhgvrkh5w7okf6y
On Local Intrinsic Dimension Estimation and Its Applications
2010
IEEE Transactions on Signal Processing
There has been much work done on estimating the global dimension of a data set, typically for the purposes of dimensionality reduction. ...
In this paper, we present multiple novel applications for local intrinsic dimension estimation. ...
Finn and the Department of Pathology, University of Michigan, for the cytometry data and diagnoses. They thank the reviewers of this paper for their significant contributions. ...
doi:10.1109/tsp.2009.2031722
fatcat:t7bfczytszcohivvjsvwjrwhny
« Previous
Showing results 1 — 15 out of 176,481 results