Filters








6,765 Hits in 6.4 sec

AUCseg: An Automatically Unsupervised Clustering Toolbox for 3D-Segmentation of High-Grade Gliomas in Multi-Parametric MR Images

Botao Zhao, Yan Ren, Ziqi Yu, Jinhua Yu, Tingying Peng, Xiao-Yong Zhang
2021 Frontiers in Oncology  
In this work, we propose an automatically unsupervised segmentation toolbox based on the clustering algorithm and morphological processing, named AUCseg.  ...  We did a multi-sided evaluation of our toolbox in the BraTS2018 dataset and demonstrated that the whole tumor, tumor core, and enhancing tumor can be automatically segmented using default hyper-parameters  ...  RESULTS Comparison of Clustering Algorithms Using the same default hyper-parameters, we compared tumor segmentation results with different clustering methods.  ... 
doi:10.3389/fonc.2021.679952 fatcat:sjei7br2orepvnmvpw7s6eq3ue

Detecting British Columbia Coastal Rainfall Patterns by Clustering Gaussian Processes [article]

Forrest Paton, Paul D. McNicholas
2020 arXiv   pre-print
An approach is developed for clustering multiple processes observed on a comparable interval, based on how similar their underlying covariance kernel is.  ...  Gaussian processes are a generalization of the multivariate normal distribution to function space and, in this paper, they are used to shed light on coastal rainfall patterns in British Columbia (BC).  ...  Ten were generated from an SE covariance kernel with hyper-parameters l = 1 and σ = 1. The remaining 10 GPs were generated from a covariance kernel with hyper-parameters l = 2 and σ = 2.  ... 
arXiv:1812.09758v2 fatcat:heaty3442jfu3onjd577t73xny

An Efficient Technique for Disease Prediction by Using Enhanced Machine Learning Algorithms for Categorical Medical Dataset

Veera Anusuya, V Gomathi
2021 Information Technology and Control  
of Multi-Objective based Ant Colony Optimization (MO-ACO) from the extracted features for increasing the classification and clusters.  ...  Among the various machine learning procedures, classification, and clusters method play a significant role.  ...  We can also use the Radial Basis Function with Automatic Relevance Determination as a kernel function of Gaussian processes.  ... 
doi:10.5755/j01.itc.50.1.25349 fatcat:wnedbm7p55bupjuwpnef73uvo4

Bayesian optimization assisted unsupervised learning for efficient intra-tumor partitioning in MRI and survival prediction for glioblastoma patients [article]

Yifan Li, Chao Li, Stephen Price, Carola-Bibiane Schönlieb, Xi Chen
2021 arXiv   pre-print
Here we proposed a machine learning framework to semi-automatically fine-tune the clustering algorithms and quantitatively identify stable sub-regions for reliable clinical survival prediction.  ...  Hyper-parameters are automatically determined by the global minimum of the trained Gaussian Process (GP) surrogate model through Bayesian optimization(BO) to alleviate the difficulty of tuning parameters  ...  A powerful GP algorithm requires carefully designed kernel function and its associated hyper-parameters. See [12] for an overview of GP and the kernel functions.  ... 
arXiv:2012.03115v2 fatcat:obo7vixoszervetboxl6elll4a

Applying the Possibilistic c-Means Algorithm in Kernel-Induced Spaces

Maurizio Filippone, Francesco Masulli, Stefano Rovetta
2010 IEEE transactions on fuzzy systems  
In this paper, we study a kernel extension of the classic possibilistic clustering.  ...  In this new space, we model the mapped data by means of the Possibilistic Clustering algorithm.  ...  We used a Gaussian kernel with three different values of σ: 0.5, 1, and 5; the regularization parameter η has been set automatically using Eq. 27, where we set the value of γ to 1.  ... 
doi:10.1109/tfuzz.2010.2043440 fatcat:ap5ouwgxgvhf5dyjg3dugjph3u

EXPLORING EFFICIENT KERNEL FUNCTIONS FOR SUPPORT VECTOR CLUSTERING

Furkan Burak BAĞCI, Ömer KARAL
2020 Mugla Journal of Science and Technology  
nonlinear boundaries based on the Gaussian kernel parameter.  ...  To overcome these problems, kernel-based clustering methods have been developed in recent years, which automatically determine the number and boundaries of clusters.  ...  With this process called kernel trick, kernel-based algorithms perform with higher accuracy rate than conventional algorithms, but do not have high computational complexity by performing all operations  ... 
doi:10.22531/muglajsci.703790 fatcat:rzpsunwd3fgc5ixujrcxkw2m2q

Surpassing Human-Level Face Verification Performance on LFW with GaussianFace [article]

Chaochao Lu, Xiaoou Tang
2014 arXiv   pre-print
This paper proposes a principled multi-task learning approach based on Discriminative Gaussian Process Latent Variable Model, named GaussianFace, to enrich the diversity of training data.  ...  Face verification remains a challenging problem in very complex conditions with large variations such as pose, illumination, expression, and occlusions.  ...  This work is partially supported by "CUHK Computer Vision Cooperation" grant from Huawei, and by the General Research Fund sponsored  ... 
arXiv:1404.3840v3 fatcat:3s7wevwavfe7tdtg7s36ha5coy

On Hyperparameter Optimization of Machine Learning Algorithms: Theory and Practice

Li Yang, Abdallah Shami
2020 Neurocomputing  
Machine learning algorithms have been used widely in various applications and areas. To fit a machine learning model into different problems, its hyper-parameters must be tuned.  ...  Selecting the best hyper-parameter configuration for machine learning models has a direct impact on the model's performance.  ...  FAR-HO FAR-HO [120] is a hyper-parameter optimization package that employs gradient-based algorithms with TensorFlow.  ... 
doi:10.1016/j.neucom.2020.07.061 fatcat:xrizwqtr7retpbs5xfnkqqyia4

kernlab- AnS4Package for Kernel Methods inR

Alexandros Karatzoglou, Alex Smola, Kurt Hornik, Achim Zeileis
2004 Journal of Statistical Software  
a spectral clustering algorithm.  ...  kernlab is an extensible package for kernel-based machine learning methods in R. It takes advantage of R's new S4 object model and provides a framework for creating and using kernel-based algorithms.  ...  In case a Gaussian RBF kernel is being used a model selection process can be used to determine the optimal value of the σ hyper-parameter.  ... 
doi:10.18637/jss.v011.i09 fatcat:y5ix3p6ed5hipo2ual37hsgvfu

SPEECH/MUSIC CLASSIFICATION USING WAVELET BASED FEATURE EXTRACTION TECHNIQUES

Thiruvengatanadhan Ramalingam, P. Dhanalakshmi
2014 Journal of Computer Science  
The system shows significant results with an accuracy of 94.5%.  ...  Then the proposed method extends the application of Gaussian Mixture Models (GMM) to estimate the probability density function using maximum likelihood decision methods.  ...  Based on calculated features, a clustering algorithm is applied to structure the music content.  ... 
doi:10.3844/jcssp.2014.34.44 fatcat:3dbftyyt7jevhdzwi7zcxzqd6a

A review of automatic selection methods for machine learning algorithms and hyper-parameter values

Gang Luo
2016 Network Modeling Analysis in Health Informatics and Bioinformatics  
These findings establish a foundation for future research on automatically selecting algorithms and hyper-parameter values for analyzing big biomedical data.  ...  To make machine learning accessible to layman users with limited computing expertise, computer science researchers have proposed various automatic selection methods for algorithms and/or hyperparameter  ...  computing on a cluster of commodity computers.  ... 
doi:10.1007/s13721-016-0125-6 dblp:journals/netmahib/Luo16 fatcat:qkxv4gp64rfvvhgw7g3xygrapm

Deep Clustered Convolutional Kernels [article]

Minyoung Kim, Luca Rigazio
2015 arXiv   pre-print
We view this as a limitation and propose a novel training algorithm that automatically optimizes network architecture, by progressively increasing model complexity and then eliminating model redundancy  ...  For convolutional neural networks, our method relies on iterative split/merge clustering of convolutional kernels interleaved by stochastic gradient descent.  ...  In our algorithm we use k-means clustering to merge kernels since, naturally, k-means cluster distortion under the defined distortion measure (we employ L2 norm to compute cluster distortion).  ... 
arXiv:1503.01824v1 fatcat:stcvp6pmnnhftpx2xn2mv6pdma

KNN-kernel density-based clustering for high-dimensional multivariate data

Thanh N. Tran, Ron Wehrens, Lutgarde M.C. Buydens
2006 Computational Statistics & Data Analysis  
Density-based clustering algorithms for multivariate data often have difficulties with high-dimensional data and clusters of very different densities.  ...  Moreover, the number of clusters is identified automatically by the algorithm.  ...  The number of clusters is automatically determined by the algorithm upon convergence.  ... 
doi:10.1016/j.csda.2005.10.001 fatcat:wkvo2qvmnzb4pggctuh3bqpwqi

Parametric Gaussian Process Regression for Big Data [article]

Maziar Raissi
2017 arXiv   pre-print
The proposed methodology circumvents the well-established need for stochastic variational inference, a scalable algorithm for approximating posterior distributions.  ...  The effectiveness of the proposed approach is demonstrated using an illustrative example with simulated data and a benchmark dataset in the airline industry with approximately 6 million records.  ...  For instance, when applying a Gaussian process to a dataset of size N , exact inference has computational complexity O(N 3 ) with storage demands of O(N 2 ).  ... 
arXiv:1704.03144v2 fatcat:mecy5xomqbgkfdqiujn5jcww4a

Least Absolute Shrinkage is Equivalent to Quadratic Penalization [chapter]

Yves Grandvalet
1998 ICANN 98  
Thanks to this link, the EM algorithm of section 4 computes the lasso solution. A series of its possible applications is given in section 5.  ...  We finally present a series of applications of this type of algorithm in regression problems: kernel regression, additive modeling and neural net training. of examples.  ...  Mixtures of Gaussians may be used to cluster different set of covariates. Several models have been proposed, with data dependent clusters [4] , or classes defined a priori [5] .  ... 
doi:10.1007/978-1-4471-1599-1_27 fatcat:txwnpqabkrbc3lajzbrm4lfjka
« Previous Showing results 1 — 15 out of 6,765 results