Filters








3,182 Hits in 4.8 sec

Multi-layer Kernel Ridge Regression for One-class Classification [article]

Chandan Gautam, Aruna Tiwari, Sundaram Suresh, Alexandros Iosifidis
2018 arXiv   pre-print
In this paper, a multi-layer architecture (in a hierarchical fashion) by stacking various Kernel Ridge Regression (KRR) based Auto-Encoder for one-class classification is proposed and is referred as MKOC  ...  MKOC has many layers of Auto-Encoders to project the input features into new feature space and the last layer was regression based one class classifier.  ...  Multi-Layer KRR based Once-Class Classifier In this section, a Multi-layer KRR-based architecture for One-class Classification (MKOC) is described.  ... 
arXiv:1805.07808v2 fatcat:23vhicno45cevnkz5fxwowam7y

Graph-Embedded Multi-layer Kernel Extreme Learning Machine for One-class Classification or (Graph-Embedded Multi-layer Kernel Ridge Regression for One-class Classification) [article]

Chandan Gautam, Aruna Tiwari, M. Tanveer
2019 arXiv   pre-print
In this paper, a multi-layer architecture for OCC is proposed by stacking various Graph-Embedded Kernel Ridge Regression (KRR) based Auto-Encoders in a hierarchical fashion.  ...  The last layer of this proposed architecture is Graph-Embedded regression-based one-class classifier.  ...  One is referred as Local variance based Graph-Embedded Multi-layer KRR for One-class Classification (LM KOC), and other is referred as Global variance based Graph-Embedded Multi-layer KRR for One-class  ... 
arXiv:1904.06491v1 fatcat:k3j6jfyjrzeozi4nlv4wm6scju

NFT-K: Non-Fungible Tangent Kernels [article]

Sina Alemohammad, Hossein Babaei, CJ Barberan, Naiming Liu, Lorenzo Luzi, Blake Mason, Richard G. Baraniuk
2021 arXiv   pre-print
To further contribute interpretability with respect to classification and the layers, we develop a new network as a combination of multiple neural tangent kernels, one to model each layer of the deep neural  ...  We demonstrate the interpretability of this model on two datasets, showing that the multiple kernels model elucidates the interplay between the layers and predictions.  ...  An MLP trained with infinitesimal step size for Gaussian distributed θ 0 using the following loss min θ L(θ) + λ θ − θ 0 2 2 (5) is equivalent to a kernel ridge regression predictor with ridge hyperparameter  ... 
arXiv:2110.04945v1 fatcat:h7froeung5fjjamyfxryxfk7ry

An Insight into Extreme Learning Machines: Random Neurons, Random Features and Kernels

Guang-Bin Huang
2014 Cognitive Computation  
This paper also shows that in theory ELMs (with the same kernels) tend to outperform support vector machine and its variants in both regression and classification applications with much easier implementation  ...  the conventional SVM [1] and one of its main variants LS-SVM [2].  ...  valid in regression and multi-class classifications, the same ELM solutions can be used in regression, binary and multiclass classification applications.  ... 
doi:10.1007/s12559-014-9255-2 fatcat:k66cuxdi3rervdnyfa6yqlgxlm

Applications of Machine Learning Methods to Genomic Selection in Breeding Wheat for Rust Resistance

Juan Manuel González-Camacho, Leonardo Ornella, Paulino Pérez-Rodríguez, Daniel Gianola, Susanne Dreisigacker, José Crossa
2018 The Plant Genome  
New methods and algorithms are being developed for predicting untested phenotypes in schemes commonly used in genomic selection (GS).  ...  resistance (APR) genes, although epistasis has been found in some populations; b) rust resistance requires effective combinations of major and minor genes; and c) disease resistance is commonly measured based on  ...  We compared the performance of several regression/ classification models against some parametric models (BL, ridge regression, etc.) on SR and YR.  ... 
doi:10.3835/plantgenome2017.11.0104 pmid:30025028 fatcat:tsx5lutpd5ftdcfke3kuci7fmy

Kernel machines with two layers and multiple kernel learning [article]

Francesco Dinuzzo
2010 arXiv   pre-print
First, a representer theorem for two-layer networks is presented, showing that finite linear combinations of kernels on each layer are optimal architectures whenever the corresponding functions solve suitable  ...  Finally, a simple and effective multiple kernel learning method called RLS2 (regularized least squares with two layers) is introduced, and his performances on several learning problems are extensively  ...  RLS2: multi-class classification of microarray data RLS2 can be applied to multi-class classification problems by solving several binary classification problems and combining their outcomes.  ... 
arXiv:1001.2709v1 fatcat:qqvqdnh6wbb6zc2emivqfxbkki

Classification and evaluation of quality grades of organic green teas using an electronic nose based on machine learning algorithms

Huixiang Liu, Dongbing Yu, Yu Gu
2019 IEEE Access  
models (partial least squares regression, kernel ridge regression, and support vector regression).  ...  A multi-task model based on the back propagation neural network (MBPNN) is proposed for the simultaneous performance of the classification task (grade classification of tea) and regression task (quality  ...  multi-class classification.  ... 
doi:10.1109/access.2019.2957112 fatcat:whf5e74mbffmjjguqgvqd5yk3u

Classification of Student's Confusion Level in E-Learning using Machine Learning

2019 VOLUME-8 ISSUE-10, AUGUST 2019, REGULAR ISSUE  
Getting confused while watching online videos is one of the root causes of less performance of the students.  ...  for predicting the confusion level of the students that can improve the quality of online education system.  ...  Table I shows the features that we have considered to classify class labels for 'confused' or 'not confused' students classification.  ... 
doi:10.35940/ijitee.b1092.1292s19 fatcat:dusufgj6h5e5hhcjlrgwxeucdu

Simple and Effective Regularization Methods for Training on Noisily Labeled Data with Generalization Guarantee [article]

Wei Hu, Zhiyuan Li, Dingli Yu
2020 arXiv   pre-print
Our generalization analysis relies on the connection between wide neural network and neural tangent kernel (NTK).  ...  simple and intuitive regularization methods: (i) regularization by the distance between the network parameters to initialization, and (ii) adding a trainable auxiliary variable to the network output for  ...  Then with probability at least 1´δ, the classification error of f˚on the clean distribution D satisfies Theorem 5.3 (Multi-class classification).  ... 
arXiv:1905.11368v4 fatcat:qqrzhcvmvvajtjpojoblzaddra

Towards end-to-end pulsed eddy current classification and regression with CNN [article]

Xin Fu, Chengkai Zhang, Xiang Peng, Lihua Jian, Zheng Liu
2019 arXiv   pre-print
multi-layer structures.  ...  Extensive experiments demonstrate our model is capable of handling both classification and regression tasks on PEC data.  ...  Specifically, for Specimen A, we set our model as follows: the first two 1D convolutional layers both have 128 × 3 × 1 kernels and the second two have 64 × 3 × 1 kernels.  ... 
arXiv:1902.08553v1 fatcat:tq6lql5rrvb3fmx2znm7tirop4

Genome-enabled prediction models for black tea (Camellia sisnesnsis) quality and drought tolerance traits [article]

Robert. K. Koech, Pelly M. Malebe, Christopher Nyarukowa, Richard Mose, Samson M. Kamunya, Theodor Loots, Zeno Apostolides
2019 bioRxiv   pre-print
The findings has for the first time opened up a new avenue for future application of genomic selection in tea breeding.  ...  catechin, astringency, brightness, briskness, and colour based on putative QTLs + annotated proteins + KEGG pathway approach.The percent variables of importance of putatively annotated proteins and KEGG  ...  The Multi-layer Perceptron Network prediction models based on annotated proteins for black tea colour trait in both datasets 3 2 2 (Table 8 ).  ... 
doi:10.1101/850792 fatcat:nyupue3rabcutfxkezyfgkhkre

Mitigating Malicious Insider Attacks in the Internet of Things using Supervised Machine Learning Techniques

Mir Shahnawaz Ahmad, Shahid Mehraj Shah
2021 Scalable Computing : Practice and Experience  
In recent times, IoT has emerged as a prime field for solving diverse real-life problems by providing a smart and affordable solutions.  ...  We have also applied various supervised machine learning techniques on available IoT dataset to deduce which among them is best suited to detect the malicious insider attacks in the IoT network.  ...  We would like to thank TEQIP-III and MITS, Gwalior for supporting this research.  ... 
doi:10.12694/scpe.v22i1.1818 fatcat:3vdip37zsbbxbdpo5yscm6gofa

MeltpoolNet: Melt pool Characteristic Prediction in Metal Additive Manufacturing Using Machine Learning [article]

Parand Akbari, Francis Ogoke, Ning-Yu Kao, Kazem Meidani, Chun-Yu Yeh, William Lee, Amir Barati Farimani
2022 arXiv   pre-print
Predicting meltpool flaws based on process parameters and powder material is difficult due to the complex nature of MAM process.  ...  In this work, we introduced a comprehensive framework for benchmarking ML for melt pool characterization.  ...  For multi-class classification, we introduced two methods to evaluate every class at the same time: macro-average and micro-average.  ... 
arXiv:2201.11662v1 fatcat:jfopaagdc5hvxdllnjcllwk5ea

Application of Machine Learning Models for Survival Prognosis in Breast Cancer Studies

Iliyan Mihaylov, Maria Nisheva, Dimitar Vassilev
2019 Information  
The analysis of our experiments shows an advantage of the linear Support Vector Regression, Lasso regression, Kernel Ridge regression, K-neighborhood regression, and Decision Tree regression—these models  ...  The cross-validation for accuracy demonstrates best performance of the same models on the studied breast cancer data.  ...  for Analysis and Automated Semantic Enhancement of Big Heterogeneous Data Collections" Project, Contract DN02/9/2016.  ... 
doi:10.3390/info10030093 fatcat:yowotvaov5evzmv7szxz2s6iga

Extreme Learning Machine with Elastic Net Regularization

Lihua Guo
2020 Intelligent Automation and Soft Computing  
During the experiment, the label is either 1 or -1 for binary classification, and the label is 1, 2,..., N for multi-class classification, where N is the number of classes.  ...  They are six binary classification datasets and four multi-class classification datasets. Details are summarized in Table 2 .  ... 
doi:10.32604/iasc.2020.013918 fatcat:yfsdqnufinfd7jatbnyqnakt5m
« Previous Showing results 1 — 15 out of 3,182 results