A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2006; you can also visit the original URL.
The file type is application/pdf
.
Filters
A class of neural networks for independent component analysis
1997
IEEE Transactions on Neural Networks
We consider learning algorithms for each layer, and modify our previous nonlinear PCA type algorithms so that their separation capabilities are greatly improved. ...
The basic ICA network consists of whitening, separation, and basis vector estimation layers. It can be used for both blind source separation and estimation of the basis vectors of ICA. ...
ACKNOWLEDGMENT The authors are grateful to the reviewers for their detailed and useful comments. ...
doi:10.1109/72.572090
pmid:18255654
fatcat:7ycqgkg5yvdhflruafe75wfaay
Automatic damage type classification and severity quantification using signal based and nonlinear model based damage sensitive features
2018
Journal of Process Control
Two types of features are used as inputs to the SVM algorithm: Signal Based Features (SBF) and Nonlinear Model Based Features (NMBF). ...
[10] damage classification is performed using time-frequency representations and the Adaboost machine learning algorithm. ...
doi:10.1016/j.jprocont.2018.08.002
fatcat:ykst3cuwinanbooffot5suelxu
Cross Signal Identification For Psg Applications
2008
Zenodo
An alternative representation of the respiratory events by means of Kohonen type neural network is discussed. ...
Our computed analysis includes a learning phase based on cross signal PSG annotation. ...
We tried sigmoid type nonlinearities for the representation of ECG with NLPCA in a previous work [2] , and also hysteresis type nonlinearity, which showed poor convergence to the eigen coordinate vectors ...
doi:10.5281/zenodo.1334641
fatcat:4l2aicpezzbifnyvz36abjtrlu
Harmonic Elimination of Inverters using Blind Signal Separation
2005
American Journal of Applied Sciences
The harmonic separation process is implemented with a processor achieves low THD using Blind Signal separation. It is mostly used in medical instrumentation and medical applications like ECG, EEG. ...
The main objective of this study is to eliminate harmonics and reduce the power loss in inverters. ...
However this kind of representation often characterizes the fundamental properties of the data better than standard PCA. For example in blind signal separation of the original source signals. ...
doi:10.3844/ajassp.2005.1434.1437
fatcat:wocuz6r6efczfo67rla4vdpsca
Page 5268 of Psychological Abstracts Vol. 81, Issue 11
[page]
1994
Psychological Abstracts
(Helsinki U of Technology, Lab of Computer & Information Science, Finland) Representation and separation of signals using nonlinear PCA type learning. Neural Networks, 1994, Vol 7(1), 113-127. ...
—De- rives a new class of nonlinear principal component analysis (PCA) type learning algorithms by minimizing a general statisti- cal signal representation error. ...
Principal manifolds and Bayesian subspaces for visual recognition
1999
Proceedings of the Seventh IEEE International Conference on Computer Vision
We investigate the use of linear and nonlinear principal manifolds for learning low-dimensional representations for visual recognition. ...
Abstract We i n vestigate the use of linear and nonlinear principal manifolds for learning lowdimensional representations for visual recognition. ...
Since the Bayesian similarity method's learning stage requires two separate PCAs, its complexity is essentially twice that of PCA. ...
doi:10.1109/iccv.1999.790407
dblp:conf/iccv/Moghaddam99
fatcat:uew4mkxlvbeh7bdtjwu4fxg2b4
Automatic Damage Quantification Using Signal Based And Nonlinear Model Based Damage Sensitive Features
2017
IFAC-PapersOnLine
Two types of features are used as inputs to the SVM algorithm: Signal Based Features (SBF) and Nonlinear Model Based Features (NMBF). ...
SBF are rooted in a direct use of response signals and do not consider any underlying model of the test structure. ...
SVMs and PCA SVMs SVM learning technique is used for the classification step. ...
doi:10.1016/j.ifacol.2017.08.994
fatcat:acaz6htrajbcxahxhxoa3rwace
Neural Network Implementations for PCA and Its Extensions
2012
ISRN Artificial Intelligence
These methods are useful in adaptive signal processing, blind signal separation (BSS), pattern recognition, and information compression. ...
PCA is a statistical method that is directly related to EVD and SVD. Minor component analysis (MCA) is a variant of PCA, which is useful for solving total least squares (TLSs) problems. ...
PCA is often used to select inputs, but it is not always useful, since the variance of a signal is not always related to the importance of the signal, for non-Gaussian signals. ...
doi:10.5402/2012/847305
fatcat:5v5l5v56ozg7lkxfktm5t7cgle
Time-lagged autoencoders: Deep learning of slow collective variables for molecular kinetics
2018
Journal of Chemical Physics
Inspired by the success of deep learning techniques in the physical and chemical sciences, we apply a modification of an autoencoder type deep neural network to the task of dimension reduction of molecular ...
the capabilities of linear dimension reduction techniques. ...
European Commission (ERC StG 307494 "pcCell") and Deutsche Forschungsgemeinschaft (SFB 1114/A04). ...
doi:10.1063/1.5011399
pmid:29960344
fatcat:4lthaotfqfblfkh5uyaxmpgwza
Page 747 of Neural Computation Vol. 6, Issue 4
[page]
1994
Neural Computation
Karhunen, J., and Joutsensalo, J. 1993. Representation and separation of signals using nonlinear PCA type learning. Neural Networks, in press.
Oja, E., and Karhunen, J. 1985. ...
Stability of Oja’s PCA Subspace Rule
Acknowledgment
The author is grateful to Jyrki Joutsensalo for providing the simulation example.
References
Hertz, J., Krogh, A., and Palmer, R. G. 1991. ...
Comparison of dimensionality reduction techniques for the fault diagnosis of mono block centrifugal pump using vibration signals
2014
Engineering Science and Technology, an International Journal
In this paper dimensionality reduction is performed using traditional dimensionality reduction techniques and nonlinear dimensionality reduction techniques. ...
This is achieved by the extraction of features from the measured data and employing data mining approaches to explore the structural information hidden in the signals acquired. ...
From Tables 1e4 among the nonlinear dimensionality reduction techniques and PCA, the PCA outperforms when using decision tree, Bayes Net, Naïve Bayes and kNN classifiers. ...
doi:10.1016/j.jestch.2014.02.005
fatcat:puldrr4n7vb63cbc2umaweoq64
Classification of the myoelectric signal using time-frequency based representations
1999
Medical Engineering and Physics
An accurate and computationally efficient means of classifying surface myoelectric signal patterns has been the subject of considerable research effort in recent years. ...
Effective feature extraction is crucial to reliable classification and, in the quest to improve the accuracy of transient myoelectric signal pattern classification, an ensemble of time-frequency based ...
Acknowledgements The authors acknowledge the assistance of the Natural Sciences and Engineering Research Council of Canada and the Whitaker Foundation. ...
doi:10.1016/s1350-4533(99)00066-1
pmid:10624739
fatcat:75ae6ty3rnhwdeknbjfiqwfvqe
LEARNING MULTIPLE CAUSES BY COMPETITION ENHANCED LEAST MEAN SQUARE ERROR RECONSTRUCTION
1996
International Journal of Neural Systems
Our learning scheme provides a way for balancing the cooperation and competition necessary for the self-organization process thus realizes the multiple causes model, which accounts for an observed data ...
Comparing with previ ous probability theory based multiple causes models, our scheme is much easier to implement and quite reliable. ...
Sometime they are treated as nonlinear extensions of some PCA learnings or shortly termed as nonlinear PCA 8?11 . ...
doi:10.1142/s0129065796000208
fatcat:mxrdvepc4zgxtehkbqsrhznxd4
Dimensionality Reduction and Anomaly Detection for CPPS Data using Autoencoder
2019
2019 IEEE International Conference on Industrial Technology (ICIT)
A closely related concern is dimensionality reduction (DR) which is: 1) often used as a preprocessing step in an AD solution, 2) a sort of AD, if a measure of observation conformity to the learned data ...
Moreover, the approach outperforms state-of-the-art techniques, alongside a relatively simple and straightforward application. ...
An autoencoder neural network or autoencoder [9] is a special type of deep feed-forward neural network, typically used for representation learning and dimensionality reduction. ...
doi:10.1109/icit.2019.8755116
dblp:conf/icit2/EiteneuerHN19
fatcat:pfmvi7am3fecpkxp2lqmignntm
Finding Clusters and Components by Unsupervised Learning
[chapter]
2004
Lecture Notes in Computer Science
In statistical PR, there are two classical categories for unsupervised learning methods and models: first, variations of Principal Component Analysis and Factor Analysis, and second, learning vector coding ...
This approach is also reviewed, with examples such as linear and nonlinear independent component analysis and topological maps. ...
of the separated signals y1, Separated signals
36]: 1. ...
doi:10.1007/978-3-540-27868-9_1
fatcat:e2yzfehty5bylca47aolqcnqnm
« Previous
Showing results 1 — 15 out of 10,947 results