A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Filters
Causal Feature Selection via Orthogonal Search
[article]
2022
arXiv
pre-print
-the-rest feature selection approach to discover the direct causal parent of the response. ...
The problem of inferring the direct causal parents of a response variable among a large set of explanatory variables is of high practical importance in many disciplines. ...
An unbiased estimator of the causal effect parameter θ 0 can be obtained via the orthogonalization approach as in Chernozhukov et al. (2018a) , which is obtained via the use of the "Neyman Orthogonality ...
arXiv:2007.02938v2
fatcat:pows2gkebvdxja7sfymhsjkmym
A new Mendelian Randomization method to estimate causal effects of multivariable brain imaging exposures
[article]
2021
bioRxiv
pre-print
We propose a new Mendelian Randomization framework to jointly select instrumental variables and imaging exposures, and then estimate the causal effect of multivariable imaging data on the outcome variable ...
assessing the causal effect of multiple exposures without dimension reduction. ...
This algorithm searches the solution of the objective function in an iterative-residual fashion, which captures the most informative features of the data matrix (W ) that are of potential causal effect ...
doi:10.1101/2021.10.01.462221
fatcat:d2ysv5m4cbcetdluzywdlno5ku
Machine Learning with Squared-Loss Mutual Information
2012
Entropy
estimation and SMI-based machine learning techniques such as independence testing, dimensionality reduction, canonical dependency analysis, independent component analysis, object matching, clustering, and causal ...
Supervised Feature Selection Next, we show how the SMI estimator can be used for supervised feature selection. ...
Thus, for whitened data, the search space of V can be restricted to the orthogonal group without loss of generality. ...
doi:10.3390/e15010080
fatcat:6b3w7mkd3rcfzh3sqio6vvfpxa
Approaches to dimensionality reduction in proteomic biomarker studies
2007
Briefings in Bioinformatics
It then focuses on the problem of selecting the most appropriate method for a specific task or dataset, and proposes method combination as a potential alternative to single-method selection. ...
A major data-analytical problem involved is the extremely high dimensionality (i.e. number of features or variables) of proteomic data, in particular when the sample size is small. ...
A preliminary theoretical analysis of causality-based feature selection has refined the concept of feature relevance in the framework of causal Bayesian networks. ...
doi:10.1093/bib/bbn005
pmid:18310106
fatcat:57agd2jsajgezlieq7hz4r26dm
Orthogonal Structure Search for Efficient Causal Discovery from Observational Data
[article]
2020
arXiv
pre-print
The problem of inferring the direct causal parents of a response variable among a large set of explanatory variables is of high practical importance in many disciplines. ...
Recent work exploits stability of regression coefficients or invariance properties of models across different experimental conditions for reconstructing the full causal graph. ...
For the proposed orthogonal structure search (OSS), the distributions for the linear parameters of the true causal parents are easily distinguishable from those of the non-parents, see Figure 6 . ...
arXiv:1903.02456v2
fatcat:3dulhzoqtbbkncjmewcdwjul24
Using machine learning for linking spatiotemporal information to biological processes in the ocean – A case study for North Sea cod recruitment
2021
Marine Ecology Progress Series
Feature selection Search algorithms. Feature subset selection is a way of removing redundant and uninformative variables (i.e. features) in an ML model. ...
Unsupervised feature extractions via empirical orthogonal functions (EOFs) or self-organising maps (SOMs) are demonstrated as a way to summarize spatiotemporal fields for inclusion in predictive models ...
doi:10.3354/meps13689
fatcat:im3z45uupjdxfjbna3zivcuaxm
GLANCE: Global to Local Architecture-Neutral Concept-based Explanations
[article]
2022
arXiv
pre-print
These aligned features form semantically meaningful concepts that are used for extracting a causal graph depicting the 'perceived' data-generating process, describing the inter- and intra-feature interactions ...
This causal graph serves as a global model from which local explanations of different forms can be extracted. ...
We pick features that are seemingly orthogonal to one another because that helps us to assume ground-truth causal structure to follow naive Bayes structure with all the selected features. ...
arXiv:2207.01917v1
fatcat:2g5vp2uyund5lh423uva5cgbve
Block Variable Selection in Multivariate Regression and High-dimensional Causal Inference
2010
Neural Information Processing Systems
To efficiently address such problems, we propose a variable selection method, Multivariate Group Orthogonal Matching Pursuit, which extends the standard Orthogonal Matching Pursuit technique. ...
Within this framework, we then formulate the problem of inferring causal relationships over a collection of high-dimensional time series variables. ...
This capacity control may be implemented in various ways, e.g., via dimensionality reduction, input variable selection or regularized risk minimization. ...
dblp:conf/nips/LozanoS10
fatcat:m7tkihwy4ze6vdubh4ff2euhn4
MERLiN: Mixture Effect Recovery in Linear Networks
2016
IEEE Journal on Selected Topics in Signal Processing
We introduce MERLiN (Mixture Effect Recovery in Linear Networks), a family of causal inference algorithms that implement a novel means of constructing causal variables from non-causal variables. ...
Given an observed linear mixture, the algorithms can recover a causal variable that is a linear effect of another given variable. ...
We therefore assume that A is an orthogonal d × d matrix and restrict the search to v ⊥ . ...
doi:10.1109/jstsp.2016.2601144
fatcat:ql43hlti4jh67f44fs3ojn4us4
Stable Prediction via Leveraging Seed Variable
[article]
2020
arXiv
pre-print
By assuming the independence between causal and non-causal variables, we show, both theoretically and with empirical experiments, that our algorithm can precisely separate causal and non-causal variables ...
causal variables with a seed variable as priori, and adopt them for stable prediction. ...
(ii) The performance of PC-simple is similar to correlation based method, since it's hard to search the optimal solution for PC-simple via naively random search, moreover, it relies on the causal sufficiency ...
arXiv:2006.05076v1
fatcat:jhkxzgfx7bfsxdeowfq4pwjlwa
A scalable framework for large time series prediction
2021
Knowledge and Information Systems
The proposed feature selection algorithm shows promising results compared to widely known algorithms, such as the classic and the kernel principle component analysis, factor analysis, and the fast correlation-based ...
for identifying time series predictors, which analyses the dependencies between time series using the mutual reinforcement principle between Hubs and Authorities of the Hits (Hyperlink-Induced Topic Search ...
Filter feature selection includes two classes of algorithms: univariate algorithms and subset search algorithms [35] . ...
doi:10.1007/s10115-021-01544-w
fatcat:c4qyao25nbfrxk6u32ekki6ytm
Review of Statistical Learning Methods in Integrated Omics Studies (An Integrated Information Science)
2018
Bioinformatics and Biology Insights
the multilayer associations and causal relations between omics features in integrated omics studies. ...
Random walk kernel is used for smoothing and a permutation test is used to select features of each data set. ...
Integration occurs at the molecular level: the input data are IDs of gene, protein, and metabolite and merged by these ID; results are derived using public databases (ie, pathway enrichment analysis via ...
doi:10.1177/1177932218759292
pmid:29497285
pmcid:PMC5824897
fatcat:nbknjl4qq5awrldy7natmg3h6y
Extreme Learning Machines [Trends & Controversies]
2013
IEEE Intelligent Systems
However, feature engineering requires domain knowledge and human ingenuity to generate appropriate features. ...
A machine learning algorithm's generalization capability depends on the dataset, which is why engineering a dataset's features to represent the data's salient structure is important. ...
Because H is the projected feature space of X squashed via a sigmoid function, we hypothesize that ELM-AE's output weight b will learn to represent the features of the input data via singular values. ...
doi:10.1109/mis.2013.140
fatcat:xaj2ynxqrnfy5ivzkpkdxxaiia
Boosted Convolutional Neural Networks for Motor Imagery EEG Decoding with Multiwavelet-based Time-Frequency Conditional Granger Causality Analysis
[article]
2018
arXiv
pre-print
Specifically, multiwavelet basis functions are first combined with Geweke spectral measure to obtain high-resolution TF-conditional Granger causality (CGC) representations, where a regularized orthogonal ...
Further constructed boosted ConvNets by using spatio-temporal convolutions as well as advances in deep learning including cropping and boosting methods, to extract discriminative causality features and ...
Orthogonally decomposing the selected regression matrix Φ (which is full rank in columns) as Φ , where is a matrix with orthogonal columns and is an unit upper triangular matrix. ...
arXiv:1810.10353v1
fatcat:qavwmnwdorhqri3cpdahfoqmiu
Global Selection of Saccadic Target Features by Neurons in Area V4
2014
Journal of Neuroscience
This feature-dependent modulation occurred in the absence of any feature-attention task. ...
Presaccadic suppression was absent when the features of the saccadic target matched the features preferred by individual V4 neurons. ...
relative to trials in which the search target is not comprised of preferred features. ...
doi:10.1523/jneurosci.0867-13.2014
pmid:24806696
pmcid:PMC4012320
fatcat:co7z3ej4yzdmtdcxnim5tsssh4
« Previous
Showing results 1 — 15 out of 13,652 results