A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Capturing heterogeneous group differences using mixture-of-experts: Application to a study of aging
2016
NeuroImage
In this paper, we present a method that explicitly models and captures heterogeneous patterns of change in the affected group relative to a reference group of controls. ...
In the case of patient/control comparisons, each such pattern aims to capture a different dimension of a disease, and hence to identify patient subgroups. ...
Acknowledgments This study was supported in part by NIH grant AG014971, the Intramural Research Program, National Institute on Aging, and NIH contract HHSN2712013000284P by the NIA to UPenn. ...
doi:10.1016/j.neuroimage.2015.10.045
pmid:26525656
pmcid:PMC5460911
fatcat:knqmxjzurrfkde7vl73qgvoelu
Mixtures of Experts Models
[article]
2018
arXiv
pre-print
Given their mixture model foundation, mixtures of experts models possess a diverse range of analytic uses, from clustering observations to capturing parameter heterogeneity in cross-sectional data. ...
Mixtures of experts models provide a framework in which covariates may be included in mixture models. ...
Their demonstrated use to cluster observations, and to appropriately capture heterogeneity in cross sectional data, provides only a glimpse of their potential flexibility and utility in a wide range of ...
arXiv:1806.08200v1
fatcat:amaewljxlveu5hkpdaeujyqlbu
Learning to Adapt Clinical Sequences with Residual Mixture of Experts
[article]
2022
arXiv
pre-print
In this work, we aim to alleviate this limitation by refining a one-fits-all model using a Mixture-of-Experts (MoE) architecture. ...
With this way, the mixture of experts can provide flexible adaptation to the (limited) predictive power of the single base RNN model. ...
The key idea is to specialize the Mixture-of-Experts to learn the residual that δ base cannot capture. ...
arXiv:2204.02687v1
fatcat:yjhkfsxc7jcyldroeb4tsri3ce
M3E2: Multi-gate Mixture-of-experts for Multi-treatment Effect Estimation
[article]
2022
arXiv
pre-print
This work proposes the M3E2, a multi-task learning neural network model to estimate the effect of multiple treatments. ...
In contrast to existing methods, M3E2 can handle multiple treatment effects applied simultaneously to the same unit, continuous and binary treatments, and many covariates. ...
In a multi-gate mixture-of-expert (MMoE) architecture [15] , a hard-parameter sharing network can be interpreted as a single expert model. ...
arXiv:2112.07574v2
fatcat:nwflmz2bffcgve2co4n4agfeoa
Anchoring to Exemplars for Training Mixture-of-Expert Cell Embeddings
[article]
2021
arXiv
pre-print
We propose Treatment ExemplArs with Mixture-of-experts (TEAMs), an embedding learning approach that learns a set of experts that are specialized in capturing technical variations in our training set and ...
equipments used to collect microscopy images. ...
In contrast, we use a mixture-of-experts approach that obtained using a single linear projection. ...
arXiv:2112.03208v1
fatcat:dem557uqwjavboekwtwimk66hi
Scenario Adaptive Mixture-of-Experts for Promotion-Aware Click-Through Rate Prediction
[article]
2022
arXiv
pre-print
Technically, it follows the idea of Mixture-of-Experts by adopting multiple experts to learn feature representations, which are modulated by a Feature Gated Network (FGN) via an attention mechanism. ...
In this work, we propose Scenario Adaptive Mixture-of-Experts (SAME), a simple yet effective model that serves both promotion and normal scenarios. ...
Partially inspired by these prior works, we borrow the idea of Mixture-of-Experts. ...
arXiv:2112.13747v2
fatcat:qu7qtomqfbc47j4wrwbb2d7quq
MECATS: Mixture-of-Experts for Quantile Forecasts of Aggregated Time Series
[article]
2021
arXiv
pre-print
We introduce a mixture of heterogeneous experts framework called , which simultaneously forecasts the values of a set of time series that are related through an aggregation hierarchy. ...
Different types of forecasting models can be employed as individual experts so that the form of each model can be tailored to the nature of the corresponding time series. learns hierarchical relationships ...
Left: point prediction generated
by mixture-of-experts, Lrecon is used to train gating network NNg . ...
arXiv:2112.11669v1
fatcat:iagqb6az4jfy7k7ktlqalvkjzy
Analysing plant closure effects using time-varying mixture-of-experts Markov chain clustering
2018
Annals of Applied Statistics
In addition, a mixtureof-experts approach allows us to model the probability of belonging to a certain cluster as depending on a set of covariates via a multinomial logit model. ...
In particular, we follow the careers of workers who experience a job displacement due to plant closure and observe -over a period of forty quarters -whether these workers manage to return to a steady career ...
Acknowledgements The research was funded by the Austrian Science Fund (FWF): S10309-G16 (NRN "The Austrian Center for Labor Economics and the Analysis of the Welfare State") and the CD Laboratory "Ageing ...
doi:10.1214/17-aoas1132
fatcat:tqduem3h75h6bj3jss27b4bqrm
Labor market entry and earnings dynamics: Bayesian inference using mixtures-of-experts Markov chain clustering
2011
Journal of applied econometrics
The statistical challenge in our application comes from the difficulty in extending distance-based clustering approaches to the problem of identify groups of similar time series in a panel of discrete-valued ...
In order to analyze group membership we present an extension to this approach by formulating a probabilistic model for the latent group indicators within the Bayesian classification rule using a multinomial ...
Acknowledgements This research is supported by the Austrian Science Foundation (FWF) under the grant S 10309-G14 (NRN "The Austrian Center for Labor Economics and the Analysis of the Welfare State", Subproject ...
doi:10.1002/jae.1249
fatcat:ark6c2kct5gpvnqadzbvgricuu
Steered Mixture-of-Experts for Light Field Images and Video: Representation and Coding
2019
IEEE transactions on multimedia
We propose a novel coding framework for higher-dimensional image modalities, called Steered Mixture-of-Experts (SMoE). ...
Index Terms-Mixture of experts, light fields, mixture models, sparse representation, bayesian modeling. ...
of a Mixture-of-Experts with one layer for regression. ...
doi:10.1109/tmm.2019.2932614
fatcat:2nuvguaeorguxlkhqir6yzi27e
UKnowledge Development in Normal Mixture and Mixture of Experts Modeling
unpublished
Simulation study To simulate data with m per group, we need first get the joint distribution of m variables. ...
As we mentioned in the application section, a two components normal mixture is not a good fit of the presumably correlated Z-values, this motivates us to expand the method to testing 2 versus 3 component ...
Then according to the definition in (Van der Vaart [2000] ), we have two bracketing functions l and u with finite L(P ) -norms. ...
fatcat:f7yhtjs7cncbxjqftrpidh3vne
Transfer Learning from Well-Curated to Less-Resourced Populations with HIV
2020
Machine Learning in Health Care
We demonstrate its utility for optimising treatments for the first time in a set of HIV patients in Africa, and note how this approach may be applicable to many other scenarios where a variable is measured ...
In this work, we present a novel mixture based approach that uses a deep information bottleneck to transfer patterns learned from European HIV cohorts-where genomic data is readily available-to African ...
In this paper, we adapt this framework to a multi-treatment setting, and incorporate this knowledge into a mixture-of-experts model for reasoning about treatment effects over heterogenous patient groups ...
dblp:conf/mlhc/ParbhooW0D20
fatcat:3fub3ihtpbhrxjkmwymafufwke
Concordance of Alzheimer's Disease Subtypes Produced from Different Representative Morphological Measures: A Comparative Study
2022
Brain Sciences
However, how the two measures affect the definition of AD subtypes remains unclear. Methods: A total of 180 AD patients from the ADNI database were used to identify AD subgroups. ...
This study provides a valuable reference for selecting features in future studies of AD subtypes. ...
Conflicts of Interest: The authors have no competing interests to declare. ...
doi:10.3390/brainsci12020187
pmid:35203950
pmcid:PMC8869952
fatcat:uc2nc7fyvrcerkinazvhit7p4i
A Generalizable Speech Emotion Recognition Model Reveals Depression and Remission
[article]
2021
bioRxiv
pre-print
Methods: A Mixture-of-Experts machine learning model was trained to infer happy/sad emotional state using three publicly available emotional speech corpora. ...
This study investigated a generalizable approach to aid clinical evaluation of depression and remission from voice. ...
A gradient boosted decision tree model was trained on each dataset separately to predict the probability of sounding happy or sad using Catboost 28 and combined in a Mixture of Experts (MoE) architecture ...
doi:10.1101/2021.09.01.458536
fatcat:ub6jhmyugrdcbeh3wsjsaclng4
Deep Mixed Effect Model using Gaussian Processes: A Personalized and Reliable Prediction for Healthcare
[article]
2019
arXiv
pre-print
To this end, we propose a composite model of a deep neural network to learn complex global trends from the large number of patients, and Gaussian Processes (GP) to probabilistically model individual time-series ...
that captures global trend across diverse patients and ii) a patient-specific component that models idiosyncratic variability for each patient. ...
For example, the patients can be clustered into the groups having different range of ages or different range of where they live. ...
arXiv:1806.01551v3
fatcat:ggydqdhq25dvzcewkz7mb4efwe
« Previous
Showing results 1 — 15 out of 167 results