A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Beyond Fine Tuning: A Modular Approach to Learning on Small Data
[article]
2016
arXiv
pre-print
The central impact of using a modular approach comes from adding new representations to a network, as opposed to replacing representations via fine-tuning. ...
In this paper we present a technique to train neural network models on small amounts of data. ...
This allows the modular approach to more robustly handle small data sets than naive fine-tuning. ...
arXiv:1611.01714v1
fatcat:ag3oajbzu5ddvkkb36z6v3qev4
Beyond Fine-tuning: Few-Sample Sentence Embedding Transfer
[article]
2020
arXiv
pre-print
Fine-tuning (FT) pre-trained sentence embedding models on small datasets has been shown to have limitations. ...
We perform evaluation on seven small datasets from NLP tasks and show that our approach with end-to-end training outperforms FT with negligible computational overhead. ...
The authors would also like to acknowledge the support provided by the University of Wisconsin-Madison Office of the Vice Chancellor for Research and Graduate Education with funding from the Wisconsin ...
arXiv:2004.05119v2
fatcat:vyt2nlsryjhdnhjgemikmmgehi
Lightweight Adapter Tuning for Multilingual Speech Translation
[article]
2021
arXiv
pre-print
Starting from different pre-trained models (a multilingual ST trained on parallel data or a multilingual BART (mBART) trained on non-parallel multilingual data), we show that adapters can be used to: ( ...
Adapter modules were recently introduced as an efficient alternative to fine-tuning in NLP. ...
Acknowledgments This work was supported by a Facebook AI SRA grant, and was granted access to the HPC resources of IDRIS under the allocation 2020-AD011011695 made by GENCI. ...
arXiv:2106.01463v2
fatcat:754ss6gtzbhpjbpcn2ijspmvae
Differentially Private Fine-tuning of Language Models
[article]
2021
arXiv
pre-print
We propose a meta-framework for this problem, inspired by the recent success of highly parameter-efficient methods for fine-tuning. ...
All our experiments suggest that larger models are better suited for private fine-tuning: while they are well known to achieve superior accuracy non-privately, we find that they also better maintain their ...
Janardhan Kulkarni would like to thank Edward Hu for sharing many ideas on fine-tuning. ...
arXiv:2110.06500v1
fatcat:p5kk4zuodbdftlx7jf3yhj45dm
Prefix-Tuning: Optimizing Continuous Prompts for Generation
[article]
2021
arXiv
pre-print
We find that by learning only 0.1\% of the parameters, prefix-tuning obtains comparable performance in the full data setting, outperforms fine-tuning in low-data settings, and extrapolates better to examples ...
In this paper, we propose prefix-tuning, a lightweight alternative to fine-tuning for natural language generation tasks, which keeps language model parameters frozen, but optimizes a small continuous task-specific ...
A natural approach to this problem is lightweight fine-tuning, which freezes most of the pretrained parameters and augments the model with small trainable modules. ...
arXiv:2101.00190v1
fatcat:bdhj3qnsufcxjml24cndpx43s4
BayesAdapter: Being Bayesian, Inexpensively and Reliably, via Bayesian Fine-tuning
[article]
2021
arXiv
pre-print
In particular, we propose to adapt the pre-trained deterministic NNs to be BNNs via cost-effective Bayesian fine-tuning. ...
To make BayesAdapter more practical, we technically contribute 1) a modularized, user-friendly implementation for the learning of variational BNNs under two representative variational distributions, 2) ...
As a solution, we opt to explicitly regularize the variational BNNs to behave uncertainly on a collection of OOD data during Bayesian fine-tuning. ...
arXiv:2010.01979v4
fatcat:nhiu3neenjb7johs5hc5husyiq
Enabling Technology for Remote Prosthetic Alignment Tuning
2021
Military medicine
Target alignments were calculated by the machine learning algorithm in the ProSAT software, based on input of kinetic data samples representing the precondition and where a real prosthetic misalignment ...
The sensor has been cross-validated against kinetic measurement in a gait laboratory, and bench testing was performed to validate the performance of the tool while adjusting a prosthetic socket based on ...
We also thank Jonathon Maier, Jung Kim, and Stephen Silverstein for the work on engineering the software foundation that enables this teleprosthetics mobile application. ...
doi:10.1093/milmed/usaa453
pmid:33499549
fatcat:uojdvtrv4rbyto7a6njfx6rhxq
Milepost GCC: Machine Learning Enabled Self-tuning Compiler
2011
International journal of parallel programming
Our approach is to develop a modular, extensible, self-tuning optimization infrastructure to automatically learn the best optimizations across multiple programs and architectures based on the correlation ...
We developed machine learning plugins based on probabilistic and transductive approaches to predict good combinations of optimizations. ...
Using machine learning to predict good optimization passes The Milepost approach to learning optimizations across programs is based on the observation that programs may exhibit similar behavior for a similar ...
doi:10.1007/s10766-010-0161-2
fatcat:r6s7qcgunzf5hcgllcorvkir4m
Scalable Auto-Tuning of Synthesis Parameters for Optimizing High-Performance Processors
2016
Proceedings of the 2016 International Symposium on Low Power Electronics and Design - ISLPED '16
In this paper we present a novel learning-based algorithm for synthesis parameter optimization. ...
Architecture of the STS process, which employs a parallel and iterative tuning process to optimize macros [2]. ...
Moreover, to better estimate the cost based on non-trivial contributing scenarios (i.e., scenarios comprising more than one primitive), the Learning algorithm includes a fine-grained cost estimation (see ...
doi:10.1145/2934583.2934620
dblp:conf/islped/ZieglerLC16
fatcat:f6kj64s4orbuhfiarlotr5lcze
Hominin interbreeding and language evolution: fine-tuning the details
2013
Journal of Anthropological Sciences
A second key concern raised by Bruner is that we must rely on all evidences if we intend to reach hypotheses that are probable (and eventually, to be able to falsify them). ...
At present we are in position to try to answer this question by putting together all the archaeological, paleo-neurobiological, genetic, and even molecular data available to date. ...
In doing so they go beyond Premo's position and explicitly argue for a computational approach to the problem that focuses on the procedural system required for planning and executing the involved motor ...
doi:10.4436/jass.91020
pmid:24334493
fatcat:tx7wxbomxffuxhmgeg6l4xy4lm
DBA bandits: Self-driving index tuning under ad-hoc, analytical workloads with safety guarantees
[article]
2020
arXiv
pre-print
Our comprehensive empirical results demonstrate up to 75% speed-up on shifting and ad-hoc workloads and 28% speed-up on static workloads compared against a state-of-the-art commercial tuning tool. ...
We propose a self-driving approach to online index selection that eschews the DBA and query optimiser, and instead learns the benefits of viable structures through strategic exploration and direct performance ...
Learning approaches to optimisation and tuning. Recent years have witnessed new machine learning approaches to automate decision-making processes within databases. ...
arXiv:2010.09208v2
fatcat:567uvwlv45da3b5a2353bmnvby
Extensive Parameterization And Tuning of Architecture-Sensitive Optimizations
2011
Procedia Computer Science
We present a framework to support fine-grained parameterization of these optimizations and flexible tuning of their configuration space. ...
We then use a transformation-aware (TA) search algorithm to support flexible tuning of the parameterized transformation scripts to achieve portable high performance. ...
In particular, gemm is compute-bound as it reuses every data item a large number of times during evaluation; gemv is memory bound as only a small fraction of data are reused; ger is severely memory-bound ...
doi:10.1016/j.procs.2011.04.236
fatcat:bxovcxilibhl5n6ntdh4wsoqdy
Autonomous tuning and charge state detection of gate defined quantum dots
[article]
2019
arXiv
pre-print
With growing device complexity and increasing number of functional devices required for measurements, a manual approach to finding suitable gate voltages to confine electrons electrostatically is impractical ...
Here, we implement a two-stage device characterization and dot-tuning process which first determines whether devices are functional and then attempts to tune the functional devices to the single or double ...
Tuning approach Quantum dots are systems confining electrons or holes in regions small enough to make their quantum mechanical energy levels observable. ...
arXiv:1911.10709v2
fatcat:zifyol7vqvfejjljowa34thnmm
Fuzzy Controlled Architecture for Performance Tuning of Database Management System
2012
International Journal of Computer Applications
In this paper, a new tuning architecture based on fuzzy logic is presented, where in, the control action is expressed in linguistic terms. ...
Database tuning is complicated due to the fact that several conflicting tuning parameters have to be adjusted simultaneously for a variety of workload types and the highly unpredictable traffic patterns ...
Our thanks are also due to our esteemed Management for their support. Our Sincere thanks to Computer Center Head Prof. S.R.Mangalwede for providing us with the computing facilities. ...
doi:10.5120/4813-7050
fatcat:xesmapqvhzaydixrowmbljk76i
Automatic Database Management System Tuning Through Large-scale Machine Learning
2017
Proceedings of the 2017 ACM International Conference on Management of Data - SIGMOD '17
To overcome these challenges, we present an automated approach that leverages past experience and collects new information to tune DBMS configurations: we use a combination of supervised and unsupervised ...
Database management system (DBMS) configuration tuning is an essential aspect of any data-intensive application effort. ...
Similar to MySQL, Postgres has a small number of knobs that have a large impact on the performance. ...
doi:10.1145/3035918.3064029
dblp:conf/sigmod/AkenPGZ17
fatcat:skhqczxchnb5zlvxit6dmbbfem
« Previous
Showing results 1 — 15 out of 11,248 results