A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit <a rel="external noopener" href="https://arxiv.org/pdf/2205.00305v1.pdf">the original URL</a>. The file type is <code>application/pdf</code>.
AdapterBias: Parameter-efficient Token-dependent Representation Shift for Adapters in NLP Tasks
[article]
<span title="2022-04-30">2022</span>
<i >
arXiv
</i>
<span class="release-stage" >pre-print</span>
Transformer-based pre-trained models with millions of parameters require large storage. Recent approaches tackle this shortcoming by training adapters, but these approaches still require a relatively large number of parameters. In this study, AdapterBias, a surprisingly simple yet effective adapter architecture, is proposed. AdapterBias adds a token-dependent shift to the hidden output of transformer layers to adapt to downstream tasks with only a vector and a linear layer. Extensive
<span class="external-identifiers">
<a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2205.00305v1">arXiv:2205.00305v1</a>
<a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/xgb2jjg2onavrox7mocz4duayy">fatcat:xgb2jjg2onavrox7mocz4duayy</a>
</span>
more »
... are conducted to demonstrate the effectiveness of AdapterBias. The experiments show that our proposed method can dramatically reduce the trainable parameters compared to the previous works with a minimal decrease in task performances compared with fine-tuned pre-trained models. We further find that AdapterBias automatically learns to assign more significant representation shifts to the tokens related to the task in consideration.
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20220504194629/https://arxiv.org/pdf/2205.00305v1.pdf" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext">
<button class="ui simple right pointing dropdown compact black labeled icon button serp-button">
<i class="icon ia-icon"></i>
Web Archive
[PDF]
<div class="menu fulltext-thumbnail">
<img src="https://blobs.fatcat.wiki/thumbnail/pdf/bf/22/bf229e41801c864d1bf1b6d36efda1e512ec26f0.180px.jpg" alt="fulltext thumbnail" loading="lazy">
</div>
</button>
</a>
<a target="_blank" rel="external noopener" href="https://arxiv.org/abs/2205.00305v1" title="arxiv.org access">
<button class="ui compact blue labeled icon button serp-button">
<i class="file alternate outline icon"></i>
arxiv.org
</button>
</a>