A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Filters
Transformer-based Language Models for Factoid Question Answering at BioASQ9b
[article]
2021
arXiv
pre-print
We have focused on finding the ideal answers and investigated multi-task fine-tuning and gradual unfreezing techniques on transformer-based language models. ...
For factoid questions, our ALBERT-based systems ranked first in test batch 1 and fourth in test batch 2. ...
In addition, we investigated the effect of gradual unfreezing on transformer-based language models using the BioASQ9b dataset. ...
arXiv:2109.07185v1
fatcat:3ffcwp4j7vbtrpgynq7bik35g4
Query-Focused Extractive Summarisation for Finding Ideal Answers to Biomedical and COVID-19 Questions
[article]
2021
arXiv
pre-print
For phase B of the BioASQ9b task, the relevant documents and snippets were already included in the test data. ...
The Synergy Task is an end-to-end question answering task on COVID-19 where systems are required to return relevant documents, snippets, and answers to a given question. ...
All of these variants were based on models made available by the Huggingface transformers repository 6 . BERT We used huggingface's model "bert-base-uncased". ...
arXiv:2108.12189v2
fatcat:3u7dyoyilfbtralfnuj4nmrcpy