2 Hits in 1.6 sec

Transformer-based Language Models for Factoid Question Answering at BioASQ9b [article]

Urvashi Khanna, Diego Mollá
2021 arXiv   pre-print
We have focused on finding the ideal answers and investigated multi-task fine-tuning and gradual unfreezing techniques on transformer-based language models.  ...  For factoid questions, our ALBERT-based systems ranked first in test batch 1 and fourth in test batch 2.  ...  In addition, we investigated the effect of gradual unfreezing on transformer-based language models using the BioASQ9b dataset.  ... 
arXiv:2109.07185v1 fatcat:3ffcwp4j7vbtrpgynq7bik35g4

Query-Focused Extractive Summarisation for Finding Ideal Answers to Biomedical and COVID-19 Questions [article]

Diego Mollá, Urvashi Khanna, Dima Galat, Vincent Nguyen, Maciej Rybinski
2021 arXiv   pre-print
For phase B of the BioASQ9b task, the relevant documents and snippets were already included in the test data.  ...  The Synergy Task is an end-to-end question answering task on COVID-19 where systems are required to return relevant documents, snippets, and answers to a given question.  ...  All of these variants were based on models made available by the Huggingface transformers repository 6 . BERT We used huggingface's model "bert-base-uncased".  ... 
arXiv:2108.12189v2 fatcat:3u7dyoyilfbtralfnuj4nmrcpy