A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Word-Level Embeddings for Cross-Task Transfer Learning in Speech Processing
2021
2021 29th European Signal Processing Conference (EUSIPCO)
unpublished
Recent breakthroughs in deep learning often rely on representation learning and knowledge transfer. In recent years, unsupervised and self-supervised techniques for learning speech representation were developed to foster automatic speech recognition. Up to date, most of these approaches are task-specific and designed for within-task transfer learning between different datasets or setups of a particular task. In turn, learning taskindependent representation of speech and cross-task applications
doi:10.23919/eusipco54536.2021.9616254
fatcat:ioqudb3t7vfirg33jinjg6qoju