A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is
SN Applied Sciences
Transfer learning can replace the long and costly data collection, labeling and training session by effective and the most efficient representations. BERT, trained by Google, is a language representation generator and is far more global to be effectively determine the representations of natural languages and create the numerical version of grammatical structures and inter-dependencies of language attributes. In this work, we introduced recurrent BERT network and singular BERT network and havedoi:10.1007/s42452-019-1765-9 fatcat:brz7ttgkerhbnd6sawrclilyou