RBN: enhancement in language attribute prediction using global representation of natural language transfer learning technology like Google BERT

Chiranjib Sur
2019 SN Applied Sciences  
Transfer learning can replace the long and costly data collection, labeling and training session by effective and the most efficient representations. BERT, trained by Google, is a language representation generator and is far more global to be effectively determine the representations of natural languages and create the numerical version of grammatical structures and inter-dependencies of language attributes. In this work, we introduced recurrent BERT network and singular BERT network and have
more » ... monstrated the effective way of utilization of BERT for applications like parts-of-speech tagging and phrase tagging, which are integral part of understanding languages structure and interpretation of message. We have achieved extraordinarily high accuracy for prediction with these models and have done a comparative study using different datasets and is aimed at applications related to sentence generation. We created an accuracy of 96.65% for parts-of-speech detection and 95.24% for phrase prediction for Penn Tree Bank sentences and 99.64% and 97.94% for MSCOCO dataset sentences. Different origin sentences will ensure that while human generated sentences are complex for high accuracy and prone to errors, effective machine generated sentence attribute detection provision must be kept open for progress in language understanding.
doi:10.1007/s42452-019-1765-9 fatcat:brz7ttgkerhbnd6sawrclilyou