A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is
This paper integrates the current Google's most powerful NLP transfer learning model BERT with the traditional state-of-the-art BiLSTM-CRF model to solve the problem of named entity recognition. A bi-directional LSTM model can consider an effectively infinite amount of context on both sides of a word and eliminates the problem of limited context that applies to any feed-forward models. Google's model applied a feedforward neural network, causing its performance to weaken. We seek to solve thesedoi:10.1088/1742-6596/1267/1/012017 fatcat:p46ct6xxijakfhwp73ovsfw36u