Resources [chapter]

Zhiyuan Liu, Yankai Lin, Maosong Sun
2020 Representation Learning for Natural Language Processing  
Deep learning has been shown as a powerful method for a variety of artificial intelligence tasks, including some critical tasks in NLP. However, training a deep neural network is usually a very time-intensive process and requires lots of code to build related models. To alleviate these issues, some deep learning frameworks have been developed and released, which incorporate some existing and necessary arithmetic operators for neural network constructions. And these frameworks exploit hardware
more » ... atures such as multi-core CPUs and many-core GPUs to shorten the training time. Each framework has its advantages and disadvantages. In this chapter, we aim to exhibit features and running performance of these frameworks so that users can select an appropriate framework for their usage.
doi:10.1007/978-981-15-5573-2_10 fatcat:qs6uihkvjnfmndbdtr32buqt7y