A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Knowledge Distillation of Japanese Morphological Analyzer
日本語形態素解析器の知識蒸留
2021
日本語形態素解析器の知識蒸留
In this study, we apply the method of knowledge distillation to the Japanese morphological analyzer rakkyo and evaluate if the method compresses its model size, and the training converges for smaller datasets. Recently, Japanese morphological analyzers have achieved high performance in both accuracy and speed. From the viewpoint of practical uses, however, it is preferable to reduce the model size. The rakkyo model, among others, succeeded in significantly reducing its model size by using only
doi:10.11517/pjsai.jsai2021.0_4j1gs6d02
fatcat:r3oy6oegsngf5kgd6v7x4z4vhi