Cross-entropy and estimation of probabilistic context-free grammars

Anna Corazza, Giorgio Satta
2006 Proceedings of the main conference on Human Language Technology Conference of the North American Chapter of the Association of Computational Linguistics -   unpublished
We investigate the problem of training probabilistic context-free grammars on the basis of a distribution defined over an infinite set of trees, by minimizing the cross-entropy. This problem can be seen as a generalization of the well-known maximum likelihood estimator on (finite) tree banks. We prove an unexpected theoretical property of grammars that are trained in this way, namely, we show that the derivational entropy of the grammar takes the same value as the crossentropy between the input
more » ... y between the input distribution and the grammar itself. We show that the result also holds for the widely applied maximum likelihood estimator on tree banks.
doi:10.3115/1220835.1220878 fatcat:hwkh2td7cnewjnnridn7e4pstu