A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2003; you can also visit the original URL.
The file type is
This paper introduces two approximations of the Kullback-Leibler divergence for hidden Markov models (HMMs). The first one is a generalization of an approximation originally presented for HMMs with discrete observation densities. In that case, the HMMs are assumed to be ergodic and the topologies similar. The second one is a modification of the first one. The topologies of HMMs are assumed to be left-to-right with no skips but the models can have different number of states unlike in the firstdoi:10.1109/icassp.2002.5743946 dblp:conf/icassp/ViholaHSSS02 fatcat:txgsgks4mva6znwljalrremo3i