A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Learning Conditional Independence Tree for Ranking
Fourth IEEE International Conference on Data Mining (ICDM'04)
Accurate ranking is desired in many real-world data mining applications. Traditional learning algorithms, however, aim only at high classification accuracy. It has been observed that both traditional decision trees and naive Bayes produce good classification accuracy but poor probability estimates. In this paper, we use a new model, conditional independence tree (CITree), which is a combination of decision tree and naive Bayes and more suitable for ranking and more learnable in practice. We
doi:10.1109/icdm.2004.10021
dblp:conf/icdm/SuZ04
fatcat:3jihkfhnajacbkju2oaxsip57q