Representational Gradient Boosting: Backpropagation in the Space of Functions

Gilmer Valdes, Jerome Friedman, Fei Jiang, Efstathios Gennatas
2021 IEEE Transactions on Pattern Analysis and Machine Intelligence  
The estimation of nested functions (i.e. functions of functions) is one of the central reasons for the success and popularity of machine learning. Today, artificial neural networks are the predominant class of algorithms in this area, known as representational learning. Here, we introduce Representational Gradient Boosting (RGB), a nonparametric algorithm that es-timates functions with multi-layer architectures obtained using backpropagation in the space of functions. RGB does not need to
more » ... a functional form in the nodes or output (e.g. linear models or rectified linear units), but rather estimates these transformations. RGB can be seen as an optimized stacking procedure where a meta algorithm learns how to combine different classes of functions (e.g. Neural Networks (NN) and Gradient Boosting (GB)), while building and optimizing them jointly in an attempt to compensate each others weaknesses. This highlights a stark difference with current approaches to meta- learning that combine models only after they have been built independently. We showed that providing optimized stacking is one of the main advantages of RGB over current approaches. Additionally, due to the nested nature of RGB we also showed how it improves over GB in problems that have several high-order interactions.
doi:10.1109/tpami.2021.3137715 pmid:34941500 fatcat:wdgbirsgvrffzjpuu2yf47yvnm