Accelerated Gradient Boosting - Université Rennes 2 Accéder directement au contenu
Article Dans Une Revue Machine Learning Année : 2019

Accelerated Gradient Boosting

Résumé

Gradient tree boosting is a prediction algorithm that sequentially produces a model in the form of linear combinations of decision trees, by solving an infinite-dimensional optimization problem. We combine gradient boosting and Nesterov's accelerated descent to design a new algorithm, which we call AGB (for Accelerated Gradient Boosting). Substantial numerical evidence is provided on both synthetic and real-life data sets to assess the excellent performance of the method in a large variety of prediction problems. It is empirically shown that AGB is much less sensitive to the shrinkage parameter and outputs predictors that are considerably more sparse in the number of trees, while retaining the exceptional performance of gradient boosting.
Fichier principal
Vignette du fichier
biau-cadre-rouviere.pdf (459.64 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-01723843 , version 1 (05-03-2018)
hal-01723843 , version 2 (05-03-2019)

Identifiants

Citer

Gérard Biau, Benoît Cadre, Laurent Rouviere. Accelerated Gradient Boosting. Machine Learning, In press, 22 p. ⟨10.1007/s10994-019-05787-1⟩. ⟨hal-01723843v1⟩

Collections

ENS-RENNES
369 Consultations
244 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More