论文标题

分配梯度提升机

Distributional Gradient Boosting Machines

论文作者

März, Alexander, Kneib, Thomas

论文摘要

我们为回归任务提出了一个统一的概率梯度提升框架,该框架是对单变量响应变量的整个条件分布作为协变量的函数的整个条件分布。我们基于可能性的方法使我们可以对参数分布的所有条件矩进行建模,或者通过标准化流量来近似条件累积分布函数。作为基础计算骨架,我们的框架基于XGBOOST和LIGHTGBM。建模和预测整个条件分布会大大增强现有的基于树的梯度增强实现,因为它允许创建概率预测,从中可以从中得出预测间隔和利益的分位数。经验结果表明,我们的框架实现了最新的预测准确性。

We present a unified probabilistic gradient boosting framework for regression tasks that models and predicts the entire conditional distribution of a univariate response variable as a function of covariates. Our likelihood-based approach allows us to either model all conditional moments of a parametric distribution, or to approximate the conditional cumulative distribution function via Normalizing Flows. As underlying computational backbones, our framework is based on XGBoost and LightGBM. Modelling and predicting the entire conditional distribution greatly enhances existing tree-based gradient boosting implementations, as it allows to create probabilistic forecasts from which prediction intervals and quantiles of interest can be derived. Empirical results show that our framework achieves state-of-the-art forecast accuracy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源