GPBoost is a software library for combining tree-boosting with Gaussian process and mixed effects models. It also allows for independently doing tree-

fabsig / GPBoost

submited by
Style Pass
2021-06-23 12:30:02

GPBoost is a software library for combining tree-boosting with Gaussian process and mixed effects models. It also allows for independently doing tree-boosting as well as inference and prediction for Gaussian process and mixed effects models. The GPBoost library is predominantly written in C++, and there exist both a Python package and an R package.

Both tree-boosting and Gaussian processes are techniques that achieve state-of-the-art predictive accuracy. Besides this, tree-boosting has the following advantages:

For the GPBoost algorithm, it is assumed that the response variable (aka label) y is the sum of a potentially non-linear mean function F(X) and random effects Zb:

The model is trained using the GPBoost algorithm, where training means learning the covariance parameters (aka hyperparameters) of the random effects and the predictor function F(X) using a tree ensemble. In brief, the GPBoost algorithm is a boosting algorithm that iteratively learns the covariance parameters and adds a tree to the ensemble of trees using a gradient and/or a Newton boosting step. In the GPBoost library, covariance parameters can be learned using (Nesterov accelerated) gradient descent or Fisher scoring (aka natural gradient descent). Further, trees are learned using the LightGBM library. See Sigrist (2020) and Sigrist (2021) for more details.

Leave a Comment
Related Posts