lightgbm:Light Gradient Boosting Machine
Tree based algorithms can be improved by introducing boosting frameworks. 'LightGBM' is one such framework, based on
Ke, Guolin et al. (2017)
<https://papers.nips.cc/paper/6907-lightgbm-a-highly-efficient-gradient-boosting-decision>.
This package offers an R interface to work with it. It is
designed to be distributed and efficient with the following
advantages: 1. Faster training speed and higher efficiency. 2.
Lower memory usage. 3. Better accuracy. 4. Parallel learning
supported. 5. Capable of handling large-scale data. In
recognition of these advantages, 'LightGBM' has been
widely-used in many winning solutions of machine learning
competitions. Comparison experiments on public datasets suggest
that 'LightGBM' can outperform existing boosting frameworks on
both efficiency and accuracy, with significantly lower memory
consumption. In addition, parallel experiments suggest that in
certain circumstances, 'LightGBM' can achieve a linear speed-up
in training time by using multiple machines.