'Lightgbm classifier with gpu

model = lgbm.LGBMClassifier(n_estimators=1250, num_leaves=128,learning_rate=0.009,verbose=1)`enter code here`

using the LGBM classifier
is there way to use this with gpu this days?



Solution 1:[1]

First, you need to build LightGBM for GPU, like:

git clone --recursive https://github.com/Microsoft/LightGBM 
cd LightGBM && mkdir build && cd build
cmake -DUSE_GPU=1 ..
make -j4
pip uninstall lightgbm
cd ../python-package/ && python setup.py install

After that you may use device="gpu" in params to train your model on GPU, like:

lgbm.train(params={'device'='gpu'}, ...)

or

lgbm.LGBMClassifier(device='gpu')

And speed up for a largish dataset:

from sklearn.datasets import make_classification
from sklearn.model_selection import train_test_split
import lightgbm as lgbm
X,y = make_classification(n_samples=10000000, n_features=100, n_classes=2)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.25)
%%timeit
model = lgbm.LGBMClassifier(device="gpu")
model.fit(X_train, y_train)
19.9 s ± 163 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)
%%timeit
model = lgbm.LGBMClassifier(device="cpu")
model.fit(X_train, y_train)
1min 23s ± 46.4 s per loop (mean ± std. dev. of 7 runs, 1 loop each)

Solution 2:[2]

LightGBM on the GPU blog post provides comprehensive instructions on LightGBM with GPU support installation. It describes several errors that may occur during installation and steps to take when Anaconda is used.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1
Solution 2 Sergey Khutornoy