Skip to content
#

xgboost

Here are 1,026 public repositories matching this topic...

StrikerRUS
StrikerRUS commented Oct 18, 2019

I'm sorry if I missed this functionality, but CLI version hasn't it for sure (I saw the related code only in generate_code_examples.py). I guess it will be very useful to eliminate copy-paste phase, especially for large models.

Of course, piping is a solution, but not for development in Jupyter Notebook, for example.

awesome-decision-tree-papers
awesome-gradient-boosting-papers
mljar-supervised
pplonski
pplonski commented Sep 11, 2020

There can be a situation when all features are dropped during feature selection. Need to handle it. Maybe by throwing exception or raising a warning.

Code to reproduce:

import numpy as np
from supervised import AutoML

X = np.random.uniform(size=(1000, 31))
y = np.random.randint(0, 2, size=(1000,))

automl = AutoML(
    algorithms=["CatBoost", "Xgboost", "LightGBM"],
    model_t

Improve this page

Add a description, image, and links to the xgboost topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the xgboost topic, visit your repo's landing page and select "manage topics."

Learn more

You can’t perform that action at this time.