Skip to content
#

hyperparameter-tuning

Here are 447 public repositories matching this topic...

nni
shenoynikhil98
shenoynikhil98 commented Mar 23, 2022

https://github.com/microsoft/nni/blob/8d5f643c64580bb26a7b10a3c4c9accf617f65b1/nni/compression/pytorch/speedup/jit_translate.py#L382

While trying to speedup my single shot detector, the following error comes up. Any way to fix this,

/usr/local/lib/python3.8/dist-packages/nni/compression/pytorch/speedup/jit_translate.py in forward(self, *args)
    363 
    364         def forward(self, *

Notes, programming assignments and quizzes from all courses within the Coursera Deep Learning specialization offered by deeplearning.ai: (i) Neural Networks and Deep Learning; (ii) Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization; (iii) Structuring Machine Learning Projects; (iv) Convolutional Neural Networks; (v) Sequence Models

  • Updated Feb 7, 2022
  • Jupyter Notebook
Neuraxle
evalml
chukarsten
chukarsten commented Feb 15, 2022

In #3324 , we had to mark some tests as expected to fail since XGBoost was throwing a FutureWarning. The warning has been addressed in XGBoost, so we're just waiting for the PR merged to be released. This issue is discussed in the #3275 issue.

evalml/tests/component_tests/test_xgboost_classifier.py needs to have the @pytest.mark.xfail removed f

testing good first issue tech debt
OCTIS

Improve this page

Add a description, image, and links to the hyperparameter-tuning topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the hyperparameter-tuning topic, visit your repo's landing page and select "manage topics."

Learn more