Skip to content
#

mxnet

Here are 593 public repositories matching this topic...

DNXie
DNXie commented Aug 24, 2020

Description

This is a documentation bug. The parameter of API mxnet.test_utils.check_numeric_gradient is not consistent between signature and Parameter section. There is a parameter check_eps in the Parameter section, but it is not in the signature.

Link to document: https://mxnet.apache.org/versions/1.6/api/python/docs/api/mxnet/test_utils/index.html#mxnet.test_utils.check_numeric_gra

gluon-cv
JiaMingLin
JiaMingLin commented Aug 3, 2021

Hi,
I need to download the something-to-something and jester datasets. But the 20bn website "https://20bn.com" are not available for weeks, the error message is "503 Service Temporarily Unavailable".

I have already downloaded the video data of something-to-something v2, and I need the label dataset. For the Jester, I need both video and label data. Can someone share me the

good first issue
gluon-ts
karthickgopalswamy
karthickgopalswamy commented Jul 7, 2022

Description

The studentT distribution from torch expects positive (x > 0) constraint on the scale and df parameters. The current implementation takes softplus(input) and softplus(-120) > 0 results in False

To Reproduce

(Please provide minimal example of code snippet that reproduces the error. For existing examples, please provide link.)

from gluonts.torch.modules.distribut
bug good first issue
djl
zachgk
zachgk commented Apr 20, 2022

Description

This issue is to create the TabNet model and add it to the basic model zoo. TabNet is a good example of a deep learning model that will work with the tabular modality. Then, it can be trained or tested with an implementation of the CsvDataset such as AirfoilRandomAccess or AmesRandomAccess.

References

  • Paper: [TabNet: Attentive Interpretable Tabular Learning](htt
enhancement good first issue Call for Contribution
gluon-nlp
preeyank5
preeyank5 commented Dec 3, 2020

Description

While using tokenizers.create with the model and vocab file for a custom corpus, the code throws an error and is not able to generate the BERT vocab file

Error Message

ValueError: Mismatch vocabulary! All special tokens specified must be control tokens in the sentencepiece vocabulary.

To Reproduce

from gluonnlp.data import tokenizers
tokenizers.create('spm', model_p

enhancement good first issue

Improve this page

Add a description, image, and links to the mxnet topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the mxnet topic, visit your repo's landing page and select "manage topics."

Learn more