Skip to content
#

mxnet

Here are 577 public repositories matching this topic...

DNXie
DNXie commented Aug 24, 2020

Description

This is a documentation bug. The parameter of API mxnet.test_utils.check_numeric_gradient is not consistent between signature and Parameter section. There is a parameter check_eps in the Parameter section, but it is not in the signature.

Link to document: https://mxnet.apache.org/versions/1.6/api/python/docs/api/mxnet/test_utils/index.html#mxnet.test_utils.check_numeric_gra

askhade
askhade commented Jan 19, 2021

Add a new API for converting a model to external data. Today the conversion happens in 2 steps
external_data_helper.convert_model_to_external_data(<model>, <all_tensors_to_one_file>, <size_threshold>) save_model(model, output_path)
We want to add another api which combines the 2 steps
`
save_model_to_external_data(, <output_

gluon-cv
yxchng
yxchng commented Dec 31, 2020

There are many links in Kinetics that have expired. As as result, everyone might not be using the same Kinetics dataset. As a reference, the statistics of the Kinetics dataset used in PySlowFast can be found here, https://github.com/facebookresearch/video-nonlocal-net/blob/master/DATASET.md. However, I cannot seem to find similar information for gluoncv. Will you guys be sharing the statistics and

briandesilva
briandesilva commented Jan 20, 2021

Issue

I am getting a segmentation fault when running the following import statement

from autogluon.tabular import TabularPrediction

and segmentation fault: 11 when running

from autogluon.text import TextPrediction

I am using a recent branch of autogluon, installed in a clean virtual environment: autogluon version 0.0.16b20210120

Environment

gluon-nlp
preeyank5
preeyank5 commented Dec 3, 2020

Description

While using tokenizers.create with the model and vocab file for a custom corpus, the code throws an error and is not able to generate the BERT vocab file

Error Message

ValueError: Mismatch vocabulary! All special tokens specified must be control tokens in the sentencepiece vocabulary.

To Reproduce

from gluonnlp.data import tokenizers
tokenizers.create('spm', model_p

gluon-ts

Improve this page

Add a description, image, and links to the mxnet topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the mxnet topic, visit your repo's landing page and select "manage topics."

Learn more