Distributed (Deep) Machine Learning Community
Grow your team on GitHub
GitHub is home to over 50 million developers working together. Join them to grow your own development teams, manage permissions, and collaborate on projects.
Sign upPinned repositories
Repositories
-
xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python, R, Java, Scala, C++ and more. Runs on single machine, Hadoop, Spark, Dask, Flink and DataFlow
-
dgl
Python package built to ease deep learning on graph, on top of existing DL frameworks.
-
gluon-nlp
NLP made easy
-
gluon-cv
Gluon CV Toolkit
-
decord
An efficient video loader for deep learning with smart shuffling that's super easy to digest
-
web-data
The repo to host all the web data including images for documents in dmlc projects.
-
treelite
model compiler for decision tree ensembles
-
rabit
Reliable Allreduce and Broadcast Interface for distributed machine learning
-
ps-lite
A lightweight parameter server interface
-
dmlc-core
A common bricks library for building scalable and portable distributed machine learning.
-
dlpack
RFC for common in-memory tensor structure and operator interface for deep learning system
-
mxnet.js
MXNetJS: Javascript Package for Deep Learning in Browser (without server)
-
HalideIR
Symbolic Expression and Statement Module for new DSLs
-
XGBoost.jl
XGBoost Julia Package
-
tensorboard Archived
Standalone TensorBoard for visualizing in deep learning
-
-
-
mshadow Archived
Matrix Shadow:Lightweight CPU/GPU Matrix and Tensor Template Library in C++/CUDA for (Deep) Machine Learning
-
MXNet.jl
MXNet Julia Package - flexible and efficient deep learning in Julia
-
minerva Archived
Minerva: a fast and flexible tool for deep learning on multi-GPU. It provides ndarray programming interface, just like Numpy. Python bindings and C++ bindings are both available. The resulting code can be run on CPU or GPU. Multi-GPU support is very easy.
-
keras
Forked from keras-team/kerasDeep Learning library for Python. Convnets, recurrent neural networks, and more. Runs on MXNet, Theano or TensorFlow.
-
minpy Archived
NumPy interface with mixed backend execution
-
mxnet-notebooks
Notebooks for MXNet
-
-
drat
Drat Repository for DMLC R packages
-
mxnet-memonger
Sublinear memory optimization for deep learning, reduce GPU memory cost to train deeper nets