#
inference
Here are 617 public repositories matching this topic...
ncnn is a high-performance neural network inference framework optimized for the mobile platform
android
ios
caffe
deep-learning
neural-network
mxnet
tensorflow
vulkan
keras
inference
pytorch
artificial-intelligence
simd
darknet
arm-neon
high-preformance
ncnn
onnx
mlir
-
Updated
Apr 26, 2021 - C++
-
Updated
Apr 26, 2021 - Python
Hello AI World guide to deploying deep-learning inference networks and deep vision primitives with TensorRT and NVIDIA Jetson.
machine-learning
embedded
caffe
computer-vision
deep-learning
robotics
inference
nvidia
digits
image-recognition
segmentation
object-detection
jetson-tx1
jetson
tensorrt
jetson-tx2
video-analytics
jetson-xavier
jetson-nano
jetson-xavier-nx
-
Updated
Apr 22, 2021 - C++
Runtime type system for IO decoding/encoding
-
Updated
Apr 26, 2021 - TypeScript
Grakn Core: The Knowledge Graph
database
graph
graph-algorithms
logic
inference
datalog
knowledge-graph
graph-theory
graph-database
graphdb
knowledge-base
query-language
graph-visualisation
knowledge-representation
reasoning
knowledge-engineering
enterprise-knowledge-graph
grakn
graql
hyper-relational
-
Updated
Apr 23, 2021 - Java
An easy to use PyTorch to TensorRT converter
-
Updated
Apr 26, 2021 - Python
The Triton Inference Server provides an optimized cloud and edge inferencing solution.
-
Updated
Apr 26, 2021 - C++
OpenVINO™ Toolkit repository
-
Updated
Apr 26, 2021 - C++
TensorFlow template application for deep learning
machine-learning
csv
deep-learning
tensorflow
inference
cnn
lstm
tensorboard
mlp
libsvm
tfrecords
wide-and-deep
serving
-
Updated
Jan 3, 2019 - Python
Acceleration package for neural networks on multi-core CPUs
cpu
neural-network
high-performance
inference
multithreading
simd
matrix-multiplication
neural-networks
high-performance-computing
convolutional-layers
fast-fourier-transform
winograd-transform
-
Updated
Mar 12, 2021 - C
DELTA is a deep learning based natural language and speech processing platform.
nlp
front-end
ops
deep-learning
text-classification
tensorflow
nlu
speech
inference
text-generation
speech-recognition
seq2seq
sequence-to-sequence
speaker-verification
asr
tensorflow-serving
emotion-recognition
custom-ops
serving
tensorflow-lite
-
Updated
Apr 16, 2021 - Python
HuaizhengZhang
commented
May 16, 2019
Deploy a ML inference service on a budget in less than 10 lines of code.
-
Updated
Apr 14, 2021 - Python
typescript
matching
pattern
pattern-matching
inference
ts
conditions
type-inference
exhaustive
typescript-pattern-matching
-
Updated
Apr 7, 2021 - TypeScript
Pytorch-Named-Entity-Recognition-with-BERT
curl
inference
pytorch
cpp11
named-entity-recognition
postman
pretrained-models
bert
conll-2003
bert-ner
-
Updated
Jan 24, 2020 - Python
VivekPanyam
commented
May 18, 2020
Bounds check and call [] operator
lsy641
commented
May 15, 2020
Hi, I am so interesting in your project, and wonder if you need contributor and how could I make my own contribution?
High-efficiency floating-point neural network inference operators for mobile, server, and Web
cpu
neural-network
inference
multithreading
simd
matrix-multiplication
neural-networks
convolutional-neural-networks
convolutional-neural-network
inference-optimization
mobile-inference
-
Updated
Apr 25, 2021 - C
ericangelokim
commented
Oct 23, 2019
'max_request_size' seems to refer to bytes, not mb.
tucan9389
commented
Mar 28, 2019
Lua Language Server coded by Lua
-
Updated
Apr 26, 2021 - Lua
TensorFlow models accelerated with NVIDIA TensorRT
neural-network
tensorflow
models
realtime
inference
optimize
nvidia
image-classification
object-detection
train
tx1
jetson
tensorrt
tx2
-
Updated
Feb 14, 2021 - Python
LightSeq: A High-Performance Inference Library for Sequence Processing and Generation
-
Updated
Apr 16, 2021 - Cuda
Embedded and mobile deep learning research resources
deep-neural-networks
deep-learning
inference
pruning
quantization
neural-network-compression
mobile-deep-learning
embedded-ai
efficient-neural-networks
mobile-ai
mobile-inference
-
Updated
Apr 20, 2021
Bolt is a deep learning library with high performance and heterogeneous flexibility.
-
Updated
Apr 13, 2021 - C++
Shape and dimension inference (Keras-like) for PyTorch layers and neural networks
-
Updated
Oct 6, 2020 - Python
Package for causal inference in graphs and in the pairwise settings. Tools for graph structure recovery and dependencies are included.
python
machine-learning
algorithm
graph
inference
toolbox
causality
causal-inference
causal-models
graph-structure-recovery
causal-discovery
-
Updated
Apr 6, 2021 - Python
Train a state-of-the-art yolov3 object detector from scratch!
python
deep-learning
gpu
keras
inference
tf2
detector
yolo
object-detection
transfer-learning
deep-learning-tutorial
keras-models
google-colab
yolov3
tensorflow2
wandb
weights-and-biases
annotating-images
custom-yolo
trainyourownyolo
-
Updated
Apr 24, 2021 - Jupyter Notebook
Improve this page
Add a description, image, and links to the inference topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the inference topic, visit your repo's landing page and select "manage topics."
When run
bazel run --define MEDIAPIPE_DISABLE_GPU=1 mediapipe/examples/desktop/hello_world:hello_worldDownloading org_tensorflow will time out.
I want use the pre-downloaded org_tensorflow by putting the tar.gz file in the .cache corresponding directory, but it didn't work.How should I operate it correctly?