OpenVINO™ Toolkit repository
-
Updated
Jun 14, 2023 - C++
OpenVINO™ Toolkit repository
FedML - The federated learning and analytics library enabling secure and collaborative machine learning on decentralized data anywhere at any scale. Supporting large-scale cross-silo federated learning, cross-device federated learning on smartphones/IoTs, and research simulation. MLOps and App Marketplace are also enabled (https://open.fedml.ai).
Rule engine implementation in Golang
FeatherCNN is a high performance inference engine for convolutional neural networks.
带你从零实现一个高性能的深度学习推理库,Implement a high-performance deep learning inference library step by step
Paddle.js is a web project for Baidu PaddlePaddle, which is an open source deep learning framework running in the browser. Paddle.js can either load a pre-trained model, or transforming a model from paddle-hub with model transforming tools provided by Paddle.js. It could run in every browser with WebGL/WebGPU/WebAssembly supported. It could also…
Adlik: Toolkit for Accelerating Deep Learning Inference
High performance Cross-platform Inference-engine, you could run Anakin on x86-cpu,arm, nv-gpu, amd-gpu,bitmain and cambricon devices.
A library for high performance deep learning inference on NVIDIA GPUs.
A Machine Learning System for Data Enrichment.
A common base representation of python source code for pylint and other projects
PyKnow: Expert Systems for Python
docs for search system and ai infra
This is a repository for an object detection inference API using the Tensorflow framework.
MIVisionX toolkit is a set of comprehensive computer vision and machine intelligence libraries, utilities, and applications bundled into a single toolkit. AMD MIVisionX also delivers a highly optimized open-source implementation of the Khronos OpenVX™ and OpenVX™ Extensions.
A quick view of high-performance convolution neural networks (CNNs) inference engines on mobile devices.
Expert Systems for Python
Python Computer Vision & Video Analytics Framework With Batteries Included
Add a description, image, and links to the inference-engine topic page so that developers can more easily learn about it.
To associate your repository with the inference-engine topic, visit your repo's landing page and select "manage topics."