Port of OpenAI's Whisper model in C/C++
High-performance neural network inference framework for mobile
MNN is a blazing fast, lightweight deep learning framework
A scalable inference server for models optimized with OpenVINO
A GPU-accelerated library containing highly optimized building blocks
OpenMLDB is an open-source machine learning database
Easy-to-use deep learning framework with 3 key features
Deep Learning API and Server in C++14 support for Caffe, PyTorch
Open standard for machine learning interoperability
C++ library for high performance inference on NVIDIA GPUs
Serving system for machine learning models
Lightweight inference library for ONNX files, written in C++
Deep learning inference framework optimized for mobile platforms
Uniform deep learning inference framework for mobile