Home
Login

ncnn is a high-performance neural network inference framework optimized for mobile platforms.

NOASSERTIONC++ 21.7kTencentncnn Last Updated: 2025-06-27

ncnn - Tencent Open Source Mobile Neural Network Inference Framework

Project Overview

ncnn is a high-performance neural network forward computation framework optimized for mobile devices. ncnn is designed from the ground up with mobile deployment and usage in mind. It has no third-party dependencies, is cross-platform, and achieves faster CPU speeds on mobile devices than all known open-source frameworks.

Project Address: https://github.com/Tencent/ncnn

Development Team: Tencent Open Source Project

Core Features

1. Extreme Mobile Optimization

  • Born for Mobile Platforms: Specifically optimized for mobile devices from the initial design.
  • No Third-Party Dependencies: Does not rely on any other computing frameworks such as BLAS or NNPACK.
  • Pure C++ Implementation: Ensures cross-platform compatibility and high performance.

2. Excellent Performance

  • ARM NEON Assembly-Level Optimization: Employs meticulous assembly-level optimization for extremely fast computation speeds.
  • Fine-Grained Memory Management: Extremely low memory footprint, suitable for resource-constrained mobile devices.
  • Multi-Core Parallel Computing: Supports ARM big.LITTLE CPU scheduling optimization.

3. Broad Platform Support

  • Cross-Platform: Supports multiple platforms including Android, iOS, Linux, Windows, and macOS.
  • Multi-Architecture Support: Supports different CPU architectures such as ARM and x86.
  • GPU Acceleration: Supports GPU acceleration based on the Vulkan API.

4. Rich Model Support

  • Multi-Framework Model Import: Can import models from mainstream frameworks such as Caffe, PyTorch, MXNet, ONNX, Darknet, Keras, and TensorFlow.
  • Quantization Support: Supports 8-bit quantization and half-precision floating-point storage.
  • Direct Memory Loading: Supports zero-copy reference loading of network models.

Technical Architecture

Supported Network Types

  • Classic CNN Networks: VGG, AlexNet, GoogleNet, Inception, etc.
  • Practical CNN Networks: ResNet, DenseNet, SENet, FPN, etc.
  • Lightweight CNNs: SqueezeNet, MobileNet series, ShuffleNet series, MNasNet, etc.
  • Face Detection: MTCNN, RetinaFace, SCRFD, etc.
  • Object Detection: YOLO series, SSD series, Faster-RCNN, etc.
  • Image Segmentation: FCN, PSPNet, UNet, YOLACT, etc.
  • Pose Estimation: SimplePose, etc.

Platform Compatibility Matrix

Platform/Hardware Windows Linux Android macOS iOS
Intel CPU ✔️ ✔️ ✔️ /
Intel GPU ✔️ ✔️ /
AMD CPU ✔️ ✔️ ✔️ /
AMD GPU ✔️ ✔️ /
NVIDIA GPU ✔️ ✔️ /
Qualcomm ✔️ / /
ARM CPU / /
Apple CPU / / / ✔️

✅ = Known to run and perform excellently; ✔️ = Known to run; ❔ = Theoretically feasible but not confirmed; / = Not applicable

Real-World Applications

ncnn is currently used in several core Tencent applications, including:

  • QQ
  • QZone
  • WeChat
  • Pitu
  • Other Tencent Applications

Development Ecosystem

Example Projects

  • Android Application Examples:
    • SqueezeNet Image Classification
    • Style Transfer Application
    • MobileNet-SSD Object Detection
    • MTCNN Face Detection
    • YOLOv5/YOLOv7 Object Detection
    • SCRFD Face Detection

Tool Support

  • Model Visualization: Supports model visualization using Netron.
  • Custom Layers: Supports registering and implementing custom layers.
  • Quantization Tools: Provides model quantization tools.

Community Support

  • Technical Exchange QQ Groups: 637093648, 677104663, 818998520
  • Telegram Group and Discord Channel
  • Detailed Documentation: Complete wiki documentation and API reference.

Getting Started

Build Support

ncnn supports building on the following platforms:

  • Linux / Windows / macOS
  • Raspberry Pi 3/4
  • Android
  • iOS
  • WebAssembly
  • NVIDIA Jetson
  • Allwinner D1
  • Loongson 2K1000

Quick Start

It is recommended to start with the Using ncnn with AlexNet tutorial, which provides detailed step-by-step instructions and is especially suitable for beginners.

Project Advantages

  1. Excellent Performance: Runs faster on mobile CPUs than all known open-source frameworks.
  2. Resource-Friendly: Extremely low memory footprint, suitable for resource-constrained environments.
  3. Easy to Integrate: No third-party dependencies, simple integration.
  4. Production-Verified: Validated in multiple Tencent applications with hundreds of millions of users.
  5. Continuously Maintained: Active open-source community and continuous version updates.
  6. Broad Compatibility: Supports model import from mainstream deep learning frameworks.

ncnn is an ideal choice for mobile AI application development, especially for developers and enterprises that need to deploy deep learning models on mobile devices.

Star History Chart