ultralytics/docs/en/reference/utils/export/ncnn.md
Onuralp SEZER b73dce8813
refactor: split Exporter export methods into per-format utility modules (#23914)
Signed-off-by: Onuralp SEZER <onuralp@ultralytics.com>
Co-authored-by: UltralyticsAssistant <web@ultralytics.com>
Co-authored-by: Lakshantha Dissanayake <lakshantha@ultralytics.com>
Co-authored-by: Jing Qiu <61612323+Laughing-q@users.noreply.github.com>
Co-authored-by: Laughing-q <1185102784@qq.com>
2026-03-27 18:38:56 +08:00

844 B

description keywords
NCNN export utilities for converting PyTorch YOLO models to NCNN format using PNNX. Optimized for mobile and embedded platforms with support for FP16 inference on ARM architectures. Ultralytics, NCNN, model export, PyTorch to NCNN, PNNX, mobile deployment, ARM, embedded systems, FP16, lightweight inference, Tencent NCNN, edge AI

Reference for ultralytics/utils/export/ncnn.py

!!! success "Improvements"

This page is sourced from [https://github.com/ultralytics/ultralytics/blob/main/ultralytics/utils/export/ncnn.py](https://github.com/ultralytics/ultralytics/blob/main/ultralytics/utils/export/ncnn.py). Have an improvement or example to add? Open a [Pull Request](https://docs.ultralytics.com/help/contributing/) — thank you! 🙏

::: ultralytics.utils.export.ncnn.torch2ncnn