Ncnn

Jul 20, 2023

High-performance neural network inference framework

ncnn is a high-performance neural network inference computing framework optimized for mobile platforms. ncnn is deeply concerned about its deployment and use on mobile phones from the beginning of its design. ncnn does not have third party dependencies. It is cross-platform, and runs faster than all known open-source frameworks on mobile phone CPUs. Developers can easily deploy deep learning algorithm models to mobile platforms by using the efficient ncnn implementation. They can create intelligent apps, and bring the artificial intelligence to your fingertips. ncnn is currently being used in many Tencent applications, such as QQ, Qzone, WeChat, Pitu and so on.



Checkout these related ports:
  • Zoneinfo - Updated timezone definitions
  • Zine - Simple and opinionated tool to build your own magazine
  • Ytree - DOS-XTREE(tm) look-a-like file manager
  • Yaunc - Yet another uptimes.net client
  • Xtypo - X-based keyboard trainer
  • Xtar - View and manipulate contents of a tar file
  • Xtail - Watches the growth of files or directories
  • Xplr - Hackable, minimal, fast TUI file explorer
  • Xosd - X11 on-screen-display program and library
  • Xless - X11-based viewer for text files
  • Xiphos - Bible interface utilizing the sword framework
  • Xgboost - Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT, GBM)
  • Xgas - The animated simulation of an ideal gas
  • Xfce4-wm-themes - Additional themes for xfwm4
  • Xfce4-weather-plugin - Weather plugin for the Xfce panel