ncnn is a a high-performance neural network inference computing framework optimized for mobile platforms. More details can be found here ncnn.
This project offers tools for building on Windows x64. And some C-style wrappers for using the library.
- Visual Studio 2015 or 2017
- CMake (>3.2)
Open the appropriate Command Prompt from the Start menu.
For example VS2015 x64 Native Tools Command Prompt:
C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC>
Change to your working directory:
cd /to/project/dir
If cmake command is not available from Command Prompt, add it to system PATH variable:
set PATH=%PATH%;C:\Program Files (x86)\CMake\bin
build-windows.bat
If succeed, results will be in the install
directory and Visual Studio Solution will be in the sln
directory
android-ndk(<16)
set ANDROID_NDK=/your/ndk/path
build.bat
Build files will be here build-android\install
Instructions of using the original ncnn APIs can be found here. Our C-style wrapper arms to make it easier to use. These wrappers were orignally developed for caffe and we keep the heading caffe_
just for compatible.
Method | Code | Note |
---|---|---|
Load a net | handle=caffe_net_load(const char* prototxt, const char* weights) | |
Set input | caffe_net_setBlob(handle, "data", input_blob) | * |
Forward | caffe_net_forward(handle) | |
Get output | out_blob = caffe_net_getBlob(handle, "feat") | * |
Get layer names | std::vector<std::string> names = caffe_net_getLayerNames(handle) | |
Clean up | input_blob.destroy(); out_blob.destory() |
destory blobs |
Release handle | caffe_net_release(handle) |