-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
【编译MNN】编译完成后,已在虚拟Python环境中安装PyMNN,导入MNN报错 #3053
Labels
question
Further information is requested
Comments
像是编译时使用的 libc++ 和运行时的 libc++ 不一致 |
是的,我编译用的是本地环境,运行的时候用的是虚拟环境,想问问有什么办法? |
是加载 opencl 的 so 失败? |
编译时打开 MNN_OPENCL 了么? |
都打开了 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
平台(如果交叉编译请再附上交叉编译目标平台):
RK3588,本地编译
gcc版本11.4.0
g++版本
Ubuntu 22.04
Github版本:
git clone https://github.com/alibaba/MNN.git
编译方式:
cmake..
make -j4
python build_deps.py opencl
python setup.py install --version 2.9.6 --deps opencl
编译日志:
编译过程仅开启opencl
cmake配置如下
option(MNN_USE_SYSTEM_LIB "For opencl and vulkan, use system lib or use dlopen" OFF)
option(MNN_BUILD_HARD "Build -mfloat-abi=hard or not" OFF)
option(MNN_BUILD_SHARED_LIBS "MNN build shared or static lib" ON)
option(MNN_WIN_RUNTIME_MT "MNN use /MT on Windows dll" OFF)
option(MNN_FORBID_MULTI_THREAD "Disable Multi Thread" OFF)
option(MNN_OPENMP "Use OpenMP's thread pool implementation. Does not work on iOS or Mac OS" OFF)
option(MNN_USE_THREAD_POOL "Use MNN's own thread pool implementation" ON)
option(MNN_BUILD_TRAIN "Build MNN's training framework" OFF)
option(MNN_BUILD_DEMO "Build demo/exec or not" OFF)
option(MNN_BUILD_TOOLS "Build tools/cpp or not" ON)
option(MNN_BUILD_QUANTOOLS "Build Quantized Tools or not" ON)
option(MNN_EVALUATION "Build Evaluation Tools or not" ON)
option(MNN_BUILD_CONVERTER "Build Converter" ON)
option(MNN_SUPPORT_DEPRECATED_OP "Enable MNN's tflite quantized op" ON)
option(MNN_DEBUG_MEMORY "MNN Debug Memory Access" OFF)
option(MNN_DEBUG_TENSOR_SIZE "Enable Tensor Size" OFF)
option(MNN_GPU_TRACE "Enable MNN Gpu Debug" OFF)
option(MNN_SUPPORT_RENDER "Enable MNN Render Ops" OFF)
option(MNN_SUPPORT_TRANSFORMER_FUSE "Enable MNN transformer Fuse Ops" OFF)
option(MNN_PORTABLE_BUILD "Link the static version of third party libraries where possible to improve the portability of built executables" OFF)
option(MNN_SEP_BUILD "Build MNN Backends and expression separately. Only works with MNN_BUILD_SHARED_LIBS=ON" ON)
option(NATIVE_LIBRARY_OUTPUT "Native Library Path" OFF)
option(NATIVE_INCLUDE_OUTPUT "Native Include Path" OFF)
option(MNN_AAPL_FMWK "Build MNN.framework instead of traditional .a/.dylib" OFF)
option(MNN_WITH_PLUGIN "Build with plugin op support." OFF)
option(MNN_BUILD_MINI "Build MNN-MINI that just supports fixed shape models." OFF)
option(MNN_USE_SSE "Use SSE optimization for x86 if possiable" ON)
option(MNN_BUILD_CODEGEN "Build with codegen" OFF)
option(MNN_ENABLE_COVERAGE "Build with coverage enable" OFF)
option(MNN_BUILD_PROTOBUFFER "Build with protobuffer in MNN" ON)
option(MNN_BUILD_OPENCV "Build OpenCV api in MNN." OFF)
option(MNN_BUILD_LLM "Build llm library based MNN." OFF)
option(MNN_BUILD_DIFFUSION "Build diffusion demo based MNN." OFF)
option(MNN_INTERNAL "Build with MNN internal features, such as model authentication, metrics logging" OFF)
option(MNN_JNI "Build MNN Jni for java to use" OFF)
option(MNN_SUPPORT_BF16 "Enable MNN's bf16 op" OFF)
option(MNN_LOW_MEMORY "Build MNN support low memory for weight quant model." OFF)
option(MNN_CPU_WEIGHT_DEQUANT_GEMM "Build MNN CPU weight dequant related gemm kernels." OFF)
在python中导入时,发现如下问题
Traceback (most recent call last):
File "", line 1, in
File "/home/orangepi/archiconda3/envs/mnn/lib/python3.6/site-packages/MNN_OPENCL-2.9.6-py3.6-linux-aarch64.egg/MNN/init.py", line 4, in
from _mnncengine import *
ImportError: /home/orangepi/archiconda3/envs/mnn/lib/python3.6/site-packages/MNN_OPENCL-2.9.6-py3.6-linux-aarch64.egg/_mnncengine.cpython-36m-aarch64-linux-gnu.so: undefined symbol: _ZSt28__throw_bad_array_new_lengthv
还恳请工程师能够帮忙解决,谢谢!
The text was updated successfully, but these errors were encountered: