it works well
INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
Found face: x=120, y=186, w=62, h=79, score=0.8306506276130676
Found face: x=311, y=125, w=66, h=81, score=0.8148472309112549
Found face: x=424, y=173, w=64, h=86, score=0.8093323111534119
Inference took (on average): 32.0ms. per image
–use-qnn
with a lot of logs
Initializing HtpProvider
Specified config SOC, ignoring on real target
/prj/qct/webtech_scratch20/mlg_user_admin/qaisw_source_repo/rel/qairt-2.39.0/release/snpe_src/avante-tools/prebuilt/dsp/hexagon-sdk-5.4.0/ipc/fastrpc/rpcmem/src/rpcmem_android.c:38:dummy call to rpcmem_init, rpcmem APIs will be used from libxdsprpc
[1] [4294967295] has incorrect Value 1895298384, expected equal to 0.
QnnBackend_validateOpConfig failed 3110
Failed to validate op node_id_0_op_type_Conv2d_op_count_0 with error 0xc26
Logs will be sent to the system’s default channel
[1] [4294967295] has incorrect Value 1895298384, expected equal to 0.
QnnBackend_validateOpConfig failed 3110
Failed to validate op node_id_0_op_type_Conv2d_op_count_0 with error 0xc26
It’s not working even though i just follow tutorial.
Anyone know about it?
(.venv) ubuntu@ubuntu:~/yolov11$ qnn-platform-validator --backend dsp --testBackend
PF_VALIDATOR: DEBUG: Calling PlatformValidator->setBackend
PF_VALIDATOR: DEBUG: Calling PlatformValidator->isBackendHardwarePresent
PF_VALIDATOR: DEBUG: Calling PlatformValidator->isBackendAvailable
PF_VALIDATOR: DEBUG: Should be able to access atleast one of libraries from : libc.so.6
PF_VALIDATOR: DEBUG: dlOpen successfull for library : libc.so.6
PF_VALIDATOR: DEBUG: Should be able to access atleast one of libraries from : libcdsprpc.so
PF_VALIDATOR: DEBUG: dlOpen successfull for library : libcdsprpc.so
Backend DSP Prerequisites: Present.
PF_VALIDATOR: DEBUG: Calling PlatformValidator->backendCheck
PF_VALIDATOR: DEBUG: Should be able to access atleast one of libraries from : libc.so.6
PF_VALIDATOR: DEBUG: dlOpen successfull for library : libc.so.6
PF_VALIDATOR: DEBUG: Should be able to access atleast one of libraries from : libcdsprpc.so
PF_VALIDATOR: DEBUG: dlOpen successfull for library : libcdsprpc.so
PF_VALIDATOR: DEBUG: Starting calculator test
PF_VALIDATOR: DEBUG: Loading sample stub: libQnnHtpV68CalculatorStub.so
PF_VALIDATOR: DEBUG: Successfully loaded DSP library - ‘libQnnHtpV68CalculatorStub.so’. Setting up pointers.
PF_VALIDATOR: DEBUG: Success in executing the sum function
Unit Test on the backend DSP: Passed.
QNN is supported for backend DSP on the device.
*********** Results Summary ***********
Backend = DSP
{
Backend Hardware : Supported
Backend Libraries : Found
Library Version : Not Queried
Core Version : Not Queried
Unit Test : Passed
}
We have tested that the model runs successfully on our side. Could you kindly share the contents of the ~/aihub-npu directory on your device?
May I also ask which Python version you are using when running the model?
May I confirm whether you followed the steps in the link below?
If not, please tell me the exact commands or screenshots of the document you used for testing.
Alternatively, you may flash the latest release and repeat the test.