Is it possible to use Neural computer stick 2(OpenVINO)

Am I able to run mimic3 and or mycroft-precise using Neural computer stick 2(OpenVINO). It has built in way to convert tensor flow models to models it uses.

https://docs.openvino.ai/2021.2/omz_models_public_mozilla_deepspeech_0_8_2_mozilla_deepspeech_0_8_2.html

From the above  onnxruntime link above
OpenVINO™ Execution Provider with Onnx Runtime on Linux installed from PyPi.org come with prebuilt OpenVINO™ libs and supports flag CXX11_ABI=0. So there is no need to install OpenVINO™ separately.

To enable CX11_ABI=1 flag, build Onnx Runtime python wheel packages from source. For build instructions, please see the [BUILD page](https://onnxruntime.ai/docs/build/eps.html#openvino). OpenVINO™ Execution Provider wheels on Linux built from source will not have prebuilt OpenVINO™ libs so we must set the OpenVINO™ Environment Variable using the full installer package of OpenVINO™:

C:\ <openvino_install_directory>\setupvars.bat

Also here it looks like it is a possible combination

What do you want to use it for? It’s got limited model compatibility, so it’s not generally applicable to all tf uses.

the ONNX models on the raspberry pi to see if it would speed up response times

probably better off asking on the openvino forums, tbh.

the onnx models for mimic 3

See above, also maybe inquire with the onnx folks?

Perhaps this developer can help you out;

He has a lot of convertors and to the looks of it ,have experience with openvino, TF and ONNX.

1 Like

Thank you! I will check it out