Am I able to run mimic3 and or mycroft-precise using Neural computer stick 2(OpenVINO). It has built in way to convert tensor flow models to models it uses.
From the above onnxruntime link above OpenVINO™ Execution Provider with Onnx Runtime on Linux installed from PyPi.org come with prebuilt OpenVINO™ libs and supports flag CXX11_ABI=0. So there is no need to install OpenVINO™ separately. To enable CX11_ABI=1 flag, build Onnx Runtime python wheel packages from source. For build instructions, please see the [BUILD page](https://onnxruntime.ai/docs/build/eps.html#openvino). OpenVINO™ Execution Provider wheels on Linux built from source will not have prebuilt OpenVINO™ libs so we must set the OpenVINO™ Environment Variable using the full installer package of OpenVINO™:
Also here it looks like it is a possible combination