Skip to content

Getting error when trying to use OpenVINOExecution Provider #473

@NickM-27

Description

@NickM-27

Describe the issue

We run a service via python in a docker container, this service runs ONNX models including support for the OpenVINO execution provider. The service is started and run inside of S6. We have found that running the model with

import onnxruntime as ort
ort.InferenceSession(/config/model_cache/jinaai/jina-clip-v1/vision_model_fp16.onnx, providers=['OpenVINOExecutionProvider', 'CPUExecutionProvider'], provider_options=[{'cache_dir': '/config/model_cache/openvino/ort', 'device_type': 'GPU'}, {}])

results in the error:

EP Error /onnxruntime/onnxruntime/core/session/provider_bridge_ort.cc:1637 onnxruntime::Provider& onnxruntime::ProviderLibrary::Get() [ONNXRuntimeError] : 1 : FAIL : Failed to load library libonnxruntime_providers_openvino.so with error: /usr/local/lib/python3.9/dist-packages/onnxruntime/capi/libopenvino_onnx_frontend.so.2430: undefined symbol: _ZN2ov3Any4Base9to_stringB5cxx11Ev

however, when running the exact same in a python3 shell (as opposed to in the main python process) it appears that it does work correctly. I am hoping to understand if there is any significance to this error and if it might indicate what is going wrong.

To reproduce

import onnxruntime as ort
ort.InferenceSession(/config/model_cache/jinaai/jina-clip-v1/vision_model_fp16.onnx, providers=['OpenVINOExecutionProvider', 'CPUExecutionProvider'], provider_options=[{'cache_dir': '/config/model_cache/openvino/ort', 'device_type': 'GPU'}, {}])

Urgency

No response

Platform

Linux

OS Version

Debian

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

onnxruntime-openvino 1.19.*

ONNX Runtime API

Python

Architecture

X64

Execution Provider

OpenVINO

Execution Provider Library Version

openvino 2024.3.*

Metadata

Metadata

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions