[Detector Support]: onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument #19308
-
Describe the problem you are havingI have updated my frigate to 0.16-rc1, used the YOLOv9 command to generate the ONNX file since now the system does not allow TENSORRT models (I changed the 320 input to 640), in the config changed the detector from tensorrt to onnx and added the new model using this config: And when I try to start the container, the config validates successfully, but then I get this error from the detector: Version0.16-rc1 Frigate config filedetectors:
onnx:
type: onnx
model:
path: /config/model_cache/onnx/yolov9-m-640.onnx
model_type: yolo-generic
width: 640
height: 640
input_tensor: nhwc
labelmap_path: /labelmap/coco-80.txtdocker-compose file or Docker CLI commandservices:
frigate:
container_name: frigate
privileged: true # this may not be necessary for all setups
restart: unless-stopped
image: ghcr.io/blakeblackshear/frigate:0.16.0-rc1-tensorrt
shm_size: "1024mb" # update for your cameras based on calculation above
devices:
- /dev/bus/usb:/dev/bus/usb # passes the USB Coral, needs to be modified for other versions
- /dev/apex_0:/dev/apex_0 # passes a PCIe Coral, follow driver instructions here https://coral.ai/docs/m2/get-started/#2a-on-linux
- /dev/dri/renderD128 # for intel hwaccel, needs to be updated for your hardware
volumes:
- /etc/localtime:/etc/localtime:ro
- ./config:/config
- ./media:/media/frigate
- type: tmpfs # Optional: 1GB of memory, reduces SSD/SD Card wear
target: /tmp/cache
tmpfs:
size: 1000000000
ports:
- "5000:5000"
- "8554:8554" # RTSP feeds
- "8555:8555/tcp" # WebRTC over tcp
- "8555:8555/udp" # WebRTC over udp
environment:
CUDA_MODULE_LOADING: "LAZY"
YOLO_MODELS: "yolov9-m-640"
deploy: # <------------- Add this section
resources:
reservations:
devices:
- driver: nvidia
device_ids: ['0'] # this is only needed when using multiple GPUs
#count: 1 # number of GPUs
capabilities: [gpu]Relevant Frigate log outputfrigate | 2025-07-28 22:11:33.534139171 [2025-07-28 22:11:33] frigate.detectors.plugins.onnx INFO : ONNX: /config/model_cache/onnx/yolov9-m-640.onnx loaded
frigate | 2025-07-28 22:11:33.573330368 [INFO] Starting go2rtc healthcheck service...
frigate | 2025-07-28 22:11:33.667383556 2025/07/28 22:11:33 [error] 246#246: *8 connect() failed (111: Connection refused) while connecting to upstream, client: 127.0.0.1, server: , request: "GET /api/version HTTP/1.1", subrequest: "/auth", upstream: "http://127.0.0.1:5001/auth", host: "127.0.0.1:5000"
frigate | 2025-07-28 22:11:33.667399316 2025/07/28 22:11:33 [error] 246#246: *8 auth request unexpected status: 502 while sending to client, client: 127.0.0.1, server: , request: "GET /api/version HTTP/1.1", host: "127.0.0.1:5000"
frigate | 2025-07-28 22:11:33.675180515 [2025-07-28 22:11:33] frigate.api.fastapi_app INFO : Starting FastAPI app
frigate | 2025-07-28 22:11:34.831404432 [2025-07-28 22:11:34] frigate.api.fastapi_app INFO : FastAPI started
frigate | 2025-07-28 22:11:46.048875777 Process detector:onnx:
frigate | 2025-07-28 22:11:46.048879534 Traceback (most recent call last):
frigate | 2025-07-28 22:11:46.048880967 File "/usr/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap
frigate | 2025-07-28 22:11:46.048881849 self.run()
frigate | 2025-07-28 22:11:46.048882921 File "/opt/frigate/frigate/util/process.py", line 41, in run_wrapper
frigate | 2025-07-28 22:11:46.048885235 return run(*args, **kwargs)
frigate | 2025-07-28 22:11:46.048902608 ^^^^^^^^^^^^^^^^^^^^
frigate | 2025-07-28 22:11:46.048903800 File "/usr/lib/python3.11/multiprocessing/process.py", line 108, in run
frigate | 2025-07-28 22:11:46.048935000 self._target(*self._args, **self._kwargs)
frigate | 2025-07-28 22:11:46.048936262 File "/opt/frigate/frigate/object_detection/base.py", line 136, in run_detector
frigate | 2025-07-28 22:11:46.048937354 detections = object_detector.detect_raw(input_frame)
frigate | 2025-07-28 22:11:46.048938356 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
frigate | 2025-07-28 22:11:46.048960328 File "/opt/frigate/frigate/object_detection/base.py", line 86, in detect_raw
frigate | 2025-07-28 22:11:46.048961430 return self.detect_api.detect_raw(tensor_input=tensor_input)
frigate | 2025-07-28 22:11:46.048962903 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
frigate | 2025-07-28 22:11:46.048964105 File "/opt/frigate/frigate/detectors/plugins/onnx.py", line 81, in detect_raw
frigate | 2025-07-28 22:11:46.048985476 tensor_output = self.model.run(None, {model_input_name: tensor_input})
frigate | 2025-07-28 22:11:46.048986688 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
frigate | 2025-07-28 22:11:46.048988221 File "/usr/local/lib/python3.11/dist-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 266, in run
frigate | 2025-07-28 22:11:46.048989313 return self._sess.run(output_names, input_feed, run_options)
frigate | 2025-07-28 22:11:46.049003450 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
frigate | 2025-07-28 22:11:46.049005704 onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Unexpected input data type. Actual: (tensor(uint8)) , expected: (tensor(float))
frigate | 2025-07-28 22:11:53.643399877 [2025-07-28 22:11:53] frigate.watchdog INFO : Detection appears to have stopped. Exiting Frigate...Install methodDocker Compose Object DetectorOther Screenshots of the Frigate UI's System metrics pagesThe system does not start, so the metrics are not available. Any other information that may be helpfulNo response |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
|
The error The official Frigate ONNX plugin expects to convert the input to float before inference. In the documented ONNX detector implementation, the input tensor is explicitly converted with tensor_output = self.model.run(None, {model_input_name: tensor_input.astype(np.float32)})If you are seeing this error, it's likely your Frigate version or build is not applying this conversion, or your model’s input signature is not compatible with what Frigate expects. To resolve this:
If you built the ONNX model yourself, verify that the input node’s expected type is For further details about supported ONNX models and detector setup, see the Supported Hardware documentation. 📚 Sources: Have another question? Just tag @inkeep. |
Beta Was this translation helpful? Give feedback.
-
|
That worked! Thank you very much, @hawkeye217 and @NickM-27 ! |
Beta Was this translation helpful? Give feedback.
See the ONNX models section above where you are looking:
https://deploy-preview-16390--frigate-docs.netlify.app/configuration/object_detectors/#yolo-v3-v4-v7-v9-1