[Detector Support]: Frigate Plus ONNX model on Jetson Orin Nano and Jetpack 6 is using cpu instead of gpu #19350
Answered
by
jshumaker
jshumaker
asked this question in
Detector Support
-
Beta Was this translation helpful? Give feedback.
Answered by
jshumaker
Aug 1, 2025
Replies: 1 comment 10 replies
-
|
Are you running nvidia-smi inside of the docker container? This will not show any processes, based on the inference speed it does suggest that the CPU is not being used as it would likely be much slower |
Beta Was this translation helpful? Give feedback.
10 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment

When running 2 cameras the detection process seems to regularly freeze up, I've dropped it down to 320x320 as that seems far more stable on this device. Thanks again.
To summarize:
The jp6 image is properly using the GPU on the Jetson Orin Nano. nvidia-smi however is not appropriate on the device to check for usage.
tegrastatsis built in and a better option.jtopcan be installed and is better, but currently does not support JetPack 6.2.1 and requires manual modification to run.Jetson Orin Nano seems to maybe be able to do 640x640 detection for one camera, but unstable for 2. 320x320 works well for 2 cameras with 1 detector, 30-40ms inference time, with 2, 50-60ms, and 3 40-65ms. I am s…