[HW Accel Support]: Most Optimized ONNX Model for Nvidia GPU on Frigate 16 #19333
Replies: 4 comments 31 replies
-
|
The YOLO-NAS model will be considerably more accurate than the yolov7 model that the tensorrt detector used to run. Also, generally the inference time you see there is plenty fast especially since 2 detectors are being run at the same time. As long as you don't see any skipped fps then there is nothing lost with that inference speed, and the increased accuracy is a benefit. |
Beta Was this translation helpful? Give feedback.
-
|
I am using Yolo-NAS right now but it bogs down when lots of movement get detected, I've see some skipped fps at times unfortunately .. and GPU spiking
|
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.





Uh oh!
There was an error while loading. Please reload this page.
-
Describe the problem you are having
As some already have, I've moved from Tensorrt supported models to ONNX going into the latest 16.4 Beta and now Release candidate 1.
One thing that's been keeping me up at nigh is trying to get an optimized model that works well with ONNX and NVIDIA GPU.
Running an A4000 is no slouch but I'm starting to observe performance hit moving away from directly supported Tensorrt detectors. 3ms~ to 10ms~ on ONNX(depending on the models)
I'm not a developer or a coder and I truly want to acknowledge the amazing support that's been provided to this project.
So a big thank you for keeping up with all the questions!
I do have a big ask..., does anyone know a way to get me back closer to faster inference speeds?
I'm sure I'm not the only one invested into Nvidia GPU's that is now struggling to adapt to the new ONNX detector.
I like things fast and light, is there a way to optimize the yolo7-tiny or other model to run on ONNX detector that is optimized for my card?
Version
0.16.0-0b7a33d
Frigate config file
docker-compose file or Docker CLI command
Relevant Frigate log output
Relevant go2rtc log output
FFprobe output from your camera
root@4cf93339850f:/opt/frigate# ffprobe rtsp://127.0.0.1:8554/Front-Door bash: ffprobe: command not found root@4cf93339850f:/opt/frigate#Install method
Docker Compose
Object Detector
TensorRT
Network connection
Wired
Camera make and model
hivision
Screenshots of the Frigate UI's System metrics pages
Any other information that may be helpful
Need for Speed :
Beta Was this translation helpful? Give feedback.
All reactions