Jetson Orin Nano Jetpack6 : When generating model Yolov7-320-dla I get a generating error #21045
Replies: 1 comment
-
|
Based on the error logs, the DLA (Deep Learning Accelerator) model generation is failing due to TensorRT engine build errors. The logs show several DLA compatibility issues:
Solution: The YOLOv7-320 model architecture has layers that are incompatible with the Jetson Orin Nano's DLA. Instead of using the
environment:
- YOLO_MODELS=yolov7-320
- USE_FP16=true
model:
path: /config/model_cache/tensorrt/yolov7-320.trt
The DLA is designed for specific layer types and has dimensional constraints that the YOLOv7-320 architecture exceeds. Running on GPU will still provide hardware acceleration without the DLA-specific limitations(1). ** 📚 Sources: Have another question? Just tag @inkeep. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Describe the problem you are having
I run a Jetson Orin Nano 8gb. I created a docker-compose.yaml where I, as in de frigate documentation. add the parameter -dla to the model Yolo7-320. You get then Model=Yolo7-320-dla. The build process give's a error and later on exit with code 2 cause the *trt files cannot be found. If a build it without added -dla. The build process succesfully completes.
Version
0.16.2-4d58206
Frigate config file
docker-compose file or Docker CLI command
Relevant Frigate log output
Install method
Docker Compose
Object Detector
TensorRT
Screenshots of the Frigate UI's System metrics pages
The server is not starting up.
Any other information that may be helpful
No response
Beta Was this translation helpful? Give feedback.
All reactions