Skip to content

[None][feat] ModelOpt - Align quantization_config with llm-compressor format #28472

[None][feat] ModelOpt - Align quantization_config with llm-compressor format

[None][feat] ModelOpt - Align quantization_config with llm-compressor format #28472

Triggered via pull request November 18, 2025 15:12
Status Success
Total duration 11m 0s
Artifacts

precommit-check.yml

on: pull_request
Fit to window
Zoom out
Zoom in