Zedong Wang1, Siyuan Li2, Dan Xu1
1The Hong Kong University of Science and Technology, 2Zhejiang University

(a) Most existing MTL optimization methods focus on addressing conflicts in parameter updates. (b) Rep-MTL instead leverages task saliency in shared representation space to explicitly facilitate cross-task information sharing while preserving task-specific signals via regularization, without modifications to either the underlying optimizers or model architectures.
Rep-MTL is a representation-level regularization method for multi-task learning that introduces task saliency-based objectives to encourage inter-task complementarity via Cross-task Saliency Alignment (CSA) while mitigating negative transfer among tasks via Task-specific Saliency Regulation (TSR).
We evaluate Rep-MTL on several challenging MTL benchmarks spanning diverse computer vision scenarios:
Dataset | Tasks | Scenario | Download |
---|---|---|---|
NYUv2 | SemSeg + Depth Est. + Surface Normal Pred. | Indoor Dense Prediction | Link |
CityScapes | SemSeg + Depth Est. | Outdoor Dense Prediction | Link |
Office-31 | Image Classification (31 classes) | Domain Adaptation | Link |
Office-Home | Image Classification (65 classes) | Domain Adaptation | Link |
- [July 24, 2025] 🎉 Rep-MTL was selected as ICCV 2025 Highlight! We are working on cleaning and organizing our codebase. Stay tuned!
- [June 26, 2025] Rep-MTL was accepted to ICCV 2025, with final ratings: 5/5/6 (out of 6).
For questions or research discussions, please contact Zedong Wang at [email protected]
.
@inproceedings{iccv2025repmtl,
title={Rep-MTL: Unleashing the Power of Representation-level Task Saliency for Multi-Task Learning},
author={Wang, Zedong and Li, Siyuan and Xu, Dan},
booktitle={IEEE/CVF International Conference on Computer Vision (ICCV)},
year={2025}
}
We thank the following great repositories that facilitated our research: LibMTL, CAGrad, MTAN, FAMO, and Nash-MTL. We also extend our appreciation to many other studies in the community for their foundational contributions that inspired this work.