Skip to content

Integrating Task-Specific and Universal Adapters for Pre-Trained Model-based Class-Incremental Learning (ICCV 2025)

Notifications You must be signed in to change notification settings

LAMDA-CL/ICCV2025-TUNA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Integrating Task-Specific and Universal Adapters for Pre-Trained Model-based Class-Incremental Learning

State Key Laboratory for Novel Software Technology, Nanjing University

The code repository for "Integrating Task-Specific and Universal Adapters for Pre-Trained Model-based Class-Incremental Learning" (ICCV 2025) in PyTorch. If you use any content of this repo for your work, please cite the following bib entry:

@inproceedings{wang2025integrating,
  title={Integrating Task-Specific and Universal Adapters for Pre-Trained Model-based Class-Incremental Learning},
  author={Yan Wang and Da-Wei Zhou and Han-Jia Ye},
  booktitle={ICCV},
  year={2025}
}

📢 Updates

[08/2025] Code has been released.

[08/2025] arXiv paper has been released.

[06/2025] Accepted to ICCV 2025.

📝 Introduction

Class-Incremental Learning (CIL) requires a learning system to continually learn new classes without forgetting. Existing pre-trained model-based CIL methods often freeze the pre-trained network and adapt to incremental tasks using additional lightweight modules such as adapters. However, incorrect module selection during inference hurts performance, and task-specific modules often overlook shared general knowledge, leading to errors on distinguishing between similar classes across tasks. To address the aforementioned challenges, we propose integrating Task-Specific and Universal Adapters (TUNA) in this paper. Specifically, we train task-specific adapters to capture the most crucial features relevant to their respective tasks and introduce an entropy-based selection mechanism to choose the most suitable adapter. Furthermore, we leverage an adapter fusion strategy to construct a universal adapter, which encodes the most discriminative features shared across tasks. We combine task-specific and universal adapter predictions to harness both specialized and general knowledge during inference. Extensive experiments on various benchmark datasets demonstrate the state-of-the-art performance of our approach.

Requirements

🗂️ Environment

  1. torch 2.0.1
  2. torchvision 0.15.2
  3. timm 0.6.12

🔎 Dataset

We provide the processed datasets as follows:

  • CIFAR100: will be automatically downloaded by the code.
  • ImageNet-R: Google Drive: link or Onedrive: link
  • ImageNet-A: Google Drive: link or Onedrive: link
  • ObjectNet: Onedrive: link You can also refer to the filelist if the file is too large to download.

You need to modify the path of the datasets in ./utils/data.py according to your own path.

These datasets are referenced in the Aper

🔑 Running scripts

Please follow the settings in the exps folder to prepare json files, and then run:

python main.py --config ./exps/[filename].json

👨‍🏫 Acknowledgment

We would like to express our gratitude to the following repositories for offering valuable components and functions that contributed to our work.

About

Integrating Task-Specific and Universal Adapters for Pre-Trained Model-based Class-Incremental Learning (ICCV 2025)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages