Skip to content

[OPEN teaching project] The transfer learning code for understanding and teaching : Boosting for transfer learning with single / multiple source(s)

License

Notifications You must be signed in to change notification settings

Bin-Cao/TrAdaboost

Repository files navigation

TrAdaBoost: Boosting for Transfer Learning

🤝🤝🤝 Please star ⭐️ this project to support open-source research and development 🌍! Thank you!

This is a teaching and research-oriented project that implements transfer learning using boosting strategies, developed during my stay at Zhejiang Lab (March 1 – August 31, 2023). If you have any questions or need assistance, feel free to reach out!


🔬 Overview

Transfer learning aims to leverage knowledge from one or more source domains to improve performance on a target domain with limited data. This project focuses on instance-based methods, particularly variants of the TrAdaBoost algorithm for both classification and regression tasks.

Security Status


📦 Models Included

🔹 Classification

🔸 Regression

Implemented in Python, supporting Windows, Linux, and macOS platforms.


📚 Tutorial


📈 Star History

Star History Chart


📌 中文介绍(持续更新)


📎 Citation

If you use this code in your research, please cite:

Cao Bin, Zhang Tong-yi, Xiong Jie, Zhang Qian, Sun Sheng. Package of Boosting-based transfer learning [2023SR0525555], 2023, Software Copyright. GitHub: github.com/Bin-Cao/TrAdaboost


🔧 Package Info

author_email='[email protected]'
maintainer='CaoBin'
maintainer_email='[email protected]'
license='MIT License'
url='https://github.com/Bin-Cao/TrAdaboost'
python_requires='>=3.7'

📚 References

  1. Dai, W., Yang, Q., et al. (2007). Boosting for Transfer Learning. ICML.
  2. Yao, Y., & Doretto, G. (2010). Boosting for Transfer Learning with Multiple Sources. CVPR.
  3. Rettinger, A., et al. (2006). Boosting Expert Ensembles for Rapid Concept Recall. AAAI.
  4. Pardoe, D., & Stone, P. (2010). Boosting for Regression Transfer. ICML.

💡 Related Transfer Learning Methods

1️⃣ Instance-based Transfer Learning

  • Instance Selection (same marginal, different conditional distributions): TrAdaBoost

  • Instance Re-weighting (same conditional, different marginal distributions): KMM

2️⃣ Feature-based Transfer Learning

  • Explicit Distance-based

  • Implicit Distance-based

    • DANN

3️⃣ Parameter-based Transfer Learning

  • Pretraining + Fine-tuning

For any inquiries or assistance, feel free to contact Mr. CAO Bin at:
📧 Email: [email protected]

Cao Bin is a PhD candidate at the Hong Kong University of Science and Technology (Guangzhou), under the supervision of Professor Zhang Tong-Yi. His research focuses on AI for science, especially intelligent crystal-structure analysis and discovery. Learn more about his work on his homepage.

About

[OPEN teaching project] The transfer learning code for understanding and teaching : Boosting for transfer learning with single / multiple source(s)

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published