metadata
language:
- en
license: mit
tags:
- continual-learning
- general-continual-learning
- online-learning
- vision
- image-classification
- vit
- prompt-tuning
library_name: pytorch
pipeline_tag: image-classification
inference: false
FlyGCL Checkpoints (FlyPrompt & ViT Baselines)
This repository provides research checkpoints for FlyGCL, a lightweight framework for General Continual Learning (GCL) / online class-incremental learning in the Si-Blurry setting.
It is designed to be used together with the FlyGCL codebase:
- Code:
https://github.com/AnAppleCore/FlyGCL - Paper (arXiv):
https://www.arxiv.org/abs/2602.01976 - OpenReview:
https://openreview.net/forum?id=8pi1rP71qv
What is included
This model repo may contain:
- Backbone checkpoints (ViT-B/16 variants) referenced by FlyGCL via
--backbone. - Prompt checkpoints (optional) for DualPrompt/MISA-style prompts:
g_prompt.pte_prompt.pt
For the exact filename mapping and where to place these files in FlyGCL, see the code repository README:
https://github.com/AnAppleCore/FlyGCL/blob/main/README.md
Model details
- Architecture family: Vision Transformer (ViT-B/16) backbones + prompt-based continual learning heads.
- Framework: PyTorch.
- Training setting: online GCL / Si-Blurry (see paper and code for details).
Intended use
These checkpoints are released for:
- Research / reproducibility of the FlyGCL paper and baselines
- Benchmarking continual learning methods in comparable settings
Not intended for:
- Safety-critical or medical/diagnostic use
- Deployment without careful evaluation in your target environment
Limitations and biases
- Continual learning performance depends on data ordering, hyperparameters, and backbone initialization.
- Backbones pretrained on large-scale datasets may encode biases from their pretraining data.
- Prompt checkpoints may not transfer to datasets/settings different from those used during training.
License
- Code license: MIT (see FlyGCL
LICENSE). - Checkpoint licensing may depend on upstream sources (e.g., DINO/iBOT/MoCo pretrained backbones). If you redistribute upstream-derived weights here, ensure the redistribution terms are compatible and include required notices.
Citation
If you use FlyGCL or these checkpoints in your research, please cite:
@inproceedings{flyprompt2026,
title={FlyPrompt: Brain-Inspired Random-Expanded Routing with Temporal-Ensemble Experts for General Continual Learning},
author={Yan, Hongwei and Sun, Guanglong and Zhou, Kanglei and Li, Qian and Wang, Liyuan and Zhong, Yi},
booktitle={ICLR},
year={2026}
}
Contact
- Maintainer:
Hongwei Yan(yanhw22@mails.tsinghua.edu.cn)