Toggle Main Menu Toggle Search

Open Access padlockePrints

Dual Variational Knowledge Attention for Class Incremental Vision Transformer

Lookup NU author(s): Dr Haoran Duan, Dr Varun OjhaORCiD, Dr Tejal Shah, Professor Raj Ranjan

Downloads


Licence

This work is licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0).


Abstract

Class incremental learning (CIL) strives to emulate the human cognitive process of continuously learning and adapting to new tasks while retaining knowledge from past experiences. Despite significant advancements in this field, Transformer-based models have not fully leveraged the potential of attention mechanisms to balance the transferable knowledge between tokens and the associated information. This paper addresses this gap by using a dual variational knowledge attention (DVKA) mechanism within a Transformer-based encoder-decoder framework, tailored for CIL. DVKA mechanism aims to manage the information flow through the attention maps, ensuring a balanced representation of all classes, and mitigating the risk of information dilution as new classes are incrementally introduced. This method, leverage the information bottleneck and mutual information principle, selectively filters less relevant information, directing the model’s focus towards the most significant details for each class. The DVKA is designed with two distinct attentions: one focused on the feature level and the other on the token dimension. The feature-focused attention aims to purify the complex nature of various classification tasks, ensuring a comprehensive representation of both old and new tasks. The token-focused attention mechanism highlights specific tokens, facilitating local discrimination among disparate patches and fostering global coordination for a spectrum of task tokens. Our work is a major stride towards improving transformer models for class incremental learning, presenting a theoretical rationale and effective experimental results on three widely-used datasets.


Publication metadata

Author(s): Duan H, Rui S, Ojha V, Shah T, Huang Z, Ouyang Z, Huang Y, Long Y, Ranjan R

Publication type: Conference Proceedings (inc. Abstract)

Publication status: Published

Conference Name: International Joint Conference on Neural Networks (IJCNN 2024)

Year of Conference: 2024

Pages: 1-8

Online publication date: 09/09/2024

Acceptance date: 15/03/2024

Date deposited: 18/11/2024

ISSN: 2161-4407

Publisher: IEEE

URL: https://doi.org/10.1109/IJCNN60899.2024.10650317

DOI: 10.1109/IJCNN60899.2024.10650317

ePrints DOI: 10.57711/2rs5-0509

Library holdings: Search Newcastle University Library for this item

ISBN: 9798350359312


Share