Browse by author
Lookup NU author(s): Dr Rebeen Hamad, Dr Wai Lok Woo, Dr Bo WeiORCiD
This work is licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0).
Deep learning models have demonstrated significant efficacy across various domains, including healthcare, security, ubiquitous computing, and pervasive intelligence. Long short-term memory and convolutional neural networks, in particular, have revolutionized human activity recognition systems by effectively extracting semantic information directly from raw sensor data. However, the acquisition of large, well-annotated human activity datasets remains a considerable challenge due to privacy concerns, as well as the time and cost associated with labeling. To address this, unsupervised semantic representation learning, which bypasses the need for manual annotations, is crucial for utilizing the vast pool of unlabeled sensor data. In this study, we present a novel self-supervised learning network that extracts meaningful representations from sensor data without relying on predefined semantic labels. To address class imbalances, we enhance the synthetic minority oversampling technique, introducing an improved version called iSMOTE, which accurately labels synthetic samples. Additionally, we propose a random masking technique that removes identity mapping from input data, enabling the construction of generic semantic representations for downstream tasks. By utilizing balanced unlabeled data, our network significantly improves human activity recognition while reducing reliance on annotated data. We conducted extensive evaluations and comparisons of our proposed network against various supervised and self-supervised models using datasets from smart environments and wearable sensors. Remarkably, our network outperforms several existing methods across 12 public datasets. For instance, with just 25\% of labeled data, our self-supervised network achieved an accuracy of 93.87\% on the Ordonez Home A dataset, surpassing the performance of fully supervised methods such as DeepConvLSTM + Attention (84.97\%), HAR + Attention (88.55\%), and DCC + MSA (90.78\%).
Author(s): Hamad Rebeen, Woo Wai, Wei Bo, Yang Longzhi
Publication type: Article
Publication status: Published
Journal: IEEE Transactions on Emerging Topics in Computational Intelligence
Year: 2025
Pages: epub ahead of print
Online publication date: 27/01/2025
Acceptance date: 04/11/2024
Date deposited: 07/02/2025
ISSN (print): 2471-285X
ISSN (electronic): 2471-285X
Publisher: IEEE
URL: https://doi.org/10.1109/TETCI.2025.3526504
DOI: 10.1109/TETCI.2025.3526504
ePrints DOI: 10.57711/mc48-xw31
Altmetrics provided by Altmetric