Browse by author
Lookup NU author(s): Dr Lei ShiORCiD
This work is licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0).
Sign language can make possible effective communication between hearing and deaf-mute people. Despite years of extensive pedagogical research, learning sign language remains a formidable task, with the majority of the current systems relying extensively on online learning resources, presuming that users would regularly access them; yet, this approach can feel monotonous and repetitious. Recently, gamification has been proposed as a solution to the problem, however, the research focus is on game design, rather than user experience design. In this work, we present a system for user-defined interaction for learning static American Sign Language (ASL), supporting gesture recognition for user experience design, and enabling users to actively learn through involvement with user-defined gestures, rather than just passively absorbing knowledge. Early findings from a questionnaire-based survey show that users are more motivated to learn static ASL through user-defined interactions.
Author(s): Wang Jindi, Ivrissimtzis Ioannis, Li Zhaoxing, Zhou Yunzhan, Shi Lei
Editor(s): Frasson C; Mylonas P; Troussas C
Publication type: Conference Proceedings (inc. Abstract)
Publication status: Published
Conference Name: Augmented Intelligence and Intelligent Tutoring Systems, 19th International Conference, ITS 2023
Year of Conference: 2023
Pages: 479–490
Print publication date: 16/05/2023
Online publication date: 22/05/2023
Acceptance date: 10/03/2023
Date deposited: 27/05/2023
ISSN: 0302-9743
Publisher: Springer
URL: https://doi.org/10.1007/978-3-031-32883-1_43
DOI: 10.1007/978-3-031-32883-1_43
Library holdings: Search Newcastle University Library for this item
Series Title: Lecture Notes in Computer Science
ISBN: 9783031328824