Toggle Main Menu Toggle Search

Open Access padlockePrints

Convolutional Tsetlin Machine-based Training and Inference Accelerator for 2-D Pattern Classification

Lookup NU author(s): Svein Tunheim, Professor Rishad Shafik, Professor Alex Yakovlev

Downloads


Licence

This work is licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0).


Abstract

© 2023 The Authors. The Tsetlin Machine (TM) is a machine learning algorithm based on an ensemble of Tsetlin Automata (TAs) that learns propositional logic expressions from Boolean input features. In this paper, the design and implementation of a Field Programmable Gate Array (FPGA) accelerator based on the Convolutional Tsetlin Machine (CTM) is presented. The accelerator performs classification of two pattern classes in 4 × 4 Boolean images with a 2 × 2 convolution window. Specifically, there are two separate TMs, one per class. Each TM comprises 40 propositional logic formulas, denoted as clauses, which are conjunctions of literals. Include/exclude actions from the TAs determine which literals are included in each clause. The accelerator supports full training, including random patch selection during convolution based on parallel reservoir sampling across all clauses. The design is implemented on a Xilinx Zynq XC7Z020 FPGA platform. With an operating clock speed of 40 MHz, the accelerator achieves a classification rate of 4.4 million images per second with an energy per classification of 0.6 μJ. The mean test accuracy is 99.9% when trained on the 2-dimensional Noisy XOR dataset with 40% noise in the training labels. To achieve this performance, which is on par with the original software implementation, Linear Feedback Shift Register (LFSR) random number generators of minimum 16 bits are required. The solution demonstrates the core principles of a CTM and can be scaled to operate on multi-class systems for larger images.


Publication metadata

Author(s): Tunheim SA, Jiao L, Shafik R, Yakovlev A, Granmo O-C

Publication type: Article

Publication status: Published

Journal: Microprocessors and Microsystems

Year: 2023

Volume: 103

Print publication date: 01/11/2023

Online publication date: 07/10/2023

Acceptance date: 04/10/2023

Date deposited: 26/10/2023

ISSN (print): 0141-9331

ISSN (electronic): 1872-9436

Publisher: Elsevier BV

URL: https://doi.org/10.1016/j.micpro.2023.104949

DOI: 10.1016/j.micpro.2023.104949

Data Access Statement: Data will be made available on request.


Altmetrics

Altmetrics provided by Altmetric


Share