aub logo
  • AUB Admission is ongoing for Spring - 2026 (January to April)  | To Apply Click Here
  • 4th AUB International Conference on Good Governance, Justice and Ethical Leadership (08 - 09 May 2026) | For details - Click Here
  • To verify your document, please email us at verification@aub.ac.bd 
  •  *** www.aub.ac.bd is our only website. All other websites in the name of AUB are fake. So everyone is warned not to be deceived. 
aub logo white
4th AUBIC-2026 কুইজ প্রতিযোগিতা

Contact us

+8801678664413-19

aub_admin June 20, 2023 356 Views

Deep Learning with OBH for Real-Time Rotation-Invariant SignsDetection

Shamim Akhter*

Aust. Univ. of Science & Technology (AUST), Dhaka, Bangladesh.
Email: shamimakhter@gmail.com

Shah Jafor Sadeek Quaderi

Asian University of Bangladesh, Dhaka, Bangladesh.
Email: sjsquaderi11@gmail.com

Saleh Ud-Din Ahmad

AISIP laboratory, Dhaka, Bangladesh.
Email: saleh.ahmed@brotecs.com

Abstract

Numerous studies are being undertaken to provide answers forsign language recognition and classification. Deep learning-basedmodels have higher accuracy (90%-98%); however, require moreruntime memory and processing in terms of both computationalpower and execution time (1 hour 20 minutes) for feature extrac-tion and training images. Besides, deep learning models are notentirely insensitive to translation, rotation, and scaling; unless thetraining data includes rotated, translated, or scaled signs. However,Orientation-Based Hashcode (OBH) completes gesture recognitionin a significantly shorter length of time (5 minutes) and with rea-sonable accuracy (80%-85%). In addition, OBH is not affected bytranslation, rotation, scaling, or occlusion. As a result, a new inter-mediary model is developed to detect sign language and performclassification with a reasonable processing time (6 minutes) likeOBH while providing attractive accuracy (90%-96%) and invariancequalities. This paper presents a coupled and completely networked autonomous system comprised of OBH and Gabor features with machine learning models. The proposed model is evaluated with 576 sign alphabet images (RGB and Depth) from 24 distinct categories, and the results are compared to those obtained using traditional machine learning methodologies. The proposed methodology is 95.8% accurate against a randomly selected test dataset and 93.85% accurate after 9-fold validation.

CCS CONCEPTS: Computing Methodologies, Machine learning, Neural Networks

KEYWORDS: Deep Learning, Orientation Based Hashcode, Gabor Filter, Sequential Neural Network

(ICSCA 2023), February 23–25, 2023, Kuantan, Malaysia. ACM, New York, NY, USA.