Embedded deployment of traffic sign detection and recognition systems

Imane Taouqi, Mohamed Lamane, Abdessamad Klilou, Assia Arsalane, Kebir Chaji

Abstract


Traffic sign (TS) detection and recognition are essential components of advanced driver assistance systems (ADAS), contributing to safer and more reliable driving. However, deploying deep learning–based vision models on embedded platforms is challenging due to constraints in computational power and energy consumption. In this work, a comparative deployment of you only look once version 7 (YOLOv7) and YOLOv7-tiny deep learning algorithms is conducted on embedded NVIDIA platforms, namely Jetson Nano and Jetson Xavier NX, to evaluate their suitability for real-time TS detection. Following the detection stage, a convolutional neural network (CNN) is integrated to perform TS recognition, enabling a complete detection–recognition pipeline. Experimental results show that YOLOv7-tiny achieves higher detection precision of 97%, while providing better speed and computational cost on resource-constrained devices, with Jetson Nano reaching 18.8 frames per second (FPS) and, on Jetson Xavier NX reaching 43 FPS. The integrated CNN model ensures reliable classification of detected TS with an accuracy of 99.54%. This work highlights the trade-offs between precision, speed, and power consumption and provides practical guidance for selecting detection and recognition architectures for embedded ADAS applications.


Keywords


Advanced driver assistance systems; Computer vision; Detection; Embedded systems; You only look once version 7

Full Text:

PDF


DOI: https://doi.org/10.11591/eei.v15i2.11215

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Bulletin of EEI Stats

Bulletin of Electrical Engineering and Informatics (BEEI)
ISSN: 2089-3191, e-ISSN: 2302-9285
This journal is published by the Institute of Advanced Engineering and Science (IAES) in collaboration with Intelektual Pustaka Media Utama (IPMU).