ISSN 0134-2452 · EISSN 2412-6179
Языки: ru · en

Статья: Comparative analysis of neural network models performance on low-power devices for a real-time object detection task (2024)

Читать онлайн

A computer vision based real-time object detection on low-power devices is economically attractive, yet a technically challenging task. The paper presents results of benchmarks on popular deep neural network models, which are often used for this task. The results of experiments provide insights into trade-offs between accuracy, speed, and computational efficiency of MobileNetV2 SSD, CenterNet MobileNetV2 FPN, EfficientDet, YoloV5, YoloV7, YoloV7 Tiny and YoloV8 neural network models on Raspberry Pi 4B, Raspberry Pi 3B and NVIDIA Jetson Nano with TensorFlow Lite. We fine-tuned the models on our custom dataset prior to benchmarking and used post-training quantization (PTQ) and quantization-aware training (QAT) to optimize the models’ size and speed. The experiments demonstrated that an appropriate algorithm selection depends on task requirements. We recommend EfficientDet Lite 512×512 quantized or YoloV7 Tiny for tasks that require around 2 FPS, EfficientDet Lite 320×320 quantized or SSD Mobilenet V2 320×320 for tasks with over 10 FPS, and EfficientDet Lite 320×320 or YoloV5 320×320 with QAT for tasks with intermediate FPS requirements.

Ключевые фразы: computer vision, image analysis, object detection, deep learning, benchmarking, optimization techniques, edge devices.
Автор (ы): Zagitov / Загитов Artur / Атрур Раусович, Chebotareva / Чеботарева Elvira / Эльвира Валерьевна, Toschev Alexander, Магид Евгений Аркадьевич
Журнал: КОМПЬЮТЕРНАЯ ОПТИКА

Идентификаторы и классификаторы

УДК
535. Оптика
ГРНТИ
28.23.15. Распознавание образов. Обработка изображений
Префикс DOI
10.18287/2412-6179-CO-1343
Для цитирования:
ZAGITOV /. З., CHEBOTAREVA /. Ч., TOSCHEV A., МАГИД Е. А. COMPARATIVE ANALYSIS OF NEURAL NETWORK MODELS PERFORMANCE ON LOW-POWER DEVICES FOR A REAL-TIME OBJECT DETECTION TASK // КОМПЬЮТЕРНАЯ ОПТИКА. 2024. ТОМ 48 N 2 МАРТ-АПРЕЛЬ
Текстовый фрагмент статьи