Visual and Thermal Data for Pedestrian and Cyclist Detection

Research output: Chapter in Book/Report/Conference proceedingConference proceeding

19 Downloads (Pure)

Abstract

With the continued advancement of autonomous vehicles and their implementation in public roads, accurate detection of vulnerable road users (VRUs) is vital for ensuring safety. To provide higher levels of safety for these VRUs, an effective detection system should be employed that can correctly identify VRUs in all types of environments (e.g. VRU appearance, crowded scenes) and conditions (e.g. fog, rain, night-time). This paper presents optimal methods of sensor fusion for pedestrian and cyclist detection using Deep Neural Networks (DNNs) for higher levels of feature abstraction. Typically, visible sensors have been utilized for this purpose. Recently, thermal sensors system or combination of visual and thermal sensors have been employed for pedestrian detection with advanced detection algorithm. DNNs have provided promising results for improving the accuracy of pedestrian and cyclist detection. This is because they are able to extract features at higher levels than typical hand-crafted detectors. Previous studies have shown that amongst the several sensor fusion techniques that exist, Halfway Fusion has provided the best results in terms of accuracy and robustness. Although sensor fusion and DNN implementation have been used for pedestrian detection, there is considerably less research undertaken for cyclist detection.

Original languageEnglish
Title of host publicationTowards Autonomous Robotic Systems - 20th Annual Conference, TAROS 2019, Proceedings
Subtitle of host publication20th Annual Conference, TAROS 2019, London, UK, July 3–5, 2019, Proceedings, Part II
EditorsKaspar Althoefer, Jelizaveta Konstantinova, Ketao Zhang
PublisherSpringer
Pages223-234
Number of pages12
ISBN (Electronic)978-3-030-25332-5
ISBN (Print)978-3-030-25331-8
DOIs
Publication statusPublished - 2019
Event20th Towards Autonomous Robotic Systems Conference - London, United Kingdom
Duration: 3 Jul 20195 Jul 2019
Conference number: 20
https://www.qmul.ac.uk/robotics/events/taros2019/

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11650 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference20th Towards Autonomous Robotic Systems Conference
Abbreviated titleTAROS 2019
CountryUnited Kingdom
CityLondon
Period3/07/195/07/19
Internet address

Keywords

  • Cyclist detection
  • Deep Neural Networks
  • Pedestrian detection
  • Sensor fusion

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Visual and Thermal Data for Pedestrian and Cyclist Detection'. Together they form a unique fingerprint.

  • Cite this

    Ahmed, S., Huda, M. N., Rajbhandari, S., Saha, C., Elshaw, M., & Kanarachos, S. (2019). Visual and Thermal Data for Pedestrian and Cyclist Detection. In K. Althoefer, J. Konstantinova, & K. Zhang (Eds.), Towards Autonomous Robotic Systems - 20th Annual Conference, TAROS 2019, Proceedings: 20th Annual Conference, TAROS 2019, London, UK, July 3–5, 2019, Proceedings, Part II (pp. 223-234). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11650 LNAI). Springer. https://doi.org/10.1007/978-3-030-25332-5_20