Visual and Thermal Data for Pedestrian and Cyclist Detection

Research output: Chapter in Book/Report/Conference proceedingConference proceeding

Abstract

With the continued advancement of autonomous vehicles and their implementation in public roads, accurate detection of vulnerable road users (VRUs) is vital for ensuring safety. To provide higher levels of safety for these VRUs, an effective detection system should be employed that can correctly identify VRUs in all types of environments (e.g. VRU appearance, crowded scenes) and conditions (e.g. fog, rain, night-time). This paper presents optimal methods of sensor fusion for pedestrian and cyclist detection using Deep Neural Networks (DNNs) for higher levels of feature abstraction. Typically, visible sensors have been utilized for this purpose. Recently, thermal sensors system or combination of visual and thermal sensors have been employed for pedestrian detection with advanced detection algorithm. DNNs have provided promising results for improving the accuracy of pedestrian and cyclist detection. This is because they are able to extract features at higher levels than typical hand-crafted detectors. Previous studies have shown that amongst the several sensor fusion techniques that exist, Halfway Fusion has provided the best results in terms of accuracy and robustness. Although sensor fusion and DNN implementation have been used for pedestrian detection, there is considerably less research undertaken for cyclist detection.

Original languageEnglish
Title of host publicationTowards Autonomous Robotic Systems - 20th Annual Conference, TAROS 2019, Proceedings
Subtitle of host publication20th Annual Conference, TAROS 2019, London, UK, July 3–5, 2019, Proceedings, Part II
EditorsKaspar Althoefer, Jelizaveta Konstantinova, Ketao Zhang
PublisherSpringer
Pages223-234
Number of pages12
ISBN (Electronic)978-3-030-25332-5
ISBN (Print)978-3-030-25331-8
DOIs
Publication statusPublished - 2019
Event20th Towards Autonomous Robotic Systems Conference - London, United Kingdom
Duration: 3 Jul 20195 Jul 2019
Conference number: 20
https://www.qmul.ac.uk/robotics/events/taros2019/

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11650 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference20th Towards Autonomous Robotic Systems Conference
Abbreviated titleTAROS 2019
CountryUnited Kingdom
CityLondon
Period3/07/195/07/19
Internet address

Fingerprint

Sensor Fusion
Sensors
Fusion reactions
Pedestrian Detection
Neural Networks
Sensor
Safety
Autonomous Vehicles
Fog
Rain
Vision
Hot Temperature
Fusion
Detector
Robustness
Detectors
Deep neural networks

Keywords

  • Cyclist detection
  • Deep Neural Networks
  • Pedestrian detection
  • Sensor fusion

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Cite this

Ahmed, S., Huda, M. N., Rajbhandari, S., Saha, C., Elshaw, M., & Kanarachos, S. (2019). Visual and Thermal Data for Pedestrian and Cyclist Detection. In K. Althoefer, J. Konstantinova, & K. Zhang (Eds.), Towards Autonomous Robotic Systems - 20th Annual Conference, TAROS 2019, Proceedings: 20th Annual Conference, TAROS 2019, London, UK, July 3–5, 2019, Proceedings, Part II (pp. 223-234). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11650 LNAI). Springer. https://doi.org/10.1007/978-3-030-25332-5_20

Visual and Thermal Data for Pedestrian and Cyclist Detection. / Ahmed, Sarfraz; Huda, M. Nazmul; Rajbhandari, Sujan; Saha, Chitta; Elshaw, Mark; Kanarachos, Stratis.

Towards Autonomous Robotic Systems - 20th Annual Conference, TAROS 2019, Proceedings: 20th Annual Conference, TAROS 2019, London, UK, July 3–5, 2019, Proceedings, Part II. ed. / Kaspar Althoefer; Jelizaveta Konstantinova; Ketao Zhang. Springer, 2019. p. 223-234 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11650 LNAI).

Research output: Chapter in Book/Report/Conference proceedingConference proceeding

Ahmed, S, Huda, MN, Rajbhandari, S, Saha, C, Elshaw, M & Kanarachos, S 2019, Visual and Thermal Data for Pedestrian and Cyclist Detection. in K Althoefer, J Konstantinova & K Zhang (eds), Towards Autonomous Robotic Systems - 20th Annual Conference, TAROS 2019, Proceedings: 20th Annual Conference, TAROS 2019, London, UK, July 3–5, 2019, Proceedings, Part II. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 11650 LNAI, Springer, pp. 223-234, 20th Towards Autonomous Robotic Systems Conference, London, United Kingdom, 3/07/19. https://doi.org/10.1007/978-3-030-25332-5_20
Ahmed S, Huda MN, Rajbhandari S, Saha C, Elshaw M, Kanarachos S. Visual and Thermal Data for Pedestrian and Cyclist Detection. In Althoefer K, Konstantinova J, Zhang K, editors, Towards Autonomous Robotic Systems - 20th Annual Conference, TAROS 2019, Proceedings: 20th Annual Conference, TAROS 2019, London, UK, July 3–5, 2019, Proceedings, Part II. Springer. 2019. p. 223-234. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-030-25332-5_20
Ahmed, Sarfraz ; Huda, M. Nazmul ; Rajbhandari, Sujan ; Saha, Chitta ; Elshaw, Mark ; Kanarachos, Stratis. / Visual and Thermal Data for Pedestrian and Cyclist Detection. Towards Autonomous Robotic Systems - 20th Annual Conference, TAROS 2019, Proceedings: 20th Annual Conference, TAROS 2019, London, UK, July 3–5, 2019, Proceedings, Part II. editor / Kaspar Althoefer ; Jelizaveta Konstantinova ; Ketao Zhang. Springer, 2019. pp. 223-234 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{2894300cf5f44f7497f9e0f78ac2414c,
title = "Visual and Thermal Data for Pedestrian and Cyclist Detection",
abstract = "With the continued advancement of autonomous vehicles and their implementation in public roads, accurate detection of vulnerable road users (VRUs) is vital for ensuring safety. To provide higher levels of safety for these VRUs, an effective detection system should be employed that can correctly identify VRUs in all types of environments (e.g. VRU appearance, crowded scenes) and conditions (e.g. fog, rain, night-time). This paper presents optimal methods of sensor fusion for pedestrian and cyclist detection using Deep Neural Networks (DNNs) for higher levels of feature abstraction. Typically, visible sensors have been utilized for this purpose. Recently, thermal sensors system or combination of visual and thermal sensors have been employed for pedestrian detection with advanced detection algorithm. DNNs have provided promising results for improving the accuracy of pedestrian and cyclist detection. This is because they are able to extract features at higher levels than typical hand-crafted detectors. Previous studies have shown that amongst the several sensor fusion techniques that exist, Halfway Fusion has provided the best results in terms of accuracy and robustness. Although sensor fusion and DNN implementation have been used for pedestrian detection, there is considerably less research undertaken for cyclist detection.",
keywords = "Cyclist detection, Deep Neural Networks, Pedestrian detection, Sensor fusion",
author = "Sarfraz Ahmed and Huda, {M. Nazmul} and Sujan Rajbhandari and Chitta Saha and Mark Elshaw and Stratis Kanarachos",
year = "2019",
doi = "10.1007/978-3-030-25332-5_20",
language = "English",
isbn = "978-3-030-25331-8",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer",
pages = "223--234",
editor = "Kaspar Althoefer and Jelizaveta Konstantinova and Ketao Zhang",
booktitle = "Towards Autonomous Robotic Systems - 20th Annual Conference, TAROS 2019, Proceedings",
address = "United Kingdom",

}

TY - GEN

T1 - Visual and Thermal Data for Pedestrian and Cyclist Detection

AU - Ahmed, Sarfraz

AU - Huda, M. Nazmul

AU - Rajbhandari, Sujan

AU - Saha, Chitta

AU - Elshaw, Mark

AU - Kanarachos, Stratis

PY - 2019

Y1 - 2019

N2 - With the continued advancement of autonomous vehicles and their implementation in public roads, accurate detection of vulnerable road users (VRUs) is vital for ensuring safety. To provide higher levels of safety for these VRUs, an effective detection system should be employed that can correctly identify VRUs in all types of environments (e.g. VRU appearance, crowded scenes) and conditions (e.g. fog, rain, night-time). This paper presents optimal methods of sensor fusion for pedestrian and cyclist detection using Deep Neural Networks (DNNs) for higher levels of feature abstraction. Typically, visible sensors have been utilized for this purpose. Recently, thermal sensors system or combination of visual and thermal sensors have been employed for pedestrian detection with advanced detection algorithm. DNNs have provided promising results for improving the accuracy of pedestrian and cyclist detection. This is because they are able to extract features at higher levels than typical hand-crafted detectors. Previous studies have shown that amongst the several sensor fusion techniques that exist, Halfway Fusion has provided the best results in terms of accuracy and robustness. Although sensor fusion and DNN implementation have been used for pedestrian detection, there is considerably less research undertaken for cyclist detection.

AB - With the continued advancement of autonomous vehicles and their implementation in public roads, accurate detection of vulnerable road users (VRUs) is vital for ensuring safety. To provide higher levels of safety for these VRUs, an effective detection system should be employed that can correctly identify VRUs in all types of environments (e.g. VRU appearance, crowded scenes) and conditions (e.g. fog, rain, night-time). This paper presents optimal methods of sensor fusion for pedestrian and cyclist detection using Deep Neural Networks (DNNs) for higher levels of feature abstraction. Typically, visible sensors have been utilized for this purpose. Recently, thermal sensors system or combination of visual and thermal sensors have been employed for pedestrian detection with advanced detection algorithm. DNNs have provided promising results for improving the accuracy of pedestrian and cyclist detection. This is because they are able to extract features at higher levels than typical hand-crafted detectors. Previous studies have shown that amongst the several sensor fusion techniques that exist, Halfway Fusion has provided the best results in terms of accuracy and robustness. Although sensor fusion and DNN implementation have been used for pedestrian detection, there is considerably less research undertaken for cyclist detection.

KW - Cyclist detection

KW - Deep Neural Networks

KW - Pedestrian detection

KW - Sensor fusion

UR - http://www.scopus.com/inward/record.url?scp=85073914676&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-25332-5_20

DO - 10.1007/978-3-030-25332-5_20

M3 - Conference proceeding

SN - 978-3-030-25331-8

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 223

EP - 234

BT - Towards Autonomous Robotic Systems - 20th Annual Conference, TAROS 2019, Proceedings

A2 - Althoefer, Kaspar

A2 - Konstantinova, Jelizaveta

A2 - Zhang, Ketao

PB - Springer

ER -