Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Deep Radar Perception for
Autonomous Driving
Datasets, Methods, and Challenges
Yi Zhou
zhouyi1023@tju.edu.cn
Institute of Deep Perception Technology
Jiangsu Industrial Technology Research Institute
June 8, 2022
Yi Zhou JITRI Deep Radar Perception June 8, 2022 1 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Table of Contents
1 Introduction
2 Radar Datasets
3 Low Level Tasks
4 Object Detection
5 Object Tracking
6 Challenges
Yi Zhou JITRI Deep Radar Perception June 8, 2022 2 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Introduction
Yi Zhou JITRI Deep Radar Perception June 8, 2022 3 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Citation
This presentation is adapted from the review article "Towards
Deep Radar Perception for Autonomous Driving: Datasets,
Methods, and Challenges".
For more details, please see the original article.
https://www.mdpi.com/1424-8220/22/11/4208
In addition, I maintain a github repository for further updating.
https://github.com/ZHOUYI1023/
awesome-radar-perception
Yi Zhou JITRI Deep Radar Perception June 8, 2022 4 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Industrial Trends
4D Radar, possibly with imaging capability
12Tx-16Rx radar (4 chip cascade and FPGA)
Arbe 48Tx-48Rx radar (RF chipset and radar processing unit
(RPU))
CAN (500 kb/s) -> 100BASE-T Ethernet (11.75 MB/s)
Yi Zhou JITRI Deep Radar Perception June 8, 2022 5 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Radar vs Lidar
Parameter Radar Lidar
Wavelength 3.8 mm 905 nm and 1,550 nm
Directional element Antenna Optics
Illumination approach Wide beam ->entire object Narrow beam ->portion of the object
Transmitted signal Diffusive propagation Direct propagation
Two-way attenuation 1/R^4 1/R^2
Major propagation effects Multipath Absorption, Diffusion, Speckle
Interaction with objects Specular: isolated detections Diffusive: “image like”
Received echo Superposition of multiple reflectors From a single reflector
Output Radar cube- >Point cloud Point cloud
Classification information Statistics of point distritution, RCS, velocity Object shape
Figure is from [1].
Yi Zhou JITRI Deep Radar Perception June 8, 2022 6 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Research Trends
Radar perception has drawn increasing attention from researchers in
computer vision, robotics, and intelligent vehicles.
Yi Zhou JITRI Deep Radar Perception June 8, 2022 7 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Big Picture
Radar can be used in many applications:
Military applications, remote sensing
Traffic surveillance and security: concealed object detection
Industry monitoring
Automotive: object detection and robust Odometry/ SLAM
Gait/gesture recognition -> healthcare, HCI
Vital sign monitoring -> healthcare, sleep monitoring, in-cabin
monitoring
For autonomous driving, learning-based methods can be used for:
Radar depth estimation
Radar velocity estimation
Radar object detection
Sensor fusion for autonomous driving
Yi Zhou JITRI Deep Radar Perception June 8, 2022 8 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Radar Datasets
Yi Zhou JITRI Deep Radar Perception June 8, 2022 9 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Radar Datasets
Many radar datasets were released within these three years.
Yi Zhou JITRI Deep Radar Perception June 8, 2022 10 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Radar Datasets (Cont.)
Radar Datasets for Autonomous Driving
nuScenes, DENSE and Pixset are for sensor fusion, but do not
particularly address the role of radar.
Radar scenes provides point-wise annotations for radar point
cloud, but has no other modalities.
Pointillism uses 2 radars with overlapped view.
Zendar uses SAR for vehicle detection.
RADIATE uses spinning radar and addresses adverse weather
effect.
Yi Zhou JITRI Deep Radar Perception June 8, 2022 11 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Radar Datasets (Cont.)
4D Radar Datasets
Astyx is small
VoD focuses on VRU classification
RADIal’s annotation is coarse but provides raw data
TJ4D features for its long range detection
Pre-CFAR Datasets
CARRADA is too simple
CRUW uses RA maps
RADDet provides annotations for RAD tensor
RADICaL provides raw ADC data and signal processing toolboxes
GhentVRU is designed for VRU detection
Yi Zhou JITRI Deep Radar Perception June 8, 2022 12 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Radar Datasets (Cont.)
Odometry and Localization Datasets
Oxford Radar Robocar, RADIATE, MULRAN and Boreas all use
spinning radar
MulRan, Boreas and EU Long-term are for place recognition and
long-term SLAM
Endeavour Radar Dataset uses 5 Conti ARS 430 for odometry
ColoRadar uses TI AWR2243 Cascade + AWR1843 with
overlapped FoV for odometry
USVInland is for SLAM in inland waterways and water
segmentation
Yi Zhou JITRI Deep Radar Perception June 8, 2022 13 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Radar Datasets (Cont.)
Radar Datasets for Specific Tasks
HawkEye is a SAR datasets for tatic vehicle classification
PREVENTION is a datasets for trajectory prediction
SCORP is for open space segmentation
Ghost is for mirrored ghost detection
DopNet is for gesture classification
Radar signatures of human activities and Solinteraction Data use
Google soli
FloW dataset is for floating waste detection
Yi Zhou JITRI Deep Radar Perception June 8, 2022 14 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Radar Calibration
Calibration targets to be observed simultaneously by different
modalities
Vertical misalignment problem if no sufficient elevation resolution
Yi Zhou JITRI Deep Radar Perception June 8, 2022 15 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Radar Labelling
Cross-modality labelling
Project radar points to visual images
Use Lidar bounding box as ground truth
Labelling quality is not guaranteed
Occlusion problem, not necessarily shared FoV
Bounding box often contains data points which appear to be part
of an object, but are actually caused by a different reflector, e.g.,
ground or multi-path detections.
Yi Zhou JITRI Deep Radar Perception June 8, 2022 16 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Generative Model
Lidar to radar
Scene configuration to radar
Radar to image
Figures are from [2, 3, 4].
Yi Zhou JITRI Deep Radar Perception June 8, 2022 17 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Low Level Tasks
Yi Zhou JITRI Deep Radar Perception June 8, 2022 18 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Radar Depth Estimation
Depth completion: radar-camera fusion, with Lidar as
supervision
Monodepth: with radar as supervision
Two-stage coarse-to-fine architecture
Further expanse radar sparse detections for better alignment
Extend radar detections in height (simple but coarse)
Maintain multiple hypothesis by building a probabilistic map
Apply a strict filtering according to the bounding box, where only
detections corresponding to the frontal surface are retained.
Yi Zhou JITRI Deep Radar Perception June 8, 2022 19 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Radar Depth Estimation
Depth in BEV is similar with
Occupancy grid mapping
Temporal accumulation
Better for slow speed applications, such as parking
Semantic scene understanding
Open space segmentation
Single measurement, real-time
Figures are from [5, 6].
Yi Zhou JITRI Deep Radar Perception June 8, 2022 20 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Radar Velocity Estimation
Radar measure radial velocity
To recover full velocity. we need add some constraints, such as
Observe multiple detections per object
Other modalities see the same detection
Temporal relationship between multiple frames, i.e. scene flow
Figures are from [7, 8].
Yi Zhou JITRI Deep Radar Perception June 8, 2022 21 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Object Detection
Yi Zhou JITRI Deep Radar Perception June 8, 2022 22 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Traditional Radar Detection Pipeline
RD map -> CFAR -> DOA -> DBSCAN
Deep learning can be applied to replace CFAR, DOA estimation
Figures are from [9, 10].
Yi Zhou JITRI Deep Radar Perception June 8, 2022 23 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
End-to-End Radar Detection
Different radar representations + temporal accumulation
Raw ADC data: replace MIMO waveform separation
RAD encoder: 3D tensor
3D conv is time-consuming, use CFAR to crop the small cubes
Multi-view architecture, then 2D conv
RA map: close field high-resolution imaging (USSR mode)
Point cloud
pointnet/pointconv etc.
accumulated into grid mapping, then yolo
Yi Zhou JITRI Deep Radar Perception June 8, 2022 24 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Sensor Fusion for Detection
Input fusion needs a lightweight pre-processing to explicitly
handle radar position imprecision.
Cascaded ROI fusion is not robust to sensor failures, while
parallel ROI fusion improves it
Yi Zhou JITRI Deep Radar Perception June 8, 2022 25 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Sensor Fusion for Detection(Cont.)
Feature map fusion provides the network with greater flexibility to
combine radar and visual semantics
Require dynamic training techniques
Modality-wise dropout
Weight freezing
Require dynamic inference capability
Scene classifier + knowledge-based gating
Self attention + cross attention
Entropy-based (input) weighted average
Yi Zhou JITRI Deep Radar Perception June 8, 2022 26 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Sensor Fusion for Detection (Cont.)
Decision fusion takes advantage of modal redundancy and is there-
fore popular in real-world applications
Location information is noisy
Track-to-track fusion (temporal filtering)
Use visual semantics for better association
Category information is difficult to handle
Bayesian inference: inherent problem with modelling ignorance
Set-based: evidence theory
Yi Zhou JITRI Deep Radar Perception June 8, 2022 27 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Object Tracking
Yi Zhou JITRI Deep Radar Perception June 8, 2022 28 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Extended Object Tracking
Extended object: multiple detections per object
Cluster the sensor data to provide the conventional trackers, e.g.
EKF with a single detection per object
Random Finite Set (RFS) representation
First-order approximation: GMM-PHD or SMC-PHD filtering
Track-by-detection
Detection->association->simple tracking
End-to-end: extract features from two consecutive radar frames,
and learn the the temporal consistency between them
Yi Zhou JITRI Deep Radar Perception June 8, 2022 29 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Radar Signature
Most detections are located in the proximity of the contour
(0.1-0.2m inside)
The wheel cases (even on the opposite) reflect EM waves
especially well and cause a substantial amount of detections.
Detection probability on the contour is heavily dependent on the
orientation
Figures are from [11].
Yi Zhou JITRI Deep Radar Perception June 8, 2022 30 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Radar Measurement Model
Function of individual measurement likelihoods
Surface-volume model is more reasonable
Figures are from [12].
Yi Zhou JITRI Deep Radar Perception June 8, 2022 31 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Challenges
Yi Zhou JITRI Deep Radar Perception June 8, 2022 32 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Multi-Path Effect
If the target reflections and the multi-path reflections occupy the
same RD cell, the performance of DOA is affected.
If they occupy different cells, it can produce ghost targets in
multi-path directions.
Reflection between ego-vehicle and targets
Underbody reflection, e.g. under the truck, also known as
look-through effect
Mirrored ghost detections caused by the reflective surface, such as
concrete walls, guardrails etc.
Figures are from [13].
Yi Zhou JITRI Deep Radar Perception June 8, 2022 33 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Mutual Interference
Coherent interference occurs when the same chirp configuration
is used and leads to ghost detections
Incoherent interference is caused by different types of chirps,
resulting in a significantly increased noise floor, masked weak
target, and thus, reduced probability of detection
In reality, partially coherent interference is more widely seen
where the interferer has a slightly different chirp configuration.
Figures are from [14].
Yi Zhou JITRI Deep Radar Perception June 8, 2022 34 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Fusion in Adverse Weather
Attenuation decreases the received power of the signal.
Backscattering increases the interference at the receiver.
The diameter of fog particles ranges from 1 to 100 nm, raindrop
diameters range from 0.5 to 5 mm
Lidar signal with a wavelength (905 nm or 1550 nm) shorter than
the diameter of these particles will be highly scattered
Radar signal will have minor attenuation and backscattering
Figures are from [15].
Yi Zhou JITRI Deep Radar Perception June 8, 2022 35 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Future Research Directions
Building high-quality datasets
Point-wise annotations are better (clutter and multi-path effects)
Data diversity (weather, scenarios) and class imbalance (VRUs)
Orientation and full-velocity
Incorporating radar domain knowledge
Radar datasets cannot guarantee generalization
Identify possible deficiencies, such as multi-path, interference etc.
Uncertainty quantification
Due to the low SNR of radar data and the small size of radar
datasets, both high data and model uncertainties are expected for
CNN-based radar detectors.
Motion forecasting
Doppler velocity is valuable for motion forecasting
Yi Zhou JITRI Deep Radar Perception June 8, 2022 36 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Reference I
[1] Igal Bilik. “Comparative Analysis of Radar and Lidar Technolo-
gies for Automotive Applications”. In: IEEE Intelligent Trans-
portation Systems Magazine (2022).
[2] Carsten Ditzel and Klaus Dietmayer. “GenRadar: Self-Supervised
Probabilistic Camera Synthesis Based on Radar Frequencies”.
In: IEEE Access 9 (2021), pp. 148994–149042.
[3] Vladimir Lekic and Zdenka Babic. “Automotive radar and cam-
era fusion using generative adversarial networks”. In: Com-
puter Vision and Image Understanding 184 (2019), pp. 1–8.
[4] Tim A Wheeler et al. “Deep stochastic radar models”. In: 2017
IEEE Intelligent Vehicles Symposium (IV). IEEE. 2017,
pp. 47–53.
[5] Robert Prophet et al. “Semantic segmentation on 3D occu-
pancy grids for automotive radar”. In: IEEE Access 8 (2020),
pp. 197917–197930.
Yi Zhou JITRI Deep Radar Perception June 8, 2022 37 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Reference II
[6] Julien Rebut et al. “Raw High-Definition Radar for Multi-Task
Learning”. In: arXiv preprint arXiv:2112.10646 (2021).
[7] Fangqiang Ding et al. “Self-Supervised Scene Flow Estimation
with 4D Automotive Radar”. In: arXiv preprint arXiv:2203.01137
(2022).
[8] Yunfei Long et al. “Full-Velocity Radar Returns by Radar-Camera
Fusion”. In: Proceedings of the IEEE/CVF International Con-
ference on Computer Vision. 2021, pp. 16198–16207.
[9] Chia-Hung Lin et al. “DL-CFAR: A Novel CFAR target detection
method based on deep learning”. In: 2019 IEEE 90th Vehic-
ular Technology Conference (VTC2019-Fall). IEEE. 2019,
pp. 1–6.
[10] Jonas Fuchs et al. “A Machine Learning Perspective on Auto-
motive Radar Direction of Arrival Estimation”. In: IEEE Access
(2022).
Yi Zhou JITRI Deep Radar Perception June 8, 2022 38 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Reference III
[11] Philipp Berthold et al. “Radar reflection characteristics of vehi-
cles for contour and feature estimation”. In: 2017 Sensor Data
Fusion: Trends, Solutions, Applications (SDF). IEEE. 2017,
pp. 1–6.
[12] Yuxuan Xia et al. “Extended object tracking using hierarchi-
cal truncation measurement model with automotive radar”.
In: ICASSP 2020-2020 IEEE International Conference on
Acoustics, Speech and Signal Processing (ICASSP). IEEE.
2020, pp. 4900–4904.
[13] Johannes Kopp et al. “Fast Rule-Based Clutter Detection in Au-
tomotive Radar Data”. In: 2021 IEEE International Intelligent
Transportation Systems Conference (ITSC). IEEE. 2021,
pp. 3010–3017.
Yi Zhou JITRI Deep Radar Perception June 8, 2022 39 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Reference IV
[14] Canan Aydogdu et al. “Radar interference mitigation for auto-
mated driving: Exploring proactive strategies”. In: IEEE Signal
Processing Magazine 37.4 (2020), pp. 72–84.
[15] You Li et al. “What happens for a ToF LiDAR in fog?” In: IEEE
Transactions on Intelligent Transportation Systems 22.11
(2020), pp. 6670–6681.
Yi Zhou JITRI Deep Radar Perception June 8, 2022 40 / 41
Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References
Thank You !
Yi Zhou JITRI Deep Radar Perception June 8, 2022 41 / 41

More Related Content

PDF
[기초개념] Graph Convolutional Network (GCN)
PPTX
Batch normalization presentation
PDF
HSB Marine | Floating Solutions and Building On Water
PDF
Floating Solar PV - An Introduction
PPTX
Q1 Memory Fabric Forum: Building Fast and Secure Chips with CXL IP
PPTX
What is quantum computing
PPTX
Introduction to RAG (Retrieval Augmented Generation) and its application
PDF
Orbital mechanics
[기초개념] Graph Convolutional Network (GCN)
Batch normalization presentation
HSB Marine | Floating Solutions and Building On Water
Floating Solar PV - An Introduction
Q1 Memory Fabric Forum: Building Fast and Secure Chips with CXL IP
What is quantum computing
Introduction to RAG (Retrieval Augmented Generation) and its application
Orbital mechanics

What's hot (20)

PDF
Fabry laser ppt
PDF
XR and the Future of Immersive Technology
DOCX
IoT Design Principles
PPTX
Rf and mw radiation hazards
PDF
S-matrix analysis of waveguide components
PPTX
Creator IoT Framework
PPT
Virtual retinal-display ppt
PDF
Eccm in radar
PPTX
bel-radar
PDF
Radar 2009 a 19 electronic counter measures
PPT
Microstrip Patch Antenna Design
PPTX
Augmented Reality
PDF
Project “The Interceptor”: Owning anti-drone systems with nanodrones
PDF
A survey on Device-to-Device Communication
PPT
Wireless Sensor Network based Crop Field Monitoring for Marginal Farming: Per...
PDF
Extended Reality.pdf
PDF
microstrip transmission lines explained.pdf
PPT
Satellite Bands
PPTX
Smart Antenna for mobile communication
PDF
Introduction to IoT Architectures and Protocols
Fabry laser ppt
XR and the Future of Immersive Technology
IoT Design Principles
Rf and mw radiation hazards
S-matrix analysis of waveguide components
Creator IoT Framework
Virtual retinal-display ppt
Eccm in radar
bel-radar
Radar 2009 a 19 electronic counter measures
Microstrip Patch Antenna Design
Augmented Reality
Project “The Interceptor”: Owning anti-drone systems with nanodrones
A survey on Device-to-Device Communication
Wireless Sensor Network based Crop Field Monitoring for Marginal Farming: Per...
Extended Reality.pdf
microstrip transmission lines explained.pdf
Satellite Bands
Smart Antenna for mobile communication
Introduction to IoT Architectures and Protocols
Ad

Similar to Slides_Deep_Radar_Perception_for_Autonomous_Driving.pdf (20)

DOC
Mmpaper draft10
DOC
Mmpaper draft10
PPTX
sensor fusion presentation iit kanpur ashish
PDF
3D LiDAR in Action: Enhancing Object Detection Across Industries
PDF
Research on Ship Detection in Visible Remote Sensing Images
PPT
Ieee gold 2010 resta
PDF
Real-time 3D Object Detection on LIDAR Point Cloud using Complex- YOLO V4
PDF
DSNet Joint Semantic Learning for Object Detection in Inclement Weather Condi...
PPTX
Lidar and sensing
PDF
Visual and light detection and ranging-based simultaneous localization and m...
PPTX
PDF
Mobile robot localization using visual odometry in indoor environments with T...
PDF
You only look once model-based object identification in computer vision
PDF
“Future Radar Technologies and Applications,” a Presentation from IDTechEx
PPTX
Lidar technology and it’s applications
PDF
10.1109@ICCMC48092.2020.ICCMC-000167.pdf
PPT
Template_for_Presentation_january25 (1).ppt
PDF
3-d interpretation from single 2-d image IV
PPTX
Seminar -I PPT Vivek RT-Object Detection.pptx
Mmpaper draft10
Mmpaper draft10
sensor fusion presentation iit kanpur ashish
3D LiDAR in Action: Enhancing Object Detection Across Industries
Research on Ship Detection in Visible Remote Sensing Images
Ieee gold 2010 resta
Real-time 3D Object Detection on LIDAR Point Cloud using Complex- YOLO V4
DSNet Joint Semantic Learning for Object Detection in Inclement Weather Condi...
Lidar and sensing
Visual and light detection and ranging-based simultaneous localization and m...
Mobile robot localization using visual odometry in indoor environments with T...
You only look once model-based object identification in computer vision
“Future Radar Technologies and Applications,” a Presentation from IDTechEx
Lidar technology and it’s applications
10.1109@ICCMC48092.2020.ICCMC-000167.pdf
Template_for_Presentation_january25 (1).ppt
3-d interpretation from single 2-d image IV
Seminar -I PPT Vivek RT-Object Detection.pptx
Ad

Recently uploaded (20)

PPTX
Applications of SAP S4HANA in Mechanical by Sidhant Vohra (SET23A24040166).pptx
PDF
Engine Volvo EC55 Compact Excavator Service Repair Manual.pdf
PDF
Caterpillar Cat 324E LN Excavator (Prefix LDG) Service Repair Manual Instant ...
PPT
Main/Core Business Application User Manual
PPTX
368455847-Relibility RJS-Relibility-PPT-1.pptx
PDF
Life Cycle Analysis of Electric and Internal Combustion Engine Vehicles
PDF
Volvo EC55 Compact Excavator Service Repair Manual Instant Download.pdf
PDF
Transmission John Deere 370E 410E 460E Technical Manual.pdf
PDF
Articulated Dump Truck John Deere 370E 410E 460E Technical Manual.pdf
PDF
book-slidefsdljflsk fdslkfjslf sflgs.pdf
PPTX
Training Material_Verification Station.pptx
PPT
Introduction to Hybrid Electric Vehicles
PDF
System Diagrams John Deere 370E 410E 460E Repair Manual.pdf
PDF
harrier-ev-brochure___________________.pdf
PPTX
Money and credit.pptx from economice class IX
PDF
Cylinder head Volvo EC55 Service Repair Manual.pdf
PPTX
description of motor equipments and its process.pptx
PDF
Lubrication system for Automotive technologies
PDF
MES Chapter 3 Combined UNIVERSITY OF VISVESHWARAYA
PPTX
Cloud_Computing_ppt[1].pptx132EQ342RRRRR1
Applications of SAP S4HANA in Mechanical by Sidhant Vohra (SET23A24040166).pptx
Engine Volvo EC55 Compact Excavator Service Repair Manual.pdf
Caterpillar Cat 324E LN Excavator (Prefix LDG) Service Repair Manual Instant ...
Main/Core Business Application User Manual
368455847-Relibility RJS-Relibility-PPT-1.pptx
Life Cycle Analysis of Electric and Internal Combustion Engine Vehicles
Volvo EC55 Compact Excavator Service Repair Manual Instant Download.pdf
Transmission John Deere 370E 410E 460E Technical Manual.pdf
Articulated Dump Truck John Deere 370E 410E 460E Technical Manual.pdf
book-slidefsdljflsk fdslkfjslf sflgs.pdf
Training Material_Verification Station.pptx
Introduction to Hybrid Electric Vehicles
System Diagrams John Deere 370E 410E 460E Repair Manual.pdf
harrier-ev-brochure___________________.pdf
Money and credit.pptx from economice class IX
Cylinder head Volvo EC55 Service Repair Manual.pdf
description of motor equipments and its process.pptx
Lubrication system for Automotive technologies
MES Chapter 3 Combined UNIVERSITY OF VISVESHWARAYA
Cloud_Computing_ppt[1].pptx132EQ342RRRRR1

Slides_Deep_Radar_Perception_for_Autonomous_Driving.pdf

  • 1. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Deep Radar Perception for Autonomous Driving Datasets, Methods, and Challenges Yi Zhou zhouyi1023@tju.edu.cn Institute of Deep Perception Technology Jiangsu Industrial Technology Research Institute June 8, 2022 Yi Zhou JITRI Deep Radar Perception June 8, 2022 1 / 41
  • 2. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Table of Contents 1 Introduction 2 Radar Datasets 3 Low Level Tasks 4 Object Detection 5 Object Tracking 6 Challenges Yi Zhou JITRI Deep Radar Perception June 8, 2022 2 / 41
  • 3. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Introduction Yi Zhou JITRI Deep Radar Perception June 8, 2022 3 / 41
  • 4. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Citation This presentation is adapted from the review article "Towards Deep Radar Perception for Autonomous Driving: Datasets, Methods, and Challenges". For more details, please see the original article. https://www.mdpi.com/1424-8220/22/11/4208 In addition, I maintain a github repository for further updating. https://github.com/ZHOUYI1023/ awesome-radar-perception Yi Zhou JITRI Deep Radar Perception June 8, 2022 4 / 41
  • 5. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Industrial Trends 4D Radar, possibly with imaging capability 12Tx-16Rx radar (4 chip cascade and FPGA) Arbe 48Tx-48Rx radar (RF chipset and radar processing unit (RPU)) CAN (500 kb/s) -> 100BASE-T Ethernet (11.75 MB/s) Yi Zhou JITRI Deep Radar Perception June 8, 2022 5 / 41
  • 6. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Radar vs Lidar Parameter Radar Lidar Wavelength 3.8 mm 905 nm and 1,550 nm Directional element Antenna Optics Illumination approach Wide beam ->entire object Narrow beam ->portion of the object Transmitted signal Diffusive propagation Direct propagation Two-way attenuation 1/R^4 1/R^2 Major propagation effects Multipath Absorption, Diffusion, Speckle Interaction with objects Specular: isolated detections Diffusive: “image like” Received echo Superposition of multiple reflectors From a single reflector Output Radar cube- >Point cloud Point cloud Classification information Statistics of point distritution, RCS, velocity Object shape Figure is from [1]. Yi Zhou JITRI Deep Radar Perception June 8, 2022 6 / 41
  • 7. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Research Trends Radar perception has drawn increasing attention from researchers in computer vision, robotics, and intelligent vehicles. Yi Zhou JITRI Deep Radar Perception June 8, 2022 7 / 41
  • 8. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Big Picture Radar can be used in many applications: Military applications, remote sensing Traffic surveillance and security: concealed object detection Industry monitoring Automotive: object detection and robust Odometry/ SLAM Gait/gesture recognition -> healthcare, HCI Vital sign monitoring -> healthcare, sleep monitoring, in-cabin monitoring For autonomous driving, learning-based methods can be used for: Radar depth estimation Radar velocity estimation Radar object detection Sensor fusion for autonomous driving Yi Zhou JITRI Deep Radar Perception June 8, 2022 8 / 41
  • 9. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Radar Datasets Yi Zhou JITRI Deep Radar Perception June 8, 2022 9 / 41
  • 10. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Radar Datasets Many radar datasets were released within these three years. Yi Zhou JITRI Deep Radar Perception June 8, 2022 10 / 41
  • 11. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Radar Datasets (Cont.) Radar Datasets for Autonomous Driving nuScenes, DENSE and Pixset are for sensor fusion, but do not particularly address the role of radar. Radar scenes provides point-wise annotations for radar point cloud, but has no other modalities. Pointillism uses 2 radars with overlapped view. Zendar uses SAR for vehicle detection. RADIATE uses spinning radar and addresses adverse weather effect. Yi Zhou JITRI Deep Radar Perception June 8, 2022 11 / 41
  • 12. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Radar Datasets (Cont.) 4D Radar Datasets Astyx is small VoD focuses on VRU classification RADIal’s annotation is coarse but provides raw data TJ4D features for its long range detection Pre-CFAR Datasets CARRADA is too simple CRUW uses RA maps RADDet provides annotations for RAD tensor RADICaL provides raw ADC data and signal processing toolboxes GhentVRU is designed for VRU detection Yi Zhou JITRI Deep Radar Perception June 8, 2022 12 / 41
  • 13. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Radar Datasets (Cont.) Odometry and Localization Datasets Oxford Radar Robocar, RADIATE, MULRAN and Boreas all use spinning radar MulRan, Boreas and EU Long-term are for place recognition and long-term SLAM Endeavour Radar Dataset uses 5 Conti ARS 430 for odometry ColoRadar uses TI AWR2243 Cascade + AWR1843 with overlapped FoV for odometry USVInland is for SLAM in inland waterways and water segmentation Yi Zhou JITRI Deep Radar Perception June 8, 2022 13 / 41
  • 14. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Radar Datasets (Cont.) Radar Datasets for Specific Tasks HawkEye is a SAR datasets for tatic vehicle classification PREVENTION is a datasets for trajectory prediction SCORP is for open space segmentation Ghost is for mirrored ghost detection DopNet is for gesture classification Radar signatures of human activities and Solinteraction Data use Google soli FloW dataset is for floating waste detection Yi Zhou JITRI Deep Radar Perception June 8, 2022 14 / 41
  • 15. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Radar Calibration Calibration targets to be observed simultaneously by different modalities Vertical misalignment problem if no sufficient elevation resolution Yi Zhou JITRI Deep Radar Perception June 8, 2022 15 / 41
  • 16. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Radar Labelling Cross-modality labelling Project radar points to visual images Use Lidar bounding box as ground truth Labelling quality is not guaranteed Occlusion problem, not necessarily shared FoV Bounding box often contains data points which appear to be part of an object, but are actually caused by a different reflector, e.g., ground or multi-path detections. Yi Zhou JITRI Deep Radar Perception June 8, 2022 16 / 41
  • 17. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Generative Model Lidar to radar Scene configuration to radar Radar to image Figures are from [2, 3, 4]. Yi Zhou JITRI Deep Radar Perception June 8, 2022 17 / 41
  • 18. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Low Level Tasks Yi Zhou JITRI Deep Radar Perception June 8, 2022 18 / 41
  • 19. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Radar Depth Estimation Depth completion: radar-camera fusion, with Lidar as supervision Monodepth: with radar as supervision Two-stage coarse-to-fine architecture Further expanse radar sparse detections for better alignment Extend radar detections in height (simple but coarse) Maintain multiple hypothesis by building a probabilistic map Apply a strict filtering according to the bounding box, where only detections corresponding to the frontal surface are retained. Yi Zhou JITRI Deep Radar Perception June 8, 2022 19 / 41
  • 20. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Radar Depth Estimation Depth in BEV is similar with Occupancy grid mapping Temporal accumulation Better for slow speed applications, such as parking Semantic scene understanding Open space segmentation Single measurement, real-time Figures are from [5, 6]. Yi Zhou JITRI Deep Radar Perception June 8, 2022 20 / 41
  • 21. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Radar Velocity Estimation Radar measure radial velocity To recover full velocity. we need add some constraints, such as Observe multiple detections per object Other modalities see the same detection Temporal relationship between multiple frames, i.e. scene flow Figures are from [7, 8]. Yi Zhou JITRI Deep Radar Perception June 8, 2022 21 / 41
  • 22. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Object Detection Yi Zhou JITRI Deep Radar Perception June 8, 2022 22 / 41
  • 23. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Traditional Radar Detection Pipeline RD map -> CFAR -> DOA -> DBSCAN Deep learning can be applied to replace CFAR, DOA estimation Figures are from [9, 10]. Yi Zhou JITRI Deep Radar Perception June 8, 2022 23 / 41
  • 24. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References End-to-End Radar Detection Different radar representations + temporal accumulation Raw ADC data: replace MIMO waveform separation RAD encoder: 3D tensor 3D conv is time-consuming, use CFAR to crop the small cubes Multi-view architecture, then 2D conv RA map: close field high-resolution imaging (USSR mode) Point cloud pointnet/pointconv etc. accumulated into grid mapping, then yolo Yi Zhou JITRI Deep Radar Perception June 8, 2022 24 / 41
  • 25. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Sensor Fusion for Detection Input fusion needs a lightweight pre-processing to explicitly handle radar position imprecision. Cascaded ROI fusion is not robust to sensor failures, while parallel ROI fusion improves it Yi Zhou JITRI Deep Radar Perception June 8, 2022 25 / 41
  • 26. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Sensor Fusion for Detection(Cont.) Feature map fusion provides the network with greater flexibility to combine radar and visual semantics Require dynamic training techniques Modality-wise dropout Weight freezing Require dynamic inference capability Scene classifier + knowledge-based gating Self attention + cross attention Entropy-based (input) weighted average Yi Zhou JITRI Deep Radar Perception June 8, 2022 26 / 41
  • 27. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Sensor Fusion for Detection (Cont.) Decision fusion takes advantage of modal redundancy and is there- fore popular in real-world applications Location information is noisy Track-to-track fusion (temporal filtering) Use visual semantics for better association Category information is difficult to handle Bayesian inference: inherent problem with modelling ignorance Set-based: evidence theory Yi Zhou JITRI Deep Radar Perception June 8, 2022 27 / 41
  • 28. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Object Tracking Yi Zhou JITRI Deep Radar Perception June 8, 2022 28 / 41
  • 29. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Extended Object Tracking Extended object: multiple detections per object Cluster the sensor data to provide the conventional trackers, e.g. EKF with a single detection per object Random Finite Set (RFS) representation First-order approximation: GMM-PHD or SMC-PHD filtering Track-by-detection Detection->association->simple tracking End-to-end: extract features from two consecutive radar frames, and learn the the temporal consistency between them Yi Zhou JITRI Deep Radar Perception June 8, 2022 29 / 41
  • 30. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Radar Signature Most detections are located in the proximity of the contour (0.1-0.2m inside) The wheel cases (even on the opposite) reflect EM waves especially well and cause a substantial amount of detections. Detection probability on the contour is heavily dependent on the orientation Figures are from [11]. Yi Zhou JITRI Deep Radar Perception June 8, 2022 30 / 41
  • 31. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Radar Measurement Model Function of individual measurement likelihoods Surface-volume model is more reasonable Figures are from [12]. Yi Zhou JITRI Deep Radar Perception June 8, 2022 31 / 41
  • 32. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Challenges Yi Zhou JITRI Deep Radar Perception June 8, 2022 32 / 41
  • 33. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Multi-Path Effect If the target reflections and the multi-path reflections occupy the same RD cell, the performance of DOA is affected. If they occupy different cells, it can produce ghost targets in multi-path directions. Reflection between ego-vehicle and targets Underbody reflection, e.g. under the truck, also known as look-through effect Mirrored ghost detections caused by the reflective surface, such as concrete walls, guardrails etc. Figures are from [13]. Yi Zhou JITRI Deep Radar Perception June 8, 2022 33 / 41
  • 34. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Mutual Interference Coherent interference occurs when the same chirp configuration is used and leads to ghost detections Incoherent interference is caused by different types of chirps, resulting in a significantly increased noise floor, masked weak target, and thus, reduced probability of detection In reality, partially coherent interference is more widely seen where the interferer has a slightly different chirp configuration. Figures are from [14]. Yi Zhou JITRI Deep Radar Perception June 8, 2022 34 / 41
  • 35. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Fusion in Adverse Weather Attenuation decreases the received power of the signal. Backscattering increases the interference at the receiver. The diameter of fog particles ranges from 1 to 100 nm, raindrop diameters range from 0.5 to 5 mm Lidar signal with a wavelength (905 nm or 1550 nm) shorter than the diameter of these particles will be highly scattered Radar signal will have minor attenuation and backscattering Figures are from [15]. Yi Zhou JITRI Deep Radar Perception June 8, 2022 35 / 41
  • 36. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Future Research Directions Building high-quality datasets Point-wise annotations are better (clutter and multi-path effects) Data diversity (weather, scenarios) and class imbalance (VRUs) Orientation and full-velocity Incorporating radar domain knowledge Radar datasets cannot guarantee generalization Identify possible deficiencies, such as multi-path, interference etc. Uncertainty quantification Due to the low SNR of radar data and the small size of radar datasets, both high data and model uncertainties are expected for CNN-based radar detectors. Motion forecasting Doppler velocity is valuable for motion forecasting Yi Zhou JITRI Deep Radar Perception June 8, 2022 36 / 41
  • 37. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Reference I [1] Igal Bilik. “Comparative Analysis of Radar and Lidar Technolo- gies for Automotive Applications”. In: IEEE Intelligent Trans- portation Systems Magazine (2022). [2] Carsten Ditzel and Klaus Dietmayer. “GenRadar: Self-Supervised Probabilistic Camera Synthesis Based on Radar Frequencies”. In: IEEE Access 9 (2021), pp. 148994–149042. [3] Vladimir Lekic and Zdenka Babic. “Automotive radar and cam- era fusion using generative adversarial networks”. In: Com- puter Vision and Image Understanding 184 (2019), pp. 1–8. [4] Tim A Wheeler et al. “Deep stochastic radar models”. In: 2017 IEEE Intelligent Vehicles Symposium (IV). IEEE. 2017, pp. 47–53. [5] Robert Prophet et al. “Semantic segmentation on 3D occu- pancy grids for automotive radar”. In: IEEE Access 8 (2020), pp. 197917–197930. Yi Zhou JITRI Deep Radar Perception June 8, 2022 37 / 41
  • 38. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Reference II [6] Julien Rebut et al. “Raw High-Definition Radar for Multi-Task Learning”. In: arXiv preprint arXiv:2112.10646 (2021). [7] Fangqiang Ding et al. “Self-Supervised Scene Flow Estimation with 4D Automotive Radar”. In: arXiv preprint arXiv:2203.01137 (2022). [8] Yunfei Long et al. “Full-Velocity Radar Returns by Radar-Camera Fusion”. In: Proceedings of the IEEE/CVF International Con- ference on Computer Vision. 2021, pp. 16198–16207. [9] Chia-Hung Lin et al. “DL-CFAR: A Novel CFAR target detection method based on deep learning”. In: 2019 IEEE 90th Vehic- ular Technology Conference (VTC2019-Fall). IEEE. 2019, pp. 1–6. [10] Jonas Fuchs et al. “A Machine Learning Perspective on Auto- motive Radar Direction of Arrival Estimation”. In: IEEE Access (2022). Yi Zhou JITRI Deep Radar Perception June 8, 2022 38 / 41
  • 39. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Reference III [11] Philipp Berthold et al. “Radar reflection characteristics of vehi- cles for contour and feature estimation”. In: 2017 Sensor Data Fusion: Trends, Solutions, Applications (SDF). IEEE. 2017, pp. 1–6. [12] Yuxuan Xia et al. “Extended object tracking using hierarchi- cal truncation measurement model with automotive radar”. In: ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE. 2020, pp. 4900–4904. [13] Johannes Kopp et al. “Fast Rule-Based Clutter Detection in Au- tomotive Radar Data”. In: 2021 IEEE International Intelligent Transportation Systems Conference (ITSC). IEEE. 2021, pp. 3010–3017. Yi Zhou JITRI Deep Radar Perception June 8, 2022 39 / 41
  • 40. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Reference IV [14] Canan Aydogdu et al. “Radar interference mitigation for auto- mated driving: Exploring proactive strategies”. In: IEEE Signal Processing Magazine 37.4 (2020), pp. 72–84. [15] You Li et al. “What happens for a ToF LiDAR in fog?” In: IEEE Transactions on Intelligent Transportation Systems 22.11 (2020), pp. 6670–6681. Yi Zhou JITRI Deep Radar Perception June 8, 2022 40 / 41
  • 41. Introduction Radar Datasets Low Level Tasks Object Detection Object Tracking Challenges References Thank You ! Yi Zhou JITRI Deep Radar Perception June 8, 2022 41 / 41