multi-sensor data fusion for checking plausibility of v2v ... · multi-sensor data fusion for...
TRANSCRIPT
Multi-Sensor Data Fusion for Checking Plausibilityof V2V Communications by Vision-based
Multiple-Object Tracking
Marcus Obst Laurens Hobert Pierre ReisdorfBASELABS GmbH HITACHI Europe Technische Universität Chemnitz
IEEE VNC 2014, Paderborn
2
Project full title: Networked Automated Driving by 2030
Coordinator: Andras Kovacs / BroadBit
Project major CRF, Volvo Technology
partners: Hitachi, BASELABS, EPFL, ICCS, TU Dresden, Armines, BroadBit
Starting Date: November 1, 2013
Ending Date: October 31, 2016
Budget Total/Funding: 4.6 MEUR / 3.3 MEUR
Type of project: European S/M collaborative project
Project General Information
3
Development of automated driving technology is a major current challenge
Motivation and Objectives
4Quelle: BMW AG
5
Development of automated driving technology is a major current challenge
How to make the best use of the emerging 5.9 GHz 802.11p technology at service of automated driving?
How can sensing, control and V2X communications be integrated into a cost-effective on-board system for automated driving?
Motivation and Objectives
6
Application/Function(e.g. Intersection-Movement
Assist, Blind spot assist)
Straightforward Integration of V2V Communications
ITSG-5 Wireless Unit
V2V entity
Ego v
ehic
le
CAMs, DENMs
7
Straightforward Integration of V2V Communications
ITSG-5 Wireless Unit
Application/Function(e.g. Intersection-Movement
Assist, Blind spot assist)
V2V entity
Ego v
ehic
le
CAMs, DENMs
Do we trust this entity?
8
Plausibility Checking of V2V Communications
V2V entity
ITSG-5 Wireless Unit
Plausibility Checking
Application/Function(e.g. Intersection-Movement
Assist, Blind spot assist)
V2V entity
V2V entity
Ego v
ehic
le
Cross correlation with on-board perception
9
1. Take standard consumer-grade perception and communication sensors
2. Apply Bayesian multi-sensor data fusion and generate common perception
3. Derive a measure to decide if a sensor is sending valid information perform plausibility checking
Approach of this work
10
Take standard consumer-grade perception and communication sensors
MobilEye CameraITSG-5 Unit (Atheros-based)
Range, angle
width, velocity
15 Hz fixed
CAMs (position, velocity,
heading, dimensions, time)
1-10 Hz variable
11
1. Take standard consumer-grade perception and communication sensors
2. Apply Bayesian multi-sensor data fusion and generate common perception
Approach of this work
12
1. Data/measurement synchronization
2. Sensor field of view (FOV) and handover
3. Occluded object
Exemplary challenges in the data fusion development process
13
Development effort increases with the number of sensors
Sensor fields of view and handover
14
Identified objects have to be tracked and handed over to other sensorsSensor fields of view and handover
15
Relevant objects may not be visible to the sensor(s)
Occlusion
!
16
V2V-Communication is introduced to increase the visibility…
17
… and to increase the range of the system!
18
Case study: Handling occluded vehicles in the AutoNet2030 project for 360° perception
CAN bus Low-cost GPS (ublox LEA6-T) MobilEye front camera ITS-G5 equipment for C2C
(Atheros AR5414A-B2B) Front radar ARS 308 GNSS reference sensors for
high-reliable ground truth
19
20
Hide & Seek
21
V2V allows to track occluded objects
MobilEye only MobilEye + C2C Communication
22
1. Take standard consumer-grade perception and communication sensors
2. Apply Bayesian multi-sensor data fusion and generate common perception
3. Derive a measure to decide whether a sensor is sending valid information perform plausibility checking
Approach of this work
23
Can we do plausibility checking?
xx
24
Neutral: Object not visible by on-board perception
Valid: V2V information complies with on-board perception
Invalid: V2V is not consistent with on-board observations
Plausibility Checking Results
25
Probabilistic confidence measure (existence probability)
Computed over time
Considers sensor characteristics (FOV, detection probability 𝑃𝐷and false alarm probability 𝑃𝐹)
Naturally extends to multiple-sensors scenario
Sequential Probability Ratio Testing (SPRT)
Plausibility Checking by Track Score
26
Including Packet Reception Rate (PRR)
F. Martelli, M. Elena Renda, G. Resta, and P. Santi, “A measurement basedstudy of beaconing performance in ieee 802.11 p vehicularnetworks,” in INFOCOM, 2012 Proceedings IEEE. IEEE, 2012, pp.1503–1511.
27
Including Packet Reception Rate (PRR)
F. A. Teixeira, V. F. e Silva, J. L. Leoni, D. F. Macedo, and J. M.Nogueira, “Vehicular networks using the fIEEEg 802.11p standard: Anexperimental analysis,” Vehicular Communications, vol. 1, no. 2, pp. 91– 96, 2014.
28
Including Packet Reception Rate (PRR)
Empirical PRR from measurement data of presented work
29
Results
30
Plausibility Checking: Valid Scenario
Ego vehicle follows V2V vehicle which finally performs a left turn maneuver.
31
Plausibility Checking: Valid Scenario
32
Results of Plausibility Checking: Valid Scenario
CAMs only, baseline solution
neutral
33
Results of Plausibility Checking: Valid Scenario
CAMs + MobilEye
valid neutral
34
Plausibility Checking: Attacker Scenario
Ghost vehicle overtakes from left and enters FOV of on-board perception
35
Results of Plausibility Checking: Attacker Scenario
CAMs + MobilEye
neutral
invalidneutral
36
What about the efficiency?
Efficient design of the sensor data fusion for ADAS and
automated driving with BASELABS Connect and Create
37
Developmenttime
Lines of code
The selected tools allow the developer to spend his time on the differentiating parts of the system
70% 20%10%
200900
520
System Integration
Model development (=performance)
Application and data fusion
2450
38
Conclusion and Outlook
Bayesian multi-sensor data fusion approaches can be successfully applied to plausibility checking
Standard components can be easily integrated with available tools
Approach naturally extends to other on-board perception sensors such as radars
Development time should be spent on designing and tuning models
Open questions and next steps:
Perform a full centralized raw-sensor data fusion including raw GNSS signals
What is about implementing such a system directly inside of a wireless unit as kind of application (e.g. based on Linux/ARM)?
40
Data fusion component developed with BASELABS Create
Bird’s Eye Visualization
Sensor data input (from real sensor or recorded data)
Sensor calibration info