600字范文,内容丰富有趣,生活中的好帮手!
600字范文 > 项目实践-基于视觉的自动驾驶正向碰撞预警(matlab代码)

项目实践-基于视觉的自动驾驶正向碰撞预警(matlab代码)

时间:2022-11-11 00:19:14

相关推荐

项目实践-基于视觉的自动驾驶正向碰撞预警(matlab代码)

第一章 绪论

1.1引言

自动驾驶,是指通过给车辆装备智能软件和多种感应设备,包括车载传感器、雷达、GPS以及摄像头等,实现车辆的自主安全驾驶,安全高效地到达目的地并达到完全消除交通事故的目标。美国国家公路交通安全管理局分5 级定义汽车自动化等级。0级:由驾驶员驾驶;l级:具备1种以上自动化控制功能(如自适应巡航和车道保持系统 ACC等);2级:以汽车为主体执行多种操作功能;3级:当以汽车为主体的驾驶行不通时可指示驾驶员切换为手动驾驶;4级:完全可以无人驾驶。如今,越来越多的企业开始涉足这块领域,将尖端的IT技术运用到汽车领域中,希望为消费者驾车出行带来更多美好体验。麦肯锡预测,到2025年自动驾驶汽车可以产生2000亿-1.9万亿美元的产值;市场研究公司IHs预测,2035年4 级完全自动驾驶车每年销量可达480万辆。对任何一个行业而言,这都具有足够的市场诱惑。

1.2研究现状

10月,特斯拉宣布未来旗下的车型将配备足够的硬件装置,使得这些汽车拥有完整的自动驾驶能力,而不是现在的自动巡航(Autopilot)。此前,特斯拉刚刚针对首起自动驾驶车祸致死案进行了自动巡航功能的软件系统升级。

1月,通用向美国第二大打车应用Lyft投资5亿美元研发自动驾驶,随后又对外宣布将于年内在公开道路上测试具备自动驾驶功能的出租车。此外,通用还花费10亿美元收购了自动驾驶初创公司Cruise Automation。据了解,目前通用正在研发一套名为Super Cruise的自动巡航系统,类似于特斯拉的Autopilot。

福特宣布将于实现完全自动驾驶汽车的大规模商用。据称,福特目前正在与4家自动驾驶技术相关的初创公司合作,或者已经对其进行了投资。而且福特最近正在扩张自家在硅谷的自动驾驶研发团队。

克莱斯勒宣布与谷歌母公司Alphabet达成合作,将后者的自动驾驶技术应用于自家的厢式旅行车上。目前,Alphabet在克莱斯勒的总部密歇根州设有联合研发实验室。

本田讴歌在美国加州的GoMentum Station自动驾驶汽车测试场展示了多款自动驾驶汽车。有专利显示,本田目前正在研究通过增强现实(AR)技术让驾驶员看到障碍物背后的物体。

沃尔沃在瑞典哥德堡开启了一项针对旗下XC90车型的自动驾驶改装计划。有传言称,沃尔沃还计划在伦敦和中国展开更多的自动驾驶测试。同时,沃尔沃已经与Uber达成合作,目前正在美国匹兹堡进行自动驾驶项目的联合研发。

谷歌已对外宣布,将自动驾驶项目从X实验室拆分出来,以Waymo的名字独立运营。分析认为,这说明谷歌在自动驾驶领域多年的研究已经顺利跨过了实验室研发阶段,正式步入了大规模商业化。

苹果的造车计划一直充满了神秘感。根据知情人士的透露,苹果已经暂停了自动驾驶项目Titan,并展开了大规模裁员。但随后不久,又有媒体爆料,苹果已经和全球知名的英国赛车制造商迈凯伦展开洽谈,计划对后者进行投资,共同推进自动驾驶技术的研发。同时,也有爆料称苹果打算收购一家位于美国旧金山的自动平衡摩托车初创公司Lit Motors。

英特尔、Mobileye、宝马公司于宣布展开合作,共同进行自动驾驶技术的研发,并计划于将这一技术投入商。Mobileye公司是全球知名的车载激光雷达和摄像头设备、软件制造商,总部位于以色列,在全球ADAS(汽车先进驾驶辅助系统)OEM(整车配套)市场具有垄断地位,在汽车图像检测和处理领域具有全球领先的技术优势。包括特斯拉、奔驰、宝马和沃尔沃等众多知名的汽车厂商都是Mobileye的客户。

nuTonomy是一家从麻省理工分离出来的创业公司,主要从事自动驾驶出租车的研发。目前,nuTonomy领先Uber在新加坡展开了全球首次自动驾驶出租汽车的公开测试。

9月,美国交通运输部发布了“联邦自动驾驶汽车条例”,针对自动驾驶汽车在制造和研发中的各种问题做出了明确规定。另外,该部门还制定了一条新规,要求所有新款轿车和卡车都必须配备通讯装置,实时上传自身的行驶速度、地理位置以及目的地。

底,美国密歇根大学批准了一项600 多万美元的“安全驾驶”项目,建造用于测试自动驾驶汽车的设施。该大学与汽车行业及政府部门通力合作,计划在向该州安娜堡市交付自动驾驶车车队,车队里所有的自动驾驶汽车都将联网。7月,美国密西根大学宣布接手原本隶属于通用集团的Willow Run建筑群(二战期间福特曾在此地生产B-24轰炸机)。密歇根大学计划将此地改造成全美最大的自动驾驶汽车测试基地,并将之命名为美国移动中心(American Center for Mobility)。

德国柏林自由大学科研人员推出了名为“德国制造”的自动驾驶汽车,它可以自动停下来搭载乘客。这款试验性车型,允许乘客利用智能设备向车辆发送消息,该车通过全球定位系统锁定乘客方位,确定出最佳路线,以最快的速度到达并搭载乘客。

,英国政府拨款150万英镑,用来在伦敦以北的小城米尔顿·凯恩斯的道路上,进行自动驾驶汽车实地试验。这些别称为“豆荚”的自动驾驶汽车行驶速度为19 krn/h,它们将在专用道路上搭载乘客前往市区各地。

1.3本文主要研究工作

本文通过融合来自视觉和雷达传感器的数据以获得轨迹列表,即估计汽车前面物体的位置和速度,根据轨道和FCW标准来跟踪车辆前方的目标来执行正向碰撞预警。追踪预警效果可在视屏帧和Bird-Eye图中体现。

第二章 自动驾驶关键技术概述

2.1引言

自动驾驶车辆(Intelligent vehicle)是一个集环境感知、规划决策、多等级辅助驾驶等功能于一体的综合系统,集中运用了计算机、现代传感、信息融合、通讯、人工智能及自动控制等技术,是典型的高新技术综合体。自动驾驶的关键技术依次可以分为环境感知,行为决策,路径规划和运动控制四大部分。

图 1自动驾驶关键技术

2.2环境感知

自动驾驶的第一步就是环境信息和车内信息的采集与处理,是智能车辆自主行驶的基础和前提。获取周围的环境信息。这方面涉及到道路边界检测、车辆检测、行人检测等技术,即传感器技术,所用到的传感器一般都会有激光测距仪、视频摄像头、车载雷达、速度和加速度传感器等等。但是,由于各个传感器在设计的时候有各自的局限性,单个传感器满足不了各种工况下的精确感知,想要车辆在各种环境下平稳运行,就需要运用到多传感器融合技术,该技术也是环境感知这一大类技术的关键技术所在,目前国内这方面和国外的主要差距也集中在多传感器融合方面。

2.3行为决策

在辅助驾驶或者自动驾驶技术中,完成了感知部分之后,依据上一步感知系统获取的信息来进行决策判断,确定适当工作模型,制定相应控制策略。例如在车道保持、车道偏离预警、车距保持,障碍物警告等系统中,需要预测本车与其他车辆、车道、行人等在未来一段时间内的状态,先进的决策理论包括模糊推理、强化学习、神经网络和贝叶斯网络技术等。

2.4路径规划

自动驾驶车辆的路径规划就是在进行环境信息感知并确定车辆在环境中位置的基础上,按照一定的搜索算法,找出一条可通行的路径,进而实现自动驾驶车辆的自主导航。路径规划的方法根据智能车辆工作环境信息的完整程度,可分为两大类:

基于完整环境信息的全局路径规划方法;例如,从上海到北京有很多条路,规划处一条作为行驶路线即为全局规划。如栅格法、可视图法、拓扑法、自由空间法、神经网络法等静态路径规划算法。

基于传感器实时获取环境信息的局部路径规划方法;例如,在全局规划好的上海到北京的那条路线上会有其他车辆或者障碍物,想要避过这些障碍物或者车辆,需要转向调整车道,这就是局部路径规划。局部路径规划的方法包括:人工势场法、矢量域直方图法、虚拟力场法、遗传算法等动态路径规划算法等。

2.5运动控制

完成行驶路径规划后,需要控制车辆沿着期望的轨迹行驶,这就是运动控制部分需要完成的内容。运动控制包括横向控制和纵向控制,简单来说横向控制就是转向控制,纵向控制就是速度控制,

现在研究比较多的是横向控制,所运用的方法主要包括滑膜控制、模糊控制、神经网络控制、最优控制、自适应控制和纯跟踪控制等。横向控制为给定一个速度,通过控制转向达到车辆沿着预定轨迹行驶的目的;而纵向控制目的是为了满足车辆行驶过程中的速度要求,有时候还需要配合横向控制达到满足车辆在轨迹跟踪的同时,还需要满足安全性、稳定性和舒适性的目的。因为车辆是一个特别复杂的系统,横向、纵向和垂向都有耦合关系的存在,因此就需要对智能车辆进行横、纵向,甚至横、纵、垂向的协同控制。由于其耦合关系的复杂性,所以说智能车辆运动控制的协同控制技术,也是该部分的技术难点。

第三章 自动驾驶正向碰撞预警设计与实现

3.1引言

正向碰撞警告(FCW)是驾驶员辅助和自动驾驶系统中的一个重要功能,其目标是在与前方车辆即将发生碰撞之前向驾驶员提供正确、及时和可靠的警告。一项调查研究显示,近80%的交通碰撞都是由驾驶员在事发3秒前的疏忽造成的。如果驾驶员能提前1.5秒接收到危险警示,会减少90%的追尾碰撞;提前2秒,则几乎能预防所有碰撞。从而前方碰撞预警系统对于自动驾驶来说极为重要。本文通过融合来自视觉和雷达传感器的数据来跟踪车辆前方的目标来执行正向碰撞预警。

3.2传感器数据载入与融合

图 2传感器布局

自动驾驶汽车通常都会具备如下的4种主要零部件。激光雷达。工作过程中会一直旋转,通过激光束探测的方式生成汽车周围环境的360度成像信息。摄像头。利用多摄像头的成像视差来测算周围物体和汽车之间的距离。同时也被用来识别红绿灯和各种交通指示牌,以及周围的行人和自行车等物体。无线传感器。与摄像头协同工作,通过无线信号直接探测周围近处物体距离汽车的精确距离。主控系统。通常位于汽车的尾箱,用于处理各种传感器的感知信号,并与预先存储在系统内的地图数据对比,评估自动驾驶汽车的实时状况。为了实现正向碰撞警告这一目标,需要采集车辆的前置视觉和雷达传感器的数据信息。为了提高准确预警的概率,降低虚假预警的概率,需要进行传感器融合。本文采用了测试车(ego vehicle),其配备了各种传感器,并记录了它们的输出。

本文中使用的传感器是:

Vision

sensor:它提供了观察对象的列表,以及它们的分类和关于车道边界的信息。对象列表每秒报告10次。车道边界每秒报告20次。

中远程模式的雷达传感器:提供未分类的观测对象列表,对象列表每秒报告20次。

IMU:报告了ego车辆的速度和转弯速度,每秒20次。

摄像头:它记录了汽车前面的场景的视频剪辑。不用于跟踪器,仅用于将跟踪结果显示在视频上进行验证。

从传感器获取数据后,融合传感器数据以获得轨迹列表,即估计汽车前面物体的位置和速度,根据轨道和FCW标准发布警告。FCW标准是基于欧洲NCAP

AEB测试程序,并考虑到相对距离和相对速度的物体在前面的汽车。

3.3追踪器创建

多目标跟踪器根据视觉和雷达传感器报告的目标列表跟踪ego车辆周围的目标。追踪器的创建对于系统的稳定运行具有重要意义。在此涉及到以下方面。分配阈值:探测距离轨道有多远。如果有未分配给轨道的检测,但应该分配给轨道,则增加此值。如果有检测被分配到太远的轨道,则降低此值。更新:一个轨道被滑行多少次,然后删除。滑行是一个术语,用于在没有指定检测(预测)的情况下更新航迹。确证参数:轨道确证参数。每个未分配的检测都会初始化一个新的跟踪。其中一些检测可能是错误的,所以所有的轨道都初始化为’暂定’。要确认跟踪,必须在N跟踪器更新中至少检测M次。M和N的选择取决于对象的可见性。本文使用3次更新中2次检测的默认值。

此外,如何设计一个合理的卡尔曼滤波初始化函数进而用于描述被检测对象的运动模型是该部分的核心。首先定义运动模型,本文采用匀加速运动模型;然后定义与采样时间和位置加速度变化率相关的过程噪声;之后定义测量模型即期望的输出的变量;再初始化状态和状态方差,由于定加速度模型是线性的根据KF创建所需的卡尔曼滤波器。

3.4碰撞预警

通过数据载入与追踪器创建,循环对视屏的每一帧进行处理,主要包括创建追踪对象,更新追踪情况,计算关键对象和更新检测结果。

创建追踪对象:

在实际系统中,雷达和视觉系统会识别出多个运动物体,大部分不会对车辆行驶安全产生影响,如与汽车距离较远的物体,车辆后方物体。所以记录的信息必须经过处理和格式化后才能被跟踪器使用。这需要清除雷达探测中不必要的杂波探测。雷达报告了许多与固定物体相对应的物体,包括:护栏、道路中线、交通标志等。如果在跟踪中使用这些检测,它们会在道路边缘创建固定物体的假轨迹,因此必须在调用跟踪器之前将其移除。如果雷达目标在汽车前方静止或在其附近移动,则认为它们是非杂波。追踪对象的速度和位置测量值,将检测结果格式化为跟踪器的输入。

更新追踪目标:

指定被追踪对象的运动模型和被追踪对象的速度和位置测量之后,更新卡尔曼滤波器,判断被追踪对象是否为符合要求的运动物体。

提取关键对象:

本文关键对象被定义为位于ego车道且最靠近汽车前方的轨道,即X坐标较小的目标。为了降低误报的概率,只考虑已确认的航迹。一旦找到MIO,就会计算出汽车和MIO之间的相对速度。相对距离和相对速度决定了前方的碰撞预警。

FCW有3种情况:安全(绿色):ego lane没有车(没有MIO),MIO正在远离车,或者距离保持不变。警告(黄色):MIO离车越来越近,但仍然在FCW距离之上。警告(红色):MIO正在靠近汽车,它的距离小于FCW距离。FCW距离采用欧洲NCAP

AEB测试协议计算。这个距离随着MIO和汽车之间的相对速度而变化,当关闭速度越高,这个距离越大。欧洲NCAP AEB测试协议定义了以下距离计算: d_{FCW} = 1.2 * v_{rel}

\frac{v_{rel}^2}{2a_{max}} 其中 d_{FCW}是正向碰撞警告距离。v_{rel}是两辆车之间的相对速度。a_{max}是最大减速,定义为重力加速度的40%。

更新检测结果:

在运行过程中将对追踪器进行迭代更新,对外界环境和车内状况进行感知计算,并给出相关处理决策。

3.5实验结果及分析

图 3视频及Bird-Eye图初始化

图 4第50帧检测结果

图 5第100帧检测结果

图 6第120帧检测结果

图 7第150帧检测结果

图 8第180帧检测结果

图 9第200帧检测结果

安全(绿色):没有MIO,MIO正在远离车,或者距离保持不变。

警告(黄色):MIO离车越来越近,但仍然在FCW距离之上。

警告(红色):MIO正在靠近汽车,它的距离小于FCW距离。

从实验结果可以看出,该正向碰撞预警系统可以有效的检测出车道线及车道线附近的车辆和行人,同时根据车辆自身的相关数据进行决策并作出相应的预警,但仍会检测出一些道路边的与正向碰撞预警无关的信息。

[videoReader, videoDisplayHandle, bepPlotters, sensor] = helperCreateFCWDemoDisplay('01_city_c2s_fcw_10s.mp4', 'SensorConfigurationData.mat');% Read the recorded detections file[visionObjects, radarObjects, inertialMeasurementUnit, laneReports, ...timeStep, numSteps] = readSensorRecordingsFile('01_city_c2s_fcw_10s_sensor.mat');laneWidth = 3.6; % metersegoLane = struct('left', [0 0 laneWidth/2], 'right', [0 0 -laneWidth/2]);% Prepare some time variablestime = 0; % Time since the beginning of the recordingcurrentStep = 0; % Current timestepsnapTime = 9.3;% The time to capture a snapshot of the display% Initialize the tracker[tracker, positionSelector, velocitySelector] = setupTracker();while currentStep < numSteps && ishghandle(videoDisplayHandle)% Update scenario counterscurrentStep = currentStep + 1;time = time + timeStep;% Process the sensor detections as objectDetection inputs to the tracker[detections, laneBoundaries, egoLane] = processDetections(...visionObjects(currentStep), radarObjects(currentStep), ...inertialMeasurementUnit(currentStep), laneReports(currentStep), ...egoLane, time);% Using the list of objectDetections, return the tracks, updated to timeconfirmedTracks = updateTracks(tracker, detections, time);% Find the most important object and calculate the forward collision% warning mostImportantObject = findMostImportantObject(confirmedTracks, egoLane, positionSelector, velocitySelector);% Update video and birds-eye plot displaysframe = readFrame(videoReader);% Read video framehelperUpdateFCWDemoDisplay(frame, videoDisplayHandle, bepPlotters, ...laneBoundaries, sensor, confirmedTracks, mostImportantObject, positionSelector, ...velocitySelector, visionObjects(currentStep), radarObjects(currentStep));% Capture a snapshotif time >= snapTime && time < snapTime + timeStepsnapnow;endenddisplayEndOfDemoMessage(mfilename)

function [tracker, positionSelector, velocitySelector] = setupTracker()tracker = multiObjectTracker(...'FilterInitializationFcn', @initConstantAccelerationFilter, ...'AssignmentThreshold', 35, 'ConfirmationParameters', [2 3], ...'NumCoastingUpdates', 5);% The State vector is:% In constant velocity:State = [x;vx;y;vy]% In constant acceleration: State = [x;vx;ax;y;vy;ay]% Define which part of the State is the position. For example:% In constant velocity:[x;y] = [1 0 0 0; 0 0 1 0] * State% In constant acceleration: [x;y] = [1 0 0 0 0 0; 0 0 0 1 0 0] * StatepositionSelector = [1 0 0 0 0 0; 0 0 0 1 0 0];% Define which part of the State is the velocity. For example:% In constant velocity:[x;y] = [0 1 0 0; 0 0 0 1] * State% In constant acceleration: [x;y] = [0 1 0 0 0 0; 0 0 0 0 1 0] * StatevelocitySelector = [0 1 0 0 0 0; 0 0 0 0 1 0];end

function filter = initConstantAccelerationFilter(detection)% This function shows how to configure a constant acceleration filter. The% input is an objectDetection and the output is a tracking filter.% For clarity, this function shows how to configure a trackingKF,% trackingEKF, or trackingUKF for constant acceleration.% % Steps for creating a filter:% 1. Define the motion model and state% 2. Define the process noise% 3. Define the measurement model% 4. Initialize the state vector based on the measurement% 5. Initialize the state covariance based on the measurement noise% 6. Create the correct filter% Step 1: Define the motion model and state% This example uses a constant acceleration model, so:STF = @constacc;% State-transition function, for EKF and UKFSTFJ = @constaccjac; % State-transition function Jacobian, only for EKF% The motion model implies that the state is [x;vx;ax;y;vy;ay]% You can also use constvel and constveljac to set up a constant% velocity model, constturn and constturnjac to set up a constant turn% rate model, or write your own models.% Step 2: Define the process noisedt = 0.05; % Known timestep sizesigma = 1; % Magnitude of the unknown acceleration change rate% The process noise along one dimensionQ1d = [dt^4/4, dt^3/2, dt^2/2; dt^3/2, dt^2, dt; dt^2/2, dt, 1] * sigma^2;Q = blkdiag(Q1d, Q1d); % 2-D process noise% Step 3: Define the measurement model MF = @fcwmeas; % Measurement function, for EKF and UKFMJF = @fcwmeasjac; % Measurement Jacobian function, only for EKF% Step 4: Initialize a state vector based on the measurement % The sensors measure [x;vx;y;vy] and the constant acceleration model's% state is [x;vx;ax;y;vy;ay], so the third and sixth elements of the% state vector are initialized to zero.state = [detection.Measurement(1); detection.Measurement(2); 0; detection.Measurement(3); detection.Measurement(4); 0];% Step 5: Initialize the state covariance based on the measurement% noise. The parts of the state that are not directly measured are% assigned a large measurement noise value to account for that.L = 100; % A large number relative to the measurement noisestateCov = blkdiag(detection.MeasurementNoise(1:2,1:2), L, detection.MeasurementNoise(3:4,3:4), L);% Step 6: Create the correct filter. % Use 'KF' for trackingKF, 'EKF' for trackingEKF, or 'UKF' for trackingUKFFilterType = 'EKF';% Creating the filter:switch FilterTypecase 'EKF'filter = trackingEKF(STF, MF, state,...'StateCovariance', stateCov, ...'MeasurementNoise', detection.MeasurementNoise(1:4,1:4), ...'StateTransitionJacobianFcn', STFJ, ...'MeasurementJacobianFcn', MJF, ...'ProcessNoise', Q ... );case 'UKF'filter = trackingUKF(STF, MF, state, ...'StateCovariance', stateCov, ...'MeasurementNoise', detection.MeasurementNoise(1:4,1:4), ...'Alpha', 1e-1, ... 'ProcessNoise', Q ...);case 'KF' % The ConstantAcceleration model is linear and KF can be used% Define the measurement model: measurement = H * state% In this case:% measurement = [x;vx;y;vy] = H * [x;vx;ax;y;vy;ay]% So, H = [1 0 0 0 0 0; 0 1 0 0 0 0; 0 0 0 1 0 0; 0 0 0 0 1 0]%% Note that ProcessNoise is automatically calculated by the% ConstantAcceleration motion modelH = [1 0 0 0 0 0; 0 1 0 0 0 0; 0 0 0 1 0 0; 0 0 0 0 1 0];filter = trackingKF('MotionModel', '2D Constant Acceleration', ...'MeasurementModel', H, 'State', state, ...'MeasurementNoise', detection.MeasurementNoise(1:4,1:4), ...'StateCovariance', stateCov);endend

function [detections,laneBoundaries, egoLane] = processDetections...(visionFrame, radarFrame, IMUFrame, laneFrame, egoLane, time)% Inputs: % visionFrame - objects reported by the vision sensor for this time frame% radarFrame - objects reported by the radar sensor for this time frame% IMUFrame - inertial measurement unit data for this time frame% laneFrame - lane reports for this time frame% egoLane- the estimated ego lane% time - the time corresponding to the time frame% Remove clutter radar objects[laneBoundaries, egoLane] = processLanes(laneFrame, egoLane);realRadarObjects = findNonClutterRadarObjects(radarFrame.object,...radarFrame.numObjects, IMUFrame.velocity, laneBoundaries); % Return an empty list if no objects are reported% Counting the total number of object detections = {};if (visionFrame.numObjects + numel(realRadarObjects)) == 0return;end% Process the remaining radar objectsdetections = processRadar(detections, realRadarObjects, time); % Process video objectsdetections = processVideo(detections, visionFrame, time);end

function mostImportantObject = findMostImportantObject(confirmedTracks,egoLane,positionSelector,velocitySelector) % Initialize outputs and parametersMIO = [];% By default, there is no MIOtrackID = []; % By default, there is no trackID associated with an MIOFCW = 3;% By default, if there is no MIO, then FCW is 'safe'threatColor = 'green'; % By default, the threat color is greenmaxX = 1000; % Far enough forward so that no track is expected to exceed this distancegAccel = 9.8; % Constant gravity acceleration, in m/s^2maxDeceleration = 0.4 * gAccel; % Euro NCAP AEB definitiondelayTime = 1.2; % Delay time for a driver before starting to break, in secondspositions = getTrackPositions(confirmedTracks, positionSelector);velocities = getTrackVelocities(confirmedTracks, velocitySelector);for i = 1:numel(confirmedTracks) x = positions(i,1);y = positions(i,2);relSpeed = velocities(i,1); % The relative speed between the cars, along the laneif x < maxX && x > 0 % No point checking otherwiseyleftLane = polyval(egoLane.left, x);yrightLane = polyval(egoLane.right, x);if (yrightLane <= y) && (y <= yleftLane)maxX = x;trackID = i;MIO = confirmedTracks(i).TrackID;if relSpeed < 0 % Relative speed indicates object is getting closer% Calculate expected braking distance according to% Euro NCAP AEB Test Protocol d = abs(relSpeed) * delayTime + relSpeed^2 / 2 / maxDeceleration;if x <= d % 'warn'FCW = 1;threatColor = 'red';else % 'caution'FCW = 2;threatColor = 'yellow';endendendendendmostImportantObject = struct('ObjectID', MIO, 'TrackIndex', trackID, 'Warning', FCW, 'ThreatColor', threatColor);end

function [visionObjects, radarObjects, inertialMeasurementUnit, laneReports, ...timeStep, numSteps] = readSensorRecordingsFile(sensorRecordingFileName)% Read Sensor Recordings% The |ReadDetectionsFile| function reads the recorded sensor data file.% The recorded data is a single structure that is divided into the% following substructures:% % # |inertialMeasurementUnit|, a struct array with fields: timeStamp,% velocity, and yawRate. Each element of the array corresponds to a% different timestep.% # |laneReports|, a struct array with fields: left and right. Each element% of the array corresponds to a different timestep.% Both left and right are structures with fields: isValid, confidence,% boundaryType, offset, headingAngle, and curvature.% # |radarObjects|, a struct array with fields: timeStamp (see below),% numObjects (integer) and object (struct). Each element of the array% corresponds to a different timestep.% |object| is a struct array, where each element is a separate object,% with the fields: id, status, position(x;y;z), velocity(vx,vy,vz),% amplitude, and rangeMode.% Note: z is always constant and vz=0.% # |visionObjects|, a struct array with fields: timeStamp (see below), % numObjects (integer) and object (struct). Each element of the array % corresponds to a different timestep.% |object| is a struct array, where each element is a separate object,% with the fields: id, classification, position (x;y;z),% velocity(vx;vy;vz), size(dx;dy;dz). Note: z=vy=vz=dx=dz=0% % The timeStamp for recorded vision and radar objects is a uint64 variable% holding microseconds since the Unix epoch. Timestamps are recorded about% 50 milliseconds apart. There is a complete synchronization between the% recordings of vision and radar detections, therefore the timestamps are% not used in further calculations.A = load(sensorRecordingFileName);visionObjects = A.vision;radarObjects = A.radar;laneReports = A.lane;inertialMeasurementUnit = A.inertialMeasurementUnit;timeStep = 0.05; % Data is provided every 50 millisecondsnumSteps = numel(visionObjects); % Number of recorded timestepsend

function [laneBoundaries, egoLane] = processLanes(laneReports, egoLane)% Lane boundaries are updated based on the laneReports from the recordings.% Since some laneReports contain invalid (isValid = false) reports or% impossible parameter values (-1e9), these lane reports are ignored and% the previous lane boundary is used.leftLane = laneReports.left;rightLane = laneReports.right;% Check the validity of the reported left lanecond = (leftLane.isValid && leftLane.confidence) && ...~(leftLane.headingAngle == -1e9 || leftLane.curvature == -1e9);if condegoLane.left = cast([leftLane.curvature, leftLane.headingAngle, leftLane.offset], 'double');end% Update the left lane boundary parameters or use the previous onesleftParams = egoLane.left;leftBoundaries = parabolicLaneBoundary(leftParams);leftBoundaries.Strength = 1;% Check the validity of the reported right lanecond = (rightLane.isValid && rightLane.confidence) && ...~(rightLane.headingAngle == -1e9 || rightLane.curvature == -1e9);if condegoLane.right = cast([rightLane.curvature, rightLane.headingAngle, rightLane.offset], 'double');end% Update the right lane boundary parameters or use the previous onesrightParams = egoLane.right;rightBoundaries = parabolicLaneBoundary(rightParams);rightBoundaries.Strength = 1;laneBoundaries = [leftBoundaries, rightBoundaries];end%%%

function realRadarObjects = findNonClutterRadarObjects(radarObject, numRadarObjects, egoSpeed, laneBoundaries)% The radar objects include many objects that belong to the clutter.% Clutter is defined as a stationary object that is not in front of the% car. The following types of objects pass as nonclutter:% % # Any object in front of the car% # Any moving object in the area of interest around the car, including% objects that move at a lateral speed around the car% Allocate memorynormVs = zeros(numRadarObjects, 1);inLane = zeros(numRadarObjects, 1);inZone = zeros(numRadarObjects, 1);% ParametersLaneWidth = 3.6; % What is considered in front of the carZoneWidth = 1.7*LaneWidth; % A wider area of interestminV = 1; % Any object that moves slower than minV is considered stationaryfor j = 1:numRadarObjects[vx, vy] = calculateGroundSpeed(radarObject(j).velocity(1),radarObject(j).velocity(2),egoSpeed);normVs(j) = norm([vx,vy]);laneBoundariesAtObject = computeBoundaryModel(laneBoundaries, radarObject(j).position(1));laneCenter = mean(laneBoundariesAtObject);inLane(j) = (abs(radarObject(j).position(2) - laneCenter) <= LaneWidth/2);inZone(j) = (abs(radarObject(j).position(2) - laneCenter) <= max(abs(vy)*2, ZoneWidth));end realRadarObjectsIdx = union(...intersect(find(normVs > minV), find(inZone == 1)), ...find(inLane == 1));realRadarObjects = radarObject(realRadarObjectsIdx);end

function [Vx,Vy] = calculateGroundSpeed(Vxi,Vyi,egoSpeed)% Inputs% (Vxi,Vyi) : relative object speed% egoSpeed : ego vehicle speed% Outputs% [Vx,Vy] : ground object speedVx = Vxi + egoSpeed; % Calculate longitudinal ground speedtheta = atan2(Vyi,Vxi); % Calculate heading angleVy = Vx * tan(theta); % Calculate lateral ground speedend

function postProcessedDetections = processVideo(postProcessedDetections, visionFrame, t)% Process the video objects into objectDetection objectsnumRadarObjects = numel(postProcessedDetections);numVisionObjects = visionFrame.numObjects;if numVisionObjectsclassToUse = class(visionFrame.object(1).position);visionMeasCov = cast(diag([2,2,2,100]), classToUse);% Process Vision Objects:for i=1:numVisionObjectsobject = visionFrame.object(i);postProcessedDetections{numRadarObjects+i} = objectDetection(t,...[object.position(1); object.velocity(1); object.position(2); 0], ...'SensorIndex', 1, 'MeasurementNoise', visionMeasCov, ...'MeasurementParameters', {1}, ...'ObjectClassID', object.classification, ...'ObjectAttributes', {object.id, object.size});endendend

function postProcessedDetections = processRadar(postProcessedDetections, realRadarObjects, t)% Process the radar objects into objectDetection objectsnumRadarObjects = numel(realRadarObjects);if numRadarObjectsclassToUse = class(realRadarObjects(1).position);radarMeasCov = cast(diag([2,2,2,100]), classToUse);% Process Radar Objects:for i=1:numRadarObjectsobject = realRadarObjects(i);postProcessedDetections{i} = objectDetection(t, ...[object.position(1); object.velocity(1); object.position(2); object.velocity(2)], ...'SensorIndex', 2, 'MeasurementNoise', radarMeasCov, ...'MeasurementParameters', {2}, ...'ObjectAttributes', {object.id, object.status, object.amplitude, object.rangeMode});endendend

function measurement = fcwmeas(state, sensorID)% The example measurements depend on the sensor type, which is reported by% the MeasurementParameters property of the objectDetection. The following% two sensorID values are used:% sensorID=1: video objects, the measurement is [x;vx;y]. % sensorID=2: radar objects, the measurement is [x;vx;y;vy]. % The state is:% Constant velocity state = [x;vx;y;vy] % Constant turn state = [x;vx;y;vy;omega]% Constant acceleration state = [x;vx;ax;y;vy;ay]if numel(state) < 6 % Constant turn or constant velocityswitch sensorIDcase 1 % videomeasurement = [state(1:3); 0];case 2 % radarmeasurement = state(1:4);endelse % Constant accelerationswitch sensorIDcase 1 % videomeasurement = [state(1:2); state(4); 0];case 2 % radarmeasurement = [state(1:2); state(4:5)];endendend

function jacobian = fcwmeasjac(state, sensorID)% The example measurements depend on the sensor type, which is reported by% the MeasurementParameters property of the objectDetection. We choose% sensorID=1 for video objects and sensorID=2 for radar objects. The% following two sensorID values are used:% sensorID=1: video objects, the measurement is [x;vx;y]. % sensorID=2: radar objects, the measurement is [x;vx;y;vy]. % The state is:% Constant velocity state = [x;vx;y;vy] % Constant turn state = [x;vx;y;vy;omega]% Constant acceleration state = [x;vx;ax;y;vy;ay]numStates = numel(state);jacobian = zeros(4, numStates);if numel(state) < 6 % Constant turn or constant velocityswitch sensorIDcase 1 % videojacobian(1,1) = 1;jacobian(2,2) = 1;jacobian(3,3) = 1;case 2 % radarjacobian(1,1) = 1;jacobian(2,2) = 1;jacobian(3,3) = 1;jacobian(4,4) = 1;endelse % Constant accelerationswitch sensorIDcase 1 % videojacobian(1,1) = 1;jacobian(2,2) = 1;jacobian(3,4) = 1;case 2 % radarjacobian(1,1) = 1;jacobian(2,2) = 1;jacobian(3,4) = 1;jacobian(4,5) = 1;endendend

参考文献

[1] http//

[2] 杨帆.无人驾驶汽车的发展现状和展望[J].上海汽车,(3):35-40.

[3] Srinivasa N.

Vision-based vehicle detection and tracking method for forward collision warning

in automobiles[C]. Intelligent Vehicle Symposium.IEEE,2002:626-631 vol.2.

[4] Nakaoka

M,Raksincharoensak P,Nagai M.Study on forward collision warning system adapted

to driver characteristic and road environment[C].International Conference on

Control,Automation and Systems.IEEE,:2890-2895.

本内容不代表本网观点和政治立场,如有侵犯你的权益请联系我们处理。
网友评论
网友评论仅供其表达个人看法,并不表明网站立场。