In chrono:sensor:ChRadarSensor, the synthetic data is generated via GPU-based ray-tracing. By leveraging hardware acclereated support and the headless rendering capablities provided by NVIDIA Optix Library. The field of view and maximum distance of the radar define the space in which objects may be detected. To model this, the space is partitioned and sampled using rays. The rays trace the environment from the sensors out into the environement. If a ray colision is detected, the ray is endowed with the velocity of the detection, the intensity of return, object ID, and distance to detection. The sampled says are then used to approximate the full radar return. Since radar is transmitted as a single, continuous wave, the movement of the scene during a scan is neglibible in constrast to lidar. The intensity returned by the radar samples is based on a diffuse reflectance model.
Creating a Radar
auto radar = chrono_types::make_shared<ChRadarSensor>(
parent_body,
update_rate,
offset_pose,
horizontal_samples,
vertical_samples,
horizontal_fov,
vertical_fov,
100);
radar->SetName("Radar Sensor");
radar->SetLag(lag);
Radar Filter Graph
radar->PushFilter(chrono_types::make_shared<ChFilterRadarAccess>());
radar->PushFilter(chrono_types::make_shared<ChFilterRadarXYZReturn>());
radar->PushFilter(chrono_types::make_shared<ChFilterRadarXYZAccess>());
radar->PushFilter(chrono_types::make_shared<ChFilterRadarXYZVisualize>(640, 480, 2, "Radar Point Cloud"));
manager->AddSensor(radar);
Radar Data Access
UserRadarXYZBufferPtr data_ptr;
while () {
data_ptr=Radar->GetMostRecentBuffer<UserRadarXYZBufferPtr>();
if(data_ptr->Buffer) {
RadarXYZReturn first_point= data_ptr->Buffer[0];
std::cout<<"First Point: [ "<<unsigned(first_point.x) <<", "<<
unsigned(first_point.y) <<", “ <<unsigned(first_point.z) <<", "<<
unsigned(first_point.vel_x) <<", "<<unsigned(first_point.vel_y)<<", "
unsigned(first_point.vel_z) <<", "<<
unsigned(first_point.amplitude) <<" ]"<<std::endl;
}
}