In chrono:sensor:ChLidarSensor, the synthetic data is generated via GPU-based ray-tracing. By leveraging hardware acclereated support and the headless rendering capablities provided by NVIDIA Optix Library. For each lidar beam, a group of rays are traced that sample that lidar beam. The number of samples, along with beam divergence angle, are set by the user. The entire frame/scan of the lidar is processed in a single render step. To account for the time difference of rays across the scan, keyframes and motion blur techniques are used. With these keyframes, each beam in the scan traces the scene at a specific time, reproducing the motion of objects and the lidar. The intensity returned by the lidar beams is based on diffuse reflectance.
Creating a Lidar
auto lidar = chrono_types::make_shared<ChLidarSensor>(
parent_body,
update_rate,
offset_pose,
horizontal_samples,
vertical_channels,
horizontal_fov,
max_vert_angle,
min_vert_angle,
max_distance,
beam_shape,
sample_radius,
vert_divergence_angle,
hori_divergence_angle,
return_mode,
clip_near
);
lidar->SetName("Lidar Sensor");
lidar->SetLag(lag);
lidar->SetCollectionWindow(collection_time);
Lidar Filter Graph
lidar->PushFilter(chrono_types::make_shared<ChFilterDIAccess>());
lidar->PushFilter(chrono_types::make_shared<ChFilterPCfromDepth>());
lidar->PushFilter(chrono_types::make_shared<ChFilterLidarNoiseXYZI>(0.01f, 0.001f, 0.001f, 0.01f));
lidar->PushFilter(chrono_types::make_shared<ChFilterXYZIAccess>());
lidar->PushFilter(chrono_types::make_shared<ChFilterVisualizePointCloud>(640, 480, 2, "Lidar Point Cloud"));
manager->AddSensor(lidar);
Lidar Data Access
while () {
if(xyzi_ptr->Buffer) {
PixelXYZI first_point= xyzi_ptr->Buffer[0];
std::cout<<"First Point: [ "<<unsigned(first_point.x) <<", "<<
unsigned(first_point.y) <<", “ <<unsigned(first_point.z) <<", "<<
unsigned(first_point.intensity) <<" ]"<<std::endl;
}
}