In Chrono:Sensor:ChCameraSensor, the synthetic data is generated via GPU-based ray-tracing. By leveraging hardware acclereated support and the headless rendering capablities provided by Nvidia Optix Library. For each lidar beam, a group of rays are traced that sample that lidar beam. The number of samples, along with beam divergence angle, are set by the user. The entire frame/scan of the lidar is processed in a single render step. To account for the time difference of rays across the scan, keyframes and motion blur techniques are used. With these keyframes, each beam in the scan traces the scene at a specific time, reproducing the motion of objects and the lidar. The intensity returned by the lidar beams is based on diffuse reflectance.
Creating a Lidar
auto lidar = chrono_types::make_shared<ChLidarSensor>(
my_body,
update_rate,
offset_pose,
horizontal_samples,
vertical_channels,
horizontal_fov,
max_vert_angle,
min_vert_angle,
100);
lidar->SetName("Lidar Sensor");
lidar->SetLag(lag);
lidar->SetCollectionWindow(collection_time);
Lidar Filter Graph
lidar->PushFilter(chrono_types::make_shared<ChFilterDIAccess>());
lidar->PushFilter(chrono_types::make_shared<ChFilterPCfromDepth>());
lidar->PushFilter(chrono_types::make_shared<ChFilterLidarNoiseXYZI>(0.01f, 0.001f, 0.001f, 0.01f));
lidar->PushFilter(chrono_types::make_shared<ChFilterXYZIAccess>());
lidar->PushFilter(chrono_types::make_shared<ChFilterVisualizePointCloud>(640, 480, 2, "Lidar Point Cloud"));
manager->AddSensor(cam);
Lidar Data Access
while () {
if(xyzi_ptr->Buffer) {
PixelXYZIfirst_point= xyzi_ptr->Buffer[0];
std::cout<<"First Point: [ "<<unsigned(first_point.x) <<", "<<
unsigned(first_point.y) <<", “ <<unsigned(first_point.z) <<", "<<
unsigned(first_point.intensity) <<" ]"<<std::endl;
}
}