Lidar For Environment Sensing In RoboTaxi

This post was originally published on this site

True autonomous driving requires human-like advanced sensing of the environment, for driving with little or no human input. Safety of passengers and passersby is the foremost priority in autonomous driving. To achieve the same, RoboSense delivers comprehensive intelligent environment perception lidar solutions, including chip, FPGA, lidar hardware and artificial intelligence (AI) algorithms. The lidar solution, RS-Fusion-P5, has been used in various application scenarios, such as in self-driving logistics vehicles, buses, passenger cars and so on. This ensures a safer and more-efficient autonomous driving for RoboTaxi.

RoboSense RS-Fusion-P5 uses a combination of RS-Ruby and RS-Bpearl to ensure a sensing range within a radius of more than 200 metres. One RS-Ruby on top of the vehicle has a vertical field-of-view (FOV) from -25° to +15°, but there is still a small blind spot less than 0.4 metres around the vehicle body. Thus, four RS-BPearl are embedded sideways around the vehicle to cover the small blind zones to guarantee a complete 360-degree surrounding view.

RS-Fusion-P5 fuses point from RS-Ruby and RS-BPearl in real time, and generates laser points of more than 4,600,000 per second to identify all-around obstacles, and position easily and precisely. Through its advanced AI perception algorithms, synchronisation interfaces and multi-sensor fusion, it achieves unprecedented safety and reliability.
RS-Ruby has 128 beams for L4 and L5 autonomous driving with a vertical resolution of 0.1°, which fully meets the further demand of high-speed autonomous driving. It not only meets the automotive-grade requirement of working from -40°C to 60°C but also achieves breakthrough performance in all-weather conditions. It achieves anti-interference of multiple lidar using special laser encryption technology to filter interference signals. Besides, it is not affected by strong direct sunlight. Lidar reflection intensity reaches a perfect balance of consistency and distinction.

RS-BPearl has fine near-field detection of 360°×90° super-wide FOV, and short-range blind spot lidar, which can precisely identify objects around the vehicle body such as pedestrians, pets, vehicles, roadbeds as well as other details of the near-field ground area. It can also detect actual height information under particular scenarios such as bridge tunnels and culverts for improving driving decisions and car safety.

There are also other solutions including RS-P1 mature and reliable lidar perception solution for low-speed autonomous driving, RS-P2 lidar perception solution for medium-high speed autonomous driving and RS-Fusion-P3 full-stack lidar perception solution for L3+ autonomous driving.

Dr Leilei Shinohara, vice president – R&D, RoboSense, says, “Driving safety is the most critical challenge for autonomous driving vehicles. Perception is the first sense for autonomous vehicles to precept the surrounding environment.” The cold-resistant lidar maps the 3D world through emitting and receiving laser pulses. With point cloud algorithms it can accurately recognise obstacles, even in snow and ice. Therefore lidar is the most important sensor when it comes to bringing autonomous driving technology to extremely cold environments. Point cloud data quality on rainy days is excellent and outperforms state-of-the-art options to better cope with harsh environments and extreme conditions.