News > i3

LiDAR Lights the Way


You can't navigate the world without perceiving what's around you, through sight or some other means - and the same is true for cars. Besides high-definition maps that help determine precisely where a vehicle is, automakers are pursuing new ways a self-driving car can react to the people and objects along its path. With enough support, some could be on the  road next year. 

The newest tech in this realm is the LiDAR (light detection and ranging) sensor, and its utility isn’t limited to self-driving cars. It will soon be in vehicles that offer ADAS (advanced driver assistance systems) features such as pedestrian detection with automatic emergency braking. LiDAR is a complement to the radars and cameras found in ADAS-equipped cars today. 
 

Capabilities Matter, Along with Price

“It’s not about one sensor, it’s about combining all these sensors,” says Frederic Bruneteau, managing director and founder of Ptolemus Consulting Group in Brussels, Belgium. Together they form the right mix “that solves all the issues that you want to solve,” he adds.

“All three of the sensors have different strengths and weaknesses,” explains Shiv Patel, research analyst for smart mobility and automotive at ABI Research in Wellingborough, England. “Radar has got great range but poor resolution. The camera has poor range and great resolution but struggles to interpret distances to other vehicles, and also struggles in poor lighting conditions. And LiDAR sits in between the two. The range is not as good as the radar and the resolution is not as good as the camera, but it’s better than the camera in adverse conditions.” He says, “We are going to need all three sensor types in a self-driving car.”

LiDAR’s main challenge is pricing. Yet Patel says price drops are inevitable, noting that anti-lock braking systems (ABS) cost $8,000 apiece when they were introduced decades ago, and now cost $50 each. Prices now can reach the low tens of thousands of dollars per unit at the high end, and below $1,000 at the low end, but previously LiDAR prices ranged above $75,000 each. Patel expects prices for individual LiDARs to drop to $200 by 2022 while the current industrywide goal is $500 or less.

Of course, rising production of self-driving cars will help bring down LiDAR prices by decreasing scarcity.

According to ABI Research, “shipments” of self-driving cars (SAE Levels 3-5) are expected to surge between now and the end of the next decade: from 87,000 vehicles in 2018 to 270,350 in 2020, 7.76 million in 2025 and 37.744 million in 2030.

ABI forecasts the LiDAR market to surge in that same timeframe. From negligible this year (counted as 0.0 million by ABI), worldwide LiDAR shipments will reach 310,000 in 2020 and grow to 36.27 million in 2025. By 2030, ABI says 170.97 million LiDAR sensors will ship globally.

Partnerships among automakers, tier 1 (or primary) suppliers and sensor makers (tier 2 suppliers) will play a role in LiDAR proliferation and price declines, too. Patel points to Samsung’s DRVLINE autonomous driving and ADAS platform debuted at CES 2018. DRVLINE presents a choice of LiDAR sensors from Samsung’s partners – Quanergy, Innoviz, Tetravue and Occulii — in addition to a new ADAS forwardfacing camera system created by Samsung and Harman.

 

Evolution to Revolution

“For any type of vehicle automation, the first thing you need to do is build a perception module,” says Mobileye’s Dan Galves, a unit of Intel. This enables the vehicle to react to shapes and textures around it, including those of people, vehicles, obstructions, lane markings and traffic light colors. But while cameras can detect both types of information, radar and LiDAR recognize only shapes.

Galves says, “we use cameras as the primary source of information for perception, and then use LiDAR and radar as a redundancy for shape. For texture, we use a high-definition map for redundancy.” All combined, “you can build a very accurate model of the environment around the vehicle,” he says.

To date, Mobileye has integrated LiDAR in demonstration vehicles exclusively, including one it displayed at CES 2018. But Galves says next year a “household name” automaker will launch the first production vehicle with a Mobileye LiDAR-based system for Level 3 autonomy.

More, because the driver is always in control of the vehicle at Levels 1 and 2, “it limits the amount of redundancy [needed],” he says. “We don’t really believe that LiDAR is necessary for anything below Level 3.”

Despite the advances, it could be another five years before most LiDAR companies are ready for the mass market, says Dr. Christoph Schroeder, director of vehicle intelligence in the autonomous driving team at Mercedes-Benz Research & Development North America (MBRDNA) in Sunnyvale, CA. Suppliers are challenged to economically produce the internal chips needed to operate LiDARs, Schroeder says. A fully self-driving car requires at least four LiDARs to see all 360 degrees, he says.

Car vision and perception has quickly advanced since the auto industry first began working on self-driving cars 10 years ago. With cameras, it’s possible to distinguish pedestrians from one another, and that’s “a huge step forward,” Schroeder says. Similar breakthroughs occurred with radars. Yet there’s still room for improvement with all three sensor types (camera, radar and LiDAR) and the deep learning software that works with them. MBRDNA and other automakers have dedicated teams working on car perception technologies “to make it robust and able to handle more situations,” he says.

Daimler AG and MBRDNA are pursuing parallel paths, Schroeder says. The first is evolutionary and leads to improved ADAS on vehicles that are in or nearing production. The second is revolutionary, leading towards fully self-driving cars. The flagship S-Class is an example of the former; a new generation introduced last year uses map and navigation data to enhance camera and radar sensing to better automate driving behavior in curves. Regarding the latter, Schroeder mentions MBRDNA’s joint venture with Bosch, announced last year.

Toyota is opening an automated vehicle test facility in Ottawa Lake, MI, in October. Under the auspices of the Toyota Research Institute, the facility is designed to “safely replicate demanding ‘edge case’ driving scenarios, too dangerous to perform on public roads.” But its focus is on ADAS, not autonomy.

BMW opened a new Autonomous Driving Center in Munich, Germany this year that houses 1,800 workers, including some BMW partners such as Intel and Mobileye.

Puzzle Pieces: Mechanical, Solid State & Hybrid

There are basically two kinds of LiDAR available to automakers: solid-state, which contain no moving parts, and mechanical, which contain rotating mirrors. But a third type is also on the market from one supplier — a solid-state “hybrid” LiDAR that spins on a ball bearing.

MBRDNA’s Schroeder contends that mechanical and hybrid LiDARs are useful in near-term applications because they’ve been available the longest but are vulnerable to “wear and tear.” So the future belongs to solid-state, which “probably will work in 10 years as it does now,” he says.

That’s a sentiment echoed by other insiders. Mechanical LiDARs offer the range, resolution and field of view needed to get driverless vehicles on the road soonest and that will matter more in the mobility-as-a-service (MaaS) fleet market than high unit prices of $4,000 and up, asserts ABI Research’s Patel. He expects Waymo to be among the earliest adopters of mechanical LiDARs in a robotaxi service launching this year.

Conversely, Patel expects solid-state LiDARs priced below $500 to roll out en masse in personally-owned semi-autonomous and autonomous (SAE Levels 3-5) vehicles in the 2020-2025 timeframe. He notes, however, that one LiDAR-equipped Level-3 car is already available: the 2018 Audi A8, which incorporates a mechanical unit made by Valeo. (The Valeo SCALA laser scanner was demonstrated at CES 2015.)

ABI Research calculates worldwide shipments of mechanical and solid-state LiDARs will be less than one million this year (0.00). But it predicts the shipments of solid-state LiDARs will shoot from 240,000 in 2020 to 35.24 million in 2025 and 158.94 million in 2030. By comparison, ABI Research forecasts the shipments of mechanical LiDARs will be 70,000 in 2020, 1.03 million in 2025 and 12.03 million in 2030.

In May, LiDAR startup Innoviz announced it will furnish its proprietary solid-state sensors to the BMW Group via a partnership with Magna, one of the world’s largest automotive technology suppliers. The automaker will use a Magna autonomous driving technology platform that comprises LiDAR, radar and other sensors in self-driving cars it plans to sell to consumers in 2021. The InnovizOne LiDARs are small enough to be integrated into a car’s front grille, and are mated to Innoviz’s own computer vision software stack and algorithms for object detection and classifcation.

Besides cost and size, a big challenge for LiDAR makers has been performance in bright sunlight conditions and near other LiDARs’ light beams, says Omer David Keilaf, CEO of Innoviz, based in Israel. He says Innoviz has overcome this through its proprietary optical design, microelectromechanical systems (MEMS), detectors and signal processing.

Two ways to distinguish one LiDAR from another are the wavelength of its laser light beam (sub-1000 nanometers or 1,550 nanometers), and the way that light beam is steered (mechanical or the solid-state alternatives MEMS, Flash, and OPA or optical phase array).

Innoviz pursued a sub-1000 nm wavelength because it doesn’t require a cooling mechanism to be built into the LiDAR and can be built with standard silicon chips which instills major cost reductions, unlike 1,550 nm LiDARs, Keilaf says. The company went with MEMS because it is a mature solid-state technology and least expensive to produce at a mass scale.

Velodyne LiDAR Inc., based in San Jose, has been in the LiDAR field for 10 years, says CTO Anand Gopalan, and it, too, has chosen sub-1,000 nm MEMS to meet automotive requirements at competitive price points with the ability to scale in large volumes. But for one of its two LiDAR designs – named Puck – Velodyne patented placing the solidstate sensor on a rotating spindle to achieve a unique 360-degree field of view. For the other, named Velarray, the company devised a “frictionless beam steering technology” that produces a narrow 120-degree field of view.

While the Puck is optimal for self-driving cars, the Velarray is oriented to ADAS applications like automatic emergency braking and blind spot monitoring. Gopalan says, there are four different Velodyne LiDAR models dubbed 16, 32, 64 and 128 (for the number of lasers they contain). The 16 and 32 can also address ADAS functions such as automatic emergency braking, blind spot detection, collision avoidance — on top of self-driving at less than 35 miles per hour.

Gopalan anticipates the first Level 3 production cars with Velarray LiDARS and the Puck 16 for ADAS to be available in 2020 or 2021 as well as Puck 32-, 64- and 128-equipped Level 4 or 5 cars.
 

Are LiDARS Needed?

Jada Tapley, vice president of advanced engineering at Aptiv PLC, a tier 1 supplier of automotive active safety technologies, says there’s no match for LiDARs accuracy in identifying objects like a truck’s tire tread on the highway. “That’s why we are a firm believer in the need for LiDAR,” Tapley adds.

But while Aptiv makes its own cameras and radars for automakers, it invested in three LiDAR manufacturers to source product: Innoviz and Leddartech in 2017, and Quanergy in 2016.

“What you’re seeing is an optimization puzzle,” states John Buszek, director of ADAS and autonomous driving for Renesas Electronics America, an automotive semiconductor maker based in Farmington Hills, MI. It’s founded on “taking raw sensor data and figuring out what the world around the vehicle is and where that data really surges in, is at that perception level,” Buszek says. “This autonomous driving business is not short of areas of innovation. It’s all over the place.”

Robert E. Calem

Tagged

Related