Most car crashes are due to human error. Remove drivers, and the roads get safer.
Self-driving vehicles have a lot to master, though. They must “see” and “think” in real-world road situations, reacting to their surroundings. That might mean swerving around a child who suddenly runs into the street, or gauging whether to cross an intersection on a yellow light. And polls show that people are nervous about riding in a driverless vehicle, despite the claims that roads will be less dangerous with bots in charge. But ready or not, driverless cars are coming soon to a street near you.
Seeing. . .
Take a glance at the primary sensors that enable driverless vehicles to know their surroundings.
LIDAR (light detection and ranging): LIDAR emits lasers that bounce off objects to measure distance. It’s especially good for long-range detection; some systems sense objects nearly 1,000 feet away. And these aren’t blurry images. LIDAR creates high-resolution 3-D maps of shapes, including pedestrians, curbs and other vehicles.
Cameras: Cameras are the most cost effective of the three sensors. They have decent range and are good at spotting lane markings and traffic lights. But they’re not as accurate as radar and LIDAR.
Radar: Radar uses radio waves to detect the position and speed of other objects. Police officers use it to catch speeding drivers, and air traffic controllers use it to track planes. As adapted for robo-vehicles, the technology can detect objects from a few feet away to hundreds of feet away. But radar can’t determine shapes.
Thinking. . .
Self-driving vehicles must quickly understand and interpret what they see. Software and algorithms make this happen. One way is through artificial neural networks, which are computer systems that mimic the human brain. These neural networks let robot cars understand, react and learn in real time from sensor data. This artificial intelligence is the mind of the self-driving vehicle.
Levels of Automation
In 2014, SAE International, an association of tech professionals, created the standard classification for vehicle autonomy levels. They range from a human in full control to the car in full control. Levels 1 and 2 include advanced driver assistance systems (ADAS), which offer a helping hand, like watching your blind spots. Levels 3 to 5 involve automated driving systems (ADS), upping the ante of bot control.
Level 0: You’re driving.
Level 1: You’re driving, but ADAS helps with acceleration and steering.
Level 2: You’re driving, but ADAS can fully control acceleration, braking and steering under some circumstances.
Level 3: ADS fully controls the car in some circumstances, but you’re ready to take over.
Level 4: ADS performs all driving functions in certain situations, and you don’t need to pay attention.
Level 5: The vehicle drives, and you sit as passenger.
Race to Market
By 2040, annual sales of self-driving cars could exceed 33 million. Currently, only cars up to Level 2 are available to the public. By 2021, BMW says it will be selling Level 3, and Ford says it will be selling Level 4 to consumers.
That’s all great, but regulations will need to keep pace. Since 2011, 21 states have passed autonomous vehicle legislation, and five others have governor-issued executive orders. In general, the laws limit the vehicles on public roads to Level 2, though some states have opened roads to researchers to test cars at Level 3 and higher. At the federal level, existing laws were written with human drivers in mind, but the U.S. Senate is considering a bill that would provide some regulatory framework for autonomous cars.
Antsy? Ride-sharing companies Lyft and Uber offer autonomous-car trips on the streets of Boston and Pittsburgh — with a driver at the wheel in case something goes wrong.
In the meantime, save your pennies. The first generation of truly driverless vehicles is expected to cost hundreds of thousands of dollars.
Would you be comfortable riding in a driverless car? Fifty-six percent of Americans say no, citing lack of trust and safety concerns, according to a 2017 Pew Research Center survey. In March, an Uber self-driving vehicle in Tempe, Arizona, hit and killed a woman crossing a street, further stoking public fear. Another study, using machine learning to analyze the language used in more than 1 trillion social media posts, found that Americans are most afraid of autonomous cars being hijacked by hackers.