Tesla isn’t backing away from its radical plans for self-driving cars.
CEO Elon Musk and other Tesla executives explained to investors Monday their plans to have more than a million full self-driving Teslas on roads next year.
The optimistic rhetoric contrasts with the rest of the autonomous-vehicleindustry, which is increasingly cautioning how long and hard a road it will be to get such cars on streets en masse. Tesla has already forged ahead with its Autopilot software, which has been deployed in some vehicles since 2015 and handles minor driving tasks. Tesla views Autopilot as an incremental step toward full self-driving vehicles. The software can be updated with an over-the-air update.
Tesla also stands out for its approach to building a safe self-driving car.
The company won’t rely on high-definition maps to guide its vehicles, there will be no geofenced boundaries that restrict where the cars can driveand it’s not using a sensor nearly everyone else views as essential. The sensor, called LIDAR, is prized for its ability to tell a vehicle exactly how far away nearby objects are, such as a pedestrian, car or intersection.
Instead, Tesla will rely on cameras, radar and ultrasonic sensors to understand a car’s surroundings.
“LIDAR is a fool’s errand. Anyone relying on LIDAR is doomed,” said Musk, who described LIDAR as expensive and unnecessary.
LIDAR typically bulges from the roof of a self-driving car, resembling a spinning bucket of fried chicken. Early LIDAR sensors cost $75,000, but costs have come down as the industry develops and the sensors are made in larger quantities.
Tesla’sapproach is controversial, and has drawn public criticism from competitors.
“These vehicles that don’t have LIDAR and don’t have advanced radar, that haven’t captured a 3D map are not self-driving vehicles,” said Ken Washington, Ford vice president of research and advanced engineering, at an event earlier this month. “Let me just really emphasize that. They are consumer vehicles with a really good driver-assist technology.”
Tesla argues that if humans can drive with two eyes, machines should be able to drive with cameras.
Andrej Karpathy, Tesla senior director of artificial intelligence, described LIDAR Monday as a shortcut that gives self-driving car companies a false sense of progress.
“Is that person distracted and on their phone? Are they going to walk into your lane?” Karpathy said. “Those answers are only found in vision.”
He said Tesla is training an artificial-intelligence system to understand what’s happening on roads from camera images captured by Tesla’s fleet. He described how Tesla had taken an image of a bicycle mounted on the back of a vehicle, and used it to search for similar images across the Tesla fleet. Then Tesla found a bunch of photos of bikes on the backs of cars, and used them to train its AI to distinguish between a bicycle being ridden on a street, and a bicycle mounted to the back of a car. Understanding such distinctions is critical before cars can operate without human drivers.
Experts say the technique is state of the art.
“It’s remarkable. It’s unbelievable technologically,” said Bryan Reimer, research scientist in the MIT AgeLab and the associate director of the New England University Transportation Center at MIT. “I don’t think anyone else in the industry is capable of doing that in near real-time.”
Reimer also cautioned that there may be looming limitations that slow Tesla’s success.
“Everybody else is betting the other direction,” Reimer said. “At some point we’ll know whether Musk is right or wrong, but it could be decades.”