When a dozen small children crossed in front of our Tesla with “full self-driving,” I had good reason to be nervous.
I’d spent my morning so far in the backseat of the Model 3 using “full self-driving,” the system that Tesla says will change the world by enabling safe and reliable autonomous vehicles. I’d watched the software nearly crash into a construction site, try to turn into a stopped truck and attempt to drive down the wrong side of the road. Angry drivers blared their horns as the system hesitated, sometimes right in the middle of an intersection. (We had an attentive human driver behind the wheel during all of our tests, to take full control when needed.)
The Model 3’s “full self-driving” needed plenty of human interventions to protect us and everyone else on the road. Sometimes that meant tapping the brake to turn off the software, so that it wouldn’t try to drive around a car in front of us. Other times we quickly jerked the wheel to avoid a crash. (Tesla tells drivers to pay constant attention to the road, and be prepared to act immediately.)
I hoped the car wouldn’t make any more stupid mistakes. After what felt like an eternity the kids finished crossing. I exhaled.
We were clear to make our turn. The car seemed overly hesitant initially, but then I noticed a bicyclist coming from our left. We waited.
Once the bicyclist crossed the intersection, the car pulled up and made a smooth turn.
Over the past year I’ve watched more than a hundred videos of Tesla owners using “full self-driving” technology, and I’ve spoken to many of them about their experiences.
“Full self-driving” is a $10,000 driver-assist feature offered by Tesla. While all new Teslas are capable of using the “full self-driving” software, buyers must opt into the costly addition if they want to access the feature. The software is still in Beta and is currently available to only a select group of Tesla owners, though CEO Elon Musk has claimed that a wider rollout is imminent. Musk promises “full self-driving” will be totally capable of getting a car to its destination in the near future.
But it doesn’t do that. Far from it.
Tesla owners have described the technology as impressive but also flawed. One moment it’s driving perfectly, the next moment it nearly crashes into something.
Jason Tallman, a Tesla owner who documents his “full self-driving” trips on YouTube, offered to let me experience it first-hand.
We asked Jason to meet us on Brooklyn’s Flatbush Avenue. It’s an urban artery that funnels thousands of cars, trucks, cyclists and pedestrians into Manhattan. For even experienced human drivers, it can be a challenge.
City driving is chaotic, with vehicles running red lights and pedestrians on nearly every block. It’s a far cry from the suburban neighborhoods and predictable highways around Tesla’s California offices, or the broad streets of Arizona, where Alphabet’s Waymo operates fully autonomous vehicles.
Cruise, GM’s self-driving company, recently completed its first fully autonomous rides in San Francisco. But they were conducted after 11 p.m. at night, when traffic is light and few pedestrians or cyclists are present.
Brooklyn offered us a chance to see how close Tesla’s autonomous driving software was to replacing human drivers. It’s the sort of place where humans drive because they have to, not the sort of place selected by a corporate headquarters. It’s where self-driving cars might have the biggest impact.
At one point we were cruising along in the right lane of Flatbush. A construction site loomed ahead. The car continued full speed ahead toward a row of metal fencing.
I felt deja vu as I recalled a video in which a Tesla owner slammed on the brakes after his car appeared set on crashing headlong into a construction site.
But this time I was sitting in the back seat. I instinctively threw up my right arm like the Heisman Trophy, as if to protect myself in a collision.
That was a moment I wished “full self-driving” would be quick to change lanes. In other cases, I wished it would chill out on its aggressive turns.
“Full self-driving” sometimes makes jerky turns. The wheel starts to turn, but then shifts back, before again turning in its intended direction. The staggered turns generally don’t seem to be a bother on sweeping suburban curves, but in a dense city largely built before cars, it’s uncomfortable.
There’s also the braking, which can feel random. At one point a car came close to rear ending us following braking that surprised me. Getting honked at was common. I never quite felt like I knew what “full self-driving” would do next. Asking “full self-driving” to navigate Brooklyn felt like asking a student driver to take on a road test they weren’t ready for yet.
What “full self-driving” could do well was impressive, but the experience was ultimately unnerving. I can’t imagine using “full self-driving” regularly in a city. I noticed I was reluctant to ever look down at the Model 3’s dashboard, such as for checking our speed, because I didn’t want to take my eyes off the road.
Tesla owners routinely tell me how Autopilot, the highway-focused predecessor to “full self-driving” makes their trips less stressful. They arrive at destinations feeling less fatigued. Some have told me they’re more likely to go on long road trips because of Autopilot.
But “full self-driving” felt like the inverse. I felt like I needed to be constantly on guard to prevent the car from doing something wrong.
Ultimately, seeing “full self-driving” in Brooklyn reminded me of the importance of the finer points of driving, which is tough for an artificial intelligence powered car to master. Things like pulling slightly into the intersection on a narrow road to make a left turn, so traffic behind you has room to pull around. “Full self-driving” just sat in place as frustrated drivers behind us honked.
For now, “full self-driving” seems closer to a party trick to show friends than a must-have feature.