The hidden side of politics

Even Elon Musk Abuses Tesla’s Autopilot

Reported by WIRED:

“Do you feel safe?” Leslie Stahl asked Elon Musk on Sunday’s episode of 60 Minutes, as the scene showed her riding on the freeway with Musk in a red Tesla Model 3. “Yeah,” the CEO answered, settling back into the driver’s seat, his hands clasped together over his stomach, after turning on the car’s semiautonomous driving system. “Now you’re not driving at all,” Stahl said, incredulously, looking over at his feet.

Musk went on to demonstrate the car’s new Navigate on Autopilot feature, which lets it change lanes by itself. Stahl’s wowed reaction—“Oh my goodness”—matches that of many people when they first see the Tesla take control of its steering and speed. But her questioning, trying to gauge Musk’s involvement in the driving process, highlights a significant issue Tesla faces as it rolls out ever more advanced Autopilot features.

A growing body of evidence makes clear that many drivers are confused about what the car can and can’t do. Tesla has repeatedly insisted—with spokesperson statements, driver manuals, and on-screen warnings in the car—that Autopilot is not an autonomous system. It doesn’t even see stopped firetrucks. The human is always responsible, and should keep their hands on the wheel. Yet, on one of the country’s most popular news programs, Musk risked compounding the confusion by clearly not even touching the steering wheel, and agreeing that he wasn’t driving. As he put it: “I’m not doing anything.”

Meanwhile, Musk continues to talk up Tesla’s goal of making its cars drive themselves in situations far beyond the highway, with no human oversight or involvement. And so he risks widening the gap between what the car seems to do and what it actually does.

Outfits like Waymo and GM’s Cruise are going straight for a fully autonomous system, testing their tech with trained safety drivers in carefully prescribed situations. Tesla has been adding abilities steadily via over-the-air software updates, telling its customers these features are in beta, and letting them have at it. The approach has its merits—they’re clever assistance features that could make driving safer if they’re used properly—but also relies on a driving public that understands the system’s limitations. That’s what makes Musk’s on-camera willingness to let go of the wheel look so unfortunate. (Tesla did not reply to a series of questions about the 60 Minutes interview.)

“His board of directors needs to slap him upside the head,” says Missy Cummings, who researches human and autonomous vehicle interaction at Duke University. “One of the biggest problems with Tesla is something called mode confusion—people don’t realize when a car is in one automated mode versus not.”

Ironically, part of the problem is the high quality of Tesla’s software. Over the weekend, I drove about 60 miles on the freeway in a Model 3, using Navigate on Autopilot. The new feature lets the car change lanes to pass slower vehicles and merge into the correct lane to take an exit. The car’s central display shows other cars around the Tesla in a cartoonish graphical representation. (Once, on surface streets, the car spotted a man on an electric scooter moving into a blind spot, noting him as a stick figure pedestrian.) The car requires the driver’s approval before changing lanes, indicating where it wants to go with a gray line on the screen. Once I signaled my approval with a tap of the gear or indicator stalk, the computer put on the blinker, waited for a gap in the traffic, moved over, and canceled the signal. It felt particularly futuristic when I reached my freeway exit (per the destination I entered in the navigation system). The car signaled, moved into the exit lane, slowed down, and made three “bong” sounds to tell me Navigate on Autopilot was turning off, all without my involvement.

Because it was a new feature, I stayed hyper alert, keeping at least one hand on the wheel and watching the mirrors to make sure the car stayed safe—just like you’re supposed to. But I know from experience with previous versions of Autopilot that such vigilance wears off quickly. When the car drives so capably, it’s easy to be lulled into a sense that the computer doesn’t need any help or supervision. It’s fine. It’s OK to glance away from the road for a moment, or a minute, or a few minutes.

“This is why it’s so dangerous,” says Cummings. “One of the things we know for sure is humans will immediately start not paying attention as soon as the car is doing a good enough job.”

This sort of overconfidence in the automation has been cited by the National Transportation Safety Board in the first Autopilot fatality that killed Josh Brown in Florida in May 2016. It has been implicated in more recent deaths and the three times (at least) Tesla drivers have slammed into stopped fire trucks in 2018 alone. And with the booming sales of the Model 3, more and more regular drivers, with little experience of automated systems and no training, will be acting as Tesla’s safety drivers. “The public doesn’t understand issues surrounding the technical limitations,” Cummings says.

That’s not holding Tesla back. Also on Sunday, Musk tweeted that the company is testing more Autopilot features in development software, including the ability to handle traffic lights and roundabouts.

For the foreseeable future, though, Tesla’s cars will require driver oversight, if not input. But despite Tesla’s recent efforts to make it harder for drivers to zone out, the man leading the charge doesn’t seem to have gotten the message.

Source:WIRED

Share

FOLLOW @ NATIONAL HILL