The hidden side of politics

Finally, a Real World Grading System for Autopilot Tech

Reported by WIRED:

You can’t buy an autonomous car today. You won’t be able to buy one tomorrow, or next month, or next year. Yes, self-driving tech is in development (and in the news), but nobody’s close to delivering a product that can take humans anywhere they want to go. Not even Tesla.

That may come as a surprise if you’ve browsed websites or glossy marketing materials filled with claims of cars driving themselves, relieving the driver of the mundane tasks of steering and braking. Or if you’ve heard Tesla CEO Elon Musk promise that with Version 9 of Tesla’s software, “We will begin to enable full self-driving features.” Or if you’ve seen recent news stories about a spate of crashes among cars using semi-autonomous features, some of them deadly.

It’s that kind of surprise that’s worrying the folks at the UK’s Thatcham Research, an influential nonprofit that assesses vehicle safety, similar to the IIHS in the US. With a new paper, “Assisted and Automated Driving Definition and Assessment,” Thatcham is urging carmakers to be more transparent about what their systems can and can’t do. And it’s making moves to prod them into doing so.

This summer, Thatcham will begin rating driver-assistance features like Tesla’s Autopilot. It will study how well they function and which situations give them trouble. But it will also consider how clearly they indicate when they’re active, how they monitor their human drivers, and how they are marketed. “It’s the Wild West out there, so we’re saying, let’s get some rules around this, because people’s lives are at stake,” says Matthew Avery, Thatcham’s head of research.

Thatcham

This is more than just hand-wringing. Thatcham’s rankings have teeth, with manufacturers pushing hard to meet its standards and earn the highest safety marks. Ratings influence consumer buying choices and the premiums that insurers charge. A bad rating can make a car much less attractive to buy.

Part of the problem, Thatcham says, is that these features come with names that don’t make clear what they are or what they do. Tesla calls its system Autopilot. Nissan offers ProPilot Assist, Mercedes has Drive Pilot. Audi is gearing up to launch Traffic Jam Pilot (just not in the US). Cadillac has Super Cruise, BMW has Active Driving Assistant, and Volvo offers Adaptive Cruise Control with Steer Assist.

Tesla’s branding may be the catchiest, but Volvo’s is the most accurate, or at least the most understandable. That’s why Thatcham will start judging automakers on whether their system names are likely to confuse drivers. Those using “pilot,” which implies control, will get marked down. Those with “assist” will likely get a higher grade. They remind drivers that these systems are a helping hand, not an excuse to browse Instagram.

Thatcham’s rankings have teeth, with their manufacturers pushing hard to meet its standards, and earn the highest safety marks. Ratings influence consumer buying choices, and the premiums that insurers charge. A bad rating can make a car much less attractive to buy.
Thatcham

Elon Musk has said that among his customers who crashed while using Autopilot, confusion is less of a problem than complacency: Over time, they put too much trust in a system that needs their supervision—because it can’t, for example, spot a stopped fire truck. Avery says that’s probably not the whole picture. “There may also be customers who genuinely think the car is more capable than it actually is, and that’s the scary thing.” He says he wants manufacturers to “come clean” and sell their tech as assisted driving, rather than oversell it as any kind of autonomy.

Facing scrutiny of Autopilot, Tesla has always pointed out that its system requires the driver to accept that it’s in beta and agree to stay in control with an onscreen checkbox. It also gives a warning to “keep your hands on the wheel” every time it’s engaged. Over the past week, the company has started rolling out an over-the-air software update, reducing to 30 seconds (from three or four minutes previously) the length of time a driver can take their hands off the wheel before a warning pops up. When a driver complained on Twitter that the change makes the system annoying to use, Musk replied, “This is crux of matter: can’t make system too annoying or people won’t use it, negatively affecting safety, but also can’t allow people to get too complacent or safety again suffers.” (Tesla has long touted Autopilot as a safety feature, but no one has produced independent research showing the system saves lives.)

Thatcham says that many of these systems do make driving safer by limiting driver fatigue and reducing rear-end collisions. “We want people to use these systems, because we think they’re a safety benefit,” Avery says. But they need to understand what they’re using, how it works, and when it doesn’t.


More Great WIRED Stories

Source:WIRED

Share

FOLLOW @ NATIONAL HILL