Car News

Uber’s self-driving cars had a major flaw: They weren’t programmed to stop for jaywalkers

"Even the most junior human driver knows to expect that people sometimes walk outside of crosswalks."

An Uber self driving car sits on display ahead of an Uber products launch event in San Francisco, California on September 26, 2019. Philip Pacheco/AFP/Getty Images

Software detected the woman almost six seconds before Uber’s self-driving car hit her, investigators say, in the crash that would lead to her death and prompt the ride-share giant to slam the brakes on its autonomous vehicle testing.

But the SUV didn’t try to stop until about a second before impact. One big reason: It wasn’t designed to recognize a pedestrian outside a crosswalk, according to documents released this week by the National Transportation Safety Board after a 20-month investigation.

Revelations that Uber failed to account for jaywalkers – with deadly results in Tempe, Arizona, last March – fuel long-standing objections from critics who accuse companies like Uber of rushing to deploy vehicles not ready for public streets. They’re skeptical that automakers eager to lead on industry-transforming technology are doing enough to avoid another tragedy as they continue to test cars out in the real world.

Advertisement:

“Even the most junior human driver knows to expect that people sometimes walk outside of crosswalks,” said Jason Levine, executive director of the Center for Auto Safety, a District of Columbia-based nonprofit. “When we have public road testing that has programming that essentially chooses to ignore the realities of how people interact with public infrastructure, its really dangerous.”

The bigger question for Levine: Could similar serious oversights plague other models across the self-driving industry?

“The answer is, we don’t know, because there’s no requirements around how you program this technology before you put it on the road,” he said. He wants to see federal regulation.

Uber has made “critical program improvements” in the wake of Elaine Herzberg’s death, spokeswoman Sarah Abboud said in a statement. The company’s system is now able to handle scenarios like jaywalking in which people or bicyclists are not following road rules, she added, though human drivers may still need to intervene at times.

She declined to say how long Uber had been aware of the failure to recognize a pedestrian not in a crosswalk, and said the company is not commenting on specifics of the investigation because it is ongoing.

“We deeply value the thoroughness of the NTSB’s investigation into the crash and look forward to reviewing their recommendations once issued after the NTSB’s board meeting later this month,” Abboud said in a statement.

Advertisement:

The crash that spurred the inquiry is thought to be the first death related to testing of autonomous vehicles. A human was in the driver’s seat but didn’t keep the car from plowing into 49-year-old Herzberg near a busy intersection, according to authorities. Hit as she crossed the street, Herzberg would die in a hospital.

Multiple issues contributed to the crash, the NTSB’s documents indicate. Herzberg would likely be alive if Uber hadn’t blocked its car from using a built-in automatic emergency brake, the board found, though it will not issue its decision on what caused the death until its meeting later in November.

But another major problem was software’s inability to identify a person in the car’s sights, and its resulting failure to predict how that person would move into the vehicle’s path. Uber’s system perceived Herzberg as a vehicle, a bicycle and an “other” object in the seconds before collision – but not a human being.

The death halted Uber’s testing as scores of companies, from Google to General Motors, explored self-driving technology. It drew strong reactions from many experts who warned of a need to hold vehicles deployed in public to stricter standards. But others argued that cars can’t be fine-tuned without real-world driving experience.

Advertisement:

Nine months later, Uber resumed its test drives in Pittsburgh. The company hopes to bring its self-driving cars back to other cities such as San Francisco, Abboud said Wednesday, but is focusing right now on using human drivers to collect data on those particular locations that it can incorporate into its testing in controlled environments.

Arizona was particularly hospitable to companies looking to test their young autonomous tech in the years leading up to Herzog’s death. Before that crash, Gov. Doug Ducey, a Republican, said he welcomed Uber’s self-driving cars “with open arms and wide open roads” while criticizing California for stifling such testing with “more bureaucracy and more regulation.”

Difficulties recognizing pedestrians aren’t unique to Uber’s self-driving cars, said Jamie Court, who as president of the group Consumer Watchdog has been critical of companies’ willingness to deploy the technology. Across the board, he said, many autonomous vehicles struggle to successfully navigate more than 10 or 20 without human intervention, though some standouts like Google’s car can make it much farther.

But the NTSB’s findings on Uber’s tech is especially alarming, he said.

“Robot cars would do well driving in a world of robots, but not on roads and crosswalks where human beings have the right of way,” Court said.

Conversation

This discussion has ended. Please join elsewhere on Boston.com