San Francisco(CNN) The self-driving car industry has an irresistible sales pitch: Its technology can prevent car crashes and save thousands of lives. But there's one major nagging problem. No one has proven the utopian promise will come true.
Still, the self-driving car industry continues to push forward. Waymo, the self-driving arm of Google's parent company, already has vans in Chandler, Arizona, that operate without a human behind the wheel. General Motors talks of doing the same in 2019. Ford is planning for a similar approach in 2021.
But experts believe the industry must change course and adopt the safety practices of another section: aviation. Today, self-driving companies work independently and with little input from competitors or governments.
Silicon Valley self-driving car startup Zoox is looking toward aviation's collaborative model for inspiration. Government regulators, including the Federal Aviation Administration and NASA, work with aviation industry groups and companies such as Boeing and Airbus to share anonymous flight data and identify hazards before they turn deadly. They focus on risks like turbulence, engine failures and planes skidding off runways.
The aviation industry also has an intense certification process to prove new technologies. The Federal Aviation Administration and aviation companies work together to agree on plans to test technologies. The entire process can last up to eight years.
The cautious approach is working. No jet-powered commercial passenger jets crashed in 2017 and airline crashes have declined since the early '90s even as more people fly.
The US government has long relied on the auto industry to certify that its vehicles meet safety standards, but Mark Rosekind, Zoox's safety innovation officer, is among the experts that say that approach doesn't work for cars that drive themselves. Autonomous vehicles will tackle even harder technical challenges than planes, such as identifying and avoiding a child standing at a street corner who may run into the road. An airplane's autopilot system doesn't have to steer around unpredictable pedestrians when flying at 30,000 feet.
"Nobody's done this before," said Rosekind of proving self-driving cars are safe. "Yeah, stuff's been done in aviation and stuff's been done in artificial intelligence, but nobody's brought all of this together for the solution we're looking for."
Before joining Zoox, Rosekind was the top official at the National Highway Traffic Safety Administration, tasked with protecting Americans on the roads. He was drawn to Zoox and self-driving cars because he views the technology as the first new tool to make a significant difference in auto safety in 100 years.
"People have to be both understanding and tolerant that there's a huge opportunity here," Rosekind told CNN Business at Zoox's Foster City, California, offices. "We have to figure out the right way to get there. It's not like it's going to be from zero to one -- it's horrible, but now it's perfect."
Zoox, which was founded in 2014, is designing its own self-driving car from scratch rather than retrofitting existing cars and trucks like its competitors Waymo, GM and Ford.
Self-driving car advocates will likely recall a popular stat: 94% of car crashes are because of human error. Consider the improvements once humans aren't driving.
"People love to say, we're going to get rid of that [94%]. Prove it," Rosekind said. "How does a company show that its sensors and artificial intelligence will eliminate those crashes?"
Some in the industry like to point to how many miles they've driven in self-driving cars. In October, Waymo announced it hit the 10 million mile milestone. But Rosekind believes the industry needs a better test. The accomplishment of driving several million miles with an old version of self-driving software may not say a lot about how good the current version of your self-driving car is, he says.
The challenge of proving self-driving cars are safe is so significant that even people outside the industry are getting involved. DARPA, the Defense Department research agency, is now sponsoring research to determine how to certify that self-driving vehicles, and all autonomous systems, are really safe.
"The methods for validating correctness just doesn't exist yet. That's what we're changing," Sandeep Neema, who leads the Assured Autonomy program said.
His program began this year and will run for four years. By that time, self-driving cars may already be somewhat common.
Neema's researchers are working to find solutions as quickly as possible. One researcher, Matt Carrico, a fellow at the aviation company Rockwell Collins, believes the auto industry must follow in aviation's footsteps and establish measures for safety.
Like Rosekind, he's suspect of popular metrics of today, such as relying on how many miles a self-driving car has covered.
"You can drive million of miles. But what does that tell you about unexpected behaviors? What actually happened in those millions of miles? Were they boring? Or were you actually looking at all the [unusual] cases and pushing the envelope to explore what might go wrong," Carrico said.
Aviation safety experts say their field has been so safe because of how closely it works with competitors and regulators.
"Boeing works with Airbus and Gulfstream and Bombardier all the time. They meet, they talk about issues. they're on industry working groups, all focused on safety," said David Silver, vice president for civil aviation at Aerospace Industries Association. "We understand our whole system is based on public confidence."
Teamwork isn't the norm among autonomous vehicle developers, but companies like Zoox seek to change that. Rosekind has called for his former home, NHTSA, to be a gathering place for the industry to talk and work together on safety, just as the FAA is for the aviation industry. But so far that hasn't happened.
Safety is critical because the self-driving industry risks a backlash if things go wrong. Autonomous cars have another thing in common with planes: Passengers don't have control, so the public is less accepting of deaths.
Any crash may change how people feel about self-driving vehicles. Public comfort with autonomous vehicles dipped after a self-driving Uber struck and killed a pedestrian in Arizona last year.
"It's not necessarily fair," Rosekind said. "It's people judging. I have control and I'm a really great driver, but I don't know about that robot."