ANN ARBOR, Mich. — Amid rising concerns regarding the safety of autonomous vehicles, a prominent specialist advocates for the establishment of a national driver’s test that these vehicles must pass before being allowed on public roadways.
Such a test would establish baseline standards to ensure that these vehicles exhibit fundamental skills and abilities when interacting with various traffic scenarios, according to Henry Liu, director of the University of Michigan’s autonomous vehicle testing facility.
“Ensuring safety is crucial for consumers, the manufacturers of autonomous vehicles, and the federal government as well,” Liu stated in an interview. “It is the federal government’s role to help determine minimum standards and provide guidance on safety assessments.”
In recent years, self-driving vehicles have been linked to notable crashes, and public surveys reveal a widespread unease regarding their safety. Liu suggested that effective testing that evaluates the vehicle’s performance in diverse traffic scenarios would help bolster public trust in their safety.
While Liu acknowledged that further research is necessary before these vehicles can be safely deployed across the country, he concurred with manufacturers that self-driving technology has the potential to enhance safety and improve the efficiency of the nation’s transportation systems in the long run.
Currently, self-driving vehicles are not subjected to any specific federal regulations, and only a handful of states have their own respective requirements. The National Highway Traffic Safety Administration (NHTSA), a division of the Department of Transportation, has been collecting data on crashes involving autonomous vehicles, but has only published voluntary recommendations that do not include a practical driver testing component.
Messages were left for comment with the Transportation Department. Self-driving vehicles must still comply with federal safety standards applicable to all passenger cars, which leads to government investigations only after serious accidents occur.
“Our existing vehicle safety regulations are largely reactive, relying on self-regulation,” Liu explained.
At the University of Michigan testing facility, Liu oversees a simulated city known as Mcity, which features various elements like traffic lights and roundabouts that are utilized for testing self-driving vehicles by companies and government agencies.
Liu emphasized the necessity of having either a formal regulation or a recommended test, stating, “We want to avoid creating potential dangers for the public.” On Tuesday, he also announced that Mcity can now be accessed by researchers remotely.
He proposed that a driver’s test should evaluate if a self-driving vehicle can execute a left turn at an intersection without the aid of a traffic light displaying a green arrow. Furthermore, it should confirm that the vehicle comes to a complete stop at a stop sign and is capable of recognizing and yielding to a small pedestrian crossing the street.
According to Liu, such a test would ensure poorly functioning autonomous vehicles are not allowed to operate in public spaces, similar to how human drivers must pass a driving test to be deemed competent. However, he acknowledged that no assessment could eliminate all crashes involving self-driving technology.
The implementation of these driving tests, he argued, would assist developers of autonomous vehicles by reducing opposition from cities when they seek to deploy their technologies in various regions.
Tesla’s CEO Elon Musk has frequently criticized federal regulations, claiming they stifle innovation. Tesla is in the process of developing its “Full Self-Driving” robotaxi system, although these robotaxis cannot autonomously operate without human intervention at all times.
Liu posited that establishing basic driving standards would actually stimulate innovation and facilitate the rollout of autonomous vehicles. If companies are confident enough to deploy their technology on a large scale, he suggested, passing a basic competency test should be relatively easy for them.
“Why should this be seen as an obstacle to deployment?” he inquired.
Liu also noted that Europe and China already conduct basic tests that subject autonomous vehicles to third-party evaluations, while the U.S. continues to rely on self-certification from manufacturers.
Liu is advocating for the implementation of a driver’s test now because advancements in “machine learning” technology are enabling autonomous vehicles to make better decisions on the road. He anticipates that these vehicles will be more widely adopted in the U.S. within the next five to ten years.
“Large-scale deployment is on the horizon, and that’s why immediate action from the federal government is essential,” Liu asserted.
Currently, companies like Waymo, a subsidiary of Alphabet Inc., are transporting passengers in vehicles without human drivers in Phoenix and other areas. General Motors’ Cruise division has been operating robotaxis in San Francisco until their recent crash incidents.
Furthermore, Aurora Innovation plans to commence freight transport using fully autonomous trucks on Texas highways by the end of this year, while another autonomous cargo company, Gatik, aims for autonomous freight operations by the close of 2025.
Among the notable incidents involving self-driving vehicles are the 2018 crash of an Uber autonomous SUV with a human backup driver that resulted in the death of a pedestrian in Arizona, and a serious injury accident involving a Cruise autonomous vehicle and a pedestrian hit by another car.