
Attorney Amy Witherite Applauds Return of Human Drivers to Autonomous Trucks on Texas Highways
Attorney and traffic safety advocate Amy Witherite is welcoming what she sees as a long-overdue injection of common sense into the ongoing rollout of autonomous trucking technology on Texas highways. In a surprising reversal, Aurora Innovation—one of the leading developers of autonomous trucking systems—announced that it would resume placing human drivers inside its self-driving trucks operating between Dallas and Houston. The decision comes less than three weeks after the company had revealed plans to allow its vehicles to travel at speeds of up to 75 miles per hour on public highways without anyone inside the cab.
“Common sense has finally prevailed,” said Witherite, a prominent personal injury attorney and recognized expert on traffic safety. “We cannot ignore the real dangers these driverless trucks pose—especially on highways as busy and unpredictable as those in Texas.”
Witherite has spent months raising concerns about the rapid, and often unregulated, advancement of autonomous trucking technology. While companies like Aurora Innovation, Waymo, and TuSimple tout the safety, efficiency, and cost-saving benefits of driverless trucks, Witherite argues that the push to remove human oversight from the equation is premature and dangerous. Her central concern: the technology is being tested and deployed in real-time—on roads shared by unsuspecting motorists—without the proper regulatory framework or safety assurances in place.
A Regulatory Void
One of the key issues, Witherite contends, is the lack of up-to-date federal safety regulations governing autonomous vehicle operations. “There has been nowhere near the amount of testing required to show these trucks can operate safely in the challenging environment on Texas highways,” she said. “With billions of dollars at stake and little or no government regulation, federal and state officials are letting the fox guard the henhouse by allowing for-profit companies to determine whether their technology is safe.”
Indeed, the current rules are outdated. Federal oversight of motor vehicle safety is primarily managed by the National Highway Traffic Safety Administration (NHTSA), but the agency’s regulations—the Federal Motor Vehicle Safety Standards (FMVSS)—were developed decades ago and are largely tailored to traditional vehicles operated by human drivers. As a result, there is a significant regulatory gap when it comes to vehicles equipped with self-driving systems.
Even the Autonomous Vehicle Industry Association, a trade group representing the companies behind this new technology, has acknowledged the need for clearer national standards. In a public statement, the organization noted, “Only the federal government can uniformly regulate the design, construction, and performance of the vehicle.” That admission underscores a growing consensus—even within the autonomous vehicle industry—that more oversight is needed.
However, the agencies tasked with providing that oversight appear ill-equipped to do so. According to reporting from The Washington Post, the specialized team at NHTSA dedicated to autonomous vehicle policy has seen a significant reduction in staffing, much of it attributed to federal budget cuts implemented during the Trump administration. The result is a lack of resources and expertise at the very time that the autonomous vehicle sector is expanding rapidly.
The Stakes Are Higher for Trucks
The need for caution becomes even more urgent when the conversation shifts from self-driving passenger cars to heavy-duty trucks. “The amount of damage that can be caused by an 80,000-pound tractor trailer far exceeds the severity of accidents involving cars and SUVs,” said Witherite. “Texas highways are often crowded and full of complex and unpredictable situations—from sudden weather changes and lane closures to ongoing construction zones and erratic drivers.”
Witherite argues that human drivers bring a level of judgment and adaptability that artificial intelligence has not yet been able to match. Even the most sophisticated sensors and algorithms may struggle to respond appropriately in rapidly evolving traffic conditions, especially when unexpected elements like debris on the road, confused pedestrians, or erratic vehicle behavior are involved.
While autonomous driving systems rely on a combination of radar, lidar, GPS, and machine learning models to interpret their surroundings and make driving decisions, they still lack the intuitive reasoning and ethical decision-making capabilities of a trained human driver. And while automation can potentially reduce certain types of collisions—like those caused by drowsy or distracted driving—it also introduces new categories of risk, such as software bugs, sensor malfunctions, and cybersecurity threats.
Industry Pushback
What ultimately led Aurora Innovation to reverse its decision wasn’t government intervention, but rather pushback from its own industry partners. PACCAR Inc., the global truck manufacturing giant that supplies many of the vehicles used in Aurora’s fleet, reportedly insisted that human operators remain inside the trucks during testing and commercial operations. This move was widely seen as a rebuke to the idea that driverless trucks are ready for full autonomy on open highways.
“With so little regulation, we’ve had to rely on the manufacturers themselves to step in and put the brakes on some of the more reckless decisions,” said Witherite. “It shouldn’t take corporate pressure to do what’s clearly in the public’s best interest.”
PACCAR’s insistence on having safety drivers in the cab was a pivotal moment in a broader debate about the pace and direction of autonomous vehicle development. It suggests that even some of the most invested stakeholders are not yet fully confident in letting these vehicles operate independently.
Public Accountability and Legal Support
Witherite hopes this recent reversal becomes a turning point—one that shifts the conversation away from hype and toward meaningful oversight and public accountability. She calls for greater transparency from companies testing autonomous vehicles, more rigorous safety validation protocols, and a clearly defined path for regulatory approvals.
“Technology can be a powerful tool for improving safety and efficiency,” Witherite said. “But we cannot put innovation ahead of accountability. We need thoughtful, enforceable regulations that prioritize the well-being of everyone on the road.”
Her law firm, Witherite Law Group, has long advocated for victims of serious vehicle crashes, including those involving commercial trucks. As autonomous technology becomes more prevalent, the firm is increasingly taking on cases where questions arise about software failures, lack of operator oversight, and unclear liability. Witherite emphasizes that people involved in collisions with autonomous vehicles face unique legal challenges—especially when it’s unclear whether the fault lies with a software developer, the fleet operator, or the vehicle manufacturer.
“We are entering a new legal frontier,” she noted. “And the victims of these accidents shouldn’t be left to figure it out on their own. They deserve support, answers, and justice.”
As autonomous trucking technology continues to evolve, the conversation around its deployment on public roads is far from over. For advocates like Amy Witherite, the return of human drivers to Aurora’s trucks is a step in the right direction—but it’s only the beginning.
“We can only hope that regulators, lawmakers, and the public continue to demand this commonsense approach as the technology develops,” she said. “There’s too much at stake to get this wrong.”
Until there are clearer rules, stricter testing protocols, and broader consensus on how to safely integrate autonomous trucks into the transportation ecosystem, Witherite believes the best path forward is a cautious one—with human eyes still on the road and hands close to the wheel.