
Attorney Amy Witherite Raises Alarm Over First Responders’ Concerns Driverless Cars About Autonomous Vehicles
As autonomous vehicle technology continues to advance, a growing number of public safety officials and legal experts are voicing serious concerns about the risks posed by driverless ride-hailing services. One of the most vocal critics is Amy Witherite, a prominent attorney and traffic safety expert, who warns that the rapid deployment of these vehicles without adequate regulation could have life-threatening consequences.
Witherite, founder of the Witherite Law Group, points to a recently released report by fire, police, and transportation departments in San Francisco that outlines numerous hazards associated with autonomous vehicle operations, particularly those operated by Cruise and Waymo. These two companies, which are among the leading players in the self-driving car industry, have been expanding their fleets in California and now plan to bring their driverless ride-hailing services to Atlanta by summer 2025.
The San Francisco report paints a sobering picture. It states unequivocally: “Giving Cruise & Waymo authority to expand at their discretion does not serve public safety.” This conclusion comes after a growing list of incidents involving autonomous vehicles interfering with emergency operations, putting both responders and civilians at risk.
“In an industry where billions of dollars are on the line and the race for technological dominance is fierce, it’s crucial that cities and states take the lead in setting clear, enforceable standards for autonomous vehicles,” said Witherite. “This is not a space where companies should be allowed to regulate themselves. The report makes it clear that these vehicles, while innovative, can become dangerous obstacles in real-world emergency situations.”
Documented Incidents Highlight the Risks
According to San Francisco’s public safety departments, the presence of autonomous vehicles on city streets has already led to hundreds of disruptive incidents. In 2023 alone, the San Francisco Police Department documented at least 50 written reports involving Waymo or Cruise vehicles. Between 2022 and 2023, officials tracked nearly 600 separate occurrences in which these vehicles made unexpected stops—often in the middle of the street, in intersections, or in front of emergency facilities.
These unexpected and often inexplicable stops have created major challenges for first responders. The report highlights several real-world examples of how driverless cars have interfered with emergency response efforts:
- Blocking entrances to fire stations, preventing fire engines from exiting quickly during emergencies
- Obstructing the paths of ambulances and police vehicles responding to urgent calls
- Making physical contact or near-contact with emergency personnel and equipment
- Interrupting ongoing police operations with erratic behavior
- Failing to recognize and properly react to the presence of active emergency zones
These disruptions are not only frustrating but potentially fatal. As Witherite explains, “Emergency response is all about time. Whether it’s a fire, a heart attack, or a serious accident, every second counts. If an autonomous vehicle blocks an ambulance or delays a fire truck by even a minute, that can mean the difference between life and death.”
Federal Scrutiny Adds to Mounting Concerns
These local incidents have also drawn the attention of federal regulators. The National Highway Traffic Safety Administration (NHTSA) recently launched an investigation into Waymo’s self-driving systems after receiving multiple reports of concerning behavior. The NHTSA’s summary of the reports is alarming: “Reports include collisions with stationary and semi-stationary objects such as gates and chains, collisions with parked vehicles, and instances in which the ADS [Automated Driving System] appeared to disobey traffic safety control devices.”
The agency also noted that some of these incidents involved collisions shortly after the ADS exhibited strange or unpredictable actions near stop signs, traffic signals, or construction zones—behavior that would raise red flags for any human driver.
This federal investigation adds weight to the arguments made by Witherite and San Francisco officials. They argue that expanding the footprint of driverless vehicles without addressing these issues only increases the chances of a catastrophic failure—one that could cost lives and strain already overburdened emergency services.
No Standards, No Training Consistency
Another major issue highlighted in the San Francisco report is the lack of standardization across different autonomous vehicle systems. Public safety agencies report that each company’s vehicles operate with different protocols, and first responders are not trained to handle all the possible variations. In critical situations, this knowledge gap can create confusion, slow response times, and expose both emergency personnel and the public to unnecessary danger.
While most autonomous vehicles are connected to a remote operations center where human advisors can take control or issue instructions, this creates additional challenges. As the report states, “Human traffic control officers cannot safely leave an intersection to talk with remote advisors about just one vehicle.” This inefficiency could further bog down efforts to manage chaotic or dangerous situations.
“It’s unrealistic to expect first responders to learn and manage multiple proprietary systems developed by tech companies,” said Witherite. “There needs to be a unified set of standards and protocols so emergency personnel can safely and quickly interact with any autonomous vehicle they encounter.”
Waiting for Tragedy?
Despite the growing body of evidence, there is a perception among safety advocates that policymakers are moving too slowly. Witherite warns that governments often act only after a tragedy has occurred—and that such inaction in the face of mounting risks is unacceptable.
“It often takes a tragedy before government agencies will act,” she said. “We appear to be repeating that same mistake with this new technology. We need proactive regulation, not reactive damage control.”
The Role of the Witherite Law Group
As driverless vehicles become more common on public roads, legal experts like Witherite are preparing to play a greater role in helping individuals understand their rights and seek justice in the event of an accident involving autonomous technology. Her firm, the Witherite Law Group, specializes in motor vehicle accident cases, and she emphasizes the unique legal and technical challenges posed by crashes involving self-driving cars.
“With human drivers, liability can often be determined through well-established methods,” she explained. “But with autonomous vehicles, we are entering a complex legal territory involving software decisions, machine learning systems, sensor failures, and remote oversight. It’s a rapidly evolving field, and people need experienced legal advocates who understand both the technology and the law.”
Her firm offers legal representation and support for individuals who have been injured in accidents involving autonomous vehicles, including cases where these vehicles have malfunctioned, disobeyed traffic signals, or caused delays in emergency response.
A Call for Responsible Innovation
With the planned expansion of driverless vehicle services in cities like Atlanta and beyond, Witherite is urging state and local governments to implement stricter regulations, invest in first responder training, and demand transparency from autonomous vehicle operators.
“This technology has the potential to reshape how we travel,” she said. “But safety must be the priority. We can’t let the excitement over innovation outpace our responsibility to protect the public.”
As more communities weigh the benefits and risks of driverless mobility, voices like Witherite’s—and those of first responders on the front lines—are demanding that we proceed with caution, common sense, and a clear commitment to safety.