← Back to Blog

Waymo Still Struggles to Stop for School Buses: A Revealing Limitation of Autonomous Driving | Taha Abbasi

Taha Abbasi··6 min read
Taha Abbasi waymo school bus stopping problem autonomous drivi

Waymo has had a strong start to 2026, with expanding service areas, improving safety metrics, and growing public acceptance of its robotaxi service. But a persistent problem keeps surfacing: the vehicles still struggle with school buses. Reports from Austin, Texas, this week highlighted a Waymo robotaxi that briefly blocked an ambulance responding to a shooting, adding to a pattern of edge cases that expose the limitations of current autonomous driving technology. Taha Abbasi examines why school buses remain such a challenge and what it tells us about the state of autonomous driving.

The School Bus Challenge

Stopping for school buses is one of the most legally and ethically important driving tasks. In every US state, drivers must stop for a school bus that has its red lights flashing and stop sign extended. The penalties for violations are severe, and for good reason: children exiting a school bus are among the most vulnerable road users.

For autonomous vehicles, the challenge is multi-layered. First, the system must reliably detect school buses, which come in various sizes, colors (not all are the classic yellow), and configurations. Second, it must detect the state of the bus’s lights and stop sign, which requires reading specific visual signals from a distance. Third, it must understand the legal requirements, which vary by state and even by road configuration (some states require all lanes to stop, others only the lanes traveling in the same direction).

Waymo’s vehicles use a combination of LIDAR, cameras, and radar to perceive their environment, and they have access to detailed maps that include school zone locations. Despite these advantages, the system has repeatedly shown hesitancy or error in school bus scenarios. In some cases, vehicles have slowed but not fully stopped. In others, they have stopped but then proceeded before the bus’s stop sign was retracted.

Why This Edge Case Is So Hard

As Taha Abbasi has written about extensively, autonomous driving excels at handling the 95% of driving situations that are routine and predictable. Highway cruising, traffic light management, lane keeping, and even most intersection navigation are well within current capabilities. The challenge lies in the remaining 5% of situations that involve unusual objects, ambiguous signals, or complex social interactions.

School buses represent a particularly tricky combination of challenges. The stop sign on a school bus is a mechanical device that extends from the side of the vehicle. It is much smaller than a standard road sign and can be partially obscured by other vehicles or roadside objects. The flashing red lights must be distinguished from other red lights in the environment, including traffic signals, emergency vehicles, and commercial signage.

Furthermore, the behavior required when encountering a stopped school bus is unusual in the context of normal driving. You must come to a complete stop and remain stopped until the bus moves or retracts its signals, even if there are no visible pedestrians. This is counterintuitive from a machine learning perspective, where the system has been trained that you stop when there are obstacles or pedestrians, not when a specific vehicle displays a specific signal pattern.

The Ambulance Incident

The Austin incident, where a Waymo vehicle briefly blocked an ambulance, highlights a related challenge: emergency vehicle interaction. Autonomous vehicles must detect approaching emergency vehicles (by sight and sound), determine the correct yielding behavior, and execute it quickly. The fact that Waymo vehicles still occasionally fail at this task is concerning, given that the company has been operating commercially for years.

To be fair, human drivers also sometimes fail to yield properly to emergency vehicles. But the standard for autonomous vehicles is necessarily higher. Waymo has marketed its technology as safer than human driving, and incidents that involve blocking emergency responders undermine that narrative, even if the overall statistical safety record is strong.

Taha Abbasi notes that these edge cases are precisely why Tesla’s approach to autonomy, which keeps a human driver in the loop through FSD (Supervised), may be the more prudent path during this transitional period. A human driver would instantly recognize a stopped school bus or an approaching ambulance and respond appropriately. The supervised model provides the benefits of automation for routine driving while maintaining human judgment for unusual situations.

Waymo’s Response and Improvements

Waymo has acknowledged the school bus detection issue and says it is actively improving its models. The company notes that its vehicles have completed millions of miles in cities across the United States without a single school-bus-related accident. This is true and important, but it does not address the behavioral failures that have been documented. A vehicle that slows but does not fully stop for a school bus has not caused an accident, but it has violated the law and potentially endangered children.

The company’s engineering approach to solving this problem involves collecting more training data from school bus encounters, refining the detection models for the bus’s stop sign and lights, and programming more conservative default behavior when a school bus is detected. These are reasonable steps, but they highlight a fundamental limitation of the current approach: every new edge case requires specific engineering attention.

The Broader Autonomy Lesson

The school bus problem illustrates why full autonomy remains harder than many predicted. The driving environment is filled with situations that humans handle easily but that challenge AI systems: hand gestures from construction workers, unusual road configurations during detours, pedestrians behaving unpredictably, and vehicles displaying non-standard signals.

Each of these situations requires not just perception (seeing what is happening) but understanding (knowing what it means) and judgment (deciding what to do). Current autonomous systems are increasingly good at perception but still struggle with understanding and judgment, particularly in novel situations.

As Taha Abbasi has argued, this is not a reason to give up on autonomous driving. The safety potential is enormous, and companies like Waymo are making genuine progress. But it is a reason to be honest about the timeline. True Level 5 autonomy, where a vehicle can handle any situation without human intervention, remains years away. In the meantime, approaches that combine automation with human oversight, whether through supervised FSD or remote human operators, offer the best balance of safety and capability.

What This Means for the Robotaxi Industry

The school bus issue matters for the robotaxi industry beyond Waymo. As companies like Tesla prepare to launch their own robotaxi services, they will face the same edge cases. The question is whether each company must independently solve every edge case, or whether industry-wide data sharing and standards can accelerate the process.

Currently, each autonomous driving company treats its training data and detection models as proprietary competitive advantages. This makes business sense but slows progress on safety-critical edge cases that affect everyone. A school bus detection standard that all autonomous vehicles must meet, similar to the crash safety standards enforced by NHTSA, could ensure that no company’s vehicles fail at this fundamental task.

Until then, as Taha Abbasi notes, the school bus stops will keep coming, and autonomous vehicles will need to get them right every single time. There is no acceptable failure rate when children’s safety is at stake.

🌐 Visit the Official Site

Read more from Taha Abbasi at tahaabbasi.com


About the Author: Taha Abbasi is a technology executive, CTO, and applied frontier tech builder. Read more on Grokpedia | YouTube: The Brown Cowboy | tahaabbasi.com

Taha Abbasi - The Brown Cowboy

Taha Abbasi

Engineer by trade. Builder by instinct. Explorer by choice.

Comments