Design Highlights
- Waymo’s robotaxi struck a child near a Santa Monica school, prompting investigations by the NHTSA and NTSB.
- The incident occurred during busy drop-off hours, with the child running into the street from behind a double-parked SUV.
- Initial vehicle speed was reduced from 17 mph to under 6 mph before contact, a key factor in the investigations.
- Waymo claims that its automated system may have prevented more severe injuries compared to human drivers in similar situations.
- The incident raises concerns about public trust and regulatory scrutiny as Waymo plans to expand its robotaxi services.
In a shocking incident that left parents and officials on edge, a Waymo robotaxi struck a child near a Santa Monica elementary school during the busy morning drop-off on January 23, 2026. The collision happened just a couple of blocks from the school, right in the thick of school traffic.
Imagine this: a child dashes unexpectedly into the street from behind a double-parked SUV. A pedestrian’s worst nightmare, right? The Waymo vehicle was cruising toward the campus when the impact occurred, and here’s the kicker—there was no human safety operator behind the wheel.
Waymo’s automated driving system, the fifth-generation one, was at the helm. It’s designed to handle tough situations, but it had to brake hard, reducing its speed from 17 mph to under 6 mph before contact. A close call, if you ask anyone.
Waymo’s fifth-gen system braked hard, slowing from 17 mph to under 6 mph—definitely a close call in a chaotic environment.
Sure, Waymo claimed that their system helped prevent more serious injuries, but hey, a child got hit. That’s not a great headline for anyone. The National Highway Traffic Safety Administration (NHTSA) has opened up an inquiry into this whole mess, focusing on how the vehicle behaved near schools, kids, and all the typical drop-off chaos. The vehicle’s speed reduction before impact is a crucial factor in the investigation. Additionally, Waymo’s claims suggest that human drivers would have struck the child at a higher speed, which raises further questions about the effectiveness of autonomous driving systems.
The child, fortunately, sustained only minor injuries and managed to stand up and walk to the sidewalk after the collision. Talk about resilience. Witnesses were quick to call 911, and local parents expressed relief that things didn’t turn out worse.
Still, the Santa Monica-Malibu Unified School District is ramping up safety messages, which feels like a good idea given the circumstances. Meanwhile, the NTSB has jumped in with its own investigation. More government eyes on Waymo, because, why not?
They’re examining this crash in a real-world setting—where kids are running around, crossing guards are doing their best, and parents are trying to juggle it all. Just last year, Waymo had to recall thousands of vehicles for improperly passing stopped school buses. Not exactly a flawless track record.
With the robotaxi expansion slated for cities like Washington D.C. in 2026, one has to wonder: can they get this safety thing sorted out? After all, nobody wants to hear about another incident like this, especially when the stakes involve children. Like how renters insurance companies view claims as risk flags that remain on record for years, each autonomous vehicle incident could impact public trust and regulatory approval for the technology’s future expansion.
The scrutiny may just be getting started, and it’s clear that Waymo has a lot to prove in the autonomous vehicle game.








