I don’t drive. So I am basing this assumption on being an (infrequent) passenger and (more frequent) observer: When drivers are lost or confused, or detect malfunctions (in their cars or passengers), they pull over in the nearest place that looks safe.
That is, if the driver is human. If the car’s driver is itself (Does that sound creepy or what?), it won’t pull over. And, if it’s lost or confused, it won’t go to a therapist or spiritual counselor.
Rather, it will “brick.” No, it won’t build a wall—at least not literally. (Even Donald Trump and Greg Abbot have difficulty doing that!) Rather, said non-human driver will stop dead wherever it happens to be—even in the middle of an intersection.
In San Francisco, which probably is denser with ride-sharing services and autonomous vehicles than any other city, Waymo and Cruise self-driving cars accounted for 215 crashes during the first four months of this year.
I could not find reports of injuries caused by those collisions. The city’s transportation authority says that self-driving vehicles “will improve safety” but admits that the technology “isn’t fully developed yet.” One commenter wonders whether the city can “end this experiment now” or “does someone need to be killed first?”
He posted a video that illustrated his concerns: a number of human drivers steered into the Valencia Street bike lane, in the city’s Mission district, to avoid a self-driving Waymo vehicle that “bricked.”
Twofer! Let's take a look at how well the posts are working on Valencia @SFMTA_Muni. Can we end this experiment now or does someone need to be killed first? Also did @waymo report this incident to @californiapuc? pic.twitter.com/BSBvYUw5HK
— Andy (@Shenanigans_ATL) August 7, 2023