The company Cruise has only just recently began allowing San Francisco residents to use their driverless robotaxis for rides and one of its cars has already experienced a run-in with police.
To say that San Francisco police were left confused after pulling over one of the robotaxis for driving without headlights at night and finding no one inside is an understatement. What could possibly be going on?
The Western Journal offered their thoughts about Big Tech:
Tech has serious implications for every part of our lives going forward — including driving on the highway. And yet, Big Tech loves to downplay the risks. Here at The Western Journal, we believe society needs to be cautious and we’ll point out the potential pitfalls in Big Tech’s plan.
The chief executive of Cruise, Kyle Vogt, previously discussed the challenges for autonomous vehicles and one of the biggest red flags is how they are meant to interact with humans.
The company identified human error as the blame for the car driving without headlights and also shared that they work closely with police on how to best interact with the autonomous vehicles as well as offer a dedicated phone number for police to call if needed. In this specific situation, Cruise relocated the car to the nearest safe location after police were clear of the vehicle.
At the moment, Cruise is only allowing a handful of autonomous vehicles in the city to offer ride to the public free of charge during the evenings. The hope is that after they receive the last regulatory approval, the company will expand to launch a commercial driverless service in another densely populated city.
Ironically enough, Cruise CEO Vogt had previously stated that any scenario where a car is pulled over by police has to go smoothly.
When San Francisco police first stopped the car, a Chevy Bolt, due to it not having headlights on at night, they walked up only to find no one inside and the car pulled away shortly after. The police chased after the vehicle, lights flashing, as it crossed an intersection and pulled up in front of a Chinese restaurant.
Per Cruise’s statement, the car pulled away in order to drive “to the nearest safe location for the traffic stop as intended.”
Cop in San Francisco pulls over a car. Cop goes to drivers door to speak to driver. But there is no driver as it’s a driverless car! Car then drives itself away. Cops give chase. Surreal. pic.twitter.com/h6qjIjphj5
— Paddy Cosgrave (@paddycosgrave) April 11, 2022
In this video above, you can see three officers walking around the vehicle to inspect it for a driver inside. Once they discovered it was fully autonomous, they contacted Cruise, which quickly switched the car from autonomous to remote control so it was drive-able.
No ticket was issued.
The Chronicle reported:
“Experts said the incident showed that autonomous-car companies still have a way to go in figuring out human-robot interactions — although some shortfalls could have been remedied with basic common sense.”
“Situations like this are bound to be more common now that both Cruise, a spin-off from General Motors, and Waymo, the self-driving unit of Google parent Alphabet, are operating autonomous cars on California public roads with no one behind the wheel.”
The Verge added more details:
Cruise, a subsidiary of General Motors, uses LIDAR technology to power its vehicles’ self-driving capabilities. The company has been using the cars to shuttle around its San Francisco-based employees since 2017, but only just opened a waiting list to taxi the city’s general population.
We still don’t know what exactly caused the Cruise vehicle to operate without its headlights. Perhaps the car’s automatic headlights feature was disabled or failed to detect the darkness around it. Either way, it is a bit concerning. Cruise vehicles are only authorized to drive from 10PM to 6AM, which obviously makes headlights pretty important.
In 2018, a self-driving Uber vehicle struck and killed a pedestrian walking her bike across the road in Tempe, Arizona. Subsequent investigations from the National Transportation Safety Board (NTSB) found that Uber turned off Volvo’s factory emergency braking system to prevent any interaction with Uber’s self-driving software, but it’s unclear whether that contributed to the crash.