It’s unclear how the Model S driver misused Autopilot in the way that they did. The incident occurred before Tesla updated the system to give it the ability to detect speed limit signs using a vehicle’s cameras. However, as The Verge notes, Tesla has said Autopilot will only work when it detects that the driver has their hands on the steering wheel. If that’s not the case, the car will try to get the driver’s attention with visual and audio warnings before disabling Autopilot.
But the fact that drivers can disengage from Autopilot is something that the National Transportation Safety Board (NTSB) in the US has criticized Tesla over repeatedly. In March, the agency published a report that said a Model 3 driver’s overreliance on the system — in a situation it wasn’t designed to handle — led to a deadly crash in Delray Beach, Florida in 2019.
In this latest incident, the RCMP similarly warned against overlying on Autopilot. “Although manufacturers of new vehicles have built in safeguards to prevent drivers from taking advantage of the new safety systems in vehicles, those systems are just that — supplemental safety systems,” said Superintendent Gary Graham of Alberta RCMP Traffic Services. “They are not self-driving systems, they still come with the responsibility of driving.”
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.