Researchers made Tesla’s autopilot work without a driver

Tesla's autopilot without driver

Tesla’s autopilot system can be easily tricked into driving a car without a driver. According to experts from the non-profit organization Consumer Reports, the investigation of this problem appeared after a fatal accident in Texas, when no one was driving a Tesla car.

By using a weighted chain attached to the steering wheel to simulate the pressure of the driver’s hands, two Consumer Reports safety researchers were able to use the Tesla Model Y system to increase vehicle speed. The car was able to drive several laps on the test track while the driver was in the passenger seat.

The car repeatedly drove along the road lane of our route and did not notice that there was no one in the driver’s seat. It was a little scary when we realized how easy it is to bypass the security measures, which were clearly insufficient.the experts noted.

Experts warned other people against trying to trick Tesla’s autopilot in this way, noting that the experiment should not be carried out by anyone other than trained professionals.

The experiment does not provide a concrete understanding of the Texas accident, but safety advocates and researchers at CR say it does show that driver monitoring systems need to work harder to prevent drivers from using systems in predictably dangerous ways.

In our evaluation, the system not only failed to make sure the driver was paying attention, but it also couldn’t tell if there was a driver there at all. Tesla is falling behind other automakers like GM and Ford that, on models with advanced driver assist systems, use technology to make sure the driver is looking at the road. says Jake Fisher, CR’s senior director of auto testing, who conducted the experiment.

BMW, Ford, GM, Subaru, and others use camera-based systems that can track eye movements or the position of the driver’s head to make sure they are looking at the road.

Certain vehicles, including those equipped with GM’s Super Cruise system, may automatically decelerate to a stop if they find that drivers have ignored repeated warnings to look at the road.

Let me remind you that we also wrote that the IS researcher found that the Tesla Model 3 interface is vulnerable to DoS attacks.

By Vladimir Krasnogolovy

Vladimir is a technical specialist who loves giving qualified advices and tips on GridinSoft's products. He's available 24/7 to assist you in any question regarding internet security.

Leave a comment

Your email address will not be published. Required fields are marked *