Amazon has pulled listings for ‘steering wheel weights’ that allow Tesla drivers to trick the vehicle into thinking their hands are on the wheel in self-driving mode.
The Seattle-based firm was joined by Chinese e-commerce giant Alibaba in purging the products from their stores after they were linked to accidents.
The gadgets clamp onto the steering wheel and trick the car’s sensors into thinking that they are a pair of driver’s hands. Autopilot mode in a Tesla still requires a driver to be holding the wheel for safety reasons.
In March, a Tesla struck a teenager without slowing as he was getting off a school bus in North Carolina, leaving him seriously injured. In December, a driver in Germany fell asleep at the wheel and led police down a highway at 70mph.
Carnegie Mellon University professor Philip Koopman pointed the blame squarely at Tesla founder Elon Musk for the confusion concerning Autopilot mode.
‘Elon Musk’s saying it’s supposed to drive itself. That’s what they’re going to hear. How do you think they’re going to behave?’ Koopman told The Washington Post.
The Post reports that recently two different brands of wheel weight were listed as the top two releases in Amazon’s ‘automotive steering wheel’ section. The retail for between $30 and $75, more than 100 were sold in the last month, reports the Post.
‘Products intended to circumvent driver assistance systems or other car safety features, including counterweight rings and autopilot nag reduction devices, are prohibited,’ Amazon spokesperson Samantha Boyd told the newspaper.
‘The products in question were evasively listed, have been removed, and we are taking corrective action. Amazon does not tolerate illegal or evasive behavior, and we enforce bad actors that make factual misrepresentations to customers,’ she added.
In January, a Tesla owner, David Alford, told the New York Times Magazine that he knew of people who used wheel weights in order to drink while driving and avoid the need for an Uber.
‘I know a couple of people with Teslas that have FSD beta, and they have it to drink and drive instead of having to call an Uber,’ he said.
Earlier this month, US automobile safety regulators are zeroing in on changes that Tesla has made to its Autopilot partially automated driving system, including how it makes sure drivers pay attention and how it detects and responds to objects.
The National Highway Traffic Safety Administration asked for details of changes made to all versions of the system including dates and detailed descriptions, according to a post on its site Thursday.
The request is part of a larger investigation into why Teslas operating on Autopilot have struck emergency vehicles that are stopped along highways while they are responding to other incidents.
It covers all versions of automated driving system including ‘Full Self-Driving,’ which is being tested on public roads by Tesla owners.
The agency has been investigating crashes involving Teslas using the system since August of 2021. Investigators have sent teams to probe more than 30 crashes since 2016 that have caused at least 14 deaths.
In a letter dated January 3 and posted on the agency’s website, NHTSA asks Tesla to describe all changes to the systems in the ‘design, material composition, manufacture, quality control, supply, function, or installation of the subject system, from the start of production to date.’
Tesla must respond to the request by July 19, or it could face civil penalties, the letter states. It asks the electric vehicle maker to update a previous response dated Sept. 19 of last year.
A message was left early Thursday seeking comment from Tesla. The company says on its website that neither Autopilot nor ‘Full Self-Driving’ can drive themselves and that drivers must be ready to intervene at all times.
The letter also asks for which Tesla vehicles had cameras installed in the cabin to monitor drivers, and whether the system uses ‘Tesla Vision’ which relies only on cameras to view the road and does not use radar.
NHTSA’s request includes changes that Tesla made as part of a February recall of ‘Full Self-Driving’ software. NHTSA pressured Tesla into recalling nearly 363,000 vehicles with the software because the system can break traffic laws. The problem was to be fixed with an online software update.
NHTSA said in recall documents that the system can make unsafe actions such as traveling straight through an intersection from a turn-only lane, going through a yellow traffic light without proper caution or failing to respond to speed limit changes.
The documents said that Tesla does not agree with the agency’s analysis of the problem. The company was to fix the problems with an online software update.
Musk for his part has said he expects to have fully autonomous vehicles this year, a pledge he has made for several years. ‘The trend is very clearly toward full self-driving,’ Musk said in April. ‘And I hesitate to say this, but I think we’ll do it this year.’
The system is being tested on public roads by as many as 400,000 Tesla owners.
NHTSA also has opened investigations during the past three years into Teslas braking suddenly for no reason, suspension problems and other issues.