Two incidents involving smart cars in less than a week should give prospective buyers something to think about: a server-side blunder prevented Nissan EV owners from accessing key vehicle functionality while a Tesla on autopilot allegedly hit an innocent robot on the street.
Nissan EV app password reset
Nissan’s UK arm migrated user data to a new system over the weekend without giving Nissan EV owners a heads up. Then, a snafu during the migration inadvertently prompted every Nissan EV app user to reset their passwords, leading many to believe hackers breached Nissan and stole their data.
Nissan soon clarified this was not the case at all in a statement to The Register, as well as in a series of Tweets replying to concerned Nissan EV owners.
“There has been no data breach,” said the carmaker. “The data was simply migrated over to a new computer system and therefore customers have been asked to reset their passwords as a security protocol.”
The outage affected taxi drivers as well as regular drivers who could not access data like battery status and driving range.
“People will understandably be suspicious of a hack, but it’s probably just bad handling from Nissan if we give them the benefit of the doubt,” security researcher Scott Helme told The Register. “Upgrades like this really do not need users to reset passwords if done right. They also could have communicated this better to avoid people assuming something bad has happened.”
Tesla allegedly hits and “kills” smart robot
A slightly more controversial smart car story, about a Tesla hitting a poor robot on the street, got even more mileage in the press and social media last week. As the story goes, a Promobot was on its way to the 2019 Consumer Electronics Show (CES) in Las Vegas when a Tesla Model S in “full self-driving mode” hit it, killed it, and fled the scene. Video below:
The story is controversial not because Elon Musk’s brainchild engaged in a hit and run as much as because the story itself seems fabricated. After watching the footage, an army of disgruntled Twitter users pointed out that the vehicle doesn’t even appear to touch the robot, and that the camera, conveniently placed right where the incident took place, even appears to film a rope being tied to the robot. These observations, along with other inconsistencies, have led many to speculate that the Russian company behind the Promobot staged the entire incident for promotional purposes. Furthermore, Tesla vehicles do not (yet) have a fully-autonomous driving mode. Full story at archpaper.com.
Tesla’s CEO has gone on record several times claiming that competitors will stop at nothing to destroy the image of his electric cars, but it remains to be seen if this was a sad attempt at just such a stunt, or a factual account. Perhaps it could be a long-shot coincidence that the robot happened to fall over itself just as someone was driving by in a Tesla. Robots like these are known to take untimely dives. One commenter using the handle Blacksmith (@IHPIP) on Twitter jokingly speculated that the machine may have had Neymar Jr. firmware installed on it (the footballer is known for his fake dives).
In related news, a Pwn2Own hackathon is now open with the promise of awarding a million dollars to whomever can compromise a Tesla Model 3.
tags
Filip has 15 years of experience in technology journalism. In recent years, he has turned his focus to cybersecurity in his role as Information Security Analyst at Bitdefender.
View all postsNovember 14, 2024
September 06, 2024