Meta’s security researchers have identified a wave of malware using the name, design and even some of the functionality of ChatGPT to trick people into downloading and installing malicious software.
It’s no secret that malicious campaigns follow popular or well-known societal events, and it’s difficult to find something more popular than ChatGPT. This technology took the world by storm, and criminals immediately saw a new way of tricking people.
Using the ChatGPT image, name and other attributes is a sure way to target people who have heard of the technology but have yet to interact with it. And when a “developer” promises a browser shortcut or even direct integration with ChatGPT, some people take him up on the offer, not realizing they’ve become a victim.
“Since March alone, our security analysts have found around 10 malware families posing as ChatGPT and similar tools to compromise accounts across the internet,” explainedMeta in their Q1 2023 Security Report.
“For example, we’ve seen threat actors create malicious browser extensions available in official web stores that claim to offer ChatGPT-related tools. In fact, some of these malicious extensions did include working ChatGPT functionality alongside the malware.”
The fact that some of the apps actually offer some limited interaction with the ChatGPT tool helps criminals sell the illusion of a functional app or browser extension.
This falls in line with our own research, which showed criminals using ChatGPT’s likeness in a phishing campaign only a couple of months ago, right when the AI tool started to show up more and more in the media.
The best way to ensure that you’re not exposing your device and data in these types of attacks is to access the AI tool directly from the official website and use a security solution, whether a mobile device or a PC.
tags
Silviu is a seasoned writer who followed the technology world for almost two decades, covering topics ranging from software to hardware and everything in between.
View all postsNovember 14, 2024
September 06, 2024