NextFin News - Maryland Attorney General Anthony G. Brown issued a formal consumer alert on Monday, warning that fraudulent investment schemes are proliferating across Meta Platforms’ ecosystem, including Facebook, Instagram, and WhatsApp. The alert highlights a sophisticated evolution in digital fraud, where scammers are now deploying "deepfake" technology to impersonate high-profile financial figures and lure retail investors into high-stakes "pump and dump" operations and cryptocurrency scams.
The warning from the Maryland Office of the Attorney General identifies a specific three-step "bait, shift, and hook" methodology. Fraudsters typically begin by purchasing advertisements on Facebook or Instagram that feature AI-generated likenesses of recognizable figures such as Ark Invest’s Cathie Wood, CNBC’s Joe Kernen, or "Shark Tank" star Kevin O’Leary. These ads, placed without the individuals' permission, promise "guaranteed" returns or exclusive "insider" memberships. Once a user engages, they are pressured to move the conversation to encrypted messaging apps like WhatsApp or Telegram, effectively bypassing Meta’s internal moderation tools before being funneled into group chats where they are coerced into buying specific stocks or digital assets.
This regulatory escalation follows a period of intense scrutiny for Meta Platforms. Internal documents and reports from consumer advocacy groups, including a February 2026 letter from the National Consumers League, suggest that the scale of the problem is vast. According to those documents, Meta generated an estimated $49 million in revenue from deepfake video ads featuring public figures like U.S. President Trump and Elon Musk in late 2025. The data further indicates that up to 70% of newly active advertisers on the platform may be promoting low-quality or fraudulent products, with some accounts allowed to accumulate dozens of "strikes" before facing suspension.
Meta has pushed back against the narrative of negligence, recently initiating its own legal actions against international scam operations. The company maintains that it employs a "multi-layered approach" to fraud, which includes disabling payment methods, blocking domain names, and issuing cease-and-desist letters to former "Business Partners" who allegedly sold services to help scammers evade enforcement. However, the persistence of these ads suggests a structural gap between Meta’s automated defenses and the rapid iteration of AI-driven fraud.
For the broader market, the Maryland alert serves as a reminder of the "trust deficit" currently plaguing social media-driven retail investing. While the 2021 "meme stock" era demonstrated the power of decentralized investment communities, the 2026 landscape is increasingly defined by synthetic deception. Attorney General Brown noted that reputable broker-dealers and investment advisors rarely, if ever, post specific investment advice on social media, urging citizens to report suspicious activity to the state’s Securities Division.
The financial impact on victims is often total. In the "pump and dump" scenarios described by the Attorney General, scammers hype a low-priced asset to inflate its price, then liquidate their own holdings at the peak, leaving retail followers with worthless positions. As deepfake technology becomes more accessible, the cost of creating convincing "financial advice" has plummeted, shifting the burden of verification entirely onto the individual consumer. The Maryland alert may be a precursor to more coordinated state-level or federal actions as regulators grapple with the liability of platforms that profit from the placement of these fraudulent advertisements.
Explore more exclusive insights at nextfin.ai.

