NextFin

Maryland Attorney General Warns of Deepfake Investment Scams Proliferating on Meta Platforms

Summarized by NextFin AI
  • Maryland Attorney General Anthony G. Brown issued a consumer alert about rising fraudulent investment schemes on Meta Platforms, utilizing deepfake technology to impersonate financial figures.
  • Fraudsters employ a three-step methodology: bait, shift, and hook, using AI-generated ads to lure victims into scams on platforms like WhatsApp.
  • Meta Platforms faces scrutiny, with reports indicating it generated $49 million from deepfake ads in late 2025, and up to 70% of new advertisers promote fraudulent products.
  • The Maryland alert highlights a trust deficit in social media-driven investing, urging consumers to report suspicious activities as deepfake technology complicates investment verification.

NextFin News - Maryland Attorney General Anthony G. Brown issued a formal consumer alert on Monday, warning that fraudulent investment schemes are proliferating across Meta Platforms’ ecosystem, including Facebook, Instagram, and WhatsApp. The alert highlights a sophisticated evolution in digital fraud, where scammers are now deploying "deepfake" technology to impersonate high-profile financial figures and lure retail investors into high-stakes "pump and dump" operations and cryptocurrency scams.

The warning from the Maryland Office of the Attorney General identifies a specific three-step "bait, shift, and hook" methodology. Fraudsters typically begin by purchasing advertisements on Facebook or Instagram that feature AI-generated likenesses of recognizable figures such as Ark Invest’s Cathie Wood, CNBC’s Joe Kernen, or "Shark Tank" star Kevin O’Leary. These ads, placed without the individuals' permission, promise "guaranteed" returns or exclusive "insider" memberships. Once a user engages, they are pressured to move the conversation to encrypted messaging apps like WhatsApp or Telegram, effectively bypassing Meta’s internal moderation tools before being funneled into group chats where they are coerced into buying specific stocks or digital assets.

This regulatory escalation follows a period of intense scrutiny for Meta Platforms. Internal documents and reports from consumer advocacy groups, including a February 2026 letter from the National Consumers League, suggest that the scale of the problem is vast. According to those documents, Meta generated an estimated $49 million in revenue from deepfake video ads featuring public figures like U.S. President Trump and Elon Musk in late 2025. The data further indicates that up to 70% of newly active advertisers on the platform may be promoting low-quality or fraudulent products, with some accounts allowed to accumulate dozens of "strikes" before facing suspension.

Meta has pushed back against the narrative of negligence, recently initiating its own legal actions against international scam operations. The company maintains that it employs a "multi-layered approach" to fraud, which includes disabling payment methods, blocking domain names, and issuing cease-and-desist letters to former "Business Partners" who allegedly sold services to help scammers evade enforcement. However, the persistence of these ads suggests a structural gap between Meta’s automated defenses and the rapid iteration of AI-driven fraud.

For the broader market, the Maryland alert serves as a reminder of the "trust deficit" currently plaguing social media-driven retail investing. While the 2021 "meme stock" era demonstrated the power of decentralized investment communities, the 2026 landscape is increasingly defined by synthetic deception. Attorney General Brown noted that reputable broker-dealers and investment advisors rarely, if ever, post specific investment advice on social media, urging citizens to report suspicious activity to the state’s Securities Division.

The financial impact on victims is often total. In the "pump and dump" scenarios described by the Attorney General, scammers hype a low-priced asset to inflate its price, then liquidate their own holdings at the peak, leaving retail followers with worthless positions. As deepfake technology becomes more accessible, the cost of creating convincing "financial advice" has plummeted, shifting the burden of verification entirely onto the individual consumer. The Maryland alert may be a precursor to more coordinated state-level or federal actions as regulators grapple with the liability of platforms that profit from the placement of these fraudulent advertisements.

Explore more exclusive insights at nextfin.ai.

Insights

What are deepfake technologies, and how do they operate?

What historical events contributed to the rise of deepfake investment scams?

What is the current market situation regarding investment scams on Meta Platforms?

How have users responded to the prevalence of deepfake scams on social media?

What recent updates have been made by Meta Platforms to combat these scams?

What policy changes might be expected from regulators in response to deepfake scams?

What possible future developments could arise from the increasing use of deepfake technology in scams?

What long-term impacts could deepfake investment scams have on consumer trust in social media?

What are the core challenges faced by regulators in addressing deepfake investment scams?

What are the main controversies surrounding Meta's responsibility in preventing deepfake scams?

How do deepfake investment scams compare to traditional investment fraud schemes?

What examples exist of successful deepfake scams and their consequences?

What measures can individuals take to protect themselves from deepfake investment scams?

How does the 'pump and dump' tactic work within the context of deepfake scams?

What insights can be drawn from the Maryland Attorney General's alert regarding deepfake scams?

What role do encrypted messaging apps play in facilitating deepfake investment scams?

How might the legal landscape evolve in response to deepfake technology used in scams?

Search
NextFinNextFin
NextFin.Al
No Noise, only Signal.
Open App