NextFin

Heather Humphreys' Team Urges Removal of Deepfake Video Used in Investment Scam

NextFin news, Heather Humphreys' team publicly warned on Thursday about a deepfake video circulating online that falsely depicted the Irish presidential candidate endorsing a high-return investment scam. The video, which used artificial intelligence technology to manipulate Humphreys' image and voice, was designed to lure people into a fraudulent scheme.

The incident occurred in Dublin, Ireland, where Humphreys is a known political figure and candidate for the Áras (presidential) election. The team emphasized that the video was entirely fake and urged social media platforms to remove it immediately to prevent further deception.

Following the report from Humphreys' team and Fine Gael party representatives, Meta, the parent company of Facebook and Instagram, took action and removed the page hosting the deepfake video on Thursday. Meta confirmed the removal as part of its efforts to combat misinformation and scams on its platforms.

The use of deepfake AI technology in scams has raised concerns about the potential for misuse of public figures' images to deceive the public. Authorities and political teams are increasingly vigilant in monitoring and reporting such content to social media companies for swift removal.

The warning from Humphreys' team serves as a reminder to the public to be cautious about investment offers seen on social media and to verify the authenticity of endorsements, especially those involving high returns that seem too good to be true.

This event highlights ongoing challenges in digital security and the importance of cooperation between political figures, social media companies, and the public to prevent fraud and protect reputations.

Explore more exclusive insights at nextfin.ai.

Open NextFin App