Sexy internet celebrity MAGA supports Trump! The real face is actually an AI created by an Indian man, estimated to earn thousands of dollars a month

Indian medical students use AI to create sexy MAGA influencers, targeting conservative American men. They combine political and erotic content to harvest traffic, earning several thousand dollars per month. Experts worry that this kind of virtual influencer may become a tool of information warfare, triggering a crisis.

Sexy influencer MAGA supports Trump, powered by AI

Sexy influencer Emily Hart often shares beautiful lifestyle photos on social media. She is a loyal MAGA fan of Trump: she opposes abortion, opposes “woke culture,” and opposes immigration—but her true identity turns out to be an AI made by a man.

Using the pseudonym Sam, a 22-year-old Indian medical student recently told the media outlet Wired that, to raise money for his medical licensing exam and his future immigration to the United States, he used AI tools to create Emily Hart, spending only 30 to 50 minutes a day managing a social media account to get each short video to reach between 3 million and 10 million views.

Within just one month, Emily Hart’s account on Instagram accumulated more than 10,000 followers. Fans even pay to subscribe to her adult content on the competing platform Fanvue, or buy clothing with political slogans.

Sam estimates that this model could help him easily earn several thousand dollars each month. However, good times didn’t last—this February, Emily Hart’s IG account had already been banned, but her Facebook account is still active.

Image source: The Independent UK Sexy influencer Emily Hart (Emily Hart) is pro-Trump MAGA, but is actually AI

MAGA AI girl’s management strategy

Emily Hart’s success is mainly because Sam followed recommendations from AI tools: targeting older conservative American men with higher discretionary income and higher loyalty as the primary audience, and promoting the line of making America great again (MAGA) while also supporting Trump.

These AI-generated girls have a specific management template. They are typically configured as blonde white women, with jobs such as emergency responders like nurses, police officers, or firefighters. They wear bikinis printed with the U.S. flag, paired with publishing extreme right-wing remarks supporting gun ownership, opposing abortion, or opposing immigration.

Sam revealed that because social media algorithms favor controversial content, these posts not only attract conservative supporters, but also draw liberals to comment and criticize them, greatly boosting engagement rates.

This is an attention-harvesting strategy that combines patriotism with soft porn. Creators attract attention through political fervor, and ultimately steer followers toward monetization on paid platforms.

However, because the well-known adult platform OnlyFans strictly requires creators to be real humans, these AI creators usually direct their followers to Fanvue, a platform that accepts AI-generated content.

From traffic monetization to information warfare: concerns as virtual influencers proliferate

Before Wired reported on Emily Hart, The Washington Post also reported in March on the AI virtual soldier Jessica Foster, who had posed for photos with Trump and Russian President Vladimir Putin. Within 4 months, this account attracted more than 1 million followers.

Image source: Jessica Foster/AI virtual influencer The AI virtual soldier Jessica Foster’s account attracted more than 1 million followers within 4 months

Although Jessica Foster’s IG account has been banned, these MAGA AI girls still raise concerns among experts.

A researcher at the Brookings Institution, Valerie Wirtschafter, said that many fans don’t actually care whether these influencers are real; they only care whether the content matches their own political identity. Joan Donovan, an assistant professor at Boston University, warned that accounts like these are easy to set up and have clear profit incentives.

After all, the biggest risk of these AI accounts is that they could be converted into tools of information warfare—turning into robotic networks that spread political propaganda and misinformation—and would also create unprecedented trust crises and social problems for online communities.

Further reading:
The Classic Race《AI photo rumors with false claims: Tokyo Dome garbage everywhere, AI image circulation; rumor-makers already listed as accounts of foreign forces》

Popular posts trigger media misreporting by Taiwan media: Hornaud climbed 101 photographer is Jin Guowei—media literacy in the AI era faces challenges

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin