X announced on Wednesday that it will suspend creators from its revenue-sharing program for 90 days if they post AI-generated war videos without clearly labeling them as synthetic. The policy marks a shift toward using financial penalties rather than account bans to combat misinformation, targeting the financial incentives that drive viral combat footage during geopolitical crises.
The update was disclosed by Nikita Bier, Head of Product at X, who said the rule takes effect immediately. According to Bier, creators who fail to disclose that war-related footage was generated or altered using artificial intelligence will lose access to X creator revenue for three months. Repeat offenders will face permanent removal from monetization eligibility.
The decision comes as generative AI tools become increasingly capable of producing hyper-realistic combat footage—raising alarms across the technology and national security sectors about misinformation during periods of geopolitical instability.
New Monetization Rules Target Financial Incentives
Rather than imposing outright account bans, X is focusing its enforcement on X creator revenue. The move signals a strategic effort to curb the spread of misleading content by cutting off financial rewards tied to engagement and virality.
“During times of war, access to authentic information is critical,” Bier wrote in a post on X. He added that advances in AI have made it “trivial” to fabricate convincing battlefield scenes that can mislead audiences and distort public perception.
Under the revised guidelines, enforcement actions affecting X creator revenue will be triggered if a post receives a Community Note identifying it as AI-generated or if metadata and technical indicators suggest the use of generative AI tools. Community Notes is X’s crowd-sourced fact-checking feature, which allows users to append contextual information to potentially misleading posts.
The company said it will continue improving its detection systems to protect the integrity of X creator revenue while maintaining open expression on the platform.
Industry analysts say targeting monetization instead of speech reflects a calculated approach. “When platforms tie income to engagement, they create strong incentives for sensational content,” said Aviv Ovadya, a misinformation researcher formerly affiliated with major social platforms.
“Adjusting X creator revenue eligibility changes that incentive structure without necessarily restricting user accounts.”
Rising Pressure Over AI War Footage
The crackdown on X creator revenue arrives amid heightened global tensions and a surge in AI-generated imagery and video across social media. Civil society groups and governments have warned that deepfakes and synthetic battlefield clips could inflame conflicts or spread panic.
World Economic Forum has repeatedly identified misinformation and disinformation as among the top global risks in recent years, particularly as generative AI tools become more accessible. Meanwhile, policymakers in the United States and Europe have pressed platforms to implement clearer labeling standards for synthetic media.
Generative AI systems can now simulate explosions, troop movements, and urban warfare scenarios with striking realism. Without disclosure, such content can appear indistinguishable from genuine footage captured in conflict zones.
By tightening X creator revenue rules, the company is attempting to send a message: financial gain should not come from content that obscures reality during wartime.
“This is about trust,” Bier emphasized. “If people can’t rely on what they’re seeing during a crisis, the platform fails its core purpose.”
The emphasis on X creator revenue reflects a broader industry shift toward demonetization as a content moderation tool. Platforms have increasingly used advertising restrictions and revenue suspensions to discourage harmful or misleading posts.
Balancing Free Expression and Platform Integrity
The changes to X creator revenue highlight the delicate balance social platforms must strike between safeguarding free expression and preventing abuse of powerful AI technologies.
Since launching its revenue-sharing initiative, X has sought to attract high-profile creators by offering payouts tied to engagement metrics and advertising impressions. For many influencers and independent journalists, X creator revenue represents a meaningful income stream.
However, critics argue that monetization programs can unintentionally amplify extreme or sensational material. By narrowing X creator revenue eligibility for undisclosed synthetic war footage, the company is attempting to recalibrate that dynamic.
“Monetization policies are one of the strongest levers platforms have,” said Katie Harbath, a former public policy executive in the social media sector. “If you want to discourage certain behaviors quickly, impacting X creator revenue is often more effective than removing posts after they’ve already gone viral.”
Importantly, the new policy does not automatically suspend accounts that post AI war videos. Instead, it restricts access to X creator revenue, signaling that the company views financial deterrence as a proportionate response.
Still, some free speech advocates caution that enforcement mechanisms must remain transparent. Determining whether content is AI-generated can be technically complex, and false positives could unfairly affect X creator revenue streams.
X has said it will rely on a combination of Community Notes, metadata analysis, and internal detection tools. The company also indicated that its moderation systems will evolve as AI technologies become more sophisticated.
A Turning Point for AI Accountability
The decision to tighten X creator revenue standards marks another chapter in the tech industry’s ongoing struggle with generative AI. As tools capable of producing convincing synthetic media proliferate, platforms are under growing scrutiny to prevent misuse—especially during armed conflicts.
By zeroing in on X creator revenue, the company is betting that financial disincentives can help slow the spread of deceptive war content without undermining broader user participation.
Whether the approach proves effective remains to be seen. But in an era where AI can fabricate scenes of destruction in seconds, X’s latest policy signals that monetization frameworks will play a central role in shaping online information ecosystems.
As geopolitical tensions persist and generative AI continues to evolve, the future of X creator revenue may hinge on how successfully the platform can distinguish authentic reporting from manufactured spectacle—and ensure that trust, not deception, drives engagement.