Italian Prime Minister Giorgia Meloni has been targeted by viral AI-generated deepfake videos falsely claiming she severed all diplomatic ties with Israel.

The clips represent a significant escalation in the use of synthetic media to distort international relations. By fabricating visual evidence of a diplomatic rift, the videos attempt to mislead the public regarding Italy's actual foreign policy and its relationship with the Israeli government.

Several viral deepfake clips have spread online with misleading claims about Italy’s relationship with Israel, according to Euronews [3]. The fabricated footage portrays Meloni refusing to shake hands with Prime Minister Benjamin Netanyahu during a United Nations event in New York. Some clips further depict Meloni wearing a Palestinian flag, an image that does not reflect reality [2].

Meloni addressed the misinformation on social media this month. "I have been targeted by deepfake photos," Meloni said [2].

The narrative in the videos claims that Italy has terminated all deals with Israel. However, the actual diplomatic record differs from these claims. The only verified action regarding bilateral agreements was the suspension of a defence agreement in April 2026 [1].

The AFP Fact-Check team analyzed the footage and concluded that the videos are AI-generated and do not reflect reality [1]. The clips appear designed to create a perception of a strained relationship that does not exist between the two nations [3].

These synthetic videos surfaced in May 2026, coinciding with Meloni's public statements denying the authenticity of the images [2]. The spread of such content highlights the growing challenge for government leaders to combat sophisticated digital misinformation that can impact global stability.

"I have been targeted by deepfake photos."

The emergence of these deepfakes illustrates how generative AI can be weaponized to simulate diplomatic crises. By blending a grain of truth — such as the April 2026 suspension of a specific defence pact — with complete fabrications, bad actors can create a plausible but false narrative of geopolitical instability. This case underscores the increasing necessity for real-time digital forensics to protect international diplomacy from synthetic interference.