Video shows Ukrainian president Zelenskyy surrendering. It’s a deepfake
A poorly edited video purporting to show Ukrainian President Volodymyr Zelenskyy publicly capitulating to Russian demands drew widespread ridicule Wednesday, but experts said it could be a harbinger of more sophicated deceptions to come.
The video appeared to show an ashen-faced Zelenskyy speaking from the presidential lectern and urging his countrymen to down their weapons in the face of Russian invaders. It is not clear whether anyone was convinced.
Internet users immediately flagged the discrepancies between the skin tone on Zelenskyy’s neck and face, the odd accent in the video, and the pixelation around his head. A Facebook official later said the company was removing the footage from its platform.
1/ Earlier today, our teams identified and removed a deepfake video claiming to show President Zelensky issuing a statement he never did. It appeared on a reportedly compromised website and then started showing across the internet.
— Nathaniel Gleicher (@ngleicher) March 16, 2022
Nina Schick, the author of Deepfakes, said the video looked like “an absolutely terrible faceswap,” referring to programs that can digitally graft one person’s face onto another’s body — part of a wider family of computer techniques that can create hyperrealic forgeries known as “deepfakes.”
Television station Ukraine24 said in a Facebook post that the video was broadcast “enemy hackers” and was “FAKE! FAKE!”
The station could not immediately be reached for further detail and Ukraine’s cyber watchdog agency did not immediately return messages seeking comment. But Ukraine’s Minry of Defense later released a video from the real Zelenskyy apparently dismissing the footage as a “childish provocation.”
🔊 «Ми вдома і захищаємо Україну. Ніякої зброї ми складати не збираємось. До нашої перемоги», – Президент України @ZelenskyyUa pic.twitter.com/IkfDxLzqne
— Defence of Ukraine (@DefenceU) March 16, 2022
“We are not going to lay down any weapons until our victory,” he said.
Ukrainian officials have been warning of the danger of deepfakes, especially after Moscow’s forces were denied a quick victory on the battlefield following their February 24 invasion. Two weeks ago, Ukraine’s military intelligence agency put out a short video alerting the country to the danger of deepfakes, alleging that the Kremlin was preparing a stunt involving one.
Всі ви, напевно, чули про технологію Діпфейк (англ. deepfake; поєднання слів deep learning («глибинне навчання») та fake («підробка») — методика синтезу зображення людини, яка базується на штучному інтелекті.Готується провокація РФ.https://t.co/XYyS9WsPkK
— Defence intelligence of Ukraine (@DI_Ukraine) March 2, 2022
The Russian Embassy in Washington did not immediately return a message seeking comment.
Schick called the fake Zelenskyy video “very crude,” but warned that it was a matter of time before the technology became more accessible. “Expect fakes like this to become easier to produce while appearing highly authentic,” she said.