November 23rd 2024.
On Thursday, Russian President Vladimir Putin delivered a nationally-televised address from Moscow, sparking a wave of speculation. The video, in which Putin can be seen sitting at his desk, has raised concerns about his health as he appears almost motionless and confirms the use of hypersonic missiles in a strike on the Ukrainian city of Dnipro.
Some have even gone as far as to suggest that the video is a 'deepfake' - a digitally manipulated clip created to conceal any issues with the leader's well-being. This controversy has emerged at a crucial moment for Putin as he navigates the tense situation with Ukraine, who have recently acquired missiles capable of reaching deep into Russian territory.
Despite lacking concrete evidence, the 'deepfake' claims have gained traction, leading to further scrutiny of the video. Anton Gerashchenko, an advisor to the Ukrainian internal affairs department, was one of the first to raise questions about the clip's authenticity, pointing out discrepancies in Putin's hand movements and lip-syncing.
But what exactly are deepfakes and how do they work? These are realistic manipulations of video, audio, or images created using artificial intelligence techniques. They can mimic a person's voice and facial features to make it look like a real event has taken place. To achieve this level of realism, a large dataset of the subject is required, including various angles, expressions, and lighting conditions.
Christopher Shoebridge, a documentary filmmaker, has also weighed in on the controversy, claiming that the video is a 'totally faked' production with someone else's hands composited in and unnatural head movements that suggest the use of AI. This speculation is further fueled by reports that Putin has not been seen in public for two weeks following his appearance at a conference in Sochi.
However, even AI experts are struggling to determine with certainty if the video is indeed a deepfake. Shweta Singh, an assistant professor of Information Systems and Management, stated that it is challenging to make a definite conclusion based on speculation alone. She also pointed out that Putin is known for his animated hand gestures in his other speeches, which are noticeably absent in this particular address.
While AI detecting tools have not definitively labeled the video as a deepfake, there are some indicators of potential tampering, leading to a moderate level of suspicion. This means that parts of the video may be genuine, but there are anomalies that raise concerns. Ultimately, it is difficult to determine the authenticity of the video without further evidence.
[This article has been trending online recently and has been generated with AI. Your feed is customized.]
[Generative AI is experimental.]