Technology

The world needs experts in deepfakes to stop this chaos


Recently the army Myanmar’s coup government has added serious allegations of corruption to a raft of false cases pending against Burmese leader Aung San Suu Kyi. These new accusations are based on statements by a prominent detained politician who was first released in a videotape in March that many in Myanmar suspect are fake.

at the videoThe political prisoner’s voice and face appear distorted and unnatural as he makes a detailed claim about providing gold and money to Aung San Suu Kyi. Social media users And the journalists In Myanmar he immediately wondered if the statement was true. This incident illustrates a problem that will only get worse. With the improvement of real deep fake technology, the . file People’s desire to dismiss real footage as fake Increase. What tools and skills will be available to investigate both types of claims, and who will use them?

In the video, Phyo Min Thein, the former prime minister of Myanmar’s largest city, Yangon, is sitting in an empty room, apparently reading from a statement. His speech sounds strange and not like his natural voice, his face is still, and in the lousy version first circulated, his lips seem out of sync with his words. Apparently everyone wanted to believe it was fake. Screen capture results from the deepfake detector spread across the Internet quickly, showing a red square around the politician’s face and a confirmation with Over 90% confidence that the confession was a deep fake. Burmese journalists lack forensic skills to pass judgment. Past and current military actions have reinforced the cause for suspicion. Official spokespersons released images in stages targeting the Rohingya ethnic group Meanwhile, the organizers of the military coup denied that social media evidence of their deaths was real.

But was the prisoner’s “confession” really fake? Along with deepfake technology researcher, Henry Ajder, I consulted with deepfake content creators and media forensic specialists. Some noted that the video was low-quality enough that the mouth glitches that people saw were likely made with pressure as evidence of deepfakes. The detection algorithms are also unreliable in low-quality compressed video. His abnormal voice could be the result of reading a text under extreme stress. If he is a fake, then he is very good, because his throat and chest move at the decisive moments in sync with the words. Researchers and producers were generally skeptical that it was a deep fake, though not certain. At this point, what human rights activists like me are likely to know: A Confessing coercion in front of the camera. In addition, the substance of the allegations should not be trusted in light of the circumstances of the military coup unless there is a legitimate judicial process.

Why is this important? Regardless of whether the video is a forced confession or a deep fake, the results are likely the same: the words taken out of the prisoner’s mouth digitally or physically by the government coup. However, while Using deepfake technology to create non-consensual sexual images Synthetic media technology and deep fakes are currently vastly outperforming political situations, rapidly improving, proliferation and commercialization, expanding the potential for malicious uses. The case in Myanmar illustrates the growing gap between the capabilities of deepfakes, the chances of claiming a real video is a deep fake, and our ability to challenge that.

It also illustrates the challenges of getting the public to rely on free online detectors without understanding the strengths and limitations of detection or how to undo a misleading result. Deep counterfeit detection is still an emerging technology, and a detection tool that applies to one method often does not work over another. We also have to be wary of counter-forensics – where someone deliberately takes steps to obfuscate the approach to disclosure. It is not always possible to know which detection tools to trust.

How do we avoid conflicts and crises around the world that are shocked by deepfake technology and supposed deep fakes?

We should not turn ordinary people into fake observers, decomposing pixels to distinguish truth from falsehood. Most people would do better by relying on simpler approaches to media literacy, such as SIFT That emphasizes checking other sources or following the original context of the videos. Reality, Encouraging people to be amateur forensic experts can send people into a conspiracy rabbit hole Lack of confidence in the pictures.





Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button