Secret Light

Welcome

Article from Issue 299/2025
Author(s):

I guess we all know the world is getting scary: ideological news bubbles, Internet trackers, social media's "emotional manipulation for dollars" business model. One of the things that scares me the most is the rise of deepfake videos.

Dear Reader,

I guess we all know the world is getting scary: ideological news bubbles, Internet trackers, social media's "emotional manipulation for dollars" business model. One of the things that scares me the most is the rise of deepfake videos. Digital images reach deep inside our brains. We are hard-wired to believe that if we can see it, it is real. The problem is that digital images are also embedded deeply into our culture: our news media, our Internet activities, and scariest of all, our legal systems. A picture or a video recording was once considered a source of truth – you could present a photo or a surveillance video at a trial and the jury would have confidence believing in its authenticity. Early manipulations were so hokey they were easy to perceive with the naked eye, and more sophisticated fakes were easy enough for an expert to identify.

Unfortunately, the rise of AI has changed the value of digital imagery as a source of truth. Deepfake videos can make a sober person appear drunk. They can show politicians saying things they never said – or even giving speeches they never gave. Actors unions are even worried about movie studios deepfaking them into movies they never acted in and were never paid for. It would be trivial for a prosecutor to put a defendant into a video showing a crime that never happened – or for a defense attorney to take the defendant out of a video of a crime that did happen. Deepfakes make it hard to trust what you see. And this also means that a defendant or a public figure caught on video saying or doing something embarrassing or illegal can simply denounce the video as a deepfake.

I really don't like to be known as a bearer of dark tidings, so you can probably guess I wouldn't be telling you this unless I had some good or at least promising news. Researchers at Cornell University recently unveiled a new technique that could once again make digital video a source of truth in certain controlled contexts. The technique, which the developers call "Noise-Coded Illumination for Forensic and Photometric Video" was presented at the SIGGRAPH 2025 conference in Vancouver.

Recent attempts at protecting the authenticity of video evidence have centered around applying a digital watermark to the video. Although this approach might work in certain situations, the problem is that one cannot rule out the possibility that whoever is manipulating the video could also manipulate the watermark. The solution offered by the Cornell team is instead to watermark the light that illuminates the video. By embedding coded fluctuations into the natural random variance of the light source, the researchers create a unique signature of the light that is impossible to fake. If someone manipulates or replaces part of the video, a forensic analyst can identify the change. Adding multiple light sources makes the video even more difficult to fake. Any programmable light source can serve as a source for the watermarked light with the addition of a small chip.

Clearly this solution won't work for every situation – you can't watermark sunlight, and the random little videos people make with their Androids and iPhones won't qualify unless all the light sources in the space are equipped for watermarking. But in certain important contexts, such as video confessions, surveillance videos, and public events like press conferences, it could restore some confidence in video evidence – at least for now.

Joe Casad, Editor in Chief

Infos

  1. "Hiding Secret Codes in Light Protects Against Fake Videos," Cornell University Center for Data Science for Enterprise and Society: https://news.cornell.edu/stories/2025/07/hiding-secret-codes-light-protects-against-fake-videos

Buy this article as PDF

Express-Checkout as PDF
Price $2.95
(incl. VAT)

Buy Linux Magazine

SINGLE ISSUES
 
SUBSCRIPTIONS
 
TABLET & SMARTPHONE APPS
Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

  • Deepfake Sleuth

    The rise of AI-generated deepfakes requires tools for detecting and even disrupting the technology.

  • MainActor

    The MainActor video editing software may appear to have only a rudimentary title generator. But with some tricks and special effects, MainActor can give you some impressive opening credits.

  • Play .mkv and .xvid Video Files on Your Android Phone With RockPlayer

    Android App Uses FFmpeg library to decode video

  • Generative Adversarial Networks

    Auction houses are selling AI-based artwork that looks like it came from the grand masters. The Internet is peppered with photos of people who don't exist, and the movie industry dreams of resurrecting dead stars. Enter the world of generative adversarial networks.

  • Disinformation Detector

    Fake information is experiencing a boom, but given the right tools, you can reliably separate the wheat from the chaff.

comments powered by Disqus
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters

Support Our Work

Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.

Learn More

News