A team of data scientists at York St John University has rolled out an innovative tool aimed at detecting deepfake images that spread misinformation. This project, backed by partners from the University of Essex and Colchester-based Nosh Technologies, introduces Pixelator v2. It employs a unique blend of techniques to catch tiny discrepancies in images, boasting the capability to identify changes as minute as one pixel.
The creators of Pixelator v2 see it as a game changer for fields that rely heavily on image accuracy, especially cyber security specialists, analysts, and researchers. Lead researcher Somdip Dey emphasized the urgency of understanding visual authenticity in a world where images are paramount to communication.
Typical tools for uncovering fake images often overlook subtle yet significant alterations. Pixelator v2 takes a different approach by using LAB Colour Space Analysis and Sobel Edge Detection. LAB Colour Space Analysis mimics human vision and helps the tool detect differences that might elude an observer’s eye. On the other hand, Sobel Edge Detection highlights structural changes in images, capturing nearly invisible shifts in edges and boundaries.
This combination makes Pixelator v2 especially useful in cyber security. Quick and precise image comparisons are crucial for tasks like tamper detection and authentication. The team found that Pixelator v2 outperformed other existing methods in spotting perceptual and structural differences, boosting security by preventing minor variations from going unnoticed.
With the rise of generative AI capable of creating realistic images, the team recognizes the increasing difficulty in telling real from AI-generated content. Dey believes Pixelator v2 is a vital step in tackling this challenge by enhancing our understanding of perceptual differences in images, setting the stage for future tools aimed at detecting AI-generated content.
The researchers are already looking to expand Pixelator v2’s functions to tackle images produced by generative AI, a pressing need as bad actors exploit these technologies for misinformation. A recent incident in Pennsylvania highlighted the issue when a teenager created deepfake nude images of classmates.
For those interested in the research, the findings appeared in MDPI’s open-access Electronics journal and the Pixelator v2 tool is available for download on GitHub.