Crowds at Kamala Harris rallies are not AI generated. Here’s how you can find out

Crowds at Kamala Harris rallies are not AI generated. Here's how you can find out

Suffice it to say that this mountain of evidence from direct sources outweighs flagged images from conservative commentators such as Chuck Calesto and Dinesh D’Souzaboth of whom have been caught spreading campaign disinformation in the past.

When it comes to accusations of AI falsification, the more diverse sources of information you have, the better. Although a single source can easily generate a believable-looking image of an event, multiple independent sources showing the same event from different angles are much less likely to engage in the same hoax. Photos that pair with video evidence are even better, especially since creating convincing long-term videos of people or complex scenes remains a challenge for many AI tools.

It’s also important to track down the original source of any alleged AI imagery you view. It’s incredibly easy for a social media user to create an AI-generated image, claim it came from a news report or live footage of an event, then use obvious flaws in that fake image as “proof” that the event itself was faked . Links to original images from the original source’s own website or verified account are much more reliable than screenshots that could have originated from anywhere (and/or been modified by anyone).

Telltale signs

While tracking down original and/or corroborating sources is useful for major news events like a presidential rally, confirming the authenticity of images and videos from a single source can be more difficult. Tools like Winston AI Image Detector or IsItAI.com claim to use machine learning models to figure out if an image is AI or not. But while detection techniques continue to evolve, these kinds of tools are usually based on unproven theories that haven’t been proven reliable in any extensive study, making the prospect of false positives/negatives a real risk.

Writing on LinkedIn, UC Berkeley professor Hani Farid cited two GetReal Labs models that showed “no evidence of AI generation” in the Harris rally photos released by Trump. Farid went on to cite specific parts of the image that point to its authenticity.

“The text on the characters and the plane show none of the usual signs of generative AI,” Farid wrote. “Although the lack of evidence of tampering is not proof that the image is real. We find no evidence that this image was AI-generated or digitally altered.”

And even when parts of a photo seem like nonsensical signs of AI manipulation (a la deformed hands in some AI image models), consider that there may be a simple explanation for some apparent optical illusions. The BBC notes that the lack of reflection of the crowd on the plane in some of Harris’s rally photos may be due to a large, empty area of ​​tarmac between the plane and the crowd, as shown in a reverse corner of the scene. Simply surrounding odd-looking things in a photo with a red marker isn’t necessarily conclusive evidence of AI manipulation by itself.

Leave a Reply

Your email address will not be published. Required fields are marked *