A few weeks ago, a Google search for “deepfake nudes jennifer aniston” yielded at least seven high-quality results that claimed to contain explicit, AI-generated images of the actress. Now they are gone.
Google product manager Emma Higham says new adjustments to the way the company ranks results, which were rolled out this year, have already reduced exposure to fake explicit images by more than 70 percent in searches looking for that content for a specific person. Where once problematic results may have appeared, Google’s algorithms seek to promote news articles and other implicit content. A search for Aniston now returns articles like “How Taylor Swift’s Deepfake AI Porn Is a Threat” and other links like an Ohio attorney general’s warning about “celebrity endorsement deepfake scams” targeting consumers.
“With these changes, people can read about the impact deepfakes have on society instead of seeing pages of actual fake images without consent,” Higham wrote in a company blog post on Wednesday.
Although Google has made it easier to request the removal of objectionable explicit content, victims and their advocates have called for more proactive steps. But the company has tried to avoid becoming too much of an Internet regulator or harming access to legitimate porn. At the time, a Google spokesperson said in response that multiple teams were working hard to strengthen safeguards against what it called non-consensual explicit images (NCEI).
The increasing availability of AI image generators, including some with few restrictions on their use, has led to an uptick in NCEI, according to victim advocates. The tools have made it easy for almost anyone to create fake, expressive images of any face, whether it’s a high school classmate or a mega-celebrity.
As part of Google’s new crackdown, Higham says the company will begin implementing three of the measures to reduce the discoverability of real but unwanted explicit images to those that are synthetic and unwanted. After Google honors a takedown request for a sexualized deepfake, it will try to keep duplicates out of the results. It will also filter out explicit images from results in queries similar to those cited in the download request. Finally, websites subject to a “high volume” of successful download requests will face a drop in search results.
“These efforts are designed to give people additional peace of mind, especially if they are concerned about similar content about them appearing in the future,” Higham wrote.
Google has acknowledged that the measures aren’t working perfectly, and former employees and victim advocates have said they could go much further. The search engine clearly warns people in the US searching for nude images of children that such content is illegal. The effectiveness of the warning is unclear, but it is a potential deterrent advocated by advocates. Yet, despite laws against NCEI sharing, no such warnings appear for searches looking for adult sex-deep fakes. A Google spokesperson confirmed that this will not change.