Google has rolled out updates to Search intended to make explicit deepfakes as hard to find as possible. As part of its long-standing and ongoing fight against realistic-looking manipulated images, the company is making it easier for people to remove non-consensual fake images featuring them from Search.
It has long been possible for users deletion request of such images under Google’s policies. Now, whenever Google accepts someone’s removal request, it will also filter out all explicit results from similar searches about that person. The company’s systems will look for duplicates of the offending image and remove them as well. This update could help alleviate some of victims’ fears if they’re worried about the same image appearing again on other websites.
Additionally, Google has updated its ranking systems so that if a user specifically searches for explicit deepfakes with a person's name, the results will show “high-quality, non-explicit content.” If there are news articles about that person, for example, the results will show them. According to Google's announcement, it also appears to have plans to educate the user searching for deepfakes by showing them results that analyze their impact on society.
However, Google doesn’t want to remove results for legitimate content, such as an actor’s nude scene, in its attempt to banish deepfakes from its results page. It admits that it still has a lot of work to do when it comes to separating legitimate explicit images from fake ones. While that’s still a work in progress, one of the solutions it has implemented is demoting sites that have received a high volume of takedowns for manipulated images in Search. That’s “a pretty strong signal that it’s not a high-quality site,” Google explains, adding that the approach has worked well for other types of harmful content in the past.