In 2024 increased adoption of biometric surveillance systems, such as the use of AI facial recognition in public places and access to government services, will drive biometric identity theft and anti-surveillance innovation. Individuals aiming to steal biometric identities to commit fraud or gain access to unauthorized data will be aided by generative AI tools and the abundance of facial and voice data published online.
Voice clones are already being used for fraud. Take for example Jennifer DeStefano, a mother in Arizona who heard her daughter’s panicked voice crying “Mom, these bad men got me!” after receiving a call from an unknown number. The fraudster asked for money. DeStefano was eventually able to confirm that her daughter was safe. This scam is a precursor to more sophisticated biometric scams that will target our deepest fears, using the images and sounds of our loved ones to force us to do the bidding of whoever wields these tools.
In 2024 some governments are likely to adopt biometric mimicry to support psychological torture. In the past, a person of interest could be told false information with little evidence to support the claims other than the interrogator’s words. Today, a person being questioned may have been arrested due to a false facial recognition match. Black men in the United States, including Robert Williams, Michael Oliver, Nyer Parks, and Randall Reed, have been wrongfully arrested due to mistaken identity, detained and imprisoned for crimes they did not commit. They are among a group of people, including the elderly, people of color and gender non-conforming individuals, who are at higher risk of facial misidentification.
Generative AI tools also give intelligence agencies the ability to create false evidence, such as a video recording of an alleged co-conspirator confessing to a crime. Perhaps just as distressing, the power to create digital doppelgangers will not be limited to organizations with big budgets. Having open-source generative AI systems that can produce human voices and fake videos will increase the spread of revenge porn, child sexual abuse material and more on the dark web.
Until 2024 we will have a growing number of “excluded” communities and people – those whose life chances have been negatively altered by AI systems. At Algorithmic Justice League, we have received hundreds of reports of compromised biometric rights. In response, we will witness the rise of the faceless, those committed to keeping their biometric identities hidden from view.
As biometric rights will vary around the world, fashion choices will reflect regional biometric regimes. Face coverings, such as those used for religious purposes or medical masks to prevent viruses, will be accepted both as a fashion statement and as anti-surveillance clothing where permitted. In 2019, when protesters began destroying surveillance equipment while concealing their appearance, a Hong Kong government leader banned face masks.
In 2024 we will start to see a bifurcation of mass surveillance and free-person territories, areas where you have laws like the provision in the proposed EU AI Act that bans the use of live biometrics in public places. In such places, the anti-surveillance fad will flourish. After all, facial recognition can be used retroactively in video feeds. Parents will fight to protect children’s right to be “biometrically naïve”, meaning that none of their biometric data, such as their face print, voice or iris pattern, is scanned and stored by government agencies, schools or religious institutions. New glasses companies will offer lenses that distort the ability of cameras to easily capture your eye biometric information, and pairs of glasses will come with prosthetic extensions to change the shape of your nose and cheeks. 3D printing tools will be used to make facial prostheses at home, but depending on where you are in the world, this may be prohibited. In a world where the face is the last frontier of privacy, the sight of another’s unaltered face will be a rare intimacy.