London CNN  — 

The second you enter the United States pavilion at London Design Biennale, you feel you’re being watched. But spend a little time in this room in London’s palatial Somerset House and its creators hope you’ll become aware that you are – and always will be – under the eye of powerful systems beyond your control.

The installation, titled “Face Values” and curated by the Cooper Hewitt, Smithsonian Design Museum in New York, brings together unsettling examples of technology designed to read our faces, understand our emotions and intuit our identities.

Sit in either of two chairs and cameras equipped with facial detection technology will scan the slightest curve of your brow or wrinkle of the nose for signs of emotion. Everything captured returns as various dubious bits of feedback – like inaccurate assessments of you age or inscrutable “happiness” scores. Caroline Baumann, director of Cooper Hewitt, considers it playful but “charged.”

“I think it can be heartbreaking to think we are being turned into sets of data or facts. But that’s what it is,” she said in a phone interview.

The installation is one of the 40 pavilions at London Design Biennale, an international showcase of design, with a theme of “Emotional States.”

Courtesy David Levene
Installation view of "Face Values" the United States' London Design Biennale pavilion.

“Face Values,” which won an award for the most inspiring interpretation of the theme, is the latest in a string of creative projects confronting the invasion of networked digital technology into our normally private inner worlds.

The show reflects a time when the public’s trust in the future of design and technology is crashing, said Ellen Lupton, senior curator of contemporary design at Cooper Hewitt: “It’s definitely changed, this assumption that technology is a liberating force.”

The two chairs at the center of “Face Values” sit in front of installations by American designers Zachary Lieberman and R. Luke DuBois. Liberman’s reveals the workings of facial detection, scanning visitors to show how the technology turns a person’s appearance into a set of data points. It then playfully superimposes similar looking body parts captured from other visitors onto your face. The result is a kind of police composite photo, mocked-up live on-screen, as you contort and skew your face.

Courtesy David Levene
A visitor interacts with Zachary Lieberman's Expression Mirror in Cooper Hewitt's "Face Values" installation at the 2018 London Design Biennale. Photo David Levene.

But DuBois’ display is more unnerving. Visitors are recorded as they are asked to perform a specific emotion, like joy or calmness, and rated for their efforts by an artificial intelligence program. At the same time, they are being fed back information it has gleaned – much of it inaccurate – about the person’s age, sex and ethnicity.

For visitors, it’s disconcerting to sit grinning to max-out “joy” ratings while at the same time being told you look old. It gives a taste of how such systems can influence and subdue us.

“People know (this technology) exists, but they don’t know the numbers of governments and marketing companies that are collecting this information, says Baumann. “We do want people to exit realizing that the technology is not A+.”

Courtesy David Levene
R. Luke DuBois photographed with his work in Cooper Hewitt's "Face Values" installation.

A global cloak

We are on the brink of facial recognition taking over our daily lives, predicts Pam Dixon, an expert on biometrics and executive director World Privacy Forum.

“My best guess is that we’ll see the entire world cloaked in facial recognition technology and other biometric identification technologies within five years,” Dixon said in a phone interview. “It’s going to be very difficult to miss.”

Forms of facial recognition technology have already been rolled out on Apple’s iPhone X, where its used to unlock screens, and across Facebook’s user-uploaded photo albums to identify you in pictures.

But police and security forces have long been among the biggest backers of the technology, though its use has been far from foolproof. This year, London’s Metropolitan Police ended its use of facial recognition software at Notting Hill Carnival, a celebration of British-Caribbean culture and the country’s biggest street festival. For the previous two years, the Met had scanned crowds for faces of suspected criminals but repeatedly misrecognized people, according to independent legal observers.

Bloomberg via Getty Images
A customer sets up facial recognition on an iPhone X.

In the light-hearted setting of the Biennale, visitors react loudest to the shock of being mis-aged. In the real world, where wrong readings by law enforcement can have dire consequences, facial recognition technology’s dismal record on race and gender parity has been its most consistent and potentially catastrophic failing.

Facial recognition apps made by Microsoft, IBM and Chinese start-up Megvii are more than 99% accurate in determining the gender of light-skinned men, researchers from the Massachusetts Institute of Technology and Stanford University found in February. But perform the same tests on darker-skinned women and errors in identification are made up to 35% of the time.

In a recent test of the technology by the American Civil Liberties Union (ACLU) of Amazon’s facial recognition system, which it has licensed to law enforcement and private sector companies, scanned the faces of all 535 members of the US Congress against 25,000 police mugshots, and generated 28 false matches. The test also found that African-American and Latino members of Congress were disproportionally misidentified as criminals.

History of persecution

“It’s not a brand-new thing,” explains Lupton, linking algorithms to the Victorian vogue for measuring everything from size of your nose to distance between eyes in the belief that it would explain a person’s character.

“It relates to the origins of modern policing and psychology. The first mugshots are all about measuring the face and looking for statistical similarities between criminals.”

But now is a “fertile time” to explore these ideas, chimes Baumann. In roughly the last two years, since the shock election of President Donald Trump and the UK’s referendum decision to leave the European Union, trust in technology has plummeted as investigations have revealed the role of companies like Cambridge Analytica in using sophisticated data to sway votes, manipulate sentiment and spread disinformation.

02:08 - Source: CNN
What is China's master plan?

For examples of how dystopic things can get, Lupton points to China. In the Xinjiang region, on China’s westernmost edge, black box systems feed authorities data from untold thousands of surveillance cameras, along with data ranging from “consumer habits to banking activity, health status and indeed the DNA profile of every single inhabitant of Xinjiang,” according to reports German news magazine Der Spiegel.

Algorithms trawl this data for any irregular activity, and thousands have been arrested. Similar systems are set to be rolled out across the 1.4 billion-person state, in the form of an all-powerful system that will give every person a so-called social credit score that will dictate access to everything from housing to dating apps.

Amid all this fear, Baumann is confident that designers and museums have a role to play in educating visitors, and perhaps even spurring some to action.

“As more and more people become aware of this, I think that there will be dissent and change.”

“Face Values” is on view at Somerset House as part of the London Design Biennale until Sept. 23, 2018.