Facial Recognition: for Business and Society, What Is the Promise and Peril?

BY johnsurico | July 25, 2019

Facial recognition: those two words can conjure up extremes in our imagination. One is a future where your specific smile will effortlessly open doors and devices for you. The other is where your every movement is tracked by cameras, with the data routed to unknown companies—and authorities.

In an increasingly connected world, facial-recognition software could potentially supercharge the rise of surveillance capitalism, in which our words and actions are harvested to feed the algorithms of advertising and marketing. As a result, it represents one of the next great debates about technology and privacy.

In other words: now that iPhones can open by scanning your face, imagine what comes next.

Corporations have clearly taken notice of the opportunities, charging full steam ahead with software that will monitor and influence our everyday interactions, both as citizens and consumers. So how can this technology be used in smart and strategic ways, with benefits to individuals and society? And what are the insidious risks? Those questions were up for debate Wednesday in a panel discussion at NYU’s Center for Urban Science and Progress, organized by the Downtown Brooklyn Partnership.

The most visible manifestations of the facial-recognition revolution are the security cameras bristling from our buildings and lamp posts. New York City reportedly has more than 18,000 cameras watching the city’s 8.5 million residents and tens of millions of visitors.

Imagery from those cameras help the New York City Police Department (NYPD) track down alleged criminals in an “investigative-driven” format, said Assistant Chief Jason Wilcox, who works in the department’s Detective Bureau. If photographs are taken by the victim, or by a surveillance camera—say during an assault on the subway— the department will try to match the facial features with photos already in the NYPD’s massive database from prior arrests.

“When they [run] the technology, and get a match, and it comes back to a person that has been arrested,” he explained, it’s a lead to be used in building a case, but not yet probably cause for arrest. “So we give it back to the investigator we got it from, and now we say, ‘Okay, now you have to go do your work, and make a proper identification, and make an arrest.’”

Speakers at the panel discussion in Brooklyn, moderated by Tyler Woods, at far right (Photo courtesy of Downtown Brooklyn Partnership)

Wilcox asserted that the department does not use cameras en masse, a la Minority Report, to pinpoint who is wanted on the streets of New York City. A reason to investigate must first be in place, before the department activates the software. (Research has found that the technology has allowed for detectives to make at least 2,900 arrests in over five years of usage.)

In terms of pro-and-con impacts, “The bottom-line pro, the way we apply it, is to make New York City safer. And we do it fairly, and responsibly,” Wilcox said. “The cons, the concerns, are the things that we steer away from: the mass viewing, the people walking down the street, trying to identify, profiling, or anything like that. That is not what we use it for."

With this in mind, Noah Levenson, an artist and technologist in residence at the Mozilla Foundation, said the software has dramatic implications for the future of marketing. “Smart ads” have already popped up in Europe, he said. People passing a billboard in Oslo, Norway, were scanned to identify their gender; women were shown a salad, while men were shown a sausage pizza.

But it goes beyond merely identification: emotion recognition, he said, is the next frontier for researchers, and companies. “So is this person happy? Are they angry? Are they drunk? Are they on illegal drugs? Are they mentally ill? Will we eventually be able to learn something about the human face such that we can predict when someone's about to commit a crime, without knowing anything else about them?” he asked. “These are some of the things that are coming next. And I mostly look at how this stuff is going to wind up in consumer products, and household applications.”

Levenson cited Snapchat as an example. In 2018, the social-media platform’s patent was approved by the U.S. government to use facial recognition software in detecting mood at certain places. What does that mean for users? That out of the bazillions of selfies taken per day, Snapchat will soon be able to match your expression to your geolocation, thereby understanding how you feel at a particular place and time. “Then they're going to sell that data to the organizers of public events, concert promoters, or organizers of political rallies, talks, meetups,” he said. “And whoever else, we don't know.”

And this software can be used on the backend of capitalism, too, Levenson said. Major companies like Unilever and Goldman Sachs have begun to incorporate facial recognition software into employee recruitment. By analyzing features like body language, tone, and key words that the company can enter into the system, AI can help do the work of finding the “ideal candidate,” goes the thinking.

But, of course, all this potential is not without peril, according to those who worry about the technology overstepping its bounds. Jonathan Stribling-Uss, a technologist fellow at the New York City chapter of the American Civil Liberties Union (ACLU), flagged particular worries that his organization had with the dragnet capabilities of the NYPD and other government agencies. He mentioned a recent story about how the police put a photo of Woody Harrelson through its software to track a suspected beer thief, after the fuzzy security-camera photo came up with no match, but to human eyes the suspect looked a lot like the actor. (Wilcox contended that this was not standard protocol.)

In the future, would the faces of an audience like the one at Wednesday’s panel discussion be scanned to find out names, addresses—and emotions? (Photo courtesy of Downtown Brooklyn Partnership)

“I think it's important for people to understand that this is something that's happening right now,” Stribling-Uss said. He then asked the audience to raise their hand if they were taught not to give out their name to strangers when they were children. “Unfortunately that's what facial recognition dragnets do: they give out name and address by default ... and this is what we're seeing happening with law enforcement.” This concern, he said, has led to bans of facial recognition software in such cities as Oakland, Calif., and Somerville, Mass.

Stribling-Uss said the scenarios that keeps privacy advocates up at night is something akin to the social-credit system arising in China, where citizens are tracked and punished with bans on travel or other activities based on social taboos they have committed, like not paying their debts. “I think we're coming closer to that,” he argued. Prospectively, “people can be tracked and banned on the basis of characteristics they can't change.”

Facial recognition is also not without its vulnerabilities. While machine learning is constantly improving, it’s important to be cognizant of its limitations, said Nasir Memon, a professor of computer science at NYU Tandon School of Engineering. Sure, you might be able to have your front door open for you one day, or have a coffee bought at your local cafe without pulling out your wallet, but things can go wrong.

Memon outlined four ways that pattern recognition—which is what the software ultimately is, he said—can be hacked: through a mask, or spoofed appearance; through understanding the model’s loopholes; through a picture; or through a “master face,” like the universal key that has been used to fool fingerprint-scanning software, he said. “So what has to counter all of these possibilities? And they'll always be there: security is a cat-and-mouse game,” he explained. “But the danger, I think, is to anonymity.”

A citizen should have the right to stay anonymous, if they choose to, said Memon, whether it’s Snapchat scanning our selfies for marketing purposes, or governments using cameras for surveillance purposes. Without that ability, dissent can be discouraged, he added, which harms the health of one’s freedom and society overall.

“It's how we evolve our society,” he said. “And if everybody is recognized everywhere all the time, you have a problem.”

John Surico is a freelance journalist and researcher, based in New York City. His reporting has appeared in the New York TimesVICE, and a number of other local and national publications