Most of the threats posed by AI are decidedly uncertain. The threat posed by singularity — the point when robots gain general intelligence and potentially topple their human overlords — remains unclear, it’s likelihood far from certain. Artificial intelligence may lead to massive job losses. Then again, it may not. After all, all other technology transformations eventually created more total jobs, not less. Autonomous warfare seems to pose very real risks. Yet a lot of what future AI-based warfare will look like is still guesswork and assumes weapons that don’t yet exist.
But then there’s facial recognition. This technology is already here and is already being applied in ways that evoke 1984, most especially in China (and Russia). It’s not yet clear whether our own tech industry may take the United States down a similar path.
Facial recognition is at the heart of China’s sweeping new plans for surveillance. The country already has 200 million surveillance cameras — four times as many as the U.S. — with plans to double that number by 2020. And Chinese authorities can trumpet some notable accomplishments. During the annual beer festival in Qingdao, AI-powered cameras allowed the police to snatch two dozen criminal suspects. In Wuhu, a fugitive murder suspect was identified by camera while buying food from a street vendor. And in
Zhengzhou, facial recognition glasses allowed a police officer to spot a heroin smuggler at a train station.
Of course, apprehending criminals is just one facet of a much more ambitious agenda. Over the last handful of years, China has been rolling out a social credit score for assessing the trustworthiness of each of its citizens, based on bank accounts, court records, internet-search histories, consumer and social media habits, political persuasions — essentially any and every behavior for which the government can collect data. For now, the scores are voluntary, but they will be mandatory for all citizens come 2020.
But it is facial recognition that connects all this online surveillance to the real world — making it possible for authorities to develop comprehensive profiles of where each citizen goes and with whom they associate. Which means activities like attending a political rally, carrying a protest sign, or even being spotted in the company of someone deemed suspicious, are all likely to negatively affect one’s score. And the lowered score may mean, for example, that one no longer qualifies for high-speed internet, or that certain jobs, or colleges, or neighborhoods, are now off-limits . . . or that the police show up at your door.
The implications are stark. “This is potentially a totally new way for the government to manage the economy and society,” said Martin Chorzempa, a fellow at the Peterson Institute for International Economics. “The goal is algorithmic governance.”
Maya Wang, a senior researcher for the Asia division of Human Rights Watch, adds, “People in China don’t know 99.99 percent of what’s going on in terms of state surveillance. Most people think they can say what they want and live freely without being monitored, but that’s largely an illusion.”
Interestingly, Seattle finds itself at the heart of facial recognition developments and debate in the U.S. Our two largest local tech companies, Amazon and Microsoft, are both leaders in the field. And two of the other biggest players, Alphabet (Google) and Facebook, have major campuses in Seattle as well — to ensure they get their fair share of the AI/computer science talent emanating from UW’s acclaimed Data Science department.
But it’s Amazon and Microsoft that starkly showcase opposite sides of the facial recognition debate. Amazon evidently takes the ‘it’s just business’ view. The company is busy encouraging police departments across the country to deploy its Rekognition software. Despite formal pushback from more than three dozen civil rights organizations, Amazon’s only concern seems to be ensuring that its Rekognition market share continues to grow. It offers free consulting to government customers, and has solicited feedback on new product features for law enforcement. Amazon imposes no meaningful restrictions on how governments can use Rekognition. Instead, presumably seriously, Amazon argues “It (facial recognition) must be good since it was used at the royal wedding.”
Earlier this month, Microsoft took a very different stance. In a lengthy blog post by president Brad Smith, the company became the first tech giant to call for government-imposed limits on the use of facial recognition technology. Smith wrote, “Facial recognition technology raises issues that go to the heart of fundamental human rights protections like privacy and freedom of expression.” He added, “These issues heighten responsibility for tech companies that create these products.”
Smith then made a comparison to products like medicines and cars that, for public safety reasons, are highly regulated — and called for a similar approach with facial recognition. He added, “If we move too fast with facial recognition, we may find that people’s fundamental rights are being broken.”
Smith’s blog got considerable media coverage, including from the NY Times, and the Washington Post (owned by Amazon’s Jeff Bezos), which wrote, “When technology companies such as Microsoft acknowledge that their software comes with risks, it is time to sit up and take notice. Congress should step in and find ways to balance the public benefits of facial recognition with the obvious privacy concerns.”
People of faith are especially sensitive to this issue. Not because we are more adverse to apprehending criminals than anyone else. But because governments — totalitarian ones, certainly, but sometimes democratic ones as well — too often equate those who are criminal with those who are religious.
Two years ago China’s State Administration of Religious Affairs (SARA) ordered the installation of surveillance cameras in major churches, mosques, and temples. The effort has been particularly vigorous in Zhejiang province, known for high Christian concentrations in cities like Wenzhou and Ningbo. Pastors at the larger churches say their congregations include prominent Communist party and business leaders. But since President Xi Jinping requires that all Communist Party members be atheists, facial recognition poses an obvious threat.
In western China, mass-surveillance software is being installed specifically to track members of the Uighur Muslim minority and map their relations with friends and family. Even in the U.S., our current administration certainly gives Muslims concern as to whether the government is interested in spying on their activities and associations.
Freedom of religion and freedom of association are essential human rights. Widespread deployment of facial recognition technology poses a grave threat to these fundamental freedoms. Unless we are very careful, we risk bringing upon ourselves exactly the outcome about which George Orwell warned. He just didn’t realize Big Business would be giving Big Brother such a big assist.