‘An Invisible Cage’: How China Is Policing the Future

Three people with a criminal record check into the same hotel in southeast China. An automated system is designed to alert the police.

A man with a history of political protest buys a train ticket to Beijing. The system could flag the activity as suspicious and tell the police to investigate.

A woman with mental illness in Fujian leaves her home. A camera installed by her house records her movements so the police can track her.

Across China, the police are buying technology that harnesses vast surveillance data to predict crime and protest before they happen. The systems and software are targeting people whose behavior or characteristics are suspicious in the eyes of an algorithm and the Chinese authorities, even if they’ve done nothing wrong.

The more than 1.4 billion people living in China are constantly watched. They are recorded by police cameras that are everywhere, on street corners and subway ceilings, in hotel lobbies and apartment buildings. Their phones are tracked, their purchases are monitored, and their online chats are censored.

Now, even their future is under surveillance.

The latest generation of technology digs through the vast amounts of data collected on their daily activities to find patterns and aberrations, promising to predict crimes or protests before they happen. They target potential troublemakers in the eyes of the Chinese government — not only those with a criminal past but also vulnerable groups, including ethnic minorities, migrant workers and those with a history of mental illness.

They can warn the police if a victim of a fraud tries to travel to Beijing to petition the government for payment or a drug user makes too many calls to the same number. They can signal officers each time a person with a history of mental illness gets near a school.

It takes extensive evasive maneuvers to avoid the digital tripwires. In the past, Zhang Yuqiao, a 74-year-old man who has been petitioning the government for most of his adult life, could simply stay off the main highways to dodge the authorities and make his way to Beijing to fight for compensation over the torture of his parents during the Cultural Revolution. Now, he turns off his phones, pays in cash and buys multiple train tickets to false destinations.

While largely unproven, the new Chinese technologies, detailed in procurement and other documents reviewed by The New York Times, further extend the boundaries of social and political controls and integrate them ever deeper into people’s lives. At their most basic, they justify suffocating surveillance and violate privacy, while in the extreme they risk automating systemic discrimination and political repression.

Surveillance cameras set up in April at a residential compound in Mudanjiang, Heilongjiang Province.
Credit…China Daily/Via Reuters

For the government, social stability is paramount and any threat to it must be eliminated. During his decade as China’s top leader, Xi Jinping has hardened and centralized the security state, unleashing techno-authoritarian policies to quell ethnic unrest in the western region of Xinjiang and enforce some of the world’s most severe coronavirus lockdowns. The space for dissent, always limited, is rapidly disappearing.

“Big data should be used as an engine to power the innovative development of public security work and a new growth point for nurturing combat capabilities,” Mr. Xi said in 2019 at a national public security work meeting.

The algorithms, which would prove controversial in other countries, are often trumpeted as triumphs.

In 2020, the authorities in southern China denied a woman’s request to move to Hong Kong to be with her husband after software alerted them that the marriage was suspicious, the local police reported. An ensuing investigation revealed that the two were not often in the same place at the same time and had not spent the Spring Festival holiday together. The police concluded that the marriage had been faked to obtain a migration permit.

The same year in northern China, an automated alert about a man’s frequent entry into a residential compound with different companions prompted the police to investigate. They discovered that he was a part of a pyramid scheme, according to state media.

The details of these emerging security technologies are described in police research papers, surveillance contractor patents and presentations, as well as hundreds of public procurement documents reviewed and confirmed by The Times. Many of the procurement documents were shared by ChinaFile, an online magazine published by the Asia Society, which has systematically gathered years of records on government websites. Another set, describing software bought by the authorities in the port city of Tianjin to stop petitioners from going to neighboring Beijingwas provided by IPVM, a surveillance industry publication.

China’s Ministry of Public Security did not respond to requests for comment faxed to its headquarters in Beijing and six local departments across the country.

The new approach to surveillance is partly based on data-driven policing software from the United States and Europe, technology that rights groups say has encoded racism into decisions like which neighborhoods are most heavily policed and which prisoners get parole. China takes it to the extreme, tapping nationwide reservoirs of data that allow the police to operate with opacity and impunity.

Video

14:27China’s Surveillance State Is Growing. These Documents Reveal How.
A New York Times analysis of over 100,000 government bidding documents found that China’s ambition to collect digital and biological data from its citizens is more expansive and invasive than previously known.

Often people don’t know they’re being watched. The police face little outside scrutiny of the effectiveness of the technology or the actions they prompt. The Chinese authorities require no warrants to collect personal information.

At the most bleeding edge, the systems raise perennial science-fiction conundrums: How is it possible to know the future has been accurately predicted if the police intervene before it happens?

Even when the software fails to deduce human behavior, it can be considered successful since the surveillance itself inhibits unrest and crime, experts say.

This is an invisible cage of technology imposed on society,” said Maya Wang, a senior China researcher with Human Rights Watch, “the disproportionate brunt of it being felt by groups of people that are already severely discriminated against in Chinese society.”

Image
Products from Megvii, an artificial intelligence start-up, on display at a tech industry exhibition center in Beijing.
Credit…Florence Lo/Reuters

‘Nowhere to Hide’

In 2017, one of China’s best-known entrepreneurs had a bold vision for the future: a computer system that could predict crimes.

The entrepreneur, Yin Qi, who founded Megvii, an artificial intelligence start-up, told Chinese state media that the surveillance system could give the police a search engine for crime, analyzing huge amounts of video footage to intuit patterns and warn the authorities about suspicious behavior. He explained that if cameras detected a person spending too much time at a train station, the system could flag a possible pickpocket.

“It would be scary if there were actually people watching behind the camera, but behind it is a system,” Mr. Yin said. “It’s like the search engine we use every day to surf the internet — it’s very neutral. It’s supposed to be a benevolent thing.”

He added that with such surveillance, “the bad guys have nowhere to hide.”

Five years later, his vision is slowly becoming reality. Internal Megvii presentations reviewed by The Times show how the start-up’s products assemble full digital dossiers for the police.

Build a multidimensional database that stores faces, photos, cars, cases and incident records,” reads a description of one product, called “intelligent search.” The software analyzes the data to “dig out ordinary people who seem innocent” to “stifle illegal acts in the cradle.”

A Megvii spokesman said in an emailed statement that the company was committed to the responsible development of artificial intelligence, and that it was concerned about making life more safe and convenient and “not about monitoring any particular group or individual.”

Video
Cinemagraph
An internal presentation slide for Megvii’s “intelligent search” product. Bar charts sort groups of monitored people by category.

Similar technologies are already being put into use. In 2022, the police in Tianjin bought software made by a Megvii competitor, Hikvision, that aims to predict protests. The system collects data on legions of Chinese petitioners, a general term in China that describes people who try to file complaints about local officials with higher authorities.

It then scores petitioners on the likelihood that they will travel to Beijing. In the future, the data will be used to train machine-learning models, according to a procurement document.

Local officials want to prevent such trips to avoid political embarrassment or exposure of wrongdoing. And the central government doesn’t want groups of disgruntled citizens gathering in the capital.

A Hikvision representative declined to comment on the system.

Under Mr. Xi, official efforts to control petitioners have grown increasingly invasive. Zekun Wang, a 32-year-old member of a group that for years sought redress over a real estate fraud, said the authorities in 2017 had intercepted fellow petitioners in Shanghai before they could even buy tickets to Beijing. He suspected that the authorities were watching their communications on the social media app WeChat.

The Hikvision system in Tianjin, which is run in cooperation with the police in nearby Beijing and Hebei Province, is more sophisticated.

The platform analyzes individuals’ likelihood to petition based on their social and family relationships, past trips and personal situations, according to the procurement document. It helps the police create a profile of each, with fields for officers to describe the temperament of the protester, including “paranoid,” “meticulous” and “short tempered.”

Many people who petition do so over government mishandling of a tragic accident or neglect in the case — all of which goes into the algorithm. “Increase a person’s early-warning risk level if they have low social status or went through a major tragedy,” reads the procurement document.

Image
A police patrol in Xichang, Sichuan Province. Software allows Chinese authorities to target individuals according to preconceived ideas about their traits.
Credit…Costfoto/Future Publishing via Getty Images
A police patrol in Xichang, Sichuan Province. Software allows Chinese authorities to target individuals according to preconceived ideas about their traits.

When the police in Zhouning, a rural county in Fujian Province, bought a new set of 439 cameras in 2018, they listed coordinates where each would go. Some hung above intersections and others near schools, according to a procurement document.

Nine were installed outside the homes of people with something in common: mental illness.

While some software tries to use data to uncover new threats, a more common type is based on the preconceived notions of the police. In over a hundred procurement documents reviewed by The Times, the surveillance targeted blacklists of “key persons.”

These people, according to some of the procurement documents, included those with mental illness, convicted criminals, fugitives, drug users, petitioners, suspected terrorists, political agitators and threats to social stability. Other systems targeted migrant workers, idle youths (teenagers without school or a job), ethnic minorities, foreigners and those infected with H.I.V.

The authorities decide who goes on the lists, and there is often no process to notify people when they do. Once individuals are in a database, they are rarely removed, said experts, who worried that the new technologies reinforce disparities within China, imposing surveillance on the least fortunate parts of its population.

In many cases the software goes further than simply targeting a population, allowing the authorities to set up digital tripwires that indicate a possible threat. In one Megvii presentation detailing a rival product by Yitu, the system’s interface allowed the police to devise their own early warnings.

With a simple fill-in-the-blank menu, the police can base alarms on specific parameters, including where a blacklisted person appears, when the person moves around, whether he or she meets with other blacklisted people and the frequency of certain activities. The police could set the system to send a warning each time two people with a history of drug use check into the same hotel or when four people with a history of protest enter the same park.

Yitu did not respond to emailed requests for comment.

Video
Cinemagraph
An interface from a Yitu product that lets the police set parameters to receive alerts on suspicious behavior.CreditCredit…The New York Times

In 2020 in the city of Nanning, the police bought software that could look for “more than three key people checking into the same or nearby hotels” and “a drug user calling a new out-of-town number frequently,” according to a bidding document. In Yangshuo, a tourist town famous for its otherworldly karst mountains, the authorities bought a system to alert them if a foreigner without a work permit spent too much time hanging around foreign-language schools or bars, an apparent effort to catch people overstaying their visas or working illegally.

In Shanghai, one party-run publication described how the authorities used software to identify those who exceeded normal water and electricity use. The system would send a “digital whistle” to the police when it found suspicious consumption patterns.

The tactic was likely designed to detect migrant workers, who often live together in close quarters to save money. In some places, the police consider them an elusive, and often impoverished, group who can bring crime into communities.

The automated alerts don’t result in the same level of police response. Often, the police give priority to warnings that point to political problems, like protests or other threats to social stability, said Suzanne E. Scoggins, a professor at Clark University who studies China’s policing.

At times, the police have stated outright the need to profile people. “Through the application of big data, we paint a picture of people and give them labels with different attributes,” Li Wei, a researcher at China’s national police university, said in a 2016 speech. “For those who receive one or more types of labels, we infer their identities and behavior, and then carry out targeted pre-emptive security measures.”

Mr. Zhang first started petitioning the government for compensation over the torture of his family during the Cultural Revolution. He has since petitioned over what he says is police targeting of his family.

As China has built out its techno-authoritarian tools, he has had to use spy movie tactics to circumvent surveillance that, he said, has become “high tech and Nazified.”

Surveillance cameras within 100 meters of Zhang Yuqiao’s home. There are no cameras in other places in his village, he said.Credit…Zhang Yuqiao

When he traveled to Beijing in January from his village in Shandong Province, he turned off his phone and paid for transportation in cash to minimize his digital footprint. He bought train tickets to the wrong destination to foil police tracking. He hired private drivers to get around checkpoints where his identification card would set off an alarm.

The system in Tianjin has a special feature for people like him who have “a certain awareness of anti-reconnaissance” and regularly change vehicles to evade detection, according to the police procurement document.

Whether or not he triggered the system, Mr. Zhang has noticed a change. Whenever he turns off his phone, he said, officers show up at his house to check that he hasn’t left on a new trip to Beijing.

Image
The authorities “do whatever it takes to silence the people who raise the problems,” Mr. Zhang said.
Credit…Zhang Yuqiao
The authorities “do whatever it takes to silence the people who raise the problems,” Mr. Zhang said.

Even if police systems cannot accurately predict behavior, the authorities may consider them successful because of the threat, said Noam Yuchtman, an economics professor at the London School of Economics who has studied the impact of surveillance in China.

“In a context where there isn’t real political accountability,” having a surveillance system that frequently sends police officers “can work pretty well” at discouraging unrest, he said.

Once the metrics are set and the warnings are triggered, police officers have little flexibility, centralizing control. They are evaluated for their responsiveness to automated alarms and effectiveness at preventing protests, according to experts and public police reports.

The technology has encoded power imbalances. Some bidding documents refer to a “red list” of people whom the surveillance system must ignore.

One national procurement document said the function was for “people who need privacy protection or V.I.P. protection.” Another, from Guangdong Province, got more specific, stipulating that the red list was for government officials.

Mr. Zhang expressed frustration at the ways technology had cut off those in political power from regular people.

“The authorities do not seriously solve problems but do whatever it takes to silence the people who raise the problems,” he said. “This is a big step backward for society.”

Mr. Zhang said that he still believed in the power of technology to do good, but that in the wrong hands it could be a “scourge and a shackle.”

“In the past if you left your home and took to the countryside, all roads led to Beijing,” he said. “Now, the entire country is a net.”

Image
Surveillance cameras on a lamppost in Beijing.
Credit…Roman Pilipey/EPA, via Shutterstock

#DoneWithDunn Student Boycott

We, the undersigned law students and new lawyers, pledge to boycott Gibson, Dunn & Crutcher LLP in response to the firm’s work shielding corporate polluters from climate accountability, its racist legal attacks against Indigenous communities, and the persecution of human rights lawyer Steven Donziger, whose imprisonment is a direct result of Gibson Dunn’s unethical and bullying litigation strategies.

Gibson Dunn has consistently advanced the interests of corporations that cause immense harm to the climate and frontline communities, particularly Indigenous communities. The 2021 Law Firm Climate Change Scorecard by Law Students for Climate Accountability (LSCA) found that Gibson Dunn conducted the second most anti-climate litigation of any law firm. Gibson Dunn has represented Dakota Access despite significant environmental impacts and its incursion on sacred Sioux land, and it currently represents a plaintiff in Brackeen v. Haaland, a lawsuit seeking to strike down the Indian Child Welfare Act, a vital law protecting against the removal of American Indian children from their communities.

Likewise, Gibson Dunn has aggressively litigated to ensure that Chevron evades liability for dumping billions of gallons of toxic waste that did irreversible environmental damage and caused widespread cancer and birth defects among Indigenous and campesino communities in Ecuador. Tens of thousands of Ecuadorians brought suit and were awarded a $9.5 billion dollar judgment to fund cleanup of the pollution; rather than pay it, Chevron has used Gibson Dunn to to demonize Steven Donzinger, the attorney representing these plaintiffs. Gibson Dunn helped mastermind a wholly unprecedented campaign of coercion, bribery, and persecution against Mr. Donziger. For the “crime” of refusing to endanger vulnerable environmental activists in Ecuador by handing over to Chevron years of sensitive communications with his clients, Mr. Donziger was kept under house arrest for two years — an act the United Nations High Commissioner for Human Rights decried as illegal under international law — and was recently sentenced to six months imprisonment. He surrendered himself on October 27.

These scorched-earth tactics are not new to Gibson Dunn, which has been fined by the Montana Supreme Court for “legal thuggery” and “blatantly and maliciously trying to intimidate” its opponents. A New York federal judge sanctioned the firm for “unacceptable shenanigans,” and a California federal judge found Gibson Dunn’s misconduct to be “a product of a culture which permeates” the firm. But Gibson Dunn’s extraordinary campaign to prevent Indigenous Ecuadorians from receiving relief by attacking Mr. Donziger represents a dangerous escalation of these tactics, and a tremendous threat to all future environmental plaintiffs and their advocates.

Last spring, in a letter signed by 87 law student organizations from dozens of law schools across the country, LSCA called on Gibson Dunn to commit to an ethical standard for its fossil fuel work. These student organizations have yet to receive a response. We reiterate their call.

As the newest generation to enter the legal profession, we refuse to be a part of Gibson Dunn’s work undermining access to justice, particularly for Indigenous communities. And we refuse to contribute to a firm that is doing so much to exacerbate a climate crisis that threatens every one of us with an unlivable future.

Sincerely,

The undersigned law students and new lawyers:

The Tax Free Tour – VPRO documentary – 2013

“Where do multinationals pay taxes and how much?” Gaining insight from international tax experts, Backlight takes a look at tax havens, the people who live there and the routes along which tax is avoided globally. Those routes go by resounding names like ‘Cayman Special’, ‘Double Irish’, and ‘Dutch Sandwich’. A financial world operates in the shadows surrounded by a high level of secrecy. A place where sizeable capital streams travel the world at the speed of light and avoid paying tax. The Tax Free Tour is an economic thriller mapping the systemic risk for governments and citizens alike. Is this the price we have to pay for globalised capitalism? At the same time, the free online game “Taxodus” by Femke Herregraven is launched. In the game, the player can select the profile of a multinational and look for the global route to pay as little tax as possible. Originally broadcasted by VPRO in 2013.

Lindsey Graham is the Most Shameless Man in American Politics

The South Carolina senator once put a lot of effort into cultivating an image of a reasonable, sober, sensible, moderate Republican, willing to reach out across the aisle, willing to stick up for his principles, willing to denounce Donald Trump. But today, there is no position he won’t abandon, no U-turn he won’t perform, no lie he will not tell.