‘An Invisible Cage’: How China Is Policing the Future

Three people with a criminal record check into the same hotel in southeast China. An automated system is designed to alert the police.

A man with a history of political protest buys a train ticket to Beijing. The system could flag the activity as suspicious and tell the police to investigate.

A woman with mental illness in Fujian leaves her home. A camera installed by her house records her movements so the police can track her.

Across China, the police are buying technology that harnesses vast surveillance data to predict crime and protest before they happen. The systems and software are targeting people whose behavior or characteristics are suspicious in the eyes of an algorithm and the Chinese authorities, even if they’ve done nothing wrong.

The more than 1.4 billion people living in China are constantly watched. They are recorded by police cameras that are everywhere, on street corners and subway ceilings, in hotel lobbies and apartment buildings. Their phones are tracked, their purchases are monitored, and their online chats are censored.

Now, even their future is under surveillance.

The latest generation of technology digs through the vast amounts of data collected on their daily activities to find patterns and aberrations, promising to predict crimes or protests before they happen. They target potential troublemakers in the eyes of the Chinese government — not only those with a criminal past but also vulnerable groups, including ethnic minorities, migrant workers and those with a history of mental illness.

They can warn the police if a victim of a fraud tries to travel to Beijing to petition the government for payment or a drug user makes too many calls to the same number. They can signal officers each time a person with a history of mental illness gets near a school.

It takes extensive evasive maneuvers to avoid the digital tripwires. In the past, Zhang Yuqiao, a 74-year-old man who has been petitioning the government for most of his adult life, could simply stay off the main highways to dodge the authorities and make his way to Beijing to fight for compensation over the torture of his parents during the Cultural Revolution. Now, he turns off his phones, pays in cash and buys multiple train tickets to false destinations.

While largely unproven, the new Chinese technologies, detailed in procurement and other documents reviewed by The New York Times, further extend the boundaries of social and political controls and integrate them ever deeper into people’s lives. At their most basic, they justify suffocating surveillance and violate privacy, while in the extreme they risk automating systemic discrimination and political repression.

Surveillance cameras set up in April at a residential compound in Mudanjiang, Heilongjiang Province.
Credit…China Daily/Via Reuters

For the government, social stability is paramount and any threat to it must be eliminated. During his decade as China’s top leader, Xi Jinping has hardened and centralized the security state, unleashing techno-authoritarian policies to quell ethnic unrest in the western region of Xinjiang and enforce some of the world’s most severe coronavirus lockdowns. The space for dissent, always limited, is rapidly disappearing.

“Big data should be used as an engine to power the innovative development of public security work and a new growth point for nurturing combat capabilities,” Mr. Xi said in 2019 at a national public security work meeting.

The algorithms, which would prove controversial in other countries, are often trumpeted as triumphs.

In 2020, the authorities in southern China denied a woman’s request to move to Hong Kong to be with her husband after software alerted them that the marriage was suspicious, the local police reported. An ensuing investigation revealed that the two were not often in the same place at the same time and had not spent the Spring Festival holiday together. The police concluded that the marriage had been faked to obtain a migration permit.

The same year in northern China, an automated alert about a man’s frequent entry into a residential compound with different companions prompted the police to investigate. They discovered that he was a part of a pyramid scheme, according to state media.

The details of these emerging security technologies are described in police research papers, surveillance contractor patents and presentations, as well as hundreds of public procurement documents reviewed and confirmed by The Times. Many of the procurement documents were shared by ChinaFile, an online magazine published by the Asia Society, which has systematically gathered years of records on government websites. Another set, describing software bought by the authorities in the port city of Tianjin to stop petitioners from going to neighboring Beijingwas provided by IPVM, a surveillance industry publication.

China’s Ministry of Public Security did not respond to requests for comment faxed to its headquarters in Beijing and six local departments across the country.

The new approach to surveillance is partly based on data-driven policing software from the United States and Europe, technology that rights groups say has encoded racism into decisions like which neighborhoods are most heavily policed and which prisoners get parole. China takes it to the extreme, tapping nationwide reservoirs of data that allow the police to operate with opacity and impunity.

Video

14:27China’s Surveillance State Is Growing. These Documents Reveal How.
A New York Times analysis of over 100,000 government bidding documents found that China’s ambition to collect digital and biological data from its citizens is more expansive and invasive than previously known.

Often people don’t know they’re being watched. The police face little outside scrutiny of the effectiveness of the technology or the actions they prompt. The Chinese authorities require no warrants to collect personal information.

At the most bleeding edge, the systems raise perennial science-fiction conundrums: How is it possible to know the future has been accurately predicted if the police intervene before it happens?

Even when the software fails to deduce human behavior, it can be considered successful since the surveillance itself inhibits unrest and crime, experts say.

This is an invisible cage of technology imposed on society,” said Maya Wang, a senior China researcher with Human Rights Watch, “the disproportionate brunt of it being felt by groups of people that are already severely discriminated against in Chinese society.”

Image
Products from Megvii, an artificial intelligence start-up, on display at a tech industry exhibition center in Beijing.
Credit…Florence Lo/Reuters

‘Nowhere to Hide’

In 2017, one of China’s best-known entrepreneurs had a bold vision for the future: a computer system that could predict crimes.

The entrepreneur, Yin Qi, who founded Megvii, an artificial intelligence start-up, told Chinese state media that the surveillance system could give the police a search engine for crime, analyzing huge amounts of video footage to intuit patterns and warn the authorities about suspicious behavior. He explained that if cameras detected a person spending too much time at a train station, the system could flag a possible pickpocket.

“It would be scary if there were actually people watching behind the camera, but behind it is a system,” Mr. Yin said. “It’s like the search engine we use every day to surf the internet — it’s very neutral. It’s supposed to be a benevolent thing.”

He added that with such surveillance, “the bad guys have nowhere to hide.”

Five years later, his vision is slowly becoming reality. Internal Megvii presentations reviewed by The Times show how the start-up’s products assemble full digital dossiers for the police.

Build a multidimensional database that stores faces, photos, cars, cases and incident records,” reads a description of one product, called “intelligent search.” The software analyzes the data to “dig out ordinary people who seem innocent” to “stifle illegal acts in the cradle.”

A Megvii spokesman said in an emailed statement that the company was committed to the responsible development of artificial intelligence, and that it was concerned about making life more safe and convenient and “not about monitoring any particular group or individual.”

Video
Cinemagraph
An internal presentation slide for Megvii’s “intelligent search” product. Bar charts sort groups of monitored people by category.

Similar technologies are already being put into use. In 2022, the police in Tianjin bought software made by a Megvii competitor, Hikvision, that aims to predict protests. The system collects data on legions of Chinese petitioners, a general term in China that describes people who try to file complaints about local officials with higher authorities.

It then scores petitioners on the likelihood that they will travel to Beijing. In the future, the data will be used to train machine-learning models, according to a procurement document.

Local officials want to prevent such trips to avoid political embarrassment or exposure of wrongdoing. And the central government doesn’t want groups of disgruntled citizens gathering in the capital.

A Hikvision representative declined to comment on the system.

Under Mr. Xi, official efforts to control petitioners have grown increasingly invasive. Zekun Wang, a 32-year-old member of a group that for years sought redress over a real estate fraud, said the authorities in 2017 had intercepted fellow petitioners in Shanghai before they could even buy tickets to Beijing. He suspected that the authorities were watching their communications on the social media app WeChat.

The Hikvision system in Tianjin, which is run in cooperation with the police in nearby Beijing and Hebei Province, is more sophisticated.

The platform analyzes individuals’ likelihood to petition based on their social and family relationships, past trips and personal situations, according to the procurement document. It helps the police create a profile of each, with fields for officers to describe the temperament of the protester, including “paranoid,” “meticulous” and “short tempered.”

Many people who petition do so over government mishandling of a tragic accident or neglect in the case — all of which goes into the algorithm. “Increase a person’s early-warning risk level if they have low social status or went through a major tragedy,” reads the procurement document.

Image
A police patrol in Xichang, Sichuan Province. Software allows Chinese authorities to target individuals according to preconceived ideas about their traits.
Credit…Costfoto/Future Publishing via Getty Images
A police patrol in Xichang, Sichuan Province. Software allows Chinese authorities to target individuals according to preconceived ideas about their traits.

When the police in Zhouning, a rural county in Fujian Province, bought a new set of 439 cameras in 2018, they listed coordinates where each would go. Some hung above intersections and others near schools, according to a procurement document.

Nine were installed outside the homes of people with something in common: mental illness.

While some software tries to use data to uncover new threats, a more common type is based on the preconceived notions of the police. In over a hundred procurement documents reviewed by The Times, the surveillance targeted blacklists of “key persons.”

These people, according to some of the procurement documents, included those with mental illness, convicted criminals, fugitives, drug users, petitioners, suspected terrorists, political agitators and threats to social stability. Other systems targeted migrant workers, idle youths (teenagers without school or a job), ethnic minorities, foreigners and those infected with H.I.V.

The authorities decide who goes on the lists, and there is often no process to notify people when they do. Once individuals are in a database, they are rarely removed, said experts, who worried that the new technologies reinforce disparities within China, imposing surveillance on the least fortunate parts of its population.

In many cases the software goes further than simply targeting a population, allowing the authorities to set up digital tripwires that indicate a possible threat. In one Megvii presentation detailing a rival product by Yitu, the system’s interface allowed the police to devise their own early warnings.

With a simple fill-in-the-blank menu, the police can base alarms on specific parameters, including where a blacklisted person appears, when the person moves around, whether he or she meets with other blacklisted people and the frequency of certain activities. The police could set the system to send a warning each time two people with a history of drug use check into the same hotel or when four people with a history of protest enter the same park.

Yitu did not respond to emailed requests for comment.

Video
Cinemagraph
An interface from a Yitu product that lets the police set parameters to receive alerts on suspicious behavior.CreditCredit…The New York Times

In 2020 in the city of Nanning, the police bought software that could look for “more than three key people checking into the same or nearby hotels” and “a drug user calling a new out-of-town number frequently,” according to a bidding document. In Yangshuo, a tourist town famous for its otherworldly karst mountains, the authorities bought a system to alert them if a foreigner without a work permit spent too much time hanging around foreign-language schools or bars, an apparent effort to catch people overstaying their visas or working illegally.

In Shanghai, one party-run publication described how the authorities used software to identify those who exceeded normal water and electricity use. The system would send a “digital whistle” to the police when it found suspicious consumption patterns.

The tactic was likely designed to detect migrant workers, who often live together in close quarters to save money. In some places, the police consider them an elusive, and often impoverished, group who can bring crime into communities.

The automated alerts don’t result in the same level of police response. Often, the police give priority to warnings that point to political problems, like protests or other threats to social stability, said Suzanne E. Scoggins, a professor at Clark University who studies China’s policing.

At times, the police have stated outright the need to profile people. “Through the application of big data, we paint a picture of people and give them labels with different attributes,” Li Wei, a researcher at China’s national police university, said in a 2016 speech. “For those who receive one or more types of labels, we infer their identities and behavior, and then carry out targeted pre-emptive security measures.”

Mr. Zhang first started petitioning the government for compensation over the torture of his family during the Cultural Revolution. He has since petitioned over what he says is police targeting of his family.

As China has built out its techno-authoritarian tools, he has had to use spy movie tactics to circumvent surveillance that, he said, has become “high tech and Nazified.”

Surveillance cameras within 100 meters of Zhang Yuqiao’s home. There are no cameras in other places in his village, he said.Credit…Zhang Yuqiao

When he traveled to Beijing in January from his village in Shandong Province, he turned off his phone and paid for transportation in cash to minimize his digital footprint. He bought train tickets to the wrong destination to foil police tracking. He hired private drivers to get around checkpoints where his identification card would set off an alarm.

The system in Tianjin has a special feature for people like him who have “a certain awareness of anti-reconnaissance” and regularly change vehicles to evade detection, according to the police procurement document.

Whether or not he triggered the system, Mr. Zhang has noticed a change. Whenever he turns off his phone, he said, officers show up at his house to check that he hasn’t left on a new trip to Beijing.

Image
The authorities “do whatever it takes to silence the people who raise the problems,” Mr. Zhang said.
Credit…Zhang Yuqiao
The authorities “do whatever it takes to silence the people who raise the problems,” Mr. Zhang said.

Even if police systems cannot accurately predict behavior, the authorities may consider them successful because of the threat, said Noam Yuchtman, an economics professor at the London School of Economics who has studied the impact of surveillance in China.

“In a context where there isn’t real political accountability,” having a surveillance system that frequently sends police officers “can work pretty well” at discouraging unrest, he said.

Once the metrics are set and the warnings are triggered, police officers have little flexibility, centralizing control. They are evaluated for their responsiveness to automated alarms and effectiveness at preventing protests, according to experts and public police reports.

The technology has encoded power imbalances. Some bidding documents refer to a “red list” of people whom the surveillance system must ignore.

One national procurement document said the function was for “people who need privacy protection or V.I.P. protection.” Another, from Guangdong Province, got more specific, stipulating that the red list was for government officials.

Mr. Zhang expressed frustration at the ways technology had cut off those in political power from regular people.

“The authorities do not seriously solve problems but do whatever it takes to silence the people who raise the problems,” he said. “This is a big step backward for society.”

Mr. Zhang said that he still believed in the power of technology to do good, but that in the wrong hands it could be a “scourge and a shackle.”

“In the past if you left your home and took to the countryside, all roads led to Beijing,” he said. “Now, the entire country is a net.”

Image
Surveillance cameras on a lamppost in Beijing.
Credit…Roman Pilipey/EPA, via Shutterstock

Cops arrested him for filming in public, but things took a bizarre turn when the case went to court

 

The arrest of a Texas cop watcher for filming in public is the most recent chilling example of how law enforcement across the country is attempting to roll back auditors’ First Amendment rights. Jack Miller, also known as Texas Sheepdog, was filming outside the Olmos Park, Texas, City Hall when police arrested and charged him with multiple crimes. The ensuing five-day trial and jury verdict reveal that citizens’ ability to film in public is facing new obstacles and concerted pushback from the government.

 

 

He should have immediately filed an appeal and had his sentence stayed a federal judge would have looked at that video and put a halt to the entire sentence
Assault on a Police Officer? He never touched them. Or Is hurting their fragile feelings by cursing at them a Federal Offense?
Maybe if the police stopped making frivolous arrest, they wouldn’t have to worry about charges being dropped. The police need to know and understand the law!
…Let me get this straight. He went out to film a PSA about not threatening cops, only to have the cops brutally assault him trying to help cops. The Irony in this story is off the charts. 📈🤦🏿
It would be interesting to hear the jury instructions. Should be appealed.
I have to believe that that jury had no idea what they were doing I saw nothing of what they charged him with.
This is a prime example of how our justice system is not about right or wrong but about money and power and ego’s
I’m surprised they aren’t drug testing you twice per day! (They charge you $30.00 each time!) Its a money racket… Financial extortion! Probation is far worse than serving the time … (Never accept probation!) – Michael B. Saari for Michigan State Senate 2022
He needs to get himself a lawyer who knows what the law is. Being a retired lawyer I cannot believe that he was convicted
how in the hell did a trial get the majority of the the jurors to have a guilty plea? it is SUPER clear to me that he wasnt resisting, and VERY clear that he did not attack a police officer and the fact that it was a toy gun not an actual fire arm means how did they tack on a fire arm charge with no fire arm? this sounds EXTREMLY fishy to me ( meaning like corruption )
>> Yeah, the gun charges got me as well… isn’t this in an open carry state and that was a toy rifle SLUNG over his back? Not very menacing.
Weaponized law enforcement. Imagine that.
I’m wondering how a jury can find you guilty of resisting arrest when there was no crime committed. Resisting arrest is a secondary charge. I would say I want another jury trial with new jury members, as seeing as those jurors must be incredibly incompetent. There was no crime to begin with, so how was resisting arrest of a crime that was never committed? 2nd Amendment? That’s not a crime. Blocking a pathway? Didnt see him blocking anyone. And assault on a peace officer!?!?!? Where in this video does it show this man lay a single finger on these cops!?!?!?!? Except maybe he pulled away from them when THEY TRIED TO GRAB HIM!!! And they tackled him to the ground, pulled his arm behind his back essentially breaking it to where he needed surgery 2 days later, hes laying there crying in pain, and they charged him with Assault on a peace officer!?!?!? Get the fuck outta here dude!!! So even though he had a fake gun (Even if real, we still have 2nd amendment), was not blocking a single person on the sidewalk, didnt commit a crime at all, and was the one brutally tackled to the ground and had his arm broken, he is still somehow found guilty on all charges? What video did these jurors see, because I see the complete opposite of these charges. Once again, to recap… Fake Gun, blocking no one’s movement, didnt lay a single finger on either of those cops, no crime committed, and being the victim of assault and battery and hes still charged with disorderly (Possibly Brandishing with intent to harm), blocking pathway, assault on peace officer, and resisting arrest!?!?!?!?!? What kind of a system are we in where we find our fellow man guilty at the hands of the corrupts crimes? I’m scared to live in this society. I dont want today’s corruption and abuse to be my future….

Show less

Why The Media Demonizes Leftists

Right-wing propaganda network Newsmax recently interviewed Hungary’s Minister of Foreign Affairs and Trade, who spewed terrible anti-LGBTQIA+ rhetoric and defended Hungary’s ban on LGBTQIA+ content in schools. The rhetoric was received with welcome arms and did not face any pushback from the Newsmax host, which points towards a larger issue with U.S. media: sympathy for authoritarians, demonization for leftists.

“Newsmax just mainstreaming Hungary’s horrific anti-LGBTQ law into American TV screens with absolutely zero pushback”

Demagogues Don’t Debate: The GOP Mimics Putin and Orbán

Refusing to participate in presidential debates is a symptom of the Republicans’ embrace of authoritarianism

“This is insane. Why.” Rep. Adam Kinzinger (R-Ill.) tweeted about the news that the Republican National Committee will require candidates for America’s highest office to pledge to not participate in debates run by the Commission on Presidential Debates.

The nonprofit commission, which has organized the debates for 30 years, is bipartisan. That’s a big problem for the GOP, which is now an authoritarian party that has withdrawn its support for bipartisan rule and democratic institutions. In vacating the debate stage, the Republicans are mimicking autocratic heads of state like Russian President Vladimir Putin and Hungarian Prime Minister Viktor Orbán.

Debates between presidential candidates enact the democratic principle of mutual tolerance: the notion that those who don’t share your political views have a right to free expression. The public hears an exchange of views by two individuals who are on equal footing and bound by the same rules, which are enforced by an impartial arbiter.

This is anathema to the authoritarian mindset. Personality cults posit the leader as a man above all others, and the egalitarian staging and format of debates make them dangerous to his brand. Since authoritarians sustain their power through disinformation, threat, and corruption (including fixing elections) who knows what might be exposed if they submit to spontaneous questioning by a rival or a third party?

Putin, who came from an intelligence background into politics, understands this well. He set the tone for 21st century illiberal rulers when he refused to debate his opponents during a 1999-2000 presidential race marked by the resumption of Russia’s war with Chechnya and a series of apartment building explosions that were devastating for Russians but conveniently timed for the emergence of his strongman persona.

Avoiding debates became a feature of Putin’s rule over the next 20 years as he built a kleptocracy founded on secrecy and the silencing of rivals. In 2012, his spokesman Dmitry Peskov claimed that taking time for debate would “impede his ability to carry out his duties”– which is true given that the main goal of Putinism is not governance but thievery.

Russian opposition parties have unsuccessfully lobbied for years to change election laws to require all candidates for parliament or the presidency to participate in debates. Instead, Putin offers the public yearly live call-in shows in which he answers scripted questions.

Putin gets ready for his annual call-in show, Moscow, 2016. Mikhail Svetlov/Getty Images.

Orbán, who is a Putin client, has followed Moscow’s lead. Sixteen years have passed since the Hungarian leader last agreed to debate a competitor. A poor performance in 2006, which contributed to his defeat to the incumbent Hungarian Socialist Party, soured him on the experience. A 2018 attempt by opposition politicians to amend election law to require presidential debates did not succeed.

Four years later, Orbán’s under more pressure. He may be the darling of the American far right, but he faces a challenge in the April presidential elections from a newly unified opposition. Six parties have come together to defeat what coalition leader Peter Márki-Zay calls a “corrupt dictatorship.”

Still, Orbán rejected the proposal by Márki-Zay, the opposition’s presidential candidate, to hold a debate. “To his followers, Orbán must always appear invincible,” autocracy expert Kim Scheppele Lane says, and the Hungarian leader feels he can rely on his system of electoral autocracy, where substantial control of the voting apparatus and the judiciary by the incumbent and his cronies helps to produce the outcome needed to stay in office.

These autocratic actions offer context for evaluating the GOP’s rejection of presidential debates. Trump signaled his break with American presidential debate customs early on: after Megyn Kelly grilled him during an August 2015 debate among contenders for the GOP nomination, he boycotted the next one, so as not to show weakness. And who can forget him shadowing Hillary Clinton in a menacing way during their October 2016 debate? “My skin crawled,” Clinton later wrote of that occasion.

Trump and Clinton at their October 2016 debate, St. Louis, MO. Paul J. Richards/AFP via Getty Images..

By 2020, with four years’ experience in autocratic leadership, Trump was ready to use the debate experience to try to take down his adversary in a different way: He attended the Sept. 29 debate with Joe Biden knowing that he had recently tested positive for Covid-19, making his health status public only several days later. Putin, who silences critics with poison, likely approved of Trump turning a democratic ritual into an experiment in biological warfare.

With Trump as their cult leader, a coup attempt in their recent past (57 GOP officials participated in the rally that preceded the Jan. 6 takeover effort), and a mission to create an Orbán-like election subversion system, it’s no surprise that Republicans are abandoning the debate stage. When a party decides to rely on lies, corruption, and violence, it has too many secrets to face public questioning.

Autocrats see debates as risky, which is why they refuse to participate in them. It’s another sign of the GOP’s authoritarian turn that it’s decided to follow suit.

Where was the first place that you felt truly at home?

Its kind of odd, but I thought I felt at home in sunny Cali since I lived there from 2 years old. I didn’t realize that I didn’t really know what being home felt like, until I went to bootcamp for the Marines at age 20. I remember our first PFT, straggling in slow where the senior DI called a few of us out saying we weren’t going to make it. For the first time, I felt really at home and good because we were all the same in every way. It was strangely comforting. The senior DI kept asking for me every week to see if I was there. I made it, dropped 58–60 lbs, 300 PFT, and was yoked out. Senior DI Chavez gave me the firmest handshake at graduation and wished me well.

Fast forward to the age of 44. I lived a great life and experienced cool things all over as an American. I felt an ever bigger feeling of being “home,” once I visited S Korea for a short vacation. I rushed home, sold all, quit all, and severed most ties. I moved as quickly as I could and I’m 48 (Korean age) today! Bootcamp was the first, but this move was the most powerful. Everything and every part of me wanted to move back. It’s been fulfilling, amazing in timing, and I’m a new person. I consider this move to be the first, since it was so moving and meaningful for me. The Corps still has a part of my heart and loyalty, but the rest is with and in S Korea. I even enjoy hospital visits now.

Florida Republicans Introduce Bill To Spy On Public School Teachers

Republicans in Florida have proposed legislation that would put cameras into the classrooms of public school teachers. Not only would this be a massive invasion of privacy for teachers and students, but it would also address a problem that doesn’t actually exist. While the Republicans claim the bill is to make sure there are no “incidents” of violence from students or teachers, what is really happening is that they want to regulate what teachers can and cannot say during lessons. Ring of Fire’s Farron Cousins explains the sinister motives of these Republicans.