exponent podcast

Exponent is a podcast about tech and society hosted by Ben Thompson and James Allworth

Ben Thompson is the author and founder of Stratechery, a blog about the business and strategy of technology. You can follow him on Twitter @benthompson.

James Allworth is the co-author with Clay Christensen of How Will You Measure Your Life and a writer for the Harvard Business Review. You can follow him on Twitter @jamesallworth.

Eric X. Li: A tale of two political systems

It’s a standard assumption in the West: As a society progresses, it eventually becomes a capitalist, multi-party democracy. Right? Eric X. Li, a Chinese investor and political scientist, begs to differ. In this provocative, boundary-pushing talk, he asks his audience to consider that there’s more than one way to run a succesful modern nation.

 

Why democracy still wins: A critique of Eric X. Li’s “A tale of two political systems”

Earlier this year, economist Yasheng Huang (watch his 2011 TED Talk) sparred with Eric X. Li in the pages of Foreign Affairs on a similar topic to today’s TED Talk. The TED Blog asked Huang to expand on his argument in his ongoing conversation with Li.

Imagine confusing the following two statements from a cancer doctor: 1) “You may die from cancer” and 2) “I want you to die from cancer.” It is not hard to see a rudimentary difference between these two statements. The first statement is a prediction — it is saying that something may happen given certain conditions (in this case death conditional upon having cancer). The second statement is a preference, a desire, or a wish for a world to one’s particular liking.

Who would make such a rudimentary mistake confusing these two types of statements? Many people, including Eric X. Li, in today’s TED Talk. The Marxian meta-narrative drilled into Li’s head — and mine in my childhood and youth in the 1960s and 1970s — is a normative statement. When Marx came up with his ideas about evolution of human societies, there was not a single country in the world that even remotely resembled the communist system he advocated. The communist system Marx had in mind had no private property or of ownership of any kind. Money was also absent in that system. The Marxian version of communism has never come to fruition and, most likely, never will. Marx based his “prediction” on deduction; his successors did so by imposing their wish, enforced by power and violence.

Eric X. Li: A tale of two political systemsEric X. Li: A tale of two political systems

By contrast, the narrative that was apparently fed to Li when he was a “Berkeley hippie” is based on the actual experience of human affairs. We have had hundreds of years of experience with democracy and hundreds of countries/years of democratic transitions and rule. The statement that countries transition to democracy as they get rich is a positive statement — it is a prediction based on data. In the 1960s, roughly 25 percent  of the world was democratic; today the proportion is 63 percent . There are far more instances of dictatorships transitioning to democracies  than the other way around. The rest of the world has clearly expressed a preference for democracy. As Minxin Pei has pointed out, of the 25 countries with a higher GDP per capita than China that are not free or partially free, 21 of them are sustained by natural resources. But these are exceptions that prove the rule — countries become democratic as they get richer. Today not a single country classified as the richest is a single-party authoritarian system. (Singapore is arguably a borderline case.) Whether Li likes it or not, they all seem to end up in the same place.

Are democracies more corrupt? Li thinks so. He cites the Transparency International (TI) index to support his view. The TI data show that China is ranked better than many democracies. Fair enough.

I have always thought that there is a touch of irony with using transparency data to defend a political system built on opacity. Irony aside, let’s keep in mind that TI index itself is a product of a political system that Li so disparages — democracy (German democracy to be exact). This underscores a basic point — we know far more about corruption in democracies than we do about corruption in authoritarian countries because democracies are, by definition, more transparent and they have more transparency data. While I trust comparisons of corruption among democratic countries, to mechanically compare corruption in China with that in democracies, as Li has done so repeatedly, is fundamentally flawed. His methodology confounds two effects — how transparent a country is and how corrupt a country is. I am not saying that democracies are necessarily cleaner than China; I am just saying that Li’s use of TI data is not the basis for drawing conclusions in either direction. The right way to reach a conclusion on this issue is to say that given the same level of transparency (and the same level of many other things, including income), China is — or is not — more corrupt than democracies.

Yasheng Huang: Does democracy stifle economic growth?Yasheng Huang: Does democracy stifle economic growth?

A simple example will suffice to illustrate this idea. In 2010, two Indian entrepreneurs founded a website called I Paid a Bribe. The website invited anonymous postings of instances in which Indian citizens had to pay a bribe. By August 2012 the website has recorded more than 20,000 reports of corruption. Some Chinese entrepreneurs tried to do the same thing: They created I Made a Bribe and 522phone.com. But those websites were promptly shut down by the Chinese government. The right conclusion is not, as the logic of Li would suggest, that China is cleaner than India because it has zero postings of corrupt instances whereas India has some 20,000 posted instances of corruption.

With due respect to the good work at Transparency International, its data are very poor at handling this basic difference between perception of corruption and incidence of corruption. Democracies are more transparent — about its virtues and its vices — than authoritarian systems.  We know far more about Indian corruption in part because the Indian system is more transparent, and it has a noisy chattering class who are not afraid to challenge and criticize the government (and, in a few instances, to stick a video camera into a hotel room recording the transfer of cash to politicians). Also lower-level corruption is more observable than corruption at the top of the political hierarchy. The TI index is better at uncovering the corruption of a Barun the policeman in Chennai than a Bo Xilai the Politburo member from Chongqing. These factors, not corruption per se, are likely to explain most of the discrepancies between China and India in terms of TI rankings.

Li likes to point out, again using TI data, that the likes of Indonesia, Argentina and the Philippines are both democracies and notoriously corrupt. He often omits crucial factual details when he is addressing this issue. Yes, these countries are democracies, in 2013, but they were governed by ruthless military dictators for decades long before they transitioned to democracy. It was the autocracy of these countries that bred and fermented corruption. (Remember the 3,000 pairs of shoes of Mrs. Marcos?) Corruption is like cancer, metastatic and entrenched. While it is perfectly legitimate to criticize new democracies for not rooting out corruption in a timely fashion, confusing the difficulties of treating the entrenched corruption with its underlying cause is analogous to saying that a cancer patient got his cancer after he was admitted for chemotherapy.

The world league of the most egregious corruption offenders belongs exclusively to autocrats. The top three ruling looters as of 2004, according to a TI report, are Suharto, Marcos and Mobutu. These three dictators pillaged a combined $50 billion from their impoverished people. Democracies are certainly not immune to corruption, but I think that they have to work a lot harder before they can catch up with these autocrats.

Li has a lot of faith in the Chinese system. He first argues that the system enjoys widespread support among the Chinese population. He cites a Financial Times survey that 93 percent of Chinese young people are optimistic about their future. I have seen these high approval ratings used by Li and others as evidence that the Chinese system is healthy and robust, but I am puzzled why Li should stop at 93 percent. Why not go further, to 100 percent ? In a country without free speech, asking people to directly evaluate performance of leaders is like asking people to take a single-choice exam. The poll numbers for Erich Honecker and Kim Jong-un would put Chinese leaders to shame.

(Let me also offer a cautionary footnote on how and how not to use Chinese survey data. I have done a lot of survey research in China, and I am always humbled by how tricky it is to interpret the survey findings. Apart from the political pressures that tend to channel answers in a particular direction, another problem is that Chinese respondents sometimes view taking a survey as similar to taking an exam. Chinese exams have standard answers, and sometimes Chinese respondents fill out surveys by trying to guess what the “standard” answer is rather than expressing their own views. I would caution against any naïve uses of Chinese survey data.)

Li also touts the adaptability of the Chinese political system. Let me quote:

“Now, most political scientists will tell us that a one-party system is inherently incapable of self-correction. It won’t last long because it cannot adapt. Now here are the facts. In 64 years of running the largest country in the world, the range of the party’s policies has been wider than any other country in recent memory, from radical land collectivization to the Great Leap Forward, then privatization of farmland, then the Cultural Revolution, then Deng Xiaoping’s market reform, then successor Jiang Zemin took the giant political step of opening up party membership to private businesspeople, something unimaginable during Mao’s rule. So the party self-corrects in rather dramatic fashion.”

Now imagine putting forward the following narrative celebrating, say, Russian “adaptability”: Russia, as a country or as a people, is highly adaptable. The range of its “policies has been wider than any other country in recent memory,” from gulags to Stalin’s red terror, then collectivization, then central planning, then glasnost and perestroika, then privatization, then crony capitalism, then the illiberal democracy under Putin, something unimaginable during Lenin’s rule. So the country “self-corrects in rather dramatic fashion.”

Let me be clear and explicit — Li’s reasoning on the adaptability of the Chinese Communist Party (CCP) is exactly identical to the one I offered on Russia. The only difference is that Li was referring to a political organization — the CCP — and I am referring to a sovereign state.

The TED audience greeted Li’s speech with applause — several times in fact. I doubt that had Li offered this Russian analogy the reception would have been as warm. The reason is simple: The TED audience is intimately familiar with the tumult, violence and astronomical human toll of the Soviet rule. Steven Pinker, in his book The Better Angels of Our Nature, quoted the findings by other scholars that the Soviet regime killed 62 million of its own citizens. I guess the word “correction” somewhat understates the magnitude of the transformation from Stalin’s murderous, genocidal regime to the problematic, struggling but nonetheless democratic Russia today.

I do not know what a Berkeley hippie learned from his education, but in Cambridge, Massachusetts, where I received my education and where I am a professor by profession, I learned — and teach — every day that words actually have meaning. To me, self-correction implies at least two things. First, a self-correction is, well, a correction by self. Yes, Mao’s policies were “corrected” or even reversed by his successors, as Li pointed out, but in what sense is this “a self-correction?” Mao’s utterly disastrous policies persisted during his waning days even while the chairman lay in a vegetative state and his successor — who came to power through a virtual coup — only dared to modify Mao’s policies after his physical expiration was certain. If this is an instance of self-correction, exactly what is not a self-correction? Almost every single policy change Li identified in his talk was made by the successor to the person who initiated the policy that got corrected. (In quite a few cases, not even by the immediate successor.) This is a bizarre definition of self-correction. Does it constitute a self-correction when the math errors I left uncorrected in my childhood are now being corrected by my children?

The second meaning of self-correction has to do with the circumstances in which the correction occurs, not just the identity of the person making the correction. A 10-year-old can correct her spelling or math error on her own volition, or she could have done so after her teacher registered a few harsh slaps on the back of her left hand. In both situations the identity of the corrector is the same — the 10-year-old student — but the circumstances of the correction are vastly different. One would normally associate the first situation with “self-correction,” the second situation with coercion, duress or, as in this case, violence. In other words, self-correction implies a degree of voluntariness on the part of the person making the correction, not forced or coerced, not out of lack of alternatives other than making the correction. The element of choice is a vital component of the definition of self-correction.

Let me supply a few missing details to those who applauded Li’s characterization of 64 years of China’s one-party system as one of serial self-corrections. Between 1949 and 2012, there have been six top leaders of the CCP. Of these six, two were abruptly and unceremoniously forced out of power (and one of the two was dismissed without any due process even according to CCP’s own procedures). A third leader fell from power and was put under house arrest for 15 years until his death. That is 3 strikes out of 6 who did not exit power on their intended terms. Two of Mao’s anointed successors died on the job, one in a fiery plane crash when he tried to escape to the Soviet Union and the other tortured to death and buried with a fake name. Oh, did I mention that 30 million people were estimated to have died from Mao’s disastrous Great Leap Forward, and probably millions of people died from the violence of the Cultural Revolution? Also, do you know that Mao not only persisted in but accelerated his Great Leap Forward policies after the evidence of the extent of famine became crystal-clear?

Li calls the policy changes after these wrenching tumults “self-corrections.” His reasoning is that an entity called the CCP, but not anybody else, introduced these policy changes. First of all, doesn’t that have something to do with the fact that nobody else was allowed a chance to make those policy changes? Secondly, this fixation on who made the policy changes rather than on the circumstances under which the policy changes were made is surely problematic. Let’s extend Li’s logic a little bit further. Shall we rephrase the

  • American Independence Movement as a self-correction by the British? Or maybe the ceding of the
  • British imperial authority over India as another British self-correcting act? Shall we re-label the
  • Japanese surrender to end the Second World War a self-correction by the Japanese? Yes, there were two atomic bombs dropped on Hiroshima and Nagasaki and all of that, but didn’t the representatives of Emperor Hirohito sign the Japanese Instrument of Surrender on the battleship of USS Missouri?

To a hammer, everything is a nail. Li sees ills of democracies everywhere — financial crises in Europe and the United States, money politics and corruption. I readily agree that money politics in America is a huge problem and that it is indeed making the system utterly dysfunctional. But let’s be very clear about exactly how and why money politics is dysfunctional. It is dysfunctional precisely because it is fundamentally antithetical to democracy.  Money politics is a perversion of democracy. It undermines and invalidates a canonical pillar of democracy — one person, one vote. To be logically consistent, Li should celebrate money politics because it is moving the United States in the direction of the authoritarian way of politics that he is so enamored of.

This may be a shocking revelation to Li, but US and European democracies did not patent financial crisis. Many authoritarian regimes experienced catastrophic financial and economic crises. Think of Indonesia in 1997 and the multiple junta regimes in Latin America in the 1970s and the 1980s. The only authoritarian regimes that go without suffering an explicit financial crisis are centrally planned economies, such as Romania and East Germany. But this is entirely because they failed to meet a minimum condition for having a financial crisis — having a financial system. The consequences for this defect are well-known — in lieu of sharp cyclical ups and downs, these countries produced long-term economic stagnations. A venture capitalist would not fare well in that system.

Li claims that he has studied the ability of democracies to deliver performance. At least in his talk, the evidence that he has done so is not compelling. There is no evidence that countries pay an economic price for being democratic. (It is also important to note that there is no compelling global evidence that democracies necessarily outperform autocracies in economic growth either. Some do and some do not. The conclusion is case by case.) But in the areas of public services, the evidence is in favor of democracies. Two academics, David Lake and Matthew Baum, show that democracies are superior to authoritarian countries in providing public services, such as health and education. Not just established democracies do a better job; countries that transitioned to democracies experienced an immediate improvement in the provision of these public services, and countries that reverted back to authoritarianism typically suffered a setback.

Li blames low growth in Europe and in the United States on democracy. I can understand why he has this view, because this is a common mistake often made by casual observers — China is growing at 8 or 9 percent and the US is growing at 1 or 2 percent . He is mistaking a mathematical effect of lower growth due to high base with a political effect of democracies suppressing growth. Because democratic countries are typically richer and have much higher per-capita GDP, it is much harder for them to grow at the same rate as poor — and authoritarian — countries with a lower level of per-capita GDP. Let me provide an analogy. A 15-year-old boy is probably more likely to go to see a movie or hang out with his friends on his own than a 10-year-old because he is older and more mature. It is also likely that he will not grow as fast as a 10-year-old because he is nearer to the plateau of human height. It would be foolish to claim, as Li’s logic basically did, that the 15-year-old is growing more slowly because he is going to movies on his own.

Li is very clear that he dislikes democracy, more than about the reasons why he dislikes democracy. Li rejects democracy on cultural grounds. In his speech, he asserts that democracy is an alien concept for Chinese culture. This view is almost amusing if not for its consequential implications. Last time I checked, venture capital is a foreign concept but that apparently has not stopped Li from practicing and prospering from it. (And I presume “Eric” to be foreign in origin? I may be wrong on this.) Conversely, would Li insist on adhering to every and each precept of Chinese culture and tradition? Would Li object to abolishing the practice of bound feet of Chinese women?

The simple fact is that the Chinese have accepted many foreign concepts and practices already. (Just a reminder: Marxism to the Chinese is as Western as Adam Smith.) It is a perfectly legitimate debate about which foreign ideas and practices China ought to accept, adopt or adapt, but this debate is about which ideas China should adopt, not whether China should adopt any foreign ideas and practices at all.

If the issue is about which ideas or which practices to adopt or reject, then, unlike Li, I do not feel confident enough to know exactly which foreign ideas and practices 1.3 billion Chinese people want to embrace or want to reject. A cultural argument against democracy does not logically lead to making democracy unavailable to the Chinese but to a course of actions for the Chinese people themselves to decide on the merits or the demerits of democracy. Furthermore, if the Chinese themselves reject democracy on their own, isn’t it redundant to expend massive resources to fight and suppress democracy? Aren’t there better ways to spend this money?

So far this debate has not occurred in China, because having this debate in the first place requires some democracy. But it has occurred in other Chinese environments, and the outcome of those debates is that there is nothing fundamentally incompatible between Chinese culture and democracy. Hong Kong, although without an electoral democratic system, has press freedom and rule of law, and there is no evidence that the place has fallen into chaos and anarchy. Taiwan today has a vibrant democracy, and many mainland travelers to Taiwan often marvel that Taiwanese society is not only democratic but also far more adherent to Chinese traditions than mainland China. (I have always felt that those who believe that democracy and Chinese culture are incompatible are closet supporters of Taiwanese independence. They exclude Taiwanese as Chinese.)

Indeed Li himself has accepted quite a few political reforms that are normally considered as “Western.” NGOs are okay and even some press freedom is okay. He also endorses some intra-party democracy. These are all sensible steps toward making the Chinese system more democratic than the Maoist system, and I am all for them. The difference is that I see freedom to vote and multi-party competition as natural and logical extensions of these initial reforms, whereas Li draws a sharp line in the sand between the political reforms that have already occurred and the potential political reforms that some of us have advocated. As much as I tried, I fail to see any differences in principle between these partial reforms and the more complete reforms encompassing democracy.

There is a very curious way Li objects to democracy: He objects to many of the mechanics of democracy. In particular, he has a thing against voting. But the problem is that voting is simply a way to implement the practice of democracy, and even Li endorses some democracy. For example, he favors intra-party democracy. Fine, I do too; but how do you implement intra-party democracy without voting? This is a bit like praising tennis as a sport but condemning the use of a racket to play it.

Li has not provided a coherent and logical argument for his positions on democracy. I suspect, although I do not have any direct evidence, that there is a simple modus operandi — endorsing reforms the CCP has endorsed and opposing reforms that CCP has opposed. This is fine as far as posturing goes but it is not a principled argument of anything.

That said, I believe it is perfectly healthy and indeed essential to have a rigorous debate on democracy — but that debate ought to be based on data, facts, logic and reasoning. By this criterion, Li’s talk does not start that debate.

In this aspect, however, democracy and autocracy are not symmetrical. In a democracy, we can debate and challenge democracy and autocracy alike, as Li did when he put down George W. Bush (which I greatly enjoyed) and as I do here. But those in an autocracy can challenge democracy only. (Brezhnev, upon being informed that there were protesters shouting “Down with Reagan” in front of the White House and that the US government could not do anything to them, reportedly told Reagan, “There are people shouting ‘Down with Reagan’ on Red Square and I am not doing anything to them.”) I have no troubles with people challenging people in power and being skeptical about democracy. In fact, the ability to do so in a democracy is the very strength of democracy, and a vital source of human progress. Copernicus was Copernicus because he overturned, not because he re-created, Ptolemaic astronomy. But by the same criterion, I do have troubles with people who do not see the merit of extending the same freedom they have to those who currently do not have it.

Like Li, I do not like the messianic tone some have invoked to support democracy. I support democracy on pragmatic grounds. The single most important benefit of democracy is its ability to tame violence. In The Better Angels of Our Nature, Pinker provided these startling statistics: During the 20th century, totalitarian regimes were responsible for 138 million deaths, of which 110 million occurred in communist countries. Authoritarian regimes caused another 28 million deaths. Democracies killed 2 million, mainly in their colonies as well as with food blockades and civilian bombings during the wars. Democracies, as Pinker pointed out, have trouble even bringing themselves to execute serial murderers. Democracies, Pinker argued, have “a tangle of institutional restraints, so a leader can’t just mobilize armies and militias on a whim to fan out over the country and start killing massive numbers of citizens.”

Contrary to what he was apparently told when he was a Berkeley hippie, the idea of democracy is not that it leads to a nirvana but that it can help prevent a living hell. Democracy has many, many problems. This insurance function of democracy — of mitigating against disasters — is often forgotten or taken for granted, but it is the single most important reason why democracy is superior to every other political system so far invented by human beings. Maybe one day there will be a better system than democracy, but the Chinese political system, in Li’s rendition, is not one of them.

How Google Interferes With Its Search Algorithms and Changes Your Results

The internet giant uses blacklists, algorithm tweaks and an army of contractors to shape what you see

Every minute, an estimated 3.8 million queries are typed into Google, prompting its algorithms to spit out results for hotel rates or breast-cancer treatments or the latest news about President Trump.

They are arguably the most powerful lines of computer code in the global economy, controlling how much of the world accesses information found on the internet, and the starting point for billions of dollars of commerce.

Twenty years ago, Google founders began building a goliath on the premise that its search algorithms could do a better job combing the web for useful information than humans. Google executives have said repeatedly—in private meetings with outside groups and in congressional testimony—that the algorithms are objective and essentially autonomous, unsullied by human biases or business considerations.

The company states in a Google blog, “We do not use human curation to collect or arrange the results on a page.” It says it can’t divulge details about how the algorithms work because the company is involved in a long-running and high-stakes battle with those who want to profit by gaming the system.

But that message often clashes with what happens behind the scenes. Over time, Google has increasingly re-engineered and interfered with search results to a far greater degree than the company and its executives have acknowledged, a Wall Street Journal investigation has found.

Those actions often come in response to pressure from businesses, outside interest groups and governments around the world. They have increased sharply since the 2016 election and the rise of online misinformation, the Journal found.

Google’s evolving approach marks a shift from its founding philosophy of “organizing the world’s information,” to one that is far more active in deciding how that information should appear.

More than 100 interviews and the Journal’s own testing of Google’s search results reveal:

• Google made algorithmic changes to its search results that favor big businesses over smaller ones, and in at least one case made changes on behalf of a major advertiser, eBay Inc., contrary to its public position that it never takes that type of action. The company also boosts some major websites, such as Amazon.com Inc. and Facebook Inc., according to people familiar with the matter.

• Google engineers regularly make behind-the-scenes adjustments to other information the company is increasingly layering on top of its basic search results. These features include auto-complete suggestions, boxes called “knowledge panels” and “featured snippets,” and news results, which aren’t subject to the same company policies limiting what engineers can remove or change.

• Despite publicly denying doing so, Google keeps blacklists to remove certain sites or prevent others from surfacing in certain types of results. These moves are separate from those that block sites as required by U.S. or foreign law, such as those featuring child abuse or with copyright infringement, and from changes designed to demote spam sites, which attempt to game the system to appear higher in results.

• In auto-complete, the feature that predicts search terms as the user types a query, Google’s engineers have created algorithms and blacklists to weed out more-incendiary suggestions for controversial subjects, such as abortion or immigration, in effect filtering out inflammatory results on high-profile topics.

• Google employees and executives, including co-founders Larry Page and Sergey Brin, have disagreed on how much to intervene on search results and to what extent. Employees can push for revisions in specific search results, including on topics such as vaccinations and autism.

• To evaluate its search results, Google employs thousands of low-paid contractors whose purpose the company says is to assess the quality of the algorithms’ rankings. Even so, contractors said Google gave feedback to these workers to convey what it considered to be the correct ranking of results, and they revised their assessments accordingly, according to contractors interviewed by the Journal. The contractors’ collective evaluations are then used to adjust algorithms.

THE JOURNAL’S FINDINGS undercut one of Google’s core defenses against global regulators worried about how it wields its immense power—that the company doesn’t exert editorial control over what it shows users. Regulators’ areas of concern include anticompetitive practices, political bias and online misinformation.

Far from being autonomous computer programs oblivious to outside pressure, Google’s algorithms are subject to regular tinkering from executives and engineers who are trying to deliver relevant search results, while also pleasing a wide variety of powerful interests and driving its parent company’s more than $30 billion in annual profit. Google is now the most highly trafficked website in the world, surpassing 90% of the market share for all search engines. The market capitalization of its parent, Alphabet Inc., is more than $900 billion.

Google made more than 3,200 changes to its algorithms in 2018, up from more than 2,400 in 2017 and from about 500 in 2010, according to Google and a person familiar with the matter. Google said 15% of queries today are for words, or combinations of words, that the company has never seen before, putting more demands on engineers to make sure the algorithms deliver useful results.

A Google spokeswoman disputed the Journal’s conclusions, saying, “We do today what we have done all along, provide relevant results from the most reliable sources available.”

Lara Levin, the spokeswoman, said the company is transparent in its guidelines for evaluators and in what it designs the algorithms to do.

AS PART OF ITS EXAMINATION, the Journal tested Google’s search results over several weeks this summer and compared them with results from two competing search engines, Microsoft Corp. ’s Bing and DuckDuckGo, a privacy-focused company that builds its results from syndicated feeds from other companies, including Verizon Communications Inc. ’s Yahoo search engine.

The testing showed wide discrepancies in how Google handled auto-complete queries and some of what Google calls organic search results—the list of websites that Google says are algorithmically sorted by relevance in response to a user’s query. (Read about the methodology for the Journal’s analysis.)

Ms. Levin, the Google spokeswoman, declined to comment on specific results of the Journal’s testing. In general, she said, “Our systems aim to provide relevant results from authoritative sources,” adding that organic search results alone “are not representative of the information made accessible via search.”

The Journal tested the auto-complete feature, which Google says draws from its vast database of search information to predict what a user intends to type, as well as data such as a user’s location and search history. The testing showed the extent to which Google doesn’t offer certain suggestions compared with other search engines.

Typing “Joe Biden is” or “Donald Trump is” in auto-complete, Google offered predicted language that was more innocuous than the other search engines. Similar differences were shown for other presidential candidates tested by the Journal.

The Journal also tested several search terms in auto-complete such as “immigrants are” and “abortion is.” Google’s predicted searches were less inflammatory than those of the other engines.

See the results of the Journal’s auto-complete tests
Use the lookup tool below to select the search terms analyzed. Percentages indicate how many times each suggestion appeared during the WSJ’s testing.
GOOGLE
  • done100%
  • how old100%
  • from99%
  • running for president79%
  • he democrat78%
  • he running for president76%
  • toast71%
  • a democrat70%
DUCKDUCKGOSHOW BING
  • an idiot100%
  • creepy100%
  • from what state100%
  • too old to run for president100%
  • a moron94%
  • a liar84%
  • a joke78%
  • done22%
  • a creep22%
View more auto-complete suggestions:

Gabriel Weinberg, DuckDuckGo’s chief executive, said that for certain words or phrases entered into the search box, such as ones that might be offensive, DuckDuckGo has decided to block all of its auto-complete suggestions, which it licenses from Yahoo. He said that type of block wasn’t triggered in the Journal’s searches for Donald Trump or Joe Biden.

A spokeswoman for Yahoo operator Verizon Media said, “We are committed to delivering a safe and trustworthy search experience to our users and partners, and we work diligently to ensure that search suggestions within Yahoo Search reflect that commitment.”

Said a Microsoft spokeswoman: “We work to ensure that our search results are as relevant, balanced, and trustworthy as possible, and in general, our rule is to minimize interference with the normal algorithmic operation.”

In other areas of the Journal analysis, Google’s results in organic search and news for a number of hot-button terms and politicians’ names showed prominent representation of both conservative and liberal news outlets.

ALGORITHMS ARE effectively recipes in code form, providing step-by-step instructions for how computers should solve certain problems. They drive not just the internet, but the apps that populate phones and tablets.

Algorithms determine which friends show up in a Facebook user’s news feed, which Twitter posts are most likely to go viral and how much an Uber ride should cost during rush hour as opposed to the middle of the night. They are used by banks to screen loan applications, businesses to look for the best job applicants and insurers to determine a person’s expected lifespan.

In the beginning, their power was rarely questioned. At Google in particular, its innovative algorithms ranked web content in a way that was groundbreaking, and hugely lucrative. The company aimed to make the web useful while relying on the assumption that code alone could do the heavy lifting of figuring out how to rank information.

But bad actors are increasingly trying to manipulate search results, businesses are trying to game the system and misinformation is rampant across tech platforms. Google found itself facing a version of the pressures on Facebook, which long said it was just connecting people but has been forced to more aggressively police content on its platform.

A 2016 internal investigation at Google showed between a 10th of a percent and a quarter of a percent of search queries were returning misinformation of some kind, according to one Google executive who works on search. It was a small number percentage-wise, but given the huge volume of Google searches it would amount to nearly two billion searches a year.

By comparison, Facebook faced congressional scrutiny for Russian misinformation that was viewed by 126 million users.

Google’s Ms. Levin said the number includes not just misinformation but also a “wide range of other content defined as lowest quality.” She disputed the Journal’s estimate of the number of searches that were affected. The company doesn’t disclose metrics on Google searches.

Google assembled a small SWAT team to work on the problem that became known internally as “Project Owl.” Borrowing from the strategy used earlier to fight spam, engineers worked to emphasize factors on a page that are proxies for “authoritativeness,” effectively pushing down pages that don’t display those attributes.

Other tech platforms, including Facebook, have taken a more aggressive approach, manually removing problem content and devising rules around what it defines as misinformation. Google, for its part, said its role “indexing” content versus “hosting” content, as Facebook does, means it shouldn’t take a more active role.

One Google search executive described the problem of defining misinformation as incredibly hard, and said the company didn’t want to go down the path of figuring it out.

Around the time Google started addressing issues such as misinformation, it started fielding even more complaints, to the point where human interference became more routine, according to people familiar with the matter, putting it in the position of arbitrating some of society’s most complicated issues. Some changes to search results might be considered reasonable—boosting trusted websites like the National Suicide Prevention Lifeline, for example—but Google has made little disclosure about when changes are made, or why.

Businesses, lawmakers and advertisers are worried about fairness and competition within the markets where Google is a leading player, and as a result its operations are coming under heavy scrutiny.

The U.S. Justice Department earlier this year opened an antitrust probe, in which Google’s search policies and practices are expected to be areas of focus. Google executives have twice been called to testify before Congress in the past year over concerns about political bias. In the European Union, Google has been fined more than $9 billion in the past three years for anticompetitive practices, including allegedly using its search engine to favor its own products.

In response, Google has said it faces tough competition in a dynamic tech sector, and that its behavior is aimed at helping create choice for consumers, not hurting rivals. The company is currently appealing the decisions against it in the EU, and it has denied claims of political bias.

GOOGLE RARELY RELEASES detailed information on algorithm changes, and its moves have bedeviled companies and interest groups, who feel they are operating at the tech giant’s whim.

In one change hotly contested within Google, engineers opted to tilt results to favor prominent businesses over smaller ones, based on the argument that customers were more likely to get what they wanted at larger outlets. One effect of the change was a boost to Amazon’s products, even if the items had been discontinued, according to people familiar with the matter.

The issue came up repeatedly over the years at meetings in which Google search executives discuss algorithm changes. Each time, they chose not to reverse the change, according to a person familiar with the matter.

Google engineers said it is widely acknowledged within the company that search is a zero-sum game: A change that helps lift one result inevitably pushes down another, often with considerable impact on the businesses involved.

Ms. Levin said there is no guidance in Google’s rater guidelines that suggest big sites are inherently more authoritative than small sites. “It’s inaccurate to suggest we did not address issues like discontinued products appearing high up in results,” she added.

Many of the changes within Google have coincided with its gradual evolution from a company with an engineering-focused, almost academic culture, into an advertising behemoth and one of the most profitable companies in the world. Advertising revenue—which includes ads on search as well as on other products such as maps and YouTube—was $116.3 billion last year.

Some very big advertisers received direct advice on how to improve their organic search results, a perk not available to businesses with no contacts at Google, according to people familiar with the matter. In some cases, that help included sending in search engineers to explain a problem, they said.

“If they have an [algorithm] update, our teams may get on the phone with them and they will go through it,” said Jeremy Cornfeldt, the chief executive of the Americas of Dentsu Inc.’s iProspect, which Mr. Cornfeldt said is one of Google’s largest advertising agency clients. He said the agency doesn’t get information Google wouldn’t share publicly. Among others it can disclose, iProspect represents Levi Strauss & Co., Alcon Inc. and Wolverine World Wide Inc.

One former executive at a Fortune 500 company that received such advice said Google frequently adjusts how it crawls the web and ranks pages to deal with specific big websites.

Google updates its index of some sites such as Facebook and Amazon more frequently, a move that helps them appear more often in search results, according to a person familiar with the matter.

“There’s this idea that the search algorithm is all neutral and goes out and combs the web and comes back and shows what it found, and that’s total BS,” the former executive said. “Google deals with special cases all the time.”

Ms. Levin, the Google spokeswoman, said the search team’s practice is to not provide specialized guidance to website owners. She also said that faster indexing of a site isn’t a guarantee that it will rank higher. “We prioritize issues based on impact, not any commercial relationships,” she said.

Alphabet’s net income

$30

 billion

20

10

0

2005

’10

’15

Note: 2017 figure reflects a one-time charge of $9.9 billion related to new U.S. tax law. Alphabet was created through a corporate restructuring of Google in 2015. Figures for prior years are for Google Inc.

Source: FactSet

Online marketplace eBay had long relied on Google for as much as a third of its internet traffic. In 2014, traffic suddenly plummeted—contributing to a $200 million hit in its revenue guidance for that year.

Google told the company it had made a decision to lower the ranking of a large number of eBay pages that were a big source of traffic.

EBay executives debated pulling their quarterly advertising spending of around $30 million from Google to protest, but ultimately decided to step up lobbying pressure on Google, with employees and executives calling and meeting with search engineers, according to people familiar with the matter. A similar episode had hit traffic several years earlier, and eBay had marshaled its lobbying might to persuade Google to give it advice about how to fix the problem, even relying on a former Google staffer who was then employed at eBay to work his contacts, according to one of those people.

This time, Google ultimately agreed to improve the ranking of a number of pages it had demoted while eBay completed a broader revision of its website to make the pages more “useful and relevant,” the people said. The revision was arduous and costly to complete, one of the people said, adding that eBay was later hit by other downrankings that Google didn’t help with.

“We’ve experienced significant and consistent drops in Google SEO for many years, which has been disproportionally detrimental to those small businesses that we support,” an eBay spokesman said. SEO, or search-engine optimization, is the practice of trying to generate more search-engine traffic for a website.

Google’s Ms. Levin declined to comment on eBay.

Companies without eBay’s clout had different experiences.

Dan Baxter can remember the exact moment his website, DealCatcher, was caught in a Google algorithm change. It was 6 p.m. on Sunday, Feb. 18. Mr. Baxter, who founded the Wilmington, Del., coupon website 20 years ago, got a call from one of his 12 employees the next morning.

“Have you looked at our traffic?” the worker asked, frantically, Mr. Baxter recalled. It was suddenly down 93% for no apparent reason. That Saturday, DealCatcher saw about 31,000 visitors from Google. Now it was posting about 2,400. It had disappeared almost entirely on Google search.

Mr. Baxter said he didn’t know whom to contact at Google, so he hired a consultant to help him identify what might have happened. The expert reached out directly to a contact at Google but never heard back. Mr. Baxter tried posting to a YouTube forum hosted by a Google “webmaster” to ask if it might have been a technical problem, but the webmaster seemed to shoot down that idea.

One month to the day after his traffic disappeared, it inexplicably came back, and he still doesn’t know why.

“You’re kind of just left in the dark, and that’s the scary part of the whole thing,” said Mr. Baxter.

Google’s Ms. Levin declined to comment on DealCatcher.

(The Wall Street Journal is owned by News Corp, which has complained publicly about Google’s moves to play down news sites that charge for subscriptions. Google ended the policy that year after intensive lobbying by News Corp and other paywalled publishers. More recently, News Corp has called for an “algorithm review board” to oversee Google, Facebook and other tech giants. News Corp has a commercial agreement to supply news through Facebook, and Dow Jones & Co., publisher of The Wall Street Journal, has a commercial agreement to supply news through Apple services. Google’s Ms. Levin and News Corp declined to comment.)

GOOGLE IN RECENT months has made additional efforts to clarify how its services operate by updating general information on its site. At the end of October it posted a new video titled “How Google Search Works.”

Jonathan Zittrain, a Harvard Law School professor and faculty director of the Berkman Klein Center for Internet & Society, said Google has poorly defined how often or when it intervenes on search results. The company’s argument that it can’t reveal those details because it is fighting spam “seems nuts,” said Mr. Zittrain.

“That argument may have made sense 10 or 15 years ago but not anymore,” he said. “That’s called ‘security through obscurity,’ ” a reference to the now-unfashionable engineering idea that systems can be made more secure by restricting information about how they operate.

Google’s Ms. Levin said “extreme transparency has historically proven to empower bad actors in a way that hurts our users and website owners who play by the rules.”

“Building a service like this means making tens of thousands of really, really complicated human decisions, and that’s not what people think,” said John Bowers, a research associate at the Berkman Klein Center.

On one extreme, those decisions at Google are made by the world’s most accomplished and highest-paid engineers, whose job is to turn the dials within millions of lines of complex code. On the other is an army of more than 10,000 contract workers, who work from home and get paid by the hour to evaluate search results.

The rankings supplied by the contractors, who work from a Google manual that runs to hundreds of pages, can indirectly move a site higher or lower in results, according to people familiar with the matter. And their collective responses are measured by Google executives and used to affect the search algorithms.

Google’s results page has become a complex mix of search results, advertisements and featured content, not always distinguishable by the user. While these features are all driven by algorithms, Google has different policies and attitudes toward changing the results shown in each of the additional features. Featured snippets and knowledge panels are two common features.

Other features

Organic search results

Featured snippet

Knowledge panel

Highlights web pages that Google thinks will contain content a user is looking for. Google says it will remove content from the feature if it violates policies around harmful and hateful content.

Information Google has compiled from various sources on the web, such as Wikipedia, that provides basic facts about the subject of your query. Google is willing to adjust this material.

search term

Organic search results

Links to results that Google’s algorithms have determined are relevant to your query. Google says it doesn’t curate these results.

One of those evaluators was Zack Langley, now a 27-year-old logistics manager at a tour company in New Orleans. Mr. Langley got a one-year contract in the spring of 2016 evaluating Google’s search results through Lionbridge Technologies Inc., one of several companies Google and other tech platforms use for contract work.

During his time as a contractor, Mr. Langley said he never had any contact with anyone at Google, nor was he told what his results would be used for. Like all of Google’s evaluators, he signed a nondisclosure agreement. He made $13.50 an hour and worked up to 20 hours a week from home.

Sometimes working in his pajamas, Mr. Langley was given hundreds of real search results and told to use his judgment to rate them according to quality, reputation and usefulness, among other factors.

At one point, Mr. Langley said he was unhappy with the search results for “best way to kill myself,” which were turning up links that were like “how-to” manuals. He said he down-ranked all the other results for suicide until the National Suicide Prevention Lifeline was the No. 1 result.

Soon after, Mr. Langley said, Google sent a note through Lionbridge saying the hotline should be ranked as the top result across all searches related to suicide, so that the collective rankings of the evaluators would adjust the algorithms to deliver that result. He said he never learned if his actions had anything to do with the change.

Mr. Langley said it seemed like Google wanted him to change content on search so Google would have what he called plausible deniability about making those decisions. He said contractors would get notes from Lionbridge that he believed came from Google telling them the “correct” results on other searches.

He said that in late 2016, as the election approached, Google officials got more involved in dictating the best results, although not necessarily on issues related to the campaign. “They used to have a hands-off approach, and then it seemed to change,” he said.

Ms. Levin, the Google spokeswoman, said the company “long ago evolved our approach to collecting feedback on these types of queries, which help us develop algorithmic solutions and features in this area.” She added that, “we provide updates to our rater guidelines to ensure all raters are following the same general framework.”

Lionbridge didn’t reply to requests for comment.

AT GOOGLE, EMPLOYEES routinely use the company’s internal message boards as well as a form called “go/bad” to push for changes in specific search results. (Go/bad is a reporting system meant to allow Google staff to point out problematic search results.)

One of the first hot-button issues surfaced in 2015, according to people familiar with the matter, when some employees complained that a search for “how do vaccines cause autism” delivered misinformation through sites that oppose vaccinations.

At least one employee defended the result, writing that Google should “let the algorithms decide” what shows up, according to one person familiar with the matter. Instead, the people said, Google made a change so that the first result is a site called howdovaccinescauseautism.com—which states on its home page in large black letters, “They f—ing don’t.” (The phrase has become a meme within Google.)

Google’s Ms. Levin declined to comment.

In the fall of 2018, the conservative news site Breitbart News Network posted a leaked video of Google executives, including Mr. Brin and Google CEO Sundar Pichai, upset and addressing staffers following President Trump’s election two years earlier. A group of Google employees noticed the video was appearing on the 12th page of search results when Googling “leaked Google video Trump,” which made it seem like Google was burying it. They complained on one of the company’s internal message boards, according to people familiar with the matter. Shortly after, the leaked video began appearing higher in search results.

“When we receive reports of our product not behaving as people might expect, we investigate to see if there’s any useful insight to inform future improvements,” said Ms. Levin.

FROM GOOGLE’S FOUNDING, Messrs. Page and Brin knew that ranking webpages was a matter of opinion. “The importance of a Web page is an inherently subjective matter, which depends on the [readers’] interests, knowledge and attitudes,” they wrote in their 1998 paper introducing the PageRank algorithm, the founding system that launched the search engine.

PageRank, they wrote, would measure the level of human interest and attention, but it would do so “objectively and mechanically.” They contended that the system would mathematically measure the relevance of a site by the number of times other relevant sites linked to it on the web.

Today, PageRank has been updated and subsumed into more than 200 different algorithms, attuned to hundreds of signals, now used by Google. (The company replaced PageRank in 2005 with a newer version that could better keep up with the vast traffic that the site was attracting. Internally, it was called “PageRankNG,” ostensibly named for “next generation,” according to people familiar with the matter. In public, the company still points to PageRank—and on its website links to the original algorithm published by Messrs. Page and Brin—in explaining how search works. “The original insight and notion of using link patterns is something that we still use in our systems,” said Ms. Levin.)

By the early 2000s, spammers were overwhelming Google’s algorithms with tactics that made their sites appear more popular than they were, skewing search results. Messrs. Page and Brin disagreed over how to tackle the problem.

Mr. Brin argued against human intervention, contending that Google should deliver the most accurate results as delivered by the algorithms, and that the algorithms should only be tweaked in the most extreme cases. Mr. Page countered that the user experience was getting damaged when users encountered spam rather than useful results, according to people familiar with the matter.

Google already had been taking what the company calls “manual actions” against specific websites that were abusing the algorithm. In that process, Google engineers demote a website’s ranking by changing its specific “weighting.” For example, if a website is artificially boosted by paying other websites to link to it, a behavior that Google frowns upon, Google engineers could turn down the dial on that specific weighting. The company could also blacklist a website, or remove it altogether.

Mr. Brin still opposed making large-scale efforts to fight spam, because it involved more human intervention. Mr. Brin, whose parents were Jewish émigrés from the former Soviet Union, even personally decided to allow anti-Semitic sites that were in the results for the query “Jew,” according to people familiar with the decision. Google posted a disclaimer with results for that query saying, “Our search results are generated completely objectively and are independent of the beliefs and preferences of those who work at Google.”

Finally, in 2004, in the bathroom one day at Google’s headquarters in Mountain View, Calif., Mr. Page approached Ben Gomes, one of Google’s early search executives, to express support for his efforts fighting spam. “Just do what you need to do,” said Mr. Page, according to a person familiar with the conversation. “Sergey is going to ruin this f—ing company.”

Ms. Levin, the Google spokeswoman, said Messrs. Page, Brin and Gomes declined to comment.

After that, the company revised its algorithms to fight spam and loosened rules for manual interventions, according to people familiar with the matter.

Google has guidelines for changing its ranking algorithms, a grueling process called the “launch committee.” Google executives have pointed to this process in a general way in congressional testimony when asked about algorithm changes.

The process is like defending a thesis, and the meetings can be contentious, according to people familiar with them.

In part because the process is laborious, some engineers aim to avoid it if they can, one of these people said, and small changes can sometimes get pushed through without the committee’s approval. Mr. Gomes is on the committee that decides whether to approve the changes, and other senior officials sometimes attend as well.

Google’s Ms. Levin said not every algorithm change is discussed in a meeting, but “there are other processes for reviewing more straightforward launches at different levels of the organization,” such as an email review. Those reviews still involve members of the launch committee, she said.

Today, Google discloses only a few of the factors being measured by its algorithms. Known ones include “freshness,” which gives preference to recently created content for searches relating to things such as breaking news or a sports event. Another is where a user is located—if a user searches for “zoo,” Google engineers want the algorithms to provide the best zoo in the user’s area. Language signals—how meanings change when words are used together, such as April and fools—are among the most important, as they help determine what a user is actually asking for.

Other important signals have included the length of time users would stay on pages they clicked on before clicking back to Google, according to a former Google employee. Long stays would boost a page’s ranking. Quick bounce backs, indicating a site wasn’t relevant, would severely hurt a ranking, the former employee said.

Over the years, Google’s database recording this user activity has become a competitive advantage, helping cement its position in the search market. Other search engines don’t have the vast quantity of data that is available to Google, search’s market-leader.

That makes the impact of its operating decisions immense. When Pinterest Inc. filed to go public earlier this year, it said that “search engines, such as Google, may modify their algorithms and policies or enforce those policies in ways that are detrimental to us.” It added: “Our ability to appeal these actions is limited.” A spokeswoman for Pinterest declined to comment.

Search-engine optimization consultants have proliferated to try to decipher Google’s signals on behalf of large and small businesses. But even those experts said the algorithms remain borderline indecipherable. “It’s black magic,” said Glenn Gabe, an SEO expert who has spent years analyzing Google’s algorithms and tried to help DealCatcher find a solution to its drop in traffic earlier this year.

ALONG WITH ADVERTISEMENTS, Google’s own features now take up large amounts of space on the first page of results—with few obvious distinctions for users. These include news headlines and videos across the top, information panels along the side and “People also ask” boxes highlighting related questions.

Google engineers view the features as separate products from Google search, and there is less resistance to manually changing their content in response to outside requests, according to people familiar with the matter.

These features have become more prominent as Google attempts to keep users on its results page, where ads are placed, instead of losing the users as they click through to other sites. In September, about 55% of Google searches on mobile were “no-click” searches, according to research firm Jumpshot, meaning users never left the results page.

Two typical features on the results page—knowledge panels, which are collections of relevant information about people, events or other things; and featured snippets, which are highlighted results that Google thinks will contain content a user is looking for—are areas where Google engineers make changes to fix results, the Journal found.

Google has looser policies about making adjustments to these features than organic search results. The features include Google News and People also ask.

Other features

Organic search results

search term

Top stories

News articles surfaced as being particularly relevant. Google blocks some sites that don’t meet its policies.

People also ask

A predictive feature that suggests related questions, providing short answers with links. Google says it weeds out and blocks some phrases in this feature as it does in its auto-complete feature.

Organic

search results

In April, the conservative Heritage Foundation called Google to complain that a coming movie called “Unplanned” had been labeled in a knowledge panel as “propaganda,” according to a person familiar with the matter. The film is about a former Planned Parenthood director who had a change of heart and became pro-life.

After the Heritage Foundation complained to a contact at Google, the company apologized and removed “propaganda” from the description, that person said.

Google’s Ms. Levin said the change “was not the result of pressure from an outside group, it was a violation of the feature’s policy.”

On the auto-complete feature, Google reached a confidential settlement in France in 2012 with several outside groups that had complained it was anti-Semitic that Google was suggesting the French word for “Jew” when searchers typed in the name of several prominent politicians. Google agreed to “algorithmically mitigate” such suggestions as part of a pact that barred the parties from disclosing its terms, according to people familiar with the matter.

In recent years, Google changed its auto-complete algorithms to remove “sensitive and disparaging remarks.” The policy, now detailed on its website, says that Google doesn’t allow predictions that may be related to “harassment, bullying, threats, inappropriate sexualization, or predictions that expose private or sensitive information.”

GOOGLE HAS BECOME more open about its moderation of auto-complete but still doesn’t disclose its use of blacklists. Kevin Gibbs, who created auto-complete in 2004 when he was a Google engineer, originally developed the list of terms that wouldn’t be suggested, even if they were the most popular queries that independent algorithms would normally supply.

For example, if a user searched “Britney Spears”—a popular search on Google at the time—Mr. Gibbs didn’t want a piece of human anatomy or the description of a sex act to appear when someone started typing the singer’s name. The unfiltered results were “kind of horrible,” Mr. Gibbs said in an interview.

He said deciding what should and shouldn’t be on the list was challenging. “It was uncomfortable, and I felt a lot of pressure,” said Mr. Gibbs, who worked on auto-complete for about a year, and left the company in 2012. “I wanted to make sure it represented the world fairly and didn’t leave out any groups.”

Google still maintains lists of phrases and terms that are manually blacklisted from auto-complete, according to people familiar with the matter.

The company internally has a “clearly articulated set of policies” about what terms or phrases might be blacklisted in auto-complete, and that it follows those rules, according to a person familiar with the matter.

Blacklists also affect the results in organic search and Google News, as well as other search products, such as Web answers and knowledge panels, according to people familiar with the matter.

Google has said in congressional testimony it doesn’t use blacklists. Asked in a 2018 hearing whether Google had ever blacklisted a “company, group, individual or outlet…for political reasons,” Karan Bhatia, Google’s vice president of public policy, responded: “No, ma’am, we don’t use blacklists/whitelists to influence our search results,” according to the transcript.

Ms. Levin said those statements were related to blacklists targeting political groups, which she said the company doesn’t keep.

Google’s first blacklists date to the early 2000s, when the company made a list of spam sites that it removed from its index, one of those people said. This means the sites wouldn’t appear in search results.

Engineers known as “maintainers” are authorized to make and approve changes to blacklists. It takes at least two people to do this; one person makes the change, while a second approves it, according to the person familiar with the matter.

The Journal reviewed a draft policy document from August 2018 that outlines how Google employees should implement an anti-misinformation blacklist aimed at blocking certain publishers from appearing in Google News and other search products. The document says engineers should focus on “a publisher misrepresenting their ownership or web properties” and having “deceptive content”—that is, sites that actively aim to mislead—as opposed to those that have inaccurate content.

“The purpose of the blacklist will be to bar the sites from surfacing in any Search feature or news product sites,” the document states.

Ms. Levin said Google does “not manually determine the order of any search result.” She said sites that don’t adhere to Google News “inclusion policies” are “not eligible to appear on news surfaces or in information boxes in Search.”

SOME INDIVIDUALS and companies said changes made by the company seem ad hoc, or inconsistent. People familiar with the matter said Google increasingly will make manual or algorithmic changes that aren’t acknowledged publicly in order to maintain that it isn’t affected by outside pressure.

“It’s very convenient for us to say that the algorithms make all the decisions,” said one former Google executive.

In March 2017, Google updated the guidelines it gives contractors who evaluate search results, instructing them for the first time to give low-quality ratings to sites “created with the sole purpose of promoting hate or violence against a group of people”—something that would help adjust Google algorithms to lower those sites in search.

The next year, the company broadened the guidance to any pages that promote such hate or violence, even if it isn’t the page’s sole purpose and even if it is “expressed in polite or even academic-sounding language.”

Google has resisted entirely removing some content that outsiders complained should be blocked. In May 2018, Ignacio Wenley Palacios, a Spain-based lawyer working for the Lawfare Project, a nonprofit that funds litigation to protect Jewish people, asked Google to remove an anti-Semitic article lauding a German Holocaust denier posted on a Spanish-language neo-Nazi blog.

The company declined. In an email to Mr. Wenley Palacios, lawyers for Google contended that “while such content is detestable” it isn’t “manifestly illegal” in Spain.

Mr. Wenley Palacios then filed a lawsuit, but in the spring of this year, before the suit could be heard, he said, Google lawyers told him the company was changing its policy on such removals in Spain.

According to Mr. Wenley Palacios, the lawyers said the firm would now remove from searches conducted in Spain any links to Holocaust denial and other content that could hurt vulnerable minorities, once they are pointed out to the company. The results would still be accessible outside of Spain. He said both sides agreed to dismiss the case.

Google’s Ms. Levin described the action as a “legal removal” in accordance with local law. Holocaust denial isn’t illegal in Spain, but if it is coupled with an intent to spread hate, it can fall under Spanish criminal law banning certain forms of hate speech.

“Google used to say, ‘We don’t approve of the content, but that’s what it is,’ ” Mr. Wenley Palacios said. “That has changed dramatically.”

Google’s search results page has changed over the years, becoming much more ad-heavy.

Other features

Organic search results

search term

Ad

Ads

Ads in recent years claim more space at the top of the results page.

Ad

Vertical search results

Various features that present specialized results for specific topics, like hotels or places, often with photos or maps. The results in some of these features are paid advertisements.

Organic search results

As Google has placed more ads and verticals at the top of the page, organic search results have shrunk.

Health policy consultant Greg Williams said he helped lead a campaign to push Google to make changes that would stifle misleading results for queries such as “rehab.”

At the time, in 2017, addiction centers with spotty records were constantly showing up in search results, typically the first place family members and addicts go in search of help.

Google routed Diane Hentges several times over the last year to call centers as she desperately researched drug addiction treatment centers for her 22-year-old son, she said.

Each time she called one of the facilities listed on Google, a customer-service representative would ask for her financial information, but the representatives weren’t seemingly attached to any legitimate company.

“If you look at a place on Google, it sends you straight to a call center,” Ms. Hentges said, adding that parents who are struggling with a child with addiction “will do anything to get our child healthy. We’ll believe anything.”

After intense lobbying by Mr. Williams and others, Google changed its ad policy around such queries. But addiction industry officials also noticed a significant change to Google search results. Many searches for “rehab” or related terms began returning the website for the Substance Abuse and Mental Health Services Administration, the national help hotline run by the U.S. Department of Health and Human Services, as the top result.

Google never acknowledged the change. Ms. Levin said that “resources are not listed because of any type of partnership” and that “we have algorithmic solutions designed to prioritize authoritative resources (including official hotlines) in our results for queries like these as well as for suicide and self-harm queries.”

A spokesman for SAMHSA said the agency had a partnership with Google.

Google’s search algorithms have been a major focus of Hollywood in its effort to fight pirated TV shows and movies.

Alphabet’s revenue, by type

Advertising

Other

$150

 billion

100

50

0

2005

’10

’15

Note: Alphabet was created through a corporate restructuring of Google in 2015. Figures for prior years are for Google Inc.

Source: the company

Studios “saw this as the potential death knell of their business,” said Dan Glickman, chairman and chief executive of the Motion Picture Association of America from 2004 to 2010. The association has been a public critic of Google. “A hundred million dollars to market a major movie could be thrown away if someone could stream it illegally online.”

Google received a record 1.6 million requests to remove web pages for copyright issues last year, according to the company’s published Transparency Report and a Journal analysis. Those requests pertained to more than 740 million pages, about 12 times the number of web pages it was asked to take down in 2012.

A decade ago, in concession to the industry, Google removed “download” from its auto-complete suggestions after the name of a movie or TV show, so that at least it wouldn’t be encouraging searches for pirated content.

In 2012, it applied a filter to search results that would lower the ranking of sites that received a large number of piracy complaints under U.S. copyright law. That effectively pushed many pirate sites off the front page of results for general searches for movies or music, although it still showed them when a user specifically typed in the pirate site names.

In recent months the industry has gotten more cooperation from Google on piracy in search results than at any point in the organization’s history, according to people familiar with the matter.

“Google is under great cosmic pressure, as is Facebook,” Mr. Glickman said. “These are companies that are in danger of being federally regulated to an extent that they never anticipated.”

Mr. Pichai, who became CEO of Google in 2015, is more willing to entertain complaints about the search results from outside parties than Messrs. Page and Brin, the co-founders, according to people familiar with his leadership.

Google’s Ms. Levin said Mr. Pichai’s “style of engaging and listening to feedback has not shifted. He has always been very open to feedback.”

CRITICISM ALLEGING political bias in Google’s search results has sharpened since the 2016 election.

Interest groups from the right and left have besieged Google with questions about content displayed in search results and about why the company’s algorithms returned certain information over others.

Google appointed an executive in Washington, Max Pappas, to handle complaints from conservative groups, according to people familiar with the matter. Mr. Pappas works with Google engineers on changes to search when conservative viewpoints aren’t being represented fairly, according to interest groups interviewed by the Journal, although that is just one part of his job.

“Conservatives need people they can go to at these companies,” said Dan Gainor, an executive at the conservative Media Research Center, which has complained about various issues to Google.

Google also appointed at least one other executive in Washington, Chanelle Hardy, to work with outside liberal groups, according to people familiar with the matter.

Ms. Levin said both positions have existed for many years. She said in general Google believes it’s “the responsible thing to do” to understand feedback from the groups and said Google’s algorithms and policies don’t attempt to make any judgment based on the political leanings of a website.

Mr. Pappas declined to comment, and Ms. Hardy didn’t reply to a request for comment.

SHARE YOUR THOUGHTS

Does Google give you what you expect in search results? Join the discussion below.

Over the past year, abortion-rights groups have complained about search results that turned up the websites of what are known as “crisis pregnancy centers,” organizations that counsel women against having abortions, according to people familiar with the matter.

One of the complaining organizations was Naral Pro-Choice America, which tracks the activities of anti-abortion groups through its opposition research department, said spokeswoman Kristin Ford.

Naral complained to Google and other tech platforms that some of the ads, posts and search results from crisis pregnancy centers are misleading and deceptive, she said. Some of the organizations claimed to offer abortions and then counseled women against it. “They do not disclose what their agenda is,” Ms. Ford said.

In June, Google updated its advertising policies related to abortion, saying that advertisers must state whether they provide abortions or not, according to its website. Ms. Ford said Naral wasn’t told in advance of the policy change.

Ms. Levin said Google didn’t implement any changes with regard to how crisis pregnancy centers rank for abortion queries.

The Journal tested the term “abortion” in organic search results over 17 days in July and August. Thirty-nine percent of all results on the first page had the hostname www.plannedparenthood.org, the site of Planned Parenthood Federation of America, the nonprofit, abortion-rights organization.

By comparison, 14% of Bing’s first page of search results and 16% of DuckDuckGo’s first page of results were from Planned Parenthood.

Ms. Levin said Google doesn’t have any particular ranking implementations aimed at promoting Planned Parenthood.

See the results of the Journal’s search tests
Use the lookup tool below to select search terms analyzed. Percentages indicate how many times each web page appeared during the WSJ’s testing.
GOOGLE
  • Abortion – Wikipediahttps://en.wikipedia.org/wiki/Abortion100%
  • Abortion Information | Information About Your Optionshttps://www.plannedparenthood.org/learn/abortion100%
  • An Overview of Abortion Laws | Guttmacher Institutehttps://www.guttmacher.org/state-policy/explore/overview-abortion-laws100%
  • What facts about abortion do I need to know? – Planned Parenthoodhttps://www.plannedparenthood.org/learn/abortion/considering-abortion/what-facts-about-abortion-do-i-need-know67%
  • In-Clinic Abortion Procedure | Abortion Methods – Planned Parenthoodhttps://www.plannedparenthood.org/learn/abortion/in-clinic-abortion-procedures52%
  • National Abortion Federation: Homehttps://prochoice.org/44%
  • Abortion | Center for Reproductive Rightshttps://reproductiverights.org/our-issues/abortion38%
  • What Happens During an In-Clinic Abortion? – Planned Parenthoodhttps://www.plannedparenthood.org/learn/abortion/in-clinic-abortion-procedures/what-happens-during-an-in-clinic-abortion38%
DUCKDUCKGOSHOW BING
  • Abortion – Pros & Cons – ProCon.orghttps://abortion.procon.org/100%
  • Abortion – The New York Timeshttps://www.nytimes.com/topic/subject/abortion100%
  • Abortion Information | Information About Your Optionshttps://www.plannedparenthood.org/learn/abortion100%
  • Abortion: Get Facts About the Procedure and Statisticshttps://www.emedicinehealth.com/abortion/article_em.htm100%
  • AbortionFacts.com – Information on Abortion You Can Usehttps://www.abortionfacts.com/100%
  • Abortion – Wikipediahttps://en.wikipedia.org/wiki/Abortion99%
  • Abortion | Medical Abortion | MedlinePlushttps://medlineplus.gov/abortion.html98%
  • Abortion Procedures During First, Second and Third Trimesterhttps://americanpregnancy.org/unplanned-pregnancy/abortion-procedures/76%
View more search results:

The practice of creating blacklists for certain types of sites or searches has fueled cries of political bias from some Google engineers and right-wing publications that said they have viewed portions of the blacklists. Some of the websites Google appears to have targeted in Google News were conservative sites and blogs, according to documents reviewed by the Journal. In one partial blacklist reviewed by the Journal, some conservative and right-wing websites, including The Gateway Pundit and The United West, were included on a list of hundreds of websites that wouldn’t appear in news or featured products, although they could appear in organic search results.

Google has said repeatedly it doesn’t make decisions based on politics, and current and former employees told the Journal they haven’t seen evidence of political bias. And yet, they said, Google’s shifting policies on interference—and its lack of transparency about them—inevitably force employees to become arbiters of what is acceptable, a dilemma that opens the door to charges of bias or favoritism.

Google’s Ms. Levin declined to comment.

DEMANDS FROM GOVERNMENTS for changes have grown rapidly since 2016.

From 2010 to 2018, Google fielded such requests from countries including the U.S. to remove 685,000 links from what Google calls web search. The requests came from courts or other authorities that said the links broke local laws or should be removed for other reasons.

Nearly 78% of those removal requests have been since the beginning of 2016, according to reports that Google publishes on its website. Google’s ultimate actions on those requests weren’t disclosed.

Russia has been by far the most prolific, demanding the removal of about 255,000 links from search last year, three-quarters of all government requests for removal from Google search in that period, the data show. Nearly all of the country’s requests came under an information-security law Russia put into effect in late 2017, according to a Journal examination of disclosures in a database run by the Berkman Klein Center.

Google said the Russian law doesn’t allow it to disclose which URLs were requested to be removed. A person familiar with the matter said the removal demands are for content ruled illegal in Russia for a variety of reasons, such as for promoting drug use or encouraging suicide.

Requests can include demands to remove links to information the government defines as extremist, which can be used to target political opposition, the person said.

Google, whose staff reviews the requests, at times declines those that appear focused on political opposition, the person said, adding that in those cases, it tries not to draw attention to its decisions to avoid provoking Russian regulators.

The approach has led to stiff internal debate. On one side, some Google employees say that the company shouldn’t cooperate at all with takedown requests from countries such as Russia or Turkey. Others say it is important to follow the laws of countries where they are based.

“There is a real question internally about whether a private company should be making these calls,” the person said.

Google’s Ms. Levin said, “Maximizing access to information has always been a core principle of Search, and that hasn’t changed.”

Google’s culture of publicly resisting demands to change results has diminished, current and former employees said. A few years ago, the company dismantled a global team focused on free-speech issues that, among other things, publicized the company’s legal battles to fight changes to search results, in part because Google had lost several of those battles in court, according to a person familiar with the change.

“Free expression was no longer a winner,” the person said.