Translation: We fired and harassed out our top AI researchers from historically marginalized groups who did the work for us, and we are now looking for more people from historically marginalized groups to burn out, exploit, and expend.
Quote Tweet
@JeffDean
·
I encourage students from historically marginalized groups who are interested in learning to conduct research in AI/ML, CS or related areas to consider applying for our CSRMP mentorship program! We have 100s of researchers @GoogleAI who are excited to work with you. twitter.com/GoogleAI/statu…
As regulators seek ways to curb the company’s power, there is more focus on the vast index — hundreds of billions of web pages — behind its search engine.
In 2000, just two years after it was founded, Google reached a milestone that would lay the foundation for its dominance over the next 20 years: It became the world’s largest search engine, with an index of more than one billion web pages.
The rest of the internet never caught up, and Google’s index just kept on getting bigger. Today, it’s somewhere between 500 billion and 600 billion web pages, according to estimates.
Now, as regulators around the world examine ways to curb Google’s power, including a search monopoly case expected from state attorneys general as early as this week and the antitrust lawsuit the Justice Department filed in October, they are wrestling with a company whose sheer size has allowed it to squash competitors. And those competitors are pointing investigators toward that enormous index, the gravitational center of the company.
“If people are on a search engine with a smaller index, they’re not always going to get the results they want. And then they go to Google and stay at Google,” said Matt Wells, who started Gigablast, a search engine with an index of around five billion web pages, about 20 years ago. “A little guy like me can’t compete.”
Understanding how Google’s search works is a key to figuring out why so many companies find it nearly impossible to compete and, in fact, go out of their way to cater to its needs.
Every search request provides Google with more data to make its search algorithm smarter. Google has performed so many more searches than any other search engine that it has established a huge advantage over rivals in understanding what consumers are looking for. That lead only continues to widen, since Google has a market share of about 90 percent.
Google directs billions of users to locations across the internet, and websites, hungry for that traffic, create a different set of rules for the company. Websites often provide greater and more frequent access to Google’s so-called web crawlers — computers that automatically scour the internet and scan web pages — allowing the company to offer a more extensive and up-to-date index of what is available on the internet.
When he was working at the music site Bandcamp, Zack Maril, a software engineer, became concerned about how Google’s dominance had made it so essential to websites.
In 2018, when Google said its crawler, Googlebot, was having trouble with one of Bandcamp’s pages, Mr. Maril made fixing the problem a priority because Google was critical to the site’s traffic. When other crawlers encountered problems, Bandcamp would usually block them.
Mr. Maril continued to research the different ways that websites opened doors for Google and closed them for others. Last year, he sent a 20-page report, “Understanding Google,” to a House antitrust subcommittee and then met with investigators to explain why other companies could not recreate Google’s index.
“It’s largely an unchecked source of power for its monopoly,” said Mr. Maril, 29, who works at another technology company that does not compete directly with Google. He asked that The New York Times not identify his employer since he was not speaking for it.
A report this year by the House subcommittee cited Mr. Maril’s research on Google’s efforts to create a real-time map of the internet and how this had “locked in its dominance.” While the Justice Department is looking to unwind Google’s business deals that put its search engine front and center on billions of smartphones and computers, Mr. Maril is urging the government to intervene and regulate Google’s index. A Google spokeswoman declined to comment.
Websites and search engines are symbiotic. Websites rely on search engines for traffic, while search engines need access to crawl the sites to provide relevant results for users. But each crawler puts a strain on a website’s resources in server and bandwidth costs, and some aggressive crawlers resemble security risks that can take down a site.
Since having their pages crawled costs money, websites have an incentive to let it be done only by search engines that direct enough traffic to them. In the current world of search, that leaves Google and — in some cases — Microsoft’s Bing.
Google and Microsoft are the only search engines that spend hundreds of millions of dollars annually to maintain a real-time map of the English-language internet. That’s in addition to the billions they’ve spent over the years to build out their indexes, according to a report this summer from Britain’s Competition and Markets Authority.
Google holds a significant leg up on Microsoft in more than market share. British competition authorities said Google’s index included about 500 billion to 600 billion web pages, compared with 100 billion to 200 billion for Microsoft.
Other large tech companies deploy crawlers for other purposes. Facebook has a crawler for links that appear on its site or services. Amazon says its crawler helps improve its voice-based assistant, Alexa. Apple has its own crawler, Applebot, which has fueled speculation that it might be looking to build its own search engine.
But indexing has always been a challenge for companies without deep pockets.
The privacy-minded search engine DuckDuckGo decided to stop crawling the entire web more than a decade ago and now syndicates results from Microsoft. It still crawls sites like Wikipedia to provide results for answer boxes that appear in its results, but maintaining its own index does not usually make financial sense for the company.
“It costs more money than we can afford,” said Gabriel Weinberg, chief executive of DuckDuckGo. In a written statement for the House antitrust subcommittee last year, the company said that “an aspiring search engine start-up today (and in the foreseeable future) cannot avoid the need” to turn to Microsoft or Google for its search results.
When FindX started to develop an alternative to Google in 2015, the Danish company set out to create its own index and offered a build-your-own algorithm to provide individualized results.
FindX quickly ran into problems. Large website operators, such as Yelp and LinkedIn, did not allow the fledgling search engine to crawl their sites. Because of a bug in its code, FindX’s computers that crawled the internet were flagged as a security risk and blocked by a group of the internet’s largest infrastructure providers. What pages they did collect were frequently spam or malicious web pages.
“If you have to do the indexing, that’s the hardest thing to do,” said Brian Schildt Laursen, one of the founders of FindX, which shut down in 2018.
Mr. Schildt Laursen launched a new search engine last year, Givero, which offered users the option to donate a portion of the company’s revenue to charitable causes. When he started Givero, he syndicated search results from Microsoft.
Most large websites are judicious about who can crawl their pages. In general, Google and Microsoft get more access because they have more users, while smaller search engines have to ask for permission.
“You need the traffic to convince the websites to allow you to copy and crawl, but you also need the content to grow your index and pull up your traffic,” said Marc Al-Hames, a co-chief executive of Cliqz, a German search engine that closed this year after seven years of operation. “It’s a chicken-and-egg problem.”
In Europe, a group called the Open Search Foundation has proposed a plan to create a common internet index that can underpin many European search engines. It’s essential to have a diversity of options for search results, said Stefan Voigt, the group’s chairman and founder, because it is not good for only a handful of companies to determine what links people are shown and not shown.
“We just can’t leave this to one or two companies,” Mr. Voigt said.
When Mr. Maril started researching how sites treated Google’s crawler, he downloaded 17 million so-called robots.txt files — essentially rules of the road posted by nearly every website laying out where crawlers can go — and found many examples where Google had greater access than competitors.
ScienceDirect, a site for peer-reviewed papers, permits only Google’s crawler to have access to links containing PDF documents. Only Google’s computers get access to listings on PBS Kids. On Alibaba.com, the U.S. site of the Chinese e-commerce giant Alibaba, only Google’s crawler is given access to pages that list products.
This year, Mr. Maril started an organization, the Knuckleheads’ Club (“because only a knucklehead would take on Google”), and a website to raise awareness about Google’s web-crawling monopoly.
“Google has all this power in society,” Mr. Maril said. “But I think there should be democratic — small d — control of that power.”
New feature can deep-link to specific text on a Web page, with highlighting.
Google has been cooking up an extension to the URL standard called “Text Fragments.” The new link style will allow you to link not just to a page but to specific text on a page, which will get scrolled to and highlighted automatically once the page loads. It’s like an anchor link, but with highlighting and creatable by anyone.
The feature has actually been supported in Chrome since version 80, which hit the stable channel in February. Now a new extension from Google makes it easy to create this new link type, which will work for anyone else using Chrome on desktop OSes and Android. Google has proposed the idea to the W3C and hopes other browsers will adopt it, but even if they don’t, the links are backward-compatible.
The syntax for this URL is pretty strange looking. After the URL, the magic is in the string “#:~:text=” and then whatever text you want to match. So a full link would look like this:
If you copy and paste this into Chrome, the browser will open Wikipedia’s cat page, scroll to the first text that matches “Most breeds of cat have a noted fondness for sitting in high places,” and will highlight it. If the text doesn’t match anything, the page will still load. Backward-compatibility works because browsers currently support the number sign (#) as a URI fragment, which usually gets used for anchor links that are made by the page creator. If you paste this into a browser that doesn’t support it, the page will still load, and everything after the number sign will just be ignored as a bad anchor link. So far, so good.
One problem is that this means you can have spaces in a URL. On a webpage or forum, you can hand-code the link with a href tag (or whatever the non-HTML equivalent is) and everything will work. For instant messengers and social media though, which don’t allow code and use automatic URL parsers, things get a bit more complicated. Every URL parser treats a space as the end of a URL, so you’ll need to use percent-encoding to replace all the spaces with the equivalent “%20.” URL parsers now have a shot at linkifying this correctly, but it looks like a mess:
Spaces aren’t the only characters that can cause problems. The standard RFC 3986 defines several “reserved” characters as having a special meaning in a URL, so they shouldn’t be in a URL. Web-page-authoring tools tend to handle these characters automatically, but now that you’re embedding arbitrary sentences in a URL for highlighting, there’s a higher chance you’ll run into one of these reserved characters:! * ‘ ( ) ; : @ & = + $ , / ? # [ ]. They all need to be percent-encoded in order for the URL to work, and Google’s extension takes care of that for you.
Google’s new Chrome extension, called “Link to Text Fragment,” (it’s also on Github) will put a new entry in Chrome’s right-click menu. You just highlight text on a page, right-click it, and hit “Copy link to selected text.” Like magic, a text fragment link will end up on your clipboard. All the text encoding is done automatically, so the link should work with most websites and messengers.
Google seems like it is going to start pushing out support for text fragments across its Web ecosystem, even without the W3C. The links have already started to show up in some Google search results, which allow Chrome users to zip right to the relevant text. It’s probably only a matter of time before link creation moves from an extension to a normal Chrome feature.
Some of its employees tried to stop their company from doing work they saw as unethical. It blew up in their faces.
Laurence Berland had just gotten out of the subway in New York, some 3,000 miles from his desk in San Francisco, when he learned that Google had fired him. It was the Monday before Thanksgiving, and the news came to him, bad-breakup-style, via email. “Following a thorough investigation, the company has found that you committed several acts in violation of Google’s policies,” the note said. It did not elaborate on what he had done to violate these policies.
Berland, an engineer who had spent more than a decade at the company, had reason to expect he might be fired. He had been suspended a few weeks earlier after subscribing to the open calendars of several senior Google employees, whom he suspected of meeting with outside consultants to suppress organizing activity at the company. During a subsequent meeting at which he was questioned by Google investigators, he had the feeling that they were pressuring him to say something that could be grounds for termination. Then, the Friday before he was fired, he had spoken at a well-publicized rally of his co-workers outside Google’s San Francisco offices, accusing the company of silencing dissent.
Even so, the timing and manner of his dismissal surprised him. “I thought they’d do it when all the media attention died down,” he said. “When the suspensions and the rally were no longer on people’s minds.” Instead, at a moment when the spotlight was shining brightly, Google had escalated — as if to make a point.
Berland was one of at least four employees Google fired that day. All four were locked in an ongoing conflict with the company, as they and other activists had stepped forward to denounce both its treatment of workers and its relationship with certain customers, like U.S. Customs and Border Protection.
Berland’s terminated colleagues were even more shocked by the turn of events than he was. Rebecca Rivers, a software engineer based in Boulder, Colo., was dismissed over the phone after accessing internal documents. Rivers had only recently come out as transgender and was pursuing a medical transition. “I came out at Google expecting to stay at Google through the entire transition,” she said. “It’s terrifying to think about going to a job interview, because I’m so scared of how other companies treat trans employees.”
Sophie Waldman and Paul Duke, the two other Googlers fired that day, had not received so much as a warning, much less a suspension. Though they had been questioned by corporate security two months earlier about whether they had circulated documents referring to Customs and Border Protection contracts, they had been allowed to continue their work without incident. Waldman, a software developer in Cambridge, Mass., said she was given a 15-minute notice before she was summoned to the meeting where she was fired; Duke, an engineer in New York, said an invitation appeared on his calendar precisely one minute beforehand. Security officials escorted him out of the building without letting him return to his desk. “I had to describe to them what my jacket, scarf and bag looked like,” he said.
From its earliest days, Google urged employees to “act like owners” and pipe up in all manner of forums, from mailing lists to its meme generator to open-ended question-and-answer sessions with top executives, known as T.G.I.F. It was part of what it meant to be “Googley,” one of the company’s most common compliments. So well entrenched was this ethic of welcoming dissent that the company seemed to abide by it even after the uprising began, taking pains to show it was heeding activists’ concerns.
Over the past year, however, Google has appeared to clamp down. It has gradually scaled back opportunities for employees to grill their bosses and imposed a set of workplace guidelines that forbid “a raging debate over politics or the latest news story.” It hastried to prevent workers from discussing their labor rights with outsiders at a Google facility and even hired a consulting firm that specializes in blocking unions. Then, in November, came the firing of the four activists. The escalation sent tremors through the Google campus in Mountain View, Calif., and its offices in cities like New York and Seattle, prompting many employees — whether or not they had openly supported the activists — to wonder if the company’s culture of friendly debate was now gone for good.
(A Google spokeswoman would not confirm the names of the people fired on Nov. 25. “We dismissed four individuals who were engaged in intentional and often repeated violations of our longstanding data-security policies,” the spokeswoman said. “No one has been dismissed for raising concerns or debating the company’s activities.” Without naming Berland, Google disputed that investigators pressured him.)
As similar forms of worker activism have spread to other companies, including Amazon and Microsoft, it has also raised deeper questions about the nature of the entire industry. Silicon Valley has often held itself up as a highly evolved ecosystem that defies the usual capital-labor dichotomy — a place where investors, founders, executives and workers are all far too dependent on one another to make anything so crass as class warfare. The recent developments at Google have thrown that egalitarian story into doubt, showing that even in the most rarefied corners of Silicon Valley, the bosses are willing to close ranks and shut down debate when the stakes are high enough. The fate of these activists, meanwhile, forces America’s white-collar professionals to grapple with an uncomfortable thought: If the nation’s most sought-after workers can’t stop their employer from behaving in ways that they deplore, where does that leave the rest of us?
Claire Stapleton, who organized a 2018 walkout, left Google last year.Bobby Doherty for The New York Times
As recently as a decade ago, the prospect of labor unrest at Google, of all workplaces, would have seemed laughable. From the very beginning, its founders embraced the view that the value of its business hinged far more on the brainpower of its workers than on any particular lines of code it might own. “Google is organized around the ability to attract and leverage the talent of exceptional technologists and businesspeople,” Larry Page, a founder, wrote in the company’s 2004 I.P.O. prospectus. “We will reward and treat them well.”
To this end, the founders not only made Google one of the cushiest workplaces in corporate America, with its screening rooms, nap pods and mounds of free snacks. They also made it one of the most idealistic. “Talented people are attracted to Google because we empower them to change the world,” Page added in his letter. He was channeling a worldview with roots in the 1960s, when the engineers in Northern California’s nascent computer industry imbibed the local consciousness-raising counterculture and emerged with a faith in the power of technology to save humankind. As so much of the world economy moved online and the company’s profits soared, Google and the rest of Silicon Valley developed a reputation as the American dream factory, the place where the world’s smartest young people wanted to be — where technologists and businesspeople could pull down staggering incomes while still believing that they were engaged in an act of altruism.
But as the world they were remaking came into focus, many of those engineers had second thoughts. With social media swallowing up the public square, it was hard not to notice that Google and Facebook had become an advertising duopoly with an unsettling grip on the entire world’s attention. And after the 2016 presidential election, the consequences of the social media revolution came to seem dystopian. Many engineers felt deeply anguished at the news that foreign governments had exploited their technology in an attempt to influence domestic politics. “It showed that pretty much any system large enough and complex enough can be co-opted for nefarious purposes,” Rivers said.
Others became increasingly concerned that the Trump administration might now use their tools in service of policies they found immoral. Unknown to them, Google was at work on a project that would bring these anxieties right to their own work spaces. In September 2017, the company quietly entered into a contract to help the U.S. Department of Defense track people and vehicles in video footage captured by drones. During a meeting that December, Google presented initial results that showed its artifical-intelligence software was more successful than human data labelers at identifying vehicles, according to an internal Google document we reviewed. By February, the effort, known as Project Maven, was slotted into a launch calendar for soon-to-be-deployed products.
Google’s involvement in the project remained under wraps within the company for months. But by February, rumors about it had seeped out beyond the small circle of engineers who were doing the work. Concerned employees began to search through code and documents and compile their findings. One activist, an engineer named Liz Fong-Jones, called attention to Project Maven in an internal blog post, according to other workers who saw it, and the circle of concern grew. Many of the engineers feared that the technology would be used to single out targets for killing. A growing number of workers concluded that the program was not in keeping with Google’s values, and they wanted management to know about it. The discovery that their employer was working hand in hand with the Defense Department to bolster drone warfare “was a watershed moment,” said Meredith Whittaker, an A.I. researcher based in New York who led Google’s Open Research Group. “If they were able to do that without any internal backlash, dissent, we would have crossed a significant line.”
Whittaker, along with some colleagues, drafted a letter petitioning management to cancel the program, which quickly amassed thousands of signatures after they published it internally. In April, company leaders also flew Whittaker to Google’s headquarters in Mountain View for three back-to-back town-hall discussions about the A.I. project, which it broadcast to Google employees around the world.
The discussions did not go well for the company. Whittaker’s concerns that the technology would enable extrajudicial killings were met with platitudes, or worse. One participant, according to Whittaker and workers who watched, compared Google’s technology to a hammer: something that could be used to harm or to build. Diane Greene, then the chief executive of Google’s cloud business, said the government was essentially using the same off-the-shelf software any customer could buy, a claim that the activists believed to be false. Even Googlers sympathetic to the contract found the presentation lacking. Many workers cheered Whittaker on in a series of memes, including one — echoing a popular meme about Senator Elizabeth Warren of Massachusetts — with the words “nevertheless she persisted” superimposed on her photo.
And her performance seemed to yield results. At an internal town hall in June 2018, Greene said Google would not seek to renew the contract, which internal emails showed could have been worth up to $250 million per year. (The Defense Department declined to comment on Project Maven’s vendors or its military applications.) The about-face shocked many Google employees, even as it gave them a sense of their growing power. Greene soon stepped aside from her role leading the cloud business.
Galvanized, the activists began to press further. They quickly honed a formula for pushing back on products they opposed:
research the project using Google’s internal search tools,
compile the findings into a Google Doc,
write a protest letter and gather co-worker signatures,
debate the issue on Google Plus.
Over the summer, another secret program, nicknamed Dragonfly, came to light in The Intercept. The project would censor search results in China on behalf of the Chinese government, and after months of internal protest, Google appeared to back away from that program too. During congressional testimony that December, the company’s chief executive, Sundar Pichai, said he had “no plans to launch a search service in China,” though some activists worried it might yet return.
Meredith Whittaker, an A.I. researcher, left the company last year.Bobby Doherty for The New York Times
Workers weren’t just organizing to save the world from Google. They were also organizing to save themselves from Google, where those who didn’t fit the mold of the straight, white, male techie felt they could be too easily marginalized or dismissed.
The previous year, after a Google software engineer named James Damore circulated an essay he had written arguing that women were less suited biologically for careers in technology than men, a handful of Google employees pushed back with a memo of their own. “We must remember that we are ultimately all affected by technology,” they wrote, “and that every one of us should have a voice in how it’s built.”
The eight authors of the memo were part of a growing group of workers who spent the next 15 months laying the groundwork for a wider counteroffensive. They knew that unionizing would be a tall order in an industry in which workers had long regarded unions suspiciously. They settled on a more modest plan: to quietly confer with colleagues about the workplace issues they most wanted to address.
By 2018, the moment was ripe. That October, The New York Times revealed that the company had given Andy Rubin, a former executive, a $90 million exit package after an employee’s sexual-misconduct complaint against him was deemed credible. (Rubin denied any wrongdoing and blamed a smear campaign for the allegations.) Amid the outrage that followed the revelation, Claire Stapleton, a marketing manager for YouTube, which Google bought in 2006, suggested that women walk out. Stapleton, who had not been involved in previous activism, thought the walkout would demonstrate that women are pivotal to Google’s work force.
A more seasoned activist, worried that not enough people would walk out, suggested that Stapleton give colleagues other ways of participating, like putting an out-of-office message on their inboxes. But Stapleton forged ahead, figuring the walkout would work if even a few hundred people showed up. The organizers took to their mailing lists to get the word out.
The idea caught on swiftly, though, and on Nov. 1, some 20,000 Google employees around the world left their offices to protest the payout and other workplace frustrations. Critically for a tech company, it wasn’t just well-paid engineers and product leads who turned up, but also clerical and maintenance workers. Many were contractors and temps determined to challenge their second-class status.
Managers took notice. During a subsequent staff meeting, Google’s vice president for human resources, Eileen Naughton, said the Rubin revelations had created a mess for the company. “We’re cleaning up after the circus elephants here,” she said, going on to note that Rubin had left the company and that no payouts had been awarded to harassers since she had stepped into her role.
Activists capitalized on the success. In San Francisco and New York, they coordinated weekly “walkout lunches” to keep their co-workers involved and mobilize against the next workplace outrage, whether it was a product that threatened humanity or a human-resources practice that threatened their livelihood. The more they talked, the more they realized their concerns had a common thread: Executives had too much power over the company, and they had too little. They wanted more. They organized chat groups on encrypted apps like Signal, with innocuous names like “care package delivery” so that they wouldn’t be outed if a manager glimpsed their phones. They prepared tip sheets to help workers approach colleagues, in hopes of building a permanent organization.
Some senior Google executives spoke approvingly about the walkout, but the company also made clear there were limits to its tolerance for worker protests.
The approach had a rich pedigree in the tech sector. In 1990, Apple announced changes to a profit-sharing plan that sent employees to the company’s internal message boards to vent their outrage. Apple soon backtracked. But when about 50 workers formed an organization called Employees for One Apple, which demanded more influence over company decisions, Apple’s leadership granted only token concessions. The organization proved largely powerless, according to a paper by the researchers Libby Bishop and David Levine.
Alan Hyde, an expert on labor law at Rutgers Law School in Newark who has written extensively about the tech sector, said the pattern has repeated itself again and again in Silicon Valley: Management missteps, irate workers take to their mailing lists and message boards and leaders sometimes back down. “Then it dissipates,” he said. “No permanent organizations form, no organization that can plan and act strategically.”
Google seemed to be following a similar playbook. Activists workshopped the questions they would ask at staff meetings to force executives to explain why they weren’t responding to their walkout demands, which included a request for an employee representative on the board and an end to forced arbitration for all claims by employees.
Google would soon fold on arbitration. But it also announced that it would no longer take in-person questions at companywide meetings. Hereafter, all questions would have to be submitted through Dory, the company’s digital question platform.
The Case Against GoogleFeb. 20, 2018
The workers continued to agitate, with mixed results. In April 2019, Kent Walker, Google’s senior vice president for global affairs and top lawyer, convened an external council to advise the company on its use of artificial intelligence. But workplace activists discovered that one appointee, the president of the Heritage Foundation, Kay Cole James, had made what some employees saw as anti-L.G.B.T.Q. and anti-immigrant comments. This generated a wave of embarrassing publicity and, just over one week after he announced the council, Walker pulled the plug.
Soon the most troublesome activists began to disappear from the company. Whittaker, who had spoken out against James’s appointment, had been close to finding a new home within the company after being told that her specialty, ethical questions relating to artificial intelligence, was no longer a priority for Google Cloud. Shortly after the council was disbanded, an official in the new group told her the transfer was dead. Her prospects for remaining at the company appeared to dim.
Stapleton, for her part, said that managers took responsibilities and direct reports away from her and that an executive suggested she go on medical leave until the dispute blew over, even though she had no illness. “When people are speaking up — and they are because that’s the Google way — it’s not good for your career,” she said. Both women soon left the company. The company said that it changed the structure of Stapleton’s team as part of a routine shift but did not demote her, and that Whittaker was asked to return to an earlier role and refused to do so. Whittaker said the specific role was not one she had held before and was effectively a demotion.
It was around this time that Google appeared to be preparing for a more aggressive approach to confronting activists. In May, Walker sent out a companywide email informing workers that they could be disciplined or fired for accessing material that was supposed to be viewed by only those with a “need to know.” The email styled itself as a simple reminder about policy changes the company had recently made. (Subject line: “An important reminder on data classifications.”)
Many Googlers accepted the change as a natural shift for a company with increasingly high-profile and publicity-averse customers. A Google spokeswoman said that the underlying policy had been in place since 2007 but that the language is periodically updated to make it easier to understand and apply. And admirers and critics alike say Walker, who served as an assistant U.S. attorney before migrating to Big Tech, is evangelical about Google’s power to do good, and that he has vastly expanded his reach over the rest of the organization.
But other accounts put Walker’s vision in a different light. Ross LaJeunesse, a top public-policy official at the company, had long been concerned that Google’s cloud business was drawing the company into a web of relationships with repressive foreign governments and other questionable actors. “It makes us an accessory if we are hosting their email systems or their data,” he said in an interview. LaJeunesse, who left the company last year and is currently running for the U.S. Senate in Maine, took his concerns to Walker in 2017 and suggested creating a human rights program that would vet deals. But LaJeunesse said Walker repeatedly demurred, eventually responding that such a program would itself expose Google to legal liability. (Google denied the account. “Google’s commitment to human rights is unwavering,” the spokeswoman said. “While Ross had lobbied to have his group oversee human rights issues across our products, we felt it was more effective to have product and functional teams directly handle these issues.”)
Now activists were particularly troubled by the vagueness of Walker’s dictum: Sensitive material would not necessarily be labeled “need to know.” The onus would be on workers to determine whether they should look at it or not. “In my orientation, I was encouraged to read all the design documents I could find, look at anything about how decisions are made,” said Duke, the New York engineer. “Now they’re saying that’s no longer OK. That is a major shift in culture.” (The Google spokeswoman denies that a shift has occurred. “Google has a rich history of employees’ raising concerns and debating company decisions, but flagrant violation of our policy has never been part of our culture,” she said.)
To some activists, the policy shift appeared to be a recipe for a kind of digital entrapment, or worse. “I said at the time that they intended to fire whoever finds the next Maven,” said Irene Knapp, a Google engineer and activist who left the company in September.
Kathryn Spiers, a security engineer, was fired last year.Bobby Doherty for The New York Times
For six months after the 2018 walkout, Google’s founders, Larry Page and Sergey Brin, had little to say to their workers. There was no mass email with their names on it, no sighting of them at gatherings of rank-and-file employees. Googlers were beginning to take notice. One popular internal meme, playing on the error message a web browser displays when a site won’t load, joked, “404 Page not found.”
The founders suddenly materialized at a companywide meeting in May 2019. But it wasn’t to reassert control over the company. According to multiple employees who tuned in, the two men made idle small talk with Pichai, the chief executive, about what they had been up to — Page said he’d been working with Sidewalk Labs, a subsidiary of Google’s parent company, Alphabet, that focused on urban problems like housing; Brin said he’d been working with Wing, the company’s delivery-drone subsidiary. They mostly ducked employees’ questions, including one about whether the company had retaliated against Stapleton.
The main presentation was from Thomas Kurian, the new head of the company’s cloud division. Kurian spoke about how he planned to increase Google Cloud’s revenue by competing with Microsoft and Amazon. The two rivals had a significant head start, but Kurian promised to catch up, rapidly. It was a message that Kurian, a longtime Oracle executive, had been pounding internally since he arrived at the company earlier in the year.
“He said we are a distant third in the cloud market,” an engineer in the cloud department recalled, echoing other sources. “If we don’t make it to the top two, maybe we should not be in this market anymore.”
Kurian focused on hiring more sales and customer-support personnel and made clear that he was eager to do business with the government. At one point, Whittaker recalled her manager’s telling her that Kurian aspired to be “everywhere Lockheed is.” (Google said Kurian never made a direct comparison between Google’s business and Lockheed Martin’s defense work.) And in July, Kurian got an opportunity: Customs and Border Protection announced the first step toward bidding out a major information-technology contract.
The idea of working with C.B.P. filled many Googlers with dread, and they began to circulate a petition urging executives to spurn the agency. “The winning cloud provider will be streamlining C.B.P.’s infrastructure and facilitating its human rights abuses,” the petition said. “It’s time to stand together again and state clearly that we will not work on any such contract.” (The agency declined to comment for this article.)
Rebecca Rivers quickly signed, then began to wonder: How deeply was Google already entangled with the agency? She turned to Moma, a company intranet that employees have used to find everything from cafeteria menus to design documents, and quickly unearthed emails referencing several projects sold through third-party vendors, including one contract for about $15,000 worth of Google Cloud services, a roughly $250,000 contract for a version of Google Chrome and a third contract for about $600,000 with Google Maps.
Rivers had personally built features for Maps, including one that flagged recent changes to locations of interest. She thought it might have humanitarian uses, like locating refugees fleeing war or climate catastrophe, or could just help retailers pick the sites of new stores. “The worst I thought it could be used for was making rich people more rich,” she said. But it could be used by anyone, she later realized, including an agency that separated children from their parents. “It was gut-wrenching,” she said. “It was the software I wrote that I was most proud of.” Rivers said she passed the information to Sophie Waldman, a co-worker, who combined it with information she had found and disseminated it to other colleagues. Before long, many realized that they, too, had been unwittingly working on technology that could benefit the agency.
The biggest uproar surrounded a project called Anthos, a program that allows customers to combine their existing cloud services and Google’s. According to a report in Business Insider, internal documents showed that Google had given C.B.P. a trial of Anthos. Some engineers working on the project were enraged. They had been told that Anthos was intended for banks and other businesses. Workers took to internal mailing lists to express their outrage about the project.
Kurian had previously said the cloud department had nothing to do with the “Southern border,” according to employees. Now he reiterated the point on one of the threads, adding that in any case, Anthos was just a trial.
Many of the engineers were satisfied with Kurian’s response. “I wish we didn’t do those things,” the engineer in the cloud department said. “If someone is going to do them, I don’t trust Microsoft. At least we could have some oversight.”
Google officials insist that the decision to consider working with C.B.P. was not a statement about the company’s values. “Every experience I’ve had past or present, Google is a deeply values-driven, deeply mission-driven, deeply principled company,” said Jennifer Fitzpatrick, a senior vice president at the company. “But these are not areas where consensus is possible or even likely.” She added: “We do have to make decisions. We do have to move forward.”
But to the activists, Kurian’s explanation was nonsensical. There didn’t seem to be anything in the contracts that restricted Google from contributing to work on the Southern border. The I.T. infrastructure Google was providing would support the agency’s operations everywhere. And regardless of the merits of the contracts, there was the disturbing fact that the engineers working on them had been misled about the purpose of the technology they were creating. Some had decided to work on Anthos precisely because it did not appear to be destined for the national-security apparatus. “If workers aren’t told what the real purpose of their work is, they have no agency in deciding whether or not they want to help with those things,” Berland said. “They become unwittingly complicit.” (Google declined to comment specifically on Anthos.)
Laurence Berland, a software engineer, learned of his firing over email.Bobby Doherty for The New York Times
By last fall, workers and management appeared to be locked in a game of brinkmanship. The company tried to cancel a meeting about unions and labor rights that its Zurich employees had arranged with two union officials, but the employees disregarded messages from office leadership and held the meeting anyway.
Around the same time, employees discovered that Google officials had been meeting with a firm called IRI Consultants since at least May. The firm has done work helping to defeat organizing campaigns at hospitals and other workplaces, in one case by instructing managers to play up the history of Mafia influence on organized labor. (IRI did not return calls seeking comment.) Google also brought on a former assistant U.S. attorney with expertise prosecuting white-collar crime, Stephen C. King, as a top internal security official, in addition to a former Colorado police officer it hired earlier in the year.
It was during this time that the practical effect of the need-to-know policy began coming into focus. In September, the internal security team interviewed a handful of employees who had been involved in circulating the petition asking Google not to work with Customs and Border Protection and in unearthing the documents showing that Google already was.
According to Rivers, who was among those interviewed, one investigator asked how she found the documents and whom she shared them with, a potentially legitimate line of questioning in light of the new policy. But, she said, he seemed more interested in what she knew about how the petition was planned. “It was about how we’re organizing,” Rivers recalled. “How many people there were, the methods we used to meet.” (Google said that policy violations were the focus of the interview and that it appropriately conducted the investigations, giving individuals a chance to explain their actions.)
Rivers and the other employees interviewed believed that the questions about the documents were a pretext for trying to restrict their organizing. She and some of the others considered filing a charge with the National Labor Relations Board, but they worried that this would put them further at risk. “We were all terrified of retaliation,” Rivers said. “The second we file, we put a big target on our back.”
She had another reason to worry: She was not yet public about her transition and worried that the company would out her. “I was extremely afraid,” she said. “I feared any investigation would reveal communication that used my new name, Rebecca Rivers, rather than my dead name.”
Then in November, Rivers and Berland were abruptly put on administrative leave. Rivers turned up at work one morning and realized her security badge didn’t work. “I called the badging office, and they didn’t know; they reactivated me,” she said. “It was a very rushed turn of events.”
When Rivers and Berland were interviewed soon after, Rivers for the second time, they were asked about material they had accessed, but investigators once again spent much of the interviews on other topics. An investigator asked Rivers a series of questions about her role in the anti-C.B.P. activism, including her social media commentary. “He was just fishing for more information related to C.B.P.,” she said.
According to Berland, investigators asked him about his subscription to the calendars of several senior Google employees who had met with IRI Consultants. They asked if he planned to show up unannounced to any of the meetings, which made no sense to him. “I honestly couldn’t believe they seriously suggested that,” he said. “Until they asked, it hadn’t even occurred to me. Why would I do that?” He said he had subscribed to the calendars, which were open to any employee, because he worried the consultants might be helping the company curtail labor rights.
Within Google, the suspensions generated angst, and even nonactivists began to wonder if the company was overreacting. On Nov. 12, Walker posted an internal note defending the move. He said, apparently alluding to Rivers, that one employee had deliberately “searched for, accessed and shared a number of confidential or need-to-know documents outside the scope of their job” and that another employee’s calendar-tracking had caused “a lot of stress for people who are just trying to go about their work.”
“We have the potential to develop incredible products and services that can help billions of people around the world,” he added, but “we can’t let internal wrangling get in the way of that mission.”
Google fired the four activists on the Monday before Thanksgiving. In an internal memo that soon leaked to the press, the company said the workers were involved in “systematic” searches for material “outside the scope of their jobs.” It also cited an employee’s efforts to track information on the calendars of colleagues “outside of their work group.” And it said that information the employees accessed made its way outside the company — a suggestion that they had abetted leaking at a company where even rank-and-file workers regard leaks as a form of betrayal.
The company later said the workers had accessed need-to-know documents, only some of which were labeled such. All four workers said they were unaware of having inappropriately accessed documents labeled need-to-know, and none were ever accused of directly leaking information. When asked, Google could not point to a specific rule that forbade accessing material on colleagues’ open calendars or setting up calendar notifications. Laurie Burgess, a lawyer representing the workers, said that they had done what Googlers have traditionally been encouraged to do and that the firing was intended to chill employee activism. “They picked on people all over the country — they fired them at the same time,” Burgess said. “It’s sort of giving a lesson to employees everywhere that you could be next.”
Among activists, the reaction was sheer terror. “The folks whose names are on the bottom of the petitions were checking their bank accounts,” said Colby Jordan, a sometime activist based in Seattle who left the company in January. .‘If I don’t have a job, how’s this going to work for me?.” But the firings did appear to mobilize hundreds if not thousands of Googlers who had previously stood on the sidelines. An activist said one city’s weekly lunch, which usually attracted about 100 people, had doubled in size the week after the firings, and an additional meeting on the topic drew 300 people. “It’s a very short-term play,” Jordan said. “When a company is its workers — a fact that’s true in tech in particular — you can’t beat them into submission.”
Irene Knapp, a Google engineer and activist, left the company in September.Bobby Doherty for The New York Times
The more aggressive reaction by the company also began to convince many activists that only a formal union could protect them. Bruce Hahne, a project manager who left Google in February after 14 years, said the crackdown with the Zurich team was a galvanizing moment that created a backlash and drove employees to say: “Let’s learn more about this labor union stuff. What is it that they’re trying to suppress?”
Some Google workers had previously reached out to unions, but the idea was met with little enthusiasm from their colleagues. The success of any worker organization at Google would depend on the number of workers it could mobilize, they said, so why not focus on that? For a while, one popular idea was a so-called solidarity union — a union that didn’t seek certification under federal labor law. Unlike a formally certified union, creating a solidarity union wouldn’t depend on winning a majority of employees in a secret-ballot election, which at Google could theoretically require winning tens of thousands of votes.
But since the firings, Google workers have begun softening on organized labor. “I have felt comfortable within some of the email lists I have posted to, to say here is some labor news or some union news,” Hahne said. “You can say that and not have 20 people jump into the thread telling anti-union stories.”
Some workers have reached out to the Office and Professional Employees International Union, which represents white-collar workers like nurses, podiatrists, clerical staff and university staff, and the Communications Workers of America, which represents telephone-company workers, airline workers and many journalists. “We at C.W.A. are happy to be a resource to folks who are looking for resources to support them,” said Tom Smith, the union’s organizing director. “We want to be known as a place folks can turn.”
Beyond the cultural hurdles of trying to bring traditional unions to an industry in which they have been largely absent, the effort would face significant procedural hurdles as well, not least of which is simply determining who would be eligible to join. Some union proponents, taking note of a group of about 80 Google contract workers from Pittsburgh who had unionized in September, have begun to sketch out plans to circumvent the logistical challenges of unionizing by proceeding office by office, job function by job function.
A young security engineer named Kathryn Spiers epitomized the recent shift in thinking. She had once believed that Google was too large and too scattered geographically to unionize all at once. But over drinks with co-workers to celebrate her 21st birthday, she began to wonder: What if their team of about 100 workers unionized on its own? The conversation sparked interest among her co-workers. “I ended up having to put an F.A.Q. doc together because every conversation had the same few questions, mostly around: Why do well-paid, well-benefited engineers need this? Will this result in more administrative overhead with no real benefit?” she said. “A union will make it easier, especially for security, to stand up when they think what they’re being asked to build is wrong.”
Spiers had not hid her organizing conversations from her managers, frequently chatting with co-workers in their open office space. The ax fell shortly after she created a notification reminding co-workers of their right to participate in protected activities like organizing when they visited the website of IRI Consultants or an internal site about company guidelines. Co-workers could see the notification only if they were using Google-managed devices within the corporate network. “I can’t imagine something more clearly protected,” said Burgess, who is now representing Spiers.
Spiers said she was told only that she violated several policies, but not the specific infractions. (The Google spokeswoman said the notification had been created by misusing internal tools and that the message per se was not an issue.)
In January, hundreds of workers at Amazon began to defy a corporate-communicationspolicy by publishing quotes critical of the company’s business practices. Many in the tech industry saw the protest as part of a wave that had radiated out from Google. “Google’s walkout was huge for us,” said Weston Fribley, an Amazon software engineer involved in climate protests there. “It really expanded our imagination about what was possible.”
Even after the events of the past year, it’s hard to say whether Google’s reputation has been meaningfully tarnished within the broader pool of talented, idealistic engineers it hopes to recruit and retain. After all, it just has to look more ethical and worker-friendly than its rivals, all of which are confronting controversies of their own. As if to hammer home that fact, the day after the Amazon workers’ protest began, a Google recruiter praised the Amazon activists in a LinkedIn post, flagged by Vice. He urged them to consider coming to Google, where they could better live out their values. The post, which soon disappeared, seemed both strangely oblivious to the labor turmoil in Mountain View and inadvertently revealing about the calculations that linger over it. (Google said the recruiter was acting on his own initiative.)
If Google continues to set the curve in the competition to be the most righteous tech employer, then it may feel it can keep cracking down. “There are a small number of firms that could possibly absorb the vast majority of people hired here,” one Google activist said. “It’s like: ‘Go ahead, work for Bezos. Or work for Salesforce. They have a charismatic liberal founder who also works with ICE.”’
On the other hand, the risk of irreparably damaging the company’s worker-swaddling reputation almost certainly constrains Google management, which presumably wants to stop the discontent from leaching further into its work force, to say nothing of the popular consciousness. The shift to a more heavy-handed approach also creates the possibility of a costly overreach. “As recently as six months ago, I thought I’d seen this movie before,” said Hyde, the labor law expert at Rutgers Law School. “Then, for reasons I don’t know but would love to know, the ball seems to have been passed to idiots who seem to be doing everything possible to bring about what their predecessors were able to avoid.”
Perhaps this concern was why, two months after the company announced it was turning the weekly T.G.I.F. ritual into a monthly meeting restricted to a narrow set of questions, Pichai told workers he was reconsidering. At a staff meeting in January, he said that he had heard from many Googlers that T.G.I.F. was important to them and that he would look into finding a way to preserve the original free-for-all concept, according to two people who watched. (Google said it now allows a few off-topic questions at the end of each session.) In mid-February, Google also announced that Naughton, its longtime head of human resources, would leave her position for another role at the company later this year.
The activists are still pushing. In the most recent annual employee survey, many say they vented their frustration over the company’s need-to-know policy, whose enforcement Walker recently referred to at a meeting as “common sense.” There is also the matter of the five workers Google recently fired. In December, the C.W.A. filed charges on their behalf with the National Labor Relations Board, accusing the company of retaliating over their workplace activism. The agency will decide whether to issue complaints, which are akin to indictments, and could eventually reinstate them.
If that happens — and Hyde said that redressing illegal firings is one of the few areas of labor law that workers can still leverage, even with a hostile company and the current administration in Washington — it would deal a blow to the narrative Google has labored to create internally. Rather than viewing their fired colleagues as rogue agents intent on sabotaging the company, many Googlers who currently accept the company’s good faith may begin to question its motives.
And then there is the Customs and Border Protection relationship. It’s unclear when the agency will formally open the bidding on the contract that it indicated in July would be forthcoming. But if the company submits a proposal, the activists will have vulnerabilities to exploit. Not least is Kurian’s remark that the infrastructure would not support the agency’s work on the Mexican border, a claim that even some nonactivists consider dubious.
In the meantime, the surviving Google activists lurch between gallows humor and defiance. With every rumor of a retaliatory firing, dozens of Googlers steel themselves emotionally for what they expect will be the end of the line. But they also believe that there are structural forces propelling their activism forward. “Locally pessimistic, globally optimistic,” was how one put it, descending into engineering speak.
Mar Hicks, a labor historian and author of the book “Programmed Inequality,” said there were in fact reasons to believe that the Google activists could succeed at building a lasting worker movement. For one thing, they said, today’s tech workers, unlike earlier generations, have built coalitions beyond the walls of their own company, making it more difficult for employers to isolate them. They have forged alliances across class and job categories, and they situate themselves within a larger political and cultural reaction to the extreme inequality that the tech industry has come to epitomize.
The uprising also comes at a moment of unprecedented scrutiny of technology in the political realm and in the media, which dovetails with the workers’ own anxieties about how the company monetizes their work. “The potential for top-down regulatory action at the same time that we’re having a large groundswell from the bottom up of labor activism,” Hicks said, “puts us in a somewhat different place than earlier moments.”
Yet even these developments can’t quite explain the workers’ audacious — some might say Googley — sense of confidence. While they recognize their mission to be a moonshot, they may simply be that rare class of worker used to engineering moon landings. “Management can decide whether or not to get on board,” Berland said. “But Google workers are increasingly aware of the power they have. They’re going to continue to exercise that power, and in the end, they’re going to prevail.”
The internet giant uses blacklists, algorithm tweaks and an army of contractors to shape what you see
Every minute, an estimated 3.8 million queries are typed into Google, prompting its algorithms to spit out results for hotel rates or breast-cancer treatments or the latest news about President Trump.
They are arguably the most powerful lines of computer code in the global economy, controlling how much of the world accesses information found on the internet, and the starting point for billions of dollars of commerce.
Twenty years ago, Google founders began building a goliath on the premise that its search algorithms could do a better job combing the web for useful information than humans. Google executives have said repeatedly—in private meetings with outside groups and in congressional testimony—that the algorithms are objective and essentially autonomous, unsullied by human biases or business considerations.
The company states in a Google blog, “We do not use human curation to collect or arrange the results on a page.” It says it can’t divulge details about how the algorithms work because the company is involved in a long-running and high-stakes battle with those who want to profit by gaming the system.
But that message often clashes with what happens behind the scenes. Over time, Google has increasingly re-engineered and interfered with search results to a far greater degree than the company and its executives have acknowledged, a Wall Street Journal investigation has found.
Google’s evolving approach marks a shift from its founding philosophy of “organizing the world’s information,” to one that is far more active in deciding how that information should appear.
More than 100 interviews and the Journal’s own testing of Google’s search results reveal:
• Google made algorithmic changes to its search results that favor big businesses over smaller ones, and in at least one case made changes on behalf of a major advertiser, eBay Inc., contrary to its public position that it never takes that type of action. The company also boosts some major websites, such as Amazon.com Inc. and Facebook Inc., according to people familiar with the matter.
• Google engineers regularly make behind-the-scenes adjustments to other information the company is increasingly layering on top of its basic search results. These features include auto-complete suggestions, boxes called “knowledge panels” and “featured snippets,” and news results, which aren’t subject to the same company policies limiting what engineers can remove or change.
• Despite publicly denying doing so, Google keeps blacklists to remove certain sites or prevent others from surfacing in certain types of results. These moves are separate from those that block sites as required by U.S. or foreign law, such as those featuring child abuse or with copyright infringement, and from changes designed to demote spam sites, which attempt to game the system to appear higher in results.
• In auto-complete, the feature that predicts search terms as the user types a query, Google’s engineers have created algorithms and blacklists to weed out more-incendiary suggestions for controversial subjects, such as abortion or immigration, in effect filtering out inflammatory results on high-profile topics.
• Google employees and executives, including co-founders Larry Page and Sergey Brin, have disagreed on how much to intervene on search results and to what extent. Employees can push for revisions in specific search results, including on topics such as vaccinations and autism.
• To evaluate its search results, Google employs thousands of low-paid contractors whose purpose the company says is to assess the quality of the algorithms’ rankings. Even so, contractors said Google gave feedback to these workers to convey what it considered to be the correct ranking of results, and they revised their assessments accordingly, according to contractors interviewed by the Journal. The contractors’ collective evaluations are then used to adjust algorithms.
THE JOURNAL’S FINDINGS undercut one of Google’s core defenses against global regulators worried about how it wields its immense power—that the company doesn’t exert editorial control over what it shows users. Regulators’ areas of concern include anticompetitive practices, political bias and online misinformation.
Far from being autonomous computer programs oblivious to outside pressure, Google’s algorithms are subject to regular tinkering from executives and engineers who are trying to deliver relevant search results, while also pleasing a wide variety of powerful interests and driving its parent company’s more than $30 billion in annual profit. Google is now the most highly trafficked website in the world, surpassing 90% of the market share for all search engines. The market capitalization of its parent, Alphabet Inc., is more than $900 billion.
Google made more than 3,200 changes to its algorithms in 2018, up from more than 2,400 in 2017 and from about 500 in 2010, according to Google and a person familiar with the matter. Google said 15% of queries today are for words, or combinations of words, that the company has never seen before, putting more demands on engineers to make sure the algorithms deliver useful results.
A Google spokeswoman disputed the Journal’s conclusions, saying, “We do today what we have done all along, provide relevant results from the most reliable sources available.”
Lara Levin, the spokeswoman, said the company is transparent in its guidelines for evaluators and in what it designs the algorithms to do.
AS PART OF ITS EXAMINATION, the Journal tested Google’s search results over several weeks this summer and compared them with results from two competing search engines, Microsoft Corp. ’s Bing and DuckDuckGo, a privacy-focused company that builds its results from syndicated feeds from other companies, including Verizon Communications Inc. ’s Yahoo search engine.
The testing showed wide discrepancies in how Google handled auto-complete queries and some of what Google calls organic search results—the list of websites that Google says are algorithmically sorted by relevance in response to a user’s query. (Read about the methodology for the Journal’s analysis.)
Ms. Levin, the Google spokeswoman, declined to comment on specific results of the Journal’s testing. In general, she said, “Our systems aim to provide relevant results from authoritative sources,” adding that organic search results alone “are not representative of the information made accessible via search.”
The Journal tested the auto-complete feature, which Google says draws from its vast database of search information to predict what a user intends to type, as well as data such as a user’s location and search history. The testing showed the extent to which Google doesn’t offer certain suggestions compared with other search engines.
Typing “Joe Biden is” or “Donald Trump is” in auto-complete, Google offered predicted language that was more innocuous than the other search engines. Similar differences were shown for other presidential candidates tested by the Journal.
The Journal also tested several search terms in auto-complete such as “immigrants are” and “abortion is.” Google’s predicted searches were less inflammatory than those of the other engines.
See the results of the Journal’s auto-complete tests
Use the lookup tool below to select the search terms analyzed. Percentages indicate how many times each suggestion appeared during the WSJ’s testing.
GOOGLE
done100%
how old100%
from99%
running for president79%
he democrat78%
he running for president76%
toast71%
a democrat70%
DUCKDUCKGOSHOW BING
an idiot100%
creepy100%
from what state100%
too old to run for president100%
a moron94%
a liar84%
a joke78%
done22%
a creep22%
View more auto-complete suggestions:
Gabriel Weinberg, DuckDuckGo’s chief executive, said that for certain words or phrases entered into the search box, such as ones that might be offensive, DuckDuckGo has decided to block all of its auto-complete suggestions, which it licenses from Yahoo. He said that type of block wasn’t triggered in the Journal’s searches for Donald Trump or Joe Biden.
A spokeswoman for Yahoo operator Verizon Media said, “We are committed to delivering a safe and trustworthy search experience to our users and partners, and we work diligently to ensure that search suggestions within Yahoo Search reflect that commitment.”
Said a Microsoft spokeswoman: “We work to ensure that our search results are as relevant, balanced, and trustworthy as possible, and in general, our rule is to minimize interference with the normal algorithmic operation.”
In other areas of the Journal analysis, Google’s results in organic search and news for a number of hot-button terms and politicians’ names showed prominent representation of both conservative and liberal news outlets.
ALGORITHMS ARE effectively recipes in code form, providing step-by-step instructions for how computers should solve certain problems. They drive not just the internet, but the apps that populate phones and tablets.
Algorithms determine which friends show up in a Facebook user’s news feed, which Twitter posts are most likely to go viral and how much an Uber ride should cost during rush hour as opposed to the middle of the night. They are used by banks to screen loan applications, businesses to look for the best job applicants and insurers to determine a person’s expected lifespan.
In the beginning, their power was rarely questioned. At Google in particular, its innovative algorithms ranked web content in a way that was groundbreaking, and hugely lucrative. The company aimed to make the web useful while relying on the assumption that code alone could do the heavy lifting of figuring out how to rank information.
But bad actors are increasingly trying to manipulate search results, businesses are trying to game the system and misinformation is rampant across tech platforms. Google found itself facing a version of the pressures on Facebook, which long said it was just connecting people but has been forced to more aggressively police content on its platform.
A 2016 internal investigation at Google showed between a 10th of a percent and a quarter of a percent of search queries were returning misinformation of some kind, according to one Google executive who works on search. It was a small number percentage-wise, but given the huge volume of Google searches it would amount to nearly two billion searches a year.
By comparison, Facebook faced congressional scrutiny for Russian misinformation that was viewed by 126 million users.
Google’s Ms. Levin said the number includes not just misinformation but also a “wide range of other content defined as lowest quality.” She disputed the Journal’s estimate of the number of searches that were affected. The company doesn’t disclose metrics on Google searches.
Google assembled a small SWAT team to work on the problem that became known internally as “Project Owl.” Borrowing from the strategy used earlier to fight spam, engineers worked to emphasize factors on a page that are proxies for “authoritativeness,” effectively pushing down pages that don’t display those attributes.
Other tech platforms, including Facebook, have taken a more aggressive approach, manually removing problem content and devising rules around what it defines as misinformation. Google, for its part, said its role “indexing” content versus “hosting” content, as Facebook does, means it shouldn’t take a more active role.
One Google search executive described the problem of defining misinformation as incredibly hard, and said the company didn’t want to go down the path of figuring it out.
Around the time Google started addressing issues such as misinformation, it started fielding even more complaints, to the point where human interference became more routine, according to people familiar with the matter, putting it in the position of arbitrating some of society’s most complicated issues. Some changes to search results might be considered reasonable—boosting trusted websites like the National Suicide Prevention Lifeline, for example—but Google has made little disclosure about when changes are made, or why.
Businesses, lawmakers and advertisers are worried about fairness and competition within the markets where Google is a leading player, and as a result its operations are coming under heavy scrutiny.
The U.S. Justice Department earlier this year opened an antitrust probe, in which Google’s search policies and practices are expected to be areas of focus. Google executives have twice been called to testify before Congress in the past year over concerns about political bias. In the European Union, Google has been fined more than $9 billion in the past three years for anticompetitive practices, including allegedly using its search engine to favor its own products.
In response, Google has said it faces tough competition in a dynamic tech sector, and that its behavior is aimed at helping create choice for consumers, not hurting rivals. The company is currently appealing the decisions against it in the EU, and it has denied claims of political bias.
GOOGLE RARELY RELEASES detailed information on algorithm changes, and its moves have bedeviled companies and interest groups, who feel they are operating at the tech giant’s whim.
In one change hotly contested within Google, engineers opted to tilt results to favor prominent businesses over smaller ones, based on the argument that customers were more likely to get what they wanted at larger outlets. One effect of the change was a boost to Amazon’s products, even if the items had been discontinued, according to people familiar with the matter.
The issue came up repeatedly over the years at meetings in which Google search executives discuss algorithm changes. Each time, they chose not to reverse the change, according to a person familiar with the matter.
Google engineers said it is widely acknowledged within the company that search is a zero-sum game: A change that helps lift one result inevitably pushes down another, often with considerable impact on the businesses involved.
Ms. Levin said there is no guidance in Google’s rater guidelines that suggest big sites are inherently more authoritative than small sites. “It’s inaccurate to suggest we did not address issues like discontinued products appearing high up in results,” she added.
Many of the changes within Google have coincided with its gradual evolution from a company with an engineering-focused, almost academic culture, into an advertising behemoth and one of the most profitable companies in the world. Advertising revenue—which includes ads on search as well as on other products such as maps and YouTube—was $116.3 billion last year.
Some very big advertisers received direct advice on how to improve their organic search results, a perk not available to businesses with no contacts at Google, according to people familiar with the matter. In some cases, that help included sending in search engineers to explain a problem, they said.
“If they have an [algorithm] update, our teams may get on the phone with them and they will go through it,” said Jeremy Cornfeldt, the chief executive of the Americas of Dentsu Inc.’s iProspect, which Mr. Cornfeldt said is one of Google’s largest advertising agency clients. He said the agency doesn’t get information Google wouldn’t share publicly. Among others it can disclose, iProspect represents Levi Strauss & Co., Alcon Inc. and Wolverine World Wide Inc.
One former executive at a Fortune 500 company that received such advice said Google frequently adjusts how it crawls the web and ranks pages to deal with specific big websites.
Google updates its index of some sites such as Facebook and Amazon more frequently, a move that helps them appear more often in search results, according to a person familiar with the matter.
“There’s this idea that the search algorithm is all neutral and goes out and combs the web and comes back and shows what it found, and that’s total BS,” the former executive said. “Google deals with special cases all the time.”
Ms. Levin, the Google spokeswoman, said the search team’s practice is to not provide specialized guidance to website owners. She also said that faster indexing of a site isn’t a guarantee that it will rank higher. “We prioritize issues based on impact, not any commercial relationships,” she said.
Alphabet’s net income
$30
billion
20
10
0
2005
’10
’15
Note: 2017 figure reflects a one-time charge of $9.9 billion related to new U.S. tax law. Alphabet was created through a corporate restructuring of Google in 2015. Figures for prior years are for Google Inc.
Source: FactSet
Online marketplace eBay had long relied on Google for as much as a third of its internet traffic. In 2014, traffic suddenly plummeted—contributing to a $200 million hit in its revenue guidance for that year.
Google told the company it had made a decision to lower the ranking of a large number of eBay pages that were a big source of traffic.
EBay executives debated pulling their quarterly advertising spending of around $30 million from Google to protest, but ultimately decided to step up lobbying pressure on Google, with employees and executives calling and meeting with search engineers, according to people familiar with the matter. A similar episode had hit traffic several years earlier, and eBay had marshaled its lobbying might to persuade Google to give it advice about how to fix the problem, even relying on a former Google staffer who was then employed at eBay to work his contacts, according to one of those people.
This time, Google ultimately agreed to improve the ranking of a number of pages it had demoted while eBay completed a broader revision of its website to make the pages more “useful and relevant,” the people said. The revision was arduous and costly to complete, one of the people said, adding that eBay was later hit by other downrankings that Google didn’t help with.
“We’ve experienced significant and consistent drops in Google SEO for many years, which has been disproportionally detrimental to those small businesses that we support,” an eBay spokesman said. SEO, or search-engine optimization, is the practice of trying to generate more search-engine traffic for a website.
Google’s Ms. Levin declined to comment on eBay.
Companies without eBay’s clout had different experiences.
Dan Baxter can remember the exact moment his website, DealCatcher, was caught in a Google algorithm change. It was 6 p.m. on Sunday, Feb. 18. Mr. Baxter, who founded the Wilmington, Del., coupon website 20 years ago, got a call from one of his 12 employees the next morning.
“Have you looked at our traffic?” the worker asked, frantically, Mr. Baxter recalled. It was suddenly down 93% for no apparent reason. That Saturday, DealCatcher saw about 31,000 visitors from Google. Now it was posting about 2,400. It had disappeared almost entirely on Google search.
Mr. Baxter said he didn’t know whom to contact at Google, so he hired a consultant to help him identify what might have happened. The expert reached out directly to a contact at Google but never heard back. Mr. Baxter tried posting to a YouTube forum hosted by a Google “webmaster” to ask if it might have been a technical problem, but the webmaster seemed to shoot down that idea.
One month to the day after his traffic disappeared, it inexplicably came back, and he still doesn’t know why.
“You’re kind of just left in the dark, and that’s the scary part of the whole thing,” said Mr. Baxter.
Google’s Ms. Levin declined to comment on DealCatcher.
(The Wall Street Journal is owned by News Corp, which has complained publicly about Google’s moves to play down news sites that charge for subscriptions. Google ended the policy that year after intensive lobbying by News Corp and other paywalled publishers. More recently, News Corp has called for an “algorithm review board” to oversee Google, Facebook and other tech giants. News Corp has a commercial agreement to supply news through Facebook, and Dow Jones & Co., publisher of The Wall Street Journal, has a commercial agreement to supply news through Apple services. Google’s Ms. Levin and News Corp declined to comment.)
GOOGLE IN RECENT months has made additional efforts to clarify how its services operate by updating general information on its site. At the end of October it posted a new video titled “How Google Search Works.”
Jonathan Zittrain, a Harvard Law School professor and faculty director of the Berkman Klein Center for Internet & Society, said Google has poorly defined how often or when it intervenes on search results. The company’s argument that it can’t reveal those details because it is fighting spam “seems nuts,” said Mr. Zittrain.
“That argument may have made sense 10 or 15 years ago but not anymore,” he said. “That’s called ‘security through obscurity,’ ” a reference to the now-unfashionable engineering idea that systems can be made more secure by restricting information about how they operate.
Google’s Ms. Levin said “extreme transparency has historically proven to empower bad actors in a way that hurts our users and website owners who play by the rules.”
“Building a service like this means making tens of thousands of really, really complicated human decisions, and that’s not what people think,” said John Bowers, a research associate at the Berkman Klein Center.
On one extreme, those decisions at Google are made by the world’s most accomplished and highest-paid engineers, whose job is to turn the dials within millions of lines of complex code. On the other is an army of more than 10,000 contract workers, who work from home and get paid by the hour to evaluate search results.
The rankings supplied by the contractors, who work from a Google manual that runs to hundreds of pages, can indirectly move a site higher or lower in results, according to people familiar with the matter. And their collective responses are measured by Google executives and used to affect the search algorithms.
Mixed Results
Google’s results page has become a complex mix of search results, advertisements and featured content, not always distinguishable by the user. While these features are all driven by algorithms, Google has different policies and attitudes toward changing the results shown in each of the additional features. Featured snippets and knowledge panels are two common features.
Other features
Organic search results
Featured snippet
Knowledge panel
Highlights web pages that Google thinks will contain content a user is looking for. Google says it will remove content from the feature if it violates policies around harmful and hateful content.
Information Google has compiled from various sources on the web, such as Wikipedia, that provides basic facts about the subject of your query. Google is willing to adjust this material.
search term
Organic search results
Links to results that Google’s algorithms have determined are relevant to your query. Google says it doesn’t curate these results.
One of those evaluators was Zack Langley, now a 27-year-old logistics manager at a tour company in New Orleans. Mr. Langley got a one-year contract in the spring of 2016 evaluating Google’s search results through Lionbridge Technologies Inc., one of several companies Google and other tech platforms use for contract work.
During his time as a contractor, Mr. Langley said he never had any contact with anyone at Google, nor was he told what his results would be used for. Like all of Google’s evaluators, he signed a nondisclosure agreement. He made $13.50 an hour and worked up to 20 hours a week from home.
Sometimes working in his pajamas, Mr. Langley was given hundreds of real search results and told to use his judgment to rate them according to quality, reputation and usefulness, among other factors.
At one point, Mr. Langley said he was unhappy with the search results for “best way to kill myself,” which were turning up links that were like “how-to” manuals. He said he down-ranked all the other results for suicide until the National Suicide Prevention Lifeline was the No. 1 result.
Soon after, Mr. Langley said, Google sent a note through Lionbridge saying the hotline should be ranked as the top result across all searches related to suicide, so that the collective rankings of the evaluators would adjust the algorithms to deliver that result. He said he never learned if his actions had anything to do with the change.
Mr. Langley said it seemed like Google wanted him to change content on search so Google would have what he called plausible deniability about making those decisions. He said contractors would get notes from Lionbridge that he believed came from Google telling them the “correct” results on other searches.
He said that in late 2016, as the election approached, Google officials got more involved in dictating the best results, although not necessarily on issues related to the campaign. “They used to have a hands-off approach, and then it seemed to change,” he said.
Ms. Levin, the Google spokeswoman, said the company “long ago evolved our approach to collecting feedback on these types of queries, which help us develop algorithmic solutions and features in this area.” She added that, “we provide updates to our rater guidelines to ensure all raters are following the same general framework.”
Lionbridge didn’t reply to requests for comment.
AT GOOGLE, EMPLOYEES routinely use the company’s internal message boards as well as a form called “go/bad” to push for changes in specific search results. (Go/bad is a reporting system meant to allow Google staff to point out problematic search results.)
One of the first hot-button issues surfaced in 2015, according to people familiar with the matter, when some employees complained that a search for “how do vaccines cause autism” delivered misinformation through sites that oppose vaccinations.
At least one employee defended the result, writing that Google should “let the algorithms decide” what shows up, according to one person familiar with the matter. Instead, the people said, Google made a change so that the first result is a site called howdovaccinescauseautism.com—which states on its home page in large black letters, “They f—ing don’t.” (The phrase has become a meme within Google.)
Google’s Ms. Levin declined to comment.
In the fall of 2018, the conservative news site Breitbart News Network posted a leaked video of Google executives, including Mr. Brin and Google CEO Sundar Pichai, upset and addressing staffers following President Trump’s election two years earlier. A group of Google employees noticed the video was appearing on the 12th page of search results when Googling “leaked Google video Trump,” which made it seem like Google was burying it. They complained on one of the company’s internal message boards, according to people familiar with the matter. Shortly after, the leaked video began appearing higher in search results.
“When we receive reports of our product not behaving as people might expect, we investigate to see if there’s any useful insight to inform future improvements,” said Ms. Levin.
FROM GOOGLE’S FOUNDING, Messrs. Page and Brin knew that ranking webpages was a matter of opinion. “The importance of a Web page is an inherently subjective matter, which depends on the [readers’] interests, knowledge and attitudes,” they wrote in their 1998 paper introducing the PageRank algorithm, the founding system that launched the search engine.
PageRank, they wrote, would measure the level of human interest and attention, but it would do so “objectively and mechanically.” They contended that the system would mathematically measure the relevance of a site by the number of times other relevant sites linked to it on the web.
Today, PageRank has been updated and subsumed into more than 200 different algorithms, attuned to hundreds of signals, now used by Google. (The company replaced PageRank in 2005 with a newer version that could better keep up with the vast traffic that the site was attracting. Internally, it was called “PageRankNG,” ostensibly named for “next generation,” according to people familiar with the matter. In public, the company still points to PageRank—and on its website links to the original algorithm published by Messrs. Page and Brin—in explaining how search works. “The original insight and notion of using link patterns is something that we still use in our systems,” said Ms. Levin.)
By the early 2000s, spammers were overwhelming Google’s algorithms with tactics that made their sites appear more popular than they were, skewing search results. Messrs. Page and Brin disagreed over how to tackle the problem.
Mr. Brin argued against human intervention, contending that Google should deliver the most accurate results as delivered by the algorithms, and that the algorithms should only be tweaked in the most extreme cases. Mr. Page countered that the user experience was getting damaged when users encountered spam rather than useful results, according to people familiar with the matter.
Google already had been taking what the company calls “manual actions” against specific websites that were abusing the algorithm. In that process, Google engineers demote a website’s ranking by changing its specific “weighting.” For example, if a website is artificially boosted by paying other websites to link to it, a behavior that Google frowns upon, Google engineers could turn down the dial on that specific weighting. The company could also blacklist a website, or remove it altogether.
Mr. Brin still opposed making large-scale efforts to fight spam, because it involved more human intervention. Mr. Brin, whose parents were Jewish émigrés from the former Soviet Union, even personally decided to allow anti-Semitic sites that were in the results for the query “Jew,” according to people familiar with the decision. Google posted a disclaimer with results for that query saying, “Our search results are generated completely objectively and are independent of the beliefs and preferences of those who work at Google.”
Finally, in 2004, in the bathroom one day at Google’s headquarters in Mountain View, Calif., Mr. Page approached Ben Gomes, one of Google’s early search executives, to express support for his efforts fighting spam. “Just do what you need to do,” said Mr. Page, according to a person familiar with the conversation. “Sergey is going to ruin this f—ing company.”
Ms. Levin, the Google spokeswoman, said Messrs. Page, Brin and Gomes declined to comment.
After that, the company revised its algorithms to fight spam and loosened rules for manual interventions, according to people familiar with the matter.
Google has guidelines for changing its ranking algorithms, a grueling process called the “launch committee.” Google executives have pointed to this process in a general way in congressional testimony when asked about algorithm changes.
The process is like defending a thesis, and the meetings can be contentious, according to people familiar with them.
In part because the process is laborious, some engineers aim to avoid it if they can, one of these people said, and small changes can sometimes get pushed through without the committee’s approval. Mr. Gomes is on the committee that decides whether to approve the changes, and other senior officials sometimes attend as well.
Google’s Ms. Levin said not every algorithm change is discussed in a meeting, but “there are other processes for reviewing more straightforward launches at different levels of the organization,” such as an email review. Those reviews still involve members of the launch committee, she said.
Today, Google discloses only a few of the factors being measured by its algorithms. Known ones include “freshness,” which gives preference to recently created content for searches relating to things such as breaking news or a sports event. Another is where a user is located—if a user searches for “zoo,” Google engineers want the algorithms to provide the best zoo in the user’s area. Language signals—how meanings change when words are used together, such as April and fools—are among the most important, as they help determine what a user is actually asking for.
Other important signals have included the length of time users would stay on pages they clicked on before clicking back to Google, according to a former Google employee. Long stays would boost a page’s ranking. Quick bounce backs, indicating a site wasn’t relevant, would severely hurt a ranking, the former employee said.
Over the years, Google’s database recording this user activity has become a competitive advantage, helping cement its position in the search market. Other search engines don’t have the vast quantity of data that is available to Google, search’s market-leader.
That makes the impact of its operating decisions immense. When Pinterest Inc. filed to go public earlier this year, it said that “search engines, such as Google, may modify their algorithms and policies or enforce those policies in ways that are detrimental to us.” It added: “Our ability to appeal these actions is limited.” A spokeswoman for Pinterest declined to comment.
Search-engine optimization consultants have proliferated to try to decipher Google’s signals on behalf of large and small businesses. But even those experts said the algorithms remain borderline indecipherable. “It’s black magic,” said Glenn Gabe, an SEO expert who has spent years analyzing Google’s algorithms and tried to help DealCatcher find a solution to its drop in traffic earlier this year.
ALONG WITH ADVERTISEMENTS, Google’s own features now take up large amounts of space on the first page of results—with few obvious distinctions for users. These include news headlines and videos across the top, information panels along the side and “People also ask” boxes highlighting related questions.
Google engineers view the features as separate products from Google search, and there is less resistance to manually changing their content in response to outside requests, according to people familiar with the matter.
These features have become more prominent as Google attempts to keep users on its results page, where ads are placed, instead of losing the users as they click through to other sites. In September, about 55% of Google searches on mobile were “no-click” searches, according to research firm Jumpshot, meaning users never left the results page.
Two typical features on the results page—knowledge panels, which are collections of relevant information about people, events or other things; and featured snippets, which are highlighted results that Google thinks will contain content a user is looking for—are areas where Google engineers make changes to fix results, the Journal found.
Curated Features
Google has looser policies about making adjustments to these features than organic search results. The features include Google News and People also ask.
Other features
Organic search results
search term
Top stories
News articles surfaced as being particularly relevant. Google blocks some sites that don’t meet its policies.
People also ask
A predictive feature that suggests related questions, providing short answers with links. Google says it weeds out and blocks some phrases in this feature as it does in its auto-complete feature.
Organic
search results
In April, the conservative Heritage Foundation called Google to complain that a coming movie called “Unplanned” had been labeled in a knowledge panel as “propaganda,” according to a person familiar with the matter. The film is about a former Planned Parenthood director who had a change of heart and became pro-life.
After the Heritage Foundation complained to a contact at Google, the company apologized and removed “propaganda” from the description, that person said.
Google’s Ms. Levin said the change “was not the result of pressure from an outside group, it was a violation of the feature’s policy.”
On the auto-complete feature, Google reached a confidential settlement in France in 2012 with several outside groups that had complained it was anti-Semitic that Google was suggesting the French word for “Jew” when searchers typed in the name of several prominent politicians. Google agreed to “algorithmically mitigate” such suggestions as part of a pact that barred the parties from disclosing its terms, according to people familiar with the matter.
In recent years, Google changed its auto-complete algorithms to remove “sensitive and disparaging remarks.” The policy, now detailed on its website, says that Google doesn’t allow predictions that may be related to “harassment, bullying, threats, inappropriate sexualization, or predictions that expose private or sensitive information.”
GOOGLE HAS BECOME more open about its moderation of auto-complete but still doesn’t disclose its use of blacklists. Kevin Gibbs, who created auto-complete in 2004 when he was a Google engineer, originally developed the list of terms that wouldn’t be suggested, even if they were the most popular queries that independent algorithms would normally supply.
For example, if a user searched “Britney Spears”—a popular search on Google at the time—Mr. Gibbs didn’t want a piece of human anatomy or the description of a sex act to appear when someone started typing the singer’s name. The unfiltered results were “kind of horrible,” Mr. Gibbs said in an interview.
He said deciding what should and shouldn’t be on the list was challenging. “It was uncomfortable, and I felt a lot of pressure,” said Mr. Gibbs, who worked on auto-complete for about a year, and left the company in 2012. “I wanted to make sure it represented the world fairly and didn’t leave out any groups.”
Google still maintains lists of phrases and terms that are manually blacklisted from auto-complete, according to people familiar with the matter.
The company internally has a “clearly articulated set of policies” about what terms or phrases might be blacklisted in auto-complete, and that it follows those rules, according to a person familiar with the matter.
Blacklists also affect the results in organic search and Google News, as well as other search products, such as Web answers and knowledge panels, according to people familiar with the matter.
Google has said in congressional testimony it doesn’t use blacklists. Asked in a 2018 hearing whether Google had ever blacklisted a “company, group, individual or outlet…for political reasons,” Karan Bhatia, Google’s vice president of public policy, responded: “No, ma’am, we don’t use blacklists/whitelists to influence our search results,” according to the transcript.
Ms. Levin said those statements were related to blacklists targeting political groups, which she said the company doesn’t keep.
Google’s first blacklists date to the early 2000s, when the company made a list of spam sites that it removed from its index, one of those people said. This means the sites wouldn’t appear in search results.
Engineers known as “maintainers” are authorized to make and approve changes to blacklists. It takes at least two people to do this; one person makes the change, while a second approves it, according to the person familiar with the matter.
The Journal reviewed a draft policy document from August 2018 that outlines how Google employees should implement an anti-misinformation blacklist aimed at blocking certain publishers from appearing in Google News and other search products. The document says engineers should focus on “a publisher misrepresenting their ownership or web properties” and having “deceptive content”—that is, sites that actively aim to mislead—as opposed to those that have inaccurate content.
“The purpose of the blacklist will be to bar the sites from surfacing in any Search feature or news product sites,” the document states.
Ms. Levin said Google does “not manually determine the order of any search result.” She said sites that don’t adhere to Google News “inclusion policies” are “not eligible to appear on news surfaces or in information boxes in Search.”
SOME INDIVIDUALS and companies said changes made by the company seem ad hoc, or inconsistent. People familiar with the matter said Google increasingly will make manual or algorithmic changes that aren’t acknowledged publicly in order to maintain that it isn’t affected by outside pressure.
“It’s very convenient for us to say that the algorithms make all the decisions,” said one former Google executive.
In March 2017, Google updated the guidelines it gives contractors who evaluate search results, instructing them for the first time to give low-quality ratings to sites “created with the sole purpose of promoting hate or violence against a group of people”—something that would help adjust Google algorithms to lower those sites in search.
The next year, the company broadened the guidance to any pages that promote such hate or violence, even if it isn’t the page’s sole purpose and even if it is “expressed in polite or even academic-sounding language.”
Google has resisted entirely removing some content that outsiders complained should be blocked. In May 2018, Ignacio Wenley Palacios, a Spain-based lawyer working for the Lawfare Project, a nonprofit that funds litigation to protect Jewish people, asked Google to remove an anti-Semitic article lauding a German Holocaust denier posted on a Spanish-language neo-Nazi blog.
The company declined. In an email to Mr. Wenley Palacios, lawyers for Google contended that “while such content is detestable” it isn’t “manifestly illegal” in Spain.
Mr. Wenley Palacios then filed a lawsuit, but in the spring of this year, before the suit could be heard, he said, Google lawyers told him the company was changing its policy on such removals in Spain.
According to Mr. Wenley Palacios, the lawyers said the firm would now remove from searches conducted in Spain any links to Holocaust denial and other content that could hurt vulnerable minorities, once they are pointed out to the company. The results would still be accessible outside of Spain. He said both sides agreed to dismiss the case.
Google’s Ms. Levin described the action as a “legal removal” in accordance with local law. Holocaust denial isn’t illegal in Spain, but if it is coupled with an intent to spread hate, it can fall under Spanish criminal law banning certain forms of hate speech.
“Google used to say, ‘We don’t approve of the content, but that’s what it is,’ ” Mr. Wenley Palacios said. “That has changed dramatically.”
Business Model
Google’s search results page has changed over the years, becoming much more ad-heavy.
Other features
Organic search results
search term
Ad
Ads
Ads in recent years claim more space at the top of the results page.
Ad
Vertical search results
Various features that present specialized results for specific topics, like hotels or places, often with photos or maps. The results in some of these features are paid advertisements.
Organic search results
As Google has placed more ads and verticals at the top of the page, organic search results have shrunk.
Health policy consultant Greg Williams said he helped lead a campaign to push Google to make changes that would stifle misleading results for queries such as “rehab.”
At the time, in 2017, addiction centers with spotty records were constantly showing up in search results, typically the first place family members and addicts go in search of help.
Google routed Diane Hentges several times over the last year to call centers as she desperately researched drug addiction treatment centers for her 22-year-old son, she said.
Each time she called one of the facilities listed on Google, a customer-service representative would ask for her financial information, but the representatives weren’t seemingly attached to any legitimate company.
“If you look at a place on Google, it sends you straight to a call center,” Ms. Hentges said, adding that parents who are struggling with a child with addiction “will do anything to get our child healthy. We’ll believe anything.”
After intense lobbying by Mr. Williams and others, Google changed its ad policy around such queries. But addiction industry officials also noticed a significant change to Google search results. Many searches for “rehab” or related terms began returning the website for the Substance Abuse and Mental Health Services Administration, the national help hotline run by the U.S. Department of Health and Human Services, as the top result.
Google never acknowledged the change. Ms. Levin said that “resources are not listed because of any type of partnership” and that “we have algorithmic solutions designed to prioritize authoritative resources (including official hotlines) in our results for queries like these as well as for suicide and self-harm queries.”
A spokesman for SAMHSA said the agency had a partnership with Google.
Google’s search algorithms have been a major focus of Hollywood in its effort to fight pirated TV shows and movies.
Alphabet’s revenue, by type
Advertising
Other
$150
billion
100
50
0
2005
’10
’15
Note: Alphabet was created through a corporate restructuring of Google in 2015. Figures for prior years are for Google Inc.
Source: the company
Studios “saw this as the potential death knell of their business,” said Dan Glickman, chairman and chief executive of the Motion Picture Association of America from 2004 to 2010. The association has been a public critic of Google. “A hundred million dollars to market a major movie could be thrown away if someone could stream it illegally online.”
Google received a record 1.6 million requests to remove web pages for copyright issues last year, according to the company’s published Transparency Report and a Journal analysis. Those requests pertained to more than 740 million pages, about 12 times the number of web pages it was asked to take down in 2012.
A decade ago, in concession to the industry, Google removed “download” from its auto-complete suggestions after the name of a movie or TV show, so that at least it wouldn’t be encouraging searches for pirated content.
In 2012, it applied a filter to search results that would lower the ranking of sites that received a large number of piracy complaints under U.S. copyright law. That effectively pushed many pirate sites off the front page of results for general searches for movies or music, although it still showed them when a user specifically typed in the pirate site names.
In recent months the industry has gotten more cooperation from Google on piracy in search results than at any point in the organization’s history, according to people familiar with the matter.
“Google is under great cosmic pressure, as is Facebook,” Mr. Glickman said. “These are companies that are in danger of being federally regulated to an extent that they never anticipated.”
Mr. Pichai, who became CEO of Google in 2015, is more willing to entertain complaints about the search results from outside parties than Messrs. Page and Brin, the co-founders, according to people familiar with his leadership.
Google’s Ms. Levin said Mr. Pichai’s “style of engaging and listening to feedback has not shifted. He has always been very open to feedback.”
CRITICISM ALLEGING political bias in Google’s search results has sharpened since the 2016 election.
Interest groups from the right and left have besieged Google with questions about content displayed in search results and about why the company’s algorithms returned certain information over others.
Google appointed an executive in Washington, Max Pappas, to handle complaints from conservative groups, according to people familiar with the matter. Mr. Pappas works with Google engineers on changes to search when conservative viewpoints aren’t being represented fairly, according to interest groups interviewed by the Journal, although that is just one part of his job.
“Conservatives need people they can go to at these companies,” said Dan Gainor, an executive at the conservative Media Research Center, which has complained about various issues to Google.
Google also appointed at least one other executive in Washington, Chanelle Hardy, to work with outside liberal groups, according to people familiar with the matter.
Ms. Levin said both positions have existed for many years. She said in general Google believes it’s “the responsible thing to do” to understand feedback from the groups and said Google’s algorithms and policies don’t attempt to make any judgment based on the political leanings of a website.
Mr. Pappas declined to comment, and Ms. Hardy didn’t reply to a request for comment.
SHARE YOUR THOUGHTS
Does Google give you what you expect in search results? Join the discussion below.
Over the past year, abortion-rights groups have complained about search results that turned up the websites of what are known as “crisis pregnancy centers,” organizations that counsel women against having abortions, according to people familiar with the matter.
One of the complaining organizations was Naral Pro-Choice America, which tracks the activities of anti-abortion groups through its opposition research department, said spokeswoman Kristin Ford.
Naral complained to Google and other tech platforms that some of the ads, posts and search results from crisis pregnancy centers are misleading and deceptive, she said. Some of the organizations claimed to offer abortions and then counseled women against it. “They do not disclose what their agenda is,” Ms. Ford said.
In June, Google updated its advertising policies related to abortion, saying that advertisers must state whether they provide abortions or not, according to its website. Ms. Ford said Naral wasn’t told in advance of the policy change.
Ms. Levin said Google didn’t implement any changes with regard to how crisis pregnancy centers rank for abortion queries.
The Journal tested the term “abortion” in organic search results over 17 days in July and August. Thirty-nine percent of all results on the first page had the hostname www.plannedparenthood.org, the site of Planned Parenthood Federation of America, the nonprofit, abortion-rights organization.
By comparison, 14% of Bing’s first page of search results and 16% of DuckDuckGo’s first page of results were from Planned Parenthood.
Ms. Levin said Google doesn’t have any particular ranking implementations aimed at promoting Planned Parenthood.
See the results of the Journal’s search tests
Use the lookup tool below to select search terms analyzed. Percentages indicate how many times each web page appeared during the WSJ’s testing.
Abortion Information | Information About Your Optionshttps://www.plannedparenthood.org/learn/abortion100%
An Overview of Abortion Laws | Guttmacher Institutehttps://www.guttmacher.org/state-policy/explore/overview-abortion-laws100%
What facts about abortion do I need to know? – Planned Parenthoodhttps://www.plannedparenthood.org/learn/abortion/considering-abortion/what-facts-about-abortion-do-i-need-know67%
National Abortion Federation: Homehttps://prochoice.org/44%
Abortion | Center for Reproductive Rightshttps://reproductiverights.org/our-issues/abortion38%
What Happens During an In-Clinic Abortion? – Planned Parenthoodhttps://www.plannedparenthood.org/learn/abortion/in-clinic-abortion-procedures/what-happens-during-an-in-clinic-abortion38%
Abortion | Medical Abortion | MedlinePlushttps://medlineplus.gov/abortion.html98%
Abortion Procedures During First, Second and Third Trimesterhttps://americanpregnancy.org/unplanned-pregnancy/abortion-procedures/76%
View more search results:
The practice of creating blacklists for certain types of sites or searches has fueled cries of political bias from some Google engineers and right-wing publications that said they have viewed portions of the blacklists. Some of the websites Google appears to have targeted in Google News were conservative sites and blogs, according to documents reviewed by the Journal. In one partial blacklist reviewed by the Journal, some conservative and right-wing websites, including The Gateway Pundit and The United West, were included on a list of hundreds of websites that wouldn’t appear in news or featured products, although they could appear in organic search results.
Google has said repeatedly it doesn’t make decisions based on politics, and current and former employees told the Journal they haven’t seen evidence of political bias. And yet, they said, Google’s shifting policies on interference—and its lack of transparency about them—inevitably force employees to become arbiters of what is acceptable, a dilemma that opens the door to charges of bias or favoritism.
Google’s Ms. Levin declined to comment.
DEMANDS FROM GOVERNMENTS for changes have grown rapidly since 2016.
From 2010 to 2018, Google fielded such requests from countries including the U.S. to remove 685,000 links from what Google calls web search. The requests came from courts or other authorities that said the links broke local laws or should be removed for other reasons.
Nearly 78% of those removal requests have been since the beginning of 2016, according to reports that Google publishes on its website. Google’s ultimate actions on those requests weren’t disclosed.
Russia has been by far the most prolific, demanding the removal of about 255,000 links from search last year, three-quarters of all government requests for removal from Google search in that period, the data show. Nearly all of the country’s requests came under an information-security law Russia put into effect in late 2017, according to a Journal examination of disclosures in a database run by the Berkman Klein Center.
Google said the Russian law doesn’t allow it to disclose which URLs were requested to be removed. A person familiar with the matter said the removal demands are for content ruled illegal in Russia for a variety of reasons, such as for promoting drug use or encouraging suicide.
Requests can include demands to remove links to information the government defines as extremist, which can be used to target political opposition, the person said.
Google, whose staff reviews the requests, at times declines those that appear focused on political opposition, the person said, adding that in those cases, it tries not to draw attention to its decisions to avoid provoking Russian regulators.
The approach has led to stiff internal debate. On one side, some Google employees say that the company shouldn’t cooperate at all with takedown requests from countries such as Russia or Turkey. Others say it is important to follow the laws of countries where they are based.
“There is a real question internally about whether a private company should be making these calls,” the person said.
Google’s Ms. Levin said, “Maximizing access to information has always been a core principle of Search, and that hasn’t changed.”
Google’s culture of publicly resisting demands to change results has diminished, current and former employees said. A few years ago, the company dismantled a global team focused on free-speech issues that, among other things, publicized the company’s legal battles to fight changes to search results, in part because Google had lost several of those battles in court, according to a person familiar with the change.
“Free expression was no longer a winner,” the person said.
The U.S. is investigating whether the tech giant has abused its power, including as the biggest broker of digital ad sales across the web
Nexstar Media Group Inc., the largest local news company in the U.S., recently tested what would happen if it stopped using Google’s technology to place ads on its websites.
Over several days, the company’s video ad sales plummeted. “That’s a huge revenue hit,” said Tony Katsur, senior vice president at Nexstar. After its brief test, Nexstar switched back to Google.
Alphabet Inc. ’s Google is under fire for its dominance in digital advertising, in part because of issues like this. The U.S. Justice Department and state attorneys general are investigating whether Google is abusing its power, including as the dominant broker of digital ad sales across the web. Most of the nearly 130 questions the states asked in a September subpoena were about the inner workings of Google’s ad products and how they interact.
We dug into Google’s vast, opaque ad machine, and in a series of graphics below, show you how it all works—and why publishers and rivals have had so many complaints about it.
Much of Google’s power as an ad broker stems from acquisitions of ad-technology companies, especially its 2008 purchase of DoubleClick. Regulators who approved that $3.1 billion deal warned they would step in if the company tied together its offerings in anticompetitive ways.
In interviews, dozens of publishing and advertising executives said Google is doing just that with an array of interwoven products. Google operates the leading selling and buying tools, and the biggest marketplace where online ad deals happen.
When Nexstar didn’t use Google’s selling tool, it missed out on a huge amount of demand that comes through its buying tools, Mr. Katsur said: “They want you locked in.”