00:05very excited to introduce Rana Rana Zay00:08is the global business columnist and00:11natural times and CNN’s global economic00:13analysts previously she’s been the00:15assistant managing editor in charge of00:16Business and Economics at time as well00:18as the magazine’s economic columnist and00:20spent 13 years at Newsweek as an00:22economic foreign affairs editor and00:24correspondent and in her new book don’t00:26be evil which i think is a great title00:29Rhonda Chronicles how far big tech has00:31fallen from its original vision of free00:33information and digital democracy00:35drawing on nearly 30 years of experience00:36reporting on the technologies sector00:39Ronna traces the evolution of companies00:40such as Google Facebook Apple Amazon00:46into behemoths that monetize people’s00:49data spread misinformation and hate00:51speech and threaten citizens privacy she00:53also shows how we can fight back by00:55creating a framework that both fosters00:57innovation and protects us from threats00:59posed by digital technology her book is01:02already garnering widespread praise with01:04the Guardian calling it a masterly01:05critique of the internet pioneers who01:07now dominate our world so without01:08further ado please help me in welcoming01:10Rana for a heart to politics and prose01:16thank you I am so honored to be here01:19it’s really a pleasure this is one of my01:22favorite bookstores probably my favorite01:24bookstore in Washington and so it’s just01:27a huge pleasure I thought I would start01:30by just talking a little bit about how I01:32got the idea to write this book it’s01:33actually my second book my first book01:35makers and takers was a look at the01:38financial sector and how it no longer01:40serves business so I like to kind of01:42take on these big industry-wide maybe01:45take down so we’ve the word but kind of01:49look at an ecosystem and economic01:50ecosystem see how it’s working or not01:52working I got the idea for this book01:56probably two months into my new job at02:00the Financial Times02:01I was hired in 2017 to be the chief02:05business commentary writer so my my job02:08was to sort of look at the top world’s02:11business stories economic stories and02:13try to make sense of them in commentary02:14and when I do that I tend to try and02:17follow the money in order to narrow the02:18funnel of where to put my focus and I02:20had come across a really really02:22interesting statistic that 80%02:25of the world’s wealth corporate wealth02:27was living in 10% of companies and these02:30were the companies that had the most02:31data personal data and intellectual02:34property and so the biggest of those02:36were the big tech platforms that my my02:38book kind of tries to make icons of02:41we’re using all the candy colors here02:43the fangs Facebook Amazon Apple Netflix02:47Google so that was a pretty stunning02:50statistic and it was interesting because02:51I was thinking about how wealth since02:542008 had transferred from the financial02:56sector into the big tech sector and that02:59had happened really quietly without a03:02whole lot of commentary in the press now03:05at the same time I was starting to kind03:06of dig into this story something else03:08happened a much more personal episode I03:11came home one day and I there was a03:14credit card bill waiting for me and I03:16opened it up and I started looking03:17through and there were all these tiny03:19charges in the amount of dollar03:21ninety-nine three dollars five dollars03:22whatever and I noticed that they were03:25all from the app store and I thought oh03:28my gosh I must have been hacked and then03:30I thought who else has my password my03:33ten-year-old son Alex I see nods from03:38parents and others so I go downstairs03:42and I find Alex on the couch with his03:44phone which is his usual after-school03:46position and I say you know what what’s03:50up do you know anything about this and03:51he sort of stunned and oh yes oh that03:55yeah and turns out alex has gotten very03:58fond of a game called FIFA Mobile which04:01is an online soccer game and it’s one of04:03these games that’s dude that you can04:04download it for free but once you get04:07into the game and start playing you have04:10to buy stuff04:11in-app purchases it’s called our loot04:13boxes is another another name so if you04:16want to move up the rankings and do well04:18in the game04:19you have to buy virtual Ronaldo or some04:22new shoes for your player and nine04:24hundred dollars and one month later Alex04:27was at the top of the rankings but I was04:32horrified I was actually horrified and04:34fascinated in fact I mean as04:36mother I was horrified his phone was04:39immediately confiscated passwords were04:41changed limitations were put into place04:44by the way he now officially is allowed04:47only one hour a day on his phone he’s 1304:51years old the average for that age is 704:54hours a day national average now he04:57sneaks in an extra I think he probably04:58gets about 90 minutes because I can’t05:00police him all the time on the way to05:01the on the way to school but it’s I mean05:04to me that is a stunning fact that the05:06average American 13 year old spent 705:09hours day on their phone anyway so I was05:12horrified as a parent but I was05:13fascinated as a business writer because05:15I thought this is the most amazing05:17business model I have ever seen and I05:20have to learn everything about it and05:22right about that time someone had come05:26to see Mia a man named Tristan Harris05:28who’s one of the characters in my book05:29and Tristan is a really interesting guy05:32he was formerly the chief ethics officer05:35at Google and he was trying to bring05:39goodness and not evil to the company and05:42make sure that all the all the products05:45and services were functioning sort of a05:47human interest and then he realized he05:48was not having any luck doing that05:49within the company so he decided to go05:52outside and start something called the05:54Center for Humane technology and Tristan05:57had become really really worried about05:59the core business model that is it’s06:02particularly relevant for Google and06:05Facebook but is also a big part of06:07Amazon’s model and and it’s really the06:08model that another author Shoshanna06:10Zubov who recently wrote a wonderful06:12book on this topic would call06:14surveillance capitalism and so it’s the06:16idea of companies coming in and tracking06:20everything you are doing online and06:22increasingly offline you know if you06:24have your if you have an Android phone06:25it might know where you are in the06:27grocery store if you’re in a car with06:29smart technology your your location06:32coordinates can be tracked so all of06:35this is serving to build a picture of06:37you that is then used to be sold to06:41advertisers and then you can be targeted06:44with what’s called hyper targeted06:46advertising which is essentially why for06:49example06:50if I go online to look for a hotel in06:53California I might get a certain price06:55but someone else might get a different06:57price so this is a really important06:59thing we are looking at different07:01internets right there are subtle07:04differences but they’re there and this07:06data profile that is being built up is07:08splitting us as individual consumers but07:12I would argue that it’s also splitting07:14us as citizens and I’ll when I get to07:16the readings I’ll kind of flush that out07:18a bit more but Tristan07:20kind of turned me on to this business07:23model and he also helped me connect the07:25dots between this business model and07:27what had happened to my son because it07:29turns out that the technologies these07:31sorts of nudges that take you down a07:34game or that bring you to certain places07:36on Amazon or that give you a certain07:39kind of search result or purchasing07:41option on Google are part of an entire07:45field called capped ology which is kind07:49of an Orwellian word and these these07:52technologies actually come largely out07:54of something called the Stanford07:55persuasive technology lab so there is an07:58entire industry that is designed to08:01track your behavior and pull in things08:03like behavioral psychology casino gaming08:06techniques and then layer those on to08:09apps that will push you towards making08:13purchasing decisions or perhaps even08:15other kinds of decisions political08:16decisions that might be good for certain08:19actors and it’s interesting because when08:22I started to think about all this one of08:24the things I really wanted to do in this08:26book was to cry try and create a single08:28narrative arc to take folks through this08:3120 year evolution of this industry from08:34the mid-1990s which is really when the08:36consumer internet was born till now and08:39at the time I was writing and and still08:41probably today you could argue that08:43Facebook was the company that was08:45getting the most negative attention for08:48a lot of the economic and political08:49ramifications of its business model but08:51if you go back to the very beginning08:53Google is the most interesting way to08:56track this because Google really08:59invented the targeted advertising09:01business model they really invented09:03surveillance capitalism and one of the09:05things that is fascinating and and09:06sometimes I’m asked what’s the most09:08surprising thing that you found when09:10writing this book and really the most09:11surprising thing is it was all hiding in09:14plain sight so if you go back to the09:17original paper the Larry Page and Sergey09:19Brin who were the founders of Google did09:21in 1998 while at Stanford as graduate09:25students they actually lay out they lay09:28out what a giant search engine would09:30look like how it would function but then09:31how you might pay for it and if you go09:34down to page 33 there is a section in09:36the appendix called advertising and its09:38discontents and it essentially says that09:42if you monetize a search engine in thisthis way with hyper targeted advertisingthe interests of the users and theinterests of the advertisers be theycompanies or who knows what publicentities are eventually going to comeinto conflict and so they actuallyrecommend that there be some kind ofacademic search engine an open searchengine in the public interest so this to10:05me first of all is fascinating that it10:07was just there all along and fascinating10:11that very few people have read that10:13entire paper even though even those that10:16write about it which in some ways kind10:18of goes to the point that in the last 2010:20years we all do a lot less reading not10:22folks here but but in general we do less10:25reading there was actually a fascinating10:26study that came out recently from common10:28sense media which is Jim’s dyers group10:30in California that tracks children’s10:33behaviors online teenagers only10:36one-third of them read for pleasure more10:39than once a month10:41long-form articles doesn’t matter if10:43you’re reading on an e-book or device10:44but long-form articles books only once a10:47month for pleasure so all our entire10:50world has been changed economically10:52these companies have huge monopoly power10:54politically we’re all kind of living10:56with the ramifications of this new world10:58of social media disinformation fake news11:01and cognitively our brains are changing11:05our behaviors are changing so connecting11:07all of those things was really what I11:10was trying to get at in this book and so11:13I’m gonna read two or three maybe short11:16excerpt11:17and then we can leave a lot of time for11:19questions so that people can kind of11:20dive into as much of this as they want11:23and I’ll start perhaps with my very11:28first meeting with the Googlers Larry11:33Page and Sergey Brin who I met not in11:36Silicon Valley but in Davos the Swiss11:39gathering spot of the global power elite11:42where they had taken over a small Chalet11:44to meet with a select group of media the11:47year was 2007 the company had just11:50purchased YouTube a few months back and11:52it seemed eager to convince skeptical11:54journalists that this acquisition wasn’t11:56yet another death blow to copyright paid11:58content creation and the viability of12:00the news publications for which we12:02worked12:02unlike the buttoned-up consulting types12:05or the suited executives from the old12:07guard multinational corporations that12:09roamed the promenades of davos their12:11tasseled loafers slipping on the icy12:13paths the Googlers with a cool bunch12:15they wore fashionable sneakers and their12:17chalet was sleek white and stark with12:19giant cubes masquerading as chairs in a12:21space that looked as though it had been12:23repurposed that morning by designers12:25flown in from the valley in fact it may12:27have been and if so Google would not12:29have been alone in such access I12:30remember attending a party once in Davos12:32hosted by Napster founder and former12:34Facebook president Sean Parker that12:37featured giant taxidermy bears and a12:39musical performance by John Legend back12:42in the Google Chalet Brin and page12:44projected a youthful earnestness as they12:46explained the company’s involvement in12:48or authoritarian China and insisted12:50they’d never be like Microsoft which was12:52considered the corporate bully and12:53monopolist at the time what about the12:55future of news we wanted to know after12:57admitting that page read only free news12:59online whereas Brin often bought the13:01sunday New York Times in print it’s nice13:03he said cheerfully13:04the duo affirmed exactly what we13:07journalists wanted to hear Google they13:09assured us would never threaten our13:10livelihoods13:11yes advertisers were indeed migrating13:14and mass from our publications to the13:15web where they could target consumers13:17with a level of precision that the print13:19world could barely imagine but not to13:21worry Google would generously retool our13:22business models so we too could thrive13:24in the new digital world I was much13:27younger than and not the admittedly13:29cynical business journalist that I have13:30since13:31and yet I listened skeptically13:32skeptically to that happy future of news13:35like lecture whether Google actually13:37intended to develop some brilliant new13:40revenue model or not what alarmed me was13:42that none of us were asking a far more13:44important question sitting towards the13:46back of the room somewhat conscious of13:48my relatively junior status I hesitated13:50waiting until the final moments of the13:52meeting before raising my hand excuse me13:55I said we’re talking about all this like13:57journalism is the only thing that13:58matters but isn’t this really about13:59democracy if newspapers and magazines14:02are all driven out of business by Google14:04or companies like it I asked how are14:06people gonna find out what’s going on14:07Larry Page looked at me with an odd14:10expression as if he were surprised that14:11someone should be asking such a naive14:13question oh yes we’ve got a lot of14:16people thinking about that14:17not to worry his tone seemed to say14:19Google had the engineers working on that14:22little democracy problem next question I14:26read that because I’m kind of amazed14:30there is still a real lack of14:34understanding I think in the valley14:36about some of the real negative14:39externalities of what have been let’s14:41face it amazing technologies I mean we14:43you know where would we be without14:44search in our smartphones we all14:46carrying around the power of a mainframe14:47in our pockets but as a journalist I14:51think there’s really been a an inability14:54of these companies to kind of own up to14:56you know some of the bad stuff that they14:59have wrought and I think that that still15:00considers oh sorry still continues to be15:03to be the case one of the other points15:06that I try and make in the book is that15:09the problems I’m talking about have15:12actually moved outside of just the big15:14four flat platform firms that that we’re15:16moving into a world in which15:17surveillance capitalism is going to be15:19part of the healthcare system and the15:21financial system and really every kind15:24of business is now using this as its15:26model so for example if you buy coffee15:29at Starbucks Starbucks knows a lot about15:30you Johnson & Johnson knows a lot about15:33you there there are firms watching you15:35all the time and so we’re really at a15:37pivot point I think where we have to ask15:40as a society what are the deeper15:43implications of this and our15:44okay with them and so I would like to15:47read another excerpt where I look at how15:50this model is is moving into the15:52insurance sector and what that means so15:58far data has been obtained via computers16:01and mobile devices but now with the rise16:03of personal digital assistants like16:05Amazon’s Alexa Google’s home mini and16:07Apple Siri now at 30 and now in a third16:10of American homes with triple digit16:12sales growth a year the human voice is16:14the new gold while reports of Alexa16:16Alexa and Siri listening in on16:18conversations and phone calls are16:19disputed there’s no question that they16:21can hear every word you say and from16:23there it’s a short step to them using16:24that knowledge to direct your purchasing16:26decisions it isn’t much of a longer step16:28to see the political implications16:30already some researchers worry that16:32digital assistants will become even more16:33powerful tools than social media for16:36election manipulation certainly none of16:38us will be unaffected consider consider16:41that homeowner oops sorry16:43I’m reading from a reading from the16:44wrong part I think apologies somehow16:54picked the wrong section here anyway I’m16:57going to talk you through this example16:58because it’s it’s something that is17:01already out there I had a conversation a17:03couple of years ago with an executive17:04from Zurich Financial which is a big17:07financial company they do insurance many17:10parts of the world they will now if17:12you’d like them to put sensors in your17:14home or in your car and if you have for17:18example as I do you live in a 190117:20townhouse let’s say you’re upgrading17:22your pipes you get a check you get a you17:24know a positive mark and you may see17:26your insurance premium go down but let’s17:30say your kid is smoking a joint in their17:32bedroom and the sensor picks up on that17:34you then get a black mark here and your17:36premium may go up same again in your car17:39if you’re speeding your insurance17:42company will know and so on and so forth17:43now you can either like this or not17:46depending on where you sit in the17:48socio-economic spectrum but what’s very17:50very interesting is that entire business17:53model a pooled risk business model17:55that’s what insurance is it’s now been17:57completely dissed17:58so you can be targeted and split so this18:02is no longer about society pulling risk18:04a saree pooling risk this is about18:06individuals having to own the risk so if18:09you take that to its natural conclusion18:12you can imagine an elite up here that18:17has access to special pricing and all18:19kinds of great products but you can also18:21imagine an uninsurable group of people18:25at the bottom and then who is going to18:28pick up that risk now the public sector18:30may be maybe they’ll be a junk bond18:33market for insurance either way you have18:36a split in society that didn’t exist18:39before and that was always the business18:42model here you know you go back and read18:44some of the early work of someone like18:46Hal Varian for example who was the chief18:48economist at Google splitting pricing18:51down to the individual was always the18:53point of platform technology firms like18:56Google or Facebook or Amazon splitting18:58individuals out so they could be18:59targeted in different ways but that not19:01only splits pricing it splits Society19:05and so that’s kind of really the the19:07core issue I want to get out here19:10I think I’ll maybe read just just one19:13more excerpt and then we can do we have19:15we have time yeah and then we’ll open it19:17up for questions after that my first19:22book just to mention again was about the19:25financial industry and one of the things19:26that strikes me is that big tech19:28companies have in some way become the19:30new too big to fail entities not only19:33are they holding more wealth and power19:35than the largest banks but in some ways19:36they function like banks they have a19:39tremendous amount of money they use it19:41to buy up corporate debt if that debt19:44were to go bad that could actually be19:46the beginnings of another financial19:47crisis and so that’s kind of a part of19:49this story that really hasn’t gotten out19:51there so let me let me read just two or19:54three more pages for you on that topic19:57the late great management guru Peter20:00Drucker once said in every major20:01economic downturn in US history the20:03villains have been the heroes during the20:05preceding boom I can’t help but wonder20:08if that might be the case over the next20:10few years as the you know20:11it states and possibly the world heads20:13towards its next big slowdown downturns20:16historically come about once every20:18decade and it’s been more than that20:19since the 2008 financial crisis back20:22then banks were the too-big-to-fail20:24institutions responsible for our falling20:26stock portfolios home prices and20:28salaries technology companies by20:30contrast have led the market upswing20:32over the past decade but this time20:34around it’s the big tech firms that20:36could play the spoiler role you wouldn’t20:39think that it could be so when you look20:40at the biggest and richest tech firms20:42today take Apple for example warren20:44buffett says he wished he owned even20:45more Apple stock Goldman Sachs is20:47launching a new credit card with the20:48tech Titan which became the world’s20:50first trillion-dollar market cap company20:52in 2018 but hidden within these bullish20:55headlines are a number of disturbing20:57economic trends of which Apple is20:59already exemplar study this one company21:02and you begin to understand how big tech21:04companies the new too-big-to-fail21:05institutions could indeed sow the seeds21:08of the next financial crisis the first21:11thing to consider is the financial21:12engineering done by such firms like most21:15of the largest and most profitable21:17multinational companies Apple has loads21:19of cash about 300 billion as well as21:22plenty of debt close to 122 billion21:24that’s because like nearly every other21:27large rich company it has parked most of21:30its spare cash in offshore bond21:32portfolios over the last ten years at21:34the same time since the 2008 crisis is21:37that it is issued cheap debt at rates to21:41do sorry it is issued cheap rate sorry21:44cheap debt at low rates in order to do21:48record amounts of share buybacks and21:50dividends Apple’s responsible about a21:53quarter of the 407 billion in buybacks21:55and out since the Trump tax bill was21:57passed in December of 2017 but buybacks22:00have bolstered mainly the top 10% of the22:03US population that owns 84% of all stock22:06the fact that share buybacks have become22:08the biggest single use of corporate cash22:10for over a decade now has buoyed markets22:13but it’s also increased the wealth22:15divide which many common economists22:17believe is that not only the single22:19biggest factor in slower than historic22:21trend growth but is also driving22:22political populism which threatens the22:25good system itself that phenomenon has22:28been put on steroids by the rise of yet22:30another trend epitomized by Apple22:33intangibles such as intellectual22:35property and brands now make up a much22:37larger share of wealth in the global22:39economy the digital economy has a22:41tendency to create super stars since22:43software and internet services are so22:45scalable and they enjoy network effects22:50let’s see do but as these as software22:56and internet services become a bigger22:58part of the economy they reduce23:00investment across the economy as a whole23:02and that’s not only because banks are23:03reluctant to lend to businesses whose23:06intangible assets may simply disappear23:08if they go belly-up but because of the23:10winner-take-all effect that a handful of23:12companies including Apple Amazon and23:14Google enjoy so to sum this up in plain23:17English as this handful of companies has23:20gotten bigger and more powerful23:21investment in the overall decline23:23economy has declined the number of jobs23:26that they’re creating relative to their23:28market size is much lower than that in23:30the past so you have the superstar23:32economy that has become kind of a23:33winner-take-all game I think that we’re23:37going to probably see some kind of a23:39market correction in the next couple of23:41years it’s going to be very interesting23:43at that point to see whether tech leads23:45the markets down and whether you might23:47then see a kind of an Occupy Silicon23:49Valley sentiment as you did in 2008 with23:53Occupy Wall Street I think that that’s23:54really quite possible we can delve more23:57into that if you’d like but I think I23:59want to stop here and be respectful of24:01question time and there are parts that24:04you guys want to hear more about or24:06particular areas that I could read more24:08from you can let me know go ahead24:15because sure we don’t get to speak very24:18often you and I one is you’ve doubtless24:23read about Bloomberg’s decision recently24:26to forbade its reporters from covering24:28Michael Bloomberg yeah yet The24:31Washington Post has no problem24:34investigating Vsauce do you see is that24:38a problem for you have you thought about24:40that is that a and so have any24:43consistency that should bother at24:45financial journalists and the second24:46question is how important for any24:51solution to the problems you you raise24:53would an tights for the revival of24:56antitrust be s we see on the continent24:59where it’s more aggressive and among25:01some of the the Democratic candidates25:04for the president well so let me take25:06the antitrust question first that’s25:08actually important part of the book25:10there’s an entire chapter on antitrust25:12and I think we probably are gonna see25:15some shifts as folks may know since the25:191980s onward antitrust in America has25:23basically been predicated on price so as25:25long as consumer prices were falling it25:28was perceived that companies could be as25:30big as they wanted that it wasn’t a25:31problem but one of the things I look at25:34in the book is this this shift to a25:36world in which transactions are being25:39done not in dollars but in data so25:42that’s a that’s a barter transaction25:43really and one of the things that’s so25:45interesting and this is actually a way25:47in another way in which Silicon Valley25:49is similar to Wall Street the25:50transaction is really opaque so you25:53don’t know essentially how much you’re25:55paying for the supposedly free service25:58that you’re receiving that is a very26:02difficult market to create fairness26:04within and it probably makes the Chicago26:07School notion of consumer prices going26:10down no problem I think probably26:13irrelevant and so there’s two ways in26:15which that’s being dealt with you have26:17the rise of this new Brandeis school of26:19thinking in which you know maybe this is26:22really about power maybe maybe we should26:25think about the big tech firms26:26we do the nineteenth-century railroads26:28we’re alright you know you had at one26:30point railroad Titans that would come in26:33and build tracks and then own the cars26:35and then own the things that were in the26:37cars and eventually that became a26:39zero-sum game and it’s you know it’s as26:42folks probably know we’re in a period in26:44which there’s as much concentration of26:46wealth and power as there was in the26:47Gilded Age so I could imagine very26:50easily a scenario in which you could26:51justify Amazon say being the platform26:55for e-commerce but not being able to26:57compete in the specific areas of fashion27:01or you know whatever else they’re27:03selling against other customers and in27:05fact that’s already the case in the27:06financial sector that big companies that27:09trade let’s say aluminum you know as27:12Goldman Sachs did this is what it ran27:14into a suit a few few years ago that it27:16was both owning all the aluminum and27:18trading it and that’s that’s27:20anti-competitive and so that became an27:22issue for the Fed so I think we probably27:24are going to see that kind of ruling as27:26for the post and journalism you know27:29it’s funny I have some friends that are27:30they’re quite influential to post and27:34they say that Bezos is pretty hands-off27:37I mean I can’t I can’t vouch for that27:38one thing I will say is that Amazon did27:41put this book on the top 20 nonfiction27:43what Stern’s a month so you know I don’t27:46know if that’s a ploy to make me think27:48that they’re they’re being really fair27:49but from probably Jeff Bezos I don’t27:52know I he probably not thinking that27:53much about this book or me but anyway27:56next question go ahead so it seems like27:59some of the major decisions that these28:01big tech companies are making are in28:04regard to fake news and how they’re28:06moderating fake news or the lack of it28:08so have you seen maybe an approach by28:11any current social media platform or any28:13proposed plans in place that you think28:15would be best for moderating fake news28:17that’s such a good question so just to28:20kind of pull back the the two points of28:22view on that are hey look you know the28:26platform tech companies are essentially28:27giant media and advertising firms right28:30I mean if you look at the business model28:31of a Google or a Facebook it’s28:34essentially just like the Financial28:35Times or CNN it’s just much more28:37effective and it can be targeted to the28:39individual28:40that means that these firms have taken28:42you know 85 90 percent of the app new28:45digital advertising pie in the last few28:47years now given that they function as28:49media companies should they not be28:51liable for disinformation in the way28:55that a media company would be so if I28:57print something incorrect at the FT28:59that’s you know the the paper and also29:02my hide on the line there I think that29:05we should actually think about rolling29:08back some of those loopholes that these29:09firms enjoy since the mid-1990s onwards29:12I think that they are going to have to29:14take some responsibility now the29:16question is do we want Mark Zuckerberg29:18being the minister of truth and that’s29:20that’s that’s a really tough question29:23what I would prefer is for the29:26government to actually you know for29:28democratically elected governments to29:29come up with rules about what is and29:32isn’t appropriate and to not have29:34individual companies making those29:36choices I think we’re in a period right29:38now where you know you’ve got Twitter29:40you’ve got Google to a certain extent29:41coming out saying okay we recognize we29:43need to do things differently that’s29:44putting pressure on Facebook but at the29:46end of the day we’re gonna have to have29:47I think an entirely new framework not29:51just in this area but also in taxation29:53in you know an antitrust which we’ve29:56already talked about this is the shift29:58that we’re going through is I think the30:00new Industrial Revolution it’s a 70 year30:03transition and it’s going to require a30:04lot of different frameworks relative to30:07what we already have so the answer is no30:10I don’t see any particular company that30:12has come up with the right framework yet30:14any other questions30:16oh yeah I’d like to go back to antitrust30:18for a minute the Washington Post put up30:20an article just this afternoon about how30:23Apple is changing its business model and30:25it’s different as you know it’s30:27differentiated itself in the market by30:29saying they care about privacy well now30:32they are moving from a a device company30:38to a services company according to the30:40article and they are used and they are30:43using privacy as a lever to provide30:46services that their that other smaller30:51companies like tile which is the example30:54the article has used to create a market30:59for itself right and so it says in the31:03article that the feds are considering31:04looking at antitrust measures against31:06Apple but I think it raises a bigger31:09question that you pointed to which is31:13that the models of antitrust don’t work31:16anymore so in terms of privacy lots of31:22people have talked about monetizing31:24privacy getting paid yeah data but how31:27do you think from an economic point of31:30view we as a society need to look at the31:33role of privacy and the role of31:35antitrust together to somehow change the31:38way we think about these companies31:41because in addition we’ve got31:43consolidation in the marketplace so yeah31:45no longer fair competition you can’t31:48become another Amazon right easily31:51because there are so many big so mate31:53because the players are big and there31:55are so few of them in each part of the31:58economy yeah a right so there’s a lot in32:00what you’ve just said for starters I32:03think you’re hitting on something really32:04important which I get at in my solutions32:06chapter that this is such a huge shift32:09and it’s touching so many different32:11areas and we’ve talked about privacy32:13we’ve talked about antitrust we haven’t32:15even gotten into national security you32:17know civil liberties I mean there there32:19are so many different areas and when you32:21one of the things I noticed when I sat32:23down to write the solution sections you32:25know when you do a think book you always32:26have to have the solutions section and32:28you know the publisher wants like that32:29Silver Bullet thing and you look at this32:32and you notice that when you pull a32:33lever here it effects something in this32:35other areas so I think that’s one reason32:38why we should have a national committee32:41to actually look at what are all the32:43questions it’s when I speak to folks32:45particularly in DC policymakers there’s32:47you know the antitrust camp here the32:49privacy camp here the security folks32:50there that conversation needs to be32:52happening in a 360 way and it is32:54happening much more so that way in32:57Europe I will say I just came off of two32:59weeks of book touring in Europe and the33:02conversation there I think is much more33:04developed and they seem to be to go to33:06your point about the ecosystem and how33:08share it one of the things that seems to33:11be folks seem to be headed towards is a33:13public digital Commons a kind of a33:16database let’s say alright if you decide33:19as you know the cat seems to be out of33:21the bag that we’re gonna allow33:22surveillance capitalism I mean there33:24there are certain folks like Shoshanna33:26would love to see the dial turned back33:28I’m not sure if that’s possible let’s33:30have a public database in which not just33:33one corporation or a handful of33:35corporations but multiple sized players33:37as well as the public sector as well as33:40individual citizens who’s you know after33:43all it’s our data being harvested33:45everybody gets access and then you can33:47figure out how you want to share the pie33:49and one interesting example recently is33:51the Google sidewalk project in Toronto33:54it sounds like you’re up on these issues33:56so you’re probably aware but Google had33:59taken over sort of twelve acres on the34:01Toronto Waterfront and put sensors34:04everywhere and the idea was to create a34:06smart city in which you’d be able to34:08manage traffic patterns and energy usage34:10and things like that but until recently34:12Google was going to own all that data34:14and have access to and finally the34:16Toronto government got a clue and said34:18well actually you know what let’s put34:19this in a public database so other34:22smaller or midsize local firms can come34:25in and be part of that economic34:26ecosystem but also as a public sector we34:30can decide well maybe we want to share34:32data for energy issues or for health34:36issues but maybe we don’t want to share34:37it for certain other kinds of things and34:40perhaps there would be some way in which34:42individuals could take back some of that34:44value so California is thinking about a34:47digital dividend payment from the big34:49tech companies there’s also been talk of34:51a digital sovereign wealth fund if you34:53think about kind of data as the new oil34:56whatever the value is judged to be it34:59would be putting the sovereign wealth35:01fund in the same way that Alaska or35:02Wyoming give back payments or use that35:05for the the public sector that could be35:08done with data too so I think something35:10like that is probably going to be the35:12best solution I’ll tell you I have many35:14examples in the book of ways in which35:16the bigger players have been able to35:18squash small and mid-sized firms and35:20that35:21a major issue and a lot of venture35:23capitalists that I speak to are actually35:26becoming concerned about that because35:27they say that there’s sort of black35:29zones of innovation where if Amazon is35:33there or Google is there you really35:34can’t start a business there’s just been35:36too much that’s been been written35:38ring-fenced question over here35:40yes while your book may be the the best35:43one on the subject they’ve certainly35:44been other books before talking about35:46individuals privacy and their their data35:49and everything about them why is it that35:52you think people are so unconcerned35:56about handing over all of their data to35:58these companies when they are perhaps36:00very concerned about handing it over to36:02the government why why do they feel36:04these guys are the good guys and the36:06government is necessarily the bad guys36:08yeah it’s such an interesting question36:11and that really varies from country to36:13country I find that that’s sort of an36:15interesting cultural dynamic that can36:17shift depending on what market you’re in36:19I have really been puzzled as to why36:23people are so first of all why everybody36:25just clicks the box and says no problem36:27I think part of that is is the opacity36:29of the market I mean if you kind of go36:31back to Adam Smith basic economics you36:35need three things to make a market36:36function property properly that would be36:38equal access to data transparency in the36:41transaction and a shared moral framework36:44and you could argue that none of those36:46things are in place so when we’re making36:48these transactions I think as that’s36:51that very fact becomes better explained36:56and people begin to kind of understand36:58that narrative like the insurance36:59example I just gave that all right37:02you’re getting something but you’re37:03giving up a lot I’m beginning to see37:07pushback already and I suspect in recent37:10weeks as some of the big players have37:11moved into healthcare you know into into37:14the commercial banking business I just37:17think that we are going to begin to see37:18more people being reluctant to give up37:23that much value for what they’re getting37:25you’re also interestingly seeing when37:28there are other options people will go37:31elsewhere so Jimmy Wales who started37:33Wikipedia just I think37:34the weeks ago came up with a new social37:36networking site he’s already got 300,00037:38users there and it’s an odd37:41they don’t do targeted advertising it’s37:42run on the wiki model where you can37:44donate if you want I think once the37:47antitrust piece is in place and you37:49actually have space for new competitors37:52to come in and to offer up different37:54kinds of services that perhaps are more37:56respectful of privacy that you you know37:58you could see a shift there but I’m38:00curious actually can I pull the audience38:02for a minute because I want to ask how38:04many people think that in the next five38:07years individuals are going to become38:09more worried about giving up information38:12that’s going to change their behavior38:13online so like two-thirds but not yeah38:19that’s interesting okay oh go ahead38:23sorry we’re sheep we’re cheap oh my god38:26that was a different book curious if you38:30see the administration’s38:32suggestion that it the California can’t38:35set its own rules for gas mileage and so38:41on and emissions as having a parallel in38:44this area you know I hadn’t thought38:48about that question before I always38:50think about California as really being38:53very leading what is eventually going to38:55become the national standard and I think38:58in data I feel like that is gonna happen39:01you know even the Europeans in fact are39:04saying that the California model is39:06probably the better model for data data39:08protection and privacy and sharing of39:11value so the Europeans have GDP are you39:13know which was kind of the first step in39:15the privacy direction but it doesn’t39:17take into account that economic39:18ecosystem so perversely you have the big39:21companies maybe being able to do better39:23with the GDP our model and smaller ones39:26getting cut out of the loop because they39:27don’t have the legal muscle to kind of39:29deal with all the rules so I do think39:31the California model is going to become39:32a de facto standard we also haven’t39:34talked about China which is of course39:36going its own way and I have it I have a39:38chapter in the book where I look at that39:40I look at the current trade war tech war39:43kind of through the lens of surveillance39:44capitalism and39:46that’s gonna be very interesting I think39:48one of the big probably the biggest mid39:52to long-term economic question for me is39:54are we going to see a transatlantic39:56alliance around digital trade and coming39:59up with some standards because China is40:01going its own direction it’s going to40:02develop its own ecosystem it has its own40:04big players the u.s. is in another40:07category but where is Europe gonna be is40:09it going to be a tri polar world is it40:11going to be a bipolar world in terms of40:13how all this works that that’s a major40:15ik you cannot make an actually foreign40:16policy question I think hey thanks for40:21coming and thinking um I’m wondering we40:25have like a Department of Agriculture we40:27have a Department of Energy will there40:29be a Department of Technology ever in40:30the US and which other countries already40:33have that kind of thing going yeah40:35England is talking about that actually I40:37think kind of an FDA of Technology is40:40probably a very good idea you know I see40:44going back to the example about my son40:46there there40:46the research is nascent and causality is40:49is difficult to prove but there there’s40:51you know a new body of research since40:542011 2012 when smartphones really became40:57ubiquitous showing that levels of40:59anxiety and depression and younger41:01people arising you know they’re there41:04they’re issues of self harm sometimes41:07when people you know use these41:08technologies addictively so I think that41:11that’s that’s a big issue to me it’s41:12very similar to cigarettes you know41:14those were regulated there was a41:16different narrative and then behaviors41:17changed and I think I think that that’s41:20one area to consider policy wise there41:25may be time for one or two more41:26questions41:27okay sorry over here and then over here41:29hi41:30I’m kind of curious what you think about41:33the fact that most of these41:36conversations around technology or even41:38democracy tends to focus on institutions41:41and systems and structures which is41:44great because they are so powerful and41:47ubiquitous my background is in teaching41:51critical thinking and in conflict41:54management and I what I worry41:58that so little attention is being paid42:01to the intelligence and maturity of the42:05citizenry I’m from India after 70 years42:10of democracy we’ve lost it I think it’s42:15simplistic to blame the right-wing42:18leaders and the government I believe we42:22as a people have not developed the42:24maturity to be effective intelligent42:30citizens we don’t have the values we are42:34still feudal we are still extremely42:37hierarchical we don’t have the42:40democratic values in India and we didn’t42:43cultivated over 70 years I see a42:46parallel to being susceptible to the42:52seductions of Technology whether it be42:56free news or the click baiting or43:00anything that the big companies seduce43:04us with that even as we need as you said43:08they an FDA kind of for technology we43:13seem to be observing ourselves of the43:17responsibility of being you know of43:20waking up and no se pians I hear I hear43:24what you’re saying and it’s interesting43:25two things come to mind first of all as43:28I say I just got back from Europe the43:29debate is much more nuanced there and43:33and further along and I think that’s in43:36part because there was not quite as much43:39pendulum shift in the last 40 or 5043:41years from the public sector to the43:43private sector as there was here I think43:46I’m not quite sure if I agree entirely43:49with your point about institutions I43:50think in some ways part of the problem43:53one of the reasons why we have43:54concentration levels that are same as43:57they were in the 19th century is that44:00you know we have a generation of44:03business leaders that grew up in the 80s44:04thinking that the government was only44:06good for cutting taxes and there’s hyper44:09individualism that’s that’s44:12the entire economic model and in some44:14ways I think that you know Facebook is44:16maybe the apex of the neoliberal44:19economic model if you think about the44:22problems of globalization were that cap44:25but you know it was supposed to be44:27globalization was supposed to be about44:28capital goods and people crossing44:30borders well it turned out the capital44:31could cross a lot faster than either44:33goods or people if you take that into44:36the world of data that’s even more true44:38and so I think that you have a group of44:41companies now that have really44:44turbocharged a lot of the problems that44:46have given us the politics that we have44:49now and and a company like Facebook I44:51mean I think it every time Zuckerberg is44:52on the hill it’s like there’s this44:54attitude that they are supranational you44:56know and kind of flying 35,000 feet44:59above national concerns and I think that45:02that’s part of a larger shift and45:04probably going to be a big part of the45:052020 debate right are we gonna now have45:08a pendulum shift back away from private45:12power to some public power some45:14different sharing of that which is a45:16values question which I think gets at45:18some of what you’re talking about45:20long-winded answer anyway I think we45:22have time for maybe one more question45:23yeah – quick question okay one is some45:27of the tech companies especially the45:29platform companies have you know why45:32should we not consider looking at them45:35as utility companies yeah I mean we’ve45:39had phone companies and as far as I know45:41they don’t data mine our conversations45:43and maybe mistaken a bit right I mean45:46right that’s they could easily right45:49right yes it’s different different45:50business model yeah yeah so so that was45:52one the other thing is you mentioned45:54that eventually we need tech policy45:56around this and the issue at least my45:59issue is that the people who make these46:01decisions the the policy makers they46:05just most to them don’t have the46:07technical background right to properly46:10assess the different choices and make46:12those decisions I mean I think one of46:15them Zuckerberg or someone testified the46:17questioning was just awful I mean they46:20just ignore our tech support was46:22terrible46:23yeah exactly so I know anyway whatever46:27thoughts you have no that’s a great and46:29that’s like maybe a great place to sort46:31of wrap up I think the utility model is46:34completely viable and it’s interesting46:36one of the bits of pushback that you’ll46:38sometimes get from folks in the valley46:40about that is well if we’re if we’re46:42split in this way or if the the capacity46:46to innovate is sort of you know46:47compressed as the profit margins would46:49be compressed in a utility model that’ll46:52be bad for innovation not really I mean46:54there’s the statistics show for starters46:56that companies innovate more when46:58they’re smaller they tend to innovate47:00more when they’re private and breakups47:03in the past you can argue have actually47:05created more innovation so a lot of47:07academics would say that even the the47:10the the antitrust just the threat of47:13antitrust action against Microsoft was47:15one of the reasons that Google was47:16allowed to to blossom as it did you can47:19go back to the breakup of the bells and47:22say maybe that created space for the47:25cellphone industry to to move ahead so I47:28think there’s a lot of examples that a47:31more decentralized model is actually a47:34good thing and I think that that is47:36actually going to be a really important47:37thing because right now there’s this I47:40think very perverse debate in the u.s.47:42that is bringing together parts of the47:45far right and parts the far left that47:47all right we need these companies to47:48stay big because they’re the national47:50champions and the the becoming war with47:52China that is a complete bunk that is47:56not shown out first of all I mean these47:58companies would love to be in China if47:59they could get into China you know I48:02think decentralized is the advantage in48:06all respects in the US economically so48:09yeah I’m have no problems with a utility48:12model anyway I think my time is up but48:15I’d be happy to sign books and answer48:17any other questions here at the table48:18and thanks so much for your attention48:19[Applause]48:34you
How Google Interferes With Its Search Algorithms and Changes Your Results
The internet giant uses blacklists, algorithm tweaks and an army of contractors to shape what you see
Every minute, an estimated 3.8 million queries are typed into Google, prompting its algorithms to spit out results for hotel rates or breast-cancer treatments or the latest news about President Trump.
They are arguably the most powerful lines of computer code in the global economy, controlling how much of the world accesses information found on the internet, and the starting point for billions of dollars of commerce.
Twenty years ago, Google founders began building a goliath on the premise that its search algorithms could do a better job combing the web for useful information than humans. Google executives have said repeatedly—in private meetings with outside groups and in congressional testimony—that the algorithms are objective and essentially autonomous, unsullied by human biases or business considerations.
The company states in a Google blog, “We do not use human curation to collect or arrange the results on a page.” It says it can’t divulge details about how the algorithms work because the company is involved in a long-running and high-stakes battle with those who want to profit by gaming the system.
But that message often clashes with what happens behind the scenes. Over time, Google has increasingly re-engineered and interfered with search results to a far greater degree than the company and its executives have acknowledged, a Wall Street Journal investigation has found.
Those actions often come in response to pressure from businesses, outside interest groups and governments around the world. They have increased sharply since the 2016 election and the rise of online misinformation, the Journal found.
Google’s evolving approach marks a shift from its founding philosophy of “organizing the world’s information,” to one that is far more active in deciding how that information should appear.
More than 100 interviews and the Journal’s own testing of Google’s search results reveal:
• Google made algorithmic changes to its search results that favor big businesses over smaller ones, and in at least one case made changes on behalf of a major advertiser, eBay Inc., contrary to its public position that it never takes that type of action. The company also boosts some major websites, such as Amazon.com Inc. and Facebook Inc., according to people familiar with the matter.
• Google engineers regularly make behind-the-scenes adjustments to other information the company is increasingly layering on top of its basic search results. These features include auto-complete suggestions, boxes called “knowledge panels” and “featured snippets,” and news results, which aren’t subject to the same company policies limiting what engineers can remove or change.
• Despite publicly denying doing so, Google keeps blacklists to remove certain sites or prevent others from surfacing in certain types of results. These moves are separate from those that block sites as required by U.S. or foreign law, such as those featuring child abuse or with copyright infringement, and from changes designed to demote spam sites, which attempt to game the system to appear higher in results.
• In auto-complete, the feature that predicts search terms as the user types a query, Google’s engineers have created algorithms and blacklists to weed out more-incendiary suggestions for controversial subjects, such as abortion or immigration, in effect filtering out inflammatory results on high-profile topics.
• Google employees and executives, including co-founders Larry Page and Sergey Brin, have disagreed on how much to intervene on search results and to what extent. Employees can push for revisions in specific search results, including on topics such as vaccinations and autism.
• To evaluate its search results, Google employs thousands of low-paid contractors whose purpose the company says is to assess the quality of the algorithms’ rankings. Even so, contractors said Google gave feedback to these workers to convey what it considered to be the correct ranking of results, and they revised their assessments accordingly, according to contractors interviewed by the Journal. The contractors’ collective evaluations are then used to adjust algorithms.
THE JOURNAL’S FINDINGS undercut one of Google’s core defenses against global regulators worried about how it wields its immense power—that the company doesn’t exert editorial control over what it shows users. Regulators’ areas of concern include anticompetitive practices, political bias and online misinformation.
Far from being autonomous computer programs oblivious to outside pressure, Google’s algorithms are subject to regular tinkering from executives and engineers who are trying to deliver relevant search results, while also pleasing a wide variety of powerful interests and driving its parent company’s more than $30 billion in annual profit. Google is now the most highly trafficked website in the world, surpassing 90% of the market share for all search engines. The market capitalization of its parent, Alphabet Inc., is more than $900 billion.
Google made more than 3,200 changes to its algorithms in 2018, up from more than 2,400 in 2017 and from about 500 in 2010, according to Google and a person familiar with the matter. Google said 15% of queries today are for words, or combinations of words, that the company has never seen before, putting more demands on engineers to make sure the algorithms deliver useful results.
A Google spokeswoman disputed the Journal’s conclusions, saying, “We do today what we have done all along, provide relevant results from the most reliable sources available.”
Lara Levin, the spokeswoman, said the company is transparent in its guidelines for evaluators and in what it designs the algorithms to do.
AS PART OF ITS EXAMINATION, the Journal tested Google’s search results over several weeks this summer and compared them with results from two competing search engines, Microsoft Corp. ’s Bing and DuckDuckGo, a privacy-focused company that builds its results from syndicated feeds from other companies, including Verizon Communications Inc. ’s Yahoo search engine.
The testing showed wide discrepancies in how Google handled auto-complete queries and some of what Google calls organic search results—the list of websites that Google says are algorithmically sorted by relevance in response to a user’s query. (Read about the methodology for the Journal’s analysis.)
Ms. Levin, the Google spokeswoman, declined to comment on specific results of the Journal’s testing. In general, she said, “Our systems aim to provide relevant results from authoritative sources,” adding that organic search results alone “are not representative of the information made accessible via search.”
The Journal tested the auto-complete feature, which Google says draws from its vast database of search information to predict what a user intends to type, as well as data such as a user’s location and search history. The testing showed the extent to which Google doesn’t offer certain suggestions compared with other search engines.
Typing “Joe Biden is” or “Donald Trump is” in auto-complete, Google offered predicted language that was more innocuous than the other search engines. Similar differences were shown for other presidential candidates tested by the Journal.
The Journal also tested several search terms in auto-complete such as “immigrants are” and “abortion is.” Google’s predicted searches were less inflammatory than those of the other engines.
See the results of the Journal’s auto-complete testsUse the lookup tool below to select the search terms analyzed. Percentages indicate how many times each suggestion appeared during the WSJ’s testing.Gabriel Weinberg, DuckDuckGo’s chief executive, said that for certain words or phrases entered into the search box, such as ones that might be offensive, DuckDuckGo has decided to block all of its auto-complete suggestions, which it licenses from Yahoo. He said that type of block wasn’t triggered in the Journal’s searches for Donald Trump or Joe Biden.
A spokeswoman for Yahoo operator Verizon Media said, “We are committed to delivering a safe and trustworthy search experience to our users and partners, and we work diligently to ensure that search suggestions within Yahoo Search reflect that commitment.”
Said a Microsoft spokeswoman: “We work to ensure that our search results are as relevant, balanced, and trustworthy as possible, and in general, our rule is to minimize interference with the normal algorithmic operation.”
In other areas of the Journal analysis, Google’s results in organic search and news for a number of hot-button terms and politicians’ names showed prominent representation of both conservative and liberal news outlets.
ALGORITHMS ARE effectively recipes in code form, providing step-by-step instructions for how computers should solve certain problems. They drive not just the internet, but the apps that populate phones and tablets.
Algorithms determine which friends show up in a Facebook user’s news feed, which Twitter posts are most likely to go viral and how much an Uber ride should cost during rush hour as opposed to the middle of the night. They are used by banks to screen loan applications, businesses to look for the best job applicants and insurers to determine a person’s expected lifespan.
In the beginning, their power was rarely questioned. At Google in particular, its innovative algorithms ranked web content in a way that was groundbreaking, and hugely lucrative. The company aimed to make the web useful while relying on the assumption that code alone could do the heavy lifting of figuring out how to rank information.
But bad actors are increasingly trying to manipulate search results, businesses are trying to game the system and misinformation is rampant across tech platforms. Google found itself facing a version of the pressures on Facebook, which long said it was just connecting people but has been forced to more aggressively police content on its platform.
A 2016 internal investigation at Google showed between a 10th of a percent and a quarter of a percent of search queries were returning misinformation of some kind, according to one Google executive who works on search. It was a small number percentage-wise, but given the huge volume of Google searches it would amount to nearly two billion searches a year.
By comparison, Facebook faced congressional scrutiny for Russian misinformation that was viewed by 126 million users.
Google’s Ms. Levin said the number includes not just misinformation but also a “wide range of other content defined as lowest quality.” She disputed the Journal’s estimate of the number of searches that were affected. The company doesn’t disclose metrics on Google searches.
Google assembled a small SWAT team to work on the problem that became known internally as “Project Owl.” Borrowing from the strategy used earlier to fight spam, engineers worked to emphasize factors on a page that are proxies for “authoritativeness,” effectively pushing down pages that don’t display those attributes.
Other tech platforms, including Facebook, have taken a more aggressive approach, manually removing problem content and devising rules around what it defines as misinformation. Google, for its part, said its role “indexing” content versus “hosting” content, as Facebook does, means it shouldn’t take a more active role.
One Google search executive described the problem of defining misinformation as incredibly hard, and said the company didn’t want to go down the path of figuring it out.
Around the time Google started addressing issues such as misinformation, it started fielding even more complaints, to the point where human interference became more routine, according to people familiar with the matter, putting it in the position of arbitrating some of society’s most complicated issues. Some changes to search results might be considered reasonable—boosting trusted websites like the National Suicide Prevention Lifeline, for example—but Google has made little disclosure about when changes are made, or why.
Businesses, lawmakers and advertisers are worried about fairness and competition within the markets where Google is a leading player, and as a result its operations are coming under heavy scrutiny.
The U.S. Justice Department earlier this year opened an antitrust probe, in which Google’s search policies and practices are expected to be areas of focus. Google executives have twice been called to testify before Congress in the past year over concerns about political bias. In the European Union, Google has been fined more than $9 billion in the past three years for anticompetitive practices, including allegedly using its search engine to favor its own products.
In response, Google has said it faces tough competition in a dynamic tech sector, and that its behavior is aimed at helping create choice for consumers, not hurting rivals. The company is currently appealing the decisions against it in the EU, and it has denied claims of political bias.
GOOGLE RARELY RELEASES detailed information on algorithm changes, and its moves have bedeviled companies and interest groups, who feel they are operating at the tech giant’s whim.
In one change hotly contested within Google, engineers opted to tilt results to favor prominent businesses over smaller ones, based on the argument that customers were more likely to get what they wanted at larger outlets. One effect of the change was a boost to Amazon’s products, even if the items had been discontinued, according to people familiar with the matter.
The issue came up repeatedly over the years at meetings in which Google search executives discuss algorithm changes. Each time, they chose not to reverse the change, according to a person familiar with the matter.
Google engineers said it is widely acknowledged within the company that search is a zero-sum game: A change that helps lift one result inevitably pushes down another, often with considerable impact on the businesses involved.
Ms. Levin said there is no guidance in Google’s rater guidelines that suggest big sites are inherently more authoritative than small sites. “It’s inaccurate to suggest we did not address issues like discontinued products appearing high up in results,” she added.
Many of the changes within Google have coincided with its gradual evolution from a company with an engineering-focused, almost academic culture, into an advertising behemoth and one of the most profitable companies in the world. Advertising revenue—which includes ads on search as well as on other products such as maps and YouTube—was $116.3 billion last year.
Some very big advertisers received direct advice on how to improve their organic search results, a perk not available to businesses with no contacts at Google, according to people familiar with the matter. In some cases, that help included sending in search engineers to explain a problem, they said.
“If they have an [algorithm] update, our teams may get on the phone with them and they will go through it,” said Jeremy Cornfeldt, the chief executive of the Americas of Dentsu Inc.’s iProspect, which Mr. Cornfeldt said is one of Google’s largest advertising agency clients. He said the agency doesn’t get information Google wouldn’t share publicly. Among others it can disclose, iProspect represents Levi Strauss & Co., Alcon Inc. and Wolverine World Wide Inc.
One former executive at a Fortune 500 company that received such advice said Google frequently adjusts how it crawls the web and ranks pages to deal with specific big websites.
Google updates its index of some sites such as Facebook and Amazon more frequently, a move that helps them appear more often in search results, according to a person familiar with the matter.
“There’s this idea that the search algorithm is all neutral and goes out and combs the web and comes back and shows what it found, and that’s total BS,” the former executive said. “Google deals with special cases all the time.”
Ms. Levin, the Google spokeswoman, said the search team’s practice is to not provide specialized guidance to website owners. She also said that faster indexing of a site isn’t a guarantee that it will rank higher. “We prioritize issues based on impact, not any commercial relationships,” she said.
Alphabet’s net income
$30
billion
20
10
0
2005
’10
’15
Note: 2017 figure reflects a one-time charge of $9.9 billion related to new U.S. tax law. Alphabet was created through a corporate restructuring of Google in 2015. Figures for prior years are for Google Inc.
Source: FactSet
Online marketplace eBay had long relied on Google for as much as a third of its internet traffic. In 2014, traffic suddenly plummeted—contributing to a $200 million hit in its revenue guidance for that year.
Google told the company it had made a decision to lower the ranking of a large number of eBay pages that were a big source of traffic.
EBay executives debated pulling their quarterly advertising spending of around $30 million from Google to protest, but ultimately decided to step up lobbying pressure on Google, with employees and executives calling and meeting with search engineers, according to people familiar with the matter. A similar episode had hit traffic several years earlier, and eBay had marshaled its lobbying might to persuade Google to give it advice about how to fix the problem, even relying on a former Google staffer who was then employed at eBay to work his contacts, according to one of those people.
This time, Google ultimately agreed to improve the ranking of a number of pages it had demoted while eBay completed a broader revision of its website to make the pages more “useful and relevant,” the people said. The revision was arduous and costly to complete, one of the people said, adding that eBay was later hit by other downrankings that Google didn’t help with.
“We’ve experienced significant and consistent drops in Google SEO for many years, which has been disproportionally detrimental to those small businesses that we support,” an eBay spokesman said. SEO, or search-engine optimization, is the practice of trying to generate more search-engine traffic for a website.
Google’s Ms. Levin declined to comment on eBay.
Companies without eBay’s clout had different experiences.
Dan Baxter can remember the exact moment his website, DealCatcher, was caught in a Google algorithm change. It was 6 p.m. on Sunday, Feb. 18. Mr. Baxter, who founded the Wilmington, Del., coupon website 20 years ago, got a call from one of his 12 employees the next morning.
“Have you looked at our traffic?” the worker asked, frantically, Mr. Baxter recalled. It was suddenly down 93% for no apparent reason. That Saturday, DealCatcher saw about 31,000 visitors from Google. Now it was posting about 2,400. It had disappeared almost entirely on Google search.
Mr. Baxter said he didn’t know whom to contact at Google, so he hired a consultant to help him identify what might have happened. The expert reached out directly to a contact at Google but never heard back. Mr. Baxter tried posting to a YouTube forum hosted by a Google “webmaster” to ask if it might have been a technical problem, but the webmaster seemed to shoot down that idea.
One month to the day after his traffic disappeared, it inexplicably came back, and he still doesn’t know why.
“You’re kind of just left in the dark, and that’s the scary part of the whole thing,” said Mr. Baxter.
Google’s Ms. Levin declined to comment on DealCatcher.
(The Wall Street Journal is owned by News Corp, which has complained publicly about Google’s moves to play down news sites that charge for subscriptions. Google ended the policy that year after intensive lobbying by News Corp and other paywalled publishers. More recently, News Corp has called for an “algorithm review board” to oversee Google, Facebook and other tech giants. News Corp has a commercial agreement to supply news through Facebook, and Dow Jones & Co., publisher of The Wall Street Journal, has a commercial agreement to supply news through Apple services. Google’s Ms. Levin and News Corp declined to comment.)
GOOGLE IN RECENT months has made additional efforts to clarify how its services operate by updating general information on its site. At the end of October it posted a new video titled “How Google Search Works.”
Jonathan Zittrain, a Harvard Law School professor and faculty director of the Berkman Klein Center for Internet & Society, said Google has poorly defined how often or when it intervenes on search results. The company’s argument that it can’t reveal those details because it is fighting spam “seems nuts,” said Mr. Zittrain.
“That argument may have made sense 10 or 15 years ago but not anymore,” he said. “That’s called ‘security through obscurity,’ ” a reference to the now-unfashionable engineering idea that systems can be made more secure by restricting information about how they operate.
Google’s Ms. Levin said “extreme transparency has historically proven to empower bad actors in a way that hurts our users and website owners who play by the rules.”
“Building a service like this means making tens of thousands of really, really complicated human decisions, and that’s not what people think,” said John Bowers, a research associate at the Berkman Klein Center.
On one extreme, those decisions at Google are made by the world’s most accomplished and highest-paid engineers, whose job is to turn the dials within millions of lines of complex code. On the other is an army of more than 10,000 contract workers, who work from home and get paid by the hour to evaluate search results.
The rankings supplied by the contractors, who work from a Google manual that runs to hundreds of pages, can indirectly move a site higher or lower in results, according to people familiar with the matter. And their collective responses are measured by Google executives and used to affect the search algorithms.
Mixed Results
Google’s results page has become a complex mix of search results, advertisements and featured content, not always distinguishable by the user. While these features are all driven by algorithms, Google has different policies and attitudes toward changing the results shown in each of the additional features. Featured snippets and knowledge panels are two common features.
Other features
Organic search results
Featured snippet
Knowledge panel
Highlights web pages that Google thinks will contain content a user is looking for. Google says it will remove content from the feature if it violates policies around harmful and hateful content.
Information Google has compiled from various sources on the web, such as Wikipedia, that provides basic facts about the subject of your query. Google is willing to adjust this material.
search term
Organic search results
Links to results that Google’s algorithms have determined are relevant to your query. Google says it doesn’t curate these results.
One of those evaluators was Zack Langley, now a 27-year-old logistics manager at a tour company in New Orleans. Mr. Langley got a one-year contract in the spring of 2016 evaluating Google’s search results through Lionbridge Technologies Inc., one of several companies Google and other tech platforms use for contract work.
During his time as a contractor, Mr. Langley said he never had any contact with anyone at Google, nor was he told what his results would be used for. Like all of Google’s evaluators, he signed a nondisclosure agreement. He made $13.50 an hour and worked up to 20 hours a week from home.
Sometimes working in his pajamas, Mr. Langley was given hundreds of real search results and told to use his judgment to rate them according to quality, reputation and usefulness, among other factors.
At one point, Mr. Langley said he was unhappy with the search results for “best way to kill myself,” which were turning up links that were like “how-to” manuals. He said he down-ranked all the other results for suicide until the National Suicide Prevention Lifeline was the No. 1 result.
Soon after, Mr. Langley said, Google sent a note through Lionbridge saying the hotline should be ranked as the top result across all searches related to suicide, so that the collective rankings of the evaluators would adjust the algorithms to deliver that result. He said he never learned if his actions had anything to do with the change.
Mr. Langley said it seemed like Google wanted him to change content on search so Google would have what he called plausible deniability about making those decisions. He said contractors would get notes from Lionbridge that he believed came from Google telling them the “correct” results on other searches.
He said that in late 2016, as the election approached, Google officials got more involved in dictating the best results, although not necessarily on issues related to the campaign. “They used to have a hands-off approach, and then it seemed to change,” he said.
Ms. Levin, the Google spokeswoman, said the company “long ago evolved our approach to collecting feedback on these types of queries, which help us develop algorithmic solutions and features in this area.” She added that, “we provide updates to our rater guidelines to ensure all raters are following the same general framework.”
Lionbridge didn’t reply to requests for comment.
AT GOOGLE, EMPLOYEES routinely use the company’s internal message boards as well as a form called “go/bad” to push for changes in specific search results. (Go/bad is a reporting system meant to allow Google staff to point out problematic search results.)
One of the first hot-button issues surfaced in 2015, according to people familiar with the matter, when some employees complained that a search for “how do vaccines cause autism” delivered misinformation through sites that oppose vaccinations.
At least one employee defended the result, writing that Google should “let the algorithms decide” what shows up, according to one person familiar with the matter. Instead, the people said, Google made a change so that the first result is a site called howdovaccinescauseautism.com—which states on its home page in large black letters, “They f—ing don’t.” (The phrase has become a meme within Google.)
Google’s Ms. Levin declined to comment.
In the fall of 2018, the conservative news site Breitbart News Network posted a leaked video of Google executives, including Mr. Brin and Google CEO Sundar Pichai, upset and addressing staffers following President Trump’s election two years earlier. A group of Google employees noticed the video was appearing on the 12th page of search results when Googling “leaked Google video Trump,” which made it seem like Google was burying it. They complained on one of the company’s internal message boards, according to people familiar with the matter. Shortly after, the leaked video began appearing higher in search results.
“When we receive reports of our product not behaving as people might expect, we investigate to see if there’s any useful insight to inform future improvements,” said Ms. Levin.
FROM GOOGLE’S FOUNDING, Messrs. Page and Brin knew that ranking webpages was a matter of opinion. “The importance of a Web page is an inherently subjective matter, which depends on the [readers’] interests, knowledge and attitudes,” they wrote in their 1998 paper introducing the PageRank algorithm, the founding system that launched the search engine.
PageRank, they wrote, would measure the level of human interest and attention, but it would do so “objectively and mechanically.” They contended that the system would mathematically measure the relevance of a site by the number of times other relevant sites linked to it on the web.
Today, PageRank has been updated and subsumed into more than 200 different algorithms, attuned to hundreds of signals, now used by Google. (The company replaced PageRank in 2005 with a newer version that could better keep up with the vast traffic that the site was attracting. Internally, it was called “PageRankNG,” ostensibly named for “next generation,” according to people familiar with the matter. In public, the company still points to PageRank—and on its website links to the original algorithm published by Messrs. Page and Brin—in explaining how search works. “The original insight and notion of using link patterns is something that we still use in our systems,” said Ms. Levin.)
By the early 2000s, spammers were overwhelming Google’s algorithms with tactics that made their sites appear more popular than they were, skewing search results. Messrs. Page and Brin disagreed over how to tackle the problem.
Mr. Brin argued against human intervention, contending that Google should deliver the most accurate results as delivered by the algorithms, and that the algorithms should only be tweaked in the most extreme cases. Mr. Page countered that the user experience was getting damaged when users encountered spam rather than useful results, according to people familiar with the matter.
Google already had been taking what the company calls “manual actions” against specific websites that were abusing the algorithm. In that process, Google engineers demote a website’s ranking by changing its specific “weighting.” For example, if a website is artificially boosted by paying other websites to link to it, a behavior that Google frowns upon, Google engineers could turn down the dial on that specific weighting. The company could also blacklist a website, or remove it altogether.
Mr. Brin still opposed making large-scale efforts to fight spam, because it involved more human intervention. Mr. Brin, whose parents were Jewish émigrés from the former Soviet Union, even personally decided to allow anti-Semitic sites that were in the results for the query “Jew,” according to people familiar with the decision. Google posted a disclaimer with results for that query saying, “Our search results are generated completely objectively and are independent of the beliefs and preferences of those who work at Google.”
Finally, in 2004, in the bathroom one day at Google’s headquarters in Mountain View, Calif., Mr. Page approached Ben Gomes, one of Google’s early search executives, to express support for his efforts fighting spam. “Just do what you need to do,” said Mr. Page, according to a person familiar with the conversation. “Sergey is going to ruin this f—ing company.”
Ms. Levin, the Google spokeswoman, said Messrs. Page, Brin and Gomes declined to comment.
After that, the company revised its algorithms to fight spam and loosened rules for manual interventions, according to people familiar with the matter.
Google has guidelines for changing its ranking algorithms, a grueling process called the “launch committee.” Google executives have pointed to this process in a general way in congressional testimony when asked about algorithm changes.
The process is like defending a thesis, and the meetings can be contentious, according to people familiar with them.
In part because the process is laborious, some engineers aim to avoid it if they can, one of these people said, and small changes can sometimes get pushed through without the committee’s approval. Mr. Gomes is on the committee that decides whether to approve the changes, and other senior officials sometimes attend as well.
Google’s Ms. Levin said not every algorithm change is discussed in a meeting, but “there are other processes for reviewing more straightforward launches at different levels of the organization,” such as an email review. Those reviews still involve members of the launch committee, she said.
Today, Google discloses only a few of the factors being measured by its algorithms. Known ones include “freshness,” which gives preference to recently created content for searches relating to things such as breaking news or a sports event. Another is where a user is located—if a user searches for “zoo,” Google engineers want the algorithms to provide the best zoo in the user’s area. Language signals—how meanings change when words are used together, such as April and fools—are among the most important, as they help determine what a user is actually asking for.
Other important signals have included the length of time users would stay on pages they clicked on before clicking back to Google, according to a former Google employee. Long stays would boost a page’s ranking. Quick bounce backs, indicating a site wasn’t relevant, would severely hurt a ranking, the former employee said.
Over the years, Google’s database recording this user activity has become a competitive advantage, helping cement its position in the search market. Other search engines don’t have the vast quantity of data that is available to Google, search’s market-leader.
That makes the impact of its operating decisions immense. When Pinterest Inc. filed to go public earlier this year, it said that “search engines, such as Google, may modify their algorithms and policies or enforce those policies in ways that are detrimental to us.” It added: “Our ability to appeal these actions is limited.” A spokeswoman for Pinterest declined to comment.
Search-engine optimization consultants have proliferated to try to decipher Google’s signals on behalf of large and small businesses. But even those experts said the algorithms remain borderline indecipherable. “It’s black magic,” said Glenn Gabe, an SEO expert who has spent years analyzing Google’s algorithms and tried to help DealCatcher find a solution to its drop in traffic earlier this year.
ALONG WITH ADVERTISEMENTS, Google’s own features now take up large amounts of space on the first page of results—with few obvious distinctions for users. These include news headlines and videos across the top, information panels along the side and “People also ask” boxes highlighting related questions.
Google engineers view the features as separate products from Google search, and there is less resistance to manually changing their content in response to outside requests, according to people familiar with the matter.
These features have become more prominent as Google attempts to keep users on its results page, where ads are placed, instead of losing the users as they click through to other sites. In September, about 55% of Google searches on mobile were “no-click” searches, according to research firm Jumpshot, meaning users never left the results page.
Two typical features on the results page—knowledge panels, which are collections of relevant information about people, events or other things; and featured snippets, which are highlighted results that Google thinks will contain content a user is looking for—are areas where Google engineers make changes to fix results, the Journal found.
Curated Features
Google has looser policies about making adjustments to these features than organic search results. The features include Google News and People also ask.
Other features
Organic search results
search term
Top stories
News articles surfaced as being particularly relevant. Google blocks some sites that don’t meet its policies.
People also ask
A predictive feature that suggests related questions, providing short answers with links. Google says it weeds out and blocks some phrases in this feature as it does in its auto-complete feature.
Organic
search results
In April, the conservative Heritage Foundation called Google to complain that a coming movie called “Unplanned” had been labeled in a knowledge panel as “propaganda,” according to a person familiar with the matter. The film is about a former Planned Parenthood director who had a change of heart and became pro-life.
After the Heritage Foundation complained to a contact at Google, the company apologized and removed “propaganda” from the description, that person said.
Google’s Ms. Levin said the change “was not the result of pressure from an outside group, it was a violation of the feature’s policy.”
On the auto-complete feature, Google reached a confidential settlement in France in 2012 with several outside groups that had complained it was anti-Semitic that Google was suggesting the French word for “Jew” when searchers typed in the name of several prominent politicians. Google agreed to “algorithmically mitigate” such suggestions as part of a pact that barred the parties from disclosing its terms, according to people familiar with the matter.
In recent years, Google changed its auto-complete algorithms to remove “sensitive and disparaging remarks.” The policy, now detailed on its website, says that Google doesn’t allow predictions that may be related to “harassment, bullying, threats, inappropriate sexualization, or predictions that expose private or sensitive information.”
GOOGLE HAS BECOME more open about its moderation of auto-complete but still doesn’t disclose its use of blacklists. Kevin Gibbs, who created auto-complete in 2004 when he was a Google engineer, originally developed the list of terms that wouldn’t be suggested, even if they were the most popular queries that independent algorithms would normally supply.
For example, if a user searched “Britney Spears”—a popular search on Google at the time—Mr. Gibbs didn’t want a piece of human anatomy or the description of a sex act to appear when someone started typing the singer’s name. The unfiltered results were “kind of horrible,” Mr. Gibbs said in an interview.
He said deciding what should and shouldn’t be on the list was challenging. “It was uncomfortable, and I felt a lot of pressure,” said Mr. Gibbs, who worked on auto-complete for about a year, and left the company in 2012. “I wanted to make sure it represented the world fairly and didn’t leave out any groups.”
Google still maintains lists of phrases and terms that are manually blacklisted from auto-complete, according to people familiar with the matter.
The company internally has a “clearly articulated set of policies” about what terms or phrases might be blacklisted in auto-complete, and that it follows those rules, according to a person familiar with the matter.
Blacklists also affect the results in organic search and Google News, as well as other search products, such as Web answers and knowledge panels, according to people familiar with the matter.
Google has said in congressional testimony it doesn’t use blacklists. Asked in a 2018 hearing whether Google had ever blacklisted a “company, group, individual or outlet…for political reasons,” Karan Bhatia, Google’s vice president of public policy, responded: “No, ma’am, we don’t use blacklists/whitelists to influence our search results,” according to the transcript.
Ms. Levin said those statements were related to blacklists targeting political groups, which she said the company doesn’t keep.
Google’s first blacklists date to the early 2000s, when the company made a list of spam sites that it removed from its index, one of those people said. This means the sites wouldn’t appear in search results.
Engineers known as “maintainers” are authorized to make and approve changes to blacklists. It takes at least two people to do this; one person makes the change, while a second approves it, according to the person familiar with the matter.
The Journal reviewed a draft policy document from August 2018 that outlines how Google employees should implement an anti-misinformation blacklist aimed at blocking certain publishers from appearing in Google News and other search products. The document says engineers should focus on “a publisher misrepresenting their ownership or web properties” and having “deceptive content”—that is, sites that actively aim to mislead—as opposed to those that have inaccurate content.
“The purpose of the blacklist will be to bar the sites from surfacing in any Search feature or news product sites,” the document states.
Ms. Levin said Google does “not manually determine the order of any search result.” She said sites that don’t adhere to Google News “inclusion policies” are “not eligible to appear on news surfaces or in information boxes in Search.”
SOME INDIVIDUALS and companies said changes made by the company seem ad hoc, or inconsistent. People familiar with the matter said Google increasingly will make manual or algorithmic changes that aren’t acknowledged publicly in order to maintain that it isn’t affected by outside pressure.
“It’s very convenient for us to say that the algorithms make all the decisions,” said one former Google executive.
In March 2017, Google updated the guidelines it gives contractors who evaluate search results, instructing them for the first time to give low-quality ratings to sites “created with the sole purpose of promoting hate or violence against a group of people”—something that would help adjust Google algorithms to lower those sites in search.
The next year, the company broadened the guidance to any pages that promote such hate or violence, even if it isn’t the page’s sole purpose and even if it is “expressed in polite or even academic-sounding language.”
Google has resisted entirely removing some content that outsiders complained should be blocked. In May 2018, Ignacio Wenley Palacios, a Spain-based lawyer working for the Lawfare Project, a nonprofit that funds litigation to protect Jewish people, asked Google to remove an anti-Semitic article lauding a German Holocaust denier posted on a Spanish-language neo-Nazi blog.
The company declined. In an email to Mr. Wenley Palacios, lawyers for Google contended that “while such content is detestable” it isn’t “manifestly illegal” in Spain.
Mr. Wenley Palacios then filed a lawsuit, but in the spring of this year, before the suit could be heard, he said, Google lawyers told him the company was changing its policy on such removals in Spain.
According to Mr. Wenley Palacios, the lawyers said the firm would now remove from searches conducted in Spain any links to Holocaust denial and other content that could hurt vulnerable minorities, once they are pointed out to the company. The results would still be accessible outside of Spain. He said both sides agreed to dismiss the case.
Google’s Ms. Levin described the action as a “legal removal” in accordance with local law. Holocaust denial isn’t illegal in Spain, but if it is coupled with an intent to spread hate, it can fall under Spanish criminal law banning certain forms of hate speech.
“Google used to say, ‘We don’t approve of the content, but that’s what it is,’ ” Mr. Wenley Palacios said. “That has changed dramatically.”
Business Model
Google’s search results page has changed over the years, becoming much more ad-heavy.
Other features
Organic search results
search term
Ad
Ads
Ads in recent years claim more space at the top of the results page.
Ad
Vertical search results
Various features that present specialized results for specific topics, like hotels or places, often with photos or maps. The results in some of these features are paid advertisements.
Organic search results
As Google has placed more ads and verticals at the top of the page, organic search results have shrunk.
Health policy consultant Greg Williams said he helped lead a campaign to push Google to make changes that would stifle misleading results for queries such as “rehab.”
At the time, in 2017, addiction centers with spotty records were constantly showing up in search results, typically the first place family members and addicts go in search of help.
Google routed Diane Hentges several times over the last year to call centers as she desperately researched drug addiction treatment centers for her 22-year-old son, she said.
Each time she called one of the facilities listed on Google, a customer-service representative would ask for her financial information, but the representatives weren’t seemingly attached to any legitimate company.
“If you look at a place on Google, it sends you straight to a call center,” Ms. Hentges said, adding that parents who are struggling with a child with addiction “will do anything to get our child healthy. We’ll believe anything.”
After intense lobbying by Mr. Williams and others, Google changed its ad policy around such queries. But addiction industry officials also noticed a significant change to Google search results. Many searches for “rehab” or related terms began returning the website for the Substance Abuse and Mental Health Services Administration, the national help hotline run by the U.S. Department of Health and Human Services, as the top result.
Google never acknowledged the change. Ms. Levin said that “resources are not listed because of any type of partnership” and that “we have algorithmic solutions designed to prioritize authoritative resources (including official hotlines) in our results for queries like these as well as for suicide and self-harm queries.”
A spokesman for SAMHSA said the agency had a partnership with Google.
Google’s search algorithms have been a major focus of Hollywood in its effort to fight pirated TV shows and movies.
Alphabet’s revenue, by type
Advertising
Other
$150
billion
100
50
0
2005
’10
’15
Note: Alphabet was created through a corporate restructuring of Google in 2015. Figures for prior years are for Google Inc.
Source: the company
Studios “saw this as the potential death knell of their business,” said Dan Glickman, chairman and chief executive of the Motion Picture Association of America from 2004 to 2010. The association has been a public critic of Google. “A hundred million dollars to market a major movie could be thrown away if someone could stream it illegally online.”
Google received a record 1.6 million requests to remove web pages for copyright issues last year, according to the company’s published Transparency Report and a Journal analysis. Those requests pertained to more than 740 million pages, about 12 times the number of web pages it was asked to take down in 2012.
A decade ago, in concession to the industry, Google removed “download” from its auto-complete suggestions after the name of a movie or TV show, so that at least it wouldn’t be encouraging searches for pirated content.
In 2012, it applied a filter to search results that would lower the ranking of sites that received a large number of piracy complaints under U.S. copyright law. That effectively pushed many pirate sites off the front page of results for general searches for movies or music, although it still showed them when a user specifically typed in the pirate site names.
In recent months the industry has gotten more cooperation from Google on piracy in search results than at any point in the organization’s history, according to people familiar with the matter.
“Google is under great cosmic pressure, as is Facebook,” Mr. Glickman said. “These are companies that are in danger of being federally regulated to an extent that they never anticipated.”
Mr. Pichai, who became CEO of Google in 2015, is more willing to entertain complaints about the search results from outside parties than Messrs. Page and Brin, the co-founders, according to people familiar with his leadership.
Google’s Ms. Levin said Mr. Pichai’s “style of engaging and listening to feedback has not shifted. He has always been very open to feedback.”
CRITICISM ALLEGING political bias in Google’s search results has sharpened since the 2016 election.
Interest groups from the right and left have besieged Google with questions about content displayed in search results and about why the company’s algorithms returned certain information over others.
Google appointed an executive in Washington, Max Pappas, to handle complaints from conservative groups, according to people familiar with the matter. Mr. Pappas works with Google engineers on changes to search when conservative viewpoints aren’t being represented fairly, according to interest groups interviewed by the Journal, although that is just one part of his job.
“Conservatives need people they can go to at these companies,” said Dan Gainor, an executive at the conservative Media Research Center, which has complained about various issues to Google.
Google also appointed at least one other executive in Washington, Chanelle Hardy, to work with outside liberal groups, according to people familiar with the matter.
Ms. Levin said both positions have existed for many years. She said in general Google believes it’s “the responsible thing to do” to understand feedback from the groups and said Google’s algorithms and policies don’t attempt to make any judgment based on the political leanings of a website.
Mr. Pappas declined to comment, and Ms. Hardy didn’t reply to a request for comment.
SHARE YOUR THOUGHTS
Does Google give you what you expect in search results? Join the discussion below.
Over the past year, abortion-rights groups have complained about search results that turned up the websites of what are known as “crisis pregnancy centers,” organizations that counsel women against having abortions, according to people familiar with the matter.
One of the complaining organizations was Naral Pro-Choice America, which tracks the activities of anti-abortion groups through its opposition research department, said spokeswoman Kristin Ford.
Naral complained to Google and other tech platforms that some of the ads, posts and search results from crisis pregnancy centers are misleading and deceptive, she said. Some of the organizations claimed to offer abortions and then counseled women against it. “They do not disclose what their agenda is,” Ms. Ford said.
In June, Google updated its advertising policies related to abortion, saying that advertisers must state whether they provide abortions or not, according to its website. Ms. Ford said Naral wasn’t told in advance of the policy change.
Ms. Levin said Google didn’t implement any changes with regard to how crisis pregnancy centers rank for abortion queries.
The Journal tested the term “abortion” in organic search results over 17 days in July and August. Thirty-nine percent of all results on the first page had the hostname www.plannedparenthood.org, the site of Planned Parenthood Federation of America, the nonprofit, abortion-rights organization.
By comparison, 14% of Bing’s first page of search results and 16% of DuckDuckGo’s first page of results were from Planned Parenthood.
Ms. Levin said Google doesn’t have any particular ranking implementations aimed at promoting Planned Parenthood.
See the results of the Journal’s search testsUse the lookup tool below to select search terms analyzed. Percentages indicate how many times each web page appeared during the WSJ’s testing.The practice of creating blacklists for certain types of sites or searches has fueled cries of political bias from some Google engineers and right-wing publications that said they have viewed portions of the blacklists. Some of the websites Google appears to have targeted in Google News were conservative sites and blogs, according to documents reviewed by the Journal. In one partial blacklist reviewed by the Journal, some conservative and right-wing websites, including The Gateway Pundit and The United West, were included on a list of hundreds of websites that wouldn’t appear in news or featured products, although they could appear in organic search results.
Google has said repeatedly it doesn’t make decisions based on politics, and current and former employees told the Journal they haven’t seen evidence of political bias. And yet, they said, Google’s shifting policies on interference—and its lack of transparency about them—inevitably force employees to become arbiters of what is acceptable, a dilemma that opens the door to charges of bias or favoritism.
Google’s Ms. Levin declined to comment.
DEMANDS FROM GOVERNMENTS for changes have grown rapidly since 2016.
From 2010 to 2018, Google fielded such requests from countries including the U.S. to remove 685,000 links from what Google calls web search. The requests came from courts or other authorities that said the links broke local laws or should be removed for other reasons.
Nearly 78% of those removal requests have been since the beginning of 2016, according to reports that Google publishes on its website. Google’s ultimate actions on those requests weren’t disclosed.
Russia has been by far the most prolific, demanding the removal of about 255,000 links from search last year, three-quarters of all government requests for removal from Google search in that period, the data show. Nearly all of the country’s requests came under an information-security law Russia put into effect in late 2017, according to a Journal examination of disclosures in a database run by the Berkman Klein Center.
Google said the Russian law doesn’t allow it to disclose which URLs were requested to be removed. A person familiar with the matter said the removal demands are for content ruled illegal in Russia for a variety of reasons, such as for promoting drug use or encouraging suicide.
Requests can include demands to remove links to information the government defines as extremist, which can be used to target political opposition, the person said.
Google, whose staff reviews the requests, at times declines those that appear focused on political opposition, the person said, adding that in those cases, it tries not to draw attention to its decisions to avoid provoking Russian regulators.
The approach has led to stiff internal debate. On one side, some Google employees say that the company shouldn’t cooperate at all with takedown requests from countries such as Russia or Turkey. Others say it is important to follow the laws of countries where they are based.
“There is a real question internally about whether a private company should be making these calls,” the person said.
Google’s Ms. Levin said, “Maximizing access to information has always been a core principle of Search, and that hasn’t changed.”
Google’s culture of publicly resisting demands to change results has diminished, current and former employees said. A few years ago, the company dismantled a global team focused on free-speech issues that, among other things, publicized the company’s legal battles to fight changes to search results, in part because Google had lost several of those battles in court, according to a person familiar with the change.
“Free expression was no longer a winner,” the person said.
How Google Edged Out Rivals and Built the World’s Dominant Ad Machine: A Visual Guide
The U.S. is investigating whether the tech giant has abused its power, including as the biggest broker of digital ad sales across the web
Nexstar Media Group Inc., the largest local news company in the U.S., recently tested what would happen if it stopped using Google’s technology to place ads on its websites.
Over several days, the company’s video ad sales plummeted. “That’s a huge revenue hit,” said Tony Katsur, senior vice president at Nexstar. After its brief test, Nexstar switched back to Google.
Alphabet Inc. ’s Google is under fire for its dominance in digital advertising, in part because of issues like this. The U.S. Justice Department and state attorneys general are investigating whether Google is abusing its power, including as the dominant broker of digital ad sales across the web. Most of the nearly 130 questions the states asked in a September subpoena were about the inner workings of Google’s ad products and how they interact.
We dug into Google’s vast, opaque ad machine, and in a series of graphics below, show you how it all works—and why publishers and rivals have had so many complaints about it.
Much of Google’s power as an ad broker stems from acquisitions of ad-technology companies, especially its 2008 purchase of DoubleClick. Regulators who approved that $3.1 billion deal warned they would step in if the company tied together its offerings in anticompetitive ways.
In interviews, dozens of publishing and advertising executives said Google is doing just that with an array of interwoven products. Google operates the leading selling and buying tools, and the biggest marketplace where online ad deals happen.
When Nexstar didn’t use Google’s selling tool, it missed out on a huge amount of demand that comes through its buying tools, Mr. Katsur said: “They want you locked in.”
What Ever Happened to Google Books?
It was the most ambitious library project of our time—a plan to scan all of the world’s books and make them available to the public online. “We think that we can do it all inside of ten years,” Marissa Mayer, who was then a vice-president at Google, said to this magazine in 2007, when Google Books was in its beta stage. “It’s mind-boggling to me, how close it is.”
Today, the project sits in a kind of limbo. On one hand, Google has scanned an impressive thirty million volumes, putting it in a league with the world’s larger libraries (the library of Congress has around thirty-seven million books). That is a serious accomplishment. But while the corpus is impressive, most of it remains inaccessible. Searches of out-of-print books often yield mere snippets of the text—there is no way to gain access to the whole book. The thrilling thing about Google Books, it seemed to me, was not just the opportunity to read a line here or there; it was the possibility of exploring the full text of millions of out-of-print books and periodicals that had no real commercial value but nonetheless represented a treasure trove for the public. In other words, it would be the world’s first online library worthy of that name. And yet the attainment of that goal has been stymied, despite Google having at its disposal an unusual combination of technological means, the agreement of many authors and publishers, and enough money to compensate just about everyone who needs it.
The problems began with a classic culture clash when, in 2002, Google began just scanning books, either hoping that the idealism of the project would win everyone over or following the mantra that it is always easier to get forgiveness than permission. That approach didn’t go over well with authors and publishers, who sued for copyright infringement. Two years of insults, ill will, and litigation ensued. Nonetheless, by 2008, representatives of authors, publishers, and Google did manage to reach a settlement to make the full library available to the public, for pay, and to institutions. In the settlement agreement, they also put terminals in libraries, but didn’t ever get around to doing that. But that agreement then came under further attacks from a whole new set of critics, including the author Ursula Le Guin, who called it a “deal with the devil.” Others argued that the settlement could create a monopoly in online, out-of-print books.
Four years ago, a federal judge sided with the critics and threw out the 2008 settlement, adding that aspects of the copyright issue would be more appropriately decided by the legislature. “Sounds like a job for Congress,” James Grimmelmann, a law professor at the University of Maryland and one of the settlement’s more vocal antagonists, said at the time. But, of course, leaving things to Congress has become a synonym for doing nothing, and, predictably, a full seven years after the court decision was first announced, we’re still waiting.
There are plenty of ways to attribute blame in this situation. If Google was, in truth, motivated by the highest ideals of service to the public, then it should have declared the project a non-profit from the beginning, thereby extinguishing any fears that the company wanted to somehow make a profit from other people’s work. Unfortunately, Google made the mistake it often makes, which is to assume that people will trust it just because it’s Google. For their part, authors and publishers, even if they did eventually settle, were difficult and conspiracy-minded, particularly when it came to weighing abstract and mainly worthless rights against the public’s interest in gaining access to obscure works. Finally, the outside critics and the courts were entirely too sanguine about killing, as opposed to improving, a settlement that took so many years to put together, effectively setting the project back a decade if not longer.
In the past few years, the Authors Guild has usefully proposed a solution known as an “extended collective licensing” system. Using a complex mechanism, it would allow the owners of scanned, out-of-print libraries, such as Google or actual non-profits like the Hathitrust library, to make a limited set of them available with payouts to authors. The United States Copyright Office supports this plan. I have a simpler suggestion, nicknamed the Big Bang license. Congress should allow anyone with a scanned library to pay some price—say, a hundred and twenty-five million dollars—to gain a license, subject to any opt-outs, allowing them to make those scanned prints available to institutional or individual subscribers. That money would be divided equally among all the rights holders who came forward to claim it in a three-year window—split fifty-fifty between authors and publishers. It is, admittedly, a crude, one-time solution to the problem, but it would do the job, and it might just mean that the world would gain access to the first real online library within this lifetime.