It’s Time to Break Up Facebook (Chris Hughes)

Mark’s influence is staggering, far beyond that of anyone else in the private sector or in government. He controls three core communications platforms — Facebook, Instagram and WhatsApp — that billions of people use every day. Facebook’s board works more like an advisory committee than an overseer, because Mark controls around 60 percent of voting shares. Mark alone can decide how to configure Facebook’s algorithms to determine what people see in their News Feeds, what privacy settings they can use and even which messages get delivered. He sets the rules for how to distinguish violent and incendiary speech from the merely offensive, and he can choose to shut down a competitor by acquiring, blocking or copying it.

Mark is a good, kind person. But I’m angry that his focus on growth led him to sacrifice security and civility for clicks. I’m disappointed in myself and the early Facebook team for not thinking more about how the News Feed algorithm could change our culture, influence elections and empower nationalist leaders. And I’m worried that Mark has surrounded himself with a team that reinforces his beliefs instead of challenging them.

The government must hold Mark accountable. For too long, lawmakers have marveled at Facebook’s explosive growth and overlooked their responsibility to ensure that Americans are protected and markets are competitive. Any day now, the Federal Trade Commission is expected to impose a $5 billion fine on the company, but that is not enough; nor is Facebook’s offer to appoint some kind of privacy czar. After Mark’s congressional testimony last year, there should have been calls for him to truly reckon with his mistakes. Instead the legislators who questioned him were derided as too old and out of touch to understand how tech works. That’s the impression Mark wanted Americans to have, because it means little will change.

Facebook’s dominance is not an accident of history. The company’s strategy was to beat every competitor in plain view, and regulators and the government tacitly — and at times explicitly — approved. In one of the government’s few attempts to rein in the company, the F.T.C. in 2011 issued a consent decree that Facebook not share any private information beyond what users already agreed to. Facebook largely ignored the decree. Last month, the day after the company predicted in an earnings call that it would need to pay up to $5 billion as a penalty for its negligence — a slap on the wrist — Facebook’s shares surged 7 percent, adding $30 billion to its value, six times the size of the fine.

The F.T.C.’s biggest mistake was to allow Facebook to acquire Instagram and WhatsApp. In 2012, the newer platforms were nipping at Facebook’s heels because they had been built for the smartphone, where Facebook was still struggling to gain traction. Mark responded by buying them, and the F.T.C. approved.

Neither Instagram nor WhatsApp had any meaningful revenue, but both were incredibly popular. The Instagram acquisition guaranteed Facebook would preserve its dominance in photo networking, and WhatsApp gave it a new entry into mobile real-time messaging. Now, the founders of Instagram and WhatsApp have left the company after clashing with Mark over his management of their platforms. But their former properties remain Facebook’s, driving much of its recent growth.

.. When it hasn’t acquired its way to dominance, Facebook has used its monopoly position to shut out competing companies or has copied their technology.

The News Feed algorithm reportedlprioritized videos created through Facebook over videos from competitors, like YouTube and Vimeo. In 2012, Twitter introduced a video network called Vine that featured six-second videos. That same day, Facebook blocked Vine from hosting a tool that let its users search for their Facebook friends while on the new network. The decision hobbled Vine, which shut down four years later.

Snapchat posed a different threat. Snapchat’s Stories and impermanent messaging options made it an attractive alternative to Facebook and Instagram. And unlike Vine, Snapchat wasn’t interfacing with the Facebook ecosystem; there was no obvious way to handicap the company or shut it out. So Facebook simply copied it.

Facebook’s version of Snapchat’s stories and disappearing messages proved wildly successful, at Snapchat’s expense. At an all-hands meeting in 2016, Mark told Facebook employees not to let their pride get in the way of giving users what they want. According to Wired magazine, “Zuckerberg’s message became an informal slogan at Facebook: ‘Don’t be too proud to copy.’”

(There is little regulators can do about this tactic: Snapchat patented its “ephemeral message galleries,” but copyright law does not extend to the abstract concept itself.)

As a result of all this, would-be competitors can’t raise the money to take on Facebook. Investors realize that if a company gets traction, Facebook will copy its innovations, shut it down or acquire it for a relatively modest sum. So despite an extended economic expansion, increasing interest in high-tech start-ups, an explosion of venture capital and growing public distaste for Facebook, no major social networking company has been founded since the fall of 2011.

As markets become more concentrated, the number of new start-up businesses declines. This holds true in other high-tech areas dominated by single companies, like search (controlled by Google) and e-commerce (taken over by Amazon). Meanwhile, there has been plenty of innovation in areas where there is no monopolistic domination, such as in workplace productivity (Slack, Trello, Asana), urban transportation (Lyft, Uber, Lime, Bird) and cryptocurrency exchanges (Ripple, Coinbase, Circle).

I don’t blame Mark for his quest for domination. He has demonstrated nothing more nefarious than the virtuous hustle of a talented entrepreneur. Yet he has created a leviathan that crowds out entrepreneurship and restricts consumer choice. It’s on our government to ensure that we never lose the magic of the invisiblehand. How did we allow this to happen?

Since the 1970s, courts have become increasingly hesitant to break up companies or block mergers unless consumers are paying inflated prices that would be lower in a competitive market. But a narrow reliance on whether or not consumers have experienced price gouging fails to take into account the full cost of market domination. It doesn’t recognize that we also want markets to be competitive to encourage innovation and to hold power in check. And it is out of step with the history of antitrust law. Two of the last major antitrust suits, against AT&T and IBM in the 1980s, were grounded in the argument that they had used their size to stifle innovation and crush competition.

As the Columbia law professor Tim Wu writes, “It is a disservice to the laws and their intent to retain such a laserlike focus on price effects as the measure of all that antitrust was meant to do.

Facebook is the perfect case on which to reverse course, precisely because Facebook makes its money from targeted advertising, meaning users do not pay to use the service. But it is not actually free, and it certainly isn’t harmless.

Facebook’s business model is built on capturing as much of our attention as possible to encourage people to create and share more information about who they are and who they want to be. We pay for Facebook with our data and our attention, and by either measure it doesn’t come cheap.

I was on the original News Feed team (my name is on the patent), and that product now gets billions of hours of attention and pulls in unknowable amounts of data each year. The average Facebook user spends an hour a day on the platform; Instagram users spend 53 minutes a day scrolling through pictures and videos. They create immense amounts of data — not just likes and dislikes, but how many seconds they watch a particular video — that Facebook uses to refine its targeted advertising. Facebook also collects data from partner companies and apps, without most users knowing about it, according to testing by The Wall Street Journal.

Some days, lying on the floor next to my 1-year-old son as he plays with his dinosaurs, I catch myself scrolling through Instagram, waiting to see if the next image will be more beautiful than the last. What am I doing? I know it’s not good for me, or for my son, and yet I do it anyway.

The choice is mine, but it doesn’t feel like a choice. Facebook seeps into every corner of our lives to capture as much of our attention and data as possible and, without any alternative, we make the trade.

The vibrant marketplace that once drove Facebook and other social media companies to compete to come up with better products has virtually disappeared. This means there’s less chance of start-ups developing healthier, less exploitative social media platforms. It also means less accountability on issues like privacy.

Just last month, Facebook seemingly tried to bury news that it had stored tens of millions of user passwords in plain text format, which thousands of Facebook employees could see. Competition alone wouldn’t necessarily spur privacy protection — regulation is required to ensure accountability — but Facebook’s lock on the market guarantees that users can’t protest by moving to alternative platforms.

The most problematic aspect of Facebook’s power is Mark’s unilateral control over speech. There is no precedent for his ability to monitor, organize and even censor the conversations of two billion people.

Facebook engineers write algorithms that select which users’ comments or experiences end up displayed in the News Feeds of friends and family. These rules are proprietary and so complex that many Facebook employees themselves don’t understand them.

In 2014, the rules favored curiosity-inducing “clickbait” headlines. In 2016, they enabled the spread of fringe political views and fake news, which made it easier for Russian actors to manipulate the American electorate. In January 2018, Mark announced that the algorithms would favor non-news content shared by friends and news from “trustworthy” sources, which his engineers interpreted — to the confusion of many — as a boost for anything in the category of “politics, crime, tragedy.”

Facebook has responded to many of the criticisms of how it manages speech by hiring thousands of contractors to enforce the rules that Mark and senior executives develop. After a few weeks of training, these contractors decide which videos count as hate speech or free speech, which images are erotic and which are simply artistic, and which live streams are too violent to be broadcast. (The Verge reported that some of these moderators, working through a vendor in Arizona, were paid $28,800 a year, got limited breaks and faced significant mental health risks.)

As if Facebook’s opaque algorithms weren’t enough, last year we learned that Facebook executives had permanently deleted their own messages from the platform, erasing them from the inboxes of recipients; the justification was corporate security concerns. When I look at my years of Facebook messages with Mark now, it’s just a long stream of my own light-blue comments, clearly written in response to words he had once sent me. (Facebook now offers a limited version of this feature to all users.)

The most extreme example of Facebook manipulating speech happened in Myanmar in late 2017Mark said in a Vox interview that he personally made the decision to delete the private messages of Facebook users who were encouraging genocide there. “I remember, one Saturday morning, I got a phone call,” he said, “and we detected that people were trying to spread sensational messages through — it was Facebook Messenger in this case — to each side of the conflict, basically telling the Muslims, ‘Hey, there’s about to be an uprising of the Buddhists, so make sure that you are armed and go to this place.’ And then the same thing on the other side.”

Mark made a call: “We stop those messages from going through.” Most people would agree with his decision, but it’s deeply troubling that he made it with no accountability to any independent authority or government. Facebook could, in theory, delete en masse the messages of Americans, too, if its leadership decided it didn’t like them.

Mark used to insist that Facebook was just a “social utility,” a neutral platform for people to communicate what they wished. Now he recognizes that Facebook is both a platform and a publisher and that it is inevitably making decisions about values. The company’s own lawyers have argued in court that Facebook is a publisher and thus entitled to First Amendment protection.

No one at Facebook headquarters is choosing what single news story everyone in America wakes up to, of course. But they do decide whether it will be an article from a reputable outlet or a clip from “The Daily Show,” a photo from a friend’s wedding or an incendiary call to kill others.

Mark knows that this is too much power and is pursuing a twofold strategy to mitigate it.

  1. He is pivoting Facebook’s focus toward encouraging more private, encrypted messaging that Facebook’s employees can’t see, let alone control.
  2. Second, he is hoping for friendly oversight from regulators and other industry executives.

Late last year, he proposed an independent commission to handle difficult content moderation decisions by social media platforms. It would afford an independent check, Mark argued, on Facebook’s decisions, and users could appeal to it if they disagreed. But its decisions would not have the force of law, since companies would voluntarily participate.

In an op-ed essay in The Washington Post in March, he wrote, “Lawmakers often tell me we have too much power over speech, and I agree.” And he went even further than before, calling for more government regulation — not just on speech, but also on privacy and interoperability, the ability of consumers to seamlessly leave one network and transfer their profiles, friend connections, photos and other data to another.

I don’t think these proposals were made in bad faith. But I do think they’re an attempt to head off the argument that regulators need to go further and break up the company. Facebook isn’t afraid of a few more rules. It’s afraid of an antitrust case and of the kind of accountability that real government oversight would bring.

We don’t expect calcified rules or voluntary commissions to work to regulate drug companies, health care companies, car manufacturers or credit card providers. Agencies oversee these industries to ensure that the private market works for the public good. In these cases, we all understand that government isn’t an external force meddling in an organic market; it’s what makes a dynamic and fair market possible in the first place. This should be just as true for social networking as it is for air travel or pharmaceuticals.

In the summer of 2006, Yahoo offered us $1 billion for Facebook. I desperately wanted Mark to say yes. Even my small slice of the company would have made me a millionaire several times over. For a 22-year-old scholarship kid from small-town North Carolina, that kind of money was unimaginable. I wasn’t alone — just about every other person at the company wanted the same.

It was taboo to talk about it openly, but I finally asked Mark when we had a moment alone, “How are you feeling about Yahoo?” I got a shrug and a one-line answer: “I just don’t know if I want to work for Terry Semel,” Yahoo’s chief executive.

Outside of a couple of gigs in college, Mark had never had a real boss and seemed entirely uninterested in the prospect. I didn’t like the idea much myself, but I would have traded having a boss for several million dollars any day of the week. Mark’s drive was infinitely stronger. Domination meant domination, and the hustle was just too delicious.

Mark may never have a boss, but he needs to have some check on his power. The American government needs to do two things: break up Facebook’s monopoly and regulate the company to make it more accountable to the American people.

First, Facebook should be separated into multiple companies. The F.T.C., in conjunction with the Justice Department, should enforce antitrust laws by undoing the Instagram and WhatsApp acquisitions and banning future acquisitions for several years. The F.T.C. should have blocked these mergers, but it’s not too late to act. There is precedent for correcting bad decisions — as recently as 2009, Whole Foods settled antitrust complaints by selling off the Wild Oats brand and stores that it had bought a few years earlier.

There is some evidence that we may be headed in this direction. Senator Elizabeth Warren has called for reversing the Facebook mergers, and in February, the F.T.C. announced the creation of a task force to monitor competition among tech companies and review previous mergers.

How would a breakup work? Facebook would have a brief period to spin off the Instagram and WhatsApp businesses, and the three would become distinct companies, most likely publicly traded. Facebook shareholders would initially hold stock in the new companies, although Mark and other executives would probably be required to divest their management shares.

Until recently, WhatsApp and Instagram were administered as independent platforms inside the parent company, so that should make the process easier. But time is of the essence: Facebook is working quickly to integrate the three, which would make it harder for the F.T.C. to split them up.

Some economists are skeptical that breaking up Facebook would spur that much competition, because Facebook, they say, is a “natural” monopoly. Natural monopolies have emerged in areas like water systems and the electrical grid, where the price of entering the business is very high — because you have to lay pipes or electrical lines — but it gets cheaper and cheaper to add each additional customer. In other words, the monopoly arises naturally from the circumstances of the business, rather than a company’s illegal maneuvering. In addition, defenders of natural monopolies often make the case that they benefit consumers because they are able to provide services more cheaply than anyone else.

Facebook is indeed more valuable when there are more people on it: There are more connections for a user to make and more content to be shared. But the cost of entering the social network business is not that high. And unlike with pipes and electricity, there is no good argument that the country benefits from having only one dominant social networking company.

Facebook is indeed more valuable when there are more people on it: There are more connections for a user to make and more content to be shared. But the cost of entering the social network business is not that high. And unlike with pipes and electricity, there is no good argument that the country benefits from having only one dominant social networking company.

Still others worry that the breakup of Facebook or other American tech companies could be a national security problem. Because advancements in artificial intelligence require immense amounts of data and computing power, only large companies like Facebook, Google and Amazon can afford these investments, they say. If American companies become smaller, the Chinese will outpace us.

While serious, these concerns do not justify inaction. Even after a breakup, Facebook would be a hugely profitable business with billions to invest in new technologies — and a more competitive market would only encourage those investments. If the Chinese did pull ahead, our government could invest in research and development and pursue tactical trade policy, just as it is doing today to hold China’s 5G technology at bay.

The cost of breaking up Facebook would be next to zero for the government, and lots of people stand to gain economically. A ban on short-term acquisitions would ensure that competitors, and the investors who take a bet on them, would have the space to flourish. Digital advertisers would suddenly have multiple companies vying for their dollars.

Even Facebook shareholders would probably benefit, as shareholders often do in the years after a company’s split. The value of the companies that made up Standard Oil doubled within a year of its being dismantled and had increased by fivefold a few years later. Ten years after the 1984 breakup of AT&T, the value of its successor companies had tripled.

But the biggest winners would be the American people. Imagine a competitive market in which they could choose among one network that

  • offered higher privacy standards, another that
  • cost a fee to join but had little advertising and another that would allow users to
  • customize and tweak their feeds as they saw fit.

No one knows exactly what Facebook’s competitors would offer to differentiate themselves. That’s exactly the point.

The Justice Department faced similar questions of social costs and benefits with AT&T in the 1950s. AT&T had a monopoly on phone services and telecommunications equipment. The government filed suit under antitrust laws, and the case ended with a consent decree that required AT&T to release its patents and refrain from expanding into the nascent computer industry. This resulted in an explosion of innovation, greatly increasing follow-on patents and leading to the development of the semiconductor and modern computing. We would most likely not have iPhones or laptops without the competitive markets that antitrust action ushered in.

Adam Smith was right: Competition spurs growth and innovation.

Just breaking up Facebook is not enough. We need a new agency, empowered by Congress to regulate tech companies. Its first mandate should be to protect privacy.

The Europeans have made headway on privacy with the General Data Protection Regulation, a law that guarantees users a minimal level of protectionA landmark privacy bill in the United States should specify exactly what control Americans have over their digital information, require clearer disclosure to users and provide enough flexibility to the agency to exercise effective oversight over time. The agency should also be charged with guaranteeing basic interoperability across platforms.

Finally, the agency should create guidelines for acceptable speech on social media. This idea may seem un-American — we would never stand for a government agency censoring speech. But we already have limits on

  • yelling “fire” in a crowded theater,
  • child pornography,
  • speech intended to provoke violence and false statements to manipulate stock prices.

We will have to create similar standards that tech companies can use. These standards should of course be subject to the review of the courts, just as any other limits on speech are. But there is no constitutional right to harass others or live-stream violence.

These are difficult challenges. I worry that government regulators will not be able to keep up with the pace of digital innovation. I worry that more competition in social networking might lead to a conservative Facebook and a liberal one, or that newer social networks might be less secure if government regulation is weak. But sticking with the status quo would be worse: If we don’t have public servants shaping these policies, corporations will.

Some people doubt that an effort to break up Facebook would win in the courts, given the hostility on the federal bench to antitrust action, or that this divided Congress would ever be able to muster enough consensus to create a regulatory agency for social media.

But even if breakup and regulation aren’t immediately successful, simply pushing for them will bring more oversight. The government’s case against Microsoft — that it illegally used its market power in operating systems to force its customers to use its web browser, Internet Explorer — ended in 2001 when George W. Bush’s administration abandoned its effort to break up the company. Yet that prosecution helped rein in Microsoft’s ambitions to dominate the early web.

Similarly, the Justice Department’s 1970s suit accusing IBM of illegally maintaining its monopoly on computer sales ended in a stalemate. But along the way, IBM changed many of its behaviors. It stopped bundling its hardware and software, chose an extremely open design for the operating system in its personal computers and did not exercise undue control over its suppliers. Professor Wu has written that this “policeman at the elbow” led IBM to steer clear “of anything close to anticompetitive conduct, for fear of adding to the case against it.”

We can expect the same from even an unsuccessful suit against Facebook.

Finally, an aggressive case against Facebook would persuade other behemoths like Google and Amazon to think twice about stifling competition in their own sectors, out of fear that they could be next. If the government were to use this moment to resurrect an effective competition standard that takes a broader view of the full cost of “free” products, it could affect a whole host of industries.

The alternative is bleak. If we do not take action, Facebook’s monopoly will become even more entrenched. With much of the world’s personal communications in hand, it can mine that data for patterns and trends, giving it an advantage over competitors for decades to come.

I take responsibility for not sounding the alarm earlier. Don Graham, a former Facebook board member, has accused those who criticize the company now as having “all the courage of the last man leaping on the pile at a football game.” The financial rewards I reaped from working at Facebook radically changed the trajectory of my life, and even after I cashed out, I watched in awe as the company grew. It took the 2016 election fallout and Cambridge Analytica to awaken me to the dangers of Facebook’s monopoly. But anyone suggesting that Facebook is akin to a pinned football player misrepresents its resilience and power.

An era of accountability for Facebook and other monopolies may be beginning. Collective anger is growing, and a new cohort of leaders has begun to emerge. On Capitol Hill, Representative David Cicilline has taken a special interest in checking the power of monopolies, and Senators Amy Klobuchar and Ted Cruz have joined Senator Warren in calling for more oversight. Economists like Jason Furman, a former chairman of the Council of Economic Advisers, are speaking out about monopolies, and a host of legal scholars like Lina Khan, Barry Lynn and Ganesh Sitaraman are plotting a way forward.

This movement of public servants, scholars and activists deserves our support. Mark Zuckerberg cannot fix Facebook, but our government can.

Zuckerberg’s new privacy essay shows why Facebook needs to be broken up

Mark Zuckerberg doesn’t understand what privacy means — he can’t be trusted to define it for the rest of us

Zuckerberg’s new privacy essay shows why Facebook needs to be broken up
Mark Zuckerberg doesn’t understand what privacy means—he can’t be trusted to define it for the rest of us.
by Konstantin Kakaes March 7, 2019

In a letter published when his company went public in 2012, Mark Zuckerberg championed Facebook’s mission of making the world “more open and connected.” Businesses would become more authentic, human relationships stronger, and government more accountable. “A more open world is a better world,” he wrote.

Facebook’s CEO now claims to have had a major change of heart.

In “A Privacy-Focused Vision for Social Networking,” a 3,200-word essay that Zuckerberg posted to Facebook on March 6, he says he wants to “build a simpler platform that’s focused on privacy first.” In apparent surprise, he writes: “People increasingly also want to connect privately in the digital equivalent of the living room.”

Zuckerberg’s essay is a power grab disguised as an act of contrition. Read it carefully, and it’s impossible to escape the conclusion that if privacy is to be protected in any meaningful way, Facebook must be broken up.

Facebook grew so big, so quickly that it defies categorization. It is a newspaper. It is a post office and a telephone exchange. It is a forum for political debate, and it is a sports broadcaster. It’s a birthday-reminder service and a collective photo album. It is all of these things—and many others—combined, and so it is none of them.

Zuckerberg describes Facebook as a town square. It isn’t. Facebook is a company that brought in more than $55 billion in advertising revenue last year, with a 45% profit margin. This makes it one of the most profitable business ventures in human history. It must be understood as such.

Facebook has minted money because it has figured out how to commoditize privacy on a scale never before seen. A diminishment of privacy is its core product. Zuckerberg has made his money by performing a sort of arbitrage between how much privacy Facebook’s 2 billion users think they are giving up and how much he has been able to sell to advertisers. He says nothing of substance in his long essay about how he intends to keep his firm profitable in this supposed new era. That’s one reason to treat his Damascene moment with healthy skepticism.

“Frankly we don’t currently have a strong reputation for building privacy protective services,” Zuckerberg writes. But Facebook’s reputation is not the salient question: its business model is. If Facebook were to implement strong privacy protections across the board, it would have little left to sell to advertisers aside from the sheer size of its audience. Facebook might still make a lot of money, but they’d make a lot less of it.

Zuckerberg’s proposal is a bait-and-switch. What he’s proposing is essentially a beefed-up version of WhatsApp. Some of the improvements might be worthwhile. Stronger encryption can indeed be useful, and a commitment to not building data centers in repressive countries is laudable, as far as it goes. Other principles that Zuckerberg puts forth would concentrate his monopoly power in worrisome ways. The new “platforms for private sharing” are not instead of Facebook’s current offering: they’re in addition to it. “Public social networks will continue to be very important in people’s lives,” he writes, an assertion he never squares with the vague claim that “interacting with your friends and family across the Facebook network will become a fundamentally more private experience.”

By narrowly construing privacy to be almost exclusively about end-to-end encryption that would prevent a would-be eavesdropper from intercepting communications, he manages to avoid having to think about Facebook’s weaknesses and missteps. Privacy is not just about keeping secrets. It’s also about how flows of information shape us as individuals and as a society. What we say to whom and why is a function of context. Social networks change that context, and in so doing they change the nature of privacy, in ways that are both good and bad.

Russian propagandists used Facebook to sway the 2016 American election, perhaps decisively. Myanmarese military leaders used Facebook to incite an anti-Rohingya genocide. These are consequences of the ways in which Facebook has diminished privacy. They are not the result of failures of encryption.

“Privacy,” Zuckerberg writes, “gives people the freedom to be themselves.” This is true, but it is also incomplete. The self evolves over time. Privacy is important not simply because it allows us to be, but because it gives us space to become. As Georgetown University law professor Julie Cohen has written: “Conditions of diminished privacy also impair the capacity to innovate … Innovation requires room to tinker, and therefore thrives most fully in an environment that values and preserves spaces for tinkering.” If Facebook is constantly sending you push notifications, it diminishes the mental space you have available for tinkering and coming up with your own ideas. If Facebook bombards the gullible with misinformation, this too is an invasion of privacy. What has happened to privacy in the last couple of decades, and how to value it properly, are questions that are apparently beyond Zuckerberg’s ken.

He says Facebook is “committed to consulting with experts and discussing the best way forward,” and that it will make decisions “as openly and collaboratively as we can” because “many of these issues affect different parts of society.” But the flaw here is the centralized decision-making process. Even if Zuckerberg gets all the best advice that his billions can buy, the result is still deeply troubling. If his plan succeeds, it would mean that private communication between two individuals will be possible when Mark Zuckerberg decides that it ought to be, and impossible when he decides it ought not to be.

If that sounds alarmist, consider the principles that Zuckerberg laid out for Facebook’s new privacy focus. The most problematic of them is the way he discusses “interoperability.” Zuckerberg allows that people should have a choice between messaging services: some want to use Facebook Messenger, some prefer WhatsApp, and others like Instagram. It’s a hassle to use all of these, he says, so you should be able to send messages from one to the other.

But allowing communications that are outside Facebook’s control, he says, would be dangerous if users were allowed to send messages not subject to surveillance by Facebook’s “safety and security systems.” Which is to say we should be allowed to use any messaging service we like, so long as it’s controlled by Facebook for our protection. Zuckerberg is arguing for tighter and tighter integration of Facebook’s various properties.

Monopoly power is problematic even for companies that just make a lot of money selling widgets: it allows them to exert undue influence on regulators and to rip off consumers. But it’s particularly worrisome for a company like Facebook, whose product is information.

This is why it should be broken up. This wouldn’t answer every difficult question that Facebook’s existence raises. It isn’t easy to figure out how to protect free speech while limiting hate speech and deliberate misinformation campaigns, for example. But breaking up Facebook would provide space to come up with solutions that make sense to society as a whole, rather than to Zuckerberg and Facebook’s other shareholders.

At a minimum, splitting WhatsApp and Instagram from Facebook is a necessary first step. This makes the company smaller, and therefore less powerful when it comes to negotiating with other businesses and with regulators. Monopolies, as Louis Brandeis pointed out a century ago, and as Columbia University law professor Tim Wu, journalist Franklin Foer, and others have underscored more recently, simply accrue too much political and economic power to allow for the democratic process to find a balance in how to tackle issues like privacy.

Tellingly, Zuckerberg’s power has grown so great that he feels no need to hide his ambitions. “We can,” he writes, “create platforms for private sharing that could be even more important to people than the platforms we’ve already built to help people share and connect more openly.”

Only if we let him.

Be Outraged by America’s Role in Yemen’s Misery

The United States is not directly bombing civilians in Yemen, but it is providing arms, intelligence and aerial refueling to assist Saudi Arabia and the United Arab Emirates as they hammer Yemen with airstrikes, destroy its economy and starve its people. The Saudi aim is to crush Houthi rebels who have seized Yemen’s capital and are allied with Iran.

That’s sophisticated realpolitik for you: Because we dislike Iran’s ayatollahs, we are willing to starve Yemeni schoolchildren.

.. To their credit, some members of Congress are trying to stop these atrocities. A bipartisan effort this year, led by Senators Mike Lee, Chris Murphy and Bernie Sanders, tried to limit U.S. support for the Yemen war, and it did surprisingly well, winning 44 votes. New efforts are underway as well.

.. World leaders are gathered for the United Nations General Assembly, making pious statements about global goals for a better world, but the Assembly is infused with hypocrisy. Russia is up to its elbows in crimes against humanity in Syria, China is detaining perhaps one million Uighurs while also shielding Myanmar from accountability for probable genocide, and the United States and Britain are helping Saudi Arabia commit war crimes in Yemen.

That’s pathetic: Four of the five permanent members of the U.N. Security Council are complicit in crimes against humanity.

Stateless and Poor, Some Boys in Thai Cave Had Already Beaten Long Odds

It was Adul, the stateless descendant of a Wa ethnic tribal branch once known for headhunting, who played a critical role in the rescue, acting as interpreter for the British divers.

.. Proficient in English, Thai, Burmese, Mandarin and Wa, Adul politely communicated to the British divers his squad’s greatest needs: food and clarity on just how long they had stayed alive.

.. the Wild Boars’ 18-day ordeal came to an end. In a three-day rescue mission, Adul and 12 others were safely extracted from the cave by a team of dozens of divers, doctors and support staff.

.. Located not far from where Thailand, Myanmar and Laos meet in the Golden Triangle

.. The Golden Triangle is a smuggling center, and a sanctuary for members of various ethnic militias that have spent decades pushing for autonomy from a government in Myanmar that routinely represses them.

.. Three of the trapped soccer players, as well as their coach, Ekkapol Chantawong, are stateless ethnic minorities, accustomed to slipping across the border to Myanmar one day and returning for a soccer game in Thailand the next.

.. Their presence undercuts a Thai sense of nationhood that is girded by a triumvirate of institutions: the military, the monarchy and the Buddhist monastery.

.. With the English he used to communicate with the British divers on July 2, Adul was crucial in ensuring the safety of the Wild Boars. He is the top student in his class at the Ban Wiang Phan School in Mae Sai. His academic record and sporting prowess have earned him free tuition and daily lunch.

.. “Stateless children have a fighting spirit that makes them want to excel,” he said. “Adul is the best of the best.”

.. At least 440,000 stateless people live in Thailand, many of whom are victims of Myanmar’s long years of ethnic strife, according to the United Nations refugee agency. Human rights groups say the true number could be as high as 3 million — in a nation of nearly 70 million — even though the Thai government has refused to ratify the United Nations convention guaranteeing rights for refugees.

.. A stateless member of the ethnic Shan minority, Mr. Ekkapol has long experience caring for children. After his parents died in Myanmar when he was a young boy, he entered the Buddhist monkhood in Thailand for nearly a decade, a common option for orphans untethered from financial support.