YouTube, the Great Radicalizer

Human beings have many natural tendencies that need to be vigilantly monitored in the context of modern life. For example, our craving for fat, salt and sugar, which served us well when food was scarce, can lead us astray in an environment in which fat, salt and sugar are all too plentiful and heavily marketed to us. So too our natural curiosity about the unknown can lead us astray on a website that leads us too much in the direction of lies, hoaxes and misinformation.

In effect, YouTube has created a restaurant that serves us increasingly sugary, fatty foods, loading up our plates as soon as we are finished with the last meal. Over time, our tastes adjust, and we seek even more sugary, fatty foods, which the restaurant dutifully provides. When confronted about this by the health department and concerned citizens, the restaurant managers reply that they are merely serving us what we want.

.. There is no reason to let a company make so much money while potentially helping to radicalize billions of people, reaping the financial benefits while asking society to bear so many of the costs.

Initial Better Ads Standards: Least preferred ad experiences for desktop web and mobile web

The Coalition for Better Ads has developed initial Better Ads Standards for desktop web and mobile web for North America and Europe, based on comprehensive research involving more than 25,000 consumers.

The Coalition’s research identifies the ad experiences that rank lowest across a range of user experience factors, and that are most highly correlated with an increased propensity for consumers to adopt ad blockers. These results define initial Better Ads Standards that identify the ad experiences that fall beneath a threshold of consumer acceptability. Four types of desktop web ads (six tested ad experiences) and eight types of mobile web ads (twelve tested ad experiences) fell beneath this threshold. A summary of these types of ad experiences is presented below.

The Case Against Google (nytimes.com)

Content recommendation algorithms reward engagement metrics. One of the metrics they reward is getting a user’s attention, briefly. In the real world, someone can get my attention by screaming that there is a fire. Belief that there is a fire and interest in fire are not necessary for my attention to be grabbed by a warning of fire. All that is needed is a desire for self-preservation and a degree of trust in the source of the knowledge.

Compounding the problem, since engagement is improved and people make money off of videos, there is an incentive in place encouraging the proliferation of attention grabbing false information.

In a better world, this behavior would not be incentivized. In a better world, reputation metrics would allow a person to realize that the boy who cried wolf was the one who had posted the attention grabbing video. Humanity has known for a long time that there are consequences for repeated lying. We have fables about that, warning liars away from lying.

I don’t think making that explicit, like it is in many real world cases of lying publicly in the most attention attracting way possible, would be unreasonable.

.. Google recommends that stuff to me, and I don’t believe in it or watch it. Watch math videos, get flat earth recommendations. Watch a few videos about the migration of Germanic tribes in Europe during the decline of the Roman Empire, get white supremacist recommendations.
My best guess? They want you to sit and watch YouTube for hours, so they recommend stuff watched by people who sit and watch YouTube for hours.
This stuff reminds of the word “excitotoxins,” which is based on a silly idea yet seems to capture the addictive effect of stimulation. People are stimulated by things that seem novel, controversial, and dangerous. People craving stimulation will prefer provocative junk over unsurprising truth.

Xi: an editor for the next 20 years

Abstract: Xi is a project to build a modern text editor with uncompromising performance. Its thoroughly async, loosely coupled design promises performance and rich extensibility, but creates interesting engineering challenges, requiring advanced algorithms and data structures. In addition to pushing the state of computer science for text handling, the project also seeks to build an open-source community for teaching and learning, and working together to create a joyful editing experience.

About Raph: Raph is a software engineer at Google, currently on the Fuchsia team working on text and Rust infrastructure, and holds a PhD in Computer Science from UC Berkeley, where his thesis topic was tools for interactive font design. He has been active in the open source community for over 25 years, with contributions in text, 2D graphics, fonts, and other areas. Raph is also a Recurse Center alum, from the Fall 1, 2017 batch.