Spot on Beau! Years ago in the Army we had training on Soviet disinformation/misinformation tactics and you laid it out beautifully. Thank you!
DUBAI, June 22 (Reuters) – The U.S. Justice Department said on Tuesday it seized 36 Iranian-linked websites, many of them associated with either disinformation activities or violent organizations, taking them offline for violating U.S. sanctions. Several of the sites were back online within hours with new domain addresses. “Today, pursuant to court orders, the United States seized 33 websites used by the Iranian Islamic Radio and Television Union (IRTVU) and three websites operated by Kata’ib Hizballah (KH), in violation of U.S. sanctions,” the department said in a statement. https://www.reuters.com/world/middle-…
“But to the purveyors of the big lie—Republicans like Greg Abbott and his friends on Fox News—this very real and acute suffering is just a vehicle for their political objectives,” says Chris Hayes of the GOP blaming the Green New Deal as millions of Texans freeze. Aired on 02/17/2021.
The Hunter Biden story is a fascinating piece of disinformation and agitprop.
I’ve been following it closely, partly because it’s an interesting political story and partly because I’m an avid infosec enthusiast and I actually know how emails work.
I was, to be honest, surprised when the “Hunter laptop” scam first began that anyone actually believed it. If you know even a little bit about how email works, it’s plain as snow in December that the story was fake. The supposed “Hunter Biden emails” that were released were released as PDFs, with no headers, from email domains that do not exist. I was like “Really? People are falling for this? God damn conservatives sure are gullible.”
But in the time I’ve been in Florida helping to care for my mother, I’ve realized that isn’t fair.
My dad is older (he’s 82), he uses email every single day, but he has not the slightest clue how it works and he cannot define the word “domain,” much less explain what a domain is or spot an invalid email domain.
And honestly, a lot of people, on the left and right, are that way. It’s not that conservatives are stupid, it’s that to most people, email is magic. And they don’t ever see email headers, so it doesn’t look suspicious if someone shows them a fake email with no headers. And, like my dad, they don’t know what a domain is, so if you show them an email with a clearly bogus “from” address they don’t even blink.
In that sense, what looks like a crude, hamfisted attempt at second-rate disinformation is actually pretty savvy. It’s propaganda aimed at a very specific audience: an audience that is not technically savvy, but—and this is the important part—has also been indoctrinated to distrust “elitist experts” who think they know better. So this audience (1) doesn’t have the technical skill to see through even a very crude, simplistic scam and (2) will automatically respond to anyone who points out the scam with “neener neener I don’t believe you!!!”
It’s been a very interesting lesson in 21st-century propaganda. Forging a real email is hard. Email is trackable and traceable. It passes through many computers and it leaves traces in every one. You can not easily forge a realistic, believable email even if you have nearly unlimited resources…
…but you don’t have to.
If the target of your propaganda is people with the limited technical knowledge of my father who have also been told to distrust experts, it’s not necessary.
Now, having said that, parts of the scam are very sophisticated.
This is Martin Aspen.
Martin Aspen wrote a 64-page dossier documenting corruption in Hunter Biden’s business dealings in China, which was released by security firm Typhoon Investigations.
Martin Aspen does not exist.
Typhoon Investigations does not exist.
The photograph of the person you see here is not a picture of a person. It was created by a GAN—a type of deepfake machine learning computer program.
If you’re not familiar with these, I recommend you visit the site This Person Does Not Exist:
Every time you refresh the browser, you will see a photograph of a different person. None of the photographs are real. None of the people exist. The photographs are all created by machine learning deepfake programs.
Anyway, back to Martin Aspen.
Martin Aspen does not exist. His photo is a computer-generated deepfake. His resume lists companies he’s never worked for, universities that have no record of him, and security firms that don’t exist. The entire dossier was faked.
Fifteen, twenty, thirty years ago, this level of fakery would require the concerted effort of a nation-state’s intelligence team to do. Today, a single reasonably skilled person can do it. I can do it. You can do it.
And the thing that’s most fascinating about all of this, besides the fact it shows how fragile and easily manipulated the public perception is? Proving that the document was fake will not change a single mind.
The Hunter Biden saga has revealed two things:
- Small groups of individuals, even a single person sitting in a bedroom, can create agitprop and disinformation campaigns that would only a short time ago have been the envy of entire government intelligence teams.
- People want to believe. They’re simply looking for an excuse. Modern propaganda does not need to be subtle. It doesn’t need to be well-done. It doesn’t need to stand up to any scrutiny. It merely needs to give people an excuse to believe what they already want to believe.
On a sunny summer morning in June, professor Jonathan Zittrain is hosting Sir Tim Berners-Lee in a Harvard Law School classroom. The audience is a smattering of visiting scholars at the Berkman Klein Center for Internet and Society and a few local techies involved with open source software development. I’d come to the room half an hour early to snag a seat, but I needn’t have bothered, as the crowd to see the man who invented the World Wide Web is attentive, but thin.
Jonathan Zittrain, one of the world’s leading scholars of creativity in an internet-connected universe, points out that Sir Tim’s current work is attempting to make a second correction in the arc of the internet. His first innovation, thirty years ago, was “the conceptualization and the runaway success of the World Wide Web.” Sir Tim’s current idea is a protocol — Solid — and a company — Inrupt — which want to make the Web as it is now significantly better. Just what are Solid and Inrupt? That’s what a smattering of us are here to find.
Sir Tim draws an arc on the chalkboard behind him. “People talk about the meteoric rise of the web — of course, meteors go down.” Referencing internet disinformation expert Joan Donavan, sitting in the audience, he notes “If you study the bad things on the web, there’s hundreds and thousands to study.” Almost apologetically, he explains that “there was a time when you could see things that were new [online], but not the ways they were bad.” For Sir Tim, the days of blogs were pretty good ones. “When you made a blog, you tried to make it high quality, and you tried to make your links to high quality blogs. You as a blogger were motivated by your reading counter, which led to a virtuous system based on custodianship as well as authorship.” Wistfully, he noted, “You could be forgiven for being fairly utopian in those days.”
What came out of this moment in the web’s evolution was a “true scale-free network, based on HTTP and HTML.” (Scale-free networks follow a Pareto distribution, with a small number of highly connected nodes and a “long tail” of less-connected nodes.) “It was extraordinary to discover that when you connect humanity, they form scale-free networks at all different levels. We put out HTTP and HRTML and ended up with humanity forming scale-free networks on a planetary — okay, a tenth of a planet — scale.”
Sir Tim noted that much of what was most interesting about the web was in the long tail, the less connected and less popular nodes. Zittrain invokes philosopher David Weinberger’s maxim, “In the future, everyone will be famous for 15 people” to acknowledge this idea, and Sir Tim pushes back: “That’s not scale free. What’s possible is that for n people on the planet, we might have root-n groups. We’re not trying to make one network for everyone, not trying to design something for Justin Bieber tweeting.”
So why doesn’t blogosphere still work? Sir Tim blames the Facebook algorithms which determine what you read, breaking network effects and leading to a huge amount of consolidation. Zittrain wonders whether Facebook’s power is really all that new — didn’t Google’s search algorithm have similar effects? Sir Tim demurs — “Google just looks at all links and takes an eigenvector — it’s still using the web to search.” There’s a fascinating parenthetical where Sir Tim explains that he never thought search engines were possible. “Originally, we thought no one would be able to crawl the entire web — you would need so much storage, it wouldn’t be possible. We hadn’t realized that disk space would become ridiculously cheap.” Jonathan Zittrain likens the moment when Google comes into being as a science fiction moment, where our ability to comprehend the universe as limited by the speed of light suddenly allows us to transcend those barriers — prior to search, we might only know our local quadrant of the web, while search suddenly made it possible to encounter any content, anywhere.
Sir Tim brings us back to earth by discussing clickbait. “Blogging was driven by excitement around readership. But eventually ads come into play — if I am writing, I should have recompense.” What follows is content written specifically to generate money, like the fake news content written by Macedonian bloggers that might have influenced US elections. Zittrain generously references my “The Internet’s Original Sin” article, and Sir Tim notes that “some people argue that if you start off with advertising, you’re never going to have a successful web.”
The consequence of a monetized web, Sir Tim believes, is consolidation, designed to give advertisers larger audiences to reach. That consolidation leads to silos: “My photos are on Flickr, but my colleagues are all on LinkedIn? How do I share them? Do I have to persuade all my friends to move over to the platform I’m on?”
Zittrain offers two possible solution the problem: interoperability, where everything shares some common data models and can exchange data, or dramatic consolidation, where LinkedIn, for instance, just runs everything. Sir Tim isn’t overly optimistic about either, noting that totalitarian societies might be able to demand deep interop, but that it seems unlikely in our market democracy. And while consolidation is easier to work within, “consolidation is also incredibly frustrating. If you want to make a Facebook app, you need to work within not only the Facebook API, but the Facebook paradigm, with users, groups, and likes. Silos are very bad for innovation.”
Returning to the arc he’s drawn on the blackboard, Sir Tim notes that the meteor is crashing into earth. “We don’t need to imagine future web dystopias. We’ve got a television show where every single episode illustrates a different form of dysfunction.” The arc of the Web is long and it leads towards Black Mirror.
In March of this year, Sir Tim launched the #ForTheWeb campaign to celebrate the thirtieth anniversary of the Web. For Tim, the campaign was meant to feature the web worth saving, not to demand that either governments or Facebook fix it for us. “We need to fix networks and communities all at once, because it’s a sociotechnical system,” he explains. “We need to work inside the companies and inside the government. Some things are simple to fix — net neutrality, cheaper broadband, those were relatively simple. This isn’t simple. Free speech and hate speech are complicated and need complex social processes around them.” And while #ForTheWeb is a space for articulating the key values we want to support for a future direction of the web, that new direction needs a technical component as well. We need a course correction — what’s the White Mirror scenario?
Sir Tim pushes up the blackboard featuring the web as a meteor crashing back to earth. On the board below it, he starts drawing a set of cylinders. Solid is based around the idea of pods, personal data stores that could live in the cloud or which you could control directly. “Solid is web technology reapplied,” Sir Tim explains. “You use apps and web apps, but they don’t store your data at all.”
Returning to his photo sharing scenario, Sir Tim imagines uploading photos taken from a digital camera. The camera asks where you want to store the data. “You have a Solid pod at home, and one at work — you decide where to put them based on what context you want to use them in. Solid is a protocol, like the web. Pods are Solid-compatible personal clouds. Apps can talk to your pod.” So sharing photos is no longer about making LinkedIn and Flickr talk to each other — it’s simply about both of them talking to your pod, which you control.
“The web was all about interoperability — this is a solution for interoperability,” explains Sir Tim. “You choose where to store your information and the pods do access control, There’s a single sign on that leads to a WebID. Those WebIDs plus access controls are a common language across the Solid world.” These WebIDs support groups as well as individuals… and groups have pages where you can see who belongs to them. Apps look up the group and deliver information accordingly. The content delivery mechanism underneath Solid is WebDAV, a versioning and authoring protocol that Sir Tim has supported from very early on as a way of returning the Web to its read/write roots, though he notes that Solid plans on running on protocols that will be much faster.
Zittrain picks up the legal implications of this new paradigm: “Right now, each web app or service has custody of the data it uses — LinkedIn has a proprietary data store behind it. But there might also be some regulations that govern what LinkedIn can do with that data — how does that work in a Solid world?”
Ducking the legal question, Sir Tim looks into ways we might bootstrap personal data pods. “Because of GDPR, the major platforms bave been forced to create a way for people to export their content. You’d expect that Google, Facebook and others would fight this tooth and nail — instead they’re cooperating.” Specifically, they’re developing the Data Transfer Project, a common standard for data export that allows you not only to export your data, but to import it into a different platform. “They’ve gone to the trouble of designing common data models, which is brilliant from the Solid point of view.”
Zittrain suggests that we can think of Solid’s development in stages. In Stage 0, you might be able to retrieve your data from a platform, possibly from the API, possibly by scraping it, and you might get sued in the process. In Step 1, you can get your data through a Data Transfer dump. In Step 2, companies might begin making the data available regularly through Solid-compatible APIs. In Step 3, the Solid apps start working off the data that’s been migrated into personal pods.
Sir Tim notes that exciting things start to happen in Step 3. “My relationship with a bank is just a set of transactions and files. I can get a static copy of how the bank thinks of my current relationships. What I would like is for all those changes to be streamed to my Solid pod.” He concedes, “I probably don’t want to have the only copy.” Much of what’s interesting about Solid comes from the idea that pods can mirror each other in different ways — we might want to have a public debate in which all conversations are on the record and recorded, or an entirely ephemeral interaction, where all we say to one another disappears. This is one of many reasons, Sir Tim explains, “Solid does not use Blockchain. At all.”
Zittrain persists in identifying some of the challenges of this new model, referencing the Cambridge Analytica scandal that affected Facebook. “If the problem is privacy, specifically an API that made it easy to get not only my data, but my friends’ data, how does Solid help with this? Doesn’t there need to be someone minding controls of the access lists?”
Solid, Sir Tim explains, is not primarily about privacy. Initially, people worried about their personal data leaking, a compromising photo that was supposed to be private becoming public. Now we worry about how our data is aggregated and used. The response shouldn’t be to compensate people for that data usage. Instead, we need to help combat the manipulation. “Data is not oil. It doesn’t work that way, it’s not about owning it.” One of Sir Tim’s core concerns is that people offer valuable services, like free internet, in exchange for access to people’s datastream.
Zittrain points out that the idea that you own your own data — which is meant to be empowering — includes a deeply disempowering possibility. You now have the alienable right of giving away your own data.
Sir Tim is more excited about the upsides: “In a Solid world, my doctor has a Solid ID and I can choose the family photo that has a picture of my ankle and send it to the doctor for diagnosis. And I can access my medical data and share it with my cousin, if I choose.” Financial software interoperates smoothly, giving you access to your full financial picture. “All your fitness stuff is in your Solid Pod, and data from your friends if they want to share it so you can compete.” He imagines a record of purchases you’ve made on different sites, not just Amazon, and the possibility of running your own AI on top of it to make recommendations on what to buy next.
A member of the audience asks whether it’s really realistic for individuals to make decisions about how to share their data — we may not know what data it is unsafe to share, once it gets collected and aggregated. Can Solid really prevent data misuse?
“The Solid protocol doesn’t tell you whether these services spy on you, but the spirit of Solid is that they don’t,” offers Sir Tim. Apps are agents acting on your behalf. Not all Solid apps will be beneficent, he notes, but we can train certified developers to make beneficent apps, and offer a store of such apps. Zittrain, who wrote a terrific book about the ways in which app stores can strangle innovation, is visible uncomfortable and suggests that people may need help knowing who to trust in a Solid world. “Imagine a party able to be designated as a helper with respect to privacy. Maybe a grandchild is a helper for a grandmother. Maybe we need a new role in society — a fiduciary whose responsibility is to help you make trust decisions.” Zittrain’s question links Sir Tim’s ideas about Solid to an idea he’s been developing with Jack Balkin about information fiduciaries, the idea that platforms like Facebook might be required to treat our personal data with the legal respect that doctors, lawyers and accountants are forced to apply to personal data.
Another question wonders who will provide the hardware for Solid pods. Zittrain points out that Solid could run on Eben Moglen’s “Freedom Box”, a long-promised personal web server designed to put control of data back into users hands. Sir Tim suggests that your cable or ISP router might run a Pod in the future.
My question for Sir Tim focuses on adoption. Accepting for the moment the desirability of a Solid future — and, for the most part, I like Sir Tim’s vision a great deal — how do we get from here to there? For the foreseeable future, billions of people are using proprietary social networks that surveil their users and cling to their data. When Sir Tim last disrupted the Internet, it was an academic curiosity, not an industry worth hundreds of billions. How do we get from here to there?
Sir Tim remembers the advent of the web as a struggle. “Remember when Gopher was taking off exponentially, and the web was growing really slowly? Remember that things that take off fast can drop off fast.” Gopher wasn’t free, and its proprietary nature led it to die quickly; “People seem locked into Facebook — one of the rules of Solid is not to disturb them.” People who will adopt Solid will work around them, and when people begin using Solid, that group could explode exponentially. “The billion people on Facebook don’t affect the people using a Solid community.”
Returning to the 80s, Sir Tim notes that it was difficult for the Web to take off — there were lots of non-internet documentation systems that seemed like they might win. What happened was that CERN’s telephone directory was put on the web, and everyone got a web browser to access that directory. It took a while before people realized that they might want to put other information on top of the directory.
“We don’t want everyone using Facebook to switch to Solid tomorrow — we couldn’t handle the scale.” Instead, Sir Tim offers, “We want people who are passionate about it to work within it. The reward is being part of another revolution.”
There’s something very surreal about a moment in which thousands of researchers and pundits are studying what’s wrong with social media and the Web, and surprisingly few working on new models we can use to move forward. The man who built the web in the first place is now working on alternative models to save us from the Black Mirror universe and the broader academic and professional world seems… surprisingly uninterested.
I can certainly see problems with Solid apps — your Pod will become a honeypot of private information that’s a great target for hackers. Apps will develop to collect as much of your Pod data as possible, unless they’re both regulated and technically prevented from doing so. Unless Pods are mostly on very fast cloud services, apps that draw from multiple pods will be significantly slower than the web as it operates today.
But there’s so much to like in Sir Tim’s vision. My lab and I are working now on the idea that what the world needs now is not a better Facebook, but thousands of social networks, with different rules, purposes and community standards. Like Sir Tim, we’re not looking to replace Facebook but to create new communities for groups of 5 to 50,000, self-governing and capable of different behaviors than the communities with hundreds of millions of users and central corporate governance are capable of. There’s no reason why the networks we’re imagining couldn’t live atop Solid.
It’s hard to remember how small and strange an experiment the web was in 1989, or even in 1994. I remember dropping out of graduate school to work on a web startup. My motivation wasn’t that I might make a lot of money — that seemed extraordinarily unlikely. It was that someone was willing to pay me to work on something that seemed… right. Like a plausible and desirable future. And for me, at least, Solid seems plausible and desirable in much the same way. It also seems roughly as hard to love as the Web was in 1994, with its grey backgrounds and BLINK tag — Solid.Community allows you to register an ID, which at present doesn’t seem to let you do anything, though you can read the Github repository and see how you might create a chat app atop Solid.
Can Sir Tim revolutionize the Internet again? I have no idea. But someone needs to, because a web that crashes to earth is a Black Mirror episode I don’t want to see.
A columnist for The Washington Post and author of the Pulitzer-winning Gulag, Applebaum has been writing about Russia since the 1990s. Her fifth book is a detailed study of Stalin’s 1929 policy of agricultural collectivization, which set off the worst famine in European history. Some five million people died between 1931 and 1933 in the USSR. Of these, roughly three million were Ukrainians, and Applebaum definitively shows that they died due to deliberate government policy. Drawing on newly opened archives and personal accounts not previously translated, Applebaum substantiates the stories that Stalin suppressed Ukrainian uprisings by closing the borders, stopping food shipments, and letting the rebellious peasants starve.
(32 min): wrote about the Ukranianism of American politics with Paul Manafort
Search out far left and far right. They don’t invent, but they do fund.
Question: how do we divide people.
[Stalin] writing in private you know what he
writes to Kaganovich and these other
sidekicks he believes his ideology and
one of the things that’s important about
them about the Bolsheviks is they
believed that Marxism wasn’t just some
kind of theory and it could be money
they believed that it was a science and
it was true and it’s even more common
because it’s science and it’s true and
we define what it is and that means that
whatever we’ve said you know is true and
this is this is how things are going to
be and if it doesn’t work out in reality
the way we thought it was going to then
somebody else is responsible and who’s
responsible saboteurs wreckers kulaks
enemies of the people enemies of the
state you know and I actually believe
now that a lot of the you know a lot of
the violence the kinds kind of cycles of
violence you have in the Soviet Union
1932 and 33 you had the famine a few
years later you had the purges of 1937
and you have cyclical violence and
that’s almost always a response to
policy failure you know it hasn’t worked
the revolution hasn’t brought prosperity
and made us happy there has to be a
reason for it
okay you know let’s find the let’s find
the the parasites who are sucking the
blood of the revolution and get rid of
them and so that was you know and so the
so so your point you know your logical
point okay well look this agricultural
policy hasn’t worked let’s change it
that’s not how they thought you know it
wasn’t let’s change it love wheat you
know it’s not our policy that needs to
change it’s you know
the people in reality that has to adjust
our way of thinking and anyways I said I