DUBAI, June 22 (Reuters) – The U.S. Justice Department said on Tuesday it seized 36 Iranian-linked websites, many of them associated with either disinformation activities or violent organizations, taking them offline for violating U.S. sanctions. Several of the sites were back online within hours with new domain addresses. “Today, pursuant to court orders, the United States seized 33 websites used by the Iranian Islamic Radio and Television Union (IRTVU) and three websites operated by Kata’ib Hizballah (KH), in violation of U.S. sanctions,” the department said in a statement. https://www.reuters.com/world/middle-…
“But to the purveyors of the big lie—Republicans like Greg Abbott and his friends on Fox News—this very real and acute suffering is just a vehicle for their political objectives,” says Chris Hayes of the GOP blaming the Green New Deal as millions of Texans freeze. Aired on 02/17/2021.
The Hunter Biden story is a fascinating piece of disinformation and agitprop.
I’ve been following it closely, partly because it’s an interesting political story and partly because I’m an avid infosec enthusiast and I actually know how emails work.
I was, to be honest, surprised when the “Hunter laptop” scam first began that anyone actually believed it. If you know even a little bit about how email works, it’s plain as snow in December that the story was fake. The supposed “Hunter Biden emails” that were released were released as PDFs, with no headers, from email domains that do not exist. I was like “Really? People are falling for this? God damn conservatives sure are gullible.”
But in the time I’ve been in Florida helping to care for my mother, I’ve realized that isn’t fair.
My dad is older (he’s 82), he uses email every single day, but he has not the slightest clue how it works and he cannot define the word “domain,” much less explain what a domain is or spot an invalid email domain.
And honestly, a lot of people, on the left and right, are that way. It’s not that conservatives are stupid, it’s that to most people, email is magic. And they don’t ever see email headers, so it doesn’t look suspicious if someone shows them a fake email with no headers. And, like my dad, they don’t know what a domain is, so if you show them an email with a clearly bogus “from” address they don’t even blink.
In that sense, what looks like a crude, hamfisted attempt at second-rate disinformation is actually pretty savvy. It’s propaganda aimed at a very specific audience: an audience that is not technically savvy, but—and this is the important part—has also been indoctrinated to distrust “elitist experts” who think they know better. So this audience (1) doesn’t have the technical skill to see through even a very crude, simplistic scam and (2) will automatically respond to anyone who points out the scam with “neener neener I don’t believe you!!!”
It’s been a very interesting lesson in 21st-century propaganda. Forging a real email is hard. Email is trackable and traceable. It passes through many computers and it leaves traces in every one. You can not easily forge a realistic, believable email even if you have nearly unlimited resources…
…but you don’t have to.
If the target of your propaganda is people with the limited technical knowledge of my father who have also been told to distrust experts, it’s not necessary.
Now, having said that, parts of the scam are very sophisticated.
This is Martin Aspen.
Martin Aspen wrote a 64-page dossier documenting corruption in Hunter Biden’s business dealings in China, which was released by security firm Typhoon Investigations.
Martin Aspen does not exist.
Typhoon Investigations does not exist.
The photograph of the person you see here is not a picture of a person. It was created by a GAN—a type of deepfake machine learning computer program.
If you’re not familiar with these, I recommend you visit the site This Person Does Not Exist:
Every time you refresh the browser, you will see a photograph of a different person. None of the photographs are real. None of the people exist. The photographs are all created by machine learning deepfake programs.
Anyway, back to Martin Aspen.
Martin Aspen does not exist. His photo is a computer-generated deepfake. His resume lists companies he’s never worked for, universities that have no record of him, and security firms that don’t exist. The entire dossier was faked.
Fifteen, twenty, thirty years ago, this level of fakery would require the concerted effort of a nation-state’s intelligence team to do. Today, a single reasonably skilled person can do it. I can do it. You can do it.
And the thing that’s most fascinating about all of this, besides the fact it shows how fragile and easily manipulated the public perception is? Proving that the document was fake will not change a single mind.
The Hunter Biden saga has revealed two things:
- Small groups of individuals, even a single person sitting in a bedroom, can create agitprop and disinformation campaigns that would only a short time ago have been the envy of entire government intelligence teams.
- People want to believe. They’re simply looking for an excuse. Modern propaganda does not need to be subtle. It doesn’t need to be well-done. It doesn’t need to stand up to any scrutiny. It merely needs to give people an excuse to believe what they already want to believe.
On a sunny summer morning in June, professor Jonathan Zittrain is hosting Sir Tim Berners-Lee in a Harvard Law School classroom. The audience is a smattering of visiting scholars at the Berkman Klein Center for Internet and Society and a few local techies involved with open source software development. I’d come to the room half an hour early to snag a seat, but I needn’t have bothered, as the crowd to see the man who invented the World Wide Web is attentive, but thin.
Jonathan Zittrain, one of the world’s leading scholars of creativity in an internet-connected universe, points out that Sir Tim’s current work is attempting to make a second correction in the arc of the internet. His first innovation, thirty years ago, was “the conceptualization and the runaway success of the World Wide Web.” Sir Tim’s current idea is a protocol — Solid — and a company — Inrupt — which want to make the Web as it is now significantly better. Just what are Solid and Inrupt? That’s what a smattering of us are here to find.
Sir Tim draws an arc on the chalkboard behind him. “People talk about the meteoric rise of the web — of course, meteors go down.” Referencing internet disinformation expert Joan Donavan, sitting in the audience, he notes “If you study the bad things on the web, there’s hundreds and thousands to study.” Almost apologetically, he explains that “there was a time when you could see things that were new [online], but not the ways they were bad.” For Sir Tim, the days of blogs were pretty good ones. “When you made a blog, you tried to make it high quality, and you tried to make your links to high quality blogs. You as a blogger were motivated by your reading counter, which led to a virtuous system based on custodianship as well as authorship.” Wistfully, he noted, “You could be forgiven for being fairly utopian in those days.”
What came out of this moment in the web’s evolution was a “true scale-free network, based on HTTP and HTML.” (Scale-free networks follow a Pareto distribution, with a small number of highly connected nodes and a “long tail” of less-connected nodes.) “It was extraordinary to discover that when you connect humanity, they form scale-free networks at all different levels. We put out HTTP and HRTML and ended up with humanity forming scale-free networks on a planetary — okay, a tenth of a planet — scale.”
Sir Tim noted that much of what was most interesting about the web was in the long tail, the less connected and less popular nodes. Zittrain invokes philosopher David Weinberger’s maxim, “In the future, everyone will be famous for 15 people” to acknowledge this idea, and Sir Tim pushes back: “That’s not scale free. What’s possible is that for n people on the planet, we might have root-n groups. We’re not trying to make one network for everyone, not trying to design something for Justin Bieber tweeting.”
So why doesn’t blogosphere still work? Sir Tim blames the Facebook algorithms which determine what you read, breaking network effects and leading to a huge amount of consolidation. Zittrain wonders whether Facebook’s power is really all that new — didn’t Google’s search algorithm have similar effects? Sir Tim demurs — “Google just looks at all links and takes an eigenvector — it’s still using the web to search.” There’s a fascinating parenthetical where Sir Tim explains that he never thought search engines were possible. “Originally, we thought no one would be able to crawl the entire web — you would need so much storage, it wouldn’t be possible. We hadn’t realized that disk space would become ridiculously cheap.” Jonathan Zittrain likens the moment when Google comes into being as a science fiction moment, where our ability to comprehend the universe as limited by the speed of light suddenly allows us to transcend those barriers — prior to search, we might only know our local quadrant of the web, while search suddenly made it possible to encounter any content, anywhere.
Sir Tim brings us back to earth by discussing clickbait. “Blogging was driven by excitement around readership. But eventually ads come into play — if I am writing, I should have recompense.” What follows is content written specifically to generate money, like the fake news content written by Macedonian bloggers that might have influenced US elections. Zittrain generously references my “The Internet’s Original Sin” article, and Sir Tim notes that “some people argue that if you start off with advertising, you’re never going to have a successful web.”
The consequence of a monetized web, Sir Tim believes, is consolidation, designed to give advertisers larger audiences to reach. That consolidation leads to silos: “My photos are on Flickr, but my colleagues are all on LinkedIn? How do I share them? Do I have to persuade all my friends to move over to the platform I’m on?”
Zittrain offers two possible solution the problem: interoperability, where everything shares some common data models and can exchange data, or dramatic consolidation, where LinkedIn, for instance, just runs everything. Sir Tim isn’t overly optimistic about either, noting that totalitarian societies might be able to demand deep interop, but that it seems unlikely in our market democracy. And while consolidation is easier to work within, “consolidation is also incredibly frustrating. If you want to make a Facebook app, you need to work within not only the Facebook API, but the Facebook paradigm, with users, groups, and likes. Silos are very bad for innovation.”
Returning to the arc he’s drawn on the blackboard, Sir Tim notes that the meteor is crashing into earth. “We don’t need to imagine future web dystopias. We’ve got a television show where every single episode illustrates a different form of dysfunction.” The arc of the Web is long and it leads towards Black Mirror.
In March of this year, Sir Tim launched the #ForTheWeb campaign to celebrate the thirtieth anniversary of the Web. For Tim, the campaign was meant to feature the web worth saving, not to demand that either governments or Facebook fix it for us. “We need to fix networks and communities all at once, because it’s a sociotechnical system,” he explains. “We need to work inside the companies and inside the government. Some things are simple to fix — net neutrality, cheaper broadband, those were relatively simple. This isn’t simple. Free speech and hate speech are complicated and need complex social processes around them.” And while #ForTheWeb is a space for articulating the key values we want to support for a future direction of the web, that new direction needs a technical component as well. We need a course correction — what’s the White Mirror scenario?
Sir Tim pushes up the blackboard featuring the web as a meteor crashing back to earth. On the board below it, he starts drawing a set of cylinders. Solid is based around the idea of pods, personal data stores that could live in the cloud or which you could control directly. “Solid is web technology reapplied,” Sir Tim explains. “You use apps and web apps, but they don’t store your data at all.”
Returning to his photo sharing scenario, Sir Tim imagines uploading photos taken from a digital camera. The camera asks where you want to store the data. “You have a Solid pod at home, and one at work — you decide where to put them based on what context you want to use them in. Solid is a protocol, like the web. Pods are Solid-compatible personal clouds. Apps can talk to your pod.” So sharing photos is no longer about making LinkedIn and Flickr talk to each other — it’s simply about both of them talking to your pod, which you control.
“The web was all about interoperability — this is a solution for interoperability,” explains Sir Tim. “You choose where to store your information and the pods do access control, There’s a single sign on that leads to a WebID. Those WebIDs plus access controls are a common language across the Solid world.” These WebIDs support groups as well as individuals… and groups have pages where you can see who belongs to them. Apps look up the group and deliver information accordingly. The content delivery mechanism underneath Solid is WebDAV, a versioning and authoring protocol that Sir Tim has supported from very early on as a way of returning the Web to its read/write roots, though he notes that Solid plans on running on protocols that will be much faster.
Zittrain picks up the legal implications of this new paradigm: “Right now, each web app or service has custody of the data it uses — LinkedIn has a proprietary data store behind it. But there might also be some regulations that govern what LinkedIn can do with that data — how does that work in a Solid world?”
Ducking the legal question, Sir Tim looks into ways we might bootstrap personal data pods. “Because of GDPR, the major platforms bave been forced to create a way for people to export their content. You’d expect that Google, Facebook and others would fight this tooth and nail — instead they’re cooperating.” Specifically, they’re developing the Data Transfer Project, a common standard for data export that allows you not only to export your data, but to import it into a different platform. “They’ve gone to the trouble of designing common data models, which is brilliant from the Solid point of view.”
Zittrain suggests that we can think of Solid’s development in stages. In Stage 0, you might be able to retrieve your data from a platform, possibly from the API, possibly by scraping it, and you might get sued in the process. In Step 1, you can get your data through a Data Transfer dump. In Step 2, companies might begin making the data available regularly through Solid-compatible APIs. In Step 3, the Solid apps start working off the data that’s been migrated into personal pods.
Sir Tim notes that exciting things start to happen in Step 3. “My relationship with a bank is just a set of transactions and files. I can get a static copy of how the bank thinks of my current relationships. What I would like is for all those changes to be streamed to my Solid pod.” He concedes, “I probably don’t want to have the only copy.” Much of what’s interesting about Solid comes from the idea that pods can mirror each other in different ways — we might want to have a public debate in which all conversations are on the record and recorded, or an entirely ephemeral interaction, where all we say to one another disappears. This is one of many reasons, Sir Tim explains, “Solid does not use Blockchain. At all.”
Zittrain persists in identifying some of the challenges of this new model, referencing the Cambridge Analytica scandal that affected Facebook. “If the problem is privacy, specifically an API that made it easy to get not only my data, but my friends’ data, how does Solid help with this? Doesn’t there need to be someone minding controls of the access lists?”
Solid, Sir Tim explains, is not primarily about privacy. Initially, people worried about their personal data leaking, a compromising photo that was supposed to be private becoming public. Now we worry about how our data is aggregated and used. The response shouldn’t be to compensate people for that data usage. Instead, we need to help combat the manipulation. “Data is not oil. It doesn’t work that way, it’s not about owning it.” One of Sir Tim’s core concerns is that people offer valuable services, like free internet, in exchange for access to people’s datastream.
Zittrain points out that the idea that you own your own data — which is meant to be empowering — includes a deeply disempowering possibility. You now have the alienable right of giving away your own data.
Sir Tim is more excited about the upsides: “In a Solid world, my doctor has a Solid ID and I can choose the family photo that has a picture of my ankle and send it to the doctor for diagnosis. And I can access my medical data and share it with my cousin, if I choose.” Financial software interoperates smoothly, giving you access to your full financial picture. “All your fitness stuff is in your Solid Pod, and data from your friends if they want to share it so you can compete.” He imagines a record of purchases you’ve made on different sites, not just Amazon, and the possibility of running your own AI on top of it to make recommendations on what to buy next.
A member of the audience asks whether it’s really realistic for individuals to make decisions about how to share their data — we may not know what data it is unsafe to share, once it gets collected and aggregated. Can Solid really prevent data misuse?
“The Solid protocol doesn’t tell you whether these services spy on you, but the spirit of Solid is that they don’t,” offers Sir Tim. Apps are agents acting on your behalf. Not all Solid apps will be beneficent, he notes, but we can train certified developers to make beneficent apps, and offer a store of such apps. Zittrain, who wrote a terrific book about the ways in which app stores can strangle innovation, is visible uncomfortable and suggests that people may need help knowing who to trust in a Solid world. “Imagine a party able to be designated as a helper with respect to privacy. Maybe a grandchild is a helper for a grandmother. Maybe we need a new role in society — a fiduciary whose responsibility is to help you make trust decisions.” Zittrain’s question links Sir Tim’s ideas about Solid to an idea he’s been developing with Jack Balkin about information fiduciaries, the idea that platforms like Facebook might be required to treat our personal data with the legal respect that doctors, lawyers and accountants are forced to apply to personal data.
Another question wonders who will provide the hardware for Solid pods. Zittrain points out that Solid could run on Eben Moglen’s “Freedom Box”, a long-promised personal web server designed to put control of data back into users hands. Sir Tim suggests that your cable or ISP router might run a Pod in the future.
My question for Sir Tim focuses on adoption. Accepting for the moment the desirability of a Solid future — and, for the most part, I like Sir Tim’s vision a great deal — how do we get from here to there? For the foreseeable future, billions of people are using proprietary social networks that surveil their users and cling to their data. When Sir Tim last disrupted the Internet, it was an academic curiosity, not an industry worth hundreds of billions. How do we get from here to there?
Sir Tim remembers the advent of the web as a struggle. “Remember when Gopher was taking off exponentially, and the web was growing really slowly? Remember that things that take off fast can drop off fast.” Gopher wasn’t free, and its proprietary nature led it to die quickly; “People seem locked into Facebook — one of the rules of Solid is not to disturb them.” People who will adopt Solid will work around them, and when people begin using Solid, that group could explode exponentially. “The billion people on Facebook don’t affect the people using a Solid community.”
Returning to the 80s, Sir Tim notes that it was difficult for the Web to take off — there were lots of non-internet documentation systems that seemed like they might win. What happened was that CERN’s telephone directory was put on the web, and everyone got a web browser to access that directory. It took a while before people realized that they might want to put other information on top of the directory.
“We don’t want everyone using Facebook to switch to Solid tomorrow — we couldn’t handle the scale.” Instead, Sir Tim offers, “We want people who are passionate about it to work within it. The reward is being part of another revolution.”
There’s something very surreal about a moment in which thousands of researchers and pundits are studying what’s wrong with social media and the Web, and surprisingly few working on new models we can use to move forward. The man who built the web in the first place is now working on alternative models to save us from the Black Mirror universe and the broader academic and professional world seems… surprisingly uninterested.
I can certainly see problems with Solid apps — your Pod will become a honeypot of private information that’s a great target for hackers. Apps will develop to collect as much of your Pod data as possible, unless they’re both regulated and technically prevented from doing so. Unless Pods are mostly on very fast cloud services, apps that draw from multiple pods will be significantly slower than the web as it operates today.
But there’s so much to like in Sir Tim’s vision. My lab and I are working now on the idea that what the world needs now is not a better Facebook, but thousands of social networks, with different rules, purposes and community standards. Like Sir Tim, we’re not looking to replace Facebook but to create new communities for groups of 5 to 50,000, self-governing and capable of different behaviors than the communities with hundreds of millions of users and central corporate governance are capable of. There’s no reason why the networks we’re imagining couldn’t live atop Solid.
It’s hard to remember how small and strange an experiment the web was in 1989, or even in 1994. I remember dropping out of graduate school to work on a web startup. My motivation wasn’t that I might make a lot of money — that seemed extraordinarily unlikely. It was that someone was willing to pay me to work on something that seemed… right. Like a plausible and desirable future. And for me, at least, Solid seems plausible and desirable in much the same way. It also seems roughly as hard to love as the Web was in 1994, with its grey backgrounds and BLINK tag — Solid.Community allows you to register an ID, which at present doesn’t seem to let you do anything, though you can read the Github repository and see how you might create a chat app atop Solid.
Can Sir Tim revolutionize the Internet again? I have no idea. But someone needs to, because a web that crashes to earth is a Black Mirror episode I don’t want to see.
A columnist for The Washington Post and author of the Pulitzer-winning Gulag, Applebaum has been writing about Russia since the 1990s. Her fifth book is a detailed study of Stalin’s 1929 policy of agricultural collectivization, which set off the worst famine in European history. Some five million people died between 1931 and 1933 in the USSR. Of these, roughly three million were Ukrainians, and Applebaum definitively shows that they died due to deliberate government policy. Drawing on newly opened archives and personal accounts not previously translated, Applebaum substantiates the stories that Stalin suppressed Ukrainian uprisings by closing the borders, stopping food shipments, and letting the rebellious peasants starve.
(32 min): wrote about the Ukranianism of American politics with Paul Manafort
Search out far left and far right. They don’t invent, but they do fund.
Question: how do we divide people.
[Stalin] writing in private you know what he
writes to Kaganovich and these other
sidekicks he believes his ideology and
one of the things that’s important about
them about the Bolsheviks is they
believed that Marxism wasn’t just some
kind of theory and it could be money
they believed that it was a science and
it was true and it’s even more common
because it’s science and it’s true and
we define what it is and that means that
whatever we’ve said you know is true and
this is this is how things are going to
be and if it doesn’t work out in reality
the way we thought it was going to then
somebody else is responsible and who’s
responsible saboteurs wreckers kulaks
enemies of the people enemies of the
state you know and I actually believe
now that a lot of the you know a lot of
the violence the kinds kind of cycles of
violence you have in the Soviet Union
1932 and 33 you had the famine a few
years later you had the purges of 1937
and you have cyclical violence and
that’s almost always a response to
policy failure you know it hasn’t worked
the revolution hasn’t brought prosperity
and made us happy there has to be a
reason for it
okay you know let’s find the let’s find
the the parasites who are sucking the
blood of the revolution and get rid of
them and so that was you know and so the
so so your point you know your logical
point okay well look this agricultural
policy hasn’t worked let’s change it
that’s not how they thought you know it
wasn’t let’s change it love wheat you
know it’s not our policy that needs to
change it’s you know
the people in reality that has to adjust
our way of thinking and anyways I said I
Yet in anonymously trying to exploit the fissures within the Democratic ranks — fissures that ran through this past week’s debates — Mr. Mauldin’s website hews far closer to the disinformation spread by Russian trolls in 2016 than typical political messaging. With nothing to indicate its creator’s motives or employer, the website offers a preview of what election experts and national security officials say Americans can expect to be bombarded with for the next year and a half: anonymous and hard-to-trace digital messaging spread by sophisticated political operatives whose aim is to sow discord through deceit. Trolling, that is, as a political strategy.
Mr. Mauldin, who has not been previously identified as the creator of the website, said he had built and paid for it on his own, and not for the Trump campaign. But the campaign knows about the websites, raising the prospect that the president’s re-election effort condoned what is, in essence, a disinformation operation run by one of its own.
“We appreciate their efforts in their own time with parodies like this that help the cause,” he added.
Inside the campaign, Mr. Mauldin, 30, is seen as a rising star, prized for his mischievous sense of humor and digital know-how, according to two people familiar with the operation. He also appears to be very much on point in his choice of targets: Mr. Biden is the Democrat polling strongest against Mr. Trump and has been repeatedly singled out on Twitter by the president.
Mr. Biden’s campaign knew about the fake website for months, but had not been of aware of who was behind it, said T.J. Ducklo, a campaign spokesman. “Imagine our surprise that a site full of obvious disinformation,” he said, “is the handiwork of an operative tied to the Trump campaign.”
Mr. Ducklo sought to place the website firmly in the context of Mr. Trump’s own social media habits — such as tweeting doctored videos — and what he said was the president’s lack of interest in measures to ensure the integrity of American elections.
In addition to Mr. Biden, Mr. Mauldin has anonymously set up faux campaign websites for at least three other Democratic front-runners. “Millionaire Bernie” seeks to tar Mr. Sanders as a greedy socialist; “Elizabeth Warren for Chief” mocks her claim of Native American ancestry; and “Kamala Harris for Arresting the People” highlights her work as a prosecutor who, the site says, “put parents in jail for children skipping school — and laughed about it.”
None, though, has proved as successful as the Biden website. Mr. Mauldin boasted in the interview that he had fooled people into thinking his Biden website was the real campaign page. Some offered to donate money, he said, and others wanted to volunteer.
Mr. Mauldin insisted there was nothing duplicitous about it. “I don’t make any claims on the site to lean one way or the other,” he said, adding, “Facts are not partisan.”
It is buyer beware, and not just for unwitting Democrats. In 2017, a group of Democrats took a page out of the Russian playbook and posed as conservatives to try to divide Republicans in Alabama’s special Senate election, a race narrowly won by a Democrat. And as the 2020 campaign gets underway, election experts say they see signs that Americans from both sides of the political divide are getting ready to do the same. National security officials are also warning that Russia will again try to disrupt the election by spreading disinformation.
Meddling by foreigners is illegal. But trolling or disinformation spread by American citizens is protected by the First Amendment, and if Mr. Mauldin’s work is any guide, Americans may well do a far better job deceiving one another than any Russian troll could hope for.
A Viral Hit
Unlike much of the Russian disinformation, which often has been crude and off-key — remember the Facebook ad promoting Mr. Sanders as a gay-rights superhero? — the faux Biden site has been a viral hit. Mr. Mauldin even started selling mock Biden 2020 T-shirts through the website to capitalize on its success.
From mid-March, when Mr. Mauldin first began promoting the website on Reddit, through the end of May, it had more than 390,000 unique visitors, according to data compiled by SimilarWeb, a firm that analyzes web traffic. Mr. Biden’s official campaign website had about 310,000.
Of the people who found the websites through search engines, 83 percent landed on Mr. Mauldin’s page, according to SimilarWeb. None of it was paid traffic.
The website’s success was not accidental. Mr. Mauldin put it up well before Mr. Biden’s official website and aggressively pushed it out on Reddit, getting clicks and links and exposure. It had a big boost in May when a handful of media outlets — The Daily Callerand CNET, among others — wrote stories about the fake page beating Mr. Biden’s and linked to it. Links from established media websites are weighted heavily by search engines. The New York Times is not linking to Mr. Mauldin’s websites to avoid further boosting them in search rankings.The Trump consultant, Patrick Mauldin, has built websites featuring a number of candidates, including Senator Elizabeth Warren.
In recent weeks, as search companies became aware that Mr. Mauldin’s website was fake, it has fallen below the real Biden page. But it remains among the top results, and it already appears to have fooled people.
“I know a lot of Biden supporters were furious when they saw that website,” said David Goldstein, the chief executive of Tovo Labs, a Democratic digital consulting firm in New York. “They suspected other Dem candidates were behind it.”
Then there were the less politically astute. In late April, Mr. Mauldin anonymously took to Reddit to boast that people were confusing his website for the real one. He posted in r/The_Donald, a popular spot for right-wing trolls to trade tips and show off, using the handle NPC_12345.
“How many Democrats can we red pill with my fake Joe Biden site?” Mr. Mauldin wrote in one post.
Another post included messages from duped Democrats. One person wanted Mr. Biden to speak at her son’s school. Another suggested the former vice president look to an old soul group, the Fifth Dimension, for his campaign song.
There were even messages asking Mr. Biden not to criticize other Democrats, Mr. Mauldin said in the interview. “They want it to be all ‘Kumbaya’ with the Democrats.”
He was not having it. “It’s important for everyone to realize aspects of their own side or candidate that maybe they don’t know about or don’t want to look at,” he said.
By “their own side,” Mr. Mauldin meant Democrats. He is not trolling any Republicans.
For decades, conventional wisdom in politics held that trying to undermine your opponent’s base would only motivate that group to vote against you. But in 2016, Russian disinformation and the Trump team’s own targeting of disenchanted Democrats led many campaign veterans on the left and the right to conclude that sowing dissent inside an opponent’s ranks could work. It worked especially well if the criticism appeared to come from their own side.Mr. Mauldin posted on Reddit about his fake websites, helping to drive traffic to them.Credit
With websites like the faux Biden page, “essentially you’re trying to sow chaos and you’re trying to basically do voter suppression,” said Mr. Goldstein, the Democratic consultant.
“You want their supporters to get sad, to get angry, to get turned off from their chosen candidate,” he continued. “The way voters tend to work: They don’t turn off from a candidate and pick up someone else; they turn off from a candidate and turn off politics.”
Mr. Goldstein’s firm, Tovo, tried to prove as much during Alabama’s special Senate election in 2017. With targeted ads, Tovo led conservative Republicans to a website featuring articles by conservatives who opposed the far-right candidate, Roy Moore. Moderate Republicans were directed to a site that suggested they write in a different candidate. The effort relied only on genuine content from conservatives, and it was entirely separate from the Democrats who used Facebook to pose as conservatives.
Tovo later published its findings. It claimed to have driven down moderate Republican turnout by 2.5 percent, and conservative Republican turnout by 4.4 percent.
Unlike Tovo, Mr. Mauldin makes no claims of trying to prove any concepts, and he had no intention of outing himself. When approached by The Times, he argued that he should not be identified because he had not sought the spotlight, and because he feared threats and harassment. He preferred “to work behind the scenes,” he wrote in an email.
Mr. Maulden registered the Biden site privately so that his name and contact details would not appear in any public searches. But The Times was able to confirm Mr. Mauldin’s identity because the Biden page shared the same Google analytics tags with a number of other active and defunct websites, including the ones he has made for the three other Democratic candidates. Some of those sites that shared the Google tags were registered under Mr. Mauldin’s name.
Sipping a Crown Royal and Coke at a bar in downtown Austin, Mr. Mauldin bore little resemblance to the boasting troll he played on Reddit. He is slight, and has boyish features. He wore his shirt neatly tucked into jeans, and paused to consider questions before answering. When he did not want to answer, he quietly said, “I don’t know” or “I don’t remember” — even when asked about things it was hard to imagine he had forgotten, like what he told the Trump campaign about his websites.Mr. Mauldin works on President’s Trump’s re-election campaign, which kicked off this past month in Orlando, Fla.CreditErin Schaff/The New York Times
Mr. Mauldin grew up in eastern Texas, and described his political views as “closest to libertarian.” He studied marketing at Texas A&M, and taught himself digital design skills, building on a childhood love of drawing.
He and his brother founded Vici after helping a family friend win a state representative race. Their big break came in June 2016, when the Trump campaign’s digital operation, short of manpower and scrambling, hired Vici.
Mr. Mauldin quickly impressed. His specialty was making the kind of viral videos that riffed on pop culture and were relentlessly pumped out on social media by the Trump campaign. One came after Hillary Clinton dropped a reference to the augmented-reality game Pokémon Go into a speech, urging voters to “Pokemon Go to the polls.” Mr. Mauldin responded with a video that featured Mrs. Clinton as a Pokemon creature players had to catch, providing the kind of tit for tat needed to feed a day of news stories.
In a testimonial on Vici’s website, Brad Parscale, Mr. Trump’s 2016 digital director and now his campaign manager, called Mr. Mauldin “an indispensable part of our digital operation” in the president’s first campaign.
People with ties to the re-election campaign, all of whom spoke on the condition of anonymity because of nondisclosure agreements, said that Mr. Mauldin was brought back on retainer for the 2020 race.
Mr. Mauldin would not discuss specifics of his role with the campaign, citing his own nondisclosure agreement. He was only slightly more talkative about his websites.
Pressed on whether he thought they were deceptive, Mr. Mauldin complained that people put too much emphasis on identity “instead of examining the facts themselves.” He brushed off a question about whether GIFs of Mr. Biden touching women, devoid of any context, represented facts.
The point, Mr. Mauldin said, was to help Democrats see their candidates for who they were — warts and all — and not try to pretend that they all agreed and were in lock step on every issue.
As he sees it now, “there’s a party line and you either toe it or you’re a traitor,” he said, adding that this applied to both Democrats and Republicans.
But weren’t his sites encouraging Democrats to look for traitors?
“I mean, they could do it themselves,” Mr. Mauldin said with a laugh. “But they’re not. That’s the problem.”