Renée DiResta is a social media researcher, and the author of Invisible Rulers: The People Who Turn Lies into Reality. She studies adversarial abuse online, ranging from state actors running influence operations, to spammers and scammers, to issues related to child safety. Renée talked with us about how disinformation such as anti-vaccine fear-mongering can spread so widely, why she anticipated the current U.S. administration’s weaponized propaganda campaigns, and what our own roles can be in countering untruths both on social media and in our personal lives.
Thinking Person’s Guide to Autism (TPGA): What kicked off your work in studying and countering mis- and disinformation?
Renée DiResta: I got interested in understanding how narratives spread online and how people came to be deeply committed to propagating them in 2015 as a mom activist in California around the time of the Disneyland measles outbreak. I helped start Vaccinate California and realized as we tried to build a pro-vaccine parent movement how far ahead the antivax movement was at leveraging social media. Public health institutions didn’t prioritize communicating on social, and there was an asymmetry of passion among parents—most of us vaccinate our kids and never really think about it again. I did some network mapping with a data scientist friend looking at anti-vaccine influencer accounts and message themes; there was nothing comparable on the pro-vaccine side.
I decided to write about this experience and these findings for Wired. It was one of the first things I wrote for the public. In a rather unexpected turn of events, that writing and some subsequent public talks about the experience, were seen by some people I knew from my day job in tech in Silicon Valley who’d gone to work for the Obama Administration. That led to me being asked to participate in a similar analysis of ISIS networks on social media—there was a serious problem with terrorist propaganda at the time as well. While working on that, we began to be aware of Russia’s Internet Research Agency using social media to propagandize—but very subversively, with accounts that were pretending to be other things entirely. Russian trolls pretending to be American Texas secessionists, that sort of thing. I came to work on a few of these different investigations and by late 2017 or so, it became my job.

TPGA: Can you explain your concepts of “invisible rulers” and “bespoke realities”?
Renée DiResta: “Invisible rulers” is a reference to a phrase by Edward Bernays, actually. He was the nephew of Sigmund Freud, a propagandist during World War I, and the father of modern PR. He wrote a 1928 book, Propaganda, that argued that in modern democratic societies, public opinion is not spontaneously formed but is instead deliberately shaped by a small group of unseen influencers —advertisers, public relations professionals, and media elites, and that these figures are capable of manipulating social attitudes, desires, and behaviors to produce consent and guide mass behavior. “There are invisible rulers who control the destinies of millions…” he says, before describing the mechanics of persuasion. “Those who manipulate this unseen mechanism of society constitute an invisible government which is the true ruling power of our country.”
I wanted to write a book about modern propaganda and influence, after looking at it in so many different case studies in my own work over the last decade or so, and to connect the dots to the most influential figures of today, unique to the infrastructure of social media, who are often quite adept at selling both products and ideologies: influencers.
I think you can also make a case that platform algorithms are a sort of invisible ruler today—they are often unnoticed by users, and they subtly influence what we see and think, determining how information spreads, what’s prioritized, and what fades away. They shape our perceptions without explicit visibility or accountability.
“Bespoke realities,” is a term I use to describe the highly personalized choose-your-own-adventure informational worlds constructed for each of us by social media and recommendation algorithms. These customized realities reinforce personal preferences, behaviors, and biases, often leading to fundamentally different understandings of what’s true, important, or even real.
TPGA: How is it that groups like anti-vaxxers, which hold minority viewpoints, are able to make their opinions seem more dominant than they actually are?
Renée DiResta: Groups like anti-vaxxers amplify their viewpoints by strategically investing time and resources into building a visible presence on social media platforms. They are adept at leveraging the algorithms that curate our information feeds, and they collaboratively work to elevate influential figures within their networks and foster strong connections among supporters. The online movement is very passionate, and serves as an amplifier for the content—they consider themselves part of the mission, and are committed to evangelizing as well.
Together, these tactics can create a “majority illusion,” making their perspectives appear far more widespread and dominant than they actually are. People who simply vaccinate their kids do not often go on to evangelize about it unless there is something that incites them into speaking up—usually something like a preventable disease outbreak in their community that outrages them. Measles or whooping cough. Then they get angry, but they often don’t really know what to do with that energy, and there’s no obvious movement infrastructure for them to plug into.
TPGA: Can you talk about how influencers gain traction in areas outside their niche or expertise, and why they can feel obligated to continue in those veins?
Renée DiResta: It’s all about incentives. Influencers initially build credibility and grow a following within specific niches—maybe fitness, lifestyle, or parenting—but may find themselves pulled into broader, or more controversial issues because they’re incentivized to somehow tie their niche into major topics of the day. Algorithms curate a creator’s content into user feeds when it seems salient to what people want to see—and influencers need their content to get views in order to keep growing their audience and making money.
Sometimes, it’s also the follower base who asks for an influencer to post about something—“why aren’t you using your platform to talk about XYZ?!” That may lead an influencer to get more extreme over time, or shift their themes to fit what the audience demands, which is called audience capture—media coverage talks about how algorithms influence us, how influencers influence us, but we the audience have pretty real power over influencers too in many cases. They get attention because we give it to them.
The influencer feels that pressure. If they don’t say the thing their audience wants to hear, well, another influencer very well may. Once they’ve gained traction outside their niche, the audience composition may even change—for example, they may start out posting about wellness, but then they weighed in on the COVID lockdowns, all of a sudden they have a bunch of new followers very interested in libertarian politics in their audience, and now they feel pressured to maintain attention and engagement by continuing to speak on topics the new very engaged libertarian audience wants to hear about. They are really incentivized to keep up their engagement so that their sponsorships continue, or their revenue trends continue. So they’re being driven by audience expectations or algorithmic or financial incentives rather than actual expertise or even their own sincere beliefs in some cases.
TPGA: You assert that we no longer get to be bystanders if we are on social media; why is this?
Renée DiResta: Social media platforms have fundamentally shifted the nature of public discourse: every like, share, retweet, or comment contributes to amplifying or shaping narratives. Even silence—failing to push back—can inadvertently allow misleading narratives to flourish. Not everyone has to participate actively at all times, but being aware of how liking and sharing is a form of collective behavior is important, or understanding how participating in certain ways influences norms. And I think if you are someone who has deep expertise, particularly in a contested space, your voice can be really valuable. If there’s no counter-speech, no pushback against rumors or misleading claims, all the algorithms have available to surface is the nonsense.
TPGA: Obviously, your book came out before the current US administration took office. Did you think matters would be quite this awful, in terms of misinformation-mongering being sanctioned at and deployed from the highest government levels?
Renée DiResta: Yes. There is an entire chapter in the book chronicling the Big Lie and the work my team and I did exposing it. I spent a huge chunk of 2020 studying the propaganda machine that this president built, which tried to claim that the 2020 election was stolen. (It was not.) In 2022 when the House flipped, Jim Jordan and his committee to investigate the “Weaponization of the government” subpoenaed me and my colleagues in retaliation. The House Homeland Security committee followed with a second subpoena. Nutjobs in fringe and right wing media spun crazy lies that we had somehow censored 22 million tweets about the election – a truly insane theory, presented without any evidence—and accused us of being deep state agents running a “Censorship Industrial Complex.”
Elon Musk boosted this bullshit, ensuring that millions saw it, and inciting a lot of harassment. Even after we turned over all of our emails and documents to the Congressional committees and they showed nothing of the sort, Jordan et al continued to lie about this—you can never exonerate yourself if the accusation is convenient for the politically powerful. Stephen Miller and America First Legal simultaneously vexatiously sued us in May of 2023, in a case that is still ongoing—and the intent was to impose millions of dollars of costs in legal fees and to chill our speech. It’s like dealing with the House UnAmerican Activities Committee (HUAC), and it came for us two years ago.
So yes, this was what I expected, which is why I spent time in the book laying out what happened as a cautionary tale.
TPGA: Can you help our readers understand a few effective and non-effective strategies for countering misinformation in their own online lives?
Renée DiResta: Generally people are looking for ways to talk to friends and family in their online lives, so the most effective strategies are empathetic correction—respectfully addressing the underlying fears or concerns behind false beliefs. “Hey, I saw you shared this, I imagine you’re concerned about XYZ…” Then you can move into sharing more accurate information. You can also proactively share good sources on your own accounts, or be part of amplifying trusted voices. There are more and more medical content creators, for example, who are out there doing debunking of bad information and also putting out their own entertaining content—it’s not just boring institutional fact sheets anymore. Finally, participating in community notes programs, when platforms offer them, can help get good info out there.
On the flip side, certain approaches tend to be ineffective or counterproductive. Public ridicule or shaming triggers defensiveness, rather than fostering openness to new information. Also, it keeps up the online toxicity.
In October 2024, Renée DiResta joined the Georgetown University McCourt School of Public Policy as an Associate Research Professor. Prior to that, she was the Technical Research Manager at the Stanford Internet Observatory.