#btconf Düsseldorf, Germany 02 - 03 May 2022

Sacha Judd

Sacha is the CEO of the Hoku Group, a family office in Aotearoa New Zealand combining private investments, early-stage tech ventures and a non-profit foundation. She founded Refactor (a series of events around diversity in technology), and Flounders’ Club (a network for early-stage company founders). She writes and speaks on fandom and online communities, diversity & inclusion in the tech sector, and how fans will transform the world.

Want to watch this video on YouTube directly? This way, please.

Everything Breaks at Scale

The internet's power to bring people together is a gift. But it can be a curse too: from celebrity gossip to QAnon to COVID denialism, conspiracy thinking has become a hallmark of online public life. It’s up to us dig out the toxicity built into its very foundations, and put the focus back where it belongs: on people and community.

Sacha Judd starts from her 2016 beyond tellerrand talk on boy bands and diversity in tech and takes us even further down the rabbit hole of conspiracy thinking to help us understand its appeal. And by looking both back to the past and ahead to the future, she gives today's tech designers, developers and leaders the tools to find our way back into the light.

Transcription

[Music]

Sacha Judd: Thank you so much for that incredibly kind introduction. I wish you would set expectations a little lower.

The last time I introduced myself at Beyond Tellerrand, I said I like really strange rabbit holes on the Internet. The stranger the better.

I have always loved a mystery. I grew up reading Trixie Belden and Encyclopedia Brown. Dinosaurs held my attention because archaeologists pieced their lives together from evidence they dug out of the earth like a giant puzzle.

I devoured books about Big Foot and Loch Ness Monsters and UFOs. There was nothing I loved more than something that was unexplained.

By the time I got to university, that fascination really hadn’t let up. I dug into the classics reading books about the Kennedy assassination that put forward the sort of fringe theories that never withstand close scrutiny but are still so fascinating that you find yourself telling other people about them, theories about faked autopsy results and second coffins.

This was all before the Internet (because I’m extremely old) when you couldn’t Google anything. You’d tell people stories like this over drinks in a bar, and there was no way to prove or disprove anything in that moment. It was almost an oral tradition.

There was certainly no way for those ideas to spread exponentially. You had to go looking for this stuff, reading out of print, second-hand books and putting things on reserve in the library.

Eventually, I think I grew out of most of it. I started watching the X-Files on television, and I stopped wondering whether people who thought they were alien abductees were telling the truth.

Two things happened in the 2000s that really changed the way that I thought about this. The first was discovering 9/11 truths, and the second was discovering Lord of the Rings tin hats. At the time, I didn’t realize how connected those things would turn out to be.

I don’t recall now how I first stumbled across the infamous Loose Change documentary that kickstarted conspiracy theories about the horrific tragedy on 9/11. It was before YouTube, so I can only assume I came across it through one of the community websites that I frequented a lot at the time, like MetaFilter.

But I do remember, as I watched it, just being absolutely transfixed, and it caused me to seek out weirder and weirder theories that people believed about what might have happened that day.

Now, to be absolutely clear, that wasn’t because I believed those theories, but because the more outlandish the claim, the more I desperately wanted to understand how anyone could. How you could watch all of the video footage we have of that day (zoomed in and slowed down) and reach the conclusion that people weren’t fighting for their lives. That instead, they were puppets or holograms or government actors. And having chosen to believe something so divorced from reality, how you could then choose to try and tell other people about it and try and convince them of the same.

Meanwhile, back in New Zealand, the Lord of the Rings movie trilogy was being filmed while I was working overseas. And so, it became a bit of a personal right of passage every Christmas when a new film was released, to go and sit and watch it in the cinema and just feel deeply homesick for my country.

I didn’t really participate in the Lord of the Rings fandom, although it was huge and active at the time, until I started to pay attention to their RPF community. RPF stands for real person fandom, or real person fanfiction. As the name suggests, it’s fandom or fanfiction stories about the actors in a TV show or movie rather than the characters they play, about sports people or musicians - real people.

Rather than writing stories about Mulder and Scully and their adventures fighting monsters, an RPF fan might write stories about David Duchovny and Gillian Anderson and their adventures in their trailer when shooting wrapped for the day.

When I first started hanging out in fandom spaces, there was a pretty serious taboo about RPF. People saw it as disrespectful to the performers, breaking the fourth wall, and so on. But as more and more behind-the-scenes content began to be deliberately marketed to fans, and social media led to actors’ and musicians’ day-to-day lives being consumed as content in and of itself, those taboos started to break down.

The Lord of the Rings trilogy was absolutely ripe for RPF because the cast had spent so much time together far away from home, and the behind-the-scenes content made it look like they had this incredible summer camp time of it goofing off and making movies together.

Fans loved to see it. Some fans liked to imagine that they might have been having an extra good time together.

And a small group of fans took that a step further. It wasn’t just that they thought the actors like Dominic Monaghan and Elijah Wood might have made a cute couple. It was that they were convinced that they were in fact a couple, a secret couple, doomed to keep their great love from the public at the insistence of the mysterious powers that be.

Now, if you’ve watched my first Beyond Tellerrand talk, you’ll be starting to sense a theme. You’ll recall last time I talked about the fans of former boy banders Harry Styles and Louis Tomlinson and the tin hats who believe in their secret love.

But where does this term come from? Well, these fans in the Lord of the Rings fandom weren’t content to just appreciate some cute boys who might have had a bromance offscreen. They constructed elaborate theories and timelines. They attributed evil motives to corporate overloads. And they looked for signs and symbols that they saw them and appreciated their support.

Plagued by their extreme behavior, the rest of the Lord of the Rings fandom gave these people a name, tin hat, short for the tinfoil hats we’d already come to think of people who were scared of aliens and government conspiracies wearing to protect themselves against all manner of unprovable outside threats.

From that point on, conspiracy theorists in every fandom have been referred to the same way. Unfortunately, tin hats pop up in almost every fandom.

Convinced that Benedict Cumberbatch’s marriage is a sham and his children are fake, or that the stars of Outlander are secretly living together, or that Canadian Olympic ice dancers have great chemistry on the ice because of their secret love off of it.

Looking back now, I thought that tin hats were amusing. It was part of why using Harry and Louis as a hook for that first talk was entertaining. The confusing part for people wasn’t that they might have made a cute couple. It was, “Wait. What?! People think this is real?” They do.

They still do seven years after One Direction broke up, six years after Harry and Louis were last seen in the same place, after Louis had a child and got back together with his long-term girlfriend. They still believe it. They gather online every day, and they post theories, and they speculate about evidence, and they rail against the mysterious powers and ironclad contracts that must be holding these two wealthy white men back from declaring their secret love.

What I’ve realized is that tin hats are conspiracy theorists. They’re no different than the 9/11 truthers. If there’s anything that the last two years have taught us, it’s that conspiracy thinking isn’t amusing and it isn’t harmless.

I’ve spent some time thinking about the ways in which people are drawn to these conspiracy theories and what causes them to stay. I’ve started to think about all of us, the people in this room who build for the Internet, and the role we play, and how we might think about changing direction.

Conspiracy thinking isn’t new. You can read rafts of work about the ways in which people have believed things to be other than what they’ve seemed going all the way back to speculation about what Niro was doing while Rome burned and whether he set the fire himself.

Part of the problem with conspiracy thinking is that there is always (at its heart) at least a little kernel of truth. These people aren’t completely detached from reality. History has taught us over and over that, yes, politicians and leaders do lie. That powerful people do attempt to cover up wrongdoing. That some paparazzi shots do seem pretty staged.

Distrust in authority isn’t necessarily irrational. It’s too easy to dismiss these people as crazy. But when you look at the reasons why people come to believe ideas that are almost laughably false, the reasons are actually extremely normal.

Conspiracy thinking isn’t new, and we all know that what is new is the way that the Internet has empowered and amplified these theories, made the spread of misinformation commonplace, and provided platforms and homes for these people and their ideas. We know how people get exposed to conspiracy thinking. You’ve all read 1,000 articles about algorithms and amplification and how social media is rotting our brains.

I’m more interested today in a different question. Why do people stay in conspiracy thinking? I think there are three things at the heart of this, three deep-seeded human needs that are being met by conspiracy theories: community, purpose, and dopamine.

The first one is probably the most obvious. As human beings, we all crave community, and never has that been more obviously than over the two years of the pandemic when we’ve been separated from friends and loved ones and so dependent of the response of the people around us.

As I’ve spent time thinking about what draws people into embracing celebrity conspiracy theories, what strikes me over and over is what these people are really enjoying is being part of a community. Sure, they come online each day to share their tidbits and news, but also because the people they’re sharing them with have become friends, comrades in arms, people that they believe care about them.

There’s a wealth of literature that’s been written about the ways (over the last few centuries) our real-world community structures have broken down as we moved from villages to cities, as we moved out of traditional religious structures, parishes, as we moved further away from extended family.

It’s been 20 years since Putnam wrote the book Bowling Alone in which he argued we’ve become increasingly disconnected from our friends, our neighbors, our democratic structures. That changes to work-life, suburban life, families, television, computers have all led to a decline in our civic participation.

Voluntary associations, memberships of things like bowling leagues, he argued, was an indicator of the kind of trust and reciprocity that we need for healthy communities. And a decline in those memberships was a bad sign.

In the two decades since he wrote that book, his theories have been debated really widely, like, are memberships really declining or have our tastes in social activities just changed? Are we signing fewer petitions because we care less about politics or because we’re participating in politics in other ways?

Regardless, it remains true that we crave community and we can’t find it around us because of economic circumstance or isolation or a global pandemic, we look for it online. And increasing numbers of people are finding it online surrounded by conspiracy theorists.

As these people convince themselves of ideas that are more and more outlandish and socially unacceptable, they further weaken the ties with the people around them in their offline lives.

Writing about flat earthers recently in The Atlantic, Kelly Weill said this loss foregrounds almost every conversation at flat earth meetups, so common that they’ve started to use the language of persecuted minorities. Announcing one’s belief in a flat earth is referred to as “coming out,” a term more commonly used by the LGBT community. Separated from their loved ones, these people wind up trapped inside a theory with the only other people who will believe them. Eventually, your conspiracy community becomes your only community.

I think the second thing that causes people to stay in these environments is a sense of purpose, unified in a common cause, standing for something you think is really important or against something that you think is reprehensible. You’re not sitting alone in front of your computer looking at photos of celebrities. You’re fighting for a young gay couple in love closeted by their evil managers, and you’re making signs and taking them to their concerts to let them know that they have your support.

You know there’s a kind of rationality in that, too. If as conspiracists increasingly believe children were being trafficked by satanists, we’d all have a duty to act. If the vaccines were really killing millions of people, we’d feel compelled to do something about that too. For many of these people, the present moment is maybe the first time that they’ve experienced a lack of control whether through economic inequality, government pandemic responses, mandates, and lockdowns.

Anne Helen Peterson writing about this recently said for white straight people with American passports, this is the first time that some of them are experiencing a kind of social precarity that has long been the norm for people without those privileges. Put differently, it’s the first time that they feel like they’re suffering without cause, for reasons outside their control, in a way that feels unfair.

Conspiracy theories seem to flourish in times when we feel especially helpless to direct world events, reaching for a sense of purpose when we feel powerless or overwhelmed. Conspiracy theorists believe they’ve on the side of good, and that doesn’t make what they’re doing okay, and it doesn’t mean we should tolerate it. But we do have to do the work to understand it and to understand how we got here.

I think the third thing -- I’m going too fast -- the third thing that keeps us in these environments is something we’re all very familiar with: dopamine. We all know that our social infrastructure online is constructed around our need for this brain chemistry reward. Facebook’s founders have admitted that they built the app around this kind of engagement, and every other platform has followed suit to keep us distracted and hooked.

Look, I get it. I write original fiction, and I write fanfiction. I tell you what. An unpublished original novel is nowhere near as satisfying as the daily emails I get from 803 telling me that someone has liked and reviewed my fan works.

And so, if you’re part of a community that has that dopamine hit baked in, you can see why it causes people to stay. For people in conspiracy circles, very quickly being part of those groups can lead to having a sort of insider influence. When somebody likes the Facebook post that you wrote or the YouTube video that you filmed, it gives you a feedback loop that you want more of. It’s human nature to be delighted that our end-group likes what we have to say.

I think there’s also a dopamine hit that comes from trying to solve a mystery and looking for answers and signs and symbols. Again, I get it. I played one of the first online alternate reality games.

In 2001, I was working in London at a job that paid me quite a lot to do not very much, and so you can imagine how quickly I got swept up in a mystery that started to unfold when the credits for the trailer for the new Spielberg movie A.I. had a reference to a sentient machine therapist, one Jeanne Salla.

Now, why a movie about robots would need a robot therapist was unclear. When you went searching for her name online, it led you down a rabbit hole of websites set in the future where a man named Evan Chan had been murdered under mysterious circumstances.

What unfolded next was one of the most complex and successful alternate reality games ever staged, eventually known as The Beast. Totally immersive. Dozens of websites. Clues that led to phone numbers. Sometimes, the game called you. Sometimes, it sent you faxes. At one point, live actors were involved.

But here’s the crux of why I loved it. You couldn’t play it on your own. The puzzles were too complex, too disparate. Number puzzles, language puzzles, puzzles that needed to be brute-forced. You needed the benefit of the community, including, famously, one puzzle that turned out to have been written in loot music, so you can imagine how excited the sole loot player involved in the game was that day.

We love to solve a mystery, and so you can see why that same mystery-solving breadcrumb tracking mindset is at the heart of so much online conspiracy thinking.

Adrian Hon, the creator of alternate reality games, wrote recently that QAnon is not an ARG. It’s a dangerous conspiracy theory, but it pushes many of the same buttons whether by coincidence or intention. The call to do your research leads a curious onlooker to a cornucopia of brain-tingling information.

These lines keep blurring. Crazy Days & Nights is a celebrity gossip site that posts blind items. A blind item is an item of gossip with a name and identifying characteristics have been obscured.

For example, this British-born Z-list actor left his fiancé for an up-and-coming starlet onset. And readers of the site think of this like a game. In the comments, they post the guesses about who they think the Blind is referring to.

But lately, the site has been plagued by a range of QAnon adjacent Blinds, speculation about which celebrities are trafficking children and so on. And again, to quote Peterson writing about this, for these people, celebrity gossip is a puzzle they can solve. There’s a pleasure in that analytical puzzle-solving that tracks really easily from save the children in Q to something like Crazy Days & Nights.

It’s the same thing that keeps people hooked on whether the color of Louis Thomlinson’s T-shirt tells you anything about his relationship with Harry Styles. And it’s the same thing that’s led at least one Q adherent away from the conspiracy and back to her family through the daily world puzzle game Wordle.

I love this story. Her mother went from sending her a steady stream of conspiracist content to sharing her Wordle tactics instead, so share the game with your conspiracist loved ones today.

We know how people get exposed to conspiracy thinking and we understand a little bit more about why they stay. But how does it all go so horribly wrong and what’s our role in all of this?

Well, my thesis is that all of these things are fine at a manageable size, but everything - everything breaks at scale.

Let’s go back through each of them in turn.

Dopamine breaks at scale. An allegedly harmless reward structure that’s designed to make us feel good about being online has given birth to whole industries of influencers and clout chasers, a whole category of people performing for the Internet.

Even after a two-year experiment with hiding likes from their users to try and reduce that dopamine hit, Instagram found that their users hated the change and rolled it back. So, you cannot doubt that now publicly viewable likes are the default.

The problem is that this little reward nudge morphs and grows until influence is everything. In conspiracy circles, influence doesn’t come from rehashing the same old content and ideas. As human beings, we crave novelty. So, if you want more of that dopamine hit, you’re forced to come up with new content, increasingly outlandish claims, and speculation, which is how we start to get ideas spreading like this.

Our sense of purpose breaks at scale. People are drawn together at the start in a unified purpose, but they come with widely differing sets of beliefs and goals, and very quickly that shared sense of purpose starts to break down. Anything that grows quickly -- and we’ll talk more about that speed in a moment -- doesn’t have time to develop a cohesive set of norms and values. Instead, that shared sense of purpose is really quickly lost and everyone is there for their own purpose.

The problem with conspiracy thinking is that there aren’t any boundaries, no separation between one idea and the next. Join a Telegram group because you’re concerned about vaccinating your child, and very quickly you’re going to be wash in ideas about 5G and voter fraud and human trafficking and sovereign citizens. You might start out with a clear sense of purpose, but you certainly won’t stay that way.

The third one, I think, is probably the most interesting. Community breaks at scale. We crave community. We can’t seem to sustain it over a certain size.

We often talk about Dunbar’s number, the cognitive limit to the number of stable social relationships we can have (about 150). I don’t know if that’s the right number, but I do know that we don’t seem to be capable of building really enormous organizations that are great communities without very hard work.

I think, in part, that’s because of how we’re building them. The way we build companies that in turn create these communities has changed really dramatically over the last 30 years.

The Silicon Valley promise was that we could abandon traditional corporate structures and flatten our hierarchies and change our job titles in favor of unlimited leave and foosball tables. The promise of community, something more than an office job, something that felt like your family. But the problem is, these companies are growing at such extraordinary rates to truly incredible sizes that they seem almost fully unsustainable.

Lately, I’ve been seeing a lot of nostalgia for the early Web, and the sentiment seems to be, “Wouldn’t it be cool if we could go back to when everybody had a blog and we all just shared photos on Flickr and had our diaries on LiveJournal?

Audience member: Yes!

Sacha: [Laughter] Yeah. That sentiment is playing out in a really interesting way as we see the rise of advocates for what’s loosely being described as Web3. To hear a Web3 advocate tell it, Web2, as they call it, was all about extractive capitalism. Everything that’s wrong with walled gardens, with Facebook and Twitter, with not owning your own content, and not being paid for your creative endeavors.

But for those of us who were around in the Web 2.0 era, that’s a really depressing way to characterize it. It’s certainly not with the idealism of Web 2.0 started, in any event, which was all about interoperability and user-owned data and open standards and protocols. All the same things, ironically, that Web3 zealots are arguing for.

But increasingly, I think what we’re seeing is not just a nostalgia for the tools and the platforms and the idealism of the early Web. What we’re seeing is nostalgia for a smaller scale, a longing when there were just fewer people online.

And so, the way we’re meeting these three human needs (community, purpose, dopamine), I think all of those ways are fine when they’re at a manageable, human size. But all of them collapse at scale.

And there’s a fourth piece of this puzzle that falls apart at scale, and that’s the myth of neutrality. Despite the fact that some of them now have billions of users, companies across the tech sector are still clinging to this idea that the technology they build can be apolitical.

Some organizations like Coinbase and Basecamp are going a step further to claim that the companies themselves can be apolitical. That it’s possible to have a mission-focused workplace. As if the mission can exist separate from the people carrying it out. It can’t.

New players like Clubhouse and Substack rise to prominence on a wave of investor funding and an absolute absence of moderation or user safety. “We’re platforms, not publishers,” everyone continues to say.

Now we have the Metaverse. Facebook wants us all to move into its budget, second-life knockoff, and hang out and have meetings and whatever. Again, no mention of moderation.

It’s been 30 years since I last logged into LambdaMOO, 30 years since Julian Dibbell wrote his seminal article, “A rape in Cyberspace,” about what happened in that metaverse. And in that time, what have we learned? Because we’ve changed absolutely nothing.

Hany Farid, a Dartmouth College computer science professor, said in an interview, “You know if a biologist said, ‘Here’s a really cool virus. Let’s see what happens when the public gets their hands on it,’ that wouldn’t be acceptable, and yet that’s what Silicon Valley does all the time.

“It’s indicative of a very immature industry. We have to understand the harm and slow down in how we deploy technology like this.”

But instead, our approach seems to be that literally everything is a good idea. So, it’s only when the real-world outcomes are so extreme -- a violent insurrection threatening one of the world’s largest democracies, for example -- that it finally becomes unpalatable for private companies to be associated with conspiracy theorists, which is why you see Trump finally losing his social platforms at the last possible minute, why Parler loses its hosting.

And we’re seeing the same thing unfold now in relation to the war in Ukraine. Faced with cause to take action, companies like Facebook, TikTok, DuckDuckGo finally admit that, yes, they can come down off the fence and do something. That, no, they’re not neutral.

Look. No, AWS shouldn’t be a lone deciding voice in all of this, but we’ve let it because AWS itself is too big. We’re stuck with what Katie Notopoulos has dubbed “moderation by capitalism.” Yes, there should be a market of alternatives. Yes, regulation should work the way it’s intended.

No one wants Bezos or Zuckerberg to have all this power. But we’ve let them amass it, and so we need to be the ones to take it away. We need to get away from the Valley ethos that says it’s possible to build in the absence of politics.

Fred Turner said back in 2017, you know, when you take away bureaucracy and hierarchy in politics, you’re taking away the ability to negotiate the distribution of resources on explicit terms. What you’re left with is charisma, with cool, with shared but unspoken perceptions of power. You’re left with the cultural forces that guide our behavior in the absence of politics.

We’re seeing the impact of that everywhere, the unspoken power imbalances, the voices that are amplified and the ones that are silenced, the critiques that are listened to, and the ones that are ignored. Silicon Valley removed the hierarchies and changed the job titles and flattened the company structures and put the focus on the technology, and it hasn’t worked.

Where does this obsession with scale come from anyway? Well, in some ways, that’s the easiest question to answer. It comes from the money, the Silicon Valley funding model, venture capitalist-driven that powers these companies.

For the better part of 30 years, we’ve been sold the idea that the kind of hypergrowth that turns out unicorns is the only metric of success for a tech company. But we don’t often stop to think about why that is, and the reason has entirely to do with the way these companies are funded.

The venture capital funding model relies on you burning an outrageous amount of capital to achieve hockey stick growth in a fixed timeframe in order to achieve a 10x return for your investors so they can in turn pay out their investors. It’s absolutely artificial, and the investor class that funnels billions of dollars into funding these companies isn’t interested in addressing any of the things I’m talking about.

Venture capitalists are, by and large, complicit and silent because controversy doesn’t hit their bottom line. If anything, it improves it. Silicon Valley hypergrowth is a lie.

I keep coming back to this piece Lane Becker wrote six years ago called “All human systems are trash fires,” in which he says every human system -- your company, your organization, your church, your campaign, your band, your political movement, your dinner party, your revolution -- at some point, you’re going to look up, notice everything around you has been torched, and you’re going to say yourself, “Holy shit. This is an enormous fucking trash fire.”

And he says, you know, realizing that can be revelatory because then you start to ask better questions about your current trash fire, like, am I doing everything I can to contain this trash fire even though I know it will never go out? And, more importantly, am I surrounded by a team of firefighters or a team of arsonists?

I think I’m more optimistic than that. I think we can build systems that aren’t trash fires. But I am fascinated by that central question. Are we firefighters or are we arsonists?

What is it that we could do to change direction? Because it’s dehumanizing (at the moment) being online. And so, people are looking for alternatives to the noise and the crowd and the hate. They’re pulling back into smaller, private groups, social Discords and Slacks, group chats, Substack commenting threads.

Ask someone where their favorite place is on the Internet, and it’s going to be some obscure, private Facebook group about a podcast that they like.

People are returning to older platforms like Tumblr because it’s chronological. There’s no algorithm, no influencers, barely any current events.

But you know there’s a danger in this, and it’s all peeling back off to our own private worlds. Those worlds can lack diversity. We don’t come across ideas we might not otherwise encounter. They can be echo chambers.

Recently, in Aotearoa, we had some significant anti-government protests connected with our government’s pandemic response (like many countries around the world). Protesters occupied the grounds outside our parliament for several weeks. Yes, some of them wore tin hats. No, I don’t know if they were joking.

[Audience laughs]

Sacha: The protesters occupied the grounds for several weeks and they enjoyed very little in the way of public support. Our country’s eligible population is 95% vaccinated.

But more than that, there was an almost constant pressure on the government and the police to move these people on. As a nation, we refused to seed our public square.

And I kept thinking about what it would be like if we felt the same way about our online spaces. If we refused to give up on our public squares. If we refused to give up on the dream of the Internet. What could we do?

The first one is the most straightforward, and I know it’s anathema to the Internet utopians in the room. But the reality is, every time that humanity has invented new forms of media -- radio, television, telecommunications -- we’ve had to have regulation follow. It’s never been the case that we’ve said, “Sure. Share your message with hundreds of millions of people unimpeded.” We’ve always had standards and licensing and rules.

We need effective antitrust legislation to stop the concentration of monopoly power in the hands of a tiny few private companies that we’ve come to rely on for almost everything.

We need law enforcement agencies who understand the ways in which technology is being weaponized, coupled with rules around hate speech and online harassment.

As artificial intelligence and algorithms play an increasing role in our lives, we need careful, thought-out rules around their transparent and ethical deployment. Companies shouldn’t be able to refuse that transparency like this egregious example where Facebook takes down its most widely viewed page for breaking its own rules but then refuses to tell anyone what it was.

These are really complex areas of law reform and policy development, and they can’t be knee-jerk reactions to extreme events. We need to start that work now.

The next thing we need to do is work out how to teach defense against the dark arts. What does media literacy look like in the TikTok Era? We try to tell people to be careful online and to check their sources and so on, but there’s something called the Illusory Truth Effect that super-charges propaganda on social media.

It holds that the more times you see a piece of information repeated, the more true it seems. It works on 85% of people and it works whether or not the information seems plausible and whether or not you know better.

The only thing that guards against the Illusory Truth Effect is to train yourself to check every fact the first time you see it. How many of us do that?

Couple that with the way that new platforms are essentially a firehose of content that you can’t curate, like TikTok’s for you page, and young people are being absolutely bombarded with misinformation.

It’s easy to dismiss this, and you see a lot of really lazy takes that amount to, “Well, nobody should be going to TikTok for their news.” The point is nobody is going to TikTok for their news. They’re already on TikTok, and this is what they’re seeing.

Recently, Ryan Broderick drew what he called The Reverse Idiot Funnel to describe how any theory from conspiracy thinker is now passing out through an incredibly predictable path of sources to mainstream spread. And as he notes, the Reverse Idiot Funnel works equally well in reverse.

Now we’ve got both Twitter and TikTok attempting to label Russia state media. Recently, the UK government shared this SHARE checklist to try and encourage people to think more about the information that they’re passing on.

These things are a start, but I think we need to start thinking about how we teach people to defend themselves. It’s incredibly confusing if our advice is, “Question everything. Do your own research. Don’t assume anything is legitimate,” when that’s exactly the path that conspiracists head down.

I guess you’re not seeing that slide. That slide.

The next thing we need to do is kick the Nazis out of the bar. This is a reference to--

[Audience applauds]

Sacha: This is a reference to a viral Twitter thread that many of you will have seen in which someone describes seeing a bartender kick out a patron for wearing Nazi memorabilia. And as the bartender explains it’s fine to have one polite guy in your bar, but soon he’s bringing his friends and, before you look up, your bar is a Nazi bar.

It’s been ten years since Anil Dash wrote a post entitled “If your website is full of assholes, then it’s your fault.” In that, he argued that by learning from things like zoning regulations and urban planning and crowd control, we could come up with a set of principles to eliminate the worst behavior on the Internet.

In the ten years since he wrote that article, what have we learned?

Well, we’ve learned, first of all, that moderation does in fact work. A tiny number of hostile users are responsible for all of the bad behavior on the Internet. And it turns out that removing them does in fact work.

More than that, we used to think that there was something inherent in the anonymous online environment that caused people to behave badly. But recent studies into this, the Mismatch Hypothesis, which is the ideas that people will be nice in person but suddenly turn nasty online behind a veil of anonymity have found that that’s just simply not the case. The natural fact, people who engage in aggressive, status-seeking behavior online are like that in real life and are choosing to engage online as jerks as part of a deliberate strategy.

Is real human moderation easy? No, but it turns out we can focus on a really small subset of the problem and have an outsized impact.

When Telegram was banned in Brazil recently, one of the changes it made to have the ban lifted was to have employees just focus on the top 100 channels, which they found were responsible for spreading 95% of the public posts in their country.

And similarly, we can choose to focus just on the content that’s going viral at any point in time, throttling it to slow down the spread.

We need strong, enforced community guidelines covering both behavior in the community and other places because every community is going not attract bad actors no matter how well-intentioned.

Recently, both Twitch and Discord have moved to change their terms of service to cover off-platform behavior. Discord says it’s going to take into account things like real-world association with hate groups. Twitch similarly says, “We recognize that toxicity and abuse can spread to our platform from other places in a way that’s detrimental to our users.”

And we’ve learned that dismantling anonymity is actually not as important as creating an environment in which people are invested in their personas. A recent study looked at three phases of commenting on the Huffington post. Disposable anonymity, a move to durable pseudonyms where you could be anonymous to other users but not to the platform, meaning a ban for bad behavior was more likely to stick. And then a shift to real names through Facebook authentication.

What the research has found was there was a great improvement in the quality of commenting from disposal anonymity to durable pseudonyms. But instead of getting better with real names, it got worse. It turns out what’s important to us is not whether or not we’re anonymous but whether we have an online persona that we’re invested in and whether we’re accountable for its behavior.

What would it look like to try and build a great neighborhood instead of trying to build a city? The saying goes, “Think globally but act locally.”

I think people are returning to Tumblr, a broken hell site that I love with my whole heart even though it barely works, because it’s the antithesis of so many of the things we’re talking about. There’s no algorithm. Sure, you can like a post, but it does nothing. The anonymity is baked in. Good luck finding a post again once you’ve seen it once.

Tumblr has just introduced paid sponsored posts, a kind of paid advertising. But in typical Tumblr fashion, there’s no way to target them, no guaranteed impressions, just the ability to pay 10 American dollars to put a photo of your lizard on some random people’s dashboards.

As a venture capital investment, it’s a disaster. But as a place to hang out, it can be a delight. What if we had more neighborhoods like that?

What about the hot, new app BeReal, which gives you two minutes to post a photo of wherever you are and whatever you’re doing when the notification comes through on your phone? No time to stage anything for an audience.

When you download the app, you get a warning that says, “BeReal will not make you famous. If you want be an influencer, stay on TikTok and Instagram.” What if we had more apps like that?

What if we funded and listened to the people who deeply and genuinely understand the problems we’re facing? Tracy Chou funded Block Party (after suffering years of online harassment) as a set of tools that could and should have been backed into Twitter from the very beginning.

We shouldn’t need third-party apps to do this stuff. We should be building these things into the very foundations.

What would it look like to build slowly instead of at a breakneck pace?

A few years ago, my dear friend Tash Lampard spoke at Webstock about an Onsen in Japan that has been operating for over 1300 years, the oldest company in the world; 52 generations of the same family.

She reflected on our venture capital-funded focus on exit strategies, and she wondered if the owners of the Onsen had ever felt the same way. She said, “I imagine they focus not on an exit strategy but on an exist strategy, a strategy built on sticking around. Not for a buyout but for a handing down, a passing along.”

As Tash said, we can choose to remain small. We can choose to devote ourselves to something and to the people that we serve. And we can choose to do our small things in small ways and, over time, those build on themselves.

What would it look like to take a digital walk with friends, to share the things we love online, not for likes or influence or clout or clicks. Sharing your favorite things is a love language. Sitting around in front of the television showing each other your favorite clips on YouTube. Our social experience should feel like that.

Sometimes the best strategy might be not to build anything at all. Sometimes the greatest contribution that we might make might be choosing not to do something.

One of the most striking findings coming out of the pandemic was, the countries that fared best were the ones that had the highest levels of trust in government and in one another. The Lancet recently undertook a survey of over 177 countries and they found that the countries that fared the best were the ones that had the highest levels of trust.

They tested everything for predictive power: GDP, population density, altitude, age, obesity, smoking, air pollution - the lot. What they found was moving every country to the 75th percentile of trust in government, which is where Denmark sits, would have eliminated 13% of global infections. And moving every country to the 75th percentile of trust in fellow citizens (where South Korea sits) would have eliminated 40% of global infections.

That trust and reciprocity brings us full circle to what Putnam was talking about that we lost when we started bowling alone. There’s no magic wand that will make people trust one another, work together, be inclusive of newcomers.

But I keep coming back to this idea that none of us is part of one community. We’re all part of rock climbing groups, improv troops, choirs, writing workshops, gardening groups, and dog parks.

I firmly believe that the networks and spaces we’re part of that are happy and healthy and inclusive have to start showing up at places that are broken and toxic so that people won’t want to be part of them anymore. I believe that the good neighborhoods that we build online can influence the bad.

Ayesha Siddiqi wrote recently about the things we gain from social media saying, “I think of the people who made it possible for sexual violence to have social consequence. Students using Discord to organize school workouts. People getting help from therapists and coaches on TikTok and Instagram. Some of the best culture writers whose voices we only got through Twitter and Tumblr.”

I keep thinking about how we meet these needs of knowledge production and expression and connection and entertainment. How do we rise to that challenge and choose to build not for money or for power or for influence or for scale, but in the public interest?

The thing I love most about the Internet is its ability to bring people together. Vibrant communities have engaged people, connecting around the things that they love, and I’m passionate about fans and fandom because, at its best, it represents a way for people to find companionship and support and share their creativity and their work.

And the tools and the platforms and the organizations that we build make that possible, so it’s up to us to save it, to dig out the toxicity built into the foundations, to put the focus back on people and aware from the engineering solutions, so to stop leading people down rabbit holes, to stop leading them anywhere at all, really, and to find a way to celebrate being in community with one another all over again.

Thank you.

[Audience applauds and cheers]

Speakers