What was Twitter?

(Flickr: Buzz)

The science fiction novel Ender’s Game is best remembered for its primary plot, about a genius child who leads Earth’s forces to genocidal victory against aliens; but it also has a secondary plot line that seems, in retrospect, terrifyingly prescient. While the protagonist Ender is learning to become the greatest space-general in history, his near-equally-gifted pre-teen brother and sister, left behind on Earth, take up politics. Peter and Valentine set up pseudonyms on a global online message board and spar theatrically, building competing followings and eventually real-world political influence. By the novel’s end Peter has leveraged his online clout into the leadership of a worldwide government.

I read Ender’s Game in the mid-1990s, when it was truly science fictional to imagine the whole world connected in a single messaging system, much less using devices portable enough to carry in a backpack. By 2004, my final year of undergrad, I acquired a bulky Dell laptop which was, most excitingly, capable of using the wifi network that had just been installed in my campus apartment complex — and I’d already gone from a hand-coded HTML personal website to a series of blogs hosted on the most obvious choice, Blogger.com. Multiple of those blogs were social affairs, shared with friends, but their connection to people elsewhere on the Internet was entirely mediated by individual “<a href=” hyperlinks. Midway through graduate school, I accepted responsibility for building a website for a conference to be hosted by my home department, and decided to try embedding a new messaging platform I’d heard about: Twitter.

Counting the profile I set up for myself, that conference Twitter feed attracted all of 93 subscribers, not even 10% of in-person attendance — but that was only the beginning. Within a couple years, the same conference was planning and promoting its own hashtag in advance, and Nature was running how-to guides on the use of the platform to maximize networking at scholarly events of all sorts. I live-tweeted conferences, eventually used Twitter to recruit participants for a side project that’s now one of my most-cited papers, found my second postdoctoral job on Twitter, somehow got my account verified, tweeted through two years of frankly spirit-breaking hunting for a faculty position, even at one point did a presentation in a Twitter seminar, and managed to keep the same account fully public and land that faculty job and make it all the way to tenure. All this in a sort of giant unstructured chat room that included not just my actual colleagues in evolutionary biology but journalists and cultural critics I was a bit agog to interact with and weird pseudo-human personas representing every commercial brand on the planet and politicians being occasionally shockingly human and, for an increasingly horrific period starting in roughly late 2015 (but really going back farther), a hateful, unhinged quasi-billionaire who (we have all agreed this is what happened and it still seems incredible to write it) became President of the United States not for any real qualification he had to do the job but because of the power of his following on Twitter.

And now, a few increasingly ridiculous years after it seemed like things could not possibly get more ridiculous, the richest man in the world has purchased the giant unstructured chat room and is intent on remaking it for his own dubiously helpful purposes.


This is where it would be nice to say that, back when I read Ender’s Game, I recognized that any world in which two usually bright children with chat room pseudonyms could end up ruling the planet would probably be pretty terrible.

I did not.

That’s partly because I was myself a child, and partly because the whole thing seemed very far removed from plausible reality to begin with. (Remember, this was a book mostly about child soldiers doing space-genocide.) Similarly, it’s a bit mind-stretching to think about how Twitter became what it was by the time Elon Musk completed his purchase late last week.

I knew Twitter first as Science Twitter — the interacting profiles of people in the Science Blogosphere, mostly Internet-happy grad students and faculty members but also geeky non-academics who had set their sights on science journalism and outreach, and set up blogs to write about their own research and the papers they were reading. Before Twitter, there were organized attempts to cross-link our blogs, like the “Carnival of Evolution”: people would submit posts to a rotating host, who’d build the links into an organized list, maybe with a theme or clever narrative to nominally tie them together. There was an attempt to automate the aggregation of blog posts about scientific articles, linked to the articles’ DOIs; there were group blogs and blog networks organized on single sites, like ScienceBlogs.com (which has its own, er, interesting history).

As science bloggers set up on Twitter, though, interactions that had played out in point-counterpoint blog posts and their comments sections migrated more or less wholesale onto Twitter, mostly because Twitter was where the audience was. In a blog post’s comments, you had a conversation with the handful of folks who read the post and then kept reading below it; on Twitter, you got to tell all your followers what you thought about the wrong and possibly dangerous thing That Guy had just posted. Somewhere along the way we stopped responding to (say) the latest evolutionary psychology silliness by writing a blog post that dismantled it point by point and triumphantly tweeting the link; instead we quote-tweeted the author of the silliness, maybe with a long trailing thread of numbered arguments. This very clearly also happened in the politics blogosphere, and the history blogosphere, and other networks of more or less independent websites on which people posted writing out of love and/or obsession, but also the professional equivalent of those places, like the New York Times Op-Ed section.

This is really what Twitter’s appeal boiled down to: immediate attention. On Twitter you could demand, and often win, the attention of people with much higher non-Twitter profiles — as things progressed there got to be ways to limit this, but early on you could literally make an unwary celebrity’s phone chirp with a notification by @-tagging their handle. Even if you only had a couple hundred followers yourself, your tweets went to the timelines of more people than you could reasonably expect to read a whole blog post on a separate website — and that’s before you start factoring in retweets and the later-introduced algorithmic recommendations that started showing popular tweets to people who weren’t following the original tweeter.

I’ve only gone really properly viral once — by happening to be one of the first people to spot and tweet about some guerrilla Star Trek fanart in my neighborhood park — and my reward was spending most of a weekend with my phone constantly buzzing until I muted notifications from the post. I fully believe that Twitter users with more direct followers than mine — I have ridden my early-adopter priority effect over the years to about 8,400 — find their notifications barely tolerable. I did, however, find the profile that Twitter gave me — in the sense of social visibility, not the page with my contact information — useful in career-relevant ways. Twitter let me promote my research to people who might not have otherwise spotted it, and it got me onto other platforms I might not have otherwise reached: I’ve been quoted in The Atlantic mostly because Ed Yong spotted my tweet of amazement at the discovery of nitrogen-fixing maize; I landed two pieces in Slate because I saw an open call for pitches to a special Pride Month series.

All the attention was really possible because the site became so central to a certain kind of conversation — at least in the U.S.-centric, Anglophone world — among cultural critics and public intellectuals and journalists and politicians and activists. It’s not so much that the giant unstructured chat room was designed to sustain that usage as that all those people brought their conversations to it — the migration of the Science Blogosphere to Science Twitter being just one example. And while it’s been exciting and occasionally quite useful having all those conversations in that one online space, it’s also frequently been terrible, because there are no real boundaries between these conversations.

Even without the algorithm meddling in your feed, opening Twitter will often deluge you in people’s reactions to everything from cute pet videos to mass shootings to the latest memes to natural disasters. You might have joined Twitter to talk with other scientists about, I don’t know, newt social behavior, but you’ll rapidly end up faced with a choice to either (1) keep tweeting about your one little topic while (it seems) everyone else is organizing to help hurricane victims or (2) try to find a useful way to contribute to the Hurricane Discourse or (3) wait for things to die down so you can resume tweeting about your one little topic. Options (1) and (2) are tempting in different ways but offer very similar risks of making you look like a jerk (bad) or out of touch (worse). (There’s also Option 2.5, which is to try making newts relevant to Hurricane Discourse; this is almost always a bad idea.) Option (3) is safer, but still requires you to correctly judge when it’s socially safe to get back to the newts again. It’s exhausting!

And yet somehow very hard to leave.

In its 2022 incarnation, Twitter contains many people I deeply love and appreciate (some of my best friends are people I met on Twitter, and I am not even saying that in the ironic way) and conversations I want to follow for personal and professional reasons — but it’s also an endless wash of other concerns vying for my attention. I never even followed the Twitter President at any point in his rise or rule or eventual ouster, but I was always aware of his hateful outbursts by virtue of watching my timeline collectively react to them. Even in the post-Twitter President era, I’ll open Twitter over breakfast with the idea that it’s equivalent to reading the morning news, but it’s really more like having three different talk radio hosts simultaneously telling me nine different reasons to be worried and/or angry while trying to hear Sir David Attenborough rhapsodizing about the agility of flying squirrels. I can come away from five minutes on Twitter with a link to an interesting new research paper or chuckling at a stupid joke about Sondheim lyrics or newly convinced that the world is going to Hell and it’s my fault.

It’s worth noting that all of this is in the context of a comparatively charmed Twitter existence! With some careful choices of privacy settings and the benefit of a comparatively small following for a blue-check-haver — and, likely, the benefits of being a visibly white cis dude — I have never, in my time in the giant unstructured chat room, managed to attract the attention of the troll-mobs who make it even less usable for people they don’t like, mostly non-white, non-straight, non-cis-men. I’ve gotten good at reporting hair-raising things tweeted at my less fortunate friends and acquaintances, but even at the peak of Twitter’s content moderation (i.e., immediately Before Elon) it’s a game of whack-a-mole with disposable anonymous accounts.

So anyway, yes, fully three decades wiser, I can say: Ender’s Game depicts a horrifying dystopia, and not just because of the space-genocide.


I started looking for Twitter alternatives back during the worst of the Twitter Presidency, and so when Elon Musk announced his intent to purchase the giant unstructured chat room, I had a profile already set up on the largest “instance” of Mastodon, a genuinely interesting attempt to port the Twitter experience to something that is definitely not Twitter. I responded to that first announcement of Musk’s plans by going more or less cold-turkey; then came back maybe too quickly when he seemed to realize he’d made a huge mistake; then, when the deal closed late last week, I repeated the procedure. It feels like this time it’s going to stick.

Part of that is because of what I saw when I opened up Mastodon again: people I recognized from Twitter, and not just fellow early adopters. Within a couple hours last Friday, I spotted not just a recognizable personal account, but a meme account, Star Trek Minus Context. That’s not an account created to read the news and gossip; that’s an account that wants an audience. Whoever it is who blesses the timeline with aptly chosen screenshots from my favorite space opera had decided the audience was on the move.

I spent much of my last week on Mastodon watching familiar faces find me, but the server was sufficiently bogged down with new traffic that I couldn’t post properly myself. This is (sort of) a feature of Mastodon, not a bug — it’s not a single unstructured chat room, but a protocol and software package that runs on independent servers, set up so that if I have a profile on server A, I can still subscribe to posts from profiles on servers B, C, D, and so on. Anyone can set up a server and subscribe to posts from other servers using the protocol, much like anyone can set up a website running WordPress. Individual servers set their own codes of conduct — which may be more or less strict than whatever moderation Twitter has in place, pre- or post-Elon. Servers each have their own “local timeline” of posts from local users, and you can also view a “federated timeline” of posts from servers using the protocol and participating in open-broadcast posting. The timelines are, generally, just chronological; no one is trying (yet) to come up with some way of auto-curating and/or manipulating the set of Mastodon posts (officially they’re called “toots” but I refuse to adopt this) that a user sees, though in principle someone could do so. There’s no central moderator to this “Fediverse”, which means there’s no single gate to lock out trolls and bad actors. Users can mute and block each other, though, and even block whole servers — and servers can levy similar sanctions. So any given Mastodon server can enforce the community standards its members want and are willing and able to enforce.

By the middle of this week, mastodon.social, the biggest server and the one where I had a profile, was catching up with the influx of new users; a couple days ago I discovered that Alexis Simon, a postdoc at UC Davis, had set up a new server at ecoevo.social, and a quick look through the early membership told me it’d make sense to join. I ported my profile over and I’ve spent the last day watching familiar faces show up in my notifications as new followers. The American Naturalist was there before I was (with, delightfully, the ligature in its screen name) and I found the journal Evolution not too long after. There’s a crowdfunding platform to pay for server capacity, and (as I think I’ve seen multiple users suggest) it doesn’t seem crazy to think ecoevo.social will end up supported by one or more of the scholarly societies for evolution and ecology, in much the same way the societies support journals and conferences. In some regards that could simplify the problem that managing a Mastodon server means defining and enforcing a code of conduct for users — scholarly societies generally have explicit codes of conduct for conferences and other contexts, which would cover a lot of the necessary ground.


Beyond the several hundred biologists chattering on ecoevo.social, Mastodon feels, as Science Twitter veteran Sheril Kirschenbaum pointed out, more than a bit like the Old Days. This isn’t just because many of my early follows and followers have been people I know from the pre-Twitter Science Blogosphere. The network isn’t overwhelmed with waves of breaking news in the same way Twitter has been on a daily basis for most of a decade. There’s a sense of people finding each other again in a new context, like lifeboats regrouping on a quiet shore after the ship goes down, and circling back to our core identities and interests now that we don’t all feel like we have to react to what everyone is saying about that dad with the grandiose ideas about teaching his kid how to use a can opener.

I don’t know what happens after this. Twitter really has reshaped the public space in the U.S. and much of the rest of the world, creating an expectation that people who want to contribute to the world will have a public online presence that announces not just their identity and interests but how many people care enough about them to “follow” those interests. The public servers of the Mastodon network are hosting a lot more of those public online presences than they did two weeks ago, but still a tiny fraction of the people who were part of the giant unstructured chat room Before Elon. The Mastodon network is more fragmented — or, if you like, better structured — from the start, and while this may be helpful for my personal attention span, it might also reduce its utility for people who need the audience. We haven’t yet seen a protest movement organize on Mastodon (though it certainly could), and we haven’t yet seen the Mastodon network cope with a concerted, coordinated disinformation campaign (though we probably will sooner or later). There’s flexibility and resiliency in disaggregating, but there’s also greater difficulty in coordinating.

Maybe Elon’s interest in Twitter will flame out and somehow not take Twitter down with it, and maybe we’ll all come back to some new incarnation of the giant unstructured chat room. It’s hard to see how that happens, but a couple decades ago it was also hard to see how we’d get to something plausibly like the online public space depicted in Ender’s Game and other late-twentieth-century scifi in the first place. You don’t just need the tech to make a network of interactions like we had on Twitter — you need people to actually come and interact.

Right now the people — at least many of the ones who kept me on Twitter — are logging off, eyeing the exits, leaving the party. We’re also, in many cases, figuring out how to find each other on the outside. We may not have anything quite like Twitter ever again; but leaving the site behind doesn’t mean leaving the people who made it worth logging on.