Unknown, but not hidden

A Joshua tree in the desert, with low cliffs in the background, and a cloud-chased sky above
(jby, Flickr)

Word is that Twitter is selling out to Elon Musk, whose (speculated) plans for the platform are not especially encouraging. On the one hand, Twitter privately owned by a “free-speech absolutist” may not be appreciably less pleasant for a person like me than Twitter as a publicly traded company with some nominal interest in the experience of users besides Elon Musk. On the other hand, this is as good an excuse as any to take a step back and see if I can, finally, log off.

I’m not deleting my account — not yet — but I’m going to see if I can’t get back to something like my online behavior from the era before Twitter was my first social login of the day. Way back in the Obama administration, I posted to this blog (actually, its incarnation on, yikes, Blogger) multiple times a week. I didn’t break my thoughts up into pithy little snippets, or plan longer discussions in strings of 280-character sentences. I just … wrote.

Continue reading

#FBexit? Is that a thing?

My #FBexit statement, as posted to That Site.

Facebook is a problem. It’s become the only way I’m in contact with a lot of folks, including far-flung family and friends accumulated over a decade of the Academic Nomad life. But it’s also absolutely awful at moderating the news or stopping the spread of falsehoods, and it continues to seek new ways to do unsavory things with the data we put on its servers even as it fails to secure that data. So I’m trying to cut as much of my life out of Facebook as I can, paring my profile there to a point of contact and not much else.

I’ve downloaded my data and done my best to clear out past postings — so many old photos! — and I’m going to use the holiday season to spread the things I used to do over Facebook to a variety of other platforms, which are at least nominally separate entities. I’ve put up a list of those platforms and profiles as my last Facebook post: my Flickr account, which needs something like eight months of updating (!); my Twitter profile; my e-mail and phone number for messaging; and this very blog for longer-form stuff. None of these are perfect solutions; some of them are entangled with corporations very nearly as unlikable as Facebook. But I hope I can use them together to achieve what a Facebook profile does with more control over the negative externalities of life online.

Also, it probably wouldn’t hurt for me to do more quick writing in a space like this one. We’ll see how this goes in the new year.

Chronicle Vitae: On advice

Truth in advertising. (Flickr: Alexander)

Truth in advertising. (Flickr: Alexander)

I’m back in Vitae this week, ruminating on the usefulness of personal advice — or rather, its frequent lack of usefulness.

The challenge with receiving and applying advice is to distinguish real, general principles from what may simply amount to another person’s recollection of a series of events that ended well. … Certainly in academia, as in any career, there are habits and choices that improve the odds of survival from graduate school to tenure. But simply making it to a particular stage doesn’t actually mean that you had all the right habits or made all the right choices — or even know which habits and choices will work for most other people.

In keeping with my established approach to these columns, I actually do circle back around to a way in which you can learn from other folks’ personal experiences, but you’ll need to read the whole thing to find out how.

Today, in statistics that lose almost all meaning when you really think about them

Someone in my Twitter stream passed along a Washington Post WonkBlog item, which, drawing from WaPo’s important and impressive tracking of shootings by U.S. police, estimates that as of the posting date (June 1), police were responsible for 1 in 13 gun deaths in the U.S. There’s a graphic, for extra share-ability:

One in 13! ONE in THIRTEEN! ONE in THIRTEEN!

… Is that a lot?

Actually, we know it’s an estimated 385 deaths, because the numbers are right there on the shareable graphic. (Good practice, that, well done!) But the first thing that occurred to me, as I looked at the graphic, was that the “one in 13” ratio is only alarming to me because I knew, even before I saw the raw numbers, that Americans shoot a lot of each other. According to the Post’s data, there were 5,099 shooting deaths in the first five months of 2015! If we had the per-capita shooting death rate of a civilized nation, the police could shoot exactly as many people and end up with a much higher ratio, but would that be proportionally more alarming?

And then the second thing that occurred to me was that, actually, I can picture a scenario in which I’d prefer for the ratio to be higher — if I could trust the police to shoot people only when necessary, unencumbered by systematic biases and a proclivity to use maximal force. Heck, in a world where fully trustworthy police were responsible for 100% of gun deaths, that’d mean no gun deaths resulting from four-year-olds rummaging in their parents’ nightstands, and no gun deaths by paranoid old white dudes who hate rap music. I’d actually quite like to live in that world.

Really, all of the underlying understanding that makes the info-graphic stat alarming and newsworthy and share-able is more depressing and infuriating than the statistic itself: we live in a country where guns are used to kill far too many people, and we don’t trust the police to treat their fellow citizens fairly. Happy day-after-Independence-Day!

(As expounded previously on Twitter.)

No, E.O. Wilson didn’t invent the term “evolutionary biology”

Screenshot of the OED entry for "evolutionary biology."

Screenshot of the OED entry for “evolutionary biology.”

This View of Life, the evolution-centric online magazine, has a long “conversation” with myrmecologist E.O. Wilson, one of the most prominent evolutionary biologists of the era following the “Modern Synthesis” in the second half of the Twentieth Century, and still one of the leading popularizers of evolution. It’s a long ramble, but worth your reading time, I dare say. Though, to be honest, I only found out about it because of this aside that TVOL highlighted in a tweet:

I can’t say Jim [Watson] and I were friends because I was the only younger professor in what came to be known as evolutionary biology—a term I invented, incidentally—as I started here in Harvard, and it was Jim Watson’s wish that I and other old fashioned biologists not leave the university but find a place elsewhere than the biological laboratories. So we were not on friendly terms. [Emphasis added.]

Wow! No one was using the term evolutionary biology before E.O. Wilson? That would be pretty nifty, but it’s also easy to fact-check. I did it by looking up the phrase in the online Oxford English Dictionary over breakfast. And I found a citation to this, on page 140 of St. George Mivart’s book Contemporary Evolution, an Essay on Some Recent Social Changes, published in 1876:

The second instance is that of the apparent conflict between evolutionary biology and Christian dogma, and indeed, no better test question as to the effect of scientific progress on Christianity could well be devised. [Emphasis added.]

The OED also has a citation from 1920, nine years before Wilson was born, which refers to work by T.H. Huxley, one of the contributors to the Modern Synthesis. [Correction: Whoops, nope, Thomas Henry Huxley isn’t the Modern Synthesis guy; that’s his grandson Julian. I SHOULD HAVE KNOWN THIS.] So, I’d go so far as to say that it looks like evolutionary biology pre-dates Wilson considerably, and was probably even in common use by the time he joined the faculty at Harvard.

Update: Following from Dave Harris’s response on Twitter, I see that evolutionary biology, as a fraction of all mentions of biology in Google’s Ngrams text database, does start climbing upward in the mid-1960s, coincident with Wilson’s early career. Wilson’s work surely contributed to that increase in the use of the term, though I think it’s quite unlikely he’s solely responsible.

Okay, I’m ready to stop the march of progress now

As Douglas Adams famously and incisively put it,

“There’s a set of rules that anything that was in the world when you were born is normal and natural. Anything invented between when you were 15 and 35 is new and revolutionary and exciting, and you’ll probably get a career in it. Anything invented after you’re 35 is against the natural order of things.”

I think I’ve found my first post-35 technology about five years early, and it’s Google’s glasses.

Figure a few years for this to make it into general use, and we’re bang on my age of transition, I guess. I can think of nothing I’d like less than having my field of vision partially obscured by whatever G-mail thinks is most worthy of my attention—new messages, helpful advice, or the inevitable location-based text ads. It is just not that difficult for me to reach into a pocket and check my phone when I want to see my e-mail.

Also: notice that none of the people encountered by the Google glasses-wearing fellow in the video are wearing Google glasses? That’s because even people who do want G-mail alerts directly in their eyes don’t want to live among the kind of socially stunted cyborgs we’d all become if we wore these things, talking to the air and pointing at things that exist only in cyberspace.

In short, ugh. Ugh, ugh, ugh.◼