The Washington Post “climate advice” columnist, Michael Coren, has a great article up today about his experience trying out apps that identify plants and wildlife [gift link] from smartphone images or audio recordings, like iNaturalist or Merlin. It’s clear from Coren’s description that being able to put names to the living things in his neighborhood gave him a new connection to that urban biodiversity
I’m not a master naturalist, but I have one in my pocket. Thanks to artificial intelligence trained on millions of observations, anyone with a smartphone can snap a picture or record a sound to identify tens of thousands of species, from field bluebells to native bumblebees.
If I’m honest, it’s the kind of thing I would normally miss while walking or pedaling to work. Birdsong might be gorgeous but I’d barely hear it. I’d note “pine tree” as a catchall for conifers.
That has changed. I’m now on a first-name basis with most of my wild neighbors. It has reconnected me to a natural world I love, yet never studied deeply enough to know all its characters and settings.
This is very much the experience I hope students have in my undergraduate plant systematics course, and I’m delighted that smartphone apps are making it more accessible. (This year I actually started providing my plant systematics class with explicit guidance in using iNaturalist as one resource for plant identification, in concert with formal botanical keys like the Jepson manual.) But Coren also illustrates the article with images of plants he’d identified using apps and … they’re not very good?
On the left, here’s a “possible Monterey cypress” and on the right a “borage”, as identified, I think accurately, using iNaturalist. Neither of these images include clear views of what I’d try to capture, in an image I’d use for identification. The cypress is mostly trunk, with a few obscured views of small scale-like leaves I’d expect to see on that species. The borage is viewed from above, and because the flowers are nodding downward we see the blue petals and fuzzy sepals and none of the interior whorls of the flower. Compare to this image of Borago officinalis from Harald Henkel on Flickr:
This angle shows us the stamens and pistil of the flower, really distinctive details for this species, in addition to a bit of leaf shape and the ubiquitous hairs, which are also a common feature of species in this family. When I’m using iNaturalist, this is the kind of image I try to take, because I may want it for followup examination, and because it might look nice in my Flickr collection of plant images. But I’m a botanist — or, well, a plant-focused biologist — and I know enough of the anatomy of flowering plants to look for these features. Whereas Coren’s an enthusiast with a smartphone app that can make an educated guess, when it “sees” fuzzy hairs and green leaves and those spectacular blue petals in an image with a geo-tag near a place where other people have logged observations of Borago officinalis, that the image is probably also depicting a borage.
To be very clear, my point here isn’t to mock Coren’s photography skills! Rather, it’s that his column got me thinking about the differences in an experience of nature mediated by iNaturalist or other AI-powered apps, and one informed by a little more directed study of the species to be identified and named. I don’t think the former experience is necessarily a lesser one — learning to name the biodiversity around you really does give you a valuable new connection to that diversity, even if it’s not a technically informed connection.
Nevertheless, knowing how to look at plants, and what to look for, is its own skill set apart from recognizing species by name, and it’s one I watch students acquire in that plant systematics class over the course of a semester. That skill set is what lets us make identifications beyond what apps can achieve (for now), and what helps a fledgling botanist draw connections between a new-to-them species and the diversity they already know and recognize. I can imagine a version of iNaturalist that introduced users to plant anatomy and guided them to capture images of specific features, to better inform AI-powered identifications — but for now, at least, cultivating that way of seeing the green world remains a part of my job as a biological educator that can’t be replaced in the app store.