And another thing …

Regarding that adaptive fairytale about the “runner’s high”—over at Distributed Ecology, Ted Hart points out that it doesn’t make much sense in phylogenetic context, either.

What would be really interesting is to see where this trait maps across the phylogeny. Is it a conserved trait that was selected for in some ancestor? That would point to the fact that maybe it has nothing to do with running. The authors are mute about phylogeny, but eCB’s could alternatively be the ancestral character state, and really the interesting question is why did ferrets evolve the loss of this state? On the other hand maybe the trait evolved multiple times, and that also is really interesting to ask how that happened. But either phylogenetic scenario undermine the central thesis of Raichlen.

You’ll want to read the whole thing, natch.◼

Dr. Pangloss runs a marathon

Runners in the 2009 New York Marathon. Photo by Whiskeygonebad.

ResearchBlogging.orgThis just came over Twitter (hat tip to @DLiancourt): NPR is running a story claiming that the “runner’s high” some of us feel after a good workout is an adaptation to prompt us to keep fit, or something.

When people exercise aerobically, their bodies can actually make drugs—cannabinoids, the same kind of chemicals in marijuana. [University of Ariona anthropologist David] Raichlen wondered if other distance-running animals also produced those drugs. If so, maybe runner’s high is not some peculiar thing with humans. Maybe it’s an evolutionary payoff for doing something hard and painful, that also helps them survive better, be healthier, hunt better or have more offspring.

So, in a study [$a] pubished in The Journal of Experimental Biology, Raichlen tested this adaptive hypothesis by comparing the levels of these “endogenous cannabinoids” in the blood of humans, dogs, and ferrets after running on a treadmill. The idea being that the ancestors of dogs, like ours, made a living by running—chasing down prey—while ferrets don’t.

So it’s kind of nice to see that Raichlen and his coauthors did, indeed, find that humans and dogs both had higher levels of endogenous cannabinoids in their blood after a run, and the ferrets didn’t. That’s a useful evolutionary data point: it suggests that whatever physiological system prompts endogenous cannabinoid production in connection with exercise dates to (at least) the common ancestor of dogs and humans, and that its preservation in both species may be linked to our shared ability to run long distances.

But it really doesn’t show that this cannabinoid response is an adaptation to reward us for putting in our daily miles.

To really show that the runner’s high is an adaptation, of course, we’d need data that showed (1) observed variation in the runner’s high response has a genetic basis, and (2) people who had get stronger runner’s highs have more babies. But even apart from that, the understanding of what is good for us today—getting off our butts and going for a run—isn’t particularly that well connencted to the lives of our proto-human ancestors. Does Raichlen really think that early humans (or dogs, or any other animal that chases down prey) would just sit around and go hungry if we didn’t have a cannabinoid payoff at the end of the hunt?

And then, in the text of the very same NPR article, an orthopedic surgeon is quoted saying that the “runner’s high” can actually be a problem:

[Dr. Christina] Morganti treats runners for injuries and she says they’re the worst patients. “The treatment is to stop running,” she says. “They won’t. They don’t want to. A lot of the behavior is not unlike the patients we have who are seeking drugs. It’s really similar. It’s an addiction.”

So … a physiological response that prompts some of us to run even when running is likely to exacerbate injury is a good thing? How, exactly, would giving yourself shin splints lead to greater reproductive fitness if you’re making a living hunting gazelles on the savannah? I’m going to go out on a limb and say it wouldn’t.

Here’s an alternative hypothesis, which I freely admit is no better supported by the available data: the endogenous cannabinoid response isn’t a “reward” for running. Instead, it helped our ancestors tolerate the stress of running when they needed to, by letting them ignore minor pains and press on after that one elusive, tasty antelope. For our ancestors, dinner was the reward for running, not the cannabinoids. In the modern world, where we don’t run for our dinners, we’ve re-purposed the pleasant persistance of those cannabinoids as a motivation to replace that original life-or-death need.

Whatever the actual evolutionary origins of the “runner’s high,” the idea that it’s an adaptive reward for exercise is nothing more than adaptive storytelling filtered through the lens of our modern, very unnatural, lives. Don’t get me wrong—I love to run, and in fact I’m a month away from my fourth marathon. But I’m not going to pretend that I’ll be running those 26.2 miles because natural selection wants me to.◼

Reference

Raichlen, D., Foster, A., Gerdeman, G., Seillier, A., & Giuffrida, A. (2012). Wired to run: exercise-induced endocannabinoid signaling in humans and cursorial mammals with implications for the ‘runner’s high’ Journal of Experimental Biology, 215 (8), 1331-6 DOI: 10.1242/jeb.063677

It’s that time of year again!

The Portland Marathon two years ago. Looks fun, right?.

This weekend I’m flying out to Portland for the 2011 Portland Marathon, my third. It’s been a bit tricky keeping up with my training on top of moving to a new town and starting up a postdoc with a whole new study system, but I think I’ll be ready. While I pack, why not check out my post on the occasion of last year’s Seattle Marathon, in which I discuss what I’ve learned over a couple years of long runs and leg cramps. It all still applies.

I can make it through even a half-marathon on a good breakfast and carefully-judged pre-race hydration, but to go much longer I need more food (and water) mid-run. The long-term exercise involved in a long race is fueled by a combination of fat reserves and glycogen stored in the liver and muscle tissue. Glycogen is the more efficient fuel, so as exercise intensity increases, muscles draw on it more heavily.

For far more detail on evidence-based endurance training approaches, I suggest Dave Munger’s great science-based running. See you in 26.2 miles! ◼

No, I will not run the Seattle Marathon barefoot

ResearchBlogging.orgI’m spending a significant chunk of my Thanksgiving break in Seattle, for the purpose of running what will be my second marathon this weekend. Running, like cooking, is helping to keep me sane in the midst of teaching labs, finishing my dissertation research, writing said research up for publication, and trying to sort out what happens after my committee decides I’ve earned a handful of extra letters after my name.

Me at about mile 17 in last year’s Portland Marathon. I’m not quite dead yet.

My first marathon was last year’s Portland Marathon. Prior to 2009, I’d never run a race longer than five miles, but then that spring I let friends talk me into a half-marathon, and after running more than 13 miles, 26.2 suddenly didn’t seem quite so insane. Even so, training up for Portland was more than enough to make me realize that running what was (for me) a 3 hour-45 minute course is not really the same thing as running eight or nine 5k’s in a row.

Feed me!

I can make it through even a half-marathon on a good breakfast and carefully-judged pre-race hydration, but to go much longer I need more food (and water) mid-run. The long-term exercise involved in a long race is fueled by a combination of fat reserves and glycogen stored in the liver and muscle tissue. Glycogen is the more efficient fuel, so as exercise intensity increases, muscles draw on it more heavily.

If his muscles runs out of glycogen, a runner “hits the wall,” and may be forced to stop running altogether. I’ve done this a few times on long training runs, and it’s not pleasant—I’d end up all but walking the last couple painful miles. How long I can go before I hit the wall depends on my glycogen reserves, which in turn depend on the muscle mass in my legs—but it also depends on how fast I’m running, since glycogen use increases with effort. A computational study of the interactions between exercise intensity and glycogen consumption suggests that my first marathon time, 3:45, was close to the upper limit of glycogen consumption for a “trained endurance athlete”—and I probably don’t really qualify as “trained,” in the sense the study uses. So to survive a marathon, I have to take on supplementary energy mid-race, for which I will carry tubes of disgusting sugar syrup.

Supplementary sugar. Nasty but necessary. Photo by size8jeans

Shoes matter. Who knew?

Before I started training for Portland, I didn’t pay much attention to the state of my running shoes—I bought new ones when the holes in the uppers got too obvious. That’s okay when the longest run I do is about eight miles—once my weekly schedule started including longer distances, I noticed more post-run pain when my shoes’ insoles deteriorated. I began investing in gel insole inserts and actually paying attention to how much mileage my shoes had accumulated.

The funny thing about shoes, though, is that familiarity is almost as important as adequate support. Last year I bought new shoes about a month out from the marathon—and ran some truly miserable long runs in them. Lesson learned. It turns out that a new pair of shoes takes some breaking in, especially if you switch brands, as I had. I ended up running the marathon in the shoes I’d considered shot (with new insole inserts), and felt better at the end that I had on a fifteen-mile run in the new ones. I now stick to one brand of shoes, with the same inserts if possible, and I don’t wear new shoes on a long run until I’ve worn them on a number of short ones.

Barefoot running, without the bare feet. Photo by Steven Erdmanczyk

Part of the reason that my running comfort is so sensitive to the quality of my shoes may be that human feet aren’t evolved to run in running shoes. Running on two legs sets humans apart from our closest evolutionary relatives, and we’ve probably been doing it for millions of years—but highly padded running shoes are a very recent invention. This is the central argument in favor of a recent fad for barefoot running [PDF]—that, once you build up some necessary calluses, running without the artificial support and padding of a running shoe is less stressful. A biomechanical comparison of barefoot and shod runners provided some of the first data to support this hypothesis earlier this year. Essentially, barefoot runners tend to land each step toes- or mid-foot-first [PDF], which absorbs the force of a foot-strike more effectively than the heel-first tread of shod runners.

I did see a few of my fellow marathon runners wearing nothing but “barefoot” running shoes like the ones pictured here, which provide protection against rough pavement but no artificial padding. I’m not going to be doing that any time soon. But maybe I’ll try to add some barefoot workouts into my training routine, if I survive Seattle and decide to run a third marathon.

I intend no endorsement of any products pictured or linked to in this post. Thanks to Conor O’Brien, who pointed me to the PLoS Computational Biology article cited above.

References

Jungers, W. (2010). Biomechanics: Barefoot running strikes back. Nature, 463 (7280), 433-4 DOI: 10.1038/463433a

Lieberman, D., Venkadesan, M., Werbel, W., Daoud, A., D’Andrea, S., Davis, I., Mang’Eni, R., & Pitsiladis, Y. (2010). Foot strike patterns and collision forces in habitually barefoot versus shod runners. Nature, 463 (7280), 531-5 DOI: 10.1038/nature08723

Rapoport, B. (2010). Metabolic factors limiting performance in marathon runners. PLoS Computational Biology, 6 (10) DOI: 10.1371/journal.pcbi.1000960