Barry Morisse

View Original

Longtermism

The Corona Virus talk is dominating the global conversation at present, and rightly so. It represents the most significant risk of a pandemic that we’ve seen in a long time. Even though I’m frustrated by the amount of misinformation circling around, I’ve found it sociologically interesting to observe how quickly it has enraptured everyone around the world. So I won’t discuss the virus itself here, there’s plenty you can read about it online (but please use reputable sources and read widely). Instead, I want to explore a related thought.

The reason that it has become a global conversation so quickly and so completely is that it represents a serious immediate threat. The window we have to limit its spread is short and so it’s rational to raise the red flag and risk some widespread panic because it communicates the seriousness of the issue far and wide. It’s risk management at its core. We’re especially resilient as a species when responding to immediate risk.

Whereas, when the risk is further in the future, we see a very different response. If we look, for example, at how we have responded to the risk of climate change as a species - we simply haven’t seen a significant effort to manage it from the key decision-makers around the world. Of course, it’s more complex an issue, the underlying science is hotly debated and it is very difficult to measure - but I think that the major factor for our relative nonchalance is the fact that the consequences lie far in the future, beyond any of our lifetimes. We seem to perform a risk triage in our head and don’t give much weighting to those long-term risks that aren’t staring us directly in the face.

I think this is a mistake.

I’m well aware that altering this inbuilt survival instinct is not a tractable solution on an individual level, but surely we have the capability to build institutions and set policies that ignore our immediate instincts and instead are mandated to act in the best interests of our descendants. Some might even argue that we have a moral obligation to do so. I’ve read murmurings of this way of thinking in population ethics, long-term consequentialist philosophy and the effective altruism movement more specifically. If we strive to act impartially, it seems non-controversial to suggest that we have an ethical obligation to take longtermism very seriously.

What stands in our way are economic, psychological and political incentives to focus solely on short-term risk. However, we’ve seen pockets of communities who have pushed against this - those that advocate for change in nuclear policy, climate change, advanced artificial intelligence, etc. I’m emboldened by the progress they have made and I hope that we’ll see that trend continue.

It doesn’t have to be everyone and it doesn’t have to subsume all or even a majority of our resources. But I think we are compelled to build institutions with the self-imposed structures and constraints that give long term risk it’s due. We won’t be thanked for it, it won’t be rewarding for us. But it’s a worthy investment in the future of life on this earth and wherever we go next.