The impossibility of long-term forecasting

Embed from Getty Images

Lucy Kellaway wrote an article a while back, poking fun of McKinsey Institute’s new set of long-term forecasts, looking 50 years into the future. As part of her brilliant takedown of the report, she makes the very astute observation that the trends that MGI identify are not trends of the future, they are trends of the present. It’s become more clear lately just how hard it is for people to forecast significant change. We can see linear change, but as soon as the curve is not linear, but instead exponential or broken, our foresight breaks down. We are ok with change, we even like it, as long as it is nice incremental change, and not “superchange”. Our brains are programmed to enjoy inertia and protest if things seem too foreign.

I’ve lately been enjoying Nick Bostrom’s Superintelligence. He seems to be one of the few people who is comfortable with the idea of superchange. In the book, he has a chart showing the foreseen outcomes of superhuman machine intelligence by experts in the field. Even among these people, who are the most knowledgeable in the field, a majority of them think super human machine intelligence will most likely have moderately good outcomes. Only a very small minority (<10%), foresee it to have extremely negative outcomes. This feels like an extremely short-sighted assumption.

It used to be the case that we could learn of the future from looking at the past. Now it seem this is no longer the case, since today’s world might in fact be more complex and non-linear than times past. However, even if we can’t learn about the content of the future, we can surely learn about the speed and magnitude of change. It is an undeniable fact that someone looking 20 years forward in 1994 would not be able to foresee the things we take for granted today. This goes from obvious aspects, such as the powerful computers in our pockets that we know as cell phones, to the inescapability of climat change. It is therefore extremely presumptious of us to assume that we can forecast 2034 with anything remotely approaching certainty. It seems to a statistical impossibility that we would not experience superchange at the same rate as the past. It is actually even more likely than in the past, given the combinatorial aspects of inventions, as outlined by Erik Brynjolfsson and Andrew McAfee in The Second Machine Age.

We have developed tools to forecast the future and imagine change based on current irreversible trends, but now we need to invent the tools to imagine superchange. Otherwise, we are proceeding blindly into the future.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s