
The Future is Not a Story
For a story to be interesting to humans, it has to feature interesting content occurring at the human level. A story about the interaction of worms and rabbits is not very interesting to humans unless the characters are entirely anthropomorphic. Conversely, humans cannot write meaningful stories about content above the human level, because we lack the cognitive complexity to imagine such things.The future is not a story to entertain you. Brought up on Star Wars, Star Trek, The Terminator, and many other stories, many geeks view the future almost exclusively through the lens of overly-beloved fiction. Though these geeks may not consciously think, “the future is basically Star Trek coming into existence”, the initial reflex when confronted with a cool new piece of technology is to make a fictional reference. For example, look at the comments on practically any “futuristic technology” headline that appears on Digg. No one can resist.
However, the universe does not care that we find only stories at the human level interesting. The vast majority of natural phenomena are dictated by structures that are far smaller or far larger than humans, and often far less complex. In the near future, as humans create transhumans, the script of history will start to be written in a more sophisticated font that we lack the cognitive wherewithal to make direct sense out of.
This “interestingness bias” causes futurists to come up with showy stories to get attention. One particularly flagrant example is BT futurologist Ian Pearson, who in 2002 predicted notebook computer screens with contrast as good as paper by 2003, mobile phone location used in traffic management systems by 2004, the first organism brought back to life in 2006, anti-noise technology built into homes by 2010, and the highest-earning celebrity being synthetic in 2010. All these predictions have either failed or are on the way to failing, and seem to be made more for show than seriousness.
If I were in charge of a futurist seminar, one of the first things I would probably do is discourage anyone from mentioning any fictional story whatsoever. I do believe that fiction does have something to teach us about future possibilities, but the bias towards interesting stories is so overwhelmingly strong that most casual thinking about the future is thoroughly contaminated by it. No narrative can predict the future, because the future is a blur of uncertainties from our perspective, and will only appear like a narrative in retrospect.
This bias towards interesting future stories is particularly worrisome in the context of unFriendly AI. People anthropomorphize advanced AI and come up with a thousand interesting and semi-ironic stories for why it wouldn’t be a threat to us: for instance, AIs might find humans “boring” and blast off into space, or create their own basement universe, or figure out so much knowledge about the world that they dessicate from existential ennui. These scenarios all strike me as B-grade science fiction. More likely, when confronted by a recursively self-improving unFriendly AI with abstract mathematical goals unrelated to human concerns, the simple outcome is death. No robot wars, no citizenship battles, no epic historical dialogue between the President of the United States and the AI leader. Just defeat. How’s that for your interesting story?

No comments:
Post a Comment