This was written in 2013.
Over the decades it has become apparent that simply throwing more processor cycles at the problem of true artificial intelligence isn’t going to cut it. A brain is orders of magnitude more complex than any AI system developed thus far, but some are getting closer.
And in the seven years since this post, we now have GPT-3, capable of writing essays at least as good as college graduates, on just about any topic that interests you, whether that’s cryptocurrencies or the history of feudalism or the chemistry of marijuana.
I’m not trying to single out this article. It’s just a powerful reminder of how difficult it is for anybody to properly assess current technology, because of our inability to fully grasp the effects of exponential growth and the surprises of non-linear innovation.
We keep trying to peer into the future, and we keep being surprised when the future actually arrives.
What else are we getting wrong about our technologies of today? What other technologies have the skills or experience of a four year old?
A few come to mind: Virtual reality. Blockchains. Robots (although this is increasingly not the case).