This week, I was diverted from discussing about a tangible here-and-now technology for my target audience and instead posted a speculative piece on the reasons for Sam Altman’s firing with diminishing relevance akin to a brief half-life. By the time I finish writing this, I expect, that drama will have yet another twist best left to other authors …
All kidding aside, turns out I was pretty much dead-on in my speculation, and that leads me to make another speculative prediction. So while this is still in a way a continuation of the diversion which is not really in my wheelhouse, this speculation is much more relevant. So here you go, my best effort at predicting the future:
Artificial General Intelligence (AGI) will perpetuate as a concept and never as a reality.
Below, I will explain why I think this, but to my target audience of school leaders and educational practitioners who just want to cut to the chase and want a simple explanation of how this might impact them, this is what this means:
Generative Artificial Intelligence isn’t going to get a whole lot smarter than it is now. There won’t be another ChatGPT-level event within the next few years.
That’s not to say that we won’t have some advancements to look forward to that will make it more seamless to the user, and therefore appear better, faster, and “smarter” … I mean that I don’t think we can expect another AI event of ChatGTP’s magnitude again anytime soon:
The GenAI technologies will proliferate and become integrated, becoming a core layer of the internet, but will not replace human intelligences.
For the title artwork I chose for this piece, I asked Dall-E to weave in dragon figures as constellations against a night sky, because computers possessing the general intelligence of a human seems like how a mythological figure works: Readily imagined but not a thing of this world.
But why include constellations in the title artwork?
Astronomy is a strong interest of mine, and from that discipline there is a devastating argument that cuts across the UFO debate. We know that aliens have not visited Earth, and it has everything to do with the vast distances involved and how much energy it would take. In other words, by just knowing physics, spaceships vrooming across the stars just isn’t plausible.
When it comes to generative artificial intelligence, compute is like the vast distances of astronomy. There is only so much compute that is possible or available. It didn’t escape my notice that compute was front-and-center to the OpenAI saga, after all.
OpenAI would not be able to have built ChatGPT without the compute power available from Microsoft, which meant that the academic exercise of trying to build GenAI was met head-on with the economics of the project. To be able to progress towards AGI (and no, GPT tech is not AGI) they had to deploy vast data centers to make it viable. Enter Microsoft.
So, in my head as a thought experiment, I reckon that if we were able to sit down and draw up a way to figure out how much electricity is needed to create HALL 9000 (we’d first have to quantify intelligence, is that even possible?) …
If there was a way calculate how much compute we’d need for AGI, we would discover that we would need a data centers the collective size of the solar system, or some other such implausible size.
Thus, dragons against the night sky, dancing in our dreams.
