Chuck Wendig has a long list of things he wants AI to do for him, juxtaposing it against the stuff that the large language models currently being promoted as AI are actually doing.
A lot of the stuff on his list sounds pretty good, as do many of the fantasy scenarios suggested by promoters of AI. The problem is that most of the stuff that it would be nice for AI to do will never make any money for the companies who are building these platforms.
Even putting aside the need or desire to turn a profit, systems like Chat-GPT and other LLMs consume a huge amount of computing power, which requires lot of servers, storage, cooling, and power. “In the cloud” doesn’t mean this stuff is off magically running somewhere in the ether; it means it is running in someone else’s data center, probably on a ton of hardware.
All of that stuff costs money and that money has to come from somewhere.
For that matter, the same can be said for all the other services for which we rely upon the cloud. No one seems to want to pay for search or social—most people would laugh off the very idea—but then we turn around and wonder wonder why these platforms are infested with advertising.
I’m not trying to defend the companies that build and run most of these platforms and services. They are largely terrible—owned and run by awful people. But I feel like if we want the technology we use to work differently than how it works right now, we need to be realistic about how we got here.