For this kind of thinking to work in practice you would need to kill the people that AI makes redundant. This is apart from the fact that right now we are at a choke point where it's much more important to generate less CO2 than it is to write scientific simulation code a little quicker (and most people are using AI for much more unnecessary stuff like marketing)
> For this kind of thinking to work in practice you would need to kill the people that AI makes redundant.
That is certainly not a logical leap I'm making. AI doesn't make anybody redundant, the same way mechanized farming didn't. It just frees them up to do more productive things.
Now consider whether LLM's will ultimately speed up the technological advancements necessary to reduce CO2? It's certainly plausible.
Think about how much cloud computing and open sourced changed it so you could launch a startup with 3 engineers instead of 20. What happened? An explosion of startups, since there were so many more engineers to go around. The engineers weren't delivering pizzas instead.
Same thing is happening with anything that needs more art -- the potential for video games here is extraordinary. A trained artist is way more effective leveraging AI and handling 10x the output, as the tools mature. Now you get 10x more video games, or 10x more complex/larger worlds, or whatever it is that the market ends up wanting.
Except reality is they're not. If you want to argue the contrary, show the statistics that unemployment among digital artists is rising.
So many people make this mistake when new technologies come out, thinking they'll replace workers. They just make workers more productive. Sometimes people do end up shifting to different fields, but there's so much commercial demand for art assets in so many things, the labor market shrinking is not the case for digital artists right now.