This morning, the news broke that Larian Studios, developer of Baldur's Gate 3 and the upcoming, just-announced Divinity, is apparently using generative AI behind the scenes. The backlash has been swift, and now Larian founder and game director Swen Vincke is responding to clarify his remarks.
There are AI’s that are ethically trained. There are AI’s that run on local hardware. We’ll eventually need AI ratings to distinguish use types, I suppose.
It’s even more complicated than that: “AI” is not even a well-defined term. Back when Quake 3 was still in beta (“the demo”), id Software held a competition to develop “bot AIs” that could be added to a server so players would have something to play against while they waited for more people to join (or you could have players VS bots style matches).
That was over 25 years ago. What kind of “AI” do you think was used back then? 🤣
The AI hater extremists seem to be in two camps:
Data center haters
AI-is-killing-jobs
The data center haters are the strangest, to me. Because there’s this default assumption that data centers can never be powered by renewable energy and that AI will never improve to the point where it can all be run locally on people’s PCs (and other, personal hardware).
Yet every day there’s news suggesting that local AI is performing better and better. It seems inevitable—to me—that “big AI” will go the same route as mainframes.
colloquially today most people mean genAI like LLMs when they say “AI” for brevity.
Because there’s this default assumption that data centers can never be powered by renewable energy
that’s not the point at all. the point is, even before AI, our energy needs have been outpacing our ability/willingness to switch to green energy. Even then we were using more fossil fuels than at any point in the history of the world. Now AI is just adding a whole other layer of energy demand on top of that.
sure, maybe, eventually, we will power everything with green energy, but… we aren’t actually doing that, and we don’t have time to put off the transition. every bit longer we wait will add to negative effects on our climate and ecosystems.
There are AI’s that are ethically trained. There are AI’s that run on local hardware. We’ll eventually need AI ratings to distinguish use types, I suppose.
Can you please share examples and criteria?
It’s even more complicated than that: “AI” is not even a well-defined term. Back when Quake 3 was still in beta (“the demo”), id Software held a competition to develop “bot AIs” that could be added to a server so players would have something to play against while they waited for more people to join (or you could have players VS bots style matches).
That was over 25 years ago. What kind of “AI” do you think was used back then? 🤣
The AI hater extremists seem to be in two camps:
The data center haters are the strangest, to me. Because there’s this default assumption that data centers can never be powered by renewable energy and that AI will never improve to the point where it can all be run locally on people’s PCs (and other, personal hardware).
Yet every day there’s news suggesting that local AI is performing better and better. It seems inevitable—to me—that “big AI” will go the same route as mainframes.
colloquially today most people mean genAI like LLMs when they say “AI” for brevity.
that’s not the point at all. the point is, even before AI, our energy needs have been outpacing our ability/willingness to switch to green energy. Even then we were using more fossil fuels than at any point in the history of the world. Now AI is just adding a whole other layer of energy demand on top of that.
sure, maybe, eventually, we will power everything with green energy, but… we aren’t actually doing that, and we don’t have time to put off the transition. every bit longer we wait will add to negative effects on our climate and ecosystems.