The biggest problem about AI is not intrinsic to AI. It’s to do with the fact that it’s owned by the same few people, and I have less and less interest in what those people think, and more and more criticisms of what the effect of their work has been.
I’m on board with Eno here. There are issues with the current wave of generative AI, but the main reason they are such huge problems is the people who are pushing these technologies have zero concern for the potential social harms they could cause or exacerbate.
I’m very much in agreement with Eno here, actually. I could imagine a world very easily in which LLMs and image generators didn’t just “have use cases,” but was actually revolutionary in more than a few of those cases. A world in which it was used well, for good things.
But we don’t live in that world. We live in one where it was almost entirely born under and shaped by megacorps. That’s never healthy to anything at all, be it new tech or be it the people using it. The circumstances in which LLMs and generative models were developed was such that nobody should be surprised that we got what we did.
I think that in a better world, image generation could’ve been used for prototyping, fun, or enabling art from those without the time to dedicate to a craft. It could’ve been a tool like any other. LLMs could’ve had better warnings against their hallucinations, or simply have been used less for overly-serious things due to a lack of incentive for it, leaving only the harmless situations. Some issues would still exist – I think training a model off small artists’ work without consent is still wrong, for example – but no longer would we face so much of things like intense electrical usage or de-facto corporate bandwagon-jumping and con-artistry, and the issues that still happened wouldn’t be happening at quite such an industrial scale.
It reminds me how before the “AI boom” hit, there was a fair amount of critique against copyright from leftists or FOSS advocates. There still is, to be sure; but it’s been muddied now by artists and website owners who, rightfully so, want these companies to not steal their work. These two attitudes aren’t incompatible, but it shows a disconnect all the same. And in that disconnect I think we can do well to remember an alternate chain of events wherein such a dissonance might’ve never occurred to begin with.
No. It’s one of the many, many problems, even if it wasn’t shovelling money into rich wide dude’s mouths and trained on stolen data, it would still use absurd amounts of energy, could be used for propaganda / further eradicate what is true, make creative people loose their jobs, make normal people rely on machines that sell hallucinations for facts etc. etc. This shit is mostly terrible.
They don’t even use absurd amounts of energy. I literally have one running on my computer at home
The biggest problem with Ai is copyright.
Something we desperately need to abolish.
I disagree. Copyright is not bad.
Not without throwing a whole lot of other concepts in the trash (like capitalism and paywalling basic human necessities)