190
you are viewing a single comment's thread
view the rest of the comments
[-] ogmios@sh.itjust.works 13 points 2 months ago

AI is never going to get worse than it is now

Is that just a wild assumption, or...? One phenomena that has already been witnessed with AI is that it does in fact get worse if it trains upon it's own output.

[-] FaceDeer@fedia.io 3 points 2 months ago

Given that I have locally-run AIs sitting on my home computer that I have no plan to delete (until something better comes along), then yeah, it's never going to get worse. If all else fails I can just use the existing AI for as long as I want. It doesn't "wear out."

[-] ogmios@sh.itjust.works -1 points 2 months ago

It doesn’t “wear out.”

The physical components will, and compatible components for older systems keep getting harder to come across. Computers are not immortal entities. Maintenance of older machines will continually become more labour and cost intensive over time.

[-] knightly@pawb.social 5 points 2 months ago

The models are digital, making copies for safekeeping is easy.

The hardware is a computer, and computers are general-purpose. The kind that run AI models well at infrastructure scale are rather high end, but are still available off-the-shelf.

[-] FaceDeer@fedia.io 4 points 2 months ago

Computers are general-purpose machines. You can run a computer program on any computer, it may just be faster or slower depending on the computer's capabilities.

The AIs I run locally are also open-source, so if future computers lose compatibility with existing programs they can be recompiled for the new architecture.

I suppose we could lose the ability to build computers entirely, but that strikes me as a much bigger and more general issue than just this AI thing.

[-] ogmios@sh.itjust.works -1 points 2 months ago

You can run a computer program on any computer

Incorrect. Certain programs require certain standards for how the hardware is designed. There are already lots of old programs which can't be run natively on modern machines, and using software to emulate a compatible environment can impact performance in more ways than just speed.

[-] FaceDeer@fedia.io 1 points 2 months ago

You're wildly wrong about the fundamentals of computer science here. I'd be starting from first principles trying to explain further. I recommend reading up on Turing machines, or perhaps getting ChatGPT to explain it to you.

[-] ogmios@sh.itjust.works 1 points 2 months ago

I actually happen to know a lot about computers, and can even build them from raw materials. They are not eternal existences no matter how they're treated in popular culture, and data retention as hardware/standards evolve is actually a serious concern that is getting a fair bit of attention in research. One of the more interesting avenues being explored is encoding data in DNA, because humans will always have a reason to want to be capable of reading DNA, but that's still just theory at this point.

this post was submitted on 01 Apr 2024
190 points (98.0% liked)

Futurology

1551 readers
71 users here now

founded 11 months ago
MODERATORS