this post was submitted on 11 Jul 2023
64 points (100.0% liked)
Technology
37702 readers
282 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I think the whole thing about megacorps being the problem here is a bit short sighted, I don't think it will be too much longer before anyone can spin up their own LLM. It doesn't exactly take Google levels of resources. I'm as happy to shit on megacorps as the next person here but IP law as it is is BS.
More likely than not any changes made will be to benefit large corporations at the expense of individuals and competition. I'm imagining a world where copyright law has made it so that only big corporations can afford to pay for LLM training data. As if individuals had to pay library book prices for a personal book to train their personal LLM. This desire to "cash in" may just play right into the megacorporation's hand.
I agree that cashing in is at least important part of this. As I understand it, however, past a certain point creating and using LLMs is in fact extremely expensive. That's why GPT4 limits user interactions, for example. I also think that the more restricted these tools are in general, the better for everyone. It's absolutely possible to use them in positive ways, but as it stamps they are mostly just flooding the internet with garbage at killing low level content jobs.
We're already heading in a direction that mainly benefits those who are already in power. The real impact of these lawsuits appears to be favoring corporations and copyright holders, without sufficient thought to how they might limit individuals like us. People are already anxious about AI taking their jobs, right? But if we keep creating laws that continuously favor the same powerful few, it shouldn't shock us when the average person can't keep up. Just to give you an idea, instead of being able to use Large Language Models (LLMs) to make my work easier, I may be forced to completely abandon this tech due to this kind of shortsightedness. LLMs should be a tool available to ALL of us, not just those at the top.