this post was submitted on 17 Jun 2024
135 points (100.0% liked)

Technology

37720 readers
548 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] remotelove@lemmy.ca 27 points 5 months ago (1 children)

It's been around for a while. It's the fluff and the parlor tricks that need to die. AI has never been magic and it's still a long way off before it's actually intelligent.

[–] frog@beehaw.org 22 points 5 months ago* (last edited 5 months ago) (1 children)

The other thing that needs to die is hoovering up all data to train AIs without the consent and compensation to the owners of the data. Most of the more frivolous uses of AI would disappear at that point, because they would be non-viable financially.

[–] Even_Adder@lemmy.dbzer0.com 7 points 5 months ago* (last edited 5 months ago) (1 children)

Cory Doctorow wrote a good article about this a little while back.

[–] frog@beehaw.org 6 points 5 months ago (1 children)

I remember reading that a little while back. I definitely agree that the solution isn't extending copyright, but extending labour laws on a sector-wide basis. Because this is the ultimate problem with AI: the economic benefits are only going to a small handful, while everybody else loses out because of increased financial and employment insecurity.

So the question that comes to mind is exactly how, on a practical level, it would work to make sure that when a company scrapes data, trains and AI, and then makes billions of dollars, the thousands or millions of people who created the data all get a cut after the fact. Because particularly in the creative sector, a lot of people are freelancers who don't have a specific employer they can go after. From a purely practical perspective, paying artists before the data is used makes sure all those freelancers get paid. Waiting until the company makes a profit, taxing it out of them, and then distributing it to artists doesn't seem practical to me.

[–] Even_Adder@lemmy.dbzer0.com 2 points 5 months ago* (last edited 5 months ago) (1 children)

The point is that It's not an activity you can force someone to pay for. Everyone that can run models on their own can benefit, and that group can expand with time as research makes it more feasible on more devices. But that can never come to pass if we destroy the rights that allow us to make observations and analyze data.

counting words and measuring pixels are not activities that you should need permission to perform, with or without a computer, even if the person whose words or pixels you're counting doesn't want you to. You should be able to look as hard as you want at the pixels in Kate Middleton's family photos, or track the rise and fall of the Oxford comma, and you shouldn't need anyone's permission to do so.

Creating an individual bargainable copyright over training will not improve the material conditions of artists' lives – all it will do is change the relative shares of the value we create, shifting some of that value from tech companies that hate us and want us to starve to entertainment companies that hate us and want us to starve.

[–] frog@beehaw.org 6 points 5 months ago* (last edited 5 months ago) (1 children)

Creating same-y pieces with AI will not improve the material conditions of artists' lives, either. All that does is drag everyone down in a race to the bottom on who can churn out the most dreck the most quickly. "If we advance the technology enough, everybody can have it on their device and make as much AI-generated crap as they like" does not secure stable futures for artists.

[–] Even_Adder@lemmy.dbzer0.com 1 points 5 months ago (1 children)

Creating same-y pieces with AI will not improve the material conditions of artists’ lives, either. All that does is drag everyone down in a race to the bottom on who can churn out the most dreck the most quickly. “If we advance the technology enough, everybody can have it on their device and make as much AI-generated crap as they like” does not secure stable futures for artists.

If you're worried about labor issues, use labor law to improve your conditions. Don't deny regular people access to a competitive, corporate-independent tool for creativity, education, entertainment, and social mobility for your monetary gain.

Art ain't just a good; it's self-expression, communication, inspiration, joy – rights that belong to every human being. The kind of people wanting to relegate such a significant part of the human experience to a domain where only the few can benefit aren't the kind of people that want things to get better. They want to become the proverbial boot. The more people can participate in these conversations, the more we can all learn.

I understand that you are passionate about this topic, and that you have strong opinions. However, insults, and derisive language aren't helping this discussion. They only create hostility and resentment, and undermine your credibility. If you’re interested, we can continue our discussion in good faith, but if your next comment is like this one, I won’t be replying.

[–] frog@beehaw.org 3 points 5 months ago (1 children)

I did actually specify that I think the solution is extending labour laws to cover the entire sector, although it seems that you accidentally missed that in your enthusiasm to insist that the solution is having AI on more devices. However, so far I haven't seen any practical solutions as to how to extend labour laws to protect freelancers who will lose business to AI but don't have a specific employer that the labour laws will apply to. Retroactively assigning profits from AI to freelancers who have lost out during the process doesn't seem practical.

[–] Even_Adder@lemmy.dbzer0.com 1 points 5 months ago (1 children)

So the question that comes to mind is exactly how, on a practical level, it would work to make sure that when a company scrapes data, trains and AI, and then makes billions of dollars, the thousands or millions of people who created the data all get a cut after the fact. Because particularly in the creative sector, a lot of people are freelancers who don’t have a specific employer they can go after. From a purely practical perspective, paying artists before the data is used makes sure all those freelancers get paid. Waiting until the company makes a profit, taxing it out of them, and then distributing it to artists doesn’t seem practical to me.

This isn't labor law.

[–] frog@beehaw.org 3 points 5 months ago* (last edited 5 months ago) (1 children)

Labour law alone, in terms of the terms under which people are employed and how they are paid, does not protect freelancers from the scenario that you, and so many others, advocate for: a multitude of individuals all training their own AIs. No AI advocate has ever proposed a viable and practical solution to the large number of artists who aren't directly employed by a company but are still exposed to all the downsides of unregulated AI.

The reality is that artists need to be paid for their work. That needs to happen at some point in the process. If AI companies (or individuals setting up their own customised AIs) don't want to pay in advance to obtain the training data, then they're going to have to pay from the profits generated by the AI. Continuing the status quo, where AIs can use artists' labour without paying them at all is not an acceptable or viable long-term plan.

[–] Even_Adder@lemmy.dbzer0.com 1 points 5 months ago (1 children)

I don't think they have to, the point is to fight against regression of public rights for the benefit of the few.

[–] frog@beehaw.org 3 points 5 months ago (1 children)

Destroying the rights of artists to the benefit of AI owners doesn't achieve that goal. Outside of the extremely wealthy who can produce art for art's sake, art is a form of skilled labour that is a livelihood for a great many people, particularly the forms of art that are most at risk from AI - graphic design, illustration, concept art, etc. Most of the people in these roles are freelancers who aren't in salaried jobs that can be regulated with labour laws. They are typically commissioned to produce specific pieces of art. I really don't think AI enthusiasts have any idea how rare stable, long-term jobs in art actually are. The vast majority of artists are freelancers: it's essentially a gig-economy.

Changes to labour laws protect artists who are employees - which we absolutely should do, so that companies can't simply employ artists, train AI on their work, then fire them all. That absolutely needs to happen. But that doesn't protect freelancers from companies that say "we'll buy a few pieces from that artist, then train an AI on their work so we never have to commission them again". It is incredibly complex to redefine commissions as waged employment in such a way that the company can both use the work for AI training while the artist is ensured future employment. And then there's the issue of the companies that say "we'll just download their portfolio, then train an AI on the portfolio so we never have to pay them anything". All of the AI companies in existence fall into this category at present - they are making billions on the backs of labour they have never paid for, and have no intention of ever paying for. There seems to be no rush to say that they were actually employing those millions of artists, who are now owed back-pay for years worth of labour and all the other rights that workers protected by labour laws should have.

[–] Even_Adder@lemmy.dbzer0.com 2 points 5 months ago* (last edited 5 months ago) (1 children)

I'm not fighting for the extremely wealthy, I'm fighting for the existence of competitive open source models. Something that can't happen with what you've proposed. That would just hand corporations a monopoly of a public technology by making it prohibitively expensive to for regular people to keep up with the megacorporations that already own vast troves of data and can afford to buy even more.

This article by Katherine Klosek, the director of information policy and federal relations at the Association of Research Libraries does a good job of explaining what I'm talking about.

[–] frog@beehaw.org 2 points 5 months ago (1 children)

Taking artists' work without consent or compensation goes against the spirit of open source, though, doesn't it? The concept of open source relies upon the fact that everyone involved is knowingly and voluntarily contributing towards a project that is open for all to use. It has never, ever been the case that if someone doesn't volunteer their contributions, their work should simply be appropriated for the project without their consent. Just look at open source software: that is created and maintained by volunteers, and others contribute to it voluntarily. It has never, ever been okay for an open source dev to simply grab whatever they want to use if the creator hasn't explicitly released it under an applicable licence.

If the open source AI movement wants to be seen as anything but an enemy to artists, then it cannot just stomp on artists' rights in exactly the same way the corporate AIs have. Open source AIs need to have a conversation about consent and informed participation in the project. If an artist chooses to release all their work under an open source licence, then of course open source AIs should be free to use it. But simply taking art without consent or compensation with the claim that it's fine because the corporate AIs are doing it too is not a good look and goes against the spirit of what open source is. Destroying artists' livelihoods while claiming they are saving them from someone else destroying their livelihoods will never inspire the kind of enthusiasm from artists that open source AI proponents weirdly feel entitled to.

This is ultimately my problem with the proponents of AI. The open source community is, largely, an amazing group of people whose work I really respect and admire. But genuine proponents of open source aren't so entitled that they think anyone who doesn't voluntarily agree to participate in their project should be compelled to do so, which is at the centre of the open source AI community. Open source AI proponents want to have all the data for free, just like the corporate AIs and their tech bro CEOs do, cloaking it in the words of open source while undermining everything that is amazing about open source. I really can't understand why you don't see that forcing artists to work for open source projects for free is just as unethical as corporations doing it, and the more AI proponents argue that it's fine because it's not evil when they do it, the more artists will see them as being just as evil as the corporations. You cannot force someone to volunteer.

[–] Even_Adder@lemmy.dbzer0.com 1 points 5 months ago* (last edited 5 months ago) (1 children)

Taking artists’ work without consent or compensation goes against the spirit of open source, though, doesn’t it?

It doesn't. Making observations about others' works is a well-established tool for any researchers, reviewers, and people inventing new works. A concept which work perfectly within the open source framework. That's all these models are, original analysis of its training set in comparison with one another. Because it's a step one must necessarily take when doing anything, doing this doesn't require anyone's permission and is itself a right we all have.

[–] frog@beehaw.org 2 points 5 months ago (1 children)

When the purpose of gathering the data is to create a tool that destroys someone's livelihood, the act of training an AI is not merely "observation". The AIs cannot exist without using content created by other people, and the spirit of open source doesn't include appropriating content without consent - especially when it is not for research or educational purposes, but to create a tool that will be used commercially, which open source ones inevitably will be, given the stated purpose is to compete with corporate models.

No argument you can make will convince me that what open source AI proponents are doing is any less unethical or exploitative than what the corporate ones are. Both feel entitled to artists' labour in exchange for no compensation, and have absolutely no regard for the negative impacts of their projects. The only difference between CEO AI tech bros and open source AI tech bros is the level of wealth. The arrogant entitlement is just the same in both.

[–] Even_Adder@lemmy.dbzer0.com 1 points 5 months ago (1 children)

Giving all people a tool to help them more effectively communicate, express themselves, learn, and come together is something everyone should get behind.

I firmly believe in the public's right to access and use information, while acknowledging artists should retain specific rights over their creations. I also accept that the rights they don't retain have always enabled ethical self-expression and productive dialogue.

Imagine if copyright owners had the power to simply remove whatever wasn't profitable for them from existence. We'd be hindering critical functions such as critique, investigation, reverse engineering, and even the simple cataloging of knowledge. In place of all that good, we'd have an ideal world for those with money, tyrants, and all those who seek control, and the undermining of the free exchange of ideas.

[–] frog@beehaw.org 1 points 5 months ago (1 children)

The problem is that undermining artists by dispersing open source AI to everyone, without having a fundamental change in copyright law that removes power from the corporations as well as individual artists, and a fundamental change in labour law, wealth distribution, and literally everything else, just screws artists over. Proceeding with open source AI, without any other plans or even a realistic path to a complete change in our social and economic structure, is basically just saying "yeah, we'll sort out the problems later, but right now we're entitled to do whatever we want, and fuck everybody else". And that is the tech bro mindset, and the fossil fuel industry, and so, so many others.

AI should be regulated into oblivion until such a time as our social and economic structures can handle it, ie, when all the power and wealth has been redistributed away from the 1% and evenly into the hands of everyone. Open source AI will not change the power that corporations hold. We know this because open source software hasn't meaningfully changed the power they hold.

I'm also sick of the excuse that AI helps people express themselves, like artistic expression has always been behind some impenetrable wall, with some gatekeeper only allowing a chosen few access. Every single artist had to work incredibly hard to learn the skill. It's not some innate talent that is gifted to a lucky few. It takes hard work and dedication, just like any other skill. Nothing has ever stopped anyone learning that except the willingness to put the effort in. I don't think people who tried one doodle and gave up because it was hard are a justifiable reason to destroy workers' livelihoods.

[–] Even_Adder@lemmy.dbzer0.com 2 points 5 months ago

This isn't undermining artists, it's expanding access and knowledge, enabling individuals to take control of their own destinies. Open-source AI will empower artists, existing artists and newly active or returning artists who give this new medium a shot, by giving them the new tools that will push the frontiers of self-expression and redefine creativity this decade.

100 years ago photographers and filmmakers significantly disrupted the careers of most illustrators, story tellers, and theater companies of the time. Despite this, storytelling and image making exploded, entering a new golden age. Musicians panicked over the use of synthesizers in the 80s too often refusing to work with people involved with synthesizers. As a result, there are fewer drummers today than in 1970, but out of that came hip hop and house. Suppressing that tool would have been a huge cultural loss. Generative art hasn't found its Marley Marl or Frankie Knuckles yet, but they're out there, and they're going to do stuff that will blow our minds. Cutting edge tools and techniques have always propelled art and artists forward. Every advancement a leap forward, leaving behind constraints and enabling more people to pursue their creative aspirations.

That reminds me of a presentation I saw a little while back.

If you want to fight against people's right to freely communicate and express themselves, be my guest, but it's not a fight you can win.