this post was submitted on 20 Sep 2023
555 points (95.6% liked)

Technology

58138 readers
4309 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] AbouBenAdhem@lemmy.world 54 points 1 year ago (20 children)

The authors added that OpenAI’s LLMs could result in derivative work “that is based on, mimics, summarizes, or paraphrases” their books, which could harm their market.

Ok, so why not wait until those hypothetical violations occur and then sue?

[–] NotAPenguin@kbin.social 31 points 1 year ago (1 children)

People can do that too, are they gonna sue all people?

[–] Noumena@kbin.social 30 points 1 year ago

I have nipples Greg, could you sue me?

[–] snooggums@kbin.social 0 points 1 year ago (3 children)

Because the outcome of suing first is to address the potential outcome of what could happen based on what OenAI is doing right now. Kind of like how safety regulations are intended to prevent future problems based on what has happened previously, but expanded similar potential dangers instead of waiting for each exact scenario to happen.

[–] c0mbatbag3l@lemmy.world 23 points 1 year ago* (last edited 1 year ago) (9 children)

The difference is that you're trying to sue someone based on what could happen. That's like sueing some random author because they read your book and could potentially make a story that would be a copy of it.

LLM's are trained on writings in the language and understand how to structure sentences based on their training data. Do AI models plagiarize anymore than someone using their understanding of the English language is plagiarizing when they construct a brand new sentence? After all, we learn how to write by reading the language and learning the rules, is the training data we read when we were kids being infringed whenever we write about similar topics?

When someone uses AI to plagiarize you sue them into eternity for all I care, but no one seems concerned with the implications of trying to a sue someone/something because they trained an intelligence by letting it read publicly available written works. Reading and learning isn't allowed because you could maybe one day possibly use that information to break copyright law.

[–] Very_Bad_Janet@kbin.social -2 points 1 year ago (1 children)

I see this more like suing a musician for using a sample of your recording or a certain amount of notes or lyrics from your song without your consent. The musician created a new work but it was based on your previous songs. I'm sure if a publisher asked ChatGBT to produce a GRRM-like novel, it would create a plagiarism-lite mash up of his works that were used as writing samples, using pieces of his plots and characters, maybe even quoting directly. Sampling GRRM's writing, in other words.

[–] Fosheze@lemmy.world 7 points 1 year ago (3 children)

Except doing all of that is perfectly legal. With music it's called a remix or a cover. With stories it's called fanfic.

If the AI is exactly replicating an artists works then that is copyright infringment without a doubt. But the AI isn't doing that and it likely isn't even capable of doing that.

load more comments (3 replies)
load more comments (8 replies)
[–] uriel238@lemmy.blahaj.zone 23 points 1 year ago (4 children)

But if OpenAI cannot legally be inspired by your work, the implication is humans can't either.

It's not how copyright works. Transformative work is transformative.

[–] radix@lemmy.world 11 points 1 year ago (2 children)

The way I've heard it described: If I check out a home repair book and use that knowledge to do some handy-man work on the side, do I owe the publisher a cut of my profits?

[–] admin@lemmy.my-box.dev 5 points 1 year ago

If, without asking for permission, 1 person used my work to learn from it and taught themself to replicate it I'd be honoured. If somebody is teaching a class full of people that, I'd have objections. So when a company is training a machine to do that very same thing, and will be able to do that thousands of time per second, again, without asking for permission first, I'd be pissed.

[–] agent_flounder@lemmy.one 0 points 1 year ago (1 children)

That's a terrible analogy.

Reading a book designed to instruct you how to do tasks is not the same thing as training generative AI with novels, say, to write a novel for you.

The user of the AI benefits from the work and talent of the authors with little effort of their own.

[–] Honytawk@lemmy.zip 5 points 1 year ago (1 children)

So how about someone who loves to read books wants to become a writer, and uses the plot twists, characters, environments, writing style of books they already read.

Does that fall under copyright?

[–] agent_flounder@lemmy.one 1 points 1 year ago

Depends on how close it is... But at least they are doing the effort of writing vs merely coming up with prompts for the AI.

[–] kibiz0r@midwest.social 5 points 1 year ago (2 children)

How is that the implication?

Inspiration is something we do through conscious experience. Just because some statistical analysis of a word cloud can produce sentences that trick a casual observer into thinking a person wrote them doesn’t make it a creative process.

In fact, I can prove to you that (so-called) AI can never be creative.

To get an AI to do anything, we have to establish a goal to measure against. You have to quantify it.

If you tell a human being “this is what it means to be creative; we have an objective measure of it”, do you know what they tend to do? They say “fuck your definition” and try to make something that breaks the rules in an interesting way. That’s the entire history of art.

You can even see that playing out with respect to AI. Artists going “You say AI art can’t be art, so I’m gonna enter AI pieces and see if you can even tell.”

That’s a creative act. But it’s not creative because of what the AI is doing. Much like Duchamp’s urinal wasn’t a creative object, but the act of signing it R Mutt and submitting it to a show was.

The kinds of AIs we design right now will never have a transformative R Mutt moment, because they are fundamentally bounded by their training. They would have to be trained to use novel input to dismantle and question their training (and have that change stick around), but even that training would then become another method of imitation that they could not escape. They can’t question quantification itself, because they are just quantitative processes — nothing more than word calculators.

[–] uriel238@lemmy.blahaj.zone 4 points 1 year ago (3 children)

Those rules or objectives exist for human artists too. They're just liquid, and human artists try to break them, or test the limits of stated rules to find the edges of the envelope of what counts as art. And more often than not (95% according to Theodore Sturgeon) they fail to sell, which could be from exceeding the boundaries of the expected art threshold, or just by doing it poorly.

Now you could argue (and I think you might be arguing) that creative acts or inspiration are both properties of personhood: That which we regard as a person can do art. If it's done by nature, by a non-person animal (e.g. the Monkey Selfie) or by a mechanical process doesn't count as a creative act, as inspiration, or as art. I get it, just as someone who uses a toaster to warm up pop-tarts is not regarded as actually cooking. That said:

a) you'd have to make that assertion by fiat. And your definition doesn't count for anyone else, unless you pass a law or convince art-defining social groups to adhere to your definitions.

b) Capitalist interests don't care. If it's cheaper to make AI design their website or edit their film, and it does an adequate job cheaper than hiring an artist, they're going to do it. Even if we make it illegal to use some works to train AI, that won't stop it from leaking through via information technology services that scrape webs. Similarly ALPR companies, which use traffic cameras to track you in your car to determine your driving habits then sell that information to law enforcement who are totally violating your fourth-amendment rights when they do it, but it doesn't stop them, and that information is used in court to secure convictions.

c) It's not artists that control intellectual property, but publishing companies, and they've already been looking to churn out content as product the results of which we've seen in most blockbuster cinema offerings. The question is not if Fast & Furious XXIII is art but if people will pay to watch it. And IP law has been so long rotted to deny the public a robust public domain, we can expect they'll lobby our representatives until they can still copyright content that is awash with AI elements.

Ultimately the problem is also not whether artist get paid for their work doing art. It's that the most of us are desperate to get paid for anything and so it's a right privilege when that anything is doing something arty. The strikes, the lawsuits, these are survival precarity talking. If we didn't have to worry about that (say in an alternate reality where we had a robust UBI program) AI replacing artists would be a non-issue. People would continue to create for the sake of creation as we saw during the epidemic lockdown of 2020 and the consequential Great Resignation.

Generative AI is not at the magical level that managers and OG artists and capitalists thing it is, but short of war, a food crisis or our servers getting overrun by compound foul weather, it's going to get better and eventually AI will outpace Theodore Sturgeon's threshold of quality material to crap. This isn't what is going to confine human-crafted content to the fan-art section. It's that our shareholder-primacy-minded capitalist masters are looking to replace anyone they pay with a cheaper robot, and will at first opportunity. That's the problem we have to face right now.

[–] kibiz0r@midwest.social 2 points 1 year ago

I'm not the same person as @snooggums@kbin.social, but it did look like they were replying on my behalf, so I understand the assumption. No worries there.

I agree with what you're saying.

I would just wanna clarify that you're primarily talking about "art as a marketable commodity" and the societal problems with how that interacts with AI development, where I was talking primarily about "art as a cultural message" and the fundamental inability of AI to cross the threshold from "art as a product" to "art as a message" because the model itself has nothing to message about. (With the caveat that a person may use the AI's product as a message, but then the meaning comes from the person, not the AI.) I think we agree with each other here.

Btw, and you probably already know this, Cory Doctorow has some really sharp insights and recommendations when it comes to the past, present, and future of IP law and how we might be able to protect creators going forward.

I do wanna respond to something that wasn't really directed at me, just cuz it overlaps with my original comment and I think it's kind of interesting:

Again, you can say by fiat an AI has the personhood of a toaster, but that doesn’t make the content it creates less quality or less real. And given in the past how often we’ve disparaged art for being made by women, by non-whites, by Jews, we as a social collective have demonstrated our opinion is easily biased to arbitrarily favor those sources we like.

You’re not going to find any way to objectively justify including only human beings as qualified to make art.

You're right that, without an objective measure of what counts as an artistic endeavor, we're permitted to be as discriminatory as we feel like being. Which seems... not great, right?

But I don't think you ever can make an objective measure of what counts as art, because art is like the observable physical effect of something that's going on in our consciousness -- an immaterial world that can't directly map 1:1 with the physical world.

So I think art is always destined to be this amorphous thing that you can't exactly pin down. It's maybe more of a verb than a noun. Like I can't look at an inert object sitting on a table and figure out that it's art. But if someone tells me that this is the last sculpture their aunt made before she died and she started it when she felt fine, but by the end she could barely hold her hands still, and she never finished it... Well, suddenly I catch a glimpse of the conscious experience of that person. And it's not that her conscious experience was baked into the object, but that I can imagine being in her place and I can feel the frustration of the half-finished angles and the resignation of staring at it after touching it for the last time.

Yes, there is a real history of people saying "Those savages aren't conscious", or that they are technically conscious but a "lower" kind of consciousness. And I know it makes us uncomfortable to think we might do that again, and so I think some of us have developed a reflex to say we need to make an objective rational view of the world so that human subjectivity doesn't come into it and poison things... But I don't think it's possible, as long as the nature of consciousness remains a mystery to us.

And I also think if we do come to agree on a rationalist framework for living, we will have lost something. Once you have rules and measures, there's no room for... well, for lack of a better word, "soul". I'm an atheist, but I'm also conscious. And I don't think that the totality of my conscious experience is somehow quantifiable, or especially that if we could replay those exact quantities then it's just as good as consciousness. Like, I am experiencing something here, and there's no good reason to think that matter precedes consciousness and not the other way around.

I'm rambling now, but you get what I mean?

[–] archomrade@midwest.social 2 points 1 year ago

Ultimately the problem is also not whether artist get paid for their work doing art. It’s that the most of us are desperate to get paid for anything and so it’s a right privilege when that anything is doing something arty. The strikes, the lawsuits, these are survival precarity talking. If we didn’t have to worry about that (say in an alternate reality where we had a robust UBI program) AI replacing artists would be a non-issue. People would continue to create for the sake of creation as we saw during the epidemic lockdown of 2020 and the consequential Great Resignation.

This is a perfect framing for this discussion. I think people are pissed that AI disrupts this economic model of compensating creators, but the problem isn't AI it's the economic model.

I think this is also the conversation people like Altman were hoping to have around AI (sorry if that's too much benefit of the doubt for him), I think enthusiasts hope AI can help transition us to a more equitable economy. People are (rightly) concerned that instead of bringing about economic change, AI will further consolidate economic forces and make life even more miserable for everyone. Throwing copyright law at the problem to me seems like a desperate attempt to keep the boat afloat.

[–] snooggums@kbin.social 0 points 1 year ago (1 children)

I am saying AI won't have biological living experiences, only abstract concepts of biological living experiences that are fed into it.

You are reading way more into my point than my actual point. Another way of saying it is that we can try to understand a dog and explain why dogs do what they do, but we are not actual dogs and cannot use the actual experience of being a dog when creating art. Or how someone will never know the exact experience of someone of a other race even though they can understand the concepts of differences. Experience is different than understanding an abstract.

[–] uriel238@lemmy.blahaj.zone 3 points 1 year ago* (last edited 1 year ago) (1 children)

Firstly, @snooggums@kbin.social = @kibiz0r@midwest.social ? I was responding to the latter, so when you say I am saying (implicit format, to clarify, when I said X, I was [meaning to say] Y. ) I don't know which part of what reply fulfills X, unless you just mean to be emphatic. (e.g. He's mad! Mad, I tell you! ) So my thread context is lost.

Secondly the AI's lack of human experience seems irrelevant. Human artists commonly guess at what dogs think / feel, what it is to be a racial minority, another sex or whatever it is to not be themselves. And we're not great at it. AI, guessing at what it is to be human doesn't have a high bar to overcome. We depend on abstracts and third-party information all the time to create empathizable characters.

For that matter, among those empathizable characters, synthetic beings are included. The whole point of Blade Runner 2049 is that everyone, synthetic or otherwise, is valid, is deserving of personhood.

Again, you can say by fiat an AI has the personhood of a toaster, but that doesn't make the content it creates less quality or less real. And given in the past how often we've disparaged art for being made by women, by non-whites, by Jews, we as a social collective have demonstrated our opinion is easily biased to arbitrarily favor those sources we like.

You're not going to find any way to objectively justify including only human beings as qualified to make art.

[–] snooggums@kbin.social 1 points 1 year ago

Well, I am not saying that only humans can make art. I think a lot of other animals are fully capable of making art, even if we frequently call it instinct. Hell, bird mating rituals are better displays of physical dancing than humans in a lot of cases!

I am saying what we currently call AI, which is just mismashing existing art and not creating anything new or with any kind of complex emotions, will make technical art that has no depth or background that is commonly associated with art.

[–] gmtom@lemmy.world 1 points 1 year ago (1 children)

I really wish you lot would educate yourself on AI and the history of AI creativity and art before convincing yourself you know what you're talking about snd giving everyone your Hot Take.

[–] kibiz0r@midwest.social 2 points 1 year ago

Can you elaborate? "AI and the history of AI creativity and art" is a pretty broad scope, so I'm sure I have some massive blind spots within it, and I'd love some links or summaries of the areas I might be missing.

load more comments (2 replies)
[–] AbouBenAdhem@lemmy.world 6 points 1 year ago (1 children)

Safety regulations are created by regulatory agencies empowered by Congress, not private parties suing each other over hypotheticals.

[–] snooggums@kbin.social -1 points 1 year ago (1 children)

It was a comparison about preventing future issues, not a literally equivalent legal situation.

[–] AbouBenAdhem@lemmy.world 3 points 1 year ago* (last edited 1 year ago) (1 children)

The difference is that, to sue someone, you have to demonstrate that they were acting outside of existing laws and caused you real harm. Case law was never intended to proactively address hypothetical future scenarios—that’s what lawmakers and regulators are for.

[–] snooggums@kbin.social 0 points 1 year ago

In this case they are suing based on current copyright infringement by OpenAI, with the justification of predicable outcomes. Like how you can sue someone who is violating zoning ordinances and using predictable negative outcomes based on similar cases to justify the urgency of making them stop now instead of just trying to get money back when things get even worse.

load more comments (18 replies)