this post was submitted on 11 Jul 2023
61 points (100.0% liked)

Asklemmy

43917 readers
1166 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] kromem@lemmy.world 1 points 1 year ago (1 children)

You think tracing for educational use which is then never distributed such that it could not have a negative impact on market value is infringement?

What the generative AI field needs moving forward is a copyright discriminator that identifies infringing production of new images.

But I'll be surprised if cases claiming infringement on training in and of itself end up successful under current laws.

And yeah, most of the discussion around this revolves around US laws. If we put aside any jurisdiction then there is no conversation to be had. Or we could choose arbitrary jurisdictions to support a position, for example Israel and Japan which have already said training is fair use.

[โ€“] Atemu@lemmy.ml 1 points 1 year ago (1 children)

You think tracing for educational use which is then never distributed such that it could not have a negative impact on market value is infringement?

That's not what I think, that's what the law says.

I said what I think in the second paragraph. Sorry if I wasn't being extra clear on that.

What the generative AI field needs moving forward is a copyright discriminator that identifies infringing production of new images.

Good luck with that.

But Iโ€™ll be surprised if cases claiming infringement on training in and of itself end up successful under current laws.

Depends. If the imitative AI imitates its source material too closely, that could absolutely be laid out as a distribution of copyrighted material.
Think about it like this: If I distributed a tarball of copyrighted material, that would be infringement, eventhough you'd need tar to unpack it. Whether you need a transformer or tar to access the material should make no difference in my layman interpretation.

[โ€“] kromem@lemmy.world 1 points 1 year ago

That's not what I think, that's what the law says.

No, it doesn't. The scenario outlined squarely falls under fair use, particularly because of the non-distribution combined with research/education use. Fair use is not infringement.

Good luck with that.

We'll see.

Depends. If the imitative AI imitates its source material too closely, that could absolutely be laid out as a distribution of copyrighted material.

I mean, if we're talking about hypothetical models that only produce infringing material, you might be right.

But if we're talking about current models that have no ability to reproduce the entire training set and only limited edge case reproducibility of training images with extensive prompt effort, I stand by being surprised (and that your tar metaphor is a poor and misleading one).

If we're going with poor metaphors, I could offer up the alternative of saying that distributing or offering a cloud based Photoshop isn't infringement even though it can be used to reproduce copyrighted material. And much like diffusion based models and unlike a tarball, Photoshop requires creative input and effort from a user in order to produce infringing material.