this post was submitted on 14 Jul 2023
109 points (100.0% liked)

Technology

19 readers
2 users here now

This magazine is dedicated to discussions on the latest developments, trends, and innovations in the world of technology. Whether you are a tech enthusiast, a developer, or simply curious about the latest gadgets and software, this is the place for you. Here you can share your knowledge, ask questions, and engage in discussions on topics such as artificial intelligence, robotics, cloud computing, cybersecurity, and more. From the impact of technology on society to the ethical considerations of new technologies, this category covers a wide range of topics related to technology. Join the conversation and let's explore the ever-evolving world of technology together!

founded 1 year ago
 

No, this is not a Black Mirror episode.

you are viewing a single comment's thread
view the rest of the comments
[–] effingjoe@kbin.social 1 points 1 year ago (2 children)

the clients (no offense) tend to be less professional

I don't know what you mean by this precisely, but the "pretty good" end result I mentioned had a hand that melted into the sword-- so if you meant "low standard" then yeah, guilty as charged, haha. However, more interesting to me is that I would have never in 1000 years have paid someone to do that for me-- I just would have been low-level annoyed that my character and the avatar looked different the entire game.

I find the "they didn't have permission to train from" argument is complete bunk. That's not a right granted by intellectual property laws; there is no "right to control who learns from a work".

What needs to happen is society (especially US society) needs to stop linking "working" and "enjoying a comfortable life". Technology is coming for all our jobs, and the sooner we accept that and prepare for it, the better we'll be when it happens.

[–] nicetriangle@kbin.social 2 points 1 year ago* (last edited 1 year ago) (1 children)

I don't know what you mean by this precisely

By that I mean that some dude looking for a game avatar or whatever isn't as likely to be someone used to contracting people to do professional creative work for them. Professional clients who are accustomed to hiring creatives to do work for them are more likely to:

  • be quick to provide feedback and respond to emails
  • have feedback that is clear and actionable
  • communicate professionally
  • pay on time
  • be willing to pay a down payment and sign a contract
  • comprehend the hard work that goes into this stuff and value my time accordingly
  • don't try to push a project (either intentionally or otherwise) into what is known as "scope creep" wherein they jockey for additional work outside of the initially agreed upon scope of work

And lots of other little things like that.

Am I saying that you are like that? No. But having done creative work for north of 15 years this is my informed opinion based on a lot of experience in this field.

I find the "they didn't have permission to train from" argument is complete bunk. That's not a right granted by intellectual property laws; there is no "right to control who learns from a work".

That's your opinion and you might feel differently if you had spent years working hard to achieve something in this specific field.

What needs to happen is society (especially US society) needs to stop linking "working" and "enjoying a comfortable life". Technology is coming for all our jobs, and the sooner we accept that and prepare for it, the better we'll be when it happens.

This I fully agree with. And I wouldn't even necessarily have a problem with AI destroying creative jobs if it meant I was now more free to pursue a life of spending time doing things that I was passionate about because some kind of UBI or whatever was making that possible for me and others.

Like I kinda mentioned earlier, I don't think society is in a good place to fight for this on at least the short term. Basically not until things get really bad. What I expect will happen for now is most all of the windfalls from automation will be siphoned up to to the upper class and corporations and wages will continue to stagnate for the working class and income inequality will continue to skyrocket.

[–] effingjoe@kbin.social 2 points 1 year ago (1 children)

That's your opinion and you might feel differently if you had spent years working hard to achieve something in this specific field.

It's not really an opinion; it's just not a right granted by IP laws. I know that people that are financially dependent on this type of work really wish they had this right-- and I fully accept that if I were in the same boat, I would probably also wish I had this right, but that doesn't magically add it to the law.

All the lawsuits you see popping up are hail marys (maries?); they'll very likely all lose.

some kind of UBI or whatever was making that possible for me and others.

Something like this, set at a level that allowed a comfortable life (versus an austere one) would totally flip the whole employment dynamic. The pay for the worst jobs would skyrocket, because no one wants to do those jobs-- they only do them now to stave off starvation and homelessness.

siphoned up to to the upper class and corporations and wages will continue to stagnate for the working class and income inequality will continue to skyrocket

I can't help but agree, with sorrow. I imagine it won't get better (in the US, at least) until it impacts the wealthy-- as in, there aren't enough people getting paid to buy the stuff that is getting created by automation. Capitalism needs money flowing to the bottom (traditionally, a wage) to sustain itself. If that flow of money dries up, the whole system collapses. We can either fix it by abandoning capitalism, or by patching capitalism by finding a way for money to flow down other than by wages. (A UBI, for example)

[–] nicetriangle@kbin.social 1 points 1 year ago* (last edited 1 year ago) (1 children)

In my earlier comment in this thread I said my dilemma with using AI for my work was an ethical one, not a legal one. Ethics/morals inform laws for sure, but I think you'd agree that not everything that's technically legal is also ethical. Especially so in a country like the US.

I think a lot of people would also agree that ethics are to some extent individual. Meaning that what I find ethical or not is going to differ from others. So whether or not this is all legal doesn't mean that it's going to jive with my personal view of what is ethical.

That dilemma is my own. Whether or not congress people who have a weak grasp of both technology and the arts think one way or another on the matter is a poor ruler for one's own moral code of conduct in my book.

In any case, good chat. I appreciate that while we don't agree on everything we kept it civil. Now back to work for me (before it gets taken by a robot).

[–] effingjoe@kbin.social 3 points 1 year ago (1 children)

I understand that you may not reply because you feel the discussion has run its course, but I wanted to clarify that I was, indeed, not following that you were speaking from a personal morality standpoint. Sorry about that.

[–] Ragnell@kbin.social 1 points 1 year ago* (last edited 1 year ago) (1 children)

I find the "they didn't have permission to train from" argument is complete bunk. That's not a right granted by intellectual property laws; there is no "right to control who learns from a work".

Yeah, but is an AI LEGALLY learning? Or is it just a machine that spits out output based on its inputs? In that case, use of the work as input isn't allowed under the copyright, which is that the work be used by reading it.

All these comparisons between what an AI is doing and what a human does when reading/learning/etc are not a given in a court of law. We don't have any rulings yet that an AI is actually "learning" like a human when it is "trained."

"Training" an AI is building a tool. A tool that can be used to profit. Can artistic works be used to build a for-profit tool without permission?

This is something that needs to be decided, and it will be decided in a way that whatever the rules are for AI can't be applied to a human. Meaning if there is a requirement for permission for use in machine learning, that won't change that a human can learn from it. So the comparison is pointless, because there is no way the courts are going to rule that these things are legally indistinguishable from people.

In the meantime, back to the original, there ARE precedents for use of performance because of recordings. That's why the studios wanted that in the contract, they KNOW they cannot manipulate a person's performance through AI without their express written permission. Is it REALLY so hard to believe this can be applied to writing or art? That they can't use writing or art without the artist's express permission.

We may see a new kind of copyright soon that specifically disallows use for AI, and another that is open for use with AI. Something to replace Creative Commons on the internet.

[–] effingjoe@kbin.social 1 points 1 year ago (1 children)

There simply isn't a right to control even training. That's just not a thing. It would need a change to the law.

[–] Ragnell@kbin.social 1 points 1 year ago (1 children)

All right, I am not a lawyer but I've been around the internet long enough to know there is arguably a right to control learning and training. Because the Fair Use copyright law SPECIFICALLY allows for educational use. That means the default is that otherwise, it would not be allowed.

A judge could easily rule that AI training is not covered under Fair Use, as it is being used to create a profitable tool.

[–] effingjoe@kbin.social 1 points 1 year ago* (last edited 1 year ago) (1 children)

That's a right to make copies and distribute them for educational purposes. This is specifically not involving distribution of any kind. Arguably copyright law doesn't even apply, but even under the broader term of "intellectual property", it doesn't hold up, even without trying to make a comparison between humans learning and AI training. (which is more of an analogy)

Edit: and to be fair, I'm not a lawyer either, but IP law (especially regarding how terrible it has become) is kind of a hobby of mine. But I can't claim to be any type of authority on it.

[–] Ragnell@kbin.social 2 points 1 year ago (1 children)

Okay, well my hobby is ethics.

And the thing is, if they are using works written by others to build an AI for profit without permission, that's exploitation. Copyright law is horrible and exploited by corporations constantly. That doesn't mean we shouldn't cheer on the little guy when they try to use it to defend against exploitation by corporations. Because the big tech companies are exploiting creatives in their drive to build and sell this tool. They are exploiting creatives to make their replacements. So I'm going to go off on any comparison analogy.

Whatever the actual basis of the lawsuits against the AI companies, actual lawyers do think there's a basis in IP law to sue because a few high profile lawsuits have been filed. And clearly there is some legal basis to sue if they use AI to create using performances, or this contract would not have been proposed.

[–] effingjoe@kbin.social 1 points 1 year ago (1 children)

If you're leaning on morality, then the comparison to humans becomes relevant again.

Lawyers taking a high profile case is not any indication to go by.

I could be off base here, but are you financially impacted if AI starts making commercial art? Like, is that how you make income, too?

[–] Ragnell@kbin.social 1 points 1 year ago* (last edited 1 year ago) (1 children)

I have skills besides technical writing, but it's one of the things I rely on to get hired. So yeah, I'm partially on the chopping block prior to creative writing. And it's a serious problem that all the writing I've done on the internet is being used to train AI.

But the thing about the comparison to humans in morality is there'll be a line that gets crossed. And once that line is crossed, you can't OWN an AI anymore, and you certainly can't sell it. Up until then, you have to treat it as a tool.

The end solution is going to be something along the lines of a Creative Commons license where you specific if your work can be used to train AI, if it can't, or if it can only be used to train non-profit AI.

[–] effingjoe@kbin.social 1 points 1 year ago (1 children)

I don't follow why calling it a tool matters. If a python script renders someone's job redundant (hypothetically; this is unlikely in reality) does it matter if the script was written by a human or a LLM?

[–] Ragnell@kbin.social 1 points 1 year ago (1 children)

@effingjoe I imagine it matters to the person who wrote it. Were THEY paid for this?

I mean, it's a shitty thing that consultants and such remove jobs, but at the very least the exploitation there is only on one side, the poor guy kicked out. If an LLM is removing someone's job, then the people used to train the LLM are getting exploited too.

Plus, a certain amount of the law is for deterrence. We don't want the companies replacing creatives with AI. It would be beneficial to discourage that. We DO want things like fruit-picking and weeding and other backbreaking manual labor replaced by AI, so we can push for laws that encourage THAT. But right now they are trying to replace the wrong end.

[–] effingjoe@kbin.social 1 points 1 year ago (1 children)

You're going to need to strictly define "exploited", I think. I don't know what you mean when you use that term.

If I read a book on Python and write a script to replace someone's job, did I exploit the person who wrote the book? What about the people that created and/or maintain python?

Why don't we want companies replacing creatives with AI? Should we roll back other technological advances that resulted in fewer humans being employed? No human routes phone calls anymore, but they used to. Should their jobs be protected, too? What about people that used to carve ice out of mountain lakes and deliver it to businesses? Should refrigeration technology be held back by the law to protect those jobs? If not, why artists? What makes them more deserving of being protected?

[–] Ragnell@kbin.social 1 points 1 year ago (1 children)

Intent is a big deal in this one. Your Python book writer intended for people to read it to learn Python.

A romance book writer does not intend for an AI to use it to learn to generate sentences. But because there was no obvious barrier and they could get away with it, the companies grabbed the romance book and used it. That's an exploit.

And again, you're ignoring the quality of labor. Back-breaking jobs that hurt people's health should be improved with technology. A migrant worker might lost his job to a mechanical fruitpicker but he's likely bilingual and eligible for a translator job. Unless that job, which is better for health and longevity, and allows someone to stay in one place, is taken by an AI.

The promise of automation was that it would RAISE the quality of human life. Taking away the jobs of creatives lowers the quality of human life. Using automation to carve ice out of mountain lakes raises the quality of human life. Things are not neutral here.

The large companies want to keep manual labor in human hands and put creative work and decision making in AI hands. This is going to make life worse for everyone.

[–] effingjoe@kbin.social 1 points 1 year ago (1 children)

Intent is a big deal in this one. Your Python book writer intended for people to read it to learn Python.

I really don't see where intent falls into this, still-- but feel free to change the hypothetical to looking at other people's python code to learn how to use python. It still doesn't change the equation. Did I exploit the people who wrote the python code that I learned from? Does my source of learning matter when it comes to what I produce? Do you really believe that artists create new art in a total vacuum, without drawing inspiration from prior art?

Back-breaking jobs that hurt people's health should be improved with technology. A migrant worker might lost his job to a mechanical fruitpicker but he's likely bilingual and eligible for a translator job. Unless that job, which is better for health and longevity, and allows someone to stay in one place, is taken by an AI.

I am somewhat stunned by the obvious bias you seem to have against manual labor. You really think having an active job is less healthy than sitting in a chair, looking at a screen all day? (Please note: 90% of my job is sitting in a chair, staring at a screen all day.)

There was no "promise of automation". Technology was always going to take everyone's jobs-- the only change is the order it has taken it in. It was assumed that human creativity was some special thing that was so difficult to define in software that it would be towards the end when it came to getting replaced, but it turns out that we're a lot more like computers than we believe, and you can train software-- with relative ease-- to figure out how to achieve an end result without explicitly defining how.

Large companies want to reduce overhead, increase productivity, and maximize profit. I assure you there's no bias as to what kind of jobs get replaced when it comes to those goals. It just happens that creative jobs seem to be easily replaced.

Do you really, honestly, think that it's even possible to hold back a technological advance using legislation? You can already host your own LLMs and train them on whatever material you desire, to better tailor their output. That's today. Even if we assume, for sake of argument, that the law does decide that people have a "right" to control how their art is consumed. (again, very unlikely imo), that won't even slow down the people spinning up their own instances, and even if they follow the rules, how much worse do you really think the models would be using only public domain and open source training materials?

[–] Ragnell@kbin.social 1 points 1 year ago* (last edited 1 year ago) (1 children)

I don't know, but there is no good reason to just sit there while the rich replace all the creatives with machines built using their work.

Hell, even if your point about manual labor being replaced is just as bad, I can think of a company that makes you BUILD the robot that will replace you before you get fired.

But honestly, there's a reason people aspire to music and art and not to moving boxes at the dock, and it's not because the moving boxes job is low class but because it is backbreaking, unpleasant labor where you don't get to express yourself. But music and art and writing are forms of self-expression and some of the few places you can do self-expression during work. So those jobs should be preserved.

Maybe we can't get some sort of justice in the system, but people should at least try.

[–] effingjoe@kbin.social 1 points 1 year ago* (last edited 1 year ago) (1 children)

You still seem to have a very specific idea of what manual labor is. You may not think you'd find manual labor jobs fulfilling or expressive, but that doesn't mean no one does.

Could it be that you care more about creative jobs because you have one, and if you had a manual labor job you'd be arguing the opposite?

Edit: what, specifically, does "justice" in the system look like to you?

[–] Ragnell@kbin.social 1 points 1 year ago (1 children)

Maybe.

Edit: An end to large tech companies scraping the internet and collecting everyone's data with impunity.

[–] effingjoe@kbin.social 1 points 1 year ago* (last edited 1 year ago) (1 children)

I caution you from repeating phrases you've read but don't fully understand. "Scraping the internet and collecting everyone's data" is just how the internet works. It's certainly how every single search engine works. (even privacy focused ones, like duckduckgo). If you don't want something to be read or viewed on the internet, you shouldn't put it on the internet.

[–] Ragnell@kbin.social 1 points 1 year ago (1 children)

I don't mind people reading what I put on the internet, I mind them selling it and I mind them making machines to replace my work.

The problem of course, is that we don't actually have control over what goes on the internet, do we? I can write a book, and it could be put online. I put a picture online, the metadata shows my location. This may be "how the internet works" but it doesn't have to. There could be laws protecting us.

There could even be laws protecting us that exist right now, but are being ignored.

So here's my question for you. Corporations like OpenAI and Facebook/Meta and Google put millions into lobbying and public relations to discourage us from even looking into registering our problems with what they do. Why argue their side for free?

[–] effingjoe@kbin.social 1 points 1 year ago (1 children)

You are trying to muddy the water, but I don't see why.

We aren't talking about location exif data on pictures, and people can and should strip that off (there are tools to do so) before posting online, but that has nothing to do with LLMs and their like. Privacy violations are certainly fair game for legislation-- but as you are finding out, you don't get a say in how people consume your work. I could buy your work and burn it, or read it to my dog, or put it on a shelf, or study it daily to better learn how to make similar works. Once you make it available for public consumption, the public can consume it, even if that consumption eventually hurts you financially.

One of the many problems with IP laws is that it is so ingrained in our society that people who benefit from it directly forget that it's not all encompassing, nor is it a law of nature. For instance, I am free to make a drawing of the main characters in Stranger Things, drawn in the style of The Simpsons. That violates no IP laws. If a computer learns a specific style of painting from a specific artist and can recreate that style on command, there is still no violation of IP laws, just as it would be if a human did it. And it's plausible (though, unlikely) that someone could learn a specific style of animation (like, the simpsons) and then go on to replace the originator of that style in the show. Styles aren't copyrightable.

Your job is very likely to be replaced and there very likely nothing you can do about it. That's the bottom line. Mine may as well-- I am in the field of Software QA right now, for military robots. I feel like my time to be replaced isn't quite here-- I can't imagine it's that far off. Acknowledging this is just prudent.

[–] Ragnell@kbin.social 1 points 1 year ago (1 children)

Dude, the key to your freedom to draw Stranger Things characters in the Style of the Simpsons is that you don't make a shitton of money doing that and you don't compete with the actual Simpsons production.

But an AI is making a shitton of money for companies AND competing with the writers that it was trained on. So the stuff that makes it legal for you to draw Stranger Things characters in the Style of the Simpsons does not apply to AI.

Yeah, even with copyright protections they will probably deploy AI for tech writing, and my job will become editing all the mistakes on the output. But prudence doesn't mean just letting someone take advantage of you, and it doesn't mean you have to like it, and it doesn't mean you have to stop bitching about it, and it doesn't mean you have to accept trosy-lens pro-AI arguments on the Internet.

[–] effingjoe@kbin.social 2 points 1 year ago (1 children)

You have taken a turn towards anger, if I'm reading the correct context, but I'm just a messenger. I also wasn't expecting you to like it, and I guess you can complain all you'd like, but I was expecting to help you accept it. To be sure, whether or not you accept it won't change whether it happens.

Anyway, thanks for the discussion.

[–] Ragnell@kbin.social 1 points 1 year ago* (last edited 1 year ago)

It's not anything about you, I'm just a passionate person. I appreciate the discussion too, but I think you have identified the point where it's become unproductive.