this post was submitted on 08 Dec 2024
211 points (96.9% liked)

Technology

74528 readers
4898 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
all 42 comments
sorted by: hot top controversial new old
[–] Max_P@lemmy.max-p.me 111 points 8 months ago (3 children)

They'd get sued whether they do it or not really. If they don't they get sued by those that want privacy invasive scanning. If they do, they're gonna get sued when they inevitably end up landing someone in hot water because they took pictures of their naked child for the doctors.

Protecting children is important but can't come at the cost of violating everyone's privacy and making you guilty unless proven innocent.

Meanwhile, children just keep getting shot at school and nobody wants to do anything about it, but oh no, we can't do anything about that because muh gun rights.

[–] 0x0@programming.dev 14 points 8 months ago

Makes me wonder if the lawsuit is legit or if it's some But think of the children™ institution using some rando as cover.

because muh gun rights.

I think it's a bit more complicated. These are worth a watch at least once:
Let's talk about guns, gun control, school shooting, and "law abiding gun owners" (Part 1)
Let's talk about guns, gun control, school shooting, and "law abiding gun owners" (Part 2)
Let's talk about guns, gun control, school shooting, and "law abiding gun owners" (Part 3)

[–] john89@lemmy.ca 4 points 8 months ago* (last edited 8 months ago)

If people really care about protecting the children, we can always raise taxes on the wealthy/cut military spending to fund new task forces to combat the production and spread of child pornography!

Heck, the money spent on this lawsuit could be spent catching people producing CSAM instead.

[–] schizo@forum.uncomfortable.business 108 points 8 months ago* (last edited 8 months ago) (2 children)

First: I'm not in any way intending to cast any negative light on the horrible shit the people suing went through.

But it also kinda feels like a lawyer convinced a victim they could get paid if they sued Apple, because Apple has lots of money.

If you really were serious about suing to force change, you've literally got:

  1. X, who has reinstated the accounts of people posting CSAM
  2. Google/Youtube, who take zero action on people posting both horrible videos AND comments on said videos routinely
  3. Instagram/Facebook, which have much the same problem as X with slow or limited action on reported content

Apple, at least, will take immediate action if you report a user to them, so uh, maybe they should reconsider their best target, if their intent really is to remove content and spend some time on all the other giant corpos that are either literally actively doing the wrong thing, doing nothing, or are sitting there going 'well, akshully' at reports.

[–] Chozo@fedia.io 44 points 8 months ago (2 children)

Google/Youtube, who take zero action on people posting both horrible videos AND comments on said videos routinely

I used to share an office with YouTube's content review team at a previous job and have chatted with a bunch of them, so I can give a little insight on this side. For what it's worth, YT does take action on CSAM and other abusive materials. The problem is that it's just a numbers game. Those types of reports are human-reviewed. And for obvious reasons, it's not exactly easy to keep a department like that staffed (turns out you really can't pay people enough to watch child abuse for 8 hours a day), so the content quickly outnumbers the reviewers. Different types of infractions will have different priority levels, and there's pretty much always a consistent backlog of content to review.

While this article talks about Facebook, specifically, it's very similar to what I saw with YouTube's team, as well: https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

[–] 0x0@programming.dev 11 points 8 months ago (1 children)

you really can’t pay people enough to watch child abuse

I wonder what the package was, besides the salary. And the hiring requirements.

[–] Chozo@fedia.io 15 points 8 months ago

I don't know all the details, but I know they had basically unlimited break time, as well as free therapy/counseling. The pay was also pretty decent, especially for a job that didn't require physical labor or a specialized background.

They did have a pretty strict vetting process, because it was apparently not uncommon at all for people to apply to the job because they were either eager to see abusive content directly, or had an agenda they might try to improperly influence what content gets seen. Apparently they did social media deep dives that you had to consent to, to apply.

For Youtube I was very much talking specifically about how long and how little action they took on the kids-doing-gymnastics videos, even when it became abundantly clear that the target market was pedophiles, and the parents who kept posting these videos were, at the very least, complicit if not explicitly pimping their children out.

(If you have not seen and/or read up on this, save yourself the misery and skip it: it's gross.)

It took them a VERY long time to take any meaningful action, even after the intent of and the audience to which it was being shown was clearly not people interested in gymnastics, and it stayed there for literal years.

Like, I have done anti-CSAM work and have lots and lots of sympathy for it because it's fucking awful, but if you've got videos of children - clothed or not - and the comment section is entirely creeps and perverts and you just kinda do nothing, I have shocking limited sympathy.

Seriously - the comment section should have been used for the FBI to launch raids, because I 100% guarantee you every single person involved has piles and piles of CSAM sitting around and they were just ignored because it wasn't explicit CSAM.

Just... gross, and poorly handled.

[–] john89@lemmy.ca 2 points 8 months ago* (last edited 8 months ago)

But it also kinda feels like a lawyer convinced a victim they could get paid if they sued Apple, because Apple has lots of money.

Yep. All the money being wasted on this lawsuit could be spent catching actual producers and distributors of child porn.

Always follow the money. It shows what people's true intentions are.

[–] conciselyverbose@sh.itjust.works 39 points 8 months ago

I thought the way they intended to handle it was pretty reasonable, but the idea that there is an actual obligation to scan content is disgusting.

[–] paraphrand@lemmy.world 36 points 8 months ago* (last edited 8 months ago) (1 children)

“People like to joke about how we don’t listen to users/feedback. About how we just assert our vision and do things how we wish. Like our mouse. It drives people absolutely bonkers! But this time we listened to the pushback. And now they sue us?”

[–] lurklurk@lemmy.world 23 points 8 months ago (1 children)

Is iCloud a file sharing service or social network in some way? If it isn't, comparing them with such services makes no sense

[–] ILikeBoobies@lemmy.ca 4 points 8 months ago

file sharing service

Yes

[–] interdimensionalmeme@lemmy.ml 10 points 8 months ago

Children should be made illegal, this is a self resolving problem.

[–] john89@lemmy.ca 2 points 8 months ago

Is this a free system, by the way?

Is Apple essentially getting sued for not giving another company money?

[–] Lutra@lemmy.world 1 points 8 months ago

I just read up, and I didn't know this is not so much about stopping new images, but restitution for continued damages.

The plaintiffs are "victims of the Misty Series and Jessica of the Jessica Series" ( be careful with your googling) https://www.casemine.com/judgement/us/5914e81dadd7b0493491c7d7

Correct me please, The plaintiffs logic is : "The existence of these files is damaging to us. Anyone found ever in possession of one of these files is required by law to pay damages. Any company who stores files for others, must search every file for one these 100 files, and report that files owner to the court"

I thought it was more about protecting the innocent, and future innocent, and it seems more about compensating the hurt.

Am I missing something?

[–] lepinkainen@lemmy.world -2 points 8 months ago* (last edited 8 months ago) (3 children)

The irony is that the Apple CSAM detection system was as good as we could make it at the time, with multiple steps to protect people from accidental positives.

But, as usual, I think I was the only one who actually read the paper and didn’t go “REEEE muh privacy!!!” after seeing the headline.

[–] lurklurk@lemmy.world 28 points 8 months ago (1 children)

You should have though. This type of scanning is the thin end of the wedge to complete surveillance. If it's added, next year it's extended to cover terrorism. Then to look for missing people. Then "illegal content" in general.

The reason most people seem to disagree with you in this case is that you're wrong

[–] lepinkainen@lemmy.world -2 points 8 months ago (1 children)

We could've burned that bridge when we got to it. If Apple would've been allowed to implement on-device scanning, they could've done proper E2E "we don't have the keys officer, we can't unlock it" encryption for iCloud.

Instead what we have now is what EVERY SINGLE other cloud provider is: they scan your shit in the cloud all the time unless you specifically only upload locally-encrypted content, which 99.9999% of people will never be bothered to do.

[–] AlphaAutist@lemmy.world 3 points 8 months ago (1 children)
[–] lepinkainen@lemmy.world 0 points 8 months ago

It does now, it didn’t at the time

[–] Petter1@lemm.ee 2 points 8 months ago (1 children)

😆 yea especially after I learned that most cloud services (amazon, google, dropbox) were already doing csam scans on their servers 🤭

[–] lepinkainen@lemmy.world 3 points 8 months ago (1 children)

Yep, it's a legal "think of the children" requirement. They've been doing CSAM scanning for decades already and nobody cared.

When Apple did a system that required MULTIPLE HUMAN-VERIFIED matches of actual CP before even a hint would be sent to the authorities, it was somehow the slippery slope to a surveillance state.

The stupidest ones were the ones who went "a-ha! I can create a false match with this utter gibberish image!". Yes, you can do that. Now you've inconvenienced a human checker for 3 seconds, after the threshold of local matching images has been reached. Nobody would've EVER get swatted by your false matches.

Can people say the same for Google stuff? People get accounts taken down by "AI" or "Machine learning" crap with zero recourse, and that's not a surveillance state?

[–] Petter1@lemm.ee 3 points 8 months ago

😅why do we get downvoted?

I guess somebody doesn’t like reality 💁🏻