this post was submitted on 14 Jun 2024
184 points (88.3% liked)

Asklemmy

43912 readers
1090 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

I've watched the keynote and read some stuff on the internet and I've found this video about a dude talking about the new update (I linked it here because if you didn't see the keynote, this is probably enough)

Is it just me, or... does no one address that Apple does a Microsoft move by basically scanning everything on every machine and feeding this into their LLM?

top 50 comments
sorted by: hot top controversial new old
[–] pavnilschanda@lemmy.world 88 points 5 months ago (2 children)

As far as I know, Apple's implementation of LLMs is completely opt-in

[–] HEXN3T@lemmy.blahaj.zone 42 points 5 months ago (2 children)

And local. Probably. Maybe. Perchance.

[–] baduhai@sopuli.xyz 27 points 5 months ago

It's all local, except when it isn't.

[–] mamotromico@lemmy.ml 4 points 5 months ago (2 children)

It’s both local and remote, according to their page. There are some activities that run on a “private cloud”. I’d imagine that image creation is one of those

load more comments (2 replies)
[–] Technoguyfication@sh.itjust.works 23 points 5 months ago (2 children)

Apple also has a MUCH better track record relating to user privacy over pretty much every other big tech company.

[–] teawrecks@sopuli.xyz 17 points 5 months ago

On the contrary, Apple's track record for collecting data is deliberately obtuse and utilizes dark patterns to make it as difficult as possible to not upload your info to them.

From the article,

the user is given the option to enable Siri, but “enabling” only refers to whether you use Siri's voice control. “Siri collects data in the background from other apps you use, regardless of your choice, unless you understand how to go into the settings and specifically change that,”...“In practice, protecting privacy on an Apple device requires persistent and expert clicking on each app individually"...the steps required are “scattered in different places.”

Apple devices might be arguably more secure than other vendors, but security and privacy are not the same thing.

[–] kalleboo@lemmy.world 73 points 5 months ago* (last edited 5 months ago) (2 children)

Microsoft's thing takes a screenshot of everything on your screen and saves and indexes it. Opened up your password manager and revealed a password? Saved. Opened a porn site in a private tab in any browser aside from Edge? Saved. Opened up a private encrypted chat to try to get away from your abusive partner/parents? Saved and indexed. Logged into a portal at work showing HIPAA information? Saved and indexed.

Apple's thing is basically a better search feature of all the data you already have saved, that apps have already opted-in to sharing. It runs on device, and Apple has promised they do not send the data back to train the models. They also have some generic ChatGPT-like tool to help rewrite your documents, but that's 100% opt-in so nobody really cares about it, it's easy to just not use.

[–] aStonedSanta@lemm.ee 32 points 5 months ago (1 children)

Came here to say this much less eloquently. Left this comment cause I wanted to use eloquent 😌

[–] IbnLemmy@feddit.uk 19 points 5 months ago

Well eloquented, bravo

[–] QuarterSwede@lemmy.world 6 points 5 months ago* (last edited 5 months ago) (2 children)

It’s not ChatGPT like, it is ChatGPT, specifically GPT-4o.

See press release and versioning at bottom of the page.

https://www.apple.com/newsroom/2024/06/introducing-apple-intelligence-for-iphone-ipad-and-mac/

So, Apple is using the latest version of the best LLM out there and its opt-in. That’s a hugely different approach as you mentioned.

[–] Rexios@lemm.ee 8 points 5 months ago (1 children)

Siri can ask ChatGPT but only if you explicitly allow it. They specifically said ChatGPT is only used if you asked for something that needs world knowledge, and even then there is a prompt asking for permission to send data to ChatGPT every time. Apple even said in interviews that you can completely disable the ChatGPT integration if you want.

load more comments (1 replies)
load more comments (1 replies)
[–] jaykay@lemmy.zip 71 points 5 months ago* (last edited 5 months ago)

I saw a comment somewhere that said: “people have been burnt by Microsoft too many times, while Apple still has a benefit of the doubt for many people in regards to privacy”. People still have some trust in Apple, compared to MS.

Edit: Found the comment by @deweydecibel@lemmy.world

If Apple announced Recall? Apple wouldn’t announce Recall, that’s the whole point. Apple wouldn’t be so brazen and stupid to push a tool that is so obviously invasive and so poorly implemented. Apple earned its trust by not making those mistakes.

But if they did decide to say fuck it and implement something like Recall, of course people would trust them. That’s what trust means: consumers take them at their word. But if it’s as bad as Microsoft’s Recall, Apple would burn all that trust when people found out.

People don’t believe Microsoft because they have long since burned any trust and good will for most of their consumers. They have proven time and time again they don’t give a shit about users’ wants or needs, and users have felt that. So when they announce Recall, they have no earned trust. No believes them. There’s no good faith to cushion this. And it turns out everyone was right not to grant them that trust.

[–] helpImTrappedOnline@lemmy.world 63 points 5 months ago (2 children)

I'm going to copy paste a reply I left somewhere else. This was for iOS AI, I'm unsure what the implemention for macOS is. If they are scanning everything then I do not support it.


From what I saw,

MS Recall is a 24/7 AI monitor system that captures everything you look at and saves it for later. They didn't even do the bare minimum for protecting the data, it was just dumped in an unencytped folder where anyone get wholesale access to the data. All trust has been lost.

Apple is using AI as a tool to improve specific tasks/features that a user invokes. Things like assistant queries and the new calculator. They have said some promising things in regards to privacy, specificly with the use of ChatGPT - any inquiry sent to ChatGPT will ask the user permission first and obscure their IP. This shows they care enough to try, they have not lost our trust - but we remain skeptical.


If apple tries the same thing by scanning everything wholesale, then that's getting over shadowed by the promises made by the implentaion on the much more popular iOS.

[–] ripcord@lemmy.world 4 points 5 months ago

OP will never acknowledge this.

load more comments (1 replies)
[–] lol_idk@lemmy.ml 52 points 5 months ago* (last edited 1 week ago)
[–] arxdat@lemmy.ml 26 points 5 months ago (1 children)

Apple at least talks about privacy and security. Windows just dumped that shit right on you and is planning on storing in unencrypted databases... like, I would expect there to be enough brainpower at M$ to be able to write an application and then secure it... Just use Linux and when Ubuntu and Fedora decide they want to implement those features... OpenBSD it is :D

[–] CaptPretentious@lemmy.world 4 points 5 months ago

https://www.malwarebytes.com/blog/news/2024/05/deleted-iphone-photos-show-up-again-after-ios-update

So photos were forever stored on a database... How's that privacy and security? This isn't even just another iCloud leak.. these were things people thought were deleted coming back.

Apple likes to talk because talk is cheap. Just like Apple used to say that Apple products didn't get viruses. Or how Google said do no evil. Talk is cheap.

[–] chonglibloodsport@lemmy.world 22 points 5 months ago (2 children)

It’s really simple: Microsoft is a business solutions company. Microsoft helps your boss spy on you at work. Your boss is their customer, not you.

Apple is a consumer products company. You are their customer. They market their products on privacy and security. Betraying that marketing message by spying on users is shooting themselves in the foot, so they’re incentivized not to do that.

Neither company is trustworthy. Economic incentives are the trustworthy concept here. Barring screwups, we can trust both companies to do what is profitable to them. Microsoft profits by spying on users, Apple does not (not right now anyway).

[–] isolatedscotch@discuss.tchncs.de 21 points 5 months ago (4 children)

they definitely do spy on their users and sell their data, but are very clever at marketing their items as fashionable and people fall for it

load more comments (4 replies)
[–] cyberpunk007@lemmy.ca 6 points 5 months ago

My employer runs macos. So I'd argue Mac is still a business solution, but not as common as windows. Tools exist for managing macs at scale as well.

[–] Yggnar@lemmy.world 22 points 5 months ago

If you're already willing to put up with all the other bullshit Apple does, I don't see why you'd care about them doing this.

[–] exanime 20 points 5 months ago
  1. MS has a horrible track record on privacy or even caring about their customers; Apple, deservedly or not, has higher level of consumer's trust

  2. MS brought out features too obviously ripe for abuse built in the most insecure way, lied about it and was quickly proven a liar. Apple says they built their AI in a secure way, their fans believe them and have not challenged

  3. Undoubtedly, there is still a huge Apple fanbase that would tell you Apple's turds smell nice. So there's a portion of that in the mix as well

[–] phoneymouse@lemmy.world 19 points 5 months ago* (last edited 5 months ago)

If you watch WWDC, they shared how it works. They have a private cloud that does not persist data on it, only processes it. Also, it’s audited by a third party and there is a cryptographic mechanism that will not allow your request to be accepted unless the server software has been publicly signed by the auditor. At least, this is my best understanding of it from what I remember.

Also, in the same presentation they announced that you can now lock your Apps and hide them, which will keep its data out of the OS search results. I am fairly certain this also means it’s opted out of ML/AI processing given that any LLM would rely on the same search index.

[–] macabrett@lemmy.ml 16 points 5 months ago

I think it's because Apple has a "fandom", whereas when's the last time you heard someone being a weird fan of Microsoft outside of Xbox? It just doesn't really exist. The people with Apple devices are often "fans" of Apple, not simply people who bought a product. I think it's that simple.

[–] umbrella@lemmy.ml 13 points 5 months ago (2 children)

apple can get their consumers in a cult-like state it seems.

their marketing and pr is scary good.

load more comments (2 replies)
[–] bloodfart@lemmy.ml 13 points 5 months ago

You can look at security failures as mistakes or conspiracies.

It’s very easy to see the Microsoft failures as conspiracies the more you learn about them because Microsoft’s material interests are aligned with the failures. To steal someone’s turn of phrase: “Microsoft gives you a foot gun for free but charges for bulletproof shoes”.

It’s very easy to see apples security failure as mistakes because the more you learn about them the more you see how apples material interests arent aligned with the failures. If I had to make a similar one liner, “apple sells you designer shoes with drop rated toe boxes. They might not be bulletproof, but you also don’t have a foot gun.”

[–] Kronusdark@lemmy.world 8 points 5 months ago (1 children)

I think it all remains to be seen, Apple was very specific in their wording about privacy, probably BECAUSE they saw what happened to Microsoft. We didn’t see any live demos and I am still a bit skeptical that it will work that well.

A key difference in how Apple is doing it though, is that it only exposes necessary data as context to an LLM request. Whereas Microsoft was capturing and training on everything.

I don’t have an iPhone 14 so luckily I can’t test this day one, I will wait for reviews and security researchers to look it over.

[–] CoggyMcFee@lemmy.world 6 points 5 months ago

I think Apple’s emphasis on the privacy and security stuff would have happened anyway, because they’ve been positioning themselves as privacy focused for several years now.

[–] Aggravationstation@feddit.uk 8 points 5 months ago

Not true. I hate them both for this and a litany of other reasons. Holding back humanity's development and being the chief cause of e-waste are at the top of the pile.

[–] tiredofsametab@kbin.run 7 points 5 months ago

Because I don't use apple products and don't keep up with the news? My work laptop is Mac, but that's work's problem (I hate that thing)

[–] nutsack@lemmy.world 7 points 5 months ago

because Windows is a piece of shit that people in here are forced to use to play video games and Apple is just kind of doing its own thing being a piece of shit

[–] FrostyCaveman@lemm.ee 7 points 5 months ago

Microsoft and Apple are both privacy-disregarding monopolistic megacorporations. The difference is Microsoft is slowly degrading in competence and their PR machine is no longer able to compensate

[–] Brkdncr@lemmy.world 7 points 5 months ago

Apple has always had a better PR team.

[–] jet@hackertalks.com 6 points 5 months ago* (last edited 5 months ago)

Opt-in ; Respecting Agency; Explicit Consent.

Microsoft has every intention of SHOVING this down your throat, and only corporate group policy will be exempted. They will use every nag screen, dark pattern, accidently enabling with updates, randomized installs, to make it happen. Look at what they do with edge, for an example. MS absolutely does not respect consent. #MS-MeToo

Apple for all its faults, respects people when they say No, and if they say it's opt-in, they have a track record to back that up. Apple says 'Hey look at this cool new feature you can use', and I think Horray - more choice.

Skimming all the comments, didn't see this mentioned explicitly

[–] Num10ck@lemmy.world 5 points 5 months ago (1 children)

https://youtu.be/J7al_Gpolb8

start watching at 1hr29sec for a great explanation of the vastly different approach to privacy with AI from Apple to Microsoft.

load more comments (1 replies)
[–] orcrist@lemm.ee 5 points 5 months ago (1 children)

Maybe you're making an apples to oranges comparison. But anyway, nobody I know thinks Apple has good intentions with regard to their data.

load more comments (1 replies)
[–] PlushySD@lemmy.world 4 points 5 months ago

I saw the Apple Intelligence presentation that reads user emails and SMS like it reads everything and categorizes which is more important to you... and people take that?

[–] Raxiel@lemmy.world 4 points 5 months ago

I saw the thumb and just thought, Macos for maco Monday?

[–] Jeom@lemmy.world 3 points 5 months ago (1 children)

Read this as "TACOS WILL NEVER BE THE SAME"

load more comments (1 replies)
load more comments
view more: next ›