this post was submitted on 02 Dec 2024
236 points (91.3% liked)

Showerthoughts

31967 readers
886 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted clever little truths, hidden in daily life.

Here are some examples to inspire your own showerthoughts:

Rules

  1. All posts must be showerthoughts
  2. The entire showerthought must be in the title
  3. No politics
    • If your topic is in a grey area, please phrase it to emphasize the fascinating aspects, not the dramatic aspects. You can do this by avoiding overly politicized terms such as "capitalism" and "communism". If you must make comparisons, you can say something is different without saying something is better/worse.
    • A good place for politics is c/politicaldiscussion
  4. Posts must be original/unique
  5. Adhere to Lemmy's Code of Conduct and the TOS

If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.

Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report, the message goes away and you never worry about it.

founded 2 years ago
MODERATORS
 

What do you think, ChatGPT? If it can create almost perfect summaries with a prompt; why wouldn't it work in reverse? AI built into Windows could flag potentially subversives thoughts typed into Notepad or Word, as well as flag "problematic" clicks and compare it to previously profiled behavior. AI built into your GPU could build an behavioral profile based on your interactions with your hentai Sonic the Hedgehog game.

top 30 comments
sorted by: hot top controversial new old
[–] slazer2au@lemmy.world 45 points 3 months ago (3 children)

Don't need AI for any of this. It already happens with OS and Application telemetry.

[–] Tetsuo@jlai.lu 18 points 3 months ago

Hello, I'm NVIDIA I send every app you use as telemetry. But you know it's only to know in what apps your driver crash of course. I wouldn't send that data to telemetry even when it doesn't crash. Right?

[–] BearOfaTime@lemm.ee 12 points 3 months ago

And it's been escalated with AI

[–] sp3tr4l@lemmy.zip 10 points 3 months ago* (last edited 3 months ago) (1 children)

True, you don't need AI for security problems...

...but it is introducing tons of them, for little to no benefit.

About a month ago I saw a post for a MSFT led AI Security conference.

None of it, absolutely none of it, was about how to say, leverage LLMs to aid in heuristic scanning for malware, or something like that.

Literally every talk and booth at the conference was all about all the security flaws with LLMs and how to mitigate them.

I'll come back and edit my post with the link to what I'm talking about.

EDIT: Found it.

https://www.microsoft.com/en-us/security/blog/2024/09/19/join-us-at-microsoft-ignite-2024-and-learn-to-build-a-security-first-culture-with-ai/

Unless I am missing something, literally every talk/panel here is about how to mitigate the security risks to your system/db which are introduced by LLM AI.

[–] Paradachshund 2 points 3 months ago

Sorry, what was that? "BUY BUY BUY"?

[–] AlecSadler@sh.itjust.works 28 points 3 months ago (1 children)

I think it was during the Cambridge analytics days, but I read an article that the average person is tracked by over 5000 data points. So we're already kinda f'd.

[–] Scolding7300@lemmy.world 18 points 3 months ago (1 children)

Defeatism plays into their advantage, you can always minimize the tracking. E.g. https://www.goodreads.com/book/show/54033555-extreme-privacy

[–] AlecSadler@sh.itjust.works 8 points 3 months ago (1 children)

Ah, yeah, sorry, didn't mean to come off defeatist. I see that now.

As someone who recently ditched Alexa, blocks his smart TVs, and runs everything through PiHole and a VPN, I'm definitely...sorta trying.

[–] ComradeMiao@lemmy.world 3 points 3 months ago

If you don’t start limiting house electrical hours are you even trying

[–] ogmios@sh.itjust.works 26 points 3 months ago (1 children)
[–] ricdeh@lemmy.world 13 points 3 months ago

Initiating countermeasures

[–] JoMiran@lemmy.ml 24 points 3 months ago (3 children)

Have a few friends over and have them all sit around a table. Have everyone place their smartphones on the table (turned on, of course), and proceed to discuss something like the merits of drills from Harbor Freight versus Ryobi, Milwaukee and DeWalt. Ideally with one person speaking at a time. Wait about a week and ask your friends if any of them noticed an uptick in ads for drills or powertools in general.

[–] usualsuspect191@lemmy.ca 24 points 3 months ago (5 children)

Hasn't this been proven to be false? People have monitored the network traffic and phones don't listen like this; it's just not practical.

Instead, they keep track of your browsing, location, contacts, etc and build a profile well enough they don't need to listen to you.

[–] vinnymac@lemmy.world 16 points 3 months ago* (last edited 3 months ago) (1 children)

It’ll vary by the software you have, and the phone you have. Many companies have been caught capturing microphone recordings such as Google, Meta and Amazon over the years to name a few.

It also depends on the appliances you own, and how you have them configured. TVs, Alexa, hell we even have refrigerators that have live mics on them now.

I have worked for tech my whole life, this is table stakes for these organizations, ethics be damned.

[–] usualsuspect191@lemmy.ca 9 points 3 months ago (2 children)

My understanding is the mics aren't "live" until the activation phrase is said, then they record and send that data for processing. If someone has proven otherwise I'd love to see their methods.

The scary thing isn't that they're listening, it's that they collect so much other data that they don't have to.

[–] vala@lemmy.world 5 points 3 months ago (1 children)

How are they listening for the activation phrase then?

[–] usualsuspect191@lemmy.ca 3 points 3 months ago (1 children)

I'm sure you'll find some good explanations online, but there's an "activation" circuit on the device "listening" that then engages the rest of the system when it's triggered. So there's no recording or sending of data until the activation phrase has been said, and the activation phrase detection is done locally on the device.

[–] vala@lemmy.world 2 points 3 months ago

This makes sense for devices like Google home where there is only one activation phrase but I don't understand how an IC could exist that can respond to custom activation phrases.

Also are you saying that cellphones have this circuit too? I'm pretty darn sure that's all software based.

[–] WhyJiffie@sh.itjust.works 3 points 3 months ago

the "it doesnt record you until the software decides so" argument is such a bullshit. does not mke any difference. it listens when it wants, and you cant even verify it

[–] JoMiran@lemmy.ml 12 points 3 months ago (1 children)

Run your own experiments. That's all I am suggesting.

[–] Sludgehammer@lemmy.world 8 points 3 months ago* (last edited 3 months ago)

It'd be very easy to take some LLM text about some product, run it through a text to speech converter then quietly expose the phone to it (like put a earbud up to the mic). This way you could easily create a blind or a double blind test, you don't know what product that this set up has been rambling about into the phone for the past twelve hours and you have to pick it out from the ads you're served.

[–] vala@lemmy.world 3 points 3 months ago

You don't need to transmit the recording. Maybe not even a transcript. Just the keywords.

[–] voracitude@lemmy.world 2 points 3 months ago* (last edited 3 months ago)
[–] BearOfaTime@lemm.ee 12 points 3 months ago

I saw this in minutes after a conversation in a car with 2 people, 2 phones.

And it was for a subject which was waaaaay out in left field for us both, something neither of us had ever even thought about before.

[–] TranquilTurbulence@lemmy.zip 3 points 3 months ago (1 children)

Ads? You mean those stickers on a bus?

Seriously though, use DNS, VPN and other means to block ads and telemetry, so thoughts like that don’t even occur to you.

[–] lord_ryvan@ttrpg.network 2 points 3 months ago

VPN doesn't necessarily block telemetry, and some providers, like NordVPN, have tons of telemetry in their clients alone. Even if they come with “blocking telemetry” in their VPN, I guess they want to be the only one hoarding your data.

Use tracker blockers/firewalls, TrackerControl is a good open source app on Android for this, a PiHole can block a lot of tracking traffic as well.

[–] masterspace@lemmy.ca 12 points 3 months ago (1 children)

While it could, and I have no doubt that someone will try to do this, it's not the reason it's being shoehorned into everything.

It's partially because it's the tech thing that's 'so hot right now', so every tech enthusiast and hustler thinks it can be used everywhere to solve everything, and it's partially because it's a legitimately huge advancement in what computers are capable of doing, and one with a lot of room for growth and improvement, and can be legitimately useful in places like Notepad.

[–] lordnikon@lemmy.world 5 points 3 months ago

Yeah Gen AI is the perfect demo tech looks amazing if you don't look to close. Plus it's the perfect bullshiting machine no wonder CEOs love it, it talks like they do. AI has its uses and it's doing good work in the fields you don't hear about much. But there are way more pets.com right that will go bust soon and the viable businesses will float to the surface. Hell we are going through that right now. Where web 2.0 are moving out of the growth phase and into the Enshittification phase.

[–] rumba@lemmy.zip 5 points 3 months ago

They're just sending every query home right now. Actual training is still resource-intensive and very expensive. I suspect they're just grabbing as much data as they can get their hands on from everyone with unique identifiers and storing it for later training. Once the data they have is worth more than the cost to train on it then they'll go ahead and run a giant model of everyone.

At that point they'll sell query time to corporations. "How many people would pay $400 for trainers with OLED screens on the sides". "Oh really? Yes, I'd like to buy ads for all of those people"