this post was submitted on 08 Oct 2024
202 points (100.0% liked)

Privacy

1268 readers
159 users here now

Icon base by Lorc under CC BY 3.0 with modifications to add a gradient

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] OpenPassageways@lemmy.zip 18 points 1 month ago (3 children)

I've seen this advertised as a fraud detection and prevention service, even before ChatGPT. I'm assuming there's a standard disclosure that the call may be recorded for training purposes, it's only recently that "training" has included "training AI".

[–] lolola@lemmy.blahaj.zone 18 points 1 month ago (1 children)

Training then: Making all the new hires sit down and listen to a recording of you get increasingly frustrated with their dumbass coworker.

Training now: Burning through a neighborhood's worth of power for data processing to ensure that the phone tree understands you with absolute certainty when you say "speak to a representative", yet continues to ignore you anyway.

[–] Flocklesscrow@lemm.ee 2 points 1 month ago

"You have selected an invalid option. Goodbye." Click

[–] Fredselfish@lemmy.world 6 points 1 month ago (2 children)

Yeah but still should be illegal. I mean this AI is listening and gathering information about the customer while discussing private banking matters.

[–] Flocklesscrow@lemm.ee 2 points 1 month ago

100%

And how long (as if it isn't already) before the same systems transition to healthcare? AI wants to know all the salacious details, baby.

[–] Jakeroxs@sh.itjust.works 2 points 1 month ago

It really depends on how it's being stored and used, like the other commenter mentioned, it's standard practice in banking/Brokerage industry to record all calls for training/litigation/coaching.

[–] Pandemanium@lemm.ee 3 points 1 month ago

It doesn't prevent any fraud when anyone on the Internet can now easily recreate anyone's voice using AI. Banks should know better.