373
submitted 4 months ago by L4s@lemmy.world to c/technology@lemmy.world

University vending machine error reveals use of secret facial recognition | A malfunctioning vending machine at a Canadian university has inadvertently revealed that a number of them have been usin...::Snack dispenser at University of Waterloo shows facial recognition message on screen despite no prior indication

top 50 comments
sorted by: hot top controversial new old
[-] nicerdicer2@sh.itjust.works 85 points 4 months ago

The worst part of all is that no one would think of the fact that a vending machine is performing facial recognition techniques, because in general it is assumed that a vending machine is a mechanical device, as it has been in the past. There is not any user benefit in that.

I researched the manufacuter and in their brochure (see page 6) of a similar vending machine it is revealed what data can be processed:

Among the worst data sets are:

  • product demographics
  • measuring of foot traffic
  • gender/ age/ etc.

Bonus: on page 7 of the product brochure they introduce an app which allows the customer to make purchases directly from their smatphone, with features like

  • consumer engagement through gamification, interactive marketing, gifting, scratch-and-win receipts, product sampling and cross selling

"What do customers get?"

  • a fun and engaging payment process

Finally! I always thought that payment is not fun enough. What a time to be alive.

[-] homesweethomeMrL@lemmy.world 24 points 4 months ago

gender/age/etc.

The etc. is doing a lot of work there

[-] 9point6@lemmy.world 16 points 4 months ago

Well this absolutely wouldn't fly in the EU with GDPR

Can you lot in the states do something about your weird corpocracy, it's looking a bit dystopian

[-] nicerdicer2@sh.itjust.works 11 points 4 months ago* (last edited 4 months ago)

Bad news, the manufacturer is located in Switzerland and, as stated in the brochure, they advertise their product as "Made in EU". Probably to implicate that any data which will be collected and processed will be under the terms of GDPR.

I haven't looked up the terms regarding GDPR, but I assume that their data collection is somewhat "compliant" with GDPR, which does not necessaryly mean anything. It can just mean that data is not stored locally, albeit it will be send to the manufacturer (but probably entcrypted). However, under GDPR you can enforce your right of deletion of the collected data - that is, if you know that data about you has been collected.

What makes this issue so severe is that it would have never been detected that data has been collected and processed, if it weren't for a malfunction.

Edit: grammar, spelling

[-] fatalError@lemmy.sdf.org 10 points 4 months ago

Switzerland is not in the EU. Also even if it was, it's not illegal to design/manufacture solutions that don't comply with GDPR. They just can't be sold in the EU.

Also, data collection absolutely requires consent, it's why cookie popups exist on every website.

[-] nicerdicer2@sh.itjust.works 5 points 4 months ago

That is correct. Switzerland is not a part of the European Union. The manufacturer, Invenda, is located in Switzerland. That is where their headquarters are. It might be possible that their vending machines are produced within the EU (another country where production costs are lower). It might be possible that these specific models (those who offer data collection) are designed for markets outside of EU.

They advertise their product as "Made in EU" (see brochure). This could be made on purpose to implicate that their data collection meets GDPR requirements, leading to believe that everything is compliant with the law.

[-] 9point6@lemmy.world 2 points 4 months ago

Under GDPR you also need explicit consent for data collection, right?

[-] nicerdicer2@sh.itjust.works 1 points 4 months ago

Correct. The said vending machine was collecting data without users consent. And because it was facial recognition data it means that the collected data can be tied to an individual.

It would have been different if the collected data was just a counter which indcated the number of users of that machine. These kind of data could not have been tied to a specific individual.

[-] seaweedsheep@literature.cafe 8 points 4 months ago

No. But also, this is Ontario, well-known for being outside US jurisdiction.

[-] can@sh.itjust.works 11 points 4 months ago

Scariest part is we'd never have known if the facial recognition software hadn't encountered an error. At least until someone curious enough looked up the machine.

[-] Couldbealeotard@lemmy.world 7 points 4 months ago

This reminds me of the bit in Minority Report where Tom Cruise has to get his eyes surgically replaced so the shopping centre kiosks can't track him

[-] kalkulat@lemmy.world 16 points 4 months ago* (last edited 4 months ago)

AND they might have had miniature cameras in them for the past 20 years.

(The laws against this stuff are almost non-existing. Option left for those of us creeped out by constant surveillance: don't leave home, unplug that webcam. Demand privacy or lose it.)

[-] homesweethomeMrL@lemmy.world 15 points 4 months ago

Hey hey - y'all quit hanging around the vending machines! You're going to be late for the two minutes of hate!

[-] autotldr@lemmings.world 14 points 4 months ago

This is the best summary I could come up with:


A malfunctioning vending machine at a Canadian university has inadvertently revealed that a number of them have been using facial recognition technology in secret.

Invenda, the company that produces the machines, advertises its use of “demographic detection software”, which it says can determine gender and age of customers.

It claims the technology is compliant with GDPR, the European Union’s privacy standards, but it is unclear whether it meets Canadian equivalents.

In April, the national retailer Canadian Tire ran afoul of privacy laws in British Columbia after it used facial recognition technology without notifying customers.

The government’s privacy commissioner said that even if the stores had obtained permission, the company failed to show a reasonable purpose for collecting facial information.

The University of Waterloo pledged in a statement to remove the Invenda machines “as soon as possible”, and that in the interim, it had “asked that the software be disabled”.


The original article contains 258 words, the summary contains 149 words. Saved 42%. I'm a bot and I'm open source!

load more comments
view more: next ›
this post was submitted on 24 Feb 2024
373 points (98.4% liked)

Technology

55647 readers
2552 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS