this post was submitted on 29 Dec 2023
100 points (95.5% liked)

Technology

58431 readers
4252 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

"Like so many applications of AI, this new power is likely to be a double-edged sword: It may help people identify the locations of old snapshots from relatives, or allow field biologists to conduct rapid surveys of entire regions for invasive plant species, to name but a few of many likely beneficial applications.

"But it also could be used to expose information about individuals that they never intended to share, says Jay Stanley, a senior policy analyst at the American Civil Liberties Union who studies technology. Stanley worries that similar technology, which he feels will almost certainly become widely available, could be used for government surveillance, corporate tracking or even stalking."

all 15 comments
sorted by: hot top controversial new old
[–] paraphrand@lemmy.world 45 points 9 months ago (1 children)

There are humans that can do this too. It’s pretty wild.

[–] PoopMonster@lemmy.world 10 points 9 months ago

Openstreetmaps let's you write some insanely precise queries. There's a company around that had as a plan to team up with governments to pinpoint mass shooters when they were streaming (as a usage case).

So say in the video it was clear they were in X city and they see things in the video like McDonald's, Starbucks, fenced in playgrounds, churches, what have you you can give the query a bounding box with all that info and very quickly narrow down where the video could be taken.

I think there was also some people who would pinpoint images from mountain outlines as a game. Kind of like geoguessr on steroids.

[–] skydivekingair@lemmy.world 26 points 9 months ago

This isn’t unique to AI, like most LLM programs it’s just accomplishing it faster and on a larger scale. Personally think if you want privacy you should limit the personal things you post to what you’re okay with being out there and form habits such as waiting until home from vacation to post pictures.

[–] afraid_of_zombies@lemmy.world 16 points 9 months ago (1 children)

Yes, and people like me having continued to point out that this problem stems from a bad view of expectation of privacy.

A non-famous person has a reasonable expectation of privacy on public property. If you take a photo and a non-famous person's face is in it, you should have written consent for only that photo or blur it out. If Disney can own an image of a mouse for 95 fucking years I can own my own image.

Don't take pictures of people or their property without consent. Just because technology allows you to be a disgusting creep doesn't mean you should. If you want jerk off material just use the internet like the rest of us.

[–] DessertStorms@kbin.social 10 points 9 months ago (1 children)

If you want jerk off material just use the internet like the rest of us.

The kind of thing this can be used for is about ten stages past jerking off, and in to stalker territory. So a person already using the internet for jerking off can now pinpoint exactly where the person they're jerking off to lives, and potentially turn up at their house, and escalate from there. This is beyond just creepy (and exploitative, in the case of corporations using the info), it's potentially putting lives at risk.

[–] afraid_of_zombies@lemmy.world 0 points 9 months ago (1 children)

Ok I don't know what I am supposed to do about that. Let's just work on the problem we can solve for now.

[–] DessertStorms@kbin.social 3 points 9 months ago

I never asked you to do anything? just pointing out things are much more serious than your comment makes out. I also don't see how what you said is a problem we can solve now and it's ok to focus on, but what I added somehow isn't..

[–] CaptainBasculin@lemmy.ml 14 points 9 months ago (1 children)

New geoguessr cheat just dropped

[–] AFKBRBChocolate@lemmy.world 10 points 9 months ago

To get that kind of accuracy from a student project with such a small sample set is pretty remarkable and pretty frightening. Yes, there are people who are good at this, but (1) this AI just beat one of the most skilled humans and (2) having it in an AI brings the capability to anyone, regardless of their motives.

Plus, with an AI you can incorporate more heuristics than any human could reasonably master. The article mentions types of foliage, which is a good example. An AI could incorporate thousands of things like that easily. Seems like a tool that's ripe for abuse, but I don't know what you could do about it.

[–] superminerJG@lemmy.world 7 points 9 months ago

so they turned rainbolttwo into an AI

[–] crsu@lemmy.world 2 points 9 months ago

Machine learning gets creepier and creepier

[–] autotldr@lemmings.world 1 points 9 months ago (1 children)

This is the best summary I could come up with:


The project, known as Predicting Image Geolocations (or PIGEON, for short) was designed by three Stanford graduate students in order to identify locations on Google Street View.

But it also could be used to expose information about individuals that they never intended to share, says Jay Stanley, a senior policy analyst at the American Civil Liberties Union who studies technology.

It's a neural network program that can learn about visual images just by reading text about them, and it's built by OpenAI, the same company that makes ChatGPT.

Rainbolt is a legend in geoguessing circles —he recently geolocated a photo of a random tree in Illinois, just for kicks — but he met his match with PIGEON.

And it guessed that a picture of the Snake River Canyon in Idaho was of the Kawarau Gorge in New Zealand (in fairness, the two landscapes look remarkably similar).

They've written a paper on their technique, which they co-authored along with their professor, Chelsea Finn — but they've held back from making their full model publicly available, precisely because of these concerns, they say.


The original article contains 1,049 words, the summary contains 181 words. Saved 83%. I'm a bot and I'm open source!

[–] deadcade@lemmy.deadca.de 2 points 9 months ago

Saved 83%

And 100% of the quality/context.

[–] moon@lemmy.cafe 0 points 9 months ago

4chan did it first