this post was submitted on 29 Sep 2024
19 points (77.1% liked)

Technology

58306 readers
3212 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
all 30 comments
sorted by: hot top controversial new old
[–] nxn@biglemmowski.win 2 points 14 minutes ago

Every passing day we delve deeper into this hole that is a cold technology driven world. Instead we really should be taking the time to share our outbreaks with friends and family.

[–] gedaliyah@lemmy.world 1 points 5 minutes ago

Short answer, yes.

Finding complex patterns in noisy data is an application that AI is actually well suited for. It still requires human follow-up. Anyway, human experts make mistakes in these areas as well. There is a good chance that a well designed AI could be more accurate.

[–] Imgonnatrythis@sh.itjust.works 1 points 37 minutes ago

Depends on the specificity and sensitivity of the test. Would have to be damn close to gold standards to justify. Company providing tech would need to be heavily regulated. Could be promising tech for sex workers if sensitivity was decent, but by time skin manifestations are present most of these are fairly far along.

[–] Randomgal@lemmy.ca 10 points 3 hours ago (1 children)

Honestly? I've leaked pics of those voluntarily, so curiously I'd be a-okay with this one.

[–] Nougat@fedia.io 7 points 3 hours ago (1 children)

... leaked ...

Well, there's your problem.

[–] Randomgal@lemmy.ca 5 points 3 hours ago (1 children)

No no. There's no problem. That's what I'm saying Lol.

[–] Nougat@fedia.io 9 points 3 hours ago

I was trying to do a "it's not supposed to leak, that's probably an STI" joke.

[–] aisteru@lemmy.aisteru.ch 50 points 6 hours ago (1 children)

Honestly? Before the AI craze, I'd have said yes, because I believe AIs tailored to do one specific thing can outperform humans. Today? I'd rather not, as I could not let go of the thought that it might be somme shitty model quickly put together by the nephew of the CEO...

[–] AbidanYre@lemmy.world 1 points 31 minutes ago (1 children)

Equally likely, they're collecting data for their porn generating AI bot.

[–] gsfraley@lemmy.world 3 points 21 minutes ago

😬 I'm not sure how I'd feel about porn generated on a data set of potential STIs

[–] pixeltree@lemmy.blahaj.zone 24 points 5 hours ago (1 children)

Would I trust the accuracy of the output? No, but it might be a decent warning to get tested to make sure. Would I trust a company with pictures of my genitals attached to my identity? Certainly not an AI company.

[–] SkaveRat@discuss.tchncs.de 13 points 4 hours ago (1 children)

but it might be a decent warning to get tested to make sure

just show "better get checked by a professional" as the only result. no AI needed

[–] slacktoid@lemmy.ml 4 points 4 hours ago (1 children)

Great app idea to get pics of genitals!

[–] SkaveRat@discuss.tchncs.de 3 points 4 hours ago (1 children)

Just create a twitter account with a model as the avatar and you'll get the same. With a small chance of fewer deseased pics

[–] slacktoid@lemmy.ml 1 points 3 hours ago

Hilarious and depressing

[–] solrize@lemmy.world 5 points 4 hours ago

I dunno, maybe the diagnosis is fine but the companies that run it are sure to save copies. I can just see databreaches now, "5 million stolen dick picks uploaded to dark web". Complete with labelling of which ones are diseased though, so that's a help.

[–] Num10ck@lemmy.world 11 points 5 hours ago (1 children)

just post your junk on bluesky and crowdsource it.

[–] simplejack@lemmy.world 2 points 1 hour ago

Twitter is mostly verified dicks these days. That might be the better platform.

[–] pennomi@lemmy.world 13 points 5 hours ago

Locally run AI, yes. Hosted AI, no.

[–] kokesh@lemmy.world 2 points 3 hours ago

You just need Ann to check those. Ask Joe from Sewage.

[–] Death_Equity@lemmy.world 7 points 5 hours ago

I wouldn't trust it to tell me if something is or isn't a banana.

[–] lemmyng@lemmy.ca 5 points 5 hours ago

AI trained to do that job? Sure, yeah. LLM AI? Fuck no.

[–] SatansMaggotyCumFart@lemmy.world 8 points 6 hours ago (2 children)

I’d welcome it.

I could probably teach it a thing or two.

[–] Grimy@lemmy.world 4 points 5 hours ago (1 children)

And love.

Like that movie where Joaquin Phoenix gives Scarlett Johansson a STI.

[–] rigatti@lemmy.world 3 points 5 hours ago

Would you teach it how to make creepy comments on the internet?

[–] hendrik@palaver.p3x.de 2 points 4 hours ago* (last edited 4 hours ago)

I don't think the app in the picture is driven by AI. Seems like a catalogue of questions. Probably to assess some situation by some standard procedure. I'd trust that. Regarding the AI apps mentioned below: I wouldn't trust them at all. If my private parts start itching and I can't make sense of it, I'd go to the doctor. At least if it's serious. Or use Dr. Google if it's not too bad.

[–] UraniumBlazer@lemm.ee 1 points 4 hours ago

If it is approved by the FDA? HECK YEAH BAYBEEE

[–] sunzu2@thebrainbin.org 1 points 5 hours ago

Only if it is hosted by Google.