this post was submitted on 07 Mar 2024
178 points (97.3% liked)

Technology

59593 readers
2838 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Employers are letting artificial intelligence conduct job interviews. Candidates are trying to beat the system.

"And when they got on the phone, Ty assumed the recruiter, who introduced herself as Jaime, was human. But things got robotic."

you are viewing a single comment's thread
view the rest of the comments
[–] deweydecibel@lemmy.world 134 points 8 months ago (5 children)

And when they got on the phone, Ty assumed the recruiter, who introduced herself as Jaime, was human. But things got robotic.

If regulators are trying to come up with AI regulations, this is where you start.

It should be a law that no LLM/"AI" is allowed to pass itself off as human. They must always state, up front, what they are. No exceptions.

[–] admin@lemmy.my-box.dev 35 points 8 months ago (1 children)

If I'm not mistaken, this is one of the core tenets of the EU AI act.

[–] maynarkh@feddit.nl 10 points 8 months ago (1 children)

Since the GDPR, companies are required to give you a detailed breakdown on why an AI would reject you, if the final decision is on the AI. I'm not sure how many companies are complying though, it's hard to enforce.

[–] admin@lemmy.my-box.dev 7 points 8 months ago* (last edited 8 months ago) (1 children)

Huh? GDPR is about your rights to your personal data, not the algorithms that act upon them. And the EU AI act has not been put into law yet, AFAIK.

[–] maynarkh@feddit.nl 11 points 8 months ago

Article 22 GDPR:

The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her. [...]

There is a carve-out if it "is necessary for entering into, or performance of, a contract between the data subject and a data controller", which nobody seems sure what it means, and it has not been tested in court.

[–] PM_Your_Nudes_Please@lemmy.world 14 points 8 months ago

I would argue that AI also shouldn’t be allowed to make legally binding decisions, like deciding who to hire. Since a computer can’t be held accountable for its decisions, there’s nothing stopping it from blatantly discriminating.

[–] flumph@programming.dev 10 points 8 months ago

It should be illegal to use an AI in the hiring process that can't explain its decisions accurately. There's too much of a risk of bias in training data to empower a black box system. ChatGPT can lie, so anything powered by it is out.

[–] NoRodent@lemmy.world 9 points 8 months ago

They also should not harm a human being or, through inaction, allow a human being to come to harm.

[–] Fiivemacs@lemmy.ca 3 points 8 months ago

Yes. I assume anyone from a company is a bot right out of the gate.