this post was submitted on 15 Feb 2024
1715 points (98.5% liked)

Microblog Memes

6027 readers
1219 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Septimaeus@infosec.pub 2 points 10 months ago

I hadn’t heard about the dispute in Germany but found some articles about it. If I’m reading correctly, I would say practices were definitely not more responsible in the US, but the history of disputes here may go back a bit further, with a slew of regulatory reforms we benefit from today — namely FCRA, TILA, BSA (1968-1970), ECOA, FCBA (1974), RFPA (1978), and FACTA (2003).

For them to see things like discrimination based on names they first would need to develop a sense to question their own prejudices.

I definitely agree regarding the universality of blindness to such biases. I suppose automated credit decisions (based purely on scoring models) might have a better shot at eliminating the factor of implicit biases in human agents. Even so, there is a lot of debate over here about how best to filter data that reflects biases and also what data is currently being ignored due to biases, because any algorithm that solves these issues, in part or whole, allows better value capture and increased revenue.

The banks don’t see customers they loose because of an overly restrictive scoring model.

Perhaps, but even if their stakeholders don’t notice or care about discrimination at all, they do take notice when a competitor scoops up a portion of the market they failed to realize due to biased/inferior analysis. After all, the original goal of credit scoring was to increase objectivity and predictive accuracy by reducing bias for the sake of better (more profitable) lending decisions.

To your question re: the credit reporting system in the US, it sounds like it works a bit differently here. There are credit bureaus (sometimes called “reporting agencies”) such as Experian, TransUnion, and Equifax (“the big three”). They functioned historically as lending history aggregators, but now also have scoring models they develop and sell. There are also companies who specialize in scoring models, but they use data from the credit bureaus. These companies often tailor their models to specific markets.

In general, if you apply for any form of credit, the lender formally submits a requisition (a “hard inquiry”) for your file (a “credit report”). Only very specific information is allowed in that credit report, and it must be made available to you at your request (free annually, otherwise for a nominal fee). There are other situations where credit scores can be ordered as a “soft inquiry” without the report, and there are particular rules and restrictions that apply, but the lending history contained in your credit report is what banks and lenders routinely use for applications, even automated/instant credit decisions.