this post was submitted on 23 Oct 2023
1279 points (96.0% liked)

4chan

4255 readers
104 users here now

Greentexts, memes, everything 4chan.

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 9 points 1 year ago (2 children)

We don't need more discrimination in loan approval. A few years ago, Amazon built an AI that would look at resumes and rate how likely the candidate would be hired. The AI trained itself to recognize female sounding resumes (went to women's only college, is involved in women's organizations, does not use manly enough language) and flag those as undesirables.

https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G

[–] [email protected] 1 points 1 year ago (1 children)

Jesus christ that's dystopian

[–] [email protected] 2 points 1 year ago (2 children)

It's not so much dystopian as it is just buggy software

[–] [email protected] 2 points 1 year ago (3 children)

Ah ok. I don't know much about it, but I've heard that AI could sometimes be negative toward commonly discriminated against groups because the data that it's trained with is. (Side note: is that true? someone pls correct me if it's not). I jumped to the conclusion that this was the same thing. My bad

[–] [email protected] 3 points 1 year ago

That is both true and pivotal to this story

It's a major hurdle in some uses of AI

[–] [email protected] 3 points 1 year ago

what it did it expose just how much inherent bias there is in hiring. even just name and gender alone.

[–] [email protected] 1 points 1 year ago

An AI is only as good as its training data. If the data is biased, then the AI will have the same bias. The fact that going to a women's college was considered a negative (and not simply marked down as an education of unknown quality) is proof against the idea that many in the STEM field hold (myself included) that there is a lack of qualified female candidates but not an active bias against them.

[–] [email protected] 1 points 1 year ago (1 children)

When buggy software is used by unreasonably powerful entities to practise (and defend) discrimination that's dystopian...

[–] [email protected] 2 points 1 year ago

Except it wasn't actually launched, and they didn't defend its discrimination but rather ended the project.

[–] [email protected] 1 points 1 year ago

We don’t need but we’re going to get!!!!