this post was submitted on 08 Mar 2025
803 points (98.2% liked)

Technology

64933 readers
4015 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 5 points 16 hours ago (2 children)

I would rather have AI deciding it than bank account balances.

[โ€“] [email protected] 1 points 8 hours ago* (last edited 8 hours ago)

A lot of systems we have already made are super fucked up. this is true. a lot of them were designed to fuck shit up, and be generally evil. we do that sometimes.

these systems only serve to magnify them. see, there's been a massive marketing push is to call these things "artificial intelligence". they're not. they tell you it's all to complex to explain, but type something on your phone. no, really, do it. like a sentence or two. anything.

you just used the small easily comprehensible version of a large (thing) model. the problem is, as you try to scale complexity on these, both accuracy and compute resources grow exponentially, because it's literally the same kind of algorithm as your software keyboard uses to autocorrect, but with a bunch of recursion in it and much larger samples to reference every time someone hits a key.

there are some philosophical implications to this!

see, there is no neutral. there is no such thing as a view from nowhere. which means these systems are not. they need to be trained on something. you don't just enter axioms. that would be actual AI. this, again, isn't that. these are tools for making statistical correlations.

there's no way to do this that is 'neutral' or 'objective'. so what data do you think these tools get fed? Lets say you're a bank, let's say you're wells fargo, and you want to make a large home-loan-assessment model. so you feed it all the data from your institution going back to the day your company was founded. back in stagecoach and horse times.

so you have names of applicants, and house statistics, and geographic location, and all sorts of variables to correlate and weigh in deciding who gets a home loan.

which is great if your last name is, for example: hapsburg. less good if your last name is, for example: freeman. and you can try to find ways to compensate, if you want to. keeping in mind that the people who made this system may actively want to stop you. but it's possible. but these systems are very very good at finding secret little correlations. they're fucking amazing at it. it's kind of their shit. this is the thing they're actually good at. so you'll find weird new incomprehensibly cryptic markers for how to be a racist piece of shit, all of which will stay within the black box and be used to entrench historical financial bigotry.

death is the great equalizer, but this system can be backed up indefinitely. it will not die unless somebody kills it. which could be really hard. people can learn to be less shit, at least in theory-we can have experiences off the job that wake us up to ways we used to suck. this system can't though. people can be audited, but aside from rebuilding the whole damn thing, you can't really do maintenance on these things. the webs of connections are too complicated, and maybe on purpose, we can't know what changing an already trained large (whatever) model will do.

so these systems are literally incapable of being better than us. they are designed to be worst. they are designed to justify our worst impulses. they are designed to excuse the most vile shit we always wanted to do. they are forged from the jungian shadow of our society, forged from the sins, and only the sins, of our ancestors, forged with the intent of severing our connection to material reality, and forcing all people to surrender. to lay down arms in support of the great titan truth that has always stood between regressive agendas and their thousand year reich.

so please stop shilling for this neon-genesis-evangellion-ass-fuckery.

load more comments (1 replies)