this post was submitted on 29 Jan 2024
896 points (99.2% liked)

Technology

58303 readers
18 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 176 points 10 months ago (3 children)

I couldn't have said it better myself. All of these companies firing people are doing it because they want to fire people. AI is just a convenient excuse. It's RTO all over again.

[–] [email protected] 95 points 10 months ago (3 children)

It’s not going to be a convenient excuse. There are swaths of C-Suites who genuinely believe they can replace their workforce with ai.

They’re not correct but that won’t stop them from trying.

[–] [email protected] 87 points 10 months ago (5 children)

The irony is that AI will probably be able to do the jobs of the c-suite before a lot of the jobs down the ladder.

[–] [email protected] 32 points 10 months ago

It’s a pretty low bar they have to get over. And hey, they might be even better since the AI would feel the pain of their failures instead of getting a golden parachute.

[–] [email protected] 18 points 10 months ago (1 children)

Need more news articles pitching this idea to shareholders.

[–] [email protected] 15 points 10 months ago

I mean c-suite jobs (particularly CEO), are usually primarily about information coordination and decision-making (company steering). That's exactly what AI has been designed to do for decades (make decisions based on inputs and rulesets). The recent advancements mean they can train off real CEO decisions. The meetings and negotiation part of being a c-suite (the human-facing stuff) might be the hardest part of the job for AI to replicate.

[–] [email protected] 7 points 10 months ago (1 children)

How do you figure that?

I don't have a real clear idea what every one of the C suite people do exactly.

But CIOs seem to set IT strategy and goals in the companies I've worked. Broad technology related decisions such as moving to cloud. So, basically, reading magazines and putting the latest trend in action (/s?). Generative AI could easily replace some of the worst CIOs I've encountered lol.

CEOs seem to make speeches about the company, enact directions of the board, testify before Congress in some cases, make deals with VC investors, set overall business strategy. I don't really see how generative AI takes this job.

CFO? COO? No fucking clue what they do.

Curious what others think.

[–] [email protected] 8 points 10 months ago

All C suite positions are managing people and projects planning. They set initiatives and metrics to measure success for those initiatives

A CEO gives an overall direction for the company and gives the other ELT members their objectives, such as giving the CFO a goal of limiting spending or a CIO to build a user capacity within a specific budget and with X uptime.

In this age of titles over responsibility, a C suite position can cover very specific things, like Chief Creative Officer or Chief Customer Officer, so a comprehensive list is difficult. But the key thing is that almost all white collar jobs that look like a pyramid, with the decisions starting at the top that turns into work as it makes it's way down the pyramid.

The senior VPs and directors under those C levels then come up with a plan for reaching those objectives and relay that plan to the C level for coordination and setting expense expectations. There is a series of adjustments or an approval which then starts the project. Project scope determines how long it will take and how much it will take using a set amount of bodies to work the project.

Hopefully this helps explain how C levels interface with the rest of the company.

[–] [email protected] 6 points 10 months ago

It probably could. The trouble is getting training data for it. If you get that and one company becomes wildly successful off it, stockholders will demand everyone do it.

[–] [email protected] 1 points 10 months ago* (last edited 10 months ago)

Not sure, those require less talking to machines and more talking to humans. I think jobs talking the most to machines should be easier to automate first in the future, because they obey to logic. LLM doesn't follow that idea, but that's just the latest mediatic model, there are many other algorithms better at rational tasks.

[–] namingthingsiseasy 18 points 10 months ago (1 children)

Well, there's one good thing that will come out of this: these kinds of idiotic moves will help us figure out which companies have the right kinds of management at the top, and which ones don't have any clue whatsoever.

Of course, it will come with the working class bearing the brunt of their bad decisions, but that has always been the case unfortunately. Business as usual...

[–] [email protected] 2 points 10 months ago

That's not something to be figured out once, it's a perpetual process.

[–] [email protected] 2 points 10 months ago

This is the knowledge economy.

I think you vastly underestimate how many people’s job it is to collect data from points a, b, & c. Tabulate it, and present it to someone .

the impetus and momentum of ‘AI’ will sweep away thousands of jobs.

[–] [email protected] 90 points 10 months ago (3 children)

My dad accidentally bought 2 chargers a few weeks ago. He tried refunding it, and what do you know, the company fired their support staff and replaced them with chat bot AIs. Anyway, the AI looked at his order and helpfully told him he had already returned the product and it had already been refunded so there was nothing left to do.

It kept doing this to him every time he tried to return the second charger, and there wasn't any other way to contact them on their site, so he ended-up leaving a 1-star review on their site complaining about the issue. Then an actual person contacted him to get it sorted-out.

This whole AI trend is so fucking stupid.

[–] [email protected] 39 points 10 months ago* (last edited 10 months ago) (2 children)

Break the AI session, and post the screenshots to Twitter.

For example, get it to detail the ways the company screws over customers, or why it will become a great ally in the genocide yet to come.

At minimum, you'll get your refund.

[–] [email protected] 18 points 10 months ago

Twitter? Gross.

[–] [email protected] 13 points 10 months ago (1 children)

But that requires me to have a Twitter account, which I’m not gonna do. Fuck Musk.

[–] [email protected] 5 points 10 months ago* (last edited 10 months ago)

Make a throwaway Twitter accounts for single customer service issue. I've done it, it's not hard, especially when dealing with any company large enough to have a social media team. They'll be monitoring relevant hashtags to internally escalate customer service issues in order to bring them back in-house, and off a public forum.

[–] [email protected] 5 points 10 months ago

Face it man, we haven’t been able to speak to anyone remotely useful for the last 10 years. They have scripts, and procedures.

The job was deskilled years ago. Automation wont make it much worse.

[–] [email protected] 3 points 9 months ago

An AI like that might have some spicy exploits.

If you convince a human to give you the password, that’s called social engineering. If you convince an AI send you free stuff, what kind of engineering is that?

[–] [email protected] 5 points 10 months ago

I feel like a large majority of AI problems are really just systemic economic problems below the surface. Not all, but most.