this post was submitted on 23 Aug 2023
223 points (96.7% liked)

Technology

34877 readers
45 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
 

It's not the 1st time a language/tool will be lost to the annals of the job market, eg VB6 or FoxPro. Though previously all such cases used to happen gradually, giving most people enough time to adapt to the changes.

I wonder what's it going to be like this time now that the machine, w/ the help of humans of course, can accomplish an otherwise multi-month risky corporate project much faster? What happens to all those COBOL developer jobs?

Pray share your thoughts, esp if you're a COBOL professional and have more context around the implication of this announcement 🙏

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 1 year ago (2 children)

Even if it only converts half of the codebase, that’s still a huge improvement.

The problem is it'll convert 100% of the code base but (you hope) 50% of it will actually be correct. Which 50%? That's left as an exercise to the reader. There's no human, no plan, no logic necessarily to how it was converted also so it can be very difficult to understand code like that and you can't ask the person who wrote why stuff is a certain way.

Understanding large, complex codebases one didn't write is a difficult task even under pretty ideal conditions.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago)

First, odds are only half the code is used, and in that half, 20% has bugs that the system design obscures. It’s that 20% that tends to take the lionshare of modernization effort.

It wasn’t a bug then, though it was there, but it is a bug now.

[–] [email protected] 0 points 1 year ago (1 children)

The problem is it’ll convert 100% of the code base

Please go read the article. They specifically say they aren't doing this.

[–] [email protected] 3 points 1 year ago (1 children)

I was speaking generally. In other words, the LLM will convert 100% of what you tell it to but only part of the result will be correct. That's the problem.

[–] [email protected] 0 points 1 year ago (1 children)

And in this case they're not doing that:

“IBM built the Code Assistant for IBM Z to be able to mix and match COBOL and Java services,” Puri said. “If the ‘understand’ and ‘refactor’ capabilities of the system recommend that a given sub-service of the application needs to stay in COBOL, it’ll be kept that way, and the other sub-services will be transformed into Java.”

So you might feed it your COBOL code and find it only coverts 40%.

[–] [email protected] 3 points 1 year ago (1 children)

So you might feed it your COBOL code and find it only coverts 40%.

I'm afraid you're completely missing my point.

The system gives you a recommendation: that has a 50% chance of being correct.

Let's say the system recommends converting 40% of the code base.

The system converts 40% of the code base. 50% of the converted result is correct.

50% is a random number picked out of thin air. The point is that what you end up with has a good chance of being incorrect and all the problems I mentioned originally apply.

[–] [email protected] 1 points 1 year ago (2 children)

One would hope that IBM's selling a product that has a higher success rate than a coinflip, but the real question is long-term project cost. Given the example of a $700 million dollar project, how much does AI need to convert successfully before it pays for itself? If we end up with 20% of the original project successfully done by AI, that's massive savings.

The software's only going to get better, and in spite of how lucrative a COBOL career is, we don't exactly see a sharp increase in COBOL devs coming out of schools. We either start coming up with viable ways to move on from this language or we admit it's too essential to ever be forgotten and mandate every CompSci student learn it before graduating.

[–] [email protected] 2 points 1 year ago

One would hope that IBM’s selling a product that has a higher success rate than a coinflip

Again, my point really doesn't have anything to do with specific percentages. The point is that if some percentage of it is broken you aren't going to know exactly which parts. Sure, some problems might be obvious but some might be very rare edge cases.

If 99% of my program works, the remaining 1% might be enough to not only make the program useless but actively harmful.

Evaluating which parts are broken is also not easy. I mean, if there was already someone who understood the whole system intimately and was an expert then you wouldn't really need to rely on AI to port it.

Anyway, I'm not saying it's impossible, or necessary not going to be worth it. Just that it is not an easy thing to make successful as an overall benefit. Also, issues like "some 1 in 100,000 edge case didn't get handle successfully" are very hard to quantify since you don't really know about those problems in advance, they aren't apparent, the effects can be subtle and occur much later.

Kind of like burning petroleum. Free energy, sounds great! Just as long as you don't count all side effects of extracting, refining and burning it.

[–] [email protected] 1 points 1 year ago

A random outcome isn't flipping a coin, it's rolling dice