COBOL is the Asbestos of Programming Languages
Posted by Interesting_Pack_483@reddit | programming | View on Reddit | 261 comments
Posted by Interesting_Pack_483@reddit | programming | View on Reddit | 261 comments
Salamok@reddit
People always underestimate how much it costs to change something in IT.
jbergens@reddit
Yes and no. I know same banks wherd I live did spend a lot of money to rewrite their old systems. It didn't really work but they were prepsrr to pay for it.
My guess is that there were no good specifications and Cobol is often writren without unit tests.
HumansNeedNotApply1@reddit
It's not just the money, but also time. Some person may want to do it but the work is complex and time consuming and while that's happening you're still having to mantain the old system, which also impacts the rewrite attempt. So they spend millions for a few months and give up.
edgmnt_net@reddit
And if you pile up enough random crap, there's little that can rescue you. The base assumption here is that the rewrite must be a drop in replacement, which is also tricky. But it becomes more reasonable if you shift the burden to the business side, i.e. what makes sense and what can we rework/drop? Essentially, being willing to start from a blank slate.
lonelypenguin20@reddit
was unit testing even a defined concept back then...?
jbergens@reddit
No, not when they started to build the systems. But they were very slow to adapt it, not sure they ever did.
I worked with an integration to a Cobol system years ago and they took a LONG time testing things.
Prestigious_Boat_386@reddit
The crazy thing is that weve had cobol to c transpilers for a while. You'd think that being able to change all of the code would make someone able to do it but apparently that's not the issue.
Ok-Scheme-913@reddit
What would be the benefit of the same code being in C? That's not at all more maintainable.
Prestigious_Boat_386@reddit
C has 5 more coders cobol
Ameisen@reddit
Transpiled code is generally less readable than the original. Often significantly so.
foxsimile@reddit
People always underestimate the cost the make it in the first place.
edgmnt_net@reddit
People underestimate the cost to make it and maintain it. Framed as such, it's a tremendous amount of debt once you consider that you may need to migrate the system and keep it updated. That little feature you're making for a customer may have costs in excess of mid-term subscription revenues, over time.
praptak@reddit
You don't rewrite the huge scary programs consisting of millions of lines of legacy code. This almost inevitably leads to failure. The way to keep them going is to find a way to let them run unmodified, then gradually improve the integration "at the edges".
Here's a success story of this approach where the big scary blob was an incredibly complex game written in Power Basic where some pretty strong companies with big budgets tried to rewrite it and failed:
https://www.wallstreetraider.com/story.html
_A_Nun_Mouse_@reddit
Just in time for its release. I swear, the marketing is getting subliminal.
donald_cheese@reddit
Thanks for sharing. I really enjoyed reading that.
jeebus87@reddit
The analogy works better than most people think. The problem isn't that COBOL is bad at what it does, it's that the people who understood the systems are retiring faster than the systems can be replaced. Same as asbestos, it's not dangerous sitting there undisturbed, but the second you need to touch it you need a specialist.
programming-ModTeam@reddit
No content written mostly by an LLM. If you don't want to write it, we don't want to read it.
Physical-Compote4594@reddit
Exactly
Interesting_Level943@reddit
High Salary Recruitment: We need to solve the risk control issues related to the batch production and login of the APP.
TomKavees@reddit
COBOL itself isn't the problem, the environment around it is.
A decent programmer could learn the language relatively fast, but to actually do anything they would also have to learn fair bit of JCL, detangle 30+ years of spaghetti code and relationships between programs/macros, the bizzarro architecture of mainframes (at least from perspective of somebody that only seen x86), and probably learn a bit of Mainframe assembly and C (because of course people were mixing languages).
IME the only real way out is to apply the constrictor pattern and rewrite the system piece by piece looking at the observable behavior, not specific language constructs... which takes time and costs money, which businesses don't want to spend, which is how we got into this mess in the first place.
PerkyPangolin@reddit
Every bank in a nutshell: let's rewrite (part of) the COBOL/DB2 chimera in X.
3-5 years later: we now have both the old and the new system that are wildly incompatible and everybody's forced to use both. And they both still break.
But now there's a new ~~kid on the block~~ CTO in charge and they like Go/Rust/Zig/TS/Kotlin/Haskell/whatever, so let's rewrite in that. Who needs legacy? Oh, and let's hire a team of juniors and let them at it with no oversight. How hard can it be? It's just banking and cross-country regulations.
Acceptable_Pear_6802@reddit
More like: 10 years later: X got deprecated, COBOL/DB2 still going strong
recycled_ideas@reddit
Except it's not still going strong. It's still going, but banking has new requirements that it didn't have in the 70's when this shit was written and integrating those requirements into a ancient Cobol codebase is a disaster.
Pretending that the answer to this problem is just to keep the Cobol code running is delusional. Eventually the banks are just going to have to accept some risk and actually turn off the old modules. The strangler pattern doesn't work if you don't kill off the old tree.
MichiganDogJudge@reddit
Our COBOL Medicaid Management Information System was working well for almost 40 years. Then management decided that meeting HIPAA requirements would need a major rewrite, and elected to implement it in Java Enterprise Edition.
recycled_ideas@reddit
If it didn't meet HIPAA requirements it wasn't working well.
This is the problem.
If you have a working COBOL implementation AND YOUR REQUIREMENTS HAVE NOT CHANGED your COBOL implementation is fine. If they have changed, it's extremely unlikely that the existing code can be safely modified.
Haster@reddit
The bank is going to have to accept some risks? you mind if we use your bank first?
recycled_ideas@reddit
The bank continuing to run two incompatible systems is a much bigger risk for you as a customer and the bank would be responsible for these risks.
The basic problem is that the banks won't take any risk and so they're running both systems.
frog51@reddit
Banks are 100% about taking risks. That's literally the point of them. So much effort is put in to decide how much risk should be taken, how much must be taken and how much is actually being taken... A challenge as all three of these are different numbers :-)
recycled_ideas@reddit
Banks are about minimising risks, that's their whole schtick. They're extremely conservative in almost every way. Sometimes they get a little high on some novel financial instrument, but mostly they take as little risk as possible.
frog51@reddit
Just not true. Minimising risk is not profitable. Managing risk, yes, but not minimising it.
recycled_ideas@reddit
Banks are profitable because their business model is profitable. You "borrow" money from customers or the reserve bank and then you lend that money to other customers at a higher interest rate than you borrowed it for.
These days you can even charge the people who are lending you the money for the privilege.
Banks don't take risks if they can possibly avoid it.
smeyn@reddit
I've worked on a good number of such rewrites. If you are lucky you will have completed release 1 of 3(or more) releases and spent the budget for everything. You now have a new release that does everything the old version did but I now has bugs.that was always the end.
The only time I have seen successful modernisation was when the old version (which was bug free) was retained as the system of reference and it was outfitted with API that exposed the business functions (which in 3270 system tend to be pretty simple: search, read and update). Then the new functionality was built around it. I have been involved in a few of these and the speed of delivery is do much better and cost so much less
PerkyPangolin@reddit
As long as the CTO responsible for X got the golden parachute, does it matter? Nobody's there long enough to mourn X anyway. Now it's all about Y.
workShrimp@reddit
These days the kids at work laugh at the crappy 6 months old LLM.
The hype continues, and the prefect solution to all problems is still crap a year later, just as it used to be.
Shawnj2@reddit
There are plenty of languages which actually have stood the test of time which could feasible replace COBOL, like C# and C++ are relatively uncontroversial picks for a garbage collected and non garbage collected language. The issue is really that rewriting everything is a non starter
constant_void@reddit
LLM is a perfect use-case for COBL replacement, tbh.
PerkyPangolin@reddit
Where all the LLM-based rewrite success stories from banking?
constant_void@reddit
You mean where are all the stories of the successful ransomeware attacks because COBOL is still being used?
Oh my, quite a question, all these archaic systems lacking encryption and hardening, indeed. And edge cases about how it could be done are a lot different from the vast majority where it is not done.
many $B have banks paid out? Odd these stories aren't public, you are correct.
As banks don't report on failures, how many are reporting on successes?
Feel free to google ...
Zestyclose_Ad8420@reddit
From the banks I've seen it was never the COBOL/Mainframe and not even the CISC stuff that was the issue. Can you share some links?
constant_void@reddit
IB using AI to quit MF/cobol like the bad heroin habit that it is? all of them, I'd guess. Feel free to poke around - who is trying to get out of COBOL?
Everyone.
Who is trying to stay? Nobody.
There is already the tipping point where COBOL is shipped off mainframes into cloud based emulators...MF processing isn't all that...not these days.
Has IBM screwed the pooch? Nobody talks about why these platforms are so awful. Few can admit why it sucks so bad. instead, people tend to point at the one thing they do...that oh by the way....other things can also do.
Does google use mainframes? No?
Why?
Few to NOBODY talks about the staggering cost to operate a Cobol / mainframe platform, because most people who post online, on a good day, are operators at best, far removed from the stack of invoices, or the limits these systems have.
The dirty secret is in the T+C. Hint: It ain't nothing.
Compare the monthly outlay to cloud or even bare metal. HA isn't cheap I'l grant anyone that. In many ways, some of what these systems do, it isn't IMPORTANT enough to 86.
I'm used to the downvotes...it is vital that people don't romanticize COBOL or Mainframes. They represent IBM's vertical stranglehold on a market so deep, IBM choked itself out.
Without IBM, would we have Gnu and the FOSS movement? Who knows.
Zestyclose_Ad8420@reddit
Kid I asked you about ransomware attacks because of COBOL/Mainframes, I've got a rant about Google not using mainframes. Do you have links about ransomware attacks happening because of COBOL/Mainframes?
constant_void@reddit
of course I do.
First, if you can't remember, or aren't tracking, the HUGE hits, that is a YOU problem...if you are in or of that industry...I am worried for it.
Second, why do you keep pressing...what am I, your personal search engine? You can't be bothered to look yourself? Go digging yourself - find out how many businesses are actively trying to put out their dumpster fire of a legacy, and how many systems got hax0red to the b0ne. Hint: it's not small guys and fries, it's the BIG HITTERS and SHITTERS.
if you aren't bothered by how vulnerable the stack is, don't see how big a problem it really is, then that is a you problem.
Third - if you don't know how to search, or can't figure it out - if you need help, just holler here - I will help you, for all to see ... or even better, try putting in the minimal amount of effort and share your results here.
Zestyclose_Ad8420@reddit
post the links.
jefflance10@reddit
Wrong, wrong, wrong. Mainframes are more secure than open systems.
constant_void@reddit
LOL. how's that cleartext password treating you...SSL, what's that?
oh wait, you meant running linux on a mainframe...in which case....
A more correct phrase would be - running inherently insecure devices on an isolated LAN that is highly and heavily firewalled is the only sane way to proceed ... yes.
but that isn't because your device is a boat anchor of technical debt aka a mainframe - that is security DESPITE still using a mainframe.
A $25 rasberry pi is more secure.
jefflance10@reddit
RACF is all I will say. All RBAC security is based upon it. You seem young and ready to damn anything you don't understand. What we have today is so because of mainframes and other big iron.
constant_void@reddit
😄 Few are willing to enumerate the deep platform weakness, many focus only on the marginal benefits that have been outstripped by modern solutions. It's ok...I'm used to it. And for it's time, it was good enough.
Zestyclose_Ad8420@reddit
LoL. You don't know what you're talking about, kid.
PerkyPangolin@reddit
Not sure where you got the idea that I was defending COBOL from. And I don't think your comment answered my question.
constant_void@reddit
Try googling
There are your success stories
LLM is an accelerant, not a silver bullet.
siromega37@reddit
They don’t exist because you can’t fit the code base into a single session. We have this problem at work and tech leaders know it but don’t want to tell anyone else because they don’t want to shutdown the hype parade. Reminds me when machine learning was going to solve all the problems circa 2015.
cake-day-on-feb-29@reddit
I think the funniest part about this is how LLMs are inherently designed to generate code/text. So in the end, any LLM system will fill its own context with its own code and begin to break down.
Abject-Kitchen3198@reddit
3 months away.
PerkyPangolin@reddit
Any sprint now.
ZBlackmore@reddit
Probably either on the way or the people involved moved to the next thing. COBOL job security is absolutely about to be gone due to LLMs.
edgmnt_net@reddit
That's largely a matter scope creep and lack of commitment. Rewriting is less of an issue if management commits to changing things, but that usually needs to include discarding some stuff or at least rethinking it. Maybe large amounts of regulation are a legitimate part of scope, but that can be analyzed too.
This isn't really limited to banking, plenty of companies do half-assed rewrites but expect to have a shiny new system and keep all their legacy cruft without moving a finger. It also goes to show that piling up random features is expensive, so the question is "will revenue also cover future maintenance costs?". Software makes it easier to deal with complexity, but if you're just piling up incidental complexity it's still going to be rather expensive. This and actual tech debt are basically debt, in a nutshell, you're going to have to pay it eventually and it compounds.
Jump-Zero@reddit
The will to start a language migration is infinitely stronger than the will to complete one.
DigThatData@reddit
the requisite level of effort is dropping by the minute with LLMs becoming so capable though. I wouldn't trust one with migrating something as critical as bank software, but they're getting pretty capable and translation is what motivated the dominating paradigm to begin with.
josephjnk@reddit
I want this as a cross-stitch on my wall. For system migrations in general, not just cross-language ones.
Shogobg@reddit
The will to complete is there, just manager said we should ship new features as soon as the previous one is half working, and no one get promoted for good code, just for new projects.
constant_void@reddit
LLM's are a decent rescue.
The problem with COBOL is it split declaration from instantiation from logic, so changing data structures on day 1, much less day 101, are a daunting task. Where do you declare? Where do you instantiate? Where do you utilize? It is mission impossible, the multiple touches spiraling into regression cycle.
As a result, data structures and procedures resemble less what they are named and are more generic entities that contain data. A name field might be a name to one procedure, a description in another, and two pass thru dates in a third, with context sensitive meaning and logic spiraling beyond human comprehension.
But a LLM that ruthlessly tracks that FRANK-BANK-DATE-TIME-STR contains the color RED when the current date and time is between 0100 - 0200 ET and is thus a signal to halt JCL when in other procedures it's ACCOUNTING-YEAR that gets printed on an invoice, or check, or US Treasury extract...no biggie.
asoap@reddit
I'll second that a llm can be useful especially with spaghetti code. It's really good at analyzing code and explaining how it works.
verrius@reddit
They're very good a lot confidently explaining what's going on. It's a fucking roulette wheel if they're right or not though, and usually you're using it because you bldont have the skills to determine one way or another.
asoap@reddit
I dunno. My experience has been a lot different.
Like if you know your code base it's pretty easy to determine when it's wrong. I rarely see that though.
Like last week I was building caching into something and had the model look at all of the ways this thing changes. It was really good at pointing that out and even figured out some things that weren't on my mind but I needed to take into account. I find they work the best when taking the time and planning things out. Then make it show you all of the proposed code changes so that you can scrutizine them. That's where I have the most friction where it will do things I don't like.
SessionIndependent17@reddit
Well, there's the rub. The devs one might hire to replace such a legacy system regularly _don't_ understand the existing code base or the domain well enough to do such an analysis competently. It's not that they are necessarily incapable, but that they aren't given the time to do so before being expected to "produce". That's something that's largely independent of the language in which something may be built.
asoap@reddit
That's kinda what I'm getting at. Like if you're working on a code base you know it's easy to see where it screwed up. Again, it's rare that I've seen it screw up when looking at code. But that's really an issue for writing code.
It's still a good tool to use to understand code. You can do a prompt like "The user does this, what other parts of code does that touch" and it will go map it out really well and summarize everything. You can then go in and follow those traces and verify what the model told you.
dumasymptote@reddit
Me whenever I want to start a new project vs finish one.
Abject-Kitchen3198@reddit
Like most things in life
rrrrarelyused@reddit
So true
CoderDevo@reddit
They don't pick languages. CTOs pick their favorite vendors and integrators who dictate the rest.
MouleFrites78@reddit
The new leader trying to make their mark with building from scratch instead of fixing the old is so true
constant_void@reddit
Cobol truly is 100% of the problem. The unmanageable verbosity is a feature not a bug, led AND fed by the desire to increase billable hours for IBM ... or other vertical integrator/contractors; which means that COBOL itself is impossible to maintain...even in the 70s, 80s.
A great deal of COBOL does very little. However, because it was so hard to update, people took shortcuts everywhere, that, when they died, were fired, or retired, that tribal knowledge went with them.
But, bit by bit, one can replace mainframe systems with modern equivalents that run more cheaply, process data more quickly and are far easier to sustain and maintain.
warhead71@reddit
I would call that bullsh@t - other languages is/would be replaced after decades of use - since it can’t be maintained. The main problem (bar none) - is that no-one knows how these large legacy systems should work. Change it to Java and it becomes even more unreadable - and people still don’t know how to should work. You need to rewrite it like it’s 1965 - but in 1965 - bankers and alike actually knew what a system should do - now they barely knows the basics and tons of business logic have been added.
constant_void@reddit
C code written at bell labs in the 70s is in use today on every modern computing device because the language is understandable and portable.
COBOL code is PURE BARF because the grammar is BARF SQUARED. I state this as fact - the sun is the center around which Earth orbits; gravity is 32' per second per second; COBOL grammar is BARF SQUARED.
The OP is spot on by calling COBOL asbestos.
imo, one doesn't need to travel back in time, a lot of the stuff is smelly but it's underwear is just dirty - it's very hacky, like long division can be super scary when looked at a macro level but elementary when sliced up. One can all but guarantee huge swaths of a given code base haven't seen execution in decades.
LLM is a perfect fit for 'WTF does THIS do and WHEN would it execute' -> sometimes the answer is never, if there is a conditional that says "if today is before 1980, do these 20 things".
Remember, this stuff is all hierarchical - it's by it's nature obtuse but also relatively simple to turn on and off, like a breaker or a mains switch.
LIGHTNINGBOLT23@reddit
Pre-ANSI (actual first edition K&R style) C code running anywhere is extremely unlikely now.
constant_void@reddit
Fair - a bit of hyperbole on my end. I am pretty sure I saw, somewhere in the usenet days, there was a lineage and some Bell UNIX -> GNU code "exists" but unlikely in 1:1 form, or so trivial to be of little consequence.
Def C from the 80s compiles very well with but a nudge...and is maintained by the masses.
LIGHTNINGBOLT23@reddit
GCC for a long time has remained somewhat compatible via the
-traditionalflag. There's not really a single "pre-ANSI C" (hence why the C89 standard exists), so it's hard to tell.That said, the differences between ANSI C and all C code before it isn't too different. It's mostly simple changes.
constant_void@reddit
bringing me back to the dialect days, when ANSI wasn't quite ansi
sun c was very nice; nicer from my pov then intel, msft, etc.
now, to the OP's claim...if they REALLY wanted to start a rage baiting flame war, repost the EXACT article, but with c++ instead of COBOL lol
WillieLee@reddit
Hyperbole seems to be the only angle you have.
PerkyPangolin@reddit
LOL. C was a different language before C89. Stuff from the 70s wouldn't compile today.
InsaneOstrich@reddit
You can't honestly believe that COBOL is easier to read than Java or any other C derived language. It's not even close
warhead71@reddit
No - that’s really not my point or what an am writing. Old Java or alike - just becomes unmanageable - 3rd party problems/patches/upgrades, whatever - and I am not being hyperbolic - this is what has happened over the last decades.
ptoki@reddit
No, very much no.
Cobol is no problem. AT ALL.
The problem is that the business logic is either unknown to anyone or documented poorly or exists only in the code and scattered in many modules because it spans across teams, divisions, etc.
You can write anything in cobol because it is really simple and straight forward language. You cant do that if you have no clue what the code is supposed to do, nobody can tell you and you only learn its broken at the end of the quarter - 2 months from now when one folks raises alarm that his data is way wrong.
COBOL is the least of the problem. And the problem is not IT/coding or language.
How do I know? I have seen a lot.
constant_void@reddit
That is a fair assessment of the bigger problem.
COBOL is a problem. It is a failure of philosophy, lowered a gate that allowed some really bad ideas in, but still had too high a gate to let just anyone in. "We were this close to greatness"
A COBOL "developer" could fail out of other platforms, be a complete dumb ass and still get something done in the IBM environment, so you have this dumbing effect.
However the role still required some degree of technical understanding, so the business was still locked out; as business evolved, they in turn just locked their dumber COBOL crews in closets...stole their money (resources) and then forgot themselves just what they had asked for. in came the offshore crews and now who is watching the watchman.
HP...IBM...propagated the negligence and incompetence as it was too easy an income stream...get it wrong, and the finance suites get fired for breaking the unbroken, so in comes the hired guns to "fix".
IBM and COBOL aren't the ONLY guilty IT party here, just fun to pick on but there are plenty of examples where "everyman IT" wasn't. Blaming business for not doing the thing that is never done (relevant and recent documentation) doesn't solve the problem of humanity.
ptoki@reddit
I agree but have some remarks.
Cobol, excel, python (and a ton of other technologies) unlocked unqualified people to do impactful things.
This is two sided. We have progress and improvements but it also comes with that technological debt caused by moron making things which ten cant be changed.
Also the fact that business does not care about elegance, robustness, extensibility, maintenance of what they asked for AND they often see the IT as a weight if they try this makes the COBOL the problem it is now.
BUT! again, if cobol is so awful then it should be easy to replace it! Right? Again, yes and no. Old cobol is not replaced. But nobody makes new systems in cobol.
So in the end it is being phased out. Not in the way we would initially assume.
constant_void@reddit
Hmm. You may be on to something. Mainframes are, by definition, limited.
It is this inherent limitation that makes it relatively trivial to put into a box, and use it as a calculator of sorts. so you do have a point - the limitations of COBOL and mainframe DO limit the stupid. Like - it can't get dumber, right? It takes real work to be that bad. The damage is already there, and contained.
However, COBOL limits and the inherent insecurity of its platform is what leads it to directly contributing to MASSIVE breaches. And these limits, honestly, come with cost of varying degrees.
I have only ever advocated chipping away at the bad. "What piece of bad do we kill this year?" And then, before one knows it - the entire thing can be dumped into a volcano, and forgotten.
TheVenetianMask@reddit
COBOL is the Space Shuttle of programming. We are still strapping pieces of it to stuff in 2026 to get people to the Moon.
ILikeBumblebees@reddit
Bad analogy, since the Space Shuttle has been out of service for 15 years.
TheVenetianMask@reddit
Artemis II launched on a frankenshuttle rocket.
appmanga@reddit
If this were true, I never would have had a career. COBOL is not impossible to maintain, and I'm not sure where you got the idea that much of "billions of lines of code" does very little.
constant_void@reddit
Hey, congratulations. Work is work, and I am not going to demean people who are a product in a time, or who have carved out a niche of that time.
However - I am sure you know as well I do, that there are two sources:
1) Cobol is very redundant and verbose, with repetition - much COBOL doesn't actually "do" anything, as the grammar itself promotes sloc explosion. the below one liner simply does not exist:
let person = find_person("u/appmanga")2) In aging mainframe systems - as in, all of them - ancient and out of date capabilities are simply turned off. that code still counts as it is not removed, since the removal of the code isn't worth the cost. Think of data transport which is simply farmed out to other systems, with tunnels crafted to and from a collection of mainframe endpoints.
Look at the few US air travel systems, which do not understand special characters - a basic data construct, such as a person's name, is apparently handled outside of the travel system itself.
appmanga@reddit
People blaming COBOL, as opposed to bad code, which happens in every language I've come across, continues to amuse me. And it wouldn't have something like your code example because it's not an object-oriented language.
darkon@reddit
I've heard of an extension of COBOL that calls itself object-oriented COBOL. I've never used COBOL, much less that extension, so I know nothing else about it except the name.
shit-trapper@reddit
Not impossible. Much of the airline booking industry is likely still running on the code written in 1969. Was talking to someone in upper middle mgmt at Unisys and that's all they do - maintain that codebase. Still.
If you've ever seen an identifier on a ticket that starts with "R" or "PR" followed by a number chances are good that ticket and seat were booked on that COBOL codebase.
I've done some stuff in COBOL and SCOP, and years after that I was working at a place that had a large travel dept - 300 or so. They were SABRE customers and SABRE rolled out their new Java-based ticket booking system. It ran in java in explorer and it was . . . a uniscope terminal emulator running a SCOP frontend
SCOP was used to build database frontends for COBOL run databases, and it's an XML style templating language. Having seen my share of it running on Uniscope terminals the java window in explorer was immediately recognizable.
So no it's not 'impossible' to maintain - just very hard. The only thing harder would be to port it to another language.
PerkyPangolin@reddit
Piecemeal rewrites sounds good on paper. In practice COBOL is still everywhere and all these rewrites seem to go nowhere. Precisely due to tribal knowledge being gone.
constant_void@reddit
Mindset.
Those that succeed fair better than those that believe in, and rely on, hopelessness.
Have hope, fair traveller!
PerkyPangolin@reddit
If anything, I think it's hope that drives all those rewrites that go nowhere. That, or hubris.
constant_void@reddit
Why do you say that?
When there is a will there is a way...I have seen more COBOL decommissioned then COBOL retained, it's really not that hard to pick away at it if there is institutional will to shore up the perimeter, the core, the bridges towards the boundary...over time--decades even--what is left...is nothing.
And this is all before LLM accelerate results...sure it's not a straight line, nor an easy one. But giving up before one has begun -> that's the road to disruption and decline into the bottom half of the F1000.
Gambrinus@reddit
Writing COBOL was my first job out of college (in 2009, so not like ancient times) and the language itself was simple and easy to learn (though painful to write in as someone that was used to modern language constructs). Like you said though, JCL and wrapping my head around mainframe concepts is what was really the hard part. I only did this for maybe a year or so, but I don’t think I ever got the hang of JCL.
sinnsro@reddit
Honest question: what is so different in Mainframes as compared to a regular box at home?
Quiet-Dream7302@reddit
And then there's CICS pseudoconversational processing with BMS maps. The prehistoric web page emulator. It's a horror show.
yee_mon@reddit
You do sort-of have a "normal file directory", except the file names use "." to separate parts where the systems we're used to use "/" or "\". And the concept of a directory (without any files in it) does not exist. Imagine everything in your company is in one big S3 bucket. In fact, a lot of mainframe concepts that seem alien make more sense when you compare them to cloud services than PC architecture.
Much of the tooling was built around the concept of punch cards, with 80 columns per line. This is fairly obvious in COBOL where certain columns have special meanings. The various visual editors on the system were built with this in mind; the one I used most was a sort of "what if vi, but you enter commands in the sidebar instead of prefixing them with ESC".
You can't (or shouldn't) run any programs directly by invoking the executable; you write a JCL job. Their main purpose is to ensure that everything gets billed correctly and doesn't interfere with production. They are sort-of like shell scripts, but more clumsy and way more powerful. For example, in a unix shell you often use 3 file descriptors that almost every program has (input, output, error); in JCL you can define as many as you want and map them to files, and this leads to another oddity, which is that in your programs you don't access files directly, just the streams that you define in your JCL. They have a block where you can declare things like "this job is low priority and can run after office hours and should take no more than 30 minutes, bill the CPU time to the IT department, project 12345".
The programming model has various ways of accessing files; most of the time you don't use them as streams (like you do in unix), but define their record layout. They are expected to map directly to the in-memory layout of your COBOL data structures.
...which are also weird: The data types are not what you expect, at all. Strings are often not ASCII or UTF-8 but in a completely different encoding from the dark ages of computing. You don't think about this until you exchange files with a programmer in the modern world and all they see is garbage. There is a decimal floating point data type that internally stores 2 decimal digits in each byte, which some of the available CPUs can work with natively.
There are hierarchical databases that feel distantly-related to some modern nosql databases (but without there being any direct links between them).
IBM's DB2 supports all of these oddities with various extensions that you won't see in any other SQL database implementation. But that is genuinely the easiest way to deal with all of this.
I'm sure there is way more but it's been more than 20 years...
Izacus@reddit
In many ways, this seems like having a locally deployed AWS Serverless service where you don't really write programs as much as you deploy processing jobs and which pull data from proprietary databases and put them into other databases.
ShinyHappyREM@reddit
Sounds like fixed-point BCD. The 6502 CPU, which was usually extremely tuned for space savings, had a decimal mode mode flag that switched the add and subtract instructions to BCD, even had a patent for circuitry that allowed it to run BCD without any extra cycles.
valarauca14@reddit
Yup a modern IBM Z mainframe supports vector/SIMD BCD.
bzbub2@reddit
hell ya that's the weird stuff
evilteach@reddit
Faster IO.
Zestyclose_Ad8420@reddit
Much faster I/O, it's offloaded to dedicated boards that have PowerPC with 32GB of RAM on board, and they read and write directly to the memory (like DMA) but they are called upon at a very low level, there's a preprocessor before the actual CPU.
Zestyclose_Ad8420@reddit
How deep do you wanna go? I've worked on mainframes, modern ones, and have gotten a fair share of understanding of them. Low level they are monsters, everything is accelerated in hardware. So you have your CPU (recent one have tellum chips), the architecture is S390(x). Everything sits on a 128bit addressing space, shared by multiple chips, not just the CPU.
Your application wants to read a file, at the kernel level ultimately there's a syscall to read into a piece of memory from the disk (I'm simplifying here).
In the x86 world it would be the CPU that does the heavy lifting of talking with the disk controller. On a mainfrme the disk controller is a 4 core PowerPC with 32GB of RAM (usually fiber channel), the syscall doesn't even reach the CPU, there's a preprocessor that goes "this is an instruction for the disk controller, let me pass it to that powerPC thing", and it goes out to the disk and does the read and then writes it in the correct memory address and notifies the preprocessor that the instruction has been executed.
Do this times a thousand for everything, they still have a mathematical coprocessor and accelerators for encryption. So your 5Ghz CPU core doesn't even handle those things, it just runs your code.
RAM is in what amounts to be RAID.
The S390x architecture has instructions for the Java garbage collector.
Every instructions that actually runs on the CPU is executed on three different CPU, if a cosmic ray flips a bit you'll catch it and still get the same result, because the other two would be identical, the one that gave a different result is put offline and an IBM guy comes it to change it, and yes, it's hit swappable, the CPU...
jimgagnon@reddit
Mainframes are communication monsters. Built before the internet or any kind of general purpose network, they had to solve the problem of communicating with thousands of mostly dumb terminals all over the world.
Mainframes came from a batch oriented processing world. The article referenced the lack of parameterization in COBOL, so the mainframe JCL (Job Control Language) attempts to solve that problem through file name mapping, a real batch processing throwback.
Also, mainframe hardware and software were often the first to attempt to solve computing problems. This oftentimes leads to non-optimal solutions.
Gambrinus@reddit
It’s hard to remember specifics now, but I think it was just that everything seemed so foreign. The IBM mainframe world evolved completely separate from the Unix world so a lot of the Unix concepts you’re used just don’t exist. Like I don’t even remember having access to a normal file directory and you had to use a proprietary IBM text editor.
liltingly@reddit
Silly question, but wouldn't this be a type of task LLMs might eventually be good at? Enumerating and stress testing behaviors at scale? Or is this work smaller in nature and so a "blast it from all angles and see what shape emerges" approach wouldn't work?
Patriark@reddit
A guy I know literally flies around the world to fix 40 years old COBOL code. Quite young bloke but the kind of guy who buys computer relics and hack on them for fun. Brilliant dude and was making a fortune long before he turned 25.
jmerlinb@reddit
when COBOL gets mentioned in this sub, I swear I always read a story like this in the comments haha
jaynoj@reddit
I love these cobol post comment threads!
jmerlinb@reddit
good on you bro! yeah never worked with COBOL either - currently working on contracts though!
warhead71@reddit
Probably can assembler and is also a system programmer - most cobol programmers only do programming. System programmers are really hard to find - and are often beyond retirement age
ali-hussain@reddit
Huh? There are stil writing compilers, operating systems, and making new processors and other hardware that needs drivers? Lots of system programmers around.
warhead71@reddit
Not that many knows ims/cics beyond what a programmer needs - and it cost a lot to educate one - never mind getting experience if it’s legacy systems that only get changed on a need to basis.
LIGHTNINGBOLT23@reddit
They probably mean someone who is familiar with mainframes, not just any systems programmer. That's quite rare these days.
Patriark@reddit
Yeah, correct. He is a computer wizard. Can program quite well in a lot of languages.
Smartest dude I ever met.
PTSDaway@reddit
Can confirm the same for FORTRAN. It is stupid easy to get good at compared to low level languages and demand for experts is only going up - at a monumental rate right now.
chat-lu@reddit
In the last bank I worked at, it’s the first thing they decommissionned. They compiled the old COBOL code to .NET bytecode. I heard of other banks doing the same thing to Java bytecode.
If you manage that step, not only is hosting that system much easier, but you can start replacing it piece by piece without having two completely distinct systems running.
Marwheel@reddit
Or, i think the bizzarro aspect would apply to someone who only had worked with microcomputer architectures (Which applies also to the modern ARM world too…).
Vanhooger@reddit
Constrictor pattern is way cooler than constructor pattern, I would use it
shitty_mcfucklestick@reddit
I thought Big Balls was gonna rewrite the social security system in like a month
netgizmo@reddit
All languages have the same tangled mess surrounding the language, tooling, OS and the system(s) architecture(s) it ends up executing on.
The Cobol dev has the same issue going to JS/web stack as the JS dev has going to Cobol/Mainframe.
patrixxxx@reddit
Can confirm. Have been working professionally with Assembler, Cobol, C/C++, Lotus Notes, .Net/#C, Java and currently JS/Node React.
All have pros and cons, but I don't regret having been curious and open to switch tech despite the hard work involved with getting into a new ecosystem.
KagakuNinja@reddit
Or have an AI agent rewrite the code for you! What could go wrong?
AlSweigart@reddit
Nothing, as long as you add "No mistakes" to the prompt.
chucker23n@reddit
Now you have two codebases nobody understands!
AlSweigart@reddit
Programming as Theory Building is a 1985 essay by Peter Naur (of Backus-Naur form) that has been making the rounds again because of vibe coding. Chris Neugebauer gave a talk on this at North Bay Python 2026 a few weeks ago, and it's worth a watch.
Inevitable_Eagle2130@reddit
This is the only thing I’ve seen that works. We stopped looking at the code and decided to let go of the need to handle 100% of edge cases.
Gnome_0@reddit
also don't forget there is a spreadsheet somewhere that accounts for a 1 cent unbalance like every 3 months
emperorOfTheUniverse@reddit
Yup, cobol is easy. The reason it persists is because of how much it was used. It made programming a lot easier in its time. So a lot of stuff got made, and then features upon features were stacked on top of that stuff, and at every stack additional business logic was added and rarely documented well.
Can anyone think of something that currently makes programming easy and increases the amount of code that exists?
Document, version control, and manage the prompts y'all.
appmanga@reddit
From my experience, I don't think there's a necessity to know either Assembler or C to program in COBOL. Mainframe languages and tools like JCL, DB2 or Oracle SQL, and mainframe utilities to perform sorts and manage/manipulate files might be needed, but understanding the MVS/OS architecture at the very highest level is basically a nice-to-have.
TripleFreeErr@reddit
also: Archived and under documented standards for the systems being supported.
jonahbenton@reddit
I hate pieces like this. Yes, it is/was good for what it does and it is very brittle and hard to change BUT it is not killing anyone.
I think also cobol is a good counter metaphor for much of the "code doesn't matter anymore when the llms write it" narratives. That's the "code is now a black box" story and all we have to do is change the entry and exit and behavior specification conditions around the box if we need the box to have new internal characteristics or behavior profiles. Well- that's where things are with cobol. And it is still bloody difficult and incredibly risky to change those boxes in any automatic fashion.
GregBahm@reddit
This is the most interesting point of contention. Because "erasing old COBOL" systems seems like a prime AI scenario here in the year 2026. One of the clearest scenarios out of all the AI scenarios that exist.
If you asked me "here's an ancient cobol system. Here's the entry and exit behavior we have and here's the entry and exit behavior we want," I would feel quite frightened of this task.
The reason I'd be frightened is because the GOTO functions lead to very cryptic side effects that could have catastrophic consequences.
I'm confident I could rewrite the COBOL system to have no side effects, by leveraging the abundant computation availability and sacrifice the now-useless hyper-optimization. But the old system probably has a bunch of unintended side-effects that I'd actually need to preserve. And these cryptic side effects wouldn't be in the official entry and exit behavior, but their removal could some other system for some unknown reason.
If there's a different reason that refactoring COBOL is hard, I'm not seeing it. The side effects are the "deadly asbestos." They're the whole reason we don't program with GOTO functions anymore.
But I don't see that risk with AI code. With AI code, I can demand encapsulation, and enforce it with levels of certainty that were completely infeasible 50 years ago. Once encapsulation is achieved, if the "black box" needs to be changed, I'm free to just throw out the old black box completely and generate a new one. That's the central value proposition of encapsulation. It's great!
The only risk is if I wanted to abandon encapsulation in favor of hyper-optimization or something. In that scenario, I think I'd be a good programmer by saying "Let's just not do that." It's easier to add more hardware than rewrite spaghetti code. Maybe there's some scenario where the spaghetti code approach is more appropriate (like perhaps in some exotic new device.)
But even then, I'd be very suspicious that the programmer wants to write a bunch of hyper-optimized spaghetti code by hand, simply out of programmer vanity. This used to be a big source of pride among programmers, and it will be a while before programmers start taking pride in intelligently arranging "many AI generated black boxes."
appmanga@reddit
It's a sign of incompetence that any COBOL developer used GOTO after 1990.
constant_void@reddit
COBOL was on its way out as an in demand skill n 1990...that is part of the problem.
Talent follows demand (salary), so what remained was somewhere between unskilled labor and problematic employees who bounced out of the 'upward' path (web, remember b2b and all that) and latched onto a path they could.
Maybe they dug in and developed skill, but far more likely ... it didn't happen. In-house labor was often farmed out to consultant groups, who may have had different standards from original authors.
The ship looks less like one of the Theseus variety (the best case) and more Homer Simpson dream car, and odd grab bag / aggregation of philosophies and effort that might not have all worked together all the time.
COBOL is not taught in schools today because it is unhelpful, clutters the mind and offers little insights into the abstract nature of computation - this is because of the grammar of the language, the very platform itself.
appmanga@reddit
That is far from true. Why do people just say stuff that is simply not true, including:
"Unhelpful", but still being the basis of many active applications that many find helpful, including banking. In fact, that whole sentence appears to be plagiarized, but, these days, you have folks who think if they had the idea and AI gives them the answer, it's their work.
SessionIndependent17@reddit
Guy vibe-wrote those comments same as he imagines he'd vibe code bank transaction processing replacements.
constant_void@reddit
Life is vibe code, man
appmanga@reddit
How 2025. Good on you.
bendermcbender2@reddit
Yeah and the really bleak part is now every dude with a prompt thinks he can replace that mess with an even buggier black box.
constant_void@reddit
What is your truth?
In 1990 - could you find a gig? For a while, yes. But the COBOL demand and its pay scale was on the downward trend. It was SO easy to poach good people who could see the writing on the wall. People who retired out in 1989, some just weren't replaced. And that trend only accelerated.
Capital investments in TCP/IP (MIT, Stanford, various super computer cetners) based stacks spiked up in the late eighties, nineties..preceding the HTTP revolution.
Let us know ... what did you see?
appmanga@reddit
I saw huge numbers of people being hired to do COBOL programming and lots of consultants and consulting firms working projects, and this was years before Y2K. I even lead a team of six people for four years from 2000 - 2004. My last assignment using COBOL was less than five years ago, and I'd worked in the language since 1984. So, no, not only wasn't the language "on the way out", it was the one primarily being used by businesses of all types. The huge desire to migrate away from it has happened in the last 20 years in places where I worked in favor of languages like VB, Java, C#, and even T-SQL.
That's my truth.
mindfulnessman14@reddit
I think that’s exactly the trap, because the scary part is not rewriting the logic cleanly, it’s all the undocumented weirdness around it that some other system quietly depends on, and AI does not magically know any of that.
GregBahm@reddit
There are two different scenarios here getting mixed up.
Scenario A:
"You're going to replace this ancient COBOL system with any new system."
"Okay. I'm probably going to break something when we push the changes though. Anyone who guarantees everything will work perfectly from the drop is lying."
AI doesn't really change the game here. Maybe in theory a "superintelligent" AI can guarantee everything will work perfectly in a way a human can't guarantee, but such a "superintelligent" AI doesn't exist yet (unless Claude's Mythic lives up to its own hype, which is doubtful.)
Scenario B:
"We're going to use AI to code up a new system. The team is going to be lazy about reviewing all the AIs code, and instead just let it be a black box."
"Okay. The team can let the AI be a black box as long as the team diligently ensures the black box is an encapsulated black box. Architect the black box in a way such that its inputs and outputs are locked down, through various modern architectural conventions like containerization."
In this scenario, the AI team isn't at risk of the COBOL scenario, unless they fuck up the encapsulation. That will be the work for the human engineers. If the human engineers do their work, the risk of black box AI is mitigated because the black box becomes trivially disposable and replaceable in the event of a problem.
pcbeard@reddit
To tell the truth, the COBOL of today is JavaScript. It is ubiquitous, difficult to maintain past a certain line count, and used far too often inappropriately(e.g. Node.js for giant applications). Easy to learn but a poor fit for most domains.
I refuse to let LLMs write JavaScript, because I like to read code too. And maintain it.
Agent_03@reddit
It's actually a REALLY GOOD metaphor, because asbestos is harmless in buildings as long as it isn't disturbed. There's a truly ridiculous amount of asbestos out there in old buildings, and often the best course of action is just to leave it in place and avoid exposing it.
The problem with asbestos comes about when someone breaks into the asbestos insulation in old buildings, such as during routine maintenance or removal. Once they're broken and stirred up, the fibers become tiny airborne fragments that drift everywhere like deadly glitter and cause serious health problems or cancer years after inhalation.
COBOL is harmless as long as you never have to disturb it... such as making changes or replacing it.
(I agree that the challenges with COBOL really show the falsehood of some of the overly optimistic claims about LLMs.)
constant_void@reddit
COBOL is a disaster of a language. That's the reason to abandon it.
tms10000@reddit
I'm a COBOL developer. Pieces like this make me laugh. AMA.
pyabo@reddit
It feels like this story gets dusted off once every couple years. I can find at least half a dozen "COBOL programmers will be in huge demand!" articles from the last ten years (half of them on wired.com), but then you go to Indeed and search for COBOL jobs and there are... let's see... oh, exactly 0 hits.
Kaspur78@reddit
I just went to Indeed. Pages and pages of COBOL jobs.
pyabo@reddit
Huh? Go to Indeed.com, search for "COBOL".... I don't see a single instance of the word "COBOL" in search results. Unless you are getting wildly different search results than me, which I suppose is possible, you are just incorrect here.
Kaspur78@reddit
I know how to search the internet and sites. I even switched countries using VPN, to make sure my results are the same from different locations. I searched the USA site and my own country:
First result: Mainframe Developer (COBOL) at CAI. Second result: IBM Mainframe Programmer DOS/VSE (from the page: The application development languages supported include COBOL)
pyabo@reddit
What's your source country? Not seeing the same results.
Kaspur78@reddit
NL. I can find dozens of COBOL jobs there too. But, switching to the US site gives me even more jobs, like the 2 I mentioned.
IAmAnAnonymousCoward@reddit
Are you from the Philippines?
me_again@reddit
"Of the 300 billion lines of code that had been written by the year 2000, 80 percent of them were in COBOL" - sounds wild to me. Anyone aware of a source?
jet_heller@reddit
And I read, "it takes 4 times as many lines of code to write the same logic in COBOL". Which, based on what I've seen is 100% true.
Salamok@reddit
To be fair though cobol has a very rigid syntax that supports readability. I'd rather try to figure out what 1000 lines of cobol does than 250 lines of perl.
pcbeard@reddit
In fairness to Perl, that’s true of almost any other language != Perl.
gimpwiz@reddit
Unlike the other guy, I like perl plenty. Fine language. Excellent at dealing with large amounts of text. Good for sysadmin type stuff.
Nothing says perl needs to be hard to read. Unfortunately some people delight in making it so, or are just wildly uncaring, and the language allows a lot of hilariously intractable stuff. But good maintainable perl is just fine, same as any other language written by someone who gives a shit.
monedula@reddit
Absolutely. I think I would go beyond saying the language allows unreadable stuff - it unfortunately seems to encourage it. But I totally agree that it is not difficult to write readable perl.
gimpwiz@reddit
You're right, there are a few things the language encourages that makes it hard to read, especially for people who don't use it regularly. Like putting default stuff into
$_and then allowing you to simply skip the variable entirely for multiple uses. It's not cleaner, it's just less readable. I always use the longform for stuff like that, explicitly say the regex is for$_.In my experience the thing that really gets unreadable is the constant regex so I always comment in my code what it's doing, if it's not, yknow, simple regex. Admittedly if you read it and you don't know regex at all, it won't help a ton.
fractalife@reddit
Better yet, that one line of LISP that is a functional LISP compiler.
trannus_aran@reddit
The Lips 1.5 definitions for eval and apply that fit on one page? Or is this something else
didzisk@reddit
(loop (print (eval (read))))
Technically true.
SirClueless@reddit
I bet a lot of programmers know the term "REPL" without knowing it's literally these four instructions read from the inside-out.
Abject-Kitchen3198@reddit
I still dream about learning it and achieve singularity through LISP code.
jet_heller@reddit
Yea. You can't pick the worst offender of language syntax suckiness as your goalpost.
Try python, java or C#.
Salamok@reddit
When the article is comparing something to asbestos i assumed we were looking for the worst thing. Today perl may seem esoteric but there was a period of time where it was frequently referred to as the glue that stitched together the internet.
sshwifty@reddit
My first ETL job was Perl.
Nightmares to this day lol.
Larry Wall is super nice though.
tankerdudeucsc@reddit
Perl, the write only language. Seriously…. Yowza.
captainrv@reddit
As a former Perl coder, I couldn't agree more. Eventually one learns to read and troubleshoot perl code, but let's just say it's a character building experience.
I changed to Python a dozen years ago and never looked back.
pohart@reddit
I have always found the verbosity of Cobol makes it very hard to understand what it's doing. You can't see what's happening without reading each line, at least I can't.
I think perl isn't a reasonable comparison because even at it's peak is was widely recognized as fit only for small or single use scripts.
Yes we wrote more in it, but mostly understood that it was risky and bad practice.
constant_void@reddit
yes, perl is far more readable. excellent point.
Smallpaul@reddit
Perl is an extreme case.
KokopelliOnABike@reddit
Hello World is about 70 lines of code, kidding.
Not just JCL, also DCL for VAX/VMS machines.
I also learned assembly and C to support systems wrapped around COBOL back in the late 90s.
COBOL still stands as one of the most readable languages along with the most spaghetti code I've ever had to read, noodle out and patch.
yee_mon@reddit
COBOL is only readable if you need to figure out a single line. It's very hard to see very much of what you are looking at because each one of those lines does very little, and at least older dialects had only global variables and very short names. I remember lots of times looking at code that in python would have been `actual_comments_not_needed()` that instead said `CALL P-103-CLC-BSNS-VAL` (it gets much worse when you are passing variables around because you define the structure several screens away twice and then use a global variable to pass your data around).
It is basically impossible to not write spaghetti code in it.
SplendidPunkinButter@reddit
Tech reporting: It takes 4 times as many lines of code to write the same logic in COBOL, which is clearly bad.
Also tech reporting: AI produces LOTS of lines of code really fast, which is clearly good.
jet_heller@reddit
And, every single time I've seen ANYTHING written by AI my comments are immediately: My god, WTF happened here? Did AI write this shit.
Sooooo, yea, if that's your judgement, then cobol is clearly lots of bad code.
gilgoomesh@reddit
This claim comes from a Gartner study in the 1990s trying to estimate how many lines of code would need to be audited for the Y2K bug.
me_again@reddit
Thanks! A much more interesting article than the original one 😄
asdasci@reddit
Journalists and numbers pulled out of (someone else's) ass (to maintain "journalistic integrity"). Name a more iconic duo.
Nervous-Cockroach541@reddit
If you knew COBOL you would understand more. Let's just say it's a very verbose language.
WeirdIndividualGuy@reddit
No source because no one was actively tracking such data back then. Those numbers are all just guesses out the ass
Character-Education3@reddit
There was a lot of rewrites and updates to systems that used COBOL right before 1/1/00
Far-Dragonfly7240@reddit
So, how well does Rust (or whatever) handle 36 digit decimal arithmetic with about 14 different kinds of rounding? Just asking.
frog51@reddit
I'm still surprised at the hatred COBOL gets. Having worked my way up over the last 50-odd years from writing machine-code, assembler, C++ and many many others, I still far prefer COBOL to a lot of other languages because it is so readable, easy to understand and review, and stable.
And of course if you are good with COBOL the charge-out rate for consultancy is many times that for other languages :-)
Conscious_Support176@reddit
The charge out rate is higher because COBOL is unreadable. I shudder with the memory of fixing code where the programmer clearly not understand what they wrote, because you need to keep too much context in your head to understand what a line in COBOL really doing.
It’s essentially a verbose assembly language for a very powerful notional CPU.
frog51@reddit
The rate is higher because all the original folks are dead...
Conscious_Support176@reddit
I guess in another few years, the rates for small talk, V and SQL will start shooting up.
lefty_is_so_good@reddit
except nobody died from being exposed to cobol
appmanga@reddit
And that paragraph is the bottom line. Agnostically, COBOL is no worse than any other programming language, all of which can be (and are) badly used by people who do programming, as opposed to being real programmers. There was a very well-known at the time consulting company that would take anyone with any kind of Bachelors degree and turn them into COBOL "programmers" in six week. These folks were responsible for hundreds of millions of these billions of lines of code that the author decries, and offshoring made the issues even worse. Enterprises who ignorantly hired these consulting firms and consultants in order to throw the huge number of bodies they wouldn't, or couldn't, actually hire deserve more of the blame for the bad code than the language does.
With that in mind, you'd think the entities that still rely on COBOL would be more judicious about whom they bring in to maintain and develop it, but no, so the lessons that led to the trainwreck go unheeded. When you hire $25/hour talent, you get what you pay for.
I studied the language and became an expert in it, and it remains the only true expertise I have despite having some proficiency and competence with other languages. There's a reason why COBOL-based applications continue to live on: the language is a very good fit for what it does -- heavy batch and "real-time" processing. And it tends to work with many types of front-end solutions that make the back-end transparent to the user.
chucker23n@reddit
That's… not the reason.
warhead71@reddit
COBOL is still alive because it still works. The environment barely evolve and the code has a minimal of libraries/dependencies/3rd party product.
A cobol program checking if account into are valid and update in a mainframe can run without changes for decades.
I find cobol to be more like a dumb duplo brick than asbestos. Since it work - companies usually find other more urgent things to do than replacing cobol.
chucker23n@reddit
That is evidently true, but it does not follow that this is a good approach to new software architecture, just that it's good enough to keep it that way for another decade.
Exactly.
Which is my point. I don't agree that it's "a very good fit for what it does". It just happens to get the job done.
appmanga@reddit
Then I guess it'll be going away soon.
Intelligent-Sir8144@reddit
Have a listen (or read) https://www.npr.org/transcripts/844230915
pyabo@reddit
I just got zero hits on Indeed.com for "COBOL", so this isn't looking like a gold mine just yet.
appmanga@reddit
Then the premise of the article appears to be faulty.
madman1969@reddit
Yep, COBOL and Perl are the two programming languages I never admit to knowing unless threaten with physical violence.
mikkolukas@reddit
Vibe code is the new asbestos we will face the consequences of in a few years 🙄
pjpartridge@reddit
We find ourselves in the same position again.
I have serious doubts that all COBOL code can simply be replaced. There is clearly a real issue: experienced COBOL developers are retiring, and too few new people are being brought in to replace them.
However, the companies facing this problem also bear some responsibility. In many cases, their hiring criteria are simply unrealistic.
They say they urgently need COBOL developers because the current generation is retiring, yet they continue to ask for 20+ years of COBOL experience. That approach makes little sense.
A capable developer can learn a programming language. The greater challenge is usually understanding the tooling, the environment, and the business domain. Unfortunately, many companies fail to recognise this.
It is similar to rejecting a skilled painter because they have never worked with green paint before. Most people would immediately see how unreasonable that is.
YourLizardOverlord@reddit
Sure, and COBOL isn't the most difficult, apart from the abomination that's CICS. I'm not sure that many developers would want to take on an ecosystem that doesn't have much future though.
BobQuixote@reddit
I'm shocked it's gone this long without them adapting - higher pay, paid workshops to learn COBOL, etc. Any programmer would have told them how to handle this.
MSMSMS2@reddit
All the tech bros can show how great their products are - Claude Mythos should be able to rewrite all COBOL in whatever the flavor of the moment is overnight. Since that is what they always try to claim in their sound bites.
FalseRegister@reddit
It is not the Asbestos, it is the Roman concrete
srpulga@reddit
The accomplishment I'm most proud of was shutting down a whole system z mainframe. The COBOL code was migranted to... COBOL.
The problem is not the language, it's the hundreds or thousands of edge cases in behaviour. You'll never rewrite that shit in a different language: you'll end up writing a slow as fuck COBOL emulator.
If you're a bank and want to move away from COBOL the cheapest option is to start a new bank. In fact every bank already has a plan for this, they've been planning for it for decades.
protomyth@reddit
Which COBOL did you migrate to and on what platform?
srpulga@reddit
microfocus COBOL always. To redhat, hp-ux, and even windows server.
syklemil@reddit
Though we can also see the opposite happen. There was a young, pretty popular bank around here that got bought up by a dinosaur bank. Shortly after the dinosaur bank sought more mainframe coders, and the stuff about the young bank that had attracted people in the first place started going away. Now it just exists as a "concept by
$dinosaurBank".the_ai_wizard@reddit
sounds like a job for LLMs
BobQuixote@reddit
Specifically, a contractor team with at least one person fluent in COBOL, and the whole team familiar with how to keep LLMs productive.
I'd start on that if I didn't have projects already lined up. Learning COBOL doesn't seem like that big of an ask.
garyk1968@reddit
Stupid analogy.
Asbestos is dangerous to health if left in situ which is why it is removed and the removal itself is dangerous. COBOL works and hopefully no one ever has, or ever will die from handling it.
Article is generic "COBOL is now causing a host of problems" and "By one rough calculation, COBOL’s inefficiencies cost the US GDP..." ok, so no specifics and one rough calculation? come on.
I have no skin in the COBOL game, never used it but what a shit article.
Physical-Compote4594@reddit
Like asbestos, COBOL is really good at what it does. Also like asbestos, it is really difficult to get rid of safely. It’s a pretty perfect metaphor, and I’m jealous I didn’t think of it myself.
mats_o42@reddit
I agree but it's also very wrong at the same time
asbestos kills but it seems impossible to kill cobol
One of my Customers tried. 1990:s mainframe with a custom written LOB app. The plan was to rewrite in Java, scale it by multiple parallel workers and so on. They knew that it would be way slower per clock cycle but 25 years of CPU development should be enough, right?
Nope. It could be scaled out to the same performance but it would cost way more than a new mainframe
brool@reddit
That's a really interesting scenario, was it written up anywhere?
OkPosition4563@reddit
I think its an universal experience. I used to work in financial markets infrastructure and we did slice out part after part from cobol into Java and it was the same. Same performance could only be achieved with significantly more expensive hardware, reliability was terrible and speed of development honestly wasnt much faster.
mats_o42@reddit
Don't think so
zxyzyxz@reddit
You're mixing up the subject and the object in the metaphor. Asbestos kills (subject is asbestos doing the killing) and Cobol can't be killed (object is Cobol). It should either be, asbestos/Cobol kills or asbestos/Cobol are killed.
Lil_slimy_woim@reddit
The original metaphor works perfectly too, because in a lot of industrial construction asbestos is still used because of how very well it works if handled properly
fallenfunk@reddit
This is the root of the problem in many cases. The government has tried to do the same but the IBM and Unisys mainframes haven’t gone anywhere. It’s still cheaper to keep what’s there than the years of development, parallel operation in transition, and all the associated switching costs for a workforce in the tens of thousands.
MooseBoys@reddit
That makes it an even better metaphor. Asbestos kills precisely because, unlike most foreign matter that gets inhaled, asbestos crystals can't be killed by white blood cells. This leads to long-lived inflammation, cytokine storms, and eventually cancer. Asbestos kills because your body can't kill it.
Plank_With_A_Nail_In@reddit
We still don't actually know the exact process by which asbestos kills.
FlukyS@reddit
Well Java isn’t all that fast either, the best choice for COBOL replacement would be golang and maybe some nice language as a backend that is a bit more friendly like Python for the juggling. The rewrite and hardware cost would be more than basic maintenance but opening up the system for new features or whatever and the hardware being closer to the average deployment make it worth it
rrrrarelyused@reddit
It is really good. Love it
HighRising2711@reddit
It's like asbestos in that you don't want to touch it because it keeps half our infrastructure stable
I started my career migrating COBOL from mainframes and ancient mid range machines to Unix.
JCL became shell scripts, COBOL stayed as COBOL
Then I moved to java and have spent many years replacing COBOL or similar systems with Java systems of various architectures. Working IT systems are very expensive to replace, so unless there's a very good reason (regulatory requirements or licencing costs going through the roof) there's never much incentive to fund them to completion so often you end up with 89% java and 20% COBOL that's too expensive/difficult to replace
RScrewed@reddit
Is this written by a 22 year old webdev by any chance?
nitrinu@reddit
Would be just as likely if it was a rust fan tbh.
max123246@reddit
The software developer who is led by hype rather than first principles. Doesn't matter what language it is, they are everywhere.
Anytime someone says "skill issue" or "it's so simple", it tells you they have misunderstood the cataclysmic complexity of writing software that is performant, stable, correct, and is easy to change that lasts for years or even decades.
yeslikethedrink@reddit
For those such people, in my experience, the LLMs had far eclipsed them by GPT 4.
Smallpaul@reddit
What about this article makes you think so?
PerkyPangolin@reddit
https://www.wired.com/author/zeb-larson/ I don't know if this is better or worse.
Substantial_Lake_542@reddit
The irony is that the people who need to replace it most are the same ones who can't afford the downtime to replace it. Banking systems running COBOL process something like $3 trillion daily — you don't just migrate that on a weekend sprint. It's less asbestos and more load-bearing wall that everyone's too scared to touch.
DensitYnz@reddit
Odd way of spelling Javascript.
pieeatingchamp@reddit
I felt the same way about our Classic ASP code, which we finally migrated around 98% of recently.
SessionIndependent17@reddit
I imagine it could be useful for writing thorough automated data-driven regression tests for the existing system upon which one could build components to progressively replace more isolates subsystems before attempting to attack the central logic.
Wodanaz_Odinn@reddit
COBOL is so straight forward to use, we won't need programmers any more. Business analysts will be able to do everything, really exciting!
pyabo@reddit
~~COBOL~~ Vibe coding is so straight forward to use, we won't need programmers any more. Business analysts will be able to do everything, really exciting!
protomyth@reddit
I'm having extreme CASE tools flashbacks.
MundaneWiley@reddit
COBOL the original tech debt.
clumma@reddit
Really bad rep but actually useful?
0xAERG@reddit
This is the best thing I’ve read on Reddit this week
tclbuzz@reddit
to any and all, let's find something that even matters
Chunky_cold_mandala@reddit
The real issue is 50 years of accumulated structural density and spaghetti logic.
The article rightly fears "JOBOL" (Java-flavored COBOL). When standard AI or legacy AST parsers try to migrate these systems, they do a blind 1:1 syntax swap. You don't get modern Java; you just get 1970s technical debt translated into a new language.
We have to stop treating migration as a translation problem and start treating it as a physics problem.
By bypassing the AST entirely and mapping the code's keyword regex counts, you can calculate a surprisingly useful above of info. Using K-Means clustering, you mathematically isolate the dense monolithic files from the simple pipelines. Once you untangle that structural blueprint, you can cleanly refactor the logic into modern COBOL or native Java.
COBOL isn't asbestos. It’s just heavily compacted architectural debt.
giantsparklerobot@reddit
The code isn't the core challenge. Far too many people seem to conflate COBOL the language with the operating environment in which COBOL lives.
It's the business processes, regulations, and other idiosyncrasies the code represents that is the hardest part about displacing COBOL systems. In large COBOL systems those processes are often older than any of the engineers working on the project. There's all sorts of nonsensical "π = 3" laws/regulations/contracts that don't make logical or mathematical sense but are required for proper operation/accounting. At some point in 1972 someone implemented the "π = 3" code and never documented why it was done. But the output has been correct since then.
A big rewrite in has to preserve all of that idiosyncratic shit with no questions asked. Not just "π = 3" sort of things but rounding fractional values the exact same way as every stage of the original (undocumented) code. Today's audit of yesterday's books need to match yesterday's audit or someone(s) is getting fired or going to jail.
Only when you've got a full map of everything the old code does can you effectively port it to a new system. That takes a lot of time and money and usually outlives the will or patience of whatever executive ordered the migration. Even then you won't necessarily know why the code does everything. So you might have all your COBOL migrated to Java but still need tight control around all changes since there will be black box logic that no one understands. Understanding all those black boxes takes even more time and money to document and research.
rescuemod@reddit
And AI generated Code ist die Asbestos of the future 🫣
wnoise@reddit
Extremely useful with no replacement covering all use-case?
pjmlp@reddit
Lets not forget C is only 12 years younger.
Kickstart68@reddit
Good luck with replacing mainframe code (whether COBOL, PL/1, etc) with a modern language without introducing a load of new bugs (plus revealing long standing bugs)
Fixed decimal arithmetic. Good traceability through intermediate files used in JCL, generation data sets, etc.
evilteach@reddit
indexed files.
wunderkit@reddit
I started writing COBOL in college. Also fortran and assembler. When I was in the Air Force I was at the Joint Stragegic Target Planning Staff (JSTPS) as an analyst of for SIOP (Single Integrated Operational Plan, Nclear war plan) programs. That's right, the nuclear war plan was written in COBOL (with some Fortran for the Trig parts). This was done on 8 meg IBM mainframes. Don't know if COBOL is still used but it wouldn't surprise me.
captain_obvious_here@reddit
Even AIs seem to have a hard time rewriting old Cobol apps. Not that the language is hard, because it is not, but because the old mainframe architectures don't have much in common with the recent x86-derived ones.
ShacoinaBox@reddit
thanks for the input on cobol author zeb, whose career is "historian" and not actually anything to do with computers, let alone mainframes!
cobol's fine, great for what it does, does its job properly. places do not want to be shackled by IBM fees, whenever u hear "aww man we can't find cobolers in 2kXX ;( aww MAN... oh well, gotta spend $30quadrillion on a java conversion project..." simply remember that they would REALLY rather not pay IBM money for use of machines n services and it'll all make sense.
constant_void@reddit
AMEN
Headpuncher@reddit
big up the junglist massive