Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've spent some time working in finance, so I've actually worked with COBOL developers personally in multiple different roles. In all those cases they were maintaining legacy applications that ran on IBM mainframes. Why would a company choose to use COBOL if they weren't restricted to what ran on IBM's mainframe infrastructure? Serious question, not an attack.

I get that many of these legacy applications are some of the most battle-tested things in existence, and do what they do very well. I've seen them in action personally. However I'm also under the impression that COBOL is just not that amazing compared with modern alternatives: It's not easy to write, or maintain, and (as far as I understand) the things that make it 'fast' have more to do with the platform than COBOL itself.

I'd love to know more about why someone would choose COBOL today, if anyone can fill me in.



> I'd love to know more about why someone would choose COBOL today, if anyone can fill me in.

I doubt anybody is choosing to start a new greenfield system in COBOL in 2024.

But, if you have an existing COBOL code base, and the business is asking for new features, you have two basic choices (1) write new modules for the existing system in COBOL (2) write the new modules in a more mainstream language (Java, C#, whatever) and have the existing COBOL modules integrate with those new modules (e.g. using REST)

Both options have their pros and cons. If you aren't already doing (2), then (in the short-term at least) (1) is the easier path.


TIL, COBOL has been able to make rest calls for almost 20 years: https://stackoverflow.com/questions/52136482/how-can-i-call-...


That link is about COBOL running on IBM CICS. It doesn’t apply to COBOL in other environments (both non-CICS IBM and non-IBM)

But, pretty much every COBOL implementation can do this nowadays, even if the details differ between implementations.

COBOL can call C code, so if have a C library to do something, you can use it from COBOL


You're right. I think I misread the original post, and assumed that since they were talking about running COBOL on more modern systems that they were using it to solve new business problems, hence my confusion.


> I think I misread the original post, and assumed that since they were talking about running COBOL on more modern systems that they were using it to solve new business problems, hence my confusion.

Sometimes, a new business problem can be addressed (whether in whole or in part) by changes or enhancements to an existing IT system; in that sense, some people use COBOL for new business problems even in 2024.


We actually rely upon two different operating environments for COBOL that do not originate from IBM.

The first is OS2200. Our final major application on this platform was complete by 1970, and links COBOL into assembler that accesses the hierarchical DMS database. The first SMP port of UNIX was to this hardware:

https://en.m.wikipedia.org/wiki/OS_2200

The second is VMS, specifically the VAX variety. VMS bundled a COBOL compiler, which we used to write ACMS applications.

https://en.m.wikipedia.org/wiki/OpenVMS

We continue to produce new COBOL code for both.


Are you able to find COBOL programmers to hire, or is it like Rust where a lot of jobs are "know C++ and we'll train you"?


My buddy's wife has a business degree and her first company had her learn Cobol for a couple of months (never had programmed before). I'm not sure where that went after the training, but apparently it's still a thing (this was only a few years ago).


We have several. Hiring is possible.

Hiring people with experience of the respective OS is more difficult.


C++ is not that similar to COBOL. COBOL is closer to Crystal Reports or SAP (ABAP). There are plenty of people with experience in those domains around today. Probably more than C++.


Is your org considering transitioning to OpenVMS on x86?


No, TDMS wasn't ported to the Alpha, so migration was never an option.


> No, TDMS wasn't ported to the Alpha, so migration was never an option.

VSI has TDMS for both Alpha and Integrity: https://vmssoftware.com/products/tdms/

I assume they'll probably come out with an x86-64 version at some point


What's the long-term plan then? Rehosting on a different OS?


> Why would a company choose to use COBOL if they weren't restricted to what ran on IBM's mainframe infrastructure?

COBOL has nothing to do with just IBM mainframes even though that is what it mostly runs on. The second big platform that runs COBOL programmes (heh, not apps!) on is OpenVMS (whatever hardware it runs on today) although the number of OpenVMS installations has seriously dwindled in the recent decade.

The reason why companies no longer choose COBOL is the mostly dead ecosystem and the lack of the fresh meat on the job market as a consequence of that.

If we imagine a parallel reality where COBOL is still thriving, many companies would almost absolutely choose COBOL for new projects because of its safe memory model at the very least – there is no memory corruption, no buffer overflows, no null pointer exceptions. If there are bugs in a COBOL programme, the bugs are related either to the logic or to the data handling.

With most programming languages of today, you have all of that too, plus compounding memory related issues even in considered «safe», garbage collected programming languages. The business generally does not care about the art of fine programming unless it will provide a substantial ROI (e.g. vastly reduced running and operational costs or it will yield a higher revenue), so the business would absolutely choose (or consider) COBOL in such a parallel reality.


COBOL is very easy to learn and write if you use it for what it was intended for. If you try to make a crud app or read from a web service, you are going to have a bad time. Trust me, I know. It can be done, but it is not pretty and was kind of shoe-horned in.

We have been trying to get rid of all of our COBOL and get off of the mainframe (many apps have been) for the past 20+ years. We are getting closer but priorities keep changing which keeps delaying that goal.

I love doing backend and db work so I would gladly switch to Go for my new projects. But, I would still need to use DB2, which (surprise surprise) is on the mainframe.


Db2 actually implements SQL/PSM, which is more than can be said for Sybase/Microsoft SQL Server.

I'm not sure if this extends to all three Db2 variants, namely Mainframe, AS/400 derivatives, and UDB for Linux and Windows (I hope so).

https://en.m.wikipedia.org/wiki/SQL/PSM

IBM supposedly got the PSM code from EnterpriseDB/Postgres.

https://www.enterprisedb.com/news/enterprisedb-and-ibmr-coll...

https://www.internetnews.com/blog/ibm-gets-compatible-with-o...


What they got from EnterpriseDB is the Oracle (PL/SQL) compatibility. SQL/PSM was built by IBM themselves.


This is actually the first time I have heard of SQL/PSM. From Wiki, looks like it is for stored procedures. We do not do stored procedures at all on our DB2 instances. Heck, we don't even use triggers either. All our mainframe/DB2 business logic is in COBOL.


Wow! I've worked with SMS sending gateway and cofounders cannot avoid temptation to use stored procedures for logic (we use Postgres, and logic in Perl). Because of this, last years installations limited by HDD performance, but fortunately, after crisis of 2008, we don't need much scaling.


A lot of COBOL in finance runs on x86 servers running MicroFocus Cobol on RHEL.


And on Xenix earlier, and on SVR4 and SVR3.


Suspected System V, and there was definitely a lot on AIX if only because there was CICS for AIX available (maybe still is? Paradoxically it was easier for me to get the mainframe version running... than any AIX system...)

MicroFocus on RHEL is stuff I've seen live recently, including new development, and I think there are still companies buying new releases of COBOL programs.

Similarly, MUMPS is very much alive in financial market, with completely new development being done even if mostly as small component of biggerz usually Java apps


Interesting about MUMPS.


I cannot speak for all the COBOL implementations, but what I've seen is that a company will purchase and implement a new COTS ("Commercial Off The Shelf") back-office application such as ERP - Financials, HR, CRM, EPM, etc. Something that's boring to techies but critical bread and butter for a company.

That application may well have some COBOL in it. It's sold as new, but such big applications will have bits that were written 3 months ago and bits that were written 20 years ago.

Now the company has a shiny new application that has some COBOL.

ERPs are living applications. Legislation changes, markets change, company's needs change, so ERPs are both customized and updated and expanded. So you need some COBOL expertise depending how active you want to be about it.

(note: a typical techie, including myself a decade ago, will typically see insanity in this and think something like "Payroll???? How hard can THAT be??". Like being a parent or living through a civil war, nobody can truly explain this to another human being that hasn't been through it :-)


> COBOL is just not that amazing compared with modern alternatives

To be honest, I'm new in field of mainframes, I'm just few weeks playing with Hercules.

But, after more than 20 years in industry, on micro- level (I near all my life spent with all sorts of x86 and with networking hardware, but also few years with Macs and with microcontrollers), I must say, only thing comparable to mainframes in reliability is Erlang.

And if I will take your words, just replace with regex /cobol/erlang/i, most will become truth, because, Erlang have much less LOC numbers, but looks like cost of massive reliability is same - not easy to write, or maintain (it is just too different from classic modern programming way), also Erlang is not very fast (experienced people said, it is close to Perl).

All these matter even considering Erlang syntax directly derived from Prolog, nearly best syntax in the modern world!

BTW if you want, I'm open to talk about how to make Erlang better for today, or even, talk possible ways to engage COBOL with Erlang on modern hardware.


BTW you could consider me guy from Philippe Kahn world (I begun with Turbo C and Turbo Pascal, than was Delphi, eventually Modula, other pillar of reliable world).


Nobody chooses COBOL as a language for creating new systems today. It is maintenance of existing systems, primarily but not exclusively, running on IBM mainframe and midrange servers.


I understand that old programs are extremely stable, but I don't understand why they are relevant. The world changes so much, why are these programs still useful?


That's a very techie view (and I speak as, largely, one:)

Core accounting principles don't change so much. Core payroll principles don't change so much. Core finance principles don't change so much.

We have a technology-oriented view that "world changes so much", but there are domains where that's true, and domains where that's not true.

Several times in my life I had the luck to have a mentor ask me, when I propose some obvious technological change and improvement, "what's the business advantage? If I let you spend X amount of hours over Y days to accomplish technical change Z, in which way will the business be better?"

In the world of startups and ___-tech, technology stacks matters and change and it's a whirlwind world of frameworks and languages and webapps stuff.

But in the core, back-office, cost-centre business of the company... the rules are complicated and numerous but in a way stable enough that solid stable complex software from 30 years ago still has value.


The world changes little-by-little, not all at once.

Let's say your application (100k LoC of Cobol) handles payroll. Your government introduces some new rules about how paternity leave should be handled. You have two options, rewrite the 100K lines into Java and add the new rule, or just add the new rule in the Cobol app. The latter is cheaper, at least in the long term, so that's what you do. Now you have 101k lines of Cobol.

20 years, four tax reforms, three mergers, five international expansions and fifty lawsuits later, you have migrated 100k lines of Cobol to Java, but your app grew and is now 3 million lines.


I've worked at organisations which have mainframe applications that have been running for almost 50 years. One of the funnier reasons I've heard about why these '100k lines of COBOL' programs never get rewritten is that no one really knows the full extent of what they do, and even if they do know the full extent of what they do, no one really knows the full extent of why. The business requirements that these programs cover are long since lost, but the program still remains. Even if this isn't a 100% true anecdote, it's an amusing one.


It is a 100% true anecdote at any number of businesses. It's philosophically enlightening in a way and related to an increasingly profound adage of "Purpose of the system is what it does" (which can be applied to IT and political and social and school and other systems through various lenses).

There are any number of systems out there which are "self documenting" in a weirder sense that what they do is what the rules are, rather than other way around. There's no external document or business process ruleset that describes what they do at the level of detail and accuracy that code does. There are a myriad edge cases built in which are important and relevant and likely still true (until somebody actively decides to change some rules) but not well documented or understood. I'm working on a government payroll system and it is mind.boggingly more complex than ever I could have possibly imagined and it still blows my mind every day. We have tens of thousands of time & labour rules due to hundred unions making thousands of negotiations over decades, on top of complex baseline payroll and taxation rules across governments and provinces and levels.

Attempts to rewrite such a software almost always, almost inevitably, take much longer and cost much more not just because of technical and architectural challenges, but because of, well, hubris - assumption that just because we can migrate webserver code across platforms, we can migrate a payroll system across platforms as well. Turns out one is orders of magnitude more complex than the other, we are just not used to thinking that way :).


> We have tens of thousands of time & labour rules due to hundred unions making thousands of negotiations over decades, on top of complex baseline payroll and taxation rules across governments and provinces and levels.

Working in healthcare/insurance right now and I can tell it's a very similar tangle of rules about which insurance plan covers what. The mess that's US healthcare ties up at least a trillion dollars a year in servicing its complexities. While it pays my bills, it pains me that a country can self-inflict such penalties just to not have some sort of government provided universal healthcare system.


Heck, I've got code I wrote ten years ago and no idea why it was done that way except for a vague memory of a request that I implemented but was never really used.


One plausible way to rewrite such a program would be to make it more modular. It's already likely all those rules are separated in modules and it is (allegedly) possible to call routines in different languages from COBOL code. You can, then, start writing those rules in Java, Python, C, or Rust (there is a Rust for z/OS!).

OTOH, you now have a more complicated and, perhaps, more brittle business-critical application you need to tend to.


> It's already likely all those rules are separated in modules

Have you seen much COBOL code? I had some dealings with a public school budgeting and payroll system back in the 80's, and I can tell you, it was a complete mess, with each program in its own source file (except some COPYs for record layouts). Here's just a simple " example:

COBOL has statements divided into paragraphs, each with a paragraph name. You can have for example, PARA-A down through PARA-D, one after the other. You can say PERFORM PARA-A and it's somewhat like calling a function, where it executes the paragraph and then comes back. Or you can say PERFORM PARA-A THROUGH PARA-C, and that's like calling a function made up of those 3 paragraphs. Or you can fall into PARA-A, and it executes the paragraphs one after the other until a GOTO occurs.

Another completely annoying thing is looping: PERFORM PARA-A VARYING IX FROM 1 BY 1 UNTIL IX > 10 is a for loop. The thing is, the code you execute is somewhere else, maybe 10 pages away. Nowadays there are probably some fancy code editors to help with these kinds of issues, but "back in the day", they were annoying as hell, and how good of a COBOL programmer you were depended on the depth of your mental execution stack.

The main advantages I think COBOL had/has is:

- no memory allocations; you spelled it all out in the DATA DIVISION.

- no various sizes of integers, floats, etc. You spelled it out with PICTUREs, like S99V99 is a signed number with 2 digits to the right and left of the decimal point. That's what it was on every machine.

- no buffer overflows; if you have PIC X(25), the thing is 25 characters, period. You try to move a PIC X(30) to a PIC X(25), and it just chops off the last 5 bytes. You move a PIC X(10) to a PIC X(25) and it pads with spaces. Very simple rules.

Combined, these fairly unique features make COBOL extremely portable.


> The world changes so much

Yes and no. Really, developed world have very stable laws, this is one of pillars of power.

So, new companies, like Uber, Apple (appstore), Microsoft (when created XBox), etc, sure have to create new infrastructure from scratch, because their new vision is their concurrent advantage.

But, for old businesses, like financial, energy, some others, most important to use all measures to cut transaction costs, and they achieve this by reusing old information infrastructure (not only this, but this is important part of equation). Unfortunately for old big businesses, their business models created when only mainframes could withstand their scale (clouds with micro- become mature enough, I think in 1990s; mini's just not considered as replacement of big machines), so big companies have large heritage on mainframes.

I know few cases, when large company move their infrastructure to cloud, but as I know, they mostly prefer to remain with old but reliable mainframes if possible.

At end of my speech I could add, I myself could be considered hunter from micro-world, who hunt for people who wish to switch (to micro and clouds), so I really know how much it cost :)

But to be honest, I feel romantic feelings to mainframes, as they are important part of history, I think we should save them for future generations.


Most of these systems are likely software ships of Theseus.

Very few lines of code will be unmodified from their original state, but at any given point in time the changes required will only have been small.

Hence it’s always been a pragmatic choice to do small modifications rather than a ground-up reimplementation.


The old programs do their job. They work and they're stable.

But I guess you can try to change that if you want.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: