It's like the Y2K bug, but twenty years on

Remeber the Y2K bug? New Jersey looking for COBOL programmers.

Funnily enough, one of our finance systems at my workplace is going through a Y2K20 update. It’s all to do with the date windowing. “We won’t be using this in twenty years’ time”, they said, in 1999.

3 Likes

Something similar was mentioned on the Trump daily briefing the other day, stimulus checks were likely to be delayed because of ‘40-year-old computers’?

Certainly could be related. The “fix” for old code was to assume the two digit year between 0 and some arbitrary number is year 20xx. Going beyond that number revers the year to 19xx. So if they are issuing cheques for 2020 and the system is not expecting it, the cheques could have the wrong dates on them.

Man… cheques… wouldn’t direct deposit be easier?

2 Likes

I think when they say ‘checks’ they actually intend to do transfers wherever possible. Way it was described was federal govt can get the money to all the States, but some States will then struggle to make the payments. Perhaps a check in the post is easier as long as you don’t need to visit a bank to cash it.

Right, that makes sense. The states’ finance systems may be all different. I just thought the federal government would be managing payments.

As I understand it most banks in the US can deposit cheques via banking apps now by taking a photo of the cheque.

1 Like

That’s pretty sad that they are dealing with this…

I remember being paid to flowchart programs on paper and write them in Cobol. Luckily we had green screens rather than punch cards.

1 Like

We are in the process of replacing our ERP system. The old one is written in COBOL (but is still actively maintained).

The problem is, these old systems are so large, complex and sprawling and patched with new features over the last 40 years, it just isn’t economical to replace them. They probably cost well into 6 or 7 figures to create back in the 70s and 80s and have had new functionality tacked onto them over the decades. You would be well into 6 figures just analysing exactly what the current system does, then you’ll need to add what it should be doing, rationalise that and come up with the specification for a replacement system.

That would probably run into high 6 or low 7 figures to implement, multiply that up for all the states still using old systems, plus old federal systems and you will be talking a hefty budget (several billion, probably), to replace all of those old legacy systems that are still chugging along reliably in the background.

Usually the old system is left as much alone as it can be and web frontends or GUIs are written to interface into those old systems, usually over a transaction processing layer, like IBM MQSeries.

I learned COBOL and FORTRAN using punch cards in college. I liked FORTRAN much better. That was the same year the IBM PC came out.

What needs to be asked is how much does it cost to maintain such an old system?

  • Old systems tend to cost more to support (both users and IT infrastructure)
  • Old systems tend to cost more to secure or bring into compliance
  • OId systems tend to be slow and clunky which costs employee productivity

I have worked for companies that refused to look into this because they knew they would not like the answer. Any talk of replacements needs to take into account the cost/risk of doing nothing. Usually doing nothing is more expensive/risky than initially assumed.

Of course fancy new systems can need to be properly investigated and scoped or you could get a fancy expensive new system that works worse than the old clunker it replaced (See the Pheonix pay system saga in Canada for a sad example)

It depends, we are currently replacing our old ERP software. It was written in the 80s in COBOL on a mini computer. It currently runs under POSIX on Windows using MicroFOCUS COBOL. The biggest complaint with the new system, while it has lists and better reporting, is that it is much slower and clunkier to use, when entering data - which is what most users spend their time doing.

The problem is, the old system was built around a green screen terminal, so you quickly tab through all the fields entering data. The new system has much more complicated, it uses various tabs to categorise the information, so you can no longer just enter the data with the keyboard, you have to enter 2 or 3 fields, break, grab mouse, move to new tab, click on first field, move to keyboard, enter a couple of fields, break, go to mouse…

For those writing the report templates, the new system is a vast improvement. For those doing analysis or searches, it is more flexible. For those doing the actual data entry, the majority, it is a lot more difficult.

The support costs are the same for the old and the new system. No change there.

The old systems may be more difficult to secure against external intruders, they have good user security and they are only available on a separate network, if they need to interface with new systems, they will usually have a signle bridged computer with more modern interface software that keeps the old system secure and allows wider access through a more modern GUI.

The old system: 150 users on a terminal server, 8 cores, 32GB RAM, 50GB hard drive space, runs very smoothly.

The new system: 8 cores, 64GB RAM, 100GB for the database. 12 cores, 32GB RAM and 50GB disk per 50 users for the application server. 2 cores and 4GB RAM per user for the terminal server.

And the new system feels much slower… But it is written using the latest SQL Server and Java…

1 Like