Ok, let me just say at work I’ve spent the last 4 years working on a product that interfaces with COBOL. In-house we:
- Generate native windows interfaces with COBOL, to the point where only a few lines of code create a control like a data grid, button or text entry with rich functionality. You’d never know these snappy native apps, with features like visual data validation and asynchronous menus/context popups, are COBOL.
- Render rich web pages with COBOL using a in-house DSL that automatically become somewhat responsive between devices. The presentation information streams over a web socket so it’s snappy and as responsive as the above desktop application
- Perform complex queries and rich data validation. A few lines of code can automatically be linked to grid view that has HEAPS of built-in functionality like export or data side-view (link applications/side panels to the currently selected row), app row highlighting and styling, auto pagination with range expand, column sorting, reorder, filtering, conditional highlight and more.
So yeah, COBOL isn’t dead. It’s actually the coolest damn thing Ive seen in a while. Because it’s pretty high-level we can build whatever we need on top as a runtime and the same apps can run on the web, native desktop, handheld tablet or on an old VT100 over SSH.
Is this really all the same COBOL application? I'm truly impressed! Is there some kind of GUI toolkit for COBOL or the COBOL in this case only the logic behind the application and the GUI here is another language/framework?
I ask because your parent comment certainly rings true if this is all COBOL!
The programs are very uniform. There is not multiple ways of doing things (at a presentation level). So you get to use hundreds of thousands of existing apps written over the last N decades in your modern web application.
You just couldn’t write them all over again, there isn’t a team on earth that could, even with infinite money. You’d be talking, I think, more than high double digits, millions of LOC.
JavaScript by comparison .. today it’s React/Angular, but it is very fluid. Not much code will be practically reusable in 5/yr from now.
> There is not multiple ways of doing things (at a presentation level).
Love the insight. Established businesses like banks look for this kind of stability. And upstarts who build from ground up to challenge status quo, they go on to figure out ways to win with es6 and their friends.
Not much code will be practically reusable in 5/yr from now.
Is that really true? If you look at a lot of JS from 5, 10 or even 20 years ago it still works. The main things that stop it working are security changes, which is arguably a good thing. I don't see why many of today's JS libs will stop working in the future unless there's a good reason why they should.
It's not that they won't work, but that nobody will be doing anything new in JS in 10 years. It will become forgotten, mysterious, scary -- like COBOL is today.
I’m not sure I get your joke. Every system, JavaScript, COBOL, C, Java, etc etc etc, has this behavior. Floating point decimals when converted to binary, are a rather rough thing for a system to handle. For example, 0.1 in binary is a repeating number. Basic operations on those binary values are very very fast but not always 100% accurate due to the binary nature of decimals. There are packages which allow for very precise floating point calculations but will cause performance degradation compared to the basic operations. It all depends on what you need.
COBOL uses fixed-point decimal arithmetic, not binary floating-point. You'd declare these variables as PIC V99 and then get a nice neat .10×.10=.01 (or as PIC 99V999 and get 00.100×00.100=00.010, or whatever). This is because it's a Business-Oriented Language, and businesses don't want floating point rounding errors in their finances.
Incidentally, this was actually faster than binary floating-point on the machines for which COBOL was designed, which used EBCDIC and had special circuitry for arithmetic on binary-coded-decimal numbers. You could operate directly on the decimal representation without having to do any conversion for I/O.
Nit: the COBOL spec dates to 1960; EBCDIC came later, ~1963. In 1960, character codes were specific to the manufacturer, and sometimes to the machine. IBM had a 6-bit BCD code that was mostly consistent across its line, but Univac had its own which was quite different.
I used to do a lot of BCD in embedded POS stuff. No one will miss it. You can do anything you can do in BCD with regular integers if you have any understanding of integer arithmetic. Yeah, it is going to take a little more time to convert to/from decimal digits you can display/input but processors these days have fast integer division so that doesn't matter.
A current issue is that a lot of programmers are bad at integer arithmetic. It isn't really taught in school.
- Generate native windows interfaces with COBOL, to the point where only a few lines of code create a control like a data grid, button or text entry with rich functionality. You’d never know these snappy native apps, with features like visual data validation and asynchronous menus/context popups, are COBOL.
- Render rich web pages with COBOL using a in-house DSL that automatically become somewhat responsive between devices. The presentation information streams over a web socket so it’s snappy and as responsive as the above desktop application
- Perform complex queries and rich data validation. A few lines of code can automatically be linked to grid view that has HEAPS of built-in functionality like export or data side-view (link applications/side panels to the currently selected row), app row highlighting and styling, auto pagination with range expand, column sorting, reorder, filtering, conditional highlight and more.
So yeah, COBOL isn’t dead. It’s actually the coolest damn thing Ive seen in a while. Because it’s pretty high-level we can build whatever we need on top as a runtime and the same apps can run on the web, native desktop, handheld tablet or on an old VT100 over SSH.