Software Development

I’m having another go at learning Visual Basic not that I’ve had any problems with it in the past. It’s just that every time I get ready to try my hand, my employer has said “Never Mind” and everything on the subject that I’ve managed to stuff into my wee sma’ brain escapes. My current part-time gig uses VB & ASP.NET extensively so I’ve signed up for a course and am getting ready to do it one more time.

Anyway, I’m reading a book called Learning Visual Basic . NET and I can tell that the author is somewhat younger than I am. I found the following footnote:

Remember, the Y2K problem was caused by programmers who couldn’t imagine needing to reference a year later than 1999.

A fine example of youth and inexperience that is. The Y2K problem really began in the early days of computing when memory was tight. The machine I worked on after graduating from Wesleyan had only 16 K of memory. There was a mainframe on site that had an incredible 256 K of memory! In the early days of computing, machines had a very limited amount of memory so the coding for an application had to be tight and every byte of storage had to be conserved. Why waste a byte for “1984” when “84” would work just as well. The Y2K mess was due to the fact that there was no remediation of data and applications when hardware began sporting larger amounts of memory, well before the turn of the century.

I saw the problem coming back in 1981. The general assumption was that these older applications would be superseded with ones using more modern programming techniques as time went by and thus the problem would simply go away. Trouble is, it didn’t. While it became standard to use “1984” instead of  “84” in an application, a lot of the old data – and the systems that depended on it was still using the old method for designating a year. Add to that the fact that significant amounts of code were still in use a quarter century after it was first written and you have Y2K. Q.E.D.

The year 2000 dawned and there was – overall – very little disruption due to the world’s odometer ticking over. There followed a hue and cry from the general public that the whole Y2K problem had been overblown and was essentially a case of crying “Wolf”. Rest assured that the problem and threat was very real. Things went smoothly because a bunch of software engineers – including yours truly – spent a lot of time remediating both code and data to ensure that disruptions would be kept to a bare minimum.

And now you know the rest of the story.

If you haven’t paid a visit to my resume’, I’m a senior software engineer. That’s a fancy-pants name for a programmer, a profession that I’ve been involved in since the late ’70s. Over the years, I’ve heard more times than I care to think about that this programming language is “dead” or that programming language is on its way out. A glance at the List of programming languages by type on Wikipaedia will reveal scads of programming languages that were in use at one time or another. Many of them still are. Among those that are truly dead is my favourite; NCR’s NEAT/3 and NEAT/VS. Another dead – or nearly so – language is the one I developed; TOOL – Text Oriented Object Language.

Neither of these languages showed up on the list of 10 development technologies that refuse to die published by Tech Republic. The ones that did make the list are interesting. I won’t go into detail about all the languages but here are a few examples:

  • COBOL – This is the language developed by the mother of modern programming, Admiral Grace Murray Hopper, and for decades powered much of the world’s business applications. One famous though unattributed quote ran something like “I don’t know what programming language will be  used at the start of the 21st century but its name will be COBOL”. There are countless COBOL based applications still doing work for banks, insurance companies, etc. For what it’s worth, I’d love to write COBOL again; especially since many of today’s younger programmers are frightened by the thought of COBOL and are scared to death of  JCL.
  • C – This language had been sailing into the sunset of application development and looking forward to an active retirement powering hardware drivers and operating systems development. All that changed with the release of the iPhone and iPad, both of which use Apple’s iOS. Objective C – a superset of C – is being used to develop countless cutting edge applications to be run under iOS. Hmmmm . . . maybe it’s time to pull my C manuals out of storage.
  • FORTRAN – I never wrote much code in this language but I did teach my wife to use it when she was in graduate school. FORTRAN code still runs things in certain industries/sectors and is hard at work today doing weather prediction.
  •  JAVA – No, JAVA is not going away anytime soon. Despite its flaws, JAVA is still a powerful language and is to the first half of the 21st century what COBOL was the 2nd half of the 20th century.

And for the people who write all this code? Old programmers never die, they just run to E-O-J.

Programming, coding, software development – call it what you will – is not unlike writing a fantasy novel.

  1. You come up with a premise
  2. Define requirements, restrictions and other general parameters
  3. Turn the above into code or prose
  4. Beat on what you have written until it works

 
Simple.

This Month’s Rants

November 2017
M T W T F S S
« Dec    
 12345
6789101112
13141516171819
20212223242526
27282930  

Pipe Count

Dr. Data's Pipe Count

480 (+/-)

Dr. Data has PAD - Pipe Acquisition Disorder

Professional Reader

Subscribe to my Rants

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 222 other subscribers