Applying "Good" Programming To Old BASIC

On one of my classic computing Facebook Groups there was a post quoting Edsger Dijkstra stating, “It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.”  It’s actually part of a much larger document where he condemns pretty much every higher order language of the day, IBM, the cliquish nature of the computing industry, and so on. Despite most of it being the equivalent of a Twitter rant, in fact each line is almost made for tweet sized bites, there are some legitimate gems in there; one relevant to this topic being, “The tools we use have a profound (and devious!) influence on our thinking habits, and, therefore, on our thinking abilities.”  No, I don’t agree with the concept that starting with BASIC, or any other language, permanently breaks someone forever, but the nature of the tools we use driving our thinking means that it can lead to requiring us to unlearn bad habits.  Yet has someone tried to actually write BASIC, as in the BASIC languages of the 60s, 70s, and early 80s, with actual design principles?  Fortunately/unfortunately, I tried a while ago, with some interesting results.

My very first programming experiences were Applesoft BASIC.  That’s the standard BASIC that shipped with every Apple II computer.  After a few years of slogging away at BASIC in grade school I started taking “real” computer science classes in my teens at high school and at the local university continuing education program.  As time went on so too did my experiences until I became a full fledged professional developer.  Along the way came lots of tools that we now take for granted that weren’t even concepts in most developer’s eyes when I started this saga in the 80s: source code management systems, standardized automated test harnesses, code LINT’ers. etc.  Interestingly, when I decided to start trying to write my first few lines of BASIC code after 25 years of not even seeing the stuff before I decided to approach it as I would a modern program that I would write in C or something to that effect.  I wanted to use the standard best practices.  The fact is I couldn’t.  In fact those practices lead to huge errors.

Yes, I used GOSUBs not GOTOs, therefore giving me subroutines.  But subroutines take on a different concept than functions in C or methods in your favorite language.  There really is no concept of local variables in these old BASICs, everything is a global.  With entire memory footprints measured in thousands of bytes and limited call stack sizes it’s not exactly surprising that memory space was really considered something of some giant blob of data that you could do whatever you want with.  In fact if you wanted to get any speed out of your program you would leverage this directly; actually you’d probably write it in assembly language, but that’s another matter.  Therefore you didn’t fret too much about too many global variables, they all were global variables.  You had more to worry about with making sure you had enough “free” line gaps in your programs.  Oh yes, the line numbers mattered.

10 FOR X=1 TO 10

This is a horrible program! What were you thinking?  What happens if you want to add another line to your for loop?  Guess what, you get to re-number your lines…all your lines…and keep track of anywhere you GOSUB’d or GOTO’d.  By the mid-80s BASIC had overcome this, but back in the “Good Old Days"™ this was a problem you worried about immensely.  I remembered enough to remember that when I wrote that first program years later, but for my own little program I hadn’t properly reserved enough space, so that required a little reshuffling at the end.  Keep in mind line numbers were relatively limited, going up to 32K, 64K, or maybe if you were lucky 440K lines, so you didn’t have as much flexibility as you’d probably need for large programs, but large programs written in BASIC were probably not the biggest need for many people.

In our modern programming era a best practice is to name variables well to create self-documenting code. So this pseudo-codeish Database library call:

var c = D.c("")
var d = c.d("MyDatabase")
var t = c.t("MyDataTable")
var a = t({column1:"TX"})

Would fail any code review.  What the hell is going on here?  Better would be something like:

var client = DatabaseEngine.createClient("")
var db = client.getDatabase("MyDatabase")
var table = db.getTable("MyDataTable")
var data ={column1:"TX"})

The above is much better (not great, but it’s pseudo-code to prove my point so give me a break).  So when you go to write some good old BASIC code using modern practices you want to do stuff like give your variables real names.  Given this stupid program, where I am in a function and have a current summation of numbers, in a modern world using full variable names you’d do something akin to (again it’s pseudo-code to prove a point, so don’t dissect the names too much):

20 FOR CURNUM = 1 to 10

Which you would expect to do a nice Fibonacci style sequence up to 55 but instead you get 15.  Why?  Because in Applesoft BASIC, and many like them, only the first two characters are actually used by the compiler/interpreter.  The rest are just for the humans, so to the interpreter this program reads:

10 CU = 0
20 FOR CU = 1 to 10
30 CU = CU + CU

Better to just stick with simple names and avoid over-interpreting what your code may be doing.  It’s odd isn’t it, that giving your variables reasonably verbose names can make your code less clear by masking what the compiler/interpreter is actually doing:

10 S = 0
20 FOR I = 1 to 10
30 S = S + I

Let’s not even get started that not only is it not poor form to peek at and poke memory literally everywhere in the entire memory of the entire computer, but that it’s literally the way to do things!

It was an odd tour of a parallel universe of best practices to be sure.  Yes, at the time you had what we would consider real programming languages which didn’t restrict you as much. The machines necessary to run those were considered relatively large and expensive in the day.    There was a reason why “real” languages ultimately became the main way of doing things.  But back to Dijkstra’s original comment,  I imagine that if I had started my career programming in BASIC, dealing with all the shortcuts needed to work with these systems, that I would have a lot of relearning once I moved to languages like C, Lisp, or Smalltalk back in the day.  While I did start my programming experience learning BASIC, by the time I decided to do it seriously I was being indoctrinated with Pascal and a few years later C/C++ and FORTRAN.  I had some lessons to unlearn, but I was still a malleable enough young developer (still not old enough to drive even) to have not hardwired many bad practices.   So while his comment is probably mostly unfair, there is definitely a ring of truth to it too.