Edsger Dijkstra was absolutely right when he said, “Programming in BASIC causes brain damage.” (Lacking a source for that quote, I found an even better quote that has a source: “It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.”)
When I reflect on my (not-too-distant) programming infancy, I often think about what I might have done differently, like what technologies I could have learned, which ones I should have avoided, or what algorithms I could have used in old software I had written, and so on.
But there’s one thing that really stands out more than anything else: starting out with BASIC was the worst thing I could have done.
I’m not sure how useful it is to talk about this now, since BASIC has pretty much gone extinct, and rightly so, but it feels good to get it off my chest anyway.
My parents and I immigrated to the U.S. in 1991, when I was 10 years old, and I had never laid eyes on a personal computer before that time. During my family’s first few months in the States, we acquired a 80286 IBM PC, which was probably donated to us by an acquaintance (since the 80386 architecture was already prevalent at that time, and 80486 was the cutting edge).
I also happened to come across a book called BASIC and the Personal Computer by Dwyer and Critchfield. I was instantly fascinated by the prospect of programming the computer, and the boundless possibilities that computer software could provide.
However, I made a critical error that would hinder my programming development for at least a year: I reached the erroneous conclusion that BASIC was the only language there was!
I had no idea that BASIC was an interpreted language, or indeed what difference there is between an interpreted and a compiled language. I thought that all software (including the games I played, Windows 3.0, Word Perfect, etc.) was written in BASIC! This unfortunately led me down an ill-fated path of self-study, which took an even stronger effort to undo.
I learned all there was to know about BASIC programming in a few months (starting with GW-BASIC, then moving to QuickBASIC), and then I started to notice certain things about the software I was trying to write.
No matter how I tried, I couldn’t make my programs be as fast as other software I used. I couldn’t understand why this was the case. Also, the graphics routines in BASIC were virtually nonexistent, so I was baffled how anyone could write games with elaborate graphics, scrolling, and responsive controls. I was eager to start developing games that would rival my favorite games at the time, like Prince of Persia, Crystal Caves, and Commander Keen. But the graphics and responsiveness of those games was orders of magnitude beyond what I could achieve with my BASIC programs.
With all this frustration on my mind, I was determined to find the reason why my programs were so limited. I soon found a solution, but once again it was the wrong one! I stumbled upon some example BASIC code that used assembly language subroutines (encoded as DATA
lines in the BASIC program), as well as INTERRUPT
routines that took advantage of the underlying DOS and BIOS services.
This led me down the path of learning Intel 286 assembly language (another few months of studying), and encoding it into my BASIC programs! This solved the issue of responsiveness, but there was still the issue of graphics, or lack thereof. Fortunately, I found a book at the local public library about VGA graphics programming. Even more fortunately, the book contained sample source code, using a language they called C….
And my eyes were open!
It hit me like a freight train. I almost burst out laughing right there at the library. I realized that I had been learning the wrong things all along! (Of course learning assembly language was sort of right, but my application of it was still misguided.)
Learning C and C++ from that point forward wasn’t particularly difficult, but I still feel like it would have been a lot easier if my mind hadn’t been polluted by the programming style and structure that I learned from BASIC. It makes me wonder how things might have been different, had I accidentally picked up a book on C++ instead of a book on BASIC during my earliest exploits with computers.
In all fairness, I’m sure I learned some rudimentary programming principles from BASIC, but I’m not sure that this redeems BASIC as a learning tool. There were just too many moments where, while learning C++, I thought, “So that’s the way it really works!” And I’m sure it’s also my fault for trying to learn everything on my own, instead of seeking guidance from someone else who might have told me, “You’re doing it wrong.”
All of this makes me wonder what programming language would be appropriate for teaching today’s generation of young programmers. Based on my comically tragic experience with BASIC, my gut instinct is to advise aspiring developers to stay away from interpreted languages (such as Python), or at the very least understand that the interpreted language they’re learning is useless for developing actual software. I don’t think there’s any harm in diving right into a compiled language (such as C++), and learning how it hugs the underlying hardware in a way that no interpreted language ever could.
That being said, I don’t wish any of this to reflect negatively on Dwyer and Critchfield’s BASIC and the Personal Computer. It’s a solid book, and I still own the original copy. There’s no denying that it was one of the first books that got me interested in programming, and for that I’m thankful. However, sometimes I regret that I didn’t find Stroustrup’s The C++ Programming Language at the same garage sale as where I found BASIC and the Personal Computer. Or, alternatively, perhaps Dwyer and Critchfield could have included the following disclaimer in large bold letters: This is not the way actual software is written! But perhaps it’s time to let it go. I didn’t turn out so bad, right?