If I had to name one winner amongst the thousands of programming languages that have been created over the last 60 years, the obvious choice would be C. Developed by Dennis Ritchie from 1969 as the foundation of the Unix operating system, C remains one of the most commonly used languages even today; the Linux kernel, for example, is implemented in C. Yet that only tells part of the story. Dozens of other languages have borrowed the basic syntax of C while adding bells and whistles of their own. This group includes the most commonly used languages in computing, such as Java, C++, and Perl; quickly growing upstarts like C# and Objective C; and plenty of more esoteric domain-specific languages, like the interactive-fiction development system TADS 3. For a whole generation of programmers, C’s syntax, so cryptic and off-putting to newcomers with its parenthesis, curly braces, and general preference for mathematical symbols in lieu of words, has become a sort of comfort food. “This new language can’t be that bad,” we think. “After all, it’s really just C with…” (“these new things called classes that hold functions as well as variables”; “a bunch of libraries to make text-adventure development easy”; etc.). For obvious reasons, “C-like syntax” always seems to be near the top of the feature list of new languages that have it. (And for those who don’t: congratulations on sticking to your aesthetic guns, but you’ve chosen a much harder road to acceptance. Good luck!)
When we jump back 30 years, however, we find in this domain of computing like in so many others a very different situation. In this time C was the standard language of the fast-growing institutional operating system Unix, but had yet to really escape the Unix ghetto to join the top tier of languages in the computing world at large. Microcomputers boasted only a few experimental and/or stripped-down C compilers, and the language was seldom even granted a mention when magazines like Byte did one of their periodic surveys of the state of programming. The biggest buzz in Byte went instead to Niklaus Wirth’s Pascal, named after the 17th-century scientist, inventor, and philosopher who invented an early mechanical calculating machine. Even after C arrived on PCs in strength, Pascal, pushed along by Borland’s magnificent Turbo Pascal development environment, would compete with and often even overshadow it as the language of choice for serious programmers. Only in the mid-1990s did C finally and definitively win the war and become the inescapable standard we all know today.
While I was researching this post I came across an article by Chip Weems of Oregon State University. I found it kind of fascinating, so much that I’m going to quote from it at some length.
In the early days of the computer industry, the most expensive part of owning a computer was the machine itself. Of all the components in such a machine, the memory was the most costly because of the number of parts it contained. Early computer memories were thus small: 16 K was considered large and 64 K could only be found in supercomputers. All of this meant that programs had to take advantage of what little space was available.
On the other hand, programs had to be written to run as quickly as possible in order to make the most efficient use of the large computers. Of course these two goals almost always contradicted each other, which led to the concept of the speed versus space tradeoff. Programmers were prized for the ability to write tricky, efficient code which took advantage of special idiosyncrasies in the machine. Supercoders were in vogue.
Fortunately, hardware evolved and became less expensive. Large memories and high speed became common features of most systems. Suddenly people discovered that speed and space were no longer important. In fact roles had reversed and hardware had become the least expensive part of owning a computer.
The costliest part of owning a computer today is programming it. With the advent of less expensive hardware, the emphasis has shifted from speed versus space to a new tradeoff: programmer cost versus machine cost. The new goal is to make the most efficient use of a programmer’s time, and program efficiency has become less important — it’s easier to add more hardware.
If you know something about the history of the PC, you’re probably nodding along right now, as we’re seemingly on very familiar ground. If you’re a crotchety old timer, you may even be mulling over a rant about programmers today who solve all their problems just by throwing more hardware at them. (When old programmers talk about the metaphorical equivalent of having to walk both ways uphill in the snow to school every morning, they’re actually pretty much telling the truth…) Early Apple II magazines featured fawning profiles of fast-graphics programming maestros like Nasir Gebelli (so famous everyone just knew him by his first name), Bill Budge, and Ken Williams, the very picture of Weems’s “supercoders” who wrote “tricky, efficient code which took advantage of special idiosyncrasies in the machine.” If no one, including themselves after a few weeks, could quite understand how their programs did their magic, well, so be it. It certainly added to the mystique.
Yet here’s the surprising thing: Weems is not describing PC history at all. In fact, the article predates the fame of the aforementioned three wizards. It appeared in the August, 1978, issue of Byte, and is describing the evolution of programming to that point on the big institutional systems. Which leads us to the realization that the history of the PC is in many ways a repeat of the history of institutional computing. The earliest PCs being far too primitive to support the relatively sophisticated programming languages and operating systems of the institutional world, early microcomputer afficionados were thrown back into a much earlier era, the same that Weems is bidding a not-very-fond farewell to above. Like the punk-rock movement that was exploding just as the trinity of 1977 hit the market, they ripped it up and started again, only here by necessity rather than choice. This explains the reaction, somewhere between bemused contempt and horror, that so many in the institutional world had to the tiny new machines. (Remember the unofficial motto of MIT’s Dynamic Modeling Group: “We hate micros!”) It also explains the fact that I’m constantly forced to go delving into the history of computing on the big machines to explain developments there that belatedly made it to PCs. In fact, I’m going to do that again, and just very quickly look at how institutional programming got to the relatively sophisticated place at which it had arrived by the time the PC entered the scene.
The processor at the heart of any computer can ultimately understand only the most simplistic of instructions. Said instructions, known as “opcodes,” do such things as moving a single number from memory into a register of the processor; or adding a number already stored in a register to another; or putting the result from an operation back into memory. Each opcode is identified by a unique sequence of bits, or on/off switches. Thus the first programmers were literally bit flippers, laboriously entering long sequences of 1s and 0s by hand. (If they were lucky, that is; some early machines could only be programmed by physically rewiring their internals.) Assemblers were soon developed, which allowed programmers to replace 1s and 0s with unique textual identifiers: “STO” to store a number in memory, “ADD” to do the obvious, etc. After writing her program using this system of mnemonics, the programmer just had to pass it through the assembler to generate the 1s and 0s the computer needed. That was certainly an improvement, but still, programming a computer at the processor level is very time consuming. Sure, it’s efficient in that the computer does what you tell it to and only what you tell it to, but it’s also extremely tedious. It’s very difficult to write a program of real complexity from so far down in the weeds, hard to keep track of the forest of what you’re trying to accomplish when surrounded by trees made up of endless low-level STOs and ADDs. And even if you’re a supercoder who’s up to the task, good luck figuring out what you’ve done after you’ve slept on it. And as for others figuring it out… forget about it.
And so people started to develop high-level languages that would let them program at a much greater level of abstraction from the hardware, to focus more on the logic of what they were trying to achieve and less on which byte they’d stuck where 2000 opcodes ago. The first really complete example of such a language arrived in 1954. We’ve actually met it before on this blog: FORTRAN, the language Will Crowther chose to code the original Adventure more than 20 years later. LISP, the ancestor of MIT’s MDL and Infocom’s ZIL, arrived in 1958. COBOL, language of a million dull-but-necessary IBM mainframe business programs, appeared in 1959. And they just kept coming from there, right up until the present.
As the 1960s wore on, increasing numbers of people who were not engineers or programmers were beginning to make use of computers, often logging on to timesharing systems where they could work interactively in lieu of the older batch-processing model, in which the computer was fed some data, did its magic, and output some result at the other end without ever interacting with the user in between. While they certainly represented a huge step above assembly language, the early high-level languages were still somewhat difficult for the novice to pick up. In addition, they were compiled languages, meaning that the programmer wrote and saved them as plain text files, then passed them through another program called a compiler which, much like an assembler, turned them into native code. That was all well and good for the professionals, but what about the students and other amateurs who also deserved a chance to experience the wonder of having a machine do their bidding? For them, a group of computer scientists at Dartmouth University led by John Kemeny and Thomas Kurtz developed the Beginner’s All-Purpose Symbolic Instruction Code: BASIC. It first appeared on Dartmouth’s systems in 1964.
As its name would imply, BASIC was designed to be easy for the beginner to pick up. Another aspect, somewhat less recognized, is that it was designed for the new generation of time-sharing systems: BASIC was interactive. In fact, it wasn’t just a standalone language, but rather a complete computing environment which the would-be programmer logged into. Within this environment, there was no separation between statements used to accomplish something immediately, like LISTing a program or LOADing one, and those used within the program itself. Entering “PRINT ‘JIMMY'” prints “JIMMY” to the screen immediately; put a line number in front of it (“10 PRINT ‘JIMMY'”) and it’s part of a program. BASIC gave the programmer a chance to play. Rather than having to type in and save a complete program, then run it through a compiler hoping she hadn’t made any typos, and finally run the result, she could tinker with a line or two, run her program to see what happened, ad infinitum. Heck, if she wasn’t sure how a given statement worked or whether it was valid, she could just type it in by itself and see what happened. Because BASIC programs were interpreted at run-time rather than compiled beforehand into native code, they necessarily ran much, much slower than programs written in other languages. But still, for the simple experiments BASIC was designed to facilitate that wasn’t really so awful. It’s not like anyone was going to try to program anything all that elaborate in BASIC… was it?
Well, here’s where it all starts to get problematic. For very simple programs, BASIC is pretty straightforward and readable, easy to understand and fun to just play with. Take everybody’s first program:
10 PRINT "JIMMY RULES!" 20 GOTO 10
It’s pretty obvious even to someone who’s never seen a line of code before what that does, it took me about 15 seconds to type it in and run it, and in response I get to watch it fill the screen with my propaganda for as long as I care to look at it. Compared to any other contemporary language, the effort-to-reward ratio is off the charts. The trouble only starts if we try to implement something really substantial. By way of example, let’s jump to a much later time and have a look at the first few lines of the dungeon-delving program in Richard Garriott’s Ultima:
0 ONERR GOTO 9900
10 POKE 105, PEEK (30720): POKE 106, PEEK (30721): POKE 107, PEEK (30722): POKE 108, PEEK (30723): POKE 109, PEEK (30724): POKE 110, PEEK (30725): POKE 111, PEEK (30726): POKE 112, PEEK (30727)
20 PRINT "BLOAD SET"; INT (IN / 2 + .6)
30 T1 = 0:T2 = 0:T3 = 0:T4 = 0:T5 = 0:T6 = 0:T7 = 0:T8 = 0:T9 = 0: POKE - 16301,0: POKE - 16297,0: POKE - 16300,0: POKE - 16304,0: SCALE= 1: ROT= 0: HCOLOR= 3: DEF FN PN(RA) = DNG%(PX + DX * RA,PY + DY * RA)
152 DEF FN MX(MN) = DN%(MX(MN) + XX,MY(MN)): DEF FN MY(MN) = DN%(MX(MN),MY(MN) + YY): DEF FN L(RA) = DNG%(PX + DX * RA + DY,PY + DY * RA - DX) - INT (DN%(PX + DX * RA + DY,PY + DY * RA - DX) / 100) * 100: DEF FN R(RA) = DNG%(PX + DX * RA - DY,PY + DY * RA + DX) - INT (DN%(PX + DX * RA - DY,PY + DY * RA + DX) / 100) * 100
190 IF PX = 0 OR PY = 0 THEN PX = 1:PY = 1:DX = 0:DY = 1:HP = 0: GOSUB 500
195 GOSUB 600: GOSUB 300: GOTO 1000
300 HGR :DIS = 0: HCOLOR= 3
Yes, given the entire program so that you could figure out where all those line-number references actually lead, you could theoretically find the relatively simple logic veiled behind all this tangled syntax, but would you really want to? It’s not much fun trying to sort out where all those GOTOs and GOSUBs actually get you, nor what all those cryptic one- and two-letter variables refer to. And because BASIC is interpreted, comments use precious memory, meaning that a program of real complexity like the one above will probably have to dispense with even this aid. (Granted, Garriott was also likely not interested in advertising to his competition how his program’s logic worked…)
Now, everyone can probably agree that BASIC was often stretched by programmers like Garriott beyond its ostensible purpose, resulting in near gibberish like the above. When you have a choice between BASIC and assembly language, and you don’t know assembly language, necessity becomes the mother of invention. Yet even if we take BASIC at its word and assume it was intended as a beginner’s language, to let a student play around with this programming thing and get an idea of how it works and whether it’s for her, opinions are divided about its worth. One school of thought says that, yes, BASIC’s deficiencies for more complex programming tasks are obvious, but if used as a primer or taster of sorts for programming it has its place. Another is not only not convinced by that argument but downright outraged by BASIC, seeing it as an incubator of generations of awful programmers.
Niklaus Wirth was an early member of the latter group. Indeed, it was largely in reaction to BASIC’s deficiencies that he developed Pascal between 1968 and 1970. He never mentions BASIC by name, but his justification for Pascal in the Pascal User Manual and Report makes it pretty obvious of which language he’s thinking.
The desire for a new language for the purpose of teaching programming is due to my dissatisfaction with the presently used major languages whose features and constructs too often cannot be explained logically and convincingly and which too often defy systematic reasoning. Along with this dissatisfaction goes my conviction that the language in which the student is taught to express his ideas profoundly influences his habits of thought and invention, and that the disorder governing these languages imposes itself into the programming style of the students.
There is of course plenty of reason to be cautious with the introduction of yet another programming language, and the objection against teaching programming in a language which is not widely used and accepted has undoubtedly some justification, at least based on short-term commercial reasoning. However, the choice of a language for teaching based on its widespread acceptance and availability, together with the fact that the language most taught is thereafter going to be the one most widely used, forms the safest recipe for stagnation in a subject of such profound pedagogical influence. I consider it therefore well worthwhile to make an effort to break this vicious cycle.
If BASIC, at least once a program gets beyond a certain level of complexity, seems to actively resist every effort to make one’s code readable and maintainable, Pascal swings hard in the opposite direction. “You’re going to structure your code properly,” it tells the programmer, “or I’m just not going to let you compile it at all.” (Yes, Pascal, unlike BASIC, is generally a compiled language.) Okay, that’s not quite true; it’s possible to write ugly code in any language, just as it’s at least theoretically possible to write well-structured BASIC. But certainly Pascal works hard to enforce what Wirth sees as proper programming habits. The opinions of others on Wirth’s approach have, inevitably, varied, some seeing Pascal and its descendants as to this day the only really elegant programming languages ever created and others seeing them as straitjackets that enforce a certain inflexible structural vision that just isn’t appropriate for every program or programmer.
For my part, I don’t agree with Wirth and so many others that BASIC automatically ruins every programmer who comes into contact with it; people are more flexible than that, I think. And I see a bit of both sides of the Pascal argument, finding myself alternately awed by its structural rigorousness and infuriated by it every time I’ve dabbled in the language. Since I seem to be fond of music analogies today: Pascal will let you write a beautiful programming symphony, but it won’t let you swing or improvise. Still, when compared to a typical BASIC listing or, God forbid, an assembly-language program, Pascal’s clarity is enchanting. Considering the alternatives, which mostly consisted of BASIC, assembly, and (on some platforms) creaky old FORTRAN, it’s not hard to see why Byte and many others in the early PC world saw it as the next big thing, a possible successor to BASIC as the lingua franca of the microcomputer world. Here’s the heart of a roulette game implemented in Pascal, taken from another article in that August 1978 issue:
begin askhowmany (players); for player : = 1 to players do getname (player , playerlist) ; askif (yes); if yes then printinstructions; playersleft : = true ; while playersleft do begin for player : = 1 to players do repeat getbet (player, playerlist); scanbet (player, playerlist); checkbet (player, playerlist, valid); until valid; determine (winningnumber); for player : = 1 to players do begin if quit (player, playerlist) then processquit (player, playerlist, players, playersleft); if pass (player, playerlist) then processpass (player, playerlist); if bet (player , playerlist) then processbet (player, playerlist, winningnumber) end end end.
The ideal of Wirth was to create a programming language capable of supporting self-commenting code: code so clean and readable that comments became superfluous, that the code itself was little more difficult to follow than a simple textual description of the program’s logic. He perhaps didn’t quite get there, but the program above is nevertheless surprisingly understandable even if you’ve never seen Pascal before. Just to make it clear, here’s the pseudocode summary which the code extract above used as its model:
Begin program. Ask how many players. For as many players as there are, Get each player's name. Ask if instructions are needed. If yes, output the instructions. While there are still any players left, For as many players as there are, Repeat until a valid bet is obtained: Get the player's bet. Scan the bet. Check bet for validity. Determine the winning number. For as many players as there are, If player quit, process the quit. If player passed , process the pass. If player bet, Determine whether player won or lost. Process this accordingly. End program.
Yet Pascal’s readability and by extension maintainability was only part of the reason that Byte was so excited. We’ll look at the other next time… and yes, this tangent will eventually lead us back to games.
March 14, 2012 at 5:54 am
Time for me to swan off on a chain of anecdotes, no doubt of interest to nobody outside my own skull…
Pascal was the accepted pedagogical language in the mid-80s, which is when I started taking real programming courses (summer camp and self-taught — my high school didn’t teach programming). So that’s the “real” language I knew, although I had of course messed around with BASIC for years.
When I got to college in ’88 and looked at CMU’s Unix setup, one of my first questions was: “Is there a Pascal compiler on this thing?” And the reply — which I remember as a chorus echoing back from the entire computer cluster, although I’m sure memory exaggerates — was: “Use C, kid.”
It turned out that CMU’s intro-programming course was taught with a (universally reviled) Pascal IDE, which was so structured that it *didn’t let you type your code*. You had to select structures (IF statements, WHILE statements, etc) from a pull-down menu, and then fill in the blanks with variable names (or more menu options).
Fortunately, the self-paced coursework I’d done in high school enabled me to place out of that, so I went straight into the 200-level course. Which was, of course, all in C.
March 14, 2012 at 8:43 am
My first exposure to Pascal was also in a programming course, I believe my freshman year in high school. I didn’t really remark this at the time, but now I realize my little suburban high school was surprisingly progressive. We worked in a computer lab full of early IBM PCs (or maybe clones) equipped with Turbo Pascal, and the course was surprisingly rigorous. It was my first real exposure to the idea that there was a right way and a wrong way to structure code, and that code could actually be beautiful and readable in addition to being functional.
At the time I loathed everything about the IBM world, and thought the Amiga system I was lusting after was God’s gift to computing (still do, in a way; former Amiga people tend to be like that). But even through my prejudice I couldn’t help admitting that working in Turbo Pascal was, well, pretty damn great. It deserves a lot of credit as the first real IDE as we’ve come to know them today, the ancestor of Visual Studio and Eclipse and dozens of others.
Soon after I finally got that Amiga and started learning C, and never did more than dabble in Pascal thereafter. But for many years I pined for some equivalent to the Turbo Pascal IDE on my platform of choice. Never got it until I surrendered to the inevitable and bought a clone in the early 1990s.
All of which says more about Borland’s implementation of Pascal than Pascal itself, but, hey, anecdotes are like that…
March 14, 2012 at 2:07 pm
Which are the two languages of the title? Seems like it’s about three.
March 14, 2012 at 2:20 pm
Well, it was really meant to be about Pascal and BASIC, but you kind of have a point, don’t you? What the hell, I’ll just change the title…
March 14, 2012 at 5:28 pm
When I was first learning BASIC by typing code from books, I thought it was my lack of experience that made the code seem unreadable. And some of those books explained the code very well indeed!
But BASIC has a rather subtle strong point: it educates programmers as to how the machine works. You know the practice of setting up certain “reserved” variables before doing a gosub? That’s more or less what C function calls compile to. While loops compile to the equivalent of if… then… goto, and so on. Going the BASIC -> Pascal -> C route, I always kept that in mind. But nowadays, they start with Java (if in school) or PHP (if on their own), and it shows.
March 15, 2012 at 8:20 am
On a related note, I think the strong point of C is the way it lets you decide how close you want to get to the hardware. You can use a bunch of libraries and essentially treat C like Pascal, dealing only in the abstract logic of your program. Or you can, assuming the OS will let you, use it to bang right on the hardware, without terribly much more overhead than coding in pure assembler.
March 19, 2012 at 12:27 am
I had an Apple ][c — portable! — it had a handle! — that introduced me to BASIC. Also, at some point, to LOGO, which the young’uns out there can think of as a scripted toy language, a kind of a useless graphical version of Python.
In high school, I put together an IBM clone and a friend of mine set me up with Pascal. I have to say, the language put real restraints on you, but even so it was a nice language. It made some sense.
In college (starting ’92), the CS department taught in Pascal. Later I wanted to make the shift to C / C++ and found the syntax gave me a little trouble, but really troublesome was that the language expected me to go about things in ways that were far outside of my experience.
I was able to write a few programs, in an assortment of languages, more through stubbornness than anything else. I made a Markov chain generator that read in Shakespeare and spat out gibberish (in Prolog). Did knowing a little Pascal help with such projects? Well, it didn’t hurt.
TADS 3 is very Pascal-like, which is one of the things I appreciate about it. Pascal was a very nice language, a friendly language, but in terms of actually making things happen with the computer, you really had to do things the long way.
As much as I like Pascal — and I really do — I have to say my experience has lead me not to value languages that are meant to teach the programmer to think a certain way.
If I’m going to clock the time to learn the syntax and the basic building blocks of a language, I simply want the language to be effective. I want to be able to do things with it. Languages are better formed by the kind of task they’re meant for, than to shape the programmer’s thinking.
March 19, 2012 at 10:15 am
Yes, you’ve really pinpointed the weakness of Pascal, and the thing that turned a lot of people away from it. It’s very rigidly bound to a certain ideology of programming. Even extensions added to the language by the UCSD folks and Borland were decried by Wirth as corruptions of his perfect construct, steps down the road to programming anarchy. In the long run, C, a language constructed with no more ideological underpinning than a desire to let hackers get things done as quickly and easily as possible, won out. Much as I appreciate Pascal for its elegance and its role in computer history, I’d say the better language ultimately won.
January 25, 2013 at 9:05 am
It’s funny to see Objective-C characterized as an upstart when it was developed in 1981.
Excellent blog by the way. I don’t care about early game history but your articles about early computer hardware and software are fantastic. I’m reading backwards through your archive now.
March 8, 2013 at 11:37 am
Great article. It’s always hard for me to hear negative things about Pascal – I have a real soft spot for the language. I found the move from Object-Pascal (Delphi) to c# very easy a few years ago, thanks to the similarities in the concepts, probably due to Anders Hejlsberg’s influence. Still learning to love the .NET Framework though…….
June 30, 2013 at 7:27 am
It’s interesting to look at that Pascal sample now and compare it to a modern BASIC – anything from Visual BASIC on – and see how familiar it looks. It seems like Pascal never really died, it just donated its constructs to other languages.
January 22, 2017 at 1:12 pm
Even late 80s QuickBasic didn’t look all that different from that pascal example, if you used it as intended (you could also program in it like “traditional” basics).
October 18, 2014 at 8:39 pm
I must have missed this article when it was first posted. It definitely brings me back. Basic was my first language on an HP3000 mini, then I got to use Pascal and C in college. In my view both are good languages and have a philosophy to them from the original authors. C makes you work a little harder on understanding memory than I think makes sense for most people, but over the years I think that’s true of almost any language. Or put differently, if you want to reduce bugs in programs, don’t require programmers to allocate and deallocate memory for dynamic structures.
Later on I worked at Borland with Anders Hejlsberg and others to bring some of the later versions of Turbo Pascal and then Delphi to market. I think Borland did a good job making Pascal more accommodating than the official standard, bringing it into the modern era of Object-Oriented Programming, the event-driven model for Windows etc.
It was interesting to see how C evolved into C++ which in many ways is so complex that few people really understand what’s going on at runtime. To a certain extent I think Visual Basic and then Java were an attempt to make mainstream programming much easier than it had become. And arguably newer languages like Go are doing this once again.
Perhaps “back to basics” programming is just another one of those cycles that repeats every 20 or so years.
November 4, 2014 at 10:26 am
I loved Pascal! My high school computer science class used Turbo Pascal 3 and later 6, and it just felt perfect to me, most of the same concepts as C but much more readable. Later versions of BASIC such as Amiga BASIC and QBASIC were better structured than the 8-bit versions and finally did lose the requirement for line numbers, and of course C++ and Java are more popular today, but my best memories were of Turbo Pascal and I still prefer its readability. These days the closest popular equivalent would probably be Python, but it’s an interpreted language, and if I feel the need to compile a stand-alone executable and I don’t feel like struggling with C, there is now the open-source Free Pascal compiler, whose IDE still looks practically identical to the Borland one I grew up with.
November 26, 2015 at 2:19 pm
Looks like I started out a good bit earlier than everyone else here, programming in BASIC in my senior year in high school (1974, on some form of HP mini). I graduated with a B.S.E.E. from my university in 1978. Computer science did not exist as a discipline/major back then. IIRC, my uni also had a HP3000, and I did most of my course related programming in BASIC with the odd excursion into FORTRAN.
It is possible to write well-structured programs in any language, IMO, as long as you put chunks of code you are going to use more than once (and sometimes only once, if they are complex) into subroutines/methods/whatever, and as long as you put meaningful comments in the code that explain what you are doing, rather than just being pedantic.
In fact, a lot of my company’s early code was written in Intel x86 assembly language, where you could use ‘structures’ (though, I think that was made possible via Microsoft’s MASM assembler). These are essentially overlays on a chunk of memory that function as primitive objects. Coupled with well thought-out comments, you have ‘structured assembly language’ :-)
My company currently builds middleware for large e-commerce platforms, and we have been using Java since it first came out in 1996. I remember it being a big paradigm shift for us old hands that were using C at the time, but we eventually got comfortable with it. And it’s far superior to C++, IMO, which had lots of grey areas depending on which compiler you were using.
December 28, 2015 at 2:41 am
Here’s my BASIC programming story: When I was 12 years old my dad bought me a kids book about programming in BASIC because I always felt like an idiot using my dads new Apple //+ that my 17 year old brother seemed to be able to do anything with (in reality, he wasn’t much more advanced than I was, it just seemed like it at the time). Anyway I took to this book for some reason, so by the time I entered 9th grade and was allowed to take 11th grade Computer Science in high school, I was easily the best BASIC programmer in the class, and luckily that’s the language we used on the PRIME mainframe housed at a school across town, which we accessed with terminals. This didn’t last long as I was soon kicked out of the class and not allowed to take computer science again after I stupidly gave my password to some guy from another school claiming to be starting a ‘hacking group’ called the PRIMOS Resistance. Anyway a few years later I convinced my dad to buy an Apple-Cat modem to replace the acoustic coupler my brother got from somewhere. I was really into BBSing and soon found Telecat BBS, a program almost entirely written in BASIC, except for the modem routines which I didn’t know how to write anyway. So here they were already written for me, and I could modify the BBS to my hearts desire. Unfortunately I never got to run my BBS because a couple years later I got my own computer, a Vtech Laser 128ex, with expansion box for the Apple-cat. The Laser 128 was the only Apple //e clone that wasn’t sued out of existence by Apple because Vtech reverse-engineered the ROM. But they did a poor job of it, because the advanced BASIC I was writing for my BBS wouldn’t work on the Laser. I ended up having a nervous breakdown before finally calling Vtech and finding out that I needed a new ROM to fix the bugs, which the bastards actually charged me $30 to get. By this time the shine of programming was wearing off, I was discovering girls, got a car, etc and I was always on the phone anyway, so there was no point in running a BBS. I still have fond memories of my BBS programming, though. I gutted Telecat and re-wrote it from scratch. I was so proud when I changed the routine to enter your name in all caps to automatically capitalize the first letter of your first and last name, with the rest in lower-case. I ended up getting obsessed with variable arrays. I realized I could store everything in variable arrays and have the data available at all times; message headers, usernames and passwords, even dungeon and combat stats…. Unfortunately defining huge dimensioned arrays took up so much memory there was hardly any room for code. No problem because my latest obsession was the CHAIN command. The one thing that always bothered me was my inability to figure out how to impliment RWTS (read-write/track-sector) for my message base, in BASIC. A local guy a couple years older than me had written a famous BBS for the Commodore 64 that was all in assembly and it used RWTS for its message base and was lightening fast, way faster than my BBS, and I was jealous. At the time I didn’t realize that this was all possible because the C64 floppy drive was far more sophisticated than the Apple // floppy.
OH well, that’s my story. BTW your blog is awesome… lots of fun to read.
Joe Mc Swiney
July 9, 2017 at 5:20 pm
I studied electronics engineering back in the early ’80s. When I returned to hobbyist coding recently I had to learn C. In hindsight, I could not understand why our curriculum in the ’80s focused on Fortran and Pascal for engineers who were clearly going to be operating at the machine level much of the time. Thanks to this article I now understand and the college is forgiven :-). BTW our “mainframe” was a Prime 850 mini and we spent a lot of time hacking in CPL script. Zero structure and way more fun than Pascal.
November 29, 2019 at 6:09 am
and other seeing them as straitjackets
November 29, 2019 at 5:03 pm
September 29, 2021 at 10:20 pm
A few points of correction:
“The first really complete example of such a language arrived in 1954. We’ve actually met it before on this blog: FORTRAN,…”
Probably worth noting that an actual working FORTRAN compiler wasn’t available until 1957. The specification fo the FORTRAN language was in place by 1954, but not a working implementation.
“LISP, the ancestor of MIT’s MDL and Infocom’s ZIL, arrived in 1958.”
The first known working interpreter for LISP was actually released in May 1959.
“COBOL, language of a million dull-but-necessary IBM mainframe business programs, appeared in 1959.”
The first COBOL program actually ran in August 1960.
Obviously the overall time frame is still very much accurate, so I’m not trying to be overly pedantic, but one thing that’s worth nothing about some of these languages is that they had a gestation period before they actually saw practical (or even any) use. And I think that’s interesting because by the time they actually became usable, they went through a period of specification and design and thought that was reflective of how people were considering what it meant to “program” at the time.
September 30, 2021 at 9:20 am
Correction: “worth NOTING” in my previous post. :)
“Worth nothing” is about the worst typo I could have made there.
September 30, 2021 at 9:30 am
“…he developed Pascal between 1968 and 1970.
Interesting side note to that range is that while the first Pascal compiler was created in 1969, it actually didn’t work. It wasn’t until 1970 that a working compiler was developed and made available.
February 17, 2022 at 9:10 am
I remember once reading a comment about Pascal, that it was “… a great improvement on all its successors.”