RSS

Tag Archives: wizardry

Playing Wizardry

Writing about Ultima earlier, I described that game as the first to really feel like a CRPG as we would come to know the genre over the course of the rest of the 1980s. Yet now I find myself wanting to say the same thing about Wizardry, which was released just a few months after Ultima. That’s because these two games stand as the archetypes for two broad approaches to the CRPG that would mark the genre over the next decade and, arguably, even right up to the present. The Ultima approach emphasizes the fictional context: exploration, discovery, setting, and, eventually, story. Combat, although never far from center stage, is relatively deemphasized, at least in comparison with the Wizardry approach, which focuses on the process of adventuring above all else. Like their forefather, Wizardry-inspired games often take place in a single dungeon, seldom feature more than the stub of a story, and largely replace the charms of exploration, discovery, and setting with those of tactics and strategy. The Ultima strand is often mechanically a bit loose — or more than a bit, if we take Ultima itself, with its hit points as a purchasable commodity and its concept of character level as a function of time served, as an example. The Wizardry strand is largely about its mechanics, so it had better get them right. (As I wrote in my last post about Wizardry, Richard Garriott refined and balanced Ultima by playing it a bit himself and soliciting the opinions of a few buddies; Andrew Greenberg and Robert Woodhead put Wizardry through rigorous balancing and playtesting that consumed almost a year.) These bifurcated approaches parallel the dueling approaches to tabletop Dungeons and Dragons, as either a system for interactive storytelling enjoyed by “artful thespians” or a single-unit tactical wargame.

Wizardry, then, isn’t much concerned with niceties of setting or story. The manual, unusually lengthy and professional as it is, says nothing about where we are or just why we choose to spend our time delving deeper and deeper into the game’s 10-level dungeon. If a dungeon exists in a fantasy world, it must be delved, right? That’s simply a matter of faith. Only when we reach the 4th level of the dungeon do we learn the real purpose of it all, when we fight our way through a gauntlet of monsters to enter a special room.

CONGRATULATIONS, MY LOYAL AND WORTHY SUBJECTS. TODAY YOU HAVE SERVED ME WELL AND TRULY PROVEN YOURSELF WORTHY OF THE QUEST YOU ARE NOW TO UNDERTAKE. SEVERAL YEARS AGO, AN AMULET WAS STOLEN FROM THE TREASURY BY AN EVIL WIZARD WHO IS PURPORTED TO BE IN THE DUNGEON IMMEDIATELY BELOW WHERE YOU NOW STAND. THIS AMULET HAS POWERS WHICH WE ARE NOW IN DIRE NEED OF. IT IS YOUR QUEST TO FIND THIS AMULET AND RETRIEVE IT FROM THIS WIZARD. IN RECOGNITION OF YOUR GREAT DEEDS TODAY, I WILL GIVE YOU A BLUE RIBBON, WHICH MAY BE USED TO ACCESS THE LEVEL TRANSPORTER [otherwise known as an “elevator”] ON THIS FLOOR. WITHOUT IT, THE PARTY WOULD BE UNABLE TO ENTER THE ROOM IN WHICH IT LIES. GO NOW, AND GOD SPEED IN YOUR QUEST!

And that’s the last we hear about that, until we make it to the 10th dungeon level and the climax.

What Wizardry lacks in fictional context, it makes up for in mechanical depth. Nothing that predates it on microcomputers offers a shadow of its complexity. Like Ultima, Wizardry features the standard, archetypical D&D attributes, races, and classes, renamed a bit here and there for protection from Mr. Gygax’s legal team. Wizardry, however, lets us build a proper adventuring party with up to six members in lieu of the single adventurer of Ultima, with all the added tactical possibilities managing a team of adventurers implies. Also on offer here are four special classes in addition to the basic four, to which we can change characters when they become skilled enough at their basic professions. (In other words, Wizardry is already offering what the kids today call “prestige classes.”) Most impressive of all is the aspect that gave Wizardry its name: priests eventually have 29 separate spells to call upon, mages 21, each divided into 7 spell levels to be learned slowly as the character advances. Ultima‘s handful of purchasable scrolls, which had previously marked the state of the art in CRPG magic systems, pales in comparison. Most of the depth of Wizardry arises one way or another from its magic system. It’s not just a matter of learning which spells are most effective against which monsters, but also of husbanding one’s magic resources: deciding when one’s spell casters are depleted enough that it’s time to leave the dungeon, deciding whether the powerful spell is good enough against that demon or whether it’s time to use the really powerful one, etc. It’s been said that a good game is one that confronts players with interesting, non-obvious — read, difficult — decisions. By that metric, magic is largely what makes Wizardry a good game.

Of course, Wizardry‘s mechanics, from its selection of classes and races to its attribute scores that max out at 18 to its armor-class score that starts at 10 and moves downward for no apparent reason, are steeped in D&D. There’s even a suggestion in the manual that one could play Wizardry with one’s D&D group, with each player controlling a single character — not that that sounds very compelling or practical. The game also tries, not very successfully, to shoehorn in D&D‘s mechanic of alignment, a silly concept even on the tabletop. On the computer, good, evil, and neutral are just a set of arbitrary restrictions: good and evil cannot be in the same party, thieves cannot be good.

Sometimes you meet “friendly” monsters in the dungeon. If good characters kill them anyway, or evil characters let them go, there’s a chance that their alignments will change — which can in turn play the obvious havoc with party composition. (In an amusing example of unintended emergent behavior, it’s also possible for the “evil” mage at the end of the game to be… friendly. Now doesn’t that present a dilemma for a “good” adventurer, particularly since not killing him means not getting the amulet that the party needs to get out of his lair.)

So, Greenberg and Woodhead were to some extent just porting an experience that had already proven compelling as hell to many players to the computer, albeit doing a much more complete job of it than anyone had managed before. But there’s also much that’s original here. Indeed, so much that would become standard in later CRPGs has its origin here that it’s hard to know where to begin to describe it all. Wizardry is almost comparable to Adventure in defining a whole mode of play that would persist for many years and countless games. For those few of you who haven’t played an early Wizardry game, or one of its spiritual successors (read: slavish imitators) like The Bard’s Tale or Might and Magic, I’ll take you on a very brief guided tour of a few highlights. Sorry about my blasphemous adventurer names; I’ve been reading the Old Testament lately, and it seems I got somewhat carried away with it all.

Wizardry is divided into two sections: the castle (shown below), where we do all of the housekeeping chores like making characters, leveling up, putting together our party, shopping for equipment, etc.; and the dungeon, where the meat of the game takes place.

When we enter the dungeon, we start in “camp.” We are free to camp again at any time in the dungeon, as long as we aren’t in the middle of a fight. Camping gives us an opportunity to tinker with our characters and the party as a whole without needing to worry about monsters. We can also cast spells. Here I’ve just cast MAPORFIC, a very useful spell which reduces the armor class of the entire party by two for the duration of our stay in the dungeon. All spells have similar made-up names; casting one requires looking it up in the manual and entering its name.

Once we leave camp, we’re greeted with the standard traveling view: a first-person wireframe-3D view of our surroundings occupies the top left, with the rest of the screen given over to various textual status information and a command menu that’s really rather wasteful of screen space. (I suspect Greenberg and Woodhead use it because it gives them something with which to fill up some space that they don’t have to spend computing resources dynamically updating.)

I was just saying that Wizardry manages to be its own thing, separate from D&D. That becomes clear when we consider the player’s biggest challenge: mapping. It’s absolutely essential that she keep a meticulous map of her explorations. Getting lost and not knowing how to return to the stairs or elevator is almost invariably fatal. While tabletop D&D players are often also expected to keep rough maps of their journeys, few dungeon masters are as unforgiving as Wizardry. In addition to all the challenges of keeping track of lots of samey-looking corridors and rooms, the game soon begins to throw other mapping challenges at the player: teleporters that suddenly throw the party somewhere else entirely; spinners that spin them in place so quickly it’s easy to not realize it’s happened; passages that wrap around from one side of the dungeon to the other; dark areas that force one to map by trial and error, literally by bashing one’s head against the walls.

On the player’s side are an essential mage spell, DUMAPIC, that tells her exactly where she is in relation to the bottom-left corner of the dungeon level; and the knowledge that all dungeon levels are exactly 20 spaces by 20 spaces in size. Mapping is such a key part of Wizardry that Sir-tech even provided a special pad of graph paper for the purpose in the box, sized 20 X 20.

The necessity to map for yourself is easily the most immediately off-putting aspect of a game like Wizardry for a modern player. While games before Wizardry certainly had dungeons, it was the first to really require such methodical mapping. The dungeons in Akalabeth and Ultima, for instance, don’t contain anything other than randomized monsters to fight with randomized treasure. The general approach in those games becomes to use “Ladder Down” spells to quickly move down to a level with monsters of about the right strength for one’s character, to wander around at random fighting monsters until satisfied and/or exhausted, then to use “Ladder Up” spells to make an escape. There’s nothing unique to really be found down there. Wizardry changed all that; its dungeon levels may be 99% empty rooms, corridors, and randomized monster encounters, but there’s just enough unique content to make exploring and mapping every nook and cranny feel essential. If that’s not motivation enough, there’s also the lack of a magic equivalent to “Ladder Up” and “Ladder Down” until one’s mage has reached level 13 or higher. Map-making is essential to survival in Wizardry, and for many years to follow laborious map-making would be a standard part of the CRPG experience. It’s an odd thing: I have little patience for mazes in text adventures, yet find something almost soothing about slowly building up a picture of a Wizardry dungeon on graph paper. Your milage, inevitably, will vary.

In general Wizardry is all too happy to kill you, but it does offer some kindnesses here and there in addition to DUMAPIC and dungeon levels guaranteed to be 20 X 20 spaces. These proving grounds are, for example, one of the few fantasy dungeons to be equipped with a system of elevators. They let us bypass most of the levels to quickly get to the one we want. Here we’re about to go from level 1 to level 4.

From level 4 we can take another elevator all the way down to level 9. But, as you can see below, entering that second elevator is allowed for “authorized users only.”

Wizardry doesn’t have the ability to save any real world state at all. Only characters can be saved, and only from the castle. Each dungeon level is reset entirely the moment we enter it again (or, more accurately, reset when we leave it, when it gets dumped from memory to be replaced by whatever comes next). Amongst other things, this makes it possible to kill Werdna, the evil mage of level 10, and thus “win the game” over and over again. One way the game does manage to work around this state of affairs is through checks like what you see illustrated above. We can only enter the second elevator if we have the blue ribbon — and we can only get that through the fellow who enlisted our services in another part of level 4 (see the quotation above). By tying progress through the plot (such as it is) to objects in this way, Greenberg and Woodhead manage to preserve at least a semblance of game state. The blue ribbon is of course an object which we carry around with us, and that is preserved when we save our characters back at the castle. Therefore it gives the game a way of “knowing” whether we’ve completed the first stage of our quest, and thus whether it should allow us into the lower levels. It’s quite clever in its way, and, again, would become standard operating procedure in many other RPGs for years to come. The mimesis breaker is that, just as we can kill Werdna over and over, we can also acquire an infinite number of these blue ribbons by reentering that special room on level 4 again and again.

There’s a surprising amount of unique content in the first 4 levels: not only our quest-giver and the restricted elevator, but also some special rooms with their own atmospheric descriptions and a few other lock-and-key-style puzzles similar to, although less critical than, the second-elevator puzzle. In levels 5 through 9, however, such content is entirely absent. These levels hold nothing but empty corridors and rooms. I believe the reason for this is down to disk capacity. Wizardry shipped on two disks, but the first serves only to host the opening animation and some utilities. The game proper lives entirely on a second disk, as must all of the characters that players create. This disk is stuffed right to the gills, and probably would not allow for any more text or “special” areas. Presumably Greenberg and Woodhead realized this the hard way, when the first four levels were already built with quite a bit of unique detail.

We start to see more unique content again only on level 10, the lair of Werdna himself. There’s this, for instance:

From context we can conclude that Trebor must be the quest giver that we met back on level 4. “Werdna” and “Trebor” are also, of course, “Andrew” and “Robert” spelled backward. Wizardry might like to describe itself using some pretty high-minded rhetoric sometimes and might sport a very serious-looking dragon on its box cover, but Greenberg and Woodhead weren’t above indulging in some silly fun in the game proper. When mapped, level 8 spells out Woodhead’s initials; ditto level 9 for Greenberg’s.

In the midst of all this exploration and mapping we’re fighting a steady stream of monsters. Some of these fights are trivial, but others are less so, particularly as our characters advance in level and learn more magic and the monsters we face also get more diverse and much more dangerous, with more special capabilities of their own.

The screenshot above illustrates a pretty typical combat dilemma. In an extra little touch of cruelty most of its successors would abandon, Wizardry often decides not to immediately tell us just what kind of monsters we’re facing. The “unseen entities” above could be Murphy’s ghosts, which are pretty much harmless, or nightstalkers, a downright sadistic addition that drains a level every time it successfully hits a character. (Exceeded in cruelty only by the vampire, which drains two levels.) So, we are left wondering whether we need to throw every piece of high-level magic we have at these things in the hopes of killing them before they can make an attack, or whether we can take it easy and preserve our precious spells. As frustrating as it can be to waste one’s best spells, it usually pays to err on the side of caution in these situations; once to level 9 or so, each experience level represents hours of grinding. Indeed, if there’s anything Wizardry in general teaches, it’s the value of caution.

I won’t belabor the details of play any more here, but rather point you to the CRPG Addict’s posts on Wizardry for an entertaining description of the experience. Do note as you read that, however, that he’s playing a somewhat later MS-DOS port of the Apple II original.

The Wizardry series today has the reputation of being the cruelest of all of the earlier CRPGs. That’s by no means unearned, but I’d still like to offer something of a defense of the Wizardry approach. In Dungeons and Desktops, Matt Barton states that “CRPGs teach players how to be good risk-takers and decision-makers, managers and leaders,” on the way to making the, shall we say, bold claim that CRPGs are “possibly the best learning tool ever designed.” I’m not going to touch the latter claim, but there is something to his earlier statements, at least in the context of an old-school game of Wizardry.

For all its legendary difficulty, Wizardry requires no deductive or inductive brilliance or leaps of logical (or illogical) reasoning. It rewards patience, a willingness to experiment and learn from mistakes, attention to detail, and a dedication to doing things the right way. It does you no favors, but simply lays out its world before you and lets you sink or swim as you will. Once you have a feel for the game and understand what it demands from you, it’s usually only in the moment that you get sloppy, the moment you start to take shortcuts, that you die. And dying here has consequences; it’s not possible to save inside the dungeon, and if your party is killed they are dead, immediately. Do-overs exist only in the sense that you may be able to build up another party and send it down to retrieve the bodies for resurrection. This approach is probably down at least as much to the technical restrictions Greenberg and Woodhead were dealing with — saving the state of a whole dungeon is complicated — as to a deliberate design choice, but once enshrined it became one of Wizardry‘s calling cards.

Now, this is very possibly not the sort of game you want to play. (Feel free to insert your “I play games to have fun, not to…” statements here.) Unlike some “hardcore” chest-thumpers you’ll meet elsewhere on the Internet, I don’t think that makes you any stupider, more immature, or less manly than me. Hell, often I don’t want to play this sort of game either. But, you know, sometimes I do.

My wife and I played through one of the critical darlings of last year, L.A. Noire, recently. We were generally pretty disappointed with the experience. Leaving aside the sub-Law and Order plotting, the typically dodgy videogame writing, and the most uninteresting and unlikable hero I’ve seen in a long time, our prime source of frustration was that there was just no way to fuck this up. The player is reduced to stepping through endless series of rote tasks on the way to the next cut scene. The story is hard-coded as a series of death-defying cliffhangers, everything always happening at the last possible second in the most (melo-)dramatic way possible, and the game is quite happy to throw out everything you as the player have, you know, actually done to make sure it plays out that way. In the end, we were left feeling like bit players in someone else’s movie. Which might not have been too terrible, except it wasn’t even a very good movie.

In Wizardry, though, if you stagger out of the dungeon with two characters left alive with less than 10 hit points each, that experience is yours. It wasn’t scripted by a hack videogame writer; you own it. And if you slowly and methodically build up an ace party of characters, then take them down and stomp all over Werdna without any problems at all, there’s no need to bemoan the anticlimax. The satisfaction of a job well and thoroughly done is a reward of its own. After all, that’s pretty much how the good guys won World War II. To return to Barton’s thesis, it’s also the way you make a good life for yourself here in the real world; the people constantly scrambling out of metaphorical dungeons in the nick of time are usually not the happy and successful ones. If you’re in the right frame of mind, Wizardry, with its wire-frame graphics and its 10 K or so of total text, can feel more immersive and compelling than L.A. Noire, with all its polygons and voice actors, because Wizardry steps back and lets you make your own way through its world. (It also, of course, lets you fuck it up. Oh, boy, does it let you fuck it up.)

That’s one way to look at it. But then sometimes you’re surprised by six arch-mages and three dragons who proceed to blast you with spells that destroy your whole 15th-level party before anyone has a chance to do a thing in response, and you wish someone had at least thought to make sure that sort of thing couldn’t happen. Ah, well, sometimes life is like that too. Wizardry, like reality, can be a cruel mistress.

I’m making the Apple II version and its manual available for you to download, in case you’d like to live (or relive) the experience for yourself. You’ll need to remove write permissions from the first disk image before you boot with it. As part of its copy protection, Wizardry checks to see if the disk is write protected, and refuses to start if not. (If you’re using an un-write-protected disk, it assumes you must be a nasty pirate.)

Next time I’ll finish up with Wizardry by looking at what Softline magazine called the “Wizardry phenomenon” that followed its release.

 
 

Tags: ,

Making Wizardry

When we left off, Robert Woodhead had just completed Galactic Attack and, as he and Norman Sirotek waited for the Apple Pascal run-time system that would let them release it, was already considering what game to do next. Once again he turned to the lively culture of PLATO for inspiration. As I described in an earlier post, PLATO had been home to the very first computerized adaptations of Dungeons and Dragons, and still housed the most sophisticated examples of the emerging CRPG form. Microcomputers in 1980 had nothing to compare with PLATO games like Moria, Oubliette, and Avatar, games that not only foreshadowed the PC-based single-player CRPGs soon to come but also the online social dynamics of more modern MMORPGs like World of Warcraft. Looking around at a microcomputer scene that offered only much less sophisticated games like Temple of Apshai, Woodhead began considering how he might bring some modicum of the PLATO CRPG experience to PCs. He tentatively named his new project Paladin.

Coincidentally, a computer-science graduate student at Cornell, Andrew Greenberg, had been working on the same idea for quite a long time already. During spring-break week, 1978, Greenberg, still an engineering undergraduate at the time, was lazing around his with his friends, playing chess, Scrabble, and cards. From the first issue of the short-lived newsletter WiziNews:

After a couple of days, he [Greenberg] says that, “I was getting tired of these same games. I was bored and complained about my boredom.” A friend suggested offhand that he go put Dungeons and Dragons on a computer.

Greenberg worked on the idea in fits and starts over the months that followed, constantly expanding the game — which he had dubbed Wizardry — on his dorm-room Apple II. He could sense he had the germ of something good, especially when his friends started visiting to play the game on his computer and ended up staying all night. Like so many would-be game makers, however, Greenberg found bringing all of his ideas to fruition in the face of limitations — both his own and those of his hardware — to be a challenge. He had coded the game in BASIC, the only language other than assembly to which he had access on his Apple II. It was slow. Painfully slow. And as it got bigger, dealing with all the frustrations and limitations of BASIC became a bigger and bigger problem.

Meanwhile, Greenberg was working in the university’s PLATO computer lab, where one of his duties was to keep the hordes of gamers from monopolizing terminals ostensibly intended for education. PLATO-addict Woodhead was, naturally, one of his biggest problem children. The two engaged in a constant battle of wits, Greenberg devising new schemes to lock down the gaming files and Woodhead always finding ways around his roadblocks. “He was one of those people who just seemed to live to make my life miserable,” says Greenberg.

But then his nemesis, who had played one of the copies of his game that were being passed around campus, came to Greenberg with a proposition. Greenberg had — or at least was well on the way to having — an innovative, original design, but was having problems realizing it technically; Woodhead had gotten very good at programming the Apple II in Pascal, but had only the sketch of a design for his game. Further, Woodhead had, through his connections with the Sirotek family, the resources to get a game published and marketed. Greenberg hadn’t previously thought along these lines, having envisioned his game as just a fun project for his “buds,” but he certainly wasn’t averse to the idea. The match was obvious, and a partnership was born. The two sat down to discuss just what the new game should be. Rather than just make a clone of the PLATO CRPGs, they had some original ideas of their own to include.

Another popular genre on PLATO was the “maze runners,” in which players had to find their way out of a labyrinth shown to them in a three-dimensional, first-person perspective. (I’ve had occasion to mention them before on this blog; they were also the inspiration, by way of Silas Warner’s port of one to the Apple II, for the dungeon-delving section of Richard Garriott’s Akalabeth.) Greenberg and Woodhead wondered if it might be possible to build a CRPG from that perspective, putting the player right into the world, as it were, rather than making her view the action from on-high. The two were also very fond of the party dynamics of tabletop D&D sessions, in which every player controlled an avatar with different tactical strengths and weaknesses, forcing the players to work together to devise an optimum strategy that made the best use of all. Being built on an online network, many of the PLATO CRPGs also let players team up to explore and fight together. This sort of thing just wasn’t possible on an Apple II given the state of telecommunications of the time, but as a next-best thing they thought to give the player control over an entire party of adventurers rather than a single character. What she lost in not being able to bond with a single character that definitively represented her would presumably be more than made up for by the tactical depth this configuration would allow.

Greenberg today frankly characterizes the months that followed, months of designing, implementing, testing, and revising what would become Wizardry, as “the most wondrous of my life.” The general role played by each was precisely opposite what you might expect: Greenberg, the budding computer scientist, designed the game system and the dungeons to be explored, while Woodhead, the psychology major, did most of the programming and technical work. Partly this division of labor came down to practicalities. Woodhead, still suspended from classes, had a lot more time to work on thorny technical issues than Greenberg, immersed in the first year of an intensive PhD program. Nor were the two exclusively confined to these roles. Greenberg, for instance, had already created many of the algorithms and data structures that would persist into the final game by the time he turned his earlier game’s code over to Woodhead.

Almost from the start, the two envisioned Wizardry as not just a game but a game system. In best D&D (and Eamon) fashion, the player would carry her adventurers from scenario to scenario — or, in D&D parlance, from module to module. The first release, which Greenberg and Woodhead planned to call Dungeons of Despair, would only be the beginning. Woodhead therefore devoted a lot of attention to their tools, crafting not just a game but a whole system for making future Wizardry scenarios as cleanly and easily as possible. Greenberg characterizes the final product as “layers upon layers of interpreters,” with the P-Machine interpreter itself at the bottom of the stack. And in addition to the game engine itself, Woodhead also coded a scenario editor that Greenberg — and, it was hoped, eventually other designers — could use to lay out the dungeons, treasures, and monsters.

Apple Pascal’s unique capabilities were key to fitting such an ambitious design into the Apple II. One of the most important was the concept of code segments. Segments allowed a programmer to break up a large program into a collection of smaller pieces. The Pascal library needed load only the currently active segment into memory. When execution branched to another segment, the previous segment was dumped and the new loaded in its place. This scheme allowed the programmer to write, relatively painlessly, a single program much larger than the physical memory of the Apple II would seem to allow. It was, in other words, another early form of virtual memory. While it was possible to chain BASIC programs together to create a superficially similar effect, as evidenced by Eamon, Ultima, and plenty of others, the process was a bit of a kludge, and preserving the state of the game across programs that the computer saw as essentially unrelated was a constant headache.

Another remarkable and important aspect of Apple Pascal was its graphics system, which went far beyond the capabilities of Applesoft BASIC. It had the ability to print text anywhere on the bitmapped hi-res screen with a few simple statements. This sequence, for instance, prints an “X” in the center of the hi-res screen:

PENCOLOR (NONE);
MOVETO (137,90);
WCHAR ('X');

Developers working in BASIC or assembly who wished to blend text with hi-res graphics had to either use the Apple II’s dual graphics/text mode, which restricted text to the bottom 4 lines of the screen, or invest considerable time and energy into rolling their own hi-res-mode text-generation system, as Muse Software did. By comparison, Wizardry‘s standard screen, full of text as it was, was painless to create.

Another hidden bonus of Apple Pascal would be its acting as a sort of copy-protection system. Because the system used its own disk format, Wizardry disks would be automatically uncopyable for those who didn’t themselves own Pascal, or at least who didn’t have access to special software tools like a deep copier.

Greenberg and Woodhead got a prototype version of the game working in late September of 1980. They showed it to the public for the first time two months later, at the New York Personal Computer Expo. People were entranced, many asking to buy a copy on the spot. That, however, was not possible, as Apple still hadn’t come through with the promised run-time system. A second Siro-tech product was stuck in limbo, even as Apple continued to promise the run-time “real soon now.”

Yet that was not as bad as it might seem. With the luxury of time, Greenberg enlisted a collection of friends and fellow D&D fans to put the game through its paces. In addition to finding bugs, they helped Greenberg to balance the game: “I began with an algorithmic model to balance experience, monsters, treasure, and the like, and then tweaked and fine-tuned it by collecting data from the game players.” Their contributions were so significant that Woodhead states that “it would not be unfair to credit them as the third author of the game.” To appreciate how unusual this methodical approach to development was, consider this exchange about Richard Garriott’s early games from Warren Spector’s interview of him:

WS: At this point, did you have any concept of play-testing? Did you have your friends play it? Did California Pacific have any testing? Or was it just, “Hey, this is kind of cool, let’s put it out there!”

RG: Pretty much the latter. Of course my friends were playing it, and I was playing it. I was showing it off to friends. But we didn’t have any process, like, “Hey, you know, we’re about to go manufacture a thousand, so let’s please make sure there’s no bugs and go through a testing process.” There was nothing like that.

I don’t write this to criticize Garriott; his modus operandi was that of the early industry as a whole, and his early games are much more playable than their development process would seem to imply. I do it rather to point out how unusually sophisticated Greenberg and Woodhead’s approach was, perhaps comparable only to Infocom’s. One could quibble about exactly what level of difficulty should count as “balanced” (as Rob Hall wrote in The Computist #40, “If these games are really balanced, those dungeon monsters sure weigh a lot”), but the effort Greenberg and Woodhead put into getting there was well-nigh unprecedented.

The long-awaited run-time system finally arrived in early 1981, as Greenberg and Woodhead were still obsessively testing and tweaking Wizardry. Without the need to hold the development tools in memory, it allowed an ordinary 48 K Apple II to run most programs written and compiled with Apple Pascal. From a room above his father’s spoon factory, Norman Sirotek began duplicating and packaging Siro-tech’s first two products, the comparatively ancient Info-Tree and Galactic Attack, and selling them directly to customers via a few magazine advertisements. It was a very modest beginning. Info-Tree in particular was already showing its age, and it became obvious as the phone began to ring that the quickly-written documentation was inadequate. In fact, that ringing phone posed something of a problem. “Siro-tech” was awfully close to the family name of the Siroteks, so close that customers in need of support started to look the name up in the phone book and call the Sirotek family home. In Woodhead’s words: “After about the fourth phone call at the Sirotek home around four in the morning, we dropped the ‘o’ to become ‘Sir-tech’ and made sure the company phone number was in prominent places on the manual and packaging.”

About this time Norman’s older brother Robert joined him at the new company. He had been working as a computer programmer for a large company before, “tired of the bureaucracy,” deciding to take a flyer on this new venture. Robert turned out to be a vital ally for Greenberg and Woodhead amongst the other Siroteks, who were not at all thrilled with the idea of publishing games and pressuring the two to just finish with Wizardry already so everyone could move on to some sort of proper business application. Frederick Sirotek, from Softalk‘s August 1982 feature on Sir-tech:

“The boys thought it was a great game,” Sir-tech’s top adviser confirms. “But as far as I was concerned, computers were business machines. They weren’t fun machines. You do things with them that you need. I certainly did not realize that there is such a relatively large segment of the population that has the computer only or mostly for pleasure.”

Robert, on the other hand, was much more familiar with typical uses of computers and “got” Wizardry the first time he played it; he thought it “fantastic,” well worth the time and labor.

To drum up some publicity, Sir-tech took the game to the June 1981 AppleFest in Boston (the same show where Chuck Benton had his fateful meeting with Ken Williams and On-Line Systems). They sold there a demonstration version of the game, which included just the first three dungeon levels. The reception was very positive indeed. Slowly, a buzz was building about the game outside of Sir-tech and Cornell. And then TSR stepped in.

One of the less attractive sides of Gary Gygax and his company was their fondness for using the legal system as a bludgeon. This was, remember, the company that had threatened to sue MIT because an alternate name for Zork, Dungeon, was the same as that of TSR’s Dungeon! board game. It now seemed that Gygax and his company considered the double-Ds of Dungeons of Despair too close to those of Dungeons and Dragons. (One wonders just how TSR, a profoundly un-tech-savvy company almost unbelievably tardy in getting its own products onto computers, kept finding out about all these alleged violations in the first place…) Like the Zork team before them, the Sir-tech folks scoffed a bit at TSR’s chutzpah, but ultimately decided this wasn’t a fight worth having. Dungeons of Despair became Proving Grounds of the Mad Overlord — a better name in my book anyway. (If you’re going to go the purple-prose route, might as well go all out.) In a wonderful display of karmic justice, Gygax himself in the early 1990s was sued by his old company when he tried to market a new game of his own under the name Dangerous Dimensions, and had to change it to Dangerous Journeys.

Sir-tech spent the rest of that summer of 1981 making final preparations to release Wizardry at last. Here Frederick Sirotek made a vital contribution. Realizing from his own business experience how important an appearance of professionalism was and all too aware of the inadequate Info-Tree documentation, he insisted that Sir-tech put together a solid, attractive package for the game and make sure the manual “was readable by people without computer backgrounds.” From the embossed cover to the unusually lengthy, professionally-edited-and-typeset manual found within, Wizardry looked a class act, standing out dramatically from the Ziploc bags and amateurish artwork of the competition. Wizardry looked like something major.

The first pages of the manual reinforced the impression, even if their idea of what constitutes a huge, time-consuming game-development project sounds laughable today:

Wizardry is unlike any other game you have played on your Apple II computer. Using all the power and sophistication of the Pascal language, we have been able to create the most challenging fantasy war game available for any personal computer.

Wizardry is a huge program — in fact, at 14,000 lines of code, it may be the largest single microcomputer game ever created. The entire Wizardry game system, including the programs used to create the extensive Wizardry databases, comprises almost 25,000 lines of code, and is the result of over one man year of intensive effort.

The result is a game that simply could not have been written in BASIC. Wizardry has so many options and is so flexible that the only limits to the game are your imagination and ingenuity.

In something of a coup, they were able to hire one Will McLean, who had done cartoons for Dragon magazine and The Dungeon Master’s Guide, to illustrate the manual.

McLean’s work gave Wizardry more than a whiff of the house style of TSR itself, a quality sure to be attractive to all of the tabletop D&D fans likely to play it. (Remarkably, TSR didn’t try to sue them for that one…)

At the end of September, Sir-tech began shipping Wizardry at last. All of the Siroteks’ doubts were answered almost immediately; Wizardry became a sensation, the biggest release of the year in Apple II gaming. “Two months after Wizardry came out,” said Norman, “I was ready to eat my hat! I’m glad I wasn’t more convincing with my arguments.” We’ll chart its impact in a future post, but before we do that we’ll take a closer look at the game itself.

 
 

Tags: ,

The Roots of Sir-tech

The story of Sir-tech, the software publisher that brought the Wizardry franchise to the world, is inseparable from the story of the family that founded it. To properly trace the company’s roots, we have to go to a time and place far removed from the dawning American microcomputer industry: to Czechoslovakia during the interwar period. Appropriately enough, a castle figures prominently.

Czechoslovakia was patched together from scraps of the Austro-Hungarian Empire at the end of World War I. Composed of three separate and not always friendly ethnolinguistic groups — Czechs, Slovaks, and Germans — the new country had a somewhat fractious start. Within a few years, however, things stabilized nicely, and there followed an all-too-brief happy time in the country’s short and generally vexed history. Having inherited much of the old Austro-Hungarian Empire’s industrial heartland and being possessed of an unusually well-educated population, Czechoslovakia became one of the top ten economies in the world. With business booming, a prosperous populace eager to buy homes, and a burgeoning national reputation for innovative architecture, it was a good time to be a talented and industrious Czech builder. That’s exactly what Bedrich Sirotek was, and he prospered accordingly.

The good times ended for Czechoslovakia in 1938 with the Munich Agreement, in which the country’s alleged allies conspired with Nazi Germany to strip it of its border defenses, of 3.5 million of its citizens, of many of its most valuable natural resources, and of its dignity as a sovereign nation. Sirotek was as proud a Czech as anyone, but he was also a pragmatic businessman. The uncertainty — in some sectors, verging on panic — that followed the loss of the Sudetenland led to a drastic decline in property values. Sirotek started methodically buying up land, hedging against the time when peace and prosperity would return again. Sadly, that would be a long, long time in coming for Czechoslovakia.

One of the properties Sirotek bought was special: a 12th-century Romanesque castle in the village of Stráž nad Nežárkou. It had sat empty for almost a decade following the death of its previous owner, the ill-starred opera diva Emmy Destinn, who in her time had sung with the likes of Enrico Caruso. Decrepit as it was, Sirotek envisioned the castle as the perfect seat of the business dynasty he was building. He moved in right away with his wife, son, and daughter, and started making renovation plans. But within weeks the Germans arrived to gobble up the rest of the helpless country. Sirotek’s son, Bedrich Jr., describes the scene:

“Aside from a garage door falling on me when I was 7 in Smichov, my first real memory is as a 9-year-old boy on March 15, 1939. My sister Miluska and I started out to school, but the streetcars weren’t running and there were strange-looking guys in strange-looking uniforms and strange-looking vehicles driving on the wrong side of the street. [Prewar Czechoslovakia used to have British-style left-hand driving until it became a “protectorate” of right-driving Nazi Germany.] So we went home and found my father listening to the radio. And he took us both aside and said: ‘Now hear this. The Germans have arrived. From here on out, nothing you hear in the family gets repeated.'”

Sirotek’s family continued living in the castle, which he strove to make as livable as he could given the privations of life under the Nazis. Sirotek himself, however, spent much of his time in Prague, where he became heavily involved with the resistance. On several occasions the Gestapo seemed on to him and the game seemed to be up, but, unlike virtually all of Czechoslovakia’s Jewish population, Sirotek was lucky. He survived to see the country liberated by the Soviets.

For a time it looked like Czechoslovakia might be allowed to become again the happy, prosperous little country it had been before the war, as the Soviets stepped back and allowed the people to conduct elections and form a new republic. Sirotek returned to his business interests with gusto, and finally began the extensive renovations of the family castle he had been planning to do so many years before. Bedrich Jr. names his happiest memory there as his sister’s wedding on New Year’s Eve, 1947, when he was 17. But less than two months later, the Czech Communist Party, with the encouragement and support of the Soviets, executed a coup d’état to seize absolute control of the country. Sirotek, well known for his opposition to the Communists, was in danger once again. I’ll let Bedrich Jr. tell the rest of the story, which reads like an episode from a John Le Carré novel:

One weekend soon after the commies seized power, my dad got a call from his bank manager, who’d joined the party to protect himself – and, I guess, his clients. He said: ‘Mr. Sirotek, I’d advise you to leave before dawn on Monday because that’s when they’re coming to pick you up.’ So we loaded up our Tatra and headed out to Frantiskovy Lazne, the spa nearest the West German border. My dad still had contacts from his underground days and had been negotiating with a people-smuggler even before he got the warning.

“We checked into a good hotel and, a day or two later, my mother and father and sister and I got our marching orders to go to a station nearer the frontier; my sister’s husband was already in Geneva on business.

“The smuggler wasn’t there to meet our train. It was market day, so my mother and sister just melted into the crowd of women going to shop. But my father and I stood out like sore thumbs in that closely watched station, so some cops took us in to meet the chief of police himself.

“The chief asked what we were there for, and my father said we wanted to look at the local carpet factory. But he advised us it had been closed for several years. Now he asked if we had any weapons. My father reached into his pocket and came up with a .45-caliber revolver. The chief emptied the bullets and pocketed them. Then he asked my father if he had a permit. Dad produced one.

“The chief was very polite. ‘But, Mr. Sirotek,’ he said. ‘This permit is for a .38, not a .45. Do you happen to have the .38 with you?’

“My father reached into his other pocket and produced the .38. I thought for sure we would leave that room only in handcuffs. But the chief then called our hotel to verify whether we were registered there and had we checked out? We hadn’t – and the manager told him, wrongly, that my mother and sister were still there. So the chief said: ‘Mr. Sirotek, I’m going to keep your weapons. There’s a train back to your family in an hour and I want you both to be on it.’

“We said we would and then headed for the town pub, where my mother and sister and the smuggler were waiting and worrying. By train time, we were hiding in an unused chicken coop, waiting for darkness. It was right on the Iron Curtain; we could hear the guards talking and sometimes there were gunshots. But that night we walked out of the lion’s cage and clear of the zoo.”

The Sirotek family arrived in Canada with little more than the proverbial clothes on their backs; their entire fortune, castle included, was left to the Communists back in Czechoslovakia. Undaunted, Sirotek started over. Both he and his son changed their first names to the more English-friendly Frederick, and by 1951 they had formed their own home-building business. Once again they were on hand for a great economic moment, the prosperity of the 1950s in which a generation of ex-soldiers found good jobs, married, and started buying houses. The company moved on from home-building to gas stations to major commercial projects all over eastern Canada and the northeastern United States, including such prestige projects as a wind tunnel for Ottawa Airport and a linear accelerator and ion lab for the Canadian National Research Council. Frederick Jr., now married and with three children of his own, took over complete control of the family’s numerous business concerns after his father died in 1974.

Those concerns had by this point diversified far beyond construction. The family had, for example, for many years owned a factory manufacturing those little souvenir spoons sold in gift shops. During the mid-1970s, Sirotek became aware of a small industrial-resin manufacturer in Ogdensburg, New York, looking for an outside partner to invest. The owner of the company was a woman named Janice Woodhead, a British émigré to the United States by way of Canada. The husband with whom she had founded the business had recently died, and she needed a partner to continue. Sirotek, who saw an opportunity to acquire the resin his spoon-factory needed at a much cheaper price, signed on.

The partnership eased one link in his chain of supply, but there was still a problem further up the line. The base of the resin manufactured by Woodhead’s company was ordinary sand. That might seem a cheap and plentiful commodity, but this wasn’t generally the case. Prices for the stuff kept changing from week to week, largely in response to changing railroad-shipping rates. Every time that happened, Woodhead would have to recalculate by hand manufacturing costs and pricing. Sirotek didn’t really know anything about computers, but he did know enough to wonder aloud one day whether it might not be possible to program one to do all of this for them, and to do it much more quickly.

As it happened, Janice had a son named Robert who knew a thing or two about computers. Robert was attending Cornell University, allegedly majoring in psychology, but making very slow progress. The reason: Janice had been unwise enough to send Robert to a university on the PLATO network. Like an alarming number of other students, Robert became totally and helplessly addicted, cutting classes and neglecting his assignments in favor of endless hours of online socializing, games, and hacking. As he later said, “PLATO was like crack for computer nerds.” To make the situation even worse, Robert had recently acquired another dangerously addictive device: a TRS-80. Robert had already begun an alternate career in computers, working in a Computerland, programming business applications on contract, even making programs for his own university’s School of Hotel Administration.

At Janice’s suggestion, Sirotek talked to Robert about their problem. Robert’s programming resume and immediately positive response impressed him enough that Sirotek went out and paid $7000 for a top-of-the-line Apple II system to be shared by the two companies. Robert made the program as promised. As a bonus, he also implemented a mailing-list database to help the spoon manufacturer stay in contact with its suppliers and distributors. Wonderful, money well spent, time to move on, etc. Except now the wheels were beginning to turn in Sirotek’s head. His family hadn’t gotten to where it was without a keen business instinct and a nose for opportunity. Certainly lots of other businesses must have similar software needs, and Robert was a smart, personable kid he felt happy to help. As an experiment, they polished up the in-house mailing-list program, named it Info-Tree, and put some packaging together. They agreed that Robert would take the $7000 Apple II system along with the program to the Trenton Computer Festival of April 1979. (The keynote that year was delivered by Wayne Green, and had the perfect theme: “Remarkable Opportunities for Hobbyists.”)

But there was a problem: Sirotek wasn’t willing to ship his expensive computer by air, and Robert didn’t drive. Sirotek therefore decided to ask one of his sons, Norman, if he would be willing to drive Robert out to New Jersey for the show. At the time, Norman was having a bit of trouble deciding what he wanted for his life. After high school he’d enrolled in a business-management program at Clarkson College, only to decide it wasn’t for him after two years. He’d tried engineering for a time, but dropped out of that program as well. Recently he’d been managing construction jobs for his father’s companies while taking some engineering-drafting courses on the side. Norman had no particular interest in computers, and wasn’t thrilled about spending a weekend at a trade show for the things. However, his father was able to convince him by mentioning that Trenton was very close to the casinos and nightlife of Atlantic City.

Norman did spend some time that weekend in Atlantic City, but he also spent much more time than expected with Robert at the show. In fact, he was fascinated by what he saw there. On the drive home, he proposed to Robert that they officially go into the software business together: he would market the programs using his family’s wealth and connections, and Robert would write them. “Siro-tech” Software was born. The proposal came at a perfect time for Robert, who had just been suspended from university for a full year due to his poor grades.

The senior Sirotek officially took the role of president of the new company, but was happy to largely let the young men run with their ideas on their own, figuring the venture would if nothing else make a good learning experience:

“It was a good starter for the boys, learning from the ground up,” Fred Sirotek observes. “Neither Robert Woodhead nor Norman had too much business experience. I guess they both had some credits from the university on the subject, but in terms of hands-on experience they didn’t have any. So Norman would come to me for help — you know, ‘What do I do with this, Dad?’ I’d either produce a suggestion or direct him to what he needed.”

Robert and Norman had a long discussion about what they should do for their second product, after Info-Tree. Robert told Norman that — as if it hadn’t been obvious from the software on display at the show — games were hot. And they certainly sounded a lot more fun to write and market than business software. Norman was not, however, initially thrilled with the idea of selling games:

“I remember late one evening telling Bob Woodhead to forget the new game and put his efforts into something worthwhile, like a business package. I said nobody needs or wants the game. Bob looked straight at me and said I was wrong and went back to work.”

And so, over Norman’s mild objections, the die was cast. Siro-tech would try to make its name as a games publisher.

One of the most popular games on PLATO at the time (and one of the system’s legendary titles even today) was a space wargame called Empire. It’s a game we’ve brushed up against before on this blog: Silas Warner helped its designer, John Daleske, with its early development, and later developed a variant of his own. Robert believed it would be possible to write a somewhat stripped-down version of the game for the Apple II. Progress was slow at first, but after a few months Robert bought the brand-new Apple Pascal and fell in love with it. He designed and programmed Galactic Attack in Pascal during the latter half of 1979. Demonstrating that blissful ignorance of copyright that marked the early software industry, he not only swiped the design pretty much whole-cloth from Daleske but made his alien enemies the Kzinti, a warlike race from Larry Niven’s Known Space books.

The game was complete, but now the would-be company had a problem, a big one: they had no way to release it. Apple had promised upon the release of Apple Pascal that a “run-time system” — a way to allow ordinary Apple IIs without the Apple Pascal software or the language card to run programs written in Pascal — would be coming shortly. (The run-time system would be, in other words, a standalone P-Machine interpreter.) Robert had taken them at their word, figuring the run-time would be available by the time Galactic Attack was ready. Now it was, and the run-time wasn’t. Apple continued to promise that it was in the works, but for now Siro-tech was stuck with a game they couldn’t distribute. All they could do was wait, pester Apple from time to time, and have faith. Luckily, the deep pockets of the Sirotek family gave them that luxury. In fact, they showed quite a lot of faith: Robert was such a fan of Pascal that, in spite of all the uncertainty, he plunged into a new Pascal project even as Galactic Attack sat on the shelf. This one would be bigger, more ambitious, and more original. We’ll see where that led next time.

But before we do that, know that the Sirotek family did eventually get their castle back. It was officially returned to Frederick by the Czech government as part of its restitution for the Communist years in the early 1990s.

(In addition to the links imbedded above, this article is based heavily upon articles in the March 1982 Softline, August 1982 Softalk, and December 1992 Computer Gaming World.)

 
 

Tags: ,

Pascal and the P-Machine

Working with a small team of assistants, Niklaus Wirth designed Pascal between 1968 and 1970 at the Swiss Federal Institute of Technology in Zürich. His specification was implemented for the first time on the university’s CDC Cyber mainframe in mid-1970, and the system was finally considered complete and robust enough to introduce in beginning programming classes there in 1972. With his language essentially complete and with a working proof of concept in daily use, Wirth now shifted roles, from design and implementation to the equally daunting task of convincing computer-science departments around the world to give up their old languages and give his new one a shot. Like the PC industry of a decade later, the world of institutional computing was full of incompatible systems that often had trouble even exchanging data, much less programs. And yet Pascal needed to be available on all or most of these machines — or at least the ones commonly chosen by computer-science departments for pedagogical use — to have a chance of realizing Wirth’s goal of Pascal serving as an antidote to the deadly virus of BASIC. Porting the compiler by hand to all of those disparate architectures looked to be a daunting task indeed.

Wirth’s next epiphany should sound familiar if you read my earlier posts about Infocom: working closely with a graduate student, Urs Amman, he created a virtual machine, named the P-Machine, that could be hosted on all of these physical machines. They rewrote the Pascal compiler to output P-Code that could run under the P-Machine, just as Infocom later did in designing ZIL and the Z-Machine. (That’s of course no big surprise, as the P-Machine was the inspiration for the Z-Machine. If you’ve been reading these posts chronologically, I’m afraid we’ve rather put the cart before the horse.) Wirth, however, went one step further: he rewrote the Pascal compiler and other development tools themselves in P-Code, thus completing the circle. Once a P-Machine interpreter was written for any given platform, that platform could not only run the whole universe of already extant Pascal software, but also run the compiler, allowing users to create more software that could not only run on that platform but on all others for which P-Machine interpreters had been written. Similarly, updates to Pascal could be made instantly available on every platform hosting the language. Neat trick, no?

Beginning in 1973, Wirth began offering a “P-Kit” to anyone who wanted one. It consisted of the P-Code Pascal compiler and the source code, itself written in Pascal, for a P-Machine interpreter. The recipient need only (?) translate this source into a program runnable on their platform, working in assembly or some other high-level language, to get a complete Pascal environment up and running. To further encourage as many implementations as possible, Wirth published the specifications for the P-Machine in his book Algorithms + Data Structures = Programs, published in German in 1975 and in English the following year. The P-Machine did its job. By the mid-1970s universities were increasingly adapting Pascal as their standard beginning pedagogical language in lieu of comparative dinosaurs like BASIC and FORTRAN.

Meanwhile, the PC revolution was beginning, a development of which Wirth remained virtually unaware. He was after all firmly entrenched in the established institutional computing culture, and, further, he was working from Europe, where microcomputer technology was oddly slow in arriving. It would therefore be someone else, Ken Bowles of the University of California San Diego, who would spearhead a drive to bring Pascal and the P-Machine to microcomputers.

Bowles was an angry, frustrated man when he received his P-Kit in 1974. A devotee of interactive, time-shared computing over the old batch-processing model, Bowles had ascended to director of UCSD’s computer center in 1968. One of his first actions had been to replace the mainframe at the core of the center, an aged, batch-processing-bound Control Data system, with a state-of-the-art Burroughs capable of timesharing. Incredibly, however, Bowles got word from a lecturing stint in Oxford, England, in mid-1974 that the university’s administrators had decided, without even consulting him, to replace the Burroughs system with another big, traditional, batch-processing IBM mainframe. Even better, he got this news not from the university but from contacts at Burroughs, who contacted him asking why UCSD was pulling its contract. Bowles resigned his position as director in protest, going back to being just an ordinary professor, but could only watch helplessly as the trucks arrived to cart away the Burroughs system that had been essential to much of the research of him and his students. Worse, his programming classes would now have to be taught in the old way once again: instead of being able to write a program, compile it, and instantly see the result, students would have to type it out onto punched cards, deliver it to the computer center, then return the next day — if they were lucky — to see if it had actually worked. And rinse and repeat, ad nauseum.

Bowles saw the P-Kit as a possible solution to his woes, a chance to get a proper development environment back into the hands of his students. He would let the administrators have their mainframe, and try to get Pascal running on smaller, cheaper machines. Unlike his colleague in Switzerland, Bowles could even in 1974 see where the new generation of microchip technology was leading; he realized that desktop computers were on the horizon. While he would initially implement his P-Machine on a PDP-11 minicomputer, he could already envision the day when every student would have her own private computer to program. Thus the portability of the P-Machine was key to his project.

By mid-1976, Bowles and a small group of students had already come a long way, with a working PDP-11 Pascal environment that they had begun using to teach introduction-to-programming classes. (It replaced, not without controversy from traditionalists, the older FORTRAN-based curriculum.) And they had not just created a clone of Wirth’s compiler but had gone far beyond it. They had expanded greatly upon Wirth’s relatively stripped-down language, adding everyday conveniences such as better string handling and easier file access. Around it they had built what amounted to an entire Pascal operating system, all running in virtualized P-Code, similar to the interactive BASIC environments of the time but better; the text editor, for instance, was something of a marvel for its time. When UCSD Pascal began to spread, their tinkering with Pascal raised a fair amount of ire from some quarters, not least from Wirth himself, a pedantic sort who regarded the language in its original form as perfect, with everything it needed and nothing it didn’t. Still, UCSD Pascal would soon supersede Wirth’s own implementation as the standard, most notably inspiring what became the commercial juggernaut Turbo Pascal. And whatever his misgivings at the time, Wirth has since come to acknowledge the enormous role UCSD Pascal played in popularizing his design in the PC world.

In July of 1976, Bowles and his students brought their Pascal up for the first time on a microcomputer, a Z80-based system built from a kit. He describes this moment as a “revelation”; all of the software his team had created for the PDP-11 version just worked, immediately, with no changes whatsoever.

Bowles had begun his project to provide a better tool for his students, but it was soon obvious that UCSD Pascal had commercial potential outside the university. The first partnership was with a tiny startup called Terak, who had developed a workstation called the 8510/a that was basically a stripped-down, semi-compatible clone of the PDP-11 minicomputer with added bitmapped graphics capabilities that were stunning for their time. Having been first implemented on a PDP-11, UCSD Pascal was of course a natural fit there. Bowles went on the road with Terak to demonstrate the system, where the programming environment combined with the machine’s display capabilities inspired “gasps of amazement.” Terak machines soon became the standard platforms for running UCSD Pascal at UCSD itself.

The greenest pastures, however, beckoned from the burgeoning PC market. Microcomputer users and programmers were already as early as 1977 trying to reckon with the incompatible machines on the market: the TRS-80, Apple II, and Commodore PET, not to mention the dozens of kit and boutique computers, were all incompatible with one another, fragmenting an already tiny software market. Yes, these machines all ran BASIC, but each hosted a subtly different version of the language, crafted in response to the hardware’s capabilities and the whims of the machine’s manufacturer, enough to guarantee that all but the simplest BASIC programs would need some translation to move from platform to platform.

Every programmer had to deal with this reality, whether by coding in BASIC and translating as necessary (as did the general-purpose magazines, who often published type-in listings footnoted with the changes needed to run the program on platforms X, Y, and Z), developing some sort of portable game engine (as did Scott Adams, Automated Simulations, and Infocom), or just focusing on a single platform and hoping it was enough to sustain a business (as did the Apple II-specific supercoders I mentioned in my last post). The UCSD system offered another solution. Beginning in 1978, Bowles and his students started a quasi-business selling versions of the system for S-100-bus PCs to anyone who asked for one for $15. Those machines, descendents of the original Altair and generally either built from kits or provided by boutique manufacturers, inhabited a somewhat different ecosystem than the friendlier, more mass-market trinity of 1977, being the domain of the hardcore technical set that made up the core of Byte magazine’s readership and, increasingly, business users. (Tellingly, games, which dominated early software on the trinity of 1977, were few and far between on these machines.) For all that, however, there were quite a lot of them out there, and quite lot of their owners were eager to experiment with UCSD Pascal in lieu of their normal operating system of choice, Digital Research’s CP/M.

Bowles first met Steve Jobs and Steve Wozniak at the very West Coast Computer Faire at which they unveiled the Apple II. Jobs was already eying the education market, eager to forge “respectable” ties for Apple, and eager to bring professional-level software to the platform, and so the two men remained in intermittent contact. The relationship was given a boost the following year when Bill Atkinson, a UCSD alum, came to work for Apple. Atkinson, a computer engineer whose word held a great deal of sway with the un-technical Jobs, was greatly enamored of UCSD Pascal, convinced it would be a great booster for the Apple II. Still, that remained a problematic proposition at this point. Although UCSD Pascal had been designed to run on tiny machines in comparison to its inspiration, there were inevitable limits. The system was designed for a machine with at least 64 K of memory. By contrast, the first Apple IIs could be purchased with as little as 4 K, and seldom exceeded 16 K. It was an obvious nonstarter. And so the relationship between Apple and UCSD remained just talk for the moment.

In mid-1979 Apple introduced the dramatically improved Apple II Plus, which generally sold with what was taken at the time as the machine’s maximum possible memory of 48 K; the 6502 CPU used in the Apple II can only address 64 K at one time, of which 16 K was used by the ROM memory that hosted the machine’s BASIC-based operating system. They were getting close, but an Apple II version of UCSD Pascal still seemed out of reach. As it turned out, however, they were close enough that some clever hacking could get the job done.

The UCSD system would by design completely take over the machine. This meant that the 16 K of BASIC ROM would be superfluous when the machine was running the new operating system. Therefore Apple came up with a new expansion card (reason to bless Woz’s insistence on having all those slots again!) containing 16 K of RAM memory. The user could choose whether the CPU addressed this RAM (for running UCSD Pascal), or the standard 16 K of ROM (for running other software). Just like that, they had their 64 K machine.

The USCD Pascal software, renamed to Apple Pascal, was sold as a single package along with this “Language Card” for about $500 from shortly after the arrival of the Apple II Plus. It transformed just about everything about the Apple II; even its disks used their own format, unreadable under the normal Apple II environment. It would not be an exaggeration to say that an Apple II equipped with Apple Pascal was a completely new and different machine from Woz’s original creation, with a personality all its own. The inability to exchange programs and data with users who hadn’t purchased the system was, undeniably, a drawback. On the plus side, however, the user got easily the most advanced development environment available on any microcomputer of this era. Not only did she have access to the Pascal language in lieu of BASIC, but Apple and UCSD worked in quite a lot of extensions to take advantage of the Apple II’s unique bitmapped graphics capabilities, borrowing from the older Terak implementation. I’ll come back to that a couple of posts from now, when I demonstrate a concrete example of Apple Pascal in action. And we’ll start on the story that will lead to that next time.

 
 

Tags: ,

A Tale of Three Languages

If I had to name one winner amongst the thousands of programming languages that have been created over the last 60 years, the obvious choice would be C. Developed by Dennis Ritchie from 1969 as the foundation of the Unix operating system, C remains one of the most commonly used languages even today; the Linux kernel, for example, is implemented in C. Yet that only tells part of the story. Dozens of other languages have borrowed the basic syntax of C while adding bells and whistles of their own. This group includes the most commonly used languages in computing, such as Java, C++, and Perl; quickly growing upstarts like C# and Objective C; and plenty of more esoteric domain-specific languages, like the interactive-fiction development system TADS 3. For a whole generation of programmers, C’s syntax, so cryptic and off-putting to newcomers with its parenthesis, curly braces, and general preference for mathematical symbols in lieu of words, has become a sort of comfort food. “This new language can’t be that bad,” we think. “After all, it’s really just C with…” (“these new things called classes that hold functions as well as variables”; “a bunch of libraries to make text-adventure development easy”; etc.). For obvious reasons, “C-like syntax” always seems to be near the top of the feature list of new languages that have it. (And for those who don’t: congratulations on sticking to your aesthetic guns, but you’ve chosen a much harder road to acceptance. Good luck!)

When we jump back 30 years, however, we find in this domain of computing like in so many others a very different situation. In this time C was the standard language of the fast-growing institutional operating system Unix, but had yet to really escape the Unix ghetto to join the top tier of languages in the computing world at large. Microcomputers boasted only a few experimental and/or stripped-down C compilers, and the language was seldom even granted a mention when magazines like Byte did one of their periodic surveys of the state of programming. The biggest buzz in Byte went instead to Niklaus Wirth’s Pascal, named after the 17th-century scientist, inventor, and philosopher who invented an early mechanical calculating machine. Even after C arrived on PCs in strength, Pascal, pushed along by Borland’s magnificent Turbo Pascal development environment, would compete with and often even overshadow it as the language of choice for serious programmers. Only in the mid-1990s did C finally and definitively win the war and become the inescapable standard we all know today.

While I was researching this post I came across an article by Chip Weems of Oregon State University. I found it kind of fascinating, so much that I’m going to quote from it at some length.

In the early days of the computer industry, the most expensive part of owning a computer was the machine itself. Of all the components in such a machine, the memory was the most costly because of the number of parts it contained. Early computer memories were thus small: 16 K was considered large and 64 K could only be found in supercomputers. All of this meant that programs had to take advantage of what little space was available.

On the other hand, programs had to be written to run as quickly as possible in order to make the most efficient use of the large computers. Of course these two goals almost always contradicted each other, which led to the concept of the speed versus space tradeoff. Programmers were prized for the ability to write tricky, efficient code which took advantage of special idiosyncrasies in the machine. Supercoders were in vogue.

Fortunately, hardware evolved and became less expensive. Large memories and high speed became common features of most systems. Suddenly people discovered that speed and space were no longer important. In fact roles had reversed and hardware had become the least expensive part of owning a computer.

The costliest part of owning a computer today is programming it. With the advent of less expensive hardware, the emphasis has shifted from speed versus space to a new tradeoff: programmer cost versus machine cost. The new goal is to make the most efficient use of a programmer’s time, and program efficiency has become less important — it’s easier to add more hardware.

If you know something about the history of the PC, you’re probably nodding along right now, as we’re seemingly on very familiar ground. If you’re a crotchety old timer, you may even be mulling over a rant about programmers today who solve all their problems just by throwing more hardware at them. (When old programmers talk about the metaphorical equivalent of having to walk both ways uphill in the snow to school every morning, they’re actually pretty much telling the truth…) Early Apple II magazines featured fawning profiles of fast-graphics programming maestros like Nasir Gebelli (so famous everyone just knew him by his first name), Bill Budge, and Ken Williams, the very picture of Weems’s “supercoders” who wrote “tricky, efficient code which took advantage of special idiosyncrasies in the machine.” If no one, including themselves after a few weeks, could quite understand how their programs did their magic, well, so be it. It certainly added to the mystique.

Yet here’s the surprising thing: Weems is not describing PC history at all. In fact, the article predates the fame of the aforementioned three wizards. It appeared in the August, 1978, issue of Byte, and is describing the evolution of programming to that point on the big institutional systems. Which leads us to the realization that the history of the PC is in many ways a repeat of the history of institutional computing. The earliest PCs being far too primitive to support the relatively sophisticated programming languages and operating systems of the institutional world, early microcomputer afficionados were thrown back into a much earlier era, the same that Weems is bidding a not-very-fond farewell to above. Like the punk-rock movement that was exploding just as the trinity of 1977 hit the market, they ripped it up and started again, only here by necessity rather than choice. This explains the reaction, somewhere between bemused contempt and horror, that so many in the institutional world had to the tiny new machines. (Remember the unofficial motto of MIT’s Dynamic Modeling Group: “We hate micros!”) It also explains the fact that I’m constantly forced to go delving into the history of computing on the big machines to explain developments there that belatedly made it to PCs. In fact, I’m going to do that again, and just very quickly look at how institutional programming got to the relatively sophisticated place at which it had arrived by the time the PC entered the scene.

The processor at the heart of any computer can ultimately understand only the most simplistic of instructions. Said instructions, known as “opcodes,” do such things as moving a single number from memory into a register of the processor; or adding a number already stored in a register to another; or putting the result from an operation back into memory. Each opcode is identified by a unique sequence of bits, or on/off switches. Thus the first programmers were literally bit flippers, laboriously entering long sequences of 1s and 0s by hand. (If they were lucky, that is; some early machines could only be programmed by physically rewiring their internals.) Assemblers were soon developed, which allowed programmers to replace 1s and 0s with unique textual identifiers: “STO” to store a number in memory, “ADD” to do the obvious, etc. After writing her program using this system of mnemonics, the programmer just had to pass it through the assembler to generate the 1s and 0s the computer needed. That was certainly an improvement, but still, programming a computer at the processor level is very time consuming. Sure, it’s efficient in that the computer does what you tell it to and only what you tell it to, but it’s also extremely tedious. It’s very difficult to write a program of real complexity from so far down in the weeds, hard to keep track of the forest of what you’re trying to accomplish when surrounded by trees made up of endless low-level STOs and ADDs. And even if you’re a supercoder who’s up to the task, good luck figuring out what you’ve done after you’ve slept on it. And as for others figuring it out… forget about it.

And so people started to develop high-level languages that would let them program at a much greater level of abstraction from the hardware, to focus more on the logic of what they were trying to achieve and less on which byte they’d stuck where 2000 opcodes ago. The first really complete example of such a language arrived in 1954. We’ve actually met it before on this blog: FORTRAN, the language Will Crowther chose to code the original Adventure more than 20 years later. LISP, the ancestor of MIT’s MDL and Infocom’s ZIL, arrived in 1958. COBOL, language of a million dull-but-necessary IBM mainframe business programs, appeared in 1959. And they just kept coming from there, right up until the present.

As the 1960s wore on, increasing numbers of people who were not engineers or programmers were beginning to make use of computers, often logging on to timesharing systems where they could work interactively in lieu of the older batch-processing model, in which the computer was fed some data, did its magic, and output some result at the other end without ever interacting with the user in between. While they certainly represented a huge step above assembly language, the early high-level languages were still somewhat difficult for the novice to pick up. In addition, they were compiled languages, meaning that the programmer wrote and saved them as plain text files, then passed them through another program called a compiler which, much like an assembler, turned them into native code. That was all well and good for the professionals, but what about the students and other amateurs who also deserved a chance to experience the wonder of having a machine do their bidding? For them, a group of computer scientists at Dartmouth University led by John Kemeny and Thomas Kurtz developed the Beginner’s All-Purpose Symbolic Instruction Code: BASIC. It first appeared on Dartmouth’s systems in 1964.

As its name would imply, BASIC was designed to be easy for the beginner to pick up. Another aspect, somewhat less recognized, is that it was designed for the new generation of time-sharing systems: BASIC was interactive. In fact, it wasn’t just a standalone language, but rather a complete computing environment which the would-be programmer logged into. Within this environment, there was no separation between statements used to accomplish something immediately, like LISTing a program or LOADing one, and those used within the program itself. Entering “PRINT ‘JIMMY'” prints “JIMMY” to the screen immediately; put a line number in front of it (“10 PRINT ‘JIMMY'”) and it’s part of a program. BASIC gave the programmer a chance to play. Rather than having to type in and save a complete program, then run it through a compiler hoping she hadn’t made any typos, and finally run the result, she could tinker with a line or two, run her program to see what happened, ad infinitum. Heck, if she wasn’t sure how a given statement worked or whether it was valid, she could just type it in by itself and see what happened. Because BASIC programs were interpreted at run-time rather than compiled beforehand into native code, they necessarily ran much, much slower than programs written in other languages. But still, for the simple experiments BASIC was designed to facilitate that wasn’t really so awful. It’s not like anyone was going to try to program anything all that elaborate in BASIC… was it?

Well, here’s where it all starts to get problematic. For very simple programs, BASIC is pretty straightforward and readable, easy to understand and fun to just play with. Take everybody’s first program:

10 PRINT "JIMMY RULES!"
20 GOTO 10

It’s pretty obvious even to someone who’s never seen a line of code before what that does, it took me about 15 seconds to type it in and run it, and in response I get to watch it fill the screen with my propaganda for as long as I care to look at it. Compared to any other contemporary language, the effort-to-reward ratio is off the charts. The trouble only starts if we try to implement something really substantial. By way of example, let’s jump to a much later time and have a look at the first few lines of the dungeon-delving program in Richard Garriott’s Ultima:

0 ONERR GOTO 9900
10 POKE 105, PEEK (30720): POKE 106, PEEK (30721): POKE 107, PEEK (30722): POKE 108, PEEK (30723): POKE 109, PEEK (30724): POKE 110, PEEK (30725): POKE 111, PEEK (30726): POKE 112, PEEK (30727)
20 PRINT "BLOAD SET"; INT (IN / 2 + .6)
30 T1 = 0:T2 = 0:T3 = 0:T4 = 0:T5 = 0:T6 = 0:T7 = 0:T8 = 0:T9 = 0: POKE - 16301,0: POKE - 16297,0: POKE - 16300,0: POKE - 16304,0: SCALE= 1: ROT= 0: HCOLOR= 3: DEF FN PN(RA) = DNG%(PX + DX * RA,PY + DY * RA)
152 DEF FN MX(MN) = DN%(MX(MN) + XX,MY(MN)): DEF FN MY(MN) = DN%(MX(MN),MY(MN) + YY): DEF FN L(RA) = DNG%(PX + DX * RA + DY,PY + DY * RA - DX) - INT (DN%(PX + DX * RA + DY,PY + DY * RA - DX) / 100) * 100: DEF FN R(RA) = DNG%(PX + DX * RA - DY,PY + DY * RA + DX) - INT (DN%(PX + DX * RA - DY,PY + DY * RA + DX) / 100) * 100
190 IF PX = 0 OR PY = 0 THEN PX = 1:PY = 1:DX = 0:DY = 1:HP = 0: GOSUB 500
195 GOSUB 600: GOSUB 300: GOTO 1000
300 HGR :DIS = 0: HCOLOR= 3

Yes, given the entire program so that you could figure out where all those line-number references actually lead, you could theoretically find the relatively simple logic veiled behind all this tangled syntax, but would you really want to? It’s not much fun trying to sort out where all those GOTOs and GOSUBs actually get you, nor what all those cryptic one- and two-letter variables refer to. And because BASIC is interpreted, comments use precious memory, meaning that a program of real complexity like the one above will probably have to dispense with even this aid. (Granted, Garriott was also likely not interested in advertising to his competition how his program’s logic worked…)

Now, everyone can probably agree that BASIC was often stretched by programmers like Garriott beyond its ostensible purpose, resulting in near gibberish like the above. When you have a choice between BASIC and assembly language, and you don’t know assembly language, necessity becomes the mother of invention. Yet even if we take BASIC at its word and assume it was intended as a beginner’s language, to let a student play around with this programming thing and get an idea of how it works and whether it’s for her, opinions are divided about its worth. One school of thought says that, yes, BASIC’s deficiencies for more complex programming tasks are obvious, but if used as a primer or taster of sorts for programming it has its place. Another is not only not convinced by that argument but downright outraged by BASIC, seeing it as an incubator of generations of awful programmers.

Niklaus Wirth was an early member of the latter group. Indeed, it was largely in reaction to BASIC’s deficiencies that he developed Pascal between 1968 and 1970. He never mentions BASIC by name, but his justification for Pascal in the Pascal User Manual and Report makes it pretty obvious of which language he’s thinking.

The desire for a new language for the purpose of teaching programming is due to my dissatisfaction with the presently used major languages whose features and constructs too often cannot be explained logically and convincingly and which too often defy systematic reasoning. Along with this dissatisfaction goes my conviction that the language in which the student is taught to express his ideas profoundly influences his habits of thought and invention, and that the disorder governing these languages imposes itself into the programming style of the students.

There is of course plenty of reason to be cautious with the introduction of yet another programming language, and the objection against teaching programming in a language which is not widely used and accepted has undoubtedly some justification, at least based on short-term commercial reasoning. However, the choice of a language for teaching based on its widespread acceptance and availability, together with the fact that the language most taught is thereafter going to be the one most widely used, forms the safest recipe for stagnation in a subject of such profound pedagogical influence. I consider it therefore well worthwhile to make an effort to break this vicious cycle.

If BASIC, at least once a program gets beyond a certain level of complexity, seems to actively resist every effort to make one’s code readable and maintainable, Pascal swings hard in the opposite direction. “You’re going to structure your code properly,” it tells the programmer, “or I’m just not going to let you compile it at all.” (Yes, Pascal, unlike BASIC, is generally a compiled language.) Okay, that’s not quite true; it’s possible to write ugly code in any language, just as it’s at least theoretically possible to write well-structured BASIC. But certainly Pascal works hard to enforce what Wirth sees as proper programming habits. The opinions of others on Wirth’s approach have, inevitably, varied, some seeing Pascal and its descendants as to this day the only really elegant programming languages ever created and others seeing them as straitjackets that enforce a certain inflexible structural vision that just isn’t appropriate for every program or programmer.

For my part, I don’t agree with Wirth and so many others that BASIC automatically ruins every programmer who comes into contact with it; people are more flexible than that, I think. And I see a bit of both sides of the Pascal argument, finding myself alternately awed by its structural rigorousness and infuriated by it every time I’ve dabbled in the language. Since I seem to be fond of music analogies today: Pascal will let you write a beautiful programming symphony, but it won’t let you swing or improvise. Still, when compared to a typical BASIC listing or, God forbid, an assembly-language program, Pascal’s clarity is enchanting. Considering the alternatives, which mostly consisted of BASIC, assembly, and (on some platforms) creaky old FORTRAN, it’s not hard to see why Byte and many others in the early PC world saw it as the next big thing, a possible successor to BASIC as the lingua franca of the microcomputer world. Here’s the heart of a roulette game implemented in Pascal, taken from another article in that August 1978 issue:

begin 
     askhowmany  (players); 
     for  player :  =  1  to players do 
          getname  (player ,  playerlist) ; 
     askif (yes); 
     if  yes  then  printinstructions; 
     playersleft : =  true ; 
     while  playersleft do 
          begin  
          for  player :  =  1  to players do 
          repeat 
               getbet (player,  playerlist);
               scanbet (player, playerlist); 
               checkbet  (player, playerlist, valid);
          until valid; 
          determine (winningnumber); 
          for  player : =  1 to  players do 
               begin  
               if  quit (player, playerlist) 
                    then  processquit  (player, playerlist, players, playersleft); 
               if  pass  (player, playerlist) 
                    then  processpass (player, playerlist); 
               if  bet  (player , playerlist) 
                    then  processbet  (player, playerlist, winningnumber)
               end
     end  
end.

The ideal of Wirth was to create a programming language capable of supporting self-commenting code: code so clean and readable that comments became superfluous, that the code itself was little more difficult to follow than a simple textual description of the program’s logic. He perhaps didn’t quite get there, but the program above is nevertheless surprisingly understandable even if you’ve never seen Pascal before. Just to make it clear, here’s the pseudocode summary which the code extract above used as its model:

Begin program. 
     Ask how many  players. 
     For  as many players as there are, 
          Get each player's name. 
     Ask if instructions are needed. 
     If  yes, output  the  instructions. 
     While there are still any players left, 
          For as many  players as there are, 
               Repeat until a valid bet is obtained: 
                    Get the player's bet. 
                         Scan the bet. 
                         Check bet for validity. 
          Determine the winning number. 
          For as many players as there are, 
               If player quit, process  the quit. 
               If  player passed , process the  pass. 
               If  player bet, 
                    Determine whether player won or lost. 
                    Process  this accordingly.
End program.

Yet Pascal’s readability and by extension maintainability was only part of the reason that Byte was so excited. We’ll look at the other next time… and yes, this tangent will eventually lead us back to games.

 
 

Tags: ,