RSS

Tag Archives: apple ii

The Dawn of Multimedia

Cover illustration from Byte, June 1982

Unless you’re an extremely patient and/or nostalgic sort, most of the games I’ve been writing about on this blog for over two years now are a hard sell as something to just pick up and play for fun. There have been occasional exceptions: M.U.L.E., probably my favorite of any game I’ve written about so far, remains as magical and accessible as it was the day it was made; some or most of the Infocom titles remain fresh and entertaining as both fictions and games. Still, there’s an aspirational quality to even some of the most remarkable examples of gaming in this era. Colorful boxes and grandiose claims of epic tales of adventure often far exceeded the minimalist content of the disks themselves. In another era we might levy accusations of false advertising, but that doesn’t feel quite like what’s going on here. Rather, players and developers entered into a sort of partnership, a shared recognition that there were sharp limits to what developers could do with these simple computers, but that players could fill in many of the missing pieces with determined imaginings of what they could someday actually be getting on those disks.

Which didn’t mean that developers weren’t positively salivating after technological advances that could turn more of their aspirations into realities. And progress did come. Between the Trinity of 1977 and 1983, the year we’ve reached as I write this, typical memory sizes on the relatively inexpensive 8-bit machines typical in homes increased from as little as 4 K to 48 K, with 64 K set to become the accepted minimum by 1984. The arrival of the Atari 400 and 800 in 1979 and the Commodore 64 in 1982 each brought major advances in audiovisual capabilities. Faster, more convenient disks replaced cassettes as the accepted standard storage medium, at least in North America. But other parts of the technological equation remained frozen, perhaps surprisingly so given the modern accepted wisdom about the pace of advancement in computing. Home machines in 1983 were still mostly based around one of the two CPUs found in the Trinity of 1977, the Zilog Z80 and the MOS 6502, and these chips were still clocked at roughly the same speeds as in 1977. Thus, Moore’s Law notwithstanding, the processing potential that programmers had to work with remained for the moment frozen in place.

To find movement in this most fundamental part of a microcomputer we have to look to the more expensive machines. The IBM PC heralded the beginning of 16-bit microcomputing in 1981. The Apple Lisa of 1983 became the first mass-produced PC to use the state-of-the-art Motorola 68000, a chip which would have a major role to play in computing for the rest of the decade and beyond. Both the Lisa and an upgraded model of the IBM PC introduced in 1983, the PC/XT, also sported hard drives, which let them store several megabytes of data in constantly accessible form, and to retrieve it much more quickly and reliably than could be done from floppy disks. Still, these machines carried huge disadvantages to offset their technical advancements. The IBM PC and especially the PC/XT were, as noted, expensive, and had fairly atrocious graphics and sound even by the standards of 1983. The Lisa was really, really expensive, lacked color and sound, and was consciously designed to be as inaccessible to the hackers and bedroom coders who built the games industry as the Apple II was wide open. The advancements of the IBM PC and the Lisa would eventually be packaged into forms more useful to gamers and game developers, but for now for most gamers it was 8 bits, floppy disks, and (at best) 64 K.

Developers and engineers — and, I should note, by no means just those in the games industry and by no means just those working with the 8-bit machines — were always on the lookout for a secret weapon that might let them leapfrog some steps in what must have sometimes seemed a plodding pace of technological change, something that might let them get to that aspirational future faster. They found one that looked like it might just have potential in a surprising place: in the world of ordinary consumer electronics. Or perhaps by 1983 it was not so surprising, for by then they had already been waiting for, speculating about, and occasionally tinkering with the technology in question for quite some years.

At the end of the 1960s, with the home-videocassette boom still years away, the American media conglomerate MCA and the Dutch electronics giant Philips coincidentally each began working separately on a technology to encode video onto album-like discs using optical storage. The video would be recorded as a series of tiny pits in the plastic surface of the disc, which could be read by the exotic technology of a laser beam scanning it as the disc was rotated. The two companies decided to combine their efforts after learning of one another’s existence a few years later, and by the mid-1970s they were holding regular joint demonstrations of the new technology, to which they gave the perfect name for the era: DiscoVision.

A DiscoVision prototype in action

A DiscoVision prototype in action

Yet laser discs, as they came to be more commonly called, were painfully slow to reach the market. A few pilots and prototype programs aside, the first consumer-grade players didn’t reach stores in numbers until late 1980.

The Pioneer VP-1000, most popular of the early consumer-grade laser-disc players

The Pioneer VP-1000, most popular of the early consumer-grade laser-disc players

By that time VCRs were selling in huge numbers. Laser discs offered far superior video and audio than VCRs, but, at least from the standpoint of most consumers, had enough disadvantages to more than outweigh that. For starters, they were much more expensive. And they could only hold about 30 minutes of video on a side; thus the viewer had to get up and flip or change the disc, album-style, three times over the course of a typical movie. This was a hard sell indeed to a couch-loving nation who were falling in love with their new remote controls as quickly as their VCRs. Yet it was likely the very thing that the movie and television industry found most pleasing about the laser disc that really turned away consumers: the discs were read only, meaning it was impossible to use them to record from the television, or to copy and swap movies and other programs with friends. Some (admittedly anecdotal) reports claim that up to half of the laser-disc players sold in the early years of the format were returned when their purchasers realized they couldn’t use them to record.

Thus the laser-disc format settled into a long half-life in which it never quite performed up to expectations but never flopped so badly as to disappear entirely. It became the domain of the serious cineastes and home-theater buffs who were willing to put up with its disadvantages in return for the best video and audio quality you could get in the home prior to the arrival of the DVD. Criterion appeared on the scene in 1984 to serve this market with a series of elaborate special editions of classic films loaded with the sorts of extras that other publishers wouldn’t begin to offer until the DVD era: cast and crew interviews, “making of” documentaries, alternate cuts, unused footage, and of course the ubiquitous commentary track (like DVDs, laser discs had the ability to swap and mix audio streams). Even after DVDs began to replace VCRs en masse and thus to change home video forever circa 2000, a substratum of laser-disc loyalists soldiered on, some unwilling to give up on libraries they’d spent many years acquiring, others convinced, like so many vinyl-album boosters, that laser discs simply looked better than the “colder” digital images from DVDs or Blu-ray discs. (Although all of these mediums store data using the same basic optical techniques, in a laser disc the data is analog, and is processed using analog rather than digital circuitry.) Pioneer, who despite having nothing to do with the format’s development became its most consistent champion — they were responsible for more than half of all players sold — surprised those who already thought the format long dead in January of 2009 when they announced that they were discontinuing the last player still available for purchase.

The technology developed for the laser disc first impacted the lives of those of us who didn’t subscribe to Sound and Vision in a different, more tangential way. Even as DiscoVision shambled slowly toward completion during the late 1970s, a parallel product was initiated at Philips to adapt optical-storage technology to audio only. Once again Philips soon discovered another company working on the same thing, this time Sony of Japan, and once again the two elected to join forces. Debuting in early 1983, the new compact disc was first a hit mainly with the same sorts of technophiles and culture vultures who were likely to purchase laser-disc players. Unlike the laser disc, however, the CD’s trajectory didn’t stop there. By 1988 400 million CDs were being pressed each year, by which time the format was on the verge of its real explosion in popularity; nine years later that number was 5 billion, close to one CD for every person on the planet.

But now let’s back up and relate this new optical audiovisual technology to the computer technologies with which we’re more accustomed to spending our time around these parts. Many engineers and programmers have a specific sort of epiphany after working with electronics in general or computers in particular for a certain amount of time. Data, they realize, is ultimately just data, whether it represents an audio recording, video, text, or computer code. To a computer in particular it’s all just a stream of manipulable numbers. The corollary to this fact is that a medium developed for the storage of one sort of data can be re-purposed to store something else. Microcomputers in particular already had quite a tradition of doing just that even in 1983. The first common storage format for these machines was ordinary cassette tapes, playing on ordinary cassette players wired up to Altairs, TRS-80s, or Apple IIs. The data stored on these tapes, which when played back for human ears just sounded like a stream of discordant noise, could be interpreted by the computer as the stream of 0s and 1s which it encoded. It wasn’t the most efficient of storage methods, but it worked — and it worked with a piece of cheap equipment found lying around in virtually every household, a critical advantage in those do-it-yourself days of hobbyist hackers.

If a cassette could be used to store a program, so could a laser disc. Doing so had one big disadvantage compared to other storage methods, the very same that kept so many consumers away from the format: unless you could afford the complicated, specialized equipment needed to write to them yourself, discs had to be stamped out from a special factory complete with their contents, which afterward could only be read, not altered. But the upside… oh, what an upside! A single laser-disc side may have been good for only about 30 minutes of analog video, but could store about 1 to 1.5 GB of digital computer code or data. The possibility of so much storage required a major adjustment of the scale of one’s thinking; articles even in hardcore magazines like Byte that published the figure had to include a footnote explaining what a gigabyte actually was.

Various companies initiated programs in the wake of the laser disc’s debut to adopt the technology to computers, resulting in a plethora of incompatible media and players. Edward Rothchild wrote in Byte in March of 1983 that “discs are being made now in 12- and 14-inch diameters, with 8-, 5 1/4-, 3-, and possibly 2-inch discs likely in the near future.”

The Toshiba DF-2000, a typically elaborate optical-storage-based institutional archiving system of the 1980s

The Toshiba DF-2000, a typically elaborate optical-storage-based institutional archiving system of the 1980s

Others moved beyond discs entirely to try cards, slides, even optical versions of old-fashioned reel-to-reel or cassette tapes. Some of the ideas that swirled around were compelling enough that you have to wonder why they never took off. A company called Drexler came out with the Drexon Laser Card, a card the size of a driver’s license or credit card with a strip at the back that was optical rather than magnetic and could store some 2 MB of data. They anticipated PCs of the future being equipped with a little slot for reading the cards. Among other possibilities, a complete operating system could be installed on a card, taking the place of ROM chips or an operating system loaded from disk. Updates would become almost trivial; the cards were cheap and easy to manufacture, and the end user would need only swap the new card for the old to “install” the new operating system. Others anticipated Laser Cards becoming personal identification cards, with everything anyone could need to know about you, from citizenship status to credit rating, right there on the optical strip, a helpful boon indeed in the much less interconnected world of the early 1980s. (From the department of things that never change: the privacy concerns raised by such a scheme were generally glossed over or ignored.)

The Drexon Laser Card

The Drexon Laser Card

Some of these new technologies — the Laser Card alas not among them — did end up living on for quite some years. Optical storage was ideal for large, static databases like public records, especially in institutions that could afford the technology needed to create the discs as well as read them. IBM and others who served the institutional-computing market therefore came out with various products for this purpose, some of which persist to this day. In the world of PCs, however, progress was slow. It could be a bit hard to say what all that storage might actually be good for on a machine like, say, an Apple II. Today we fill our CDs and DVDs mostly with graphics and sound resources, but if you’ve seen the Apple II screenshots that litter this blog you know that home computers just didn’t have the video (or audio) hardware to make much direct use of such assets. Nor could they manipulate more than the most minuscule chunk of the laser disc’s cavernous capacity; connecting an Apple II to optical mass storage would be like trying to fill the family cat’s water bowl with a high-pressure fire hose. Optical media as a data-storage medium therefore came to PCs only slowly. When it did, it piggybacked not on the laser disc but on the newer, more successful format of the audio CD. The so-called “High Sierra” standard for the storage of data on CDs — named after the Las Vegas casino where it was hashed out by a consortium of major industry players — was devised in 1985, accompanied by the first trickle of pioneering disc-housed encyclopedias and the like, and Microsoft hosted the first big conference devoted to CD-ROM in March of 1986. It took several more years to really catch on with the mass market, but by the early years of the 1990s CD-ROM was one of the key technologies at the heart of the “multimedia PC” boom. By this time processor speeds, memory sizes, and video and sound hardware had caught up and were able to make practical use of all that storage at last.

Still, even in the very early 1980s laser discs were not useless to even the most modest of 8-bit PCs. They could in fact be used with some effectiveness in a way that hewed much closer to their original intended purpose. Considered strictly as a video format, the most important property of the laser disc to understand beyond the upgrade in quality it represented over videotape is that it was a random-access medium. Videocassettes and all other, older mediums for film and video were by contrast linear formats. One could only unspool their contents sequentially; finding a given bit of content could only be accomplished via lots of tedious rewinding and fast forwarding. But with a laser disc one could jump to any scene, any frame, immediately; freeze a frame on the screen; advance forward or backward frame by frame or at any speed desired. The soundtrack could be similarly manipulated. This raised the possibility of a new generation of interactive video, which could be controlled by a computer as cheap and common as an Apple II or TRS-80. After all, all the computer had to do was issue commands to the player. All of the work of displaying the video on the screen, so far beyond the capabilities of any extant computer’s graphics hardware, was neatly sidestepped. For certain applications at least it really did feel like leapfrogging about ten years of slow technological progress. Computers could manage graphics and sound through manipulating laser-disc players that they wouldn’t be able to do natively until the next decade.

The people who worked on the DiscoVision project were not blind to the potential here. Well before the laser disc became widely available to consumers in 1980 they were already making available pre-release and industrial-grade models to various technology companies and research institutions. These were used for the occasional showcase, such as the exhibition at Chicago’s Museum of Science and Industry in 1979 which let visitors pull up an image of the front page of any issue of the Chicago Tribune ever published. Various companies continued to offer throughout the 1980s pricey professional-level laser-disc setups that came equipped with a CPU and a modest amount of RAM memory. These could take instructions from their controlling computers and talk back to them as well: telling what frame was currently playing, notifying the host when a particular snippet was finished, etc. The host computer could even load a simple program into the player’s memory and let it run unattended. Consumer-grade devices were more limited, but virtually all did come equipped with one key feature: a remote-control sensor, which could be re-purposed to let a computer control the player. Such control was more limited than was possible with the more expensive players — no talking back on the part of the player was possible. Still, it was intriguing stuff. Magazines like Byte and Creative Computing started publishing schematics and software to let home users take control of their shiny new laser-disc player just months after the devices started becoming available to purchase in the first place. But, given all of the complications and the need to shoot video as well as write code and hack hardware to really create something, much of the most interesting work with interactive video was done by larger institutions. Consider, for example, the work done by the American Heart Association’s Advanced Technology Development Division.

The AHA was eager to find a way to increase the quantity and quality of CPR training in the United States. They had a very good reason for doing so: in 1980 it was estimated that an American stricken with a sudden heart attack faced odds of 18 to 1 against there being someone on-hand who could use CPR to save her life. Yet CPR training from a human instructor is logistically complicated and expensive. Many small-town governments and/or hospitals simply couldn’t manage to provide it. David Hon of the AHA believed that interactive video could provide the answer. The system his research group developed consisted of an Apple II interfaced to a laser-disc player as well as a mannequin equipped with a variety of sensors. An onscreen instructor taught the techniques of CPR step by step. After each step the system quizzed the student on what she had just learned; she could select her answers by touching the screen with a light pen. It then let her try out her technique on the mannequin until she had it down. The climax of the program came with a simulation of an actual cardiac emergency, complete with video and audio, explicitly designed to be exciting and dramatic. Hon:

We had learned something from games like Space Invaders: if you design a computer-based system in such a way that people know the difference between winning and losing, virtually anyone will jump in and try to win. Saving a life is a big victory and a big incentive. We were sure that if we could build a system that was easy to use and engaging trainees would use it and learn from it willingly.

The American Heart Association's CPR training system

The trainee's "coach" provides instruction and encouragement on the left monitor; the right shows the subject's vital signs as the simulation runs

The trainee’s “coach” provides instruction and encouragement on the left monitor; the right shows the subject’s vital signs as the simulation runs

At a cost of about $15,000 per portable system, the scheme turned out to be a big success, and probably saved more than a few lives.

One limitation of many early implementations of interactive video like this was the fact that the computer controller and the laser disc itself each had its own video display, with no way to mix the two on one screen, as you can clearly see in the photos above. In time, however, engineers developed the genlock, a piece of hardware which allowed a computer to overlay its own signal onto a video display. How might this be useful? Well, consider the very simple case of an educational game which quizzes children on geography. The computer could play some looping video associated with a given country from the laser disc, while asking the player what country is being shown in text generated by the computer. Once the player answers, more text could be generated telling whether she got it right or not. Yet many saw even this scenario as representing the merest shadow of interactive video’s real potential. A group at the University of Nebraska developed a flight-training system which helped train prospective pilots by combining video and audio from actual flights with textual quizzes asking, “What’s happening here?” or “What should be done next?” or “What do these instruments seem to indicate?”

Flight Training

Another University of Nebraska group developed a series of educational games meant to teach problem-solving to hearing-impaired children. They apparently played much like the full-motion-video adventure games of a decade later, combining video footage of real actors with puzzles and conversation menus to let the child find her own way through the story and solve the case.

Think It Through Think It Through

Think It Through

The Minnesota Educational Computing Consortium (the same organization that distributed The Oregon Trail) developed a high-school economics course:

Three types of media are used in each session. A booklet introduces the lesson and directs the student to use the other pieces of equipment. At the same time, it provides space for note taking and record keeping. A microcomputer [an Apple II] contributes tutorial, drill, and practice dimensions to the lesson. And a videodisc player presents information, shows examples, and develops concepts which involve graphics or motion.

Apple built commands for controlling laser-disc players into their SuperPILOT programming language, a rival to BASIC designed specifically for use in schools.

There was a widespread sense among these experimenters that they were pioneering a new paradigm of education and of computing, even if they themselves couldn’t quite put their fingers on what it was, what it meant, or what it should be called. In March of 1976, an amazingly early date when laser discs existed merely as crude prototypes, Alfred M. Bork envisioned what laser discs could someday mean to educational computing in an article that reads like a dispatch from the future:

I envision that each disc will contain a complete multimedia teaching package. Thus, a particular disc might be an elaborate teaching sequence for physics, having on the disc the computer code for that sequence (including possible microcode to make the stand-alone system emulate the particular machine that material was originally developed for), slides, audio messages, and video sequences of arbitrary length, all of these in many different segments. Thus, a teaching dialog stored on a videodisc would have full capability of handling very complex computer logic, and making sizable calculations, but it also could, at an appropriate point, show video sequences of arbitrary length or slides, or present audio messages. Another videodisc might have on it a complete language, such as APL, including a full multimedia course for learning APL interactively. Another might have relatively little logic, but very large numbers of slides in connection with an art-history or anatomy course. For the first time control of all the important audiovisual media would be with the student. The inflexibility of current film and video systems could be overcome too, because some videodiscs might have on them simply nothing but a series of film clips, with the logic for students to pick which ones they wanted to see at a particular time.

Bork uses a critical word in his first sentence above, possibly for the first time in relation to computing: “multimedia.” Certainly it’s a word that wouldn’t become commonplace until many years after Bork wrote this passage. Tony Feldman provided perhaps the most workable and elegant definition in 1994: “[Multimedia is] a method of designing and integrating computer technologies on a single platform that enables the end user to input, create, manipulate, and output text, graphics, audio, and video utilizing a single user interface.” This new paradigm of multimedia computing is key to almost all of the transformations that computers have made in people’s everyday lives in the thirty years that have passed since the pioneering experiments I just described. The ability to play back, catalog, combine, and transform various types of media, many or most of them sourced from the external world rather than being generated within the computer itself, is the bedrock at the root of the World Wide Web, of your iPod and iPhone and iPad (or equivalent). Computers today can manipulate all of that media internally, with no need for the kludgy plumbing together of disparate devices that marks these early experiments, but the transformative nature of the concept itself remains. With these experiments with laser-disc-enabled interactive video we see the beginning of the end of the old analog world of solid-state electronics, to be superseded by a digital world of smart, programmable media devices. That, much more than gigabytes of storage, is the real legacy of DiscoVision.

But of course these early experiments were just that, institutional initiatives seen by very few. There simply weren’t enough people with laser-disc players wired to their PCs for a real commercial market to develop. The process of getting a multimedia-computing setup working in the home was just too expensive and convoluted. It would be six or seven more years before “multimedia” would become the buzzword of the age — only to be quickly replaced in the public’s imagination by the World Wide Web, that further advance that multimedia enabled.

In the meantime, most people of the early 1980s had their first experience with this new paradigm of computing outside the home, in the form of — what else? — a game. We’ll talk about it next time.

(The most important sources for this article were: Byte magazines from June 1982, March 1983, and October 1984; Creative Computing from March 1976 and January 1982; Multimedia, a book by Tony Feldman; Interactive Video, a volume from 1989 in The Educational Technology Anthology Series; and various laser-disc-enthusiast sites on the Internet. I also lifted some of these ideas from my own book about the Amiga, The Future Was Here. The lovely picture that begins this article was on the cover of the June 1982 Byte. All of the other images were also taken from the various magazines listed above.)

 

Tags: ,

Ultima III in Pictures

Ultima III

There’s a lot of interesting stuff to talk about in Ultima III, to the extent that I wasn’t quite sure how to wedge it all into a conventional review. So I decided to try this approach, to balance my usual telling with quite a bit of showing. Or something like that. Anyway, I found it fun to do.

If you’re inspired to play Ultima III yourself, know that Good Old Games is selling it in a collection which also contains Ultima I and II. Less legitimately, there are the usual abandonware sites and ROM collections where you can find the original Apple II version that I play here, but you’re on your own there. Some spoilers do follow, although Ultima III is tricky enough that you may just welcome whatever little bit of guidance you glean from this post.


Ultima III

Garriott was really proud of his game’s subtitle, Exodus, to the extent that in the game itself and most early advertising it’s actually more prominent than the Ultima name. He draws no connection to its meaning as an English noun or to the Bible. It’s simply a cool-sounding word that he takes as the name of his latest evil wizard, the love child of his two previous evil wizards, Mondain from Ultima I and Minax from Ultima II. Roe R. Adams III did make a somewhat strained attempt to draw a connection to the expected implications of the word in the manual via a recasting of an old seafaring mystery:

One possible clue as to the identity of thy nemesis has been discovered. A derelict merchant ship was recently towed into port. No crewmen were aboard, alive or dead. Everyone had vanished, as if plucked by some evil force off the boat. The only thing found was a word written in blood on the deck: EXODUS.

I never hear anything about this ghost ship in the game itself. Also left unexplained, as it was in Ultima II, is why Mondain was on Garriott’s fantasy world of Sosaria and Minax was on our own Earth. This time I’m stuck back on Sosaria again. Garriott would finally get more serious about making an Ultima mythos that makes some kind of sense with the next game, but for now… let’s just say I won’t be spending much more time discussing the plotting or the worldbuilding.


Ultima III

In Ultima III I get to create and control a full party of four adventurers rather than a single avatar. This is actually the only Ultima that works quite this way. Later games would use the code Garriott first developed here to allow players to have more than one person in their parties, but would start them off with a single avatar. Finding other adventurers in the game world itself and convincing them to join would become part of the experience of play and an important component of those games’ much richer plots.


Ultima II

Ultima III

With my party created, I’m dumped into Sosaria, right outside the town of Britain and the castle of Lord British in what has already become by Ultima III a time-honored tradition.

One of the fascinating aspects of playing through the Ultima games in order is seeing which pieces are reused from earlier games and which are replaced. Programming often really is a game of interchangeable parts. On the left above is Ultima II, on the right Ultima III. The same old tile engine that dates back to Ultima I is still in place in both games, but Ultima III changes the screen layout considerably and makes everything a bit more attractive and ornate within the considerable limitations of the Apple II. It no longer uses the Apple II’s mixed display mode that displays text rather than graphics on the bottom four lines of the screen. Instead the whole screen is now given over to a graphics display, with a character generator, once an exotic piece of technology but by 1983 commonplace, used to put words anywhere on the screen.


Ultima III

When I enter a town for the first time another of Ultima III‘s additions to the old tile-graphics engine becomes clear: a line-of-sight algorithm now prevents me from seeing through walls. This adds an extra dimension of realism, but proves to be a mixed blessing. We’ll talk about why that is in just a little bit.


Ultima II

Ultima III

And when I run into a couple of wandering orcs for the first time I see another big addition: a separate strategic-combat screen that pops up when a fight begins. You can see that on the right above; the old Ultima II system of flailing in place on the map screen is on the left. The earlier system would obviously be unworkable with a party of four. Unlike with Wizardry, combat has never been the heart of Ultima‘s appeal, but that doesn’t mean you don’t spend a lot of time — maybe too much time — in Ultima III engaging in it. The new system does add some welcome interest to the old formula. I can now move each character about individually, use missile weapons (a highly recommended strategy that lets me take out many monsters before they can get close enough to damage me), and cast quite a variety of offensive and defensive spells. Less wonderfully, all those random encounters with orcs and cutthroats now take much more time to resolve, which is one of the things that can turn Ultima III into quite the slog by the time all is said and done. Also contributing to the tedium: in a harbinger of certain modern CRPGs, random encounters are balanced to suit the general potency of my party, thus guaranteeing that they will still take some time even once I have quite a powerful group of characters.


Ultima III

As part of a general tightening of the game’s mechanics likely prompted by unfavorable comparisons of previous Ultimas to previous Wizardries, the strange system of hit points as a commodity purchasable from Lord British has finally been overhauled. Now healing works as you might expect: each character has a maximum number of hit points which Lord British raises by 100 every time I visit him after gaining a level. Alas, this works only until level 25 and 2500 hit points. At least I don’t have to pay him for his trouble anymore. In the screenshot above his “Experience more!” means that I haven’t yet gained a level for him to boost my hit-point total; small wonder, as all my characters are still level 1.


Ultima III

Ultima III

Having gotten the initial lay of the land, I settle into the rhythm of building my characters, exploring the world map, and talking to everyone I can find in the towns. The latter process, like so much in Ultima III, is equal parts frustrating and gratifying. The good citizens of Sosaria insist on speaking in the most cryptic of riddles. And here we see the darker side of Garriott’s new line-of-sight system: most of the most vital clue-givers are tucked away in the most obscure possible corners of the towns, like the fellow shown in the screenshot above and left. I have to scour every town square by tedious square to be absolutely certain I haven’t missed a vital clue, a vital link in a chain of tasks required to win that is much more complicated than those found in the earlier games. On the other hand, the gratification that comes when another piece of the puzzle falls into place is considerable. Ultima has always been better at delivering that thrill of exploration than just about any other CRPG.

There are in many places in Ultima III some small kindnesses, some elements that, once I figure out how they work, can make things easier. In the screenshot to the right I’m using a magic gem, purchasable from thieves guilds in a couple of the towns, to get a bird’s-eye view of the town I’m currently in. Ferreting out these secrets and hidden mechanics contributes to another thing Ultima always does well: making you feel smart.


Ultima III

Ultima III

Still, it’s possible to take this whole discovery thing too far. In one of the more astonishing design decisions in Ultima III, Garriott has consciously engineered into his hotkey-driven interface an element of guess the verb. After all, why should text adventurers have all the fun? There’s a mysterious OTHER command this time, which lets me enter new verbs. Divining what these are depends on my sussing that words surrounded by “<>” in characters’ speech refer to new verbs. (“<SEARCH> the shrines.”) A very strange design choice, which does a good job of illustrating the gulf in player expectations between now and then, when guess the verb was still trumpeted by many as an essential element of adventure games rather than just a byproduct of their technical limitations. Given that, why not try to engineer it into Ultima, a series which always tried to offer more, more, more? Thankfully, it would disappear again from Ultima IV, in what could be read as another reflection of changing player expectations.

In the screenshot at left above I’ve just used the hidden verb “BRIBE” to convince a guard who just a second before was standing right next to me to go away for the modest fee of 100 gold. Now I can go into the shop and steal with relative impunity. (Ultima III is, as we’ll continue to see, very much an amoral world, the last Ultima about which that can be said.) Bribing is only useful; other hidden verbs are vital.

For instance, the second screenshot above shows me gathering a piece of important information using the hidden verb “PRAY” inside a temple. This is actually quite an interesting sequence. PRAYing yields the information that I must YELL — YELL being one of the standard hotkey-based commands — “EVOCARE” at a certain place. It’s perilously close to two guess-the-verb — or at least guess-the-word — puzzles joined together.


Ultima III

Ultima III

We see an interesting re-purposing of previous Ultima technology in the form of the eight moon gates which wink in and out of existence in a set pattern on the world map. In Ultima II, you may recall, these supposedly allowed me to travel through time, although effectively they just provided access to different world maps; nothing I did in one time could have any direct effect on any of the others. Here they’re renamed and used more honestly, as ways to move quickly from place to place on the primary world map. (There are only two world maps this time, the primary one and an alternate world called Ambrosia which we’ll get to shortly.) They also allow me to reach a few places that are otherwise completely inaccessible, as the screenshot at right above illustrates. Well, okay… I could also get there with a ship, an element we’ll talk about later. But that’s not always the case; there’s at least one vital location that can be visited only via moon gate. Thus understanding the logic of the moon gates and charting their patterns is another critical aspect of cracking the puzzle of Ultima III. Moon gates would continue to be a fixture in the Ultimas to come.


Ultima III

Ultima III

Garriott had completely rewritten his dungeon-delving engine for Ultima II, replacing what had been the slowest and most painful part of Ultima I with a snappy new piece that replaced a wire-frame portrait of the surroundings with glorious filled-in color. It’s easily the most impressive and appreciated improvement in that game. But then, like so much else in Ultima II, he squandered it by giving his players no reason to go there. Thus Ultima III almost feels like the new dungeon engine’s real debut. Not only can I harvest a lot of desperately needed gold from the dungeons, but I must also explore them to find five vital “marks” that give special abilities which are in turn key to solving the game. And at the bottom of the Dungeon of Time I meet the Time Lord. (Garriott’s Time Bandits fixation had apparently not yet completely run its course — or are we now dealing with a Doctor Who obsession?) He gives a portentous clue that will be vital to the end-game.


Ultima III

Ultima III

Sosaria is still a world where might makes right. Lord British, the supposedly benevolent monarch, has a dirty little secret, an ugly torture chamber hidden in the depths of his castle. It’s almost enough to make you ask who’s really the evil one here. The manual talks a good game about Exodus, but he doesn’t actually do anything at all in the game itself, just hangs out in his castle and waits for us to come kill him. Meanwhile Lord British has torture chambers, and his lands are beset with monsters trying to kill me, and he seems completely disinterested in helping me beyond boosting my hit points from time to time. Nor am I exactly morally pure: my own mission in the torture chamber is not to save the fellow who’s been thrown into a lake of fire, merely to extract some information from him.

The screenshot at the right shows an even more morally questionable episode, albeit one that requires a bit more explanation. I’m the one on the horse. Each of the three clerics next to me has a critical clue to convey. However, I can’t interact on a diagonal, meaning that the one at bottom right is inaccessible to me — unless I open up a lane by killing one of his companions in cold blood, that is. I want to emphasize here that the clue the inaccessible cleric has to offer is absolutely necessary; he tells where to dig for some special weapons and armor that provide the only realistic way to survive the end-game in Exodus’s castle. Thus the only way forward is, literally, murder, and it’s a conscious design choice on Garriott’s part. Of course, he didn’t think of it quite that way. He just saw it as an interesting mechanic for a puzzle, having not yet made the leap himself from mechanics to experiential fiction. Again, all of that would change with Ultima IV.


Ultima III

Ultima III

Speaking of horses: given Garriott’s newfound willingness to edit, the vehicles available to me in Ultima III are neither so plentiful nor so outrageous as they were in Ultima II. The ridiculous and ridiculously cool airplane, for instance, is gone.

I can buy horses for my party in a couple of towns. These let me move overland a bit faster, using less food and avoiding many of the wandering monsters and the endless combats they bring which can test the patience of the hardiest of players. A ship can be acquired only by taking it from one of the roving bands of pirates that haunt the coastline. There aren’t actually a lot of pirates about, which can get very frustrating; a ship is required to visit several important areas of the game, and finding one can be tough. In the right-hand screenshot above I’ve sailed to an island, where, following the lead of the cleric whose companion I killed in cold blood, I’ve dug up the aforementioned special weapons that are required to harm Exodus’s innermost circle of minions.


Ultima III

Ultima III

I also need a ship to get to the alternate world of Ambrosia, which I can manage only by the counter-intuitive step of sailing into a whirlpool. Here I find shrines to each of the four abilities, the only ways to raise my scores above their starting values. Doing so is vital; in Ultima III‘s still somewhat strange system, ability scores have much more effect on my performance in combat and other situations than my character level. For instance, the number and power of spells I can cast has nothing to do with my level, only with my intelligence (wizard spells) or wisdom (cleric spells).

The explicitly Christian imagery in these shrines, and occasionally in other places in the game, is worth noting. It’s doubtless a somewhat thoughtless result of Garriott’s SCA activities and his accompanying fascination with real medieval culture, but it could certainly be read as disrespectful, a trivializing of religious belief. It’s the sort of thing that TSR, creators of Dungeons and Dragons, were always smart enough to stay well away from (not that it always helped them to avoid controversy). Similarly, you definitely will never see crosses in a big-budget modern fantasy CRPG.


Ultima III

Ready at last, I piece together a string of clues and sail to the “Silver Snake”. There I yell the password “EVOCARE” to enter Exodus’s private grotto. The Silver Snake itself provides a good illustration of just how intertwined the early Ultima games were with Garriott’s own life. And the anecdote that explains its presence here also shows some of the difficulties of trying to pin down the facts about Garriott’s life and career.

Growing up in Houston in the mid-1970s, Garriott was one of the few people to see the infamously awful adventure film Doc Savage: The Man of Bronze. Members of the lost Central American tribe that Savage battles in the movie all bear a tattoo on their chest of the Mayan god Kulkulkan, about whom little is known today apart from his symbol: a serpent.

Kulkulkan

Young Richard thought the symbol so cool-looking that he went to his mother’s silversmithing workshop in that room above his family’s garage that would one day house Origin Systems and made the design — or as close an approximation as he could manage — for himself. He put his new amulet on a chain made from one of his mother’s belts. He told Shay Addams about it circa 1990:

“And this chain now resides around my neck 365 days a year, 24 hours a day — it has essentially remained there for the rest of my life ever since the day I put it on. There is no way to remove it without taking a screwdriver to it and prying open one of the links. For the first couple of years that I wore it, I actually had a link that I used to open and close a little bit. After I realized I was wearing out something by doing that, I quit doing it, so this necklace has remained here ever since. It literally never comes off. The chain was gold-colored when I first put it on. As it wears off, the colors keep changing, and now it rusts on my neck. I mean literally, every day. When I go, I may die of rust poisoning or something.”

Shortly after finishing Ultima III, Garriott loaned the original to his father Owen to carry with him on his second and final trip into space. It went into space again with Richard himself in 2008, and it seems that he still wears it frequently if not constantly. For what it’s worth, the color now seems to be a dull silver, almost a pewter shade.

But… wait. A close look at the early portrait of Origin Systems I published earlier shows that he doesn’t seem to be wearing it there, although Ken Arnold is using either the original or a duplicate as a key ring. Various other contemporary photos show no evidence of a chain or amulet, at least not of the construction and bulk of the one he wears to public appearances in recent years. Now, you could say that to even question this is petty, and in a very real sense you’d be right. Really what does it matter whether he never takes the serpent medallion off or whether it’s merely a precious link to his past that he wears on special occasions? I mention it here only because it points to how slippery everything involving Garriott can be, how much the man often seems to prefer SCA-style legend over the messier world of historical facts, and by extension how eager his interviewers and chroniclers often are to mythologize rather than document. That in turn forces me to spend far more time than I’d like to debunking or at least double-checking everything he says and much of what is said about him. But we’ve moved far afield from Ultima III now, so enough beating of this particular dead horse.


Ultima III

Ultima III

As I’ve mentioned before, Garriott excised most of the anachronistic science-fiction elements from Ultima III to focus on fantasy. But notice that I said “most.” When I get to the grand climax at last, I learn that Exodus apparently is in fact… a giant deranged computer in the tradition of Star Trek. The four magic cards I quested for were apparently punched cards — Exodus is an old-fashioned evil computer — that I need to use to shut him down or change his programming or… something. Of course, none of this make a lick of sense — how did Mondain and Minax manage to breed a computer child? But I dutifully insert the cards and shut him down, and am left to “speculation” about Ultima IV.

In that spirit, let’s note that Garriott himself sees the Ultimas through Ultima III as essentially technical exercises, written “to satisfy my personal interest in seeing how much better a game I could put together with the skills I’d acquired while creating the previous game.” While his technology would continue to improve, with Ultima III it reached a certain point of fruition at which it was capable of delivering more than an exercise in rote mechanics, was capable of sustaining real experiential fictions. Garriott didn’t entirely realize that at the time he was writing Ultima III, and thus the game takes only the most modest of steps in that direction. When he started on the next one, however, it would all come home. In a way, it’s with that game that Ultima really became Ultima as we remember it today. We have much else to talk about before we get there, but I hope you’ll still be around when we do. With Ultima III Garriott had his foundation in place. Next would come the cathedral.

 

Tags: , ,

Origin Systems

Early days in the garage at Origin. Top row, from left: Ken Arnold, Mike Ward, Laurie Thatcher, James Van Artsdalen, Helen Garriott, John Van Artsdalen. Bottom row: Richard Garriott, Robert Garriott, Chuck Bueche.

Early days in the garage at Origin. Top row, from left: Ken Arnold, Mike Ward, Laurie Thatcher, James Van Artsdalen, Helen Garriott, John Van Artsdalen. Bottom row: Richard Garriott, Robert Garriott, Chuck Bueche.

When we last checked in with Richard Garriott, he had just released Ultima II under the imprint of Sierra Online. Despite all of the pain and tension of its extended development process and the manifold design flaws that resulted from that, Ultima II proved to be a hit, selling over 50,000 copies within the first year or so and eventually approaching sales of 100,000. Contemporary reviews were uniformly stellar. In contrast to Ultima II‘s modern reputation as the black sheep of the Ultima family, reviewers of the era seemed so entranced by the scope and vision of the game, so much grander than anything else out there, that they were willing to overlook all of the useless spinning gears that didn’t connect with anything else and the many things that just didn’t make sense even by the generous standards of CRPG storytelling. Only one review that I’ve seen takes note of Ultima II‘s strangely disconnected design elements at all, James A. McPherson’s piece for Computer Gaming World. Even he bends over backwards to put the best possible interpretation on it:

My only thought as I finished the game was that very little of this enormous work was really being utilized as being required to finish the game. It was almost as if this was only a small initial quest to give you the lay of the land and that additional scenarios would be released, each one using more of the game until the “Ultimate” quest was finished.

No “additional scenarios” would have a chance to appear even if Garriott or someone at Sierra had read this review and thought it a good idea. As McPherson wrote those words Garriott’s relationship with Sierra was falling to pieces.

As I described in my earlier article, the relationship had been full of tension for months before the release of Ultima II. Big, blustery Ken Williams of Sierra took pretty good care of his people and was beloved by most of them for it, but he never let it be forgot that he considered them his people; he always made it clear who was ultimately in charge. Richard Garriott, younger and quieter than Ken though he may have been, had just as strong a will. He just wasn’t going to be the junior partner in anything. In fact, he even had a small entourage of his own, some of his old running buddies from high school who assisted with his projects in various ways. Most prominent amongst this group were Ken Arnold, Keith Zabalaoui, and Chuck Bueche (immortalized as “Chuckles the Jester” in many an Ultima), the latter two of whom also spent time in Oakhurst at the Sierra offices. Throw in a serious culture clash between the free-spirited California lifestyle of Sierra and the conservatism of Garriott’s suburban Texas upbringing and a final blow-up was probably inevitable. It came just weeks after Ultima II‘s release.

Through much of 1982 Sierra was essentially a two-platform shop. Most of their games were developed on the Apple II, and then those that were successful would be ported to the Atari 8-bit line. (A minority, such as the works of Atari stalwart John Harris, went in the opposite direction.) Accordingly, immediately upon signing Garriott Sierra had not only re-released Ultima I, whose rights they recovered from the now defunct California Pacific as part of the deal, but also funded a port of that game to the Atari machines. Ultima II‘s Atari port was done by prior agreement by Chuck Bueche for a piece of Garriott’s generous royalties. By this time, however, it was becoming clear that Sierra would need to support more than just these two platforms if they wished to remain a major player in the exploding software industry. They therefore funded an additional port of Ultima II, without Garriott’s direct oversight, to the IBM PC. (Another unsupervised port, to the Commodore 64, would follow later in 1983.) The contract he had signed not only allowed Sierra to choose where and when to port Ultima II, but also allowed them to pay Garriott a considerably lower royalty for ports with which he and his entourage were not involved. Effectively he would be paid as the designer only, not as the designer and the programmer. Garriott, who had apparently overlooked this aspect of the contract, felt like he was being swindled even though Sierra remained well within the letter of the law. You can choose to see all of this as you like, as Ken Williams slyly manipulating contract law to put one over on his naive young signee or as a simple failure of due diligence on Garriott’s part.

Regardless, Garriott had consciously or subconsciously been looking for a reason to split with Sierra for some time. Now he had a suitable grievance. Luckily, he had been wise enough to retain the right to the Ultima name. Even Ultima I and II were given exclusively to Sierra only for a few years before reverting back to their creator. There was thus nothing stopping him from continuing the Ultima series elsewhere.

But where? He certainly had no shortage of suitors, among them Trip Hawkins, who pitched hard for Garriott to become one of his electronic artists. Still, Richard wasn’t sure that he wanted to get in bed with yet another publisher at all. He talked it over with his business adviser, his older brother Robert, who in the best over-educated tradition of the Garriott family was just finishing his second Master’s degree at MIT with the thesis “Cross Elasticity Demand for Computer Games.” Robert proposed that they start their own publisher, with him managing the business side and Richard and his buddy Chuck Bueche the technical and creative. And so Origin Systems was born. It would be a little while before they came up with their brilliant slogan — “We Create Worlds” — but just the company name itself was pretty great. It probably owed something to the Origins Games Fair, one of the two most prominent North American conventions for tabletop gamers of all types. Richard, who had played Dungeons and Dragons obsessively in high school and at university in Austin had become an intimate of Steve Jackson Games, had deep roots in that culture. Richard, Robert, their father Owen, and Chuck Bueche all put up money — with the lion’s share naturally coming from the relatively flush Richard — to become the founders of a new games publisher.

Everything about the young (literally; look at their picture above!) Origin Systems was bizarre, even by startup standards. They set up shop in Richard’s personal playhouse, a space above the Garriott family’s three-car garage which had once served as an art studio for his mother but had been commandeered by Richard and his friends years before for their D&D games. It was a big room scattered with desks, chairs, and even cots. Here Richard and his friends set up their various computers. A little cubbyhole at one end served as Robert’s business office. Robert himself was still officially living in Massachusetts with his wife, who had quite a career of her own going as a manager at Bell Labs and thus couldn’t move. Robert, however, was a pilot with a little Cessna at his disposal. He spent three weeks of each month in Houston, then flew back to spend the last with his wife in Massachusetts.

Together Chuck Bueche and Richard worked feverishly on the games that would become Origin Systems’s first two products. Chuck’s was an action game called Caverns of Callisto; Richard’s was of course the big one upon which they were all depending to get Origin properly off the ground, Ultima III.

Given its flagship status, Garriott felt compelled to try to remedy some of the shortcomings of his earlier games. In particular, he was obviously eying the Wizardry series; for all of the Ultima series’s stellar reviews and sales, the first two Wizardry games had garnered even better and more of both. Much of what’s new in Ultima III is there in the name of addressing his series’s real or perceived failings in comparison with Wizardry. Thus he replaced the single adventurer of the early games with a full party which the player must manage; added a new strategic combat screen to make fights more interesting; added a full magic system with 32 separate spells to cast to replace the simplistic system (which the player could easily and safely ignore entirely) of his previous games; added many new class and race options from which to build characters; made some effort to bring some Wizardry-style rigorousness to the loosy-goosy rules of play that marked his earlier games.

Notably, however, Ultima III is also the first Garriott design that doesn’t simply try to pile on more stuff than the game before. Whether because he knew that, what with his family and friends all counting on him, this game needed to be both good and finished quickly or just because he was maturing as a designer, with Ultima III he for the first time showed an ability to edit. Garriott was never going to be a minimalist, but Ultima III is nevertheless only some 60% of the geographical size of Ultima II, the only example of the series shrinking between installments prior to everything going off the rails many years later with Ultima VIII. Also gone entirely is the weird sub-game of space travel, as well as — for the most part — the painful stabs at humor. Yet it’s safe to say that Ultima III will take the average player much longer to finish, because instead of leaving huge swathes of game — entire planets! — dangling uselessly in the wind Garriott this time wove everything together with an intricate quest structure that gives a reason to explore all those dungeons. In fact, there’s a reason to visit every significant area in the game.

Viewed from the vantage point of today, Ultima III is perched on a slightly uncomfortable border, right between the simple early Ultimas that predate it and the deeper, richer works that make up the heart of Ultima‘s (and Richard Garriott’s) legacy today. I don’t know if any other game in the series sparks as much diversity of opinion. To some it’s just a long, boring grind, while a small but notable minority actually name it as their favorite in the entire series. Personally, I can appreciate its advances but take issue with many aspects of its design, which strike me as cruel and rather exhausting. My favorite of the early Ultimas, the one that strikes me as most playable today, remains Ultima I. But I’ll talk about Ultima III at much greater length in a future post. For now let’s just note that it gave CRPG players of 1983 exactly what they wanted — a big, convoluted, epic experience that pushed the technology even further than had the previous game — without the bugs and other issues that had plagued Ultima II.

Having dropped out of even a part-time university schedule and now largely living right there in that garage loft, Richard wrote Ultima III quickly, almost inconceivably so given its technical advancements. It was done in about six months, barely one-third the time invested into Ultima II and considerably less time than it would take many a player to finish it. As usual, the game itself was essentially a one-man effort, but as it came together he recruited family and friends to help with numerous ancillary matters. Ken Arnold, his old buddy from the ComputerLand days, wrote and programmed a lovely soundtrack for the game, playable by those who had purchased one of the new Mockingboard sound cards for their Apple II. A huge advance over the bleeps and farts of the previous games, it was the first of three Arnold-composed soundtracks that have become a core part of Ultima nostalgia for a generation of players, especially once ported to the Commodore 64, where they sounded even better on the magnificent SID chip.

Ultima III

But most of the outside effort went into the package. Origin may have literally been a garage startup, but Richard was determined that their products should not look the part. He wanted to outdo Sierra’s efforts for Ultima II; he succeeded handily. Denis Loubet, whom Richard had met back when he did the original cover art for the California Pacific Akalabeth, now drew a striking demon for the Ultima III cover which might not have had anything obviously to do with the contents of the disks but sure looked cool. (Maybe too cool; lots of overzealous Christian parents would take one look and start sending Garriott letters accusing him of Satanism.) Loubet also provided pictures for the manuals, as did Richard’s mother Helen, who drew up another mysterious cloth map complete with arcane runes along the borders; such maps were about to become another of the series’s trademarks. And did you notice I said “manuals”? That wasn’t a typo. Ultima III included three: a main game manual along with two more booklets containing elaborate faux-medieval descriptions and illustrations for each wizard and cleric spell. Said faux-medieval writing is a bit more tolerable this time because Richard, no wordsmith, didn’t write it himself. The spell descriptions were done by Margaret Weigers, a local friend, while Roe R. Adams III, who was quickly parlaying his reputation as the king of adventure-game players into a career in game development (he would soon sign on to design Wizardry IV for Sir-Tech), doused the main manual in copious quantities of suitably purple prose (yet another Ultima trademark).

As July of 1983 faded into August the game was already largely finished and the various hardcopy pieces were beginning to come in from the printers. Showing that he could challenge even Ken Williams in the charisma department when he wanted to, Richard convinced Mary Fenton and Jeff Hillhouse, two Sierra employees he’d met during his time in Oakhurst, to come join Origin. Fenton would become Origin’s first customer-service person; Hillhouse, who had learned how the industry worked at Sierra, would handle logistics and distribution. When he made contact with distributors and announced Ultima III, everyone was astonished when initial orders totaled no less than 10,000 units. Richard and Robert now kicked their long-suffering parents’ vehicles out of their own garage to make room for a big shrink-wrap machine — their biggest capital investment yet — and a workbench of computers to use for disk duplication. By now Origin had rented a tiny office in Houston to serve as the front that they presented to the world, but the real heart of the company remained there in the garage. For several months evenings in front of the television at the Garriott household would be spent folding together lurid demon-painted boxes.

Origin Systems's first advertisement, for their first two products

Origin Systems’s first advertisement, for their first two products

Ultima III began shipping in late August for the Apple II. Versions for the Atari 8-bit line and the Commodore 64 soon followed. Both ports were done by Chuck Bueche, whose role as a creative and technical force with Origin during these early days was almost as significant as Richard’s. The game was a huge hit across all platforms; Ultima III became the first Ultima to top 100,000 units in sales, a mark that all of the following titles would surpass with ease. Indeed, this moment marks the point where Ultima pulled ahead of the Wizardry series once and for all to become simply the premiere CRPG series of its era. Despite the occasional worthy competitor like the Bard’s Tale series, it would not be really, seriously challenged in that position until the arrival of the officially licensed D&D games that SSI would start releasing at the end of the decade. Happily, Ultima and Richard Garriott would prove worthy of their status; the next Ultima in particular would be downright inspiring.

But for now we still have some business for 1983 and Ultima III. I want to take a closer look at the game, which planted the seeds of much that would follow. First, however, we’ll take a little detour to set the record straight about another one of those persistent myths that dog fan histories of Ultima.

(Richard Garriott’s career has of course been very well documented. The two most in-depth histories are The Official Book of Ultima and Dungeons and Dreamers, even if a distinct whiff of hagiography makes both rather insufferable at times. And of course he’s all over contemporary magazines, not to mention the modern Internet. A particular gem of an article for students of this period in his career is in the November/December 1983 Softline. That’s where I found the wonderful picture at the beginning of this article.)

 

Tags: , , ,

Dan Bunten and M.U.L.E.

Dan Bunten

Dan Bunten

As Electronic Arts got off the ground, Trip Hawkins hired three veterans from his time at Apple — Dave Evans, Pat Marriott, and Joe Ybarra — to become the first people with the job title of “producer” at EA. Their new careers began with a mock draft: Hawkins had them draw lots to determine the order in which they would get to pick the developers they would be working with. Naturally, the three experienced developers all went in the first round, and in the order of their status within established gaming circles. Evans picked first, and chose Bill Budge, the first and arguably still the greatest of the Apple II’s superstar game developers, with name recognition within that community that could be matched by very few others. Marriott chose next, and picked Free Fall Associates, whose Jon Freeman had been responsible for the landmark CRPG hit Temple of Apshai and the Dunjonquest line of sequels and spinoffs that had followed it from Automated Simulations. That left Ybarra with Dan Bunten and his new team Ozark Softscape.

Unlike the others, Bunten had no hits on his resume; his biggest game to date had sold all of 6000 copies. He had previously published through Strategic Simulations, Incorporated, which was the antithesis of Hawkins’s vision of casual consumer software, having been founded by a grognard named Joel Billings to release a series of almost aggressively off-putting computer wargames in the hardcore tabletop tradition. Still, Hawkins had fallen in love with one of Bunten’s SSI games, a business simulation called Cartels and Cutthroats. He had first tried to buy it outright from Billings. When his overtures were rejected, he turned to Bunten himself to ask if he would like to make a game kind of like it for EA. Thus the presence of this B-lister on EA’s rolls, complete with generous royalty and advance. To make things even worse, Ozark was located, as the name would imply, deep inside flyover country: Little Rock, Arkansas. Ybarra certainly didn’t relish the many trips he would have to make there. Little did he realize that the relationship would turn into one of the most rewarding of his career, or that the first game he would develop with Ozark, M.U.L.E., would become the most beloved of all the early titles inside the company, or that it would go on to be remembered as one of the greatest of the all-time classic greats.

Dan Bunten was an idealist from an early age. At university he protested the Vietnam War, and also started a bicycle shop, not to make money but to help save the world. According to his friend Jim Simmons, Bunten’s logic was simple: “If more people rode bikes, the world would be a better place.” When he watched Westerns, Bunten was an “Indian sympathizer”: “It just seems like such a neat, romantic culture, in tune with the earth.” A staunch anti-materialist, he drove a dented and battered old Volkswagen for years after he could afford better. “I felt like I sold out when I bought a 25-inch color TV,” he said. That 1960s idealism, almost quaint as it now can sound, became the defining side of Bunten the game designer. He campaigned relentlessly for videogames that brought people together rather than isolating them. As his most famous quote, delivered at an early Game Developers Conference, went, “No one on their death bed ever said, ‘I wish I had spent more time alone with my computer!'” M.U.L.E. positively oozes that idealistic sentiment. As such, it’s an easy game to fall in love with. Certainly your humble blogger here must confess to being a rabid fanboy.

The seeds of M.U.L.E. were planted back in 1978 when Bunten bought his first Apple II. Educated as an industrial engineer, he at that time was 29, married and with daughter, and seemingly already settled into running a consulting firm doing city planning under a National Science Foundation grant in Little Rock. The eldest of six children, Bunten and his siblings had played lots of board games growing up: “When I was a kid the only times my family spent together that weren’t totally dysfunctional were when we were playing games.” In fact, some of his fondest childhood memories had taken place around a Monopoly board. Dan and his brother Bill had also delved into the world of wargames; when the former was twelve and the latter ten they had designed a complete naval wargame of their own, drawing the map directly onto the basement floor. During a gig working at the National Science Foundation, he had spent some of his time tinkering on their Varian minicomputer with an elaborate football simulation he imagined might eventually become the heart of a Master’s thesis in systems simulation. Now he started working on a game for the Apple II. Right from the beginning his approach to game design was different from that of just about everyone else.

Bunten loved more than anything the social potential of gaming. Setting a precedent that would endure for the rest of his career, he determined to bring some of that magic to the computer. Working in BASIC with only 16 K, he wrote a simple four-player auction game called Wheeler Dealers. He designed a simple hardware gadget to let all four players bid at once. (The details of how this worked, as well as the game software, unfortunately appear to be lost to history.) Then he found a tiny Canadian mom-and-pop publisher called Speakeasy Software to sell the game and the gadget for $54. (Speakeasy’s founder Brian Beninger: “Dan called out of the blue one day and spoke to Toni [Brian’s wife]. She had never experienced an accent from the southern United States and had trouble understanding him…”) Legend has it that Wheeler Dealers was the first computer game ever sold in a box, a move necessitated by the inclusion of the hardware gadget. However, such a claim is difficult to substantiate, as other games, such as Temple of Apshai and Microsoft Adventure, were also beginning to appear in boxes in the same time frame. What is certain is that Bunten and Speakeasy took a bath on the project, managing to sell just 50 to 150 (sources vary) of the 500 they had optimistically produced. In retrospect that’s unsurprising given the game’s price and the limited reach of its tiny publisher, not to mention the necessity of gathering four people to play it, but it did set another, unfortunate precedent: Wheeler Dealers would not be the last Bunten game to commercially disappoint.

Computer Quarterback, in its 1981 incarnation

Computer Quarterback, in its 1981 incarnation

Still, Bunten had caught the design bug. For his next project, he dusted off the FORTRAN source to his old football simulation. As would befit a Master’s thesis project, that game was the “most thoroughly mathematically modeled” that he would ever do, the deepest he would ever delve into pure simulation. It was, in other words, a great fit for the hardcore grognards at SSI, who released Computer Quarterback as one of their first titles in an all-text version in 1980, followed by a graphical update that took advantage of the Apple II’s hi-res mode in 1981. Typically for SSI, the manual determinedly touts Bunten’s professional credentials in an attempt to justify him as a designer of “adult games.” There is even affixed his seal as a “State of Arkansas Registered Professional Engineer”:

By affixing my seal hereto, I certify that this product was developed in accord with all currently accepted techniques in the fields of operations research, systems simulation, and engineering design, and I further accept full responsibility for the professional work represented here.

It all seems a bit dreary, and an especially odd sentiment from a fellow who would become known for championing easy accessibility to everyday people in his designs. Yet simulation of the real world was in fact a deep, abiding fascination of Bunten, albeit one that would be more obscured by his other design tendencies in his later, mature games. In the meantime, SSI’s audience of the hardcore was big enough to make Computer Quarterback Bunten’s bestselling game prior to his signing with EA, the one that convinced him to quit his day job in city planning and dive into game development full time. Indeed, the aforementioned figure of 6000 sold at the time of EA’s founding would continue to increase afterward; SSI would continue to sell updated versions well into the late 1980s.

Cartels and Cutthroats Cartels and Cutthroats

Bunten’s next game was the one that caught Hawkins’s eye, Cartels and Cutthroats. Like Hawkins of the “Strategy and Applied Game Theory” degree, Bunten was fascinated by economic simulations. For help with the modeling of Cartels, an oddly abstracted simulation of the business world — you are told in the beginning only that your company produces either “luxury,” “mixed,” or “necessity goods” — he turned to his little brother Bill, who had recently finished his MBA. Apparently few other gamers of the time shared Hawkins’s and Bunten’s interest in economic simulation; Cartels did not even manage the sales that Computer Quarterback had. Bunten later wryly noted that “evidently folks interested in playing with the stock market or business, do it in real-life instead.” That may to some extent be true, but in my opinion the game’s abstractions do it no favors; it’s hard to get excited about your role as producer of a “luxury good.” Cartels today reads as a step on the road to M.U.L.E.. The later game would continue the economics focus while grounding itself in a much more specific context that the player can really get her hands around.

If these early SSI games can seem slightly anomalous to Bunten’s mature work in their unabashed focus on simulation, one thing did stay consistent: they were conceived primarily as multi-player affairs. SSI had to cajole him into putting together a rudimentary opponent AI and single-player mode for Computer Quarterback as a condition of acceptance for publication. Bunten named the computer’s team “The Robots,” which perhaps shows about how seriously he took them. Cartels and Cutthroats offers a number of ways for up to six people to play together, the most verisimilitudinous of which employs a printer to let each player grab her stock reports off the “teletype.” Here computer players, while once more optionally present, still don’t get no respect: now they are called “dummies.”

Cytron Masters

Bunten’s final game for SSI was a marked departure. Released on SSI’s short-lived Rapid Fire line of action-oriented titles, Cytron Masters plays like a prototype of the real-time strategy games that would become popular a decade later. Two players — the two-player mode was again the main focus; the computer opponent’s AI was predictably atrocious — face off on a battlefield of the future in real time, spawning and issuing orders to six types of units. Each player can have up to fifty units onscreen at once, all moving about semi-autonomously. Bunten’s first game to use large amounts of assembly-language code as opposed to BASIC, it was by far his most challenging programming project yet. Cytron had to juggle animations and sound effects while also running the simple AI routines for up to a hundred on-screen units and accepting input from two players, all without becoming so slow as to lose its status as an “action-strategy” game. This presented a huge challenge on the minimalist, aging hardware of the Apple II. As Bunten wrote in a Computer Gaming World article about the experience, “the Apple can’t do two things without a lot of effort (you have to time your clicks of the speaker with your graphic draw routine so that they take turns). It was a tough program to write [emphasis original].”

By this time the Atari 800 was almost three years old, and Bunten had had one “collecting dust” for a pretty good portion of that time. He had remained committed to the Apple II as both the machine with the healthiest software market and the one he knew how to make “sing.” But now he decided to have a go at porting Cytron Masters to the 800. The experience proved to be something of a revelation. At first Bunten expected to just duplicate the game on the Atari. But when he showed the first version to Atari users, they scoffed. “It’s a neat game, but where’s the color? And what are those little noises?” they asked in response to the explosions.

Needless to say, I decided that if the program was to do well as an Atari version, it would have to use a few of the features of that machine. But, during the conversion, I discovered that all the sophisticated hardware features of the Atari are useful! Cytron Masters uses the separate sound processor and four voices to make truly impressive sound effects (at least compared to the Apple); it uses the display list and display-list interrupts to change colors on the fly, and have character graphics, four-color text as well as hi-res graphics on one screen; it uses player/missile graphics for additional colors and fast animation; and most useful of all, it uses vertical-blank interrupts to allow two programs to (apparently) run at once!

Bunten became the latest of a long line of programmers to fall for the elegance of Jay Miner’s Atari 8-bit design, an elegance which the often developer-hostile antics of Atari itself could obscure but never quite negate. He would never develop another game on the Apple II, and the company he was already in the process of forming, Ozark Softscape, would be an Atari shop. (M.U.L.E. never even got a port to the Apple II.)

Cytron Masters was another relative commercial disappointment for Bunten and SSI. “Rather than appealing to both action gamers and strategy gamers,” he later said, “it seemed to fall in the crack between them.” But then, just as Bunten was finishing up the Atari port, Trip Hawkins came calling asking for that sequel to Cartels and Cutthroats and promising that EA could find him the commercial success that had largely eluded his SSI games.

By this point Bunten was already in the process of taking what seemed to him the next logical step in his new career, going from a lone-wolf developer and programmer to the head of a design studio. In a sense, Ozark Softscape was just a formalizing of roles that already existed. Of the three employees that now joined him in the new company, his little brother Bill had already helped a great deal with the design of Cartels and Cutthroats while also serving as a business adviser; Jim Rushing, a friend of Bill’s from graduate school, had offered testing and occasional programming input since the same time; and Alan Watson, formerly a salesman at a local stereo shop, had helped him with the technical intricacies of Cytron Masters and contributed his talents for Atari graphics programming to the port. Now the three came to Ozark largely in the roles they had already carved out. Bill Bunten, the only one to keep his day job (as a director of parks for the city of Little Rock) and the only non-programmer, would handle the administrative vagaries of running a business. Rushing would program, as would Watson in addition to serving as in-house artist. All three would offer considerable design input as well, but they all would ultimately defer to Dan, the reason they were all here. As Rushing later said, “We all knew Dan was a genius.” They were just happy to be along for the ride.

With their EA advance they rented a big house across the street from the University of Arkansas to serve as office, studio, and clubhouse. Each took a bedroom as an office, and they filled the living room and den with couches, beanbag chairs, and of course more computers, making of them ideal spaces for brainstorming and playing. They filled the huge refrigerator in the kitchen with beer, which helped to lure in a crowd of outsiders to play and offer feedback virtually every evening. These were drawn mostly from the biggest local computer club, the Apple Addicts, of which Dan had been the first president back in the days of Wheeler Dealers. He may have defected to the Atari camp since, but no one seemed to mind; at least one or two were inspired by what they saw in the house to buy Ataris of their own. When they grew tired of creating and playing, the house’s regular inhabitants as well as the visitors could exit the back door to walk around an idyllic fourteen-acre lake, to sit under the trees talking or skip rocks across the water. The house and its environs made a wonderful space for creation as well as an ideal laboratory for Dan’s ideas about games as social endeavors to bring people together. It was here that Dan and his colleagues took M.U.L.E. from the germ of a concept to a shipping game in less than nine months.

Said germ was to create a game similar to the rather dryly presented, text-based Cartels and Cutthroats, only more presentable and more accessible, in line with Trip Hawkins’s credo of “simple, hot, and deep” consumer software. They would be writing for the Atari 8-bit line, which in addition to excellent sound and graphics offered them one entirely unique affordance: these machines offered four joystick ports rather than the two (or none) found on other brands. Dan thus saw a way to offer in practical form at last the vision that had caused him to get involved with game design in the first place back in the days of Wheeler Dealers. Four people could gather around the living room, each with her own controller, and really play together, in real time; no need for taking turns in front of the computer or any of the other machinations that had marked his earlier games. This would allow him to create something much breezier than Cartels and Cutthroats — a game to replace the old board-game standbys on family game nights, a game for parties and social occasions. With the opportunity to do those Wheeler Dealers real-time auctions right at last, Dan dusted the old idea off and made it the centerpiece of the new design.

Given their intention to create a family board game for the next generation, Dan and his colleagues started to look at the classic designs for other ideas with which to surround the auctions. The obvious place to look for inspiration for a game with an economic theme was the game that is still pretty much the board game as far as the masses are concerned: Monopoly. Monopoly gets a lot of grief amongst hardcore gamers these days for a multitude of very real flaws, from an over-reliance on luck in the early stages to the way it often goes on forever after it becomes totally obvious who is going to win to the way it can leave bankrupted players sitting around with nothing to do for an hour or more while everyone else finishes. Yet there’s something compelling about it as well, something more than sheer cultural inertia behind its evergreen appeal. The team now tried to tease out what those qualities were. Bill Bunten said, half facetiously, that his favorite thing about Monopoly was the iconic metal tokens representing each player — the battleship, the car, the top hat, the shoe, etc. Everyone laughed, but the input became an important part of the new game’s charm: every player in it gets to pick the avatar she “most resembles.”

M.U.L.E.

Looking more deeply for the source of Monopoly‘s appeal, the team realized that it was socially- rather than rules-driven. Unlike most board games, which reward the analytical thinker able to maximize the interactions of a rules set, Monopoly — at least if you’re playing it right — rewards the softer arts of negotiation and diplomacy. The personalities of the players and the relationships among them have as much effect on the way play proceeds as do the rolls of the dice. In the Bunten family, Mom would always let you out of paying rent if you couldn’t afford it; Bill would force you to mortgage a property if you came up a dollar short on your rent. Alliances and partnerships would form and shift as a result. The team decided that they wanted that human element in their game. It had never been seen in a computer game before, for the very simple reason that it was beyond the scope of possibility for an AI opponent living in 48 K of memory. But in their game, conceived primarily as a multi-player experience, it should be possible.

And yet more elements were drawn from Monopoly. Play would center around a “board” of properties which would be gradually acquired by the players, through a land grant that began each turn or through auctions or trades. They also built in equivalents to Monopoly‘s Chance and Community Chest cards to keep play from getting too comfortable and predictable. In keeping with Dan’s roots in simulation, however, the game would attempt to model real economic principles, making its theme more than just the window-dressing it largely was in Monopoly. Producing the same good in two adjacent plots would let the player take advantage of economies of scale to produce more; having three plots in total producing the same good would also result in more production, thanks to the learning curve theory of production. In general, the computer allowed for a deeper, more fine-grained game model than was possible in dice and cardboard. For instance, normalized probability curves could be used in place of a six-sided die, and the huge sums of money the players would eventually accumulate could be tracked down to the dollar. It all would result in something more than just a computerized board game. It would be a real, functioning economy, a modest little virtual world where the rules of supply and demand played out transparently, effortlessly from the players’ perspective, just as they do in the real world.

But what should be the fictional premise behind the world? For obvious commercial reasons — Star Wars and Star Trek were huge in the early 1980s — they decided early on to give the game a science-fiction theme. Dan and Bill had both read Time Enough for Love by Robert Heinlein. Dating from the early stages of Heinlein’s dirty-old-man phase, there’s not much to recommend the book if you aren’t a fan of lots and lots of incestuous sex written in that uniquely clunky way of aging science-fiction writers who look around to realize that something called the Sexual Revolution has happened while they were hanging out at science-fiction conventions. Still, the brothers were inspired by one section of the book, “The Tale of the Adopted Daughter,” about a colony that settles on a distant planet. Provided with only the most meager materials for subsistence, they must struggle to survive and build a functioning economy and society by the time the colony ship returns years later to deliver more colonists and, more importantly, haul away the goods they produced to make a profit for everyone back in the more civilized parts of the galaxy. Sounds like a beautiful setup for a game, doesn’t it? To add a realistic wrinkle, the team decided that each of the four players would not only be working for herself, but must balance self-interest with the need to make the colony as a whole successful by the time the ship returned. Thus entered the balancing act people working in real economies must come to terms with, between self-interest and the needs of the society around them. A player who gets too cutthroat in her attempts to wring every bit of profit out of the others can bring the whole economy crashing down around her ears. (Perhaps some banking executives of recent years should have played more M.U.L.E. as youngsters.)

Among the most valuable tools that Heinlein’s colonists bring with them is a herd of genetically modified mules that are not only possessed of unusual strength and endurance but also so intelligent that they can engage in simple speech. The fact that the mules are nevertheless bought and sold like livestock makes this just one more squicky aspect of a very squicky book; it feels uncomfortably like slavery. Obviously that wouldn’t quite do for the game. Then one day Alan Watson’s son came in with a toy model of an AT-AT Walker from The Empire Strikes Back. It only took the removal of the guns and the addition of a listlessly shambling gait to go from Imperial killing machine to cute mascot. A hasty backronym was created: mules were now M.U.L.E.s, Multi-Use Labor Elements programmable to perform any of several different roles. They provided the pivot around which the whole experience would revolve.

He [a M.U.L.E.] was born — if you can call it that — in an underground lab in the Pacific Northwest. A major defense contractor had gone out of its way to get the job and they were stoked.

Stoked, this is, until the detailing robots went on strike. Costs ran over. Senators screamed. And when the dust had cleared, the job was finished by a restaurant supply firm, a maker of preschool furniture, and the manufacturers of a popular electric toaster.

It shows.

The game itself was quickly renamed from the underwhelming Planet Pioneers to M.U.L.E., albeit not without some conflict with EA, who pushed for the name Moguls from Mars. Thankfully, M.U.L.E. won the day in the end.

AT-AT Walkers M.U.L.E.

Combined with the Monopoly-inspired player avatars, the M.U.L.E.s anchored the game in a concrete reality, offering it an escape from the abstraction that had limited the appeal of Cartels and Cutthroats. Now the player could be embodied in the economic simulation. She didn’t just assign one of her plots to produce, say, smithore (the colony’s main cash crop, which requires food and energy to produce) from some textual display. No, she had to walk into the village at the center of the colony, buy a M.U.L.E., outfit it for the right sort of work, then lead it back to her plot. And now auctions could be implemented as a unique combination of footrace and game of chicken involving all of the players’ avatars. All of this is done entirely with the joystick, forming a GUI interface of sorts perfectly in line with Trip Hawkins’s vision of a new generation of friendly consumer software. The new “visual, tactile relationship” (producer Joe Ybarra’s words) between player and game also allowed some modest action elements to keep players on their toes: they had only a limited amount of time to try to accomplish everything they needed to — buying M.U.L.E.s, reequipping and rearranging them to suit current production needs, etc. — during their turn. Running out of time or misplacing a M.U.L.E. (thus causing it to run off) could be disastrous; conversely, working quickly and efficiently, and thus finishing early, gave time to earn some extra money by gambling in the pub, or, in an homage to Gregory Yob’s classic, go hunting for the “mountain wampus.” The latter was just one of many elements of whimsy the team found room to include, one more drop in M.U.L.E.‘s bucket of charm.

A land auction in progress.

A land auction in progress.

About to buy a M.U.L.E. in the village.

About to buy a M.U.L.E. in the village.

Leading a M.U.L.E. from the village at the center of the game board for placement in an empty plot (denoted by the house symbol) at far left.

Leading a M.U.L.E. from the village at the center of the game board for placement in an empty plot (denoted by the house symbol) at far left.

Hunt the "Wampus"

Hunt the “Wampus”

With the core ideas and mechanics now in place, Dan Bunten and his colleagues had the makings of one hell of a game on their hands. But as any good game designer, whether she works in cardboard or silicon, will tell you, even the most genius of designs must be relentlessly tested, endlessly tweaked. Ozark Softscape and EA devoted literally months to this task, gradually refining the design. Land had originally all been sold through auctions, but this soon became obviously problematic: once a player got fairly well ahead, she would be able to buy up every plot that became available, putting her economy in a different league from everyone else’s and making the outcome a foregone conclusion as early as halfway through the game. They solved this by automatically granting one plot of land to each player on every turn, only supplementing those grants with the occasional plot that came up for auction. They also added several other little tweaks designed to keep anyone from completely running away with the game. For instance, a bad random event can never happen to the player in last place, while a good can never happen to the player in first. In case of ties in auctions or land grants — two or more players arriving somewhere or pressing their buttons at the same time — priority always goes to the player furthest behind.

And then of course the economy itself — the exact relationship between supply and demand, the prices of the different commodities and the ways they fluctuated — required a multitude of adjustments to find the perfect balance.

The game was designed to always have four players, with the seats of any absent humans being filled by computer opponents. This required the development of AI. While obviously not the main point of M.U.L.E., the team to their credit did a pretty good job with that; the computer often makes smarter moves than you might expect. Single-player M.U.L.E. is a pale shadow of multi-player M.U.L.E., but it’s hardly a disaster. (As Dan later wrote, “Single-player M.U.L.E. is considerably better than single-player Monopoly!”) It’s even possible to let four computer opponents play while you sit back and watch, something that stores looking to feature the game in their sales windows must have greatly appreciated.

Ozark relied for all of the exhaustive and exhausting testing required to get everything right not only on the endless stream of eager players who visited their house each night but also on others back at EA. Both Hawkins and Ybarra made considerable contributions to the design. Hawkins pushed always to make M.U.L.E. as realistic an economic simulation as its premise and the need for accessibility — not to mention the limited capabilities of the Atari 800 — would allow. Later he wrote the manual himself; like the game, it’s a model of concise, friendly accessibility, designed to get the player playing with an absolute minimum of tedious reading. As for Ybarra… well, here’s his level of dedication to a project of which he had started out so skeptical:

Right about the mid-point of the product, when we were starting to get [the] first playable [builds], that was when I started my several-hundred hour journey of testing this game. I can remember many nights I would come home from work and fire up the Atari 800 and sit down with my, at the time, two-year-old daughter on my lap holding the joystick that didn’t work, while I was holding the joystick that did work, testing this game. And I’d probably get eight or ten games in at night, and I would do that for two or three or four months actually, trying to work out all the kinks in the product.

By the way, at that time in the history of EA, we had no testers. In fact we had no assistance—we didn’t have anything! So producers had to do everything. I tested my own products; I built my own masters; I did all the disk-duplication work; I did all the copy-protection; I did the whole nine yards! If it was associated with getting the product manufactured, the producers did all the work. I remember a lot of nights there staying up until one or two o’clock in the morning playing M.U.L.E. and thinking, “Wow, this game is good!” It was a lot of fun. And then thinking to myself, “Gee, I wish the AI would do this.” So I took notes and took them along to Dan, and said “If you do these kinds of things at this point in the game, this is what happens.” He would take parts of those notes, and a couple of days later I’d get a new build and be back in that main chair back with my daughter on my lap, once again testing this thing and checking to see if it worked. More often than not, it did. That was a really special time.

As the game neared completion just in time for EA’s own launch as a publisher, the EA PR folks went to work. Hewing always to the “software artists” dictum, they cast Ozark Softscape as a group of hip back-country savants, sort of the gaming equivalent of the Allman Brothers Band. Their portrait on the inner sleeve of M.U.L.E. even bears a certain passing resemblance to the Allmans’ iconic At Fillmore East cover.

The Allman Brothers Band At Fillmore East

Seated from left: Bill Bunten, Jim Rushing, Alan Watson, Dan Bunten

Seated from left: Bill Bunten, Jim Rushing, Alan Watson, Dan Bunten

Like all of this software-artists stuff, it was a bit contrived. The girl Bill Bunten is apparently ogling like a real rock star on the prowl is actually his sister, hastily recruited to add an element of additional interest to the picture.

Heartbreakingly, the image-making and advertising didn’t get the job done. Despite all the love lavished on M.U.L.E. by Ozark Softscape and EA and despite deservedly stellar reviews, it was a commercial disappointment. M.U.L.E. sold only about 30,000 copies over its lifetime. By way of comparison, consider that Pinball Construction Set, another launch title, shifted over 300,000 units. Some of the disappointment may be down to M.U.L.E.‘s debuting on a relative minority platform, the Atari 8-bit line. Although it was later ported to the juggernaut Commodore 64, it was kludgier away from the Atari and its four joystick ports. Even the latest iteration of the Atari 8-bit line, the 1200XL, couldn’t play M.U.L.E. properly, thanks to Atari’s decision to reduce the four joystick ports to two in the name of cost reduction. Out of all the disappointments engendered by that very disappointing machine, this was perhaps the most painful. Thus M.U.L.E., the Atari 8-bit’s finest gaming hour, plays properly only on a subset of the line.

But likely even more significant was a fact that was slowly becoming clear, to Dan Bunten’s immense frustration: multi-player games just didn’t sell that well. It really did seem that most of the people buying computer games preferred to spend their time alone with them. Reluctantly recognizing this, even he would soon be forced by commercial concerns to switch to the single-player model, at least for a couple of games.

Yet we can take comfort in the fact that M.U.L.E.‘s reputation has grown to far exceed its commercial performance. Indeed, it’s better remembered and better loved today than all but a handful of the contemporaries that trounced it so thoroughly in the marketplace back in the day. And deservedly so, because playing M.U.L.E. with a group of friends is a sublime experience that stands up as well today as it did thirty years ago. The world is a better place because it has M.U.L.E. in it, and every time I think about it I feel just a little bit happier than I was before. Just a few notes of its theme music (written by a Little Rock buddy of the Buntens, Roy Glover) puts a smile on my face. If the reasons for that aren’t clear from all the words that have preceded these, that may be down to my failings as a writer. But it may just also be down to the way that it transcends labels and descriptions. If ever a game was more than the sum of its parts, it’s this one. I could tell you at this point how such gaming luminaries as Sid Meier, Will Wright, and Warren Spector speak about M.U.L.E. with stars in their eyes, but instead I’ll just ask you to please go play it.

There are modern re-creations on offer, but purists like me still prefer the original. In that spirit, here’s the manual and Atari disk image, which you can load into an emulator if, like most of us, you don’t have an old Atari 800 lying around. Pick up some old-time digital joysticks as well and then hook a laptop up to your television to really do the experience right. That’s the way that M.U.L.E. should be played — gathered around the living room with good friends and the snacks and beverages of your choice. At some point during the evening remember to look around and remind yourself in best beer-commercial fashion that gaming doesn’t get any better than this. And maybe drink a toast to the late, great Dan Bunten while you’re at it.

Update, August 1, 2023: The Dan Bunten of this article began to live as the woman Dani Bunten Berry in 1992; she died in 1998. I knew less about transgenderism at the time that I wrote this article than I do now, and would certainly have written it differently today. Which doesn’t, of course, mean that my handling of it would satisfy everybody. These are complicated issues, balancing fidelity to history against the rights of individuals to determine their own gender identities, potentially even retroactively. As such, reasonable people of good faith can and do disagree about them. For a fairly long-winded a description of my current opinions and editorial policy on these matters, thought through in a way they sadly weren’t at the time I wrote this article, see a comment I wrote elsewhere on this site in 2018.

(Sources: Dan wrote a column for Computer Gaming World from the July/August 1982 issue through the September/October 1985 issue. Those are a gold mine for anyone interested in understanding his design process. Particularly wonderful is his detailed history of M.U.L.E.‘s development in the April/May 1984 issue. Other interesting articles and interviews were in the June 1984 Compute!’s Gazette, the November 1984 Electronic Games, and the January 1985 Antic. Online, you’ll find a ton of historical information on World of M.U.L.E. Salon also published a good article about him ten years ago. Finally, see the site of the (apparently stalled) remake Alpha Colony for some nice — albeit somewhat buried — historical tidbits. And sorry this article runs so long. M.U.L.E. is… special. I really wanted to do it justice.)

 
76 Comments

Posted by on February 12, 2013 in Digital Antiquaria, Interactive Fiction

 

Tags: , , , ,

The Pinball Wizard

Bill Budge in Electronic Arts software artist pose

Bill Budge in Electronic Arts software artist pose

The name of Bill Budge has already come up from time to time on this blog. Mentioning him has been almost unavoidable, for he was one of the titans amongst early Apple II programmers, worshiped for his graphical wizardry by virtually everyone who played games. As you may remember, his name carried such weight that when Richard Garriott was first contacted by Al Remmers of California Pacific to propose that he allow CP to publish Akalabeth Garriott’s first reaction was a sort of “I’m not worthy” sense of shock at the prospect of sharing a publisher with the great Budge. Having arrived at the time of the birth of Electronic Arts and Budge’s masterpiece, Pinball Construction Set, now seems a good moment to take a step back and look at what made Budge such a star.

Budge was always a tinkerer, always fascinated by the idea of construction sets. As a young kid, he played with blocks, tinker toys, erector sets. As an older kid, he moved on to fiddling with telescopes and model rockets. (“It’s amazing we didn’t kill ourselves.”) After moving about the country constantly when Budge was younger, his family finally ended up in the San Francisco Bay area by the time Budge began high school in the late 1960s. It was a fortuitous move. With the heart of the burgeoning Silicon Valley easily accessible, Budge’s family had found the perfect spot for a boy who liked to tinker with technology. A teacher at his school named Harriet Hungate started a class in “computer math” soon after Budge arrived. The students wrote their programs out by hand, then sent them off to a local company that had agreed to donate some time on their IBM 1401 minicomputer. They then got to learn whether their programs had worked from a printout sent back to the school. It was a primitive way of working, but Budge was immediately smitten. He calls the moment he discovered what a loop is one of the “transcendent moments” in his life. He “just programmed all the time” during his last two years of high school. Hungate was eventually able to finagle a deal with another local business to get a terminal installed at the school with a connection to an HP 2100 machine hosting HP Time-Shared BASIC. Budge spent hours writing computer versions of Tic-tac-toe, checkers, and Go.

But then high school was over. Without the ready access to computers that his high school had afforded him, Budge tried to put his programming behind him. He entered the University of California Santa Cruz as an English major, with vague aspirations toward becoming a novelist. Yet in the end the pull of programming proved too strong. After two years he transferred to Berkeley as a computer-science major. He got his Bachelor’s there, then stayed on to study for a PhD. He was still working on that in late 1978 when the Apple II first entered his life.

As you might expect, the arrival of the trinity of 1977 had prompted considerable discussion within Berkeley’s computer-science department. Budge dithered for a time about whether to buy one, and if so which one. At last friend and fellow graduate student Andy Hertzfeld convinced him to go with the local product of nearby Apple Computer. It wasn’t an easy decision to make; the Commodore PET and the TRS-80 were both much cheaper (a major consideration for a starving student), and the TRS-80 had a vastly larger installed base of users and much more software available. Still, Budge decided that the Apple II was worth the extra money when he saw the Disk II system and the feature that would make his career, the bitmapped hi-res graphics mode. He spent half of his annual income on an Apple II of his own. It was so precious that he would carefully stow the machine away back in its box, securely swaddled in its original protective plastic, whenever he finished using it.

As he explored the possibilities of his treasure, Budge kept coming back again and again to hi-res mode. He worked to divine everything about how it worked and what he might do with it. His first big programming project became to rewrite much of Steve Wozniak’s original game of Breakout which shipped with every early Apple II. He replaced Woz’s graphics code with his own routines to make the game play faster and smoother, more like its arcade inspiration. When he had taken that as far as he could, he started thinking about writing a game of his own. He was well-acquainted with Pong from a machine installed at the local pizza parlor. Now he recreated the experience on the Apple II. He names “getting my first Pong ball bouncing around on the screen” as another of his life’s transcendent moments: “When I finished my version of Pong, it was kind of a magical moment for me. It was night, and I turned the lights off in my apartment and watched the trailing of the ball on the phosphors of my eighty-dollar black and white TV.” He added a number of optional obstacle layouts to the basic template for variety, then submitted the game, which he named Penny Arcade, to Apple themselves. They agreed to trade him a printer for it, and earmarked it for The Apple Tapes, a cassette full of “introductory programs” to be included with every specimen of the new Apple II Plus model they were about to release. In the manual for the collection they misattributed the game to “Bob Budge,” but it mattered little. Soon enough everyone would know his name.

Penny Arcade

Penny Arcade

With his very first game shipping with every Apple II Plus, Budge was naturally happy to continue with his new hobby. He started hanging around the local arcades, observing and taking careful notes on the latest games. Then he would go home and clone them. Budge had little interest in playing the games, and even less in playing the role of game designer. For him, the thrill — the real game, if you will — was in finding ways to make his little Apple II produce the same visuals and gameplay as the arcade machines, or at least as close as he could possibly get. In a few years Atari would be suing people for doing what Budge was doing, but right now the software industry was small and obscure enough that he could get away with it.

Budge’s big breakthrough came when a friend of his introduced him to a traveling sales rep named Al Remmers, who went from store to store selling 8″ floppy disk drives. He and Budge made a deal: Remmers would package the games up in Ziploc baggies and sell them to the stores he visited on his treks, and they would split the profits fifty-fifty. Budge was shocked to earn $7000 for the first month, more than his previous annual income. From this relationship was born Remmers’s brief-lived but significant software-publishing company, California Pacific, as well as Budge’s reputation as the dean of Apple II graphics programmers. His games may not have been original, but they looked and played better than just about anything else out there. To help others who dreamed of doing what he did, he packaged some of his routines together as Bill Budge’s 3-D Graphics System. His reputation was such that this package sold almost as well as his games. This was how easily fame and fortune could come to a really hot programmer for a brief window of a few years, when word traveled quickly in a small community aching for more and better software for their machines.

In fact, his reputation soared so quickly that Apple themselves came calling. Budge, who had been putting less and less effort into his studies as his income from his games increased, dropped out of Berkeley to join his old buddy Andy Hertzfeld in Cupertino. He was made — what else? — a graphics specialist working in the ill-fated Apple III division. He ended up spending only about a year at Apple during 1980 and 1981, but two experiences there would have a huge impact on his future work, and by extension on the field of computer gaming.

While Budge was working at Apple much of the engineering team, including Hertzfeld and even Woz himself, were going through a hardcore pinball phase: “They were students of the game, talking about catches, and how to pass the ball from flipper to flipper, and they really got into it.” Flush with cash as they were after the IPO, many at Apple started filling their houses with pinball tables.

Budge's first pinball game, from Trilogy of Games

Budge’s first pinball game, from Trilogy of Games

Budge didn’t find pinball intrinsically all that much more interesting than he did purely electronic arcade games. Still, one of the first games Budge sold through Remmers had been a simple pinball game, which was later included in his very successful Trilogy of Games package published by California Pacific. Pinball was after all a fairly natural expansion of the simple Pong variants he started with. Now, witnessing the engineers’ enthusiasm led him to consider whether he could do the game better justice, create something on the Apple II that played and felt like real pinball, with the realistic physics that are so key to the game. It was a daunting proposition in some ways, but unusually susceptible to computer simulation in others. A game of pinball is all about physics, with no need to implement an opponent AI. And the action is all centered around that single moving ball while everything else remains relatively static, meaning it should be possible to do on the Apple II despite that machine’s lack of hardware sprites. (This lack made the Apple II less suited for many action games than the likes of the Atari 8-bit computers or even the Atari VCS.) After some months of work at home and after hours, Budge had finished Raster Blaster.

Raster Blaster

Raster Blaster was the best thing Budge had yet done — so good that he decided he didn’t want to give it to California Pacific. Budge felt that Remmers wasn’t really doing much for him by this point, just shoveling disks into his homemade-looking packaging, shipping them off to the distributor SoftSel, and collecting 50% of the money that came back. The games practically sold themselves on the basis of Budge’s name, not California Pacific’s. Budge was a deeply conflict-averse personality, but his father pushed him to cut his ties with California Pacific, to go out on his own and thereby double his potential earnings. And anyway, he was getting bored in his job at Apple. So he quit, and along with his sister formed BudgeCo. He would write the games, just as he always had, and she would handle the business side of things. Raster Blaster got BudgeCo off the ground in fine form. It garnered rave reviews, and became a huge hit in the rapidly growing Apple II community, Budge’s biggest game yet by far. Small wonder — it was the first computer pinball game that actually felt like pinball, and also one of the most graphically impressive games yet seen on the Apple II.

But next came the question of what to do for a follow-up. It was now 1982, and it was no longer legally advisable to blatantly clone established arcade games. Budge struggled for weeks to come up with an idea for an original game, but he got nowhere. Not only did he have no innate talent for game design, he had no real interest in it either. Out of this frustration came the brilliant conceptual leap that would make his legacy.

Above I mentioned that two aspects of Budge’s brief time at Apple would be important. The other was the Lisa project. Budge did not directly work on or with the Lisa team, but he was fascinated by their work, and observed their progress closely. Like any good computer-science graduate student, he had been aware of the work going on at Xerox PARC. Yet he had known the Alto’s interface only as a set of ideas and presentations. When he could actually play with a real GUI on the Lisa prototypes, it made a strong impression. Now it provided a way out of his creative dilemma. He was disinterested in games and game design; what interested him was the technology used to make games. Therefore, why not give people who actually did want to become designers a set of tools to let them do that? Since these people might be no more interested in programming than he was in design, he would not just give them a library of code like the old 3-D Graphics System he had published through California Pacific. No, he would give them a visual design tool to make their own pinball tables, with a GUI interface inspired by the work of the Lisa team.

The original Pinball Construction Set box art, featuring pieces of the pinball machine that Budge disassembled to plan the program

The original Pinball Construction Set box art, featuring pieces of the pinball machine that Budge disassembled to plan the program

Budge had resisted buying a pinball table of his own while at Apple, but now he bought a used model from a local thrift shop. He took it apart carefully, cataloging the pieces that made up the playfield. Just as the Lisa’s interface used a desktop as its metaphor, his program would let the user build a pinball machine from a bin of iconographic spare parts. The project was hugely more ambitious than anything he had tackled before, even if some of the components, such as a simple paint program that let the user customize the look of her table, he had already written for his personal use in developing Raster Blaster. Budge was determined to give his would-be creator as much scope as he possibly could. That meant fifteen different components that she could drag and drop anywhere on the surface of her table. It meant letting her alter gravity or the other laws of physics if she liked. It meant letting her make custom scoring combinations, so that bumping this followed by that gave double points. And, because every creator wants to share her work, it meant letting the user save her custom table as a separate program that her friends could load and play just like they did Budge’s own Raster Blaster. That Budge accomplished all of this, and in just 48 K of memory, makes Pinball Construction Set one of the great feats of Apple II programming. None other than Steve Wozniak has called it “the greatest program ever written for an 8-bit machine.”

Pinball Construction Set

Pinball Construction Set

Amazing as it was, when BudgeCo released Pinball Construction Set at the end of 1982 its sales were disappointing. It garnered nowhere near the attention of Raster Blaster. The software industry had changed dramatically over the previous year. A tiny operation like BudgeCo could no longer just put a game out — even a great, groundbreaking game like PCS — and wait for sales. It was getting more expensive to advertise, harder to get reviews and get noticed in general. Yet when Trip Hawkins came to him a few months later asking to re-release PCS through his new publisher Electronic Arts, Budge was reluctant, nervous of the slick young Hawkins and his slick young company. But Hawkins just wouldn’t take no for an answer; he said he would make Budge and his program stars, said that only he could find PCS the audience its brilliance deserved — and he offered one hell of a nice advance and royalty rate to boot. And EA did have Woz himself on the board of directors, and Woz said he thought signing up would be a smart move. Budge agreed at last; thus BudgeCo passed into history less than two years after its formation.

As good as PCS was, it’s very possible that Hawkins had another, ulterior motive in pursuing Budge with such vigor. To understand how that might have been, we need to understand something about what Budge was like personally. Given the resume I’ve been outlining — spent his best years of high school poring over computer code; regarded his Apple II as his most precious possession; had his most transcendent moments programming it; etc. — you’ve probably already formulated a shorthand picture. If the Budge of that picture is, shall we say, a little bit on the nerdy, introverted side, you can be forgiven. The thing was, however, the real Budge was nothing like what you might expect; as he himself put it, he “didn’t quite fit the mold.” He had a tall, rangy build and handsome features beneath a luxurious head of hair, with striking eyes that a teenage girl might call dreamy. At 29 (although he looked perhaps 22), he was comfortable in his own skin in a way that some people never manage, with an easy grace about him that made others as glad to talk to him as they were to listen. His overall persona smacked more of enlightened California beach bum than hardcore programmer. And he took a great picture. If there was one person amongst Hawkins’s initial crew of developers who could actually pull off the rock star/software artist role, it was Budge; he might even attract a groupie or two. He was a dream come true for the PR firm Hawkins had inherited from his time at Apple, Regis McKenna, Inc. Thus the EA version of PCS was designed to look even more like a contemporary rock album than any of the other games. The name of Bill Budge, the man EA planned to make their very own rock star, was far larger on it than the name of his game.

EA's version of Pinball Construction Set

The down-to-earth Budge himself was rather bemused by EA’s approach, but he shrugged his shoulders and went along with it in his usual easygoing manner. When EA arranged for rock photographer Norman Seeff to do the famous “software artists” photo shoot, they asked that the subjects all wear appropriately bohemian dark clothing to the set. Budge went one better: he showed up with a single studded leather glove he’d bought for dressing up as a punk rocker for a party thrown by the Apple Macintosh team. He brought it simply as a joke, a bit of fun poked at all this rock-star noise. Imagine, then, how shocked he was when Seeff and the others demanded that he actually wear it. Thus Budge in his leather glove became the standout figure from that iconic image. As he later sheepishly admitted, “That’s not really me.” Soon after he got a software-artist photo shoot and advertisement all to himself, filled with vague profundities that may or may not have actually passed his lips beforehand. (“Programming for a microcomputer is like writing a poem using a 600-word vocabulary.”)

EA booked Budge into every gig they could find for him. He did a lengthy one-on-one interview with Charlie Rose for CBS News Nightwatch (“He knew absolutely nothing. He seemed like your typical blow-dried guy without a lot of substance. But I guess I was wrong about him.”); he demonstrated PCS alongside Hawkins on the influential show Computer Chronicles; he featured in a big segment on Japanese television, at a time when that country’s own designers were toiling in obscurity for their parent corporations; he had his photo on the cover of The Wall Street Journal; he was featured alongside the likes of Steve Jobs in an Esquire article on visionaries under the age of forty.

With his album out and the photo shoots done and the promotional spots lined up, it still remained for EA’s rock star to hit the road — to tour. If the highs just described were pretty intoxicating for a computer-game programmer, this part of the process kept him most assuredly grounded. Budge and EA’s Bing Gordon went on a series of what were billed as “Software Artists Tours,” sometimes accompanied by other designers, sometimes alone. The idea was something like a book tour, a chance to sign autographs and meet the adoring fans. Determined to break beyond the ghetto of traditional computer culture, EA booked them not only into computer stores but also into places like Macy’s in New York City, where they were greeted with confusion and bemusement. Even the computer stores sometimes seemed surprised to see them. Whether because of communications problems or flat disinterest, actual fans were often rare or nonexistent at the events. Hawkins’s dream of hordes of fans clutching their EA albums, fighting for an autograph… well, it didn’t happen, even though PCS became a major hit in its new EA duds (it would eventually sell over 300,000 copies across all platforms, a huge figure in those days). Often there seemed to be more press people eager to score an interview than actual fans at the appearances, and often the stores themselves didn’t quite know what to do with their software artists. One manager first demanded that Budge buy himself a new outfit (he was under-dressed in the manager’s opinion to be “working” in his store), then asked him if he could make himself useful by going behind the register and ringing up some customers. “That’s when I realized maybe I wouldn’t be a rock star,” a laconic Budge later said.

Budge wasn’t idle in the down-times between PR junkets. Privileged with one of the few Macintosh prototypes allowed outside of Apple, he used its bundled MacPaint application as the model for MousePaint, a paint program that Apple bundled with the first mouse for the Apple II. He also ported PCS to the Mac. Still, the fans and press were expecting something big, something as revolutionary as PCS itself had been — and small wonder, given the way that EA had hyped him as a visionary.

One of the most gratifying aspects of PCS had been the unexpected things people found to do with it, things that often had little obvious relationship to the game of pinball. Children loved to fill the entire space with bumpers, then watch the ball bouncing about among them like a piece of multimedia art. Others liked to just use the program’s painting subsections to make pictures, scattering the ostensible pinball components here and there not for their practical functions but for aesthetic purposes. If people could make such creative use of a pinball kit, what might they do with something more generalized? As uninterested as ever in designing a game in the traditional sense, Budge began to think about how he could take the concept of the construction set to the next step. He imagined a Construction Set Construction Set, a completely visual programming environment that would let the user build anything she liked — something like ThingLab, an older and admittedly somewhat obtuse stab at the idea that existed at Xerox PARC. His ideas about Construction Set Construction Set were, to say the least, ambitious:

“I could build anything from Pac-Man to Missile Command to a very, very powerful programming language. It’s the kind of a program that has a very wide application. A physics teacher, for example, could build all kinds of simulations, of little micro-worlds, set up different labs and provide dynamic little worlds that aren’t really videogames.”

It turned out to be a bridge too far. Budge tinkered with the idea for a couple of years, but never could figure out how to begin to really implement it. (Nor has anyone else in the years since.) In fact, he never made a proper follow-up to PCS at all. Ironically, Budge, EA’s software artist who best looked the part, was one of the least able to play the role in the long term. As becomes clear upon reading any interview with Budge, old or new, his passion is not for games; it’s for code. In the early days of computer gaming the very different disciplines of programming and game design had been conflated into one due to the fact that most of the people who owned computers and were interested in making games for them were programmers. During the mid-1980s, however, the two roles began to pull apart as the people who used computers and the way games were developed changed. Budge fell smack into the chasm that opened up in the middle. Lauded as a brilliant designer, he was in reality a brilliant programmer. People expected from him something he didn’t quite know how to give them, although he tried mightily with his Construction Set Construction Set idea.

Budge at home in early 1985, the beginning of his "years in the wilderness"

Budge at home in early 1985, the beginning of his “years in the wilderness”

So, he finally just gave up. After 1984 the interviews and appearances and celebrity petered out. His continuing royalties from PCS made work unnecessary for some years, so he all but gave up programming, spending most of his time wind-surfing instead (a sport that Bing Gordon, perhaps to his regret, had taught him). Most people would have a problem going back to obscurity after being on television and newspaper features and even having their own magazine column (in Softalk), but it seemed to affect Budge not at all: “I’m kind of glad when I don’t have anything new out and people forget about me.” Eventually EA quietly accepted that they weren’t going to get another game from him and quit calling. Budge refers to this period as his “years in the wilderness.” By 1990 the name of Bill Budge, such a superstar in his day, came up only when the old-timers started asking each other, “Whatever happened to….?”

In the early 1990s, Budge, now married and more settled, decided to return to the games industry, first to work on yet another pinball game, Virtual Pinball for the Sega Genesis console. Without the pressure of star billing to live up to and with a more mature industry to work in that had a place for his talents as a pure programmer’s programmer, he decided to continue his career at last. He’s remained in the industry ever since, unknown to the public but respected immensely by his peers within the companies for which he’s worked. For Budge, one of those people who has a sort of innate genius for taking life as it comes, that seems more than enough. Appropriately enough, he’s spent most of his revived careers as what’s known as a tools programmer, making utilities that others then use to make actual games. In that sense his career, bizarre as its trajectory has been, does have a certain consistency.

PCS, his one towering achievement as a software artist, deserves to be remembered for at least a couple of reasons. First of all there is of course its status as the first really elegant tool to let anyone make a game she could be proud of. It spawned a whole swathe of other “construction set” software, from EA and others, all aimed at fostering this most creative sort of play. That’s a beautiful legacy to have. Yet its historical importance is greater than even that would imply. PCS represents to my knowledge the first application of the ideas that began at Xerox PARC to an ordinary piece of software which ordinary people could buy at an ordinary price and run on ordinary computers. It proved that you didn’t need an expensive workstation-class machine like the Apple Lisa to make friendlier, more intuitive software; you could do it on a 48 K Apple II. No mouse available? Don’t let that stop you; use a joystick or a set of paddles or just the arrow keys. Thousands and thousands of people first saw a GUI interface in the form of Pinball Construction Set. Just as significantly, thousands of other designers saw its elegance and started implementing similar interfaces in their own games. The floating, disembodied hand of PCS, so odd when the game first appeared, would be seen everywhere in games within a couple of years of its release. And game manuals soon wouldn’t need to carefully define “icon,” as the PCS manual did. PCS is a surprising legacy for the Lisa project to have; certainly the likes of it weren’t anything anyone involved with Lisa was expecting or aiming for. But sometimes legacies are like that.

Next time we’ll look at another of those seminal early EA games. If you’d like to try to make something of your own in the meantime, here’s the Apple II disk image and manual for Pinball Construction Set.

(What with his celebrity in Apple II circles between 1979 and 1985, there’s a lot of good information on Budge available in primary-source documents from the period. In particular, see the November 1982 Softline, the December 1985 Compute!’s Gazette, the March 1985 Electronic Games, the March 1985 Enter, the September 1984 Creative Computing, and Budge’s own column in later issues of Softalk. Budge is also featured in Halcyon Days, and Wired magazine published a retrospective on his career when he was given a Pioneer Award by the Academy of Interactive Arts and Sciences in 2011. Budge’s interview at the awards ceremony was videotaped and is available for viewing online.)

 
 

Tags: , ,