RSS

Tag Archives: apple ii

The Road to V

Ultima V

It’s not easy having a software superstar for a little brother. That’s something that Robert Garriott, president of Origin Systems, had more and more cause to realize as the 1980s wore on. Whilst Richard Garriott quite literally lived out his fantasies, it was Robert who was left to deal with all the mundanities of running a small game developer in an industry that was ever becoming a more precarious place. Whilst Richard wrote the games and gave all the interviews and reveled in his Lord British persona, it was Robert who dealt with the sort of people who might not be terribly impressed by a wispy 25-year-old that liked to affect the personality and the dress code of a Medieval monarch. It was Robert who negotiated the business deals, Robert who represented Origin’s interests with the Software Publishers Association, Robert who put a sober, businesslike face on a company that to a lot of outsiders looked like little more than a bunch of nerds with too much time on their hands. And sometimes it was Robert who found himself trapped between the practicalities of running a business and the desires of a famous younger brother who was just slightly full of himself — what young man wouldn’t be slightly full of himself in his situation? — and was used to having things his own way.

Honestly, now... would you feel comfortable investing in a company run by this guy?

Honestly, now… would you feel comfortable investing in a company run by this guy?

The most dangerous of these conflicts was the great sibling squabble over just where Origin Systems should be located. Back at the end of 1983, you may remember, Robert had been able to convince Richard to move the company from their parents’ garage in Houston, Texas, up to New Hampshire, where his wife Marcy had found a fine position of her own working for Bell Labs. The deal was that they would remain there for at least three years. Robert, who had spent the months before the move commuting cross-country in his private plane, hoped that during the three years something might change: Marcy might get a transfer, or Richard might decide he actually liked New England and wanted to stay there. Well, at the end of 1986 the three years were up, and neither of those things had happened. Richard, who persists to this day in describing his exile in the “frozen wasteland” of New Hampshire in terms lifted straight out of Ethan Frome, figured he had fulfilled his side of the deal, had done his three years as he’d said he would. Now he wanted to move. And he knew exactly where he wanted to move to: back to warm, sunny Austin, the city that had felt like the only place he wanted to make his home almost from the day he arrived to attend university there back in 1979.

A deal being a deal notwithstanding, Robert tried to nix the move, at least for the time being. In addition to his own marriage — he and Marcy certainly didn’t relish going back to commuting cross-country — there were the other Origin employees to think about. Sure, most of the technical staff remained the same group of youngsters that had trooped up north with the Garriotts three years before; they were almost one and all in agreement with Richard that it was time to be southbound again. But there were also the support personnel to think of, New Englanders hired in New England who had been doing good work for the company for quite some time. Robert proposed that they put Origin’s future location to a simple company-wide vote.

That proposal really pissed Richard off. New Englanders now well outnumbered Texas transplants, meaning the outcome of any vote must be a foreordained conclusion — which was, Richard believed, exactly why Robert was asking for one. The two had screaming rows that spilled out of their offices into the hallways of a suddenly very tense suite of offices, while the occupants of those offices, northerners and southerners crammed together under one roof for years, now felt free to let loose on each other with all of the frustrations they’d been keeping under wraps for so long. It was civil war — the staid New Englanders who were loyal to Robert against Lord British’s merry band of anarchists. In a fit of pique and homesickness, Richard’s right-hand man Chuck Bueche, music composer and programmer for the Ultima games, porting expert, and designer of games in his own right, announced he’d had enough and headed for Texas on his own. Richard and Robert each threatened to break with the other, to do his own thing with his own splinter of the company.

Such threats were ridiculous. Richard and his crew were no more capable of taking full responsibility for a company than Robert and his were of writing the next Ultima. These two needed each other for more reasons than just the ties of blood. It was finally left to older and cooler heads, in the form of the brothers’ parents, to broker a compromise. Richard would move back to Austin with most of the technical team, to set up a small studio there that would make the games; Robert would remain in New Hampshire with Marcy, a couple of programmers working on ports and ancillary projects, and the larger support staff that was responsible for packaging and marketing the games and running the business as a whole.

Thus Richard and company, reunited again with Bueche, found themselves a minimalist office in Austin in early 1987, fifteen desks ranged along a single long hallway. And Richard himself, now becoming a very wealthy young man indeed thanks to the huge success of Ultima III and IV, started work on Britannia Manor, a custom-built house-cum-castle worthy of Lord British; it came complete with secret passageways, a cave, a wine cellar, and a stellar observatory. It was pretty clear he wasn’t planning to go anywhere else anytime soon.

Carried out though it was for very personal reasons, Richard’s return to Austin would prove the single best business move Origin ever made. Eastern Texas may not have had as sexy a reputation as Silicon Valley, but there was plenty of high technology in the environs of Dallas, Houston, and Austin, along with a booming economy and low taxes to boot. Austin itself, in addition to being home to a prestigious university boasting almost 50,000 students of diverse talents, was something of the cultural as well as government capital of the state. Along with a lively music scene and tattoo parlors and all the other attributes of a thriving college town, Weird Austin boasted a diverse tapestry of nerdier culture, including Richard’s beloved SCA chapter and the hugely influential tabletop-game publisher Steve Jackson Games. What Austin, and Texas in general for that matter, rather oddly lacked was any notable presence in the computer-games industry. Richard himself was shocked at the hungry talent that washed up unbidden at Origin’s doorstep almost as soon as they hung their shingle, all eager to work for the house that Ultima had built. “Austin as a location was fundamental to the success of Origin,” remembers Richard, “because there was so much talent here in this town.” The atmosphere inside Origin’s new Austin office was soon so exciting, so positively bursting with possibility, that Robert had to admit defeat. More and more of Origin’s operations steadily moved south. Within a couple of years, Robert would convince Marcy to make the move with him, and Origin’s operations in New Hampshire would come to an end.

But hardly was the great Texas/New Hampshire crisis resolved than another raised its head. This time the dispute wasn’t intra-family or even intra-company. It rather involved Electronic Arts, a much bigger publisher with which little Origin would have quite the love-hate relationship over the years.

The origin of Origin’s EA problem dated back to August of 1985, about a month before the release of Ultima IV. By this point distribution was starting to become a real issue for a little publisher like Origin, as the few really big publishers, small enough in number to count on one hand, were taking advantage of their size and clout to squeeze the little guys off of store shelves. Knowing he had a hugely anticipated game on his hands with Ultima IV, one that with the proper care and handling should easily exceed the considerable-in-its-own-right success of Ultima III, Robert also knew he needed excellent distribution to realize its potential. He therefore turned to EA, one of the biggest of the big boys of the industry.

The agreement that resulted was quite the coup for EA as well as Origin. Thanks to it, they would enjoy a big share of the profits not just from The Bard’s Tale, the hit CRPG they had just released under their own imprint, but also from Origin’s Ultima IV. Together these two games came to dominate the CRPG field of the mid-1980s, each selling well over 200,000 copies. For a company that had never had much of anything to do with this genre of games before, it made for one hell of a double whammy to start things off.

While it’s been vaguely understood for years that the Origin and EA had a mid-1980s distribution agreement that broke down in discord, the details have never been aired. I’m happy to say that I can shed a lot more light on just what happened thanks to documents housed in the Strong Museum of Play‘s collection of Brøderbund papers. (The reason I was able to find them in a Brøderbund archive will become clear shortly.) I unfortunately can’t make these documents publicly available, but I can summarize and quote extracts from them. I do want to look at the contract that EA and Origin signed and the dispute that would eventually result from it in some detail, both because it’s so very illustrative of how the industry was changing as it entered the second half of the 1980s and because it provides a great example of one of the most dangerous of the potential traps that awaited the small fry who still tried to survive as independents. Origin would escape the trap, but many another small publisher/developer would not.

At first glance the distribution contract might seem more generous to Origin than to EA. Origin is obligated to remain the distributee only as long as EA has bought product from them totaling a stipulated amount over the course of a rolling calendar. By the end of the contract’s first year, which comes on September 1, 1986, EA must have bought $3.3 million worth of Origin games. The goal for the second year of the contract doubles; EA must have bought games worth $9.3 million in total from Origin by September 1, 1987, in order for the latter company to be obligated to honor the third and final year of their distribution contract. That’s a very ambitious sales goal for a little company like Origin whose entire reason for existence was a single series of games with a sporadic release schedule. (Origin had already released some non-Ultima titles and would continue to do so, but it would be years yet before any of them would make an impact on their bottom line to even begin to rival that of Ultima.)

All went well between Origin and EA for the first eighteen months. The trouble started shortly after Richard’s move back to Austin, when he got word of EA’s plans to release a rather undistinguished CRPG called Deathlord that was even more derivative of Ultima than was the norm. As Strategic Simulations, Incorporated, had learned to their chagrin a few years earlier in the case of their own Ultima clone Questron, Richard didn’t take kindly to games that copied his own work too blatantly. When EA refused to nix their game, and also proved uninterested in negotiating to license the “game structure and style” as SSI had done, Richard was incensed enough to blow up the whole distribution deal.

Richard and Robert believed that Origin would be on firm legal ground in withdrawing from the distribution agreement at the onset of the third year because EA was projected to have purchased just $6.6 million worth of product from Origin by September 1, 1987, way short of the goal of $9.3 million. Origin informed EA of their intentions and commenced negotiating a new distribution agreement with another of the big boys, Brøderbund, currently riding even higher than EA on the strength of The Print Shop and Carmen Sandiego.

The notice was greeted with shock and outrage by EA, who felt, and by no means entirely without reason, that it was hardly their fault that they were so far from the goal. That goal had been predicated on not just one but two or three or possibly even four new Ultima games being released during those first two years. Foreshadowing the way that Origin would handle Ultima VII years later, Richard’s plan at the time the contract was signed had been to release an Ultima IV Part 2 that would reuse the same engine in relatively short order, and only then to turn to Ultima V. But those plans had fallen by the wayside, undone by Richard’s idealistic need to make each Ultima clearly, comprehensively better than its predecessor. And now Ultima V was taking even longer than had Ultima IV. Having long since missed the original target of Christmas 1986, it now looked almost certain to miss Christmas 1987 as well; it still looked to be a good six months away from release as of mid-1987.

Yet it was the Ultima I situation that most ruffled EA’s feathers. When the rights to the first game of the series, having passed through the hands of the long-defunct California Pacific and then Sierra, reverted back to Richard in 1986, Origin assigned several programmers to rewrite it from scratch in assembly language rather than BASIC, adding graphical upgrades and interface enhancements along the way to bring it at least nominally up to date. Already a semi-legendary game, long out of print on the Apple II and never before available at all on the Commodore 64 or MS-DOS, the new and improved Ultima I carried with it reasonably high commercial hopes. While not the new Ultima, it was a new Ultima for the vast majority of Lord British fans, and should ease some of the disappointment of not being able to get Ultima V out that year. But in the wake of the Deathlord dust-up it became clear to EA that Origin was deliberately holding Ultima I back, wanting to tempt their prospective next distributor with it rather than give EA their fair share of its earnings. This… well, this pissed EA right the hell off. And, then as now, pissing off EA wasn’t usually a very good idea.

EA’s lawyers went through the contract carefully, looking for anywhere where they might knock a few dollars off the requirement of $9.3 million in orders inside two years.

The original goal for 9/1/87 was stated in Exhibit A as $9,300,000. This amount “is reduced by $40,000 for every month in which any of the software products listed in Exhibit B are not available according to the schedules set forth in Exhibit B.” Moebius/Apple was listed as being available in September 1985, and was not available until November 1985, a slip of two months. Ogre/Apple was listed as being available in November 1985 and was not available until June 1986, a slip of seven months. Moebius/C64 was listed as being available in November 1985 and was not available until October 1986, a slip of eleven months. Taking into account only those titles listed in Exhibit B, a total of 22 months are applicable to the $40,000 provision, equaling a deduction of $880,000 from the $9,300,000 goal mentioned earlier, leaving a net goal of $8,420,000 for 9/1/87.

The adjusted goal of $8.4 million still left EA $1.8 million short. No problem. They attached to the same letter a purchase order for a random hodgepodge of Origin products totaling the full $1.8 million. EA didn’t really care what Origin shipped them, as long as they billed them $1.8 million for it: “If Origin is unable to ship any of the products in the quantities stated on the purchase orders, please consider this an order for a similar dollar volume of any of your products that can be shipped in sufficient quantities to meet our 9/1/87 objectives.”

You’re probably wondering what on earth EA is thinking in throwing away almost $2 million on any old anything at all just to retain Origin as a distributee. Far from cutting off their nose to spite their face, they’re playing hardball here; what they’ve just done is far more dangerous for Origin than it is for them. To understand why requires an understanding of “overstock adjustments,” better known as returns. It’s right there in the original contract: “Vendor [Origin] agrees to issue credit to EA based on the original purchase price for the return of resalable overstock made any time beyond 90 days of original receipt.” This provision gives EA the ability to crush Origin, accidentally or on purpose, by over-ordering. Origin can honor the order, only to have it all come back to them along with a bill big enough to bury them when EA doesn’t sell it on. Or Origin can refuse to honor the order and get buried under a nasty breach-of-contract lawsuit. Or they can come back to EA hat in hand and ask nicely if both parties can just forget the whole thing ever happened and continue that third year of their agreement as was once planned.

Many small publishers like Origin were becoming more and more angry and/or terrified by the logistics of distribution by the latter half of the 1980s. This is why. Nevertheless, with the big publishers squeezing out any other means of getting their games onto store shelves, most of the small companies were forced to get in bed with one of the big boys against their better judgment. Although several other big publishers had affiliate distribution programs, Activision and EA became the most aggressive of the bunch, both in recruiting and, if things didn’t work out, destroying affiliated labels by returning hundreds of thousands or millions of dollars worth of product along with a bill for same. The battlefield of the industry’s history is fairly littered with the corpses of companies who signed distribution deals much like Origin’s with EA.

Origin, however, was lucky. In rushing to become a distributee of Brøderbund, they’d found shelter with a company with the resources to go toe-to-toe with EA; Doug Carlston, founder and president of Brøderbund, was himself a lawyer. Brøderbund took Origin’s cause as their own, and a settlement agreement presumably entailing the payment of some sort of penalty from Origin and/or Brøderbund to EA was reached in fairly short order. (The actual settlement agreement is unfortunately not included in the Strong’s collection.) Origin signed a two-year distribution contract with Brøderbund, and all of EA’s worst suspicions were confirmed when the revamped Ultima I shipped on the very first day of the new agreement. And that wasn’t even Origin’s last laugh: Deathlord, the match that had lit the whole powder keg, got mediocre reviews and flopped. True to his tradition of adding references to his contemporary personal life into each Ultima, Richard added the words “Electronic Arts” to the in-progress Ultima V’s list of forbidden swear words (“With language like that, how didst thou become an Avatar?”) and added unflattering caricatures of EA’s president named “Hawkins” or “Pirt Snikwah” to the next few Ultimas. Just for good measure, he also built a mausoleum for “Pirt Snikwah” on the grounds of Britannia Manor. Like most monarchs, Lord British apparently didn’t forget a slight quickly.

The Garriotts were still living charmed lives. Much as so many love to romanticize Trip Hawkins’s “electronic artists” of the 1980s, complete with crying computers and all the rest, EA has always been a rough customer when it gets down to the brass tacks (knuckles?) of doing business. Few others have tangled with them like Origin did and lived to tell the tale.

Behind all this drama there lurked always the real point of the whole endeavor that was Origin Systems: Ultima, specifically Ultima V. Just like all the other games in the series, it was well on the way to dwarfing its predecessor in terms of scale and technical ambition, with all the birthing pains that must imply.

Beginnings and endings can be tricky things for an historian to come to grips with. Certainly the middle period of the eventual nine-game Ultima series is full of them. There’s the beginning marked by the great conceptual leap that is Ultima IV, when the series became about more than killing monsters, collecting loot, and leveling up — a leap that changed the series’s character to such an extent that plenty of fans will tell you that you needn’t even bother with anything that came before, that the real Ultima starts right here. And there’s the ending that is Ultima VI, the first Ultima not built on the code base of its predecessor, the first not developed and released first and foremost for the Apple II, the first for which Richard did none of the programming.

In between the two lies Ultima V, a crossroads game if ever there was one. It marks the end of the line for the 8-bit Ultimas, the basic structure that began with Akalabeth pushed to a complex extreme that would have been unthinkable back in 1980. How extraordinary to think that this game runs on essentially the same computer as Akalabeth, plus only 16 K of memory here or an extra disk drive there. The series’s glorious last hurrah on the Apple II, it also marks the beginning of a radically different development methodology that would carry forward into the era of the MS-DOS-developed Ultimas. Starting with Ultima V, new Ultimas would no longer be the end result of Richard Garriott toiling alone in front of a single Apple II for months or years until he emerged with bleary eyes and disk in hand. From now on, Richard would direct, design, and supervise, while other people did most of the grunt work.

It was an obviously necessary step from the perspective of even the most minimally informed outsider. Ultima IV had taken him two years, twice as long as originally planned, and had nearly killed him in the process. If the series was to continue to grow in scale and ambition, as he himself always demanded it should, something had to give. Yet Richard resisted the obvious for quite some time. He struggled alone, first with the abortive Ultima IV Part 2 and then with Ultima V, for almost a year while while everyone fretted at the lack of progress. He genuinely loved programming, took pride in knowing each new Ultima was truly his personal expression, top to bottom. But at last he accepted that he needed help — an acceptance that would change everything about the way that Ultimas got made forevermore.

The process started with two new programmers, Steve Meuse and John Miles. The former started writing tools to make it easier to create the world, to put a friendly interface on all of the tasks that Richard normally managed by hand using nothing more than a hex editor. Meuse’s “Ultima Creation Package” would grow into something that, according to Richard, “almost anyone could use.” Meanwhile Miles took over most of the actual game-programming tasks from Richard; more than half of the code that shipped in the finished game would be his. “The transition of doing it all yourself to doing it as a team was very painful,” Richard says of this landmark change of late 1986 that marked the abrupt end of his days as a working programmer. “However, once you had a team in place, and especially once you were no longer sharing the duties of both doing it and managing it, the pain went away.”

Richard’s team only continued to expand after the move to Austin, as all of that pent-up Texas talent began arriving on Origin’s doorstep. The finished game credits no fewer than six programmers in addition to Richard himself. With so many more people involved, this Ultima needed a project manager — the role also commonly referred to as “producer” — for the first time as well. That role went to Dallas Snell, late of Penguin Software, who, nobody being too specialized yet at this stage, did some of the programming as well. Snell lobbied for months for the hiring of a full-time artist, but Richard remained skeptical of the need for one until quite some time after the move to Austin. But at last Denis Loubet, an Austin artist who had been doing cover art for Richard’s games since the days of Akalabeth, joined the Origin staff to do all of the art for Ultima V, whether the media be paper or cardboard or pixels. Loubet’s work, blessedly free of the chainmail bikinis and other cheesecake tendencies that make most vintage CRPG art so cringe-worthy, would now become even more integral to the series, helping to maintain its aura of having just a little more class than the standard CRPG fare. Finally, and also largely thanks to Snell’s determination to professionalize the process of making Ultimas, there are fourteen people — fourteen! — credited solely for play-testing Ultima V, more than enough to ensure that there wouldn’t be any more blatant screw-ups like the vital clue that was left out of Ultima IV.

Denis Loubet on the job at Origin.

Denis Loubet on the job at Origin.

Freed from the pressure of programming, Richard could make Ultima V a much more consciously designed game than its predecessors. From an interview conducted almost a year before the game was published:

In previous Ultimas the combat systems were not designed out on paper ahead of time. I kind of ranked weapons in order of strength… the higher up the list of weapons you got, the better the weapon. Now I’ve actually designed an entire gaming system, including magic and combat, that is just as good to play on paper as on the computer. It’s extremely well-balanced, both [sic.] the weapons, armor, and magic, and we’ve been balancing the costs and uses of those things for six months — essentially by playing Ultima on paper.

Origin was so proud of this system of rules that they planned for some time to make an Ultima tabletop RPG out of them. That project fell by the wayside, but just the fact that Richard was thinking this way represented a huge step forward for a series whose mechanics had always felt ad hoc in comparison to those of its original rival, Wizardry. “I can tell you in numbers the probabilities of your being able to do something,” said Richard, “whereas in previous Ultimas I probably wouldn’t be able to do so. I just kind of did it until it looked right.”

While all of the extra care and thought that was going into this Ultima was welcome, it was also time-consuming. A series of release dates spouted by an overoptimistic Richard in interview after interview fell by the wayside, as subscribers to adventurer-catering magazines like Questbusters read for a year and a half of a game that was perpetually just a few months away. Still, the game they kept reading about sounded better with every mention: it would fill no less than eight Apple II disk sides; it would offer twice as much territory as Ultima IV to explore; each non-player character would have three times as much to say; non-player characters would have realistic day-and-night schedules that they followed; just about every single thing in the world, from table and chairs to torches and even a harpsichord, would be a discrete, manipulable object.

An early public preview of Ultima V at Dragon Con, October 1987.

An early public preview of Ultima V at Dragon Con, October 1987.

More philosophically-minded fans wondered about a subject on which there was less concrete information available: what would the new Ultima be about? After the great conceptual leap that had been Ultima IV, would Lord British be content to return to monster-killing and evil-wizard-bashing, or would there be another — or perhaps the same? — message on offer this time out?

All of their questions were answered on March 18, 1988, when Origin released Ultima V: Warriors of Destiny for the Apple II; versions for MS-DOS and the Commodore 64 followed in July and October respectively, with ports to a handful of other platforms trickling out over the following year or so. We’ll dive into the virtual world that awaited Ultima V‘s army of 200,000-plus eager buyers next time.

(Sources for this article and the next: Questbusters of June 1987, July 1987, August 1987, March 1988, July 1988; Game Developer of September 1994; Computer Gaming World of March 1986, December 1987, July 1988, January 1989, November 1991, November 1992. The books The Official Book of Ultima by Shay Addams; Dungeons and Dreamers by Brad King and John Borland; Ultima: The Avatar Adventures by Rusel DeMaria and Caroline Spector. See also Richard Garriott’s extended interview with Warren Spector. And of course the Strong’s collection; my thanks to Jon-Paul Dyson and his colleagues for hosting me there for a very productive week!

Ultima V is available from GOG.com in a collection with its prequel and sequel.)

 
4 Comments

Posted by on February 5, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags: , , , ,

A Pirate’s Life for Me, Part 3: Case Studies in Copy Protection

Copy-protection schemes, whether effected through software, a combination of software and hardware, or hardware alone, can and do provide a modicum of software protection. But such schemes alone are no better forms of security than locks. One with the appropriate tools can pick any lock. Locks only project the illusion of protection, to both the owner and the prospective thief.

Our focus on copy protection is the primary reason why our industry’s software-protection effort has come under skeptical scrutiny and intense attack. Many users now consider the copy-protection scheme to be just an obstacle to be overcome en route to their Congressionally- and self-granted right to the backup copy.

Dale A. Hillman
President, XOR Software
1985

An impregnable copy-protection scheme is a fantasy. With sufficient time and effort, any form of copy protection can be broken. If game publishers didn’t understand this reality at the dawn of their industry, they were given plenty of proof of its veracity almost as soon as they began applying copy protection to their products and legions of mostly teenage crackers began to build their lives around breaking it.

Given the unattainability of the dream of absolute protection, the next best thing must be protection that is so tough that the end result of a cracked, copyable disk simply isn’t worth the tremendous effort required to get there. When even this level of security proved difficult if not impossible to achieve, some publishers — arguably the wisest — scaled back their expectations yet further, settling for fairly rudimentary schemes that would be sufficient to deter casual would-be pirates but that would hardly be noticed by the real pros. Their games, so the reasoning went, were bound to get cracked anyway, so why compound the loss by pouring money into ever more elaborate protection schemes? Couldn’t that money be better used to make the game themselves better?

Others, however, doubled down on the quixotic dream of the game that would never be cracked, escalating a war between the copy-protection designers who developed ever more devious schemes and the intrepid crackers of the scene, the elite of the elite who staked their reputations on their ability to crack any game ever made. In the long term, the crackers won every single battle of this war, as even many of the publishers who waged it realized was all but inevitable. The best the publishers could point to was a handful of successful delaying actions that bought their games a few weeks or months before they were spread all over the world for free. And even those relative successes, it must be emphasized, were extremely rare. Few schemes stood up much more than a day or two under the onslaught of the scene’s brigade of talented and highly motivated crackers.

Just as so many crackers found the copy-protection wars to be the greatest game of all, far more intriguing and addictive than the actual contents of the disks being cracked, the art of copy protection — or, as it’s more euphemistically called today, digital-rights management or DRM — remains an almost endlessly fascinating study for those of a certain turn of mind. Back in the day, as now, cracking was a black art. Both sides in the war had strong motivations to keep it so: the publishers because information on how their schemes worked meant the power to crack them, and the crackers because their individual reputations hinged on being the first and preferably the only to crack and spread that latest hot game. Thus information in print on copy protection, while not entirely unheard of, was often hard to find. It’s only long since that wild and woolly first decade of the games industry that much detailed information on how the most elaborate schemes worked has been widely available, thanks to initiatives like The Floppy Disk Preservation Project.

This article will offer just a glimpse of how copy protection began and how it evolved over its first decade, as seen through the schemes that were applied to four historically significant games that we’ve already met in other articles: Microsoft Adventure for the TRS-80, Ultima III for the Apple II, Pirates! for the Commodore 64, and Dungeon Master for the Atari ST. Sit back, then, and join me on a little tour through the dawn of DRM.

Microsoft Adventure box art

The release of Microsoft Adventure in late 1979 for the Radio Shack TRS-80 marks quite a number of interrelated firsts for the games industry. It was the first faithful port of Will Crowther and Don Wood’s perennial Adventure, itself one of the most important computer games ever written, to a home computer. It accomplished this feat by taking advantage of the capabilities of the floppy disk, becoming in the process the first major game to be released on disk only, as opposed to the cassettes that still dominated the industry. And to keep those disks from being copied, normally a trivially easy thing to do in comparison to copying a cassette, Microsoft applied one of the earliest notable instances of physical copy protection to the disk, a development novel enough to attract considerable attention in its own right in the trade press. Byte magazine, for instance, declared the game “a gold mine for the enthusiast and a nightmare for the software pirate.”

Floppy Disk

The core of a 5¼-inch floppy disk, the type used by the TRS-80 and most other early microcomputers, is a platter made of a flexible material such as Mylar — thus the “floppy” — with a magnetic coating made of ferric oxide or a similar material, capable of recording the long sequences of ones and zeroes (or ons and offs) that are used to store all computer code and data. The platter is housed within a plastic casing that exposes just enough of it to give the read/write head of the disk drive access as the platter is spun.

The floppy disk is what’s known as a random-access storage medium. Unlike a cassette drive, a floppy drive can access any of its contents at any time at a simple request from the computer to which it’s attached. To allow this random access, there needs to be an organizing scheme to the disk, a way for the drive to know what lies where and, conversely, what spaces are still free for writing new files. A program known as a “formatter,” which must be run on every new disk before it can be used, writes an initially empty framework to the disk to keep track of what it contains and where it all lives on the disk’s surface.

In the case of the TRS-80, said surface is divided into 35 concentric rings, known as “tracks,” numbered from 0 to 34, with track 0 lying at the outer margin of the disk and track 34 closet to the inner ring. Each track is subdivided along its length into 10 equal-sized sectors, each capable of storing 256 bytes of data. Thus the theoretical maximum capacity of an entire disk is about ((256 * 10 * 35) / 1024) 87 K.

Figure 1

Figure 1 (click to expand)

Figure 1 shows the general organization of the tracks on a TRS-80 disk. Much of this is specific to the TRS-80’s operating system and thus further down in the weeds than we really need to go, but a couple of details are very relevant to our purposes. Notice track 18, the “system directory.” It’s just what its name would imply. The entire track is reserved to be the disk’s directory service, a list of all the files it contains along with the track and sector numbers where each begins. The directory is placed in the middle of the disk for efficiency’s sake. Because it must be read from every time a file is requested, having it here minimizes the distance the head must travel both to read from the directory and, later, to access the file in question. For the same reason, most floppy-disk systems try to fill disks outward from the directory track, using the farthest-flung regions only if the disk is otherwise full.

The one exception to this rule in the case of the TRS-80 as well as many other computers is the “boot sector”: track 0, sector 0. It contains code, stored outside the filesystem described in the directory, which the computer will always try to access and execute on boot-up. This “bootstrap” code tells the computer how to get started loading the operating system and generally getting on with things. There isn’t much space here — only a single sector’s worth, 256 bytes — but it’s enough to set the larger process in motion.

Figure 2

Figure 2

Figure 2 shows the layout an individual disk sector. This diagram presumes a newly formatted disk, so the “dummy data” represents the sector’s 256 bytes of available storage, waiting to be filled. Note the considerable amount of organizing and housekeeping information surrounding the actual data, used to keep the drive on track and aware at all times of just where it is. Again, there’s much more here than we need to dig into today. Relevant for our purposes are the track and sector numbers stored near the beginning of each sector. These amount to the sector’s home address, its index in the directory listing.

Microsoft Adventure introduces a seeming corruption into the disk’s scheme. Beginning with track 1 — track 0 must be left alone so the system can find the boot sector and get started — the tracks are numbered not from 1 to 34 but from 127 to 61, in downward increments of 2. The game’s bootstrap inserts a patch into the normal disk-access routines that tells them how to deal with these weirdly numbered tracks. But, absent the patch, the normal TRS-80 operating system has no idea what to make of it. Even a so-called “deep” copier, which tries to copy a disk sector by sector rather than file by file to create a truly exact mirror image of the original, fails because it can’t figure out where the sectors begin and end.

If one wants to make a copy of a protected program, whether for the legal purpose of backing it up or the illegal one of software piracy, one can take either of two approaches. The first is to find a way to exactly duplicate the disk, copy protection and all, so that there’s no way for the program it contains to know it isn’t running on an original. The other is to crack it, to strip out or ignore the protection and modify the program itself to run correctly without it.

One of the first if not the first to find a way to duplicate Microsoft Adventure and then to crack it to boot was an Australian teenager named Nick Andrew (right from the beginning, before the scene even existed, cracking already seemed an avocation for the young). After analyzing the disk to work out how it was “corrupted,” he rewrote the TRS-80’s usual disk formatter to format disks with the alternate track-numbering system. Then he rewrote the standard copier to read and write to the same system. After “about two days,” he had a working duplicate of the original disk.

But he wasn’t quite done yet. After going through all the work of duplicating the disk, the realization dawned that he could easily go one step further and crack it, turn it into just another everyday disk copyable with everyday tools. To do so, he wouldn’t need his modified disk formatter at all. He needed only make a modification to his customized copier, to read from a disk with the alternate track-numbering system but write to a normal one. Remove the custom bootstrap to make Adventure boot like any other disk, and he was done. This first “nightmare for the software pirate” was defanged.

Ultima III

Released in 1983, Ultima III was already the fourth commercial CRPG to be written by the 22-year-old Richard Garriott, but the first of them to be published by his own new company, Origin Systems. With the company’s future riding on its sales, he and his youthful colleagues put considerable effort into devising as tough a copy-protection scheme as possible. It provides a good illustration of the increasing sophistication of copy protection in general by this point, four years after Microsoft Adventure.

Apple II floppy-disk drives function much like their TRS-80 equivalents, with largely only practical variations brought on by specific engineering choices. The most obvious of the differences is the fact that the Apple II writes its data more densely to the disk, giving it 16 256-byte sectors on each of its 35 tracks rather than the 10 of the TRS-80. This change increases each disk’s capacity to ((256 * 16 * 35) / 1024) 140 K.

Ultima III shipped on two disks, one used to boot the game and the other to load in data and to save state as needed during play. The latter is a completely normal Apple II disk, allowing the player to make copies as she will in the name of being able to start a fresh game with a new character at any time. The former, however, is a different story.

The game’s first nasty trick is to make the boot disk less than half a disk. Only tracks 0 through 16 are formatted at all. Like the TRS-80, the Apple II expects the disk’s directory to reside in the middle of the disk, albeit on track 17 rather than 18. In this case, though, track 17 literally doesn’t exist.

But how, you might be wondering, can even a copy-protected disk function at all without a directory? Well, it really can’t, or at least it doesn’t in this case. Again like the TRS-80, the beginning of an Apple II disk is reserved for a boot block. The Ultima III bootstrap substitutes alternative code for a standard operating-system routine called the “Read Write Track Sector” routine, or, more commonly, the “RWTS.” It’s this routine that programs call when they need to access a disk file or to do just about any other operation to a disk. Ultima III provides an RWTS that knows to look for the directory listing not on track 17 but rather on track 7, right in the middle of its half-a-disk. Thus it knows how to find its files, but no one else does.

Ultima III‘s other trick is similar to the approach taken by Microsoft Adventure in theory, but far more gnarly in execution. To understand it, we need to have a look at the structure of an Apple II sector. As on the TRS-80, each sector is divided into an “address field,” whose purpose is to keep the drive on track and help it to locate what it’s looking for, and a “data field” containing the actual data written there. Figures 3 and 4 show the structure of each respectively.

Figure 3

Figure 3

Figure 4

Figure 4

Don’t worry too much about the fact that our supposed 256 bytes of data have suddenly grown to 342. This transformation is down to some nitty-gritty details of the hardware that mean that 256 logical bytes can’t actually be packed into 256 bytes of physical space, that the drive needs some extra breathing room. A special encoding process, known as Group Code Recording (GCR) on the Apple II, converts the 256 bytes into 342 that are easily manipulable by the drive and back again. If we were really serious about learning to create copy protection or how to crack it, we’d need to know a lot more about this. But it’s not necessary to understand if you’re just dipping your toes into that world, as we’re doing today.

Of more immediate interest are the “prologues” and “epilogues” that precede and trail both the address and data fields. On a normal disk these are fixed runs of numbers, which you see shown in hexadecimal notation in Figures 3 and 4. (If you don’t know what that means, again, don’t worry too much about it. Just trust me that they’re fixed numbers.) Like so much else here, they serve to keep the drive on track and to reassure it that everything is kosher.

Ultima III, however, chooses other numbers to place in these spaces. Further, it doesn’t just choose a new set of fixed numbers — that would be far too easy — but rather varies the expected numbers from track to track and even sector to sector according to a table only it has access to, housed in its custom RWTS. Thus what looks like random garbage to the computer normally suddenly becomes madness with a method behind it when the computer has been booted from the Ultima III disk. If any of these fields don’t match with what they should be — i.e., if someone is trying to use an imperfect copy —  the game loads no further.

It’s a tough scheme, particularly for its relatively early date, but far from an unbreakable one. There are a couple of significant points of vulnerability. The first is the fact that Ultima III doesn’t need to read and write only protected disks. There is, you’ll remember, also that second disk in a standard format. The modified RWTS needs to be able to fall back to the standard routine when using that second disk, which is no more readable by the modified routine than the protected disk is by the standard. It relies on the disk’s volume number to decide which routine to use: volume 1 is the first, protected disk; volume 2 the second, unprotected (if the volume number is anything else, it knows somebody must be up to some sort of funny business and just stops entirely). Thus if we can just get a copy of the first disk in an everyday disk format and set its volume number to 2, Ultima III will happily accept it and read from it.

But that “just” is, of course, a tricky proposition. We would seemingly need to write a program of our own to read from a disk — or rather from half of a disk — with all those ever-changing prologue and epilogue fields. That, anyway, is one approach. But, if we’re really clever, we won’t have to. Instead of working harder, we can work smarter, using Ultima III‘s own code to crack it.

One thing that legions of hackers and crackers came to love about the Apple II was its integrated machine-language monitor, which can be used to pause and break into a running program at almost any point. We can use it now to pause Ultima III during its own boot process and look up the address of its customized RWTS in memory; because all disk operations use the RWTS, it is easily locatable via a global system pointer. Once we know where the new RWTS lives, we can save that block of memory to disk for later use.

Next we need only boot back into the normal system, load up the customized RWTS we saved to disk, and redirect the system pointer to it rather than the standard RWTS. Remember that the custom RWTS is already written to assume that disks with a volume number of 1 are in the protected format, those with a volume number of 2 in the normal format. So, if we now use an everyday copy program to copy from the original, which has a volume number of 1, to a blank disk which we’ve formatted with a volume number of 2, Ultima III essentially cracks itself. The copy operation, like all disk operations, simply follows the modified system pointer to the new RWTS, and is never any the wiser that it’s been modified. Pretty neat, no? Elegant tricks like this warm any hacker’s heart, and are much of the reason that vintage cracking remains to this day such an intriguing hobby.

Pirates!

Ultima III‘s copy protection was clever enough in its day, but trivial compared to what would start to appear just a year or so later as the art reached a certain level of maturity. As the industry itself got more and more cutthroat, many of the protection schemes also got just plain nasty. The shadowy war between publisher and pirate was getting ever more personal.

A landmark moment in the piracy wars was the 1984 founding of the Software Publishers Association. It was the brainchild of a well-connected Washington, D.C., lawyer named Ken Wasch who decided that what the industry really needed was a D.C.-based advocacy group and that he, having no previous entanglements within it, was just the neutral party to start it. The SPA had a broad agenda, from gathering data on sales trends from and for its members to presenting awards for “software excellence,” but, from the perspective of the outsider at any rate, seemed to concern itself with the piracy problem above all else. Its rhetoric was often strident to the point of shrillness, while some of its proposed remedies smacked of using a hydrogen bomb to dig a posthole. For instance, the SPA at one point protested to Commodore that multitasking shouldn’t be a feature of the revolutionary new Amiga because it would make it too easy for crackers to break into programs. And Wasch lobbied Congress to abolish the user’s right to make backup copies of their software for personal archival purposes, a key part of the 1980 Software Copyright Act that he deemed a “legal loophole” because it permitted the existence of programs capable of copying many forms of copy-protected software — a small semi-underground corner of the software industry that the SPA was absolutely desperate to eliminate rather than advocate for. The SPA also did its best to convince the FBI and other legal authorities to investigate the bulletin-board systems of the cracking scene, with mixed success at best.

Meanwhile copy protection was becoming a business in its own right, the flip side to the business of making copying programs. In place of the home-grown protection schemes of our first two case studies, which amounted to whatever the developers themselves could devise in whatever time they had available, third-party turnkey protection systems, the products of an emerging cottage industry, became increasingly common as the 1980s wore on. The tiny companies that created the systems weren’t terribly far removed demographically from the crackers that tried to break them; they were typically made up of one to three young men with an encyclopedic knowledge of their chosen platforms and no small store of swagger of their own. Their systems, sporting names like RapidLok and PirateBusters, were multifaceted and complex, full of multiple failsafes, misdirections, encryptions, and honey pots. Copy-protection authors took to sneaking taunting messages into their code, evincing a braggadocio that wouldn’t have felt out of place in the scene: “Nine out of ten pirates go blind trying to copy our software. The other gets committed!”

Protection schemes of this later era are far too complex for me to describe in any real detail in an accessible article like this one, much less explain how people went about cracking them. I would, however, like to very briefly introduce RapidLok, the most popular of the turnkey systems on the Commodore 64. It was the product of a small company called the Dane Final Agency, and was used in its various versions by quite a number of prominent publishers from early 1986 on, including MicroProse. You’ll find it on that first bona fide Sid Meier classic, the ironically-titled-for-our-purposes Pirates!, along with all of their other later Commodore 64 games.

The protection schemes we’ve already seen have modified their platforms’ standard disk formats to confuse copy programs. RapidLok goes to the next level by implementing its own custom format from scratch. A standard Commodore 64 disk has 17 to 21 sectors per track, depending on where the track is located; a RapidLok disk has 11 or 12 much larger sectors, with the details of how those sectors organize their data likewise re-imagined. Rapidlok also adds a track to the standard 35, shoved off past the part of the disk that is normally read from or written to. This 36th track serves as an encrypted checksum store for all of the other tracks. If any track fails the checksum check — indicating it’s been modified from the original — the system immediately halts.

Like any protection scheme, RapidLok must provide a gate to its walled garden, an area of the disk formatted normally so that the computer can boot the game in the first place. Further, writing to RapidLok-formatted tracks isn’t practical. The computer would need to recalculate the checksum for the track as a whole, encrypt it, and rewrite that portion to the checksum store out past the normally accessible part of the disk — a far too demanding task for a little Commodore 64. For these reasons, Rapidlok disks are hybrids, partially formatted as standard disks and partially in the protected format. Figure 5 below shows the first disk of Pirates! viewed with a contemporary copying utility.

Figure 5

Figure 5

As the existence of such a tool will attest, techniques did exist to analyze and copy RapidLok disks in their heyday. Among the crackers, Mitch of Eagle Soft was known as the RapidLok master; it’s his vintage crack of Pirates! and many other RapidLok-protected games that you’ll find floating around the disk-image archives today. Yet even those cracks, masterful as they were, were forced to strip out a real advantage that RapidLok gave to the ordinary player, that was in fact the source of the first part of its name: its custom disk format was much faster to read from than the standard, by a factor of five or six. Pirates who chose to do their plundering via Mitch’s cracked version of Pirates! would have to be very patient criminals.

But balanced against the one great advantage of RapidLok for the legitimate user was at least one major disadvantage beyond even the obvious one of not being able to make a backup copy. In manipulating the Commodore 64 disk drive in ways its designers had never intended, RapidLok put a lot of stress on the hardware. Drives that were presumably just slightly out of adjustment, but that nevertheless did everything else with aplomb, proved unable to load RapidLok disks, or, almost worse, failed intermittently in the middle of game sessions (seemingly always just after you’d scored that big Silver Train robbery in the case of Pirates!, of course). And, still worse from the standpoint of MicroProse’s customer relations, a persistent if unproven belief arose that RapidLok was actually damaging disk drives, throwing them out of alignment through its radical operations. It certainly didn’t sound good in action, producing a chattering and general caterwauling and shaking the drive so badly one wondered if it was going to walk right off the desktop one day.

The belief, quite probably unfounded though it was, that MicroProse and other publishers were casually destroying their customers’ expensive hardware in the name of protecting their own interests only fueled the flames of mistrust between publisher and consumer that so much of the SPA’s rhetoric had done so much to ignite. RapidLok undoubtedly did its job in preventing a good number of people from copying MicroProse games. A fair number of them probably even went out and bought the games for themselves as an alternative. Whether those sales were worth the damage it did to MicroProse’s relations with their loyal customers is a question with a less certain answer.

Dungeon Master

No discussion of copy protection in the 1980s could be complete without mentioning Dungeon Master. Like everything else about FTL’s landmark real-time CRPG, its copy protection was innovative and technically masterful, so much so that it became a veritable legend in its time. FTL wasn’t the sort of company to be content with any turnkey copy-protection solution, no matter how comprehensive. What they came up with instead is easily as devious as any dungeon level in the game proper. As Atari ST and Amiga crackers spent much of 1988 learning, every time you think you have it beat it turns the tables on you again. Let’s have a closer look at the protection used on the very first release of Dungeon Master, the one that shipped for the ST on December 15, 1987.

3 1/2 inch floppy disk

With the ST and its 68000-based companions, we’ve moved into the era of the 3½-inch disk, a format that can pack more data onto a smaller disk and also do so more reliably; the fragile magnetic platter is now protected beneath a rigid plastic case and a metal shield that only pulls away to expose it when the disk is actually inserted into a drive. The principles of the 3½-inch disk’s operation are, however, the same as those of the 5¼-inch, so we need not belabor the subject here.

Although most 3½-inch drives wrote to both sides of the disk, early STs used just one, in a format that consisted of 80 tracks, each with 9 512-byte sectors, for a total of ((512 * 9 * 80) / 1024) 360 K of storage capacity. The ST uses a more flexible filesystem than was the norm on the 8-bit machines we’ve discussed so far, one known as FAT, for File Allocation Table. The FAT filesystem dates back to the late 1970s, was adopted by Microsoft for MS-DOS in 1981, and is still in common use today in a form known as FAT32; the ST uses FAT12. The numerical suffix refers to the number of bits allocated to each file’s home address on the disk, which in turn dictates the maximum possible capacity of the disk itself. FAT is designed to accommodate a wide range of floppy and hard disks, and thus allows the number of tracks and sectors to be specified at the beginning of the disk itself. Thanks to FAT’s flexibility, Dungeon Master can easily bump the number of sectors per track from 9 to 10, a number still well within the capabilities of the ST’s drive. That change increases the disk’s storage capacity to ((512 * 10 * 80) / 1024) 400 K. It was only this modification, more a response to a need for just a bit more disk space than an earnest attempt at copy protection, that allowed FTL to pack the entirety of Dungeon Master onto a single disk.

Dungeon Master‘s real protection is a very subtle affair, which is one of the keys to its success. At first glance one doesn’t realize that the disk is protected at all — a far cry from the radical filesystem overhaul of RapidLok. The disk’s contents can be listed like those of any other, its individual files even read in and examined. The disk really is a completely normal one — except for track 0, sectors 7 and 8.

Let’s recall again the two basic methods of overcoming copy protection: by duplicating the protection on the copy or by cracking the original, making it so that you don’t need to duplicate the protection. Even with a scheme as advanced as RapidLok, duplication often remained an option. Increasingly by the era of Dungeon Master, though, we see the advent of schemes that are physically impossible for the disk drives on the target machines to duplicate under any circumstances, that rely on capabilities unique to industrial-scale disk duplicators. Nate Lawson, a reader of this blog who was hugely helpful to me in preparing this article, describes good copy protection as taking advantage of “asymmetry”: “the difference between the environment where the code is executed versus where it was produced.” The ultimate form of asymmetry must be a machine on the production side that can write data in a format that the machine on the execution side physically cannot.

Because FTL duplicated their own disks in-house rather than using an outside service like most publishers, they had a great deal of control over the process used to create them. They used their in-house disk duplicator to write an invalid sector number to a single sector: track 0, sector 8 is labeled sector 247. At first blush, this hardly seems special; Microsoft Adventure, that granddaddy of copy-protected games, had after all used the same technique eight years earlier. But there’s something special about this sector 247: due to limitations of the ST’s drive hardware that we won’t get into here, the machine physically can’t write that particular sector number. Any disk with a sector labeled 247 has to have come from something other than an ST disk drive.

Track 0, sector 7, relies on the same idea of hardware asymmetry, but adds another huge wrinkle sufficient to warm the heart of any quantum physicist. Remember that the data stored on a disk boils down to a series of 1s and 0s, magnetized or demagnetized areas that are definitively in one state or the other. But what if it was possible to create a “fuzzy” bit, one that capriciously varies between states on each successive read? Well, it wasn’t possible to do anything like that on an ST disk drive or even most industrial disk duplicators. But FTL, technology-driven company that they were, modified their own disk duplicator to be able to do just that. By cramming a lot of “flux reversals,” or transitions between a magnetic and demagnetized state, into a space far smaller than the read resolution of the ST disk drive, they could create bits that lived in a perpetually in-between state — bits that the drive would randomly read sometimes as on and sometimes as off.

Dungeon Master has one of these fuzzy bits on track 0, sector 7. When the disk is copied, the copy will contain not a fuzzy bit but a normal bit, on or off according to the quantum vagaries of the read process that created it.

Figure 6

Figure 6

As illustrated in Figure 6, Dungeon Master‘s copy-protection routines read the ostensible fuzzy bit over and over, waiting for a discordant result. When that comes, it can assume that it’s running from an original disk and continue. If it tries many times, always getting the same result, it assumes it’s running from a copy and behaves accordingly.

FTL’s scheme was so original that they applied for and were granted a patent on it, one that’s been cited many times in subsequent filings. It represents a milestone in the emerging art and science of DRM. Ironically, the most influential aspect of Dungeon Master, a hugely influential game on its own terms, might just be its fuzzy-bit copy protection. Various forms of optical media continue to use the same approach to this day.

With duplication a complete non-starter in the case of both this sector numbered 247 and the fuzzy bit, the only way to pirate Dungeon Master must be to crack it. Doing so must entail diving into the game’s actual code, looking for the protection check and modifying it to always return a positive response. In itself, that wasn’t usually too horrible; crackers had long ago learned to root through code to disable look-up-a-word-in-the-manual and code-wheel-based “soft” protection schemes. But FTL, as usual, had a few tricks up their sleeves to make it much harder: they made the protection checks multitudinous and their results non-obvious.

Instead of checking the copy protection just once, Dungeon Master does it over and over, from half-a-dozen or so different places in its code, turning the cracker’s job into a game of whack-a-mole. Every time he thinks he’s got it at last, up pops another check. The most devious of all the checks is the one that’s hidden inside a file called “graphics.dat,” the game’s graphics store. Who would think to look for executable code there?

Compounding the problem of finding the checks is the fact that even on failure they don’t obviously do anything. The game simply continues, only to become unstable and start spitting out error messages minutes later. For this reason, it’s extremely hard to know when and whether the game is finally fully cracked. It was the perfect trap for the young crackers of the scene, who weren’t exactly known for their patience. The pirate boards were flooded with crack after crack of Dungeon Master, all of which turned out to be broken after one had actually played a while. In a perverse way, it amounted to a masterful feat of advertising. Many an habitual pirate got so frustrated with not being able properly play this paradigm-shattering game that he made Dungeon Master the only original disk in his collection. Publishers had for years already been embedding their protection checks some distance into their games, both to make life harder for crackers and to turn the copies themselves into a sort of demo version that unwitting would-be pirates distributed for them for free. But Dungeon Master used the technique to unprecedented success in terms of pirated copies that turned into sold originals.

Dungeon Master still stands as one of copy protection’s — or, if you like, DRM’s — relatively few absolutely clear, unblemished success stories. It took crackers more than a year, an extraordinary amount of time by their usual standards, to wrap their heads around the idea of a fuzzy bit and to find all of the checks scattered willy-nilly through the code (and, in the case of “graphics.dat,” out of it). After that amount of time the sales window for any computer game, even one as extraordinary as Dungeon Master, must be closing anyway. Writing about the copy protection twenty years later, Doug Bell of FTL couldn’t resist a bit of crowing.

Dungeon Master exposed the fallacy in the claims of both the pirates and the crackers. The pirates who would never have paid for the game if they could steal it did pay for it. Despite a steadily growing bounty of fame and notoriety for cracking the game, the protection lasted more than a year. And the paying customer was rewarded with not just a minimally invasive copy-protection scheme, but, just as importantly, with the satisfaction of not feeling like a schmuck for paying for something that most people were stealing.

As the developer of both Dungeon Master and the software portion of its copy protection, I knew that eventually the copy protection would be broken, but that the longer it held out the less damage we would suffer when it was broken.

Dungeon Master had a greater than 50-percent market penetration on the Atari ST—that is, more than one copy of Dungeon Master was sold for each two Atari ST computers sold. That’s easily ten times the penetration of any other game of the time on any other platform.

So what’s the lesson? That piracy does take significant money out of the pocket of the developer and that secure anti-piracy schemes are viable.

Whether we do indeed choose to view Dungeon Master as proof of the potential effectiveness of well-crafted DRM as a whole or, as I tend to, as something of an historical aberration produced by a unique combination of personalities and circumstances, it does remain a legend among old sceners, respected as perhaps the worthiest of all the wily opponents they encountered over the years — not just technically brilliant but conceptually and even psychologically so. By its very nature, the long war between the publishers and the crackers could only be a series of delaying actions on the part of the former. For once, the delay created by Dungeon Master‘s copy protection was more than long enough.

And on that note we’ll have to conclude this modest little peek behind the curtain of 1980s copy protection. Like so many seemingly narrow and esoteric topics, it only expands and flowers the deeper you go into it. People continue to crack vintage games and other software to this day, and often document their findings in far more detail than I can here. Apple II fans may want to have a look at the work of one “a2_4am” on Twitter, while those of you who want to know more about RapidLok may want to look into the C64 Preservation Project‘s detailed RapidLok Handbook, which is several times the length of this article. And if all that’s far, far more information than you want — and no, I really don’t blame you — I hope this article, cursory as it’s been, has instilled some respect for the minds on both sides of the grand software-piracy wars of the 1980s.

(Sources: Beneath Apple DOS by Don Worth and Pieter Lechner; The Anatomy of the 1541 Disk Drive by Lothar Englisch and Norbert Szczepanowski; Inside Commodore DOS by Richard Immers and Gerald G. Newfeld; The Kracker Jax Revealed Trilogy; Commodore Power Play of August/September 1985; Kilobaud of July 1982; New Zealand Bits and Bytes of May 1984; Games Machine of June 1988; Transactor 5.3; 80 Microcomputing of November 1980; Byte of December 1980; Hardcore Computist #9 and #11; Midnite Software Gazette of April 1986. Online sources include Nick Andrew’s home page, the aforementioned C64 Preservation Project, and The Dungeon Master Encyclopedia. See also Jean Louis-Guérin’s paper “Atari Floppy Disk Copy Protection.” Information on the SPA’s activities comes from the archive of SPA-related material donated to the Strong Museum of Play by Doug Carlston, first fruit of my research here in Rochester.

My huge thanks to Nate Lawson for doing something of a peer review of this article prior to publication!)

 
 

Tags: , , , , , , , , , , , ,

Send in the Clones

In computer parlance, a clone is Company B’s copycat version of Company A’s computer that strains to be as software and hardware compatible with its inspiration as possible. For a platform to make an attractive target for cloning, it needs to meet a few criteria. The inspiration needs to be simple and/or well-documented enough that it’s practical for another company — and generally a smaller company at that, with far fewer resources at its disposal — to create a compatible knock-off in the first place. Then the inspiration needs to be successful enough that it’s spawned an attractive ecosystem that lots of people want to be a part of. And finally, there needs to be something preventing said people from joining said ecosystem by, you know, simply buying the machine that’s about to be cloned. Perhaps Company A, believing it has a lock on the market, keeps the price above what many otherwise interested people are willing or able to pay; perhaps Company A has simply neglected to do business in a certain part of the world filled with eager would-be buyers.

Clones have been with us almost from the moment that the trinity of 1977 kicked off the PC revolution in earnest. The TRS-80 was the big early winner of the trio thanks to its relatively low price and wide distribution through thousands of Radio Shack stores, outselling the Apple II in its first months by margins of at least twenty to one (as for the Commodore PET, it was the Bigfoot of the three, occasionally glimpsed in its natural habitat of trade-show booths but never available in a form you could actually put your hands on until well into 1978). The first vibrant, non-business-focused commercial software market in history sprung up around the little Trash 80. Cobbled together on an extreme budget out of generic parts that were literally just lying around at Radio Shack — the “monitor,” for instance, was just a cheap Radio Shack television re-purposed for the role — the TRS-80 was eminently cloneable. Doing so didn’t make a whole lot of sense in North America, where Radio Shack’s volume manufacturing and distribution system would be hard advantages to overcome. But Radio Shack had virtually no presence outside of North America, where there were nevertheless plenty of enthusiasts eager to join the revolution.

EACA shindig in Hong Kong

A shindig for EACA distributors in Hong Kong. Shortly after this photo was taken, Eric Chung, third from right in front, would abscond with $10 million and that would be that for EACA.

The most prominent of the number of TRS-80 cloners that had sprung up by 1980 was a rather shady Hong Kong-based company called EACA, who made cheap clones for any region of the world with distributors willing to buy them. Their knock-offs popped up in Europe under the name “The Video Genie”; in Australasia as the “Dick Smith System 80,” distributed under the auspices of Dick Smith Electronics, the region’s closest equivalent to Radio Shack; even in North America as the “Personal Micro Computers PMC-80.” EACA ended in dramatic fashion in 1983 when founder Eric Chuang absconded to Taiwan with all of his company’s assets that he could liquify, $10 million worth, stuffed into his briefcase. He or his descendents are presumably still living the high life there today.

By the time of those events, the TRS-80’s heyday was already well past, its position as the most active and exciting PC platform long since having been assumed by the Apple II, which had begun a surge to the fore in the wake of the II Plus model of 1979. The Apple II was if anything an even more tempting target for cloners than the TRS-80. While Steve Wozniak’s hardware design is justly still remembered as a marvel of compact elegance, it was also built entirely from readily available parts, lacking the complex and difficult-to-duplicate custom chips of competitors like Atari and Commodore. Wozniak had also insisted that every last diode on the Apple II’s circuit board be meticulously documented for the benefit of hackers just like him. And Apple, then as now, maintained some of the highest profit margins in the industry, creating a huge opportunity for a lean-and-mean cloner to undercut them.

The Franklin Ace 1000

A Franklin Ace 1000 mixed and matched with a genuine Apple floppy drive.

Assorted poorly distributed Far Eastern knock-offs aside, the first really viable Apple II clone arrived in mid-1982 in the form of the Franklin Ace line. The most popular model, the Ace 1000, offered for about 25 percent less than a II Plus complete hardware and software compatibility while also having more memory as well as luxuries like a numeric keypad and upper- and lowercase letter input. The Ace terrified Apple. With the Apple III having turned into a disaster, Apple remained a one-platform company, completely dependent on continuing Apple II sales — and continuing high Apple II profit margins — to fund not one but two hugely ambitious, hugely innovative, and hugely expensive new platform initiatives, Lisa and Macintosh. A viable market in Apple II workalikes which cut seriously into sales, or that forced price cuts, could bring everything down around their ears. Already six months before the Ace actually hit the market, as soon as they got word of Franklin’s plans, Apple’s lawyers were therefore looking for a way to challenge Franklin in court and drive their machine from the market.

As it turned out, the basis for a legal challenge wasn’t hard to find. Yes, the Apple II’s unexceptional hardware would seem to be fair game — but the machine’s systems software was not. Apple quickly confirmed that, like most of the TRS-80 cloners, Franklin had simply copied the contents of the II’s ROM chips; even bugs and the secret messages Apple’s programmers had hidden inside them were still there in Franklin’s versions. A triumphant Apple rushed to federal court to seek a preliminary injunction to keep the Ace off the market until the matter was decided through a trial. Much to their shocked dismay, the District Court for the Eastern District of Pennsylvania found the defense offered by Franklin’s legal team compelling enough to deny the injuction. The Ace came out right on schedule that summer of 1982, to good reviews and excellent sales.

Franklin’s defense sounds almost unbelievable today. They readily admitted that they had simply copied the contents of the ROM chips. They insisted, however, that the binary code contained on the chips, being a machine-generated sequence of 1s and 0s that existed only inside the chips and that couldn’t be reasonably read by a human, was not a form of creative expression and thus not eligible for copyright protection in the first place. In Franklin’s formulation, only the human-readable source code used to create the binary code stored on the ROM chips, which Franklin had no access to and no need for given that they had the binary code, was copyrightable. It was an audacious defense to say the least, one which if accepted would tear down the legal basis for the entire software industry. After all, how long would it take someone to leap to the conclusion that some hot new game, stored only in non-human-readable form on a floppy disk, was also ineligible for copyright protection? Astonishingly, when the case got back to the District Court for a proper trial the judge again sided with Franklin, stating that “there is some doubt as to the copyrightability of the programs described in this litigation,” in spite of an earlier case, Williams Electronics, Inc. v. Arctic International, Inc., which quite clearly had established binary code as copyrightable. Only in August of 1983 was the lower court’s ruling overturned by the Federal Court of Appeals in Philadelphia. A truculent Franklin threatened to appeal to the Supreme Court, but finally agreed to a settlement that January that demanded they start using their own ROMs if they wanted to keep cloning Apple IIs.

Apple Computer, Inc., v. Franklin Computer Corp. still stands today as a landmark in technology jurisprudence. It firmly and finally established the copyrightable status of software regardless of its form of distribution. And it of course also had an immediate impact on would-be cloners, making their lives much more difficult than before. With everyone now perfectly clear on what was and wasn’t legal, attorney David Grais clarified the process cloners would need to follow to avoid lawsuits in an episode of Computer Chronicles:

You have to have one person prepare a specification of what the program [the systems software] is supposed to do, and have another person who’s never seen the [original] program write a program to do it. If you can persuade a judge that the second fellow didn’t copy from the [original] code, then I think you’ll be pretty safe.

After going through this process, Apple II cloners needed to end up with systems software that behaved absolutely identically to the original. Every system call needed to take the exact same amount of time that it did on a real Apple II; each of the original software’s various little quirks and bugs needed to be meticulously duplicated. Anything less would bring with it incompatibility, because there was absolutely nothing in those ROMs that some enterprising hacker hadn’t used in some crazy, undocumented, unexpected way. This was a tall hurdle indeed, one which neither Franklin nor any other Apple II cloner was ever able to completely clear. New Franklins duly debuted with the new, legal ROMs, and duly proved to be much less compatible and thus much less desirable than the older models. Franklin left the Apple-cloning business within a few years in favor of hand-held dictionaries and thesauri.

There is, however, still another platform to consider, one on which the cloners would be markedly more successful: the IBM PC. The open or (better said) modular architecture of the IBM PC was not, as so many popular histories have claimed, a sign of a panicked or slapdash design process. It was rather simply the way that IBM did business. Back in the 1960s the company had revolutionized the world of mainframe computing with the IBM System/360, not a single computer model but a whole extended family of hardware and software designed to plug and play together in whatever combination best suited a customer’s needs. It was this product line, the most successful in IBM’s history, that propelled them to the position of absolute dominance of big corporate computing that they still enjoyed in the 1980s, and that reduced formerly proud competitors to playing within the house IBM had built by becoming humble “Plug-Compatible Manufacturers” selling peripherals that IBM hadn’t deigned to provide — or, just as frequently, selling clones of IBM’s products for lower prices. Still, the combined profits of all the cloners remained always far less than those of IBM itself; it seemed that lots of businesses wanted the security that IBM’s stellar reputation guaranteed, and were willing to pay a bit extra for it. IBM may have thought the PC market would play out the same way. If so, they were in for a rude surprise.

The IBM PC was also envisioned as not so much a computer as the cornerstone of an ever-evolving, interoperable computing family that could live for years or decades. Within three years of the original machine’s launch, you could already choose from two CPUs, the original Intel 8088 or the new 80286; could install as little as 16 K of memory or as much as 640 K; could choose among four different display cards, from the text-only Monochrome Display Adapter to the complicated and expensive CAD-oriented Professional Graphics Controller; could choose from a huge variety of other peripherals: floppy and hard disks, tape backup units, modems, printer interfaces, etc. The unifying common denominator amongst all this was a common operating system, MS-DOS, which had quickly established itself as the only one of the four operating paradigms supported by the original IBM PC that anyone actually used. Here we do see a key difference between the System/360 and the IBM PC, one destined to cause IBM much chagrin: whereas the former ran an in-house-developed IBM operating system, the operating system of the latter belonged to Microsoft.

The IBM architecture was different from that of the Apple II in that its operating system resided on disk, to be booted into memory at system startup, rather than being housed in ROM. Still, every computer needs to have some code in ROM. On an IBM PC, this code was known as the “Basic Input/Output System,” or BIOS, a nomenclature borrowed from the CP/M-based machines that preceded it. The BIOS was responsible on startup for doing some self-checks and configuration and booting the operating system from disk. It also contained a set of very basic, very low-level routines to do things like read from and write to the disks, detect keyboard input, or display text on the screen; these would be called constantly by MS-DOS and, very commonly, by applications as well while the machine was in operation. The BIOS was the one piece of software for the IBM PC that IBM themselves had written and owned, and for obvious reasons they weren’t inclined to share it with anyone else. Two small companies, Corona Labs and Eagle Computer, would simply copy IBM’s BIOS a la Franklin. It took the larger company all of one day to file suit and force complete capitulation and market withdrawal when those machines came to their attention in early 1984.

Long before those events, other wiser would-be cloners recognized that creating a workalike, “clean-room” version of IBM’s BIOS would be the key to executing a legal IBM clone. The IBM PC’s emphasis on modularity and future expansion meant that it was a bit more forgiving in this area than the likes of the more tightly integrated Apple II. Yet an IBM-compatible BIOS would still be a tricky business, fraught with technical and financial risk.

As the IBM PC was beginning to ship, a trio of Texas Instruments executives named Rod Canion, James Harris, and William Murto were kicking around ideas for getting out from under what they saw as a growing culture of non-innovation inside TI. Eager to start a business of their own, they considered everything from a Mexican restaurant to household gadgets like a beeper for finding lost keys. Eventually they started to ask what the people around them at TI wanted but weren’t getting in their professional lives. They soon had their answer: a usable portable computer that executives and engineers could cart around with them on the road, and that was cheap enough that their purchasing managers wouldn’t balk. Other companies had explored this realm before, most notably the brief-lived Osborne Computer with the Osborne 1, but those products had fallen down badly in the usability sweepstakes; the Osborne 1, for example, had a 5-inch display screen the mere thought of which could prompt severe eye strain in those with any experience with the machine, disk drives that could store all of 91 K, and just 64 K of memory. Importantly, all of those older portables ran CP/M, until now the standard for business computing. Canion, Harris, and Murto guessed, correctly, that CP/M’s days were numbered in the wake of IBM’s adoption of MS-DOS. Not wanting to be tied to a dying operating system, they first considered making their own. Yet when they polled the big software publishers about their interest in developing for yet another new, incompatible machine the results were not encouraging. There was only one thing for it: they must find a way to make their portable compatible with the IBM PC. If they could bring out such a machine before IBM did, the spoils could be enormous. Prominent tech venture capitalist Ben Rosen agreed, investing $2.5 million to help found Compaq Computer Corporation in February of 1982. What with solid funding and their own connections within the industry, Canion, Harris, and Murto thought they could easily design a hardware-compatible portable that was better than anything else available at the time. That just left the software side.

Given Bill Gates’s reputation as the Machiavelli of the computer industry, we perhaps shouldn’t be surprised that some journalists have credited him with anticipating the rise of PC clones from well before the release of the first IBM PC. That, however, is not the case. All indications are that Gates negotiated a deal that let Microsoft lease MS-DOS to IBM rather than sell it to them simply in the expectation that the IBM PC would be a big success, enough so that an ongoing licensing fee would amount to far more than a lump-sum payout in the long run. Thus he was as surprised as anyone when Compaq and a few other early would-be cloners contacted him to negotiate MS-DOS license deals for their own machines. Of course, Gates being Gates, it took him all of about ten minutes to grasp the implications of what was being requested, and to start making deals that, not incidentally, actually paid considerably better than the one he’d already made with IBM.

The BIOS would be a tougher nut to crack, the beachhead on which this invasion of Big Blue’s turf would succeed or fail. Having quickly concluded that simply copying IBM’s ROMs wasn’t a wise option, Compaq hired a staff of fifteen programmers who would dedicate the months to come to creating a slavish imitation. Programmers with any familiarity at all with the IBM BIOS were known as “dirty,” and barred from working on the project. Instead of relying on IBM’s published BIOS specifications (which might very well be incorrect due to oversight or skulduggery), the team took the thirty biggest applications on the market and worked through them one at a time, analyzing each BIOS call each program made and figuring out through trial and error what response it needed to receive. The two trickiest programs, which would go on to become a sort of stress test for clone compatibility both inside and outside of Compaq, proved to be Lotus 1-2-3 and Microsoft Flight Simulator.

Before the end of the year, Compaq was previewing their new portable to press and public and working hard to set up a strong dealer network. For the latter task they indulged in a bit of headhunting: they hired away from IBM H. L. ”Sparky” Sparks, the man who had set up the IBM PC dealer network. Knowing all too well how dealers thought and what was most important to them, Sparks instituted a standard expected dealer markup of 36 percent, versus the 33 percent offered by IBM, thus giving them every reason to look hard at whether a Compaq might meet a customer’s needs just as well or better than a machine from Big Blue.

The Compaq Portable

Compaq’s first computer, the Portable

Savvy business realpolitik like that became a hallmark of Compaq. Previously clones had been the purview of small upstarts, often with a distinct air of the fly-by-night about them. The suburban-Houston-based Compaq, though, was different, not only from other cloners but also from the established companies of Silicon Valley. Compaq was older, more conservative, interested in changing the world only to the extent that that meant more Compaq computers on desks and in airplane luggage racks. ”I don’t think you could get a 20-year-old to not try to satisfy his ego by ‘improving’ on IBM,” said J. Steven Flannigan, the man who led the BIOS reverse-engineering effort. “When you’re fat, balding, and 40, and have a lot of patents already, you don’t have to try.” That attitude was something corporate purchasing managers could understand. Indeed, Compaq bore with it quite a lot of the same sense of comforting stolidity as did IBM itself. Not quite the first to hit the market with an IBM clone with a “clean” BIOS (that honor likely belongs to Columbia Data Products, a much scruffier sort of operation that would be out of business by 1985), Compaq nevertheless legitimized the notion in the eyes of corporate America.

The Compaq Portable goes flying

The worst possible 1980s airplane seatmate: a business traveler lugging along a Compaq Portable.

Yet the Compaq Portable that started shipping very early in 1983 also succeeded because it was an excellent and — Flannigan’s sentiments aside — innovative product. By coming out with their portable before IBM itself, Compaq showed that clones need not be mere slavish imitations of their inspirations distinguished only by a lower price. “Portable” in 1983 did not, mind you, mean what it does today. The Compaq Portable was bigger and heavier  — a full 28 pounds — than most desktop machines of today, something you manhandled around like a suitcase rather than slipping into a pocket or backpack. There wasn’t even a battery in the thing, meaning the businessperson on the go would likely be doing her “portable” computing only in her hotel room. Still, it was very thoughtfully designed within the technical constraints of its era; you could for instance attach it to a real monitor at your desk to enjoy color graphics in lieu of the little 9-inch monochrome screen that came built-in, a first step on the road to the ubiquitous laptop docking stations of today.

Launching fortuitously just as some manufacturing snafus and unexpected demand for the new PC/XT were making IBM’s own computers hard to secure in some places, the Compaq Portable took off like a rocket. Compaq sold 53,000 of them for $111 million in sales that first year, a record for a technology startup. IBM, suddenly in the unaccustomed position of playing catch-up, released their own portable the following year with fewer features but — and this was truly shocking — a lower price than the Compaq Portable; by forcing high-and-mighty IBM to compete on price, Compaq seemed to have somehow turned the world on its head. The IBM Portable PC was a notable commercial failure, first sign of IBM’s loosening grip on the monster they had birthed. Meanwhile Compaq launched their own head-to-head challenge that same year with the DeskPro line of desktop machines, to much greater success. Apple may have been attacking IBM in melodramatic propaganda films and declaring themselves and IBM to be locked in a battle of Good versus Evil, but IBM hardly seemed to notice the would-be Apple freedom fighters. The only company that really mattered to IBM, the only company that scared them, wasn’t sexy Apple but buttoned-down, square-jawed Compaq.

But Compaq was actually far from IBM’s only problem. Cloning just kept getting easier, for everyone. In the spring of 1984 two little companies called Award Software and Phoenix Technologies announced identical products almost simultaneously: a reverse-engineered, completely legal IBM-compatible BIOS which they would license to anyone who felt like using it to make a clone. Plenty of companies did, catapulting Award and Phoenix to the top of what was soon a booming niche industry (they would eventually resolve their rivalry the way that civilized businesspeople do it, by merging). With the one significant difficulty of cloning thus removed, making a new clone became almost a triviality, a matter of ordering up a handful of components along with MS-DOS and an off-the-shelf BIOS, slapping it all together, and shoving it out the door; the ambitious hobbyist could even do it in her home if she liked. By 1986, considerably more clones were being sold than IBMs, whose own sales were stagnant or even decreasing.

That year Intel started producing the 80386, the third generation of the line of CPUs that powered the IBM PC and its clones. IBM elected to wait a bit before making use of it, judging that the second-generation 80286, which they had incorporated into the very successful PC/AT in 1984, was still plenty powerful  for the time being. It was a bad decision, predicated on a degree dominance which IBM no longer enjoyed. Smelling opportunity, Compaq made their own 80386-based machine, the DeskPro 386, the first to sport the hot new chip. Prior to this machine, the cloners had always been content to let IBM pave the way of such fundamental advances. The DeskPro 386 marks Compaq’s — and the clone industry’s — coming of age. No longer just floating along in the wake of IBM, tinkering with form factors, prices, and feature sets, now they were driving events. Already in November of 1985, Bill Machrone of PC Magazine had seen where this was leading: “Now that it [IBM] has created the market, the market doesn’t necessarily need IBM for the machines.” We see here business computing going through its second fundamental shift (the first being the transition from CP/M to MS-DOS). What was an ecosystem of IBM and IBM clones now became a set of sometimes less-than-ideal, sometimes accidental, but nevertheless agreed-upon standards that were bigger than IBM or anyone else. IBM, Machrone wrote, “had better conform” to the standards or face the consequences just like anyone else. Tellingly, it’s at about this time that we see the phrase “IBM clone” begin to fade, to be replaced by “MS-DOS machine” or “Intel-based machine.”

The emerging Microsoft/Intel juggernaut (note the lack of an “IBM” in there) would eventually conquer the home as well. Already by the mid-1980s certain specimens of the breed were beginning to manifest features that could make them attractive for the home user. Let’s rewind just slightly to look at the most important of them, which I’ve mentioned in a couple of earlier articles but have never really given its full due.

When the folks at Radio Shack, trying to figure out what to do with their aging, fading TRS-80 line, saw the ill-fated IBM PCjr, they saw things well worth salvaging in its 16-color graphics chip and its three-voice sound synthesizer, both far superior to the versions found in its big brothers. Why not clone those pieces, package them into an otherwise fairly conventional PC clone, and sell the end result as the perfect all-around computer, one which could run all the critical business applications but could also play games in the style to which kids with Commodore 64s were accustomed? Thanks to the hype that had accompanied the PCjr’s launch, there were plenty of publishers out there with huge inventories of games and other software that supported the PCjr’s audiovisuals, inventories they’d be only too eager to unload on Radio Shack cheap. With those titles to prime the pump, who knew where things might go…

Launched in late 1984, the Tandy 1000 was the first IBM clone to be clearly pitched not so much at business as at the ordinary consumer. In addition to the audiovisual enhancements and very aggressive pricing, it included DeskMate, a sort of proto-GUI operating environment designed to insulate the user from the cryptic MS-DOS command prompt while giving access to six typical home applications that came built right in. A brilliant little idea all the way around, the Tandy 1000 rescued Radio Shack from the brink of computing irrelevance. It also proved a godsend for many software publishers who’d bet big on the PCjr; John Williams credits it with literally saving Sierra by providing a market for King’s Quest, a game Sierra had developed for the PCjr at horrendous expense and to underwhelming sales given that platform’s commercial failure. Indeed, the Tandy 1000 became so popular that it prompted lots of game publishers to have a second look at the heretofore dull beige world of the clones. As they jumped aboard the MS-DOS gravy train, many made sure to take advantage of the Tandy 1000’s audiovisual enhancements. Thousands of titles would eventually blurb what became known as “Tandy graphics support” on their boxes and advertisements. Having secured the business market, the Intel/Microsoft architecture’s longer, more twisting road to hegemony over home computing began in earnest with the Tandy 1000. And meanwhile poor IBM couldn’t even get proper credit for the graphics standard they’d actually invented. Sometimes you just can’t win for losing.

Another sign of the nascent but inexorably growing power of Intel/Microsoft in the home would come soon after the Tandy 1000, with the arrival of the first game to make many Apple, Atari, and Commodore owners wish that they had a Tandy 1000 or, indeed, even one of its less colorful relatives. We’ll get to that soon — no, really! — but first we have just one more detour to take.

(I was spoiled for choice on sources this time. A quick rundown of periodicals: Creative Computing of January 1983; Byte of January 1983, November 1984, and August 1985; PC Magazine of January 1987; New York Times of November 5 1982, October 26 1983, January 5 1984, February 1 1984, and February 22 1984; Fortune of February 18 1985. Computer Wars by Charles H. Ferguson and Charles R. Morris is a pretty book book-length study of IBM’s trials and tribulations during this period. More information on the EACA clones can be found at Terry Stewart’s site. More on Compaq’s roots in Houston can be found at the Texas Historical Association. A few more invaluable links are included in the article proper.)

 
 

Tags: , , , , ,

Apple, Carmen Sandiego, and the Rise of Edutainment

If there was any one application that was the favorite amongst early boosters of personal computing, it was education. Indeed, it could sometimes be difficult to find one of those digital utopianists who was willing to prioritize anything else — unsurprisingly, given that so much early PC culture grew out of places like The People’s Computer Company, who made “knowledge is power” their de facto mantra and talked of teaching people about computers and using computers to teach with equal countercultural fervor. Creative Computing, the first monthly magazine dedicated to personal computing, grew out of that idealistic milieu, founded by an educational consultant who filled a big chunk of its pages with plans, schemes, and dreams for computers as tools for democratizing, improving, and just making schooling more fun. A few years later, when Apple started selling the II, they pushed it hard as the learning computer, making deals with the influential likes of the Minnesota Educational Consortium (MECC) of Oregon Trail fame that gave the machine a luster none of its competitors could touch. For much of the adult public, who may have had their first exposure to a PC when they visited a child’s classroom, the Apple II became synonymous with the PC, which was in turn almost synonymous with education in the days before IBM turned it into a business machine. We can still see the effect today: when journalists and advertisers look for an easy story of innovation to which to compare some new gadget, it’s always the Apple II they choose, not the TRS-80 or Commodore PET. And the iconic image of an Apple II in the public’s imagination remains a group of children gathered around it in a classroom.

For all that, though, most of the early educational software really wasn’t so compelling. The works of Edu-Ware, the first publisher to make education their main focus, were fairly typical. Most were created or co-created by Edu-Ware co-founder Sherwin Steffin, who brought with him a professional background of more than twenty years in education and education theory. He carefully outlined his philosophy of computerized instruction, backed as it was by all the latest research into the psychology of learning, in long-winded, somewhat pedantic essays for Softalk and Softline magazines, standard bearers of the burgeoning Apple II community. Steffin’s software may or may not have correctly applied the latest pedagogical research, but it mostly failed at making children want to learn with it. The programs were generally pretty boring exercises in drill and practice, lacking even proper titles. Fractions, Arithmetic Skills, or Compu-Read they said on their boxes, and fractions, arithmetic, or (compu-)reading was what you got, a series of dry drills to work through without a trace of wit, whimsy, or fun.

The other notable strand of early PC-based education was the incestuous practice of using the computer to teach kids about computers. The belief that being able to harness the power of the computer through BASIC would somehow become a force for social democratization and liberation is an old one, dating back to even before the first issues of Creative Computing — to the People’s Computer Club and, indeed, to the very researchers at Dartmouth University who created BASIC in the 1960s. As BASIC’s shortcomings became more and more evident, other instructional languages and courses based on them kept popping up in the early 1980s: PILOT, Logo, COMAL, etc. This craze for “computer literacy,” which all but insisted that every kid who didn’t learn to program was going to end up washing dishes or mowing lawns for a living, peaked along with the would-be home-computer revolution in about 1983. Advocating for programming as a universal life skill was like suggesting in 1908 that everyone needed to learn to take a car apart and put it back together to prepare for the new world that was about to arrive with the Model T — which, in an example of how some things never really change, was exactly what many people in 1908 were in fact suggesting. Joseph Weizenbaum of Eliza fame, always good for a sober corrective to the more ebullient dreams of his colleagues, offered a take on the real computerized future that was shockingly prescient by comparing the computer to the electric motor.

There are undoubtedly many more electric motors in the United States than there are people, and almost everybody owns a lot of electric motors without thinking about it. They are everywhere, in automobiles, food mixers, vacuum cleaners, even watches and pencil sharpeners. Yet, it doesn’t require any sort of electric-motor literacy to get on with the world, or, more importantly, to be able to use these gadgets.

Another important point about electric motors is that they’re invisible. If you question someone using a vacuum cleaner, of course they know that there is an electric motor inside. But nobody says, “Well, I think I’ll use an electric motor programmed to be a vacuum cleaner to vacuum the floor.”

The computer will also become largely invisible, as it already is to a large extent in the consumer market. I believe that the more pervasive the computer becomes, the more invisible it will become. We talk about it a lot now because it is new, but as we get used to the computer it will retreat into the background. How much hands-on computer experience will students need? The answer, of course, is not very much. The student and the practicing professional will operate special-purpose instruments that happen to have computers as components.

The pressure to make of every kid a programmer gradually faded as the 1980s wore on, leaving programming to those of us who found it genuinely fascinating. Today even the term “computer literacy,” always a strange linguistic choice anyway, feels more and more like a relic of history as this once-disruptive and scary new force has become as everyday as, well, the electric motor.

As for those other educational programs, they — at least some of them — got better by mid-decade. Programs like Number Munchers, Math Blaster, and Reader Rabbit added a bit more audiovisual sugar to their educational vegetables along with a more gamelike framework to their repetitive drills, and proved better able to hold children’s interest. For all the early rhetoric about computers and education, one could argue that the real golden age of the Apple II as an educational computer didn’t begin until about 1983 or 1984.

By that time a new category of educational software, partly a marketing construct but partly a genuinely new thing, was becoming more and more prominent: edutainment. Trip Hawkins, founder of Electronic Arts, has often claimed to have invented the portmanteau for EA’s 1984 title Seven Cities of Gold, but this is incorrect; a company called Milliken Publishing was already using the label for their programs for the Atari 8-bit line in late 1982, and it was already passing into common usage by the end of 1983. Edutainment dispensed with the old drill-and-practice model in preference to more open, playful forms of interactions that nevertheless promised, sometimes implicitly and sometimes explicitly, to teach. The skills they taught, meanwhile, were generally not the rigid, disembodied stuff of standardized tests but rather embedded organically into living virtual worlds. It’s all but impossible to name any particular game as the definitive first example of such a nebulous genre, but a good starting point might be Tom Snyder and Spinnaker Software.

Tom Snyder, 1984

Tom Snyder, 1984

Snyder had himself barely made it through high school. He came to blame his own failings as a student on his inability to relate to exactly the notions of arbitrary, contextless education that marked the early era of PC educational software: “Here, learn this set of facts. Write this paper. This is what you must know. This is what’s important.” When he became a fifth-grade teacher years later, he made it a point to ground his lessons always in the real world, to tell his students why it was useful to know the things he taught them and how it all related to the world around them. He often used self-designed games, first done with pencil and paper and cardboard and later done on computers, to let his students explore knowledge and its ramifications. In 1980 he founded a groundbreaking development company, Tom Snyder Productions, to commercialize some of those efforts. One of them became Snooper Troops, published as one of Spinnaker’s first titles in 1982; it had kids wandering around a small town trying to solve a mystery by compiling clues and using their powers of deduction. The next year’s In Search of the Most Amazing Thing, still a beloved memory of many of those who played it, combined clue-gathering with elements of economics and even diplomacy in a vast open world. Unlike so much other children’s software, Snyder’s games never talked down to their audience; children are after all just as capable of sensing when they’re being condescended to as anyone else. They differed most dramatically from the drill-and-practice software that preceded them in always making the educational elements an organic part of their worlds. One of Snyder’s favorite mantras applies to educational software as much as it does to any other creative endeavor and, indeed, to life: “Don’t be boring.” The many games of Tom Snyder Productions, most of which were not actually designed by Snyder himself, were often crude and slow, written as often as not in BASIC. But, at least at the conceptual level, they were seldom boring.

It’s of course true that a plain old game that requires a degree of thoughtfulness and a full-on work of edutainment can be very hard to disentangle from one another. Like so much else in life, the boundaries here can be nebulous at best, and often had as much to do with marketing, with the way a title was positioned by its owner, as with any intrinsic qualities of the title itself. When we go looking for those intrinsics, we can come up with only a grab bag of qualities of which any given edutainment title was likely to share a subset: being based on real history or being a simulation of some real aspect of science or technology; being relatively nonviolent; emphasizing thinking and logical problem-solving rather than fast reflexes. Like pornography, edutainment is something that many people seemed to just know when they saw it.

That said, there were plenty of titles that straddled the border between entertainment and edutainment. Spinnaker’s Telarium line of adventure games is a good example. Text-based games that were themselves based on books, published by a company that had heretofore specialized in education and edutainment… it wasn’t hard to grasp why parents might be expected to find them appealing, even if they were never explicitly marketed as anything other than games. Spinnaker’s other line of adventures, Windham Classics, blurred the lines even more by being based on acknowledged literary classics of the sort kids might be assigned to read in school rather than popular science fiction and fantasy, and by being directly pitched at adolescents of about ten to fourteen years of age. Tellingly, Tom Synder Productions wrote one of the Windham Classics games; Dale Disharoon, previously a developer of Spinnaker educational software like Alphabet Zoo, wrote two more.

A certain amount of edutational luster clung to the text adventure in general, was implicit in much of the talk about interactive fiction as a new form of literature that was so prevalent during the brief bookware boom. One could even say it clung to the home computer itself, in the form of notions about “good screens” and “bad screens.” The family television was the bad screen, locus of those passive and mindless broadcasts that have set parents and educators fretting almost from the moment the medium was invented, and now the home of videogames, the popularity of which caused a reactionary near-hysteria in some circles; they would inure children to violence (if they thought Space Invaders was bad, imagine what they’d say about the games of today!) and almost literally rot their brains, making of them mindless slack-jawed zombies. The computer monitor, on the other hand, was the good screen, home of more thoughtful and creative forms of interaction and entertainment. What parent wouldn’t prefer to see her kid playing, say, Project: Space Station rather than Space Invaders? Home-computer makers and software publishers — at least the ones who weren’t making Space Invaders clones — caught on to this dynamic early and rode it hard.

As toy manufacturers had realized decades before, there are essentially two ways to market children’s entertainment. One way is to appeal to the children themselves, to make them want your product and nag Mom and Dad until they relent. The other is to appeal directly to Mom and Dad, to convince them that what you’re offering will be an improving experience for their child, perhaps with a few well-placed innuendoes if you can manage them about how said child will be left behind if she doesn’t have your product. With that in mind, it can be an interesting experiment to look at the box copy from software of the early home-computer era whilst asking yourself whether it’s written for the kids who were most likely to play it or the parents who were most likely to pay for it — or whether it hedges its bets by offering a little for both. Whatever else it was, emphasizing the educational qualities of your game was just good marketing; a 1984 survey found that 46 percent of computers in homes had been purchased by parents with the primary goal of improving their children’s education. It was the perfect market for the title that would come to stand alongside The Oregon Trail as one of the two classic examples of 1980s edutainment software.

Doug, Cathy, and Gary Carlston, 1983

Doug, Cathy, and Gary Carlston, 1983

The origins of the game that would become known as Where in the World is Carmen Sandiego? are confused, with lots of oft-contradictory memories and claims flying around. However, the most consistent story has it beginning with an idea by Gary Carlston of Brøderbund Software in 1983. He and his brother Doug had been fascinated by their family’s almanac as children: “We used to lie there and ask each other questions out of the almanac.” This evolved into impromptu quiz games in bed after the lights went out. Gary now proposed a game or, better yet, a series of games which would have players running down a series of clues about geography and history, answerable via a trusty almanac or other reference work to be included along with the game disk right there in the box.

Brøderbund didn’t actually develop much software in-house, preferring to publish the work of outside developers on a contract basis. While they did have a small staff of programmers and even artists, they were there mainly to assist outside developers by helping with difficult technical problems, porting code to other machines, and polishing in-game art rather than working up projects from scratch. But this idea just seemed to have too much potential to ignore or outsource. Gary was therefore soon installed in Brøderbund’s “rubber room” — so-called because it was the place where people went to bounce ideas off one another — along with Lauren Elliott, the company’s only salaried game designer; Gene Portwood, Elliott’s best friend, manager of Brøderbund’s programming team, and a pretty good artist; Ed Bernstein, head of Brøderbund’s art department; and programmer Dane Bigham, who would be expected to write not so much a game as a cross-platform database-driven engine that could power many ports and sequels beyond the Apple II original.

Gary’s first idea was to name the game Six Crowns of Henry VIII, and to make it a scavenger hunt for the eponymous crowns through Britain. However, the team soon turned that into something wider-scoped and more appealing to the emerging American edutainment market. You would be chasing an international criminal ring through cities located all over the world, trying to recover a series of stolen cultural artifacts, like a jade goddess from Singapore, an Inca mask from Peru, or a gargoyle from Notre Dame Cathedral (wonder how the thieves managed that one). It’s not entirely clear who came up with the idea for making the leader of the ring, whose capture would become the game’s ultimate goal, a woman named Carmen Sandiego, but Elliott believes the credit most likely belongs to Portwood. Regardless, everyone immediately liked the idea. “There were enough male bad guys,” said Elliott later, and “girls [could] be just as bad.” (Later, when the character became famous, Brøderbund would take some heat from Hispanic groups who claimed that the game associated a Hispanic surname with criminality. Gary replied with a tongue-in-cheek letter explaining that “Sandiego” was actually Carmen’s married name, that her maiden name was “Sondberg” and she was actually Swedish.) When development started in earnest, the Carmen team was pared down to a core trio of Eliott, who broadly speaking put together the game’s database of clues and cities; Portwood, who drew the graphics; and Bigham, who wrote the code. But, as Eliott later said, “A lot of what we did just happened. We didn’t think much about it.”

Where in the World is Carmen Sandiego?

To play that first Carmen Sandiego game today can be just a bit of an underwhelming experience; there’s just not that much really to it. Each of a series of crimes and the clues that lead you to the perpetrator are randomly generated from the game’s database of 10 possible suspects, 30 cities, and 1000 or so clues. Starting in the home city of the stolen treasure in question, you have about five days to track down each suspect. Assuming you’re on the right track, you’ll get clues in each city as to the suspect’s next destination among the several possibilities represented by the airline connections from that city: perhaps he “wanted to know the price of tweed” or “wanted to sail on the Severn.” (Both of these clues would point you to Britain, more specifically to London.) If you make the right deductions each step of the way you’ll apprehend the suspect in plenty of time. You’ll know you’ve made the wrong choice if you wind up at a dead-end city with no further clues on offer. Your only choice then is to backtrack, wasting precious time in the process. The tenth and final suspect to track down is always Carmen Sandiego herself, who for all of her subsequent fame is barely characterized at all in this first installment. Capture her, and you retire to the “Detective Hall of Fame.” There’s a little bit more to it, like the way that you must also compile details of the suspect’s appearance as you travel so you can eventually fill out an arrest warrant, but not a whole lot. Any modern player with Wikipedia open in an adjacent window can easily finish all ten cases and win the game in a matter of a few hours at most. By the time you do, the game’s sharply limited arsenal of clues, cities, and stolen treasures is already starting to feel repetitive.

Which is not to say that Carmen Sandiego is entirely bereft of modern appeal. When my wife and I played it over the course of a few evenings recently, we learned a few interesting things we hadn’t known before and even discovered a new country that I at least had never realized existed: the microstate of San Marino, beloved by stamp and coin collectors and both the oldest and the smallest constitutional republic in the world. My wife is now determined that we should make a holiday there.

Still, properly appreciating Carmen Sandiego‘s contemporary appeal requires of us a little more work. The logical place to start is with that huge World Almanac and Books of Facts that made the game’s box the heaviest on the shelves. It can be a bit hard even for those of us old enough to have grown up before the World Wide Web to recover the mindset of an era before we had the world in our living rooms — or, better said in this age of mobile computing, in our pockets. Back in those days when you had to go to a library to do research, when your choices of recreation of an evening were between whatever shows the dozen or so television stations were showing and whatever books you had in the house, an almanac was magic to any kid with a healthy curiosity about the world and a little imagination, what with its thousand or more pages filled with exotic lands along with records of deeds, buildings, cities, people, animals, and geography whose very lack of context only made them more alluring. The whole world — and then some; there were star charts and the like for budding astronomers — seemed to have been stuffed within its covers.

In that spirit, one could almost call the Carmen Sandiego game disk ancillary to the almanac rather than the other way around. Who knew what you delights you might stumble over while you tried to figure out, say, in which country the python made its home? The World Almanac continues to come out every year, and seems to have done surprisingly well, all things considered, surviving the forces that have killed dead typical companions on reference shelves like the encyclopedia. But of course it’s lost much of its old magic in these days of information glut. While we can still recapture a little of the old feeling by playing Carmen Sandiego with a web browser open, our search engines have just gotten too good; it’s harder to stumble across the same sorts of crazy facts and alluring diversions.

Carmen Sandiego captured so many kids because it tempted them to discover knowledge for themselves rather than attempting to drill it into them, and all whilst never talking down to them. Gary Carlston said of Brøderbund’s edutainment philosophy, “If we would’ve enjoyed it at age 12, and if we still it enjoy it now, then it’s what we want. Whether it’s pedagogically correct is not relevant.” Carmen Sandiego did indeed attract criticism from earnest educational theorists armed with studies showing how it failed to live up to the latest research on learning; this low-level drumbeat of criticism continues to this day. Some of it may very well be correct and relevant; I’m hardly qualified to judge. What I do see, though, is that Carmen Sandiego offers a remarkably progressive view of knowledge and education for its time. At a time when schools were still teaching many subjects through rote memorization of facts and dates, when math courses were largely “take this set of numbers and manipulate them to become this other set of numbers” without ever explaining why, Carmen Sandiego grasped that success in the coming world of cheap and ubiquitous data would require not a head stuffed with facts but the ability to extract relevant information from the flood of information that surrounds us, to synthesize it into conclusions, and to apply it to a problem at hand. While drill-and-practice software taught kids to perform specific tasks, Carmen Sandiego, like all the best edutainment software, taught them how to think. Just as importantly, it taught them how much fun doing so could be.

Where in the World is Carmen Sandiego

Brøderbund may not have been all that concerned about making Carmen Sandiego “pedagogically correct,” but they were hardly blind to the game’s educational value, nor to the marketing potential therein. The back cover alone of Carmen Sandiego is a classic example of edutainment marketing, emphasizing the adventure aspects for the kids while also giving parents a picture of children beaming over an almanac and telling how they will be “introduced to world geography” — and all whilst carefully avoiding the E-word; telling any kid that something is “educational” was and is all but guaranteed to turn her off it completely.

For all that, though, the game proved to be a slow burner rather than an out-of-the-gates hit upon its release in late 1985. It was hardly a flop; sales were strong enough that Brøderbund released the first of many sequels, Where in the USA is Carmen Sandiego?, the following year. Yet year by year the game just got more popular, especially when Brøderbund started to reach out more seriously to educators, releasing special editions for schools and sending lots of free swag to those who agreed to host “Carmen Days,” for which students and teachers dressed up as Carmen or her henchmen or the detectives on their trail, and could call in to the “Acme Detective Agency” at Brøderbund itself to talk with Portwood or Elliott playing the role of “the Chief.” The combination of official school approval, the game’s natural appeal to both parents and children, and lots of savvy marketing proved to be a potent symbiosis indeed. Total sales of Carmen Sandiego games passed 1 million in 1989, 2 million in 1991, by which time the series included not only Where in the World is Carmen Sandiego? and Where in the USA is Carmen Sandiego? but also Where in Europe is Carmen Sandiego?, Where in Time is Carmen Sandiego?, Where in America’s Past is Carmen Sandiego?, and the strangely specific Where in North Dakota is Carmen Sandiego?, prototype for a proposed series of state-level games that never got any further; Where in Space is Carmen Sandiego? would soon go in the opposite direction, rounding out the original series of reference-work-based titles on a cosmic scale. In 1991 Carmen also became a full-fledged media star, the first to be spawned by a computer game, when Where in the World is Carmen Sandiego? debuted as a children’s game show on PBS.

A Print Shop banner: an artifact as redolent of its era as Hula Hoops or bellbottoms are of theirs.

A Print Shop banner: an artifact as redolent of its era as Hula Hoops or bellbottoms are of theirs.

Through the early 1980s, Brøderbund had been a successful software publisher, but not outrageously so in comparison to their peers. At mid-decade, though, the company’s fortunes suddenly began to soar just as many of those peers were, shall we say, trending in the opposite direction. Brøderbund’s success was largely down to two breakout products which each succeeded in identifying a real, compelling use for home computers at a time when that was proving far more difficult than the boosters and venture capitalists had predicted. One was of course the Carmen Sandiego line. The other was a little something called The Print Shop, which let users design and print out signs and banners using a variety of fonts and clip art. How such a simple, straightforward application could become so beloved may seem hard to understand today, but beloved The Print Shop most definitely became. For the rest of the decade and beyond its distinctive banners, enabled by the fan-fold paper used by the dot-matrix printers of the day, could be seen everywhere that people without a budget for professional signage gathered: at church socials, at amateur sporting events, inside school hallways and classrooms. Like the first desktop-publishing programs that were appearing on the Macintosh contemporaneously, The Print Shop was one more way in which computers were beginning to democratize creative production, a process, as disruptive and fraught as it is inspiring, that’s still ongoing today.

In having struck two such chords with the public in the form of The Print Shop and Carmen Sandiego, Brøderbund was far ahead of virtually all of their competitors who failed to find even one. Brøderbund lived something of a charmed existence for years, defying most of the hard-won conventional wisdom about consumer software being a niche product at best and the real money being in business software. If the Carlstons hadn’t been so gosh-darn nice, one might be tempted to begrudge them their success. (Once when the Carlstons briefly considered a merger with Electronic Arts, whose internal culture was much more ruthless and competitive, a writer said it would be a case of the Walton family moving in with the Manson family.) One could almost say that for Brøderbund alone the promises of the home-computer revolution really did materialize, with consumers rushing to buy from them not just games but practical software as well. Tellingly — and assuming we agree to label Carmen Sandiego as an educational product rather than a game — Brøderbund’s top-selling title was never a game during any given year between 1985 and the arrival of the company’s juggernaut of an adventure game Myst in 1993, despite their publication of hits like the Jordan Mechner games Karateka and Prince of Persia. Carmen Sandiego averaged 25 to 30 percent of Brøderbund’s sales during those years, behind only The Print Shop. The two lines together accounted for well over half of yearly revenues that were pushing past $50 million by decade’s end — still puny by the standards of business software but very impressive indeed by that of consumer software.

For the larger software market, Carmen Sandiego — and, for that matter, The Print Shop — were signs that, if the home computer hadn’t quite taken off as expected, it also wasn’t going to disappear or be relegated strictly to the role of niche game machine either, a clear sign that there were or at least with a bit more technological ripening could be good reasons to own one. The same year that Brøderbund pushed into edutainment with Carmen Sandiego, MECC, who had reconstituted themselves as the for-profit (albeit still state-owned) publisher Minnesota Educational Computing Corporation in 1984, released the definitive, graphically enhanced version of that old chestnut The Oregon Trail, a title which shared with Carmen Sandiego an easygoing, progressive, experiential approach to learning. Together Oregon and Carmen became the twin icons of 1980s edutainment, still today an inescapable shared memory for virtually everyone who darkened a grade or middle school door in the United States between about 1985 and 1995.

The consequences of Carmen and Oregon and the many other programs they pulled along in their wake were particularly pronounced for the one remaining viable member of the old trinity of 1977: the Apple II. Lots of people both outside and inside Apple had been expecting the II market to finally collapse for several years already, but so far that had refused to happen. Apple, whose official corporate attitude toward the II had for some time now been vacillating between benevolent condescension and enlightened disinterest, did grant II loyalists some huge final favors now. One was the late 1986 release of the Apple IIGS, a radically updated version produced on a comparative shoestring by the company’s dwindling II engineering team with assistance from Steve Wozniak himself. The IIGS used a 16-bit Western Design Center 65C816 CPU that was capable of emulating the old 8-bit 6502 when necessary but was several times as powerful. Just as significantly, the older IIs’ antiquated graphics and sound were finally given a major overhaul that now made them amongst the best in the industry, just a tier or two below those of the current gold standard, Commodore’s new 68000-based Amiga. The IIGS turned out to be a significant if fairly brief-lived hit, outselling the Macintosh and all other II models by a considerable margin in its first year.

But arguably much more important for the Apple II’s long-term future was a series of special educational offers Apple made during 1986 and 1987. In January of the former year, they announced a rebate program wherein schools could send them old computers made by Apple or any of their competitors in return for substantial rebates on new Apple IIs. In April of that year, they announced major rebates for educators wishing to purchase Apple IIs for home use. Finally, in March of 1987, Apple created two somethings called the Apple Unified School System and the Apple Education Purchase Program, which together represented a major, institutionalized outreach and support effort designed to get even more Apple IIs into schools (and, not incidentally, more Macs into universities). The Apple II had been the school computer of choice virtually from the moment that schools started buying PCs at all, but these steps along with software like Carmen Sandiego and The Oregon Trail cemented and further extended its dominance, to an extent that many schools and families simply refused to let go. The bread-and-butter Apple II model, the IIe, remained in production until November of 1993, by which time this sturdy old machine, thoroughly obsolete already by 1985, was selling almost exclusively to educators and Apple regarded its continued presence in their product catalogs like that of the faintly embarrassing old uncle who just keeps showing up for every Thanksgiving dinner.

Even after the inevitable if long-delayed passing of the Apple II as a fixture in schools, Carmen and Oregon lived on. Both received the requisite CD-ROM upgrades, although it’s perhaps debatable in both instances how much the new multimedia flash really added to the experience. The television Carmen Sandiego game shows also continued to air in various incarnation through the end of the decade. Carmen Choose Your Own Adventure-style gamebooks, conventional young-adult novels, comic books, and a board game were also soon on offer, along with yet more computerized creations like Carmen Sandiego Word Detective. Only with the millennium did Carmen — always a bit milquetoast as a character and hardly the real source of the original games’ appeal — along with The Oregon Trail see their stars finally start to fade. Both retain a certain commercial viability today, but more as kitschy artifacts and nostalgia magnets than serious endeavors in either learning or entertainment. Educational software has finally moved on.

Perhaps not enough, though: it remains about 10 percent inspired, 10 percent acceptable in a workmanlike way, and 80 percent boredom stemming sometimes from well-meaning cluelessness and sometimes from a cynical desire to exploit parents, teachers, and children. Those looking to enter this notoriously underachieving field today could do worse than to hearken back to the simple charms of Carmen Sandiego, created as it was without guile and without reams of pedagogical research to back it up, out of the simple conviction that geography could actually be fun. All learning can be fun. You just have to do it right.

(See Engineering Play by Mizuko Ito for a fairly thorough survey of educational and edutational software from an academic perspective. Gamers at Work by Morgan Ramsay has an interview with Doug and Gary Carlston which dwells on Carmen Sandiego at some length. Matt Waddell wrote a superb history of Carmen Sandiego for a class at Stanford University in 2001. A piece on Brøderbund on the eve of the first Carmen Sandiego game’s release was published in the September 1985 issue of MicroTimes. A summary of the state of Brøderbund circa mid-1991 appeared in the July 9, 1991, New York Times. Joseph Weizenbaum’s comments appeared in the July 1984 issue of Byte. The first use of the term “edutainment” that I could locate appeared in a Milliken Publishing advertisement in the January 1983 issue of Creative Computing. Articles involving Spinnaker and Tom Snyder appeared in the June 1984 Ahoy! and the October 1984 and December 1985 Compute!’s Gazette. And if you got through all that and would like to experience the original Apple II Carmen Sandiego for yourself, feel free to download the disk images and manual — but no almanac I’m afraid — from right here.)

 
 

Tags: , ,

Shadowkeep

Shadowkeep

The story of Shadowkeep, even more so than Amazon the odd duck in the Telarium lineup, begins with Sigma Distributing, one of the first big microcomputer hardware distributors in the Seattle area. In 1981 Christopher Anson, a Sigma vice president, sought and received permission to start a new subsidiary to develop original games to serve the growing demand for software for the computers Sigma was selling. Anson’s first two acts were to name the company Ultrasoft and hire a programmer named Alan Clark away from Boeing. Clark became the technical architect of the set of tools and approaches that would define Ultrasoft during their brief existence.

Anson had decided that the best place for Ultrasoft to make a splash was in the field of illustrated adventure games, a nexus of excitement in the wake of Mystery House and The Wizard and the Princess. Like Scott Adams, Ken Williams, and Marc Blank before him, Clark realized that it would be more efficient in the long run to write an adventure-game engine and language that could give designers a bit of distance from the technical details of implementation as well as let Ultrasoft deploy their games to multiple platforms relatively painlessly. Deciding that a little corporate branding is always in order, they named their adventure programming language simply Ultra; the interpreter for each targeted platform UltraCode; their graphics system UltraVision. From the standpoint of the end user, Ultrasoft’s most obvious innovation — or, if you like, gimmick — involved this last. UltraVision could display not just static pictures but also brief animations, which could be used to, say, show the player’s avatar actually walking from room to room. Less obvious but no less significant, however, was the parser, one of the first developed outside of Infocom that allowed more than two words — although, it should be said, it was nowhere near as impressive a creation overall (more on that shortly).

Ultrasoft developed two adventures using the system, The Mask of the Sun (1982) and Serpent’s Star (1983). Both are interesting in their way, more carefully crafted, atmospheric, and thoughtful than was the norm of the time. Serpent’s Star in particular does a surprisingly good job of matching its puzzles to its theme of Buddhist philosophy. But both — and particularly Mask of the Sun — are also riddled with the sorts of unfair elements that were all too typical of their era. And both are fairly excruciating to play under any conditions. Whatever its other merits, you see, the Ultra system is slow. A quick look at the games’ technical underpinnings gives a clue as to why: the Ultra program logic doesn’t appear to be compiled at all, merely interpreted in place. The only blessing of that approach was that it enabled some frustrated adventurers to find the solutions to the more incomprehensible puzzles by code diving.

Ultrasoft first tried to market and distribute Mask of the Sun and Serpent’s Star on their own, but found it tough sledding for a tiny company with mainly regional connections in the professionalizing software industry of 1983. They soon accepted the role of developer only, licensing both games to Brøderbund for publication. After spending most of 1983 working on an ambitious new game, an adventure-game/CRPG hybrid called Shadowkeep written in a new version of their system which they dubbed Ultra II, they found a publisher for it in Spinnaker, who took the largely completed game as a future member of their planned Telarium line.

Looking to find some way to make the game fit with the bookware theme of the line as a whole, Spinnaker came up with the idea of reversing the usual process, of hiring a name writer to adapt the game into a book rather than the opposite. Luckily, they had substantial time to get the novelization done; they signed the contract with Ultrasoft in late 1983, a year before they planned to launch the Telarium line. Spinnaker approached Warner Books, who hooked them up with the reigning king of media tie-in novels, Alan Dean Foster. After making his reputation within the industry ghost-writing the Star Wars novelization for George Lucas, Foster had gone on to do The Black Hole, Clash of the Titans, The Last Starfighter, and Alien among many others. Virtually any big science-fiction or fantasy movie released seemed to arrive with an accompanying Alan Dean Foster novelization. (That’s still true today; his most recent novelization as of this writing is of Star Trek into Darkness.)

Spinnaker furnished Foster with design documents and a copy of the Shadowkeep source code and let him have at it. Those were all he had to go on; he doesn’t recall ever meeting or speaking with anyone from the design team, nor ever actually playing the game. He does, however, recall it as a very challenging project indeed. Being a party-based CRPG in the mold of Wizardry, Shadowkeep has no actual protagonist to speak of — no characters at all, really, outside of the fellow who sells you equipment and the evil demon Dal’Brad who shows up for the final showdown on the last of the dungeon levels. He was thus forced to invent the vast majority of the novel himself, whilst struggling to strike a balance between writing some recognizable analogue to the experience of the game and giving away all of its challenges. He did his usual workmanlike job, handing in a readable little genre exercise for simultaneous release with the game. Tellingly, it’s not until halfway through the book that the heroes enter the Shadowkeep — i.e., reach the beginning of the game.

Shadowkeep

Said game is… well, it’s really strange. Imagine Wizardry played with a text parser, and you’ve pretty well summed up Shadowkeep. You make a party of up to nine(!) characters. Most of the usual is here: attribute scores, classes and races to choose from, ever better equipment and spells to collect. Oddly missing, however, are character levels and any concept of experience; getting more powerful in this game is strictly a matter of finding or buying better stuff. The dungeon levels are the usual 16 X 16 grids full of traps, monsters, and assorted cartographic challenges. There are some original ideas here. For instance, the positions of the monsters that attack you and those of the members of your party are taken account of to a degree not found in Wizardry, adding some strategic depth to the experience. You likewise have more combat options than in Wizardry; in each round you can choose to forget defense and attack twice, or to just parry, or to attack once while not totally neglecting defense. And certainly the full-color graphics, which feature occasional examples of Ultrasoft’s trademark animations, are much better than Wizardry‘s wire frames.

Shadowkeep

Shadowkeep

Still, Shadowkeep mostly just makes you appreciate all the more how well Wizardry does the dungeon crawl. The game replaces Wizardry‘s hot-key interface with, yes, a text-adventure parser. You literally just type what you want to do: “OPEN DOOR,” “GET THE TORCH,” “CAST THE LUMINANCE SPELL,” “LIGHT THE TORCH AND PREPARE THE SWORD,” “PUT THE WAND OF TRAVEL IN THE CHEST.” Sounds fine, right? Well, what sounds fine in the abstract doesn’t work so well in practice. You must now type “F <RETURN>” (for “FORWARD”) instead of just “F” every time you want to walk forward a square in the dungeon. This may seem a minor thing, but consider that you’ll be entering this command thousands and thousands of times in the course of playing the game. That extra keystroke thus means thousands and thousands of extra keystrokes. And that’s the tip of the iceberg; this game is death by a hundred such small cuts. Commands by default are carried out by the leader of your party, who is not even a character you select but merely the one with the highest Leadership attribute score. Having someone else do something requires that you prepend her name to the command (“NAOMI GET THE TORCH AND GIVE IT TO REB”) — yet more tedious typing.

And the parser, that focal point of the whole interface, is at least as exasperating as the mainline Telarium parser. Like Byron Preiss Video Productions and many others at this time, Ultrasoft chose to take a profoundly misguided approach to this most critical piece of their engine. As described in an article in Softline magazine:

Ultrasoft’s parser is based on concepts in artificial intelligence. In any given message, it eliminates words that don’t make sense and attempts to make sense out of words that are relevant to the situation. This method frees the player from the verb-noun format of the typical adventure’s input.

In other words, the parser pretends to be smarter than it is by simply throwing out anything it doesn’t understand and doing what it can with the rest. This approach may “free the player from the verb-noun format,” but it also guarantees that complex (and often not so complex) inputs will be not just rejected — which combined with a proper error message is at least a form of useful feedback — but misunderstood. Far from making the parser seem smarter, this just makes it seem that much dumber and that much more infuriating. It leads to situations like that in the Byron Preiss games where any input containing the word “LOOK” anywhere within it causes the parser to dump everything else and print a room description. In Shadowkeep, typing “NAOMI CAST CURE SPELL ON REB” leads her to cast it away into the ether — that “ON REB” was a bridge too far, and thus ignored. Such a system fails to recognize that at least 95% of the time those extra words are not just stuff the player tacked on for the hell of it (who wants to type more than necessary under any circumstances?) but essential information about what she really wants to do.

To play Shadowkeep is to constantly wrestle with the interface. After playing several hours there are basic tasks I still haven’t figured out how to do — like how to cast a cure spell on someone outside of combat, or how to just get a list of the spells a certain character knows. And, like Ultrasoft’s earlier games, Shadowkeep is slow. Every step in the dungeon seems to take an eternity, and as for more complex action… forget about it. Playing is like wading through molasses with shackled feet.

The rewards for all the parsing pain are relatively slight: a handful of logic- or object-oriented puzzles on each level that can perhaps be a bit more complex than they could be under the Wizardry engine. Needless to say, they aren’t worth the rest of the trouble, making Shadowkeep something of a lowlight in the long, chequered history of adventure/CRPG hybrids. Which is a shame, because Shadowkeep‘s dungeon levels do show evidence of some careful craftsmanship and, as noted above, there are some good, original ideas on display here. Shadowkeep is a perfect example of a potentially worthy game destroyed by horrid interface choices. And I mean that literally: if the game isn’t outright unplayable (some patient souls have apparently played and even enjoyed it), it’s closer than I ever need to come to that adjective.

Ultrasoft was already in the process of fading quietly away by the time of Shadowkeep‘s late 1984 release. They never managed to port the Ultra II engine beyond the Apple II, leaving Shadowkeep without that all-critical Commodore 64 version. Spinnaker toyed with doing the port themselves, even announcing it as coming soon on various occasions, but I see no reason to believe that ever happened. (A Commodore 64 version has been a semi-mythical White Whale in collecting circles for many years now, but, despite some anecdotal claims and remembrances, no one has ever produced an actual working version to my knowledge.) The lack of a Commodore 64 version and the underwhelming nature of the game itself combined to make Shadowkeep the least successful — and, today, rarest — of all the Telarium games. Alan Dean Foster’s book, while no bestseller itself, appears to have sold far more copies on the author’s name recognition and its $3 (as opposed to $35) price tag.

Shadowkeep consists, like most of the Telarium games, of four disk sides. In this case, however, all four sides are written to during play to preserve the current state of the dungeon levels; the player is expected to copy her originals before beginning. Most of the copies floating around the Internet contain the residue of the previous players in their dungeons. Thankfully, however, reader Peter Ferrie has provided me (and thus you) with a completely pristine set just waiting for you and only you to leave your marks upon them. If whilst playing Wizardry or Bard’s Tale you thought to yourself that this game would be even better if it played a lot slower and had a parser, you’ve just found your dream CRPG. All others should consider this one a subject for historical research only.

And on that less than stellar note we’ll be moving on from Telarium for a while. My final reckoning of their first five releases is: two worthy efforts (Dragonworld and Amazon); one could-have-been-a-contender (Fahrenheit 451); and two total misfires (Rendezvous with Rama and Shadowkeep). Not a horrible track record on the whole. We’ll see if they learned any lessons in time for their last few games down the road a ways. But next it’s time to get back to the big boys in the field, and tell the rest of the story of Infocom’s very eventful 1984.

(In addition to the sources listed in my first article on bookware and Telarium, I also referenced for this article a feature on Ultrasoft in the May/June 1983 Softline. And thanks to Alan Dean Foster for taking the time to share his memories of the Shadowkeep project with me.)

 
 

Tags: , , ,