RSS

Author Archives: Jimmy Maher

This Game Is Over

Before the famous Videogame Crash of 1983 there was the Videogame Crash of 1976. By that year Atari’s Pong had been in arcades for four years, along with countless ball-bouncing variants: Handball, Hockey, Pin Pong, Dr. Pong, and of course Breakout. The public was already growing bored of all of them, as well as with the equally simplistic driving and shooting games that made up the rest of arcade fare. As videogame revenues declined, pinball, the form they were supposed to have superseded, started to make a comeback. Even Atari themselves started a pinball division, as manufacturers began applying some of the techniques they’d learned in videogames to a new generation of electromechanical pinball tables that rewarded players with lots of sounds, flashing lights, and high-score leaderboards. When Atari introduced its VCS home-game console in October of 1977, sales were predictably sluggish. Then, exactly one year later, Space Invaders arrived.

Developed by the Japanese company Taito and manufactured and sold in North America under license by Midway, Space Invaders had the perfect theme for a generation of kids entranced with Star Wars and Close Encounters. Its constant, frenetic action and, yes, the violence of its scenario also made it stand out markedly from comparatively placid games like Pong and Breakout. Space Invaders became the exemplar of videogames in general, the first game the general public thought of when one mentioned the form. With coin-operated arcade games suddenly experiencing a dramatic revival, sales of the Atari VCS also began to steadily increase. Thanks to a very good holiday season, sales for 1979 hit 1 million.

However, the real tipping point that would eventually result in Atari VCSs in more than 15% of U.S. homes came when Manny Gerard and Ray Kassar, Atari’s vice president and president respectively, negotiated a deal with their ostensible rivals Taito and Midway to make a version of Space Invaders for the VCS. Kassar is known today as the man who stifled innovation at Atari and mistreated his programmers so badly that the best of them decided to form their own company, Activision. Still, his marketing instinct at this moment was perfect. Kassar predicted that Space Invaders would not only be a huge hit with the VCS’s existing owners, but that it would actually sell consoles to people who wanted to play their arcade favorite at home. He was proven exactly right upon the VCS Space Invaders‘s release in January of 1980. The VCS, dragged along in the wake of the game, doubled its sales in 1980, to 2 million units.

Atari took the lesson of Space Invaders to heart. Instead of investing energy into original games with innocuously descriptive titles like Basketball, Combat, and Air Sea Battle, as they had done for the first few years of the VCS, they now concentrated on licensing all of the big arcade hits. Atari had learned an important lesson: that the quantity and quality of available software is more important to a platform than the technical specifications of the platform itself. This fact would allow the Atari VCS to dominate the console field for years despite being absurdly primitive in comparison to competition like the Intellivision and the Vectrex.

Apple was learning a similar lesson at this time in the wake of the fortuitous decision that Dan Bricklin and Bob Frankston made to first implement VisiCalc on the Apple II. Indeed, one could argue that the survivors from the early PC industry — companies like Apple and, most notably, Microsoft — were the ones that got the supreme importance of software, while those who didn’t — companies like Commodore, Radio Shack’s computer division, and eventually Atari itself — were the ones ultimately destined for the proverbial dustbin of history. Software like VisiCalc provided an answer to the question that had been tripping up computer hobbyists for years when issued from the mouths of wives, girlfriends, and parents: “But what can you really do with it?” A computer that didn’t have a good base of software, no matter how impressive its hardware, wasn’t much use to the vast majority of the public who weren’t interested in writing their own programs.

With all this in mind, let’s talk about computer games (as opposed to console games) again. We can divide entertainment software in these early years into two broad categories, only one of which I’ve so far concerned myself with in this blog. I’ve been writing about the cerebral branch of computer gaming, slow-paced works inspired by the tabletop-gaming and fiction traditions. These are the purest of computer games, in that they existed only on PCs and, indeed, would have been impossible on the game consoles of their day. They depend on a relatively large memory to hold their relatively sophisticated world models (and, increasingly, disk storage to increase the scope of possibility thanks to virtual memory); a keyboard to provide a wide range of input possibilities; and the ability to display text easily on the screen to communicate in relatively nuanced ways with their players.

The other category consists of arcade-style gameplay brought onto the PC. With the exception of the Atari 400 and 800, none of the earliest PCs were terribly suited to this style of game, lacking sprites and other fast-animation technologies and often even appropriate game controllers. Yet with the arcade craze in full bloom, these games became very, very popular. Even the Commodore PET, which lacked any bitmapped graphics mode at all, had a version of Breakout implemented entirely in “text” using the machine’s extended ASCII character set.

On a machine like the Apple II, which did have bitmapped graphics, such games were even more popular. Nasir Gebelli and Bill Budge were the kings of the Apple II action game, and as such were known by virtually every Apple II hobbyist. Even Richard Garriott, programmer of a very different sort of game, was so excited upon receiving that first call from California Pacific about Akalabeth because CP was, as everyone knew, the home of Budge. If Computer Gaming World is to be believed, it was not Zork or Temple of Apshai or Wizardry that was the bestselling Apple II game of all time in mid-1982, but rather K-Razy Shootout, a clone of the arcade game Berzerk. They may have sold in minuscule numbers compared to their console counterparts and may not have always looked or played quite as nicely, but arcade-style games were a big deal on PCs right from the start. When the Commodore VIC-20 arrived, perched as it was in some tenuous place between PC and game console, the trend only accelerated.

You may have noticed a theme in my discussion of these games in this post and in a previous post: many of these games were, um, heavily inspired by popular coin-operated arcade games. In the earliest days, when the PC-software industry was truly minuscule and copyright still a foreign concept to many programmers, many aspired to make unabashed clones of the latest arcade hits, down to the name itself. By 1980, however, this approach was being replaced by something at least a little more subtle, in which programmers duplicated the gameplay but changed the title and (sometimes, to some extent) the presentation. It should be noted that not all PC action-game programmers were cloners; Gebelli and Budge, for instance, generally wrote original games, and perhaps therein lies much of their reputation. Still, clones were more the rule than the exception, and by 1981 the PC software industry had grown enough for Atari to start to notice — and to get pissed off about it. They took out full-page advertisements in many of the big computer magazines announcing “PIRACY: THIS GAME IS OVER.”

Some companies and individuals have copied Atari games in an attempt to reap undeserved profits from games that they did not develop. Atari must protect its investment so that we can continue to invest in new and better games. According, Atari gives warning to both the intentional pirate and to the individuals simply unaware of the copyright laws that Atari registers the audiovisual works associated with its games with the Library of Congress and considers its game proprietary. Atari will protect its rights by vigorously enforcing these copyrights and by taking the appropriate action against unauthorized entities who reproduce or adapt substantial copies of Atari games, regardless of what computer or other apparatus is used in their performance.

In referring to cloning as “piracy,” Atari is conflating two very separate issues, but they aren’t doing so thoughtlessly — there’s a legal strategy at work here.

Literally from the dawn of the PC era, when Bill Gates wrote his famous “Open Letter to Hobbyists,” software piracy was recognized by many in the industry as a major problem, a problem that some even claimed could kill the whole industry before it got properly started. Gates considered his letter necessary because the very concept of commercial software was a new thing, as new as the microcomputer itself. Previously, programs had been included with hardware and support contracts taken out with companies like IBM and DEC, or traded about freely amongst students, hackers, and scientists on the big machines. In fact, it wasn’t at all clear that software even could be copyrighted. The 1909 Copyright Act that was still in effect when Gates wrote his letter in January of 1976 states that to be copyrightable a work must be “fixed in a tangible medium of expression.” One interpretation of this requirement holds that an executable computer program, since it lives only electronically within the computer’s memory, fails the tangibility test. The Copyright Act of 1976, a major amendment, failed to really clarify the situation. Astonishingly, it was only with the Computer Software Copyright Act of 1980 that it was made unambiguously clear that software was copyrightable in the same way as books and movies and that, yes, all those pirates were actually doing something illegal as well as immoral.

But there was still some confusion about exactly what aspect of a computer program was copyrightable. When we’re talking about copyright on a book, we’re obviously concerned with the printed words on the page. When we’re talking about copyright on a film, we’re concerned with the images that the viewer sees unspooling on the screen and the sounds that accompany them. A computer program, however, has both of these aspects. There’s the “literary” side, the code to be run by the computer, which in many cases takes two forms, the source code written by the programmer and the binary code that the computer actually executes after the source has been fed through an assembler or compiler. And then there’s the “filmic” side, the images that the viewer sees on the screen before her and the sounds she hears. The 1980 law defines a computer program as a “set of statements or instructions to be used directly or indirectly in a computer in order to bring about a certain result.” Thus, it would seem to extend protection to source and executable code, but not to the end experience of the user.

Such protection was not quite enough for Atari. They therefore turned to a court case of 1980, Midway vs. Dirkschneider. Dirkschneider was a small company who essentially did in hardware what many PC programmers were doing in software, stamping out unauthorized clones of games from the big boys like Atari and Midway, then selling them to arcade operators at a substantial discount on the genuine article. When they started making their own version of Galaxian, one of Midway’s most popular games, under the name Galactic Invader, Midway sued them in a Nebraska court. The judge in that case ruled in favor of the plaintiff, on the basis of a new concept that quickly became known as the “ten-foot rule”: “If a reasonable person could not, at ten feet, tell the difference between two competitive products, then there was cause to believe an infringement was occurring.”

So, in conflating pirates who illegally copied and traded software with cloners who merely copied the ideas and appearance of others’ games, implementing them using entirely original code, Atari was attempting to dramatically expand the legal protections afforded to software. The advertisement is also, of course, a masterful piece of rhetoric meant to tar said cloners with the same brush of disrepute used for the pirates, who were criticized in countless hand-wringing editorials in the exact same magazines in which Atari’s advertisement appeared. All of this grandstanding moved out of the magazines and into the courts in late 1981, via the saga of Jawbreaker.

The big arcade hit of 1981 was Pac-Man. In fact, calling Pac-Man merely “big” is considerably underestimating the matter. The game was a full-fledged craze, dwarfing the popularity of even Space Invaders. Recent studies have shown Pac-Man to still be the most recognizable videogame character in the world, which by extension makes Pac-Man easily the most famous videogame ever created. Like Space Invaders, Pac-Man was an import from Japan, created there by Namco and distributed, again like Space Invaders, by Atari’s arch-rival of the standup-arcade world, Midway. Said rivalry did not, however, prevent the companies from working out a deal to get Pac-Man onto the Atari VCS. It was to be released just in time for Christmas 1981, and promised to be the huge VCS hit of the season. Kassar and his cronies rubbed their hands in anticipation, imagining the numbers it would sell — and the number of VCSs it would also move as those who had been resistant so far finally got on the bandwagon.

Yet long before the big release day came, John Harris, Ken Williams’s star Atari 400 and 800 programmer at On-Line Systems, had already written a virtually pixel-perfect clone of the game after obsessively studying it in action at the local arcade. Ken took one look and knew he didn’t dare release it. Even leaving aside Atari’s aggressive attempts to expand the definition of software “piracy,” the Pac-Man character himself was trademarked. Releasing the game as-is risked lawsuits from multiple quarters, all much larger and richer in lawyers than On-Line Systems. The result could very well be the destruction of everything he had built. Yet, the game was just so damn good. After discussing the problem with others, Ken told Harris to go home and redo the game’s graphics to preserve the gameplay but change the theme and appearance. Harris ended up delivering a bizarre tribute to the seemingly antithetical joys of candy and good dental hygiene. Pac-Man became a set of chomping teeth; the dots Life Savers; the ghosts jawbreakers. Every time the player finished a level, an animated toothbrush came out to brush her avatar’s teeth. None of it made a lot of sense, but then the original Pac-Man made if anything even less. Ken put it out there. It actually became On-Line’s second Pac-Man clone; another one called Gobbler was already available for the Apple II.

Meanwhile Atari, just as they had promised in that advertisement, started coming down hard on Pac-Man cloners. They “persuaded” Brøderbund Software to pull Snoggle for the Apple II off the market. They “convinced” a tiny publisher called Stoneware not to even release theirs, despite having already invested money in packaging and advertising. And they started calling Ken.

The situation between On-Line and Atari was more complicated than the others. Jawbreaker ran on Atari’s own 400 and 800 computers rather than the Apple II. On the one hand, this made Atari even more eager to stamp it out of existence, because they themselves had belatedly begun releasing many of their bestselling VCS titles (a group sure to include Pac-Man) in versions for the 400 and 800. On the other hand, though, this represented an opportunity. You see, Harris had naively given away some copies of his game back when it was still an unadulterated Pac-Man. Some of these (shades of Richard Garriott’s experience with California Pacific) had made it all the way to Atari’s headquarters. Thus their goals were twofold: to stamp out Jawbreaker, but also if possible to buy this superb version of Pac-Man to release under their own imprint. Unfortunately, Harris didn’t want to sell it to them. He loved the Atari computers, but he hated the company, famous by this time for their lack of respect for the programmers and engineers who actually built their products. (This lack of respect was such that the entire visionary team that had made the 400 and 800 had left the company by the time the machines made it into stores.)

At the center of all this was Ken, the very picture of a torn man. He wasn’t the sort who accepts being pushed around, and Atari were trying to do just that, threatening him with all kinds of legal hellfire. Yet he also knew that, well, they kind of had a point; if someone did to one of his games what On-Line was doing to Pac-Man, he’d be mad as hell. Whatever the remnants of the hippie lifestyle that hung around On-Line along with the occasional telltale whiff of marijuana smoke, Ken didn’t so much dream of overthrowing the man as joining him, of building On-Line into a publisher to rival Atari. He wasn’t sure he could get there by peddling knockoffs of other people’s designs, no matter how polished they were.

Thanks largely to Ken’s ambivalence, the final outcome of all this was, as tends to happen in real life, somewhat anticlimactic. On-Line defied Atari long enough to get dragged into court for a deposition, at which Atari tried to convince the judge to grant a preliminary injunction forcing On-Line to pull Jawbreaker off the market pending a full trial. The judge applied the legal precedent of the ten-foot rule, and, surprisingly, decided that Jawbreaker looked different enough from Pac-Man to refuse Atari’s motion. You can judge for yourself: below is a screenshot of the original arcade Pac-Man pair with one of Jawbreaker.

Atari’s lawyers were reportedly stunned at the rejection, but still, Ken had no real stomach for this fight. He walked out of the courtroom far from triumphant: “If this opens the door to other programmers ripping off my software, what happened here was a bad thing.” Shortly after, he called Atari to see if they couldn’t work something out to keep Jawbreaker on the market but share the wealth.

Right on schedule, Atari’s own infamously slapdash implementation of Pac-Man appeared just in time for Christmas. It moved well over 7 million units to consumers who didn’t seem to care a bit that the ghosts flickered horribly and the colors were all wrong. The following year, On-Line and Harris developed a version of the now authorized Jawbreaker for the Atari VCS, publishing it through a company called Tigervision. It didn’t sell a fraction of what its inferior predecessor had sold, of course, but it did represent a change in the mentality of Ken and his company. Much of the fun and craziness continued, but they were also becoming a “real” company ready to play with the big boys like Atari — with all the good and bad that entails.

Similar changes were coming to the industry as a whole. Thanks to Atari’s legal muscling, blatant clones of popular arcade games dried up. The industry was now big enough to attract attention from outside its own ranks, with the result that intellectual property was starting to become a big deal. Around this time Edu-Ware got sued for its Space games that were a little bit too inspired by Game Designers’ Workshop’s Traveller tabletop RPG; they replaced them with a new series in the same spirit called Empire. Scott Adams got threatened with a lawsuit of his own over Mission Impossible Adventure, and in response changed the name to Secret Mission.

Indeed, 1981 was the year when the microcomputer industry as a whole went fully and irrevocably professional, as punctuated by soaring sales of VisiCalc and the momentous if belated arrival of IBM on the scene. That’s another story we really have to talk about, but later. Next time, we’ll see how the two broad styles of computer gaming met one another in a single game for the first time.

(My most useful sources in writing this post were an article by Al Tommervik in the January 1982 Softline and Steven Levy’s Hackers.)

 

Tags: ,

Computers for the Masses

The company that would eventually become Commodore International was formed in 1958 as an importer and assembler of Czechoslovakian portable typewriters for Canada and the northeastern United States. Its founder was a Polish immigrant and Auschwitz survivor named Jack Tramiel. Commodore first made the news as a part of the Atlantic Acceptance scandal of 1965, in which one of Canada’s largest savings and loans suddenly and unexpectedly collapsed. When the corpse was dissected, a rotten core of financial malfeasance, much of it involving its client Commodore, was revealed. It seems that Tramiel had become friends with the head of Atlantic, one C.P. Morgan, and the two had set up some mutually beneficial financial arrangements that were not, alas, so good for Atlantic Acceptance as a whole. Additionally, it appears that Tramiel likely lied under oath and altered documents to try to obscure the trail. (The complicated details of all this are frankly beyond me; Zube dissects it all at greater length on his home page, for those with better financial minds than mine.) The Canadian courts were plainly convinced of Tramiel’s culpability in the whole sorry affair, but ultimately decided they didn’t have enough hard evidence to prosecute him. A financier named Irving Gould rescued Tramiel and his scandal-wracked company from a richly deserved oblivion. Commodore remained alive and Tramiel remained in day-to-day control, but thanks to his controlling investment Gould now had him by the short hairs.

Tramiel and Gould would spend almost two decades locked in an embrace of loathing codependency. Tramiel worked like a demon, seldom taking a day off, fueled more by pride and spite than greed. Working under his famous mantra “Business is War,” he seemed to delight in destroying not only the competition but also suppliers, retailers, and often even his own employees when they lost favor in his eyes. Gould was a more easygoing sort. He put the money Tramiel earned him to good use, maintaining three huge homes in three countries, a private yacht, a private jet, and lots of private girlfriends. His only other big passion was tax law, which he studied with great gusto in devising schemes to keep the tax liability of himself and his company as close to zero as possible. (His biggest coup in that department was his incorporation of Commodore in the Bahamas, even though they had no factories, no employees, and no product for sale there.) Some of his favorite days were those in which Tramiel would come to him needing him to release some capital from his private stash to help him actually, you know, run a proper business, with a growth strategy and research and development and all that sort of thing. Gould would toy with him a bit on those occasions, and sometimes even give him what he wanted. But usually not. Better for Tramiel to pay for it out of his operating budget; Gould needed his pocket money, after all.

Commodore’s business over the next decade changed its focus from the manufacturing of typewriters and mechanical adding machines to a new invention, the electronic calculator, with an occasional sideline in, of all things, office furniture. They also built up an impressive distribution network for their products around the world, particularly in Europe. Indeed, Europe, thanks to well-run semi-independent spinoffs in Britain and West Germany, became the company’s strongest market. Commodore remained a niche player in the U.S. calculator market, but in Europe they became almost a household name. Through it all Commodore’s U.S. operation, the branch that ultimately called the shots and developed the product line, retained an everpresent whiff of the disreputable. One could quickly sense that this company just wasn’t quite respectable, that in most decisions quick and dirty was likely to win out over responsible and ethical. Which is not, I need to carefully emphasize, to cast aspersions on the many fine engineers who worked for Commodore over the years, who often achieved heroic results in spite of management’s shortsightedness or, eventually, outright incompetence.

Tramiel and Commodore stumbled into a key role in both the PC revolution and the videogame revolution. In 1976 the company was, not for the first nor the last time, struggling mightily. Texas Instruments had virtually destroyed their calculator business by introducing machines priced cheaper than Commodore could possibly match. The reason: TI owned its own chip-fabrication plants rather than having to source its chips from other suppliers. It was a matter of vertical integration, as they say in the business world. Desperate for some integration of his own, Tramiel bought a chip company of his own, MOS Technologies. With MOS came a new microprocessor, one that had been causing quite a lot of excitement amongst homebrew microcomputer hackers like Steve Wozniak: the 6502. Commodore also ended up with the creator of the 6502, MOS’s erstwhile head of engineering Chuck Peddle. For his next trick, Peddle was keen to build a computer around his CPU. Tramiel wasn’t so sure about the idea, but reluctantly agreed to let Peddle have a shot. The Commodore PET became the first of the trinity of 1977 to be announced, but the last to actually ship. Tramiel, you see, was having cash-flow problems as usual, and Gould was as usual quite unforthcoming.

The PET wasn’t a bad little machine at all. It wasn’t quite as advanced in some areas as the Apple II, but it was also considerably cheaper. Still, it was hard to articulate just where it fit in the North American market. Hobbyists on a budget favored the TRS-80, easily available from Radio Shack stores all over the country, while those who wanted the very best settled on the more impressive Apple II. Business users, meanwhile, fixated early on the variety of CP/M machines from boutique manufacturers, and later, in the wake of VisiCalc, also started buying Apple IIs. The PET therefore became something of an also-ran in North America in spite of the stir of excitement its first announcement had generated.

Europe, however, was a different story. Neither Apple nor Radio Shack had any proper distribution network there in the beginning. The PET therefore became the first significant microcomputer in Europe. With effectively no competition, Commodore was free to hike its prices in Europe to Apple II levels and beyond. This meant that PETs were most commonly purchased by businesses and installed in offices. Only France, where Apple set up distribution quite early on, remained resistant, while West Germany became a particularly strong market, with the Commodore name accorded respect in business equivalent to what CP/M received in the U.S. And when a PET version of VisiCalc was introduced to Europe in 1980, it caused almost as big a sensation as the Apple II version had the year before in America. Within a year or two, Commodore stopped even seriously trying to sell PETs in North America, but rather shipped most of the output of their U.S. factory to Europe, where they could charge more and where the competition was virtually nonexistent.

In North America Commodore’s role in the early microcomputer and game-console industries was also huge, but mostly behind the scenes, and all centered around the Commodore Semiconductor Group — what had once been MOS Technologies. In an oft-repeated scenario that Dave Haynie has dubbed the “Commodore Curse,” most of the innovative engineers who had created the 6502 fled soon after the Commodore purchase, driven away by Tramiel’s instinct for degradation and his refusal to properly fund their research-and-development efforts. For this reason, MOS, poised at the top of the microcomputer industry for a time, would never even come close to developing a viable successor to the 6502. Nevertheless, Commodore inherited a very advanced chipmaking operation — one of the best in the country in fact. It would take some years for inertia and neglect to break down the house that Peddle and company had built. In the meantime, they delivered the 6502s and variants found not only in the PET but also in the Apple II, the Atari VCS, the Atari 400 and 800, and plenty of other more short-lived systems. They also built many or most of the cartridges on which Atari VCS games shipped. All of which put Commodore in the enviable position of making money every time many of their ostensible competitors built something. Thanks to MOS and Europe, Commodore went from near bankruptcy to multiple stock splits, while Tramiel himself was worth $50 million by 1980. That year he rewarded Peddle, the technical architect of virtually all of this success, with termination and a dubious lawsuit that managed to wrangle away the $3 million in Commodore stock he had earned.

Commodore’s transformation from a business-computer manufacturer and behind-the-scenes industry player to the king of home computing also began in 1980, when Tramiel visited London for a meeting. He saw there for the first time an odd little machine called the Sinclair ZX-80. Peddled by an eccentric English inventor named Clive Sinclair, the ZX-80 was something of a throwback to the earliest U.S.-made microcomputers. It was sold as a semi-assembled kit, and, with just 1 K of memory and a display system so primitive that the screen went blank every time you typed on the keyboard, pretty much the bare-minimum machine that could still meet some reasonable definition of “computer.” For British enthusiasts, however, it was revelatory. Previously the only microcomputers for sale in Britain had been the Commodore PET line and a few equally business-oriented competitors. These machines cost thousands of pounds, putting them well out of reach of most private individuals in this country where average personal income lagged considerably behind that of the U.S. The ZX-80, though, sold for just under £100. For a generation of would-be hackers who, like the ones who had birthed the microcomputer industry in the U.S. five years before, simply wanted to get their hands on a computer — any computer — it was a dream come true. Sinclair sold 50,000 ZX-80s before coming out with something more refined the next year.

We’ll talk more about Sinclair and his toys in later posts, but for now let’s focus on what the ZX-80 meant to Tramiel. He began to think about a similar low-cost computer for the U.S. consumer market — this idea of a “home computer” that had been frequently discussed but had yet to come to any sort of real fruition. To succeed in the U.S. mass market Commodore would obviously need to put together something more refined than the ZX-80. It would have to be a fully assembled computer that was friendly, easy to use, and that came equipped with all of the hardware needed to hook it right up to the family television. And it would need to be at least a little more capable than the Atari VCS in the games department (to please the kids) and to have BASIC built in (to please the parents, who imagined their children getting a hand up on their future by learning about computers and how to program them).

Luckily, Commodore already had most of the parts they needed just sort of lying around. All the way back in 1977 their own Al Charpentier had designed the Video Interface Chip (the VIC) for a potential game console or arcade machine. It could display 16-color graphics at resolutions of up to 176 X 184, and could also generate up to three simple sounds at one time. Commodore had peddled it around a bit, but it had ended up on the shelf. Now it was dusted off to become the heart of the new computer. Sure, it wasn’t a patch on the Atari 400 and 800’s capabilities, but it was good enough. Commodore joined it up with much of the PET architecture in its most cost-reduced form, including the BASIC they’d bought from Microsoft years before, added a cartridge port, and they had their home computer. After test marketing it in Japan as the VIC-1001, they brought it to North America as the VIC-20 in the spring of 1981, and soon after to Europe. (In the German-speaking countries it was called the VC-20 because of the unfortunate resemblance “VIC” had to the German verb “ficken” — to fuck.) In the U.S. the machine’s first list price was just under $300, in line with Tramiel’s new slogan: “Computers for the masses, not the classes.” Tramiel may have been about the last person in the world you’d expect to start advocating for the proletariat, but business sometimes makes strange bedfellows. Discounting construction kits and the like, the VIC-20 was easily the cheapest “real computer” yet sold in the U.S.

For the first time in the company’s history, Commodore created a major U.S. advertising campaign that was well-funded and smart, perhaps because it was largely the work of an import from Commodore’s much more PR-savvy British subsidiary named Kit Spencer. He hired as spokesman William Shatner, Captain Kirk himself. “Why buy just a videogame?” Shatner asked. “Invest in the wonder computer of the 1980s,” with “a real computer keyboard.” The messaging was masterful. The box copy announced that the VIC-20 was great for “household budgeting, personal improvement, student education, financial planning.” In reality, the VIC-20, with just 5 K of memory and an absurdly blocky 22-characters-per-line text display, was of limited (at best) utility for any of those things. But always Commodore snuck in a reference, seemingly as an afterthought, to the fact that the VIC-20 “plays great games too!” Commodore was effectively colluding with the kids they were really trying to reach, giving them lots of ways to convince Mom and Dad to buy them the cool new game machine they really wanted. Understanding that a good lineup of games was crucial to this strategy, they made sure that upon release a whole library of games, many of them unauthorized knockoffs of current arcade hits, was ready to go. For the more cerebral sorts, they also contracted with Scott Adams to make cartridge versions of his first five adventures available at launch.

Within a few months of the launch, Tramiel made a deal with K-Mart, one of the largest U.S. department-store chains of the time, to sell VIC-20s right from their shelves. This was an unprecedented move. Previously department stores had been the domain of the game consoles; the Atari VCS owed much of its early success to a distribution deal that Atari struck with Sears. Computers, meanwhile, were sold from specialized dealers whose trained employees could offer information, service, and support before and after the sale. Tramiel alienated and all but destroyed Commodore’s dealer network in the U.S., such as it was, by giving preferential treatment to retailers like K-Mart, even indulging in the dubiously legal practice of charging the latter lower prices per unit than he did the loyal dealers who had sometimes been with him for years. Caught up in his drive to make Commodore the home-computer company as well as his general everyday instinct to cause as much chaos and destruction as possible, Tramiel couldn’t have cared less when they complained and dropped their contracts in droves. Eventually this betrayal, like so many others, would come back to haunt Commodore. But for now they were suddenly riding higher than ever.

The VIC-20 resoundingly confirmed at last the mutterings about the potential for a low-cost home computer. It sold 1 million units in barely a year, the first computer of any type to do so. Apple, by comparison, had after five years of steadily building momentum managed to sell about 750,000 Apple IIs by that point, and Radio Shack’s numbers were similar. The VIC-20 would go on to sell 2.5 million units before crashing back to earth almost as quickly as it had ascended; Commodore officially discontinued it in January of 1985, by which time it was generally selling for well under $100. Attractive as its price was, it was ultimately just too limited a machine to have longer legs. Still, and while the vast majority of VIC-20s were used almost exclusively for playing games (at least 98% of the software released for the machine were games), some who didn’t have access to a more advanced machine used it as their gateway to the wonders of computing. Most famously, Linus Torvalds, the Finnish creator of Linux, got his start exploring the innards of the VIC-20 installed in his bedroom. For European hackers like Torvalds, without as many options as the U.S. market afforded, the VIC-20 as well as the cheap Sinclair machines were godsends.

The immediate reaction to the VIC-20 from users of the Apple II and other more advanced machines was generally somewhere between a bemused shrug and a dismissive snort. With its minuscule memory and its software housed on cartridges or cassette tapes, the VIC-20 wasn’t capable of running most of the programs I’ve discussed on this blog, primitive as many of them have been. Even the Scott Adams games were possible only because they were housed on ROM cartridges rather than loaded into the VIC-20’s scant RAM. Games like Wizardry, Ultima, The Wizard and the Princess, or Zork — not to mention productivity game-changers like VisiCalc — were simply impossible here. The VIC-20’s software library, large and (briefly) profitable as it was, was built mostly of simple action games not all that far removed from the typical Atari VCS fare. Companies like On-Line Systems released a VIC-20 title here and there if someone stepped forward with something viable (why throw away easy money?), but mostly stayed with the machines that had brought them this far. To the extent that the VIC-20 was relevant to them at all, it was relevant as a stepping stone — or, if you will, a gateway drug to computing. Hopefully some of those VIC-20 buyers would get intrigued enough that they’d decide to buy a real system some day.

Yet in the long run the VIC-20 was only a proof of concept for the home computer. With the segment now shown to be viable and, indeed, firmly established, the next home computer to come from Commodore wouldn’t be so easy to ignore.

(By far the best, most unvarnished, and most complete history of Commodore is found in Brian Bagnall’s Commodore: A Company on the Edge and its predecessor On the Edge: The Spectacular Rise and Fall of Commodore. Both books are in desperate need of a copy editor, making them rather exhausting to read at times, and Bagnall’s insistence on slamming Apple and IBM constantly gets downright annoying. Still, the information and stories are there.

Michael Tomczyk’s much older The Home Computer Wars was previously the only real insider account of Commodore during this period, but it’s of dubious value at best in the wake of Bagnall’s books. Tomczyk inflates his own role in the creation and marketing of the VIC-20 enormously, and insists on painting Tramiel as a sort of social visionary. He’s amazed that Tramiel is willing to do business in Germany after spending time in Auschwitz, seeing this as a sign of the man’s essential nobility and forgiving nature. News flash: unprincipled men seldom put principles — correct or misguided — above the opportunity to make a buck.)

 

Tags: , ,

Of Game Consoles, Home Computers, and Personal Computers

When I first started writing the historical narrative that’s ended up consuming this blog, I should probably have stated clearly that I was writing about the history of computer games, not videogames or game consoles. The terms “computer game” and “videogame” have little or no separation today, but in the late 1970s and early 1980s the two were regarded as very distinct things. In Zap!, his history of Atari written just as that company was imploding in 1983, Scott Cohen takes the division as a given. He states, “Perhaps Atari’s most significant contribution is that it paved the way for the personal computer.” In predicting the future of the two categories, he is right about one and spectacularly wrong about the other. The PC, he says, will continue up a steadily inclining growth curve, becoming more and more an expected household fixture as the years go by. The game console, however, will be dismissed in future years as a “fad,” the early 1980s version of the Hula Hoop.

If we trace back far enough we can inevitably find some common origins, but the PC and game console were generally products of different folks with very different technical orientations and goals. Occasional collisions like Steve Jobs’s brief sojourn with Atari were more the exception than the rule. Certainly the scales of the two industries were completely out of proportion with one another. We’ve met plenty of folks on this blog who built businesses and careers and, yes, made lots of money from the first wave of PCs. Yet everything I’ve discussed is a drop in the bucket compared to the Atari-dominated videogame industry. A few figures should make this clear.

Apple, the star of the young PC industry, grew at an enviable rate in its early years. For example, sales more than doubled from 1979 to 1980, from 35,000 units to 78,000. Yet the Atari VCS console also doubled its sales over the same period: from 1 million in 1979 to 2 million in 1980. By the time the Apple II in 1983 crossed the magical threshold of 1 million total units sold, the VCS was knocking at the door of 20 million. Even the Intellivision, Mattel’s distant-second-place competitor to the VCS, sold 200,000 units in 1980 alone. In mid-1982, the height of the videogame craze, games consoles could already be found in an estimated 17% of U.S. households. Market penetration like that would be years in coming to the PC world.

In software the story is similar. In 1980, a PC publisher with a hit game might dream of moving 15,000 units. Atari at that time already had two cartridges, Space Invaders and Asteroids, that had sold over 1 million copies. Activision, an upstart VCS-game-maker formed by disgruntled Atari programmers, debuted in 1980 with sales of $67 million on its $25 game cartridges. By way of comparison, Apple managed sales of $200 million on its $1500 (or more) computer systems. The VCS version of Pac-Man, the big hit of 1981, sold over 2 million copies that year alone. Again, it would be a decade or more before PC publishers would begin to see numbers like that for their biggest titles.

So, we have two very different worlds here, that of the mass-market, inexpensive game consoles and that of the PC, the latter of which remained the province of the most affluent, technology-savvy consumers only. But then a new category began to emerge, to slot itself right in the middle of this divide: the “home computer.” The first company to dip a toe into these waters was Atari itself.

Steve Jobs during his brief association with Atari brought a proposal for what would become the Apple II to Atari’s then-head Nolan Bushnell. With Atari already heavily committed to both arcade machines and the project that would become the VCS, Bushnell declined. (Bushnell did, however, get Jobs a meeting with potential investor Don Valentine, who in turn connected him with Mike Markkula. Markkula became the third employee at Apple, put up most of the cash the company used to get started in earnest, and played a key role in early marketing efforts. Many regard him as the unsung hero of Apple’s unlikely rise.) Only later on, after the success of the Apple II and TRS-80 proved the PC a viable bet, did Atari begin to develop a full-fledged computer of its own.

The Atari 400 and 800, released in late 1979, were odd ducks in comparison to other microcomputers. The internals were largely the work of three brilliant engineers, Steven Mayer, Joe Decuir, and Jay Miner, all of whom had also worked on the Atari VCS. Their design was unprecedented. Although they had at their heart the same MOS 6502 found in the Atari VCS and the Apple II, the 400 and 800 were built around a set of semi-intelligent custom chips that relieved the CPU of many of its housekeeping burdens to increase its overall processing potential considerably. These chips also brought graphics capabilities that were nothing short of stunning. Up to 128 colors could be displayed at resolutions of up to 352 X 240 pixels, and the machines also included sprites, small graphics blocks that could be overlaid over the background and moved quickly about; think of the ghosts in Pac-Man for a classic example. By comparison, the Apple II’s hi-res mode, 280 X 160 pixels with 6 possible colors, no sprites, and the color-transition limitations that result in all that ugly color fringing, had represented the previous state of the art in PC graphics. In addition, the Atari machines featured four-voice sound-synthesis circuitry. Their competitors offered either no sound at all, or, as in the case of the Apple II, little more than beeps and squeaks. As an audiovisual experience, the new Atari line was almost revolutionary.

Still, externally the Apple II looked and was equipped (not to mention was priced) like a machine of serious intent. The Ataris lacked the Apple’s flexible array of expansion slots as well as Steve Wozniak’s fast and reliable floppy-disk system. They shipped with just 8 K of memory. Their BASIC implementation, one of the few not sourced from Microsoft, was slow and generally kind of crummy. The low-end model, the 400, didn’t even have a proper keyboard, just an awkward membrane setup. And it wasn’t even all a story of missing features. When you inspected the machines more closely, you found something unexpected: a console-style port for game cartridges. The machines seemed like Frankensteins, stuck somewhere between the worlds of the game console and the PC. Enter the home computer — a full-fledged computer, but one plainly more interested in playing games and doing “fun” things than “serious” work. The Atari logo on the cases, of course, also contributed to the impression that, whatever else they were, these machines weren’t quite the same thing as, say, the Apple II.

Alas, Atari screwed the pooch with the 400 and 800 pretty badly. From the beginning it priced them too high for their obvious market; the 800 was initially only slightly less expensive than the Apple II. And, caught up like the rest of the country in VCS-fever, they put little effort into promotion. Many in management hardly seemed aware that they existed at all. In spite of this, their capabilities combined with the Atari name were enough to make them modest sales successes. They also attracted considerable software support. On-Line Systems, for instance, made them their second focus of software development, behind only the Apple II, during their first year or two in business. Still, they never quite lived up to their hardware’s potential, never became the mass-market success they might (should?) have been.

The next company to make a feint toward the emerging idea of a home computer was Radio Shack, who released the TRS-80 Color Computer in 1980. (By the end of that year Radio Shack had four separate machines on the market under the TRS-80 monicker, all semi- or completely incompatible with one another. I haven’t a clue why no one could come up with another name.) Like so much else from Radio Shack, the CoCo didn’t seem to know quite what it wanted to be. Radio Shack did get the price about right for a home computer: $400. And they provided a cartridge port for instant access to games. Problem was, those games couldn’t be all that great, because the video hardware, while it did indeed allow color, wasn’t a patch on the Atari machines. Rather than spend money on such niceties, Tandy built the machine around a Motorola 6809, one of the most advanced 8-bit CPUs ever created. That attracted a small but devoted base of hardcore hackers who did things like install OS-9, the first microcomputer operating system capable of multitasking. Meanwhile the kids and families the machine was presumably meant to attract shrugged their shoulders at the unimpressive graphics and went back to their Atari VCSs. Another missed opportunity.

The company that finally hit the jackpot in the heretofore semi-mythical home-computer market was also the creator of the member of the trinity of 1977 that I’ve talked about the least: Commodore, creator of the PET. I’ll try to make up for some of that inattention next time.

 
 

Tags:

The Wizardry Phenomenon

Of the two long-lived CRPG franchises that made their debuts in 1981, the Ultima series would prove to be the more critically and commercially successful in the long term. Yet in a state of affairs that brings to mind clichés about tortoises and hares and battles and wars, it was the first Wizardry game that really captured imaginations, not to mention the most sales, in 1981 and 1982. Ultima, mind you, was another very big success for Richard Garriott, receiving positive reviews and selling 20,000 copies in its first year. It along with Akalabeth made him a very prosperous young man indeed, enough that he would soon have to question whether there was any point in continuing at university to prepare for a “real” career (a story we’ll get to later). But Wizardry was operating on another plane entirely.

If reviews of Ultima were very positive, early reviews of Wizardry were little short of rapturous. Softalk, who published a review even before the game was available thanks to a pre-release copy, called Wizardry not just a game but “a place,” and “the ultimate computer Dungeons and Dragons,” and said those who “don’t give this game a try” would be “missing much.” Computer Gaming World called it “one of the all-time classic computer games,” “the standard by which all fantasy role-playing games should be compared.” Even Dragon magazine took note. In one of its occasional nods to the CRPG scene, it said that “there is so much good about this game, it’s difficult to decide where to begin,” and that it “would excite any dedicated fantasy role-player.” The consensus of these reviewers is that Greenberg and Woodhead had in some sense perfected the idea of D&D on the microcomputer, producing the first compulsively playable example of the form after all of the not-quite-there-yet experiments of Automated Simulations and others. While Ultima, for one, certainly has its own charms, it’s difficult to entirely disagree.

Rapturous press and positive word of mouth paid off commercially. Just two months after its release in September of 1981, Wizardry was already the second bestselling Apple II program on the market, behind only the unstoppable VisiCalc, according to Softalk‘s sales surveys. The September/October 1982 issue of Computer Gaming World included a survey of top-selling games and their alleged sales numbers through June 1982. (This is also the source that I used for the 20,000-copy figure for Ultima). Here, nine months after its release, Wizardry is claimed to have sold 24,000 copies. Ultima had not only sold fewer copies in total, but had been on the market three months longer. The only adventure games to have outsold Wizardry were Zork (32,000 copies), Temple of Apshai (30,000 copies), and The Wizard and the Princess (25,000 copies). All of these games had been on the market at least twice as long as Wizardry, and in the case of the former two on other platforms in addition to the Apple II. For the record, the only other games to outsell Wizardry were K-Razy Shootout (35,000 copies) and Snack Attack (25,000 copies), clones of the arcade hits Berzerk and Pac-Man respectively; Raster Blaster (25,000 copies), a pinball game from Apple II supercoder Bill Budge; and the evergreen Flight Simulator (30,000 copies). (Yes, bizarre as it sounds, the completely unremembered K-Razy Shootout may well have been the bestselling computer game of all-time in mid-1982 — counting only games sold for full-fledged PCs rather than game consoles, of course. On the other hand, there are enough oddities about CGW‘s list that I’m far from ready to take it in its entirety as gospel.) Impressive as its sales to that point had been, in mid-1982 Wizardry was still quite early in its commercial lifespan. As Apple IIs continued to sell in ever greater numbers, Wizardry also would continue as a major seller for several more years. A full year after the CGW list, Electronic Games magazine still called it “without a doubt, the most popular fantasy adventure game available for the Apple II.”

Sales success like this, combined with the devotion the game tended to engender amongst those who bought it and, yes, the rampant piracy that was as typical of this era as it is of our own, led to a user base of active, long-term Wizardry players that was larger than the entire installed base of some of the Apple II’s competition. Wizardry is of course a famously difficult game, leading many of these folks to cast around for outside aid. One of the more fascinating and important aspects of the Wizardry story is the cottage industry that arose to feed this hunger. At least two third-party character editors from tiny publishers, WizPlus and WizFix, appeared within months of Wizardry itself, offering players the opportunity (for $25 or so) to alter their characters’ statistics at will and rescue dead characters left in the dungeon. These programs grew so popular that Sir-tech already felt behooved to respond upon the release of the second Wizardry scenario in May of 1982 by inserting into the box a sheet bearing the following rather mean-spirited scold:

It has come to our attention that some software vendors are marketing so-called “cheat programs.” These programs allow you to create characters of arbitrary strength and ability.

While it may seem appealing to use these products, we urge you not to succumb to the temptation. It took more than four years of careful adjustment to properly balance Wizardry. These products tend to interfere with this subtle balance and may substantially reduce your playing pleasure. It would be akin to playing chess with additional queens, or poker with all cards wild.

It has also come to our attention that some of these programs are unreliable and may even destroy data. While we repair or replace inoperative disks free within 30 days of purchase, or for a nominal fee of $5.00 anytime thereafter, we will not do so for disks damaged by a cheat program.

Such pedantry foreshadows some of the mistakes that Sir-tech would soon begin to make with the franchise.

A year or two later, The Wizard’s Workbench from Magicsoft took advantage of Greenberg and Woodhead’s determination to make Wizardry a reusable, database-based game system by offering what amounted to a reconstruction of the tools Woodhead had created to author the original game. A full-fledged CRPG authoring tool in all but name, Wizard’s Workbench let the player alter existing Wizardry scenarios at will, as well as create her own with custom mazes to be mapped, monsters to be fought, magic items to be acquired, and puzzles to be solved — a precursor to systems like The Bard’s Tale Construction Set and Unlimited Adventures and, by extension, the more recent Neverwinter Nights.

Others trafficked not in software but in information. One Michael Nichols put together a binder’s worth of maps, data on monsters and items, and playing advice under the name “The Wizisystem”:

Wizardry is one of the most exciting and challenging games available for the Apple computer. Its complexity and seemingly endless variations make it interesting long after the average game has been gathering dust for months. Perhaps the most enduring aspect of Wizardry is that it forces the player to think logically, to act rationally, and to organize masses of data into usable form in order to be successful. In other words, the Wizardry player must combine the skills of a master strategist, a tax lawyer, a cartographer, an experienced researcher, and a Knight of the Round Table!

The Wizisystem allows the average player, who has neither the time nor the means to learn all these skills, to be successful at the game by teaching him to exert control over every phase of the game — from creating characters to opening chests. It gives the player a successful, easy-to-follow format and backs it up with information that is as complete and helpful as possible.

The essence of the Wizisystem is control through planning, organization, knowledge, and a methodical approach to the game.

Products like Wizisystem showed publishers that there was a market hungry for such detailed information on individual games. Soon most adventure-game publishers would be selling hints books as a tidy extra profit channel, and soon enough after that book-store shelves would be full of sometimes-hundreds-of-pages-long deconstructions of popular games of all stripes.

It all added up to something that Softline could already in its March 1982 issue call a “phenomenon” with only slight hyperbole. As with Eliza fifteen years before, some saw applications for Wizardry that sound over the top or even downright silly today. Harry Conover considered playing the game good training for working as a small-business manager: “As the manager of a small group of individuals, each with their own strengths and weaknesses, you must manipulate the members’ performances against the ‘competition’ so that they achieve a certain goal.” Chuck Dompa used Wizardry in a graduate-level continuing-education course (“CS470: Teaching Fantasy Simulation”) for educators at Penn State University. Dr. Ronald Levy, a New York child psychiatrist, started using the game in his work. He wrote a letter to Sir-tech describing his experiences with a deeply depressed, apparently suicidal child:

Jim agreed to play videogames on my Apple computer and he became fascinated by my description of the Wizardry game. He made a set of characters, gave them names, and played nonstop for almost an hour. After the first half hour, he was willing to discuss with me what he was doing in the game, and I was able to learn a great deal about him from what he had told me and from watching him play.

I found out that he was not as depressed as he seemed and that he was able to become enthusiastic about something he was interested in; and we were able to talk about some of his worries, using the game as a springboard. At the conclusion of this visit, he told me he had no intention of killing himself because he “wanted to come back and play some more.” In this case, an in several others, I have been able, by using your game, to evaluate correctly children who initially seemed much more disturbed than they really were… Although you intended to create a recreational game, you have inadvertently provided me with a marvelous tool for my work with children.

Less compellingly, Levy raised the stakes further to claim that the individual characters that make up a Wizardry party were really each a fragment of the player’s psyche, alluding to the ideas that Hermann Hesse put forward in Steppenwolf. Alas, Dr. Levy, sometimes a computer game is just a computer game.

Wizardry‘s success inspired a certain amount of resentment from some of the old guard on PLATO, from whose games Greenberg and Woodhead had lifted so many of their ideas. Dirk Pellett, who did much work on the seminal PLATO CRPG dnd, claims to this day that Woodhead attempted to copy that game and release it under his own name on PLATO as Sorcerer. When he was called out for that, claims Pellett, he and Greenberg then “plagiarized” another popular PLATO game, Oubliette, to create Wizardry. For what it’s worth, I find this claim absurd. Oubliette did pioneer many ideas used in Wizardry, including the first-person view, but the contents of the latter’s dungeons were completely original. And the most obvious innovation of Wizardry, its placing the player in charge of an entire party instead of a single avatar, does indeed appear to originate with Wizardry itself. If Wizardry plagiarized Oubliette, then Zork plagiarized Adventure — and dnd plagiarized D&D. Indeed, it’s hard to think of a computer game of the last 30 years that is not a product of plagiarism under those terms. Yet with Greenberg and Woodhead having gotten so much recognition and money from being the first to bring to a paying market so many of the ideas of PLATO, such resentments are perhaps inevitable. (More surprising is the complete equanimity Will Crowther and Don Woods have always shown in the face of the commercialization of their own seminal work, Adventure.)

What all of this attention ultimately came down to for Sir-tech, of course, was sales. Lots and lots of sales. For its first offices the company rented out a 100 square-foot area in the spoon factory that had gotten all of this started in the first place. Sir-tech started out copying disks by hand for sale at a rate of about 100 per day, but soon invested in specialized duplication machines that raised their daily capacity to 500. And they started hiring; soon Norman and Robert Sirotek were joined in the office by five employees. Meanwhile Greenberg and Woodhead started doing what you do when you’ve just made a hit computer game: working on the sequel.

We’ll be tracing the parallel evolutions of the Wizardry and Ultima series for a long time to come. But next, as usual, something completely different.

 
 

Tags: ,

Playing Wizardry

Writing about Ultima earlier, I described that game as the first to really feel like a CRPG as we would come to know the genre over the course of the rest of the 1980s. Yet now I find myself wanting to say the same thing about Wizardry, which was released just a few months after Ultima. That’s because these two games stand as the archetypes for two broad approaches to the CRPG that would mark the genre over the next decade and, arguably, even right up to the present. The Ultima approach emphasizes the fictional context: exploration, discovery, setting, and, eventually, story. Combat, although never far from center stage, is relatively deemphasized, at least in comparison with the Wizardry approach, which focuses on the process of adventuring above all else. Like their forefather, Wizardry-inspired games often take place in a single dungeon, seldom feature more than the stub of a story, and largely replace the charms of exploration, discovery, and setting with those of tactics and strategy. The Ultima strand is often mechanically a bit loose — or more than a bit, if we take Ultima itself, with its hit points as a purchasable commodity and its concept of character level as a function of time served, as an example. The Wizardry strand is largely about its mechanics, so it had better get them right. (As I wrote in my last post about Wizardry, Richard Garriott refined and balanced Ultima by playing it a bit himself and soliciting the opinions of a few buddies; Andrew Greenberg and Robert Woodhead put Wizardry through rigorous balancing and playtesting that consumed almost a year.) These bifurcated approaches parallel the dueling approaches to tabletop Dungeons and Dragons, as either a system for interactive storytelling enjoyed by “artful thespians” or a single-unit tactical wargame.

Wizardry, then, isn’t much concerned with niceties of setting or story. The manual, unusually lengthy and professional as it is, says nothing about where we are or just why we choose to spend our time delving deeper and deeper into the game’s 10-level dungeon. If a dungeon exists in a fantasy world, it must be delved, right? That’s simply a matter of faith. Only when we reach the 4th level of the dungeon do we learn the real purpose of it all, when we fight our way through a gauntlet of monsters to enter a special room.

CONGRATULATIONS, MY LOYAL AND WORTHY SUBJECTS. TODAY YOU HAVE SERVED ME WELL AND TRULY PROVEN YOURSELF WORTHY OF THE QUEST YOU ARE NOW TO UNDERTAKE. SEVERAL YEARS AGO, AN AMULET WAS STOLEN FROM THE TREASURY BY AN EVIL WIZARD WHO IS PURPORTED TO BE IN THE DUNGEON IMMEDIATELY BELOW WHERE YOU NOW STAND. THIS AMULET HAS POWERS WHICH WE ARE NOW IN DIRE NEED OF. IT IS YOUR QUEST TO FIND THIS AMULET AND RETRIEVE IT FROM THIS WIZARD. IN RECOGNITION OF YOUR GREAT DEEDS TODAY, I WILL GIVE YOU A BLUE RIBBON, WHICH MAY BE USED TO ACCESS THE LEVEL TRANSPORTER [otherwise known as an “elevator”] ON THIS FLOOR. WITHOUT IT, THE PARTY WOULD BE UNABLE TO ENTER THE ROOM IN WHICH IT LIES. GO NOW, AND GOD SPEED IN YOUR QUEST!

And that’s the last we hear about that, until we make it to the 10th dungeon level and the climax.

What Wizardry lacks in fictional context, it makes up for in mechanical depth. Nothing that predates it on microcomputers offers a shadow of its complexity. Like Ultima, Wizardry features the standard, archetypical D&D attributes, races, and classes, renamed a bit here and there for protection from Mr. Gygax’s legal team. Wizardry, however, lets us build a proper adventuring party with up to six members in lieu of the single adventurer of Ultima, with all the added tactical possibilities managing a team of adventurers implies. Also on offer here are four special classes in addition to the basic four, to which we can change characters when they become skilled enough at their basic professions. (In other words, Wizardry is already offering what the kids today call “prestige classes.”) Most impressive of all is the aspect that gave Wizardry its name: priests eventually have 29 separate spells to call upon, mages 21, each divided into 7 spell levels to be learned slowly as the character advances. Ultima‘s handful of purchasable scrolls, which had previously marked the state of the art in CRPG magic systems, pales in comparison. Most of the depth of Wizardry arises one way or another from its magic system. It’s not just a matter of learning which spells are most effective against which monsters, but also of husbanding one’s magic resources: deciding when one’s spell casters are depleted enough that it’s time to leave the dungeon, deciding whether the powerful spell is good enough against that demon or whether it’s time to use the really powerful one, etc. It’s been said that a good game is one that confronts players with interesting, non-obvious — read, difficult — decisions. By that metric, magic is largely what makes Wizardry a good game.

Of course, Wizardry‘s mechanics, from its selection of classes and races to its attribute scores that max out at 18 to its armor-class score that starts at 10 and moves downward for no apparent reason, are steeped in D&D. There’s even a suggestion in the manual that one could play Wizardry with one’s D&D group, with each player controlling a single character — not that that sounds very compelling or practical. The game also tries, not very successfully, to shoehorn in D&D‘s mechanic of alignment, a silly concept even on the tabletop. On the computer, good, evil, and neutral are just a set of arbitrary restrictions: good and evil cannot be in the same party, thieves cannot be good.

Sometimes you meet “friendly” monsters in the dungeon. If good characters kill them anyway, or evil characters let them go, there’s a chance that their alignments will change — which can in turn play the obvious havoc with party composition. (In an amusing example of unintended emergent behavior, it’s also possible for the “evil” mage at the end of the game to be… friendly. Now doesn’t that present a dilemma for a “good” adventurer, particularly since not killing him means not getting the amulet that the party needs to get out of his lair.)

So, Greenberg and Woodhead were to some extent just porting an experience that had already proven compelling as hell to many players to the computer, albeit doing a much more complete job of it than anyone had managed before. But there’s also much that’s original here. Indeed, so much that would become standard in later CRPGs has its origin here that it’s hard to know where to begin to describe it all. Wizardry is almost comparable to Adventure in defining a whole mode of play that would persist for many years and countless games. For those few of you who haven’t played an early Wizardry game, or one of its spiritual successors (read: slavish imitators) like The Bard’s Tale or Might and Magic, I’ll take you on a very brief guided tour of a few highlights. Sorry about my blasphemous adventurer names; I’ve been reading the Old Testament lately, and it seems I got somewhat carried away with it all.

Wizardry is divided into two sections: the castle (shown below), where we do all of the housekeeping chores like making characters, leveling up, putting together our party, shopping for equipment, etc.; and the dungeon, where the meat of the game takes place.

When we enter the dungeon, we start in “camp.” We are free to camp again at any time in the dungeon, as long as we aren’t in the middle of a fight. Camping gives us an opportunity to tinker with our characters and the party as a whole without needing to worry about monsters. We can also cast spells. Here I’ve just cast MAPORFIC, a very useful spell which reduces the armor class of the entire party by two for the duration of our stay in the dungeon. All spells have similar made-up names; casting one requires looking it up in the manual and entering its name.

Once we leave camp, we’re greeted with the standard traveling view: a first-person wireframe-3D view of our surroundings occupies the top left, with the rest of the screen given over to various textual status information and a command menu that’s really rather wasteful of screen space. (I suspect Greenberg and Woodhead use it because it gives them something with which to fill up some space that they don’t have to spend computing resources dynamically updating.)

I was just saying that Wizardry manages to be its own thing, separate from D&D. That becomes clear when we consider the player’s biggest challenge: mapping. It’s absolutely essential that she keep a meticulous map of her explorations. Getting lost and not knowing how to return to the stairs or elevator is almost invariably fatal. While tabletop D&D players are often also expected to keep rough maps of their journeys, few dungeon masters are as unforgiving as Wizardry. In addition to all the challenges of keeping track of lots of samey-looking corridors and rooms, the game soon begins to throw other mapping challenges at the player: teleporters that suddenly throw the party somewhere else entirely; spinners that spin them in place so quickly it’s easy to not realize it’s happened; passages that wrap around from one side of the dungeon to the other; dark areas that force one to map by trial and error, literally by bashing one’s head against the walls.

On the player’s side are an essential mage spell, DUMAPIC, that tells her exactly where she is in relation to the bottom-left corner of the dungeon level; and the knowledge that all dungeon levels are exactly 20 spaces by 20 spaces in size. Mapping is such a key part of Wizardry that Sir-tech even provided a special pad of graph paper for the purpose in the box, sized 20 X 20.

The necessity to map for yourself is easily the most immediately off-putting aspect of a game like Wizardry for a modern player. While games before Wizardry certainly had dungeons, it was the first to really require such methodical mapping. The dungeons in Akalabeth and Ultima, for instance, don’t contain anything other than randomized monsters to fight with randomized treasure. The general approach in those games becomes to use “Ladder Down” spells to quickly move down to a level with monsters of about the right strength for one’s character, to wander around at random fighting monsters until satisfied and/or exhausted, then to use “Ladder Up” spells to make an escape. There’s nothing unique to really be found down there. Wizardry changed all that; its dungeon levels may be 99% empty rooms, corridors, and randomized monster encounters, but there’s just enough unique content to make exploring and mapping every nook and cranny feel essential. If that’s not motivation enough, there’s also the lack of a magic equivalent to “Ladder Up” and “Ladder Down” until one’s mage has reached level 13 or higher. Map-making is essential to survival in Wizardry, and for many years to follow laborious map-making would be a standard part of the CRPG experience. It’s an odd thing: I have little patience for mazes in text adventures, yet find something almost soothing about slowly building up a picture of a Wizardry dungeon on graph paper. Your milage, inevitably, will vary.

In general Wizardry is all too happy to kill you, but it does offer some kindnesses here and there in addition to DUMAPIC and dungeon levels guaranteed to be 20 X 20 spaces. These proving grounds are, for example, one of the few fantasy dungeons to be equipped with a system of elevators. They let us bypass most of the levels to quickly get to the one we want. Here we’re about to go from level 1 to level 4.

From level 4 we can take another elevator all the way down to level 9. But, as you can see below, entering that second elevator is allowed for “authorized users only.”

Wizardry doesn’t have the ability to save any real world state at all. Only characters can be saved, and only from the castle. Each dungeon level is reset entirely the moment we enter it again (or, more accurately, reset when we leave it, when it gets dumped from memory to be replaced by whatever comes next). Amongst other things, this makes it possible to kill Werdna, the evil mage of level 10, and thus “win the game” over and over again. One way the game does manage to work around this state of affairs is through checks like what you see illustrated above. We can only enter the second elevator if we have the blue ribbon — and we can only get that through the fellow who enlisted our services in another part of level 4 (see the quotation above). By tying progress through the plot (such as it is) to objects in this way, Greenberg and Woodhead manage to preserve at least a semblance of game state. The blue ribbon is of course an object which we carry around with us, and that is preserved when we save our characters back at the castle. Therefore it gives the game a way of “knowing” whether we’ve completed the first stage of our quest, and thus whether it should allow us into the lower levels. It’s quite clever in its way, and, again, would become standard operating procedure in many other RPGs for years to come. The mimesis breaker is that, just as we can kill Werdna over and over, we can also acquire an infinite number of these blue ribbons by reentering that special room on level 4 again and again.

There’s a surprising amount of unique content in the first 4 levels: not only our quest-giver and the restricted elevator, but also some special rooms with their own atmospheric descriptions and a few other lock-and-key-style puzzles similar to, although less critical than, the second-elevator puzzle. In levels 5 through 9, however, such content is entirely absent. These levels hold nothing but empty corridors and rooms. I believe the reason for this is down to disk capacity. Wizardry shipped on two disks, but the first serves only to host the opening animation and some utilities. The game proper lives entirely on a second disk, as must all of the characters that players create. This disk is stuffed right to the gills, and probably would not allow for any more text or “special” areas. Presumably Greenberg and Woodhead realized this the hard way, when the first four levels were already built with quite a bit of unique detail.

We start to see more unique content again only on level 10, the lair of Werdna himself. There’s this, for instance:

From context we can conclude that Trebor must be the quest giver that we met back on level 4. “Werdna” and “Trebor” are also, of course, “Andrew” and “Robert” spelled backward. Wizardry might like to describe itself using some pretty high-minded rhetoric sometimes and might sport a very serious-looking dragon on its box cover, but Greenberg and Woodhead weren’t above indulging in some silly fun in the game proper. When mapped, level 8 spells out Woodhead’s initials; ditto level 9 for Greenberg’s.

In the midst of all this exploration and mapping we’re fighting a steady stream of monsters. Some of these fights are trivial, but others are less so, particularly as our characters advance in level and learn more magic and the monsters we face also get more diverse and much more dangerous, with more special capabilities of their own.

The screenshot above illustrates a pretty typical combat dilemma. In an extra little touch of cruelty most of its successors would abandon, Wizardry often decides not to immediately tell us just what kind of monsters we’re facing. The “unseen entities” above could be Murphy’s ghosts, which are pretty much harmless, or nightstalkers, a downright sadistic addition that drains a level every time it successfully hits a character. (Exceeded in cruelty only by the vampire, which drains two levels.) So, we are left wondering whether we need to throw every piece of high-level magic we have at these things in the hopes of killing them before they can make an attack, or whether we can take it easy and preserve our precious spells. As frustrating as it can be to waste one’s best spells, it usually pays to err on the side of caution in these situations; once to level 9 or so, each experience level represents hours of grinding. Indeed, if there’s anything Wizardry in general teaches, it’s the value of caution.

I won’t belabor the details of play any more here, but rather point you to the CRPG Addict’s posts on Wizardry for an entertaining description of the experience. Do note as you read that, however, that he’s playing a somewhat later MS-DOS port of the Apple II original.

The Wizardry series today has the reputation of being the cruelest of all of the earlier CRPGs. That’s by no means unearned, but I’d still like to offer something of a defense of the Wizardry approach. In Dungeons and Desktops, Matt Barton states that “CRPGs teach players how to be good risk-takers and decision-makers, managers and leaders,” on the way to making the, shall we say, bold claim that CRPGs are “possibly the best learning tool ever designed.” I’m not going to touch the latter claim, but there is something to his earlier statements, at least in the context of an old-school game of Wizardry.

For all its legendary difficulty, Wizardry requires no deductive or inductive brilliance or leaps of logical (or illogical) reasoning. It rewards patience, a willingness to experiment and learn from mistakes, attention to detail, and a dedication to doing things the right way. It does you no favors, but simply lays out its world before you and lets you sink or swim as you will. Once you have a feel for the game and understand what it demands from you, it’s usually only in the moment that you get sloppy, the moment you start to take shortcuts, that you die. And dying here has consequences; it’s not possible to save inside the dungeon, and if your party is killed they are dead, immediately. Do-overs exist only in the sense that you may be able to build up another party and send it down to retrieve the bodies for resurrection. This approach is probably down at least as much to the technical restrictions Greenberg and Woodhead were dealing with — saving the state of a whole dungeon is complicated — as to a deliberate design choice, but once enshrined it became one of Wizardry‘s calling cards.

Now, this is very possibly not the sort of game you want to play. (Feel free to insert your “I play games to have fun, not to…” statements here.) Unlike some “hardcore” chest-thumpers you’ll meet elsewhere on the Internet, I don’t think that makes you any stupider, more immature, or less manly than me. Hell, often I don’t want to play this sort of game either. But, you know, sometimes I do.

My wife and I played through one of the critical darlings of last year, L.A. Noire, recently. We were generally pretty disappointed with the experience. Leaving aside the sub-Law and Order plotting, the typically dodgy videogame writing, and the most uninteresting and unlikable hero I’ve seen in a long time, our prime source of frustration was that there was just no way to fuck this up. The player is reduced to stepping through endless series of rote tasks on the way to the next cut scene. The story is hard-coded as a series of death-defying cliffhangers, everything always happening at the last possible second in the most (melo-)dramatic way possible, and the game is quite happy to throw out everything you as the player have, you know, actually done to make sure it plays out that way. In the end, we were left feeling like bit players in someone else’s movie. Which might not have been too terrible, except it wasn’t even a very good movie.

In Wizardry, though, if you stagger out of the dungeon with two characters left alive with less than 10 hit points each, that experience is yours. It wasn’t scripted by a hack videogame writer; you own it. And if you slowly and methodically build up an ace party of characters, then take them down and stomp all over Werdna without any problems at all, there’s no need to bemoan the anticlimax. The satisfaction of a job well and thoroughly done is a reward of its own. After all, that’s pretty much how the good guys won World War II. To return to Barton’s thesis, it’s also the way you make a good life for yourself here in the real world; the people constantly scrambling out of metaphorical dungeons in the nick of time are usually not the happy and successful ones. If you’re in the right frame of mind, Wizardry, with its wire-frame graphics and its 10 K or so of total text, can feel more immersive and compelling than L.A. Noire, with all its polygons and voice actors, because Wizardry steps back and lets you make your own way through its world. (It also, of course, lets you fuck it up. Oh, boy, does it let you fuck it up.)

That’s one way to look at it. But then sometimes you’re surprised by six arch-mages and three dragons who proceed to blast you with spells that destroy your whole 15th-level party before anyone has a chance to do a thing in response, and you wish someone had at least thought to make sure that sort of thing couldn’t happen. Ah, well, sometimes life is like that too. Wizardry, like reality, can be a cruel mistress.

I’m making the Apple II version and its manual available for you to download, in case you’d like to live (or relive) the experience for yourself. You’ll need to remove write permissions from the first disk image before you boot with it. As part of its copy protection, Wizardry checks to see if the disk is write protected, and refuses to start if not. (If you’re using an un-write-protected disk, it assumes you must be a nasty pirate.)

Next time I’ll finish up with Wizardry by looking at what Softline magazine called the “Wizardry phenomenon” that followed its release.

 
 

Tags: ,