RSS

Author Archives: Jimmy Maher

Micro Men

For practical purposes, the British PC industry lagged about three years behind the American. It wasn’t that it was impossible to buy a modern American machine. Commodore alone sold some 45,000 PET systems in Britain in that platform’s first three years of availability, and, while they were less common, you could certainly buy imported TRS-80s, Apple IIs, and Atari 400s and 800s if you had the money. But it’s that last part that’s key here. At a time when the pound was worth around $2.50, even the most bare-bones PET system would set you back at least £650, while an Apple II system of the type that was pretty much the expected standard in America by 1981 — a II Plus with 48 K, a color monitor, two floppy drives, perhaps a printer — would quickly climb to around the £2000 mark. To fully understand just how out of reach these prices made computers for the average Briton, you have to understand something about life there in the late 1970s and early 1980s.

The British economy hadn’t really been good for quite some years, suffering along with the rest of country from a sort of general post-empire malaise punctuated by occasional embarrassing shocks like the Three-Day Week (1974), when chronic energy shortages forced the government to mandate that business could only open three days in the week, and the Winter of Discontent (1978-79), when strikes across a whole range of industries brought the economy and, indeed, daily life to a virtual standstill. The latter events were sufficient to ensure the election as Prime Minister of perhaps the most polarizing figure in postwar British political history, Margaret Thatcher, on a platform that promised to drag Britain into the modern age, if necessary kicking and screaming, by rolling back most of the welfare state that had been erected in the aftermath of World War II. Yet nothing got better in the immediate wake of Thatcher’s election. In fact, as the government imposed harsh austerity measures and much of the country’s remaining industrial base collapsed under privatization, they just continued to get worse. By 1981 unemployment was at 12.5%, entire cities were reduced to industrial wasteland, riots were becoming a daily reality, and Thatcher was beset by howling mobs virtually everywhere she went. It felt like something more than just a serious recession; it felt dangerous. That summer The Specials summed up the mood of the country in the apocalyptic, chart-topping “Ghost Town.” Things would get slowly, painfully better after that low point, but it would be nearly a decade before unemployment shrunk to reasonable levels and the modern economy Thatcher had promised really took hold with the beginning of the era of “cool Britannia.”

Suffice to say, then, that most Britons would not have been able to afford American computers even if they were priced in line with what Americans paid for them. While PETs were sold to businesses and TRS-80s and Apple IIs to the handful of wealthy eccentrics who could afford them, a parallel domestic industry arose to serve everyday users at prices they could afford. It began in 1978, three years after the Altair in North America, with a handful of do-it-yourself kits that let hobbyists solder together contraptions of toggle switches and blinking lights. The British equivalent of the trinity of 1977 then arrived, right on schedule, in 1980.

So many characters from the early PC era are larger than life, and their photos seem to say it all about them. You’ve got, for example, Steve Jobs, the glib, handsome charmer whom you wouldn’t quite trust with your daughter.

You’ve got Jack Tramiel, who (Jewishness aside) looks like he should be sitting behind a mound of spaghetti mumbling about breaking kneecaps.

And you’ve got the man history remembers as the first to bring affordable computers to the British public, Sir Clive Sinclair. He looks like a mad genius inventor who should be making gadgets for James Bond — or maybe Maxwell Smart. If you left him alone at your house you’d probably return to find the cat on fire and the daughter’s hair turned blue.

Despite having absolutely no formal training, Sinclair graduated from gigs writing for electronics magazines in 1961 to found Sinclair Radionics, a firm with the perfect name for a mad scientist’s workshop. After years spent selling kits for making radios, amplifiers, test equipment, and the like to hobbyists, Sinclair Radionics started a consumer-electronics line, for which, as (once again) befitted any proper mad scientist, they produced groundbreaking gadgets with absurd design flaws and about the worst quality control imaginable. There was the Sinclair Executive, one of the first calculators small enough to fit in a pocket, but which had an unfortunate tendency to explode (!) when left on too long. And there was the Microvision, a portable television. Unfortunately, Sinclair had neglected to ask just who the hell really wanted to watch TV on a 2″ black-and-white screen, and it was a commercial flop.

But the stereotypical — or satirical — Sinclair product was the Black Watch.

On the plus side, it was one of the first digital wristwatches. On the negative side — gee, where to start? The Black Watch was chronically unreliable in actually, you know, keeping time, never a good feature in a watch; it was apparently very susceptible to climate changes, running at different speeds in different seasons. Batteries lasted for a solid ten days if you were lucky, and were almost as hard to replace as the watch had been to assemble in the first place. (Like many Sinclair products, it was available as a do-it-yourself kit as well as in pre-assembled form). It had a tendency to literally fall to pieces all at once as the clips that held it together fatigued. But even that wasn’t the worst possible failure. In what was becoming a Sinclair trademark, the Black Watch was also known to explode without warning.

Released in late 1975, the Black Watch fiasco combined with the onslaught of cheap calculators from Japan marked the beginning of the end of Sinclair Radionics. Britain’s National Enterprise Board bought a majority interest in 1977, but quickly found Clive to be all but impossible to deal with, and found the hoped-for turnaround a tough nut to crack. The NEB finally pulled the plug on the company in the wake of Thatcher’s election; this sort of mixing with private business was of course under Thatcher’s new paradigm exactly what the government should not be doing. By that time Clive had already started another company on the sly to wriggle free of government interference with his management decisions. He named it Science of Cambridge to keep its guiding hand at least somewhat under wraps. This was the company that would start the PC boom in Britain.

For an exaggerated but entertaining picture of Clive Sinclair the man, I’ll point you to the show whose title I stole for this post, the BBC one-off Micro Men. He was a genuinely talented inventor with a flair for the art of the possible and a determination to bring out products at prices that ordinary people could afford — a populist in the best sense of the word. He was also stupefyingly stubborn and arrogant, one of those supremely tedious people who love to talk about their IQ scores. (He was chairman of British Mensa for almost two decades.) In a typical interview for Your Computer magazine in 1981, he said, “I make mistakes, everyone does, but I never make them twice.” Someone of more average intelligence — like for instance your humble blogger here — might beg to differ that his history of exploding products would seem to point to a man who kept making the same mistakes over and over, thinking he could avoid the perspiration of polishing and perfecting through the inspiration of his initial brilliant idea. But what do I know?

Sinclair had been involved with some of those blinking-box computer kits I mentioned earlier, but he first entered the computer market in a big way with the release of the ZX80 in early 1980, the £100 machine I mentioned in an earlier post as Jack Tramiel’s inspiration for the Commodore VIC-20. Indeed, there are some similarities between the two men, both egocentric executives who were forced out of the calculator market by the cheaper Japanese competition. Yet we shouldn’t push the comparison too far. Sinclair was, to use the British term, a thoroughgoing boffin, filled with childlike enthusiasm for gadgets and for technology’s social potential. Tramiel, however, was all businessman; he would, to paraphrase one of Steve Jobs’s most famous pitches, have been perfectly happy to sell sugared water for his entire life if that gave him the competition he craved.

The ZX80 was, once again, available as either a semi-assembled kit or, for somewhat more, a completed product ready to plug in and use. With its tiny case and its membrane keyboard, it looked more like a large calculator than a computer. Indeed, its 1 K of standard RAM meant that it wasn’t good for much more than adding numbers until the user sprang for an expansion. Its standard BASIC environment was bizarre and seemed almost willfully unfriendly, and it was beset by the usual Sinclair reliability problems, with overheating a particular concern. (At least there were no reports of exploding ZX80s…) The design was so minimal that it didn’t even have a video chip, but rather relied on the CPU to generate a video signal entirely in software. From this stemmed one of its most unique “features”: because the CPU could only generate video when it was not doing something else, the screen went blank whenever a program was actually running, even momentarily every time the user hit a key. But it was a real computer, the first really within reach for the majority of Britons. Sinclair sold 100,000 of them in less than eighteen months.

Science of Cambridge was not the only British company to make a splash in the burgeoning home-computer market in 1980. Another young company, Acorn Computers, released its own machine, the Acorn Atom, later that year.

The Atom cost about 50% more than the ZX80, but was still vastly less than any of the American machines. The extra money bought you a much more usable computer, with a proper keyboard, twice the RAM (even if 2 K was still sadly inadequate for actually doing much of anything), a display that didn’t flick on and off, and a less, shall we say, idiosyncratic interpretation of BASIC. The competition between Sinclair and Acorn was personal. The head of Acorn, Chris Curry, had been for some twelve years Clive Sinclair’s right-hand man. The two had parted ways in late 1978, ironically because Curry wanted to produce a new microcomputer that Sinclair did not (yet) see the potential of. Curry went on to form Acorn with a partner, Hermann Hauser, and barely a year later — Sinclair having suddenly gotten the microcomputer religion — was going toe to toe with his erstwhile boss and mentor.

The following year, 1981, would prove a pivotal one. Sinclair, who changed the name of his company that year to Sinclair Research in the wake of Sinclair Radionics dissolution, introduced the ZX81 in March, an evolution of the ZX80 design that further reduced the price to just £50 in kit form, £70 fully assembled.

Amongst other modest improvements, the ZX81 could run in “slow” mode, in which enough CPU time was always reserved to update the display, eliminating the screen blanking at the cost of dramatically slower CPU throughput. And it could handle floating-point numbers, an impossibility on the ZX80. Of course, it was also a Sinclair product, with everything that entailed. The 16 K RAM expansion didn’t quite fit into its socket correctly; it would occasionally fall out of place with disastrous results. Actually, most of the connections had similar if less acute problems, forcing one to tiptoe gingerly around the machine. (Presumably those living near train tracks were just out of luck.)

The Commodore VIC-20 also arrived that year, at an initial price of about £180. Very much a lowest end of low-end machines in North America, the VIC-20 with its 5 K of RAM and color graphics capabilities was considerably more capable than either the unexpanded Sinclair or Acorn; thus the comparatively high price.

In North America, we saw the emergence of a commercial software market in 1978, as hobbyists like Scott Adams began packaging their programs on cassette tapes in Ziploc baggies and selling them. True to the three-year rule, a domestic British software market began to emerge in 1981, with a similar do-it-yourself personality of hand-copied cassettes and improvised packaging. (One could hear the creators’ children playing and similar background noises on some of these “data” tapes.) Software of course largely meant games, and a big part of games was text adventures.

A very good candidate for the first homegrown British example of the form is Planet of Death, a game for the ZX80 and ZX81 released around June of 1981 by Artic Software, a company formed by two university students, Richard Turner and Chris Thornton, the year before. Unlike the earliest American text-adventure coders, Turner and Thornton had plenty of examples to follow, thanks to their Video Genie computer, a Hong Kong-manufactured clone of the TRS-80 Model 1 that became more popular than the real thing in Britain. (In fact, they did their coding on the Genie, which shared the Sinclair machines’ Zilog Z-80 processor, and transferred their work to the more primitive Sinclairs.) The Artic adventure line, of which Planet of Death was the first, shows a marked Scott Adams influence, from the instructions insert that calls the player’s avatar her “puppet” to Artic’s system of numbering its adventures to help the devoted assemble a complete collection. (One difference: Artic used letters instead of numbers. Thus Planet of Death is Adventure A.)

Planet of Death doesn’t cut a very inspiring figure as the first example of British ludic narrative. Mostly it makes you appreciate its inspiration; whatever his other failings, Scott Adams always finished his games before he released them. Planet of Death plays like something you might find sloshing around the bottom of one of the modern IF Competitions, albeit without the built-in technical competency modern IF languages like Inform bring to the table. It’s as if Turner and Thornton ran out of memory and simply stopped where they were — which, come to think of it, is likely exactly what happened. You’ve got bugs galore, a maze that’s doubly frustrating because it ultimately leads nowhere, red herrings and half-finished puzzles, all wired up to an unusually obtuse two-word parser that thinks “with” is a verb. Yet, just as the ZX80 and ZX81 were real computers, however limited an implementation thereof, Planet of Death was a real adventure game, the first most of the British public had seen, and it sold well enough to spawn a whole line from Artic. It stands at the origin of an adventure-game scene that would become if anything even more vital and prolific than that in the U.S. — one we’ll be following in later posts.

In an important signifier of the growing acceptance of PCs in Britain, the omnipresent High Street newsstand chain WH Smith began selling the ZX81 in its stores with the arrival of the 1981 holiday season, billing it as “your first step into personal computing.” Just as the arrival of the VIC-20 in K-Mart stores in North America signaled a similar paradigm shift there, mainstream British stores would soon be stocking not just Sinclairs but also Acorns and Commodores. Within a few years British computer sales would surpass those in the U.S. on a per capita basis, as Britain became the most computer-mad nation on Earth. We’ll get back to that. For next time, though, we’ll return to the U.S. to look at the last major computer introduction of 1981, and the most long-lived and important of all.

 

Tags: , ,

Sentient Software

In 1979 a 30-year-old aspiring science-fiction writer named Mike Berlyn bought an Apple II. He had already finished and delivered his first two novels to Bantam Paperbacks, who would release them under the titles The Crystal Phoenix and The Integrated Man the following year. Now about to start on a third, he had heard that these new PCs were going to change the way writers wrote, and was eager to find out for himself. In the long term, the prediction was of course not wrong, but Berlyn quickly found that the technology of 1979 was, as they say, not quite there yet. The Apple II didn’t even yet support lower-case letters at this point, necessitating all sorts of kludges in early word processors that took them about as far away as you can get from the ideal of what you see is what you get. He ended up writing his third novel, eventually published by Ace Paperbacks as Blight under the pen name Mark Sonders in 1981, the old-fashioned way.

Still, Berlyn was far from disappointed with his purchase. The Apple II may still have been problematic from a practical standpoint, but Berlyn, like so many before and after him, found it an endlessly fascinating toy. When not writing that third book, he spent most of his time exploring his new machine. He found text adventures particularly compelling, but was disappointed by the obvious lack of literary skill of most of the people creating them. Being an enterprising sort, Berlyn decided when the third book was finished that, rather than start right away on a fourth, he’d like to try making a text adventure or two of his own. The result of that aspiration was Sentient Software, a company founded by Berlyn and his wife Muffy with the help of some other partners also located near the Berlyns’ Colorado home. Sentient published two games in 1981, Oo-Topos and Cyborg. Both were written and programmed entirely by Berlyn with a bit of help from his wife, and both were science-fiction adventures involving a damaged spaceship.

In many ways these games are very typical of their era. Technically, they are most similar to Softporn of the games I’ve already discussed on this blog; they are built from a BASIC program with a two-word parser that fetches text and details of the storyworld as needed from data files stored on the disk. They are, in other words, about equivalent to the Scott Adams games in their parser and in the depth of their world modeling, but their use of the disk drive gives them space to be much more loquacious (certainly an important attribute for a “real” writer like Berlyn) and to have much bigger geographies. Indeed, their worlds are quite big ones, but made up mostly of empty rooms, connected via undescribed exits that necessitate painstaking mapping — and that’s outside the obligatory mazes. And of course, the parser makes many puzzles much harder than they ought to be. (Finding out what the correct verbs are, Cyborg tells us, is “half the fun.” Um, no.)

Yet in other ways these games represent something new and significant. Berlyn was the first author to come to the text adventure from the world of traditional fiction. He was interested in the form not, like the hackers who preceded him, as an interesting technical challenge, but rather as a potential new form of storytelling. The packaging of the games emphasized that they were not about “treasures” or “score,” but about “character development,” consistency, and plot. Some of those claims may have been more than a bit of a stretch, but Berlyn was trying, and that is significant in itself.

The plot of Cyborg, the more thematically audacious of the two games, casts you as, well, a cyborg, a human who has been physically and mentally merged with a robot. When play begins, you have amnesia, an adventure-game trope that would soon become a cliché but that may just see its first appearance here. Robbing your avatar of her memory allows Berlyn to place the two of you in the same mental situation. You both spend the game piecing together what brought you to this state, marooned on a stricken spaceship in orbit around a strange planet. Although you are expected to eventually repair the spaceship and lead your people — whom you eventually realize are colonists stored in suspended animation aboard the ship — to the planet below, the vast majority of the plot is not really story per se, but rather backstory, a frame to contain the game’s traditional puzzle- and mapping-oriented play. Within that frame, however, the game’s environments are indeed consistent and believable in a way that hadn’t been seen before. Like amnesia, Cyborg‘s piece-together-the-back-story approach to plotting would soon become an adventure-game cliché. Still, it became a cliché because, at least in these earlier, less jaded days, it worked. Here it allows Berlyn to present a much richer fictional experience than would normally be possible given the primitive technology on-hand to him. His use of it marks him as — and I don’t use this word lightly — a visionary, someone thinking about the medium’s potential in a very progressive way.

One of the most interesting aspects of Cyborg is its handling of the player / avatar split. You play a disembodied human intelligence who must communicate with another, synthetic entity to accomplish absolutely everything. The idea of a split or disembodied consciousness was one that Berlyn found endlessly intriguing; his first two novels both dealt with similar themes, and he would return to it yet again (and most famously) in his next game, Infocom’s Suspended. Here he gets huge mileage out of his concept, including using it to account for the limitations of his parser:

I MAY NOT SEEM VERY HELPFUL AT TIMES BUT I DO WHAT I CAN. MY VOCABULARY IS PRETTY LARGE CONSIDERING THE STATE MY CHIPS ARE IN. THE CIRCUITS USED TO MAKE LOGICAL DECISIONS AND CARRY OUT ORDERS ARE DIFFERENT THAN THOSE USED TO DESCRIBE LOCATIONS. I TELL YOU THIS SO YOU WILL UNDERSTAND THAT ALTHOUGH I MAY USE A WORD IN ONE SENSE THAT DOESN’T MEAN I’LL UNDERSTAND IT IN ALL CASES. IT WILL HELP US BOTH IF YOU ARE AS SPECIFIC AS POSSIBLE WHEN COMMUNICATING WITH ME. AVOID WORDS LIKE “USE” OR “CONTINUE.” IF YOU WANT TO DO SOMETHING I SAY WE CAN’T TRY A SIMILAR VERB.

The game’s simple hint system is likewise integrated into the fiction. You can ask your computerized companion what he thinks about locations or items, and occasionally — very occasionally — will get a helpful suggestion.

This unusual concept makes Cyborg one of the few (only?) text adventures ever written in the first-person plural. And again, it’s reflective of some unusually sophisticated thinking about the medium and its possibilities. Scott Adams and others had previously described the player’s avatar as her “puppet,” and at times seemed to give it a separate consciousness, at least if we can judge from the occasional snappy comebacks it gave to nonsensical or dangerous inputs. But no one had previously devised a scenario where even parser frustrations fitted into the scenario so seamlessly. Cyborg marks the first of a long line of games — and almost as many articles in game theory — to explicitly, consciously (ha!) play with the identities of player and avatar. Berlyn even extends the conceit to the verbs permitted. For instance, you cannot LOOK but must SCAN, and an INVENTORY becomes a BODY SCAN.

Given their obviously limited resources, Berlyn and company did the best they could marketing Oo-Topos and Cyborg. For packaging they used a very minimalist cardboard folder, but did commission some nice science-fiction art for the covers.

Still, and as Chuck Benton was discovering at about the same time, it was getting harder for the bedroom hacker without connections to distributors and the like to get his software into stores. Cyborg received an absolutely glowing review in the influential Softalk magazine: “Cyborg introduces the most exciting advances in adventuring since the original Adventure began the whole wonderful thing.” Yet even that wasn’t enough to overcome Sentient’s distributional problems and make the game a success.

Berlyn designed a couple more games for Sentient in 1982, albeit less ambitious arcade-oriented fare, called Gold Rush and Congo. They similarly didn’t make much of an impact. At this point Berlyn and his partners had some sort of falling out which led him to walk away from the company. Over the next couple of years, said partners funded ports of Berlyn’s adventures to the Atari 400 and 800, the IBM PC, and the Commodore 64, before allowing Sentient to fade quietly out of existence. Berlyn, however, was just getting started in interactive fiction, as we’ll see in later posts.

Cyborg is as fascinating conceptually as it can be frustrating to actually play, but it’s well worth a look by any student of the art of interactive fiction. I’ve therefore made the Apple II disk image available for you.

Next time: we’ll take our first tentative steps across the big pond.

 
 

Tags: , ,

The Future Was Here: The Commodore Amiga

As has been something of an open secret for quite a while now, I wrote a book. It’s called The Future Was Here: The Commodore Amiga, it’s published by the MIT Press, and now it’s shipping at last.

As the name would imply, my book is a history of the Amiga, a computing platform that pioneered much of the digital world of today. Indeed, my central thesis is that the Amiga represents the world’s first true multimedia personal computer. Much of the book is devoted to working out the implications of that claim.

One thing I wanted to do with the book, as with this blog, was to not neglect the technology in writing technological history. To understand what allowed the Amiga to, say, pioneer the field of desktop video (something that has become so ubiquitous in this era of YouTube that, like “desktop publishing,” the term has ceased to be a useful signifier), one has to understand a bit about its design, even about how the Amiga got its picture to the screen and how this differed from other contemporary computers. So, and while I don’t neglect culture and sociology, I do delve quite deeply into the inner workings of the machine. At the same time, I keep the jargon to a minimum and, when I do indulge, make it a point to explain it carefully beforehand. I thoroughly believe that any patient and interested reader is capable of understanding this stuff if the author just shows a little bit of care, and that’s the assumption that guided me throughout the writing. In other words: no computer science degrees are required. I’m going to go out on a limb here and say that I think many of you who enjoy this blog will also enjoy the book — even if only one chapter deals directly with games. (Hey, at least it’s one of the longest ones…)

Again as I do on this blog, I wanted to encourage active reading, to encourage you to go out and explore some of this technology and art for yourselves. With that in mind, I’ve created a website for the book that hosts a fair amount of content. The book itself can of course be purchased from many fine bookstores, online or brick and mortar.

Oh, and sorry things have been a little quiet with the blog lately. I should have some more stuff for you within a day or three.

 

Tags:

My Eamon Problem

Fair warning — this post is going to be a bit meta. It has two purposes. The first is easily dispensed with: to tell you that I’ve revised my earlier posts on the history of Eamon to reflect what I believe to be a more supportable chronology which does not have the system appearing until late 1979. The rest of what follows describes briefly how I came to my conclusions. This is all rather inside baseball, but those of you thinking of growing up to become digital antiquarians yourselves might be interested in this slice of my poor detail-obsessed life.

Traditional histories have given Eamon a release date of 1980, presumably because the first published article about the system, a piece written by Don Brown himself for Recreational Computing, dates from the summer of that year. I initially saw no reason to doubt the traditional chronology. But then I made contact with John Nelson, founder of the National Eamon Users Club. He dropped a bomb on me by saying he had first played Eamon in 1978, and that at that time there were already four additional scenarios available. As the guy who probably did more for Eamon than anyone else, including its creator, Nelson was a hard fellow to doubt. So I wrote those posts based largely on his chronology, even though I never could manage to feel really confident in it. Ever since, those posts have remained the ones I’m least happy about. My dissatisfaction was such that I recently started rummaging through all of the early Eamon disks again, looking for something that would let me pin a definite date onto at least one of them, and thereby begin to build a chronology. As it happened, I found what I was looking for, and that in turn prompted me to revise the earlier articles and write this post. Before I tell you what I found, however, let me first state some of the misgivings that sent me looking in the first place.

The Apple II actually had two versions of the BASIC language. The original machine had in its ROM a very stripped-down version of the language, one that had been put together quickly by Steve Wozniak himself. This version was soon dubbed “Integer BASIC” because it had no support for floating-point (i.e., decimal) numbers, only integers. Because floating-point numbers are very important to certain types of applications, Apple quickly realized the need for a better, more complete implementation of BASIC. They bought one from Microsoft and spent considerable effort customizing it for the Apple II. They dubbed it Applesoft BASIC upon its release in January of 1978. Applesoft was initially not widely used, however, both because its earliest incarnation was quite buggy and because it was housed on tape or disk rather than in ROM, meaning the user had to load it into RAM to use it. With most machines still equipped with only 16 K of memory in these early days, Applesoft, which consumed 10 K by itself, was impractical for most users. It only really caught on from May of 1979, when Apple began shipping the II Plus with Applesoft in ROM; to run an Integer BASIC program on the II Plus, one had to load that language in from disk.

Yet Eamon is written in Applesoft BASIC. And there’s something else: the standard Eamon needs pretty much all of a 48 K Apple II’s memory. (The master disk did originally contain a special, stripped-down version of the program for 32 K machines.) It’s doubtful that it would even be possible to load Applesoft from disk and still have room for Eamon. Even if it was, a 48 K machine would have been a very unusually powerful one for 1978. After the 48 K Apple II Plus began shipping, however, the larger memory quite quickly became an expected standard.

And there’s the text-adventure chronology problem. Scott Adams first released Adventureland and Pirate Adventure during the second half of 1978 for the TRS-80. These games did not appear on the Apple II until early the following year, where they represent the first text adventures available for that platform. To have developed Eamon in 1978, Brown would have had to either: 1) be aware enough of the TRS-80 world that he played Adams’s games and decided to implement a similarly parser-based interface on the Apple II ; 2) have played Crowther and Woods’s Adventure or one of the other games it spawned on a big institutional computer; or 3) have come up with the concept of the text-adventure interface on his own, from scratch. None of these are impossible, but none seems hugely likely either. Depending on when in 1978 Eamon was released, an early Eamon even creates the somewhat earthshaking possibility that it may have been Brown, not Scott Adams, who first brought the text adventure to the microcomputer. Again, this just doesn’t feel right to me.

And then there’s that Recreational Computing article itself. In it Brown writes, “I know of five additional adventure diskettes.” Nelson, on the other hand, believes that “about 20” adventures were available by 1980. He suggested to me that Brown was perhaps referring to adventures that he himself had not written, but it’s very hard for me to read this sense into the paragraph in question. Nelson’s other suggestion, that the article had just lain on the shelf for many months before being printed, seems equally a stretch. If everything else pointed to an earlier chronology, I could accept such reasoning, but in combination with the other questions it becomes a good deal harder.

And then I found what I was looking for. Eamon #3, The Cave of the Mind, was the first not to be written by Brown himself, being from Jim Jacobson and Red Varnum. At the beginning of one of its programs is an REM statement with an actual date: January 30, 1980. This was enough to tip me back over to something much closer to the traditional chronology, with Brown developing the system in the latter half of 1979 in the wake of the Apple II Plus’s release. Sure, it’s possible that the date in the code of Cave represents a revision date rather than a date of completion or release, even though it doesn’t say this. But weighed together with all the other evidence, I feel pretty confident a later date for Eamon is more likely than an earlier.

None of this is meant to criticize John Nelson, who generously shared his memories with me. It’s just that 30 years is a long time. It’s also possible that Nelson might have played an earlier proto-Eamon, presumably written in Integer BASIC for an Apple II with much less memory, which Brown expanded at a later date into the Eamon we know today. Yet unless some real documentary evidence surfaces, or Brown suddenly starts talking, that remains only speculation.

So, the current Eamon articles still represent something of a best guess, and as such I’m still not entirely happy with them. But I think it’s a better guess than the one I made the first time around. Barring more new data, that will have to do.

 
 

Tags: ,

Castle Wolfenstein

One night circa early 1981, Silas Warner of Muse Software dropped by a local 7-Eleven store, where he saw an arcade game called Berzerk.

Berzerk essentially played like an interactive version of the programming game Warner had just finished writing on the Apple II, Robot War. The player controlled a “humanoid” who looked more than a little like a robot himself, battling an array of other robots each equipped with their own armaments and personalities. But most impressively, Berzerk talked. The enemy robots shouted out science-fiction cliches like “Intruder alert!” and, Dalek style, single-word imperatives like “Attack!,” “Kill!,” and “Destroy!” Warner was entranced, especially considering that one of Muse’s flagship products was Warner’s own The Voice, an Apple II voice-synthesis system. Still, he’d had enough of robots for a while.

Then one night the old World War II flick The Guns of Navarone came on the television. The most successful film of 1961, it’s the story of a tiny group of Allied commandos who make their way across a (fictional) Greek island to destroy a vital German gun installation. Like most films of its ilk, it can be good escapist fun if you’re in the right frame of mind, even if most of its plot is forehead-slappingly silly. After seeing Navarone, Warner started thinking about whether it might be possible to replace robots with Nazis. One nice thing about filmic Nazis, after all, is that they tend to be as aggressively stupid as videogame robots, marching blithely into trap after ambush after deception while periodically shouting out “Achtung!,” “Jawohl!,” and “Sieg Heil!” in lieu of Berzerk‘s “Attack!,” “Kill!,” and “Destroy!” (One imagines that the Greeks in the movie, when not engaging in ethnically appropriate song and dance or seducing our heroes with their dewy-eyed, heroic-resistance-fighter gazes, must be wondering just how the hell they managed to get themselves conquered by this bunch of clowns.) Other elements of the movie also held potential. The heroes spend much of the latter half disguised in German uniforms, sneaking about until someone figures out the ruse and the killing has to start again. What a game mechanic!

So, from the odd couple of Berzerk and The Guns of Navarone was born Castle Wolfenstein.

Given Wolfenstein‘s position in the history of ludic narrative, it’s appropriate that it should have resulted from the pairing of an arcade game with a work of fiction. Wolfenstein was the first game to unify the two strands of computer gaming I described in my previous post, combining a real story and fictional context with action mechanics best carried out with a joystick or set of paddles. Yet even this gameplay also demanded considerable thought, even strategizing, for success. In the console world, Warren Robinett had attempted a similar fusion a couple of years earlier with the Atari VCS game Adventure, which was directly inspired by Crowther and Woods’s game of the same name. Still, the VCS was horribly suited to the endeavor. Because it couldn’t display text at all, Adventure couldn’t set the scene like Wolfenstein did when the player first started a game. The following is mouthed by a dying cellmate in the castle/fortress in which you are being held prisoner:

“WELCOME TO CASTLE WOLFENSTEIN, MATE! THE NAZIS BROUGHT YOU HERE TO GET INFORMATION OUT OF YOU BEFORE THEY KILL YOU. THAT’S WHAT THIS PLACE IS FOR – IF YOU LISTEN YOU CAN HEAR THE SCREAMS. THEY’VE ALREADY WORKED ME OVER AND I’LL NEVER GET OUT ALIVE, BUT MAYBE YOU CAN WITH THIS GUN. I GOT IT OFF A DEAD GUARD BEFORE THEY CAUGHT ME. IT’S STANDARD ISSUE – EACH CLIP HOLDS 10 BULLETS, AND IT’S FULLY LOADED.

“BE CAREFUL, MATE, BECAUSE EVERY ROOM IN THE CASTLE IS GUARDED. THE REGULAR GUARDS CAN’T LEAVE THEIR POSTS WITHOUT ORDERS, BUT WATCH OUT FOR THE SS STORMTROOPERS. THEY’RE THE ONES IN THE BULLETPROOF VESTS AND THEY’RE LIKE BLOODY HOUNDS. ONCE THEY’VE PICKED UP YOUR TRAIL THEY WON’T STOP CHASING YOU UNTIL YOU KILL THEM AND YOU ALMOST NEED A GRENADE TO DO THAT.

“CASTLE WOLFENSTEIN IS FULL OF SUPPLIES TOO. I KNOW ONE CHAP WHO FOUND A WHOLE GERMAN UNIFORM AND ALMOST SNEAKED OUT PAST THE GUARDS. HE MIGHT HAVE MADE IT IF HE HADN’T SHOT SOME POOR SOD AND GOT THE SS ON HIS TRAIL. IF YOU CAN’T UNLOCK A SUPPLY CHEST, TRY SHOOTING IT OPEN. NOW I WOULDN’T GO SHOOTING AT CHESTS FULL OF EXPLOSIVES…

“ONE MORE THING. THE BATTLE PLANS FOR OPERATION RHEINGOLD ARE HIDDEN SOMEWHERE IN THE CASTLE. I’M SURE YOU KNOW WHAT IT WOULD MEAN TO THE ALLIED HIGH COMMAND IF WE COULD GET OUR HANDS ON THOSE…

“THEY’RE COMING FOR ME! GOOD LUCK!

“AIIIIEEEEEEE….”

Once into the game proper the text dries up, but there are still elements that make it feel like some facsimile of a real situation rather than an exercise in abstract arcade mechanics. The “verbs” available to the player are very limited in comparison to, say, even an old-school text adventure: move, aim, shoot, search a surrendered soldier or corpse, open a door or chest, throw a grenade, use a special item, take inventory. Yet the game’s commitment to simulation is such that this limited suite of actions yields a surprising impression of verisimilitude. One can, for example, use a grenade to blow up guards, but one can also use it to blast holes in walls. Such possibilities make the game a tour de force of early virtual worldbuilding; arguably no one had created a simulated world so believable on such a granular level prior to Wolfenstein.

There is even some scope for moral choice. If you catch them by surprise, guards will sometimes lift their arms in surrender, at which point you are free to kill them or leave them alive, as you will. Similarly, the game allows different approaches to its central problem of escape. One can attempt to methodically dispatch every single guard in every single room, but one can also try to dodge past them or outrun them, only killing as a last resort. Or one can find a uniform, and (in the game’s most obvious homage to The Guns of Navarone) try to just walk right out the front door that way. These qualities have led many to call Wolfenstein the first ancestor of the much later genre of stealth-based games like Metal Gear Solid and Thief. I don’t know as much about such games as I probably ought to, but I see no reason to disagree. The one limiting factor on the “sneaking” strategy is the need to find those battle plans in order to achieve full marks. To do that you have to search the various chests you come across, something which arouses the guards’ suspicion. (These may be videogame Nazis, but they aren’t, alas, quite that stupid.)

In order to make the game a replayable exercise (shades of the arcade again), the castle is randomly stocked with guards and supplies each time the player begins a new game. In addition, play progresses through a series of levels. The first time you play you are a private, and things are appropriately easier — although, it should be noted, never easy; Wolfenstein is, at least for me, a punishingly difficult game. Each time you beat the game on a given level, you increase in rank by one, and everything gets more difficult the next time around. The ultimate achievement is to become a field marshal.

In Warner’s own words, he threw “everything” Muse had on their shelf of technical goodies into Wolfenstein. For instance, we once more see here the high-res character generator Warner had also used in Robot War.

But most impressive was the inclusion of actual speech, a first for a computer game. To really appreciate how remarkable this was, you first have to understand how extraordinarily primitive the Apple II’s sound hardware actually was. The machine contained no sound synthesizer or waveform generator. A program could make sound only by directly toggling current to the speaker itself. Each time it did this, the result was an audible click. Click the speaker at the appropriate frequency, and you could create various beeps and boops, but nothing approaching the subtlety of human speech — or so went the conventional wisdom. The story of Wolfenstein‘s talking Nazis begins back in 1978, when a programmer named Bob Bishop released a pair of programs called Apple-Lis’ner and Appletalker.

Every Apple II shipped with a port that allowed a user to connect to it a standard cassette drive for storage, as well as the internal hardware to convert binary data into sound for recording and vice versa. Indeed, cassettes were the most common storage medium for the first few years of the Apple II’s life. Bishop realized that, thanks to the cassette port, every Apple II effectively contained a built-in audio digitizer, a way of converting sound data into binary data. If he attached a microphone to the cassette port, he should be able to “record” his own speech and store it on the computer. He devised a simplistic 1-bit sampling algorithm: for every sample at which the level of the incoming sound was above a certain threshold, click the speaker once. The result, as played back through Appletalker, was highly distorted but often intelligible speech. Warner refined Bishop’s innovations in 1980 in The Voice. It shipped with a library of pre-sampled phonemes, allowing the user to simply enter text at the keyboard and have the computer speak it — if the program properly deduced what phoneme belonged where, of course.

For Wolfenstein, Warner took advantage of an association that Muse had with a local recording studio, who processed Muse’s cassette software using equalizers and the like to create tapes that Muse claimed were more robust and reliable than those of the competition. Warner: “We went down there [to the studio] one fine day, and I spent several hours on the microphone saying, ‘Achtung!'” Given the primitive technology used to create them (not to mention Warner’s, um, unusual German diction), Wolfenstein‘s assorted shouts were often all but indecipherable. Rather than hurting, however, the distortion somehow added to the nightmare quality of the scenario as a whole, increasing the tension rather than the contrary.

Castle Wolfenstein

Warner’s magnum opus as a designer and programmer, Castle Wolfenstein remained Muse’s most successful product and reliable seller from its release in September of 1981 through Muse’s eventual dissolution, not only in its original Apple II incarnation but also in ports to the Atari 400 and 800, MS-DOS, and (most notably) the Commodore 64. Muse produced a belated sequel in 1984, Beyond Castle Wolfenstein, in which the player must break into Adolf Hitler’s underground bunker to assassinate the Fūhrer himself rather than break out of a generic Nazi fortress. However, while Warner was involved in design discussion for that game, the actual implementation was done by others. The following year, Muse suddenly collapsed, done in by a string of avoidable mistakes in a scenario all too common for the early, hacker-led software publishers. Warner stayed in the games industry for another decade after Muse, but never found quite the creative freedom and that certain spark of something that had led to Robot War and Castle Wolfenstein in his banner year of 1981. He died at the age of 54 in 2004. Wolfenstein itself, of course, lived on when id Software released Wolfenstein 3D, the precursor to the landmark Doom, in 1992.

Whether we choose to call Castle Wolfenstein the first PC action adventure or the first stealth game or something else, its biggest importance for ludic narrative is its injection of narrative elements into a gameplay framework completely divorced from the text adventures and CRPGs that had previously represented the category on computers. As such it stands at the point of origin of a trend that would over years and decades snowball to enormous — some would say ridiculous — proportions. Today stories in games are absolutely everywhere, from big-budget FPSs to casual puzzlers. With its violence and cartoon-like Nazi villains, Wolfenstein is perhaps also a harbinger of how cheap and coarse so many of those stories would be. But then again, we can’t really blame Warner for that, can we?

If you’d like to try Silas Warner’s greatest legacy for yourself, you can download the Apple II disk image and manual from here.

Next time we have some odds and ends to clean up as we begin to wrap up 1981 at last.

 
 

Tags: , ,