RSS

Search results for ‘EDU-WARE’

Edu-Ware

In 1978 the Minnesota Educational Computing Consortium (MECC), home of Don Rawitsch and his game The Oregon Trail, was on the cutting edge of computers in education — so much so that, long before business or the general public took much notice of the things, it began considering how to bring microcomputers into Minnesota classrooms as a supplement to the teletypes, dumb terminals, and large time-sharing systems that were the order of the day. MECC went to the leading producers of microcomputers of the time for bids, a list that was of course headed by Radio Shack. The Shack responded in its usual disinterested fashion.

Some of the companies, particularly Radio Shack, were not enamored with this process and thought it was kind of hokey — the process being the bid process and state requirements — and so they weren’t real particular about how they responded. We told Radio Shack, “You know, if you don’t respond in the right way we can’t accept your bid,” and they weren’t willing to change. Everything was flying high and they were selling TRS-80s like mad.

Although most in the MECC bureaucracy would have preferred to deal with large, stable Radio Shack, tiny Apple bid aggressively and enthusiastically, and won the day. MECC ordered 500 Apple IIs, a huge order in a year in which Apple would sell just 7600 machines in total. Granted, Apple discounted the sale so heavily that it’s doubtful they made much of anything from it. But that mattered not a whit. In a storied career filled with savvy marketing moves, Steve Jobs never made a savvier.

MECC not only began moving Apples into Minnesota classrooms, but also began porting its huge library of BASIC educational programs onto the platform. Let’s think about what this state of affairs means for a moment. MECC was already known all over the country as the leader in computer-based education, the example which all of the other more conservative, less well-funded educational institutions tended to belatedly follow. When those folks began thinking about microcomputers for their classrooms, they naturally asked what MECC was using: the Apple II. When they considered educational software, they once again had to take note of MECC’s rich library — a library being rapidly ported to just one microcomputer, the Apple II.

To push the process of educational adoption, by 1979 Apple was beginning to heavily promote the Apple II as an educational tool via advertisements like these:

Jobs realized that getting his computers into schools was the key to conquering a much bigger market: the home. Education was after all one of the most frequently cited reasons that families bought a computer. When Mom and Dad considered what computer to buy for Junior, the Apple II — the computer with all that educational software, the computer that Junior’s school was using, the computer that Junior himself had told them about and already knew how to operate — seemed to many the only logical choice, even if it did cost a bit more and, increasingly as time went on, didn’t have quite as impressive specifications as competing models. Those discounted Apple IIs for schools were loss leaders that paid off handsomely for years. Indeed, as soon as Apple had enough money to make it feasible, they increased their largesse, offering to give an Apple II absolutely free to every elementary school in the country. Moves like that created a stranglehold that even Apple itself was unable to break for years, when it wished the Apple II would just die already in favor of the Apple III and, later, the Macintosh. From the September 24, 1990, edition of InfoWorld:

Nearly 10 years later, elementary schools continue to buy Apple II technology. As a result, the strategy has kept what many industry observers contend is an overpriced and technically obsolete system in the mainstream. And it provided Apple with a virtual lock on the elementary school market that continues today.

That said, there was a bit more than smart marketing behind the Apple II’s classroom domination. Thanks to Woz’s chip- and circuit-saving design as well as the relative primitiveness of the machine itself, there wasn’t much to go wrong on the Apple II internally. And externally the thing was built like a tank. These factors helped the machines survive literally years of abuse at the hands of a whole generation of schoolchildren pounding their keyboards in frustration, poking at their screens with sticky fingers, and jamming the occasional floppy disk into a drive sideways. Teachers grew to love these tough little partners that offered them an occasional reprieve from classes full of wailing children.

Nor is it fair, regardless of the purity or lack thereof of Apple’s motivations in promoting education so heavily, to frame the discussion only in terms of sales and market share. Woz’s hackerish creation found itself a key player in an ongoing debate about the best way to approach education itself. We can perhaps come to understand that by looking at the career of one man, Sherwin Steffin. (Much of what follows is drawn from a portrait of Steffin and his company, Edu-Ware, that appeared in the May, 1981, issue of Softalk magazine.)

Steffin was not one of the young whiz kids of the microcomputer revolution. By the time Apple IIs began arriving in classrooms, he was almost 45 years old, with an impressive career in education already behind him. In addition to earning a bachelor’s degree in experimental psychology and a master’s degree in instructional technology, Steffin had combated gangs as a social worker in Detroit, taught junior high school for seven years, served as media director for a Chicago school district, served as coordinator of instructional system development at Northeastern University for four years, and developed instructional television for the National Technical Institute for the Deaf in Rochester, New York. From 1977, he worked as a senior research analyst at UCLA. The alleged crises in education that he wrestled with there sound eerily familiar today:

Conventional education was in serious difficulty. The end product was being perceived as less competent, less skilled, less curious, and lacking in the desire to learn.

Schools were filled with frustration. The teachers were getting the brunt of the public’s animosity, but the teacher had no mandate within which to work. It seemed that equally as important as teaching reading, writing, and arithmetic were his duties in teaching social skills, making the students patriotic, keeping them off drugs, and teaching them sex education without enlightening them about sex.

“Educational technologists” of Steffin’s generation tended to be greatly enamored with the theories of psychologist B.F. Skinner, inventor of “radical behaviorism.” Skinner believed that all human behavior is predetermined by genetics and by previous experience — the idea of a quasi-mystical “free will” was a useless chimera. He wrote a book, The Technology of Teaching, applying radical behaviorism to the field of education, outlining his idea of “programmed instruction.” Skinner proposed education as essentially a series of rote drills: the student is asked a question, responds, and is immediately informed whether her answer was correct, ad infinitum. Educational technologists developed “programmed learning machines,” automated devices to implement the concept of programmed instruction. Not surprisingly, they weren’t a big success. In a rare show of unity, teachers and students alike loathed them. Not only were they inexpressibly dull to work with, but teachers especially found them downright dehumanizing (a sentiment that, given the thrust of his ideas, Skinner may have embraced wholeheartedly). They correctly argued that many subjects, such as art and literature appreciation and critical thinking, could hardly be pounded home through rote drills.

Steffin began to diverge from his peers, finding the programmed learning machines inadequate. All their other failings aside, they were only good for what he called “convergent thinking, meaning that problems are posed and all students are brought to the same answer.” Divergent thinking, the encouragement of individual critical thinking skills and even opinion, was surely at least as important, for he believed that “thinking is the path to freedom.” With the arrival of relatively cheap microcomputers like the Apple II, Steffin saw a much more flexible tool for learning than the straitjacketed programming learning machines. In spite of having no programming experience or innate aptitude, he developed a program called Compu-Read to teach reading skills, first on UCLA’s big institutional system but later on an Apple II he had bought for research purposes. Like so many other semi-professional / semi-hobbyist programmers in those early years, he initially developed software as a sideline, licensing Compu-Read to the biggest of the early Apple II software publishers, Programma International. In the spring of 1979, however, Steffin was laid off from his post at UCLA. Rather than looking for another, he decided to jump into computer education with both feet, founding Edu-Ware in partnership with a UCLA student, Steve Pederson. Together they began churning out software at a feverish clip, copying the disks themselves and selling them in the Ziploc baggies that were typical of the era.

Edu-Ware’s offerings can be divided into three broad categories. Most were competent but plebeian educational drills that, truth be told, were not all that different from the old programmed learning machines. Their names were about as unexciting as their contents: Compu-Read, Fractions, Decimals, Arithmetic Skills, Compu-Spell, Algebra. (At least no one could say they weren’t descriptive.)

In remarkably short order, however, other Edu-Ware programs began to appear that occupied a hazy space at the intersection of educational tool, game, and simulation. Windfall: The Oil Crisis Game placed the player in charge of a large (albeit fictional) oil company. She could and presumably would try to win, of course, but she would also, inevitably, learn about a complex system that had almost broken down to produce the 1979 oil crisis. Network placed her in charge of a television network, balancing shows, schedules, and ratings, and learning about the pressures of mass media in the process. Terrorist focused on another subject much on people’s minds as the Iranian hostage crisis dragged on, placing her in the role of terrorist or government authority in hostage taking, airplane hijacking, or nuclear blackmail scenarios.

Created at a time when most other software either ignored the real world entirely or dealt with it only in the form of military hardware, these programs are remarkable for the way they determinedly engage with real, pressing social questions. But they are not just dry simulations. Each reflects an agenda, makes an argument about the world, making them perhaps the first examples of what has come to be called “persuasive games.” Their procedural rhetoric reflects the liberal worldview of Edu-Ware themselves. Network might even qualify as the first procedural satire, being inspired by the 1976 black comedy film of the same name.

And the third category? They don’t pretend to be simulations, or anything other than games for that matter, but they’re no less fascinating for all that. More on them next time.

 
 

Tags: ,

I Have No Mouth, and I Must Scream

To the person who [is] contemplating buying this game, what would I say? I would say take your money and give it to the homeless, you’ll do more good. But if you are mad to buy this game, you’ll probably have a hell of a lot of fun playing it, it will probably make you uneasy, and you’ll probably be a smarter person when you’re done playing the game. Not because I’m smarter, but because everything was done to confuse and upset you. I am told by people that it is a game unlike any other game around at the moment and I guess that’s a good thing. Innovation and novelty is a good thing. It would be my delight if this game set a trend and all of the arcade bang-bang games that turn kids into pistol-packing papas and mamas were subsumed into games like this in which ethical considerations and using your brain and unraveling puzzles become the modus operandi. I don’t think it will happen. I don’t think you like to be diverted too much. So I’m actually out here to mess with you, if you want to know it. We created this game to give you all the stuff you think you want, but to put a burr into your side at the same time. To slip a little loco weed into your Coca-Cola. See you around.

— Harlan Ellison

Harlan Ellison made a very successful career out of biting the hands that fed him. The pint-sized dervish burst into literary prominence in the mid-1960s, marching at the vanguard of science fiction’s New Wave. In the pages of Frederick Pohl’s magazine If, he paraded a series of scintillatingly trippy short stories that were like nothing anyone had ever seen before, owing as much to James Joyce and Jack Kerouac as they did to Isaac Asimov and Robert Heinlein. Ellison demanded, both implicitly in his stories and explicitly in his interviews, that science fiction cast off its fetish for shiny technology-fueled utopias and address the semi-mythical Future in a more humanistic, skeptical way. His own prognostications in that vein were almost unrelentingly grim: “‘Repent, Harlequin!’ Said the Ticktockman” dealt with a future society where everyone was enslaved to the ticking of the government’s official clock; “I Have No Mouth, and I Must Scream” told of the last five humans left on a post-apocalyptic Earth, kept alive by an insane artificial intelligence so that he could torture them for all eternity; “A Boy and His Dog” told of a dog who was smarter than his feral, amoral human master, and helped him to find food to eat and women to rape as they roamed another post-apocalyptic landscape. To further abet his agenda of dragging science fiction kicking and screaming into the fearless realm of True Literature, Ellison became the editor of a 1967 anthology called Dangerous Visions, for which he begged a diverse group of established and up-and-coming science-fiction writers to pick a story idea that had crossed their mind but was so controversial and/or provocative that they had never dared send it to a magazine editor — and then to write it up and send it to him instead.

Ellison’s most impactful period in science fiction was relatively short-lived, ending with the publication of the somewhat underwhelming Again, Dangerous Visions in 1972. He obstinately refused to follow the expected career path of a writer in his position: that of writing a big, glossy novel to capitalize on the cachet his short stories had generated. Meanwhile even his output of new stories slowed in favor of more and more non-fiction essays, while those stories that did emerge lacked some of the old vim and vinegar. One cause of this was almost certainly his loss of Frederick Pohl as editor and bête noire. Possessing very different literary sensibilities, the two had locked horns ferociously over the most picayune details — Pohl called Ellison “as much pain and trouble as all the next ten troublesome writers combined” — but Pohl had unquestionably made Ellison’s early stories better. He was arguably the last person who was ever truly able to edit Harlan Ellison.

No matter. Harlan Ellison’s greatest creation of all was the persona of Harlan Ellison, a role he continued to play very well indeed right up until his death in 2018. “He is a test of our credulity,” wrote his fellow science-fiction writer David Gerrold in 1984. “He is too improbable to be real.”

Harlan Ellison on the set of Star Trek with Leonard Nimoy and William Shatner.

The point of origin of Harlan Ellison as science fiction’s very own enfant terrible can be traced back to the episode of Star Trek he wrote in 1966. “The City on the Edge of Forever” is often called the best single episode of the entire original series, but to Ellison it was and forever remained an abomination in its broadcast form. As you may remember, it’s a time-travel story, in which Kirk, Spock, and McCoy are cast back into the Great Depression on Earth, where Kirk falls in love with a beautiful social worker and peace activist, only to learn that he has to let her die in a traffic accident in order to prevent her pacifism from infecting the body politic to such an extent that the Nazis are able to win World War II. As good as the produced version of the episode is, Ellison insisted until his death that the undoctored script he first submitted was far, far better — and it must be acknowledged that at least some of the people who worked on Star Trek agreed with him. In a contemporaneous memo, producer Bob Justman lamented that, following several rounds of editing and rewriting, “there is hardly anything left of the beauty and mystery that was inherent in the screenplay as Harlan originally wrote it.” For his part, Ellison blamed Star Trek creator Gene Roddenberry loudly and repeatedly for “taking a chainsaw” to his script. In a fit of pique, he submitted his undoctored script for a 1967 Writers Guild Award. When it won, he literally danced on the table in front of Roddenberry inside the banquet hall, waving his trophy in his face. Dorothy Fontana, the writer who had been assigned the unenviable task of changing Ellison’s script to fit with the series’s budget and its established characters, was so cowed by his antics that for 30 years she dared not tell him she had done so.

Despite this incident and many another, lower-profile one much like it, Ellison continued to work in Hollywood — as, indeed, he had been doing even before his star rose in literary science-fiction circles. Money, he forthrightly acknowledged, was his principal reason for writing for a medium he claimed to loathe. He liked creating series pilots most of all, he said, “because when they screw those up, they just don’t go on the air. I get paid and I’ve written something nice and it doesn’t have to get ruined.” His boorish behavior in meetings with the top movers and shakers of Hollywood became legendary, as did the lawsuits he fired hither and yon whenever he felt ill-used. Why did Hollywood put up with it? One answer is that Harlan Ellison was at the end of the day a talented writer who could deliver the goods when it counted, who wasn’t unaware of the tastes and desires of the very same viewing public he heaped with scorn at every opportunity. The other is that his perpetual cantankerousness made him a character, and no place loves a character more than Hollywood.

Then again, one could say the same of science-fiction fandom. Countless fans who had read few to none of Ellison’s actual stories grew up knowing him as their genre’s curmudgeonly uncle with the razor wit and the taste for blood. For them, Harlan Ellison was famous simply for being Harlan Ellison. Any lecture or interview he gave was bound to be highly entertaining. An encounter with Ellison became a rite of passage for science-fiction journalists and critics, who gingerly sidled up to him, fed him a line, and then ducked for cover while he went off at colorful and profane length.

Harlan Ellison was a talk-show regular during the 1970s. And small wonder: drop a topic in his slot, and something funny, outrageous, or profound — or all three — was guaranteed to come out.

It’s hard to say how much of Ellison’s rage against the world was genuine and how much was shtick. He frequently revealed in interviews that he was very conscious of his reputation, and hinted at times that he felt a certain pressure to maintain it. And, in keeping with many public figures with outrageous public personas, Ellison’s friends did speak of a warmer side to his private personality, of a man who, once he brought you into his fold, would go to ridiculous lengths to support, protect, and help you.

Still, the flame that burned in Ellison was probably more real than otherwise. He was at bottom a moralist, who loathed the hypocrisy and parsimony he saw all around him. Often described as a futurist, he was closer to a reactionary. Nowhere could one see this more plainly than in his relationship to technology. In 1985, when the personal-computer revolution had become almost old hat, he was still writing on a mechanical typewriter, using reasoning that sounded downright Amish.

The presence of technology does not mean you have to use that technology. Understand? The typewriter that I have — I use an Olympia and I have six of them — is the best typewriter ever made. That’s the level of technology that allows me to do my job best. Electric typewriters and word processors — which are vile in every respect — seem to me to be crutches for bad writing. I have never yet heard an argument for using a word processor that didn’t boil down to “It’s more convenient.” Convenient means lazy to me. Lazy means I can write all the shit I want and bash it out later. They can move it around, rewrite it later. What do I say? Have it right in your head before you sit down, that’s what art is all about. Art is form, art is shape, art is pace, it is measure, it is the sound of music. Don’t write slop and discordancy and think just because you have the technology to cover up your slovenliness that it makes you a better writer. It doesn’t.

Ellison’s attitude toward computers in general was no more nuanced. Asked what he thought about computer entertainment in 1987, he pronounced the phrase “an oxymoron.” Thus it came as quite a surprise to everyone five years later when it was announced that Harlan Ellison had agreed to collaborate on a computer game.



The source of the announcement was a Southern California publisher and developer called Cyberdreams, which had been founded by Pat Ketchum and Rolf Klug in 1990. Ketchum was a grizzled veteran of the home-computer wars, having entered the market with the founding of his first software publisher DataSoft on June 12, 1980. After a couple of years of spinning their wheels, DataSoft found traction when they released a product called Text Wizard, for a time the most popular word processor for Atari’s 8-bit home-computer line. (Its teenage programmer had started on the path to making it when he began experimenting with ways to subtly expand margins and increase line spacings in order to make his two-page school papers look like three…)

Once established, DataSoft moved heavily into games. Ketchum decided early on that working with pre-existing properties was the best way to ensure success. Thus DataSoft’s heyday, which lasted from roughly 1983 to 1987, was marked by a bewildering array of television shows (The Dallas Quest), martial-arts personalities (Bruce Lee), Sunday-comics characters (Heathcliff: Fun with Spelling), blockbuster movies (Conan, The Goonies), pulp fiction (Zorro), and even board games (221 B Baker St.), as well as a bevy of arcade ports and British imports. The quality level of this smorgasbord was hit or miss at best, but Ketchum’s commercial instinct for the derivative proved well-founded for almost a half a decade. Only later in the 1980s, when more advanced computers began to replace the simple 8-bit machines that had been the perfect hosts for DataSoft’s cheap and cheerful games, did his somewhat lackadaisical attitude toward the nuts and bolts of his products catch up to him. He then left DataSoft to work for a time at Sullivan Bluth Interactive Media, which made ports of the old laser-disc arcade game Dragon’s Lair for various personal-computing platforms. Then, at the dawn of the new decade, he founded another company of his own with his new partner Rolf Klug.

The new company’s product strategy was conceived as an intriguing twist on that of the last one he had founded. Like DataSoft, Cyberdreams would rely heavily on licensed properties and personalities. But instead of embracing DataSoft’s random grab bag of junk-food culture, Cyberdreams would go decidedly upmarket, a move that was very much in keeping with the most rarefied cultural expectations for the new era of multimedia computing. Their first released product, which arrived in 1992, was called Dark Seed; it was an adventure game built around the striking and creepy techno-organic imagery of the Swiss artist H.R. Giger, best known for designing the eponymous creatures in the 1979 Ridley Scott film Alien. If calling Dark Seed a “collaboration” with Giger is perhaps stretching the point — although Giger licensed his existing paintings to Cyberdreams, he contributed no new art to the game — the end result certainly does capture his fetishistic aesthetic very, very well. Alas, it succeeds less well as a playable game. It runs in real time, meaning events can and will run away without a player who isn’t omniscient enough to be in the exact right spot at the exact right time, while its plot is most kindly described as rudimentary — and don’t even get me started on the pixel hunts. Suffice to say that few games in history have screamed “style over substance” louder than this one. Still, in an age hungry for fodder for the latest graphics cards and equally eager for proof that computer games could be as provocative as any other form of media, it did quite well.

By the time of Dark Seed‘s release, Cyberdreams was already working on another game built around the aesthetic of another edgy artist most famous for his contributions to a Ridley Scott film: Syd Mead, who had done the set designs for Blade Runner, along with those of such other iconic science-fiction films as Star Trek: The Motion Picture, TRON, 2010, and the Alien sequel Aliens. CyberRace, the 1993 racing game that resulted from the partnership, was, like its Cyberdreams predecessor, long on visuals and short on satisfying gameplay.

Well before that game was completed — in fact, before even Dark Seed was released — Pat Ketchum had already approached Harlan Ellison to ask whether he could make a game out of his classic short story “I Have No Mouth, and I Must Scream.” Doing so was, if nothing else, an act of considerable bravery, given not only Ellison’s general reputation but his specific opinion of videogames as “an utter and absolute stupid waste of time.” And yet, likely as much to Ketchum’s astonishment as anyone else’s, he actually agreed to the project. Why? That is best left to Ellison to explain in his own inimitable fashion:

The question frequently asked of me is this: “Since it is common knowledge that you don’t even own a computer on which you could play an electronic game this complex, since it is common knowledge that you hate computers and frequently revile those who spend their nights logging onto bulletin boards, thereby filling the air with pointless gibberish, dumb questions that could’ve been answered had they bothered to read a book of modern history or even this morning’s newspaper, and mean-spirited gossip that needs endless hours the following day to be cleaned up; and since it is common knowledge that not only do you type your books and columns and TV and film scripts on a manual typewriter (not even an electric, but an actual finger-driven manual), but that the closest you’ve ever come to playing an actual computer- or videogame is the three hours you wasted during a Virgin Airlines flight back to the States from the UK; where the hell do you get off creating a high-tech cutting-edge enigma like this I Have No Mouth thing?”

To which my usual response would be, “Yo’ Mama!”

But I have been asked to attempt politeness, so I will vouchsafe courtesy and venture some tiny explication of what the eff I’m doing in here with all you weird gazoonies. Take your feet off the table.

Well, it goes back to that Oscar Wilde quote about perversion: “You may engage in a specific perversion once, and it can be chalked up to curiosity. But if you do it again, it must be presumed you are a pervert.”

They came to me in the dead of night, human toads in silk suits, from this giant megapolitan organization called Cyberdreams, and they offered me vast sums of money — all of it in pennies, with strings attached to each coin, so they could yank them back in a moment, like someone trying to outsmart a soft-drink machine with a slug on a wire — and they said, in their whispery croaky demon voices, “Let us make you a vast fortune! Just sell us the rights to use your name and the name of your most famous story, and we will make you wealthy beyond the dreams of mere mortals, or even Aaron Spelling, our toad brother in riches.”

Well, I’d once worked for Aaron Spelling on Burke’s Law, and that had about as much appeal to me as spending an evening discussing the relative merits of butcher knives with O.J. Simpson. So I told the toads that money was something I had no trouble making, that money is what they give you when you do your job well, and that I never do anything if it’s only for money. ‘Cause money ain’t no thang.

Well, for the third time, they then proceeded to do the dance, and sing the song, and hump the drums, and finally got down to it with the fuzzy ramadoola that can snare me: they said, “Well (#4), you’ve never done this sort of thing. Maybe it is that you are not capable of doing this here now thing.”

Never tell me not to go get a tall ladder and climb it and open the tippy-topmost kitchen cabinet in my mommy’s larder and reach around back there at the rear of the topmost shelf in the dark with the cobwebs and the spider-goojies and pull out that Mason jar full of hard nasty petrified chickpeas and strain and sweat to get the top off the jar till I get it open and then take several of those chickpeas and shove them up my nose. Never tell me that. Because as sure as birds gotta swim an’ fish gotta fly, when you come back home, you will find me lying stretched out blue as a Duke Ellington sonata, dead cold with beans or peas or lentils up my snout.

Or, as Oscar Wilde put it: “I couldn’t help it. I can resist anything except temptation.”

And there it is. I wish it were darker and more ominous than that, but the scaldingly dopey truth is that I wanted to see if I could do it. Create a computer game better than anyone else had created a computer game. I’d never done it, and I was desirous of testing my mettle. It’s a great flaw with me. My only flaw, as those who have known me longest will casually attest. (I know where they live.)

Having entered the meeting hoping only to secure the rights to Ellison’s short story, Pat Ketchum thus walked away having agreed to a full-fledged collaboration with the most choleric science-fiction writer in the world, a man destined to persist forevermore in referring to him simply as “the toad.” Whether this was a good or a bad outcome was very much up for debate.

Ketchum elected to pair Ellison with David Sears, a journalist and assistant editor for Compute! magazine who had made Cyberdreams’s acquaintance when he was assigned to write a preview of Dark Seed, then had gone on to write the hint book for the game. Before the deal was consummated, he had been told only that Cyberdreams hoped to adapt “one of” Ellison’s stories into a game: “I was thinking, oh, it could be ‘Repent, Harlequin!’ Said the Ticktockman,’ or maybe ‘A Boy and His Dog,’ and it’s going to be some kind of RPG or something.” When he was told that it was to be “I Have No Mouth, and I Must Scream,” he was taken aback: “I was like, what? There’s no way [to] turn that into a game!” In order to fully appreciate his dismay, we should look a bit more closely at the story in question.

Harlan Ellison often called “No Mouth” “one of the ten most-reprinted stories in the English language,” but this claim strikes me as extremely dubious. Certainly, however, it is one of the more frequently anthologized science-fiction classics. Written “in one blue-white fit of passion,” as Ellison put it, “like Captain Nemo sitting down at his organ and [playing] Toccata and Fugue in D Minor,” it spans no more than fifteen pages or so in the typical paperback edition, but manages to cram quite a punch into that space.

The backstory entails a three-way world war involving the United States, the Soviet Union, and China and their respective allies, with the forces of each bloc controlled by a supercomputer in the name of maximal killing efficiency. That last proved to be a mistake: instead of merely moving ships and armies around, the American computer evolved into a sentient consciousness and merged with its rival machines. The resulting personality was twisted by its birthright of war and violence. Thus it committed genocide on the blighted planet’s remaining humans, with the exception of just five of them, which it kept alive to physically and psychologically torture for its pleasure.  As the story proper opens, it’s been doing so for more than a century. Our highly unreliable narrator is one of the victims, a paranoid schizophrenic named Ted; the others, whom we meet only as the sketchiest of character sketches, are named Gorrister, Benny, Ellen (the lone woman in the group), and Nimdok. The computer calls itself AM, an acronym for its old designation of “Allied Mastercomputer,” but also a riff on Descartes: “I think, therefore I AM.”

The story’s plot, such as it is, revolves around the perpetually starving prisoners’ journey to a place that AM has promised them contains food beyond their wildest dreams. It’s just one more of his cruel jokes, of course: they wind up in a frigid cavern piled high with canned food, without benefit of a can opener. But then something occurs which AM has failed to anticipate: Ted and Ellen finally accept that there is only one true means of escape open to them. They break off the sharpest stalactites they can find and use them to kill the other three prisoners, after which Ted kills Ellen. But AM manages to intervene before Ted can kill himself. Enraged at having his playthings snatched away, he condemns the very last human on Earth to a fate more horrific even than what he has already experienced:

I am a great soft jelly thing. Smoothly rounded, with no mouth, with pulsing white holes filled by fog where my eyes used to be. Rubbery appendages that were once my arms; bulks rounding down into legless humps of slippery matter. I leave a moist trail when I move. Blotches of diseased, evil gray come and go on my surface, as though light is being beamed from within.

Outwardly: dumbly, I shamble about, a thing that could never have been known as human, a thing whose shape is so alien a travesty that humanity becomes more obscene for the vague resemblance.

Inwardly: alone. Here. Living under the land, under the sea, in the belly of AM, whom we created because our time was badly spent and we must have known unconsciously that he could do it better. At least the four of them are safe at last.

AM will be the madder for that. It makes me a little happier. And yet… AM has won, simply… he has taken his revenge…

I have no mouth. And I must scream.

Harlan Ellison was initially insistent that the game version of No Mouth preserve this miserably bleak ending. He declared himself greatly amused by the prospect of “a game that you cannot possibly win.” Less superciliously, he noted that the short story was intended to be, like so much of his work, a moral fable: it was about the nobility of doing the right thing, even when one doesn’t personally benefit — indeed, even when one will be punished terribly for it. To change the story’s ending would be to cut the heart out of its message.

Thus when poor young David Sears went to meet with Ellison for the first time — although Cyberdreams and Ellison were both based in Southern California, he himself was still working remotely from his native Mississippi — he faced the daunting prospect of convincing one of the most infamously stubborn writers in the world — a man who had spent decades belittling no less rarefied a character than Gene Roddenberry over the changes to his “City on the Edge of Forever” script — that such an ending just wouldn’t fly in the contemporary games market. The last company to make an adventure game with a “tragic” ending had been Infocom back in 1983, and they’d gotten so much blow back that no one had ever dared to try such a thing again. People demanded games that they could win.

Much to Sears’s own surprise, his first meeting with Ellison went very, very well. He won Ellison’s respect almost immediately, when he asked a question that the author claimed never to have been asked before: “Why are these [people] the five that AM has saved?” The question pointed a way for the game of No Mouth to become something distinctly different from the story — something richer, deeper, and even, I would argue, more philosophically mature.

Ellison and Sears decided together that each of AM’s victims had been crippled inside by some trauma before the final apocalyptic war began, and it was this that made them such particularly delightful playthings. The salt-of-the-earth truck driver Gorrister was wracked with guilt for having committed his wife to a mental institution; the hard-driving military man Benny was filled with self-loathing over his abandonment of his comrades in an Asian jungle; the genius computer scientist Ellen was forever reliving a brutal rape she had suffered at the hands of a coworker; the charming man of leisure Ted was in reality a con artist who had substituted sexual conquest for intimacy. The character with by far the most stains on his conscience was the elderly Nimdok, who had served as an assistant to Dr. Josef Mengele in the concentration camps of Nazi Germany.

You the player would guide each of the five through a surreal, symbolic simulacrum of his or her checkered past, helpfully provided by AM. While the latter’s goal was merely to torture them, your goal would be to cause them to redeem themselves in some small measure, by looking the demons of their past full in the face and making the hard, selfless choices they had failed to make the first time around. If they all succeeded in passing their tests of character, Ellison grudgingly agreed, the game could culminate in a relatively happy ending. Ellison:

This game [says] to the player there is more to the considered life than action. Television tells you any problem can be solved in 30 minutes, usually with a punch in the jaw, and that is not the way life is. The only thing you have to hang onto is not your muscles, or how pretty your face is, but how strong is your ethical behavior. How willing are you to risk everything — not just what’s convenient, but everything — to triumph. If someone comes away from this game saying to himself, “I had to make an extremely unpleasant choice, and I knew I was not going to benefit from that choice, but it was the only thing to do because it was the proper behavior,” then they will have played the game to some advantage.

Harlan Ellison and David Sears were now getting along fabulously. After several weeks spent working on a design document together, Ellison pronounced Sears “a brilliant young kid.” He went out of his way to be a good host. When he learned, for example, that Sears was greatly enamored with Neil Gaiman’s Sandman graphic novels, he called up said writer himself on his speakerphone: “Hi, Neil. This is David. He’s a fan and he’d love to talk to you about your work.” In retrospect, Ellison’s hospitality is perhaps less than shocking. He was in fact helpful and even kind throughout his life to young writers whom he deemed to be worth his trouble. David Sears was obviously one of these. “I don’t want to damage his reputation because I’m sure he spent decades building it up,” says Sears, “but he’s a real rascal with a heart of gold — but he doesn’t tolerate idiots.”

Harlan Ellison prepares to speak at the 1993 Game Developers Conference.

The project had its industry coming-out party at the seventh annual Computer Game Developers Conference in May of 1993. In a measure of how genuinely excited Harlan Ellison was about it, he agreed to appear as one of the most unlikely keynote speakers in GDC history. His speech has not, alas, been preserved for posterity, but it appears to have been a typically pyrotechnic Ellison rant, judging by the angry response of Computer Gaming World editor Johnny L. Wilson, who took Ellison to be just the latest in a long line of clueless celebrity pundits swooping in to tell game makers what they were doing wrong. Like all of the others, Wilson said, Ellison “didn’t really understand technology or the challenges faced daily by his audience [of game developers].” His column, which bore the snarky title of “I Have No Message, but I Must Scream,” went on thusly:

The major thesis of the address seemed to be that the assembled game designers need to do something besides create games. We aren’t quite sure what he means.

If he means to take the games which the assembled designers are already making and infuse them with enough human emotion to bridge the gaps of interpersonal understanding, there are designers trying to accomplish this in many different ways (games with artificial personalities, multiplayer cooperation, and, most importantly, with story).

If he objects to the violence which is so pervasive in both computer and video games, he had best revisit the anarchic and glorious celebration of violence in his own work. Violence is an easy way to express conflict and resolution in any art form. It can also be powerful. That is why we advocate a more careful use of violence in certain games, but do not editorialize against violence per se.

Harlan Ellison says that the computer-game design community should quit playing games with their lives. We think Ellison should stop playing games with his audiences. It’s time to put away his “Bad Melville” impression and use his podium as a “futurist” to challenge his audiences instead of settling for cheap laughs and letting them miss the message.

Harlan Ellison seldom overlooked a slight, whether in print or in person, and this occasion was no exception. He gave Computer Gaming World the rather hilarious new moniker of Video Wahoo Magazine in a number of interviews after Wilson’s editorializing was brought to his attention.

But the other side of Harlan Ellison was also on display at that very same conference. David Sears had told Ellison shortly before he made his speech that he really, really wanted a permanent job in the games industry, not just the contract work he had been getting from Cyberdreams. So, Ellison carried a fishbowl onstage with him, explained to the audience that Sears was smart and creative as heck and urgently needed a job, and told them to drop their business cards in the bowl if they thought they might be able to offer him one. “Three days later,” says Sears, “I had a job at Virgin Games. If he called me today [this interview was given before Ellison’s death] and said, ‘I need you to fix the plumbing in my bathroom,’ I’d be on a plane.”

Ellison’s largess was doubly selfless in that it stopped his No Mouth project in its tracks. With Sears having departed for Virgin Games, it spent at least six months on the shelf while Cyberdreams finished up CyberRace and embarked on a Dark Seed II. Finally Pat Ketchum handed it to a new hire, a veteran producer and designer named David Mullich.

It so happens that we met Mullich long, long ago, in the very early days of these histories. At the dawn of the 1980s, as a young programmer just out of university, he worked for the pioneering educational-software publisher Edu-Ware, whom he convinced to let him make some straight-up games as well. One of these was an unauthorized interactive take on the 1960s cult-classic television series The Prisoner; it was arguably the first commercial computer game in history to strive unabashedly toward the status of Art.

Mullich eventually left Edu-Ware to work for a variety of software developers and publishers. Rather belying his earliest experiments in game design, he built a reputation inside the industry as a steady hand well able to churn out robust and marketable if not always hugely innovative games and educational products that fit whatever license and/or design brief he was given. Yet the old impulse to make games with something to say about the world never completely left him. He was actually in the audience at the Game Developers Conference where Harlan Ellison made his keynote address; in marked contrast to Johnny L. Wilson, he found it bracing and exciting, not least because “I Have No Mouth, and I Must Scream” was his favorite short story of all time. Half a year or so later, Pat Ketchum called Mullich up to ask if he’d like to help Ellison get his game finished. He didn’t have to ask twice; after all those years spent slogging in the trenches of commerce, here was a chance for Mullich to make Art again.

His first meeting with Ellison didn’t begin well. Annoyed at the long delay from Cyberdreams’s side, Ellison mocked him as “another member of the brain trust.” It does seem that Mullich never quite developed the same warm relationship with Ellison that Sears had enjoyed: Ellison persisted in referring to him as “this new David, whose last name I’ve forgotten” even after the game was released. Nonetheless, he did soften his prejudicial first judgment enough to deem Mullich “a very nice guy.” Said nice guy took on the detail work of refining Sears and Ellison’s early design document — which, having been written by two people who had never made a game before, had some inevitable deficiencies — into a finished script that would combine Meaning with Playability, a task his background prepared him perfectly to take on. Mullich estimates that 50 percent of the dialog in the finished game is his, while 30 percent is down to Sears and just 20 percent to Ellison himself. Still, even that level of involvement was vastly greater than that of most established writers who deigned to put their names on games. And of course the core concepts of No Mouth were very much Ellison and Sears’s.

Pat Ketchum had by this point elected to remove Cyberdreams from the grunt work of game development; instead the company would act as a design mill and publisher only. Thus No Mouth was passed to an outfit called The Dreamers Guild for implementation under Mullich’s supervision. That became another long process; the computer game of I Have No Mouth, and I Must Scream wasn’t finally released until late 1995, fully three and a half years after Pat Ketchum had first visited Harlan Ellison to ask his permission to make it.

The latter’s enthusiasm for the project never abated over the course of that time. He bestowed his final gift upon David Mullich and the rest of Cyberdreams when he agreed to perform the role of AM himself. The result is one of the all-time great game voice-acting performances; Ellison, a man who loved to hear himself speak under any and all circumstances, leans into the persona of the psychopathic artificial intelligence with unhinged glee. After hearing him, you’ll never be able to imagine anyone else in the role.


Upon the game’s release, Ellison proved a disarmingly effective and professional spokesman for it; for all that he loved to rail against the stupidity of mainstream commercial media, he had decades of experience as a writer for hire, and knew the requirements of marketing. He wrote a conciliatory, generous, and self-deprecatory letter to Computer Gaming World — a.k.a., Video Wahoo Magazine — after the magazine pronounced No Mouth its Adventure Game of the Year. He even managed to remember David Mullich’s last name therein.

With a bewildering admixture of pleasure and confusion — I’m like a meson which doesn’t know which way to quark — I write to thank you and your staff. Pleasure, because everybody likes to cop the ring as this loopy caravanserie chugs on through Time and Space. Confusion, because — as we both know — I’m an absolute amateur at this exercise. To find myself not only avoiding catcalls and justified laughter at my efforts, but to be recognized with a nod of approval from a magazine that had previously chewed a neat, small hole through the front of my face… well, it’s bewildering.

David Sears and I worked very hard on I Have No Mouth. And we both get our accolades in your presentation. But someone else who had as much or more to do with bringing this project to fruition is David Mullich. He was the project supervisor and designer after David Sears moved on. He worked endlessly, and with what Balzac called “clean hands and composure,” to produce a property that would not shame either of us. It simply would not have won your award had not David Mullich mounted the barricades.

I remember when I addressed the Computer Game Designers’ banquet a couple of years ago, when I said I would work to the limits of my ability on I Have No Mouth, but that it would be my one venture into the medium. Nothing has changed. I’ve been there, done that, and now you won’t have to worry about me making a further pest of myself in your living room.

But for the honor you pay me, I am grateful. And bewildered.

Ellison’s acknowledgment of Mullich’s contribution is well-taken. Too often games that contain or purport to contain Deep Meaning believe this gives them a pass on the fundamentals of being playable and soluble. (For example, I might say, if you’ll allow me just a bit of Ellisonian snarkiness, that a large swath of the French games industry operated on this assumption for many years.) That No Mouth doesn’t fall victim to this fallacy — that it embeds its passion plays within the framework of a well-designed puzzle-driven adventure game — must surely be thanks to Mullich. In this sense, then, Sears’s departure came at the perfect time, allowing the experienced, detail-oriented Mullich to run with the grandiose concept which Sears and Ellison, those two game-design neophytes, had cooked up together. It was, one might say, the best of both worlds.

But, lest things start to sound too warm and fuzzy, know that Harlan Ellison was still Harlan Ellison. In the spring of 1996, he filed a lawsuit against Cyberdreams for unpaid royalties. Having spent his life in books and television, it appears that he may have failed to understand just how limited the sales prospects of an artsy, philosophical computer game like this one really were, regardless of how many awards it won. (Witness his comparison of Cyberdreams to the television empire of Aaron Spelling in one of the quotes above; in reality, the two operated not so much in different media galaxies as different universes.) “With the way the retail chain works, Cyberdreams probably hadn’t turned a profit on the game by the time the lawsuit was filed,” noted Computer Gaming World. “We’re not talking sales of Warcraft II here, folks.” I don’t know the details of Ellison’s lawsuit, nor what its ultimate outcome was. But I do know that David Mullich estimates today that No Mouth probably sold only about 40,000 copies in all.

Harlan Ellison didn’t always keep the sweeping promises he made in the heat of the moment; he huffily announced on several occasions that he was forever abandoning television, the medium with which he passed so much of his career in such a deadly embrace, only to be lured back in by money and pledges that this time things would be different. He did, however, keep his promise of never making another computer game. And that, of course, makes the one game he did help to make all the more special. I Have No Mouth, and I Must Scream stands out from the otherwise drearily of-its-time catalog of Cyberdreams as a multimedia art project that actually works — works as a game and, dare I say it, as a form of interactive literature. It stands today as a rare fulfillment of the promise that so many saw in games back in those heady days when “multimedia” was the buzzword of the zeitgeist — the promise of games as a sophisticated new form of storytelling capable of the same relevance and resonance as a good novel or movie. This is by no means the only worthwhile thing that videogames can be, nor perhaps even the thing they are best at being; much of the story of gaming during the half-decade after No Mouth‘s release is that of a comprehensive rejection of the vision Cyberdreams embodied. The company went out of business in 1997, by which time its artsy-celebrity-driven modus operandi was looking as anachronistic as Frank Sinatra during the heyday of the Beatles.

Nevertheless, I Have No Mouth, and I Must Scream remains one of the best expressions to stem from its confused era, a welcome proof positive that sometimes the starry-eyed multimedia pundits could be right. David Mullich went on to work on such high-profile, beloved games as Heroes of Might and Magic III and Vampire: The Masquerade — Bloodlines, but he still considers No Mouth one of the proudest achievements of a long and varied career that has encompassed the naïvely idealistic and the crassly commercial in equal measure. As well he should: No Mouth is as meaningful and moving today as it was in 1995, a rare example of a game adaptation that can be said not just to capture but arguably to improve on its source material. It endures as a vital piece of Harlan Ellison’s literary legacy.


In I Have No Mouth, and I Must Scream, you explore the traumas of each of the five people imprisoned by the psychotic supercomputer AM, taken in whatever order you like. Finding a measure of redemption for each of them opens up an endgame which offers the same chance for the rest of humanity — a dramatic departure from the infamously bleak ending of the short story on which the game is based.

Each character’s vignette is a surreal evocation of his tortured psyche, but is also full of opportunities for him to acknowledge and thereby cleanse himself of his sins. Harlan Ellison particularly loved this bit of symbolism, involving the wife and mother-in-law of the truck driver Gorrester: he must literally let the two principal women in his life off the hook. (Get it?) Ellison’s innocent delight in interactions like these amused the experienced game designer David Mullich, for whom they were old hat.

In mechanical terms, No Mouth is a fairly typical adventure game of its period. Its engine’s one major innovation can be seen in the character portrait at bottom left. The background here starts out black, then lightens through progressive shades of green as the character in question faces his demons (literally here, in the case of Ted — the game is not always terribly subtle). Ideally, each vignette will conclude with a white background. Be warned: although No Mouth mostly adheres to a no-deaths-and-no-dead-ends philosophy — “dying” in a vignette just gets the character bounced back to his cage, whence he can try again — the best ending becomes impossible to achieve if every character doesn’t demonstrate a reasonable amount of moral growth in the process of completing his vignette.

The computer genius Ellen is mortified by yellow, the color worn by the man who raped her. Naturally, the shade features prominently in AM’s decor.

The professional soldier Benny confronts the graves of the men who died under his command.

If sins can be quantified, then Nimdok, the associate to Dr. Mengele, surely has the most to atone for. His vignette involves the fable of the Golem of Prague, who defended the city’s Jewish ghetto against the pogroms of the late sixteenth century. Asked whether he risked trivializing the Holocaust by putting it in a game, Harlan Ellison answered in the stridently negative: “Nothing could trivialize the Holocaust. I don’t care whether you mention it in a comic book, on bubble-gum wrappers, in computer games, or write it in graffiti on the wall. Never forget. Never forget.


People say, “Oh, you’re so prolific.” That’s a remark made by assholes who don’t write. If I were a plumber and I repaired 10,000 toilets, would they say, “Boy, you’re a really prolific plumber?”

If I were to start over, I would be a plumber. I tell that to people, they laugh. They think I’m making it up. It’s not funny. I think a plumber, a good plumber who really cares and doesn’t overcharge and makes sure things are right, does more good for the human race in a given day than 50 writers. In the history of the world, there are maybe, what, 20, 30 books that ever had any influence on anybody, maybe The Analects of Confucius, maybe The History of the Peloponnesian Wars, maybe Uncle Tom’s Cabin. If I ever write anything that is remembered five minutes after I’m gone, I will consider myself having done the job well. I work hard at what I do; I take my work very seriously. I don’t take me particularly seriously. But I take the work seriously. But I don’t think writing is all that inherently a noble chore. When the toilet overflows, you don’t need Dostoevsky coming to your house.

That’s what I would do, I would get myself a job as a plumber. I would go back to bricklaying, which I used to do. I would become an electrician. Not an electrical engineer. I would become an electrician. I would, you know, install a night light in a kid’s nursery, and at the end of the day, if I felt like writing, I would write something. I don’t know what that has to do with the game or anything, but you asked so I told you.

— Harlan Ellison (1934-2018)

(Sources: the books The Way the Future Was by Frederick Pohl, These Are the Voyages: Season One by Marc Cushman with Susan Osborn, The Cambridge Companion to Science Fiction edited by Edward James and Farah Mendlesohn, I Have No Mouth & I Must Scream: Stories by Harlan Ellison, and I Have No Mouth, and I Must Scream: The Official Strategy Guide by Mel Odom; Starlog of September 1977, April 1980, August 1980, August 1984, November 1985, and December 1985; Compute! of November 1992; Computer Gaming World of March 1988, September 1992, July 1993, September 1993, April 1996, May 1996, July 1996, August 1996, November 1996, and June 1999; CU Amiga of November 1992 and February 1993; Next Generation of January 1996; A.N.A.L.O.G. of June 1987; Antic of August 1983; Retro Gamer 183. Online sources include a 1992 Game Informer retrospective on I Have No Mouth, and I Must Scream and a history of Cyberdreams at Game Nostalgia. My thanks also go to David Mullich for a brief chat about his career and his work on No Mouth.

I Have No Mouth, and I Must Scream is available as a digital purchase at GOG.com.)

 
 

Tags: , , ,

Apple, Carmen Sandiego, and the Rise of Edutainment

If there was any one application that was the favorite amongst early boosters of personal computing, it was education. Indeed, it could sometimes be difficult to find one of those digital utopianists who was willing to prioritize anything else — unsurprisingly, given that so much early PC culture grew out of places like The People’s Computer Company, who made “knowledge is power” their de facto mantra and talked of teaching people about computers and using computers to teach with equal countercultural fervor. Creative Computing, the first monthly magazine dedicated to personal computing, grew out of that idealistic milieu, founded by an educational consultant who filled a big chunk of its pages with plans, schemes, and dreams for computers as tools for democratizing, improving, and just making schooling more fun. A few years later, when Apple started selling the II, they pushed it hard as the learning computer, making deals with the influential likes of the Minnesota Educational Consortium (MECC) of Oregon Trail fame that gave the machine a luster none of its competitors could touch. For much of the adult public, who may have had their first exposure to a PC when they visited a child’s classroom, the Apple II became synonymous with the PC, which was in turn almost synonymous with education in the days before IBM turned it into a business machine. We can still see the effect today: when journalists and advertisers look for an easy story of innovation to which to compare some new gadget, it’s always the Apple II they choose, not the TRS-80 or Commodore PET. And the iconic image of an Apple II in the public’s imagination remains a group of children gathered around it in a classroom.

For all that, though, most of the early educational software really wasn’t so compelling. The works of Edu-Ware, the first publisher to make education their main focus, were fairly typical. Most were created or co-created by Edu-Ware co-founder Sherwin Steffin, who brought with him a professional background of more than twenty years in education and education theory. He carefully outlined his philosophy of computerized instruction, backed as it was by all the latest research into the psychology of learning, in long-winded, somewhat pedantic essays for Softalk and Softline magazines, standard bearers of the burgeoning Apple II community. Steffin’s software may or may not have correctly applied the latest pedagogical research, but it mostly failed at making children want to learn with it. The programs were generally pretty boring exercises in drill and practice, lacking even proper titles. Fractions, Arithmetic Skills, or Compu-Read they said on their boxes, and fractions, arithmetic, or (compu-)reading was what you got, a series of dry drills to work through without a trace of wit, whimsy, or fun.

The other notable strand of early PC-based education was the incestuous practice of using the computer to teach kids about computers. The belief that being able to harness the power of the computer through BASIC would somehow become a force for social democratization and liberation is an old one, dating back to even before the first issues of Creative Computing — to the People’s Computer Club and, indeed, to the very researchers at Dartmouth University who created BASIC in the 1960s. As BASIC’s shortcomings became more and more evident, other instructional languages and courses based on them kept popping up in the early 1980s: PILOT, Logo, COMAL, etc. This craze for “computer literacy,” which all but insisted that every kid who didn’t learn to program was going to end up washing dishes or mowing lawns for a living, peaked along with the would-be home-computer revolution in about 1983. Advocating for programming as a universal life skill was like suggesting in 1908 that everyone needed to learn to take a car apart and put it back together to prepare for the new world that was about to arrive with the Model T — which, in an example of how some things never really change, was exactly what many people in 1908 were in fact suggesting. Joseph Weizenbaum of Eliza fame, always good for a sober corrective to the more ebullient dreams of his colleagues, offered a take on the real computerized future that was shockingly prescient by comparing the computer to the electric motor.

There are undoubtedly many more electric motors in the United States than there are people, and almost everybody owns a lot of electric motors without thinking about it. They are everywhere, in automobiles, food mixers, vacuum cleaners, even watches and pencil sharpeners. Yet, it doesn’t require any sort of electric-motor literacy to get on with the world, or, more importantly, to be able to use these gadgets.

Another important point about electric motors is that they’re invisible. If you question someone using a vacuum cleaner, of course they know that there is an electric motor inside. But nobody says, “Well, I think I’ll use an electric motor programmed to be a vacuum cleaner to vacuum the floor.”

The computer will also become largely invisible, as it already is to a large extent in the consumer market. I believe that the more pervasive the computer becomes, the more invisible it will become. We talk about it a lot now because it is new, but as we get used to the computer it will retreat into the background. How much hands-on computer experience will students need? The answer, of course, is not very much. The student and the practicing professional will operate special-purpose instruments that happen to have computers as components.

The pressure to make of every kid a programmer gradually faded as the 1980s wore on, leaving programming to those of us who found it genuinely fascinating. Today even the term “computer literacy,” always a strange linguistic choice anyway, feels more and more like a relic of history as this once-disruptive and scary new force has become as everyday as, well, the electric motor.

As for those other educational programs, they — at least some of them — got better by mid-decade. Programs like Number Munchers, Math Blaster, and Reader Rabbit added a bit more audiovisual sugar to their educational vegetables along with a more gamelike framework to their repetitive drills, and proved better able to hold children’s interest. For all the early rhetoric about computers and education, one could argue that the real golden age of the Apple II as an educational computer didn’t begin until about 1983 or 1984.

By that time a new category of educational software, partly a marketing construct but partly a genuinely new thing, was becoming more and more prominent: edutainment. Trip Hawkins, founder of Electronic Arts, has often claimed to have invented the portmanteau for EA’s 1984 title Seven Cities of Gold, but this is incorrect; a company called Milliken Publishing was already using the label for their programs for the Atari 8-bit line in late 1982, and it was already passing into common usage by the end of 1983. Edutainment dispensed with the old drill-and-practice model in preference to more open, playful forms of interactions that nevertheless promised, sometimes implicitly and sometimes explicitly, to teach. The skills they taught, meanwhile, were generally not the rigid, disembodied stuff of standardized tests but rather embedded organically into living virtual worlds. It’s all but impossible to name any particular game as the definitive first example of such a nebulous genre, but a good starting point might be Tom Snyder and Spinnaker Software.

Tom Snyder, 1984

Tom Snyder, 1984

Snyder had himself barely made it through high school. He came to blame his own failings as a student on his inability to relate to exactly the notions of arbitrary, contextless education that marked the early era of PC educational software: “Here, learn this set of facts. Write this paper. This is what you must know. This is what’s important.” When he became a fifth-grade teacher years later, he made it a point to ground his lessons always in the real world, to tell his students why it was useful to know the things he taught them and how it all related to the world around them. He often used self-designed games, first done with pencil and paper and cardboard and later done on computers, to let his students explore knowledge and its ramifications. In 1980 he founded a groundbreaking development company, Tom Snyder Productions, to commercialize some of those efforts. One of them became Snooper Troops, published as one of Spinnaker’s first titles in 1982; it had kids wandering around a small town trying to solve a mystery by compiling clues and using their powers of deduction. The next year’s In Search of the Most Amazing Thing, still a beloved memory of many of those who played it, combined clue-gathering with elements of economics and even diplomacy in a vast open world. Unlike so much other children’s software, Snyder’s games never talked down to their audience; children are after all just as capable of sensing when they’re being condescended to as anyone else. They differed most dramatically from the drill-and-practice software that preceded them in always making the educational elements an organic part of their worlds. One of Snyder’s favorite mantras applies to educational software as much as it does to any other creative endeavor and, indeed, to life: “Don’t be boring.” The many games of Tom Snyder Productions, most of which were not actually designed by Snyder himself, were often crude and slow, written as often as not in BASIC. But, at least at the conceptual level, they were seldom boring.

It’s of course true that a plain old game that requires a degree of thoughtfulness and a full-on work of edutainment can be very hard to disentangle from one another. Like so much else in life, the boundaries here can be nebulous at best, and often had as much to do with marketing, with the way a title was positioned by its owner, as with any intrinsic qualities of the title itself. When we go looking for those intrinsics, we can come up with only a grab bag of qualities of which any given edutainment title was likely to share a subset: being based on real history or being a simulation of some real aspect of science or technology; being relatively nonviolent; emphasizing thinking and logical problem-solving rather than fast reflexes. Like pornography, edutainment is something that many people seemed to just know when they saw it.

That said, there were plenty of titles that straddled the border between entertainment and edutainment. Spinnaker’s Telarium line of adventure games is a good example. Text-based games that were themselves based on books, published by a company that had heretofore specialized in education and edutainment… it wasn’t hard to grasp why parents might be expected to find them appealing, even if they were never explicitly marketed as anything other than games. Spinnaker’s other line of adventures, Windham Classics, blurred the lines even more by being based on acknowledged literary classics of the sort kids might be assigned to read in school rather than popular science fiction and fantasy, and by being directly pitched at adolescents of about ten to fourteen years of age. Tellingly, Tom Synder Productions wrote one of the Windham Classics games; Dale Disharoon, previously a developer of Spinnaker educational software like Alphabet Zoo, wrote two more.

A certain amount of edutational luster clung to the text adventure in general, was implicit in much of the talk about interactive fiction as a new form of literature that was so prevalent during the brief bookware boom. One could even say it clung to the home computer itself, in the form of notions about “good screens” and “bad screens.” The family television was the bad screen, locus of those passive and mindless broadcasts that have set parents and educators fretting almost from the moment the medium was invented, and now the home of videogames, the popularity of which caused a reactionary near-hysteria in some circles; they would inure children to violence (if they thought Space Invaders was bad, imagine what they’d say about the games of today!) and almost literally rot their brains, making of them mindless slack-jawed zombies. The computer monitor, on the other hand, was the good screen, home of more thoughtful and creative forms of interaction and entertainment. What parent wouldn’t prefer to see her kid playing, say, Project: Space Station rather than Space Invaders? Home-computer makers and software publishers — at least the ones who weren’t making Space Invaders clones — caught on to this dynamic early and rode it hard.

As toy manufacturers had realized decades before, there are essentially two ways to market children’s entertainment. One way is to appeal to the children themselves, to make them want your product and nag Mom and Dad until they relent. The other is to appeal directly to Mom and Dad, to convince them that what you’re offering will be an improving experience for their child, perhaps with a few well-placed innuendoes if you can manage them about how said child will be left behind if she doesn’t have your product. With that in mind, it can be an interesting experiment to look at the box copy from software of the early home-computer era whilst asking yourself whether it’s written for the kids who were most likely to play it or the parents who were most likely to pay for it — or whether it hedges its bets by offering a little for both. Whatever else it was, emphasizing the educational qualities of your game was just good marketing; a 1984 survey found that 46 percent of computers in homes had been purchased by parents with the primary goal of improving their children’s education. It was the perfect market for the title that would come to stand alongside The Oregon Trail as one of the two classic examples of 1980s edutainment software.

Doug, Cathy, and Gary Carlston, 1983

Doug, Cathy, and Gary Carlston, 1983

The origins of the game that would become known as Where in the World is Carmen Sandiego? are confused, with lots of oft-contradictory memories and claims flying around. However, the most consistent story has it beginning with an idea by Gary Carlston of Brøderbund Software in 1983. He and his brother Doug had been fascinated by their family’s almanac as children: “We used to lie there and ask each other questions out of the almanac.” This evolved into impromptu quiz games in bed after the lights went out. Gary now proposed a game or, better yet, a series of games which would have players running down a series of clues about geography and history, answerable via a trusty almanac or other reference work to be included along with the game disk right there in the box.

Brøderbund didn’t actually develop much software in-house, preferring to publish the work of outside developers on a contract basis. While they did have a small staff of programmers and even artists, they were there mainly to assist outside developers by helping with difficult technical problems, porting code to other machines, and polishing in-game art rather than working up projects from scratch. But this idea just seemed to have too much potential to ignore or outsource. Gary was therefore soon installed in Brøderbund’s “rubber room” — so-called because it was the place where people went to bounce ideas off one another — along with Lauren Elliott, the company’s only salaried game designer; Gene Portwood, Elliott’s best friend, manager of Brøderbund’s programming team, and a pretty good artist; Ed Bernstein, head of Brøderbund’s art department; and programmer Dane Bigham, who would be expected to write not so much a game as a cross-platform database-driven engine that could power many ports and sequels beyond the Apple II original.

Gary’s first idea was to name the game Six Crowns of Henry VIII, and to make it a scavenger hunt for the eponymous crowns through Britain. However, the team soon turned that into something wider-scoped and more appealing to the emerging American edutainment market. You would be chasing an international criminal ring through cities located all over the world, trying to recover a series of stolen cultural artifacts, like a jade goddess from Singapore, an Inca mask from Peru, or a gargoyle from Notre Dame Cathedral (wonder how the thieves managed that one). It’s not entirely clear who came up with the idea for making the leader of the ring, whose capture would become the game’s ultimate goal, a woman named Carmen Sandiego, but Elliott believes the credit most likely belongs to Portwood. Regardless, everyone immediately liked the idea. “There were enough male bad guys,” said Elliott later, and “girls [could] be just as bad.” (Later, when the character became famous, Brøderbund would take some heat from Hispanic groups who claimed that the game associated a Hispanic surname with criminality. Gary replied with a tongue-in-cheek letter explaining that “Sandiego” was actually Carmen’s married name, that her maiden name was “Sondberg” and she was actually Swedish.) When development started in earnest, the Carmen team was pared down to a core trio of Eliott, who broadly speaking put together the game’s database of clues and cities; Portwood, who drew the graphics; and Bigham, who wrote the code. But, as Eliott later said, “A lot of what we did just happened. We didn’t think much about it.”

Where in the World is Carmen Sandiego?

To play that first Carmen Sandiego game today can be just a bit of an underwhelming experience; there’s just not that much really to it. Each of a series of crimes and the clues that lead you to the perpetrator are randomly generated from the game’s database of 10 possible suspects, 30 cities, and 1000 or so clues. Starting in the home city of the stolen treasure in question, you have about five days to track down each suspect. Assuming you’re on the right track, you’ll get clues in each city as to the suspect’s next destination among the several possibilities represented by the airline connections from that city: perhaps he “wanted to know the price of tweed” or “wanted to sail on the Severn.” (Both of these clues would point you to Britain, more specifically to London.) If you make the right deductions each step of the way you’ll apprehend the suspect in plenty of time. You’ll know you’ve made the wrong choice if you wind up at a dead-end city with no further clues on offer. Your only choice then is to backtrack, wasting precious time in the process. The tenth and final suspect to track down is always Carmen Sandiego herself, who for all of her subsequent fame is barely characterized at all in this first installment. Capture her, and you retire to the “Detective Hall of Fame.” There’s a little bit more to it, like the way that you must also compile details of the suspect’s appearance as you travel so you can eventually fill out an arrest warrant, but not a whole lot. Any modern player with Wikipedia open in an adjacent window can easily finish all ten cases and win the game in a matter of a few hours at most. By the time you do, the game’s sharply limited arsenal of clues, cities, and stolen treasures is already starting to feel repetitive.

Which is not to say that Carmen Sandiego is entirely bereft of modern appeal. When my wife and I played it over the course of a few evenings recently, we learned a few interesting things we hadn’t known before and even discovered a new country that I at least had never realized existed: the microstate of San Marino, beloved by stamp and coin collectors and both the oldest and the smallest constitutional republic in the world. My wife is now determined that we should make a holiday there.

Still, properly appreciating Carmen Sandiego‘s contemporary appeal requires of us a little more work. The logical place to start is with that huge World Almanac and Book of Facts that made the game’s box the heaviest on the shelves. It can be a bit hard even for those of us old enough to have grown up before the World Wide Web to recover the mindset of an era before we had the world in our living rooms — or, better said in this age of mobile computing, in our pockets. Back in those days when you had to go to a library to do research, when your choices of recreation of an evening were between whatever shows the dozen or so television stations were showing and whatever books you had in the house, an almanac was magic to any kid with a healthy curiosity about the world and a little imagination, what with its thousand or more pages filled with exotic lands along with records of deeds, buildings, cities, people, animals, and geography whose very lack of context only made them more alluring. The whole world — and then some; there were star charts and the like for budding astronomers — seemed to have been stuffed within its covers.

In that spirit, one could almost call the Carmen Sandiego game disk ancillary to the almanac rather than the other way around. Who knew what delights you might stumble over while you tried to figure out, say, in which country the python made its home? The World Almanac continues to come out every year, and seems to have done surprisingly well, all things considered, surviving the forces that have killed dead typical companions on reference shelves like the encyclopedia. But of course it’s lost much of its old magic in these days of information glut. While we can still recapture a little of the old feeling by playing Carmen Sandiego with a web browser open, our search engines have just gotten too good; it’s harder to stumble across the same sorts of crazy facts and alluring diversions.

Carmen Sandiego captured so many kids because it tempted them to discover knowledge for themselves rather than attempting to drill it into them, and all whilst never talking down to them. Gary Carlston said of Brøderbund’s edutainment philosophy, “If we would’ve enjoyed it at age 12, and if we still enjoy it now, then it’s what we want. Whether it’s pedagogically correct is not relevant.” Carmen Sandiego did indeed attract criticism from earnest educational theorists armed with studies showing how it failed to live up to the latest research on learning; this low-level drumbeat of criticism continues to this day. Some of it may very well be correct and relevant; I’m hardly qualified to judge. What I do see, though, is that Carmen Sandiego offers a remarkably progressive view of knowledge and education for its time. At a time when schools were still teaching many subjects through rote memorization of facts and dates, when math courses were largely “take this set of numbers and manipulate them to become this other set of numbers” without ever explaining why, Carmen Sandiego grasped that success in the coming world of cheap and ubiquitous data would require not a head stuffed with facts but the ability to extract relevant information from the flood of information that surrounds us, to synthesize it into conclusions, and to apply it to a problem at hand. While drill-and-practice software taught kids to perform specific tasks, Carmen Sandiego, like all the best edutainment software, taught them how to think. Just as importantly, it taught them how much fun doing so could be.

Where in the World is Carmen Sandiego

Brøderbund may not have been all that concerned about making Carmen Sandiego “pedagogically correct,” but they were hardly blind to the game’s educational value, nor to the marketing potential therein. The back cover alone of Carmen Sandiego is a classic example of edutainment marketing, emphasizing the adventure aspects for the kids while also giving parents a picture of children beaming over an almanac and telling how they will be “introduced to world geography” — and all whilst carefully avoiding the E-word; telling any kid that something is “educational” was and is all but guaranteed to turn her off it completely.

For all that, though, the game proved to be a slow burner rather than an out-of-the-gates hit upon its release in late 1985. It was hardly a flop; sales were strong enough that Brøderbund released the first of many sequels, Where in the USA is Carmen Sandiego?, the following year. Yet year by year the game just got more popular, especially when Brøderbund started to reach out more seriously to educators, releasing special editions for schools and sending lots of free swag to those who agreed to host “Carmen Days,” for which students and teachers dressed up as Carmen or her henchmen or the detectives on their trail, and could call in to the “Acme Detective Agency” at Brøderbund itself to talk with Portwood or Elliott playing the role of “the Chief.” The combination of official school approval, the game’s natural appeal to both parents and children, and lots of savvy marketing proved to be a potent symbiosis indeed. Total sales of Carmen Sandiego games passed 1 million in 1989, 2 million in 1991, by which time the series included not only Where in the World is Carmen Sandiego? and Where in the USA is Carmen Sandiego? but also Where in Europe is Carmen Sandiego?, Where in Time is Carmen Sandiego?, Where in America’s Past is Carmen Sandiego?, and the strangely specific Where in North Dakota is Carmen Sandiego?, prototype for a proposed series of state-level games that never got any further; Where in Space is Carmen Sandiego? would soon go in the opposite direction, rounding out the original series of reference-work-based titles on a cosmic scale. In 1991 Carmen also became a full-fledged media star, the first to be spawned by a computer game, when Where in the World is Carmen Sandiego? debuted as a children’s game show on PBS.

A Print Shop banner: an artifact as redolent of its era as Hula Hoops or bellbottoms are of theirs.

A Print Shop banner: an artifact as redolent of its era as Hula Hoops or bellbottoms are of theirs.

Through the early 1980s, Brøderbund had been a successful software publisher, but not outrageously so in comparison to their peers. At mid-decade, though, the company’s fortunes suddenly began to soar just as many of those peers were, shall we say, trending in the opposite direction. Brøderbund’s success was largely down to two breakout products which each succeeded in identifying a real, compelling use for home computers at a time when that was proving far more difficult than the boosters and venture capitalists had predicted. One was of course the Carmen Sandiego line. The other was a little something called The Print Shop, which let users design and print out signs and banners using a variety of fonts and clip art. How such a simple, straightforward application could become so beloved may seem hard to understand today, but beloved The Print Shop most definitely became. For the rest of the decade and beyond its distinctive banners, enabled by the fan-fold paper used by the dot-matrix printers of the day, could be seen everywhere that people without a budget for professional signage gathered: at church socials, at amateur sporting events, inside school hallways and classrooms. Like the first desktop-publishing programs that were appearing on the Macintosh contemporaneously, The Print Shop was one more way in which computers were beginning to democratize creative production, a process, as disruptive and fraught as it is inspiring, that’s still ongoing today.

In having struck two such chords with the public in the form of The Print Shop and Carmen Sandiego, Brøderbund was far ahead of virtually all of their competitors who failed to find even one. Brøderbund lived something of a charmed existence for years, defying most of the hard-won conventional wisdom about consumer software being a niche product at best and the real money being in business software. If the Carlstons hadn’t been so gosh-darn nice, one might be tempted to begrudge them their success. (Once when the Carlstons briefly considered a merger with Electronic Arts, whose internal culture was much more ruthless and competitive, a writer said it would be a case of the Walton family moving in with the Manson family.) One could almost say that for Brøderbund alone the promises of the home-computer revolution really did materialize, with consumers rushing to buy from them not just games but practical software as well. Tellingly — and assuming we agree to label Carmen Sandiego as an educational product rather than a game — Brøderbund’s top-selling title was never a game during any given year between 1985 and the arrival of the company’s juggernaut of an adventure game Myst in 1993, despite their publication of hits like the Jordan Mechner games Karateka and Prince of Persia. Carmen Sandiego averaged 25 to 30 percent of Brøderbund’s sales during those years, behind only The Print Shop. The two lines together accounted for well over half of yearly revenues that were pushing past $50 million by decade’s end — still puny by the standards of business software but very impressive indeed by that of consumer software.

For the larger software market, Carmen Sandiego — and, for that matter, The Print Shop — were signs that, if the home computer hadn’t quite taken off as expected, it also wasn’t going to disappear or be relegated strictly to the role of niche game machine either, a clear sign that there were or at least with a bit more technological ripening could be good reasons to own one. The same year that Brøderbund pushed into edutainment with Carmen Sandiego, MECC, who had reconstituted themselves as the for-profit (albeit still state-owned) publisher Minnesota Educational Computing Corporation in 1984, released the definitive, graphically enhanced version of that old chestnut The Oregon Trail, a title which shared with Carmen Sandiego an easygoing, progressive, experiential approach to learning. Together Oregon and Carmen became the twin icons of 1980s edutainment, still today an inescapable shared memory for virtually everyone who darkened a grade or middle school door in the United States between about 1985 and 1995.

The consequences of Carmen and Oregon and the many other programs they pulled along in their wake were particularly pronounced for the one remaining viable member of the old trinity of 1977: the Apple II. Lots of people both outside and inside Apple had been expecting the II market to finally collapse for several years already, but so far that had refused to happen. Apple, whose official corporate attitude toward the II had for some time now been vacillating between benevolent condescension and enlightened disinterest, did grant II loyalists some huge final favors now. One was the late 1986 release of the Apple IIGS, a radically updated version produced on a comparative shoestring by the company’s dwindling II engineering team with assistance from Steve Wozniak himself. The IIGS used a 16-bit Western Design Center 65C816 CPU that was capable of emulating the old 8-bit 6502 when necessary but was several times as powerful. Just as significantly, the older IIs’ antiquated graphics and sound were finally given a major overhaul that now made them amongst the best in the industry, just a tier or two below those of the current gold standard, Commodore’s new 68000-based Amiga. The IIGS turned out to be a significant if fairly brief-lived hit, outselling the Macintosh and all other II models by a considerable margin in its first year.

But arguably much more important for the Apple II’s long-term future was a series of special educational offers Apple made during 1986 and 1987. In January of the former year, they announced a rebate program wherein schools could send them old computers made by Apple or any of their competitors in return for substantial rebates on new Apple IIs. In April of that year, they announced major rebates for educators wishing to purchase Apple IIs for home use. Finally, in March of 1987, Apple created two somethings called the Apple Unified School System and the Apple Education Purchase Program, which together represented a major, institutionalized outreach and support effort designed to get even more Apple IIs into schools (and, not incidentally, more Macs into universities). The Apple II had been the school computer of choice virtually from the moment that schools started buying PCs at all, but these steps along with software like Carmen Sandiego and The Oregon Trail cemented and further extended its dominance, to an extent that many schools and families simply refused to let go. The bread-and-butter Apple II model, the IIe, remained in production until November of 1993, by which time this sturdy old machine, thoroughly obsolete already by 1985, was selling almost exclusively to educators and Apple regarded its continued presence in their product catalogs like that of the faintly embarrassing old uncle who just keeps showing up for every Thanksgiving dinner.

Even after the inevitable if long-delayed passing of the Apple II as a fixture in schools, Carmen and Oregon lived on. Both received the requisite CD-ROM upgrades, although it’s perhaps debatable in both instances how much the new multimedia flash really added to the experience. The television Carmen Sandiego game shows also continued to air in various incarnations through the end of the decade. Carmen Choose Your Own Adventure-style gamebooks, conventional young-adult novels, comic books, and a board game were also soon on offer, along with yet more computerized creations like Carmen Sandiego Word Detective. Only with the millennium did Carmen — always a bit milquetoast as a character and hardly the real source of the original games’ appeal — along with The Oregon Trail see their stars finally start to fade. Both retain a certain commercial viability today, but more as kitschy artifacts and nostalgia magnets than serious endeavors in either learning or entertainment. Educational software has finally moved on.

Perhaps not enough, though: it remains about 10 percent inspired, 10 percent acceptable in a workmanlike way, and 80 percent boredom stemming sometimes from well-meaning cluelessness and sometimes from a cynical desire to exploit parents, teachers, and children. Those looking to enter this notoriously underachieving field today could do worse than to hearken back to the simple charms of Carmen Sandiego, created as it was without guile and without reams of pedagogical research to back it up, out of the simple conviction that geography could actually be fun. All learning can be fun. You just have to do it right.

(See Engineering Play by Mizuko Ito for a fairly thorough survey of educational and edutational software from an academic perspective. Gamers at Work by Morgan Ramsay has an interview with Doug and Gary Carlston which dwells on Carmen Sandiego at some length. Matt Waddell wrote a superb history of Carmen Sandiego for a class at Stanford University in 2001. A piece on Brøderbund on the eve of the first Carmen Sandiego game’s release was published in the September 1985 issue of MicroTimes. A summary of the state of Brøderbund circa mid-1991 appeared in the July 9, 1991, New York Times. Joseph Weizenbaum’s comments appeared in the July 1984 issue of Byte. The first use of the term “edutainment” that I could locate appeared in a Milliken Publishing advertisement in the January 1983 issue of Creative Computing. Articles involving Spinnaker and Tom Snyder appeared in the June 1984 Ahoy! and the October 1984 and December 1985 Compute!’s Gazette. And if you got through all that and would like to experience the original Apple II Carmen Sandiego for yourself, feel free to download the disk images and manual — but no almanac I’m afraid — from right here.)

 
 

Tags: , ,

Bookware

Bookware

Suppose that in some alternate universe you are William Shakespeare. Strolling about London one day in the late sixteenth century, mulling over plans for your next novel, you come upon some workmen erecting a large wooden structure of peculiar shape. The design of the building strikes you as inappropriate for either a dwelling or a place of business.

A few questions gain you some information about a recent invention (this is an alternate universe, remember) called the “play.” Live people, sometimes costumed and in makeup, are getting up on a flat surface called a “stage” and acting out stories!

The clever people who have designed and built the first stages, as well as the inventors of acting, are right in there writing and directing the best plays they can come up with. (At least the best they can come up with in their spare time — each of these people necessarily has one or two active careers already going.)

In one of the earliest successful plays, dummies representing invading aliens (Frenchmen, perhaps, or Spaniards, from across the Channel) were lowered on ropes from concealed positions above the stage, while the actor (this play needed only one) ran back and forth, following shouted directions from the audience, trying to shoot all the dummies before they touched the floor. The audience liked this play a lot and cheered it enthusiastically.

In a somewhat more recent show, also very popular, the lead actor climbs about on a crazy scaffolding of planks and ladders, trying to accomplish some rather simple-minded tasks, while others costumed as fantastic creatures try to knock him off by throwing barrels. It’s good slapstick fun, and the audiences love it.

“Wait a minute,” you say to these eager people who have been proudly explaining how plays work. “Wait a minute. That all sounds amusing, yes. But l really think you’re on to something bigger. Let me go home and think about this for a while… How many people can you get onstage at once? How many lines can an actor memorize? Can you have it dark on one half of the stage and light on the other half?”

They look at each other. “We’re not really sure,” one replies at last. “Our stages are still pretty primitive. Our actors are all new at the job. Everybody is. Next year we’ll be able to do more. But what should we try to do?”

You don’t have any instant answers for them. A lot of vague ideas suddenly churning. Possibilities. …

“I hope you will go home and think about it, Will,” says one of the stage managers. “You’re good with words. Maybe we could have the man on the ladder say something more than ‘Ouch!’ and ‘Wow!'”

“Yes, something more,” you agree thoughtfully, turning away. The other stage people call good wishes after you. But you scarcely hear them. Your mind is involved with new ideas.

To work with — depend on — carpenters, actors, experts in stage machinery and lighting? Whatever story emerged would no longer be purely your own. But already you can see that the stage those others have created can capture the imagination and enthrall an audience, even with no more than a few clowns and ladders.

You head for home, for a place where you can sit down and think, and write. Your thoughts are on a story that you had planned to make into a book. The one to be called Hamlet…

The words above were written by science-fiction author and would-be gaming entrepreneur Fred Saberhagen in a feature article about the possibilities for interactive fiction (“Call Yourself Ishmael: Micros Get the Literary Itch”) in the September/October 1983 issue of Softline — the same article in fact that contained the first published discussion of Floyd’s death in Planetfall and what it portended. Saberhagen, whose own flirtations with interactivity would be considerable but commercially frustrating, was at the vanguard of an emerging conventional wisdom about the intersection of computers, games, and books. He and his fellow pundits that started to emerge during 1983 weren’t the first to begin to think along these lines. The editors of SoftSide magazine had first started writing about the literary potential of “compunovels” back in 1979, truly a leap of faith in light of the strangled prose and plots of the Scott Adams games and the other 16 K adventures that were pretty much the only ones available to PC owners at the time. But it took the efforts of not only Saberhagen but also, and probably more significantly, respectable folks writing for respectable mainstream publications like The New York Times Book Review, Time, The Washington Post, The Boston Globe Magazine, and Scientific American to make of it a full-fledged meme. The boldest pundits declared that we were on the cusp of nothing less than a whole new form of literature that could be as rich, meaningful, and aesthetically brilliant as anything put down by Shakespeare or Melville. Suddenly what had once been dismissed as mere “adventure games” were worth taking seriously.

All of the articles just mentioned focused primarily or entirely on Infocom, the only company in the field whose games could realistically stand up to any scrutiny at all as literary works. Seeing this, and seeing Infocom growing less and less afraid to lay claim to the mantle of literary artists under such persistent stroking, lots of other companies started asking how they could steal some of Infocom’s cultural thunder and get a piece of what the pundits said would be the literature of the future. The players who now started entering the field were a surprisingly motley lot. There were the expected startups as well as old dogs in the software game looking to learn some new tricks. Epyx, Brøderbund, and Electronic Arts amongst others all discovered a latent passion for text, as did of all people console developer Imagic, whose action games for first-generation consoles like the Atari VCS had been enormously successful but who were now floundering like everything else in that industry in the wake of the Great Videogame Crash. But technologists were not the only opportunists looking to jump on the bandwagon Infocom was driving. The huge publishing houses of Simon and Schuster, Addison-Wesley, and Random House also started software arms, as did smaller publishers like science-fiction paperback specialist Baen Books. On the other end of the distributional pipeline, the two biggest American bookstore chains, Waldenbooks and B. Dalton, both set aside areas in their stores for entertainment software, as did the ubiquitous W.H. Smith in Britain. To complete the strange mixture and hedge the bets of their own software arm, Addison-Wesley signed a deal with Infocom in early 1984 to distribute their games to the bookselling trade, an important contract that got Infocom into thousands of bookstores populated with just the sort of literate customers they were trying to reach.

All of this forthright investment in the idea of interactive fiction as a serious literary force by such pillars of mainstream American business feels a bit unbelievable today. Nor is it without a certain note of tragic resonance as the great What Might Have Been for people like me who still unabashedly love the idea. We see here the culture and, at least as importantly, the culture’s business interests trying to work out just what computer games were, what they could be, and perhaps what they should be. It’s a fascinating process to watch, if also — again, for people with my sympathies — one with kind of a heartbreaking ending. Were computer games actually games in the way that, say, Monopoly was? Or were they more like books? Or movies? Or were they — and this was the most messy and complicated and also the most likely answer of all — like any or all or none of the above, depending on the particular title in question and its genre and target audience? The answer to these questions was essential to answer another, more practical one: how, where, and to whom should computer games be sold? The folks we’re concerned with today are those who decided to bet on computer games, or at least a segment of them, as the next iteration in the long history of the book. Even on the business side of the development equation their efforts mix the expected desire to shift a lot of units and make a lot of money with a genuine idealism about the artistic potential of the form in which they were working. It was, as should be more than clear by now, an odd time in media.

If computer games were or could be literature, and if they wanted to be taken seriously as such, said many people, the smartest thing to do was to get people who already knew how to write literature — or at least readable popular fiction — involved in their creation. Or, failing that, the next best thing must be to play in the rich fictional worlds they had created. Lots of people in lots of companies followed this chain of reasoning simultaneously. Thus began the era of what the British press pithily came to call “bookware,” the splicing of the new frontier of computer games with the old of books and their authors, who were sometimes (but not all that often) actively involved, sometimes disinterested but happy to be getting another royalty check, and sometimes dead and thus blissfully unaware of the whole exercise. A trend that had been presaged in 1982 by The Hobbit reached manic full flower in 1984.

The list of bookware that flooded the market in both the United States and Britain between 1984 and 1986 is long, and includes some of the biggest names in genre fiction of the era as well as some surprisingly high-brow figures. An incomplete list might include: The Robots of Dawn (Isaac Asimov); The Mist (Stephen King); Amnesia (Thomas M. Disch); Mindwheel (future American poet laureate Robert Pinsky); High Stakes and Twice Shy (Dick Francis); Wings Out of Shadow (Fred Saberhagen); The Fourth Protocol (Frederick Forsyth); The Dragonriders of Pern (Anne McCaffrey); The Stainless Steel Rat Saves the World (Harry Harrison); The Pen and the Dark (Colin Kapp); The Saga of Erik the Viking (Terry Jones); The Secret Diary of Adrian Mole Aged 13 3/4 (Sue Townsend); The Width of the World (Ian Watson); The Colour of Magic (Terry Pratchett); and Infocom’s own The Hitchhiker’s Guide to the Galaxy (Douglas Adams). Those whose tastes ran more to the classics could choose from Pride and Prejudice (Jane Austen); The Time Machine (H.G. Wells); Sherlock (Arthur Conan Doyle); Macbeth (William Shakespeare); The Snow Queen (Hans Christian Andersen); Dante’s Inferno; and Dracula (Bram Stoker). The list of games for which contracts were signed but which went unreleased due to the bookware bubble’s bursting includes Glory Road (Robert Heinlein); another Inferno (this one by Larry Niven and Jerry Pournelle); Animal Lover (Stephen R. Donaldson); 20,000 Leagues Under the Sea (Jules Verne); Special Deliverance (Clifford D. Simak); Soldier, Ask Not (Gordon R. Dickson); and The World Thinker (Jack Vance). The publishers involved with all these efforts largely abandoned the old labels of “adventure game” and “text adventure” in favor of ones that reflected their literary aspirations. Some shamelessly appropriated “interactive fiction” from Infocom (who had themselves, of course, shamelessly appropriated it from Robert Lafore). Others made “interactive novels” or “computer novels.” Bantam Software came up with a particularly catchy label: “living literature.” Mosaic in Britain just bowed to the inevitable and stuck “bookware” on their boxes.

But it was none of the companies or games already mentioned who made the biggest bet on bookware. That was in fact a rapidly growing publisher located ten minutes from Infocom in Cambridge which had heretofore specialized in educational software: Spinnaker. Spinnaker debuted not one but two new imprints for bookware in 1984: Trillium (later Telarium, for reasons we’ll get to shortly), for book adaptations aimed at adults and teenagers; and Windham Classics, aimed at a somewhat younger set. They released no fewer than seven games between the two lines in 1984 alone (two more than Infocom’s total output for the year), followed by another five in 1985 and a final straggler in 1986, the twilight of the brief bookware era.

The story of Trillium and Windham Classics begins with a remarkable young publishing mogul named Byron Preiss. Born in 1953 in New York City, Preiss was one of those characters like Steve Jobs who could seemingly talk anybody into anything, one who as a callow youth not yet out of his teens was already able to convince older and supposedly wiser businesspeople to back his many and varied schemes. Throughout his career, Preiss mixed business with an idealism that seems to have been anything but affected. He first made a mark at seventeen, when he wrote and distributed The Block, an anti-drug comic book written for a near-illiterate reading level and aimed at grade-school children growing up in inner cities. He traveled around the country relentlessly to promote it, and eventually won the official endorsement of the Children’s Television Workshop of Sesame Street fame, as well as the lifelong friendship of one of the most important figures there, Chris Cerf. That’s just the way it was with Preiss throughout his life. He seemed genuinely unaware of the barriers between him and the people of power who could realize his dreams, and in consequence they seemed to just melt away. His gifts for gab and inspiration were legendary, as recalled by Leigh Ronald Grossman, one of a stable of young writers he came to nurture:

He had to be the most passionate person I’ve ever known, able to visualize what was special and exciting about EVERYTHING. I remember going to lunch with Byron and another publisher, in which the discussion hinged on a tired project that the whole staff was sick of dealing with after years of development. By the end of the lunch I was thrilled to be working on such a visionary project… and I’m still not quite sure how he did it.

He founded his own publishing company, Byron Preiss Visual Publications, in 1974 while enrolled in the film program at Stanford University. Through it he designed and/or published a variety of interesting, often groundbreaking work: a line of paperbacks that tried to revive the pulp adventures of the 1930s more than five years before Raiders of the Lost Ark; adult comics that didn’t involve superheroes and were forerunners of what have come to be called “graphic novels” today; Dragonworld, a beautifully illustrated epic fantasy novel which he himself coauthored; The Illustrated Harlan Ellison, whose pictures were in 3D and could be viewed through a pair of 3D glasses bound into the volume; various profusely illustrated nonfiction volumes for children and adults, on subjects ranging from the Beach Boys to dinosaurs. Throughout he assiduously cultivated relationships. By the time he turned thirty, Preiss’s Christmas-card list included people like Arthur C. Clarke, Isaac Asimov, the aforementioned Harlan Ellison, and Ray Bradbury in addition to many of the biggest names in the business of publishing and, yes, the members of the Beach Boys as well. C. David Seuss (a man whose role in this story will become clear shortly) believes that his sheer likability was perhaps his biggest asset of all; “people just wanted to go along with his ideas because he was just so nice.”

Throughout his career Preiss pushed the boundaries of what a book could be — physically, formally, and aesthetically. It’s thus little surprise that he got involved when the idea of the interactive book began to emerge in earnest in the early 1980s. His first project was a unique series in the crowded field of Choose Your Own Adventure books and copycats that dominated the children’s sections of bookstores at the time. The Be an Interplanetary Spy series mostly replaces simple choices of the “what do you want to do next?” variety with visual and logic puzzles that have to be solved to advance the narrative. In keeping with the through-going theme of Preiss’s career, the books are also lavishly illustrated, to the point that they are more interactive comic book than conventional text; the illustrations are often more important than the words. He would later create two more traditional but also fondly remembered gamebook series, Time Machine and Explorer.

Preiss had been well aware of computers and computer games since the mid-1970s, when like so many other Stanford students he paid a visit to nearby Xerox PARC. By mid-1983, with home computers booming, he felt the time had come to get involved with them as yet another facet of what a book could be. Indeed, the Be an Interplanetary Spy books, which started to appear at just this time, show more than a hint of influence from games like those of Infocom in their many puzzles and their replacing the occasional choices of Choose Your Own Adventure with almost constant interaction. By packing three or four puzzles and narrative branches onto almost every page he was able to make the books feel less granular, more like a parser-based adventure game than a Choose Your Own Adventure book. Still, Preiss wanted to go further. He started shopping around an idea for a series of computer games based on works by established authors, many of whom he just happened to have personal relationships with and whose participation — or at least willingness to sign a licensing contract — he could thus all but assure. He found himself a dance partner in the aggressive young Spinnaker Software.

Founded in April of 1982 by two friends from Harvard Business School and the Boston Consulting Group, Bill Bowman and C. David Seuss, Spinnaker was, like Electronic Arts, one of the new guard of slicker, more conventionally professional software publishers that began popping up during 1982 and 1983. Neither Bowman nor Seuss had a hackerish bone in his body. For proof, one need look no further than the ties they insisted on wearing to work every day at Spinnaker, or Bowman’s habit of getting up at 5:30 every morning to attend mass, or his seven children that were often hovering around Spinnaker’s offices after school hours (a marked contrast to life at Infocom, populated mostly by childless twenty- and thirty-somethings). They were savvy businessmen who saw an opportunity to get in on the ground floor of an emerging market after the arrival of the IBM PC (Bowman believes he has one of the first hundred ever built) and the wave of purpose-built home computers that was set off by the Commodore VIC-20. The legendary venture capitalist Jacqueline Morby funded a nationwide jaunt to talk to retailers and look for obvious holes in the market, which revealed an exploding demand for educational software that could not be met by under-capitalized semi-amateurs like Edu-Ware. They jumped at it, with the aid of millions in venture capital arranged by Morby and her company TA Associates. (Morby would also take a place on Spinnaker’s board, along with that of Sierra On-Line and several others, making her one of the most powerful hidden shapers of the software industry of the 1980s.)

Spinnaker released their first products just in time for the Christmas of 1982. Some were done in-house, some by outside developers like educational pioneer Tom Snyder Productions. Quality inevitably varied, but some, like Snyder’s In Search of the Most Amazing Thing, have become children’s classics. Still, Spinnaker was emblematic of the changes that swept the software industry with the the home-computer boom and the arrival of traditional big-business interests — and big-business money — looking to capitalize on it, yet another sign that the era Doug Carlston referred to as the software “Brotherhood” was well and truly ended. Their agenda plainly included driving older rivals like Edu-Ware out of the market entirely, a goal they soon accomplished. They placed all but unprecedented emphasis on licensing, branding, and advertising, a result of what Seuss calls his “First Law”: “Apply money at the point of resistance.” Spinnaker poured some 15% of their total earnings for 1983 back into huge advertising buys, much of it outside the trade press in places like Good Housekeeping, Better Homes and Gardens, and Newsweek. They soon hit upon the scheme of releasing their software under a number of sub-brands, each of which would be as often as possible a licensed take on an already well-known consumer entity. By the end of 1983 their stable already included Fisher-Price (educational software for the very young; name licensed from the huge toy company), Nova (science education; name licensed from the long-running PBS television series), and Better Living (personal productivity software), as well as the flagship Spinnaker brand. Whatever its other merits, the approach was a clever sleight of hand for a company that aimed to do nothing less than own educational software, and in time and with luck to extend that dominance to home-computing software as a whole. Bowman:

“Shelf space is all that matters in this business. If everything is under the Spinnaker brand, the consumer feels he’s not getting much of a choice, but here he can choose any of six brands.”

Feel free to put your own scare quotes around “choose” there. Spinnaker drew analogies between themselves and General Motors, who had their Buick, Chevrolet, Oldsmobile, etc. — if nothing else an indication that Bowman and Seuss weren’t afraid to dream big. When Byron Preiss came along with his idea for a new line of bookware, it was both an opportunity to add yet more brands to the stable and to begin to make inroads into entertainment software (something that had been on their long-term agenda from the start), just as the Better Living brand represented their first explorations of the productivity market. Their internal industry studies showed adventure games to be a very good place to start. While they currently sold in less than one-third the numbers of action games, the segment was growing rapidly, while action games were doing just the opposite.

Shortly after Preiss came on the scene, one of Spinnaker’s best outside developers, Dale Disharoon, came to them with a bookware idea of his own: to make edutainment based on classic and contemporary children’s books. Disharoon’s educational titles sold in huge numbers for Spinnaker, so when he talked they tended to listen. Thus was born the Windham Classics line, brand number six for Spinnaker, as a companion to Trillium.

Preiss was most closely involved with the latter, and that’s also where we’ll be spending most of our time. As should be clear by now, Trillium owed its existence to a mixture of artistic idealism and commercial pragmatism — not that such strange bedfellows aren’t pretty much business as usual in any media industry. For Preiss, Trillium represented nothing less than the future — or a possible future — of fiction. For Spinnaker, it represented a bit of the same but also yet another branding opportunity, the chance to stamp the names of popular authors onto boxes and get in on the hot new trend in entertainment software. They placed a young marketer named Seth Godin in charge of Trillium’s image. (Godin has since gone on to a career as a prominent marketing guru.) Now it was time to start rounding up authors.

Here Preiss proved the value of all those connections to the world of written science fiction. He pulled out his thick phone list and started calling some of the name authors with whom he’d carefully built relationships over the last decade. He soon had Arthur C. Clarke signed for an adaptation of Rendezvous with Rama and Ray Bradbury for Fahrenheit 451. He also reached a tentative agreement with Robert Heinlein to adapt his juvenile classic Starman Jones, and initiated talks with Roger Zelazny, Philip José Farmer, Harry Harrison, and Alfred Bester. As with most of the products of the era of bookware, the question of how involved these authors were actually willing to be was always a delicate one. Mostly they wound up doing little more than politely sitting through presentations of the latest versions, with their creative role amounting to little more than a never- or almost never-exercised veto power. And that was about it, beyond signing their name to an appropriately glowing endorsement for the back of the box. Only Ray Bradbury was willing to get somewhat more involved, actively drawing up plot ideas for and by reports even contributing some prose to the Fahrenheit 451 game.

Thus when Spinnaker approached Michael Crichton about adapting one of his books, they must have been thrilled to learn that Crichton had already been working on an original game with some associates for eighteen months, and was in fact looking for a publisher. More problematic was a game called Shadowkeep, a text-adventure/CRPG hybrid offered to Spinnaker by a small developer called Ultrasoft. A book or an author was nowhere in sight, but it was just too impressive a game to pass up, so Spinnaker simply flipped the process on its head by hiring Alan Dean Foster, the reigning king of media tie-in novels (he had made his name in the industry by ghost writing the novelization of the original Star Wars for George Lucas), to write a book based on the game. He would thus have his name displayed in huge letters on the box of a game he had had absolutely nothing to do with. For the Windham Classics line, Spinnaker took advantage of some blessedly out-of-copyright children’s classics like The Swiss Family Robinson, Treasure Island, Alice in Wonderland, and The Wizard of Oz, but did sign a license for the recent Green Sky Trilogy, whose author, Zilpha Keatley Snyder, did heavily involve herself in the design of the resulting game. That game, Below the Root, is not a text adventure at all but is a lovely, lyrical classic in its own right.

Indeed, the games of Trillium and Windham Classics take a surprising variety of approaches, mixing illustrated text adventure with action and CRPG elements, sometimes in the same game. But the fundamental goal, particularly for the Trillium games, was to outdo Infocom at their own game. Largely at Seth Godin’s behest, Spinnaker put the Trillium games inside mouth-wateringly beautiful gatefold boxes that on aesthetic grounds might just outdo — dare I say it? — Infocom. They also started a newsletter in an effort to build a community of loyal customers in the same way Infocom had, although they were a bit stumped as to what to put in it beyond plugs and order forms for the games; Spinnaker’s offices weren’t filled with the same sort of inspired madness that made The New Zork Times such a constant delight. Infocom couldn’t help but feel the target Spinnaker had plainly painted on their back, couldn’t help but feel a bit unnerved by this big company — already much larger than them — with their big-name authors and their illustrations and other additional gimmicks that was determined to wrest away from them their comfortable niche in the industry. Spinnaker even negotiated a deal to get the Trillium games into every Waldenbooks in the country to sit side by side with the Infocom games that had also just arrived there thanks to Infocom’s deal with Addison Wesley. Various Infocom folks have quietly acknowledged that Spinnaker was the only competitor who ever really made them nervous in 1984, the year they sold more games than in any other.

They thus must have been quite pleased when the Trillium line hit a major snafu within weeks of launching in the fall of that year. It seemed there already existed a tiny publisher of educational books using the name of “Trillium.” Now the original Trillium’s lawyers came calling. Seuss feels they could have fought the suit and likely prevailed (the original Trillium had never actually registered their name as a trademark), but it would have been time consuming, disruptive, and expensive. Spinnaker chose instead to start the embarrassing process of re-branding the entire line with a new name. They chose “Telarium,” a name that, for what it’s worth, I like a lot better anyway. (To avoid confusion as we crisscross the Trillium/Telarium split in this and future articles, I’m just going to refer to the brand as “Telarium” from now on.)

The whole debacle was anathema to a company that was so focused on branding. And it was almost as frustrating when negotiations with the Heinlein people to release Starman Jones, previously considered a done deal to the extent that a huge amount of work had already been done on the game, reached an impasse from which they would never emerge despite months and months of Telarium’s optimistically announcing the nearly completed Starman Jones as “coming soon.”

The arrival of Telarium was greeted with hostility and even a certain amount of contempt, not only at Infocom but amongst their other peers as well. Jon Freeman, never one to pull his punches, delivered a withering takedown via his column in Computer Gaming World:

In the first place, the economics of the situation almost guaranteed that the programmers involved, despite the hype, would not be first-class. Big Name Authors and Best-Selling Books don’t come cheap. Although the draw of author and book is clearly a marketing advantage — and should therefore be paid out of the marketing/advertising budget — the cost is normally borne by R&D. A big chunk of the royalties that would otherwise go to the developers of the game is paid, instead, to the author of the book. What top-flight game designer or programmer would take that kind of a pay cut? Regardless of their deficiencies as game designers, most good programmers are still at work on their own ideas. Those who can’t come up with original subjects for games are busy making a lucrative living converting popular games to new computers. Therefore, the majority of programmers available for these book projects are either inexperienced or inadequate or both.

The worst part is that the SF people involved don’t know how little they know about the subject. Few of the authors involved in all these projects play games: most lack the time; many lack the inclination. Technophobes like Ray Bradbury, who admits that he cannot use the computer he owns, believe the apex of computer usage is to enter the text of a book and read it on the CRT. Would he know a good computer game if he fell over one?

Yes, Freeman’s logic is questionable at best. Still, many of the people who had been in the industry for years saw Telarium as calculated and soulless, and not without cause. There was an air of contrivance, even perhaps a note of disingenuousness about the whole enterprise. Seth Godin claimed, “We wanted to go to the people who could write [the games] the best. And that’s not programmers — it’s authors.” Which would be fine if only the authors in question were actually deeply involved; witness the particular absurdity of Shadowkeep as an “Alan Dean Foster” game. Meanwhile Telarium’s addition of graphics and music and even action sequences to the Infocom template just screamed of bullet points on some marketer’s demands for more, more, more than the competition. When the line ultimately proved commercially disappointing, few felt much sympathy. In fact, they cheered the failure as an object lesson that you can’t buy your way to success with big licenses and an overstuffed marketing budget. (A lesson subsequent gaming history has not, alas, always borne out.)

Yet there was also, as I’ve already noted, that idealistic side to Telarium that hasn’t been discussed enough. A genuine aesthetic vision drove Telarium — a vision for games as coherent lived fictional experiences. Byron Preiss:

“We’re trying to make a game that is based on plot and characterization, not puzzles — the way a book is. If you read Fahrenheit 451, you don’t get stuck on page 50. And if you play the game, you don’t get stuck on frame 50, because the whole idea is that you’re interested in the game because of the characters and the plot and what’s happening. You care about what’s going on.”

Or, as he put it another way: “When you’re reading a good book, say a Ludlum thriller, you’re really sweating because you believe you’re part of the story. Adventure games weren’t doing that because the puzzles kept bringing you back to reality.” People in and around Telarium expressed over and over this determination to get beyond arbitrary, often frustrating set-piece puzzle solving to something worthy of Infocom’s chosen label of interactive fiction. To what extent they succeeded is debatable; certainly the games have plenty of rough edges. Still, they’re also more than worthy of the sort of careful second look that too few of the non-Infocom works of the bookware era have heretofore received. We’ll begin the process of remedying that next time.

(Many thanks to C. David Seuss for answering questions and sharing his memories with me. A Harvard Business School case study on Spinnaker was also invaluable. Particularly good contemporary articles on Telarium and the bookware phenomenon are in: the September/October 1983 Softline; the December 1984 Compute!’s Gazette; the June/July 1985 Commodore Power Play; the February 1985 Compute!; the April 1985 Electronic Games; the June 18, 1984 InfoWorld; the July 30, 1984 InfoWorld; the August 13, 1984 InfoWorld; the August 1984 Computer Gaming World; the February 1985 Micro-Adventurer; and the May 1985 Commodore Microcomputers. All were used for this article and the ones on individual Telarium games that will follow, as were Jason Scott’s Get Lamp archives which he kindly shared with me. The great illustration that begins this article was taken from the April 1985 Electronic Games.)

 
24 Comments

Posted by on September 10, 2013 in Digital Antiquaria, Interactive Fiction

 

Tags: ,

Summer Camp is Over

It’s difficult to exaggerate just what a phenomenon Atari and their VCS console were in the United States of the very early 1980s. The raw figures are astounding; nothing else I’ve written about on this blog holds a candle to Atari’s mainstream cultural impact. By the beginning of 1982 the rest of the business of their parent company, the longstanding media conglomerate Warner Communications, looked almost trivial in comparison. Atari reaped six times as much profit as Warner’s entire music division; five times as much as the film division. By the middle of the year 17 percent of American households owned a videogame console, up from 9 percent at the same time of the previous year. Atari all but owned this exploding market, to the tune of an 80 percent share. The company’s very name had become synonymous for videogames, like Kleenex is for tissue. People didn’t ask whether you played videogames; they asked whether you played Atari. As the company ramped up for the big Christmas season with their home version of the huge arcade hit Pac-Man as well as a licensed adaptation of the blockbuster movie of the year, E.T., they confidently predicted sales increases of 50 percent over the previous Christmas. But then, on December 7, they shocked the business world by revising those estimates radically downward, to perhaps a 10 or 15 percent increase. Granted, plenty of businesses would still love to have growth like that, but the fact remained that Atari for the first time had underperformed. Change was in the air, and everyone could sense it.

Those who had been watching closely and thoughtfully could feel the winds of change already the previous summer, when Atari’s infamously substandard version of Pac-Man sold in massive numbers, but not quite as massive numbers as the company and their boosters had predicted; when sales on the Mattel Intellivision and the brand new ColecoVision soared, presumably at the expense of Atari’s aged VCS; when Commodore continued to aggressively market their low-cost home computers as a better alternative to a games console, and continued to be rewarded with huge sales. The big question became what form the post-VCS future of gaming would take, assuming it didn’t just fade away like the Hula Hoop fad to which videogames were so often compared. There were two broad schools of thought, who would each prove to be right and wrong in their own ways. Some thought that the so-called “second generation” consoles, like the ColecoVision, would pick up Atari’s slack and the console videogame industry would continue as strong as ever. Others, however, looked to the PC industry, which VisiCalc and the IBM PC had legitimized even as Commodore was proving that people would buy computers for the home in huge numbers if the price was right. The VIC-20 may have been only modestly more capable than the Atari VCS, but as a proof of concept of sorts it certainly got people’s attention. With prices dropping and new, much more capable machines on the horizon, many analysts cast their lot with the home computer as the real fruition of the craze that the Atari VCS had started. Full-fledged computers could offer so much better, richer experiences than the consoles thanks to their larger memories, their ability to display text, their keyboards, their disk-based storage. The newest computers had much better graphics and sound than their console counterparts to boot. And of course you could do more than play games with a computer, like write letters or help Junior learn BASIC as a leg-up for a computer world soon to come.

An increasing appreciation of the potential of home computers and computer games by the likes of Wall Street meant big changes for the pioneers I’ve been writing about on this blog. Although most of the signs of these changes would not be readily visible to consumers until the following year, 1982 was the year that Big Capital started flowing into the computer-game (as opposed to the console-centric videogame) industry. Slick new companies like Electronic Arts were founded, and old-media corporations started commissioning software divisions. The old guard of pioneers would have to adapt to the new professionalism or die, a test many — like The Software Exchange, Adventure International, California Pacific, Muse, and Edu-Ware, among dozens of others — would fail. The minority that survived — like On-Line Systems (about to be rechristened Sierra On-Line), Brøderbund, Automated Simulations (about to be rechristened Epyx), Penguin, and Infocom — did so by giving their scruffy hacker bona fides a shave and a haircut, hiring accountants and MBAs and PR firms (thus the name changes), and generally starting to behave like real companies. Thanks to John Williams, who once again was generous enough to share his memories with me, I can write about how this process worked within On-Line Systems in some detail. The story of their transformative 1982, the year that summer camp ended, begins with a venture-capital firm.

TA Associates was founded in 1968 as one of the first of the new breed of VC firms. From the beginning, they were also one of the most savvy, often seeing huge returns on their investments while building a reputation for investing in emerging technologies like recombinant DNA and gene splicing at just the right moment. They were one of the first VC firms to develop an interest in the young PC industry, thanks largely to Jacqueline Morby, a hungry associate who came to TA (and to a career in business) only at age 40 in 1978, after raising her children. While many of her peers rushed to invest in hardware manufacturers like industry darling Apple, Morby stepped back and decided that software was where the action was really going to be. It’s perhaps difficult today to fully appreciate what a brave decision that was. Software was still an uncertain, vaguely (at best) understood idea among businesspeople at the time, as opposed to the concrete world of hardware. “Because it was something you couldn’t see, you couldn’t touch, you couldn’t hold,” she later said to InfoWorld, “it was a frightening thing to many investors.” For her first big software investment, in 1980, Morby backed what would ultimately prove to be the wrong horse: she invested in Digital Research, makers of CP/M, rather than Microsoft. Her record after that, however, would be much better, as she and TA maintained a reputation throughout the 1980s as one of (if not the) major players in software VC. She described her approach in a recent interview:

If you talk to enough entrepreneurs, you quickly figure out which of their ventures are the most promising. First, I would consider the year they were formed. If a company was three years old and employed 100 people, that meant something was going right. Then, after researching what their products did, I’d call them — cold. In those days, nobody called on the presidents of companies to say, “Hi, I’m an investor and I’m interested in you. Might I come out to visit and introduce myself?” But most of the companies said, “Come on out. There’s no harm in talking.” My calling companies led to many, many investments throughout the years.

When you look at the potential of a company, the most important questions to consider are, “How big is its market and how fast is it growing?” If the market is only $100 million, it’s not worth investing. The company can’t get very big. Many engineers never ask these questions. They just like the toys that they’re inventing. So you find lots of companies that are going to grow to $5 million or so in sales, but never more, because the market for their products is not big enough.

By 1982, Morby, now a partner with TA thanks to her earlier software success, had become interested in investing in an entertainment-software company. If computer games were indeed to succeed console games once people grew tired of the limitations of the Atari VCS and its peers, the potential market was going to be absolutely huge. After kicking tires around the industry, including at Brøderbund, she settled on On-Line Systems as just the company for her — unique enough to stand out with its scenic location and California attitude but eager to embrace the latest technologies, crank out hits, and generally take things to the next level.

When someone offers you millions of dollars virtually out of the blue, you’re likely to think that this is all too good to be true. And indeed, venture capital is always a two-edged sword, as many entrepreneurs have learned to their chagrin. TA’s money would come only with a host of strings attached: TA themselves would receive a 24 percent stake in On-Line Systems; Morby and some of her colleagues would sit on the board and have a significant say in the company’s strategic direction. Most of all, everyone would have to clean up their act and start acting like professionals, starting with the man at the top. Steven Levy described Ken Williams in his natural habitat in Hackers:

Ken’s new office was just about buried in junk. One new employee later reported that on first seeing the room, he assumed that someone had neglected to take out a huge, grungy pile of trash. Then he saw Ken at work, and understood. The twenty-eight-year-old executive, wearing his usual faded blue Apple Computer T-shirt and weather-beaten jeans with a hole in the knee, would sit behind the desk and carry on a conversation with employees or people on the phone while going through papers. The T-shirt would ride over Ken’s protruding belly, which was experiencing growth almost as dramatic as his company’s sales figures. Proceeding at lightning pace, he would glance at important contracts and casually throw them in the pile. Authors and suppliers would be on the phone constantly, wondering what had happened to their contracts. Major projects were in motion at On-Line for which contracts hadn’t been signed at all. No one seemed to know which programmer was doing what; in one case two programmers in different parts of the country were working on identical game conversions. Master disks, some without backups, some of them top secret IBM disks, were piled on the floor of Ken’s house, where one of his kids might pick it up or his dog piss on it. No, Ken Williams was not a detail person.

If Ken was not detail-oriented, he did possess a more valuable and unusual trait: the ability to see his own weaknesses. He therefore acceded wholeheartedly to TA’s demands that he hire a squad of polished young managers with suits, resumes, and business degrees. He even let TA field most of the candidates. He hired as president Dick Sunderland, a fellow he had worked for before the birth of On-Line, where he had been loathed by the hackers under him as too pedantic, too predictable, too controlling, too boring. To Ken (and TA) this sounded like just the sober medicine On-Line would need to compete in the changing industry.

Which is not to say that all of this new professionalism didn’t also come with its attendant dangers. John Williams states frankly today that “some of those new managers came in with the idea that they would run the business after they pushed Ken to the side or out.” (It wasn’t clear to the Williams whether they came up with that idea on their own or TA subtly conveyed it to them during the hiring process.) Ken also clashed constantly with his own hire Sunderland; the latter would be gone again within a year. He was walking a difficult line, trying to instill the structure his company needed to grow and compete and be generally taken seriously by the business community without entirely losing his original vision of a bunch of software artisans creating together in the woods. As org charts started getting stapled to walls, file cabinets started turning up locked, and executive secretaries started appearing as gatekeepers outside the Williams’ offices, many of the old guard saw that vision as already dying. Some of them left. Needless to say, Ken no longer looked for their replacements in the local liquor store.

Ken proved amazingly adept at taking the good advice his new managers had to offer while remaining firmly in charge. After a while, most accepted that he wasn’t going anywhere and rewarded him with a grudging respect. Much of their advice involved the face that On-Line presented to the outer world. For a long time now everyone had agreed that the name “On-Line Systems,” chosen by Ken back when he had envisioned a systems software company selling a version of FORTRAN for microcomputers, was pretty awful — “generic as could be and dull as dishwater” in John Williams’s words. They decided on the new name of “Sierra On-Line.” The former part conveyed the unique (and carefully cultivated) aura of backwoods artisans that still clung to the company even in these more businesslike days, while the latter served as a bridge to the past as well as providing an appropriate high-tech flourish (in those times “On-Line” still sounded high-tech). They had a snazzy logo featuring a scenic mountain backdrop drawn up, and revised and slicked-up their packaging. The old Hi-Res Adventure line was now SierraVenture; the action games SierraVision.

Sierra hired Barbara Hendra, a prominent New York PR person, to work further on their image. Surprisingly, the erstwhile retiring housewife Roberta was a big force behind this move; her success as a game designer had revealed an unexpected competitive streak and a flair for business of her own. Hendra nagged Roberta and especially Ken — he of the faded, paunch-revealing tee-shirt and the holey jeans — about their dress and mannerisms, teaching them how to interact with the movers and shakers in business and media. She arranged a string of phone interviews and in-person visits from inside and outside the trade press, including a major segment on the prime-time news program NBC Magazine. Ken was good with these junkets, but Roberta — pretty, chic, and charming — was the real star, Sierra’s PR ace in the hole, the antithesis of the nerds so many people still associated with computer games. When someone like Roberta said that computer games were going to be the mass-market entertainment of the future, it somehow sounded more believable than it did coming from a guy like Ken.

In the midst of all this, another windfall all but fell into Sierra’s lap. Christopher Cerf, a longtime associate of The Children’s Television Workshop of Sesame Street fame, approached them with some vague ideas about expanding CTW into software. From there discussions moved in the direction of a new movie being developed by another CTW stalwart: Jim Henson, creator of the Muppets. For Ken, who had been frantically reading up on entertainment and media in order to keep up with the changes happening around his company, the idea of working with Henson was nothing short of flabbergasting, and not just because the Muppets were near the apogee of their popularity on the heels of two hit movies, a long-running television series, and a classic Christmas special with John Denver. John Williams:

Ken developed a kind of worship for two men as he began to study up on entertainment. One was Walt Disney and the second was Jim Henson. Both were men who were enablers — not known as much for their own artistry so much as their ability to bring artists and business together to make really big things happen — and that was what Ken strived for. Walt was already gone of course, but Henson was still alive.

Ken Williams (right) hobnobbing with Jim Henson

Ken Williams (right) hobnobbing with Jim Henson

The almost-completed movie was called The Dark Crystal. In the works on and off for five years, it marked a major departure for Henson and his associates. Although populated with the expected cast of puppets and costumed figures (and not a single recognizable human), there were no Muppets to be found in it. It was rather a serious — even dark — fantasy tale set in a richly organic landscape of the fantastic conceived by Henson’s creative partner on the project, designer and illustrator Brian Froud. In an early example of convergence culture, Henson and friends were eager to expand the world of the movie beyond the screen. They already planned a glossy hardcover book, a board and a card game, and a traveling art exhibit. Now an adventure game, to be designed by Roberta, sounded like a good idea. Such a major media partnership was a first for a computer-game publisher, although Atari had been doing licensed games for some time now for the VCS. Anyone looking for a sign that computer games were hitting the big time needed look no farther.

The Dark Crystal

For the Williamses, the changes that the venture capitalists had brought were nothing compared to this. Suddenly they were swept into the Hollywood/Manhattan media maelstrom, moving in circles so rarified they’d barely realized they existed outside of their televisions. John Williams again:

I remember this time very well. Let me put it in a very personal perspective. I’m like 22 or 23. A guy who grew up in Wheaton, Illinois (which is just down the street from absolutely nowhere) and currently living in a town of like 5000 people 50 miles from the nearest traffic light. Now imagine this young wet-behind-the-ears punk walking through the subways and streets of Manhattan with Jim Henson, getting interviewed on WNBC talk radio while wearing his first real tailored suit. Eating at “21” with Chris Cerf, and taking limos to meet with publishing companies on Times Square. That was me – and I was just along for the ride. For Ken and Roberta, it was on a whole other level.

Much of the Williams’ vision for computerized entertainment, of games as the next great storytelling medium to compete with Hollywood, was forged during this period. If they had ever doubted their own vision for Sierra, hobnobbing with the media elite convinced them that this stuff was going to get huge. Years before the technology would become practical, they started toying with the idea of hiring voice actors and considering how Screen Actors Guild contracts would translate to computer games.

But for here and now there was still The Dark Crystal, in the form of both movie and game. Both ended up a bit underwhelming as actual works when set against what they represent to Sierra and the computer-game industry.

The movie is in some ways an extraordinary achievement, a living alien world built from Styrofoam, animatronics, and puppets. It’s at its most compelling when the camera simply lingers over the landscape and its strange inhabitants. Unfortunately, having created this world, Henson and company don’t seem quite sure what to do with it. The story is an unengaging quest narrative which pits an impossibly, blandly good “chosen one,” the Gelfling Jen, against the impossibly evil race of the Skeksis. It’s all rather painfully derivative of The Lord of the Rings: two small protagonists carry an object of great power into danger, with even a Gollum stand-in to dog their steps. Nor do the endless melodramatic voiceovers or the hammy voice acting do the film any favors. It’s a mystery to whom this film, too dark and disturbing for children and too hokey and simplistic for adults and with none of the wit and joy that marked Henson’s Muppets, was meant to really appeal. There have been attempts in recent years to cast the movie, a relative commercial disappointment in its time, as a misunderstood masterpiece. I’m not buying it. The Dark Crystal is no Blade Runner.

The game is similarly difficult to recommend. Like The Hobbit, The Dark Crystal‘s quest narrative maps unusually well to an adventure game, but Roberta showed none of the technical ambition that Veronika Megler displayed in making a game of her source material. The Dark Crystal suffers from the same technical and design flaws that mark all of the Hi-Res Adventure line: absurd puzzles, bad parser, barely-there world model, you’ve heard the litany before from me. In the midst of the problems, however, there are more nods toward story than we’re used to seeing in our old-school adventure games, even if they sometimes smack more of the necessities born of doing a movie adaptation than a genuine striving to advance the medium. Early on we get by far the longest chunk of expository text to make it into any of the Hi-Res Adventure line.

The Dark Crystal

Unusually, the game is played in the third person, with you guiding the actions of the movie’s hero Jen and, later, both Jen and his eventual sidekick/tentative love interest, Kira. The duality of this is just odd; you never quite know who will respond to your commands. The third-person perspective extends to the graphics, which show Jen and Kira as part of each scene.

The Dark Crystal

As Carl Muckenhoupt mentions in his (highly recommended) posts about the game, it’s tempting to see the graphics as a transitional step between the first-person perspective of Roberta’s earlier Hi-Res Adventure games and the fully animated adventure games that she would make next — games that would have you guiding your onscreen avatar about an animated world in real-time. It’s also very possible that working with the fleshed-out story and world of someone else inspired Roberta to push her own future original works further in the direction of real storytelling. Notably, before The Dark Crystal none of her games bothered to define their protagonists or even give them names; after it, all of them did.

Whatever influence it had on Roberta’s design approach, the fact remains that she seemed less passionate about The Dark Crystal itself than she had been about her previous games. With the licensing deal having been finalized as the movie was all but ready for release, The Dark Crystal was what John Williams euphemistically calls a “compressed timeline” game. Roberta spent only a month or so on the design while dealing with all of the distractions of her new life in the spotlight, then turned the whole thing over to Sierra’s team of in-house programmers and artists. It all feels a bit rote. John:

The simple truth is that the whole of the Dark Crystal project was, in the end, a business decision and not really driven by our developers or our creative people. I think that’s really why this is one of the least cared about and least remembered products in the Sierra stable. Look back at that game and there’s really none of Roberta’s imagination in there – and the programmers, artists, etc., involved were basically mimicking someone else’s work and creating someone else’s vision. The lack of passion shows.

The player must not so much do what seems correct for the characters in any given situation as try to recreate the events of the film. If she succeeds, she’s rewarded with… exactly what she already saw in the movie.

The Dark Crystal

The Dark Crystal

Adapting a linear story to an interactive medium is much more difficult than it seems. This is certainly one of the least satisfying ways to approach it. The one nod toward the dynamism that marks The Hobbit are a couple of minions sent by the Skeksis to hunt you down: an intelligent bat and a Garthim, a giant, armored, crab-like creature with fearsome pincers. If you are spotted in the open by the bat, you have a limited amount of time to get under cover — trees, a cave, or the like — before a Garthim comes to do you in. That’s kind of impressive given the aging game engine, and it does help with the mimesis that so many of the game’s other elements actively work against. But alas, it’s just not enough.

Even with the rushed development schedule, the game didn’t arrive in stores until more than a month after the movie’s December 7, 1982, premiere. After, in other words, the big Christmas buying season. That, along with the movie’s lukewarm critical reception and somewhat disappointing performance at the box office, likely contributed to The Dark Crystal not becoming the hit that Sierra had expected. Its sales were disappointing enough to sour Sierra on similar licensing deals for years to come. Ken developed a new motto: “I don’t play hits, I make them.”

Of course, it also would have been unwise to blame The Dark Crystal‘s underperformance entirely on timing or on its being tied to the fate of the movie. The old Hi-Res Adventure engine, which had been so amazing in the heyday of The Wizard and the Princess, was getting creaky with age, and had long since gone past the point of diminishing commercial returns; not only The Dark Crystal but also its immediate predecessor, the epic Time Zone, had failed to meet sales expectations. This seventh Hi-Res Adventure would therefore be the last. Clearly it was time to try something new if Sierra intended to keep their hand in adventure games. That something would prove to be as revolutionary a step as had been Mystery House. The Dark Crystal, meanwhile, sneaked away into history largely unloved and unremembered, one of the first of a long industry tradition of underwhelming, uninspired movie cash-ins. The fact that computer games had reached a stage where such cash-ins could exist is ultimately the most important thing about it.

If you’d like to try The Dark Crystal for yourself despite my criticisms, here’s the Apple II disk images and the manual.

(As always, thanks to John Williams for his invaluable memories and insights on these days of yore. In addition to the links embedded in the text, Steven Levy’s Hackers and the old Atari history Zap! were also wonderful sources. Speaking of Atari histories: I look forward to diving into Marty Goldberg and Curt Vendel’s new one.)

 
25 Comments

Posted by on December 12, 2012 in Digital Antiquaria, Interactive Fiction

 

Tags: ,