RSS

The Apple II

Steve Jobs’s unique sense of design and aesthetics has dominated every technology project he’s led following the Apple II — for better (the modern Macintosh, the iPhone, the iPod, the iPad) or worse (the Apple III) or somewhere in between (the original 1984 Macintosh, the NeXT workstations). The Apple II, though, was different. While Jobs’s stamp was all over it, so too was the stamp of another, very different personality: Steve Wozniak. The Apple II was a sort of dream machine, a product genuinely capable of being very different things to different people, for it unified Woz’s hackerish dedication to efficiency, openness, and possibility with Jobs’s gift for crafting elegant experiences for ordinary end users. The two visions it housed would soon begin to pull violently against one another, at Apple as in the computer industry as a whole, but for this moment in time, in this machine only, they found a perfect balance.

To call Jobs a mediocre engineer is probably giving him too much credit; the internals of the Apple II were all Woz. Steven Levy describes his motivation to build it:

It was the fertile atmosphere of Homebrew that guided Steve Wozniak through the incubation of the Apple II. The exchange of information, the access to esoteric technical hints, the swirling creative energy, and the chance to blow everybody’s mind with a well-hacked design or program… these were the incentives which only increased the intense desire Steve Wozniak already had: to build the kind of computer he wanted to play with. Computing was the boundary of his desires; he was not haunted by visions of riches and fame, nor was he obsessed by dreams of a world of end users exposed to computers.

When you open an Apple II, you see a lot of slots and a lot of empty space.

All those slots were key to Woz’s vision of the machine as a hacker’s ultimate plaything; each was an invitation to extend it in some interesting way. Predictably, Jobs was nonplussed by Woz’s insistence on devoting all that space to theoretical future possibilities, as this did not jibe at all with his vision of the Apple II as a seamless piece of consumer electronics to be simply plugged in and used. Surely one or two slots is more than sufficient, he bargained. Normally Jobs, by far the more forceful personality of the two, inevitably won disputes like this — but this time Woz uncharacteristically held his ground and got his slots.

Lucky that he did, too. Within months hackers, third-party companies, and Apple itself began finding ways to fill all of those slots — with sound boards, 80-column video boards, hard-disk and printer interfaces, modems, co-processor and accelerator cards, mouse interfaces, higher resolution graphics boards, video and audio digitizers, ad infinitum. The slots, combined with Woz’s dogged insistence that every technical nuance of his design be meticulously documented for the benefit of hackers everywhere, transformed the Apple II from a single, static machine into a dynamic engine of possibility. They are the one feature that, more than anything else, distinguished the Apple II from its contemporaries the PET and TRS-80, and allowed it to outlive those machines by a decade. Within months of the Apple II’s release, even Jobs would have reason to thank Woz for their existence.

All of the trinity of 1977 initially relied on cassette tapes for storage. Both the PET and TRS-80 in fact came with cassette drives as standard equipment, while the Apple II included only a cassette interface, to which the user was expected to connect her own tape player. A few months’ experience with this storage method, the very definition of balky, slow, and deeply unreliable, convinced everyone that something better was needed if these new machines were to progress beyond being techie toys and become useful for any sort of serious work. The obvious solution was the 5 1/4 inch floppy-disk technology recently devised by a small company called Shugart Associates. Woz soon set to work, coming up with a final design that engineers who understand such things still regard with reverence for its simplicity, reliability, and efficiency. The product, known as the Disk II, arrived to market in mid-1978 for about $600, vastly increasing the usability and utility of the Apple II. Thanks to the expandability Woz had designed into the Apple II from the start, the machine was able to incorporate the new technology effortlessly. Even at $600, a very competitive price for a floppy-disk system at the time, Woz’s minimalist design aesthetic combined with the Apple II’s amenability to expansion meant that Apple made huge margins on the product; in West of Eden, Frank Rose claims that the Disk II system was ultimately as important to Apple’s early success as the Apple II itself. The PET and TRS-80 eventually got floppy-disk drives of their own, but only in a much uglier fashion; a TRS-80 owner who wished to upgrade to floppy disk, for instance, had to first buy Radio Shack’s bulky, ugly, and expensive “expansion interface,” an additional big box containing the slots that were built into the Apple II.

Another killer app enabled by the Apple II’s open architecture had a surprising source: Microsoft. In 1980, that company introduced its first hardware product, a Zilog Z80 CPU on a card which it dubbed the SoftCard. An Apple II so equipped had access to not only the growing library of Apple II software but also to CP/M and its hundreds of business-oriented applications. It gave Apple II owners the best of both worlds for just an additional $350. Small wonder that the card sold by the tens of thousands for the next several years, until the gradual drying up of CP/M software — a development, ironically, for which Microsoft was responsible with its new MS-DOS standard — made it irrelevant.

While most 6502-based computers were considered home and game machines of limited “serious” utility, products like the SoftCard and the various video cards that let it display 80 columns of text on the screen — an absolute requirement for useful word processing — lent the Apple II the reputation of a machine as useful for work as it was for play. This reputation, and the sales it undoubtedly engendered, were once again ultimately down to all those crazy slots. In this sense the Apple II much more closely resembled the commodity PC design first put together by IBM in 1981 than it did any subsequent design from Apple itself.

Another significant advantage that the Apple II had over its early competitors was its ability to display bitmap graphics. The TRS-80 and the PET, you may recall, were essentially limited to displaying text only. While it was possible to draw simple pictures using the suite of simple shape glyphs these machines provided in addition to traditional letters and punctuation (see my post on Temple of Apshai on the TRS-80), this technique was an inevitably limited one. The Apple II, however, provided a genuine grid of 280 X 192 individually addressable pixels. Making use of this capability was not easy on the programmer, and it came with a host of caveats and restrictions. Just 4 colors were available on the original Apple II, for instance, and oddities of the design meant that any individual pixel could not always be any individual desired color. These circumstances led to the odd phasing and color fringing that still makes an Apple II display immediately recognizable even today. Still, the Apple II was easily the graphical class of the microcomputer field in 1977. (I’ll talk a bit more about the Apple II’s graphical system and its restrictions when I look at some specific games in future posts.)

So, Woz was all over the Apple II, in these particulars as well as many others. But where was Jobs?

He was, first of all, performing the role he always had during his earlier projects with Woz, that of taskmaster and enabler. Left to his own devices, Woz could lose himself for weeks in the most minute and ultimately inconsequential aspects of a design, or could drift entirely off task when, say, some new idea for an electronically enabled practical joke struck him. Jobs therefore took it upon himself to constantly monitor Woz and the pair of junior engineers who worked with him, keeping them focused and on schedule. He also solved practical problems for them in that inimitable Steve Jobs way. When it became apparent that it was going to be very difficult to design the RF modulator needed for hooking the computer up to a television (dedicated monitors at the time were a rare and pricy luxury) without falling afoul of federal RF interference standards, he had Woz remove this part from the computer entirely, passing the specifications instead on to a company called M&R Electronics. When sold separately and by another company, the RF modulator did not need to meet the same stringent standards. Apple II owners would simply buy their RF modulators separately for a modest cost, and everyone (most of all M&R, who were soon selling the little gadgets by the thousands) could be happy.

Such practical problem-solving aside, Jobs’s unique vision was also all over the finished product. It was Jobs who insisted that Woz’s design be housed within a sleek, injection-molded plastic case that looked slightly futuristic, but not so much as to clash with the decor of a typical home. It was Jobs who gave the machine its professional appearance, with its screws all hidden away underneath and with the colorful Apple logo (a reference to the machine’s unique graphical capabilities) front and center.

Jobs, showing a prejudice against fan noise that has continued with him to the present day, insisted that Woz and company find some other way to cool it, which feat they managed via a system of cleverly placed vents. And it was Jobs who gave the machine its unique note of friendly accessibility, via a sliding top giving easy access to the expansion slots and, a bit further down the line, unusually complete and professional documentation in the form of big, glossy, colorful manuals. Indeed, much of the Apple II ownership experience was not so far removed from the Apple ownership experience of today. Jobs worked to make Apple II owners feel themselves part of an exclusive club, a bit more rarified and refined than the run-of-the-mill PET and TRS-80 owners, by sending them freebies and updates (such as the aforementioned new manuals) from time to time. And just like the Apple of today, he was uninterested in competing too aggressively on price. If an Apple II cost a bit more — actually, a lot more, easily twice the price of a PET or TRS-80 — it was extra money well spent. Besides, what adds an aura of exclusivity to a product more effectively than a higher price? What we are left with, then, is a more expensive machine, but also an unquestionably better machine than its competitors, and — a couple of years down the road at least, once its software library started to come up to snuff — one uniquely suited to perform well in many different roles for many different people, from the hardcore hacker to the businessman to the teacher to the teenage videogamer.

When the Apple II made its debut at the first West Coast Computer Faire in April of 1977, Jobs’s promotional instincts were again in evidence. In contrast to the other displays, which were often marked with signs hand-drawn in black marker, Apple’s had a back-lit plexiglass number illuminating the company’s new logo; it still looks pretty slick even today.

In light of the confusion that still exists over who deserves the credit for selling the first fully assembled PC, perhaps we should take a moment to look at the chronology of the trinity of 1977. Commodore made the first move, showing an extremely rough prototype of what would become the PET at the Winter Consumer Electronics Show in January of 1977. It then proceeded to show progressively less rough prototypes at the Hannover Messe in March (a significant moment in the history of European computing) and the West Coast Computer Faire. However, the design was not fully finalized until July, and the first units did not trickle into stores until September. Even then, PETs remained notoriously hard to come by until well into 1978, thanks to the internal chaos and inefficiency that seemed endemic to Commodore throughout the company’s history. (Ironically, Jobs and Woz had demonstrated the Apple II technology privately to Commodore as well as Atari in 1976, offering to sell it to them for “a few hundred thousand” and positions on staff. They were turned down; Commodore, immensely underestimating the difficulty of the task, decided it could just as easily create a comparable design of its own and begin producing it in just a few months.) The TRS-80, meanwhile, was not announced until August of 1977, but appeared in Radio Shack stores in numbers within weeks of the PET to become by several orders of magnitude the biggest early seller of the trinity. And the Apple II? Woz’s machine was in a much more finished state than the PET at the West Coast Computer Faire, and began shipping to retailers almost right on schedule in June of 1977. Thus, while Commodore gets the credit for being the first to announce a pre-built PC, Apple was the first to actually follow through and ship one as a finished product. Finally, Radio Shack can have the consolation prize of having the first PC to sell in large numbers — 100,000 in just the last few months of 1977 alone, about twice the quantity of all other kit or preassembled microcomputers sold over the course of that entire year.

Actually, that leads to an interesting point: if Apple’s status as the maker of the first PC is secure, it’s also true that the company’s rise was not so meteoric as popular histories of that period tend to suggest. As impressive as both the Apple II and Jobs’s refined presentation of it was, Apple struggled a bit to attract attention at the West Coast Computer Faire in the face of some 175 competing product showcases, many of them much larger if not more refined than Apple’s. Byte magazine, for instance, did not see fit to so much as mention Apple in its extensive writeup of the show. Even after the machine began to ship, early sales were not breathtaking. Apple sold just 650 Apple IIs in 1977, and struggled a bit for oxygen against Radio Shack with its huge distribution network of stores and immense production capacity. The next year was better (7600 sold), the next even better (35,000 sold, on the strength of increasingly robust software and hardware libraries). Still, the Apple II did not surpass the TRS-80 in total annual sales until 1983, on the way to its peak of 1,000,000 sold in 1984 (the year that is, ironically, immortalized as the Year of the Macintosh in the popular press).

Apple released an enhanced version of the II in 1979, the Apple II Plus. This model usually shipped with a full 48 K of RAM, a very impressive number for the time; the original Apple II had initially had only 4 K as standard equipment. Also notable was the replacement in ROM of the original Integer BASIC, written by Woz himself years ago when he first started attending Homebrew Computer Club meetings, with the so-called AppleSoft BASIC. AppleSoft corrected a pivotal flaw in the original Integer BASIC, its inability to deal with floating-point (i.e., decimal) numbers. This much more full-featured implementation was provided, like seemingly all microcomputer BASICs of the time, by Microsoft. (As evidenced by AppleSoft BASIC and products like the SoftCard, Microsoft often worked quite closely with Apple during this early period, in marked contrast to the strained relationship the two companies would develop in later years.) Woz also tweaked the display system on the II Plus to manage 6 colors in hi-res mode instead of just 4.

By 1980, then, the Apple II had in the form of the II Plus reached a sort of maturity, even though holes — most notably, a lack of support for lower-case letters without the purchase of additional hardware — remained. It was not the best-selling machine of 1980, and certainly far from the cheapest, but in some ways still the most desirable. Woz’s fast and reliable Disk II design coupled with the comparatively cavernous RAM of the II Plus and the machine’s bitmap graphics capabilities gave inspiration for a new breed of adventure games and CRPGs, larger and more ambitious than their predecessors. We’ll begin to look at those developments next time.

In the aftermath of even the Apple II’s first, relatively modest success, Jobs began working almost immediately to make sure Apple’s follow-up products reflected only his own vision of computing, gently easing Woz out of his central position. He began to treat Woz as something of a loose cannon to be carefully managed after Woz threatened Apple’s legendary 1980 IPO by selling or even giving away chunks of his private stock to various Apple employees who he just thought were doing a pretty good job and deserved a reward, gosh darn it. The Apple III, also introduced in 1980, was thus the product of a more traditional process of engineering by committee, with Woz given very little voice in the finished design. It was also Apple’s first failure, largely due to Jobs’s overweening arrogance and refusal to listen to what his engineers were telling him. Most notably, Jobs insisted that the Apple III, like the Apple II, ship without a cooling fan. This time, no amount of clever engineering hacks could prevent machines from melting by the thousands. Perhaps due to the deeply un-Jobs-ian hackerish side of its personality, Jobs tried repeatedly to kill the Apple II, with little success; it remained the company’s biggest seller and principal source of revenue when he resigned from Apple in a huff following an internal dispute in 1985.

In February of 1981, Woz crashed the small airplane he had recently learned how to fly, suffering serious head trauma. This event marked the end of his truly cutting-edge engineering years, at Apple or anywhere else. Perhaps he took the crash as a wake-up call to engage with all those other wonders of life he’d been neglecting during the years he’d spent immersed in circuits and code. It’s also true, though, that the sort of high-wire engineering Woz did throughout the 1970s (not only with Apple and privately, but also with Hewlett Packard) is very mentally intense, and possibly Woz’s brain had been changed enough by the experience to make it no longer possible. Regardless, he began to interest himself in other things: going back to university under an assumed name to finish his aborted degree, organizing two huge outdoor music and culture festivals (The “US Festivals” of 1982 and 1983), developing and trying to market a universal remote control. He is still officially an employee of Apple, but hasn’t worked a regular shift in the office since February of 1987. He wrote an autobiography (with the expected aid of a professional co-author) a few years ago, maintains a modest website, contributes to various worthy causes such as the Electronic Frontier Foundation, and, most bizarrely, made a recent appearance on Dancing with the Stars.

Asked back in 2000 if he considered himself an entrepreneur, Woz had this to say:

Not now. I’m not trying to do that because I wouldn’t put 20 hours a day into anything. And I wouldn’t go back to the engineering. The way I did it, every job was A+. I worked with such concentration and focus and I had hundreds of obscure engineering or programming things in my head. I was just real exceptional in that way. It was so intense you could not do that for very long—only when you’re young. I’m on the board of a couple of companies that you could say are start-ups, so I certainly support it, but I don’t live it. The older I get the more I like to take it easy.

Woz has certainly earned the right to “take it easy,” but there’s something that strikes me a little sad about his post-Apple II career, as the story of a man who never quite figured out what to do for a second act in life. And the odd note of subservience that always marked his relationship with Jobs is still present. From the same interview:

You know what, Steve Jobs is real nice to me. He lets me be an employee and that’s one of the biggest honors of my life. Some people wouldn’t be that way. He has a reputation for being nasty, but I think it’s only when he has to run a business. It’s never once come out around me. He never attacks me like you hear about him attacking other people. Even if I do have some flaky thinking.

It’s as if Woz, God bless his innocence, still does not understand that he was really treated rather shabbily by Jobs, and that, in a very real sense, it was he that made Jobs. In that light, it seems little enough to expect that Jobs refrain from hectoring him as he would one of his more typical employees.

As for Jobs himself… well, you know all about what became of him, right?

 
16 Comments

Posted by on September 12, 2011 in Digital Antiquaria, Interactive Fiction

 

Tags:

Jobs and Woz

As I write this the news media and the blogosphere are just tailing off from an explosion of commentary and retrospectives triggered by an obviously ill Steve Jobs stepping down at last from his post as Apple’s CEO. The event marks the end of an era. With Bill Gates having retired from day-to-day involvement with Microsoft a few years ago, the two great survivors from those primordial computing days of the late 1970s and early 1980s no longer run the iconic companies that they began to build all those years ago.

For many, Bill and Steve embodied two fundamentally opposing approaches to technology. On one side was Gates, the awkwardly buttoned-down overachiever who never even as a multi-billionaire seemed quite comfortable in his own skin, wielding spreadsheets and databases while obsessing over Microsoft’s latest financial reports. On the other was Jobs, the epitome of California cool who never met a person he couldn’t charm, wielding art packages and music production software while talking about how technology could allow us to live better, more elegant lives. These attitudes were mirrored in the products of their respective companies. In In the Beginning Was the Command Line, Neal Stephenson compared the Macintosh with a sleek European sedan, while Windows was a station wagon which “had all the aesthetic appeal of a Soviet worker housing block; it leaked oil and blew gaskets, and [of course] it was an enormous success.” These contrasts — or should we say caricatures? — run deep. They were certainly not lost on Apple itself when it made its classic series of “I’m a Mac / I’m a PC” commercials to herald its big post-millennial Jobs-helmed comeback.

Even in the late 1970s, when he was a very young man, Jobs had an intuitive feeling for the way that technology ought to work and an aesthetic eye that was lacking in just about every one of the nerds and hackers that made up the rest of the early microcomputer industry. Almost uniquely among his contemporaries, Jobs had a vision of where all this stuff could go, a vision of a technological future that would appeal not just to PC guy in the commercials above but also to Mac guy. The industry desperately needed a guy like Jobs — good-looking, glib, articulate, with an innate sense of aesthetics and design — to serve as an ambassador between the hackers and ordinary people. Jobs was the kind of guy who might visit a girlfriend’s home for dinner and walk away with a check to fund his startup business from the father and a freshly baked cake from the mother. He made all these hackers with their binary code and their soldering irons seem almost normal, and almost (if only by transference) kind of cool.

There’s a trope that comes up again and again amongst the old-timers who remember those days and the histories that are written of them: that it was a fundamentally innocent time, when hackers hacked just for the joy of it and accidentally created the modern world. In Triumph of the Nerds, Jobs’s partner in founding Apple, Steve Wozniak, said:

“It was just a little hobby company, like a lot of people do, not thinking anything of it. It wasn’t like we both thought it was going to go a long ways. We thought we would both do it for fun, but back then there was a short window in time where one person who could sit down and do some neat, good designs could turn them into a huge thing like the Apple II.”

I believe Wozniak, a hacker’s hacker if ever there was one. To imagine that an amity of hacking bliss united those guiding the companies that made up the early industry, though, is deeply mistaken. As shown by the number of companies and computer models that had already come and gone by even 1982, the PC industry was a cutthroat, hyper-competitive place.

In the same video, Jobs has this to say about those days:

“I was worth over a million dollars when I was 23, and over ten million dollars when I was 24, and over a hundred million dollars when I was 25, and it wasn’t that important, because I never did it for the money.”

In contrast to Wozniak’s comments, there’s a note of disingenuousness here. It seems suspicious that, for someone for whom finances are so unimportant, Jobs has such a specific recollection of his net worth at exact points in time; something tells me Wozniak would be challenged to come up with similar figures. I mentioned once before on this blog how Jobs cheated his best friend Wozniak out of a $5000 bonus for designing Breakout on his behalf for Atari. Jobs was of course a very young man at the time, and we’d all like to have things back from our youth, but this moment always struck me as one of those significant markers of character that says something about who a person fundamentally is. Wozniak might dismiss the incident in his autobiography by saying, “We were just kids, you know,” but I can’t imagine him pulling that stunt on Jobs. In another of those markers of character, Wozniak was so honest that, upon designing the computer that would come to be known as the Apple I and founding a company with Jobs to market it, he suddenly recalled the employment contract he had signed with Hewlett Packard which said that all of his engineering work belonged to HP during the term of his employment, whether created in the office or at home, and tried to give his computer design to HP. Much to Jobs’s relief, HP just looked at it bemusedly and told Wozniak to knock himself out trying to sell the thing on his own.

In the case of Jobs, when we drill down past the veneer of California cool and trendy Buddhism we find a man as obsessively competitive as Gates; both men were the most demanding of bosses in their younger days, who belittled subordinates and deliberately fomented discord in the name of keeping everyone at their competitive best. Gates, however, lacked the charm and media savvy that kept Jobs the perpetual golden boy of technology. Even when he was very young, people spoke about the “reality distortion field” around Jobs that seemed to always convince others to see things his way and do his bidding.

And if Jobs isn’t quite the enlightened New Man whose image he has so carefully crafted, there’s a similarly subtle cognitive dissonance about his company. Apple’s contemporary products are undeniably beautiful in both their engineering and their appearance, and they’re even empowering in their way, but this quality only goes so far. To turn back to Stephenson again, these sleek machines have “their innards hermetically sealed, so that how they work is something of a mystery.” Empowering they may be, but only on Apple’s terms. In another sense, they foster dependence — dependence on Apple — rather than independence. And then, of course, all of that beauty and elegance comes at a premium price, such that they become status symbols. The idea of a computing device, whatever its price, becoming a status symbol anywhere but inside the community of nerds would of course have been inconceivable in 1980 — so that’s progress of a sort, and largely down to Jobs’s influence. Still, it’s tempting sometimes to compare the sealed unknowability of Apple’s products with the commodity PCs that once allowed the “evil” Bill Gates to very nearly take over the computing world entirely. A Windows-based PC may have been a domestic station wagon or (in another popular analogy) a pickup truck, but like those vehicles it was affordable to just about everyone, and it was easy to pop the hood open and tinker. Apple’s creations required a trip to the metaphorical exotic car dealership just to have their oil changed. A Macintosh might unleash your inner artist and impress the coffee-house circuit, but a PC could be purchased dirt cheap — or assembled from cast-off parts — and set up in the savannah to control those pumps that keep that village supplied with drinking water. There’s something to be said for cheap, ubiquitous, and deeply uncool commodity hardware; something to be said for the idea of (as another microcomputer pioneer put it) “computers for the masses, not the classes.”

A mention of Linux might seem appropriate at this juncture, as might a more fine-grained distinction between hardware and software, but these metaphors are already threatening to buckle under the strain. Let’s instead try to guide this discussion back to Jobs and Woz, an odd couple if ever there was one.

Wozniak was a classic old-school hacker. Even during high school in the late 1960s, he fantasized about computers the way that normal teenagers obsessed over girls and cars. His idea of fun was to laboriously write out programs in his notebooks, programs which he had no computer to run, and to imagine them in action. While other boys hoarded girlie magazines, Woz (as everyone called him) collected manuals for each new computer to hit the market — sometimes so he could redesign them better, more efficiently, in his imagination.

In 1970, during a working sabbatical of sorts from university, the 20-year-old Woz met the 15-year-old Steve Jobs. Despite the age difference, they became fast friends, bonding over a shared love of technology, music, and practical jokes. Soon they discovered another mutual obsession: phone phreaking, hacking the phone system to let one call long distance for free. The pair’s first joint business venture — instigated, as these sort of things always were, by Jobs — was selling homemade “blue boxes” that could generate the tones needed to mimic a long-distance carrier.

Jobs was… not a classic old-school hacker. He was, outwardly at least, a classic hippie with a passion for Eastern philosophy and Bob Dylan, a “people person” with little patience for programming or engineering. Nevertheless, the reality distortion field allowed him to talk his way into a technician’s job at rising arcade-game manufacturer Atari. He even got Atari to give him a summer off and an airline ticket to India to do “spiritual research.” In spite of it all, though, the apparently clueless Jobs just kept delivering the goods. The reason, of course, was Woz, who by then was working full-time for Hewlett Packard during the day, then doing Jobs’s job for him by night. The dynamic duo’s finest hour at Atari was the arcade game Breakout. In what at least from the outside has all the markings of a classic codependent relationship, poor Woz was told that they had just four days to get the design done; actually, Jobs just wanted to get finished so he could jet off to attend the harvest at an apple orchard commune in Oregon. (You just can’t make some of this stuff up…) Woz met the deadline by going without sleep for four days straight, and did it using such an impossibly low number of chips that it ended up being un-manufactureable. Atari engineer Al Alcorn:

“Ironically, the design was so minimized that normal mere mortals couldn’t figure it out. To go to production, we had to have technicians testing the things so they could make sense of it. If any one part failed, the whole thing would come to its knees. And since Jobs didn’t really understand it and didn’t want us to know that he hadn’t done it, we ended up having to redesign it before it could be shipped.”

But Jobs made it to the apple festival, and also got that $5000 bonus he neglected to tell Woz about to spend there. Even in 1984 Woz still believed that he and Jobs had earned only $700 for a design that became the big arcade hit of 1976.

We can really only speculate about what caused Woz to put up with treatment like this — but speculation is fun, so let’s have at it. Woz was one of those good-hearted sorts who want to like and be liked, but who, due to some failure of empathy or just from sheer trying too hard, are persistently just enough out of sync in social situations to make everything a bit awkward. Woz always seemed to laugh a little bit too loud or too long, couldn’t quite sense when the time was right to stop reciting from his store of Polish jokes, didn’t recognize when his endless pranks were about to cross the line from harmless fun into cruelty. For a person like this the opportunity to hang out with a gifted social animal like Jobs must have been hard to resist, no matter how unequal the relationship might seem.

And it wasn’t entirely one way — not at all, actually. When Woz was hacking on the project that would become the Apple I, he lusted after a new type of dynamic RAM chips, but couldn’t afford them. Jobs just called up the manufacturer and employed the reality distortion field to talk them into sending him some “samples.” Jobs was Woz’s enabler, in the most positive sense; he had a genius for getting things done. In fact, in the big picture it is Woz that is in Jobs’s debt. One senses that Jobs would have made his mark on the emerging microcomputer industry even if he had never met Woz — such was his drive. To be blunt, Jobs would have found another Woz. Without Jobs, though, Woz would have toiled away — happily, mind you — in some obscure engineering lab or other his entire life, quietly weaving his miniaturized magic out of silicon, and retired with perhaps a handful of obscure patents to mark his name for posterity.

Unsurprisingly given their backgrounds and interests, Woz and Jobs were members of the famous Homebrew Computer Club, Woz from the very first meeting on March 5, 1975. There, the social hierarchy was inverted, and it was Woz with his intimate knowledge of computers that was the star, Jobs that was the vaguely uncomfortable outsider.

Woz designed the machine that became the Apple I just for fun. It was somewhat unique within Homebrew in that it used the new MOS 6502 CPU rather than the Intel 8080 of the original Altair, for the very good reason that Woz didn’t have a whole lot of money to throw around and the 6502 cost $25 versus $175 for the 8080. The process was almost committee-driven; Woz, who had the rare and remarkable gift of being without ego when it came to matters of design, would bring his work-in-progress to each biweekly Homebrew meeting, explaining what he’d done, describing where he was having problems, and soliciting advice and criticism. What he ended up with was pretty impressive. The machine could output to a television screen, as opposed to the flashing lights of the Altair; it used a keyboard, as opposed to toggle switches; and it could run a simple BASIC interpreter programmed by Woz himself. Woz said he “designed the Apple I because I wanted to give it away free for other people. I gave out schematics for building my computer at the next meeting I attended.”

Steve Jobs put a stop to those dangerous tendencies. He stepped in at this point to convince Woz to do what he never would have done on his own: to turn his hacking project into a real product provided by a real company. Woz sold his prized HP calculator and Jobs his Volkswagen van (didn’t someone once say that stereotypes are so much fun because they’re so often true?) to form Apple Computer on April 1, 1976. The Apple I was not a fully assembled computer like the trinity of 1977, but it was an intermediate step between the Altair and them; instead of a box of loose chips, you got a finished, fully soldered motherboard to build onto with your own case, power supply, keyboard, and monitor. The owner of an important early computer store, The Byte Shop, immediately wanted to buy 50 of them. Problem was, Jobs and Woz didn’t have the cash to buy the parts to make them. No problem; Jobs employed the reality distortion field to convince a wholesale electronics firm to give these two hippies tens of thousands of dollars in hardware in exchange for a promise to pay them in one month. Apple ended up selling 175 Apple Is over the next year, each assembled by hand in Jobs’s parents’ garage by Jobs and Woz and a friend or family member or two.

While that was going on, Woz was designing his masterpiece: the Apple II.

 
13 Comments

Posted by on September 9, 2011 in Digital Antiquaria, Interactive Fiction

 

Tags:

Binning the Trash-80

The microcomputer landscape of 1980 looked very different than it had when the trinity of 1977 first hit the scene. The hackers and early adopters who first made the TRS-80 a success were a step closer to sane than the solder-iron-wielding crazies who had constructed Altairs in their garages out of a set of diagrams and a loose pile of chips, but only a step. Owning and operating a computer was still expensive and difficult, and the question on the lips of wives and girlfriends across the country — “But what is it really good for?” — did not have any particularly strong answers. By 1980, though, that was changing, sufficiently so in fact that segments of the population were beginning to purchase computers not out of fascination with the technology itself, but rather because of what the technology would allow them to accomplish. That was due to the work of all those early adopters, who hacked like mad to create useful things that would justify their time in the terms that matter most in a market economy, dollars and cents, and thus in turn buy them yet more time to hack.

The most celebrated of these early killer apps today, perhaps due to its having been featured on The Triumph of the Nerds documentary, is VisiCalc, the spreadsheet program whose basic approach is still echoed in the Microsoft Excel we all know and love (?) today. Introduced in late 1979, it gave accountants, small-business owners, and even home users compelling reasons to own a microcomputer — whether to calculate taxes or accounts receivable and payable, or just to keep the checkbook balanced. But there are other examples. The first crude word processing application was called The Electric Pencil; it predated even the trinity of 1977, appearing for the early kit computers in December of 1976. It took WordStar, however, to refine the concept into a program flexible and powerful enough to begin to replace the expensive specialized word-processing machines found on secretary’s desks around the country upon its release in September of 1978. dBase, the first programmable relational database for microcomputers, made its first appearance in 1979. And while they were seldom openly mentioned as a reason to buy these early computers, games were always present as a sort of guilty pleasure and secret motivator. They were still crude and limited in 1980, but growing by leaps and bounds in both ambition and sales as the first specialized entertainment publishers such as Adventure International got off the ground, and as new microcomputers much more suited for play began to appear in the wake of the Atari VCS game-console sensation which began sweeping the country in earnest during the holiday season of 1979.

Ah, yes, the new machines. As new applications showed how useful and/or entertaining computers could be in both businesses and homes and as their sales figures responded, plenty of new players came rushing into the market. Some, such as the Exidy Sorcerer and Texas Instruments 99/4, found little traction, becoming mere historical footnotes and modern collector’s items. Others, though, heralded major new technological and cultural developments. We’ll get to these at some point, but for this post let’s see if we can bring some sort of order — i.e., some categories — to the crazy quilt of microcomputers available by 1980. Oddities like the TI 99/4 (the world’s first 16-bit microcomputer based on a CPU of TI’s own design) aside, most computers were based on one of two 8-bit CPU architectures.

First there was the Intel 8080, the chip at the heart of the original Altair kit computer and its contemporaries, and the Z80, a mostly compatible CPU from Zilog that nevertheless offered a more flexible, efficient design; this, you may recall, was the chip Tandy chose for the TRS-80. Apart from the TRS-80, which for better and (as we shall shortly see) for worse remained largely its own thing, these machines generally ran the first widespread platform-agnostic operating system for microcomputers, CP/M (Control Program for Microcomputers). Developed by Gary Kildall at the very dawn of the microcomputer era and published by his company Digital Research, CP/M was the MS-DOS — or, if you like, the Microsoft Windows — of this early era, a de facto if not official standard that allowed machines from a huge variety of makers to share software and information. (There is also a more tangible link between CP/M and MS-DOS: depending on whom you talk to, the original MS-DOS from 1981 was either “inspired by” CP/M or an outright unauthorized reverse engineering of the earlier O/S. But that subject will doubtlessly come up again in later posts…) For a computer to run CP/M, it required two things: an Intel 8080 or Zilog Z80 CPU, and a certain standard bus design for communicating with its disk drives and other peripherals, known as the S-100 — a design which had its origins as far back as the original Altair.(UPDATE: As Jonno points out in the comments, an S-100 bus was not a strict requirement for CP/M.)

CP/M and the Intel- and Zilog-based architectures on which it ran became the standard environment for “serious” microcomputing of the late 1970s and early 1980s, the kind done in corporate offices and small businesses. WordStar and dBase were both born there, and VisiCalc, although conceived on the Apple II, quickly found its way there. CP/M had, however, no graphics capabilities at all and only limited support for real-time operations, making it problematic as a platform for many types of games and even educational software. It also relied upon the existence of at least one disk drive on its host platform at a time when such devices tended to be very pricy. These factors made CP/M and the 8080 a poor fit for the less expensive, usually cassette-based computers generally chosen by home users. That market was dominated by another hardware architecture, that of the MOS Technologies 6502 CPU.

When the 6502 first appeared in 1975, MOS was a tiny independent chip-maker, but that changed when Commodore purchased the entire company in late 1976. This move, one of the smartest that Commodore head Jack Tramiel ever made, left Commodore in the enviable position of making money not only when it sold its own machines such as the PET, but also every time a rival purchased 6502s for its own products. Said rivals initially included only Apple with its Apple II line and a number of kit-based computers from various small manufacturers, but that would change soon enough.

A CP/M equivalent for 6502-based machines was never developed, meaning that they remained largely incompatible with one another. BASIC did serve as a partial lingua franca, as virtually all of these machines housed a version of Microsoft’s industry-standard BASIC in their ROMs, but there was enough variation from implementation to implementation that most programs needed at least some customizing. And of course when one progressed beyond BASIC to assembly language to take full advantage of everything a 6502-based machine had to offer — especially graphics and sound, which capabilities varied wildly from model to model — one was faced with essentially coding everything from scratch for each machine one wished to support. Crazy times — although with the ever-increasing proliferation of incompatible mobile computing devices in our own times it’s starting to look like 1980 all over again.

What the 6502 world lost in compatibility it gained in flexibility. Freed from the need to work through a comparatively complex and inefficient OS like CP/M, programmers could code right to the metal on these machines, manipulating every element of the hardware directly for maximum efficiency. Further, the 6502-based machines, being generally aimed at the home and education markets, tended to feature the graphics and sound capabilities that were missing from the bland, textual world of CP/M; the Apple II, for instance, was the only member of the trinity of 1977 with support for proper bitmap graphics, a subject I’ll begin to discuss in more detail in my next post.

But now you might be wondering where all of this left the TRS-80, which fit neatly into neither of the two categories just described. Although the TRS-80 was built around the Z80 CPU, Radio Shack had chosen in the name of penny pinching not to implement the S-100 bus design. (UPDATE: As happens from time to time around these parts, this is not correct. Actually, the problem involved the memory map of the original TRS-80, in which ROM preceded RAM; a CP/M machine required the reverse. Thanks to Jonno for pointing this out in the comments.) This made CP/M a nonstarter. Despite being a huge success in its early years and still having the largest installed base of any microcomputer, the TRS-80’s future was, at least in retrospect, already clouded in 1980. Its incompatibility with CP/M left it cut off from the quickly growing base of serious business software found on that OS. In spite of the TRS-80’s relatively cheap price, Radio Shack’s reputation as purveyors of cheap junk for the masses did little to attract business users, and in a classic chicken-or-the-egg scenario this lack of business users discouraged developers from porting their products from CP/M to the little oddball Tandy machine. And in the other half of the microcomputer market, the 6502-dominated world of games machines and hobbyist computing, the TRS-80 was also looking like an increasingly poor fit with its almost complete lack of graphics and absolutely complete lack of sound. The arrival of the Atari 400 and 800, colorful 6502-based machines with superb graphics and sound for the time, and, a bit later in early 1981, the Commodore VIC-20, a much less capable machine in comparison but one nevertheless sporting color graphics and sound for an unprecedentedly low price, were particularly ominous signs.

While the wisdom of many of its moves is debatable, Tandy at least did not stand entirely still in the face of these developments. In fact, it released quite a blizzard of new machines, none of which came close to recapturing the market share the TRS-80 enjoyed in the late 1970s.

Tandy released a new machine called the TRS-80 Model 2 (the original TRS-80 being now retroactively renamed to the Model 1) in late 1979. The Model 2 was designed to capture the business computing market that was passing the Model 1 by; it sold with integrated disk drives and did properly implement the S-100 bus included bank-switchable ROM, thus allowing it to run CP/M. But it was also a much more expensive machine than the Model 1 and, most dismaying of all, completely incompatible with it. Thanks to Radio Shack’s usual lack of marketing acumen and genius for clunky, tacky-looking design as well as its high price, it was not a big success in the business market, while its incompatibility made it of little interest to existing Model 1 owners.

The Model 3 which appeared to replace the Model 1 in the summer of 1980, meanwhile, was rather forced on Radio Shack. The Model 1 had put out so much radio interference that, in an example of the boundless ingenuity that marked the early microcomputer era, people began writing programs to manipulate memory so as to make music using this interference along with a nearby transistor radio to pick it up. New FCC regulations for 1981 forced Radio Shack to build in proper RF shielding, and thus spoiled that particular kind of fun. In addition to fixing this issue, the Model 3 also sported a slightly faster version of the Z80 CPU and (hallelujah!) real lower-case letter support for both input and output amongst other modest improvements. Yet it did nothing to improve the Model 1’s meager display capabilities. And, in the one-step-forward two-steps-back dance that seemed to define Radio Shack, the Model 3 was optimistically said to be just “80%” compatible with the Model 1, while, once again, no S-100 bus meant no the design did not allow for CP/M. Radio Shack in their marketing genius now had three separate machines labeled the TRS-80, each now partially or entirely incompatible with its siblings. Just imagine trying to figure out what software actually worked on your version…

And incredibly, there was yet another completely incompatible TRS-80 released in 1980, this one the most significant of all. Although officially called the TRS-80 Color Computer, it was a radical departure from anything seen before, being built around perhaps the most advanced 8-bit CPU ever produced, the new Motorola 6809E. Like so many Radio Shack systems, it offered intriguing potential bundled together with some dismaying weaknesses. On the plus side were the powerful 6809E itself and an advanced Microsoft BASIC that made it a favorite among hobbyist programmers; on the weak side were sound and graphics capabilities that, while a step up from the other TRS-80 models, were still not competitive with new and upcoming models from companies like Atari and Commodore. In spite of that the CoCos, as they soon became affectionately known, had a long run during which they consistently flew under the radar of the mainstream, attracting little in the way of games or applications from most publishers or even from Radio Shack itself, but survived on the back of a sort of cult industry all their own sustained by a fanatically loyal user base. The CoCo line did not finally go out of production until 1991.

There are many more interesting stories to tell about Radio Shack’s quirky little computers, but none would ever come close to dominating the industry the way that the TRS-80 Model 1 did for those first few years. In truth, even the Model 1 was popular because it was widely available at a time when distribution channels for other brands were barely extant and because its price was reasonable rather than because of any sterling technical qualities of the machine itself. The TRS-80 was really not so far removed from Radio Shack’s other products: it basically got the job done, but in about the most uncool and unsexy way imaginable. It primed the pump of the home computer industry and brought adventure games into the home for the first time, but already in 1980 its time was passing.

So, we’ll bid adieu to the old Trash-80 and move on next time to look at the machine that made the company that has come to define cool and sexy in technology. Yes, I’m talking about those two plucky kids in that California garage.

 
22 Comments

Posted by on September 6, 2011 in Digital Antiquaria, Interactive Fiction

 

Tags:

Robert Lafore’s Interactive Fiction

Quick: Who first coined the term interactive fiction? And why?

Assuming you had an answer at all, and assuming you’re a loquacious git like me, it may have run something like this:

The term originated, many years after the birth of the genre it describes, in the early 1980s with a company called Infocom. At that time, games of this sort were commonly known as “adventure games” or “text adventures,” the latter to distinguish them from the graphical brand of story-based games which were just beginning to compete with text-based titles in the marketplace of that time. Indeed, both terms are commonly used to this day, although they generally connote a rather “old-school” form of the genre that places most of its emphasis on the more gamelike, as opposed to literary, potentials of the form. Infocom decided that interactive fiction was a term which more accurately described their goal of creating a viable new literary form, and following that company’s demise the term was appropriated by a modern community of text-based storytellers who in many ways see themselves as heirs to Infocom’s legacy.

That, anyway, is what I wrote five years ago in my history of interactive fiction. I still think it describes pretty accurately Infocom’s motivation for replacing the term text adventure with IF, but it’s inaccurate in one important sense: the term did not actually originate with Infocom. It was rather the creation of a fellow named Robert Lafore, who founded a company under that name in 1979 and published software from 1980 to 1982 through Scott Adams’s Adventure International. By the time that Lafore came to AI, he already had three titles in the line ready to go. Local Call for Death and Two Heads of the Coin are mystery stories with an obvious debt to Dorothy L. Sayers’s Lord Peter Wimsey and Arthur Conan Doyle’s Sherlock Holmes respectively, while Six Micro-Stories presents six brief vignettes in a variety of settings and genres. Over the next year or so he wrote two more: His Majesty’s Ship Impetuous, in the style of C.S. Forester’s Horatio Hornblower novels; and Dragons of Hong Kong, in the spirit of Sax Rohmer’s Fu Manchu series of oriental mysteries.

To understand what Lafore’s concept of IF is and how it works, let’s begin with some promotional copy. After asking us to “step into a new dimension in literature,” Adventure International’s advertisement for the line continues:

Traditionally, literature has been a one-way medium. The information flow was from the novel to the reader, period. Interactive fiction changes this by permitting the reader to participate in the story itself.

The computer sets the scene with a fictional situation, which you read from the terminal. Then you become a character in the story: when it’s you’re turn to speak, you type in your response. The dialog of the other characters, and even the plot, will depend on what you say.

Wow. No previous computer “game” had dared to compare its story to that of a novel. Just the text above, divorced from the actual program it describes, demonstrates a real vision of the future of ludic narrative.

But as anyone who’s had experience with early computer-game ad copy knows, the reality often doesn’t match the rhetoric, with the latter often seeming aspirational rather than descriptive, corresponding more with the game the authors would like to have created than with the technical constraints of 8-bit processors and minuscule memories. Here’s a complete play-through of the first of the vignettes of Six Micro-Stories, “The Fatal Admission”:

Admittedly, this is not Lafore’s finest hour, so let’s try to be gentle. Let’s leave aside the fact that an admiral could only have been in the Kriegsmarine, not the Gestapo, as well as Lafore’s obvious cluelessness about German. Let’s also leave aside the illogicality of the question on which the story turns. (If I’ve been actively impersonating Colonel Braun for so long, how could I not know what flight wing I am with?) And let’s leave aside the unfair, learning-by-death aspect of the whole experience. I just want to get down to how the program works right now.

As will probably surprise no one, the program is not parsing the player’s responses in any meaningful sense, but rather doing simple pattern matching on the player’s input, somewhat in the style of Eliza but without even that program’s sophistication. Given this, it’s inevitably very easy to trip the program up, intentionally or unintentionally. Consider the following response to the admiral’s question about Captain Eiderdown:

What’s happened here is that the program has found the “not” in the player’s input and thrown out everything else, assuming the sentence to be a negative answer. This is not an isolated incident. Let’s try yet again to answer the trick question about the 57th Air Wing correctly and stay alive.

Cool! Now we can accept our new assignment and learn even more juicy Nazi secrets.

Woops. The program has failed to understand us entirely that time, which is at least better than a misunderstanding I suppose.

Obviously simple answers are the best answers.

Various vignettes in Six Micro-Stories do various things with the entered text. Perhaps the most complex and computationally interesting entry in the collection is called “Encounter in the Park,” in which you must try to get a date from a young lady you meet by chance in the park. It plays like a goal-directed version of Eliza, albeit a very primitive one. In case the connection is not obvious, the love interest’s name is even, you guessed it, Eliza.

The ultimate solution is ice cream; the mere mention of it causes Eliza to shed her sophisticated Updike-reading trappings and collapse into schoolgirl submissiveness. (The implications of this behavior in a paternalistic society like ours we will leave for the gender-studies experts.)

Another vignette is little more than a multiple-choice quiz on the age-old question of the definition of art.

And then there’s this nihilistic little number, in which nothing you type makes any difference whatsoever:

Lafore was either in a pissy mood when he wrote that one or homing in on some existential truth the rest of us can’t bear to face — take your choice.

But these are exceptions. When we get past the parser to look at the player’s actual options (thank God for BASIC!), we find that the remainder of the vignettes, as well as all of the vastly more compelling full-length stories, are really multiple-choice narratives, in which the player can choose from (usually) two or (occasionally) as many as three, four, or five options in a series of hard-coded decision points. In other words, these are really choose-your-own-adventure stories / hypertext narratives / choice-based narratives (choose your term). They are much closer to the Choose Your Own Adventure books that were just beginning to flood bookstores in 1980 than they are to the text adventures of Scott Adams or, indeed, to the interactive fiction that Infocom would soon be publishing. It’s just that their real nature is obscured by the frustrating Eliza-esque “parser” which adds an extra layer of guesswork to each decision. Sure, there are arguments to be made for the parser here. Theoretically at least, allowing the player to make decisions “in her own words” could help to draw her into the story and the role she plays there. In practice, though, the opportunities for miscommunication are so great that they outweigh any possible positives.

In a demonstration of just how ridiculously far I’m willing to go to prove a point, I reimplemented what I consider to be the most satisfying of the longer stories, His Majesty’s Ship Impetuous, using the ChoiceScript system. (Well, okay, I did want to try out ChoiceScript as well, and this project made a good excuse…) If you care to play it, you’ll find that it’s a much more complete and satisfying effort than those I’ve highlighted above, if not totally free of some of their design problems, in that getting an optimal outcome requires a bit of learning from death. Still, Lafore’s writing is sharp and well suited to the genre, and the story as a whole is carefully thought through; this represents easily the most competently crafted fiction yet to grace a computer screen in 1980. Its good qualities come through much better shorn of the Eliza trappings. Indeed, it’s much more interesting to consider in this light, because choice-based narratives had not yet made their way to the computer before Lafore set to work.

I don’t want to try to formulate a detailed theory of choice-based narrative design here, particularly because Sam Kabo Ashwell is gradually doing exactly that via his amazing series of analyses of various works in the form — a series to which nothing I say here can bear comparison. I do, however, want to note that parser-based and choice-based narratives are very different from one another, yet are constantly confused; Lafore was perhaps the first to make this mistake, but he was hardly the last. For example, the Usenet newsgroup that was the center of IF discussion for many years, rec.arts.int-fiction, was originally founded by hypertext aficionados, only to be invaded and co-opted by the text-adventure people. And even today, the very interesting-looking Varytale project bills itself as “interactive fiction.” Choice of Games doesn’t go that far, but does actively encourage authors to submit their ChoiceScript games to IF competitions, something that doesn’t really bother me but that perhaps does bring two sets of expectations into a collision that doesn’t always serve either so well.

The primary formal difference here is in the granularity of the choices offered. Parser-based IF deals in micro-actions: picking up and dropping objects, moving from one concretely bounded space to another, fiddling with that lock in exactly the right way to get that door open. Choice-based narratives at their best deal in large, defining decisions: going to war with Eastasia or with Eurasia, trying to find your way back to the Cave of Time or giving up. Even when the choices presented are seemingly of an IF-like granularity, such as whether to take the left or the right branch in that dungeon you’re exploring, they should turn out to be of real consequence. A single choice in a choice-based narrative can encompass thousands of turns in a parser-based work of IF — or easily an entire game. When authors combine a choice-based structure with an IF-like level of granularity, the results are almost always unfortunate; see for example 2009 IF Competition entry Trap Cave, which attempted to shoehorn an Adventure-style cave crawl into a choice-based format, or Ashwell’s analysis of a Fighting Fantasy gamebook. A choice-based narrative needs to give its player real narrative agency — at the big, defining level of choosing where the plot goes — to be successful. Parser-based IF does not; it can let the player happily busy herself with the details while guiding her firmly along an essentially railroaded plot arc.

Given that they can tell such large swathes of story at a hop, we might be tempted to conclude that choice-based games allow deeper, richer stories. In a way that’s true, especially in the context of 1980; it would have been impossible to pack even 10% of the story of His Majesty’s Ship Impetuous into a Scott Adams-style game. It’s also true that choice-based narratives are generally not so much about challenging their players with puzzles or tactical dilemmas as they are about the “pure” experience of story. A contemporary reviewer of His Majesty’s Ship Impetuous writing in SoftSide magazine, Dave Albert, very perceptively picked up on these qualities and in the process summarized the joys of a choice-based narrative as well as some of the frustrations of Lafore’s early implementation of same:

Lafore has tried to write an open-ended story with several possible endings, and he has tried to structure it so that the reader/player is unaware of the import of the decisions made. Where in previous stories the player is allowed to ask any question that comes to mind (with often incongruous and confusing results), in Impetuous yes or no decisions are presented. There is no way to work around this structure, and it is greatly to the benefit of the program that such is the case. There is no puzzle to solve, only a story to develop. The end goal is to survive and the decisions that you make will dictate whether you do or not. However, you cannot decipher what is the proper course of action that will guarantee your success. There are enough critical points (decisions) in the program to make you uncertain of your actions after several games. This greatly enhances the value of the program.

I wouldn’t frame all the specific design choices in Impetuous quite so positively as Albert, but I think the larger points stand. Choice-based works encourage the player to view them from on-high, like a puppet master manipulating not just her character but the strings of the plot itself; note that Impetuous is written in the third-person past tense. The player manipulates the story, but she does not always feel herself to be in the story. Some more recent choice-based works have even divorced the player entirely from any set in-world avatar. Parser-based IF, however, excels at putting you right there, immersed in the virtual reality you are exploring. Both approaches are valid for telling different kinds of stories, creating different kinds of experiences, and both can go horribly awry. Too many choice-based works leave their player feeling so removed from the action that she ceases to care at all (this, I must admit, is my typical reaction when playing even the modern ChoiceScript games, and the main reason my heart belongs firmly to the parser-based camp); too many parser-based works, especially early ones, become so fiddly that they only frustrate.

Minus the parser frustrations, Impetuous is a fairly successful piece of work, written at an appropriate level of abstraction for the choice-based form. If many of its choices are ultimately false ones, having no real effect on the plot, it disguises this well enough that the player does not really realize it, at least on the first play-through, while enough choices do matter to keep the player interested. Best of all, and most surprisingly when we consider the structure of, say, the early Choose Your Own Adventure books, there are no arbitrary, context-less choices (will you go right or left in this dungeon?) and no choices that lead out of the blue to death. Some of its ethical positions are debatable, such as the way it favors plunging headlong into battle versus a more considered approach, but perhaps that’s par for the course given its genre.

One of the amusing and/or disconcerting aspects of writing this blog has been that I sometimes find myself honoring pioneers who have no idea they are pioneers. Lafore traded in his entertainment-software business after writing Dragons of Hong Kong for a long, successful, and still ongoing career as an author of technical books. I’m going to guess that he has no idea that the term he invented all those years ago remains vital while the works to which he applied it have been largely forgotten.

By way of remedying that at least a little bit, do give my implementation of His Majesty’s Ship Impetuous a shot. I think it works pretty well in this format, and is more entertaining and well-written than it has a right to be. (Credit goes to Lafore for that, of course.) And if you’d like to play the originals of either Six Micro-Stories or His Majesty’s Ship Impetuous, you can do that too.

1. Download my Robert Lafore starter pack.
2. Start the sdltrs emulator.
3. Press F7, then load “newdos.dsk” in floppy drive 0 and either “microstories.dsk” or “impetuous.dsk” into floppy drive 1.
4. Reboot the emulator by pressing F10.
5. At the DOS prompt, type BASIC.
6. Type LOAD “STORY:1″ for Six Micro-Stories; LOAD “STORY1:1” for His Majesty’s Ship Impetuous.
7. Type RUN.

Next time I want to have a look at the evolving state of the computer industry in 1980 — and begin to execute a (hopefully) deft platform switch.

 
20 Comments

Posted by on September 1, 2011 in Digital Antiquaria, Interactive Fiction

 

Tags: ,

A Busy 1980

When we last left Scott Adams at the end of 1979, he was poised to take this adventuring thing to the proverbial next level, with a solid catalog of games already available on a number of platforms, with perhaps the best name recognition in the nascent computer-game industry, and with a new company — Adventure International — ready to publish his works and the works of others under its own imprint. The following year saw him realize all of that potential and more, to take a place at the forefront of a new industry.

Adventure International grew by leaps and bounds for the next few years, while always remaining, like everything Adams touched, indelibly stamped with the personality of its founder. AI was the Dollar General of the early software industry. Its catalogs are filled with a ramshackle collection of software of every stripe. In addition to the expected text adventures from Adams and others (many of these also using Adams’s engine), there were arcade clones (“far superior to any Space Invader game for the TRS-80 microcomputer so far”, announces the blurb for Invaders Plus, with more honesty than legal wisdom); space strategy games (Galactic Empire and Galactic Trader); chess programs (“although a graphic display of the chessboard is provided, it is recommended that an actual chessboard be used during play…”); board game adaptations (Micropoly, which is once again foolish enough to advertise that it is a clone of Monopoly right in the promotional text); even TRS-80 Opera, which let one listen to the William Tell overture via a transistor radio set in close proximity to the machine’s cassette port (the TRS-80’s lack of proper RF shielding was not always such a bad thing). And for when fun-and-games time was over, there were also math programs, print spoolers, programming tools, drawing programs, and educational software to hand, the latter evincing the usual fascination with states and their capitals that was so common amongst early programmers. All of this software was relatively cheap, with $9.95 or $14.95 being the most common price points, and somewhat… variable… in quality. Still, there was a thrill to be had in walking the virtual aisles of the AI catalogs, gazing at the shelves groaning with the output of an expanding new industry, wondering what crazy (not to say hare-brained) idea would be around the next bend. Hovering over the whole scene was always the outsized personality of Adams himself, who would remain throughout AI’s brief but busy lifetime an unusually visible company leader. (A reading of the legal fine print shows AI itself to be merely “a division of Scott Adams, Inc.”)

The year 1980 represents an important historical moment for the entertainment-software industry. A few exceptions such as Microsoft and Automated Simulations aside, computer games had previously been distributed as a sideline by semi-amateurs who hung their Ziploc baggies up in the local computer store and signed up with the hobbyist distribution services run by SoftSide and Creative Computing magazines. Now, though, companies like AI and a few others that sprung up around the same time began to professionalize the field. Within a few years the Ziploc baggies would be replaced with slick, colorful boxes stuffed with glossy manuals and other goodies, and the semi-amateurs in their home offices and bedrooms with real development studios whose members did this stuff for a living. Computer games were becoming a viable business, bringing more resources onto the scene that would soon allow for bigger and more ambitious creations than anything yet seen, but also bringing all the complications and loss of innocence associated with monetizing a labor of love.

In light of the explosive growth of his company, it’s no surprise that Adams’s creation of new adventures slowed down dramatically at this point. Some of his energy was consumed — and not for the last time — with repackaging his already extant games. All received pen-and-watercolor cover illustrations courtesy of an artist known as “Peppy,” whose colorful if unpolished style perfectly suited the gonzo enthusiasm of Adams’s prose.

AI released just two new Adams-penned adventures in 1980: the Western pastiche Ghost Town in the spring and Savage Island Part One, first of a two-parter advertised as difficult enough for the hardcore of the hardcore, just in time for Christmas. I thought we’d take a closer look at the first of these to see how far Adams’s art had progressed since The Count.

The simple but painful answer to that question is: not at all, really. In fact, it has regressed in many ways. The Western setting was apparently merely the next on Adams’s list of genre touchstones to cover, as it does little to inform the experience of play. Ghost Town is a plotless treasure hunt, just like Adventureland; it’s as if the The Count never happened: “Drop treasures then score.” Sigh.

We rob the saloon of its cash box just because it’s there. Double sigh.

Worse, even as a treasure hunt Ghost Town is neither entertaining nor satisfying. A few quips such as the response to trying to GO MIRROR (“I’m not Alice!”) aside, Ghost Town has lost some of the friendly warmth that made one somewhat willing to forgive the earlier games their own dodgy moments. The useful HELP command with its little nudges and food for thought has disappeared entirely, while the puzzles have devolved into a veritable catalog of design sins. Adams had been slowly ramping up the difficulty of each successive game that he wrote, apparently expecting his player to work through the games in order and thus to be prepared by the time they faced Ghost Town. I suppose that’s a reasonable enough approach in the abstract, but in reality there is no way to train for the puzzles in Ghost Town. Even some of the least objectionable require considerable outside knowledge, of things like the composition of gunpowder or the translation of Morse code.

Of course, in 1980, a time when Wikipedia was not a browser bookmark away, tracking down this sort of information might require a trip to the local library.

Other puzzles require us to see the room in question exactly as Adams pictured it, despite his famously terse room descriptions that do little more than list the objects therein. Still others reward only dogged persistence rather than insight, such as requiring us to tote a shovel around the map and dig in every single room to see whether anything turns up. Yet more, the worst of all, are protracted battles with the parser. How long it would take the average player to divine that she must SAY GIDYUP to get the horse to move is something I don’t even want to think about — nor how long she might fruitlessly try mixing the charcoal, sulfur, and saltpeter together before finally just typing MAKE GUNPOWDER. At times the parser seems not just technically limited but intentionally cruel.

Typing just GUNPOWDER as opposed to WITH GUNPOWDER at the above prompt results in a generic failure message. My experience with Ghost Town makes me more enamored than ever of the idea that these early games were simply too technologically limited to support difficult puzzles that were also fair and logically tenable, that ramping up their difficulty beyond a certain rather low threshold inevitably resulted in nonsense like so many of the puzzles in Ghost Town and the absurd end-game of Adventure.

It’s also tempting to conclude that Adams himself simply lacked the vision to continue to push the text adventure forward. Tempting, but not entirely correct. For a couple of years Adams wrote an occasional column for SoftSide magazine. In the November, 1980, edition he announced a planned new adventuring system called Odyssey, which would take advantage of disk-drive-equipped systems in the same way as did Microsoft Adventure, using all of that storage space as an auxiliary memory store. His plans were ambitious to say the least:

1. More than one player in an Odyssey at one time. Players may help (or hinder) one another as they see fit!
2. Full paragraphs instead of “baby talk,” e.g., “Shoe the horse with the horseshoe and the hammer and nails.”
3. Longer messages;
4. sound effects; and
5. expanded plot lines.

To develop this system I have actually had to develop a new type of computer language which I call OIL (Odyssey Interpretive Language) which is implemented by a special Odyssey assembler that generates Odyssey machine code. This machine code is then implemented on each different micro, e.g. Apple, TRS-80, etc., through a special host emulator to simulate my nonexistent Odyssey computer.

Currently (as of the Washington computer show, Sept., 1980) the system is in the final stages of implementing a host emulator on a TRS-80 32 K disk system and writing the first Odyssey (which has been sketched out and is tentatively entitled Martian Odyssey) in OIL to run on the emulator. I hope that by the time you are reading this, Odyssey Number One will be available from your local computer store or favorite mail order house.

The technical conception of Odyssey sounds remarkably similar to what would soon be rolled out by a tiny Massachusetts startup called Infocom. Interestingly, Marc Blank and Stu Galley of Infocom had laid out in the abstract the design of their virtual machine, the “Z-Machine,” in an article in Creative Computing just a couple of months before Adams wrote these words. Could he have been inspired by that article?

Whatever the answer to that question, Martian Odyssey of course never appeared, and to my knowledge Adams never mentioned the Odyssey system again. For better and (ultimately) for worse, he elected to stick with what had brought him this far — meaning treasure hunts runnable on low-end 16 K computers equipped only with cassette drives. That strategy would continue to pay off handsomely enough for a few more years, yet it’s hard not to wonder about the path not taken, the territory ceded without a fight to Infocom and others. From 1980 on, Adams is more interesting as a businessman and an enabler for others than as a software artist in his own right. On that note, I want to talk about a few of the more interesting creations to stand alongside Adams’s own adventures in the jumble of the Adventure International catalog next time.

If you’d like to try Ghost Town, here’s a version you can load into the MESS emulator using its “Devices -> Quickload” function.

 
 

Tags: , ,