RSS

Search results for ‘trinity’

The Birth of Infocom

As the Dynamic Modeling Group put the final touches on Zork and put it to bed at last, it was beginning to feel like the end of an era at MIT. Marc Blank was about to graduate medical school and begin his residency in Pittsburgh, which would make extensive MIT hacking impossible even given his seemingly superhuman capacities. Others were finishing their own degree programs at MIT, or just running out of justifications for forestalling “real” careers with real salaries by hanging around their alma mater. In fact, a generational exodus was beginning, not just from the DMG but from MIT’s Laboratory for Computer and AI Lab in general as well. Pressures from the outside world were intruding on the hacker utopia inside MIT at last, pressures which in the next few years would change it forever. Much of the change stemmed from the invention of the microcomputer.

Most in established institutional hacking environments like MIT were initially nonplussed by what’s come to be called the PC revolution. That’s not so surprising, really. Those early microcomputers were absurdly limited machines. The homebrew hackers who bought (and often built) them were just excited to have unfettered access to something that, however minimally, met the definition of “computer.” Those privileged to find a place at an institution like MIT, however, not only had unfettered or nearly unfettered access to the systems there, but said systems were powerful enough to really do something. What charms did an Altair or even TRS-80 have to compare with sophisticated operating systems like TOPS-10 or TOPS-20 or ITS, with well-structured programming languages like LISP and MDL, with research into AI and natural-language processing, even with networked games like Maze and Trivia and, yes, Zork? The microcomputer world looked like a hopelessly uncultured and untutored one, bereft of a whole hacking tradition stretching back two decades or more. How could anyone try to build complex software using BASIC? When many institutional hackers deigned to notice the new machines at all, it was with withering contempt; Stu Galley called “We hate micros!” the unofficial motto of the DMG. They regarded the micros as little more than toys — the very same reaction as most of the general population.

By the spring of 1979, though, it was becoming increasingly clear to anyone willing to look that the little machines had their uses. WordStar, the first really usable microcomputer word processor, had been out for a year, and was moving more and more CP/M-based machines into offices and even writer’s studies. At the West Coast Computer Faire that May, Dan Bricklin demonstrated for the first time VisiCalc, the world’s first spreadsheet program, which would revolutionize accounting and business-planning practice. “How did you ever do without it?” asked the first pre-release advertisement, hyperbolically but, as it turned out, presciently; a few years later millions would be asking themselves just that question. Unlike WordStar and even Scott Adams’s Adventureland, VisiCalc was not a more limited version of an institutional computing concept implemented on microcomputer hardware. It had been conceived, designed, and implemented entirely on the Apple II, the first genuinely new idea in software to be born on the microcomputer — and a sign of a burgeoning changing of the guard.

The microcomputer brought many, many more users to computers than had ever existed before. That in turn brought more private-industry investment into the field, driven by a new reality: that you could make real money at this stuff. And that knowledge brought big changes to MIT and other institutions of “pure” hacking. Most (in)famously, the AI Lab was riven that winter and spring of 1979 by a dispute between Richard Greenblatt, pretty much the dean of the traditional hacker ethic at MIT, and a more pragmatic administrator named Russell Noftsker. Along with a small team of other hackers and hardware engineers, Greenblatt had developed a small single-user computer — a sort of boutique micro, the first of what would come to be called “workstations” — optimized for running LISP. Believing the design to have real commercial potential, Noftsker approached Greenblatt with a proposal to form a company and manufacture it. Greenblatt initially agreed, but soon proved (at least in Noftsker’s view) unwilling to sacrifice even the most minute hacker principle in the face of business realities. The two split in an ugly way, with Noftsker taking much of the AI Lab with him to implement Greenblatt’s original concept as Symbolics, Inc. Feeling disillusioned and betrayed, Greenblatt eventually left as well to form his own, less successful company, Lisp Machines.

It’s not as if no one had ever founded a company out of MIT before, nor that commerce had never mixed with the idealism of the hackers there. The founders of DEC itself, Ken Olson and Harlan Anderson, were MIT alumni who had done the basic design for what became DEC’s first machine, the PDP-1, as students there in the mid-1950s. Thereafter, MIT maintained always a cozy relationship with DEC, testing hardware and, most significantly, developing much essential software for the company’s machines — a relationship that was either, depending on how you look at it, a goldmine for the hackers in giving them perpetual access to the latest technology or a brilliant scheme by DEC for utilizing some of the best computing minds of their generation without paying them a dime. Still, what was happening at MIT in 1979 felt qualitatively different. These hackers were almost all software programmers, after all, and the microcomputer market was demonstrating that it was now possible to sell software on its own as prepackaged works, the way you might a record or a book. As a wise man once said, “Money changes everything.” Many MIT hackers were excited by the potential lucre, as evidenced by the fact that many more chose to follow Noftsker than the idealistic Greenblatt out of the university. Only a handful, such as Marvin Minsky and the ever-stubborn Richard Stallman, remained behind and continued to hew relentlessly to the old hacker ethic.

Infocom’s founders were not among the diehards. As shown by their willingness to add (gasp!) security to ITS to protect their Zork source, something that would have drawn howls of protest from Stallman on at least two different levels, their devotion to the hacker ethic of total sharing and transparency was negotiable at best. In fact, Al Vezza and the DMG had been mulling over commercial applications for the group’s creations as far back as 1976. As the 1979 spring semester wrapped up, however, it seemed clear that if this version of the DMG, about to be scattered to the proverbial winds as it was, wanted to do something commercially, the time to get started was now. And quite a lot of others at MIT were doing the same thing, weren’t they? It wouldn’t do to be left behind in an empty lab, as quite literally happened to poor old Richard Stallman. That’s how Al Vezza saw the situation, anyway, and his charges, eager to remain connected and not averse to increasing their modest university salaries, quickly agreed.

And so Infocom was officially founded on June 22, 1979, with ten stockholders. Included were three of the four hackers who had worked on Zork: Tim Anderson, Dave Lebling, and the newly minted Dr. Marc Blank (commuting from his new medical residency in Pittsburgh). There were also five other current or former DMG hackers: Mike Broos, Scott Cutler, Stu Galley, Joel Berez, Chris Reeve. And then there was Vezza himself and even Licklider, who agreed to join in the same sort of advisory role he had filled for the DMG back at MIT. Each person kicked in whatever funding he could afford, ranging from $400 to $2000, and received an appropriate percentage of the new company’s stock in return. Total startup funds amounted to $11,500. The name was necessarily nondescript, considering that no one knew quite what (if anything) the company would eventually do. The fractured, futuristic compound was much in vogue amongst technology companies of the time — Microsoft, CompuWare, EduWare — and Infocom just followed the trend in choosing the name “least objectionable to everyone.”

As should be clear from the above, Infocom did not exactly begin under auspicious circumstances. I’d call them a garage startup, except that they didn’t even have a garage. Infocom would exist for some months as more of a theoretical company in limbo than an actual business entity. It didn’t even get its first proper mailing address — a P.O. Box — until March of 1980. Needless to say, no one was quitting their day jobs as they met from time to time over the following months to talk about what ought to come next. In August, Mike Broos had already gotten bored with the endeavor and quit, leaving just nine partners. Everyone agreed that they needed something they could put together relatively quickly to sell and really get the company off the ground. More ambitious projects could then follow. But what could they do for that first project?

The hackers trolled through their old projects from MIT, looking for ideas. They kept coming back to the games. There was that Trivia game, but it wouldn’t be practical to store enough questions on a floppy disk to make it worthwhile. More intriguing was the Maze game. Stand-up arcades were booming at the time. If Infocom could build a version of Maze for arcades, they would have something unprecedented. Unfortunately, getting there would require a huge, expensive hardware- as well as software-engineering project. The Infocom partners were clever enough, but they were all software rather than hardware hackers, and money was in short supply. And then of course there was Zork… but there was no way to squeeze a 1 MB adventure game into a 32 K or 48 K microcomputer. Anyway, Vezza wasn’t really comfortable with getting into the games business on any terms, fearing it could tarnish the company’s brand even if only used to raise some early funds and bootstrap the startup. So there was also plenty of discussion of other, more business-like ideas also drawn from the DMG’s project history: a document-tracking system, an email system, a text-processing system.

Meanwhile, Blank was living in Pittsburgh and feeling rather unhappy at being cut off from his old hacking days at MIT. Luckily, he did have at least one old MIT connection there. Joel Berez had worked with the DMG before graduating in 1977. He had spent the last two years living in Pittsburgh and working for his family’s business (which experience perhaps influenced the others to elect him as Infocom’s President in November of 1979). Blank and Berez made a habit of getting together for Chinese food (always the hacker’s staple) and talking about the old times. These conversations kept coming back to Zork. Was it really impossible to even imagine getting the game onto a microcomputer? Soon the conversations turned from nostalgic to technical. As they began to discuss technical realities, other challenges beyond even that of sheer computing capacity presented themselves.

Even if they could somehow get Zork onto a microcomputer, which microcomputer should they choose? The TRS-80 was by far the best early seller, but the Apple II, the Cadillac of the trinity of 1977, was beginning to come on strong now, aided by the new II Plus model and VisiCalc. Next year, and the year after that… who knew? And all of these machines were hopelessly incompatible with one another, meaning that reaching multiple platforms must seemingly entail re-implementing Zork — and any future adventure games they might decide to create — from scratch on each. Blank and Berez cast about for some high-level language that might be relatively portable and acceptable for implementing a new Zork, but they didn’t find much. BASIC was, well, BASIC, and not even all that consistent from microcomputer to microcomputer. There was a promising new implementation of the more palatable Pascal for the Apple II on the horizon, but no word of a similar system on other platforms.

So, if they wanted to be able to sell their game to the whole microcomputer market rather than just a slice of it, they would need to come up with some sort of portable data design that could be made to work on many different microcomputers via an interpreter custom-coded for each model. Creating each interpreters would be a task in itself, of course, but at least a more modest one, and if Infocom should decide to do more games after Zork the labor savings would begin to become very significant indeed. In reaching this conclusion, they followed a line of reasoning already well-trod by Scott Adams and Automated Simulations.

But then there was still another problem: Zork currently existed only as MDL source, a language which of course had no implementation on any microcomputer. If they didn’t want to rewrite the entire game from scratch — and wasn’t the point of this whole exercise to come up with a product relatively quickly and easily? — they would have to find a way to make that code run on microcomputers.

They had, then, quite a collection of problems. We’ll talk about how they solved every one of them — and pretty brilliantly at that — next time.

 
 

Tags: , ,

Ken and Roberta

There are two prototypical kinds of “computer professionals” in the world. First there are the purist hackers, who dive into the depths of circuits, operating systems, and programming languages like explorers discovering new lands; it wasn’t by chance that away from the computer Will Crowther was a caver, nor that he now spends his time deep-sea scuba diving. For the purists the reward is in the thing itself, in learning to understand and navigate this binary wonderland and maybe, just maybe, someday making (or helping to make) something really, truly new and cool. The other group is made up of the careerists. These people end up in the field for some mixture of a variety of reasons: because they need to earn a good living to support their family (no shame in that); because they’ve heard computers are cool and the next big thing (hello, Internet bubble); because they have a vision of society which requires computers as its enabler (hello, Steve Job); because they just want to get really, really rich (why, there’s Steve again hiding out in the corner hoping not to be noticed — hi!). One thing only binds this disparate group together: they are attracted to computers not by their intrinsic interest in the machines themselves but by externalities, by a vision of what the machines can do, whether for them or for others. The two groups often seem — and believe themselves to be — at odds with one another, but in truth they need each other. Witness the dynamic duo of Woz and Jobs that built the Apple II and got it to the masses. Or witness Ken and Roberta Williams, the power couple of 1980s adventure gaming.

Ken and Roberta married in 1972. He was just 18 at the time; she was 19. He was attending California Polytechnic Pomona University as a physics major, and failing; she was living at home and not doing much of anything. Contrary to what you might be thinking, there was no shotgun involved. He simply wanted Roberta in his life and was determined to have her there, permanently. Steven Levy writes that his words to her were simply, “We’re getting married, and that’s it.” She “didn’t fight it.” Right there you learn a lot about their two personalities.

Within a year or so of their marriage Ken, a restless, driven, somewhat aggressive young man with no real respect for or interest in higher education with its hierarchical structure and its abstract theorizing, could see he wasn’t going to make it as a physics major, much less a physicist. Roberta, meanwhile, was now pregnant. Ken needed a career, and he needed one quick.

In the early 1970s the institutional computer industry was nearing its peak, supplying mainframes and minicomputers by the thousands to businesses, universities, public and private schools, branches of government, and research installations. We’ve met several of the prominent companies already (IBM, DEC, HP), each serving their own core sectors of this huge market while competing with one another on the margins. Another was Control Data Corporation. Founded in 1957 by a group of refugees from an even earlier company, Sperry, CDC had by the early 1970s carved out a reputation for itself as a manufacturer of prestigious and expensive supercomputers of the type used for some of the most intensive scientific computing. The supercomputer market was, however, a small one, and so the bulk of CDC’s business was courtesy of its line of more plebeian mainframes that competed directly with IBM for corporate business. To carve out a place for itself against the larger company, CDC tried to stick to a “10% rule”: to make sure each of its systems was always 10% faster and 10% cheaper than the nearest equivalent IBM model. For a number of years this approach was very good to CDC, sufficiently so that the company opened a little trade school all its own to train future custodians of its systems. Armed with a $1500 student loan co-signed by a very concerned father-in-law, Ken entered Control Data Institute. In doing so he was conforming to a stereotype that remains with the computer industry to this day: the pure hackers go to universities and get computer-science degrees; the careerists go to trade schools and get certificates in something “practical.”

Indeed, the atmosphere at CDI promised nothing like the free-wheeling intellectual exploration of the computer-science labs at MIT or Berkley. The emphasis was on pounding in the rote tasks and procedures needed to maintain and run the big, batch-processing mainframes of CDC at the banks and other large bureaucratic entities that housed them. And that suited Ken, hungry for career in business, just fine. Where an MIT hacker might have seen intolerable drudgery, he saw money to be made. When he turned out to be pretty good at this computer stuff — even within limits to enjoy it — that just increased the earning potential.

After finishing at CDI, Ken spent the rest of the 1970s living a life that we more typically associate with the following decade, bouncing from company to company in search of ever better salaries while generally also juggling two or three independent consulting gigs on the side. With computers still mysterious, almost occult objects to most people, a fast-talking, energetic, and ambitious young man like Ken could go far with just the modicum of knowledge he had gained at CDI. As that knowledge increased and he became an ever better programmer and problem solver courtesy of the best teacher of all, experience, he seemed even more of a miracle worker, and found himself even more in demand. Ken, in other words, was becoming a pretty damn good hacker almost in spite of himself. But he always wanted more — a new hot tub, a bigger house, a nicer car, a place in the country — even as he dreamed of retiring young and bequeathing a fortune to his children. (These things would in fact happen, although not in the way Ken thought they would in the 1970s.) Ken made no apologies for his materialism. “I guess greed,” he later told Levy, “would summarize me better than anything. I always want more.”

When the first kit computers that one could build in one’s home appeared in 1975, Ken barely noticed. There was no real money to be made in them, he believed, unlike his big, boring mainframes. When the trinity of 1977 marked the arrival of a PC you didn’t need a soldering iron to assemble, he likewise paid no attention. It was not until a couple of years later that the beginning of a real, paying market in professional business software, exemplified by pioneering applications like VisiCalc and WordStar, made Ken begin to pay attention to the little “toy” machines. When he finally bought an Apple II in January of 1980, it was for a very specific purpose.

At the time there were only two real language possibilities for Apple programmers: they could use BASIC, which was easy to learn and get started with but quickly became a nightmare when trying to structure large, complex programs; or assembly language, which gave the ultimate in precise control over the hardware but was well-nigh impenetrable for the uninitiated, tedious in the micro-management it required, and just as bereft of structure. Ken saw an opportunity for a more sophisticated high-level language, one designed to be used by serious programmers creating complex software. Specifically, he wanted to bring FORTRAN, as it happens the implementation language of the original Adventure (not that Ken likely knew this or cared), to the little Apple II. With that purpose in mind, he registered a company of his own, choosing to call it On-Line Systems, a name fairly typical of the vaguely futuristic, vaguely compound, but essentially meaningless names (Microsoft, anyone?) that were so common in the era.

And what was Roberta doing during these years? Well, she was raising the Williams’ two children and happily (at least to external observers) playing the role of housewife and homemaker. She had always been a painfully shy, passive personality who by her own admission “could hardly make a phone call.” If Ken seemed to already be living in the frenetic 1980s rather than the mellow 1970s, Roberta seemed a better match for the 1950s, the doting wife who took care of the children, made sure everyone in the family had a good breakfast, lunch, and dinner, and meekly entrusted the big decisions and the earning of a living to the man of the house. That makes what happened next doubly surprising.

Shortly before Ken bought that first Apple, and while the Williams’ second son was just eight months old, Ken happened to have a remote terminal at the house for one of his gigs. The mainframe to which it could connect had on it a copy of Adventure, which by now had been ported to a variety of other platforms beyond the PDP-10. Ken called Roberta over to have a look at what he regarded as nothing more than a curiosity. Roberta, however, was immediately transfixed. “I started playing and kept playing it. I had a baby at the time, Chris was eight months old; I totally ignored him. I didn’t want to be bothered. I didn’t want to stop and make dinner.” As Ken wondered what had become of his dutiful wife, Roberta stayed up most of the night playing, then lay awake in bed working through the puzzles in her mind. It was no doubt a relief to everyone when she finally finished the game after a month of effort.

But the respite didn’t last long. After Ken brought the Apple II into the home, it didn’t take Roberta long to learn about the works of Scott Adams. Soon she was back to obsessively playing again. But then another thought began to crowd out the conundrums of the games: what if she could make a text adventure of her own? She was turning the most inspirational corner I know, imagining herself as a creator rather than a passive consumer. Inspired mostly by Agatha Christie’s novel Ten Little Indians and the board game Clue, she began to sketch ideas for a text adventure as cozy murder mystery, a genre that the form had not yet tackled. When she was pretty far along, she took a deep breath and laid out her ideas to Ken.

The story concept was certainly innovative, but it wasn’t the sort of innovation that would immediately appeal to a guy like Ken, with little interest in game design in the abstract. He was rather interested in products he could sell, operating intuitively by a rule he would later, for better and perhaps sometimes for worse, codify and articulate regularly: “Games have to have ‘WOW-value.’ If you don’t say ‘wow’ when someone describes the game to you, or you see it from 10 feet away, there’s no reason to market the game.” At first, caught up in his FORTRAN software and his prior experience of computers only as serious tools of business, he was dismissive of Roberta’s little project. But as she persisted, and as he perhaps began to notice that companies like Adventure International were growing rapidly and making real money just like the “serious” software houses, he began to reconsider. Still, he needed something special, needed an angle to help their little game stand out from the likes of the established line of Scott Adams games.

He began to think about the Apple II, with its comparatively cavernous 48 K of RAM, its fast and reliable disk drives, and its bitmap graphics capability. What if he designed their game around the unique capabilities of that machine, instead of taking the portable lowest-common-denominator approach of Adams? And then came the brainstorm: he could use the Apple’s hi-res mode to include pictures with the text. That would certainly make their game stand out. Pretty soon FORTRAN was forgotten, and work on Mystery House (the first of a whole line of On-Line Systems “Hi-Res Adventures”) had begun in earnest. The husband-and-wife team were not that far removed from Woz and Jobs. Here, Roberta designed the thing out of her inherit fascination with the thing itself, while Ken enabled her efforts, providing the tools and support she needed to bring her vision to life and, soon enough, finding ways to sell that vision to the masses.

 
 

Tags: , ,

The Apple II

Steve Jobs’s unique sense of design and aesthetics has dominated every technology project he’s led following the Apple II — for better (the modern Macintosh, the iPhone, the iPod, the iPad) or worse (the Apple III) or somewhere in between (the original 1984 Macintosh, the NeXT workstations). The Apple II, though, was different. While Jobs’s stamp was all over it, so too was the stamp of another, very different personality: Steve Wozniak. The Apple II was a sort of dream machine, a product genuinely capable of being very different things to different people, for it unified Woz’s hackerish dedication to efficiency, openness, and possibility with Jobs’s gift for crafting elegant experiences for ordinary end users. The two visions it housed would soon begin to pull violently against one another, at Apple as in the computer industry as a whole, but for this moment in time, in this machine only, they found a perfect balance.

To call Jobs a mediocre engineer is probably giving him too much credit; the internals of the Apple II were all Woz. Steven Levy describes his motivation to build it:

It was the fertile atmosphere of Homebrew that guided Steve Wozniak through the incubation of the Apple II. The exchange of information, the access to esoteric technical hints, the swirling creative energy, and the chance to blow everybody’s mind with a well-hacked design or program… these were the incentives which only increased the intense desire Steve Wozniak already had: to build the kind of computer he wanted to play with. Computing was the boundary of his desires; he was not haunted by visions of riches and fame, nor was he obsessed by dreams of a world of end users exposed to computers.

When you open an Apple II, you see a lot of slots and a lot of empty space.

All those slots were key to Woz’s vision of the machine as a hacker’s ultimate plaything; each was an invitation to extend it in some interesting way. Predictably, Jobs was nonplussed by Woz’s insistence on devoting all that space to theoretical future possibilities, as this did not jive at all with his vision of the Apple II as a seamless piece of consumer electronics to be simply plugged in and used. Surely one or two slots is more than sufficient, he bargained. Normally Jobs, by far the more forceful personality of the two, inevitably won disputes like this — but this time Woz uncharacteristically held his ground and got his slots.

Lucky that he did, too. Within months hackers, third-party companies, and Apple itself began finding ways to fill all of those slots — with sound boards, 80-column video boards, hard-disk and printer interfaces, modems, co-processor and accelerator cards, mouse interfaces, higher resolution graphics boards, video and audio digitizers, ad infinitum. The slots, combined with Woz’s dogged insistence that every technical nuance of his design be meticulously documented for the benefit of hackers everywhere, transformed the Apple II from a single, static machine into a dynamic engine of possibility. They are the one feature that, more than anything else, distinguished the Apple II from its contemporaries the PET and TRS-80, and allowed it to outlive those machines by a decade. Within months of the Apple II’s release, even Jobs would have reason to thank Woz for their existence.

All of the trinity of 1977 initially relied on cassette tapes for storage. Both the PET and TRS-80 in fact came with cassette drives as standard equipment, while the Apple II included only a cassette interface, to which the user was expected to connect her own tape player. A few months’ experience with this storage method, the very definition of balky, slow, and deeply unreliable, convinced everyone that something better was needed if these new machines were to progress beyond being techie toys and become useful for any sort of serious work. The obvious solution was the 5 1/4 inch floppy-disk technology recently devised by a small company called Shugart Associates. Woz soon set to work, coming up with a final design that engineers who understand such things still regard with reverence for its simplicity, reliability, and efficiency. The product, known as the Disk II, arrived to market in mid-1978 for about $600, vastly increasing the usability and utility of the Apple II. Thanks to the expandability Woz had designed into the Apple II from the start, the machine was able to incorporate the new technology effortlessly. Even at $600, a very competitive price for a floppy-disk system at the time, Woz’s minimalist design aesthetic combined with the Apple II’s amenability to expansion meant that Apple made huge margins on the product; in West of Eden, Frank Rose claims that the Disk II system was ultimately as important to Apple’s early success as the Apple II itself. The PET and TRS-80 eventually got floppy-disk drives of their own, but only in a much uglier fashion; a TRS-80 owner who wished to upgrade to floppy disk, for instance, had to first buy Radio Shack’s bulky, ugly, and expensive “expansion interface,” an additional big box containing the slots that were built into the Apple II.

Another killer app enabled by the Apple II’s open architecture had a surprising source: Microsoft. In 1980, that company introduced its first hardware product, a Zilog Z80 CPU on a card which it dubbed the SoftCard. An Apple II so equipped had access to not only the growing library of Apple II software but also to CP/M and its hundreds of business-oriented applications. It gave Apple II owners the best of both worlds for just an additional $350. Small wonder that the card sold by the tens of thousands for the next several years, until the gradual drying up of CP/M software — a development, ironically, for which Microsoft was responsible with its new MS-DOS standard — made it irrelevant.

While most 6502-based computers were considered home and game machines of limited “serious” utility, products like the SoftCard and the various video cards that let it display 80 columns of text on the screen — an absolute requirement for useful word processing — lent the Apple II the reputation of a machine as useful for work as it was for play. This reputation, and the sales it undoubtedly engendered, were once again ultimately down to all those crazy slots. In this sense the Apple II much more closely resembled the commodity PC design first put together by IBM in 1981 than it did any subsequent design from Apple itself.

Another significant advantage that the Apple II had over its early competitors was its ability to display bitmap graphics. The TRS-80 and the PET, you may recall, were essentially limited to displaying text only. While it was possible to draw simple pictures using the suite of simple shape glyphs these machines provided in addition to traditional letters and punctuation (see my post on Temple of Apshai on the TRS-80), this technique was an inevitably limited one. The Apple II, however, provided a genuine grid of 280X192 individually addressable pixels. Making use of this capability was not easy on the programmer, and it came with a host of caveats and restrictions. Just 4 colors were available on the original Apple II, for instance, and oddities of the design meant that any individual pixel could not always be any individual desired color. These circumstances led to the odd phasing and color fringing that still makes an Apple II display immediately recognizable even today. Still, the Apple II was easily the graphical class of the microcomputer world in 1977. (I’ll talk a bit more about the Apple II’s graphical system and its restrictions when I look at some specific games in future posts.)

So, Woz was all over the Apple II, in these particulars as well as many others. But where was Jobs?

He was, first of all, performing the role he always had during his earlier projects with Woz, that of taskmaster and enabler. Left to his own devices, Woz could lose himself for weeks in the most minute and ultimately inconsequential aspects of a design, or could drift entirely off task when, say, some new idea for an electronically enabled practical joke struck him. Jobs therefore took it upon himself to constantly monitor Woz and the pair of junior engineers who worked with him, keeping them focused and on schedule. He also solved practical problems for them in that inimitable Steve Jobs way. When it became apparent that it was going to be very difficult to design the RF modulator needed for hooking the computer up to a television (dedicated monitors at the time were a rare and pricy luxury) without falling afoul of federal RF interference standards, he had Woz remove this part from the computer entirely, passing the specifications instead on to a company called M&R Electronics. When sold separately and by another company, the RF modulator did not need to meet the same stringent standards. Apple II owners would simply buy their RF modulators separately for a modest cost, and everyone (most of all M&R, who were soon selling the little gadgets by the thousands) could be happy.

Such practical problem-solving aside, Jobs’s unique vision was also all over the finished product. It was Jobs who insisted that Woz’s design be housed within a sleek, injection-molded plastic case that looked slightly futuristic, but not so much as to clash with the decor of a typical home. It was Jobs who gave the machine its professional appearance, with its screws all hidden away underneath and with the colorful Apple logo (a reference to the machine’s unique graphical capabilities) front and center.

Jobs, showing a prejudice against fan noise that has continued with him to the present day, insisted that Woz and company find some other way to cool it, which feat they managed via a system of cleverly placed vents. And it was Jobs who gave the machine its unique note of friendly accessibility, via a sliding top giving easy access to the expansion slots and, a bit further down the line, unusually complete and professional documentation in the form of big, glossy, colorful manuals. Indeed, much of the Apple II ownership experience was not so far removed from the Apple ownership experience of today. Jobs worked to make Apple II owners feel themselves part of an exclusive club, a bit more rarified and refined than the run-of-the-mill PET and TRS-80 owners, by sending them freebies and updates (such as the aforementioned new manuals) from time to time. And just like the Apple of today, he was uninterested in competing too aggressively on price. If an Apple II cost a bit more — actually, a lot more, easily twice the price of a PET or TRS-80 — it was extra money well spent. Besides, what adds an aura of exclusivity to a product more effectively than a higher price? What we are left with, then, is a more expensive machine, but also an unquestionably better machine than its competitors, and — a couple of years down the road at least, once its software library started to come up to snuff — one uniquely suited to perform well in many different roles for many different people, from the hardcore hacker to the businessman to the teacher to the teenage videogamer.

When the Apple II made its debut at the first West Coast Computer Faire in April of 1977, Jobs’s promotional instincts were again in evidence. In contrast to the other displays, which were often marked with signs hand-drawn in black marker, Apple’s had a back-lit plexiglass number illuminating the company’s new logo; it still looks pretty slick even today.

In light of the confusion that still exists over who deserves the credit for selling the first fully assembled PC, perhaps we should take a moment to look at the chronology of the trinity of 1977. Commodore made the first move, showing an extremely rough prototype of what would become the PET at the Winter Consumer Electronics Show in January of 1977. It then proceeded to show progressively less rough prototypes at the Hanover Messe in March (a significant moment in the history of European computing) and the West Coast Computer Faire. However, the design was not fully finalized until July, and the first units did not trickle into stores until September. Even then, PETs remained notoriously hard to come by until well into 1978, thanks to the internal chaos and inefficiency that seemed endemic to Commodore throughout the company’s history. (Ironically, Jobs and Woz had demonstrated the Apple II technology privately to Commodore as well as Atari in 1976, offering to sell it to them for “a few hundred thousand” and positions on staff. They were turned down; Commodore, immensely underestimating the difficulty of the task, decided it could just as easily create a comparable design of its own and begin producing it in just a few months.) The TRS-80, meanwhile, was not announced until August of 1977, but appeared in Radio Shack stores in numbers within weeks of the PET to become by several orders of magnitude the biggest early seller of the trinity. And the Apple II? Woz’s machine was in a much more finished state than the PET at the West Coast Computer Faire, and began shipping to retailers almost right on schedule in June of 1977. Thus, while Commodore gets the credit for being the first to announced a pre-built PC, Apple was the first to actually follow through and ship one as a finished product. Finally, Radio Shack can have the consolation prize of having the first PC to sell in large numbers — 100,000 in just the last few months of 1977 alone, about twice the quantity of all other kit or preassembled microcomputers sold over the course of that entire year.

Actually, that leads to an interesting point: if Apple’s status as the maker of the first PC is secure, it’s also true that the company’s rise was not so meteoric as popular histories of that period tend to suggest. As impressive as both the Apple II and Jobs’s refined presentation of it was, Apple struggled a bit to attract attention at the West Coast Computer Faire in the face of some 175 competing product showcases, many of them much larger if not more refined than Apple’s. Byte magazine, for instance, did not see fit to so much as mention Apple in its extensive writeup of the show. Even after the machine began to ship, early sales were not breathtaking. Apple sold just 650 Apple IIs in 1977, and struggled a bit for oxygen against Radio Shack with its huge distribution network of stores and immense production capacity. The next year was better (7600 sold), the next even better (35,000 sold, on the strength of increasingly robust software and hardware libraries). Still, the Apple II did not surpass the TRS-80 in total annual sales until 1983, on the way to its peak of 1,000,000 sold in 1984 (the year that is, ironically, immortalized as the Year of the Macintosh in the popular press).

Apple release an enhanced version of the II in 1979, the Apple II Plus. This model usually shipped with a full 48 K of RAM, a very impressive number for the time; the original Apple II had initially had only 4 K as standard equipment. Also notable was the replacement in ROM of the original Integer BASIC, written by Woz himself years ago when he first started attending Homebrew Computer Club meetings, with the so-called AppleSoft BASIC. AppleSoft corrected a pivotal flaw in the original Integer BASIC, its inability to deal with floating-point (i.e., decimal) numbers. This much more full-featured implementation was provided, like seemingly all microcomputer BASICs of the time, by Microsoft. (As evidenced by AppleSoft BASIC and products like the SoftCard, Microsoft often worked quite closely with Apple during this early period, in marked contrast to the strained relationship the two companies would develop in later years.) Woz also tweaked the display system on the II Plus to manage 6 colors in hi-res mode instead of just 4.

By 1980, then, the Apple II had in the form of the II Plus reached a sort of maturity, even though holes — most notably, a lack of support for lower-case letters without the purchase of additional hardware — remained. It was not the best-selling machine of 1980, and certainly far from the cheapest, but in some ways still the most desirable. Woz’s fast and reliable Disk II design coupled with the comparatively cavernous RAM of the II Plus and the machine’s bitmap graphics capabilities gave inspiration for a new breed of adventure and RPG games, larger and more ambitious than their predecessors. We’ll begin to look at those developments next time.

In the aftermath of even the Apple II’s first, relatively modest success, Jobs began working almost immediately to make sure Apple’s follow-up products reflected only his own vision of computing, gently easing Woz out of his central position. He began to treat Woz as something of a loose cannon to be carefully managed after Woz threatened Apple’s legendary 1980 IPO by selling or even giving away chunks of his private stock to various Apple employees who he just thought were doing a pretty good job and deserved a reward, gosh darn it. The Apple III, also introduced in 1980, was thus the product of a more traditional process of engineering by committee, with Woz given very little voice in the finished design. It was also Apple’s first failure, largely due to Jobs’s overweening arrogance and refusal to listen to what his engineers were telling him. Most notably, Jobs insisted that the Apple III, like the Apple II, ship without a cooling fan. This time, no amount of clever engineering hacks could prevent machines from melting by the thousands. Perhaps due to the deeply un-Jobs-ian hackerish side of its personality, Jobs tried repeatedly to kill the Apple II, with little success; it remained the company’s biggest seller and principal source of revenue when he resigned from Apple in a huff following an internal dispute in 1985.

In February of 1981, Woz crashed the small airplane he had recently learned how to fly, suffering serious head trauma. This event marked the end of his truly cutting-edge engineering years, at Apple or anywhere else. Perhaps he took the crash as a wake-up call to engage with all those other wonders of life he’d been neglecting during the years he’d spent immersed in circuits and code. It’s also true, though, that the sort of high-wire engineering Woz did throughout the 1970s (not only with Apple and privately, but also with Hewlett Packard) is very mentally intense, and possibly Woz’s brain had been changed enough by the experience to make it no longer possible. Regardless, he began to interest himself in other things: going back to university under an assumed name to finish his aborted degree, organizing two huge outdoor music and culture festivals (The “US Festivals” of 1982 and 1983), developing and trying to market a universal remote control. He is still officially an employee of Apple, but hasn’t worked a regular shift in the office since February of 1987. He wrote an autobiography (with the expected aid of a professional co-author) a few years ago, maintains a modest website, contributes to various worthy causes such as the Electronic Frontier Foundation, and, most bizarrely, made a recent appearance on Dancing with the Stars.

Asked back in 2000 if he considered himself an entrepreneur, Woz had this to say:

Not now. I’m not trying to do that because I wouldn’t put 20 hours a day into anything. And I wouldn’t go back to the engineering. The way I did it, every job was A+. I worked with such concentration and focus and I had hundreds of obscure engineering or programming things in my head. I was just real exceptional in that way. It was so intense you could not do that for very long—only when you’re young. I’m on the board of a couple of companies that you could say are start-ups, so I certainly support it, but I don’t live it. The older I get the more I like to take it easy.

Woz has certainly earned the right to “take it easy,” but there’s something that strikes me a little sad about his post-Apple II career, as the story of a man who never quite figured out what to do for a second act in life. And the odd note of subservience that always marked his relationship with Jobs is still present. From the same interview:

You know what, Steve Jobs is real nice to me. He lets me be an employee and that’s one of the biggest honors of my life. Some people wouldn’t be that way. He has a reputation for being nasty, but I think it’s only when he has to run a business. It’s never once come out around me. He never attacks me like you hear about him attacking other people. Even if I do have some flaky thinking.

It’s as if Woz, God bless his innocence, still does not understand that he was really treated rather shabbily by Jobs, and that, in a very real sense, it was he that made Jobs. In that light, it seems little enough to expect that Jobs refrain from hectoring him as he would one of his more typical employees.

As for Jobs himself… well, you know all about what became of him, right?

 
6 Comments

Posted by on September 12, 2011 in Digital Antiquaria, Interactive Fiction

 

Tags:

Jobs and Woz

As I write this the news media and the blogosphere are just tailing off from an orgy of commentary and retrospectives triggered by an obviously ill Steve Jobs stepping down at last from his post as Apple’s CEO. The event marks the end of an era. With Bill Gates having retired from day-to-day involvement with Microsoft a few years ago, the two great survivors from those primordial computing days of the late 1970s and early 1980s no longer run the iconic companies that they began to build all those years ago.

For many, Bill and Steve embodied two fundamentally opposing approaches to technology. On one side was Gates, the awkwardly buttoned-down overachiever who never even as a multi-billionaire seemed quite comfortable in his own skin, wielding spreadsheets and databases while obsessing over Microsoft’s latest financial reports. On the other was Jobs, the epitome of California cool who never met a person he couldn’t charm, wielding art packages and music production software while talking about how technology could allow us to live better, more elegant lives. These attitudes were mirrored in the products of their respective companies. In In the Beginning Was the Command Line, Neil Stephenson compared the Macintosh with a sleek European sedan, while Windows was a station wagon which “had all the aesthetic appeal of a Soviet worker housing block; it leaked oil and blew gaskets, and [of course] it was an enormous success.” These contrasts — or should we say caricatures? — run deep. They were certainly not lost on Apple itself when it made its classic series of “I’m a Mac / I’m a PC” commercials to herald its big post-millennial Jobs-helmed comeback.

Even in the late 1970s, when he was a very young man, Jobs had an intuitive feeling for the way that technology ought to work and an aesthetic eye that was lacking in just about every one of the nerds and hackers that made up the rest of the early microcomputer industry. Almost uniquely among his contemporaries, Jobs had a vision of where all this stuff could go, a vision of a technological future that would appeal not just to PC guy in the commercials above but also to Mac guy. The industry desperately needed a guy like Jobs — good-looking, glib, articulate, with an innate sense of aesthetics and design — to serve as an ambassador between the hackers and ordinary people. Job was the kind of guy who might visit a girlfriend’s home for dinner and walk away with a check to fund his startup business from the father and a freshly baked cake from the mother. He made all these hackers with their binary code and their soldering irons seem almost normal, and almost (if only by transference) kind of cool.

There’s a trope that comes up again and again amongst the old-timers who remember those days and the histories that are written of them: that it was a fundamentally innocent time, when hackers hacked just for the joy of it and accidentally created the modern world. In Triumph of the Nerds, Jobs’s partner in founding Apple, Steve Wozniak, said:

“It was just a little hobby company, like a lot of people do, not thinking anything of it. It wasn’t like we both thought it was going to go a long ways. We thought we would both do it for fun, but back then there was a short window in time where one person who could sit down and do some neat, good designs could turn them into a huge thing like the Apple II.”

I believe Wozniak, a hacker’s hacker if ever there was one. To imagine that an amity of hacking bliss united the those guiding the companies that made up the early industry, though, is deeply mistaken. As shown by the number of companies and computer models that had already come and gone by even 1982, the PC industry was a cutthroat, hyper-competitive place.

In the same video, Jobs has this to say about those days:

“I was worth over a million dollars when I was 23, and over ten million dollars when I was 24, and over a hundred million dollars when I was 25, and it wasn’t that important, because I never did it for the money.”

In contrast to Wozniak’s comments, there’s a note of disingenuousness here. It seems suspicious that, for someone for whom finances are so unimportant, Jobs has such a specific recollection of his net worth at exact points in time; something tells me Wozniak would be challenged to come up with similar figures. I mentioned once before on this blog how Jobs cheated best friend Wozniak out of a $5000 for designing Breakout on his behalf for Atari. Jobs was of course a very young man at the time, and we’d all like to have things back from our youth, but this moment always struck me as one of those significant markers of character that says something about who a person fundamentally is. Wozniak might dismiss the incident in his autobiography by saying, “We were just kids, you know,” but I can’t imagine him pulling that stunt on Jobs. In another of those markers of character, Wozniak was so honest that, upon designing the computer that would come to be known as the Apple I and founding a company with Jobs to market it, he suddenly recalled the employment contract he had signed with Hewlett Packard which said that all of his engineering work belonged to HP during the term of his employment, whether created in the office or at home, and tried to give his computer design to HP. Much to Jobs’s relief, HP just looked at it bemusedly and told Wozniak to knock himself out trying to sell the thing on his own.

In the case of Jobs, when we drill down past the veneer of California cool and trendy Buddhism we find a man as obsessively competitive as Gates; both men were the most demanding of bosses in their younger days, who belittled subordinates and deliberately fomented discord in the name of keeping everyone at their competitive best. Gates, however, lacked the charm and media savvy that kept Jobs the perpetual golden boy of technology. Even when he was very young, people spoke about the “reality distortion field” around Jobs that seemed to always convince others to see things his way and do his bidding.

And if Jobs isn’t quite the enlightened New Man whose image he has so carefully crafted, there’s a similarly subtle cognitive dissonance about his company. Apple’s contemporary products are undeniably beautiful in both their engineering and their appearance, and they’re even empowering in their way, but this quality only goes so far. To turn back to Stephenson again, these sleek machines have “their innards hermetically sealed, so that how they work is something of a mystery.” Empowering they may be, but only on Apple’s terms. In another sense, they foster dependence — dependence on Apple — rather than independence. And then, of course, all of that beauty and elegance comes at a premium price, such that they become status symbols. The idea of a computing device, whatever its price, becoming a status symbol anywhere but inside the community of nerds would of course have been inconceivable in 1980 — so that’s progress of a sort, and largely down to Jobs’s influence. Still, it’s tempting sometimes to compare the sealed unknowability of Apple’s products with the commodity PCs that once allowed the “evil” Bill Gates to very nearly take over the computing world entirely. A Windows-based PC may have been a domestic station wagon or (in another popular analogy) a pickup truck, but like those vehicles it was affordable to just about everyone, and it was easy to pop the hood open and tinker. Apple’s creations required a trip to the metaphorical exotic car dealership just to have their oil changed. A Macintosh might unleash your inner artist and impress the coffee-house circuit, but a PC could be purchased dirt cheap — or assembled from cast-off parts — and set up in the savannah to control those pumps that keep that village supplied with drinking water. There’s something to be said for cheap, ubiquitous, and deeply uncool commodity hardware; something to be said for the idea of (as another microcomputer pioneer put it) “computers for the masses, not the classes.”

A mention of Linux might seem appropriate at this juncture, as might a more fine-grained distinction between hardware and software, but these metaphors are already threatening to buckle under the strain. Let’s instead try to guide this discussion back to Jobs and Woz, an odd couple if ever there was one.

Wozniak was a classic old-school hacker. Even during high school in the late 1960s, he fantasized about computers the way that normal teenagers obsessed over girls and cars. His idea of fun was to laboriously write out programs in his notebooks, programs which he had no computer to run, and to imagine them in action. While other boys hoarded girlie magazines, Woz (as everyone called him) collected manuals for each new computer to hit the market — sometimes so he could redesign them better, more efficiently, in his imagination.

In 1970, during a working sabbatical of sorts from university, the 20-year-old Woz met the 15-year-old Steve Jobs. Despite the age difference, they became fast friends, bonding over a shared love of technology, music, and practical jokes. Soon they discovered another mutual obsession: phone phreaking, hacking the phone system to let one call long distance for free. The pair’s first joint business venture — instigated, as these sort of things always were, by Jobs — was selling homemade “blue boxes” that could generate the tones needed to mimic a long-distance carrier.

Jobs was… not a classic old-school hacker. He was, outwardly at least, a classic hippie with a passion for Eastern philosophy and Bob Dylan, a “people person” with little patience for programming or engineering. Nevertheless, the reality distortion field allowed him to talk his way into a technician’s job at rising arcade-game manufacturer Atari. He even got Atari to give him a summer off and an airline ticket to India to do “spiritual research.” In spite of it all, though, the apparently clueless Jobs just kept delivering the goods. The reason, of course, was Woz, who by then was working full-time for Hewlett Packard during the day, then doing Jobs’s job for him by night. The dynamic duo’s finest hour at Atari was the arcade game Breakout. In what at least from the outside has all the markings of a classic codependent relationship, poor Woz was told that they had just four days to get the design done; actually, Jobs just wanted to get finished so he could jet off to attend the harvest at an apple orchard commune in Oregon. (You just can’t make some of this stuff up…) Woz met the deadline by going without sleep for four days straight, and did it using such an impossibly low number of chips that it ended up being un-manufactureable. Atari engineer Al Alcorn:

“Ironically, the design was so minimized that normal mere mortals couldn’t figure it out. To go to production, we had to have technicians testing the things so they could make sense of it. If any one part failed, the whole thing would come to its knees. And since Jobs didn’t really understand it and didn’t want us to know that he hadn’t done it, we ended up having to redesign it before it could be shipped.”

But Jobs made it to the apple festival, and also got that $5000 bonus he neglected to tell Woz about to spend there. Even in 1984 Woz still believed that he and Jobs had earned only $700 for a design that became the big arcade hit of 1976.

We can really only speculate about what caused Woz to put up with treatment like this — but speculation is fun, so let’s have at it. Woz was one of those good-hearted sorts who want to like and be liked, but who, due to some failure of empathy or just from sheer trying too hard, are persistently just enough out of sync in social situations to make everything a bit awkward. Woz always seemed to laugh a little bit too loud or too long, couldn’t quite sense when the time was right to stop reciting from his store of Polish jokes, didn’t recognize when his endless pranks were about to cross the line from harmless fun into cruelty. For a person like this the opportunity to hang out with a gifted social animal like Jobs must have been hard to resist, no matter how unequal the relationship might seem.

And it wasn’t entirely one way — not at all, actually. When Woz was hacking on the project that would become the Apple I, he lusted after a new type of dynamic RAM chips, but couldn’t afford them. Jobs just called up the manufacturer and employed the reality distortion field to talk them into sending him some “samples.” Jobs was Woz’s enabler, in the most positive sense; he had a genius for getting things done. In fact, in the big picture it is Woz that is in Jobs’s debt. One senses that Jobs would have made his mark on the emerging microcomputer industry even if he had never met Woz — such was his drive. To be blunt, Jobs would have found another Woz. Without Jobs, though, Woz would have toiled away — happily, mind you — in some obscure engineering lab or other his entire life, quietly weaving his miniaturized magic out of silicon, and retired with perhaps a handful of obscure patents to mark his name for posterity.

Unsurprisingly given their backgrounds and interests, Woz and Jobs were members of the famous Homebrew Computer Club, Woz from the very first meeting on March 5, 1975. There, the social hierarchy was inverted, and it was Woz with his intimate knowledge of computers that was the star, Jobs that was the vaguely uncomfortable outsider.

Woz designed the machine that became the Apple I just for fun. It was somewhat unique within Homebrew in that it used the new MOS 6502 CPU rather than the Intel 8080 of the original Altair, for the very good reason that Woz didn’t have a whole lot of money to throw around and the 6502 cost $25 versus $175 for the 8080. The process was almost committee-driven; Woz, who had the rare and remarkable gift of being without ego when it came to matters of design, would bring his work-in-progress to each biweekly Homebrew meeting, explaining what he’d done, describing where he was having problems, and soliciting advice and criticism. What he ended up with was pretty impressive. The machine could output to a television screen, as opposed to the flashing lights of the Altair; it used a keyboard, as opposed to toggle switches; and it could run a simple BASIC interpreter programmed by Woz himself. Woz said he “designed the Apple I because I wanted to give it away free for other people. I gave out schematics for building my computer at the next meeting I attended.”

Steve Jobs put a stop to those dangerous tendencies. He stepped in at this point to convince Woz to do what he never would have done on his own: to turn his hacking project into a real product provided by a real company. Woz sold his prize HP calculator and Jobs his Volkswagen van (didn’t someone once say that stereotypes are so much fun because they’re so often true?) to form Apple Computer on April 1, 1976. The Apple I was not a fully assembled computer like the trinity of 1977, but it was an intermediate step between the Altair and them; instead of a box of loose chips, you got a finished, fully soldered motherboard to build onto with your own case, power supply, keyboard, and monitor. The owner of an important early computer store, The Byte Shop, immediately wanted to buy 50 of them. Problem was, Jobs and Woz didn’t have the cash to buy the parts to make them. No problem; Jobs employed the reality distortion field to convince a wholesale electronics firm to give these two hippies tens of thousands of dollars in hardware in exchange for a promise to pay them in one month. Apple ended up selling 175 Apple Is over the next year, each assembled by hand in Jobs’s parents’ garage by Jobs and Woz and a friend or family member or two.

While that was going on, Woz was designing his masterpiece: the Apple II.

 
4 Comments

Posted by on September 9, 2011 in Digital Antiquaria, Interactive Fiction

 

Tags:

Binning the Trash-80

The microcomputer landscape of 1980 looked very different than it had when the trinity of 1977 first hit the scene. The hackers and early adopters who first made the TRS-80 a success were a step closer to sane than the solder-iron-wielding crazies who had constructed Altairs in their garages out of a set of diagrams and a loose pile of chips, but only a step. Owning and operating a computer was still expensive and difficult, and the question on the lips of wives and girlfriends across the country — “But what is it really good for?” — did not have any particularly strong answers. By 1980, though, that was changing, sufficiently so in fact that segments of the population were beginning to purchase computers not out of fascination with the technology itself, but rather because of what the technology would allow them to accomplish. That was due to the work of all those early adopters, who hacked like mad to create useful things that would justify their time in the terms that matter most in a market economy, dollars and cents, and thus in turn buy them yet more time to hack.

The most celebrated of these early killer apps today, perhaps due to its having been featured on The Triumph of the Nerds documentary, is VisiCalc, the spreadsheet program whose basic approach is still echoed in the Microsoft Excel we all know and love (?) today. Introduced in late 1979, it gave accountants, small-business owners, and even home users compelling reasons to own a microcomputer — whether to calculate taxes or accounts receivable and payable, or just to keep the checkbook balanced. But there are other examples. The first crude word processing application was called The Electric Pencil; it predated even the trinity of 1977, appearing for the early kit computers in December of 1976. It took WordStar, however, to refine the concept into a program flexible and powerful enough to begin to replace the expensive specialized word-processing machines found on secretary’s desks around the country upon its release in September of 1978. dBase, the first programmable relational database for microcomputers, made its first appearance in 1979. And while they were seldom openly mentioned as a reason to buy these early computers, games were always present as a sort of guilty pleasure and secret motivator. They were still crude and limited in 1980, but growing by leaps and bounds in both ambition and sales as the first specialized entertainment publishers such as Adventure International got off the ground, and as new microcomputers much more suited for play began to appear in the wake of the Atari VCS game-console sensation which began sweeping the country in earnest during the holiday season of 1979.

Ah, yes, the new machines. As new applications showed how useful and/or entertaining computers could be in both businesses and homes and as their sales figures responded, plenty of new players came rushing into the market. Some, such as the Exidy Sorcerer and Texas Instruments 99/4, found little traction, becoming mere historical footnotes and modern collector’s items. Others, though, heralded major new technological and cultural developments. We’ll get to these at some point, but for this post let’s see if we can bring some sort of order — i.e., some categories — to the crazy quilt of microcomputers available by 1980. Oddities like the TI 99/4 (the world’s first 16-bit microcomputer based on a CPU of TI’s own design) aside, most computers were based on one of two 8-bit CPU architectures.

First there was the Intel 8080, the chip at the heart of the original Altair kit computer and its contemporaries, and the Z80, a mostly compatible CPU from Zilog that nevertheless offered a more flexible, efficient design; this, you may recall, was the chip Tandy chose for the TRS-80. Apart from the TRS-80, which for better and (as we shall shortly see) for worse remained largely its own thing, these machines generally ran the first widespread platform-agnostic operating system for microcomputers, CP/M (Control Program for Microcomputers). Developed by Gary Kildall at the very dawn of the microcomputer era and published by his company Digital Research, CP/M was the MS-DOS — or, if you like, the Microsoft Windows — of this early era, a de facto if not official standard that allowed machines from a huge variety of makers to share software and information. (There is also a more tangible link between CP/M and MS-DOS: depending on whom you talk to, the original MS-DOS from 1981 was either “inspired by” CP/M or an outright unauthorized reverse engineering of the earlier O/S. But that subject will doubtlessly come up again in later posts…) For a computer to run CP/M, it required two things: an Intel 8080 or Zilog Z80 CPU, and a certain standard bus design for communicating with its disk drives and other peripherals, known as the S-100 — a design which had its origins as far back as the original Altair.(UPDATE: As Jonno points out in the comments, an S-100 bus was not a strict requirement for CP/M.)

CP/M and the Intel- and Zilog-based architectures on which it ran became the standard environment for “serious” microcomputing of the late 1970s and early 1980s, the kind done in corporate offices and small businesses. WordStar and dBase were both born there, and VisiCalc, although conceived on the Apple II, quickly found its way there. CP/M had, however, no graphics capabilities at all and only limited support for real-time operations, making it problematic as a platform for many types of games and even educational software. It also relied upon the existence of at least one disk drive on its host platform at a time when such devices tended to be very pricy. These factors made CP/M and the 8080 a poor fit for the less expensive, usually cassette-based computers generally chosen by home users. That market was dominated by another hardware architecture, that of the MOS Technologies 6502 CPU.

When the 6502 first appeared in 1975, MOS was a tiny independent chip-maker, but that changed when Commodore purchased the entire company in late 1976. This move, one of the smartest that Commodore head Jack Tramiel ever made, left the Commodore in the enviable position of making money not only when it sold its own machines such as the PET, but also every time a rival purchased 6502s for its own products. Said rivals initially included only Apple with its Apple II line and a number of kit-based computers from various small manufacturers, but that would change soon enough.

A CP/M equivalent for 6502-based machines was never developed, meaning that they remained largely incompatible with one another. BASIC did serve as a partial lingua franca, as virtually all of these machines housed a version of Microsoft’s industry-standard BASIC in their ROMs, but there was enough variation from implementation to implementation that most programs needed at least some customizing. And of course when one progressed beyond BASIC to assembly language to take full advantage of everything a 6502-based machine had to offer — especially graphics and sound, which capabilities varied wildly from model to model — one was faced with essentially coding everything from scratch for each machine one wished to support. Crazy times — although with the ever-increasing proliferation of incompatible mobile computing devices in our own times it’s starting to look like 1980 all over again.

What the 6502 world lost in compatibility it gained in flexibility. Freed from the need to work through a comparatively complex and inefficient OS like CP/M, programmers could code right to the metal on these machines, manipulating every element of the hardware directly for maximum efficiency. Further, the 6502-based machines, being generally aimed at the home and education markets, tended to feature the graphics and sound capabilities that were missing from the bland, textual world of CP/M; the Apple II, for instance, was the only member of the trinity of 1977 with support for proper bitmap graphics, a subject I’ll begin to discuss in more detail in my next post.

But now you might be wondering where all of this left the TRS-80, which fit neatly into neither of the two categories just described. Although the TRS-80 was built around the Z80 CPU, Radio Shack had chosen in the name of penny pinching not to implement the S-100 bus design. (UPDATE: As happens from time to time around these parts, this is not correct. Actually, the problem involved the memory map of the original TRS-80, in which ROM proceeded RAM; a CP/M machine required the reverse. Thanks to Jonno for pointing this out in the comments.) This made CP/M a nonstarter. Despite being a huge success in its early years and still having the largest installed base of any microcomputer, the TRS-80’s future was, at least in retrospect, already clouded in 1980. Its incompatibility with CP/M left it cut off from the quickly growing base of serious business software found on that OS. In spite of the TRS-80’s relatively cheap price, Radio Shack’s reputation as purveyors of cheap junk for the masses did little to attract business users, and in a classic chicken-or-the-egg scenario this lack of business users discouraged developers from porting their products from CP/M to the little oddball Tandy machine. And in the other half of the microcomputer market, the 6502-dominated world of games machines and hobbyist computing, the TRS-80 was also looking like an increasingly poor fit with its almost complete lack of graphics and absolutely complete lack of sound. The arrival of the Atari 400 and 800, colorful 6502-based machines with superb graphics and sound for the time, and, a bit later in early 1981, the Commodore VIC-20, a much less capable machine in comparison but one nevertheless sporting color graphics and sound for an unprecedentedly low price, were particularly ominous signs.

While the wisdom of many of its moves is debatable, Tandy at least did not stand entirely still in the face of these developments. In fact, it released quite a blizzard of new machines, none of which came close to recapturing the market share the TRS-80 enjoyed in the late 1970s.

Tandy released a new machine called the TRS-80 Model 2 (the original TRS-80 being now retroactively renamed to the Model 1) in late 1979. The Model 2 was designed to capture the business computing market that was passing the Model 1 by; it sold with integrated disk drives and did properly implement the S-100 bus included bank-switchable ROM, thus allowing it to run CP/M. But it was also a much more expensive machine than the Model 1 and, most dismaying of all, completely incompatible with it. Thanks to Radio Shack’s usual lack of marketing acumen and genius for clunky, tacky-looking design as well as its high price, it was not a big success in the business market, while its incompatibility made it of little interest to existing Model 1 owners.

The Model 3 which appeared to replace the Model 1 in the summer of 1980, meanwhile, was rather forced on Radio Shack. The Model 1 had put out so much radio interference that, in an example of the boundless ingenuity that marked the early microcomputer era, people began writing programs to manipulate memory so as to make music using this interference along with a nearby transistor radio to pick it up. New FCC regulations for 1981 forced Radio Shack to build in proper RF shielding, and thus spoiled that particular kind of fun. In addition to fixing this issue, the Model 3 also sported a slightly faster version of the Z80 CPU and (hallelujah!) real lower-case letter support for both input and output amongst other modest improvements. Yet it did nothing to improve the Model 1’s meager display capabilities. And, in the one-step-forward two-steps-back dance that seemed to define Radio Shack, the Model 3 was optimistically said to be just “80%” compatible with the Model 1, while, once again, no S-100 bus meant no the design did not allow for CP/M. Radio Shack in their marketing genius now had three separate machines labeled the TRS-80, each now partially or entirely incompatible with its siblings. Just imagine trying to figure out what software actually worked on your version…

And incredibly, there was yet another completely incompatible TRS-80 released in 1980, this one the most significant of all. Although officially called the TRS-80 Color Computer, it was a radical departure from anything seen before, being built around perhaps the most advanced 8-bit CPU ever produced, the new Motorola 6809E. Like so many Radio Shack systems, it offered intriguing potential bundled together with some dismaying weaknesses. On the plus side were the powerful 6809E itself and an advanced Microsoft BASIC that made it a favorite among hobbyist programmers; on the weak side were sound and graphics capabilities that, while a step up from the other TRS-80 models, were still not competitive with new and upcoming models from companies like Atari and Commodore. In spite of that the CoCos, as they soon became affectionately known, had a long run during which they consistently flew under the radar of the mainstream, attracting little in the way of games or applications from most publishers or even from Radio Shack itself, but survived on the back of a sort of cult industry all their own sustained by a fanatically loyal user base. The CoCo line did not finally go out of production until 1991.

There are many more interesting stories to tell about Radio Shack’s quirky little computers, but none would ever come close to dominating the industry the way that the TRS-80 Model 1 did for those first few years. In truth, even the Model 1 was popular because it was widely available at a time when distribution channels for other brands were barely extant and because its price was reasonable rather than because of any sterling technical qualities of the machine itself. The TRS-80 was really not so far removed from Radio Shack’s other products: it basically got the job done, but in about the most uncool and unsexy way imaginable. It primed the pump of the home computer industry and brought adventure games into the home for the first time, but already in 1980 its time was passing.

So, we’ll bit adieu to the old Trash-80 and move on next time to look at the machine that made the company that has come to define cool and sexy in technology. Yes, I’m talking about those two plucky kids in that California garage.

 
9 Comments

Posted by on September 6, 2011 in Digital Antiquaria, Interactive Fiction

 

Tags: