RSS

Category Archives: Digital Antiquaria

A Net Before the Web, Part 3: Content and Competition

We saw in the last article how CompuServe’s user-driven philosophy led to this online service becoming an online community, steered to a large extent by its subscribers. Yet the choice between a content-driven model and a user-driven model has never really constituted a zero-sum proposition, whether on the Internet of today or the CompuServe of the 1980s. In fact, virtually from the moment that Jeff Wilkins decided the nascent MicroNET had potential that was worth seriously investing in — a moment we can date to the very end of 1979 — he started casting about for information and applications which CompuServe’s users couldn’t possibly create for themselves.

The list of top-down initiatives CompuServe would launch over the next several years reads amazingly similar to the list of aspirations, sketchily fulfilled if at all, with which The Source had made its much more high-profile debut. But whether Wilkins really was checking off the items on Bill von Meister’s original list or coming up with this stuff on his own doesn’t matter much in the end. What is important is how much of the daily online life of today was first tried out on CompuServe in the 1980s. Sometimes, as we’ve already seen in the case of the attempt to launch a digital-download service for commercial software, the world would prove not quite ready for what CompuServe strove to offer it. Still, the simple fact of the striving has historical significance of its own.

Very early on, Wilkins determined to bring the news to CompuServe. With The Source having cornered United Press International, he chose to ask the other national news wire, the Associated Press, to make their feed of important stories available to his subscribers. The almost accidental result of his inquiries was something even more prescient, a full-blown collision between the titans of Old Media and what would soon be known as the New Media. Jeff Wilkins:

I had been thinking about news for a long time — the potential to have it be searchable and immediate. And of course it lent itself to text pretty well; we were still at that point in time limited to all text.

I called the local newspaper, the Columbus Dispatch, and said that we’re building this service, and we’d like to have the AP wire; that’s where all the news came from in those days. They said the AP didn’t do that, but you could work on a test to convince them to participate. So, they gave us a test feed, and our technical team took that and parsed it and figured out how to set up menus and all that sort of thing. So, we had a crude working model of a news feed.

Then I called the Associated Press in New York and said I’d like to come talk to them about an idea. Of course, they gave me to a lower-level staffer. But I met him in New York and told him what we were trying to do. He said, “The AP is all the newspapers. They have a board of directors who make all the decisions. I doubt they’d be interested in this, but we’re having our conference in Hawaii next week. If you’ll let me take this demo you’ve just shown me out there, I’ll show it to them and see what they think. Then I’ll get back to you.” His name was Henry Heilman. He was a great guy.

About a week later, I’m in Columbus in my office and the phone rings. “This is Henry Heilman. I’m in Hawaii. Our board would like to come to Columbus to talk about your proposal.”

I said, “That’d be great! When would they like to come?”

He said, “They’d like to come next week.”

I said, “Who’s coming?”

He started to name names. And I recognized a couple of them. One was Katharine Graham from the Washington Post. Another was [Arthur Ochs] Sulzberger from the New York Times.

So, we set it up. It was really funny. Katherine Graham’s secretary called me and said, “Can you have a car for Mrs. Graham?”

I said, “What do you mean by a car?”

He said, “A limousine.”

We didn’t have a limousine service in Columbus, at least not that I ever used.  But anyway, I made arrangements to have her and everybody else picked up.

So, ten of these people came to our little conference room, and we made a presentation.

They said, “What’s your proposal?”

I said, “Well, I would like to have ten newspapers participate in a test of an electronic-newspaper service, and in exchange I’d like advertising in your newspapers worth $250,000 apiece, talking about this project.”

They said, “Can we have a few minutes to talk?” They were in there 45 minutes. I remember sweating profusely, thinking they were never going to go on with this. But they came back and said, “Yes, we’ll accept the proposal — with one condition: we offer it to all our newspapers, and let any participate that want to, provided that the ten of us can [also] be in the test.”

So, that was how we kicked it all off. I said, “I have one final request: the Columbus Dispatch will be the first newspaper that comes online.”

They agreed, and that was how the electronic-newspaper [service] launched.

CompuServe is demonstrated to members of the Associated Press in 1980. Standing in the back row from left are Jeff Wilkins, Katherine Graham of the Washington Post, and John F. Wolfe of the Columbus Dispatch, which was soon to become the first newspaper in history to go online.

Wilkins’s tale serves to illustrate that the entrenched forces of establishment media aren’t always quite as hidebound as they may first appear. In fact, the vaguely defined idea of “electronic publishing” was very much en vogue at the time in certain circles, albeit greeted with equal measures of excitement and trepidation. It was the former impulse that led Readers Digest, by reputation at least about the most hidebound media institution of all, to buy a controlling interest in the The Source in 1980, the same year CompuServe struck their newspaper deal.

But the fear that would always remain at the root of traditional publishing’s long, fraught negotiation with the online world was never hard to find just below the surface. Jim Batton of Knight-Ridder Newspapers was one of the more prominent skeptics, voicing fears that were in their way as prescient as the more optimistic rhetoric that came to surround this brave new world of online news: “Our concern was that if people might get their information in this way, they might no longer need newspapers.” Katherine Graham was playing both sides of the fence, lobbying in Congress for legislation that would prevent telephone companies from becoming “information providers” even as she was signing on with CompuServe.

Indeed, it appeared the newspaper industry in general didn’t entirely know its own mind. Keith Fuller, president of the Associated Press, summed up the two views that were at war within the psyches of people like Graham in these terms: “One [view is] that electronic delivery is the future knocking at the door, and the other [is] that electronic delivery is a disaster hunting a victim.” The decision to get in bed with CompuServe was not without controversy inside the AP’s member newspapers. One union, The Twin Cities Newspaper Guild No. 2, held a 26-day strike against the Minneapolis Sun and Tribune after they elected to participate in the experiment. The union’s delivery carriers demanded guarantees that they would not lose their positions with a switch to electronic delivery, while editors and writers demanded that they receive the same residuals on electronically published articles as those they were accustomed to receiving for articles published on paper. It seems an absurdly early point for such conflicts to have begun, given the vanishingly small number of people who actually had the equipment and the willingness to reach their local newspaper online in 1980, but there you have it.

For CompuServe, on the other hand, the deal represented just another way to reach out to Middle America, to reach customers early and make their online service the only credible example of same in the eyes of most of them. They saw the importation of actual newspapers rather than just a news wire to CompuServe was a very significant step toward those goals. The news wires provided the skeleton of what people looked for in their hometown newspapers, but the meat, the bones, and the personality were found in the local human-interest stories, the opinion columns, the entertainment guides. These were the things that made spending a long, lazy weekend morning over the local newspaper and a pot of coffee such a mainstay of American life. In this light, the fact that it was the little Columbus Dispatch that first established an online presence rather than one of the big papers of record feels appropriate.

Each of the newspapers that participated in the program offered free time on CompuServe to any of their subscribers who wished to get a glimpse of the cyberspace future of journalism. The enormous attention the experiment garnered throughout the mainstream media made CompuServe a household name for the first time, at least among those interested in technology in the abstract. Rich Baker, a CompuServe executive:

All of a sudden, we had the biggest newspapers in the country running stories about CompuServe Information Service. The news stories spun off into wire stories, and our getting on the Today show. The Today crew came here so Garrick Utley could deliver the story. We got an incredible amount of exposure from the newspaper experiment. No amount of paid advertising could have accomplished such a feat.

While the experiment was a roaring success from the standpoint of CompuServe, the results from the newspapers’ standpoint were considerably more mixed — doubtless much to the relief of organizations like The Twin Cities Newspaper Guild No. 2. In the initial flurry of excitement, a few of the newspapers had devoted entire editorial staffs to their online editions, a practice that quickly proved untenable given the small number of online readers. Meanwhile even many of the early adopters among the reading public who had greeted the idea of an online newspaper with excitement had to admit in the end that it was a heck of a lot more pleasant to read a 25¢ physical newspaper than it was to watch stories scroll slowly onto a computer screen, bereft of illustrations or proper typesetting, at a price of $6 per hour — not to mention that it was a heck of a lot easier to read a paper-based newspaper at the breakfast table than it was to set up a computer there.

The trial program officially ended in June of 1982, and most of the fifteen or so newspapers who had participated ended their presence on CompuServe along with it. CompuServe’s grander plans for online news were eventually replaced by something called the Executive News Service, a much more limited digest of relevant wire reports for, as the name would indicate, the busy businessperson on the go. Tellingly, CompuServe shifted from telling potential customers about all the prestigious newspapers on offer to offering them the opportunity to “create your own newspaper” — a formulation much more in keeping with the user-driven ethos that had come to define so much of the service.

Another area where CompuServe reached toward a future that would prove to be just out of their grasp was online banking. On October 9, 1980, they announced a partnership with Radio Shack and the United American Bank of Knoxville, Tennessee, to offer the bank’s customers online access to their accounts. According to the press release, customers would be able to “receive current information on their checking accounts, use a bookkeeping service, and apply for loans,” with many more functions, including online bill paying and tax services, planned for the future. The service would represent, according to the bank’s president, “convenience banking without leaving home.” It certainly sounded promising, but it was a struggle to find any takers for the offer, limited as it was to United American Bank’s existing customers in eastern Tennessee. With computer security in its relative infancy, the safety of this, the most important of all their personal information, was a concern repeatedly and justifiably expressed by those who were surveyed on the topic. In the end, instead of becoming the first of many banks to go online, United American Bank elected to terminate the experiment within six months. It seemed that online banking, even more so than online newspapers, was an idea that was still just a little too far ahead of its time.

But other far-seeing ventures proved more successful. In 1982, just as the big newspaper experiment was ending, another electronic-publishing initiative was getting started. The World Book Encyclopedia went online with CompuServe that year, thus inadvertently hammering the first nail into the coffin of the paper-based encyclopedia. Countless wired schoolchildren were soon using this early ancestor of our own ubiquitous Wikipedia to write their reports without ever having to darken the door of a library.

Another, even more important initiative arrived in early 1984 in the form of the Electronic Mall. Once again, it had been The Source who had originated the idea of an online shopping emporium, making it part of their service from Day One. But, once again, online shopping had always been more of an aspiration than a reality there: few retailers initially set up storefronts, which led to few of the The Source’s already scant subscribers taking an interest, which gave few other retailers much encouragement to join the fray. And so it was left to the more methodical CompuServe to become the real pioneers of e-commerce.


In contrast to The Source’s shopping mall, CompuServe’s Electronic Mall debuted with a very impressive list of online storefronts, a tribute to how powerful and well-connected Jeff Wilkins’s erstwhile corporate data processor was becoming in the consumer marketplace. Many of the early names in the Electronic Mall could indeed be found in the typical American brick-and-mortar shopping mall: Sears, Waldenbooks, American Express, Kodak, E.F. Hutton, in addition to the expected list of computer-oriented shops, which boasted names like Commodore and Microsoft. But just as notable as all the big names were all the little ones. In another early testament to the leveling effect of so much of online life, small online-only vendors clustered side by side with some of the biggest corporate trademarks in the country. The Electronic Mall would remain a fixture for the next decade and change, doing very well for CompuServe and many of the entities who opened storefronts there. In the process, it became the first really successful example of e-commerce, yet another blueprint for what the future would eventually bring to everyone.

This page from CompuServe’s print magazine Online Today shows some of the wide variety of products that could be purchased from the Electronic Mall by 1989.

Speaking of which: the same year that the Electronic Mall went online, Trans World Airlines opened a gateway to their internal reservations system on CompuServe, allowing subscribers to book their own travel. “This will be the first time that comprehensive worldwide airline information and fares will be available to consumers,” said a proud Jeff Wilkins.  Other airlines followed, as did rental-car providers and hotels, precipitating a slow-rolling transformation in the way that people travel — and making life much more difficult for lots of professional travel agents.

So, already by the dawn of 1985 CompuServe encompassed an astonishing swathe of what we’ve come to think of as modern online life, some of it driven by users, some by content providers: email, forums, chat, news, encyclopedias, shopping, travel reservations. And even some of the things missing from that list, like digital distribution of commercial software and online banking, had been tried but had proved impractical. The range is so broad and so far-reaching that some of the technical pioneers who worked for CompuServe have in recent years made a lucrative sideline out of testifying to their prior art in patent cases, ruining the days of heaps of people who had believed themselves to be the innovators. “Almost everything people [have] tried to patent on the Internet,” notes Jeff Wilkins, “CompuServe had done in the early eighties.”

Having thus done his part for online posterity, Wilkins left the company in 1985 in order to get in on the ground floor of CD-ROM by opening a CD-pressing plant. His successor, Charlie McCall, made no dramatic changes to the solid framework Wilkins had left in place. For the remainder of the 1980s and well into the 1990s, CompuServe would just keep on trucking in business-as-usual mode, adding hundreds of thousands of new subscribers each year.

Prior to 1983, CompuServe had had the market for services like theirs virtually to themselves. Potential customers had only two other places to turn: The Source, which, perpetually mismanaged as it was, never posed all that much of a threat after 1980; and the network of private bulletin-board systems, which were regional, difficult to connect with, and, being usually able to host only one user at a time, were unable to offer anything like the same sense of real-time community. Indeed, CompuServe had deliberately tried to give the impression that theirs was the only online service that was or ever could be, deploying the word “utility” to foster a mental connection with the telephone system or the power grid (or, for modern sensibilities, with what the World Wide Web has become today).

But it was inevitable that others, seeing the growth CompuServe was enjoying, would want to enter the field. The first of these was DELPHI in March of 1983. Originally conceived as an online encyclopedia, it would always maintain a certain intellectual or literary focus. Shortly after its founding, for example, the service hosted what may have been the first online collaborative novel. In 1984, the science-fiction writer Orson Scott Card posted on DELPHI the entirety of Ender’s Game, destined to become his most famous novel, a year before it would see publication in print. Such coups aside, though, DELPHI lacked the corporate clout and the financial resources to challenge CompuServe for mainstream mindshare, and was never regarded by the latter as all that serious of a threat.

In October of 1985, however, a more serious threat did arrive in the form of GEnie. Formed by General Electric out of largely the same motivation that had led Jeff Wilkins to start MicroNET back in the day — the frustration of watching an expensive computing and telecommunications infrastructure sit all but dormant more than half of the time — GEnie arrived with an impressive array of offerings, many of them all too plainly modeled on those of its biggest competitor: chat, a forum system, shopping, news services, etc. Most of all, though, its owners planned to compete on the basis of price. In contrast to CompuServe’s $6.50 per hours, GEnie launched at a price of $5 per hour, an initial salvo in a slow-moving pricing cold war that would gradually bring down the average connection charge across the entire online-services industry over the years to come. While it would never even come close to catching CompuServe, GEnie would remain a force to be reckoned with in its own right for a long time.

And so it went, in accord with the implacable logic of capitalism. By the late 1980s there were several other viable online services as well, all orbiting the star that was CompuServe, defining themselves sometimes in their convergence, sometimes in their divergence. “We’re the more intellectual CompuServe!” said DELPHI; “We’re a cheaper version of CompuServe!” said GEnie; etc., etc. The fact that none of the services had any way of communicating with one another meant that each developed its own unique personality, partly defined by the priorities of its administrators but also partly, one senses, by random chance — or, rather, by the priorities of the people who happened to sign up in its earliest days.

For its part, CompuServe maintained always its reputation as the safe, steady online service, the one that might cost a little more than some of the others but that you knew you could rely on. A certain tradition of technical excellence which John Goltz had instilled from the company’s earliest days as a provider of corporate time-sharing services served them well in the consumer market. Their systems never — but never — went down, and even the odd glitches which often dogged their rivals’ offerings were all but unheard of. Some of their solutions to contemporary problems of the moment were so thorough that they have remained with us to this day. In 1987, for example, CompuServe developed the Graphics Interchange Format, or GIF, as a way to allow their subscribers using many different models of computer running many different kinds of software to share pictures with one another. It would go on to become the first truly ubiquitous cross-platform graphical standard; GIF images have been created in the literal billions in the decades since the format’s inception.

Even as it expanded, the burgeoning online-services industry managed to survive at least one existential threat. In mid-1987, the Federal Communications Commission made plans to implement a fee on the local access numbers which customers used to connect to the services without incurring long-distance charges. Discount long-distance services for voice calls that made use of a similar system had always been required to send part of their revenue back to the local telephone exchanges whose equipment they used, something CompuServe and the other online services had heretofore managed to avoid. In effect, the FCC argued, users of everyday telephone services were subsidizing users of these newfangled online services. They now planned to charge the latter $4.50 to $5.40 per hour for the privilege, a move with the potential to wipe out the whole industry at a stroke. “My opinion is that online information is horrendously overpriced right now,” said one analyst. “If you raise the price, you’re cutting out more and more people.” When word of the plan got to CompuServe, they enlisted their subscribers and everyone else they could find in a furious campaign to get it rescinded before it went into effect on January 1, 1988 — and they succeeded, another testimony to their growing clout. “Aunt Minnie,” as one of the FCC’s stymied attorneys put it, would have to go on subsidizing “Joe Computer User” in the name of keeping a developing industry alive. Not that the users of CompuServe and the other online services thought of it in those terms: for them, it meant simply that they got to keep on chatting and reading and writing and shopping and playing and all the rest without seeing the prices they paid for the privilege more than double.

As they were fending off this threat at home, CompuServe was already casting an eye outside their country’s borders. They expanded into Japan in 1987, then into Switzerland and Britain the following year; other European countries then followed. Soon the stories of friendship and romance that constantly swirled around CB Simulator took on an international character: an Indiana woman moved to Dublin to marry an Irish man; a Japanese woman and her daughter moved to California to join an American man. “I would feel the same about Suzuko if she were from South Africa or lived in Moscow,” said the last. Like so many Internet chatters who would come after them, the users of CB Simulator were learning the valuable lesson that people shouldn’t be judged by the passport they happen to hold.

Another landmark moment in Charlie McCall’s tenure — if one of more symbolic than practical importance by the time it arrived — came in 1989, when CompuServe, now 500,000 members strong, gobbled up their old arch-rival The Source, which was still straggling along with 50,000 members. Thus did a pioneer which had never quite lived up to its founders’ ambitions finally meet its end.

By the early 1990s, this net before the Web which Jeff Wilkins and Bill von Meister had first conceived almost simultaneously back in 1979 was reaching its peak, with CompuServe snowballing toward an eventual 3 million subscribers, with GEnie well into the hundreds of thousands, and with all the other services beavering along as well, filling their various niches.

And now, having reached this high-water mark, loading you down with so many data points describing so many firsts along the way, I feel keenly my failure to convey a more impressionistic sense of what it was really like to log onto one of these services. Unfortunately, I run into a problem that’s doomed to dog any digital antiquarian who tries to write about what the kids today like to call computing in the cloud: the lack of permanent artifacts to study in such an ephemeral form of media. There is, in other words, no preserved version of CompuServe that I can play in for research purposes or point you to to do the same, as I can the offline games I write about. I have only my imperfect memories from decades ago to go on — I was actually a GEnie man, having been lured by the cheaper price — along with what was written about the experience at the time. So, I’m going to take an unusual step, sort of an inversion of what we usually do around here. Instead of using the historical environment as a pathway to understanding why a certain game is the way it is, I’m going to do the opposite: suggest a game that you might play as a way of understanding the environment that spawned it.

Judith Pintar was a CompuServe regular in 1991 when she decided to write a game to simulate and gently satirize online life as she then knew it. Working with the text-adventure language AGT, she made Cosmoserve. If you’re at all interested in learning more about pre-Web online culture, I strongly encourage you to play it. Try to solve it if you like — it’s a very good game in its own right — but feel free to use a walkthrough if you prefer.

In Cosmoserve you’ll find much of what I’ve been writing about in this and the previous article: email, the Forums, chat, the Electronic Mall. From the struggle you sometimes had just to get online at all to the suggestive gossip on CB Simulator, from the ubiquity of Turbo Pascal to a killer computer virus — yes, we already had them this early on — it’s a perfect time capsule of online life circa 1991. I’ll have more to write about Cosmoserve in a future article, but for now suffice to say that it conveys all the experiential context that I can’t quite manage to give you in non-interactive, purely historical articles like these have been. You can almost hear the hair-raising howl of the modem connecting and the heavy clunk of a vintage IBM keyboard. Whether it happens to be a voyage of discovery or a nostalgia trip for you personally, I think you’ll find it has a lot to offer. Bless Judith Pintar for writing it.

As it happened, though, the online milieu Pintar so ably captured in 1991 was already being threatened at the time she wrote Cosmoserve. What we’ve been tracing to this point has been a certain approach to the commercial online service, one based entirely or almost entirely on text, allowing subscribers to connect using almost any terminal program. Yet by 1991 there was another approach out there as well, which in time would lead to the biggest single online service of all — yes, bigger even than CompuServe. And when we trace its origins back to the beginning, we find the familiar name of Bill von Meister. Jeff Wilkins may have wound up stealing his thunder last time around, but the magnificent rogue wasn’t yet done shaping history.

(Sources: the book On the Way to the Web: The Secret History of the Internet and its Founders by Michael A. Banks; Online Today of February 1988 and July 1989; 80 Microcomputing of January 1981; InfoWorld of November 24 1980, April 9 1984, May 21 1984, November 5 1984, and October 21 1985; Personal Computing of January 1981 and October 1981; Family Computing of March 1984; MacWorld of September 1987; New York Times of June 16 1987; Alexander Trevor’s brief technical history of CompuServe, which was first posted to Usenet in 1988; interviews with Jeff Wilkins from the Internet History Podcast and Conquering Columbus.)

 
28 Comments

Posted by on November 10, 2017 in Digital Antiquaria, Interactive Fiction

 

Tags:

A Net Before the Web, Part 2: Service to Community

Then she generated the light, and the sight of her room, flooded with radiance and studded with electric buttons, revived her. There were buttons and switches everywhere — buttons to call for food, for music, for clothing. There was the hot-bath button, by pressure of which a basin of (imitation) marble rose out of the floor, filled to the brim with a warm deodorized liquid. There was the cold-bath button. There was the button that produced literature. And there were of course the buttons by which she communicated with her friends. The room, though it contained nothing, was in touch with all that she cared for in the world.

— from “The Machine Stops” by E.M. Forster

If we wished to compare The Source with CompuServe’s MicroNET in their earliest days, we might say that the former emphasized the content it would provide to its subscribers while the latter planned to set its subscribers free to make their own content for themselves. In a later era, the World Wide Web would offer both of these things in a hundred-car pileup between the forces of traditional media and millions of empowered creative individuals; we as societies are still struggling in many ways to come to terms with the sea change this represents. It’s of course the second part of the equation — all those empowered creative individuals — that marks the real diversion from the top-down media models of old. One might thus be tempted to say that MicroNET’s approach was the more visionary, hewing as it seemingly does to the philosophy sometimes known as “Web 2.0,” that guiding light of “mature” Internet culture. To do so, however, might be to give Jeff Wilkins and his colleagues a bit too much credit. The real driving force behind Wilkins’s MicroNET had little in common with the ideas that would come to be labelled Web 2.0, or for that matter the academic research that led to Web 1.0.

Wilkins had seen that computers were entering homes for the first time, but, raised on the big iron of institutional computing as he was, he couldn’t help but observe how absurdly primitive these new microcomputers really were. He thought of MicroNET as a way for people saddled with such toy computers to use them as the gateway to a real computer. Thus MicroNET’s early emphasis on programming languages. Why should hobbyists content themselves with the primitive BASIC dialects, 16 K (or less) memories, and slow and unreliable cassette-based storage of the first generation of microcomputers when MicroNET could offer them the chance to write and run larger programs in more sophisticated languages like Fortran and Pascal?

It didn’t take long, however, to see that most subscribers didn’t in fact come to MicroNET looking for a replacement for their little home computers. They rather saw it as a place to talk about the things they were doing on their micros: a place to trade tips, rumors, and ideas with one another. They were, in other words, less interested in writing programs on CompuServe’s big computers than they were in using them as a communications tool — as a way of learning how to write better programs on the TRS-80s and Apple IIs sitting right in front of them. Users groups were springing up all over the country for much the same purpose, but, valuable as they were, they were bound by all the constraints geography imposed on what was still a very small hobby in a very big country. What did you do between the monthly meetings of your users group? Some hobbyists logged onto MicroNET to get their fix of shop talk. And so, while the online programming environments sat largely unused, the email system and the public message boards were soon full of activity.

For all that this wasn’t quite what Wilkins had envisioned when he set up MicroNET, he adjusted to the reality on the ground with admirable alacrity. The first sign of the changing times came as early as December of 1979, when a new area called the “MicroNET Software Exchange” made its debut. Representing CompuServe’s first substantial investment of programming effort just for MicroNET subscribers, it was modeled after initiatives like the TRS-80 Software Exchange that was run by Softline magazine. With the commercial-software industry still in its infancy, these so-called “exchanges” gave programmers a conduit for selling their home-grown creations to the public. From the entrepreneurs whose wares could be found on them would be born many of the first generation of full-service software publishers — among them names like VisiCorp, Brøderbund, and Adventure International.

The MicroNET Software Exchange went online with 17 TRS-80 programs on offer, ranging in price from $1 to $49, with an average of $16.40. Subscribers who indulged could download the programs they purchased right away, seeing the price conveniently tacked onto their next MicroNET bill. But by the time MicroNET Software Exchange launched it was already clear to astute observers that this means of loosey-goosey commercial-software distribution — it wasn’t unusual for a single developer to “publish” the same program through half a dozen exchanges — probably wasn’t long for this world, doomed by the very same professional software publishers they had done so much to spawn. Despite the appeal to immediate gratification that downloading offered over waiting for physical cassettes to come in the mail, the MicroNET Software Exchange never took off. The era of digital download as a means of commercial-software distribution would require many years yet to come to fruition; this was one aspect of the digital life of the future that would indeed have to wait for a future that came equipped with the fast and reliable connections needed to download complex software painlessly.

CompuServe began to advertise MicroNET in early 1980 via simple spots like this one.

Still, the MicroNET Software Exchange did point to Wilkins’s evolving view of the service, just as the effort that went into creating it pointed to how MicroNET as a whole was moving out of the experimental phase, ready to take its place as an actively developed part of CompuServe’s business model. CompuServe began to take out some modest advertisements for the service in magazines like InfoWorld, and in the summer of 1980 dropped the separate MicroNET moniker altogether. The consumer online service was now known simply as CompuServe, all the former reticence about mixing corporate and consumer business in the same organization shoved aside. Many people within the company remained unhappy about the push into the consumer marketplace, but Wilkins dealt with the developing culture clash by isolating his small team of consumer-service developers in an office of their own, far from the jeering of their colleagues. Helping his cause immensely was the fact that Sandy Trevor, who had replaced John Goltz as the company’s chief technical architect, was himself an enthusiastic supporter of the consumer service, sending the skunk-works group many of his keenest technical minds. With him leading the way, almost all of the technical staff came around in fairly short order, and in time the rest of the staff would follow — especially as the consumer service started making the company real money. By 1987, it would constitute half of CompuServe’s revenue, nicely offsetting the continuing slow decline in the corporate time-sharing market.

It is true that early on the consumer side of the company grew fairly slowly; it would take until well into 1981 for it to reach 10,000 subscribers. Yet its perceived importance, both inside and outside of CompuServe, developed much more quickly. On May 12, 1980, the accounting giant H&R Block bought CompuServe in a deal which left Jeff Wilkins in charge and promised to let him continue on the path he was already steering. Wilkins himself believed that the potential of the consumer service was a major motivating factor — if not the major factor — prompting H&R Block to make the deal. He told one interviewer at the time that he believed H&R Block wanted “to put themselves in a marketplace that is growing faster than the tax markets.” Needless to say, such a description no longer applied to corporate time-sharing services, now a stagnant rather than an exploding market.

Radically different though the two companies’ histories, industries, and cultures were, the acquisition led to surprisingly little internal friction. Wilkins used the sense of security the name of H&R Block lent in corporate America to make deals for the consumer service that may very well have been impossible otherwise, while H&R’s deep pockets and willingness to take the long view made it possible for him to expand on his already excellent telecommunications network, thereby making sure that when the users were ready to come to CompuServe en masse, CompuServe would have the pipes to accept their business. “You have to have the ability to anticipate, to be two or three years ahead of the market,” said Wilkins. By mid-decade, it would be possible to establish a rock-steady connection with CompuServe’s PDP-10s in Columbus via a local call from virtually anywhere in the country.

The telecommunications infrastructure wasn’t the only aspect of the consumer service that required the constant attention of Wilkins’s best engineers. The steadily growing user roll brought plenty of challenges to the programming staff as well. In the old days, when CompuServe had been strictly a provider of time-sharing to corporate clients, each client was earmarked to a certain PDP-10 machine in the pool of same inside the data centers; said machine stored all their data and ran all their software and was thus the only one they needed to access. The demands of the consumer service, however, soon extended beyond the capacity of any one machine. Dividing subscribers into pools and assigning them to individual machines was no good solution, for all of the subscribers needed to be able to interact with one another in ways which CompuServe’s corporate clients didn’t. Sandy Trevor was the key designer of what came to be called the “yo-yo switch,” a methodology for balancing the load of the consumer service across the company’s range of twenty or more PDP-10s. Trevor:

When a user logs onto CompuServe and selects an option from the menu, he or she is automatically connected with the host on which the needed data is stored. If during an online session he later selects another item that’s on a different computer, he is quickly switched over to that host. Because it’s done so quickly, [the] user is unaware of the change.

This very divorcement of the details of computing hardware from computing in the abstract — to such an extent that the user never needs to think about the hardware at all — is the source of the adjective “cloud” in the modern notion of cloud computing. In the early 1980s, it was at the cutting edge of computer science, and points to how groundbreaking the CompuServe of that time was in a strictly technical as well as social and business sense.

While the engineers were thus occupied on the technical end, CompuServe’s evolving marketing department developed ways to get the service in front of potential customers with what one might call an engineer’s single-minded precision. In the summer of 1980, CompuServe struck a deal with Radio Shack, who were selling far more home computers than anyone else at the time, to stock what came to be known as the “Snapaks”: packets containing everything a new subscriber needed to log into the system for the first time and set up an account. A customer could go from opening the packet to using the service within minutes. The Snapaks thus represented a potent force in the consumer marketplace: instant gratification.

From store shelves, the Snapaks found their way into modem boxes, as well as those housing most of the popular home computers. Just as software publishers had long since realized that a stunning percentage of software was purchased at the same time as the computer used to run it, CompuServe understood that the best way to capture a potential customer was to nab her early, in the first blush of excitement that accompanied taking her new toy home. Thanks to their connections and financial resources, no one else could rival them in this kind of outreach. It became a key part of their success, especially after the inevitable competition in the market for online consumer services — some of it far more dangerous than the moribund The Source — began to arrive by mid-decade.

But we perhaps get ahead of ourselves; that’s a story for my next article. At this point, I’d like to flip the script on this business history with which we’ve occupied ourselves until now. It’s time to put on a social historian’s hat and ask what the people who used this most popular and sophisticated of all the 1980s online services were actually doing when they logged on.

It turns out that much of it wasn’t all that far removed from what people still do online today. That fact, far from minimizing the importance of this pioneering service, only serves to underscore how prescient it really was. Humans are, as the cliché goes, social animals. “Social media” may not yet have been a term, but as early as 1980 CompuServe was evolving into a prime example of exactly that. Advertised as a service, it very quickly became a community.

From the beginning, of course, there was email, allowing CompuServe members to send private messages back and forth for any reason they liked. Already in November of 1981, 80 Computing magazine could write of this subtly disruptive technology that “it may replace the postal system and take part of the load now carried by the telephone.”

While the concept of email — still generally referred to during the 1980s by more long-winded sobriquets like “electronic mail” — is a fairly obvious one, the fundamental issue which held back its acceptance as a replacement for paper mail for many years was the lack of inter-operability between the various email systems. For a CompuServe subscriber, this meant that she could only send and receive email to and from other CompuServe subscribers. In one of those quotations that become retroactively hilarious, Marvin Weinberger, a computer researcher, mused thus in 1984:

What we need is a sort of “Long Lines” carrier for electronic mail. It would be analogous to AT&T’s Long Lines, which transmits a message among the local telephone operating companies. So far, a few vendors have taken steps to exchange messages, but there are hundreds of mail systems. If electronic mail is really to become as useful as the telephone — meaning one could send a message to anybody, anywhere — then an entity of this type is a prerequisite.

Weinberger was overlooking the Internet, an entity of exactly the needed type which already existed and was in fact being used to exchange email all over the world as he said those words. Indeed, his words sound like the beginning of a joke: “Gee, if only there was an open computer network already in place for the purpose of sending all these data packets back and forth…”

But the Internet’s evolution into the publicly accessible World Wide Web was still years away; in 1984, it was available only to those with the right university, government, or corporate connections. In the meantime, the closed email systems of services like CompuServe did much to trap subscribers on the service with which they had originally signed up. Each online service was such a closed universe in all respects that moving from one to another meant literally abandoning one’s friends.

While email was a great tool for communicating with friends you’d already made on CompuServe, how did you make new ones? How, in other words, could you find people on CompuServe in the first place who shared your interests? The solution to this problem, arrived at already in its most basic form in 1980, were things that were first known as “Special Interest Groups,” then re-branded with the pithier moniker of simply “Forums.” Rather than dividing CompuServe’s offerings by function — email, bulletin boards, etc. — the Forum system divided them by topic. In a Forum, one could find and communicate with other subscribers who, one knew, were also there out of interest in the Forum’s topic.

Predictably enough, the earliest Forums tended to be dedicated to the computing hobby itself. Each brand of computer and, soon, each viable model of computer got its own Forum. These gatherings of like-minded subscribers came to wield considerable influence in the computer industry at large. Apple’s John Sculley and Steve Wozniak, for instance, both made themselves personally available from time to time on the Forum known as the “Micronetworked Apple Users Group.” It wasn’t unusual for journalists from the magazines to source their word-on-the-street reports from the CompuServe Forums, which came to serve them well as early harbingers of the way the public at large would react to any given plan, product, or announcement. Radio Shack developed the TRS-80 Model 100, the world’s first reasonably usable laptop computer, practically in partnership with the TRS-80 Forum. First they took the time to ask the people there what they wanted in a portable computer. Then they delivered prototype models to the Forum’s leading lights and collected their feedback — rinse and repeat through several more cycles. Throughout the process, the executives behind the project remained consistently available to the Forum’s members. The early subscribers to CompuServe were by definition trailblazers, and the people marketing home-computer hardware and software took their influence very, very seriously.

With time, though, CompuServe’s user base began to branch out beyond the hardcore hacker demographic, and the Forums reflected this in their growing diversity of subject matter. Jeff Wilkins has named aviation as the first non-computer topic to really take hold. Pilots, who were often early technology adopters, had congregated in enough numbers on CompuServe within a year or two that their pooled information on airplanes, airports, weather, and traffic became one of the best resources any aviator could have. Still more pilots started signing up for CompuServe just to have access to this goldmine, creating a snowball effect.

And as aviation went, so in time went heaps of other hobbies and topics of interest: law, medicine, gardening, religion, sports, travel, individual authors and musicians. Just as journalists in our own time have developed a sometimes disconcerting Twitter dependency, journalists by 1986 were finding a fair number of their alleged scoops on CompuServe. When the space shuttle Challenger blew up during launch in January of that year, the huge and active NASA Forum, with plenty of members perched at a privileged vantage point inside NASA itself, became the place to find the latest news about what had happened and why. By 1989, more than 170 Forums were in operation.

The real genius of the Forum system was CompuServe’s willingness to allow them to be driven by ordinary subscribers — a willingness that hearkens back in its way to the founding philosophy of the service. Recognizing that they couldn’t possibly administer such a diverse body of discussions, CompuServe’s employees didn’t even try. Instead they created a process whereby new Forums could be formed whenever enough subscribers had expressed interest in their proposed topics, and then turned over the administration to the experts, the people who knew best the topics they dealt with: the very same subscribers who had lobbied for them in the first place. Forum administrators — known as “sysops” in CompuServe parlance — were given free access, along with a cash stipend that was dependent on how active their domain was. For the biggest Forums, this could amount to a considerable amount of money. Jeff Wilkins has claimed that some sysops wound up earning up to $250,000 in the course of their CompuServe life.

Sysops enjoyed broad powers to go with their compensation. It was almost entirely they who wielded the censor’s pen, who said what was and wasn’t allowed. As their Forums grew, they were permitted to hire deputies to help them police their territory, rewarding them with gifts of free online time. By all accounts, the system worked remarkably well as an early example of the sort of community policing on which websites like Wikipedia would later come to depend. It was a self-regulating system; those few sysops who neglected their duties or abused their powers could expect their Forum’s traffic to dwindle away, until CompuServe shut the doors. Those Forums with particularly enthusiastic and active sysops, on the other hand, thrived, sometimes out of all seeming proportion to their esoteric areas of interest. The Source, still hewing largely to its content- rather than user-driven model, failed to implement anything like the Forum concept until 1985, and was rewarded with a far more fragmented, far less active social space, even taking into account the growing disparity between the numbers of subscribers on the two services.

While the Forums were instrumental in making CompuServe what it was, it was a single technical rather than administrative development which did the most of all to bind CompuServe’s subscribers together into a real community — a development which stands out today as the most obviously, undeniably groundbreaking aspect of the entire service.

The consumer service’s formative period had been marked by a brief-lived but fairly intense craze for CB radio, fueled by corn-pone entertainments like Smokey and the Bandit, B.J. and the Bear, and The Dukes of Hazzard. For a while, cars sporting huge antenna rigs were a common sight on American highways, and truckers were left grumbling about all these amateurs muddying up their bandwidth. Radio Shack made a killing off the fad, selling CB kits in their stores alongside the TRS-80s that were fueling the contemporaneous early home-computer boom. The people who found CB radio interesting were very often the same ones who were buying computers and using them to log onto CompuServe.

Sandy Trevor

In late 1979, in the midst of the CB craze, CompuServe rolled out an addition to the operating system used on their time-sharing PDP-10s: a method of sharing segments of memory across multiple user sessions. It may not sound like the most exciting innovation, but it opened up worlds of new possibilities for direct, user-to-user interaction in real time. The synergy between CB enthusiasts and the computer enthusiasts on CompuServe inspired Sandy Trevor to use his programmers’ latest advance in the service of the first real-time online chat system. “It struck me that CB was something everyone had heard of,” he would later say. “Unlike many computer concepts, it wasn’t difficult for novices, and I thought it would provide a unique environment for meeting other people.” Jeff Wilkins recalls his first glimpse of what become known as “CB Simulator”:

We had an executive-committee meeting every Monday morning at 9:00; this was for the whole company. Sandy Trevor came to me before the meeting and said, “I want to show you what I did over the weekend. I call it CB. You pick a channel and you pick a username and you type, and everybody that’s on your channel sees what you’re typing.” He demonstrated it for me. I said, “Wow, that’s really interesting. I don’t know if people will use it or not, but we’ll give it a try and see. Let’s tell the executive committee about it, see what they think.”

So, we went to the executive-committee meeting and he gave a demonstration. I’ll never forget the expressions on their faces. They said, “You guys are insane! Nobody will ever use that! Why are we wasting our time on all this goofy stuff?”

Despite the committee’s objections, CB Simulator went live on February 21, 1980, with no fanfare whatsoever. CompuServe didn’t advertise it at all during its first four years of existence, and it wasn’t even on the menu system for the first year; would-be chatters had to learn the command to activate it from their more clued-in online friends. Sandy Trevor claims that this manifest ambivalence was shared by even Wilkins himself to a degree that’s perhaps obscured by the quotation above; “Jeff Wilkins,” he says, “thought it would be a fad.”

And yet CB Simulator went on to become CompuServe’s killer app, the place where the majority of subscribers spent the majority of their online time. A modern-day Wilkins, long since disabused of any doubts he might once have harbored, calls it out as the perfect combination of “high-tech” and “high-touch”; CB Simulator, more so than even the Forum system or anything else on CompuServe, provided that personal element that turned a conduit for information into a conduit for relationships. CompuServe’s advertising copy — after, that is, they bothered to start advertising CB Simulator — stated the case with only slight hyperbole: “There are students, lawyers, pilots, doctors, engineers, housewives, programmers, writers, all ready to welcome you from the moment you first access CB and type, ‘Hello, I’m new.'” For the people who used it, CB Simulator wasn’t a program or a service or even a technology; it was a social space where, once you’d learned the handful of needed commands, the technology quickly faded into the background.

Steven K. Roberts received a great deal of press attention for his two-and-a-half year trip across the highways and byways of the United States on his high-tech bicycle. On the cover of his book, he’s shown using a Tandy/Radio Shack portable computer — part of a model line designed, appropriately enough, in partnership with CompuServe subscribers — to connect to CompuServe via a satellite uplink. He was a CB Simulator regular throughout his adventure.

For most people of the 1980s, the idea of having online friends was still a deeply odd one, but for the people who were part of the CB Simulator scene the relationships forged there were as real and as pure as any they formed in the “real” world — or perhaps in many cases even more so. One regular chatter noted that on the CB Simulator “you meet someone from the inside out. You judge them on their heart and values, not what kind of jeans they wear.” Pat Phelps, CompuServe’s longtime CB Simulator administrator, beloved to the point of being called “Mother Superior” by her charges, spoke of the doors that were opened in similarly utopian terms:

There is no king or queen or worker class to it. Everyone is totally equal; it’s a fantastic equalizer as far as social order goes. It doesn’t matter what sex or race you are or what you look like, or handicaps, or whatever. People judge you on your ideas, on how you communicate.

Many handicapped people, for example, can’t leave their homes, and they’re withdrawn and concerned about the way they look. Here’s a way they can meet new people, make friends from all over the country. It doesn’t matter if they’re handicapped because everyone is accepted for the thoughts they share over the computer. If you meet a person who doesn’t fit the image of what you thought they should look like, it doesn’t matter because you already care for them and accept them.

“It’s like having a house guest in the corner who will talk to you anytime you want,” said another chatter. “It’s a form of communication, like hanging out on a street corner.” But of course many of the people hanging out on this virtual street corner were the very sort who would have been extremely uncomfortable doing so in the real world. “I’ve always been a loner, and this is a convenient way to meet people,” said one. “For the first time in my life, I have a group of people I can communicate with anytime.”

One of the first of many CB Simulator parties was organized by Pat Phelps in Columbus on June 16, 1984. These happy dancers have for the most part never met before in the physical world — but they seem to be getting along well enough.

Some of the friendships that were forged on CB Simulator evolved into something more — and this even before the “lonely hearts” channels became a thing. Pat Phelps claimed that even during the earliest period of CB Simulator’s existence several couples who met there wound up getting married. Although they were almost certainly not the absolute first of their kind, the first well-documented instance of a couple who met online getting married dates to February 14, 1983.

George Stickles and Debbie Fuhrman were better known online as “Mike” and “Silver.” He was a 29-year-old who worked at a copy shop near Dallas, Texas, she a 23-year-old secretary from Phoenix, Arizona. They got to know each other by chatting for “five or six hours” every night; “He would type in these jokes on the computer, and I felt really comfortable,” said Fuhrman. She eventually moved to Dallas to be with him. As a tribute to their unusual courtship, they decided to hold a wedding online, where their other friends on CB Simulator could participate. At first they thought of only a mock marriage. “Then after we got into it,” said Fuhrman, “we decided, why not do it for real. Pat [Phelps] said, ‘Yeah, yeah, by all means, do it for real.’ So we decided to go ahead and do everything at the same time.” The online spectators included Fuhrman’s parents, who had been unable to travel from Phoenix to join their daughter and future son-in-law. The bridesmaid was named Cupcake, the caterer “<< >>,” the usher Gandalf, the photographer Challenger, while the best man was the perfectly named Bestman. Three computers were placed in the same room in Dallas: one for each half of the happy couple, one for a 24-hour on-call minister who had been plucked out of the local phone book. As they went through the ceremony, each typed his or her words in addition to speaking them aloud. “I was quite surprised at the number of people who attended, as well as how well everything went,” said Stickles. The couple left the ceremony in a hail of virtual rice: “***************************.”

Stickles and Fuhrman were interviewed a number of times by journalists interested in documenting this strange new phenomenon of online dating. Some of the other adventures and misadventures their articles describe still ring true to anyone who has dipped a toe in these waters:

A couple who had been communicating over the lines for two months decided to meet each other at a local bar. They had been talking on the phone earlier. “The phone conversation was marvelous,” says the woman, who goes by the handle BigGal. “We chatted, laughed, and conversed for the better part of three hours. I couldn’t believe such a human being existed.”

And then they met. Damion, who had claimed to be 6 feet tall, had “mysteriously shrunk to about 5 feet 6 inches,” says BigGal. “The well-built body I had imagined assumed an avocado shape, and what was left of his brown hair was more of a dull, dusty gray color. Damion, supposedly 24 to 27 years old, also fibbed about his age. He looked old enough to be my father.”

Anecdotes like these reveal that judging the opposite sex exclusively on “their hearts and values” only got some chatters so far.

Still, we can presume that some of the supposed dishonesty that could lead to misunderstandings arose not so much from malignant intent as an earnest desire to try on different identities that weren’t going to fly in many real-world regions of an intensely hetero-normative country. One chatter told a journalist of some intense online time spent with what he assumed to be a “lovely, very philosophical” woman — only to learn that she was “really” a guy named Dave. Was Dave engaging in dishonest behavior, or revealing a truer self — or was Dave in some sense doing both at once?

Inevitably, some people were less interested in the relationship-building aspect of the whole romantic enterprise than they were in getting right down to the sex. Channels dedicated to sex chat could be found on CB Simulator almost from the beginning, and were quietly tolerated by CompuServe’s administrators — if not, for obvious reasons, publicized. Below is a precious historical document: real footage from 1984 of one of CB Simulator’s “adult” channels, as preserved by YouTube user Mathew Melnick. From the common area shown on the video, chatters could pair up in private rooms in order to… well, you know what they were doing, don’t you?


So, this sort of thing certainly had its place on CB Simulator. But, particularly after the media latched onto the topic of online sexy talk with predictable enthusiasm, it didn’t take long for the very sort of uncomfortable exchanges so many women had seen CB Simulator as an escape from to begin to spill over into their online life as well. Indeed, this became one of the few topics on which the usually sanguine Pat Phelps expressed real worry:

CompuSex is a very small part of what the medium is about. I’m not against it. If people want to do that, it’s perfectly alright. But now, because of the publicity, the majority of women have gotten extremely shy. Most of them aren’t even going to “talk” mode anymore. I don’t do it anymore, unless it’s with someone I know, because most of the one-on-ones are sex calls now. It’s kind of shut the door to friendships and meeting new people. Many of the women I talked to felt the same way. It’s sad. It’s shutting the door against the real reason that CB was originated in the first place, for fun and friendship and camaraderie and romance.

Thus, already by the time Phelps said those words in 1984, the Garden of Eden that had been CB Simulator in the eyes of its first adopters was starting to collect its share of snakes.

Other chatters were less predatory, but just as depressing in the way they brought some of the less savory aspects of the real world with them online. The head of the Republican Forum, speaking from the vast wisdom she had accrued in her 24 years, seemed determined to live up to every stereotype about her political party when she sniffed that “usually CB people are more educated, make a little more money. They’re a better group of people.” It all served to point out, for anyone who was in doubt, that the online life of the future wouldn’t be all unicorns and rainbows. If everyone was equal on CB Simulator, it seemed that some still believed they were more equal than others.

Another discordant note was lent by a new phrase which had begun to enter journalistic parlance for the first time by 1984: “online addiction.” The phrase is still heard all too often today, but one big difference between then and now is that those using CompuServe and similar services during the 1980s were paying by the minute for the privilege. Lurid stories emerged, usually based on hearsay rather than direct reporting, describing chatters who had supposedly lost house and home to the compulsion. While the scope was perhaps often exaggerated, the problem for some people was real. Monthly bills of $500 or more weren’t unusual among the CB Simulator hardcore, who occasionally confessed to forgoing niceties like a new car to replace that beat-up old clunker in order to have the money to keep chatting.

But there are downsides to any social revolution. The fact remains that the people hanging out on CB Simulator and other online spaces like it were at the vanguard of something extraordinary, something destined to be far more a force for good than its opposite. For countless people, home-bound or otherwise isolated by circumstance from those in the physical spaces around them, CompuServe became a vital part of their existence. I have no statistics to hand on how many people didn’t take their own lives or make some other tragic decision because of CompuServe, but I strongly suspect they number more than a few. Born as a prosaic exercise in corporate time-sharing, CompuServe’s evolution into the largest and most vibrant online community of the 1980s — it could boast 500,000 active members by 1989 — is one of the more unlikely and inspiring tales of a pivotal era in computer history. As yet, though, we’ve only seen half the picture. Next time, we’ll see how Big Media went digital for the first time thanks to CompuServe.

(Sources: the book On the Way to the Web: The Secret History of the Internet and its Founders by Michael A. Banks and Computing Across America: The Bicycle Odyssey of a High-Tech Nomad by Steven K. Roberts; Creative Computing of March 1980; InfoWorld of May 26 1980, March 14 1983, July 2 1984, July 9 1984, July 23 1984, and July 30 1984; 80 Microcomputing of November 1980 and November 1981; Online Today of June 1985 and July 1989; Alexander Trevor’s brief technical history of CompuServe, which was first posted to Usenet in 1988; interviews with Jeff Wilkins from the Internet History Podcast and Conquering Columbus.)

 
 

Tags:

A Net Before the Web, Part 1: The Establishment Man and the Magnificent Rogue

On July 9, 1979, journalists filtered into one of the lavish reception halls in Manhattan’s Plaza Hotel to witness the flashy roll-out of The Source, an online service for home-computer owners that claimed to be the first of its kind. The master of ceremonies was none other than the famous science-fiction and science-fact writer Isaac Asimov. With his nutty-professor persona in full flower, his trademark mutton-chop sideburns bristling in the strobe of the flashbulbs, Asimov said that “this is the beginning of the Information Age! By the 21st century, The Source will be as vital as electricity, the telephone, and running water.”

Actually, though, The Source wasn’t quite the first of its kind. Just eight days before, another new online service had made a more quiet official debut. It was called MicroNET, and came from an established provider of corporate time-shared computing services called CompuServe. MicroNET got no splashy unveiling, no celebrity spokesman, just a typewritten announcement letter sent to members of selected computer users groups.

The contrast between the two roll-outs says much about the men behind them, who between them would come to shape much of the online world of the 1980s and beyond. They were almost exactly the same age as one another, but cut from very different cloths. Jeff Wilkins, the executive in charge of CompuServe, could be bold when he felt it was warranted, but his personality lent itself to a measured, incremental approach that made him a natural favorite with the conservative business establishment. “The changes that will come to microcomputing because of computer networks will be evolutionary in nature,” he said just after launching MicroNET. Even after Wilkins left CompuServe in 1985, it would continue to bear the stamp of his careful approach to doing business for many years.

But William Von Meister, the man behind The Source and its afore-described splashier unveiling, preferred revolutions to evolutions. He was high-strung, mercurial, careless, sometimes a little unhinged. Described as a “magnificent rogue” by one acquaintance, as a “pathological entrepreneur” by another, he made businesses faster than he made children — of whom, being devoted to excess in all its incarnations, he had eight. His businesses seldom lasted very long, and when they did survive did so without him at their helm, usually after he had been chased out of them in a cloud of acrimony and legal proceedings. A terrible businessman by most standards, he could nevertheless “raise money from the dead,” as one investor put it, thereby moving on to the next scheme while the previous was still going down in flames. Still, whatever else you could say about him, Bill von Meister had vision. Building the online societies of the future would require cockeyed dreamers like him just as much as it would sober tacticians like Jeff Wilkins.


Had an anonymous salesman who worked for Digital Equipment Corporation in 1968 been slightly less good at his job, CompuServe would most likely never have come to be.

The salesman in question had been assigned to a customer named John Goltz, fresh out of the University of Arizona and working now in Columbus, Ohio, for a startup. But lest the word “startup” convey a mistaken impression of young men with big dreams out to change the world, Silicon Valley-style, know that this particular startup lived within about the most unsexy industry imaginable: life insurance. No matter; from Goltz’s perspective anyway the work was interesting enough.

He found himself doing the work because Harry Gard, the founder of the freshly minted Golden United Life Insurance, wanted to modernize his hidebound industry, at least modestly, by putting insurance records online via a central computer which agents in branch offices could all access. He had first thought of giving the job to his son-in-law Jeff Wilkins, an industrious University of Arizona alumnus who had graduated with a degree in electrical engineering and now ran a successful burglar-alarm business of his own in Tucson. “The difference between electrical engineering and computing didn’t occur to him,” remembers Wilkins. “I told him that I didn’t know anything about computing, but I had a friend who did.” That friend was John Goltz, whose degree in computer science made him the more logical candidate in Wilkins’s eyes.

Once hired, Goltz contacted DEC to talk about buying a PDP-9, a sturdy and well-understood machine that should be perfectly adequate for his new company’s initial needs. But our aforementioned fast-talking salesman gave him the hard up-sell, telling him about the cutting-edge PDP-10 he could lease for only “a little more.” Like the poor rube who walks into his local Ford dealership to buy a Focus and drives out in a Mustang, Goltz’s hacker heart couldn’t resist the lure of DEC’s 36-bit hot rod. He repeated the saleman’s pitch almost verbatim to his boss, and Gard, not knowing a PDP-10 from a PDP-9 from a HAL 9000, said fine, go for it. Once his dream machine was delivered and installed in a former grocery store, Goltz duly started building the online database for which he’d been hired.

The notoriously insular life-insurance market was, however, a difficult nut to crack. Orders came in at a trickle, and Goltz’s $1 million PDP-10 sat mostly idle most of the time. It was at this point, looking for a way both to make his computer earn its keep and to keep his employer afloat, that Goltz proposed that Golden United Life Insurance enter into the non-insurance business of selling time-shared computer cycles. Once again, Gard told him to go for it; any port in a storm and all that.

At the dawn of the 1970s, time-sharing was the hottest buzzword in the computer field. Over the course of the 1950s and 1960s, the biggest institutions in the United States — government bureaucracies, banks, automobile manufacturers and other heavy industries — had all gradually been computerized via hulking mainframes that, attended by bureaucratic priesthoods of their own and filling entire building floors, chewed through and spat out millions of records every day. But that left out the countless smaller organizations who could make good use of computers but had neither the funds to pay for a mainframe’s care and upkeep nor a need for more than a small fraction of its vast computing power. DEC, working closely with university computer-science departments like that of MIT, had been largely responsible for the solution to this dilemma. Time-sharing, enabled by a new generation of multi-user, multitasking operating systems like DEC’s TOPS-10 and an evolving telecommunications infrastructure that made it possible to link up with computers from remote locations via dumb terminals, allowed computer cycles and data storage to be treated as a commodity. A business or other organization, in other words, could literally share time on a remote computer system with others, paying for only the cycles and storage they actually used. (If you think that all this sounds suspiciously like the supposedly modern innovation of “cloud computing,” you’re exactly right. In technology as in life, a surprising number of things are cyclical, with only the vocabulary changing.)

Jeff Wilkins

John Goltz possessed a keen technical mind, but he had neither the aptitude nor the desire to run the business side of Golden United’s venture into time-sharing. So, Harry Gard turned once again to his son-in-law. “I liked what I was doing in Arizona,” Jeff Wilkins says. “I enjoyed having my own company, so I really didn’t want to come out.” Finally, Gard offered him $1.5 million in equity, enough of an eye-opener to get him to consider the opportunity more seriously. “I set down the ground rules,” he says. “I had to have complete control.” In January of 1970, with Gard having agreed to that stipulation, the 27-year-old Jeff Wilkins abandoned his burglar-alarm business in Tuscon to come to Columbus and run a new Golden Life subsidiary which was to be called Compu-Serv.

With time-sharing all the rage in computer circles, it was a tough market they were entering. Wilkins remembers cutting his first bill to a client for all of $150, thinking all the while that it was going to take a lot of bills just like it to pay for this $1 million computer. But Compu-Serv was blessed with a steady hand in Wilkins himself and a patient backer with reasonably deep pockets in his father-in-law. Wilkins hired most of his staff out of big companies like IBM and Xerox. They mirrored their young but very buttoned-down boss, going everywhere in white shirt and tie, lending an aura of conservative professionalism that belied the operation’s small size and made it attractive to the business establishment. In 1972, Compu-Serv turned the corner into a profitability that would last for many, many years to come.

In the beginning, they sold nothing more than raw computer access; the programs that ran on the computers all had to come from the clients themselves. As the business expanded, though, Compu-Serv began to offer off-the-shelf software as well to suit the various industries they found themselves serving. They began, naturally enough, with the “Life Insurance Data Information System,” a re-purposing of the application Goltz had already built for Golden United. Expanding the reach of their applications from there, they cultivated a reputation as a full-service business partner rather than a mere provider of a commodity. Most importantly of all, they invested heavily into their own telecommunications infrastructure that existed in parallel with the nascent Internet and other early networks, using lines leased from AT&T and a system of routers — actually, DEC minicomputers running software of their own devising — for packet-switching. From their first handful of clients in and around Columbus, Compu-Serv thus spread their tendrils all over the country. They weren’t the cheapest game in town, but for the risk-averse businessperson looking for a full-service time-sharing provider with a fast and efficient network, they made for a very appealing package.

In 1975, Compu-Serv was spun off from the moribund Golden United Life Insurance, going public with a NASDAQ listing. Thus freed at last, the child quickly eclipsed the parent; the first stock split happened within a year. In 1977, Compu-Serv changed their name to CompuServe. By this point, they had more than two dozen offices spread through all the major metropolitan areas, and that one PDP-10 in a grocery store had turned into more than a dozen machines filling two data centers near Columbus. Their customer roll included more than 600 businesses. By now, even big business had long since come to see the economic advantages time-sharing offered in many scenarios. CompuServe’s customers included Fortune 100 giants like AMAX (the largest miner of aluminum, coal, and steel in the country), Goldman Sachs, and Owens Corning, along with government agencies like the Department of Transportation. “CompuServe is one of the best — if not the best — time-sharing companies in the country,” said AMAX’s director of research.

Inside one of CompuServe’s data centers.

The process that would turn this corporate data processor of the 1970s into the most popular consumer online service of the 1980s was born out of much the same reasoning that had spawned it in the first place. Once again, it all came down to precious computer cycles that were sitting there unused. To keep their clients happy, CompuServe was forced to make sure they had enough computing capacity to meet peak-hour demand. This meant that the majority of the time said capacity was woefully underutilized; the demand for CompuServe’s computer cycles was an order of magnitude higher during weekday working hours than it was during nights, evenings, and weekends, when the offices of their corporate clients were deserted. This state of affairs had always rankled Jeff Wilkins, nothing if not a lover of efficiency. Yet it had always seemed an intractable problem; it wasn’t as if they could ask half their customers to start working a graveyard shift.

Come 1979, though, a new development was causing Wilkins to wonder if there might in fact be a use for at least some of those off-hour cycles. The age of personal computing was in the offing. Turnkey microcomputers were now available from Apple, Commodore, and Radio Shack. The last company alone was on track to sell more than 50,000 TRS-80s before the end of the year, and many more models from many more companies were in the offing. The number of home-computer hobbyists was still minuscule by any conventional standard, but it could, it seemed to Wilkins, only grow. Might some of those hobbyists be willing and able to dial in and make use of CompuServe’s dearly bought PDP-10 systems while the business world slept? If so, who knew what it might turn into?

It wasn’t as if a little diversity would be a bad thing. While CompuServe was still doing very well on the strength of their fine reputation — they would bill their clients for $19 million in 1979 — the time-sharing market in general was showing signs of softening. The primary impetus behind it — the sheer expense of owning one’s own computing infrastructure — was slowly bleeding away as minicomputers like the DEC PDP-11, small enough to shove away in a closet somewhere rather than requiring a room or a floor of its own, became a more and more cost-effective solution. Rather than a $1 million proposition, as it had been ten years ago, a new DEC system could now be had for as little $150,000. Meanwhile a new piece of software called VisiCalc — the first spreadsheet program ever, at least as the modern world understands that term — would soon show that even an early, primitive microcomputer could already replace a time-shared terminal hookup in a business’s accounting department. And once entrenched in that vital area, microcomputers could only continue to spread throughout the corporation.

Still, the consumer market for online services, if it existed, wasn’t worth betting CompuServe’s existing business model on. Wilkins entered this new realm, as he did most things, with cautious probity. The new service would be called MicroNET so as to keep it from damaging the CompuServe brand in the eyes of their traditional customers, whether because it became a failure or just because of the foray into the untidy consumer market that it represented. And it would be “market-driven” rather than “competition-driven.” In Wilkins’s terminology, this meant that they would provide some basic time-sharing infrastructure — including email and a bulletin board for exchanging messages, a selection of languages for writing and running programs, and a suite of popular PDP-10 games like Adventure and Star Trek — but would otherwise adapt a wait-and-see attitude on adding customized consumer services, letting the market — i.e., all those hobbyists dialing in from home — do what they would with the system in the meantime.

Even with all these caveats, he had a hard time selling the idea to his board, who were perfectly happy with the current business model, thank you very much, and who had the contempt for the new microcomputers and the people who used them that was shared by many who been raised on the big iron of DEC and IBM. They took to calling his idea “schlock time-sharing.”

Mustering all his powers of persuasion, Wilkins was able to overrule the naysayers sufficient to launch a closed trial. On May 1, 1979, CompuServe quietly offered free logins to any members of the Midwest Affiliation of Computer Clubs, headquartered right there in Columbus, who asked for them. With modems still a rare and pricey commodity, it took time to get MicroNET off the ground; Wilkins remembers anxiously watching the connectivity lights inside the data center during the evenings, and seeing them remain almost entirely dimmed. But then, gradually, they started blinking.

After exactly two months, with several hundred active members having proved to Wilkins’s satisfaction that a potential market existed, he made MicroNET an official CompuServe service, open to all. To the dissatisfaction of his early adopters, that meant they had to start paying: a $30 signup charge, followed by $5 per hour for evening and weekend access, $12 per hour if they were foolish enough to log on during the day, when CompuServe’s corporate clients needed the machines. To the satisfaction of Wilkins, most of his early adopters grumbled but duly signed up, and they were followed by a slow but steady trickle of new arrivals. The service went entirely unadvertised, news of its existence spreading among computer hobbyists strictly by word of mouth. MicroNET was almost literally nothing in the context of CompuServe’s business as a whole — it would account for roughly 1 percent of their 1979 revenue, less than heaps of their larger individual corporate accounts — yet it marked the beginning of something big, something even Wilkins couldn’t possibly anticipate.

But MicroNET didn’t stand alone. Even as one online service was getting started in about the most low-key fashion imaginable, another was making a much more high-profile entrance. It was fortunate that Wilkins chose to see MicroNET as “market-driven” rather than “competition-driven.” Otherwise, he wouldn’t have been happy to see his thunder being stolen by The Source.

Bill von Meister

Like Jeff Wilkins, Bill von Meister was 36 years old. Unlike Wilkins, he already had on his resume a long string of entrepreneurial failures to go along with a couple of major successes. An unapologetic epicurean with a love for food, wine, cars, and women, he had been a child not just of privilege but of aristocracy, his father a godson of the last German kaiser, his mother an Austrian countess. His parents had immigrated to New York in the chaos that followed World War I, when Germany and Austria could be uncomfortable places for wealthy royalty, and there his father had made the transition from landed aristocrat to successful businessman with rather shocking ease. Among other ventures, he became a pivotal architect of the storied Zeppelin airship service between Germany and the United States — although the burning of the Hindenburg did rather put the kibosh on that part of his portfolio, as it did passenger-carrying airships in general.

The son inherited at least some of the father’s acumen. Leveraging his familial wealth alongside an unrivaled ability to talk people into giving him money — one friend called him the best he’d ever seen at “taking money from venture capitalists, burning it all up, and then getting more money from the same venture capitalist” — the younger von Meister pursued idea after idea, some visionary, some terrible. By 1977, he had hit pay dirt twice already in his career, once when he created what was eventually branded as Western Union’s “Mailgram” service for sending a form of electronic mail well before computer email existed, once when he created a corporate telephone service called Telemax. Unfortunately, the money he earned from these successes disappeared as quickly as it poured in, spent to finance his high lifestyle and his many other, failed entrepreneurial projects.

Late in 1977, he founded Digital Broadcasting Corporation in Fairfax County, Virginia, to implement a scheme for narrow-casting digital data using the FM radio band. “Typical uses,” ran the proposal, “would include price information for store managers in a retail chain, bad-check information to banks, and policy information to agents of an insurance company.” Von Meister needed financing to bring off this latest scheme, and he needed a factory to build the equipment that would be needed. Luckily, a man who believed he could facilitate both called him one day in the spring of 1978 after reading a description of his plans in Business Week.

Jack Taub had made his first fortune as the founder of Scott Publishing, known for their catalogs serving the stamp-collecting hobby. Now, he was so excited by von Meister’s scheme that he immediately bought into Digital Broadcasting Corporation to the tune of $500,000 of much-needed capital, good for a 42.5 percent stake. But every bit as important as Taub’s personal fortune were the connections he had within the federal government. By promising to build a factory in the economically disadvantaged inner city of Charlotte, North Carolina, he convinced the Commerce Department’s Economic Development Administration to guarantee 90 percent of a $6 million bank loan from North Carolina National Bank, under a program meant to channel financing into job-creating enterprises.

Unfortunately, the project soon ran into serious difficulties with another government agency: the Federal Communications Commission, who noted pointedly that the law which had set aside the FM radio band had stipulated it should be reserved for applications “of interest to the public.” Using it to send private data, many officials at the FCC believed, wasn’t quite what the law’s framers had had in mind. And while the FCC hemmed and hawed, von Meister was fomenting chaos within the telecommunications and broadcasting industries at large by claiming his new corporation’s name gave him exclusive rights to the term “digital broadcasting,” a modest buzzword of its own at the time. His legal threats left a bad taste in the mouth of many a potential partner, and the scheme withered away under the enormous logistical challenges getting such a service off the ground must entail. The factory which the Commerce Department had so naively thought they were financing never opened, but Digital Broadcasting kept what remained of the money they had received for the purpose.

They now planned to use the money for something else entirely. Von Meister and Taub had always seen business-to-business broadcasting as only the first stage of their company’s growth. In the longer term, they had envisioned a consumer service which would transmit and even receive information — news and weather reports, television listings, shopping offers, opinion polls, etc. — to and from terminals located in ordinary homes. When doing all this over the FM radio band began to look untenable, they had cast about for alternative approaches; they were, after all, still flush with a fair amount of cash. It didn’t take them long to take note of all those TRS-80s and other home computers that were making their way into the homes of early adopters. Both Taub and von Meister would later claim to have been the first to suggest a pivot from digital broadcasting to a microcomputer-oriented online information utility. In the beginning, they called it CompuCom.

The most obvious problem CompuCom faced — its most obvious disadvantage in comparison to CompuServe’s MicroNET — was the lack of a telecommunications network of its own. Once again, both Taub and von Meister would later claim to have been the first to see the solution. One or the other or both took note of another usage inequality directly related to the one that had spawned MicroNET. Just as the computers of time-sharing services like CompuServe sat largely idle during nights and weekends, traffic on the telecommunications lines corporate clients used to connect to them was also all but nonexistent more than half of the time. Digital Broadcasting came to GTE Telenet with an offer to lease this idle bandwidth at a rate of 75¢ per connection per hour, a dramatically reduced price from that of typical business customers. GTE, on the presumption that something was better than nothing, agreed. And while they were making the deal to use the telecommunications network, von Meister and Taub also made a deal with GTE Telenet to run the new service on the computers in the latter’s data centers, using all that excess computing power that lay idle along with the telecommunications bandwidth on nights and weekends. Because they needed to build no physical infrastructure, von Meister and Taub believed that CompuCom could afford to be relatively cheap during off-hours; the initial pricing plan stipulated just $2.75 per hour during evenings and weekends, with a $100 signup fee and a minimum monthly charge of $10.

For all the similarities in their way of taking advantage of the time-sharing industry’s logistical quirks, not to mention their shared status as the pioneers of much of modern online life, there were important differences between the nascent MicroNET and CompuCom. From the first, von Meister envisioned his service not just as a provider of computer access but as a provider of content. The public-domain games that were the sum total of MicroNET’s initial content were only the beginning for him. Mirroring its creator, CompuCom was envisioned as a service for the well-heeled Playboy– and Sharper Image-reading technophile lounge lizard, with wine lists, horoscopes, entertainment guides for the major metropolitan areas, and an online shopping mall. In a landmark deal, von Meister convinced United Press International, one of the two providers of raw news wires to the nation’s journalistic infrastructure, to offer their feed through CompuCom as well — unfiltered, up-to-the-minute information of a sort that had never been available to the average consumer before. The New York Times provided a product-information database, Prentice Hall provided tax information, and Dow Jones provided a stock ticker. Von Meister contracted with the French manufacturer Alcatel for terminals custom-made just for logging onto CompuCom, perfect for those wanting to get in on the action who weren’t interested in becoming computer nerds in the process. For the same prospective customers, he insisted that the system, while necessarily all text given the state of the technology of the time, be navigable via multiple-choice menus rather than an arcane command line.

In the spring of 1979, just before the first trials began, CompuCom was renamed The Source; the former name sounded dangerously close to “CompuCon,” a disadvantage that was only exacerbated by the founder’s checkered business reputation. The service officially opened for business, eight days after MicroNET had done the same, with that July 9 press conference featuring Isaac Asimov and the considerable fanfare it generated. Indeed, the press notices were almost as ebullient as The Source’s own advertising, with the Wall Street Journal calling it “an overnight sensation among the cognoscenti of the computing world.” Graeme Keeping, a business executive who would later be in charge of the service but was at this time just another outsider looking in, had this to say about those earliest days:

The announcement was made with the traditional style of the then-masters of The Source. A lot of fanfare, a lot of pizazz, a lot of sizzle. There was absolutely no substance whatsoever to the announcement. They had nothing to back it up with.

Electronic publishing was in its infancy in those days. It was such a romantic dream that there never had to be a product in order to generate excitement. Nobody had to see anything real. People wanted it so badly, like a cure for cancer. We all want it, but is it really there? I equate it to Laetrile.

While that is perhaps a little unfair — there were, as we’ve just seen, several significant deals with content providers in place before July of 1979 — it was certainly true that the hype rather overwhelmed the comparatively paltry reality one found upon actually logging into The Source.

Nevertheless, any comparison of The Source and MicroNET at this stage would have to place the former well ahead in terms of ambition, vision, and public profile. That distinction becomes less surprising when we consider that what was a side experiment for Jeff Wilkins was the whole enchilada for von Meister and Taub. For the very same reason, any neutral observer forced to guess which of these two nascent services would rise to dominance would almost certainly have gone with The Source. Such a reckoning wouldn’t have accounted, however, for the vortex of chaos that was Bill von Meister.

It was the typical von Meister problem: he had built all this buzz by spending money he didn’t have — in fact, by spending so much money that to this day it’s hard to figure out where it could all possibly have gone. As of October of 1979, the company had $1000 left in the bank and $8 million in debt. Meanwhile The Source itself, despite all the buzz, had managed to attract at most a couple of thousand actual subscribers. It was, after all, still very early days for home computers in general, modems were an even more exotic species, and the Alcatel terminals had yet to arrive from France, being buried in some transatlantic bureaucratic muddle.

Jack Taub

By his own later account, Jack Taub had had little awareness over the course of the last year or so of what von Meister was doing with the company’s money, being content to contribute ideas and strategic guidance and let his partner handle day-to-day operations. But that October he finally sat down to take a hard look at the books. He would later pronounce the experience of doing so “an assault on my system. Von Meister is a terrific entrepreneur, but he doesn’t know when to stop entrepreneuring. The company was in terrible shape. It was not going to survive. Money was being spent like water.” With what he considered to be a triage situation on his hands, Taub delivered an ultimatum to von Meister. He would pay him $3140 right now — 1¢ for each of his shares — and would promise to pay him another dollar per share in three years if The Source was still around then. In return, von Meister would walk away from the mess he had created, escaping any legal action that might otherwise become a consequence of his gross mismanagement. According to Taub’s account, von Meister agreed to these terms with uncharacteristic meekness, leaving his vision of The Source as just one more paving stone on his boulevard of broken entrepreneurial dreams, and leaving Taub to get down to the practical business of saving the company. “I think if I had waited another week,” the latter would later say, “it would have been too late.”

As it was, Digital Broadcasting teetered on the edge of bankruptcy for months, with Taub scrambling to secure new lines of credit to keep the existing creditors satisfied and, when all else failed, injecting more of his own money into the company. Through it all, he still had to deal with von Meister, who, as any student of his career to date could have predicted, soon had second thoughts about going away quietly — if, that is, he’d ever planned to do so in the first place. Taub learned that von Meister had taken much of Digital Broadcasting’s proprietary technology out the door with him, and was now shopping it around the telecommunications industry; that sparked a lawsuit on Taub’s behalf. Von Meister claimed his ejection had been illegal; that sparked another, going in the opposite direction. Apparently concluding that his promise not to sue von Meister for his mismanagement of the company was thus nullified, Taub counter-sued with exactly that charge. With a vengeful von Meister on his trail, he said that he couldn’t afford to “sleep with both eyes closed.”

By March of 1980, The Source had managed to attract about 3000 subscribers, but the online citizens were growing restless. Many features weren’t quite as advertised. The heavily hyped nightlife guides, for instance, mostly existed only for the Washington Beltway, the home of The Source. The email system was down about half the time, and even when it was allegedly working it was anyone’s guess whether a message that was sent would actually be delivered. Failings like these could be attributed easily enough to the usual technical growing pains, but other complaints carried with them an implication of nefarious intent. The Source’s customers could read the business pages of the newspaper as well as anyone, and knew that Jack Taub was fighting for his company’s life on multiple fronts. In that situation, some customers reasoned, there would be a strong incentive to find ways to bill them just that little bit more. Thus there were dark accusations that the supposedly user-friendly menu system had been engineered to be as verbose and convoluted as possible in order to maximize the time users spent online just trying to get to where they wanted to go. On a 110- or 300-baud connection — for comparison purposes, consider that a good touch typist could far exceed the former rate — receiving all these textual menus could take considerable time, especially given the laggy response time of the system as a whole whenever more than a handful of people were logged on. And for some reason, a request to log off the system in an orderly way simply didn’t work most of the time, forcing users to break the connection themselves. After they did so, it would conveniently — conveniently for The Source’s accountants, that is — take the system five minutes or so to recognize their absence and stop charging them.

A sampling of the many error messages with which early users of The Source became all too familiar.

The accusations of nefarious intent were, for what it’s worth, very unlikely to have had any basis in reality. Jack Taub was a hustler, but he wasn’t a con man. On the contrary, he was earnestly trying to save a company whose future he deeply believed in. His biggest problem was the government-secured loan, on which Digital Broadcasting Corporation had by now defaulted, forcing the Commerce Department to pay $3.2 million to the National Bank of North Carolina. The government bureaucrats, understandably displeased, were threatening to seize his company and dismantle it in the hope of getting at least some of that money back. They were made extra motivated by the fact that the whole affair had leaked into the papers, with the Washington Post in particular treating it as a minor public scandal, an example of Your Tax Dollars at Waste.

Improvising like mad, Taub convinced the government to allow him to make a $300,000 down payment, and thereafter to repay the money he owed over a period of up to 22 years at an interest rate of just 2 percent. Beginning in 1982, the company, now trading as The Source Telecomputing Corporation rather than Digital Broadcasting Corporation, would have to repay either $50,000 or 10 percent of their net profit each year, whichever was greater; beginning in 1993, the former figure would rise to $100,000 if the loan still hadn’t been repaid. “The government got a good deal,” claimed Taub. “They get 100 cents on the dollar, and get their money back faster if I’m able to do something with the company.” While some might have begged to differ with his characterization of the arrangement as a “good deal,” it was, the government must have judged, the best it was likely to get under the circumstances. “The question is to work out some kind of reasonable solution where you recover something rather than nothing,” said one official familiar with the matter. “While it sounds like they’re giving it away, they already did that. They already made their mistake with the original loan.”

With the deal with the Commerce Department in place, Taub convinced The Readers Digest Association, publisher of the most popular magazine in the world, who were eager to get in on the ground floor of what was being billed in some circles as the next big thing in media, to buy 51 percent of The Source for $3 million in September of 1980, thus securing desperately needed operating capital. But when a judge ruled in favor of von Meister on the charge that he had been unlawfully forced out of the company shortly thereafter, Taub was left scrambling once again. He was forced to go back to Readers Digest, convincing them this time to increase their stake to 80 percent, leaving only the remaining 20 percent in his own hands. And with that second capital injection to hand, he convinced von Meister to lay the court battle to rest with a settlement check for $1 million.

The Source had finally attained a measure of stability, and Jack Taub’s extended triage could thus come to an end at last. Along the way, however, he had maneuvered himself out of his controlling interest and, soon, out of a job. Majority ownership having its privileges, Readers Digest elected to replace him with one of their own: Graeme Keeping, the executive who had lobbied hardest to buy The Source in the first place. “Any publisher today, if he doesn’t get into electronic publishing,” Keeping was fond of saying, “is either going to be forced into it by economic circumstances or will have great difficulty staying in the paper-and-ink business.”

The Source’s Prime computer systems, a millstone around their neck for years (although the monkey does seem to be enjoying them).

The Source may have found a safe harbor with one of the moneyed giants of American media, but it would never regain its early mojo. Keeping proved to be less than the strategic mastermind he believed himself to be, with a habit of over-promising and under-delivering — and, worse, of making terrible choices based on his own overoptimistic projections. The worst example of the tendency came early in his tenure, in the spring of 1981, when he was promising the New York Times he would have 60,000 subscribers by 1982. Determined to make sure he had the computing capacity to meet the demand, he cancelled the contract to use GTE Telenet’s computing facilities, opening his own data center instead and filling it with his own machines. At a stroke, this destroyed a key part of the logistical economies which had done so much to spawn The Source (and, for that matter, CompuServe’s MicroNET) in the first place. The Source’s shiny new computers now sat idle during the day with no customers to service. Come 1982, The Source had only 20,000 subscribers, and all those expensive computers were barely ticking over even at peak usage. This move alone cost The Source millions. Meanwhile, the deal with Alcatel for custom-made terminals having fallen through during the chaos of Taub’s tenure, Keeping made a new one with Zenith to make “a semi-intelligent terminal with a hole in the back through which you can turn it into a computer.” That impractical flight of fancy also came to naught, but not before costing The Source more money. Such failures led to Keeping’s ouster in June of 1982, to be replaced by another anodyne chief from the Readers Digest executive pool named George Grune.

Soon after, Control Data Corporation, a maker of supercomputers, bought a 30 percent share of The Source for a reported $5 million. But even this latest injection of capital, technical expertise, and content — Control Data would eventually move much of their pioneering Plato educational network onto the service — changed little. The Source went through three more chief executives in the next two years. The user roll continued to grow, finally reaching 60,000 in September of 1984 — some two and a half years after Graeme Keeping’s prediction, for those keeping score — but the company perpetually lost money, was perpetually about to turn the corner into mainstream acceptance and profitability but never actually did. Thanks not least to Keeping’s data-center boondoggle, the hourly rate for non-prime usage had risen to $7.75 per hour by 1984, making this onetime pioneer that now felt more and more like an also-ran a hard sell in terms of dollars and cents as well. Neither the leading name in the online-services industry nor the one with the deepest pockets — there were limits to Readers Digest’s largess — The Source struggled to attract third-party content. A disturbing number of those 60,000 subscribers rarely or never logged on, paying only the minimum monthly charge of $10. One analyst noted that well-heeled computer owners “apparently are willing to pay to have these electronic services available, even if they don’t use them regularly. From a business point of view, that’s a formula for survival, but not for success.”

The Source was fated to remain a survivor but never a real success for the rest of its existence. Back in Columbus, however, CompuServe’s consumer offering was on a very different trajectory. Begun in such a low-key way that Jeff Wilkins had refused even to describe it as being in competition with The Source, CompuServe’s erstwhile MicroNET — now re-branded as simply CompuServe, full stop — was going places of which its rival could only dream. Indeed, one might say it was going to the very places of which Bill von Meister had been dreaming in 1979.

(Sources: the book On the Way to the Web: The Secret History of the Internet and its Founders by Michael A. Banks and Stealing Time: Steve Case, Jerry Levin, and the Collapse of AOL Time Warner by Alec Klein; Creative Computing of March 1980; InfoWorld of April 14 1980, May 26 1980, January 11 1982, May 24 1982, and November 5 1984; Wall Street Journal of November 6 1979; Online Today of July 1989; 80 Microcomputing of November 1980; The Intelligent Machines Journal of March 14 1979 and June 25 1979; Washington Post of May 11 1937, July 10 1978, February 10 1980, and November 4 1980; Alexander Trevor’s brief technical history of CompuServe, which was first posted to Usenet in 1988; interviews with Jeff Wilkins from the Internet History Podcast and Conquering Columbus.)

 
 

Tags: ,

A Full-Motion-Video Consulting Detective

Over the course of six months in 1967, 50 million people visited Expo ’67 in Montreal, one of the most successful international exhibitions in the history of the world. Representatives from 62 nations set up pavilions there, showcasing the cutting edge in science, technology, and the arts. The Czechoslovakian pavilion was a surprisingly large one, with a “fairytale area” for children, a collection of blown Bohemian glassware, a “Symphony of Developed Industry,” and a snack bar offering “famous Pilsen beer.” But the hit of the pavilion — indeed, one of the sleeper hits of the Expo as a whole — was to be found inside a small, nondescript movie theater. It was called Kinoautomat, and it was the world’s first interactive movie.

Visitors who attended a screening found themselves ushered to seats that sported an unusual accessory: large green and red buttons mounted to the seat backs in front of them. The star of the film, a well-known Czech character actor named Miroslav Horníček, trotted onto the tiny stage in front of the screen to explain that the movie the visitors were about to see was unlike any they had ever seen before. From time to time, the action would stop and he would pop up again to let the audience decide what his character did next onscreen. Each audience member would register which of the two choices she preferred by pressing the appropriate button, the results would be tallied, and simple majority rule would decide the issue.

As a film, Kinoautomat is a slightly risque but otherwise harmless farce. The protagonist, a Mr. Novak, has just bought some flowers to give to his wife — it’s her birthday today — and is waiting at home for her to return to their apartment when his neighbor’s wife, an attractive young blonde, accidentally locks herself out of her own apartment with only a towel on. She frantically bangs on Mr. Novak’s door, putting him in an awkward position and presenting the audience with their first choice. Should he let her in and try to explain the presence of a naked woman in their apartment to his wife when she arrives, or should he refuse the poor girl, leaving her to shiver in the altogether in the hallway? After this first choice is made, another hour or so of escalating misunderstanding and mass confusion ensues, during which the audience is given another seven or so opportunities to vote on what happens next.

Kinoautomat played to packed houses throughout the Expo’s run, garnering heaps of press attention in the process. Radúz Činčera, the film’s director and the entire project’s mastermind, was lauded for creating what was called by some critics one of the boldest innovations in the history of cinema. After the Expo was over, Činčera’s interactive movie theater was set up several more times in several other cities, always with a positive response, and Hollywood tried to open a discussion about licensing the technology behind it. But the interest and exposure gradually dissipated, perhaps partly due to a crackdown on “decadent” art by Czechoslovakia’s ruling Communist Party, but almost certainly due in the largest part to the logistical challenges involved in setting up the interactive movie theaters that were needed to show it. It was last shown at Expo ’74 in Spokane, Washington, after which it disappeared from screens and memories for more than two decades, to be rescued from obscurity only well into the 1990s, after the Iron Curtain had been thrown open, when it was stumbled upon once again by some of the first academics to study seriously the nature of interactivity in digital mediums.

Had Činčera’s experiment been better remembered at the beginning of the 1990s, it might have saved a lot of time for those game developers dreaming of making interactive movies on personal computers and CD-ROM-based set-top boxes. Sure, the technology Činčera had to work with was immeasurably more primitive; his branching narrative was accomplished by the simple expedient of setting up two film projectors at the back of the theater and having an attendant place a lens cap over whichever held the non-applicable reel. Yet the more fundamental issues he wrestled with — those of how to create a meaningfully interactive experience by splicing together chunks of non-interactive filmed content — remained unchanged more than two decades later.

The dirty little secret about Kinoautomat was that the interactivity in this first interactive film was a lie. Each branch the story took contrived only to give lip service to the audience’s choice, after which it found a way to loop back onto the film’s fixed narrative through-line. Whether the audience was full of conscientious empathizers endeavoring to make the wisest choices for Mr. Novak or crazed anarchists trying to incite as much chaos as possible — the latter approach, for what it’s worth, was by far the more common — the end result would be the same: poor Mr. Novak’s entire apartment complex would always wind up burning to the ground in the final scenes, thanks to a long chain of happenstance that began with that naked girl knocking on his door. Činčera had been able to get away with this trick thanks to the novelty of the experience and, most of all, thanks to the fact that his audience, unless they made the effort to come back more than once or to compare detailed notes with those who had attended other screenings, was never confronted with how meaningless their choices actually were.

While it had worked out okay for Kinoautomat, this sort of fake interactivity wasn’t, needless to say, a sustainable path for building the whole new interactive-movie industry — a union of Silicon Valley and Hollywood — which some of the most prominent names in the games industry were talking of circa 1990. At the same time, though, the hard reality was that to create an interactive movie out of filmed, real-world content that did offer genuinely meaningful, story-altering branches seemed for all practical purposes impossible. The conventional computer graphics that had heretofore been used in games, generated by the computer and drawn on the screen programmatically, were a completely different animal than the canned snippets of video which so many were now claiming would mark the proverbial Great Leap Forward. Conventional computer graphics could be instantly, subtly, and comprehensively responsive to the player’s actions. The snippets in what the industry would soon come to call a “full-motion-video” game could be mixed and matched and juggled, but only in comparatively enormous static chunks.

This might not sound like an impossible barrier in and of itself. Indeed, the medium of textual interactive fiction had already been confronted with seemingly similar contrasts in granularity between two disparate approaches which had both proved equally viable. As I’ve had occasion to discuss in an earlier article, a hypertext narrative built out of discrete hard branches is much more limiting in some ways than a parser-driven text adventure with its multitudinous options available at every turn — but, importantly, the opposite is also true. A parser-driven game that’s forever fussing over what room the player is standing in and what she’s carrying with her at any given instant is ill-suited to convey large sweeps of time and plot. Each approach, in other words, is best suited for a different kind of experience. A hypertext narrative can become a wide-angle exploration of life-changing choices and their consequences, while the zoomed-in perspective of the text adventure is better suited to puzzle-solving and geographical exploration — that is, to the exploration of a physical space rather than a story space.

And yet if we do attempt to extend a similar comparison to a full-motion-video adventure game versus one built out of conventional computer graphics, it may hold up in the abstract, but quickly falls apart in the realm of the practical and the specific. Although the projects exploring full-motion-video applications were among the most expensive the games industry of 1990 had ever funded, their budgets paled next to those of even a cheap Hollywood production. To produce full-motion-video games with meaningfully branching narratives would require their developers to stretch their already meager budgets far enough to shoot many, many non-interactive movies in order to create a single interactive movie, accepting that the player would see only a small percentage of all those hours of footage on any given play-through. And even assuming that the budget could somehow be stretched to allow such a thing, there were other practical concerns to reckon with; after all, even the wondrous new storage medium of CD-ROM had its limits in terms of capacity.

Faced with these issues, would-be designers of full-motion-video games did what all game designers do: they worked to find approaches that — since there was no way to bash through the barriers imposed on them — skirted around the problem.

They did have at least one example to follow or reject — one that, unlike Kinoautomat, virtually every working game designer knew well. Dragon’s Lair, the biggest arcade hit of 1983, had been built out of a chopped-up cartoon which un-spooled from a laser disc housed inside the machine. It replaced all of the complications of branching plots with a simple do-or-die approach. The player needed to guide the joystick through just the right pattern of rote movements — a pattern identifiable only through extensive trial and error — in time with the video playing on the screen. Failure meant death, success meant the cartoon continued to the next scene — no muss, no fuss. But, as the many arcade games that had tried to duplicate Dragon’s Lair‘s short-lived success had proved, it was hardly a recipe for a satisfying game once the novelty wore off.

Another option was to use full-motion video for cut scenes rather than as the real basis of a game, interspersing static video sequences used for purposes of exposition in between interactive sequences powered by conventional computer graphics. In time, this would become something of a default approach to the problem of full-motion video, showing up in games as diverse as the Wing Commander series of space-combat simulators, the Command & Conquer real-time strategy series, and even first-person shooters like Realms of the Haunting. But such juxtapositions would always be doomed to look a little jarring, the ludic equivalent of an animated film which from time to time switches to live action for no aesthetically valid reason. As such, this would largely become the industry’s fallback position, the way full-motion video wound up being deployed as a last resort after designers had failed to hit upon a less jarring formula. Certainly in the early days of full-motion video — the period we’re interested in right now — there still remained the hope that some better approach to the melding of computer game and film might be discovered.

The most promising approaches — the ones, that is, that came closest to working — often used full-motion video in the context of a computerized mystery. In itself, this is hardly surprising. Despite the well-known preference of gamers and game designers for science-fiction and fantasy scenarios, the genre of traditional fiction most obviously suited for ludic adaptation is in fact the classic mystery novel, the only literary genre that actively casts itself as a sort of game between writer and reader. A mystery novel, one might say, is really two stories woven together. One is that of the crime itself, which is committed before the book proper really gets going. The other is that of the detective’s unraveling of the crime; it’s here, of course, that the ludic element comes in, as the reader too is challenged to assemble the clues alongside the detective and try to deduce the perpetrator, method, and motive before they are revealed to her.

For a game designer wrestling with the challenges inherent in working with full-motion video, the advantages of this structure count double. The crime itself is that most blessed of things for a designer cast adrift on a sea of interactivity: a fixed story, an unchanging piece of solid narrative ground. In the realm of interactivity, then, the designer is only forced to deal with the investigation, a relatively circumscribed story space that isn’t so much about making a story as uncovering one that already exists. The player/detective juggles pieces of that already extant story, trying to slot them together to make the full picture. In that context, the limitations of full-motion video — all those static chunks of film footage that must be mixed and matched — suddenly don’t sound quite so limiting. Full-motion video, an ill-fitting solution that has to be pounded into place with a sledgehammer in most interactive applications, suddenly starts seeming like an almost elegant fit.

The origin story of the most prominent of the early full-motion-video mysteries, a product at the bleeding edge of technology at the time it was introduced, ironically stretches back to a time before computers were even invented. In 1935, J.G. Links, a prominent London furrier, came up with an idea to take the game-like elements of the traditional mystery novel to the next level. What if a crime could be presented to the reader not as a story about its uncovering but in a more unprocessed form, as a “dossier” of clues, evidence, and suspects? The reader would be challenged to assemble this jigsaw into a coherent description of who, what, when, and where. Then, when she thought she was ready, she could open a sealed envelope containing the solution to find out if she had been correct. Links pitched the idea to a friend of his who was well-positioned to see it through with him: Dennis Wheatley, a very popular writer of crime and adventure novels. Together Links and Wheatley created four “Dennis Wheatley Crime Dossiers,” which enjoyed considerable success before the undertaking was stopped short by the outbreak of World War II. After the war, mysteries in game form drifted into the less verisimilitudinous but far more replayable likes of Cluedo, while non-digital interactive narratives moved into the medium of experiential wargames, which in turn led, in time, to the great tabletop-gaming revolution that was Dungeon & Dragons.

And that could very well have been the end of the story, leaving the Dennis Wheatley Crime Dossiers as merely a road not taken in game history, works ahead of their time that wound up getting stranded there. But in 1979 Mayflower Books began republishing the dossiers, a complicated undertaking that involved recreating the various bits of “physical evidence” — including pills, fabric samples, cigarette butts, and even locks of hair — that had accompanied them. There is little indication that their efforts were rewarded with major sales. Yet, coming as they did at a fraught historical moment for interactive storytelling in general — the first Choose Your Own Adventure book was published that same year; the game Adventure had hit computers a couple of years before; Dungeons & Dragons was breaking into the mainstream media — the reprinted dossiers’ influence would prove surprisingly pervasive with innovators in the burgeoning field. They would, for instance, provide Marc Blank with the idea of making a sort of crime dossier of his own to accompany Infocom’s 1982 computerized mystery Deadline, thereby establishing the Infocom tradition of scene-setting “feelies” and elaborate packaging in general. And another important game whose existence is hard to imagine without the example provided by the Dennis Wheatley Crime Dossiers appeared a year before Deadline.

Prior to the Mayflower reprints, the closest available alternative to the Crime Dossiers had been a 1975 Sherlock Holmes-starring board game called 221B Baker Street: The Master Detective Game. It plays like a more coherent version of Cluedo, thanks to its utilization of pre-crafted mysteries that are included in the box rather than a reliance on random combinations of suspects, locations, and weapons. Otherwise, however, the experience isn’t all that markedly different, with players rolling dice and moving their tokens around the game board, trying to complete their “solution checklists” before their rivals. The competitive element introduces a bit of cognitive dissonance that is never really resolved: this game of Sherlock Holmes actually features several versions of Holmes, all racing around London trying to solve each mystery before the others can. But more importantly, playing it still feels more like solving a crossword puzzle than solving a mystery.

Two of those frustrated by the limitations of 221B Baker Street were Gary Grady and Suzanne Goldberg, amateur scholars of Sherlock Holmes living in San Francisco. “A game like 221B Baker Street doesn’t give a player a choice,” Grady noted. “You have no control over the clue you’re going to get and there’s no relationship of the clues to the process of play. We wanted the idea of solving a mystery rather than a puzzle.” In 1979, with the negative example of 221B Baker Street and the positive example of the Dennis Wheatley Crime Dossiers to light the way, the two started work on a mammoth undertaking that would come to be known as Sherlock Holmes Consulting Detective upon its publication two years later. Packaged and sold as a board game, it in truth had much less in common with the likes of Cluedo or 221B Baker Street than it did with the Dennis Wheatley Crime Dossiers. Grady and Goldberg provided rules for playing competitively if you insisted, and a scoring system that challenged you to solve a case after collecting the least amount of evidence possible, but just about everyone who has played it agrees that the real joy of the game is simply in solving the ten labyrinthine cases, each worthy of an Arthur Conan Doyle story of its own, that are included in the box.

Each case is housed in a booklet of its own, whose first page or two sets up the mystery to be solved in rich prose that might indeed have been lifted right out of a vintage Holmes story. The rest of the booklet consists of more paragraphs to be read as you visit various locations around London, following the evidence trail wherever it leads. When you choose to visit someplace (or somebody), you look it up in the London directory that is included, which will give you a coded reference. If that code is included in the case’s booklet, eureka, you may just have stumbled upon more information to guide your investigation; at the very least, you’ve found something new to read. In addition to the case books, you have lovingly crafted editions of the London Times from the day of each case to scour for more clues; cleverly, the newspapers used for early cases can contain clues for later cases as well, meaning the haystack you’re searching for needles gets steadily bigger as you progress from case to case. You also have a map of London, which can become unexpectedly useful for tracing the movements of suspects. Indeed, each case forces you to apply a whole range of approaches and modes of thought to its solution. When you think you’re ready, you turn to the “quiz book” and answer the questions about the case therein, then turn the page to find out if you were correct.

If Sherlock Holmes Consulting Detective presents a daunting challenge to its player, the same must go ten times over for its designers. The amount of effort that must have gone into creating, collating, intertwining, and typesetting such an intricate web of information fairly boggles the mind. The game is effectively ten Dennis Wheatley Crime Dossiers in one box, all cross-referencing one another, looping back on one another. That Grady and Goldberg, working in an era before computerized word processing was widespread, managed it at all is stunning.

Unable to interest any of the established makers of board games in such an odd product, the two published it themselves, forming a little company called Sleuth Publications for the purpose. A niche product if ever there was one, it did manage to attract a champion in Games magazine, who called it “the most ingenious and realistic detective game ever devised.” The same magazine did much to raise its profile when they added it to their mail-order store in 1983. A German translation won the hugely prestigious Spiel des Jahres in 1985, a very unusual selection for a competition that typically favored spare board games of abstract logic. Over the years, Sleuth published a number of additional case packs, along with another boxed game in the same style: Gumshoe, a noirish experience rooted in Raymond Chandler rather than Arthur Conan Doyle which was less successful, both creatively and commercially, than its predecessor.

And then these elaborate analog productions, almost defiantly old-fashioned in their reliance on paper and text and imagination, became the unlikely source material for the most high-profile computerized mysteries of the early CD-ROM era.

The transformation would be wrought by ICOM Simulations, a small developer who had always focused their efforts on emerging technology. They had first made their name with the release of Déjà Vu on the Macintosh in 1985, one of the first adventure games to replace the parser with a practical point-and-click interface; in its day, it was quite the technological marvel. Three more games built using the same engine had followed, along with ports to many, many platforms. But by the time Déjà Vu II hit the scene in 1988, the interface was starting to look a little clunky and dated next to the efforts of companies like Lucasfilm Games, and ICOM decided it was time to make a change — time to jump into the unexplored waters of CD-ROM and full-motion video. They had always been technophiles first, game designers second, as was demonstrated by the somewhat iffy designs of most of their extant games. It therefore made a degree of sense to adapt someone else’s work to CD-ROM. They decided that Sherlock Holmes Consulting Detective, that most coolly intellectual of mystery-solving board games, would counter-intuitively adapt very well to a medium that was supposed to allow hotter, more immersive computerized experiences than ever before.

As we’ve already seen, the limitations of working with chunks of static text are actually very similar in some ways to those of working with chunks of static video. ICOM thus decided that the board game’s methods for working around those limitations should work very well for the computer game as well. The little textual vignettes which filled the case booklets, to be read as the player moved about London trying to solve the case, could be recreated by live actors. There would be no complicated branching narrative, just a player moving about London, being fed video clips of her interviews with suspects. Because the tabletop game included no mechanism for tracking where the player had already been and what she had done, the text in the case booklets had been carefully written to make no such presumptions. Again, this was perfect for a full-motion-video adaptation.

Gary Grady and Suzanne Goldberg were happy to license their work; after laboring all these years on such a complicated niche product, the day on which ICOM knocked on their door must have been a big one indeed. Ken Tarolla, the man who took charge of the project for ICOM, chose three of the ten cases from the original Sherlock Holmes Consulting Detective to serve as the basis of the computer game. He now had to reckon with the challenges of going from programming games to filming them. Undaunted, he had the vignettes from the case booklets turned into scripts by a professional screenwriter, hired 35 actors to cast in the 50 speaking parts, and rented a sound stage in Minneapolis — far from ICOM’s Chicago offices, but needs must — for the shoot. The production wound up requiring 70 costumes along with 25 separate sets, a huge investment for a small developer like ICOM. In spite of their small size, they evinced a commitment to production values few of their peers could match. Notably, they didn’t take the money-saving shortcut of replacing physical sets with computer-generated graphics spliced in behind the actors. For this reason, their work holds up much better today than that of most of their peers.

Indeed, as befits a developer of ICOM’s established technical excellence — even if they were working in an entirely new medium — the video sequences are surprisingly good, the acting and set design about up to the standard of a typical daytime-television soap opera. If that seems like damning with faint praise, know that the majority of similar productions come off far, far worse. Peter Farley, the actor hired to play Holmes, may not be a Basil Rathbone, Jeremy Brett, or Benedict Cumberbatch, but neither does he embarrass himself. The interface is decent, and the game opens with a video tutorial narrated by Holmes himself — a clear sign of how hard Consulting Detective is straining to be the more mainstream, more casual form of interactive entertainment that the CD-ROM was supposed to precipitate.

First announced in 1990 and planned as a cross-platform product from the beginning, spanning the many rival CD-ROM initiatives on personal computers, set-top boxes, and game consoles, ICOM’s various versions of Consulting Detective were all delayed for long stretches by a problem which dogged every developer working in the same space: the struggle to find a way of getting video from CD-ROM to the screen at a reasonable resolution, frame rate, and number of colors. The game debuted in mid-1991 on the NEC TurboGrafx-16, an also-ran in the console wars which happened to be the first such device to offer a CD-ROM drive as an accessory. In early 1992, it made its way to the Commodore CDTV, thanks to a code library for video playback devised by Carl Sassenrath, long a pivotal figure in Amiga circles. Then, and most importantly in commercial terms, the slow advance of computing hardware finally made it possible to port the game to Macintosh and MS-DOS desktop computers equipped with CD-ROM drives later in the same year.

Sherlock Holmes Consulting Detective became a common sight in “multimedia upgrade kits” like this one from Creative Labs.

As one of the first and most audiovisually impressive products of its kind, Consulting Detective existed in an uneasy space somewhere between game and tech demo. It was hard for anyone who had never seen actual video featuring actual actors playing on a computer before to focus on much else when the game was shown to them. It was therefore frequently bundled with the “multimedia upgrade kits,” consisting of a sound card and CD-ROM drive, that were sold by companies like Creative Labs beginning in 1992. Thanks to these pack-in deals, it shipped in huge numbers by conventional games-industry terms. Thus encouraged, ICOM went back to the well for a Consulting Detective Volume II and Volume III, each with another three cases from the original board game. These releases, however, did predictably less well without the advantages of novelty and of being a common pack-in item.

As I’ve noted already, Consulting Detective looks surprisingly good on the surface even today, while at the time of its release it was nothing short of astonishing. Yet it doesn’t take much playing time before the flaws start to show through. Oddly given the great care that so clearly went into its surface production, many of its problems feel like failures of ambition. As I’ve also already noted, no real state whatsoever is tracked by the game; you just march around London watching videos until you think you’ve assembled a complete picture of the case, then march off to trial, which takes the form of a quiz on who did what and why. If you go back to a place you’ve already been, the game doesn’t remember it: the same video clip merely plays again. This statelessness turns out to be deeply damaging to the experience. I can perhaps best explain by taking as an example the first case in the first volume of the series. (Minor spoilers do follow in the next several paragraphs. Skip down to the penultimate paragraph — beginning with “To be fair…” — to avoid them entirely.)

“The Mummy’s Curse” concerns the murder on separate occasions of all three of the archaeologists who have recently led a high-profile expedition to Egypt. One of the murders took place aboard the ship on which the expedition was returning to London, laden with treasures taken — today, we would say “looted” — from a newly discovered tomb. We can presume that one of the other passengers most likely did the deed. So, we acquire the passenger manifest for the ship and proceed to visit each of the suspects in turn. Among them are Mr. and Mrs. Fenwick, two eccentric members of the leisured class. Each of them claims not to have seen, heard, or otherwise had anything to do with the murder. But Louise Fenwick has a little dog, a Yorkshire terrier of whom she is inordinately fond and who traveled with the couple on their voyage. (Don’t judge the game too harshly from the excerpt below; it features some of the hammiest acting of all, with a Mrs. Fenwick who seems to be channeling Miss Piggy — a Miss Piggy, that is, with a fake English accent as horrid as only an American can make it.)


The existence of Mrs. Fenwick’s dog is very interesting in that the Scotland Yard criminologist who handled the case found some dog hair on the victim’s body. Our next natural instinct would be to find out whether the hair could indeed have come from a Yorkshire terrier — but revisiting Scotland Yard will only cause the video from there which we’ve already seen to play again. Thus stymied on that front, we probe further into Mrs. Fenwick’s background. We learn that the victim once gave a lecture before the Royal Society where he talked about dissecting his own Yorkshire terrier after its death, provoking the ire of the Anti-Vivisection League, of which Louise Fenwick is a member. And it gets still better: she personally harassed the victim, threatening to dissect him herself. Now, it’s very possible that this is all coincidence and red herrings, but it’s certainly something worth following up on. So we visit the Fenwicks again to ask her about it — and get to watch the video we already saw play again. Stymied once more.

This example hopefully begins to illustrate how Sherlock Holmes Consulting Detective breaks its promise to let you be the detective and solve the crime yourself in the way aficionados of mystery novels had been dreaming of doing for a century. Because the game never knows what you know, and because it only lets you decide where you go, nothing about what you do after you get there, playing it actually becomes much more difficult than being a “real” detective. You’re constantly being hobbled by all these artificial constraints. Again and again, you find yourself seething because you can’t ask the question Holmes would most certainly be asking in your situation. It’s a form of fake difficulty, caused by the constraints of the game engine rather than the nature of the case.

Consider once more, then, how this plays out in practice in “The Mummy’s Curse.” We pick up this potentially case-cracking clue about Mrs. Fenwick’s previous relations with the victim. If we’ve ever read a mystery novel or watched a crime drama, we know immediately what to do. Caught up in the fiction, we rush back to the Fenwicks without even thinking about it. We get there, and of course it doesn’t work; we just get the same old spiel. It’s a thoroughly deflating experience. This isn’t just a sin against mimesis; it’s wholesale mimesis genocide.

It is true that the board-game version of Consulting Detective suffers from the exact same flaws born of its own statelessness. By presenting a case strictly as a collection of extant clues to be put together rather than asking you to ferret them out for yourself — by in effect eliminating from the equation both the story of the crime and the story of the investigation which turned up the clues — the Dennis Wheatley Crime Dossiers avoid most of these frustrations, at the expense of feeling like drier, more static endeavors. I will say that the infelicities of Sherlock Holmes Consulting Detective in general feel more egregious in the computer version — perhaps because the hotter medium of video promotes a depth of immersion in the fiction that makes it feel like even more of a betrayal when the immersion breaks down; or, more prosaically, simply because we feel that the computer ought to be capable of doing a better job of things than it is, while we’re more forgiving of the obvious constraints of a purely analog design.

Of course, it was this very same statelessness that made the design such an attractive one for adaptation to full-motion video in the first place. In other words, the problems with the format which Kinoautomat highlighted in 1967 aren’t quite as easy to dodge around as ICOM perhaps thought. It does feel like ICOM could have done a little better on this front, even within the limitations of full-motion video. Would it have killed them to provide a few clips instead of just one for some of the key scenes, with the one that plays dependent on what the player has already learned? Yes, I’m aware that that has the potential to become a very slippery slope indeed. But still… work with us just a bit, ICOM.

While I don’t want to spend too much more time pillorying this pioneering but flawed game, I do have to point out one more issue: setting aside the problems that arise from the nature of the engine, the cases themselves often have serious problems. They’ve all been shortened and simplified in comparison to the board game, which gives rise to some of the issues. That said, though, it must also be said that not everything in the board game itself is unimpeachable. Holmes’s own narratives of the cases’ solutions, which follow after you complete them by answering all of the questions in the trial phases correctly, are often rife with questionable assumptions and intuitive leaps that would never hold up on an episode of Perry Mason, much less a real trial. At the conclusion of “The Mummy’s Curse,” for instance, he tells us there was “no reason to assume” that the three archaeologists weren’t all killed by the same person. Fair enough — but there is also no reason to assume the opposite, no reason to assume we aren’t dealing with a copycat killer or killers, given that all of the details surrounding the first of the murders were published on the front page of the London Times. And yet Holmes’s entire solution to the case follows from exactly that questionable assumption. It serves, for example, as his logic for eliminating Mrs. Fenwick as a suspect, since she had neither motive nor opportunity to kill the other two archaeologists.

To be fair to Gary Grady and Suzanne Goldberg, this case is regarded by fans of the original board game as the weakest of all ten (it actually shows up as the sixth case there). Why ICOM chose to lead with this of all cases is the greatest mystery of all. Most of the ones that follow are better — but rarely, it must be said, as airtight as our cocky friend Holmes would have them be. But then, in this sense ICOM is perhaps only being true to the Sherlock Holmes canon. For all Holmes’s purported devotion to rigorous logic, Arthur Conan Doyle’s tales never play fair with readers hoping to solve the mysteries for themselves, hinging always on similar logical fallacies and superhuman intuitive leaps. If one chooses to read the classic Sherlock Holmes stories — and many of them certainly are well worth reading — it shouldn’t be in the hope of solving their mysteries before he does.

The three volumes of Sherlock Holmes Consulting Detective would, along with a couple of other not-quite-satisfying full-motion-video games, mark the end of the line for ICOM. Faced with the mounting budgets that made it harder and harder for a small developer to survive, they left the gaming scene quietly in the mid-1990s. The catalog of games they left behind is a fairly small one, but includes in Déjà Vu and Consulting Detective two of the most technically significant works of their times. The Consulting Detective games were by no means the only interactive mysteries of the early full-motion-video era; a company called Tiger Media also released a couple of mysteries on CD-ROM, with a similar set of frustrating limitations, and the British publisher Domark even announced but never released a CD-ROM take on one of the old Dennis Wheatley Crime Dossiers. The ICOM mysteries were, however, the most prominent and popular. Flawed though they are, they remain fascinating historical artifacts with much to teach us: about the nature of those days when seeing an actual video clip playing on a monitor screen was akin to magic; about the perils and perhaps some of the hidden potential of building games out of real-world video; about game design in general. In that spirit, we’ll be exploring more experiments with full-motion video in articles to come, looking at how they circumvented — or failed to circumvent — the issues that dogged Kinoautomat, Dragon’s Lair, and Sherlock Holmes Consulting Detective alike.

(Sources: the book Media and Participation by Nico Carpentier; Byte of May 1992; Amazing Computing of May 1991, July 1991, March 1992, and May 1992; Amiga Format of March 1991; Amiga Computing of October 1992; CD-ROM Today of July 1993; Computer Gaming World of January 1991, August 1991, June 1992, and March 1993; Family Computing of February 1984; Softline of September 1982; Questbusters of July 1991 and September 1991; CU Amiga of October 1992. Online sources include Joe Pranevich’s interview with Dave Marsh on The Adventure Gamer; the home page of Kinoautomat today; Expo ’67 in Montreal; and Brian Moriarty’s annotated excerpt from Kinoautomat, taken from his lecture “I Sing the Story Electric.”

Some of the folks who once were ICOM Simulations have remastered the three cases from the first volume of the series and now sell them on Steam. The Sherlock Holmes Consulting Detective tabletop line is in print again thanks to Asmodee Games. While I don’t love it quite as much as some do due to some of the issues mentioned in this article, it’s still a unique experience today that’s well worth checking out.)

 
 

Tags: , ,

The 68000 Wars, Part 5: The Age of Multimedia

A group of engineers from Commodore dropped in unannounced on the monthly meeting of the San Diego Amiga Users Group in April of 1988. They said they were on their way to West Germany with some important new technology to share with their European colleagues. With a few hours to spare before they had to catch their flight, they’d decided to share it with the user group’s members as well.

They had with them nothing less than the machine that would soon be released as the next-generation Amiga: the Amiga 3000. From the moment they powered it up to display the familiar Workbench startup icon re-imagined as a three-dimensional ray-traced rendering, the crowd was in awe. The new model sported a 68020 processor running at more than twice the clock speed of the old 68000, with a set of custom chips redesigned to match its throughput; graphics in 2 million colors instead of 4096, shown at non-interlaced — read, non-flickering — resolutions of 640 X 400 and beyond; an AmigaOS 2.0 Workbench that looked far more professional than the garish version 1.3 that was shipping with current Amigas. The crowd was just getting warmed up when the team said they had to run. They did, after all, have a plane to catch.

Word spread like crazy over the online services. Calls poured in to Commodore’s headquarters in West Chester, Pennsylvania, but they didn’t seem to know what any of the callers were talking about. Clearly this must be a very top-secret project; the engineering team must have committed a major breach of protocol by jumping the gun as they had. Who would have dreamed that Commodore was already in the final stages of a project which the Amiga community had been begging them just to get started on?

Who indeed? The whole thing was a lie. The tip-off was right there in the April date of the San Diego Users Group Meeting. The president of the group, along with a few co-conspirators, had taken a Macintosh II motherboard and shoehorned it into an Amiga 2000 case. They’d had “Amiga 3000” labels typeset and stuck them on the case, and created some reasonable-looking renderings of Amiga applications, just enough to get them through the brief amount of time their team of “Commodore engineers” — actually people from the nearby Los Angeles Amiga Users Group — would spend presenting the package. When the truth came out, some in the Amiga community congratulated the culprits for a prank well-played, while others were predictably outraged. What hurt more than the fact that they had been fooled was the reality that a Macintosh that was available right now had been able to impersonate an Amiga that existed only in their dreams. If that wasn’t an ominous sign for their favored platform’s future, it was hard to say what would be.

Of course, this combination of counterfeit hardware and sketchy demos, no matter how masterfully acted before the audience, couldn’t have been all that convincing to a neutral observer with a modicum of skepticism. Like all great hoaxes, this one succeeded because it built upon what its audience already desperately wanted to believe. In doing so, it inadvertently provided a preview of what it would mean to be an Amiga user in the future: an ongoing triumph of hope over hard-won experience. It’s been said before that the worst thing you can do is to enter into a relationship in the hope that you will be able to change the other party. Amiga users would have reason to learn that lesson over and over again: Commodore would never change. Yet many would never take the lesson to heart. To be an Amiga user would be always to be fixated upon the next shiny object out there on the horizon, always to be sure this would be the thing that would finally turn everything around, only to be disappointed again and again.

Hoaxes aside, rumors about the Amiga 3000 had been swirling around since the introduction of the 500 and 2000 models in 1987. But for a long time a rumor was all the new machine was, even as the MS-DOS and Macintosh platforms continued to evolve apace. Commodore’s engineering team was dedicated and occasionally brilliant, but their numbers were tiny in comparison to those of comparable companies, much less bigger ones like Apple and IBM, the latter of whose annual research budget was greater than Commodore’s total sales. And Commodore’s engineers were perpetually underpaid and underappreciated by their managers to boot. The only real reason for a top-flight engineer to work at Commodore was love of the Amiga itself. In light of the conditions under which they were forced to work, what the engineering staff did manage to accomplish is remarkable.

After the crushing disappointment that had been the 1989 Christmas season, when Commodore’s last and most concerted attempt to break the Amiga 500 into the American mainstream had failed, it didn’t take hope long to flower again in the new year. “The chance for an explosive Amiga market growth is still there,” wrote Amazing Computing at that time, in a line that could have summed up the sentiment of every issue they published between 1986 and 1994.

Still, reasons for optimism seemingly did still exist. For one thing, Commodore’s American operation had another new man in charge, an event which always brought with it the hope that the new boss might not prove the same as the old boss. Replacing the unfortunately named Max Toy was Harold Copperman, a real, honest-to-goodness computer-industry veteran, coming off a twenty-year stint with IBM, followed by two years with Apple; he had almost literally stepped offstage from the New York Mac Business Expo, where he had introduced John Sculley to the speaker’s podium, and into his new office at Commodore. With the attempt to pitch the Amiga 500 to low-end users as the successor to the Commodore 64 having failed to gain any traction, the biggest current grounds for optimism was that Copperman, whose experience was in business computers, could make inroads into that market for the higher-end Amiga models. Rumor had it that the dismissal of Toy and the hiring of Copperman had occurred following a civil war that had riven the company, with one faction — Toy apparently among them — saying Commodore should de-emphasize the Amiga in favor of jumping on the MS-DOS bandwagon, while the other faction saw little future — or, perhaps better said, little profit margin — in becoming just another maker of commodity clones. If you were an Amiga fan, you could at least breathe a sigh of relief that the right side had won out in that fight.

The Amiga 3000

It was in that hopeful spring of 1990 that the real Amiga 3000, a machine custom-made for the high-end market, made its bow. It wasn’t a revolutionary update to the Amiga 2000 by any means, but it did offer some welcome enhancements. In fact, it bore some marked similarities to the hoax Amiga 3000 of 1988. For instance, replacing the old 68000 was a 32-bit 68030 processor, and replacing AmigaOS 1.3 was the new and much-improved — both practically and aesthetically — AmigaOS 2.0. The flicker of the interlaced graphics modes could finally be a thing of the past, at least if the user sprang for the right type of monitor, and a new “super-high resolution” mode of 1280 X 400 was available, albeit with only four onscreen colors. The maximum amount of “chip memory” — memory that could be addressed by the machine’s custom chips, and thus could be fully utilized for graphics and sound — had already increased from 512 K to 1 MB with the release of a “Fatter Agnus” chip, which could be retrofitted into older examples of the Amiga 500 and 2000, in 1989. Now it increased to 2 MB with the Amiga 3000.

The rather garish and toy-like AmigaOS 1.3 Workbench.

The much slicker Workbench 2.0.

So, yes, the Amiga 3000 was very welcome, as was any sign of technological progress. Yet it was also hard not to feel a little disappointed that, five years after the unveiling of the first Amiga, the platform had only advanced this far. The hard fact was that Commodore’s engineers, forced to work on a shoestring as they were, were still tinkering at the edges of the architecture that Jay Miner and his team had devised all those years before rather than truly digging into it to make the more fundamental changes that were urgently needed to keep up with the competition. The interlace flicker was eliminated, for instance, not by altering the custom chips themselves but by hanging an external “flicker fixer” onto the end of the bus to de-interlace the interlaced output they still produced before it reached the monitor. And the custom chips still ran no faster than they had in the original Amiga, meaning the hot new 68030 had to slow down to a crawl every time it needed to access the chip memory it shared with them. The color palette remained stuck at 4096 shades, and, with the exception of the new super-high resolution mode, whose weirdly stretched pixels and four colors limited its usability, the graphics modes as a whole remained unchanged. Amiga owners had spent years mocking the Apple Macintosh and the Atari ST for their allegedly unimaginative, compromised designs, contrasting them continually with Jay Miner’s elegant dream machine. Now, that argument was getting harder to make; the Amiga too was starting to look a little compromised and inelegant.

Harold Copperman personally introduced the Amiga 3000 in a lavish event — lavish at least by Commodore’s standards — held at New York City’s trendy Palladium nightclub. With CD-ROM in the offing and audiovisual standards improving rapidly across the computer industry, “multimedia” stood with the likes of “hypertext” as one of the great buzzwords of the age. Commodore was all over it, even going so far as to name the event “Multimedia Live!” From Copperman’s address:

It’s our turn. It’s our time. We had the technology four and a half years ago. In fact, we had the product ready for multimedia before multimedia was ready for a product. Today we’re improving the technology, and we’re in the catbird seat. It is our time. It is Commodore’s time.

I’m at Commodore just as multimedia becomes the most important item in the marketplace. Once again I’m with the leader. Of course, in this industry a leader doesn’t have any followers; he just has a lot of other companies trying to pass him by. But take a close look: the other companies are talking multimedia, but they’re not doing it. They’re a long way behind Commodore — not even close.

Multimedia is a first-class way for conveying a message because it takes the strength of the intellectual content and adds the verve — the emotion-grabbing, head-turning, pulse-raising impact that comes from great visuals plus a dynamic soundtrack. For everyone with a message to deliver, it unleashes extraordinary ability. For the businessman, educator, or government manager, it turns any ordinary meeting into an experience.

In a way, this speech was cut from the same cloth as the Amiga 3000 itself. It was certainly a sign of progress, but was it progress enough? Even as he sounded more engaged and more engaging than had plenty of other tepid Commodore executives, Copperman inadvertently pointed out much of what was still wrong with the organization he helmed. He was right that Commodore had had the technology to do multimedia for a long time; as I’ve argued at length elsewhere, the Amiga was in fact the world’s first multimedia personal computer, all the way back in 1985. Still, the obvious question one is left with after reading the first paragraph of the extract above is why, if Commodore had the technology to do multimedia four and a half years ago, they’ve waited until now to tell anyone about it. In short, why is the the world of 1990 “ready” for multimedia when the world of 1985 wasn’t? Contrary to Copperman’s claim about being a leader, Commodore’s own management had begun to evince an understanding of what the Amiga was and what made it special only after other companies had started building computers similar to it. Real business leaders don’t wait around for the world to decide it’s ready for their products; they make products the world doesn’t yet know it needs, then tell it why it needs them. Five years after being gifted with the Amiga, which stands alongside the Macintosh as one of the two most visionary computers of the 1980s precisely because of its embrace of multimedia, Commodore managed at this event to give every impression that they were the multimedia bandwagon jumpers.

The Amiga 3000 didn’t turn into the game changer the faithful were always dreaming of. It sold moderately, mostly to the established Amiga hardcore, but had little obvious effect on the platform’s overall marketplace position. Harold Copperman was blamed for the disappointment, and was duly fired by Irving Gould, the principal shareholder and ultimate authority at Commodore, at the beginning of 1991. The new company line became an exact inversion of that which had held sway at the time of the Amiga 3000’s introduction: Copperman’s expertise was business computing, but Commodore’s future lay in consumer computing. Jim Dionne, head of Commodore’s Canadian division and supposedly an expert consumer marketer, was brought in to replace him.

An old joke began to make the rounds of the company once again. A new executive arrives at his desk at Commodore and finds three envelopes in the drawer, each labelled “open in case of emergency” and numbered one, two, and three. When the company gets into trouble for the first time on his watch, he opens the first envelope. Inside is a note: “Blame your predecessor.” So he does, and that saves his bacon for a while, but then things go south again. He opens the second envelope: “Blame your vice-presidents.” So he does, and gets another lease on life, but of course it only lasts a little while. He opens the third envelope. “Prepare three envelopes…” he begins to read.

Yet anyone who happened to be looking closely might have observed that the firing of Copperman represented something more than the usual shuffling of the deck chairs on the S.S. Commodore. Upon his promotion, it was made clear to Jim Dionne that he was to be held on a much shorter leash than his predecessors, his authority carefully circumscribed. Filling the power vacuum was one Mehdi Ali, a lawyer and finance guy who had come to Commodore a couple of years before as a consultant and had since insinuated himself more and more with Irving Gould. Now he advanced to the title of president of Commodore International, Gould’s right-hand man in running the global organization; indeed, he seemed to be calling far more shots these days than his globe-trotting boss, who never seemed to be around when you needed him anyway. Ali’s rise would not prove a happy event for anyone who cared about the long-term health of the company.

For now, though, the full import of the changes in Commodore’s management structure was far from clear. Amiga users were on to the next Great White Hope, one that in fact had already been hinted at in the Palladium as the Amiga 3000 was being introduced. Once more “multimedia” would be the buzzword, but this time the focus would go back to the American consumer market Commodore had repeatedly failed to capture with the Amiga 500. The clue had been there in a seemingly innocuous, almost throwaway line from the speech delivered to the Palladium crowd by C. Lloyd Mahaffrey, Commodore’s director of marketing: “While professional users comprise the majority of the multimedia-related markets today, future plans call for penetration into the consumer market as home users begin to discover the benefits of multimedia.”

Commodore’s management, (proud?) owners of the world’s first multimedia personal computer, had for most of the latter 1980s been conspicuous by their complete disinterest in their industry’s initial forays into CD-ROM, the storage medium that, along with the graphics and sound hardware the Amiga already possessed, could have been the crowning piece of the platform’s multimedia edifice. The disinterest persisted in spite of the subtle and eventually blatant hints that were being dropped by people like Cinemaware’s Bob Jacob, whose pioneering “interactive movies” were screaming to be liberated from the constraints of 880 K floppy disks.

In 1989, a tiny piece of Commodore’s small engineering staff — described as “mavericks” by at least one source — resolved to take matters into their own hands, mating an Amiga with a CD-ROM drive and preparing a few demos designed to convince their managers of the potential that was being missed. Management was indeed convinced by the demo — but convinced to go in a radically different direction from that of simply making a CD-ROM drive that could be plugged into existing Amigas.

The Dutch electronics giant Philips had been struggling for what seemed like forever to finish something they envisioned as a whole new category of consumer electronics: a set-top box for the consumption of interactive multimedia content on CD. They called it CD-I, and it was already very, very late. Originally projected for release in time for the Christmas of 1987, its constant delays had left half the entertainment-software industry, who had invested heavily in the platform, in limbo on the whole subject of CD-ROM. What if Commodore could steal Phillips’s thunder by combining a CD-ROM drive with the audiovisually capable Amiga architecture not in a desktop computer but in a set-top box of their own? This could be the magic bullet they’d been looking for, the long-awaited replacement for the Commodore 64 in American living rooms.

The industry’s fixation on these CD-ROM set-top boxes — a fixation which was hardly confined to Phillips and Commodore alone — perhaps requires a bit of explanation. One thing these gadgets were not, at least if you listened to the voices promoting them, was game consoles. The set-top boxes could be used for many purposes, from displaying multimedia encyclopedias to playing music CDs. And even when they were used for pure interactive entertainment, it would be, at least potentially, adult entertainment (a term that was generally not meant in the pornographic sense, although some were already muttering about the possibilities that lurked therein as well). This was part and parcel of a vision that came to dominate much of digital entertainment between about 1989 and 1994: that of a sort of grand bargain between Northern and Southern California, a melding of the new interactive technologies coming out of Silicon Valley with the movie-making machine of Hollywood. Much of television viewing, so went the argument, would become interactive, the VCR replaced with the multimedia set-top box.

In light of all this conventional wisdom, Commodore’s determination to enter the fray — effectively to finish the job that Phillips couldn’t seem to — can all too easily be seen as just another example of the me-too-ism that had clung to their earlier multimedia pronouncements. At the time, though, the project was exciting enough that Commodore was able to lure quite a number of prominent names to work with them on it. Carl Sassenrath, who had designed the core of the original AmigaOS — including its revolutionary multitasking capability — signed on again to adapt his work to the needs of a set-top box. (“In many ways, it was what we had originally dreamed for the Amiga,” he would later say of the project, a telling quote indeed.) Jim Sachs, still the most famous of Amiga artists thanks to his work on Cinemaware’s Defender of the Crown, agreed to design the look of the user interface. Reichart von Wolfsheild and Leo Schwab, both well-known Amiga developers, also joined. And for the role of marketing evangelist Commodore hired none other than Nolan Bushnell, the founder almost two decades before of Atari, the very first company to place interactive entertainment in American living rooms. The project as a whole was placed in the capable hands of Gail Wellington, known throughout the Amiga community as the only Commodore manager with a dollop of sense. The gadget itself came to be called CDTV — an acronym, Commodore would later claim in a part of the sales pitch that fooled no one, for “Commodore Dynamic Total Vision.”

Nolan Bushnell, Mr. Atari himself, plugs CDTV at a trade show.

Commodore announced CDTV at the Summer Consumer Electronics Show in June of 1990, inviting selected attendees to visit a back room and witness a small black box, looking for all the world like a VCR or a stereo component, running some simple demos. From the beginning, they worked hard to disassociate the product from the Amiga and, indeed, from computers in general. The word “Amiga” appeared nowhere on the hardware or anywhere on the packaging, and if all went according to plan CDTV would be sold next to televisions and stereos in department stores, not in computer shops. Commodore pointed out that everything from refrigerators to automobiles contained microprocessors these days, but no one called those things computers. Why should CDTV be any different? It required no monitor, instead hooking up to the family television set. It neither included nor required a keyboard — much industry research had supposedly proved that non-computer users feared keyboards more than anything else — nor even a mouse, being controlled entirely through a remote control that looked pretty much like any other specimen of same one might find between the cushions of a modern sofa. “If you know how to change TV channels,” said a spokesman, “you can take full advantage of CDTV.” It would be available, Commodore claimed, before the Christmas of 1990, which should be well before CD-I despite the latter’s monumental head start.

That timeline sounded overoptimistic even when it was first announced, and few were surprised to see the launch date slip into 1991. But the extra time did allow a surprising number of developers to jump aboard the CDTV train. Commodore had never been good at developer relations, and weren’t terribly good at it now; developers complained that the tools Commodore provided were always late and inadequate and that help with technical problems wasn’t easy to come by, while financial help was predictably nonexistent. Still, lots of CD-I projects had been left in limbo by Phillips’s dithering and were attractive targets for adaptation to CDTV, while the new platform’s Amiga underpinnings made it fairly simple to port over extant Amiga games like SimCity and Battle Chess. By early 1991, Commodore could point to about fifty officially announced CDTV titles, among them products from such heavy hitters as Grolier, Disney, Guinness (the publisher, not the beer company), Lucasfilm, and Sierra. This relatively long list of CDTV developers certainly seemed a good sign, even if not all of the products they proposed to create looked likely to be all that exciting, or perhaps even all that good. Plenty of platforms, including the original Amiga, had launched with much less.

While the world — or at least the Amiga world — held its collective breath waiting for CDTV’s debut, the charismatic Nolan Bushnell did what he had been hired to do: evangelize like crazy. “What we are really trying to do is make multimedia a reality, and I think we’ve done that,” he said. The hyperbole was flying thick and fast from all quarters. “This will change forever the way we communicate, learn, and entertain,” said Irving Gould. Not to be outdone, Bushnell noted that “books were great in their day, but books right now don’t cut it. They’re obsolete.” (Really, why was everyone so determined to declare the death of the book during this period?)

CDTV being introduced at the 1991 World of Amiga show. Doing the introducing is Gail Wellington, head of the CDTV project and one of the unsung heroes of Commodore.

The first finished CDTV units showed up at the World of Amiga show in New York City in April of 1991; Commodore sold their first 350 to the Amiga faithful there. A staggered roll-out followed: to five major American cities, Canada, and the Commodore stronghold of Britain in May; to France, Germany, and Italy in the summer; to the rest of the United States in time for Christmas. With CD-I now four years late, CDTV thus became the first CD-ROM-based set-top box you could actually go out and buy. Doing so would set you back just under $1000.

The Amiga community, despite being less than thrilled by the excision of all mention of their platform’s name from the product, greeted the launch with the same enthusiasm they had lavished on the Amiga 3000, their Great White Hope of the previous year, or for that matter the big Christmas marketing campaign of 1989. Amazing Computing spoke with bated breath of CDTV becoming the “standard for interactive multimedia consumer hardware.”

“Yes, but what is it for?” These prospective customers’ confusion is almost palpable.

Alas, there followed a movie we’ve already seen many times. Commodore’s marketing was ham-handed as usual, declaring CDTV “nothing short of revolutionary” but failing to describe in clear, comprehensible terms why anyone who was more interested in relaxing on the sofa than fomenting revolutions might actually want one. The determination to disassociate CDTV from the scary world of computers was so complete that the computer magazines weren’t even allowed advance models; Amiga Format, the biggest Amiga magazine in Britain at the time with a circulation of more than 160,000, could only manage to secure their preview unit by making a side deal with a CDTV developer. CDTV units were instead sent to stereo magazines, who shrugged their shoulders at this weird thing this weird computer company had sent them and returned to reviewing the latest conventional CD players. Nolan Bushnell, the alleged marketing genius who was supposed to be CDTV’s ace in the hole, talked a hyperbolic game at the trade shows but seemed otherwise disengaged, happy just to show up and give his speeches and pocket his fat paychecks. One could almost suspect — perish the thought! — that he had only taken this gig for the money.

In the face of all this, CDTV struggled mightily to make any headway at all. When CD-I hit the market just before Christmas, boasting more impressive hardware than CDTV for roughly the same price, it only made the hill that much steeper. Commodore now had a rival in a market category whose very existence consumers still obstinately refused to recognize. As an established maker of consumer electronics in good standing with the major retailers — something Commodore hadn’t been since the heyday of the Commodore 64 — Phillips had lots of advantages in trying to flog their particular white elephant, not to mention an advertising budget their rival could only dream of. CD-I was soon everywhere, on store shelves and in the pages of the glossy lifestyle magazines, while CDTV was almost nowhere. Commodore did what they could, cutting the list price of CDTV to less than $800 and bundling with it The New Grolier Encyclopedia and the smash Amiga game Lemmings. It didn’t help. After an ugly Christmas season, Nolan Bushnell and the other big names all deserted the sinking ship.

Even leaving aside the difficulties inherent in trying to introduce people to an entirely new category of consumer electronics — difficulties that were only magnified by Commodore’s longstanding marketing ineptitude — CDTV had always been problematic in ways that had been all too easy for the true believers to overlook. It was clunky in comparison to CD-I, with a remote control that felt awkward to use, especially for games, and a drive which required that the discs first be placed into an external holder before being loaded into the unit proper. More fundamentally, the very re-purposing of old Amiga technology that had allowed it to beat CD-I to market made it an even more limited platform than its rival for running the sophisticated adult entertainments it was supposed to have enabled. Much of the delay in getting CD-I to market had been the product of a long struggle to find a way of doing video playback with some sort of reasonable fidelity. Even the released CD-I performed far from ideally in this area, but it did better than CDTV, which at best — at best, mind you — might be able to fill about a third of the television screen with low-resolution video running at a choppy twelve frames per second. It was going to be hard to facilitate a union of Silicon Valley and Hollywood with technology like that.

None of CDTV’s problems were the fault of the people who had created it, who had, like so many Commodore engineers before and after them, been asked to pull off a miracle on a shoestring. They had managed to create, if not quite a miracle, something that worked far better than it had a right to. It just wasn’t quite good enough to overcome the marketing issues, the competition from CD-I, and the marketplace confusion engendered by an interactive set-top box that said it wasn’t a game console but definitely wasn’t a home computer either.

CDTV could be outfitted with a number of accessories that turned it into more of a “real” computer. Still, those making software for the system couldn’t count on any of these accessories being present, which served to greatly restrict their products’ scope of possibility.

Which isn’t to say that some groundbreaking work wasn’t done by the developers who took a leap of faith on Commodore — almost always a bad bet in financial terms — and produced software for the platform. CDTV’s early software catalog was actually much more impressive than that of CD-I, whose long gestation had caused so many initially enthusiastic developers to walk away in disgust. The New Grolier Encyclopedia was a true multimedia dictionary; the entry for John F. Kennedy, for example, included not only a textual biography and photos to go along with it but audio excerpts from his most famous speeches. The American Heritage Dictionary also offered images where relevant, along with an audio pronunciation of every single word. American Vista: The Multimedia U.S. Atlas boasted lots of imagery of its own to add flavor to its maps, and could plan a route between any two points in the country at the click of a button. All of these things may sound ordinary today, but in a way that very modern ordinariness is a testament to what pioneering products these really were. They did in fact present an argument that, while others merely talked about the multimedia future, Commodore through CDTV was doing it — imperfectly and clunkily, yes, but one has to start somewhere.

One of the most impressive CDTV titles of all marked the return of one of the Amiga’s most beloved icons. After designing the CDTV’s menu system, the indefatigable Jim Sachs returned to the scene of his most famous creation. Really a remake rather than a sequel, Defender of the Crown II reintroduced many of the additional graphics and additional tactical complexities that had been excised from the original in the name of saving time, pairing them with a full orchestral soundtrack, digitized sound effects, and a narrator to detail the proceedings in the appropriate dulcet English accent. It was, Sachs said, “the game the original Defender of the Crown was meant to be, both in gameplay and graphics.” He did almost all of the work on this elaborate multimedia production all by himself, farming out little more than the aforementioned narration, and Commodore themselves released the game, having acquired the right to do so from the now-defunct Cinemaware at auction. While, as with the original, its long-term play value is perhaps questionable, Defender of the Crown II even today still looks and sounds mouth-wateringly gorgeous.


If any one title on CDTV was impressive enough to sell the machine by itself, this ought to be have been it. Unfortunately, it didn’t appear until well into 1992, by which time CDTV already had the odor of death clinging to it. The very fact that Commodore allowed the game to be billed as the sequel to one so intimately connected to the Amiga’s early days speaks to a marketing change they had instituted to try to breathe some life back into the platform.

The change was born out of an insurrection staged by Commodore’s United Kingdom branch, who always seemed to be about five steps ahead of the home office in any area you cared to name. Kelly Sumner, managing director of Commodore UK:

We weren’t involved in any of the development of CDTV technology; that was all done in America. We were taking the lead from the corporate company. And there was a concrete stance of “this is how you promote it, this is the way forward, don’t do this, don’t do that.” So, that’s what we did.

But after six or eight months we basically turned around and said, “You don’t know what you’re talking about. It ain’t going to go anywhere, and if it does go anywhere you’re going to have to spend so much money that it isn’t worth doing. So, we’re going to call it the Amiga CDTV, we’re going to produce a package with disk drives and such like, and we’re going to promote it like that. People can understand that, and you don’t have to spend so much money.”

True to their word, Commodore UK put together what they called “The Multimedia Home Computer Pack,” combining a CDTV unit with a keyboard, a mouse, an external disk drive, and the software necessary to use it as a conventional Amiga as well as a multimedia appliance — all for just £100 more than a CDTV unit alone. Commodore’s American operation grudgingly followed their lead, allowing the word “Amiga” to creep back into their presentations and advertising copy.

Very late in the day, Commodore finally began acknowledging and even celebrating CDTV’s Amigahood.

But it was too late — and not only for CDTV but in another sense for the Amiga platform itself. The great hidden cost of the CDTV disappointment was the damage it did to the prospects for CD-ROM on the Amiga proper. Commodore had been so determined to position CDTV as its own thing that they had rejected the possibility of equipping Amiga computers as well with CD-ROM drives, despite the pleas of software developers and everyday customers alike. A CD-ROM drive wasn’t officially mated to the world’s first multimedia personal computer until the fall of 1992, when, with CDTV now all but left for dead, Commodore finally started shipping an external drive that made it possible to run most CDTV software, as well as CD-based software designed specifically for Amiga computers, on an Amiga 500. Even then, Commodore provided no official CD-ROM solution for Amiga 2000 and 3000 owners, forcing them to cobble together third-party adapters that could interface with drives designed for the Macintosh. The people who owned the high-end Amiga models, of course, were the ones working in the very cutting-edge fields that cried out for CD-ROM.

It’s difficult to overstate the amount of damage the Amiga’s absence from the CD-ROM party, the hottest ticket in computing at the time, did to the platform’s prospects. It single-handedly gave the lie to every word in Harold Copperman’s 1990 speech about Commodore being “the leaders in multimedia.” Many of the most vibrant Amiga developers were forced to shift to the Macintosh or another platform by the lack of CD-ROM support. Of all Commodore’s failures, this one must loom among the largest. They allowed the Macintosh to become the platform most associated with the new era of CD-ROM-enabled multimedia computing without even bothering to contest the territory. The war was over before Commodore even realized a war was on.

Commodore’s feeble last gasp in terms of marketing CDTV positioned it as essentially an accessory to desktop Amigas, a “low-cost delivery system for multimedia” targeted at business and government rather than living rooms. The idea was that you could create presentations on Amiga computers, send them off to be mastered onto CD, then drag the CDTV along to board meetings or planning councils to show them off. In that spirit, a CDTV unit was reduced to a free toss-in if you bought an Amiga 3000 — two slow-selling products that deserved one another.

The final verdict on CDTV is about as ugly as they come: less than 30,000 sold worldwide in some eighteen months of trying; less than 10,000 sold in the American market Commodore so desperately wanted to break back into, and many or most of those sold at fire-sale discounts after the platform’s fate was clear. In other words, the 350 CDTV units that had been sold to the faithful at that first ebullient World of Amiga show made up an alarmingly high percentage of all the CDTV units that would ever sell. (Phillips, by contrast, would eventually manage to move about 1 million CD-I units over the course of about seven years of trying.)

The picture I’ve painted of the state of Commodore thus far is a fairly bleak one. Yet that bleakness wasn’t really reflected in the company’s bottom line during the first couple of years of the 1990s. For all the trouble Commodore had breaking new products in North America and elsewhere, their legacy products were still a force to be reckoned with outside the United States. Here the end of the Cold War and subsequent lifting of the Iron Curtain proved a boon. The newly liberated peoples of Eastern Europe were eager to get their hands on Western computers and computer games, but had little money to spend on them. The venerable old Commodore 64, pulling along behind it that rich catalog of thousands upon thousands of games of all stripes, was the perfect machine for these emerging markets. Effectively dead in North America and trending that way in Western Europe, it now enjoyed a new lease on life in the former Soviet sphere, its sales numbers suddenly climbing sharply again instead of falling. The Commodore 64 was, it seemed, the cockroach of computers; you just couldn’t kill it. Not that Commodore wanted to: they would happily bank every dollar their most famous creation could still earn them. Meanwhile the Amiga 500 was selling better than ever in Western Europe, where it was now the most popular single gaming platform of all, and Commodore happily banked those profits as well.

Commodore’s stock even enjoyed a brief-lived bubble of sorts. In the spring and early summer of 1991, with sales strong all over Europe and CDTV poised to hit the scene, the stock price soared past $20, stratospheric heights by Commodore’s recent standards. This being Commodore, the stock collapsed below $10 again just as quickly — but, hey, it was nice while it lasted. That same year, worldwide sales topped the magical $1 billion mark, another height that had last been seen in the heyday of the Commodore 64. Commodore was now the second most popular maker of personal computers in Europe, with a market share of 12.4 percent, just slightly behind IBM’s 12.7 percent. The Amiga was now selling at a clip of 1 million machines per year, which would bring the total installed base to 4.5 million by the end of 1992. Of that total, 3.5 million were in Europe: 1.3 million in Germany, 1.2 million in Britain, 600,000 in Italy, 250,000 in France, 80,000 in Scandinavia. (Ironically in light of the machine’s Spanish name, one of the few places in Western Europe where it never did well at all was Spain.) To celebrate their European success, Irving Gould and Mehdi Ali took home salaries in 1991 of $1.75 million and $2.4 million respectively, the latter figure $400,000 more than the chairman of IBM, a company fifty times Commodore’s size, was earning.

But it wasn’t hard to see that Commodore, in relying on all of these legacy products sold in foreign markets, was living on borrowed time. Even in Europe, MS-DOS was beginning to slowly creep up on the Amiga as a gaming platform by 1992, while Nintendo and Sega, the two big Japanese console makers, were finally starting to take notice of this virgin territory after having ignored it for so long. While Amiga sales in Europe in 1992 remained blessedly steady, sales of the Amiga in North America were down as usual, sales of the Commodore 64 in Eastern Europe fell off thanks to economic chaos in the region, and sales of Commodore’s line of commodity PC clones cratered so badly that they pulled out of that market entirely. It all added up to a bottom line of about $900 million in total earnings. The company was still profitable that year, but considerably less so than the year before. Everyone was now looking forward to 1993 with more than a little trepidation.

Even as Commodore faced an uncertain future, they could at least take comfort that their arch-enemy Atari was having a much worse time of it. In the very early 1990s, Atari enjoyed some success, if not as much as they had hoped, with their Lynx handheld game console, a more upscale rival to the Nintendo Game Boy. The Atari Portfolio, a genuinely groundbreaking palmtop computer, also did fairly well for them, if perhaps not quite as well as it deserved. But the story of their flagship computing platform, the Atari ST, was less happy. Already all but dead in the United States, the ST’s market share in Europe shrank in proportion to the Amiga’s increasing sales, such that it fell from second to third most popular gaming computer in 1991, trailing MS-DOS now as well as the Amiga.

Atari tried to remedy the slowing sales with new machines they called the STe line, which increased the color palette to 4096 shades and added a blitter chip to aid onscreen animation. (The delighted Amiga zealots at Amazing Computing wrote of these Amiga-inspired developments that they reminded them of “an Amiga 500 created by a primitive tribe that had never actually seen an Amiga, but had heard reports from missionaries of what the Amiga could do.”) But the new hardware broke compatibility with much existing software, and it only got harder to justify buying an STe instead of an Amiga 500 as the latter’s price slowly fell. Atari’s total sales in 1991 were just $285 million, down by some 30 percent from the previous year and barely a quarter of the numbers Commodore was doing. Jack Tramiel and his sons kept their heads above water only by selling off pieces of the company, such as the Taiwanese manufacturing facility that went for $40.9 million that year. You didn’t have to be an expert in the computer business to understand how unsustainable that path was. In the second quarter of 1992, Atari posted a loss of $39.8 million on sales of just $23.3 million, a rather remarkable feat in itself. Whatever else lay in store for Commodore and the Amiga, they had apparently buried old Mr. “Business is War.”

Still, this was no time to bask in the glow of sweet revenge. The question of where Commodore and the Amiga went from here was being asked with increasing urgency in 1992, and for very good reason. The answer would arrive in the latter half of the year, in the form at long last of the real, fundamental technical improvements the Amiga community had been begging for for so long. But had Commodore done enough, and had they done it in time to make a difference? Those questions loomed large as the 68000 Wars were about to enter their final phase.

(Sources: the book On the Edge: The Spectacular Rise and Fall of Commodore by Brian Bagnall; Amazing Computing of August 1987, June 1988, June 1989, July 1989, May 1990, June 1990, July 1990, August 1990, September 1990, December 1990, January 1991 February 1991, March 1991, April 1991, May 1991, June 1991, August 1991, September 1991, November 1991, January 1992, February 1992, March 1992, April 1992, June 1992, July 1992, August 1992, September 1992, November 1992, and December 1992; Info of July/August 1988 and January/February 1989; Amiga Format of July 1991, July 1995, and the 1992 annual; The One of September 1990, May 1991, and December 1991; CU Amiga of June 1992, October 1992, and November 1992; Amiga Computing of April 1992; AmigaWorld of June 1991. Online sources include Matt Barton’s YouTube interview with Jim Sachs,  Sébastien Jeudy’s interview with Carl Sassenrath, Greg Donner’s Workbench Nostalgia, and Atari’s annual reports from 1989, available on archive.org. My huge thanks to reader “himitsu” for pointing me to the last and providing some other useful information on Commodore and Atari’s financials during this period in the comments to a previous article in this series. And thank you to Reichart von Wolfsheild, who took time from his busy schedule to spend a Saturday morning with me looking back on the CDTV project.)

 
 

Tags: , , , , ,