RSS

Monthly Archives: October 2017

A Net Before the Web, Part 1: The Establishment Man and the Magnificent Rogue

On July 9, 1979, journalists filtered into one of the lavish reception halls in Manhattan’s Plaza Hotel to witness the flashy roll-out of The Source, an online service for home-computer owners that claimed to be the first of its kind. The master of ceremonies was none other than the famous science-fiction and science-fact writer Isaac Asimov. With his nutty-professor persona in full flower, his trademark mutton-chop sideburns bristling in the strobe of the flashbulbs, Asimov said that “this is the beginning of the Information Age! By the 21st century, The Source will be as vital as electricity, the telephone, and running water.”

Actually, though, The Source wasn’t quite the first of its kind. Just eight days before, another new online service had made a more quiet official debut. It was called MicroNET, and came from an established provider of corporate time-shared computing services called CompuServe. MicroNET got no splashy unveiling, no celebrity spokesman, just a typewritten announcement letter sent to members of selected computer users groups.

The contrast between the two roll-outs says much about the men behind them, who between them would come to shape much of the online world of the 1980s and beyond. They were almost exactly the same age as one another, but cut from very different cloths. Jeff Wilkins, the executive in charge of CompuServe, could be bold when he felt it was warranted, but his personality lent itself to a measured, incremental approach that made him a natural favorite with the conservative business establishment. “The changes that will come to microcomputing because of computer networks will be evolutionary in nature,” he said just after launching MicroNET. Even after Wilkins left CompuServe in 1985, it would continue to bear the stamp of his careful approach to doing business for many years.

But William Von Meister, the man behind The Source and its afore-described splashier unveiling, preferred revolutions to evolutions. He was high-strung, mercurial, careless, sometimes a little unhinged. Described as a “magnificent rogue” by one acquaintance, as a “pathological entrepreneur” by another, he made businesses faster than he made children — of whom, being devoted to excess in all its incarnations, he had eight. His businesses seldom lasted very long, and when they did survive did so without him at their helm, usually after he had been chased out of them in a cloud of acrimony and legal proceedings. A terrible businessman by most standards, he could nevertheless “raise money from the dead,” as one investor put it, thereby moving on to the next scheme while the previous was still going down in flames. Still, whatever else you could say about him, Bill von Meister had vision. Building the online societies of the future would require cockeyed dreamers like him just as much as it would sober tacticians like Jeff Wilkins.


Had an anonymous salesman who worked for Digital Equipment Corporation in 1968 been slightly less good at his job, CompuServe would most likely never have come to be.

The salesman in question had been assigned to a customer named John Goltz, fresh out of the University of Arizona and working now in Columbus, Ohio, for a startup. But lest the word “startup” convey a mistaken impression of young men with big dreams out to change the world, Silicon Valley-style, know that this particular startup lived within about the most unsexy industry imaginable: life insurance. No matter; from Goltz’s perspective anyway the work was interesting enough.

He found himself doing the work because Harry Gard, the founder of the freshly minted Golden United Life Insurance, wanted to modernize his hidebound industry, at least modestly, by putting insurance records online via a central computer which agents in branch offices could all access. He had first thought of giving the job to his son-in-law Jeff Wilkins, an industrious University of Arizona alumnus who had graduated with a degree in electrical engineering and now ran a successful burglar-alarm business of his own in Tucson. “The difference between electrical engineering and computing didn’t occur to him,” remembers Wilkins. “I told him that I didn’t know anything about computing, but I had a friend who did.” That friend was John Goltz, whose degree in computer science made him the more logical candidate in Wilkins’s eyes.

Once hired, Goltz contacted DEC to talk about buying a PDP-9, a sturdy and well-understood machine that should be perfectly adequate for his new company’s initial needs. But our aforementioned fast-talking salesman gave him the hard up-sell, telling him about the cutting-edge PDP-10 he could lease for only “a little more.” Like the poor rube who walks into his local Ford dealership to buy a Focus and drives out in a Mustang, Goltz’s hacker heart couldn’t resist the lure of DEC’s 36-bit hot rod. He repeated the saleman’s pitch almost verbatim to his boss, and Gard, not knowing a PDP-10 from a PDP-9 from a HAL 9000, said fine, go for it. Once his dream machine was delivered and installed in a former grocery store, Goltz duly started building the online database for which he’d been hired.

The notoriously insular life-insurance market was, however, a difficult nut to crack. Orders came in at a trickle, and Goltz’s $1 million PDP-10 sat mostly idle most of the time. It was at this point, looking for a way both to make his computer earn its keep and to keep his employer afloat, that Goltz proposed that Golden United Life Insurance enter into the non-insurance business of selling time-shared computer cycles. Once again, Gard told him to go for it; any port in a storm and all that.

At the dawn of the 1970s, time-sharing was the hottest buzzword in the computer field. Over the course of the 1950s and 1960s, the biggest institutions in the United States — government bureaucracies, banks, automobile manufacturers and other heavy industries — had all gradually been computerized via hulking mainframes that, attended by bureaucratic priesthoods of their own and filling entire building floors, chewed through and spat out millions of records every day. But that left out the countless smaller organizations who could make good use of computers but had neither the funds to pay for a mainframe’s care and upkeep nor a need for more than a small fraction of its vast computing power. DEC, working closely with university computer-science departments like that of MIT, had been largely responsible for the solution to this dilemma. Time-sharing, enabled by a new generation of multi-user, multitasking operating systems like DEC’s TOPS-10 and an evolving telecommunications infrastructure that made it possible to link up with computers from remote locations via dumb terminals, allowed computer cycles and data storage to be treated as a commodity. A business or other organization, in other words, could literally share time on a remote computer system with others, paying for only the cycles and storage they actually used. (If you think that all this sounds suspiciously like the supposedly modern innovation of “cloud computing,” you’re exactly right. In technology as in life, a surprising number of things are cyclical, with only the vocabulary changing.)

Jeff Wilkins

John Goltz possessed a keen technical mind, but he had neither the aptitude nor the desire to run the business side of Golden United’s venture into time-sharing. So, Harry Gard turned once again to his son-in-law. “I liked what I was doing in Arizona,” Jeff Wilkins says. “I enjoyed having my own company, so I really didn’t want to come out.” Finally, Gard offered him $1.5 million in equity, enough of an eye-opener to get him to consider the opportunity more seriously. “I set down the ground rules,” he says. “I had to have complete control.” In January of 1970, with Gard having agreed to that stipulation, the 27-year-old Jeff Wilkins abandoned his burglar-alarm business in Tuscon to come to Columbus and run a new Golden Life subsidiary which was to be called Compu-Serv.

With time-sharing all the rage in computer circles, it was a tough market they were entering. Wilkins remembers cutting his first bill to a client for all of $150, thinking all the while that it was going to take a lot of bills just like it to pay for this $1 million computer. But Compu-Serv was blessed with a steady hand in Wilkins himself and a patient backer with reasonably deep pockets in his father-in-law. Wilkins hired most of his staff out of big companies like IBM and Xerox. They mirrored their young but very buttoned-down boss, going everywhere in white shirt and tie, lending an aura of conservative professionalism that belied the operation’s small size and made it attractive to the business establishment. In 1972, Compu-Serv turned the corner into a profitability that would last for many, many years to come.

In the beginning, they sold nothing more than raw computer access; the programs that ran on the computers all had to come from the clients themselves. As the business expanded, though, Compu-Serv began to offer off-the-shelf software as well to suit the various industries they found themselves serving. They began, naturally enough, with the “Life Insurance Data Information System,” a re-purposing of the application Goltz had already built for Golden United. Expanding the reach of their applications from there, they cultivated a reputation as a full-service business partner rather than a mere provider of a commodity. Most importantly of all, they invested heavily into their own telecommunications infrastructure that existed in parallel with the nascent Internet and other early networks, using lines leased from AT&T and a system of routers — actually, DEC minicomputers running software of their own devising — for packet-switching. From their first handful of clients in and around Columbus, Compu-Serv thus spread their tendrils all over the country. They weren’t the cheapest game in town, but for the risk-averse businessperson looking for a full-service time-sharing provider with a fast and efficient network, they made for a very appealing package.

In 1975, Compu-Serv was spun off from the moribund Golden United Life Insurance, going public with a NASDAQ listing. Thus freed at last, the child quickly eclipsed the parent; the first stock split happened within a year. In 1977, Compu-Serv changed their name to CompuServe. By this point, they had more than two dozen offices spread through all the major metropolitan areas, and that one PDP-10 in a grocery store had turned into more than a dozen machines filling two data centers near Columbus. Their customer roll included more than 600 businesses. By now, even big business had long since come to see the economic advantages time-sharing offered in many scenarios. CompuServe’s customers included Fortune 100 giants like AMAX (the largest miner of aluminum, coal, and steel in the country), Goldman Sachs, and Owens Corning, along with government agencies like the Department of Transportation. “CompuServe is one of the best — if not the best — time-sharing companies in the country,” said AMAX’s director of research.

Inside one of CompuServe’s data centers.

The process that would turn this corporate data processor of the 1970s into the most popular consumer online service of the 1980s was born out of much the same reasoning that had spawned it in the first place. Once again, it all came down to precious computer cycles that were sitting there unused. To keep their clients happy, CompuServe was forced to make sure they had enough computing capacity to meet peak-hour demand. This meant that the majority of the time said capacity was woefully underutilized; the demand for CompuServe’s computer cycles was an order of magnitude higher during weekday working hours than it was during nights, evenings, and weekends, when the offices of their corporate clients were deserted. This state of affairs had always rankled Jeff Wilkins, nothing if not a lover of efficiency. Yet it had always seemed an intractable problem; it wasn’t as if they could ask half their customers to start working a graveyard shift.

Come 1979, though, a new development was causing Wilkins to wonder if there might in fact be a use for at least some of those off-hour cycles. The age of personal computing was in the offing. Turnkey microcomputers were now available from Apple, Commodore, and Radio Shack. The last company alone was on track to sell more than 50,000 TRS-80s before the end of the year, and many more models from many more companies were in the offing. The number of home-computer hobbyists was still minuscule by any conventional standard, but it could, it seemed to Wilkins, only grow. Might some of those hobbyists be willing and able to dial in and make use of CompuServe’s dearly bought PDP-10 systems while the business world slept? If so, who knew what it might turn into?

It wasn’t as if a little diversity would be a bad thing. While CompuServe was still doing very well on the strength of their fine reputation — they would bill their clients for $19 million in 1979 — the time-sharing market in general was showing signs of softening. The primary impetus behind it — the sheer expense of owning one’s own computing infrastructure — was slowly bleeding away as minicomputers like the DEC PDP-11, small enough to shove away in a closet somewhere rather than requiring a room or a floor of its own, became a more and more cost-effective solution. Rather than a $1 million proposition, as it had been ten years ago, a new DEC system could now be had for as little as $150,000. Meanwhile a new piece of software called VisiCalc — the first spreadsheet program ever, at least as the modern world understands that term — would soon show that even an early, primitive microcomputer could already replace a time-shared terminal hookup in a business’s accounting department. And once entrenched in that vital area, microcomputers could only continue to spread throughout the corporation.

Still, the consumer market for online services, if it existed, wasn’t worth betting CompuServe’s existing business model on. Wilkins entered this new realm, as he did most things, with cautious probity. The new service would be called MicroNET so as to keep it from damaging the CompuServe brand in the eyes of their traditional customers, whether because it became a failure or just because of the foray into the untidy consumer market that it represented. And it would be “market-driven” rather than “competition-driven.” In Wilkins’s terminology, this meant that they would provide some basic time-sharing infrastructure — including email and a bulletin board for exchanging messages, a selection of languages for writing and running programs, and a suite of popular PDP-10 games like Adventure and Star Trek — but would otherwise adapt a wait-and-see attitude on adding customized consumer services, letting the market — i.e., all those hobbyists dialing in from home — do what they would with the system in the meantime.

Even with all these caveats, he had a hard time selling the idea to his board, who were perfectly happy with the current business model, thank you very much, and who had the contempt for the new microcomputers and the people who used them that was shared by many who had been raised on the big iron of DEC and IBM. They took to calling his idea “schlock time-sharing.”

Mustering all his powers of persuasion, Wilkins was able to overrule the naysayers sufficient to launch a closed trial. On May 1, 1979, CompuServe quietly offered free logins to any members of the Midwest Affiliation of Computer Clubs, headquartered right there in Columbus, who asked for them. With modems still a rare and pricey commodity, it took time to get MicroNET off the ground; Wilkins remembers anxiously watching the connectivity lights inside the data center during the evenings, and seeing them remain almost entirely dimmed. But then, gradually, they started blinking.

After exactly two months, with several hundred active members having proved to Wilkins’s satisfaction that a potential market existed, he made MicroNET an official CompuServe service, open to all. To the dissatisfaction of his early adopters, that meant they had to start paying: a $30 signup charge, followed by $5 per hour for evening and weekend access, $12 per hour if they were foolish enough to log on during the day, when CompuServe’s corporate clients needed the machines. To the satisfaction of Wilkins, most of his early adopters grumbled but duly signed up, and they were followed by a slow but steady trickle of new arrivals. The service went entirely unadvertised, news of its existence spreading among computer hobbyists strictly by word of mouth. MicroNET was almost literally nothing in the context of CompuServe’s business as a whole — it would account for roughly 1 percent of their 1979 revenue, less than heaps of their larger individual corporate accounts — yet it marked the beginning of something big, something even Wilkins couldn’t possibly anticipate.

But MicroNET didn’t stand alone. Even as one online service was getting started in about the most low-key fashion imaginable, another was making a much more high-profile entrance. It was fortunate that Wilkins chose to see MicroNET as “market-driven” rather than “competition-driven.” Otherwise, he wouldn’t have been happy to see his thunder being stolen by The Source.

Bill von Meister

Like Jeff Wilkins, Bill von Meister was 36 years old. Unlike Wilkins, he already had on his resume a long string of entrepreneurial failures to go along with a couple of major successes. An unapologetic epicurean with a love for food, wine, cars, and women, he had been a child not just of privilege but of aristocracy, his father a godson of the last German kaiser, his mother an Austrian countess. His parents had immigrated to New York in the chaos that followed World War I, when Germany and Austria could be uncomfortable places for wealthy royalty, and there his father had made the transition from landed aristocrat to successful businessman with rather shocking ease. Among other ventures, he became a pivotal architect of the storied Zeppelin airship service between Germany and the United States — although the burning of the Hindenburg did rather put the kibosh on that part of his portfolio, as it did passenger-carrying airships in general.

The son inherited at least some of the father’s acumen. Leveraging his familial wealth alongside an unrivaled ability to talk people into giving him money — one friend called him the best he’d ever seen at “taking money from venture capitalists, burning it all up, and then getting more money from the same venture capitalists” — the younger von Meister pursued idea after idea, some visionary, some terrible. By 1977, he had hit pay dirt twice already in his career, once when he created what was eventually branded as Western Union’s “Mailgram” service for sending a form of electronic mail well before computer email existed, once when he created a corporate telephone service called Telemax. Unfortunately, the money he earned from these successes disappeared as quickly as it poured in, spent to finance his high lifestyle and his many other, failed entrepreneurial projects.

Late in 1977, he founded Digital Broadcasting Corporation in Fairfax County, Virginia, to implement a scheme for narrow-casting digital data using the FM radio band. “Typical uses,” ran the proposal, “would include price information for store managers in a retail chain, bad-check information to banks, and policy information to agents of an insurance company.” Von Meister needed financing to bring off this latest scheme, and he needed a factory to build the equipment that would be needed. Luckily, a man who believed he could facilitate both called him one day in the spring of 1978 after reading a description of his plans in Business Week.

Jack Taub had made his first fortune as the founder of Scott Publishing, known for their catalogs serving the stamp-collecting hobby. Now, he was so excited by von Meister’s scheme that he immediately bought into Digital Broadcasting Corporation to the tune of $500,000 of much-needed capital, good for a 42.5 percent stake. But every bit as important as Taub’s personal fortune were the connections he had within the federal government. By promising to build a factory in the economically disadvantaged inner city of Charlotte, North Carolina, he convinced the Commerce Department’s Economic Development Administration to guarantee 90 percent of a $6 million bank loan from North Carolina National Bank, under a program meant to channel financing into job-creating enterprises.

Unfortunately, the project soon ran into serious difficulties with another government agency: the Federal Communications Commission, who noted pointedly that the law which had set aside the FM radio band had stipulated it should be reserved for applications “of interest to the public.” Using it to send private data, many officials at the FCC believed, wasn’t quite what the law’s framers had had in mind. And while the FCC hemmed and hawed, von Meister was fomenting chaos within the telecommunications and broadcasting industries at large by claiming his new corporation’s name gave him exclusive rights to the term “digital broadcasting,” a modest buzzword of its own at the time. His legal threats left a bad taste in the mouth of many a potential partner, and the scheme withered away under the enormous logistical challenges getting such a service off the ground must entail. The factory which the Commerce Department had so naively thought they were financing never opened, but Digital Broadcasting kept what remained of the money they had received for the purpose.

They now planned to use the money for something else entirely. Von Meister and Taub had always seen business-to-business broadcasting as only the first stage of their company’s growth. In the longer term, they had envisioned a consumer service which would transmit and even receive information — news and weather reports, television listings, shopping offers, opinion polls, etc. — to and from terminals located in ordinary homes. When doing all this over the FM radio band began to look untenable, they had cast about for alternative approaches; they were, after all, still flush with a fair amount of cash. It didn’t take them long to take note of all those TRS-80s and other home computers that were making their way into the homes of early adopters. Both Taub and von Meister would later claim to have been the first to suggest a pivot from digital broadcasting to a microcomputer-oriented online information utility. In the beginning, they called it CompuCom.

The most obvious problem CompuCom faced — its most obvious disadvantage in comparison to CompuServe’s MicroNET — was the lack of a telecommunications network of its own. Once again, both Taub and von Meister would later claim to have been the first to see the solution. One or the other or both took note of another usage inequality directly related to the one that had spawned MicroNET. Just as the computers of time-sharing services like CompuServe sat largely idle during nights and weekends, traffic on the telecommunications lines corporate clients used to connect to them was also all but nonexistent more than half of the time. Digital Broadcasting came to GTE Telenet with an offer to lease this idle bandwidth at a rate of 75¢ per connection per hour, a dramatically reduced price from that of typical business customers. GTE, on the presumption that something was better than nothing, agreed. And while they were making the deal to use the telecommunications network, von Meister and Taub also made a deal with GTE Telenet to run the new service on the computers in the latter’s data centers, using all that excess computing power that lay idle along with the telecommunications bandwidth on nights and weekends. Because they needed to build no physical infrastructure, von Meister and Taub believed that CompuCom could afford to be relatively cheap during off-hours; the initial pricing plan stipulated just $2.75 per hour during evenings and weekends, with a $100 signup fee and a minimum monthly charge of $10.

For all the similarities in their way of taking advantage of the time-sharing industry’s logistical quirks, not to mention their shared status as the pioneers of much of modern online life, there were important differences between the nascent MicroNET and CompuCom. From the first, von Meister envisioned his service not just as a provider of computer access but as a provider of content. The public-domain games that were the sum total of MicroNET’s initial content were only the beginning for him. Mirroring its creator, CompuCom was envisioned as a service for the well-heeled Playboy– and Sharper Image-reading technophile lounge lizard, with wine lists, horoscopes, entertainment guides for the major metropolitan areas, and an online shopping mall. In a landmark deal, von Meister convinced United Press International, one of the two providers of raw news wires to the nation’s journalistic infrastructure, to offer their feed through CompuCom as well — unfiltered, up-to-the-minute information of a sort that had never been available to the average consumer before. The New York Times provided a product-information database, Prentice Hall provided tax information, and Dow Jones provided a stock ticker. Von Meister contracted with the French manufacturer Alcatel for terminals custom-made just for logging onto CompuCom, perfect for those wanting to get in on the action who weren’t interested in becoming computer nerds in the process. For the same prospective customers, he insisted that the system, while necessarily all text given the state of the technology of the time, be navigable via multiple-choice menus rather than an arcane command line.

In the spring of 1979, just before the first trials began, CompuCom was renamed The Source; the former name sounded dangerously close to “CompuCon,” a disadvantage that was only exacerbated by the founder’s checkered business reputation. The service officially opened for business, eight days after MicroNET had done the same, with that July 9 press conference featuring Isaac Asimov and the considerable fanfare it generated. Indeed, the press notices were almost as ebullient as The Source’s own advertising, with the Wall Street Journal calling it “an overnight sensation among the cognoscenti of the computing world.” Graeme Keeping, a business executive who would later be in charge of the service but was at this time just another outsider looking in, had this to say about those earliest days:

The announcement was made with the traditional style of the then-masters of The Source. A lot of fanfare, a lot of pizazz, a lot of sizzle. There was absolutely no substance whatsoever to the announcement. They had nothing to back it up with.

Electronic publishing was in its infancy in those days. It was such a romantic dream that there never had to be a product in order to generate excitement. Nobody had to see anything real. People wanted it so badly, like a cure for cancer. We all want it, but is it really there? I equate it to Laetrile.

While that is perhaps a little unfair — there were, as we’ve just seen, several significant deals with content providers in place before July of 1979 — it was certainly true that the hype rather overwhelmed the comparatively paltry reality one found upon actually logging into The Source.

Nevertheless, any comparison of The Source and MicroNET at this stage would have to place the former well ahead in terms of ambition, vision, and public profile. That distinction becomes less surprising when we consider that what was a side experiment for Jeff Wilkins was the whole enchilada for von Meister and Taub. For the very same reason, any neutral observer forced to guess which of these two nascent services would rise to dominance would almost certainly have gone with The Source. Such a reckoning wouldn’t have accounted, however, for the vortex of chaos that was Bill von Meister.

It was the typical von Meister problem: he had built all this buzz by spending money he didn’t have — in fact, by spending so much money that to this day it’s hard to figure out where it could all possibly have gone. As of October of 1979, the company had $1000 left in the bank and $8 million in debt. Meanwhile The Source itself, despite all the buzz, had managed to attract at most a couple of thousand actual subscribers. It was, after all, still very early days for home computers in general, modems were an even more exotic species, and the Alcatel terminals had yet to arrive from France, being buried in some transatlantic bureaucratic muddle.

Jack Taub

By his own later account, Jack Taub had had little awareness over the course of the last year or so of what von Meister was doing with the company’s money, being content to contribute ideas and strategic guidance and let his partner handle day-to-day operations. But that October he finally sat down to take a hard look at the books. He would later pronounce the experience of doing so “an assault on my system. Von Meister is a terrific entrepreneur, but he doesn’t know when to stop entrepreneuring. The company was in terrible shape. It was not going to survive. Money was being spent like water.” With what he considered to be a triage situation on his hands, Taub delivered an ultimatum to von Meister. He would pay him $3140 right now — 1¢ for each of his shares — and would promise to pay him another dollar per share in three years if The Source was still around then. In return, von Meister would walk away from the mess he had created, escaping any legal action that might otherwise become a consequence of his gross mismanagement. According to Taub’s account, von Meister agreed to these terms with uncharacteristic meekness, leaving his vision of The Source as just one more paving stone on his boulevard of broken entrepreneurial dreams, and leaving Taub to get down to the practical business of saving the company. “I think if I had waited another week,” the latter would later say, “it would have been too late.”

As it was, Digital Broadcasting teetered on the edge of bankruptcy for months, with Taub scrambling to secure new lines of credit to keep the existing creditors satisfied and, when all else failed, injecting more of his own money into the company. Through it all, he still had to deal with von Meister, who, as any student of his career to date could have predicted, soon had second thoughts about going away quietly — if, that is, he’d ever planned to do so in the first place. Taub learned that von Meister had taken much of Digital Broadcasting’s proprietary technology out the door with him, and was now shopping it around the telecommunications industry; that sparked a lawsuit on Taub’s behalf. Von Meister claimed his ejection had been illegal; that sparked another, going in the opposite direction. Apparently concluding that his promise not to sue von Meister for his mismanagement of the company was thus nullified, Taub counter-sued with exactly that charge. With a vengeful von Meister on his trail, he said that he couldn’t afford to “sleep with both eyes closed.”

By March of 1980, The Source had managed to attract about 3000 subscribers, but the online citizens were growing restless. Many features weren’t quite as advertised. The heavily hyped nightlife guides, for instance, mostly existed only for the Washington Beltway, the home of The Source. The email system was down about half the time, and even when it was allegedly working it was anyone’s guess whether a message that was sent would actually be delivered. Failings like these could be attributed easily enough to the usual technical growing pains, but other complaints carried with them an implication of nefarious intent. The Source’s customers could read the business pages of the newspaper as well as anyone, and knew that Jack Taub was fighting for his company’s life on multiple fronts. In that situation, some customers reasoned, there would be a strong incentive to find ways to bill them just that little bit more. Thus there were dark accusations that the supposedly user-friendly menu system had been engineered to be as verbose and convoluted as possible in order to maximize the time users spent online just trying to get to where they wanted to go. On a 110- or 300-baud connection — for comparison purposes, consider that a good touch typist could far exceed the former rate — receiving all these textual menus could take considerable time, especially given the laggy response time of the system as a whole whenever more than a handful of people were logged on. And for some reason, a request to log off the system in an orderly way simply didn’t work most of the time, forcing users to break the connection themselves. After they did so, it would conveniently — conveniently for The Source’s accountants, that is — take the system five minutes or so to recognize their absence and stop charging them.

A sampling of the many error messages with which early users of The Source became all too familiar.

The accusations of nefarious intent were, for what it’s worth, very unlikely to have had any basis in reality. Jack Taub was a hustler, but he wasn’t a con man. On the contrary, he was earnestly trying to save a company whose future he deeply believed in. His biggest problem was the government-secured loan, on which Digital Broadcasting Corporation had by now defaulted, forcing the Commerce Department to pay $3.2 million to the National Bank of North Carolina. The government bureaucrats, understandably displeased, were threatening to seize his company and dismantle it in the hope of getting at least some of that money back. They were made extra motivated by the fact that the whole affair had leaked into the papers, with the Washington Post in particular treating it as a minor public scandal, an example of Your Tax Dollars at Waste.

Improvising like mad, Taub convinced the government to allow him to make a $300,000 down payment, and thereafter to repay the money he owed over a period of up to 22 years at an interest rate of just 2 percent. Beginning in 1982, the company, now trading as The Source Telecomputing Corporation rather than Digital Broadcasting Corporation, would have to repay either $50,000 or 10 percent of their net profit each year, whichever was greater; beginning in 1993, the former figure would rise to $100,000 if the loan still hadn’t been repaid. “The government got a good deal,” claimed Taub. “They get 100 cents on the dollar, and get their money back faster if I’m able to do something with the company.” While some might have begged to differ with his characterization of the arrangement as a “good deal,” it was, the government must have judged, the best it was likely to get under the circumstances. “The question is to work out some kind of reasonable solution where you recover something rather than nothing,” said one official familiar with the matter. “While it sounds like they’re giving it away, they already did that. They already made their mistake with the original loan.”

With the deal with the Commerce Department in place, Taub convinced The Readers Digest Association, publisher of the most popular magazine in the world, who were eager to get in on the ground floor of what was being billed in some circles as the next big thing in media, to buy 51 percent of The Source for $3 million in September of 1980, thus securing desperately needed operating capital. But when a judge ruled in favor of von Meister on the charge that he had been unlawfully forced out of the company shortly thereafter, Taub was left scrambling once again. He was forced to go back to Readers Digest, convincing them this time to increase their stake to 80 percent, leaving only the remaining 20 percent in his own hands. And with that second capital injection to hand, he convinced von Meister to lay the court battle to rest with a settlement check for $1 million.

The Source had finally attained a measure of stability, and Jack Taub’s extended triage could thus come to an end at last. Along the way, however, he had maneuvered himself out of his controlling interest and, soon, out of a job. Majority ownership having its privileges, Readers Digest elected to replace him with one of their own: Graeme Keeping, the executive who had lobbied hardest to buy The Source in the first place. “Any publisher today, if he doesn’t get into electronic publishing,” Keeping was fond of saying, “is either going to be forced into it by economic circumstances or will have great difficulty staying in the paper-and-ink business.”

The Source’s Prime computer systems, a millstone around their neck for years (although the monkey does seem to be enjoying them).

The Source may have found a safe harbor with one of the moneyed giants of American media, but it would never regain its early mojo. Keeping proved to be less than the strategic mastermind he believed himself to be, with a habit of over-promising and under-delivering — and, worse, of making terrible choices based on his own overoptimistic projections. The worst example of the tendency came early in his tenure, in the spring of 1981, when he was promising the New York Times he would have 60,000 subscribers by 1982. Determined to make sure he had the computing capacity to meet the demand, he cancelled the contract to use GTE Telenet’s computing facilities, opening his own data center instead and filling it with his own machines. At a stroke, this destroyed a key part of the logistical economies which had done so much to spawn The Source (and, for that matter, CompuServe’s MicroNET) in the first place. The Source’s shiny new computers now sat idle during the day with no customers to service. Come 1982, The Source had only 20,000 subscribers, and all those expensive computers were barely ticking over even at peak usage. This move alone cost The Source millions. Meanwhile, the deal with Alcatel for custom-made terminals having fallen through during the chaos of Taub’s tenure, Keeping made a new one with Zenith to make “a semi-intelligent terminal with a hole in the back through which you can turn it into a computer.” That impractical flight of fancy also came to naught, but not before costing The Source more money. Such failures led to Keeping’s ouster in June of 1982, to be replaced by another anodyne chief from the Readers Digest executive pool named George Grune.

Soon after, Control Data Corporation, a maker of supercomputers, bought a 30 percent share of The Source for a reported $5 million. But even this latest injection of capital, technical expertise, and content — Control Data would eventually move much of their pioneering Plato educational network onto the service — changed little. The Source went through three more chief executives in the next two years. The user roll continued to grow, finally reaching 60,000 in September of 1984 — some two and a half years after Graeme Keeping’s prediction, for those keeping score — but the company perpetually lost money, was perpetually about to turn the corner into mainstream acceptance and profitability but never actually did. Thanks not least to Keeping’s data-center boondoggle, the hourly rate for non-prime usage had risen to $7.75 per hour by 1984, making this onetime pioneer that now felt more and more like an also-ran a hard sell in terms of dollars and cents as well. Neither the leading name in the online-services industry nor the one with the deepest pockets — there were limits to Readers Digest’s largess — The Source struggled to attract third-party content. A disturbing number of those 60,000 subscribers rarely or never logged on, paying only the minimum monthly charge of $10. One analyst noted that well-heeled computer owners “apparently are willing to pay to have these electronic services available, even if they don’t use them regularly. From a business point of view, that’s a formula for survival, but not for success.”

The Source was fated to remain a survivor but never a real success for the rest of its existence. Back in Columbus, however, CompuServe’s consumer offering was on a very different trajectory. Begun in such a low-key way that Jeff Wilkins had refused even to describe it as being in competition with The Source, CompuServe’s erstwhile MicroNET — now re-branded as simply CompuServe, full stop — was going places of which its rival could only dream. Indeed, one might say it was going to the very places of which Bill von Meister had been dreaming in 1979.

(Sources: the book On the Way to the Web: The Secret History of the Internet and its Founders by Michael A. Banks and Stealing Time: Steve Case, Jerry Levin, and the Collapse of AOL Time Warner by Alec Klein; Creative Computing of March 1980; InfoWorld of April 14 1980, May 26 1980, January 11 1982, May 24 1982, and November 5 1984; Wall Street Journal of November 6 1979; Online Today of July 1989; 80 Microcomputing of November 1980; The Intelligent Machines Journal of March 14 1979 and June 25 1979; Washington Post of May 11 1937, July 10 1978, February 10 1980, and November 4 1980; Alexander Trevor’s brief technical history of CompuServe, which was first posted to Usenet in 1988; interviews with Jeff Wilkins from the Internet History Podcast and Conquering Columbus.)

 
 

Tags: ,

A Full-Motion-Video Consulting Detective

Over the course of six months in 1967, 50 million people visited Expo ’67 in Montreal, one of the most successful international exhibitions in the history of the world. Representatives from 62 nations set up pavilions there, showcasing the cutting edge in science, technology, and the arts. The Czechoslovakian pavilion was a surprisingly large one, with a “fairytale area” for children, a collection of blown Bohemian glassware, a “Symphony of Developed Industry,” and a snack bar offering “famous Pilsen beer.” But the hit of the pavilion — indeed, one of the sleeper hits of the Expo as a whole — was to be found inside a small, nondescript movie theater. It was called Kinoautomat, and it was the world’s first interactive movie.

Visitors who attended a screening found themselves ushered to seats that sported an unusual accessory: large green and red buttons mounted to the seat backs in front of them. The star of the film, a well-known Czech character actor named Miroslav Horníček, trotted onto the tiny stage in front of the screen to explain that the movie the visitors were about to see was unlike any they had ever seen before. From time to time, the action would stop and he would pop up again to let the audience decide what his character did next onscreen. Each audience member would register which of the two choices she preferred by pressing the appropriate button, the results would be tallied, and simple majority rule would decide the issue.

As a film, Kinoautomat is a slightly risque but otherwise harmless farce. The protagonist, a Mr. Novak, has just bought some flowers to give to his wife — it’s her birthday today — and is waiting at home for her to return to their apartment when his neighbor’s wife, an attractive young blonde, accidentally locks herself out of her own apartment with only a towel on. She frantically bangs on Mr. Novak’s door, putting him in an awkward position and presenting the audience with their first choice. Should he let her in and try to explain the presence of a naked woman in their apartment to his wife when she arrives, or should he refuse the poor girl, leaving her to shiver in the altogether in the hallway? After this first choice is made, another hour or so of escalating misunderstanding and mass confusion ensues, during which the audience is given another seven or so opportunities to vote on what happens next.

Kinoautomat played to packed houses throughout the Expo’s run, garnering heaps of press attention in the process. Radúz Činčera, the film’s director and the entire project’s mastermind, was lauded for creating what was called by some critics one of the boldest innovations in the history of cinema. After the Expo was over, Činčera’s interactive movie theater was set up several more times in several other cities, always with a positive response, and Hollywood tried to open a discussion about licensing the technology behind it. But the interest and exposure gradually dissipated, perhaps partly due to a crackdown on “decadent” art by Czechoslovakia’s ruling Communist Party, but almost certainly due in the largest part to the logistical challenges involved in setting up the interactive movie theaters that were needed to show it. It was last shown at Expo ’74 in Spokane, Washington, after which it disappeared from screens and memories for more than two decades, to be rescued from obscurity only well into the 1990s, after the Iron Curtain had been thrown open, when it was stumbled upon once again by some of the first academics to study seriously the nature of interactivity in digital mediums.

Had Činčera’s experiment been better remembered at the beginning of the 1990s, it might have saved a lot of time for those game developers dreaming of making interactive movies on personal computers and CD-ROM-based set-top boxes. Sure, the technology Činčera had to work with was immeasurably more primitive; his branching narrative was accomplished by the simple expedient of setting up two film projectors at the back of the theater and having an attendant place a lens cap over whichever held the non-applicable reel. Yet the more fundamental issues he wrestled with — those of how to create a meaningfully interactive experience by splicing together chunks of non-interactive filmed content — remained unchanged more than two decades later.

The dirty little secret about Kinoautomat was that the interactivity in this first interactive film was a lie. Each branch the story took contrived only to give lip service to the audience’s choice, after which it found a way to loop back onto the film’s fixed narrative through-line. Whether the audience was full of conscientious empathizers endeavoring to make the wisest choices for Mr. Novak or crazed anarchists trying to incite as much chaos as possible — the latter approach, for what it’s worth, was by far the more common — the end result would be the same: poor Mr. Novak’s entire apartment complex would always wind up burning to the ground in the final scenes, thanks to a long chain of happenstance that began with that naked girl knocking on his door. Činčera had been able to get away with this trick thanks to the novelty of the experience and, most of all, thanks to the fact that his audience, unless they made the effort to come back more than once or to compare detailed notes with those who had attended other screenings, was never confronted with how meaningless their choices actually were.

While it had worked out okay for Kinoautomat, this sort of fake interactivity wasn’t, needless to say, a sustainable path for building the whole new interactive-movie industry — a union of Silicon Valley and Hollywood — which some of the most prominent names in the games industry were talking of circa 1990. At the same time, though, the hard reality was that to create an interactive movie out of filmed, real-world content that did offer genuinely meaningful, story-altering branches seemed for all practical purposes impossible. The conventional computer graphics that had heretofore been used in games, generated by the computer and drawn on the screen programmatically, were a completely different animal than the canned snippets of video which so many were now claiming would mark the proverbial Great Leap Forward. Conventional computer graphics could be instantly, subtly, and comprehensively responsive to the player’s actions. The snippets in what the industry would soon come to call a “full-motion-video” game could be mixed and matched and juggled, but only in comparatively enormous static chunks.

This might not sound like an impossible barrier in and of itself. Indeed, the medium of textual interactive fiction had already been confronted with seemingly similar contrasts in granularity between two disparate approaches which had both proved equally viable. As I’ve had occasion to discuss in an earlier article, a hypertext narrative built out of discrete hard branches is much more limiting in some ways than a parser-driven text adventure with its multitudinous options available at every turn — but, importantly, the opposite is also true. A parser-driven game that’s forever fussing over what room the player is standing in and what she’s carrying with her at any given instant is ill-suited to convey large sweeps of time and plot. Each approach, in other words, is best suited for a different kind of experience. A hypertext narrative can become a wide-angle exploration of life-changing choices and their consequences, while the zoomed-in perspective of the text adventure is better suited to puzzle-solving and geographical exploration — that is, to the exploration of a physical space rather than a story space.

And yet if we do attempt to extend a similar comparison to a full-motion-video adventure game versus one built out of conventional computer graphics, it may hold up in the abstract, but quickly falls apart in the realm of the practical and the specific. Although the projects exploring full-motion-video applications were among the most expensive the games industry of 1990 had ever funded, their budgets paled next to those of even a cheap Hollywood production. To produce full-motion-video games with meaningfully branching narratives would require their developers to stretch their already meager budgets far enough to shoot many, many non-interactive movies in order to create a single interactive movie, accepting that the player would see only a small percentage of all those hours of footage on any given play-through. And even assuming that the budget could somehow be stretched to allow such a thing, there were other practical concerns to reckon with; after all, even the wondrous new storage medium of CD-ROM had its limits in terms of capacity.

Faced with these issues, would-be designers of full-motion-video games did what all game designers do: they worked to find approaches that — since there was no way to bash through the barriers imposed on them — skirted around the problem.

They did have at least one example to follow or reject — one that, unlike Kinoautomat, virtually every working game designer knew well. Dragon’s Lair, the biggest arcade hit of 1983, had been built out of a chopped-up cartoon which un-spooled from a laser disc housed inside the machine. It replaced all of the complications of branching plots with a simple do-or-die approach. The player needed to guide the joystick through just the right pattern of rote movements — a pattern identifiable only through extensive trial and error — in time with the video playing on the screen. Failure meant death, success meant the cartoon continued to the next scene — no muss, no fuss. But, as the many arcade games that had tried to duplicate Dragon’s Lair‘s short-lived success had proved, it was hardly a recipe for a satisfying game once the novelty wore off.

Another option was to use full-motion video for cut scenes rather than as the real basis of a game, interspersing static video sequences used for purposes of exposition in between interactive sequences powered by conventional computer graphics. In time, this would become something of a default approach to the problem of full-motion video, showing up in games as diverse as the Wing Commander series of space-combat simulators, the Command & Conquer real-time strategy series, and even first-person shooters like Realms of the Haunting. But such juxtapositions would always be doomed to look a little jarring, the ludic equivalent of an animated film which from time to time switches to live action for no aesthetically valid reason. As such, this would largely become the industry’s fallback position, the way full-motion video wound up being deployed as a last resort after designers had failed to hit upon a less jarring formula. Certainly in the early days of full-motion video — the period we’re interested in right now — there still remained the hope that some better approach to the melding of computer game and film might be discovered.

The most promising approaches — the ones, that is, that came closest to working — often used full-motion video in the context of a computerized mystery. In itself, this is hardly surprising. Despite the well-known preference of gamers and game designers for science-fiction and fantasy scenarios, the genre of traditional fiction most obviously suited for ludic adaptation is in fact the classic mystery novel, the only literary genre that actively casts itself as a sort of game between writer and reader. A mystery novel, one might say, is really two stories woven together. One is that of the crime itself, which is committed before the book proper really gets going. The other is that of the detective’s unraveling of the crime; it’s here, of course, that the ludic element comes in, as the reader too is challenged to assemble the clues alongside the detective and try to deduce the perpetrator, method, and motive before they are revealed to her.

For a game designer wrestling with the challenges inherent in working with full-motion video, the advantages of this structure count double. The crime itself is that most blessed of things for a designer cast adrift on a sea of interactivity: a fixed story, an unchanging piece of solid narrative ground. In the realm of interactivity, then, the designer is only forced to deal with the investigation, a relatively circumscribed story space that isn’t so much about making a story as uncovering one that already exists. The player/detective juggles pieces of that already extant story, trying to slot them together to make the full picture. In that context, the limitations of full-motion video — all those static chunks of film footage that must be mixed and matched — suddenly don’t sound quite so limiting. Full-motion video, an ill-fitting solution that has to be pounded into place with a sledgehammer in most interactive applications, suddenly starts seeming like an almost elegant fit.

The origin story of the most prominent of the early full-motion-video mysteries, a product at the bleeding edge of technology at the time it was introduced, ironically stretches back to a time before computers were even invented. In 1935, J.G. Links, a prominent London furrier, came up with an idea to take the game-like elements of the traditional mystery novel to the next level. What if a crime could be presented to the reader not as a story about its uncovering but in a more unprocessed form, as a “dossier” of clues, evidence, and suspects? The reader would be challenged to assemble this jigsaw into a coherent description of who, what, when, and where. Then, when she thought she was ready, she could open a sealed envelope containing the solution to find out if she had been correct. Links pitched the idea to a friend of his who was well-positioned to see it through with him: Dennis Wheatley, a very popular writer of crime and adventure novels. Together Links and Wheatley created four “Dennis Wheatley Crime Dossiers,” which enjoyed considerable success before the undertaking was stopped short by the outbreak of World War II. After the war, mysteries in game form drifted into the less verisimilitudinous but far more replayable likes of Cluedo, while non-digital interactive narratives moved into the medium of experiential wargames, which in turn led, in time, to the great tabletop-gaming revolution that was Dungeon & Dragons.

And that could very well have been the end of the story, leaving the Dennis Wheatley Crime Dossiers as merely a road not taken in game history, works ahead of their time that wound up getting stranded there. But in 1979 Mayflower Books began republishing the dossiers, a complicated undertaking that involved recreating the various bits of “physical evidence” — including pills, fabric samples, cigarette butts, and even locks of hair — that had accompanied them. There is little indication that their efforts were rewarded with major sales. Yet, coming as they did at a fraught historical moment for interactive storytelling in general — the first Choose Your Own Adventure book was published that same year; the game Adventure had hit computers a couple of years before; Dungeons & Dragons was breaking into the mainstream media — the reprinted dossiers’ influence would prove surprisingly pervasive with innovators in the burgeoning field. They would, for instance, provide Marc Blank with the idea of making a sort of crime dossier of his own to accompany Infocom’s 1982 computerized mystery Deadline, thereby establishing the Infocom tradition of scene-setting “feelies” and elaborate packaging in general. And another important game whose existence is hard to imagine without the example provided by the Dennis Wheatley Crime Dossiers appeared a year before Deadline.

Prior to the Mayflower reprints, the closest available alternative to the Crime Dossiers had been a 1975 Sherlock Holmes-starring board game called 221B Baker Street: The Master Detective Game. It plays like a more coherent version of Cluedo, thanks to its utilization of pre-crafted mysteries that are included in the box rather than a reliance on random combinations of suspects, locations, and weapons. Otherwise, however, the experience isn’t all that markedly different, with players rolling dice and moving their tokens around the game board, trying to complete their “solution checklists” before their rivals. The competitive element introduces a bit of cognitive dissonance that is never really resolved: this game of Sherlock Holmes actually features several versions of Holmes, all racing around London trying to solve each mystery before the others can. But more importantly, playing it still feels more like solving a crossword puzzle than solving a mystery.

Two of those frustrated by the limitations of 221B Baker Street were Gary Grady and Suzanne Goldberg, amateur scholars of Sherlock Holmes living in San Francisco. “A game like 221B Baker Street doesn’t give a player a choice,” Grady noted. “You have no control over the clue you’re going to get and there’s no relationship of the clues to the process of play. We wanted the idea of solving a mystery rather than a puzzle.” In 1979, with the negative example of 221B Baker Street and the positive example of the Dennis Wheatley Crime Dossiers to light the way, the two started work on a mammoth undertaking that would come to be known as Sherlock Holmes Consulting Detective upon its publication two years later. Packaged and sold as a board game, it in truth had much less in common with the likes of Cluedo or 221B Baker Street than it did with the Dennis Wheatley Crime Dossiers. Grady and Goldberg provided rules for playing competitively if you insisted, and a scoring system that challenged you to solve a case after collecting the least amount of evidence possible, but just about everyone who has played it agrees that the real joy of the game is simply in solving the ten labyrinthine cases, each worthy of an Arthur Conan Doyle story of its own, that are included in the box.

Each case is housed in a booklet of its own, whose first page or two sets up the mystery to be solved in rich prose that might indeed have been lifted right out of a vintage Holmes story. The rest of the booklet consists of more paragraphs to be read as you visit various locations around London, following the evidence trail wherever it leads. When you choose to visit someplace (or somebody), you look it up in the London directory that is included, which will give you a coded reference. If that code is included in the case’s booklet, eureka, you may just have stumbled upon more information to guide your investigation; at the very least, you’ve found something new to read. In addition to the case books, you have lovingly crafted editions of the London Times from the day of each case to scour for more clues; cleverly, the newspapers used for early cases can contain clues for later cases as well, meaning the haystack you’re searching for needles gets steadily bigger as you progress from case to case. You also have a map of London, which can become unexpectedly useful for tracing the movements of suspects. Indeed, each case forces you to apply a whole range of approaches and modes of thought to its solution. When you think you’re ready, you turn to the “quiz book” and answer the questions about the case therein, then turn the page to find out if you were correct.

If Sherlock Holmes Consulting Detective presents a daunting challenge to its player, the same must go ten times over for its designers. The amount of effort that must have gone into creating, collating, intertwining, and typesetting such an intricate web of information fairly boggles the mind. The game is effectively ten Dennis Wheatley Crime Dossiers in one box, all cross-referencing one another, looping back on one another. That Grady and Goldberg, working in an era before computerized word processing was widespread, managed it at all is stunning.

Unable to interest any of the established makers of board games in such an odd product, the two published it themselves, forming a little company called Sleuth Publications for the purpose. A niche product if ever there was one, it did manage to attract a champion in Games magazine, who called it “the most ingenious and realistic detective game ever devised.” The same magazine did much to raise its profile when they added it to their mail-order store in 1983. A German translation won the hugely prestigious Spiel des Jahres in 1985, a very unusual selection for a competition that typically favored spare board games of abstract logic. Over the years, Sleuth published a number of additional case packs, along with another boxed game in the same style: Gumshoe, a noirish experience rooted in Raymond Chandler rather than Arthur Conan Doyle which was less successful, both creatively and commercially, than its predecessor.

And then these elaborate analog productions, almost defiantly old-fashioned in their reliance on paper and text and imagination, became the unlikely source material for the most high-profile computerized mysteries of the early CD-ROM era.

The transformation would be wrought by ICOM Simulations, a small developer who had always focused their efforts on emerging technology. They had first made their name with the release of Déjà Vu on the Macintosh in 1985, one of the first adventure games to replace the parser with a practical point-and-click interface; in its day, it was quite the technological marvel. Three more games built using the same engine had followed, along with ports to many, many platforms. But by the time Déjà Vu II hit the scene in 1988, the interface was starting to look a little clunky and dated next to the efforts of companies like Lucasfilm Games, and ICOM decided it was time to make a change — time to jump into the unexplored waters of CD-ROM and full-motion video. They had always been technophiles first, game designers second, as was demonstrated by the somewhat iffy designs of most of their extant games. It therefore made a degree of sense to adapt someone else’s work to CD-ROM. They decided that Sherlock Holmes Consulting Detective, that most coolly intellectual of mystery-solving board games, would counter-intuitively adapt very well to a medium that was supposed to allow hotter, more immersive computerized experiences than ever before.

As we’ve already seen, the limitations of working with chunks of static text are actually very similar in some ways to those of working with chunks of static video. ICOM thus decided that the board game’s methods for working around those limitations should work very well for the computer game as well. The little textual vignettes which filled the case booklets, to be read as the player moved about London trying to solve the case, could be recreated by live actors. There would be no complicated branching narrative, just a player moving about London, being fed video clips of her interviews with suspects. Because the tabletop game included no mechanism for tracking where the player had already been and what she had done, the text in the case booklets had been carefully written to make no such presumptions. Again, this was perfect for a full-motion-video adaptation.

Gary Grady and Suzanne Goldberg were happy to license their work; after laboring all these years on such a complicated niche product, the day on which ICOM knocked on their door must have been a big one indeed. Ken Tarolla, the man who took charge of the project for ICOM, chose three of the ten cases from the original Sherlock Holmes Consulting Detective to serve as the basis of the computer game. He now had to reckon with the challenges of going from programming games to filming them. Undaunted, he had the vignettes from the case booklets turned into scripts by a professional screenwriter, hired 35 actors to cast in the 50 speaking parts, and rented a sound stage in Minneapolis — far from ICOM’s Chicago offices, but needs must — for the shoot. The production wound up requiring 70 costumes along with 25 separate sets, a huge investment for a small developer like ICOM. In spite of their small size, they evinced a commitment to production values few of their peers could match. Notably, they didn’t take the money-saving shortcut of replacing physical sets with computer-generated graphics spliced in behind the actors. For this reason, their work holds up much better today than that of most of their peers.

Indeed, as befits a developer of ICOM’s established technical excellence — even if they were working in an entirely new medium — the video sequences are surprisingly good, the acting and set design about up to the standard of a typical daytime-television soap opera. If that seems like damning with faint praise, know that the majority of similar productions come off far, far worse. Peter Farley, the actor hired to play Holmes, may not be a Basil Rathbone, Jeremy Brett, or Benedict Cumberbatch, but neither does he embarrass himself. The interface is decent, and the game opens with a video tutorial narrated by Holmes himself — a clear sign of how hard Consulting Detective is straining to be the more mainstream, more casual form of interactive entertainment that the CD-ROM was supposed to precipitate.

First announced in 1990 and planned as a cross-platform product from the beginning, spanning the many rival CD-ROM initiatives on personal computers, set-top boxes, and game consoles, ICOM’s various versions of Consulting Detective were all delayed for long stretches by a problem which dogged every developer working in the same space: the struggle to find a way of getting video from CD-ROM to the screen at a reasonable resolution, frame rate, and number of colors. The game debuted in mid-1991 on the NEC TurboGrafx-16, an also-ran in the console wars which happened to be the first such device to offer a CD-ROM drive as an accessory. In early 1992, it made its way to the Commodore CDTV, thanks to a code library for video playback devised by Carl Sassenrath, long a pivotal figure in Amiga circles. Then, and most importantly in commercial terms, the slow advance of computing hardware finally made it possible to port the game to Macintosh and MS-DOS desktop computers equipped with CD-ROM drives later in the same year.

Sherlock Holmes Consulting Detective became a common sight in “multimedia upgrade kits” like this one from Creative Labs.

As one of the first and most audiovisually impressive products of its kind, Consulting Detective existed in an uneasy space somewhere between game and tech demo. It was hard for anyone who had never seen actual video featuring actual actors playing on a computer before to focus on much else when the game was shown to them. It was therefore frequently bundled with the “multimedia upgrade kits,” consisting of a sound card and CD-ROM drive, that were sold by companies like Creative Labs beginning in 1992. Thanks to these pack-in deals, it shipped in huge numbers by conventional games-industry terms. Thus encouraged, ICOM went back to the well for a Consulting Detective Volume II and Volume III, each with another three cases from the original board game. These releases, however, did predictably less well without the advantages of novelty and of being a common pack-in item.

As I’ve noted already, Consulting Detective looks surprisingly good on the surface even today, while at the time of its release it was nothing short of astonishing. Yet it doesn’t take much playing time before the flaws start to show through. Oddly given the great care that so clearly went into its surface production, many of its problems feel like failures of ambition. As I’ve also already noted, no real state whatsoever is tracked by the game; you just march around London watching videos until you think you’ve assembled a complete picture of the case, then march off to trial, which takes the form of a quiz on who did what and why. If you go back to a place you’ve already been, the game doesn’t remember it: the same video clip merely plays again. This statelessness turns out to be deeply damaging to the experience. I can perhaps best explain by taking as an example the first case in the first volume of the series. (Minor spoilers do follow in the next several paragraphs. Skip down to the penultimate paragraph — beginning with “To be fair…” — to avoid them entirely.)

“The Mummy’s Curse” concerns the murder on separate occasions of all three of the archaeologists who have recently led a high-profile expedition to Egypt. One of the murders took place aboard the ship on which the expedition was returning to London, laden with treasures taken — today, we would say “looted” — from a newly discovered tomb. We can presume that one of the other passengers most likely did the deed. So, we acquire the passenger manifest for the ship and proceed to visit each of the suspects in turn. Among them are Mr. and Mrs. Fenwick, two eccentric members of the leisured class. Each of them claims not to have seen, heard, or otherwise had anything to do with the murder. But Louise Fenwick has a little dog, a Yorkshire terrier of whom she is inordinately fond and who traveled with the couple on their voyage. (Don’t judge the game too harshly from the excerpt below; it features some of the hammiest acting of all, with a Mrs. Fenwick who seems to be channeling Miss Piggy — a Miss Piggy, that is, with a fake English accent as horrid as only an American can make it.)


The existence of Mrs. Fenwick’s dog is very interesting in that the Scotland Yard criminologist who handled the case found some dog hair on the victim’s body. Our next natural instinct would be to find out whether the hair could indeed have come from a Yorkshire terrier — but revisiting Scotland Yard will only cause the video from there which we’ve already seen to play again. Thus stymied on that front, we probe further into Mrs. Fenwick’s background. We learn that the victim once gave a lecture before the Royal Society where he talked about dissecting his own Yorkshire terrier after its death, provoking the ire of the Anti-Vivisection League, of which Louise Fenwick is a member. And it gets still better: she personally harassed the victim, threatening to dissect him herself. Now, it’s very possible that this is all coincidence and red herrings, but it’s certainly something worth following up on. So we visit the Fenwicks again to ask her about it — and get to watch the video we already saw play again. Stymied once more.

This example hopefully begins to illustrate how Sherlock Holmes Consulting Detective breaks its promise to let you be the detective and solve the crime yourself in the way aficionados of mystery novels had been dreaming of doing for a century. Because the game never knows what you know, and because it only lets you decide where you go, nothing about what you do after you get there, playing it actually becomes much more difficult than being a “real” detective. You’re constantly being hobbled by all these artificial constraints. Again and again, you find yourself seething because you can’t ask the question Holmes would most certainly be asking in your situation. It’s a form of fake difficulty, caused by the constraints of the game engine rather than the nature of the case.

Consider once more, then, how this plays out in practice in “The Mummy’s Curse.” We pick up this potentially case-cracking clue about Mrs. Fenwick’s previous relations with the victim. If we’ve ever read a mystery novel or watched a crime drama, we know immediately what to do. Caught up in the fiction, we rush back to the Fenwicks without even thinking about it. We get there, and of course it doesn’t work; we just get the same old spiel. It’s a thoroughly deflating experience. This isn’t just a sin against mimesis; it’s wholesale mimesis genocide.

It is true that the board-game version of Consulting Detective suffers from the exact same flaws born of its own statelessness. By presenting a case strictly as a collection of extant clues to be put together rather than asking you to ferret them out for yourself — by in effect eliminating from the equation both the story of the crime and the story of the investigation which turned up the clues — the Dennis Wheatley Crime Dossiers avoid most of these frustrations, at the expense of feeling like drier, more static endeavors. I will say that the infelicities of Sherlock Holmes Consulting Detective in general feel more egregious in the computer version — perhaps because the hotter medium of video promotes a depth of immersion in the fiction that makes it feel like even more of a betrayal when the immersion breaks down; or, more prosaically, simply because we feel that the computer ought to be capable of doing a better job of things than it is, while we’re more forgiving of the obvious constraints of a purely analog design.

Of course, it was this very same statelessness that made the design such an attractive one for adaptation to full-motion video in the first place. In other words, the problems with the format which Kinoautomat highlighted in 1967 aren’t quite as easy to dodge around as ICOM perhaps thought. It does feel like ICOM could have done a little better on this front, even within the limitations of full-motion video. Would it have killed them to provide a few clips instead of just one for some of the key scenes, with the one that plays dependent on what the player has already learned? Yes, I’m aware that that has the potential to become a very slippery slope indeed. But still… work with us just a bit, ICOM.

While I don’t want to spend too much more time pillorying this pioneering but flawed game, I do have to point out one more issue: setting aside the problems that arise from the nature of the engine, the cases themselves often have serious problems. They’ve all been shortened and simplified in comparison to the board game, which gives rise to some of the issues. That said, though, it must also be said that not everything in the board game itself is unimpeachable. Holmes’s own narratives of the cases’ solutions, which follow after you complete them by answering all of the questions in the trial phases correctly, are often rife with questionable assumptions and intuitive leaps that would never hold up on an episode of Perry Mason, much less a real trial. At the conclusion of “The Mummy’s Curse,” for instance, he tells us there was “no reason to assume” that the three archaeologists weren’t all killed by the same person. Fair enough — but there is also no reason to assume the opposite, no reason to assume we aren’t dealing with a copycat killer or killers, given that all of the details surrounding the first of the murders were published on the front page of the London Times. And yet Holmes’s entire solution to the case follows from exactly that questionable assumption. It serves, for example, as his logic for eliminating Mrs. Fenwick as a suspect, since she had neither motive nor opportunity to kill the other two archaeologists.

To be fair to Gary Grady and Suzanne Goldberg, this case is regarded by fans of the original board game as the weakest of all ten (it actually shows up as the sixth case there). Why ICOM chose to lead with this of all cases is the greatest mystery of all. Most of the ones that follow are better — but rarely, it must be said, as airtight as our cocky friend Holmes would have them be. But then, in this sense ICOM is perhaps only being true to the Sherlock Holmes canon. For all Holmes’s purported devotion to rigorous logic, Arthur Conan Doyle’s tales never play fair with readers hoping to solve the mysteries for themselves, hinging always on similar logical fallacies and superhuman intuitive leaps. If one chooses to read the classic Sherlock Holmes stories — and many of them certainly are well worth reading — it shouldn’t be in the hope of solving their mysteries before he does.

The three volumes of Sherlock Holmes Consulting Detective would, along with a couple of other not-quite-satisfying full-motion-video games, mark the end of the line for ICOM. Faced with the mounting budgets that made it harder and harder for a small developer to survive, they left the gaming scene quietly in the mid-1990s. The catalog of games they left behind is a fairly small one, but includes in Déjà Vu and Consulting Detective two of the most technically significant works of their times. The Consulting Detective games were by no means the only interactive mysteries of the early full-motion-video era; a company called Tiger Media also released a couple of mysteries on CD-ROM, with a similar set of frustrating limitations, and the British publisher Domark even announced but never released a CD-ROM take on one of the old Dennis Wheatley Crime Dossiers. The ICOM mysteries were, however, the most prominent and popular. Flawed though they are, they remain fascinating historical artifacts with much to teach us: about the nature of those days when seeing an actual video clip playing on a monitor screen was akin to magic; about the perils and perhaps some of the hidden potential of building games out of real-world video; about game design in general. In that spirit, we’ll be exploring more experiments with full-motion video in articles to come, looking at how they circumvented — or failed to circumvent — the issues that dogged Kinoautomat, Dragon’s Lair, and Sherlock Holmes Consulting Detective alike.

(Sources: the book Media and Participation by Nico Carpentier; Byte of May 1992; Amazing Computing of May 1991, July 1991, March 1992, and May 1992; Amiga Format of March 1991; Amiga Computing of October 1992; CD-ROM Today of July 1993; Computer Gaming World of January 1991, August 1991, June 1992, and March 1993; Family Computing of February 1984; Softline of September 1982; Questbusters of July 1991 and September 1991; CU Amiga of October 1992. Online sources include Joe Pranevich’s interview with Dave Marsh on The Adventure Gamer; the home page of Kinoautomat today; Expo ’67 in Montreal; and Brian Moriarty’s annotated excerpt from Kinoautomat, taken from his lecture “I Sing the Story Electric.”

Some of the folks who once were ICOM Simulations have remastered the three cases from the first volume of the series and now sell them on Steam. The Sherlock Holmes Consulting Detective tabletop line is in print again. While I don’t love it quite as much as some do due to some of the issues mentioned in this article, it’s still a unique experience today that’s well worth checking out.)

 
 

Tags: , ,

The 68000 Wars, Part 5: The Age of Multimedia

A group of engineers from Commodore dropped in unannounced on the monthly meeting of the San Diego Amiga Users Group in April of 1988. They said they were on their way to West Germany with some important new technology to share with their European colleagues. With a few hours to spare before they had to catch their flight, they’d decided to share it with the user group’s members as well.

They had with them nothing less than the machine that would soon be released as the next-generation Amiga: the Amiga 3000. From the moment they powered it up to display the familiar Workbench startup icon re-imagined as a three-dimensional ray-traced rendering, the crowd was in awe. The new model sported a 68020 processor running at more than twice the clock speed of the old 68000, with a set of custom chips redesigned to match its throughput; graphics in 2 million colors instead of 4096, shown at non-interlaced — read, non-flickering — resolutions of 640 X 400 and beyond; an AmigaOS 2.0 Workbench that looked far more professional than the garish version 1.3 that was shipping with current Amigas. The crowd was just getting warmed up when the team said they had to run. They did, after all, have a plane to catch.

Word spread like crazy over the online services. Calls poured in to Commodore’s headquarters in West Chester, Pennsylvania, but they didn’t seem to know what any of the callers were talking about. Clearly this must be a very top-secret project; the engineering team must have committed a major breach of protocol by jumping the gun as they had. Who would have dreamed that Commodore was already in the final stages of a project which the Amiga community had been begging them just to get started on?

Who indeed? The whole thing was a lie. The tip-off was right there in the April date of the San Diego Users Group Meeting. The president of the group, along with a few co-conspirators, had taken a Macintosh II motherboard and shoehorned it into an Amiga 2000 case. They’d had “Amiga 3000” labels typeset and stuck them on the case, and created some reasonable-looking renderings of Amiga applications, just enough to get them through the brief amount of time their team of “Commodore engineers” — actually people from the nearby Los Angeles Amiga Users Group — would spend presenting the package. When the truth came out, some in the Amiga community congratulated the culprits for a prank well-played, while others were predictably outraged. What hurt more than the fact that they had been fooled was the reality that a Macintosh that was available right now had been able to impersonate an Amiga that existed only in their dreams. If that wasn’t an ominous sign for their favored platform’s future, it was hard to say what would be.

Of course, this combination of counterfeit hardware and sketchy demos, no matter how masterfully acted before the audience, couldn’t have been all that convincing to a neutral observer with a modicum of skepticism. Like all great hoaxes, this one succeeded because it built upon what its audience already desperately wanted to believe. In doing so, it inadvertently provided a preview of what it would mean to be an Amiga user in the future: an ongoing triumph of hope over hard-won experience. It’s been said before that the worst thing you can do is to enter into a relationship in the hope that you will be able to change the other party. Amiga users would have reason to learn that lesson over and over again: Commodore would never change. Yet many would never take the lesson to heart. To be an Amiga user would be always to be fixated upon the next shiny object out there on the horizon, always to be sure this would be the thing that would finally turn everything around, only to be disappointed again and again.

Hoaxes aside, rumors about the Amiga 3000 had been swirling around since the introduction of the 500 and 2000 models in 1987. But for a long time a rumor was all the new machine was, even as the MS-DOS and Macintosh platforms continued to evolve apace. Commodore’s engineering team was dedicated and occasionally brilliant, but their numbers were tiny in comparison to those of comparable companies, much less bigger ones like Apple and IBM, the latter of whose annual research budget was greater than Commodore’s total sales. And Commodore’s engineers were perpetually underpaid and underappreciated by their managers to boot. The only real reason for a top-flight engineer to work at Commodore was love of the Amiga itself. In light of the conditions under which they were forced to work, what the engineering staff did manage to accomplish is remarkable.

After the crushing disappointment that had been the 1989 Christmas season, when Commodore’s last and most concerted attempt to break the Amiga 500 into the American mainstream had failed, it didn’t take hope long to flower again in the new year. “The chance for an explosive Amiga market growth is still there,” wrote Amazing Computing at that time, in a line that could have summed up the sentiment of every issue they published between 1986 and 1994.

Still, reasons for optimism seemingly did still exist. For one thing, Commodore’s American operation had another new man in charge, an event which always brought with it the hope that the new boss might not prove the same as the old boss. Replacing the unfortunately named Max Toy was Harold Copperman, a real, honest-to-goodness computer-industry veteran, coming off a twenty-year stint with IBM, followed by two years with Apple; he had almost literally stepped offstage from the New York Mac Business Expo, where he had introduced John Sculley to the speaker’s podium, and into his new office at Commodore. With the attempt to pitch the Amiga 500 to low-end users as the successor to the Commodore 64 having failed to gain any traction, the biggest current grounds for optimism was that Copperman, whose experience was in business computers, could make inroads into that market for the higher-end Amiga models. Rumor had it that the dismissal of Toy and the hiring of Copperman had occurred following a civil war that had riven the company, with one faction — Toy apparently among them — saying Commodore should de-emphasize the Amiga in favor of jumping on the MS-DOS bandwagon, while the other faction saw little future — or, perhaps better said, little profit margin — in becoming just another maker of commodity clones. If you were an Amiga fan, you could at least breathe a sigh of relief that the right side had won out in that fight.

The Amiga 3000

It was in that hopeful spring of 1990 that the real Amiga 3000, a machine custom-made for the high-end market, made its bow. It wasn’t a revolutionary update to the Amiga 2000 by any means, but it did offer some welcome enhancements. In fact, it bore some marked similarities to the hoax Amiga 3000 of 1988. For instance, replacing the old 68000 was a 32-bit 68030 processor, and replacing AmigaOS 1.3 was the new and much-improved — both practically and aesthetically — AmigaOS 2.0. The flicker of the interlaced graphics modes could finally be a thing of the past, at least if the user sprang for the right type of monitor, and a new “super-high resolution” mode of 1280 X 400 was available, albeit with only four onscreen colors. The maximum amount of “chip memory” — memory that could be addressed by the machine’s custom chips, and thus could be fully utilized for graphics and sound — had already increased from 512 K to 1 MB with the release of a “Fatter Agnus” chip, which could be retrofitted into older examples of the Amiga 500 and 2000, in 1989. Now it increased to 2 MB with the Amiga 3000.

The rather garish and toy-like AmigaOS 1.3 Workbench.

The much slicker Workbench 2.0.

So, yes, the Amiga 3000 was very welcome, as was any sign of technological progress. Yet it was also hard not to feel a little disappointed that, five years after the unveiling of the first Amiga, the platform had only advanced this far. The hard fact was that Commodore’s engineers, forced to work on a shoestring as they were, were still tinkering at the edges of the architecture that Jay Miner and his team had devised all those years before rather than truly digging into it to make the more fundamental changes that were urgently needed to keep up with the competition. The interlace flicker was eliminated, for instance, not by altering the custom chips themselves but by hanging an external “flicker fixer” onto the end of the bus to de-interlace the interlaced output they still produced before it reached the monitor. And the custom chips still ran no faster than they had in the original Amiga, meaning the hot new 68030 had to slow down to a crawl every time it needed to access the chip memory it shared with them. The color palette remained stuck at 4096 shades, and, with the exception of the new super-high resolution mode, whose weirdly stretched pixels and four colors limited its usability, the graphics modes as a whole remained unchanged. Amiga owners had spent years mocking the Apple Macintosh and the Atari ST for their allegedly unimaginative, compromised designs, contrasting them continually with Jay Miner’s elegant dream machine. Now, that argument was getting harder to make; the Amiga too was starting to look a little compromised and inelegant.

Harold Copperman personally introduced the Amiga 3000 in a lavish event — lavish at least by Commodore’s standards — held at New York City’s trendy Palladium nightclub. With CD-ROM in the offing and audiovisual standards improving rapidly across the computer industry, “multimedia” stood with the likes of “hypertext” as one of the great buzzwords of the age. Commodore was all over it, even going so far as to name the event “Multimedia Live!” From Copperman’s address:

It’s our turn. It’s our time. We had the technology four and a half years ago. In fact, we had the product ready for multimedia before multimedia was ready for a product. Today we’re improving the technology, and we’re in the catbird seat. It is our time. It is Commodore’s time.

I’m at Commodore just as multimedia becomes the most important item in the marketplace. Once again I’m with the leader. Of course, in this industry a leader doesn’t have any followers; he just has a lot of other companies trying to pass him by. But take a close look: the other companies are talking multimedia, but they’re not doing it. They’re a long way behind Commodore — not even close.

Multimedia is a first-class way for conveying a message because it takes the strength of the intellectual content and adds the verve — the emotion-grabbing, head-turning, pulse-raising impact that comes from great visuals plus a dynamic soundtrack. For everyone with a message to deliver, it unleashes extraordinary ability. For the businessman, educator, or government manager, it turns any ordinary meeting into an experience.

In a way, this speech was cut from the same cloth as the Amiga 3000 itself. It was certainly a sign of progress, but was it progress enough? Even as he sounded more engaged and more engaging than had plenty of other tepid Commodore executives, Copperman inadvertently pointed out much of what was still wrong with the organization he helmed. He was right that Commodore had had the technology to do multimedia for a long time; as I’ve argued at length elsewhere, the Amiga was in fact the world’s first multimedia personal computer, all the way back in 1985. Still, the obvious question one is left with after reading the first paragraph of the extract above is why, if Commodore had the technology to do multimedia four and a half years ago, they’ve waited until now to tell anyone about it. In short, why is the the world of 1990 “ready” for multimedia when the world of 1985 wasn’t? Contrary to Copperman’s claim about being a leader, Commodore’s own management had begun to evince an understanding of what the Amiga was and what made it special only after other companies had started building computers similar to it. Real business leaders don’t wait around for the world to decide it’s ready for their products; they make products the world doesn’t yet know it needs, then tell it why it needs them. Five years after being gifted with the Amiga, which stands alongside the Macintosh as one of the two most visionary computers of the 1980s precisely because of its embrace of multimedia, Commodore managed at this event to give every impression that they were the multimedia bandwagon jumpers.

The Amiga 3000 didn’t turn into the game changer the faithful were always dreaming of. It sold moderately, mostly to the established Amiga hardcore, but had little obvious effect on the platform’s overall marketplace position. Harold Copperman was blamed for the disappointment, and was duly fired by Irving Gould, the principal shareholder and ultimate authority at Commodore, at the beginning of 1991. The new company line became an exact inversion of that which had held sway at the time of the Amiga 3000’s introduction: Copperman’s expertise was business computing, but Commodore’s future lay in consumer computing. Jim Dionne, head of Commodore’s Canadian division and supposedly an expert consumer marketer, was brought in to replace him.

An old joke began to make the rounds of the company once again. A new executive arrives at his desk at Commodore and finds three envelopes in the drawer, each labelled “open in case of emergency” and numbered one, two, and three. When the company gets into trouble for the first time on his watch, he opens the first envelope. Inside is a note: “Blame your predecessor.” So he does, and that saves his bacon for a while, but then things go south again. He opens the second envelope: “Blame your vice-presidents.” So he does, and gets another lease on life, but of course it only lasts a little while. He opens the third envelope. “Prepare three envelopes…” he begins to read.

Yet anyone who happened to be looking closely might have observed that the firing of Copperman represented something more than the usual shuffling of the deck chairs on the S.S. Commodore. Upon his promotion, it was made clear to Jim Dionne that he was to be held on a much shorter leash than his predecessors, his authority carefully circumscribed. Filling the power vacuum was one Mehdi Ali, a lawyer and finance guy who had come to Commodore a couple of years before as a consultant and had since insinuated himself more and more with Irving Gould. Now he advanced to the title of president of Commodore International, Gould’s right-hand man in running the global organization; indeed, he seemed to be calling far more shots these days than his globe-trotting boss, who never seemed to be around when you needed him anyway. Ali’s rise would not prove a happy event for anyone who cared about the long-term health of the company.

For now, though, the full import of the changes in Commodore’s management structure was far from clear. Amiga users were on to the next Great White Hope, one that in fact had already been hinted at in the Palladium as the Amiga 3000 was being introduced. Once more “multimedia” would be the buzzword, but this time the focus would go back to the American consumer market Commodore had repeatedly failed to capture with the Amiga 500. The clue had been there in a seemingly innocuous, almost throwaway line from the speech delivered to the Palladium crowd by C. Lloyd Mahaffrey, Commodore’s director of marketing: “While professional users comprise the majority of the multimedia-related markets today, future plans call for penetration into the consumer market as home users begin to discover the benefits of multimedia.”

Commodore’s management, (proud?) owners of the world’s first multimedia personal computer, had for most of the latter 1980s been conspicuous by their complete disinterest in their industry’s initial forays into CD-ROM, the storage medium that, along with the graphics and sound hardware the Amiga already possessed, could have been the crowning piece of the platform’s multimedia edifice. The disinterest persisted in spite of the subtle and eventually blatant hints that were being dropped by people like Cinemaware’s Bob Jacob, whose pioneering “interactive movies” were screaming to be liberated from the constraints of 880 K floppy disks.

In 1989, a tiny piece of Commodore’s small engineering staff — described as “mavericks” by at least one source — resolved to take matters into their own hands, mating an Amiga with a CD-ROM drive and preparing a few demos designed to convince their managers of the potential that was being missed. Management was indeed convinced by the demo — but convinced to go in a radically different direction from that of simply making a CD-ROM drive that could be plugged into existing Amigas.

The Dutch electronics giant Philips had been struggling for what seemed like forever to finish something they envisioned as a whole new category of consumer electronics: a set-top box for the consumption of interactive multimedia content on CD. They called it CD-I, and it was already very, very late. Originally projected for release in time for the Christmas of 1987, its constant delays had left half the entertainment-software industry, who had invested heavily in the platform, in limbo on the whole subject of CD-ROM. What if Commodore could steal Phillips’s thunder by combining a CD-ROM drive with the audiovisually capable Amiga architecture not in a desktop computer but in a set-top box of their own? This could be the magic bullet they’d been looking for, the long-awaited replacement for the Commodore 64 in American living rooms.

The industry’s fixation on these CD-ROM set-top boxes — a fixation which was hardly confined to Phillips and Commodore alone — perhaps requires a bit of explanation. One thing these gadgets were not, at least if you listened to the voices promoting them, was game consoles. The set-top boxes could be used for many purposes, from displaying multimedia encyclopedias to playing music CDs. And even when they were used for pure interactive entertainment, it would be, at least potentially, adult entertainment (a term that was generally not meant in the pornographic sense, although some were already muttering about the possibilities that lurked therein as well). This was part and parcel of a vision that came to dominate much of digital entertainment between about 1989 and 1994: that of a sort of grand bargain between Northern and Southern California, a melding of the new interactive technologies coming out of Silicon Valley with the movie-making machine of Hollywood. Much of television viewing, so went the argument, would become interactive, the VCR replaced with the multimedia set-top box.

In light of all this conventional wisdom, Commodore’s determination to enter the fray — effectively to finish the job that Phillips couldn’t seem to — can all too easily be seen as just another example of the me-too-ism that had clung to their earlier multimedia pronouncements. At the time, though, the project was exciting enough that Commodore was able to lure quite a number of prominent names to work with them on it. Carl Sassenrath, who had designed the core of the original AmigaOS — including its revolutionary multitasking capability — signed on again to adapt his work to the needs of a set-top box. (“In many ways, it was what we had originally dreamed for the Amiga,” he would later say of the project, a telling quote indeed.) Jim Sachs, still the most famous of Amiga artists thanks to his work on Cinemaware’s Defender of the Crown, agreed to design the look of the user interface. Reichart von Wolfsheild and Leo Schwab, both well-known Amiga developers, also joined. And for the role of marketing evangelist Commodore hired none other than Nolan Bushnell, the founder almost two decades before of Atari, the very first company to place interactive entertainment in American living rooms. The project as a whole was placed in the capable hands of Gail Wellington, known throughout the Amiga community as the only Commodore manager with a dollop of sense. The gadget itself came to be called CDTV — an acronym, Commodore would later claim in a part of the sales pitch that fooled no one, for “Commodore Dynamic Total Vision.”

Nolan Bushnell, Mr. Atari himself, plugs CDTV at a trade show.

Commodore announced CDTV at the Summer Consumer Electronics Show in June of 1990, inviting selected attendees to visit a back room and witness a small black box, looking for all the world like a VCR or a stereo component, running some simple demos. From the beginning, they worked hard to disassociate the product from the Amiga and, indeed, from computers in general. The word “Amiga” appeared nowhere on the hardware or anywhere on the packaging, and if all went according to plan CDTV would be sold next to televisions and stereos in department stores, not in computer shops. Commodore pointed out that everything from refrigerators to automobiles contained microprocessors these days, but no one called those things computers. Why should CDTV be any different? It required no monitor, instead hooking up to the family television set. It neither included nor required a keyboard — much industry research had supposedly proved that non-computer users feared keyboards more than anything else — nor even a mouse, being controlled entirely through a remote control that looked pretty much like any other specimen of same one might find between the cushions of a modern sofa. “If you know how to change TV channels,” said a spokesman, “you can take full advantage of CDTV.” It would be available, Commodore claimed, before the Christmas of 1990, which should be well before CD-I despite the latter’s monumental head start.

That timeline sounded overoptimistic even when it was first announced, and few were surprised to see the launch date slip into 1991. But the extra time did allow a surprising number of developers to jump aboard the CDTV train. Commodore had never been good at developer relations, and weren’t terribly good at it now; developers complained that the tools Commodore provided were always late and inadequate and that help with technical problems wasn’t easy to come by, while financial help was predictably nonexistent. Still, lots of CD-I projects had been left in limbo by Phillips’s dithering and were attractive targets for adaptation to CDTV, while the new platform’s Amiga underpinnings made it fairly simple to port over extant Amiga games like SimCity and Battle Chess. By early 1991, Commodore could point to about fifty officially announced CDTV titles, among them products from such heavy hitters as Grolier, Disney, Guinness (the publisher, not the beer company), Lucasfilm, and Sierra. This relatively long list of CDTV developers certainly seemed a good sign, even if not all of the products they proposed to create looked likely to be all that exciting, or perhaps even all that good. Plenty of platforms, including the original Amiga, had launched with much less.

While the world — or at least the Amiga world — held its collective breath waiting for CDTV’s debut, the charismatic Nolan Bushnell did what he had been hired to do: evangelize like crazy. “What we are really trying to do is make multimedia a reality, and I think we’ve done that,” he said. The hyperbole was flying thick and fast from all quarters. “This will change forever the way we communicate, learn, and entertain,” said Irving Gould. Not to be outdone, Bushnell noted that “books were great in their day, but books right now don’t cut it. They’re obsolete.” (Really, why was everyone so determined to declare the death of the book during this period?)

CDTV being introduced at the 1991 World of Amiga show. Doing the introducing is Gail Wellington, head of the CDTV project and one of the unsung heroes of Commodore.

The first finished CDTV units showed up at the World of Amiga show in New York City in April of 1991; Commodore sold their first 350 to the Amiga faithful there. A staggered roll-out followed: to five major American cities, Canada, and the Commodore stronghold of Britain in May; to France, Germany, and Italy in the summer; to the rest of the United States in time for Christmas. With CD-I now four years late, CDTV thus became the first CD-ROM-based set-top box you could actually go out and buy. Doing so would set you back just under $1000.

The Amiga community, despite being less than thrilled by the excision of all mention of their platform’s name from the product, greeted the launch with the same enthusiasm they had lavished on the Amiga 3000, their Great White Hope of the previous year, or for that matter the big Christmas marketing campaign of 1989. Amazing Computing spoke with bated breath of CDTV becoming the “standard for interactive multimedia consumer hardware.”

“Yes, but what is it for?” These prospective customers’ confusion is almost palpable.

Alas, there followed a movie we’ve already seen many times. Commodore’s marketing was ham-handed as usual, declaring CDTV “nothing short of revolutionary” but failing to describe in clear, comprehensible terms why anyone who was more interested in relaxing on the sofa than fomenting revolutions might actually want one. The determination to disassociate CDTV from the scary world of computers was so complete that the computer magazines weren’t even allowed advance models; Amiga Format, the biggest Amiga magazine in Britain at the time with a circulation of more than 160,000, could only manage to secure their preview unit by making a side deal with a CDTV developer. CDTV units were instead sent to stereo magazines, who shrugged their shoulders at this weird thing this weird computer company had sent them and returned to reviewing the latest conventional CD players. Nolan Bushnell, the alleged marketing genius who was supposed to be CDTV’s ace in the hole, talked a hyperbolic game at the trade shows but seemed otherwise disengaged, happy just to show up and give his speeches and pocket his fat paychecks. One could almost suspect — perish the thought! — that he had only taken this gig for the money.

In the face of all this, CDTV struggled mightily to make any headway at all. When CD-I hit the market just before Christmas, boasting more impressive hardware than CDTV for roughly the same price, it only made the hill that much steeper. Commodore now had a rival in a market category whose very existence consumers still obstinately refused to recognize. As an established maker of consumer electronics in good standing with the major retailers — something Commodore hadn’t been since the heyday of the Commodore 64 — Phillips had lots of advantages in trying to flog their particular white elephant, not to mention an advertising budget their rival could only dream of. CD-I was soon everywhere, on store shelves and in the pages of the glossy lifestyle magazines, while CDTV was almost nowhere. Commodore did what they could, cutting the list price of CDTV to less than $800 and bundling with it The New Grolier Encyclopedia and the smash Amiga game Lemmings. It didn’t help. After an ugly Christmas season, Nolan Bushnell and the other big names all deserted the sinking ship.

Even leaving aside the difficulties inherent in trying to introduce people to an entirely new category of consumer electronics — difficulties that were only magnified by Commodore’s longstanding marketing ineptitude — CDTV had always been problematic in ways that had been all too easy for the true believers to overlook. It was clunky in comparison to CD-I, with a remote control that felt awkward to use, especially for games, and a drive which required that the discs first be placed into an external holder before being loaded into the unit proper. More fundamentally, the very re-purposing of old Amiga technology that had allowed it to beat CD-I to market made it an even more limited platform than its rival for running the sophisticated adult entertainments it was supposed to have enabled. Much of the delay in getting CD-I to market had been the product of a long struggle to find a way of doing video playback with some sort of reasonable fidelity. Even the released CD-I performed far from ideally in this area, but it did better than CDTV, which at best — at best, mind you — might be able to fill about a third of the television screen with low-resolution video running at a choppy twelve frames per second. It was going to be hard to facilitate a union of Silicon Valley and Hollywood with technology like that.

None of CDTV’s problems were the fault of the people who had created it, who had, like so many Commodore engineers before and after them, been asked to pull off a miracle on a shoestring. They had managed to create, if not quite a miracle, something that worked far better than it had a right to. It just wasn’t quite good enough to overcome the marketing issues, the competition from CD-I, and the marketplace confusion engendered by an interactive set-top box that said it wasn’t a game console but definitely wasn’t a home computer either.

CDTV could be outfitted with a number of accessories that turned it into more of a “real” computer. Still, those making software for the system couldn’t count on any of these accessories being present, which served to greatly restrict their products’ scope of possibility.

Which isn’t to say that some groundbreaking work wasn’t done by the developers who took a leap of faith on Commodore — almost always a bad bet in financial terms — and produced software for the platform. CDTV’s early software catalog was actually much more impressive than that of CD-I, whose long gestation had caused so many initially enthusiastic developers to walk away in disgust. The New Grolier Encyclopedia was a true multimedia dictionary; the entry for John F. Kennedy, for example, included not only a textual biography and photos to go along with it but audio excerpts from his most famous speeches. The American Heritage Dictionary also offered images where relevant, along with an audio pronunciation of every single word. American Vista: The Multimedia U.S. Atlas boasted lots of imagery of its own to add flavor to its maps, and could plan a route between any two points in the country at the click of a button. All of these things may sound ordinary today, but in a way that very modern ordinariness is a testament to what pioneering products these really were. They did in fact present an argument that, while others merely talked about the multimedia future, Commodore through CDTV was doing it — imperfectly and clunkily, yes, but one has to start somewhere.

One of the most impressive CDTV titles of all marked the return of one of the Amiga’s most beloved icons. After designing the CDTV’s menu system, the indefatigable Jim Sachs returned to the scene of his most famous creation. Really a remake rather than a sequel, Defender of the Crown II reintroduced many of the additional graphics and additional tactical complexities that had been excised from the original in the name of saving time, pairing them with a full orchestral soundtrack, digitized sound effects, and a narrator to detail the proceedings in the appropriate dulcet English accent. It was, Sachs said, “the game the original Defender of the Crown was meant to be, both in gameplay and graphics.” He did almost all of the work on this elaborate multimedia production all by himself, farming out little more than the aforementioned narration, and Commodore themselves released the game, having acquired the right to do so from the now-defunct Cinemaware at auction. While, as with the original, its long-term play value is perhaps questionable, Defender of the Crown II even today still looks and sounds mouth-wateringly gorgeous.


If any one title on CDTV was impressive enough to sell the machine by itself, this ought to be have been it. Unfortunately, it didn’t appear until well into 1992, by which time CDTV already had the odor of death clinging to it. The very fact that Commodore allowed the game to be billed as the sequel to one so intimately connected to the Amiga’s early days speaks to a marketing change they had instituted to try to breathe some life back into the platform.

The change was born out of an insurrection staged by Commodore’s United Kingdom branch, who always seemed to be about five steps ahead of the home office in any area you cared to name. Kelly Sumner, managing director of Commodore UK:

We weren’t involved in any of the development of CDTV technology; that was all done in America. We were taking the lead from the corporate company. And there was a concrete stance of “this is how you promote it, this is the way forward, don’t do this, don’t do that.” So, that’s what we did.

But after six or eight months we basically turned around and said, “You don’t know what you’re talking about. It ain’t going to go anywhere, and if it does go anywhere you’re going to have to spend so much money that it isn’t worth doing. So, we’re going to call it the Amiga CDTV, we’re going to produce a package with disk drives and such like, and we’re going to promote it like that. People can understand that, and you don’t have to spend so much money.”

True to their word, Commodore UK put together what they called “The Multimedia Home Computer Pack,” combining a CDTV unit with a keyboard, a mouse, an external disk drive, and the software necessary to use it as a conventional Amiga as well as a multimedia appliance — all for just £100 more than a CDTV unit alone. Commodore’s American operation grudgingly followed their lead, allowing the word “Amiga” to creep back into their presentations and advertising copy.

Very late in the day, Commodore finally began acknowledging and even celebrating CDTV’s Amigahood.

But it was too late — and not only for CDTV but in another sense for the Amiga platform itself. The great hidden cost of the CDTV disappointment was the damage it did to the prospects for CD-ROM on the Amiga proper. Commodore had been so determined to position CDTV as its own thing that they had rejected the possibility of equipping Amiga computers as well with CD-ROM drives, despite the pleas of software developers and everyday customers alike. A CD-ROM drive wasn’t officially mated to the world’s first multimedia personal computer until the fall of 1992, when, with CDTV now all but left for dead, Commodore finally started shipping an external drive that made it possible to run most CDTV software, as well as CD-based software designed specifically for Amiga computers, on an Amiga 500. Even then, Commodore provided no official CD-ROM solution for Amiga 2000 and 3000 owners, forcing them to cobble together third-party adapters that could interface with drives designed for the Macintosh. The people who owned the high-end Amiga models, of course, were the ones working in the very cutting-edge fields that cried out for CD-ROM.

It’s difficult to overstate the amount of damage the Amiga’s absence from the CD-ROM party, the hottest ticket in computing at the time, did to the platform’s prospects. It single-handedly gave the lie to every word in Harold Copperman’s 1990 speech about Commodore being “the leaders in multimedia.” Many of the most vibrant Amiga developers were forced to shift to the Macintosh or another platform by the lack of CD-ROM support. Of all Commodore’s failures, this one must loom among the largest. They allowed the Macintosh to become the platform most associated with the new era of CD-ROM-enabled multimedia computing without even bothering to contest the territory. The war was over before Commodore even realized a war was on.

Commodore’s feeble last gasp in terms of marketing CDTV positioned it as essentially an accessory to desktop Amigas, a “low-cost delivery system for multimedia” targeted at business and government rather than living rooms. The idea was that you could create presentations on Amiga computers, send them off to be mastered onto CD, then drag the CDTV along to board meetings or planning councils to show them off. In that spirit, a CDTV unit was reduced to a free toss-in if you bought an Amiga 3000 — two slow-selling products that deserved one another.

The final verdict on CDTV is about as ugly as they come: less than 30,000 sold worldwide in some eighteen months of trying; less than 10,000 sold in the American market Commodore so desperately wanted to break back into, and many or most of those sold at fire-sale discounts after the platform’s fate was clear. In other words, the 350 CDTV units that had been sold to the faithful at that first ebullient World of Amiga show made up an alarmingly high percentage of all the CDTV units that would ever sell. (Phillips, by contrast, would eventually manage to move about 1 million CD-I units over the course of about seven years of trying.)

The picture I’ve painted of the state of Commodore thus far is a fairly bleak one. Yet that bleakness wasn’t really reflected in the company’s bottom line during the first couple of years of the 1990s. For all the trouble Commodore had breaking new products in North America and elsewhere, their legacy products were still a force to be reckoned with outside the United States. Here the end of the Cold War and subsequent lifting of the Iron Curtain proved a boon. The newly liberated peoples of Eastern Europe were eager to get their hands on Western computers and computer games, but had little money to spend on them. The venerable old Commodore 64, pulling along behind it that rich catalog of thousands upon thousands of games of all stripes, was the perfect machine for these emerging markets. Effectively dead in North America and trending that way in Western Europe, it now enjoyed a new lease on life in the former Soviet sphere, its sales numbers suddenly climbing sharply again instead of falling. The Commodore 64 was, it seemed, the cockroach of computers; you just couldn’t kill it. Not that Commodore wanted to: they would happily bank every dollar their most famous creation could still earn them. Meanwhile the Amiga 500 was selling better than ever in Western Europe, where it was now the most popular single gaming platform of all, and Commodore happily banked those profits as well.

Commodore’s stock even enjoyed a brief-lived bubble of sorts. In the spring and early summer of 1991, with sales strong all over Europe and CDTV poised to hit the scene, the stock price soared past $20, stratospheric heights by Commodore’s recent standards. This being Commodore, the stock collapsed below $10 again just as quickly — but, hey, it was nice while it lasted. In the fiscal year ending on June 30, 1991, worldwide sales topped the magical $1 billion mark, another height that had last been seen in the heyday of the Commodore 64. Commodore was now the second most popular maker of personal computers in Europe, with a market share of 12.4 percent, just slightly behind IBM’s 12.7 percent. The Amiga was now selling at a clip of 1 million machines per year, which would bring the total installed base to 4.5 million by the end of 1992. Of that total, 3.5 million were in Europe: 1.3 million in Germany, 1.2 million in Britain, 600,000 in Italy, 250,000 in France, 80,000 in Scandinavia. (Ironically in light of the machine’s Spanish name, one of the few places in Western Europe where it never did well at all was Spain.) To celebrate their European success, Irving Gould and Mehdi Ali took home salaries in 1991 of $1.75 million and $2.4 million respectively, the latter figure $400,000 more than the chairman of IBM, a company fifty times Commodore’s size, was earning.

But it wasn’t hard to see that Commodore, in relying on all of these legacy products sold in foreign markets, was living on borrowed time. Even in Europe, MS-DOS was beginning to slowly creep up on the Amiga as a gaming platform by 1992, while Nintendo and Sega, the two big Japanese console makers, were finally starting to take notice of this virgin territory after having ignored it for so long. While Amiga sales in Europe in 1992 remained blessedly steady, sales of the Amiga in North America were down as usual, sales of the Commodore 64 in Eastern Europe fell off thanks to economic chaos in the region, and sales of Commodore’s line of commodity PC clones cratered so badly that they pulled out of that market entirely. It all added up to a bottom line of about $900 million in total earnings for the fiscal year ending on June 30, 1992. The company was still profitable, but considerably less so than it had been the year before. Everyone was now looking forward to 1993 with more than a little trepidation.

Even as Commodore faced an uncertain future, they could at least take comfort that their arch-enemy Atari was having a much worse time of it. In the very early 1990s, Atari enjoyed some success, if not as much as they had hoped, with their Lynx handheld game console, a more upscale rival to the Nintendo Game Boy. The Atari Portfolio, a genuinely groundbreaking palmtop computer, also did fairly well for them, if perhaps not quite as well as it deserved. But the story of their flagship computing platform, the Atari ST, was less happy. Already all but dead in the United States, the ST’s market share in Europe shrank in proportion to the Amiga’s increasing sales, such that it fell from second to third most popular gaming computer in 1991, trailing MS-DOS now as well as the Amiga.

Atari tried to remedy the slowing sales with new machines they called the STe line, which increased the color palette to 4096 shades and added a blitter chip to aid onscreen animation. (The delighted Amiga zealots at Amazing Computing wrote of these Amiga-inspired developments that they reminded them of “an Amiga 500 created by a primitive tribe that had never actually seen an Amiga, but had heard reports from missionaries of what the Amiga could do.”) But the new hardware broke compatibility with much existing software, and it only got harder to justify buying an STe instead of an Amiga 500 as the latter’s price slowly fell. Atari’s total sales in 1991 were just $285 million, down by some 30 percent from the previous year and barely a quarter of the numbers Commodore was doing. Jack Tramiel and his sons kept their heads above water only by selling off pieces of the company, such as the Taiwanese manufacturing facility that went for $40.9 million that year. You didn’t have to be an expert in the computer business to understand how unsustainable that path was. In the second quarter of 1992, Atari posted a loss of $39.8 million on sales of just $23.3 million, a rather remarkable feat in itself. Whatever else lay in store for Commodore and the Amiga, they had apparently buried old Mr. “Business is War.”

Still, this was no time to bask in the glow of sweet revenge. The question of where Commodore and the Amiga went from here was being asked with increasing urgency in 1992, and for very good reason. The answer would arrive in the latter half of the year, in the form at long last of the real, fundamental technical improvements the Amiga community had been begging for for so long. But had Commodore done enough, and had they done it in time to make a difference? Those questions loomed large as the 68000 Wars were about to enter their final phase.

(Sources: the book On the Edge: The Spectacular Rise and Fall of Commodore by Brian Bagnall; Amazing Computing of August 1987, June 1988, June 1989, July 1989, May 1990, June 1990, July 1990, August 1990, September 1990, December 1990, January 1991 February 1991, March 1991, April 1991, May 1991, June 1991, August 1991, September 1991, November 1991, January 1992, February 1992, March 1992, April 1992, June 1992, July 1992, August 1992, September 1992, November 1992, and December 1992; Info of July/August 1988 and January/February 1989; Amiga Format of July 1991, July 1995, and the 1992 annual; The One of September 1990, May 1991, and December 1991; CU Amiga of June 1992, October 1992, and November 1992; Amiga Computing of April 1992; AmigaWorld of June 1991. Online sources include Matt Barton’s YouTube interview with Jim Sachs,  Sébastien Jeudy’s interview with Carl Sassenrath, Greg Donner’s Workbench Nostalgia, and Atari’s annual reports from 1989, available on archive.org. My huge thanks to reader “himitsu” for pointing me to the last and providing some other useful information on Commodore and Atari’s financials during this period in the comments to a previous article in this series. And thank you to Reichart von Wolfsheild, who took time from his busy schedule to spend a Saturday morning with me looking back on the CDTV project.)

 
 

Tags: , , , , ,

Games on the Mersey, Part 5: The Lemmings Effect

“Mummy, what’s a group of lemmings called?”

“A pact. That’s right, a suicide pact.”

“Mummy, when a lemming dies does it go to heaven like all good little girls?”

“Don’t be soft. They burn in hell — like distress flares.”

(courtesy of the January 1991 issue of The One)


 

If you had looked at the state of Psygnosis in 1990 and tried to decide which of their outside developers would break the mold of beautiful-but-empty action games, you would have no reason to single out DMA Design over any of the others. Certainly Menace and Blood Money, the two games DMA had already created for Psygnosis, gave little sign that any visionaries lurked within their ranks. From their generic teenage-cool titles to their rote gameplay, both games were as typical of Psygnosis as anything else in their catalog.

And yet DMA Design — and particularly their leader, David Jones — did in fact have abilities as yet undreamt of. People have a way of surprising you sometimes. And isn’t that a wonderful thing?


 

There are some interesting parallels between the early days of DMA Design and the early days of Imagine Software, that predecessor to Psygnosis. Like Imagine, DMA was born in a city far from the cultural capitals of Britain — even farther away than Liverpool, in fact, all the way up in Dundee, Scotland.

Dundee as well had traditionally been a working-class town that thrived as a seaport, until the increasing size of merchant ships gradually made the role untenable over the course of the twentieth century. Leaders in Dundee, as in Liverpool, worked earnestly to find new foundations for their city’s economy. And one of these vectors of economic possibility — yet again as in Liverpool — was electronics. Dundee convinced the American company National Cash Register Corporation, better known as NCR, to build their principal manufacturing plant for Britain and much of Europe there as early as 1945, and by the 1960s the city was known throughout Britain as a hub of electronics manufacture. Among the electronics companies that came to Dundee was Timex, the Norwegian/Dutch/American watchmaking conglomerate.

In 1980, Sinclair Research subcontracted out most of the manufacture of their new ZX80 home computer to the Timex plant in Dundee. The relationship continued with the ZX81, and then with Sinclair’s real heavy hitter, the Spectrum. By 1983, Timex was straining to keep up with demand for the little machines, hiring like mad in and around Dundee in order to keep the production lines rolling day and night.

David Jones. (It’s my understanding that the nose is removable.)

One of the people they hired was David Jones, 18 years old and fresh out of an experimental new “computer studies” program that was being pioneered in Dundee. He came to Timex on an apprenticeship, which paid for him to take more computer courses at nearby Kingsway Technical College.

Just as had Bruce Everiss’s Microdigital shop in Liverpool, Kingsway College was fomenting a hacking scene in Dundee, made up largely of working-class youths who but for this golden chance might have had to resign themselves to lives spent sweeping out warehouses. Both students of the college and interested non-students would meet regularly in the common areas to talk shop and, inevitably, to trade pirated games. Jones became one of the informal leaders of the collective. To ensure a steady supply of games for his mates, he even joined a cracking group in the international piracy “scene” who called themselves the Kent Team.

But for the most dedicated of the Kingsway gang, Jones among them, trading and playing games was a diversion rather than the real point. Like the gang as a whole, this hacker hardcore was of disparate temperaments and ages — one of them, named Mike Dailly, was just 14 years old when he started showing up at the college — but they were united by a certain seriousness about computers, by the fact that computers for them were, rather than just a hobby or even a potential means of making a living, an all-consuming passion. Unlike their more dilettantish peers, they were more interested in understanding how the games they copied worked and learning how to make their own than they were in playing them for their own sake.

In 1986, Sinclair sold their entire extant home-computer line to Amstrad, who promptly took the manufacturing of same in-house, leaving Timex out of a huge contract. Most of the Dundee plant’s employees, Jones among them, were laid off as a result. At the urging of his parents, he invested half of his £2000 severance check into a degree course in Computer Science at the Dundee College of Technology (now known as Abertay University). He used the other half to buy a Commodore Amiga.

Brand-new and pricey, the Amiga was still a very exotic piece of kit anywhere in Britain in 1986, much less in far-flung Dundee; Jones may very well have been the first person in his hometown to own one. His new computer made him more popular than ever among the Kingsway hackers. With the help of his best buddies from there and some others he’d met through the Kent Team, his experiments with his new toy gradually coalesced around making a shoot-em-up game that he could hopefully sell to a publisher.

David Jones first met Ian Hetherington and Dave Lawson of Psygnosis in late 1987, when he took his shoot-em-up-in-progress to a Personal Computer World Show to hawk it to potential publishers. Still at that age when the events of last year, much less those of three years ago, seem like ancient history, he had little awareness of their company’s checkered past as Imagine, and less concern about it. “You know, I don’t think I even researched it that well,” he says. “I remember the stories about it, but back in those days everything was moving so quickly, it never even crossed my mind.” Instead he was wowed, like so many developers, by Psygnosis’s cool good looks. The idea of seeing his game gussied up by the likes of Roger Dean was a difficult one to resist. He also liked the fact that Psygnosis was, relatively speaking, close to Dundee: only about half a day by car.

Menace

Psygnosis was perhaps slightly less impressed. They agreed to publish the game, running it through their legendary art department in the process, giving it the requisite Roger Dean cover, and changing the name from the Psygnosis-like Draconia to the still more Psygnosis-like Menace. But they didn’t quite judge it to be up to the standard of their flagship games, publishing it in a smaller box under a new budget line they were starting called Psyclapse (a name shared with the proposed but never-completed second megagame from the Imagine days).

Still, even at a budget price and thus a budget royalty rate, Menace did well enough to make Jones a very happy young man, selling about 20,000 copies in the still emerging European Amiga market. It even got a showcase spot in the United States, when the popular television program Computer Chronicles devoted a rare episode to the Amiga and Commodore’s representative chose Menace as the game to show off alongside Interplay’s hit Battle Chess; host Stewart Cheifet was wowed by the “really hot graphics.” A huge auto buff — another trait he shared with the benighted original founders of Imagine — Jones made enough from the game to buy his first new car. It was a Vauxhall Astra hot hatch rather than a Ferrari, but, hey, you had to start somewhere.

Even before Menace‘s release, he had taken to calling his informal game-development club DMA Design, a name that fit right in with Psygnosis’s effect-obsessed aesthetic: “DMA” in the computer world is short for “Direct Memory Access,” something the Amiga’s custom chips all utilized to generate audiovisual effects without burdening the CPU. (When the glory days of Psygnosis with Amiga tech-heads had passed, Jones would take to saying that the name stood for “Doesn’t Mean Anything.”) In the wake of Menace‘s success, he decided to drop out of college and hang up his shingle in a cramped two-room space owned by his fiancée’s father, located on a quiet street above a baby shop and across the way from a fish-and-chips joint. He moved in in August of 1989, bringing with him many of his old Kingsway College mates on a full-time or part-time basis, as their circumstances and his little company’s income stream dictated.

DMA’s first office, a nondescript place perched above a baby shop in Dundee. That’s artist Gary Timmons peering out the window. It was in this improbable location that the most successful game the British games industry had produced to date was born.

DMA, which still had more the atmosphere of a computer clubhouse than a conventional company, gladly took on whatever projects Psygnosis threw them, including the rather thankless task of converting existing Amiga titles — among them Shadow of the Beast — to more limited, non-Amiga platforms like the Commodore 64. Meanwhile Jones shepherded to completion their second original game, another typically Psygnosisian confection called Blood Money. This game came out as a full-price title, a sign that they were working their way up through the ranks. Even better, it sold twice as many copies as had Menace.

So, they charged ahead on yet another game cut from the same cloth. Just about everything you need to know about their plans for Gore is contained in the name. Should you insist on more information, consider this sample from a magazine preview of the work-in-progress: “Slicing an adversary’s neck in two doesn’t necessarily guarantee its defeat. There’s a chance the decapitated head will sprout wings and fly right back at you!” But then, in the midst of work on that charming creation, fate intervened to change everything forever for DMA and Psygnosis alike.

The process that would lead to the most popular computer game the nation of Britain had yet produced had actually begun months before, in fact within days of the DMA crew moving into their new clubhouse. It all began with an argument.

Blood Money was in the final stages of development at the time — it would be released before the end of 1989 — and, having not yet settled on making Gore, Jones and company had been toying with ideas for a potential sequel they tentatively called Walker, based around one of the characters in Blood Money, a war robot obviously inspired by the Imperial walkers in Star Wars: The Empire Strikes Back. Scott Johnson, an artist whom DMA had recently rescued from a life of servitude behind the counter of the local McDonald’s, said that the little men which the walker in the new game would shoot with its laser gun and crush beneath its feet would need to be at least 16 by 16 pixels in size to look decent. Mike Dailly, the Kingsway club veteran, disagreed, and set about trying to prove himself right by drawing them inside a box of just 8 by 8 pixels.

But once he got started drawing little men in Deluxe Paint, he just couldn’t stop. (Deluxe Paint did always tend to have that effect on people; not for nothing did artist George Christensen once call it “the greatest videogame ever designed.”) Just for fun, he added a ten-ton weight out of a Coyote-and-Road-Runner cartoon, which crushed the little fellows as they walked beneath it. Johnson soon jumped back into the fray as well, and the two wound up creating a screen-full of animations, mostly showing the little characters coming to various unhappy ends. Almost forgotten by the end of the day was the fact that Dailly had proved his point: a size of 8 by 8 pixels was enough.

This screen of Deluxe Paint animations, thrown together on a lark one afternoon by a couple of bored young men, would spawn a franchise which sold 15 million games.

It was another member of the DMA club, Russell Kay, who looked at the animations and spoke the fateful words: “There’s a game in that!” He took to calling the little men lemmings. It was something about the way they always seemed to be moving in lockstep and in great quantities across the screen whenever they showed up — and the way they were always blundering into whatever forms of imaginative deaths their illustrators could conjure up. The new name resulted in the appearance of the little men morphing into something not quite rodent-like but not quite human either, a bunch of vaguely Smurf-like fellows with bulbous noses and a cuteness people all over the world would soon find irresistible — even as they watched them die horribly by the dozens.

The very first lemmings

 


Did you know?

James Bond author Ian Fleming’s real name was in fact Ian Frank Lemming. It doesn’t take a genius to see how easily a mistake was made.

Success for Jane Lemming was always on the cards, but it took a change of name and hairstyle to become a top television personality — better known as Jan Leeming.

“One day you’ll be a big movie star,” someone once told leading Hollywood heartthrob Jack Lemmon (real name Jack Lemming). And he is.

(courtesy of the January 1991 issue of The One)

The actuality of real-world lemmings is in fact very different from the myth of little rodent soldiers following one another blindly off the side of a cliff. The myth, which may first have shown up in print in Cyril Kornbluth’s 1951 science-fiction story “The Marching Morons,” was popularized by Disney later in the decade, first in their cartoons and later in a purported nature documentary called White Wilderness, whose makers herded the poor creatures off a cliff while the cameras rolled.

While it has little to no basis in reality, the myth can be a difficult proposition to resist in terms of metaphor. Fans of Infocom’s interactive fiction may remember lemmings showing up in Trinity, where the parallel between lemmings marching off a cliff and Cold War nuclear brinkmanship becomes a part of author Brian Moriarty’s rich symbolic language. It’s safe to say, though, that no similarly rarefied thoughts about the creatures were swirling around DMA Design. They just thought they were really, really funny.


 

Despite Russell Kay’s prophetic comment, no one at DMA Design was initially in a hurry to do anything more with their lemmings than amuse themselves. For some months, the lemmings were merely a joke that was passed around DMA’s circle in Dundee in the form of pictures and animations, showing them doing various ridiculous things and getting killed in various hilarious ways.

In time, though, Jones found himself at something of an impasse with the Gore project. He had decided he simply couldn’t show all the gore he wanted to on a 512 K Amiga, but Psygnosis was extremely reluctant, despite the example of other recent releases like Dungeon Master, to let him make a game that required 1 MB of memory. He decided to shelve Gore for a while, to let the market catch up to his ambitions for it. In the meantime, he turned, almost reluctantly, to those lemmings that were still crawling all over the office, to see if there was indeed a game there. The answer, of course, would prove to be a resounding yes. There was one hell of a game there, one great enough to ensure that Gore would never cross anyone’s mind again.

The game which Jones started to program in June of 1990 was in one sense a natural evolution of the animations, so often featuring long lines of marching lemmings, that his colleagues had been creating in recent months. In another, though, it was a dramatic break from anything DMA — or, for that matter, Psygnosis — had done before, evolving into a constructive puzzle game rather than another destructive action game.

That said, reflexes and timing and even a certain amount of destruction certainly have their roles to play as well. Lemmings is a level-based game. The little fellows — up to 100 of them in all — pour out of a chute and march mindlessly across the level’s terrain from left to right. They’ll happily walk off cliffs or into vats of water or acid, or straight into whatever other traps the level contains. They’ll turn around and march in the other direction only if they encounter a wall or other solid barrier — and, once they start going from right to left instead of left to right, they won’t stop until they’re dead or they’re forced to turn around yet again. Your task is to alter their behavior just enough to get as many of them as possible safely to the level’s exit. Doing so often requires sacrificing some of them for the greater good.

To meet your goal, you have a limited but surprisingly flexible palette of possibilities at your disposal. You can change the behavior of individual lemmings by assigning them one or both of two special abilities, and/or by telling them to perform one of six special actions. The special abilities include making a lemming a “climber,” able to crawl up sheer vertical surfaces like a sort of inchworm; and making a lemming a “floater,” equipped with an umbrella-cum-parachute which will let him fall any distance without harm. The special actions include telling a lemming to blow himself up (!), possibly damaging the terrain around him in the process; turning him into a “blocker,” standing in one place and forcing any other lemmings who bump into him to turn around and march in the other direction; having him build a section of bridge — or, perhaps better said, of an upward-angling ramp; or having him dig in any of three directions: horizontally, diagonally, or vertically. Each level lets you use each special ability and each special action only a limited number of times. These restrictions are key to much of the challenge of the later levels; advanced Lemmings players become all too familiar with the frustration of winding up short by that one bridge-builder or digger, and having to start over with a completely new approach because of it.

In addition to the tools at your disposal which apply to individual lemmings, you also have a couple of more universal tools to hand. You can control the rate at which lemmings pour out of the entrance chute — although you can’t slow them down below a level’s starting speed — and you can pause the game to take a breather and position your cursor just right. The later levels require you to take advantage of both of these abilities, not least because each level has a time limit.


A Four-Screenshot Introduction to Lemmings

We’re about to begin one of the early levels — the first on the second difficulty level, or 31st out of 120 in all. We see the level’s name, the number of lemmings with which we have to deal, the number we’re required to save, their “release rate” — how quickly they fall out of the chute and into the world — and how much time we have.

Now we’ve started the level proper. We need to build a bridge to cross this gap. To keep the other lemmings from rushing after our slow-working bridge-builder and falling into the abyss, we’ve turned the one just behind him into a blocker. When we’re ready to let the lemmings all march onward, we can tell the blocker to blow himself up, thus clearing the way again. Note our toolbar at the bottom of the screen, including the count of how many of each type of action we have left.

With the gap safely bridged, we turn our lead lemming into a horizontal digger — or “basher” in the game’s preferred nomenclature — to get through the outcropping.

There were no other barriers between our lead lemming and the exit. So, we’ve blown up our blocker to release the hounds — er, lemmings — and now watch them stream toward the exit. We’ve only lost one lemming in total, that one being our poor blocker — a 99-percent survival rate on a level that only required us to save 50 percent. But don’t get too cocky; the levels will soon start getting much, much harder.


 

Setting aside design considerations for the moment, Lemmings is nothing short of an amazing feat in purely technical terms. Its levels, which usually sprawl over the width of several screens, consist entirely of deformable terrain. In other words, you can, assuming you have the actions at your disposal, dig wherever you want, build bridges wherever you want, etc., and the lemmings will interact with the changed terrain just as you would expect. To have the computer not just paint a landscape onto the screen but to be aware of and reactive to the potentially changing contents of every single pixel was remarkable in the game’s day. And then to move up to 100 independent lemmings in real time, while also being responsive to the player’s inputs… Lemmings is a program any hacker would be thrilled to claim.

As soon as he had the basic engine up and running, David Jones visited Psygnosis with it and “four or eight” levels in tow, to show them what he was now planning to turn into DMA’s next game. Jones:

They were a big company, probably about thirty or forty people. I said, “I’ll just go out to lunch, but what I’ll do is I’ll just leave the demo with a bunch of you here — grab it, play it, see what you think.” I remember coming back from lunch and it was on every single machine in the office. And everybody was just really, really enjoying it. At that time I thought, “Well, we have something really special here.”

That reaction would become typical. In an earlier article, I wrote about a Tetris Effect, meant to describe the way that game took over lives and destroyed productivity wherever it went. We might just as well coin the term “Lemmings Effect” now to describe a similar phenomenon. Part and parcel of the Lemmings Effect were all the terrible jokes: “What do lemmings drink?” “Lemmingade!” “What’s their favorite dessert?” “Lemming meringue pie!”

Following its release, Lemmings would be widely greeted as an immaculate creation, a stroke of genius with no antecedents. Our old friend Bruce Everiss, not someone inclined to praise anything Ian Hetherington was involved in without ample justification, nevertheless expressed the sentiment in his inimitably hyperbolic way in 1995:

In the worlds of novels and cinema, it is recognised that there are only a small number of plots in the universe. Each new book or film takes one of these plots and interprets it in a different way.

So it is in computer games. Every new title has been seen in many different guises; it is merely the execution that is new. The Amiga unleashed new levels of sound, graphics, and computer power on the home market. Software titles utilised these capabilities in some amazing packages, but they were all just re-formulations of what had gone before.

Until Lemmings. DMA Design created a totally new concept. In computer games this is less common than rocking-horse manure. Not only was the concept of Lemmings completely new, but also the execution was exemplary, displaying the Amiga’s capabilities well.

Lemmings is indeed a shockingly original game, made all the more shocking by coming from a developer and a publisher that had heretofore given one so little reason to anticipate originality. Still, if we join some of the lemmings in digging a bit beneath the surface, we can in fact see a source for some of the ideas that went into it.

David Jones and his colleagues were fanatic devotees of Peter Molyneux’s Populous before and during their work on LemmingsPopulous, like Lemmings, demands that you control a diffuse mass of actors through somewhat indirect means, by manipulating the environment and changing the behavior of certain individuals in the mass. Indeed, Lemmings has a surprising amount in common with Populous, even as the former is a puzzle game of rodent rescue and the latter a strategy game of Medieval warfare. Jones and company went so far as to add a two-player mode to Lemmings in homage to the Populous tournaments that filled many an evening spent in the clubhouse above the baby shop. (In the case of Lemmings, however, the two-player mode wouldn’t prove terribly popular, not least because it required a single Amiga equipped with two mice; unique and entertaining, it’s also largely forgotten today, having been left out of the sequels and most of the ports.)

Although it’s seldom if ever described using the name, Lemmings thus fits into a group of so-called “god games” that were coming to the fore at the turn of the decade; in addition to Populous, the other famous exemplar from the time is Will Wright’s SimCity. More broadly, it also fits into a longstanding British tradition of spatial puzzle games, as exemplified by titles like The Sentinel.

The Lemmings level editor

But one area where David Jones and company wisely departed from the model of Populous and The Sentinel was in building all of the levels in Lemmings by hand. British programmers had always had a huge fondness for procedural generation, which suited both the more limited hardware they had to work with in comparison to their American peers and the smaller teams in which they generally worked. Jones bucked that precedent by building a level editor for Lemmings as soon as he had the game engine itself working reasonably well. The gang in and around the clubhouse all spent time with the level editor, coming up with devious creations. Once a week or so, they would vote on the best of them, then upload these to Psygnosis.

Belying their reputation for favoring style over substance, Psygnosis had fallen in love with the game of Lemmings from the day of Jones’s first visit with his early demo in tow. It had won over virtually everyone who worked there, gamer and non-gamer alike. That fact became a key advantage for the work-in-progress. Everyone in Liverpool would pile on to play the latest levels as they were sent over from Dundee, faxing back feedback on which ones should make the cut and how those that did could be made even more interesting. As the game accelerated toward completion, Jones started offering a £10 bounty for every level that passed muster with Psygnosis, leading to yet more frenzied activity in both Dundee and Liverpool. Almost accidentally, DMA and Psygnosis had hit upon a way of ensuring that Lemmings would get many times the play-testing of the typical game of its era, all born of the fact that everyone was dying to play it — and dying to talk about playing it, dying to explain how the levels could be made even better. The results would show in the finished product. Without the 120 lovingly handcrafted levels that shipped with the finished game, Lemmings would have been an incredible programming feat and perhaps an enjoyable diversion, but it could never have been the sensation it became.

Just as the quality of the levels was undoubtedly increased immeasurably by the huge willing testing pool of Psygnosis staffers, their diversity was increased by having so many different personalities making them. Even today, those who were involved in making the game can immediately recognize a level’s author from its design and even from its graphical look. Gary Timmons, a DMA artist, became famed, oddly enough given his day job, for his minimalist levels that featured little but the bare essentials needed to fulfill their functions. Mike Dailly, at the other extreme, loved to make his levels look “pretty,” filling them with colors and patterns that had nothing to do with the gameplay. The quintessential examples of Dailly’s aesthetic were a few special levels which were filled with graphics from the earlier Psygnosis games Shadow of the Beast I and II, Awesome, and DMA’s own Menace.

Awesome meets Lemmings. For some reason, I find the image of all these little primary-colored cartoon lemmings blundering through these menacing teenage-cool landscapes to be about the funniest — and certainly the most subversive — thing in the entire game.

But of course the most important thing is how the levels play — and here Lemmings only rarely disappoints. There are levels which play like action games, demanding perfect clicking and good reflexes above all; there are levels which play like the most cerebral of strategy games, demanding perfect planning followed by methodical execution of the plan. There are levels where you need to shepherd a bare handful of lemmings through an obstacle course of tricks and traps without losing a single one; there are levels where you have 100 lemmings, and will need to kill 90 percent of them in order to get a few traumatized survivors to the exit. There are levels which are brutally compressed, where you have only one minute to succeed or fail; there are levels which you give you fully nine minutes to guide your charges on a long journey across several screens worth of terrain.

One of the most remarkable aspects of Lemmings is the way it takes the time to teach you how to play it. The very first level is called “Just Dig!,” and, indeed, requires nothing more of you. As you continue through the first dozen levels or so, the game gradually introduces you to each of the verbs at your command. Throughout the levels that follow, necessity — that ultimate motivator — will force you to build upon what you already know, learning new tricks and new combinations. But it all begins gently, and the progression from rank beginner to master lemming-herder feels organic. Although the general trajectory of the difficulty is ever upward as you work your way through the levels, there are peaks and valleys along the way, such that a level that you have to struggle with for an hour or two will usually be followed by one or two less daunting challenges.

All of this has since become widely accepted as good design practice, but games in 1990 were very seldom designed like this. Looking for contemporaneous points of comparison, the best I can come up with is a game in a very different genre: the pioneering real-time dungeon-crawler Dungeon Master, which also gently teaches you how to play it interactively, without ever resorting to words, and then slowly ramps up the difficulty until it becomes very difficult indeed. Dungeon Master and Lemmings stand out in gaming history for not only inventing whole new paradigms of play, but for perfecting them in the same fell swoop. Just as it’s difficult to find a real-time dungeon crawler that’s better than Dungeon Master, you won’t find a creature-herding puzzle game that’s better than the original Lemmings without looking long and hard.

If I was to criticize anything in Lemmings, I’d have to point to the last handful of levels. For all its progressive design sensibilities, Lemmings was created in an era when games were expected to be hard, when the consensus view had it that actually beating one ought to be a monumental achievement. In that spirit, DMA pushed the engine — not to mention the player — to the ragged edge and perhaps a little beyond in the final levels. A blogger named Nadia, who took upon herself the daunting task of completing every single level in every single Lemmings game and writing about them all, described the things you need to do to beat many of these final levels as “exploiting weird junk in the game engine.” These levels are “the wrong kind of difficult,” she goes on to say, and I agree. Ah, well… at least the very last level is a solid one that manages to encompass much of what has come before, sending the game out on a fine note.

Here we see one of the problems that dog the final levels. There are 49 lemmings packed together in a tiny space. I have exactly one horizontal dig at my disposal, and need to apply it to a lemming pointing to the right so the group can make its exit, but there’s no possible way to separate one lemming from another in this jumble. So, I’m down to blind luck. If luck isn’t with me — if the lemming I wind up clicking on is pointed in the wrong direction — I have to start the level over through no fault of my own. It will then take considerable time and effort to arrive back at this point and try again. This sort of situation is sometimes called “fake difficulty” — difficulty that arises from the technical limitations of the interface or the game engine rather than purer design considerations. It is, needless to say, not ideal.

To modern ears, taking this ludic masterpiece from nothing to finished in less than a year sounds like an incredible feat. Yet that was actually a fairly long development cycle by the standards of the British games industry of 1990. Certainly Psygnosis’s marketers weren’t entirely happy about its length. Knowing they had something special on their hands, they would have preferred to release it in time for Christmas. Thankfully, better sense prevailed, keeping the game off the market until it was completely ready.

As Lemmings‘s February 1991 release date approached, Psygnosis’s marketers therefore had to content themselves with beating the hype drum for all it was worth. They sent early versions to the magazines, to enlist them in building up the buzz about the game. One by one, the magazines too fell under the thrall of the Lemmings Effect. Amiga Format would later admit that they had greeted the arrival of the first jiffy bag from Psygnosis with little excitement. “Some snubbed it at first,” they wrote, “saying that they didn’t like puzzlers, but in the end the sound of one ‘Oh, no!’ would turn even the most hardened cynic into an addict.”

When a journalist from the American magazine .info visited Liverpool, he went to a meeting where Psygnosis showed the game to their distributors for the first time. The reaction of these jaded veterans of the industry was as telling as had been the Lemmings Effect that had swept through all those disparate magazine offices. “They had to be physically torn away from the computers,” wrote .info, “and crowds of kibitzers gathered to tell the person playing how to do it.” When an eight-level demo version of the game went out on magazine cover disks a few weeks before the launch, the response from the public at large was once again in David Jones’s words “tremendous,” prompting the usually cautious Psygnosis — the lessons of the Imagine days still counted with Ian Hetherington — to commit to an initial pressing that was far larger than they had done for any game before.

And yet it wasn’t anywhere near large enough. When Lemmings was released on Valentine’s Day, 1991, its first-day sales were unprecedented for Psygnosis, who were, for all their carefully cultivated cool, still a small publisher in a big industry. Jones remembers Ian Hetherington phoning him up hourly to report the latest numbers from the distributors: 40,000 sold, 50,000 sold. On its very first day, the game sold out all 60,000 copies Psygnosis had pressed. To put this number in perspective, consider that DMA’s Menace had sold 20,000 copies in all, Blood Money had sold 40,000 copies, and a game which sold 60,000 copies over its lifetime on the Amiga was a huge success by Psygnosis’s usual standards. Lemmings was a success on another level entirely, transforming the lives literally overnight of everyone who had been involved in making it happen. Psygnosis would struggle for months to turn out enough copies to meet the insatiable demand.

Unleashed at last to write about the game which had taken over their offices, the gaming press fell over themselves to praise it; it may have been only February, but there was no doubt what title was destined to be game of the year. The magazine The One felt they needed five pages just to properly cover all its nuances — or, rather, to gush all over them. (“There’s only one problem with Lemmings: it’s too addictive by half. Don’t play it if you have better things to do. You won’t ever get round to doing them.”) ACE openly expressed the shock many were feeling: shock that this hugely playable game could have come out of Psygnosis. It felt as if all of the thinking about design that they could never be bothered to do in the past had now been packed into this one release.

And as went the British Amiga scene, so went Europe and then the world. The game reached American shores within weeks, and was embraced by the much smaller Amiga scene there with the same enthusiasm their European peers had evinced. North American Amiga owners had seen their favored platform, so recently the premiere gaming computer on their continent as it still was in Europe, falling out of favor over the course of the previous year, with cutting-edge releases like Wing Commander appearing first on MS-DOS and only later making their way — and in less impressive versions at that — to the Amiga. Lemmings would go down in history as a somewhat melancholy milestone: as one of the last Amiga games to make American owners of other computers envious.

But then, Psygnosis had no intention of keeping a hit like this one as an Amiga exclusive for very long. Within months, an MS-DOS version was available. Amiga owners didn’t hesitate to point out its failings in comparison to the original — the controls weren’t quite right, they insisted, and the unique two-player mode had been cut out entirely — but the game’s charms were still more than intact enough. It was in MS-DOS form that Lemmings really conquered North America, thus belatedly fulfilling for Ian Hetherington, last man standing at Psygnosis from the Imagine days, the old Imagine dream of becoming a major player on the worldwide software stage. In 1992, the magnitude of their newfound success in North America led Psygnosis to open their first branch office in Boston. The people who worked there spent most of their time answering a hint line set up for the hundreds of thousands — soon, millions, especially after the game made its way to Nintendo consoles — of frazzled Americans who were hopelessly stymied by this or that level.

Britain was the country that came the closest to realizing the oft-repeated ambition of having its game developers treated like rock stars. DMA Design was well-nigh worshiped by Amiga owners in the wake of Lemmings. Here they appear in a poster inserted into a games magazine, ready to be pasted onto a teenage girl’s wall alongside her favorite boy bands. Well, perhaps a really nerdy teenage girl’s wall. Hey, it could happen…

There was talk for some time of polishing up David Jones’s in-house level editor and turning it into a Lemmings Construction Kit, but Psygnosis soon decided that they would rather sell their customers more content in the form of add-on disks than a way of making their own levels. Addicts looking for their next fix could thus get their hands on Oh, No! More Lemmings before the end of 1991, with 100 more levels on offer. This collection had largely been assembled from the cast-offs that hadn’t quite made the cut for the first game, and the reasons why weren’t that hard to sense: these levels were hard, and a little too often in that same cheap way as some of the final levels from the original. Still, it served the purpose, delivering another huge hit. That Christmas, Psygnosis gave away Xmas Lemmings, a free demo disk with a few levels re-skinned for the holiday season. They hit upon a magical synergy in doing so; the goofy little creatures, now dressed in Santa suits, went together perfectly with Christmas, and similar disks would become a tradition for several more holiday seasons to come.

This insertion of Lemmings into the cozy family atmosphere of Christmas is emblematic of the game’s appeal to so many outside the usual hardcore-gamer demographic. Indeed, Lemmings joined Tetris during this period in presaging the post-millennial casual-game market. Perhaps even more so than Tetris, it has most of the important casual traits: it’s eminently approachable, easy to learn, bright and friendly in appearance, and thoroughly cute. Granted, it’s cute in an odd way that doesn’t reward close thinking overmuch: the lemmings are after all exploding when they pipe up with their trademark “Oh, no!,” and playing the game for any length of time entails sending thousands upon thousands of the little buggers to their deaths. And yet cute it somehow manages to be.

One important quality Lemmings shares with Tetris is the way it rewards whatever level of engagement you care to give it. The early levels can be blundered through by anyone looking for a few minutes’ diversion — stories abounded of four-year-olds managing the training levels — but the later ones will challenge absolutely anyone who tackles them. Likewise, the level-based structure means Lemmings can be used to fill a coffee break or a long weekend, as you wish. This willingness to meet players on their own terms is another of the traits of gaming’s so-called “casual revolution” to come. It’s up to you to get off the train wherever interest and dedication dictate.

But those aspects of the Lemmings story — and with them the game’s full historical importance — would be clearly seen only years in the future. In the here and now, DMA had more practical concerns. David Jones abandoned the clubhouse above the baby shop for much larger digs, hired more staff, and went back to the grindstone to deliver a full-fledged sequel. He didn’t, however, neglect to upgrade his lifestyle to match his new circumstances, including the requisite exotic sports car.

Unlike the old guard of Imagine Software, Jones could actually afford his excesses. When all the sales of all the sequels that would eventually be released are combined with those of the original, the total adds up to some 15 million games sold, making Lemmings by far the biggest gaming property ever to be born on the Amiga and the biggest to be born in Britain prior to Grand Theft Auto — a series which, because the success of Lemmings apparently hadn’t been enough for them, the nucleus of DMA Design would later be responsible for as well.

Meanwhile Psygnosis, now The House That Lemmings Built, also found larger offices, in Liverpool’s Brunswick Business Park, and began to cautiously indulge in a bit more excess of their own. Ian Hetherington was the only Imagine veteran still standing, but never mind: games on the Mersey had finally come of age thanks to a game from — of all places! — Dundee.

(Sources: the book Grand Thieves and Tomb Raiders: How British Videogames Conquered the World by Rebecca Levene and Magnus Anderson and Sinclair and the “Sunrise” Technology: The Deconstruction of a Myth by Ian Adamson and Richard Kennedy; the 1989 episode of Computer Chronicles entitled “The Commodore Amiga”; The One of June 1989, March 1990, September 1990, and January 1991; .info of November 1990 and February 1991; Amiga Format of May 1993 and July 1995, and the annual for 1992; The Games Machine of June 1989; ACE of April 1991; Amiga World of June 1991; Amazing Computing of April 1991 and March 1992; the online articles “From Lemmings to Wipeout: How Ian Hetherington Incubated Gaming Success” from Polygon, “The Psygnosis Story: John White, Director of Software” from Edge Online, and “An Ode to the Owl: The Inside Story of Psygnosisfrom Push Square; “Playing Catch Up: GTA/Lemmings‘ Dave Jones” from Game Developer; Mike Dailly’s “Complete History of DMA Design.” My thanks also go to Jason Scott for sharing his memories of working at Psygnosis’s American branch office in the early 1990s.

If you haven’t played Lemmings before, you’ve been missing out; this is one game everyone should experience. Feel free to download the original Amiga version from here. Amiga Forever is an excellent, easy-to-use emulation package for running it. For those less concerned about historical purity, there are a number of versions available to play right in your browser.)

 
 

Tags: , , ,