RSS

Search results for ‘Silas Warner’

The Dream of Flight

After Edison’s original phonograph came out, people said that they could not detect a difference between a phonograph and a real performance. Clearly the standard that they had for audio fidelity back in 1910 was radically different from the standard we have. They got the same enjoyment out of that Edison phonograph that we do out of [a] high-fidelity [stereo]. As audio fidelity has gotten better and better, our standards have gotten higher and higher; if we listen to a phonograph from 1910, it sounds horrible to our modern ears.

The same thing has obviously happened to flight simulators.

— Brand Fortner, 2010

It seems to me that vintage flight simulators have aged worse than just about any other genre of game. No, they weren’t the only games that required a large helping of imagination to overlook their underwhelming audiovisuals, that had sometimes to ask their players to see them as what they aspired to be rather than what they actually were. But they were perhaps the ones in which this requirement was most marked. When we look back on them today, we find ourselves shaking our heads and asking what the heck we were all thinking.

Growing up in the 1980s, I certainly wasn’t immune to the appeal of virtual flight; I spent many hours with subLogic’s Flight Simulator II and MicroProse’s Gunship on my Commodore 64, then hours more with F/A-18 Interceptor on my Commodore Amiga. Revisited today, however, all of those games strike me as absurdly, unplayably primitive. Therefore they and the many games like them have appeared in these histories only in the form of passing mentions.

The case of flight simulators thus serves to illustrate some of the natural tensions implicit in what I do here. On the one hand, I want to celebrate the games that still stand up today, maybe even get some of you to try them for the first time all these years later — and I’ve yet to find a vintage flight simulator which I can recommend on those terms. But on the other hand, I want to sketch an accurate, non-anachronistic picture of these bygone eras of gaming as they really were. In this latter sense, my efforts to date have been sadly inadequate in the case of flight simulators; the harsh fact is that these games which I’ve neglected so completely were in fact among the most popular of their time, accounting on occasion for as much as 25 percent of the computer-game industry’s total revenue. Microsoft Flight Simulator, the prototypical and perennial product of its type, was the most commercially successful single franchise in all of computer gaming between 1982 and 1995 — all despite having no goals other than the ones you set for yourself and for the most part no guns either. (Let that sink in for a moment!)

All of which is to say that a reckoning is long overdue here. This article, while it may not quite give Microsoft Flight Simulator and its siblings their due, will at least begin to redress the balance.



Many people assumed in the 1980s, as they still tend to do today, that the early microcomputer flight simulators were imperfect imitations of the bigger simulators that were used to train pilots for real-world flying. In point of fact, though, the relationship between the two was more subtle — even more symbiotic — than one might guess. To appreciate how this could be, we need to remember that the 3D-graphics techniques that were being used to power all flight simulators by the 1980s were a new technology at the time — new not just to microcomputers but to all computers. Until the 1980s, the “big” flight simulators made for training purposes were very different beasts from the ones that came later.

That said, the idea of flight simulation in general goes back a long, long way, almost all the way back to the dawn of powered flight itself. It took very little time at all after Orville and Wilbur Wright made their first flights in Kitty Hawk, North Carolina, for people to start asking how they might train new pilots in some more forgiving, less dangerous way than putting them behind the controls of a real airplane and hoping for the best. A 1910 issue of Flight magazine — the “first aero weekly in the world” — describes the “Sanders Teacher,” a mock-up of a real airplane mounted on a pivoting base so that it could sway with the wind in response to control inputs; unlike the fragile real aircraft of its era, this one was best “flown” when there was a stiff breeze.

The Sanders Teacher, one of the earliest attempts to simulate flight.

In 1929, Edwin Link of Binghamton, New York, created the Link Trainer, the first flight simulator that we might immediately recognize as such today. An electro-mechanical device driven by organ bellows in its prototype form, it looked like an amputated single-seater-airplane cockpit. The entire apparatus pitched and turned in response to a trainee’s movements of the controls therein, while an instructor sat next to the gadget to evaluate his performance. After an initially skeptical response from the market, usage of the Link Trainer around the world exploded with the various military buildups that began in the mid-1930s. It was used extensively, in both its official incarnation and in unlicensed knock-offs, by virtually every combatant nation in World War II; it was a rite of passage for tens of thousands of new pilots, marking the most widespread use of technology in the cause of simulation to that point in the history of the world.

An American student pilot in a Link Trainer, circa 1943.

The programmable digital computers which began to appear after the war held out the prospect of providing a more complete simulation of all aspect of flight than analog devices like the Link Trainer and its successors could hope to achieve. Already in 1950, the United States Navy funded a research effort in that direction at the University of Pennsylvania. Yet it wasn’t until ten years later that the first computerized flight simulators began to appear. Once again, Link Aviation Devices provided the breakthrough machine here, in the form of the Link Mark 1, whose three processors shared 10 K of memory to present the most credible imitation of real flight yet, with even wind and engine noise included if you bought the most advanced model. By 1970, virtually all flight simulators had gone digital.

But there was a persistent problem afflicting all of these efforts at flight simulation, even after the dawn of the digital age. Although the movements of cockpit instruments and even the physical motion of the aircraft itself could be practically implemented, the view out the window could not. What these machines thus wound up simulating was a totally blind form of flying, as in the heaviest of fogs or the darkest of nights, when the pilot has only her instruments to guide her. Flying-by-instruments was certainly a useful skill to have, but the inability of the simulators to portray even a ground and horizon for the pilot to sight on was a persistent source of frustration to those who dreamed of simulating flight as it more typically occurred in the real world.

Various schemes were devised to remedy the situation, some using reels of film that were projected on the “windows” of the cockpit, some even using a moving video camera which “flew” over model terrain. But snippets of static video are a crude tool indeed in an interactive context, and none of these solutions yielded anything close to the visual impression of real flight. What was needed was an out-the-window view that was generated on the fly in real time by the computer.

In 1973, McDonnell-Douglas introduced the VITAL II, a computerized visual display which could be added to existing flight simulators. Even its technology, however, was different in a fairly fundamental sense from that of the flight simulators that would appear later. The computers which ran the latter would use what’s known as raster-based or bitmap graphics: a grid of pixels stored in memory, which are painted to the monitor screen by the computer’s display circuitry without additional programming. VITAL II, by contrast, used something known as vector graphics, in which the computer’s processor directly controls the electron gun inside the display screen, telling it where to go and when to fire to produce an image on the screen. Although bitmap graphics are far easier for the programmer to work with and more flexible in countless ways, they do eat up memory, a commodity which most computers of the early 1970s had precious little of to spare. Therefore vector graphics were still being used for many applications, including this one.

Thanks to the limitations of its hardware, the VITAL II could only show white points of light on the surface of a black screen, and thus could only be used to depict a night flight. Indeed, it showed only lights — the lights of runways, airports, and to some extent their surrounding cities.


Such was the state of the art in flight simulation during the mid-1970s, when a young man named Bruce Artwick was attending the University of Illinois in Champaign.



Flight simulators aside, this university occupies an important place in the history of computing in that it was the home of PLATO, the pioneering computer network that anticipated much of the digital culture that would come two decades or more after it. A huge variety of games were developed for PLATO, including the first CRPGs and, most importantly for our purposes today, the first flight simulator to be intended for entertainment and casual exploration rather than professional pilot training. Brand Fortner’s game of Airfight wasn’t quite a real-time simulation as we think of it today — you had to hit the NEXT key over and over to update the screen — but it could almost feel like it ran in real time to those willing and able to pound their keyboards with sufficient gusto. Brian Dear described the experience in his book about the PLATO system:

By today’s standards, Airfight’s graphics and realism, like every other PLATO game, are hopelessly primitive. But in the 1970s Airfight was simply unbelievable. These rooms full of PLATO terminals weren’t “PLATO classrooms,” they were PLATO arcades, and they were free. If you were lucky enough to get in (there were always more people wanting to play than the game could handle), you joined the Circle or the Triangle teams, chose from a list of different airplane types to fly, and suddenly found yourself in a fighter plane, looking out of the cockpit window at the runway in front of you, with the control tower far down the runway… You’d hit “9” to set the throttle at maximum, “a” for afterburners, “w” a few times to pull the stick back, and then NEXT NEXT NEXT NEXT NEXT NEXT NEXT to update the screen as you rolled down the runway, lifted off, and shot up into the sky to join the fight. It might be seconds or minutes, depending on how far away the enemy airplanes were, before you saw dots in the sky, dots that as you flew closer and closer turned into little circles and triangles. (So they weren’t photorealistic airplanes — it didn’t matter. You didn’t notice. This was battle. This was Airfight.) As you got closer and closer to one of these planes, the circles and triangles got more defined — still small, still pathetically primitive by today’s standards — but you knew you were getting closer and that’s all that mattered. As you got closer and closer you hit “s” to put up your sights, to aim. Eventually, if you were good, lucky, or both, you would be so close that you’d see a little empty space, an opening, inside the little circle or triangle icon. That’s when you were close enough to see what players called “the whites of their eyes” and that’s when you let ’em have it: SHIFT-S to shoot. SHIFT-S again. And again. Until you’d run out of ammo and KABOOM! It was glorious.

And it was addictive. People stayed up all night playing Airfight. If you went to a room full of PLATO terminals, you’d hear the clack-clack-clack-clack-clack-CLACKETY-CLACK-CLACK-BAM-BAM!-WHAM!-CLACK-CLACK! of everyone’s keyboards, as the gamers pounded them, mostly NEXT-NEXT-NEXT’ing to update their view and their radar displays (another innovation of this game — in-cockpit radar displays, showing you where the enemy was).

The standard PLATO terminal at that time was an astonishingly advanced piece of hardware to place at the disposal of everyday university students: a monochrome bitmap display of no less than 512 X 512 pixels. Thus Airfight, in addition to being the first casual flight simulator, was the first flight simulator of any kind to use a bitmap display. This fact wasn’t lost on Bruce Artwick when he first saw the game in action — for Artwick already knew a little something about the state of the art in serious flight simulation.

The University of Illinois’s Institute of Aviation was one of the premiere aerospace programs in the country, training both engineers and pilots. Artwick happened to be pursuing a master’s degree in general electrical engineering, but he roomed with one of the university’s so-called “aviation jocks”: an accomplished pilot named Stu Moment, who was training to become a flight instructor at the same time that he pursued a degree in business. “We agreed that Stu would teach me to fly if I taught him about digital electronics,” Artwick remembers. Although Artwick’s electrical-engineering program would seemingly mark him as a designer of hardware, the technological disciplines were more fluid in the 1970s than they’ve become today. His real passion, indulged willingly enough by his professors, had turned out to be the nascent field of bitmap 3D graphics. So, he found himself with one foot in the world of 3D programming, the other in that of aviation: the perfect resumé for a maker of flight simulators.

Airfight hit Artwick like a revelation. In a flash, he understood that the PLATO terminal could become the display technology behind a flight simulator used for more serious purposes. He sought and received funding from the Office of Naval Research to make a prototype 3D display useful for that purpose as his master’s thesis. Taking advantage of his knowledge of hardware engineering, he managed to connect a PLATO terminal to one of the DEC PDP-11 minicomputers used at the Aviation Institute. He then employed this setup to create what his final thesis called “a versatile computer-generated flight display,” submitting his code and a 60-page description of its workings to his instructors and to the Office of Naval Research.

It’s hard to say whether Artwick’s thesis, which he completed in May of 1976, was at all remarked among the makers of flight simulators already in use for pilot training. Many technical experiments like it came out of the aerospace-industrial complex’s web of affiliated institutions, sometimes to languish in obscurity, sometimes to provide a good idea or two for others to carry forward, but seldom to be given much credit after the fact. We can say, however, that by the end of the 1970s the shift to bitmap graphics was finally beginning among makers of serious flight simulators. And once begun, it happened with amazing speed; by the mid-1980s, quite impressive out-the-cockpit views, depicting nighttime or daytime scenery in full color, had become the norm, making the likes of the VITAL II system look like the most primordial of dinosaurs.

This photo from a 1986 brochure by a flight-simulator maker known as Rediffusion Simulation shows how far the technology progressed in a remarkably short period of time after bitmap 3D graphics were first introduced on the big simulators. Although the graphical resolution and detail are vastly less than one would find in a simulator of today, the Rubicon has already been crossed. From now on, improvements will be a question of degree rather than kind.

Meanwhile the same technology was coming home as well, looking a bit less impressive than the state-of-the-art simulators in military and civilian flight schools but a heck of a lot better than VITAL II. And Artwick’s early work on that PLATO terminal most definitely was a pivotal building block toward these simulators, given that the most important person behind them was none other than Artwick himself.



After university, Artwick parlayed his thesis into a job with Hughes Aircraft in California, but found it difficult to develop his innovations further within such a large corporate bureaucracy. His now-former roommate Stu Moment started working as a flight instructor right there in Champaign, only to find that equally unsatisfying. In early 1977, the two decided to form a software company to serve the new breed of hobbyist-oriented microcomputers. It was an auspicious moment to be doing so; the Trinity of 1977 — the Apple II, Radio Shack TRS-80, and Commodore PET, constituting the first three pre-assembled personal computers — was on the near horizon, poised to democratize the hobby for those who weren’t overly proficient with a soldering iron. Artwick and Moment named their company subLogic, after a type of computer circuit. It would prove a typical tech-startup partnership in many ways: the reserved, retiring Artwick would be the visionary and the technician, while the more flamboyant, outgoing Moment would be the manager and the salesman.

Artwick and Moment didn’t initially conceive of their company as a specialist in flight simulators; they rather imagined their specialty to be 3D graphics in all of their potential applications. Accordingly, their first product was “The subLogic Three-Dimensional Micrographics Package,” a set of libraries to help one code one’s own 3D graphics in the do-it-yourself spirit of the age. Similar technical tools continued to occupy them for the first couple of years, even as both partners continued to work their day jobs, hoping that grander things might await them in the future, once the market for personal computers had had time to mature a bit more.

In June of 1979, they decided that moment had come. Artwick quit his job at Hughes and joined Moment back in Champaign, where he started to work on subLogic’s first piece of real consumer software. Every time he had attempted to tell neophytes in the past about what it was his little company really did, he had been greeted with the same blank stare and the same stated or implied question: “But what can you really do with all this 3D-graphics stuff?” And he had learned that one response in particular on his part could almost always make his interlocutors’ eyes light up with excitement: “Well, you could use it to make a flight simulator, for instance.” So, subLogic would indeed make a flight simulator for the new microcomputers. Being owned and operated by two pilots — one of them a licensed flight instructor and the other one having considerable experience in coding for flight simulators running on bigger computers — subLogic was certainly as qualified as anyone for the task.

They released a product entitled simply Flight Simulator for the Apple II in January of 1980. One can’t help but draw comparisons with Will Crowther and Don Woods’s game of Adventure at this point; like it, Flight Simulator was not only the first of its kind but would lend its name to the entire genre of games that followed in its footsteps.

Fearing that his rudimentary, goal-less simulation would quickly bore its users, Artwick at the last minute added a mode called “British Ace,” which featured guns and enemy aircraft to be shot down in an experience distinctly reminiscent of Airfight. But he soon discovered, rather to his surprise, that most people didn’t find those additional accoutrements to be the most exciting aspect of the program. They enjoyed simply flying around this tiny virtual world with its single runway and bridge and mountain — enjoyed it despite all the compromises that a host machine with six-color graphics, 32 K of memory, and a 1 MHz 8-bit CPU demanded. It turned out that a substantial portion of early microcomputer owners were to a greater or lesser degree frustrated pilots, kept from taking to the air by the costs and all of the other logistics involved with acquiring a pilot’s license and getting time behind the controls of a real airplane. They were so eager to believe in what Flight Simulator purported to be that their imaginations were able to bridge the Grand Canyon-sized gap between aspiration and reality. This would continue to be the case over the course of the many years it would take for the former to catch up to the latter.

Flight Simulator on the Apple II.

Still, subLogic didn’t immediately go all-in for flight simulation. They released a variety of other entertainment products, from strategy games to arcade games. They even managed one big hit in the latter category, one that for a time outsold all versions of Flight Simulator: Bruce Artwick’s Night Mission Pinball was a sensation in Apple II circles upon its release in the spring of 1982, widely acknowledged as the best game of its type prior to Bill Budge’s landmark Pinball Construction Set the following year. subLogic wouldn’t release their last non-flight simulator until 1986, when an attempt to get a sports line off the ground fizzled out with subLogic Football. In the long run, though, it was indeed flight simulation that would make subLogic one of the most profitable companies in their industry, all thanks to a little software publisher known as Microsoft.

In late 1981, Microsoft came to subLogic looking to make a deal. IBM had outsourced to the former the operating system of the new IBM PC, whilst also charging them with creating or acquiring a variety of other software for the machine, including games. So, they wanted Artwick to create a “second generation” of his Flight Simulator for the IBM PC, taking full advantage of its comparatively torrid 4.77 MHz 16-bit processor.

Artwick spent a year on the project, working sixteen hours or more per day during the last quarter of that period. The program he turned in at the end of the year was both a dramatic improvement on what had come before and a remarkably complete simulation of flight for its era. Its implementation of aeronautics had now progressed to the point that a specific airplane could be said to be modeled: a Cessna 182 Skylane, a beloved staple of private and recreational aviation that was first manufactured in 1956 and has remained in production to this day. Artwick replaced the wire-frame graphics of the Apple II version with solid-filled color, replaced its single airport with more than twenty of them from the metropolitan areas of New York, Chicago, Seattle, and Los Angeles. He added weather, as well as everything you needed to fly through the thickest fog or darkest night using instruments alone; you could use radio transponders to navigate from airport to airport. You could even expect to contend with random engine failures if you were brave enough to turn that setting on. And, in a move that would have important implications in the future, Artwick also designed and implemented a coordinate system capable of encompassing the greater portion of North America, from southern Canada down to the Caribbean, although it was all just empty green space at this point outside of the four metropolitan areas.

Microsoft Flight Simulator 1.0

This first Microsoft Flight Simulator was released in late 1982, and promptly became ubiquitous on a computer that was otherwise not known as much of a game machine. Many stodgy business-oriented users who wouldn’t be caught dead playing any other game seemed to believe that this one was acceptable; it was something to do with the label of “simulator,” something to do with its stately, cerebral personality. Microsoft’s own brief spasm of interest in games in general would soon peter out, such that Flight Simulator would spend half a decade or more as the only game in their entire product catalog. Yet it would consistently sell in such numbers that they would never dream of dropping it.

When the first wave of PC clones hit the market soon after Flight Simulator was released, the computer magazines took to using it as a compatibility litmus test. After all, it pushed the CPU to its absolute limit, even as its code was full of tricky optimizations that took advantage of seemingly every single quirk of IBM’s own Color Graphics Adapter. Therefore, went the logic, if a PC clone could run Flight Simulator, it could probably run just about anything written for a real IBM PC. Soon all of the clone makers were rushing to buy copies of the game, to make sure their machines could pass the stringent test it represented before they shipped them out to reviewers.

Meanwhile Artwick began to port Microsoft Flight Simulator‘s innovations into versions for most other popular computers, under the rather confusing title of Flight Simulator II. (There had never been a subLogic Flight Simulator I on most of the computers for which this Flight Simulator II was released.) Evincing at least a modest spirit of vive la différence, these versions chose to simulate a Piper Cherokee, another private-aviation mainstay, instead of the Cessna.

Although the inexpensive 8-bit computers for which Flight Simulator II was first released were far better suited than the IBM PC for running many types of games, this particular game was not among them. Consider the case of the Commodore 64, the heart of the mid-1980s computer-gaming market. The 64’s graphics system had been designed with 2D arcade games in mind, not 3D flight simulators; its sprites — small images that could be overlaid onto the screen and moved about quickly — were perfect for representing, say, Pac-Man in his maze, but useless in the context of a flight simulator. At the same time, the differences between an IBM PC and a Commodore 64 in overall processing throughput made themselves painfully evident. On the IBM, Flight Simulator could usually manage a relatively acceptable ten frames or so per second; on the 64, you were lucky to get two or three. “We gave it a try and did the best we could,” was Artwick’s own less-than-promising assessment of the 8-bit ports.

Nevertheless, the Commodore 64 version of Flight Simulator II is the one that I spent many hours earnestly attempting to play as a boy. Doing so entailed peering at a landscape of garish green under a sky of solid blue, struggling to derive meaning from a few jagged white lines that shifted and rearranged themselves with agonizing slowness, each frame giving way to the next with a skip and a jerk. Does that line there represent the runway I’m looking for, or is it a part of one of the handful of other landmarks the game has deigned to implement, such as the Empire State Building? It was damnably hard to know.

Flight Simulator II on the Commodore 64.

As many a real pilot who tried Flight Simulator II noted, a virtual Piper Cherokee was perversely more difficult to fly than the real thing, thanks to the lack of perspective provided by the crude graphics, the clunky keyboard-based controls — it was possible to use a joystick, but wasn’t really recommended because of the imprecision of the instrument — and the extreme degree of lag that came with trying to cram so much physics modeling through the narrow aperture of an 8-bit microprocessor. Let’s say you’re attempting a landing. You hit a key to move the elevators a little bit and begin your glide path, but nothing happens for several long seconds. So, getting nervous as you see the white line that you think probably represents the runway getting a bit longer, you hit the same key again, then perhaps once more for good measure. And suddenly you’re in a power dive, your view out the screen a uniform block of green. So you desperately pound the up-elevator key and cut the throttle — and ten or twenty seconds later, you find the sky filling your screen, your plane on the verge of stalling and crashing to earth tail-first. More frantic course corrections ensue. And so it continues, with you swaying and bobbling through the sky like a drunken sailor transported to the new ocean of the heavens. Who needed enemy airplanes to shoot at in the face of all these challenges? Just getting your plane up and then down again in one piece — thankfully, the simulator didn’t really care at the end of the day whether it was on a runway or not! — was an epic achievement.

Needless to say, Flight Simulator II‘s appeal is utterly lost on me today. And yet in its day the sheer will to believe, from me and hundreds of thousands of other would-be pilots like me, allowed it to soar comfortably over all of the objections raised by its practical implementation of our grand dream of flight.

At a time when books on computer games had yet to find a place on the shelves of bookstores, books on Flight Simulator became the great exception. It began in 1985, when a fellow named Charles Gulick published 40 Great Flight Simulator Adventures, a collection of setups with exciting-sounding titles — “Low Pass on the Pacific,” “Dead-Stick off San Clemente” — that required one-tenth Flight Simulator and nine-tenths committed imagination to live up to their names. Gulick became the king of the literary sub-genre he had founded, writing five more books of a similar ilk over the following years. But he was far from alone: the website Flight Sim Books has collected no less than twenty of its namesake, all published between the the mid-1980s and the mid-1990s, ranging from the hardcore likes of Realistic Commercial Flying with Flight Simulator to more whimsical fare like A Flight Simulator Odyssey. The fact that publishers kept churning them out indicates that there was a solid market for them, which in turn points to just how committed to the dream the community of virtual fliers really was.

Of course, the game that called itself simply Flight Simulator was by no means the only one in the genre it had spawned. While a few companies did try to sell their own civilian flight simulators, none managed to seriously challenge the ones from subLogic. But military flight simulators were a different matter; MicroProse Software in particular made their reputation with a string of these. Often designed and programmed by Sid Meier, MicroProse’s simulators were distinguished by their willingness to sacrifice a fair amount of realism to the cause of decent frame rates and general playability, with the added attraction of enemy aircraft to shoot down and cities to bomb. (While the old “British Ace” mode did remain a part of the subLogic Flight Simulator into the late 1980s, it never felt like more than the afterthought it was.) Meier’s F-15 Strike Eagle, the most successful of all the MicroProse simulators, sold almost as well as subLogic’s products for a time; some sources claim that its total sales during the ten years after its initial release in 1984 reached 1 million units.

subLogic as well did dip a toe into military flight simulation with Jet in 1985. Programmed by one Charles Guy rather than Bruce Artwick, this F-16 and F/A-18 simulator was a bit more relaxed and a bit more traditionally game-like than the flagship product, offering air-, land-, and sea-based targets for your guns and bombs that could and did shoot back. Still, its presentation remained distinctly dry in comparison to the more gung-ho personality of the MicroProse simulators. Although reasonably successful, it never had the impact of its older civilian sibling. Instead Spectrum Holobyte’s Falcon, which debuted in 1987 for 16-bit and better machines only, took up the banner of realism-above-all-else in the realm of jet fighters — almost notoriously so, in fact: it came with a small-print spiral-bound manual of almost 300 pages, and required weeks of dedication just to learn to fly reasonably well, much less to fly into battle. And yet it too sold in the hundreds of thousands.

In the meantime, Artwick was continuing to plug steadily away, making his Flight Simulator slowly better. A version 2.0 of the Microsoft release, with four times as many airports to fly from and many other improvements, appeared already in 1984, soon after the 8-bit Flight Simulator II; it was then ported to the new Apple Macintosh, the only computing platform beside their own which Microsoft had chosen to officially support. When the Atari ST and Commodore Amiga appeared in 1985, sporting unprecedented audiovisual capabilities, subLogic released versions of Flight Simulator II for those machines with dazzling graphical improvements; these versions even gave you the option of flying a sleek Learjet instead of a humble single-prop airborne econobox. Version 3.0 of Microsoft Flight Simulator arrived in 1988, coming complete with the Learjet, support for the latest VGA graphics cards, and an in-game flight instructor among other enhancements.

Microsoft Flight Simulator 3.0 included the first attempt at in-program flight instruction. It would continue to appear in all subsequent releases, being slowly refined all the while, much like the simulator itself.

Betwixt and between these major releases, subLogic took advantage of Artwick’s foresight in designing a huge potential world into Flight Simulator by releasing a series of “scenery disks” to fill in all of that empty space with accurately modeled natural features and airports, along with selected other human-made landmarks. The sufficiently dedicated — i.e., those who were willing to purchase a dozen scenery disks at $20 or $30 a pop — could eventually fly all over the continental United States and beyond, exploring a virtual world larger than any other in existence at the time.

Indeed, the scenery disks added a whole new layer of interest to Flight Simulator. Taking in their sights and ferreting out all of their secrets became a game in itself, overlaid upon that of flying the airplane. It could add a much-needed sense of purpose to one’s airborne ramblings; inevitably, the books embraced this aspect with gusto, becoming in effect tour guides to the scenery disks. When they made a scenery disk for Hawaii in 1989, subLogic even saw fit to include “the very first structured scenery adventure”:

Locating the hidden jewel of the goddess Pele isn’t easy. You’ll have to find and follow an intricate set of clues scattered about the islands that, with luck, will guide you to your goal. This treasure hunt will challenge all of your flying skills, but the reward is an experience you’ll never forget!



The sales racked up by all of these products are impossible to calculate precisely, but we can say with surety that they were impressive. An interview with Artwick in the July 1985 issue of Computer Entertainment magazine states that Flight Simulator in all its versions has already surpassed 800,000 copies sold. The other piece of hard data I’ve been able to dig up is found in a Microsoft press release from December of 1995, where it’s stated that Microsoft Flight Simulator alone has sold over 3 million copies by that point. Added to that figure must be the sales of Flight Simulator II for various platforms, which must surely have been in the hundreds of thousands in their own right. And then Jet as well did reasonably well, while all of those scenery disks sold well enough that subLogic completed the planned dozen and then went still further, making special disks for Western Europe, Japan, and the aforementioned Hawaii, along with an ultra-detailed one covering San Francisco alone.

When we start with all this, and then add in the fact that subLogic remained a consistently small operation with just a handful of employees, we wind up with two founders who did very well for themselves indeed. Unsurprisingly, then, Bruce Artwick and Stu Moment, those two college friends made good, were a popular subject for magazine profiles. They were a dashing pair of young entrepreneurs, with the full complement of bachelor toys at their disposal, including a Cessna company plane which they flew to trade shows and, so they claimed, used to do modeling for their simulations. When David Hunter from the Apple II magazine Softalk visited them for a profile in January of 1983, he went so far as to compare them to Butch Cassidy and the Sundance Kid. (Sadly, he didn’t clarify which was which…)

Speed is exhilarating. Uncontrolled growth is intoxicating. As long as youth can dream, life will never move fast enough.

Whether it’s motorcycles, cars, planes, skiing, volleyball, or assembly language, Bruce Artwick likes speed. He likes Winchester disk drives, BMWs, zooming through undergraduate and graduate school in four years, and tearing down the Angeles Crest Highway on a Suzuki at a dangerous clip. The president of subLogic, Artwick is a tall, quiet, 29-year-old bachelor. He possesses a remarkable mind, which has created several of the finest programs ever to grace the Apple’s RAM.

Contrast Artwick with Stu Moment. Outgoing, of medium height, and possessed of an exceptional love of flying, Moment is subLogic’s chairman of the board. A businessman, Moment has steered the company to its present course, complementing Artwick’s superior software-engineering talents with organizational and financial skills. He’s even picked up some modest programming skills, designing a system for logging flight hours at a fair-sized flying institute.

Redford and Newman. Lewis and Clark. Laurel and Hardy. Jobs and Wozniak. Artwick and Moment. The grand adventurers riding the hard trail, living and playing at lives larger than life. It’s an old story.

Stu Moment and Bruce Artwick with their Cessna on a cold morning for flying, 1982.

When the journalists weren’t around, however, the dynamic duo’s relationship was more fractious than the public realized. Artwick wanted only to pursue the extremely profitable niche which subLogic had carved out for themselves, while Moment’s natural impulse was to expand into other areas of gaming. Most of all, though, it was likely just a case of two headstrong personalities in too close a proximity to one another, with far too much money flying through the air around them. That, alas, is also an old story.

As early as 1981, the two spent some time working out of separate offices, so badly were they treading on one another’s toes in the one. In 1984, Artwick, clearly planning for a potential future without Moment, formed his own Bruce Artwick Organization and started providing his software to subLogic, which was now solely Moment’s company, on a contract basis.

The final split didn’t happen until 1989, but when it did, it was ugly. Lawsuits flew back and forth, disputing what code and other intellectual property belonged to subLogic and what belonged to Artwick’s organization. To this day, each man prefers not to say the other’s name if he can avoid it.

This breakup marked the end of the Flight Simulator II product line — which was perhaps just as well, as the platforms on which it ran were soon to run out of rope anyway in North America. Moment tried to keep subLogic going with 1990’s Flight Assignment: Airline Transport Pilot, a simulation of big commercial aircraft, but it didn’t do well. He then mothballed the company for several years, only to try again to revive it by hiring a team to make an easier flight simulator for beginners. He sold both the company and the product to Sierra in November of 1995, and Flight Light Plus shipped three months later. It too was a failure, and the subLogic name disappeared thereafter.

It was Artwick who walked away from the breakup with the real prize, in the form of the ongoing contract with Microsoft. So, Microsoft Flight Simulator continued its evolution under his steady hand. Version 4.0 shipped in 1989, version 5.0 in 1993. Artwick himself names the latter as the entire series’s watershed moment; running on a fast computer equipped with one of the latest high-resolution Super-VGA graphics cards, it finally provided the sort of experience he’d been dreaming of when he’d written his master’s thesis on the use of bitmap 3D graphics in flight simulation all those years before. Any further audiovisual improvements from here on out were just gravy as far as he was concerned.

Flying above San Francisco in Microsoft Flight Simulator 5.0.



Such a watershed strikes me as a good place to stop today. Having so belatedly broken my silence on the subject, I’ll try to do a better job now of keeping tabs on Flight Simulator as it goes on to become the most long-lived single franchise in the history of computer gaming. (As of this writing, a new version has just been released, spanning ten dual-layer DVDs in its physical-media version, some 85 GB of data — a marked contrast indeed to that first cassette-based Flight Simulator for the 16 K TRS-80.) Before I leave you today, though, we should perhaps take one more moment to appreciate the achievements of those 1980s versions.

It’s abundantly true that they’re not anything you’re likely to want to play today; time most definitely hasn’t been kind to them. In their day, though, they had a purity, even a nobility to them that we shouldn’t allow the passage of time to erase. They gave anyone who had ever looked up at an airplane passing overhead and dreamed of being behind its controls a way to live that dream, in however imperfect a way. Although it billed itself as a hardcore simulation, Flight Simulator was in reality as much an exercise in fantasy as any other game. It let kids like me soar into the heavens as someone else, someone leading a very different sort of life. Yes, its success was a tribute to its maker Bruce Artwick, but it was also, I would argue, a tribute to everyone who persevered with it in the face of a million reasons just to give up. The people who flew Flight Simulator religiously, who bought the books and worked through a pre-flight checklist before taking off each time and somehow managed to convince themselves that the crude pixelated screen in front of them actually showed a beautiful heavenly panorama, did so out of love of the idea of flight. For them, the blue-and-green world of Flight Simulator was a wonderland of Possibility. Far be it from me to look askance upon them from my perch in their future.

(Sources: the book The Friendly Orange Glow: The Untold Story of the Rise of Cyberculture by Brian Dear and Taking Flight: History, Fundamentals, and Applications of Flight Simulation by Christopher D. Watkins and Stephen R. Marenka; Flight of December 10 1910 and March 22 1913; Softalk of January 1983; Kilobaud of October 1977; Softalk IBM of February 1983; Data Processing of April 1968; Compute!’s Gazette of January 1985; Computer Gaming World of April 1987 and September 1990; Computer Entertainment of July 1985; PC Magazine of January 1983; Illinois CS Alumni News of spring 1996; the article “High-Power Graphic Computers for Visual Simulation: A Real-Time Rendering Revolution” by Mary K. Kaiser, presented to the 1996 symposium Supercomputer Applications in Psychology; Bruce Artwick’s Masters thesis “A Versatile Computer-Generated Dynamic Flight Display”; flight-simulator product brochures from Link and Rediffusion; documents from the Sierra archive housed at The Strong Museum of Play in Rochester, New York; a brochure from an exhibition on the Link Trainer at the Roberson Museum and Science Center in 2000. Online sources include a VITAL II product-demonstration video; an interview with Bruce Artwick by Robert Scoble; a panel discussion from the celebration of PLATO’s 50th anniversary at the Computer History Museum; “A Brief History of Aircraft Flight Simulation” by Kevin Moore; the books hosted at Flight Sim Books. My guiding light through this article has been Josef Havlik’s “History of Microsoft Flight Simulator.” He did at least half of the research so that I didn’t have to…)

 
 

Tags: , , ,

Games on the Net Before the Web, Part 2: MUD

You are stood on a narrow road between The Land and whence you came. To the north and south are the small foothills of a pair of majestic mountains, with a large wall running round. To the west the road continues, where in the distance you can see a thatched cottage opposite an ancient cemetery. The way out is to the east, where a shroud of mist covers the secret path by which you entered The Land.

— the first text players saw upon entering the University of Essex MUD 


During the time when the original PDP-10 Zork was the hottest game in institutional computing, a fair number of overseas visitors managed to access the machine that ran it at MIT’s Dynamic Modeling Group. One day in 1980, one such visitor, from the University of Essex in Britain, left a strange message on the Zork mailing list: “You haven’t lived ’til you’ve died in MUD.”

When the folks inside MIT investigated, they learned that the spirit of hacker oneupsmanship which had caused them to beget Zork as a response to Crowther and Woods’s Adventure had finally come back around to bite them. Zork had perfected the Adventure formula at the same time that it exploded it, applying a much more advanced parser and a much more detailed and coherent world model to a game that was several times as big. Now, MUD — the Multi-User Dungeon — was taking the next step, applying the innovations of its predecessors to the world’s first shared persistent virtual world. The creators of MUD had first encountered Zork in the form of an unauthorized port which bore the name of Dungeon; thus the name of Multi-User Dungeon made the challenge to the existing state of the art in text adventures all too plain.

Later in 1980, Dave Lebling of the Dynamic Modeling Group penned an article for Byte magazine about Zork which, not coincidentally, included this musing about possible future directions:

Another area where experimentation is going on is that of multiplayer CFS [computerized fantasy simulation] games. Each player (possibly not even aware how many others are playing) would see only his own view of the territory. He would be notified when other players enter or leave the room, and could talk to them.

This would not, however, be the road which Lebling and his colleagues would ultimately choose to go down. Instead they would focus their energies on crafting the most polished, compelling single-player text adventures they possibly could, forming the company known as Infocom to publish them on the microcomputers of the time — a story regular readers of this blog already know well.

And yet the idea of the multi-player text adventure — and with it the idea of a text adventure that was a persistent world to be visited again and again rather than a single game to be solved and set aside — wasn’t going to go away either. On the contrary: a direct line can be traced from Adventure and Zork through MUD and its many antecedents, and on to the functioning virtual societies, with populations in some cases bigger than many small countries here on our real planet, that are the persistent online worlds of today.


As has been the case for more seminal developments in computing history than some might like to admit, MUD was spawned less by a grand theoretical vision than a technical quirk in the computer which ran it. In fact, it was the very same quirk as that which would lead Sandy Trevor over at CompuServe in the United States to create the equally seminal CB Simulator, the world’s first real-time chat program.

The big DEC PDP-10 computer, a staple of institutional computing and with it of hacker culture during the 1970s, might host dozens of simultaneous users, many of whom might be running the same program. It would be absurdly wasteful of precious memory to copy this program’s code over and over into each individual user’s private memory space. Therefore each user was given access to two pools of memory. One, used for program code, was shared among all users on the system — so that if, say, many of them were running the same text editor then the code for that text editor would have to exist in the computer’s memory in only one place. The other area was reserved for the unique data — in this example, the actual file being edited — that each user was working with; this private space could only be accessed by her. In itself, none of this constituted a quirk; it was just good system design, and as such is still used by most computers today.

Roy Trubshaw

But something that was a little quirky about the PDP-10 was noticed by a University of Essex student named Roy Trubshaw in 1977: the fact that nothing absolutely demanded that only static program code, as opposed to dynamic data of other sorts, reside in the shared memory space. With a bit of trickery, the PDP-10 could be convinced to use the shared space to facilitate real-time communications between users who would otherwise be isolated inside their own bubbles. Trubshaw’s first experiments along these lines were as basic as could be: he wrote a couple of programs to cause one user’s terminal to echo text being typed into another’s. “It might seem odd to someone who wasn’t there,” he remembers, “but the feeling of achievement when the line of text on one teletype appeared as typed on the second teletype was just awesome.” From such experiments would eventually spring MUD — for, Trubshaw would realize, there was nothing preventing the shared memory space from containing an entire virtual world rather than just lines of typed text.

The road from typed text to virtual world would, like so much else in gaming history, pass through Adventure. The year of 1977 was also the year of Will Crowther and Don Woods’s pioneering creation, which fascinated Trubshaw as much as it did all of the other hackers who encountered it. The source code to Adventure fortuitously popped up on the University of Essex’s PDP-10 the following year, while Trubshaw was still tinkering with his shared-memory experiments. Taking inspiration from Crowther and Woods’s code — but not too much; he considered the game’s implementation “by and large a giant kludge” — Trubshaw developed a markup language for describing a shared world, to be brought to life by an interpreter program. He named his new language MUDDL, for Multi-User Dungeon Development Language. (MUDDL was an amusing if coincidental echo of MDL — pronounced “muddle” — the language the Dynamic Modeling Group at MIT had used to implement Zork. Ditto Infocom’s later MDL-based in-house language, ZIL: the Zork Implementation Language.) The first MUDDL-built shared world to go online modeled, in the grand tradition of countless other first text-adventure creations, the house where Trubshaw had grown up and where his parents still lived. While it’s difficult to anchor these developments precisely in time, the project may have reached this point as early as late 1978, and certainly by 1979.

Richard Bartle

Trubshaw’s most enthusiastic fan and assistant was an undergraduate named Richard Bartle with a taste for Tolkien, Dungeons & Dragons, and the single-player text adventures, like Adventure and Zork, which echoed them. With that perfect resume to hand, Bartle began to function as the world designer for the nascent MUD. In the spring of 1980, shortly after Trubshaw and Bartle together had posted that cryptic message to the Zork mailing list, Trubshaw graduated from university — he blamed the time spent working on MUD for having finished with a second rather than a first — and moved to Belgium, bequeathing all of his old code unto Bartle. Trubshaw wouldn’t do any more serious work on MUD for the next several years, and would only very rarely visit it as a player. From this point on, it would be Richard Bartle’s baby. (There is an interesting parallel here with the original Adventure, which was started by Will Crowther largely as a coding experiment, only to be abandoned by him and developed into a full-fledged game later by Don Woods. The notable difference is that Trubshaw and Bartle, unlike Crowther and Woods, did know one another, and actually worked together on the project for a while.)

Bartle was now working earnestly on making the world of MUD, which he called simply The Land, into a place worth visiting. From its modest beginnings as a facsimile of Trubshaw’s parents’ house it would grow over the course of years to become an immense place, with some 600 rooms, encompassing everything from the expected Tolkienesque fantasy to an area based — without authorization, of course — on Jim Henson’s Fraggle Rock television program.

At first, The Land was inhabited only by a select group of University of Essex students who could locally access the PDP-10 on which it ran. The same PDP-10 was, however, also accessible by a privileged few outside the university who had managed to finagle, by means legal or extra-legal, access to the early Internet. One of the first such outsiders to drop by was a precocious 17-year-old named Jeremy San who spent his days prowling the Internet, looking for interesting things. MUD, he knew immediately, was very interesting indeed. He became its greatest propagandist among the tiny British modem fraternity of the time; a huge percentage of the people who wound up playing MUD did so thanks to his encouragement. (Jeremy San took the handle of “Jez” on MUD, a nickname which leaked out of its virtual context to become his professional name. As Jez San, he would go on to become a major figure in British game development, responsible for Starglider among other hits.)

Enough outsiders like San were soon playing MUD to prompt the division of players into “internals,” meaning people playing from within the university itself, and “externals,” meaning people logging on from outside its hallowed halls. The university’s administrators proved rather astonishingly generous with their computing resources, allowing players from all over the country and, eventually, all over the world to log on and play during evenings and weekends. But make no mistake: doing so was a tall order in the Britain of the early 1980s, where electronic communications in general were still in a far more primitive state than the United States of the period, where even simple local phone calls were still charged by the minute and modems were rarer than hen’s teeth among home-computer owners. To add to the logistical difficulties, MUD didn’t even become available on the university’s PDP-10 until 1:00 AM, after everyone else had presumably finished doing their real work on the machine. Nevertheless, enough people combined the requisite privilege, cleverness, and dedication that the software’s limitation of no more than 36 simultaneous players quickly began to frustrate.

A typical MUD session

For all that even MUD‘s very name implied it to be nothing more than a multiplayer version of Zork/Dungeon, the move from a single-player game to be solved and set aside to a persistent world to be inhabited by the same players for months or years changed everything. The designs questions which confronted Bartle were completely different from those being debated contemporaneously inside the early Infocom. In fact, they had far more in common with those still being debated by the makers of the massively-multiplayer games of today than they did with those surrounding the single-player games of their own time. In a single-player game, the player is the star of the show; the (virtual) world revolves around her. Not so inside The Land. Most traditional text-adventure puzzles made no sense at all there. The first person to come along and solve a puzzle might have fun with it, but after that the shared world meant that what was solved for one was solved for all: the door remained unlocked, the drawbridge remained lowered, etc. This was not, needless to say, a good use of a designer’s energy. MUD did include some set-piece puzzles which could be solved by simply typing an answer, without affecting the environment — riddles, number sequences, etc. — but even these became mere pointless annoyances to a player after she had solved them once, and tended to be so widely spoiled by the first player to solve them that they too hardly seemed a good use of a designer’s time.

The most innovative puzzles in The Land were those which took advantage of this being a multiplayer text adventure to demand cooperation; think of a heavy portcullis which can only be lifted by several players straining together. Perhaps the cleverest example of this species of puzzle was The Swamp, a huge maze with an immensely valuable crown hidden at its center. Because it was a swamp, the tried-and-true drop-and-plot technique for solving mazes was a nonstarter; items dropped would just disappear below the surface. The only way to defeat the maze was rather with a team of a dozen or more players, leaving one player behind in each room to mark its identity.

Yet puzzles — even brilliant puzzles like this one — were never really the heart of MUD‘s appeal; even the Swamp maze became a fairly rote exercise as soon as the first group figured out how to approach it. Although it presented itself in the guise of a text adventure, MUD often played more like a CRPG. Replacing the deterministic behavior of Adventure and Zork was a focus on emergent behavior. Each player had scores for strength, stamina, and dexterity, which determined the outcome of many actions, including combat with the many creatures which dotted the landscape. These attributes increased as one’s in-game accomplishments grew.

Like most traditional text adventures, MUD had a system of points for measuring the player’s achievements. Less typically, but very much in keeping with the CRPG side of the game’s personality, this score translated into levels, which among other things allowed players to attach honorifics (“Hero,” etc.) to their names. Points became a sort of currency within The Land; it was possible to transfer some of your points to another player by “hugging” or “kissing” her, and this was often done in exchange for goods and services.

The most straightforward way of scoring points, however, was at first glance lifted straight from Adventure and Zork: treasures were scattered about The Land, which players could retrieve and drop into a designated location. But this scheme alone was obviously unsuitable to a multiplayer context, at least if Bartle didn’t want to spend all his time hiding new treasures for players to discover. Instead he came up with a scheme, always more tolerated than liked by even the most dedicated players, by which The Land was periodically “reset,” with all treasures moved back to their original starting locations. Such resets, besides being something of a blight on MUD‘s very identity as an allegedly persistent online world, were an imperfect solution in that the more experienced players were the ones who knew where the treasures lay after a reset; they thus could rush over to claim them before the newbies had a chance. It would often take the veterans no more than five minutes to scarf up all the treasures during a “reset rush hour,” an exercise as unchallenging for them as it was baffling for the newbies.

But then, the same newbies who were left nonplussed by the reset rush hour soon found much worse to complain about. Not only did MUD feature permadeath, but it allowed its experienced players to kill the new ones. Indeed, it even encouraged such behavior by rewarding the griefers with one-twenty-forth of the points their victims had accumulated over the course of their careers. Combined with the cliquey nature of MUD‘s culture, it could make The Land a hugely off-putting place to visit for the first time.

Tormenting the newbies was a favorite pastime among the regulars. Richard Bartle tells the sad story of one of MUD‘s first two externals, who managed to connect from all the way over in the United States in 1980 and got nothing but grief for his trouble:

Also in the game was Niatram, one of the system operators (who can’t spell his name backwards). He decided to loom up on this [newbie] character, follow him around a bit, then kill him. This he repeated several times, gaining plenty of points in the process. Finally, the newcomer was at his wits’ end.

“Who’s this Niatram character?” he asked. “He keeps following me around and killing me!” “Yes, he’s done that to me before,” came the reply. “I think he may be dungeon-generated!” At this point Niatram appeared, and out of despair his victim quit, rather than be killed yet again by this “artificial person.”

Today’s virtual worlds are generally wise enough to prevent this sort of thing by one means or another.

Player agency was another area where MUD was dramatically different from the massively-multiplayer games of today, but in this case the difference was, at least from some points of view, a more positive one. Once ordinary players, or “mortals” in MUD speak, reached a certain level, they became “wizards” and “witches” — known by the unisex term of “wizzes” — who were perhaps better described as gods, given that they were granted a stunning degree of control over The Land. “Wiz mode” was originally simply the debug mode used by Trubshaw and Bartle themselves, but it wound up becoming a key part of the game, Bartle’s answer to the expert player’s question of “Well, what now?”

Becoming a wiz required one to amass 76,800 points without getting killed, no mean feat in the cheerfully genocidal realm of The Land. The powers wiz mode conveyed extended to the point of being able to crash the game at will. Tired of watching wizzes invent ways to do so to prove their cleverness, Bartle actually added the verb “crash” to the wizzes’ vocabulary in order to convey the message that, yes, you have the power to do this; you therefore don’t have to bother inventing more convoluted methods for doing so, like, say, plucking the rain from the sky of two separate rooms and mashing the lot together. Safe in the knowledge that they could crash the game at will, effortlessly, most wizzes did indeed see no reason to bother actually doing so. Which didn’t, of course, prevent all the accidental crashes: it was always a sure sign that someone new had attained wiz status and was experimenting with her powers when the game just kept going down over and over.

Once mastered, though, the wizzes’ powers truly were extraordinary. They could snoop on mortals, watching the commands they typed and what appeared on their screens in response; could move objects from any room to any other room without touching them; could move themselves or other players from any room to any other room; could use a “finger of death” spell to instantly kill any mortal, permanently; could grant instant wiz status to any non-wiz they chose.

It seemingly defies all logic — defies everything we know about human nature — that a game willing to grant some of its players such powers could have sustained itself for a week, much less the years during which MUD ran at the University of Essex. It was all indicative of what a different sort of game MUD really was. Running on a university’s computer, free to access by anyone with the wherewithal to make a connection, MUD was a very different proposition from a commercial game, more a joint creative experiment than a product. Bartle:

MUD is an evolving game, and so indeed it should be. It has been incremented gradually, with new ideas put in to be instantly tested by a horde of willing wizzes, or mortals if it was something that they could use (the various “injury” spells — BLIND, DEAFEN, CRIPPLE, DUMB, and CURE — for example). This is one of the great strengths of doing MUD at a university; it’s all research. If a commercial company were to put up a game riddled with bugs, the players would be justifiably upset when it crashed on them. Here, though, it’s free for them to play and they actually like finding mistakes, because it gets them one over on me (and occasionally gets them some points for their honesty!). And it’s also good because we don’t have to pay people to play-test, either — plenty will do it willingly in their spare time for free!

So there will always be a place for MUDs at universities, simply so that research into them can proceed. Universities can have “programs,” whereas commercial companies must have “products.” Products don’t crash (well, not often!) and they are nice and stable. Programs crash like nobody’s business and you never know from one day to the next whether some terrible new command has been added which you don’t know about, but which someone who does is about to use on you. Products are fun, but they don’t change until everything has been thoroughly tested; programs are exciting in their volatility.

Perhaps there is a place for the “not fully tested” system. Even if I as a player did have to put up with a crash every 20 minutes (MUD needs to be reset once a night on average), I think that experiencing the excitement of seeing things evolving and of being among the first to use the novel commands would make me happy to play the program, not the product. Fortunately, enough people think the same way to make debugging that much easier, and to encourage new additions to make the game even more fun for generations of adventurers to come.

Every rule in the game existed, it seemed, to be exploited; mechanically, the place was a leaky sieve that poor Richard Bartle was constantly rushing around to patch. MUD‘s players were endlessly creative when it came to finding exploits. One of Bartle’s favorite anecdotes involves two players who were each about halfway to wiz status. Deciding they’d had enough incrementalism, they hatched a plan to put themselves over the top. First, one of the players kissed the other some 1400 times in succession to transfer all of his points to her, giving her enough to make wiz. Then the newly minted wiz used her special powers to grant instant wiz status to her helper. In response to that, Bartle had to add a rule that players could only use hugs and kisses to transfer points to those with a smaller point total than their own. Just another day at the office…

For all that exploits were a way of life among The Land’s denizens, the absolute power granted to the wizzes proved not to corrupt absolutely. In fact, just the opposite. For all that there was a definite hierarchy in place among the inhabitants of The Land, for all that tormenting the newbies was regarded as such good sport by virtually everyone, a certain attenuated but real code of fair play which Bartle himself did nothing to institute took hold within MUD. And it was the wizzes, the players with the ability to wreck The Land almost beyond redemption if they chose to do so, who came to regard the code as most sacred. Bartle:

The wizzes were once mortals themselves. Wizzes know all too well what it’s like to be summoned to a cold, dark room and left alone with the word “hehehe” ringing in your ears. They know the disappointment in forging through the swamp for half an hour only to find that someone has swapped the incredibly valuable crown in the centre for a fake one. They’ve felt the pangs of outrage when you’ve been attacked by a souped-up bunny rabbit which took you 15 minutes to kill. In short, they know when to stop.

Wizzes make the game. They rule it, they stamp their personalities on it, and they give mortals something to aim for, a goal, a purpose, something which explains why they’re in there hacking and slaying. If MUD does nothing else for multi-user adventure games, for evolving the concept of a wiz it should always be remembered.

By 1984, 53 players had made wiz status: 40 from England, 5 from Scotland, 4 from Wales, 1 from Ireland, 1 from the United States, 1 from Czechoslovakia(!), 1 from Malaysia(!!). The creative powers granted to them made MUD a self-sustaining community. After Bartle built The Land and made it available, he needed do no more. The place was perfectly capable of evolving without him.

“The people,” Bartle noted, “are the game” of MUD. The sense of shared ownership could make The Land a downright cozy place when the inhabitants took a break from killing one another. There was a bizarre craze for Trivial Pursuit for a while, with wizzes shouting out questions which mortals could answer for in-game rewards. And there was at least one MUD wedding, between Frobozz the Wizard and Kate the Enchantress, with the happy couple taking up honeymoon residence in the coveted “Wizard’s Chamber.” (“What interaction occurred thereafter is unknown,” wrote the journalist covering the story.)

Christmas was always a special time of the year, when the wizzes would institute a strictly enforced ban on player-versus-player combat, scatter the landscape with holly, snowmen, wreaths, and plum puddings, and plant Christmas trees in the forests, where one was now apt to encounter a wandering Santa Claus with his reindeer in lieu of the usual monsters. The wizzes desisted from sporting with the newbies to vicariously enjoy their surprise and delight when they logged on to find The Land transformed into a winter wonderland, complete with snow. (“What’s all this snow?” “I don’t know. I just saw Father Christmas go by, and someone has given me this cracker…”) Players would band together to sing Christmas carols, swigging all the while from a shared bottle of rum and diligently role-playing their resulting intoxication.

But who, you might wonder, were all these people who flocked to The Land every night? Like the CB Simulator fraternity on CompuServe, it was a diverse demographic who enjoyed this glimpse of an online future, ranging from teenage hacker whiz kids like Jeremy San to academics in their forties. Perhaps the most dedicated player of all was Sue, the first woman to make wiz, who slept for just three hours or so each night before MUD went online, then played for the full six hours it was available every morning before heading off to her ordinary office job.

Just as on CB Simulator, some of the other people on MUD who claimed to be women weren’t really women at all. Among this group was one Felicity, who was eventually found to be the avatar of a dude named Mark. (Felicity had the reputation, for what it’s worth, of being the “kindest, most responsible wiz of all time.”) Once Felicity/Mark’s deception was uncovered, players claiming to be women were routinely greeted with suspicion. A favorite tactic was to enlist bona-fide females like Sue to engage the latest suspect into a discussion of, say, dress sizes. Thereby were the imposters generally discovered quite quickly.

A wiz called simply Evil gained the reputation of being the most eccentric of all time — quite an achievement with this group. Bartle:

If you wanted to get to any room from any other, no matter how far away, he could give you the shortest route instantly. This was despite the fact that he laboured under a tremendous disability: east-west dyslexia.

It is for this that Evil is best known. His entire in-the-head map of MUD, and all those he wrote down on paper, were flipped east for west. His misapprehension extended to commands, so if he wanted to go west from the start, which is to the left, he’d think it was to the right, and that the command for going to the right was west. So he’d get it correct, but in the wrong way! So absolutely everything was inverted, in a kind of “Evil through the looking glass”. Indeed, when I finally found out about his error I put a looking glass in MUD to celebrate!

He didn’t realise his mistake for years after he’d made it to wiz, and if people used left/right descriptions instead of west/east, he just thought they were barmy. Only when I drew a map of MUD on a blackboard did he finally discover his gaffe, and to this day he thinks a subtle change in the physics of the universe caused everyone in the world to swap east for west in their heads except for him, who remained unaffected due to his enormous and obvious intelligence…

While MUD‘s personality as a game may have been molded more by the cast of eccentrics who inhabited The Land than by Richard Bartle himself, the latter would always remain by far the most important person associated with the game. MUD was, after all, his baby at the end of the day. Even after finishing his degree, he remained at the University of Essex as an artificial-intelligence researcher, with the ulterior motive of continuing to further the cause of MUD. And indeed, from the standpoint of publicity as from so many others, he continued to prove himself to be its greatest asset. As charming and articulate in person as he was an easy and prolific writer, Bartle was able to attract far more press attention for his odd experiment running in the middle of the night on an academic computer than one might expect. He became a fixture in the British computing press of the 1980s, penning long articles about life in The Land for magazines like Practical Computing and Micro Adventurer. His efforts in this regard have proved a goldmine to modern historians, who have come to recognize in the way that his contemporary peers couldn’t possibly be expected to what a landmark creation this first persistent multi-player virtual world really was. It’s largely Bartle’s old articles that form the basis of this new article of mine. With so much in online-gaming history already lost to the ephemerality of the digital medium, we can be thankful that Bartle was as prolific as he was.

Still, preserving MUD culture for posterity wasn’t Richard Bartle’s main motivation for writing these articles. This academic researcher didn’t want MUD to remain a tiny research project at a second-tier university forever; he had a keen interest in moving it beyond the confines of the University of Essex. Early on he gave copies of the software to universities in Portugal, Sweden, Norway, and Scotland, sparking the transformation of the name MUD from a designation for a specific virtual world to a generic tag applied to all parser-driven textual virtual worlds. Despite the identical mechanics, the people who played each incarnation of the software which Bartle shared so freely gave each version of The Land a distinctive character. The MUDs in Scotland and Norway, for instance, were much more easygoing than the one at the University of Essex; delegations from the former expressed horror at the sheer amount of killing that went on in the latter when they came for a visit.

Bartle had no doubt that MUDs represented a better model for adventure gaming, the direction the entire genre by all rights ought to be going:

Like it or not, in the next few years multiplayer games like MUD are going to become the dominating factor in adventure games. The reason for this is quite simple — MUDS are absolutely fantastic to play! The fact that it’s You against Them, rather than You against It, adds an extra electricity you just can’t experience in a single-player game. If you like adventure games already, MUD will absolutely slay you (often literally!).

This quote dates from 1984. As it would imply, Bartle by that year had decided the time was right to start commercializing his research experiment. He made a deal to bring MUD to Compunet, a pioneering online service for British owners of Commodore computers, initially funded largely by Commodore’s own innovative United Kingdom branch. The first service of its kind in Britain, Compunet would muddle along for years without ever achieving the same success as QuantumLink, its nearest equivalent in the United States. The tiny staff’s efforts were constantly undone by the sheer expense of telecommunications in Britain, along with a perpetual lack of funding to set up a proper infrastructure for their service; Compunet had, for instance, only a single access number for the entire country, meaning that the vast majority of potential customers had to pay long-distance surcharges just to access it. Although it was available on Compunet for several years, MUD too just never managed to make much of an impression there.

MUSE hosted a gathering of wizs from the University of Essex MUD to mark the launch of BT MUD. Richard Bartle is the mustachioed fellow standing near the center of the group.

But even as the Compunet MUD was failing to set the world on fire, the game had attracted an important patron. Simon Dally was a longtime gamer, first of tabletop and later of computer games, who had done very well for himself in the book-publishing trade.  When he stumbled across MUD for the first time, it was love at first sight. Dally was certain that MUD could change the world, and that it could make him and its creators very rich in the process. He, Bartle, and Roy Trubshaw — the last had recently returned to England from Belgium — formed Multi-User Entertainments, or MUSE (not to be confused with the American software publisher of the same name), to exploit what Dally, even more so than the always-enthusiastic Bartle, believed to be the game’s immense commercial potential. (Trubshaw was more skeptical, and seemed to have agreed to work on MUD once again more for the programming challenge than for its supposed potential to make him rich or to change the world.)

The first project MUSE undertook was to completely rewrite MUD, with Trubshaw once again doing the low-level architecture and Bartle building a world upon these technical underpinnings. Whereas the old MUD had been inextricably tied to the rapidly aging DEC PDP-10, what became known as MUD2 was designed to “run on just about any minicomputer or mainframe in the world” with a minimum of porting; its first incarnation ran on a DEC VAX, a much newer and more powerful piece of kit than the PDP-10. Among many other improvements came the inevitable increases in scale: the old limit of 36 active players on the original MUD became 100 on MUD2, and The Land itself could now be twice as large. As befitted his day job as an artificial-intelligence researcher, Bartle was particularly proud of his new non-player creatures — known as “mobiles” in MUD speak — which were able to talk and trade with human players whilst pursuing goals of their own.

A proud Simon Dally announces the BT MUD.

Dally took the new-and-improved MUD to British Telecom and convinced them to make it available as a standalone dial-up service. Playing MUD in its latest commercial incarnation wouldn’t be cheap: it cost £20 to buy the starter pack, plus £2 per hour to play, all on top of the cost of the phone call needed to dial in. With prices like these, the new MUD endeavored to shed its scruffy hacker origins and project an upscale image, beginning with a gala launch at The London Dungeon. Dally used his connections to get a glossy book published:  An Introduction to MUD, penned by one Duncan Howard. “I know the BT venture is the just the start of something truly enormous,” Dally said. “Our MUD development language — MUDDL — will allow anyone to come to us with an idea for an interactive type of game, and allow us to implement it quickly and cheaply. We are certainly ahead of the States, where MegaWars III, a rather limited interactive game, is going down well, and we have high hopes of selling MUD to the Americans.” Those hopes would come to fruition in remarkably short order.

But in neither country would the game achieve the “truly enormous” status Dally had so confidently predicted. The BT venture proved particularly disappointing. One step Bartle took which may not have been terribly wise was to convince much of the Essex MUD old guard to migrate over to the new game, pulling strings to get them online at reduced or free rates. Thus he imported the old game’s murderous culture and set up an upper class and a lower class of players at a stroke. A writer for Acorn User magazine who surveyed the newbies found that they “don’t much like MUD. They can’t find any treasure, don’t know what to do, and spend their time waiting for the next reset or chatting to each other in the bar. Typical answers to my inquiries included phrases like ‘bored’ or ‘where’s all the T?'”

In addition to the culture clashes, the BT MUD was also a victim, like the Compunet MUD before it, of all the difficulties inherent in telecommunicating in Britain at the time. MUSE could measure their falling status within British Telecom by the people they were assigned as account managers. “It was a gradual decline,” remembers Bartle, “from speaking to board-level directors to being signed off by a youth-opportunities employee.” The next wave of adventure games in Britain the BT MUD most definitely didn’t become. It continued to run for years, but, after the initial publicity blitz puttered out, it existed as little more than another hangout for the same old small in-group it had attracted at the University of Essex. The BT MUD was finally closed down in early 1991, when British Telecom decided they’d had enough of an exercise that had long since become pointless from their perspective.

MUSE’s efforts did fare significantly better in the United States. Dally convinced no less of an online player than CompuServe to take the game as an offering on their service. It went up there in the spring of 1986 under the name of British Legends. Perhaps assuming that what had worked for Lord British of Ultima fame could work equally well for them, CompuServe’s marketers emphasized the Anglo connection at every opportunity. They declared over-optimistically that “in England, the game is a sensation,” with “thousands of players.” A couple of years after British Legends made its debut, CompuServe began offering their service in Britain as well the United States, facilitating the first meetings between large numbers of British and American players in the very same MUD. “I’ve learned more about the United Kingdom from the British players than I learned in all my college courses,” enthused one American player.

Mechanically, British Legends remained unchanged from its other incarnations, including the wizzes running roughshod over the place. Still, it tended to be a less cutthroat world than had the Essex MUD, with more time given over to socializing and cooperating and less to fighting. And of course it was blessed with an initial group of players who were all starting alike from scratch, and thus largely avoided the class conflicts which plagued the BT MUD. It would remain available on CompuServe for more than thirteen years, becoming by far the most long-lived of all the MUD incarnations licensed by MUSE. It was successful enough that virtually all of CompuServe’s competitors either made their own licensing deals with MUSE or came up with an alternative MUD of their own — or, in some cases, did both. Some of the latter were very innovative in their own right. GEnie’s Imagine Nation, for example, strained to be a kindler, gentler sort of MUD, eschewing combat and even goals entirely in favor of providing an environment for its denizens just to hang out and socialize, or to invent their own games in the form of scavenger hunts and trivia contests. In later years MUDs in general would increasingly go in this direction. (See, for instance, the modern interactive-fiction community’s longstanding, beloved IF MUD, where people congregate to play text adventures together, to discuss game design, and just to chitchat rather than to chase one another around with virtual swords.)

Yet the relative commercial success MUDs enjoyed in the United States shouldn’t be overstated. Even British Legends, probably the most popular single MUD incarnation ever, was always a niche product even within that small niche of the public with the money and the interest to access a service like CompuServe in the first place. The online services made MUDs available because they could do so fairly cheaply, and because, once they were up, they tended to produce a self-sustaining community of hardcore players which required virtually no nurturing — that is to say, they required virtually no further financial investment whatsoever, thanks to the self-sustaining genius of wiz mode.

But in the big picture, Richard Bartle and Simon Dally’s mutual passion was destined to be far more influential than it would be profitable. The death of the dream of MUDs as the dominant adventure-game form of the future came in a tragically literal fashion in 1989, when Dally killed himself. It’s impossible to say to what extent his suicide was prompted by his disappointment at the failure of MUD to achieve the world domination he had predicted and to what extent it was a product of the other mental-health issues from which he had apparently been suffering, as manifested in behavior his friends and colleagues later described as “erratic.” There can be no doubt, however, that his death marked the definitive end of the MUD’s flirtation with mainstream prominence.

Any attempt to explain why MUDs remained a niche interest must begin with their textual nature. MUSE had been formed just after the text adventure had reached its commercial peak and was about to enter its long decline, over the course of which it would gradually be replaced on store shelves by the graphic adventure. A game that consisted of nothing but text was doomed to become a harder and harder sell after 1985. But MUDs had their problems even in comparison to single-player text adventures — problems which Richard Bartle was always a bit too eager to overlook. Many players of adventure games loved puzzles, but, as we’ve seen, puzzles didn’t really work all that well in MUDs. Many craved the experience of seeing a story through from beginning to end, knowing all the while that they were the most important character in that story; MUDs couldn’t provide this either. Ask many who had tried to like MUDs, and they’d tell you that they were just too capricious, too unstructured, too difficult to get into, even before you started wrestling with a parser that sometimes seemed willfully determined not to understand you. MUDs had invented the idea of the persistent online virtual world, but the millions and millions of players who would later come to live a good chunk of their lives in such places would have a very different technological window onto them.

The way forward, in commercial terms at least, would be through more structured designs attached to cleaner interfaces, eventually using graphics instead of text wherever possible. While the hardcore who loved MUDs for the very things the casual dabblers hated about them complained — and by no means entirely without justification — that something precious was being lost, game developers would increasingly push in this more accessible direction. Instead of making multi-player text adventures with CRPG elements, they would build their persistent worlds on the framework of the traditional CRPG — full stop.

(Sources: the books MMOs from the Inside Out by Richard Bartle and Grand Thieves and Tomb Raiders: How British Videogames Conquered the World by Magnus Anderson and Rebecca Levene; Byte of December 1980; Micro Adventurer of September 1984, October 1984, November 1984, December 1984, January 1985, February 1985, and March 1985; Practical Computing of June 1982 and December 1983; Popular Computing Weekly of December 20 1984, February 28 1985, and May 23 1985; Commodore Disk User of November 1987; Your Computer of September 1985; Sinclair User of November 1985; Computer and Video Games of November 1985; Commodore User of February 1986; Acorn User of July 1985, October 1985, April 1987, and June 1987; New Computer Express of March 9 1991; Online Today of February 1986, January 1988, and March 1988; The Gamer’s Connection of September/October 1988; Questbusters of October 1989. Online sources include the article “CNET — Moving with the Times” from Commodore Apocalypse and Gamespy’s history of MUDs.

You can still play MUD1 today by telnetting to british-legends.com, port 27750. You can play MUD2 by telnetting to mud2.com, port 27723.)

 
27 Comments

Posted by on December 15, 2017 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

A Net Before the Web, Part 1: The Establishment Man and the Magnificent Rogue

On July 9, 1979, journalists filtered into one of the lavish reception halls in Manhattan’s Plaza Hotel to witness the flashy roll-out of The Source, an online service for home-computer owners that claimed to be the first of its kind. The master of ceremonies was none other than the famous science-fiction and science-fact writer Isaac Asimov. With his nutty-professor persona in full flower, his trademark mutton-chop sideburns bristling in the strobe of the flashbulbs, Asimov said that “this is the beginning of the Information Age! By the 21st century, The Source will be as vital as electricity, the telephone, and running water.”

Actually, though, The Source wasn’t quite the first of its kind. Just eight days before, another new online service had made a more quiet official debut. It was called MicroNET, and came from an established provider of corporate time-shared computing services called CompuServe. MicroNET got no splashy unveiling, no celebrity spokesman, just a typewritten announcement letter sent to members of selected computer users groups.

The contrast between the two roll-outs says much about the men behind them, who between them would come to shape much of the online world of the 1980s and beyond. They were almost exactly the same age as one another, but cut from very different cloths. Jeff Wilkins, the executive in charge of CompuServe, could be bold when he felt it was warranted, but his personality lent itself to a measured, incremental approach that made him a natural favorite with the conservative business establishment. “The changes that will come to microcomputing because of computer networks will be evolutionary in nature,” he said just after launching MicroNET. Even after Wilkins left CompuServe in 1985, it would continue to bear the stamp of his careful approach to doing business for many years.

But William Von Meister, the man behind The Source and its afore-described splashier unveiling, preferred revolutions to evolutions. He was high-strung, mercurial, careless, sometimes a little unhinged. Described as a “magnificent rogue” by one acquaintance, as a “pathological entrepreneur” by another, he made businesses faster than he made children — of whom, being devoted to excess in all its incarnations, he had eight. His businesses seldom lasted very long, and when they did survive did so without him at their helm, usually after he had been chased out of them in a cloud of acrimony and legal proceedings. A terrible businessman by most standards, he could nevertheless “raise money from the dead,” as one investor put it, thereby moving on to the next scheme while the previous was still going down in flames. Still, whatever else you could say about him, Bill von Meister had vision. Building the online societies of the future would require cockeyed dreamers like him just as much as it would sober tacticians like Jeff Wilkins.


Had an anonymous salesman who worked for Digital Equipment Corporation in 1968 been slightly less good at his job, CompuServe would most likely never have come to be.

The salesman in question had been assigned to a customer named John Goltz, fresh out of the University of Arizona and working now in Columbus, Ohio, for a startup. But lest the word “startup” convey a mistaken impression of young men with big dreams out to change the world, Silicon Valley-style, know that this particular startup lived within about the most unsexy industry imaginable: life insurance. No matter; from Goltz’s perspective anyway the work was interesting enough.

He found himself doing the work because Harry Gard, the founder of the freshly minted Golden United Life Insurance, wanted to modernize his hidebound industry, at least modestly, by putting insurance records online via a central computer which agents in branch offices could all access. He had first thought of giving the job to his son-in-law Jeff Wilkins, an industrious University of Arizona alumnus who had graduated with a degree in electrical engineering and now ran a successful burglar-alarm business of his own in Tucson. “The difference between electrical engineering and computing didn’t occur to him,” remembers Wilkins. “I told him that I didn’t know anything about computing, but I had a friend who did.” That friend was John Goltz, whose degree in computer science made him the more logical candidate in Wilkins’s eyes.

Once hired, Goltz contacted DEC to talk about buying a PDP-9, a sturdy and well-understood machine that should be perfectly adequate for his new company’s initial needs. But our aforementioned fast-talking salesman gave him the hard up-sell, telling him about the cutting-edge PDP-10 he could lease for only “a little more.” Like the poor rube who walks into his local Ford dealership to buy a Focus and drives out in a Mustang, Goltz’s hacker heart couldn’t resist the lure of DEC’s 36-bit hot rod. He repeated the saleman’s pitch almost verbatim to his boss, and Gard, not knowing a PDP-10 from a PDP-9 from a HAL 9000, said fine, go for it. Once his dream machine was delivered and installed in a former grocery store, Goltz duly started building the online database for which he’d been hired.

The notoriously insular life-insurance market was, however, a difficult nut to crack. Orders came in at a trickle, and Goltz’s $1 million PDP-10 sat mostly idle most of the time. It was at this point, looking for a way both to make his computer earn its keep and to keep his employer afloat, that Goltz proposed that Golden United Life Insurance enter into the non-insurance business of selling time-shared computer cycles. Once again, Gard told him to go for it; any port in a storm and all that.

At the dawn of the 1970s, time-sharing was the hottest buzzword in the computer field. Over the course of the 1950s and 1960s, the biggest institutions in the United States — government bureaucracies, banks, automobile manufacturers and other heavy industries — had all gradually been computerized via hulking mainframes that, attended by bureaucratic priesthoods of their own and filling entire building floors, chewed through and spat out millions of records every day. But that left out the countless smaller organizations who could make good use of computers but had neither the funds to pay for a mainframe’s care and upkeep nor a need for more than a small fraction of its vast computing power. DEC, working closely with university computer-science departments like that of MIT, had been largely responsible for the solution to this dilemma. Time-sharing, enabled by a new generation of multi-user, multitasking operating systems like DEC’s TOPS-10 and an evolving telecommunications infrastructure that made it possible to link up with computers from remote locations via dumb terminals, allowed computer cycles and data storage to be treated as a commodity. A business or other organization, in other words, could literally share time on a remote computer system with others, paying for only the cycles and storage they actually used. (If you think that all this sounds suspiciously like the supposedly modern innovation of “cloud computing,” you’re exactly right. In technology as in life, a surprising number of things are cyclical, with only the vocabulary changing.)

Jeff Wilkins

John Goltz possessed a keen technical mind, but he had neither the aptitude nor the desire to run the business side of Golden United’s venture into time-sharing. So, Harry Gard turned once again to his son-in-law. “I liked what I was doing in Arizona,” Jeff Wilkins says. “I enjoyed having my own company, so I really didn’t want to come out.” Finally, Gard offered him $1.5 million in equity, enough of an eye-opener to get him to consider the opportunity more seriously. “I set down the ground rules,” he says. “I had to have complete control.” In January of 1970, with Gard having agreed to that stipulation, the 27-year-old Jeff Wilkins abandoned his burglar-alarm business in Tuscon to come to Columbus and run a new Golden Life subsidiary which was to be called Compu-Serv.

With time-sharing all the rage in computer circles, it was a tough market they were entering. Wilkins remembers cutting his first bill to a client for all of $150, thinking all the while that it was going to take a lot of bills just like it to pay for this $1 million computer. But Compu-Serv was blessed with a steady hand in Wilkins himself and a patient backer with reasonably deep pockets in his father-in-law. Wilkins hired most of his staff out of big companies like IBM and Xerox. They mirrored their young but very buttoned-down boss, going everywhere in white shirt and tie, lending an aura of conservative professionalism that belied the operation’s small size and made it attractive to the business establishment. In 1972, Compu-Serv turned the corner into a profitability that would last for many, many years to come.

In the beginning, they sold nothing more than raw computer access; the programs that ran on the computers all had to come from the clients themselves. As the business expanded, though, Compu-Serv began to offer off-the-shelf software as well to suit the various industries they found themselves serving. They began, naturally enough, with the “Life Insurance Data Information System,” a re-purposing of the application Goltz had already built for Golden United. Expanding the reach of their applications from there, they cultivated a reputation as a full-service business partner rather than a mere provider of a commodity. Most importantly of all, they invested heavily into their own telecommunications infrastructure that existed in parallel with the nascent Internet and other early networks, using lines leased from AT&T and a system of routers — actually, DEC minicomputers running software of their own devising — for packet-switching. From their first handful of clients in and around Columbus, Compu-Serv thus spread their tendrils all over the country. They weren’t the cheapest game in town, but for the risk-averse businessperson looking for a full-service time-sharing provider with a fast and efficient network, they made for a very appealing package.

In 1975, Compu-Serv was spun off from the moribund Golden United Life Insurance, going public with a NASDAQ listing. Thus freed at last, the child quickly eclipsed the parent; the first stock split happened within a year. In 1977, Compu-Serv changed their name to CompuServe. By this point, they had more than two dozen offices spread through all the major metropolitan areas, and that one PDP-10 in a grocery store had turned into more than a dozen machines filling two data centers near Columbus. Their customer roll included more than 600 businesses. By now, even big business had long since come to see the economic advantages time-sharing offered in many scenarios. CompuServe’s customers included Fortune 100 giants like AMAX (the largest miner of aluminum, coal, and steel in the country), Goldman Sachs, and Owens Corning, along with government agencies like the Department of Transportation. “CompuServe is one of the best — if not the best — time-sharing companies in the country,” said AMAX’s director of research.

Inside one of CompuServe’s data centers.

The process that would turn this corporate data processor of the 1970s into the most popular consumer online service of the 1980s was born out of much the same reasoning that had spawned it in the first place. Once again, it all came down to precious computer cycles that were sitting there unused. To keep their clients happy, CompuServe was forced to make sure they had enough computing capacity to meet peak-hour demand. This meant that the majority of the time said capacity was woefully underutilized; the demand for CompuServe’s computer cycles was an order of magnitude higher during weekday working hours than it was during nights, evenings, and weekends, when the offices of their corporate clients were deserted. This state of affairs had always rankled Jeff Wilkins, nothing if not a lover of efficiency. Yet it had always seemed an intractable problem; it wasn’t as if they could ask half their customers to start working a graveyard shift.

Come 1979, though, a new development was causing Wilkins to wonder if there might in fact be a use for at least some of those off-hour cycles. The age of personal computing was in the offing. Turnkey microcomputers were now available from Apple, Commodore, and Radio Shack. The last company alone was on track to sell more than 50,000 TRS-80s before the end of the year, and many more models from many more companies were in the offing. The number of home-computer hobbyists was still minuscule by any conventional standard, but it could, it seemed to Wilkins, only grow. Might some of those hobbyists be willing and able to dial in and make use of CompuServe’s dearly bought PDP-10 systems while the business world slept? If so, who knew what it might turn into?

It wasn’t as if a little diversity would be a bad thing. While CompuServe was still doing very well on the strength of their fine reputation — they would bill their clients for $19 million in 1979 — the time-sharing market in general was showing signs of softening. The primary impetus behind it — the sheer expense of owning one’s own computing infrastructure — was slowly bleeding away as minicomputers like the DEC PDP-11, small enough to shove away in a closet somewhere rather than requiring a room or a floor of its own, became a more and more cost-effective solution. Rather than a $1 million proposition, as it had been ten years ago, a new DEC system could now be had for as little as $150,000. Meanwhile a new piece of software called VisiCalc — the first spreadsheet program ever, at least as the modern world understands that term — would soon show that even an early, primitive microcomputer could already replace a time-shared terminal hookup in a business’s accounting department. And once entrenched in that vital area, microcomputers could only continue to spread throughout the corporation.

Still, the consumer market for online services, if it existed, wasn’t worth betting CompuServe’s existing business model on. Wilkins entered this new realm, as he did most things, with cautious probity. The new service would be called MicroNET so as to keep it from damaging the CompuServe brand in the eyes of their traditional customers, whether because it became a failure or just because of the foray into the untidy consumer market that it represented. And it would be “market-driven” rather than “competition-driven.” In Wilkins’s terminology, this meant that they would provide some basic time-sharing infrastructure — including email and a bulletin board for exchanging messages, a selection of languages for writing and running programs, and a suite of popular PDP-10 games like Adventure and Star Trek — but would otherwise adapt a wait-and-see attitude on adding customized consumer services, letting the market — i.e., all those hobbyists dialing in from home — do what they would with the system in the meantime.

Even with all these caveats, he had a hard time selling the idea to his board, who were perfectly happy with the current business model, thank you very much, and who had the contempt for the new microcomputers and the people who used them that was shared by many who been raised on the big iron of DEC and IBM. They took to calling his idea “schlock time-sharing.”

Mustering all his powers of persuasion, Wilkins was able to overrule the naysayers sufficient to launch a closed trial. On May 1, 1979, CompuServe quietly offered free logins to any members of the Midwest Affiliation of Computer Clubs, headquartered right there in Columbus, who asked for them. With modems still a rare and pricey commodity, it took time to get MicroNET off the ground; Wilkins remembers anxiously watching the connectivity lights inside the data center during the evenings, and seeing them remain almost entirely dimmed. But then, gradually, they started blinking.

After exactly two months, with several hundred active members having proved to Wilkins’s satisfaction that a potential market existed, he made MicroNET an official CompuServe service, open to all. To the dissatisfaction of his early adopters, that meant they had to start paying: a $30 signup charge, followed by $5 per hour for evening and weekend access, $12 per hour if they were foolish enough to log on during the day, when CompuServe’s corporate clients needed the machines. To the satisfaction of Wilkins, most of his early adopters grumbled but duly signed up, and they were followed by a slow but steady trickle of new arrivals. The service went entirely unadvertised, news of its existence spreading among computer hobbyists strictly by word of mouth. MicroNET was almost literally nothing in the context of CompuServe’s business as a whole — it would account for roughly 1 percent of their 1979 revenue, less than heaps of their larger individual corporate accounts — yet it marked the beginning of something big, something even Wilkins couldn’t possibly anticipate.

But MicroNET didn’t stand alone. Even as one online service was getting started in about the most low-key fashion imaginable, another was making a much more high-profile entrance. It was fortunate that Wilkins chose to see MicroNET as “market-driven” rather than “competition-driven.” Otherwise, he wouldn’t have been happy to see his thunder being stolen by The Source.

Bill von Meister

Like Jeff Wilkins, Bill von Meister was 36 years old. Unlike Wilkins, he already had on his resume a long string of entrepreneurial failures to go along with a couple of major successes. An unapologetic epicurean with a love for food, wine, cars, and women, he had been a child not just of privilege but of aristocracy, his father a godson of the last German kaiser, his mother an Austrian countess. His parents had immigrated to New York in the chaos that followed World War I, when Germany and Austria could be uncomfortable places for wealthy royalty, and there his father had made the transition from landed aristocrat to successful businessman with rather shocking ease. Among other ventures, he became a pivotal architect of the storied Zeppelin airship service between Germany and the United States — although the burning of the Hindenburg did rather put the kibosh on that part of his portfolio, as it did passenger-carrying airships in general.

The son inherited at least some of the father’s acumen. Leveraging his familial wealth alongside an unrivaled ability to talk people into giving him money — one friend called him the best he’d ever seen at “taking money from venture capitalists, burning it all up, and then getting more money from the same venture capitalist” — the younger von Meister pursued idea after idea, some visionary, some terrible. By 1977, he had hit pay dirt twice already in his career, once when he created what was eventually branded as Western Union’s “Mailgram” service for sending a form of electronic mail well before computer email existed, once when he created a corporate telephone service called Telemax. Unfortunately, the money he earned from these successes disappeared as quickly as it poured in, spent to finance his high lifestyle and his many other, failed entrepreneurial projects.

Late in 1977, he founded Digital Broadcasting Corporation in Fairfax County, Virginia, to implement a scheme for narrow-casting digital data using the FM radio band. “Typical uses,” ran the proposal, “would include price information for store managers in a retail chain, bad-check information to banks, and policy information to agents of an insurance company.” Von Meister needed financing to bring off this latest scheme, and he needed a factory to build the equipment that would be needed. Luckily, a man who believed he could facilitate both called him one day in the spring of 1978 after reading a description of his plans in Business Week.

Jack Taub had made his first fortune as the founder of Scott Publishing, known for their catalogs serving the stamp-collecting hobby. Now, he was so excited by von Meister’s scheme that he immediately bought into Digital Broadcasting Corporation to the tune of $500,000 of much-needed capital, good for a 42.5 percent stake. But every bit as important as Taub’s personal fortune were the connections he had within the federal government. By promising to build a factory in the economically disadvantaged inner city of Charlotte, North Carolina, he convinced the Commerce Department’s Economic Development Administration to guarantee 90 percent of a $6 million bank loan from North Carolina National Bank, under a program meant to channel financing into job-creating enterprises.

Unfortunately, the project soon ran into serious difficulties with another government agency: the Federal Communications Commission, who noted pointedly that the law which had set aside the FM radio band had stipulated it should be reserved for applications “of interest to the public.” Using it to send private data, many officials at the FCC believed, wasn’t quite what the law’s framers had had in mind. And while the FCC hemmed and hawed, von Meister was fomenting chaos within the telecommunications and broadcasting industries at large by claiming his new corporation’s name gave him exclusive rights to the term “digital broadcasting,” a modest buzzword of its own at the time. His legal threats left a bad taste in the mouth of many a potential partner, and the scheme withered away under the enormous logistical challenges getting such a service off the ground must entail. The factory which the Commerce Department had so naively thought they were financing never opened, but Digital Broadcasting kept what remained of the money they had received for the purpose.

They now planned to use the money for something else entirely. Von Meister and Taub had always seen business-to-business broadcasting as only the first stage of their company’s growth. In the longer term, they had envisioned a consumer service which would transmit and even receive information — news and weather reports, television listings, shopping offers, opinion polls, etc. — to and from terminals located in ordinary homes. When doing all this over the FM radio band began to look untenable, they had cast about for alternative approaches; they were, after all, still flush with a fair amount of cash. It didn’t take them long to take note of all those TRS-80s and other home computers that were making their way into the homes of early adopters. Both Taub and von Meister would later claim to have been the first to suggest a pivot from digital broadcasting to a microcomputer-oriented online information utility. In the beginning, they called it CompuCom.

The most obvious problem CompuCom faced — its most obvious disadvantage in comparison to CompuServe’s MicroNET — was the lack of a telecommunications network of its own. Once again, both Taub and von Meister would later claim to have been the first to see the solution. One or the other or both took note of another usage inequality directly related to the one that had spawned MicroNET. Just as the computers of time-sharing services like CompuServe sat largely idle during nights and weekends, traffic on the telecommunications lines corporate clients used to connect to them was also all but nonexistent more than half of the time. Digital Broadcasting came to GTE Telenet with an offer to lease this idle bandwidth at a rate of 75¢ per connection per hour, a dramatically reduced price from that of typical business customers. GTE, on the presumption that something was better than nothing, agreed. And while they were making the deal to use the telecommunications network, von Meister and Taub also made a deal with GTE Telenet to run the new service on the computers in the latter’s data centers, using all that excess computing power that lay idle along with the telecommunications bandwidth on nights and weekends. Because they needed to build no physical infrastructure, von Meister and Taub believed that CompuCom could afford to be relatively cheap during off-hours; the initial pricing plan stipulated just $2.75 per hour during evenings and weekends, with a $100 signup fee and a minimum monthly charge of $10.

For all the similarities in their way of taking advantage of the time-sharing industry’s logistical quirks, not to mention their shared status as the pioneers of much of modern online life, there were important differences between the nascent MicroNET and CompuCom. From the first, von Meister envisioned his service not just as a provider of computer access but as a provider of content. The public-domain games that were the sum total of MicroNET’s initial content were only the beginning for him. Mirroring its creator, CompuCom was envisioned as a service for the well-heeled Playboy– and Sharper Image-reading technophile lounge lizard, with wine lists, horoscopes, entertainment guides for the major metropolitan areas, and an online shopping mall. In a landmark deal, von Meister convinced United Press International, one of the two providers of raw news wires to the nation’s journalistic infrastructure, to offer their feed through CompuCom as well — unfiltered, up-to-the-minute information of a sort that had never been available to the average consumer before. The New York Times provided a product-information database, Prentice Hall provided tax information, and Dow Jones provided a stock ticker. Von Meister contracted with the French manufacturer Alcatel for terminals custom-made just for logging onto CompuCom, perfect for those wanting to get in on the action who weren’t interested in becoming computer nerds in the process. For the same prospective customers, he insisted that the system, while necessarily all text given the state of the technology of the time, be navigable via multiple-choice menus rather than an arcane command line.

In the spring of 1979, just before the first trials began, CompuCom was renamed The Source; the former name sounded dangerously close to “CompuCon,” a disadvantage that was only exacerbated by the founder’s checkered business reputation. The service officially opened for business, eight days after MicroNET had done the same, with that July 9 press conference featuring Isaac Asimov and the considerable fanfare it generated. Indeed, the press notices were almost as ebullient as The Source’s own advertising, with the Wall Street Journal calling it “an overnight sensation among the cognoscenti of the computing world.” Graeme Keeping, a business executive who would later be in charge of the service but was at this time just another outsider looking in, had this to say about those earliest days:

The announcement was made with the traditional style of the then-masters of The Source. A lot of fanfare, a lot of pizazz, a lot of sizzle. There was absolutely no substance whatsoever to the announcement. They had nothing to back it up with.

Electronic publishing was in its infancy in those days. It was such a romantic dream that there never had to be a product in order to generate excitement. Nobody had to see anything real. People wanted it so badly, like a cure for cancer. We all want it, but is it really there? I equate it to Laetrile.

While that is perhaps a little unfair — there were, as we’ve just seen, several significant deals with content providers in place before July of 1979 — it was certainly true that the hype rather overwhelmed the comparatively paltry reality one found upon actually logging into The Source.

Nevertheless, any comparison of The Source and MicroNET at this stage would have to place the former well ahead in terms of ambition, vision, and public profile. That distinction becomes less surprising when we consider that what was a side experiment for Jeff Wilkins was the whole enchilada for von Meister and Taub. For the very same reason, any neutral observer forced to guess which of these two nascent services would rise to dominance would almost certainly have gone with The Source. Such a reckoning wouldn’t have accounted, however, for the vortex of chaos that was Bill von Meister.

It was the typical von Meister problem: he had built all this buzz by spending money he didn’t have — in fact, by spending so much money that to this day it’s hard to figure out where it could all possibly have gone. As of October of 1979, the company had $1000 left in the bank and $8 million in debt. Meanwhile The Source itself, despite all the buzz, had managed to attract at most a couple of thousand actual subscribers. It was, after all, still very early days for home computers in general, modems were an even more exotic species, and the Alcatel terminals had yet to arrive from France, being buried in some transatlantic bureaucratic muddle.

Jack Taub

By his own later account, Jack Taub had had little awareness over the course of the last year or so of what von Meister was doing with the company’s money, being content to contribute ideas and strategic guidance and let his partner handle day-to-day operations. But that October he finally sat down to take a hard look at the books. He would later pronounce the experience of doing so “an assault on my system. Von Meister is a terrific entrepreneur, but he doesn’t know when to stop entrepreneuring. The company was in terrible shape. It was not going to survive. Money was being spent like water.” With what he considered to be a triage situation on his hands, Taub delivered an ultimatum to von Meister. He would pay him $3140 right now — 1¢ for each of his shares — and would promise to pay him another dollar per share in three years if The Source was still around then. In return, von Meister would walk away from the mess he had created, escaping any legal action that might otherwise become a consequence of his gross mismanagement. According to Taub’s account, von Meister agreed to these terms with uncharacteristic meekness, leaving his vision of The Source as just one more paving stone on his boulevard of broken entrepreneurial dreams, and leaving Taub to get down to the practical business of saving the company. “I think if I had waited another week,” the latter would later say, “it would have been too late.”

As it was, Digital Broadcasting teetered on the edge of bankruptcy for months, with Taub scrambling to secure new lines of credit to keep the existing creditors satisfied and, when all else failed, injecting more of his own money into the company. Through it all, he still had to deal with von Meister, who, as any student of his career to date could have predicted, soon had second thoughts about going away quietly — if, that is, he’d ever planned to do so in the first place. Taub learned that von Meister had taken much of Digital Broadcasting’s proprietary technology out the door with him, and was now shopping it around the telecommunications industry; that sparked a lawsuit on Taub’s behalf. Von Meister claimed his ejection had been illegal; that sparked another, going in the opposite direction. Apparently concluding that his promise not to sue von Meister for his mismanagement of the company was thus nullified, Taub counter-sued with exactly that charge. With a vengeful von Meister on his trail, he said that he couldn’t afford to “sleep with both eyes closed.”

By March of 1980, The Source had managed to attract about 3000 subscribers, but the online citizens were growing restless. Many features weren’t quite as advertised. The heavily hyped nightlife guides, for instance, mostly existed only for the Washington Beltway, the home of The Source. The email system was down about half the time, and even when it was allegedly working it was anyone’s guess whether a message that was sent would actually be delivered. Failings like these could be attributed easily enough to the usual technical growing pains, but other complaints carried with them an implication of nefarious intent. The Source’s customers could read the business pages of the newspaper as well as anyone, and knew that Jack Taub was fighting for his company’s life on multiple fronts. In that situation, some customers reasoned, there would be a strong incentive to find ways to bill them just that little bit more. Thus there were dark accusations that the supposedly user-friendly menu system had been engineered to be as verbose and convoluted as possible in order to maximize the time users spent online just trying to get to where they wanted to go. On a 110- or 300-baud connection — for comparison purposes, consider that a good touch typist could far exceed the former rate — receiving all these textual menus could take considerable time, especially given the laggy response time of the system as a whole whenever more than a handful of people were logged on. And for some reason, a request to log off the system in an orderly way simply didn’t work most of the time, forcing users to break the connection themselves. After they did so, it would conveniently — conveniently for The Source’s accountants, that is — take the system five minutes or so to recognize their absence and stop charging them.

A sampling of the many error messages with which early users of The Source became all too familiar.

The accusations of nefarious intent were, for what it’s worth, very unlikely to have had any basis in reality. Jack Taub was a hustler, but he wasn’t a con man. On the contrary, he was earnestly trying to save a company whose future he deeply believed in. His biggest problem was the government-secured loan, on which Digital Broadcasting Corporation had by now defaulted, forcing the Commerce Department to pay $3.2 million to the National Bank of North Carolina. The government bureaucrats, understandably displeased, were threatening to seize his company and dismantle it in the hope of getting at least some of that money back. They were made extra motivated by the fact that the whole affair had leaked into the papers, with the Washington Post in particular treating it as a minor public scandal, an example of Your Tax Dollars at Waste.

Improvising like mad, Taub convinced the government to allow him to make a $300,000 down payment, and thereafter to repay the money he owed over a period of up to 22 years at an interest rate of just 2 percent. Beginning in 1982, the company, now trading as The Source Telecomputing Corporation rather than Digital Broadcasting Corporation, would have to repay either $50,000 or 10 percent of their net profit each year, whichever was greater; beginning in 1993, the former figure would rise to $100,000 if the loan still hadn’t been repaid. “The government got a good deal,” claimed Taub. “They get 100 cents on the dollar, and get their money back faster if I’m able to do something with the company.” While some might have begged to differ with his characterization of the arrangement as a “good deal,” it was, the government must have judged, the best it was likely to get under the circumstances. “The question is to work out some kind of reasonable solution where you recover something rather than nothing,” said one official familiar with the matter. “While it sounds like they’re giving it away, they already did that. They already made their mistake with the original loan.”

With the deal with the Commerce Department in place, Taub convinced The Readers Digest Association, publisher of the most popular magazine in the world, who were eager to get in on the ground floor of what was being billed in some circles as the next big thing in media, to buy 51 percent of The Source for $3 million in September of 1980, thus securing desperately needed operating capital. But when a judge ruled in favor of von Meister on the charge that he had been unlawfully forced out of the company shortly thereafter, Taub was left scrambling once again. He was forced to go back to Readers Digest, convincing them this time to increase their stake to 80 percent, leaving only the remaining 20 percent in his own hands. And with that second capital injection to hand, he convinced von Meister to lay the court battle to rest with a settlement check for $1 million.

The Source had finally attained a measure of stability, and Jack Taub’s extended triage could thus come to an end at last. Along the way, however, he had maneuvered himself out of his controlling interest and, soon, out of a job. Majority ownership having its privileges, Readers Digest elected to replace him with one of their own: Graeme Keeping, the executive who had lobbied hardest to buy The Source in the first place. “Any publisher today, if he doesn’t get into electronic publishing,” Keeping was fond of saying, “is either going to be forced into it by economic circumstances or will have great difficulty staying in the paper-and-ink business.”

The Source’s Prime computer systems, a millstone around their neck for years (although the monkey does seem to be enjoying them).

The Source may have found a safe harbor with one of the moneyed giants of American media, but it would never regain its early mojo. Keeping proved to be less than the strategic mastermind he believed himself to be, with a habit of over-promising and under-delivering — and, worse, of making terrible choices based on his own overoptimistic projections. The worst example of the tendency came early in his tenure, in the spring of 1981, when he was promising the New York Times he would have 60,000 subscribers by 1982. Determined to make sure he had the computing capacity to meet the demand, he cancelled the contract to use GTE Telenet’s computing facilities, opening his own data center instead and filling it with his own machines. At a stroke, this destroyed a key part of the logistical economies which had done so much to spawn The Source (and, for that matter, CompuServe’s MicroNET) in the first place. The Source’s shiny new computers now sat idle during the day with no customers to service. Come 1982, The Source had only 20,000 subscribers, and all those expensive computers were barely ticking over even at peak usage. This move alone cost The Source millions. Meanwhile, the deal with Alcatel for custom-made terminals having fallen through during the chaos of Taub’s tenure, Keeping made a new one with Zenith to make “a semi-intelligent terminal with a hole in the back through which you can turn it into a computer.” That impractical flight of fancy also came to naught, but not before costing The Source more money. Such failures led to Keeping’s ouster in June of 1982, to be replaced by another anodyne chief from the Readers Digest executive pool named George Grune.

Soon after, Control Data Corporation, a maker of supercomputers, bought a 30 percent share of The Source for a reported $5 million. But even this latest injection of capital, technical expertise, and content — Control Data would eventually move much of their pioneering Plato educational network onto the service — changed little. The Source went through three more chief executives in the next two years. The user roll continued to grow, finally reaching 60,000 in September of 1984 — some two and a half years after Graeme Keeping’s prediction, for those keeping score — but the company perpetually lost money, was perpetually about to turn the corner into mainstream acceptance and profitability but never actually did. Thanks not least to Keeping’s data-center boondoggle, the hourly rate for non-prime usage had risen to $7.75 per hour by 1984, making this onetime pioneer that now felt more and more like an also-ran a hard sell in terms of dollars and cents as well. Neither the leading name in the online-services industry nor the one with the deepest pockets — there were limits to Readers Digest’s largess — The Source struggled to attract third-party content. A disturbing number of those 60,000 subscribers rarely or never logged on, paying only the minimum monthly charge of $10. One analyst noted that well-heeled computer owners “apparently are willing to pay to have these electronic services available, even if they don’t use them regularly. From a business point of view, that’s a formula for survival, but not for success.”

The Source was fated to remain a survivor but never a real success for the rest of its existence. Back in Columbus, however, CompuServe’s consumer offering was on a very different trajectory. Begun in such a low-key way that Jeff Wilkins had refused even to describe it as being in competition with The Source, CompuServe’s erstwhile MicroNET — now re-branded as simply CompuServe, full stop — was going places of which its rival could only dream. Indeed, one might say it was going to the very places of which Bill von Meister had been dreaming in 1979.

(Sources: the book On the Way to the Web: The Secret History of the Internet and its Founders by Michael A. Banks and Stealing Time: Steve Case, Jerry Levin, and the Collapse of AOL Time Warner by Alec Klein; Creative Computing of March 1980; InfoWorld of April 14 1980, May 26 1980, January 11 1982, May 24 1982, and November 5 1984; Wall Street Journal of November 6 1979; Online Today of July 1989; 80 Microcomputing of November 1980; The Intelligent Machines Journal of March 14 1979 and June 25 1979; Washington Post of May 11 1937, July 10 1978, February 10 1980, and November 4 1980; Alexander Trevor’s brief technical history of CompuServe, which was first posted to Usenet in 1988; interviews with Jeff Wilkins from the Internet History Podcast and Conquering Columbus.)

 
 

Tags: ,

The Faery Tale Life of MicroIllusions

MicroIllusions

With the notable exception of Electronic Arts, the established American software industry was uncertain what to make of the Amiga in the wake of its initial release. Impressive as the machine was, it was also an expensive proposition from a parent company best known for much cheaper computers — and a parent company that was in a financial freefall to boot. Thus most publishers confined their support to inexpensive ports of existing titles that wouldn’t break the bank if, as so many expected, neither Commodore nor their Amiga were still around in a year or so.

Disappointing as this situation was to many early Amiga adopters, it spelled Opportunity for many an ambitious would-be Amiga entrepreneur. Just as early issues of Amazing Computing, the Amiga’s most respected technical magazine, carry with them some of the spirit of the original Byte magazine, of smart people joining together to figure out what this new thing is and what they can do with it, the early Amiga software scene represents the last great flowering of the Spirit of ’76 that had birthed the modern software industry. Like their peers of a decade before, the early Amiga developers were motivated more by love and passion than by money, and often operated more as a collective of friends and colleagues working toward a shared purpose than as competitors — i.e., as what Doug Carlston had once dubbed a “brotherhood” of software. Cinemaware became the breakout star of this group who committed themselves to the Amiga quickly and completely; that company was soon known to plenty of people who had never actually touched an Amiga for themselves. But there were plenty of others whose distinctly non-focus-group-tested names speak to their scruffy origins: companies like Aegis, Byte by Byte, and the one destined to be the great survivor of this pioneering era, NewTek. That NewTek would be just about the only one of these companies still in business in six or seven years does say something about the trajectory of the Amiga in North America, but perhaps says just as much about the nature of the companies themselves. Once again like their peers in the early 8-bit software industry, these early Amiga publishers carried along with their commitment to relentless innovation an often shocking ineptitude at executing fundamentals of running a business like writing marketing copy, keeping books, drawing up contracts, and paying taxes.

The story of our company of choice for today, MicroIllusions, is typical enough to almost stand in for that of the early Amiga software industry as a whole. At the same time, though, “typical” in this time of rampant innovation meant some extraordinarily original software. That’s particularly true of the works of MicroIllusions’s highest profile programmer, David Joiner (or, as he was better known by his friends then and still today, “Talin,” his online handle). Joiner, who describes his view of the universe as of “some giant art project,” has dedicated his life to being “compulsively creative,” in both the digital and analog worlds. His vacuum-forming costumes, which transform him into alien space-bugs or knights in armor, have been the hit of many a science-fiction convention. He’s also an enthusiastic painter — “for a long time I thought my career was going to be in art” — as well as a musician and composer.

Joiner was first exposed to computers during the four years he spent at the end of the 1970s in the Air Force, programming the big mainframes of the Strategic Air Command in Omaha. He spent the several years following his discharge kicking around the margins of the burgeoning PC industry, writing amateur and semi-professional games for the Radio Shack Color Computer among other models and working briefly for DataSoft before they went bankrupt in the industry’s great mid-decade shakeout. That left him in the state in which a Los Angeles-area computer-store owner named Jim Steinert first met him: 27 years old, sleeping on friends’ couches, picking up contract programming work when he could get it, and spending much of the rest of his time hanging out at Steinert’s store — KJ Computer, located in the suburb of Granada Hills — drooling over their new Amigas.

The seeds of MicroIllusions were planted during one day’s idle conversation when Steinert complained to Joiner that, while the Amiga supposedly had speech synthesis built into its operating system, he had never actually heard his machines talk; in the first releases of AmigaOS, the ability was hidden within the operating system’s libraries, accessible only to programmers who knew how to make the right system calls. Seeing an interesting challenge, not to mention a chance to get more time in front of one of Steinert’s precious Amigas, Joiner said that he could easily write a program to make the Amiga talk for anyone. He proved as good as his word within a few hours. Impressed, Steinert asked if he could sell the new program in his store for a straight 50/50 split. Given his circumstances, Joiner was hardly in a position to quibble. When the program sold well, Steinert decided to get into Amiga software development in earnest with the help of his wunderkind.

He leased an office for his new venture MicroIllusions a few blocks from his store, and also picked up the lease on a small house to form a little software-development commune consisting of Joiner and three of his friends, talented artists and/or programmers all. Joiner describes the first year or two he spent creating inside that little house as “probably the best time of my life. We really felt like we were building the future.”

In these heady days when the Amiga was fondly imagined by its zealots as likely to become the new face of mainstream family-friendly computing, Steinert pushed Joiner to make as his first project an edutainment title similar to a Commodore 64 hit called Cave of the Word Wizard, in which the player must explore a cave whilst answering occasional spelling questions to proceed. Joiner’s response was Discovery, which replaced the cave with a spaceship and allowed for the creation of many additional data disks covering subjects from math to history to science to simple trivia for adult players. Setting a pattern that would hold for the remainder of his time with MicroIllusions, Joiner brought all his creative skills to bear on the one-man project, drawing all of the art himself in Deluxe Paint and also writing a music soundtrack in addition to the game’s code — and all in just four months. Because the Amiga would never quite conquer North America as Steinert had anticipated, the market for the Discovery line would always be a limited one, but it would prove a consistent if modest seller for MicroIllusions for years to come.

Having proved himself with Discovery, Joiner was allowed to embark on his dream project: a hybrid game — part action, part adventure, part CRPG — that would harness the Amiga’s capabilities in the service of something different from anything that had come before. He wanted to incorporate two key ideas, one involving the game’s fiction, the other its presentation. In the case of the former, he wanted to push past the oh-so-earnest high-fantasy pastiches typical of CRPG fictions in favor of something more whimsical, more Brothers Grimm than J.R.R. Tolkien. This territory was hardly completely unexplored in gaming — Roberta Williams in particular had built a career around her love of fairy tales — but it was unusual to see in a game that owed as much to action games and CRPGs as it did to the adventure games for which she was known. Joiner’s other big idea, meanwhile, really was something entirely new under the sun: he wanted to create a world that the player would traverse not in discrete steps or even screens but as a single scrolling, contiguous landscape of open-ended, real-time possibility.

The Faery Tale Adventure

The Faery Tale Adventure is the story of three brothers who set out to save their village of Tambry from an evil necromancer. Their quest will require one or more of them — if you get one of them killed, you automatically take the reins of another — to traverse the vast world of Holm from end to end, a process that by itself could take you the player hours of real time if journeying entirely on foot. Whilst traveling, you must also fight monsters and assemble the clues necessary to complete your quest.

It’s difficult to convey using words just how lovely and lyrical your journeys around Holm can be. Even screenshots don’t do The Faery Tale Adventure justice; this is a game that really must be seen and heard in action. Here, then, is just a little taste, in which I take a magical ride on the back of a giant turtle to visit a sorceress in her crystalline lair.


If it’s difficult to fully describe the experience of playing The Faery Tale Adventure using words, it’s doubly difficult to explain just how stunning it was in its day. Note the depth-giving isometric perspective, still a rarity in games of the mid-1980s. Note the way that characters and things cast subtle shadows. And note how the entirety of the world is presented at the same scale. Gone is the wilderness/town dichotomy of the Ultima games, in which the latter blow up from tiny spots on the map to self-contained worlds of their own when you step into them. In The Faery Tale Adventure, if it’s small (or big) on the outside, it’s small (or big) on the inside. I’d also tell you to note the wonderful music, except that I’m quite sure you already have (assuming you have the sound turned on, of course). Seldom has music made a game what it is to quite the extent it does this one.

Leaving aside the goal of actually solving the game, which is kind of hopeless — we’ll get to that in a moment — The Faery Tale Adventure is all about the rhythm of wandering, following roadways and peeking into hidden corners as the music plays and day turns to night and back again. Like another early Amiga landmark, Defender of the Crown, and unlike far too many other games on this platform and others, it has a textured aesthetic all its own that’s much more memorable than the bloody action-movie pyrotechnics so typical of games then and now. Play it just a little, and you’ll never, ever forget it.

That every bit of this vast world and all that makes it up — code, art, and music alike — was created virtually unaided by David Joiner in about seven months never ceases to amaze me. This Leonardo had found his niche at last:

It’s ironic because when I was growing up I was never able to focus on one single creative outlet and ignore the others — and this was considered a disadvantage. People would tell me that I had to learn to focus on one thing. Otherwise I would never be successful, just be a dilettante. I struggled to find a profession which would use all of my skills, not just some of them.

Still, stunning technical and aesthetic achievement that it is, there’s no denying that The Faery Tale Adventure is kind of a mess as a piece of game design. Its problem are all too typical of a game designed and implemented by a single idiosyncratic individual with, at best, limited external input. Some of the mechanical wonkiness I can live with. For instance, I’m not too bothered by the fact that, if you don’t get killed in one of the first few extremely difficult fights, you level up enough inside of an hour or so to the point that fighting becomes little more than a trivial annoyance for the rest of the game. The broken character-building aspect is forgivable in light of the fact that that doesn’t feel like what the game really wants to be about anyway. (That said, CRPG addicts should certainly approach this one with caution.)

But we can’t so easily wave aside the broken main spine of the game, the fact that it’s all but insoluble on its own terms. The Faery Tale Adventure presents itself as a breadcrumb-following game like the Ultimas, but its breadcrumbs are so scattered at some stages, so literally nonexistent at others, that it’s all but impossible to piece together where to go or what to do. After exploring Holm for a while, the charm of the music and the colorful graphics begins to fade and you begin to realize what a dismayingly empty place it really is. Almost every building is vacant, virtually every hotly anticipated new voyage of discovery proves ultimately underwhelming. Moments of wonder, like the first time you hitch a ride on that turtle you see above — or, even better, on a majestic swan — do crop up from time to time, but far too infrequently. The final impression is of nothing so much as a beautiful world running inside a marvelous engine that’s now just waiting for a designer to come along and, you know, write an actual game for it all. You can see contemporaneous reviewers struggling with this impression whilst giving the game the benefit of every possible doubt; The Faery Tale Adventure is nothing if not a game that makes you want to love it. Computer Gaming World‘s Roy Wagner, for instance, felt compelled to attach a rough walkthrough to try to make it actually playable to his very positive critical take on the game. Later, when the game was ported to the Sega Genesis, Sega found its design so intractable that they demanded that a similar walkthrough be included in the very manual. (One could wish that they had demanded that the design be properly fixed instead.) Joiner notes that his approach was to “start with a basic engine and then add detail like crazy,” which does rather sound like code for “write a game engine and then try to shoehorn an actual semblance of game in there at the last minute, when you realize your deadline is looming.”

David Joiner all dressed up in his armor for the Faery Tale Adventure package.

David Joiner all dressed up in his armor for the Faery Tale Adventure package.

Yet the fact remains that technical game-changers and audiovisual charmers like The Fairy Tale Adventure can usually get away with a multitude of design sins in the face of the gaming public’s insatiable appetite for the new. Knowing he had a potential hit on his hands, Steinert determined to make a veritable Electronic Arts-style rock star out of its creator. Joiner strapped himself into his knightly armor for a inadvertently hilarious photo shoot; the end results look tacky and painfully nerdy in exactly the way that the game itself doesn’t. But no matter: The Faery Tale Adventure became a hit on the Amiga after its release in early 1987 — just in time for a new influx of unabashed Amiga gamers in the form of new Amiga 500 owners — whereupon Steinert took advantage of his favored platform’s halo effect by selling less compelling ports for the Commodore 64 and MS-DOS and, years later, that rather more impressive Sega Genesis translation. The game’s success led to Activision signing MicroIllusions on as an “affiliated publisher,” a real shot at the big time.

For better or for worse, though, Steinert just couldn’t bring himself to leave behind his scruffy roots in the Amiga hacking community. Games remained only one aspect of MicroIllusions, who also developed and published such only-Amiga-makes-it-possible software as Photon Paint, the first program to let one draw and edit pictures directly in the Amiga’s 4096-color HAM mode, and Cel Animator, a classical animation package crafted with the aid of Heidi Turnipseed, a veteran of Disney and Don Bluth Productions. The high-water point for MicroIllusions unsurprisingly corresponded with that of the Amiga itself in North America: 1988, when sales were trending upward and the big breakthrough seemed just around the corner. MicroIllusions was known during this period for their lavish trade-show displays — in truth, probably more lavish than they could realistically afford even then — that made them among the most prominent of the Amiga-centric software houses not named Cinemaware. That summer MicroIllusions products took up almost half of a Computer Chronicles television feature on the Amiga scene.

The MicroIllusions booth at the January 1988 AmiExpo show in Los Angeles, which filled half on one wall inside the Westin Bonaventure's convention space.

The MicroIllusions booth at the January 1988 AmiExpo show in Los Angeles, which filled half of one wall inside the Westin Bonaventure’s convention space.

David Joiner demonstrated his latest project-in-progress on-air on that program: Music-X, a MIDI music sequencer that he’d created largely out of concern that the hated Atari ST was getting ahead of the Amiga when it came to music software (never underestimate the motivation provided by good old platform jingoism). By the time that Music-X appeared at last to rave reviews at the tail end of 1989, MicroIllusions was already in dire straits, their phones perpetually coming on- and off-line and rumors swirling about their alleged demise. That situation would remain largely unchanged for another two desperate years. Their plight must to some extent be linked to that of the Amiga itself, which had failed to ever take off as Steinert had confidently expected when purchasing all that lavish trade-show floor space. It also didn’t help that, while they released a number of other modestly well-reviewed games, they never managed another transformative hit to come close to The Faery Tale Adventure. Thanks to that failure, Activision humiliatingly dropped them as an affiliated publisher barely a year after signing them up, citing the low sales of their latest games as just not making it worth anyone’s while anymore. Even an unexpected high-profile deal with Hanna-Barbera to produce games based on cartoon franchises like Scooby Doo, The Flintstones, and The Jetsons — truly a lifeline if ever there was one — collapsed amid allegations of breached contracts and botched schedules.

One suspects that the real cause behind these failures and so many others was a nemesis of MicroIllusions’s own making that also plagued many others in the Amiga’s home-grown software industry: a simple lack of business acumen, and with it an associated tendency to place dreams before ethics. Rather than belabor the point too much more personally, I’ll deliver David Joiner’s take on Jim Steinert’s idea of running a business:

My financial relationship with MicroIllusions was long and complicated. Jim wasn’t a good businessman. That was not unusual for the software industry at the time, but there was so much wide-open opportunity that any half-competent person could start a software business and be moderately successful.

Jim and I also differed in our approach to business ethics. He imagined himself to be a sharp dealer, and once boasted to me how he “saved money” in dealing with disk-duplication companies. You see, at the time there were companies which would do all the duplication work — that is, make copies of the floppy disks, print the packaging, and assemble the boxes. And many of these companies offered ninety-day terms — that is, you didn’t have to pay for ninety days, so you could use the money you made selling the product to pay back the duplicators. This made it possible to be an entrepreneur with very little startup capital, other than the sweat equity of writing software.

Well, Jim’s idea was that when the ninety days come up you simply refuse to pay — and then, eight months later when the duplicators eventually get around to suing you, you settle out of court for like one-third of the money. This same kind of playing fast and loose with the rules is what caused him to lose the Hasbro [sic. — I believe he means Hanna-Barbera] contract, which up to that point had been an incredibly valuable asset to the company.

Many years later, I went over all the royalty statements I had gotten from MicroIllusions, and discovered that there were lots of basic arithmetic errors in them — and not always in Jim’s favor.

The story of MicroIllusions is hardly unique among the companies we’ve encountered in this history, having much in common with that of many pioneers of the immediately preceding generation of software pioneers: companies like California Pacific, Muse, and Adventure International. Enthusiasm and programming talent can only make up for a lack of basic business acumen for so long. Despite it all, MicroIllusions somehow survived, at least nominally, through 1991, when their remaining assets, including The Faery Tale Adventure, were acquired by a new company called HollyWare who used the contracts they had purchased to launch a fruitless $10 million lawsuit against a now sorely ailing Activision for allegedly mishandling that old distribution deal. As an Amazing columnist wrote as the suit went into discovery, “The really interesting thing to discover is how MicroIllusions expects to get ten megabucks out of a company with a negative net worth.” HollyWare, needless to say, didn’t last very long.

By then Joiner had long since moved on to greener pastures in games and other forms of software development, although he would never again helm quite so impactful a project as The Faery Tale Adventure. The writing had been on the wall for software Leonardos even as he was creating his masterwork. Working with a new development team who called themselves The Dreamer’s Guild, he did belatedly create Halls of the Dead: The Faery Tale Adventure II in 1997. In the tradition of its predecessor, it looked and initially seemed to play great, but showed itself over time to be half-finished and well-nigh uncompleteable.

In the end, then, the business legacy of MicroIllusions is a bit of a tawdry one, one more example of a phenomenon that would always plague the Amiga: the platform seemed to attract idealists and shysters in equal numbers — and, somehow, often in the same individual. Yet it’s because its story is both so groundbreaking and so typical that the company makes such a worthwhile case study for anyone wishing to understand the oft-dirty life and times of the Amiga in its heyday. During MicroIllusion’s brief existence they produced some visionary software that, like so much else that came out of the Amiga scene, gave the world an imperfect glimpse of its multimedia future. That’s as true of Photon Paint, the progenitor of photographic-quality visual editors like PhotoShop, as it is of Music-X, a forerunner of easy-to-use music packages like GarageBand. And, most importantly for our purposes, it’s true of The Faery Tale Adventure, a rough draft of what games might come to be in the future. It’s a game that’s perhaps best appreciated in the context of its time, as I’m so able to do thanks to all of the research — okay, playing of old games — I do for this blog. It stands out so dramatically from its contemporaries that it gave me a catch in my throat when I first saw it again that wasn’t that different from the one I felt when I saw it for the first time back in 1987. That’s a feeling that may be hard for you to entirely duplicate if you’re not a really — I mean, really — dedicated reader who’s playing all these games right along with me. But no matter. If you have an hour or two to kill, give it a download, [1]The music at the beginning of the game is a distorted mess in this version, the only otherwise working one I could find. This is down to one of the few differences between the Amiga 1000, for which the game was originally designed, and later Amiga models — a sound pointer doesn’t get automatically set in the latter. If you just give it a moment, the music will resolve from dissonance to consonance and will play as it should henceforward. I think it’s kind of a cool effect, actually — but then I occasionally blast Sonic Youth, much to my wife’s chagrin, so take that with a grain of salt.

Note that you will need to answer a few copy-protection questions at the beginning by using the map included in the zip.

For those of you who are hopeless completionists, I’ve also included with this zip the Computer Gaming World review that gives much valuable guidance on how to pursue your (otherwise almost certainly futile) quest.

By far the easiest way to get started in Amiga emulation and to play this game and the other Amiga games I’ll be featuring in this blog for quite some time to come is by purchasing Cloanto’s Amiga Forever package. It makes the whole process pretty painless.
fire up an Amiga emulator, and just have a little wander through Holm. Never did a bad game feel — and sound — so good.

(Sources: Computer Gaming World of February 1988 and October 1991; Commodore Magazine of September 1989; Amazing Computing of August 1987, April 1988, June 1988, August 1989, October 1989, March 1990, April 1990, October 1991, December 1991, and April 1992; Info of April 1992. The home page of David Joiner (Talin) hasn’t been updated since 2000, but was nevertheless very useful. Still more useful was an interview with Joiner done by Amiga Lore.)

Footnotes

Footnotes
1 The music at the beginning of the game is a distorted mess in this version, the only otherwise working one I could find. This is down to one of the few differences between the Amiga 1000, for which the game was originally designed, and later Amiga models — a sound pointer doesn’t get automatically set in the latter. If you just give it a moment, the music will resolve from dissonance to consonance and will play as it should henceforward. I think it’s kind of a cool effect, actually — but then I occasionally blast Sonic Youth, much to my wife’s chagrin, so take that with a grain of salt.

Note that you will need to answer a few copy-protection questions at the beginning by using the map included in the zip.

For those of you who are hopeless completionists, I’ve also included with this zip the Computer Gaming World review that gives much valuable guidance on how to pursue your (otherwise almost certainly futile) quest.

By far the easiest way to get started in Amiga emulation and to play this game and the other Amiga games I’ll be featuring in this blog for quite some time to come is by purchasing Cloanto’s Amiga Forever package. It makes the whole process pretty painless.

 
 

Tags: , , ,

This Tormented Business, Part 3

In the June 1985 issue of Compute! magazine, in an otherwise innocuous editorial about font sizes and page layouts and column lengths, Richard Mansfield casually dropped a bombshell: that the number of companies in the PC industry had shrunk by 80% over the past year. Now, the reality was not quite so apocalyptic as that number (not to mention lots of fevered pundits) would make it seem. Many of the people and companies included within it were doubtless dabblers, who saw a chance to jump on a hot new trend, then saw the money wasn’t going to come so easily after all and walked away again. But still… 80%. Let’s look at some more numbers to try to unpack what that figure means.

Home computer installed base, 1978-1982

The chart above shows the numbers of actively used computers in American homes between 1978 and 1982. The first big spike came in the latter year, when cheap machines like the Commodore VIC-20, the Texas Instruments 99/4A, and the Timex Sinclair came online in a big way just as the videogame console market began to go soft. Home computers, the pundits said, were the logical successors to that fad, and consumers seemed to agree by almost quintupling their numbers in the space of a single year.

Actual and projected installed base of home computers, 1982-1987

The chart above shows the actual and forecasted installed base of active home-computer users between 1982 and 1987. As you can see, things continued to go swimmingly through 1983 — the peak of the home-computer wars, Jack Tramiel and Commodore’s year of triumph. By year’s end, following the most spectacular Christmas of the 1980s for the home-computer industry, the number of computers in American homes was over 250% of what it had been at the beginning of the year. With millions upon millions of American homes still unconverted, everyone assumed that this was only the beginning of the beginning, that growth by leaps and bounds was inevitable until the end of the decade at least.

Things didn’t work out that way. Not only did 1984 fall short of projections by more than 50%, but sales to first-time buyers weren’t even sufficient to make up for those who got bored with their balky toys from the previous year or two and relegated them to closets, first step on their long, gradual journeys to the dumpster. (One research firm would later estimate that consumers threw out 1.5 million home computers in 1985 alone.) I’ve talked in earlier articles about the many perfectly good, sensible reasons that consumers grew so quickly disillusioned with their purchases, a list which includes a complete lack of killer apps — beyond games, that is — for the average household, the pain of actually using these primitive machines, and hidden costs in the form of all of the extra hardware and software needed to do much of anything with one of them. Home computers just didn’t live up to the hype; at least the old Atari VCS really was cheap and simple and fun, exactly as advertised. Most Americans found home computers to be none of these things. Their experience of 1982 and 1983 was bad enough to sour many of them on computers for a decade or more.

As bad as the chart above looks, it took a surprising amount of time for the industry to realize just how far off-track things had gone. 1984 was a paradoxical year of mixed messages in many respects, one that saw for instance the Apple II and Infocom both enjoy their biggest sales years ever. It wasn’t until 1984 became 1985, and the industry counted its dollars and woke up to the realization that the Christmas just past had been a deeply disappointing one, that the full scale of the problems set in and the dying-home-computer-industry became as big a media meme as the home-computer-as-social-revolution had been just a year or two before.

Still, some knew long before that something was very, very wrong. The bellwether of virtually any consumer-facing industry has always been — prior, at least, to the Internet age — its magazines. A healthy, growing industry means lots of readers buying at newsstands and signing up for subscriptions, as well as lots of vigorous new companies eager to advertise, to tell the public about all the new stuff they have to sell them. Conversely, when interest and sales begin to flag the newsstands start to reduce their magazine selection to make more room for other subjects, subscriptions are allowed to lapse, and advertising budgets are the quickest and least immediately painful things to cut. And as the first companies start to fold, those with whom they’ve signed advertising contracts tend to be about the last creditors to get paid. Woe betide the magazine that’s let itself go too far out on a limb — like, you know, one assuming it’s a part of an industry likely to grow almost exponentially for years to come — when that happens. The carnage in the magazines, those engines of excitement and advice and community, was appalling during the eighteen months between mid-1984 and the end of 1985.

The period was bookended by two particularly painful losses. Softalk, the de facto voice of the Apple II community, simply never appeared again after an apparently business-as-usual August 1984 issue. An even sadder loss was that of Creative Computing, which at least got to say goodbye in a last editorial (“Great While It Lasted”) in its last issue in December of 1985. The first newsstand magazine devoted to personal and, well, creative computing, it had been founded by David Ahl, a visionary if ever there was one, in October of 1974, months before the Altair. Ahl sold the magazine to the big conglomerate Ziff-Davis in 1982, but remained on as editor-in-chief right through to that last editorial. Throughout its run Creative Computing remained relentlessly idealistic about the potential for personal computing, always thinking about next year and of what the products they reviewed meant in the context of the ongoing PC revolution as a whole. Just to take one example: in response to the arrival of the first prototype laser-disc players in 1976, the magazine laid out a manifesto for what would come to be known as multimedia computing well over ten years later.

David H. Ahl

David H. Ahl

Creative Computing also published books, the best selling, most important, and most beloved of which was titled simply BASIC Computer Games, a compendium of type-in listings featuring games that had been making the rounds of The People’s Computer Company and the pages of Creative Computing itself for years. BASIC Computer Games sold a staggering one-million copies in English and in translations to French and German, years before any pre-packaged computer game would come close to such a feat. Many a young hacker pecked out its listings and then started to experiment by changing a variable here or a statement there, learning in the process the wonderful quality that separates computers from game consoles and just about every other form of electronic entertainment: that you can use the same device you play games on to also make games, or just about anything else you want. Creative computing indeed. The voice of Ahl, every bit as much a pioneer as a Steve Wozniak or Steve Jobs, would be sorely missed in the years to come. Ironically, his magazine’s end came just as machines like the Macintosh and Amiga were arriving to begin to bring to fruition some of his more expansive predictions of earlier years. (One of Creative Computing‘s last issues featured a gushing review of the Amiga which called it nothing less than “a new medium of expression.”) It’s a sign of the immense respect with which Ahl and his magazine were still viewed in the industry that several competing magazines took the time to remark the loss of Creative Computing and offer a warm eulogy — an act of graciousness unusual indeed in the increasingly cutthroat world of computer publishing. As Info magazine noted, “There could be no better history of personal computing than a complete collection of Creatives.”

In 1985 the pain spread in earnest to the software industry. Many pioneering companies, including some we’ve met in earlier articles on this blog, collapsed during the year. Any company that hadn’t shed its old crufty hacker’s skin and learned to start behaving like professionals was doomed, as were many who had listened a bit too much to the professionals and pundits and over-expanded and over-borrowed in the expectation of the perpetually-exploding industry that had been promised them. Also doomed was anyone whose creations just weren’t good enough; those users who had chosen to stick with this computer thing were far savvier and more demanding than the neophytes of earlier years. Muse Software of Castle Wolfenstein fame was amongst the victims, as was our more recent acquaintance Synapse Software, who were shuttered by Brøderbund barely a year after they acquired them. The Carlstons may have been nice folks, but their company didn’t survive by throwing good money after bad, and neither of Synapse’s principal assets — their expertise with the fading Atari 8-bit line and their Electronic Novel line — were worth much of anything in the evolving industry order.

Indeed, adventure-game makers were if anything hit even harder than the rest of the industry. By mid-1985 it was becoming clear that bookware had been a blind alley; virtually nothing in the category, excepting only Infocom’s The Hitchhiker’s Guide to the Galaxy, did much of anything commercially. Companies like Spinnaker (owner of the Telarium and Windham Classics brands) and Brøderbund now began to divest themselves of their bookware assets almost as eagerly as they had acquired them. So much for the dream of a new interactive literature. Other expectations were also dramatically tempered. Trip Hawkins, for instance, was finally forced to give up on his dream of game designers as the rock stars of the 1980s and a shelf of games joining a shelf of records inside every hip living room. Electronic Arts now retrenched and refocused on becoming a big, highly respected fish in the relatively tiny pond of hardcore gaming (the only kind of gaming there was in this window between the Atari VCS’s collapse and the arrival of Nintendo). Yes, computer games were just computer games again.

Almost unremarked amidst all of the bankruptcies and retractions and cancellations was the collapse of Scott Adams’s Adventure International, one of the oldest of all the companies we’ve met on this site. Whether due to stubbornness or lack of funds or failure of vision or simple loyalty to what had brung’em, Adams had refused for years to upgrade his core technology, continuing to sell the same little 16 K, two-word-parser games he had started writing back in 1978. Their revamp into the SAGA line added crude graphics to the equation, but little else. Thus Infocom had long since stolen Adams’s crown as the king of adventure-gaming, not so much by besting him as by Adams not even trying to compete. Adams instead fed — and, for a time, quite well — on the ultra-low-end market, those machines like the Commodore VIC-20 and Texas Instruments 99/4A that weren’t a whole lot more capable than the original TRS-80 on which he’d first written Adventureland. These machines, unfortunately, were exactly the ones which found their way into closets and attics with the most frequency after the home-computer boom passed its heyday. Therein lay the root of AI’s troubles.

By 1984 much or most of Adventure International’s revenue was coming from Britain, thus belatedly justifying the company’s name, chosen in a fit of optimism when Adams and his wife were still making packaging out of baby-bottle liners and struggling to grasp the vagaries of wholesale pricing; expansion across an ocean must have seemed far-fetched indeed at that time. With their more modest cassette-based computers, their absolute mania for adventures, and their accompanying willingness to forgive faults and limitations Americans no longer were, Britons offered Adams a more hospitable market all around. An independent quasi-subsidiary, Adventure International UK, offered not just the classic dozen original Scott Adams games and the OtherVentures titles, but also many more games written in Britain by British authors like the prolific Brian Howarth using Adams’s engine. Adams himself was a celebrity amongst British adventurers. Everyone knew him for his crazy Afro that made him easy to spot across a crowded trade-show floor, and the magazines jostled for quotes and interviews and the fans for autographs whenever he made one of his occasional trips across the pond.

Scott Adams hams it up for the British press

Scott Adams hams it up for the British press

In late 1983 or early 1984 a tremendous opportunity to improve Adventure International’s standing on both continents virtually fell into Adams’s lap. Joe Calamari, an executive vice president with Marvel Comics, called Adams out of the blue to propose that Marvel and AI collaborate on a line of games and accompanying comic books starring the Marvel superheroes. While it would take many years for Marvel to catch up to their perpetual arch-rivals DC Comics in bringing their brand to the masses via the multiplexes, Marvel at this time was making a modest but in its way innovative push into trans-media storytelling via deals like this one and the one they inked around the same time with TSR of Dungeons and Dragons fame to do a Marvel tabletop RPG. Their choice of Adventure International for the computer-game license could be read as surprising; AI was hardly at the cutting edge of the game industry, and given the huge demographic overlap between gamers and comics readers the Marvel license would certainly have been appealing to other, slicker publishers. Perhaps AI’s support for the cheap low-end machines, not to mention their games’ typical price of $12 or so as opposed to $30 or more, led Marvel to consider them a better fit for their generally younger readers. (As with science fiction, the golden age for superheroes is about twelve.) As possible evidence of exactly this thought process, consider that Commodore, who may have suggested AI to Marvel and apparently did play some sort of intermediary role in the negotiations, had been doing very well with cartridge versions of the first five Scott Adams adventures on the VIC-20 throughout the peak years of the home-computer boom.

It’s hard not to compare this early, crude experiment in trans-media storytelling with the Marvel of today, whose characters feature in cinematic extravaganzas costing hundreds of millions to produce. We’ve certainly come a long way. (Whether it’s a change for the better is of course in the eye of the beholder.) It’s also yet another sign of just how huge text adventures were for a few years there that Marvel chose this format for the games at all. The cerebral pleasures of text and puzzles hardly feel like an obvious fit for the “Wham! Bam! Pow!” action of a superhero comic — not that this marks the strangest mismatch between form and content of the bookware era.

Marvel's Hulk QuestProbe issue

Adams signed a deal to make a dozen games with Marvel, one that gave him a crazy amount of creative freedom. He gave the series its truly awful name, the uncomfortably medicinal-sounding QuestProbe. (It’s choices like this that distinguish companies like AI, who couldn’t afford PR firms and image advisers or just couldn’t be bothered, from companies like Infocom who could. As for Marvel, who knows what they were thinking…) He also got a pleasure that would turn any superhero-loving kid — and more than a few superhero-loving adults — a Hulk-like green with envy: he outlined a story to accompany each game, then gave it to Marvel to be turned into a full-blown comic book to be sold as part of a “Scott Adams/Marvel Comics Limited Series.”

The Human Torch and The Thing

The Human Torch and The Thing

Alas, the Marvel deal, AI’s last, best chance to live and possibly even prosper, turned into an opportunity squandered. The QuestProbe games are painfully, shamefully bad by just about every criterion. The graphics are crude and ugly, the prose strangled, the situations all but incomprehensible (especially if you aren’t lucky enough to have the accompanying comic to hand), and the puzzles a hopeless mix of the inane and the inscrutable. They are, in other words, pretty much like all the other Scott Adams games after the first half-dozen or so, and that just wasn’t good enough anymore, even for the patience of twelve-year-olds. After the first game, which featured the Hulk, was roundly panned even by the forgiving gaming press (the making of a game bad enough to achieve that was something of a feat in itself), Adams did begin to include some modest innovations: the next game, featuring Spider-Man, debuted at last a parser capable of understanding more than two words (not that it was otherwise up to much); and the third and as it turned out final game, featuring the Human Torch and the Thing, had you controlling both characters, able to switch between them at will — an interesting idea badly executed. By the time that third game trickled out in mid-1985, AI was already collapsing.

Adams today notes the immediate cause of AI’s failure, no doubt accurately, as a rash of returned product from distributors who had over-ordered in anticipation of a big Christmas rush that never materialized. AI, which had never attracted the injections of venture capital and the accompanying professional financial oversight of fellow pioneers like Sierra, found themselves unable to pay back their distributors. With no one willing to extend them credit given conditions in the industry as a whole, there was no viable recourse but bankruptcy. Yet the deeper cause was Adams’s inability or unwillingness to change his games with the times. He’s stated many times in interviews that he virtually never looked at any of the games produced by his rivals; for instance, he never played an Infocom game after Zork. His logic was that he didn’t want to have his designs “polluted” by ideas and puzzles of others. This is, at best, an odd stance to take; try to imagine a novelist who refuses to read books, or a musician who doesn’t listen to music. It perhaps does much to explain the time-warp quality of the QuestProbe games. It’s strange that the man who had the vision and the technical chops to get viable adventures working on 16 K microcomputers in the first place should prove so unable to further iterate on that first masterful leap, but there you have it. Adams went on with his professional life as a programmer outside of the games industry, and Adventure International passed quietly into history.

Nigel Bamford, Michael Woodroffe, and Patricia Woodroffe of Adventure International UK

Nigel Bamford, Michael Woodroffe, and Patricia Woodroffe of Adventure International UK

One part of the brief-lived AI empire did survive. Mike Woodroffe, head of the still-viable Adventure International UK, disentangled that organization from its erstwhile namesake and renamed it Adventure Soft. The company would go on to a long if only sporadically active life as a developer of graphic adventures, whose biggest games became the Simon the Sorcerer series, The Feeble Files, and two Elvira-themed pseudo-CRPGs. Adventure Soft continues as an at least nominally going concern today, although their website is little more than a storefront for sometimes decades-old titles.

All told, then, 1985 was a brutal year in American software and particularly games software, one that weeded out the weak sisters like Adventure International, Muse, Synapse, and countless others without remorse — not to mention the casualties in publishing and hardware and still other, ancillary areas. Old timers who had grown up as hackers with many of the year’s casualties can be forgiven for seeing it in terms as apocalyptic as did the more hyperbole-prone members of the media. David Ahl, from his final Creative Computing editorial:

The personal-computing industry is largely composed of adolescent companies and inexperienced managers being forced to grow up much too fast by market forces that they themselves created. The big guys are sailing in with battleships, and the friendly competition of a few years ago has become all-out war with no holds barred. The media smells blood and death, which makes for interesting reading (and sales). Their alarmist disaster stories have simply exacerbated the situation.

Still, if we’re seeking silver linings they aren’t that hard to come by. Just to take the obvious: another look at the chart above will show that, if the home-computer user base wasn’t growing much, it also — that one brief blip in 1984 aside — wasn’t shrinking either. There was still a very viable, even vibrant market there. It was just a market that had reached an equilibrium far, far sooner than anyone had anticipated. The pain of 1985 was the pain of adjusting expectations to match that reality — the reality that numbers of computers in homes wouldn’t increase in big jumps again until the arrival of the Internet and cheap multimedia PCs in the early 1990s gave everyone a good reason to own one. The generation of microcomputers sandwiched between those and the old 8-bits — the Apple Macintosh, the Atari ST, the Commodore Amiga, the Tandy 1000 and a rash of other ever cheaper and more capable MS-DOS-based machines — would seldom be sold to complete neophytes. They would rather go to people looking to upgrade their old Apple IIs, Commodore 64s, Atari 800s, and TRS-80s. A tempering of expectations, especially for hardware makers, would be necessary. Not everyone would upgrade, after all, meaning home-computer sales wouldn’t come close to their 1983 peak for many years to come. As David Thornburg noted in a perceptive article for Compute! magazine, computers were and would for years remain a hobby, not an everyday home appliance.

If you go to someone’s house and see a computer sitting in the den, I’ll bet you say, “Hey, I see you’re into computers. How about that!”

Have you ever gone into someone’s house and said, “Hey! I see you’re into refrigerators. Wow! Automatic ice-cube maker too! I was going to get one of those myself — thought I’d get a 16-cube model, but then I heard that the 32-cubers were going to come out soon.”

If the home computer was an appliance, we would talk about it like one.

David Ahl offered another comparison to explain why the home computer hadn’t yet achieved appliance status and wasn’t likely to for some time to come.

People who don’t have computers are looking for user friendliness of a sort that just isn’t available today. You can rent a car virtually anywhere in the world and in a minute or two be familiar enough with the vehicle and local traffic laws to drive off with a reasonable degree of confidence. When it is that easy to use a computer, then manufacturers can legitimately speak of user friendliness. We are a long way from that point today.

When a reeling software industry proved unable to fill the space allocated for it at the 1985 Summer Consumer Electronics Show, a big chunk was instead given over to pornographic videos, an industry that was thriving on the back of booming VCR sales in exactly the way the software industry wasn’t on lukewarm home-computer sales. Game consoles and home computers may come and go, but some interests are eternal.

If you were a committed gamer in for the long haul, however, the outcome of all this chaos was arguably at least as positive as it was negative. With computer owners an ever savvier and more experienced lot unwilling to suffer bad or even mediocre games anymore, with publishers all competing frantically for a big enough slice of a fixed pie to keep them alive, games in general just kept getting better at a prodigious rate. By 1986 developers would be taking the Commodore 64 in particular to places that would have been simply unimaginable when the machine debuted back in 1982. And as for the next-generation machines… well, even more splendid work was in the offing there. Everything was improving: not just graphics and sound but also the craft of design.

But before we can revel too much in the positives we have more pain to address. Next time we’ll look at Infocom’s disastrous 1985, the year that came within a whisker of cutting off the most beloved canon in interactive fiction at the halfway mark.

(My huge thanks to C. David Seuss, former CEO of Spinnaker Software, who answered my questions about this era, pointed me to a useful Harvard Business School Case Study, and provided the charts shown above and other documents. The usual thanks also to Jason Scott, whose interview with Scott Adams for Get Lamp was also invaluable. Useful magazine sources this time included: Compute! of March 1985, June 1985, and January 1986; Your Computer of November 1985; Creative Computing of December 1985; Computer Gaming World of January 1985; Computer and Video Games of May 1986; Info of December 1985/January 1986. Finally, if you don’t believe me that the QuestProbe games are really, really bad, feel free to download them in their Commodore 64 incarnations and see for yourself.)

 
 

Tags: