RSS

A Working-Class Hero, Part 1: Proletariat, Prisoner, and Pilot

You may wonder what on earth the following is doing on “a history of computer entertainment.” If so, please trust that the meaning behind my madness will become clear in the course of the next few articles. In the meantime, I hope you’ll enjoy getting away, just for a little while, from computers and games and business machinations involving them to an earlier time that was even more fraught and no less fascinating.

edwardmannock1

Edward Mannock, Great Britain’s ace of aces of World War I and of all time, was also a man of a bewildering number of names. He wasn’t even born as Edward Mannock, but rather as Edward Corringham.

His father, another Edward Mannock, was the black sheep of an otherwise respectable middle-class London family. In 1878, in debt and likely in trouble with the law, he joined the army as an enlisted soldier, an occupation that was considered so ignoble at the time that his family begged him to assume an alias. Thus he became Edward Corringham, and it was as Corporal Edward Corringham that he married an Irish girl named Julia Sullivan in 1883 in Ballincollig, Ireland, the village at which his regiment was stationed. Four children, two boys and two girls, followed, the family moving all the while around Ireland and England behind Edward’s regiment. Young Edward, the third of the children, was born on May 21, 1888, in Brighton, England. Despite being born in England to an English father, the boy would always self-identify as at least as much Irish as English. That identity came complete with his mother’s brogue, an accent he may have actively cultivated as an act of defiance against his cruel drunkard of an English father.

In 1891, his father was discharged from the army, and the family reverted to his old surname of Mannock, with young Edward taking Corringham as his middle name for old times’ sake. Moving back to London in the hope of being accepted back into the respectable Mannock family’s good graces, they found nothing of the kind on offer. On the contrary, the other Mannocks were all too willing to visit the father’s sins upon his children, continuing to disown them all completely. Edward the elder had difficulty finding steady work, both due to the disrepute in which an ex-soldier was held and his own fondness for drink, and the £40 severance he had been awarded at the end of his service quickly evaporated. By the end of eighteen months, the family was living in horrid squalor and poverty, abused daily one and all by the man of the house, who took his frustration out on his wife and children with fists and kicks.

With no other prospects on offer, Edward the elder rejoined the army, enlisting with a regiment that was about to be shipped off to India. Once again, wife and children duly followed him to this latest posting. Life there was a little better; the family, not even considered on the level of the servant class back in England, could actually afford a servant of their own in India. With the economic stresses now eased, some of the physical abuse slackened, although it would never entirely go away.

It was in India, in these much improved if hardly luxurious conditions, that young Edward, now typically called “Eddie,” passed most of his childhood. In the beginning, he was a rather sickly boy, a result of the malnutrition, unsanitary conditions, and constant abuse that had marked his old life back in England. He contracted a serious amoebic infection in his eyes, which blinded him completely for as long as a fortnight.[1]Generations of hagiographers would later claim that the infection left Mannock’s vision out of his left eye permanently impaired if not destroyed entirely, thus giving rise to the legend of “the ace with one eye,” a story worthy of a Biggles novel. Manly lad that he was, the accounts claim, he never revealed his handicap to any but those who were closest to him out of a horror of being pitied. Instead he worked for hours on end to find ways to compensate when, for instance, playing cricket (a sport at which he was actually quite accomplished). Thus did the hagiographers neatly sidestep the fact that the vast majority of those who remembered Mannock remembered absolutely nothing of note about his vision, other than perhaps an unusually intense stare when he talked with them.

Similarly, the hagiographers claimed that he managed to pass at least three eye exams with ease prior to becoming a pilot by using a photographic memory that is in evidence nowhere else in his life’s story to memorize the optical chart. As it happens, one Lieutenant Gilbert Preston, who had lost his left eye as a ground soldier at Gallipoli before deciding to sign on with the Royal Flying Corps, tried to fool the doctor in exactly the same way Mannock is claimed to have done. It didn’t work out terribly well for him:

I thought that I had fooled the doctor, because after I had read the reading board with my right eye, he turned me to the window and said, “Tell me what you see out of the window.” I knew that I would have to come back to reading the eye chart, so I memorised all of the lines on the board. When I finished describing what I had seen out the window, he swung me around, and covered my right eye and said, “Will you continue reading the eye chart?” I knew what was coming, so I started to “read” the board. Suddenly he said, “You’re blind in that eye, aren’t you?” I said, “Oh no, not quite.” He told me, “Uncover your right eye and look again at the chart.” While I had been looking out the window, and unknown to me, he had turned the chart over and the only single letter on that chart was the letter “E.” I was heartsick as I thought my own chances were non-existent. He then announced, “Don’t take it too much to heart, because I have orders to send you to the Flying Corps – whether you can see or not!” To my disappointment he informed me that I could not qualify as a pilot and that I would go to France as an observer.

So, the stories of Edward Mannock as “the ace with one eye” are all, needless to say, almost certainly complete bunk. Nor are they necessary for casting him in the role of hero; his story is plenty heroic without them.
But in time he grew into an active boy with a keen interest in sports of all types.

Eddie received a reasonably good primary-school education in India via a local Jesuit mission. In 1899, his father sailed with his regiment to South Africa to take part in the Boer Wars while his wife and children were left to await his return in India. He wound up spending three years in South Africa, at one point actually volunteering to join another regiment and remain there rather than return to his family — as sure an indication as any of just how estranged he really was from them. At last, in 1902 he was shipped back to Canterbury, England, without ever returning to India at all; his wife was left to book her own passage with her children to join him.

Eddie was 14 upon their return. With virtually no memory of England, he had to acclimate himself to life in Canterbury, which was not yet the tourist trap it is today, just a rather anonymous English market town surrounding a grand cathedral. Then, within a few months of the family’s arrival, his father walked out on them for the last time. Father and son would never see each other again.

Any thought of further schooling for Eddie must now be forgotten. With no other means of support, the entire family had to go to work. For years, Eddie worked menial jobs. His first was that of a grocer’s boy, schlepping crates full of food around the town. Later he worked as a barber’s boy, sweeping floors, washing hair, and mixing vats full of shaving soap. Both jobs offered only long hours of the most stultifying labor for the most meager of wages, but needs must.

But meanwhile his older brother Patrick had had a stroke of luck. The best educated of the family thanks to his having come to the good offices of the Jesuits in India at an older age than his little brother, he found employment as an accounting clerk at the National Telephone Company. When another opening came up some time later, he was able to secure it for the now 20-year-old Eddie. This really was an extraordinary stroke of luck by most people’s measure. A clerk’s job offered relatively reasonable working hours spent indoors in an office, sitting relatively comfortably behind a desk. At the end of 45 years or so, it even offered the relative security of a modest pension. For a young man of Eddie Mannock’s social standing, this was about the most he could reasonably hope for in life. No, the work itself wasn’t especially exciting, but what work open to him was? The vast majority of young men in his position would have accepted the job gratefully and remained there for the rest of their working life. Indeed, Patrick Mannock did precisely this. Eddie, however, accepted it only with some reluctance, and almost immediately started looking for a way out.

Eddie Mannock was developing into an ambitious man who refused to accept that this seeming best lot in life offered him by the Edwardian class system was the only one possible. Well over his childhood sickliness, he was now a strong young man who, while hardly well-educated, had a certain native mechanical intelligence that made him very good at solving practical problems of many descriptions. He played cricket at every opportunity, loved to fish, and loved to tinker with engines and electricity whenever given the opportunity. He joined the Territorial Force, the forerunner to the modern British Army Reserve, where he drilled regularly with his cavalry unit, becoming quite a good horseman.

From the beginning, the circumscribed indoor life of an accounting clerk rankled. Better, he thought, to be at the scene of the action, rigging cable out in the field as a linesman for the same National Telephone Company that now employed him as an office worker. At last, after some three years behind a desk, he asked for a transfer to the field, thus stating his willingness to forgo his cushy office job for a much more difficult and dangerous life of physical labor. Everyone thought he was crazy, but his request was finally granted.

To take up his new duties, Mannock was transferred to the village of Wellingborough in the East Midlands. His fellow workers there, taking note of his Irish brogue, rechristened him “Paddy.” Products of the working class just as he was, they accepted with equanimity his radical politics, which he didn’t hesitate to share with them. At some point during his years in Canterbury, you see, Edward Mannock had become a committed socialist.

It should be noted that most of the policies for which Mannock argued, so radical in the context of Edwardian England, have thanks to the steady march of progress long since become accepted baseline standards by even many conservative Western politicians. He wanted universal suffrage for all, regardless of class, sex, race, income, or land ownership (or lack thereof). He wanted, if not necessarily to abolish the nobility and the monarchy, at least to strip them of political power. He wanted a livable minimum wage, a ceiling to the number of hours one could be expected to work per day and per week, and the abolition of child labor. He wanted a progressive tax system to redistribute the nation’s wealth more equally, but he was certainly no full-blown Marxist. His socialism didn’t even imply any particular discomfort with the notion of a British Empire, or the notion of taking up arms to defend it, as shown by his enthusiastic continuing participation in the Territorial Force. Likewise, he remained, while not overly religious by any means, a member of the Catholic Church. Even his views on the age-old question of Ireland, with its inflamed passions on both sides, sound oddly moderated today. Despite his proud Irish ancestry, he was in favor only of Home Rule — the creation of a separate Irish Parliament that would be able to adjudicate many questions of domestic politics for itself — rather than a fully independent Ireland.

The three years Mannock spent in Wellingborough were good ones, perhaps the best of his short life. The work was every bit as difficult and dangerous as had been advertised, but he found it suited his need for physical activity and his tinkerer’s instinct alike. Soon after his arrival, he met Jim Eyles, the manager of a small foundry in town which made pipes and gutters. Typically enough for Mannock the avid cricketer, they met at a cricket match. Eyles was playing, Mannock was only watching, but the former had a boil on his neck that was giving him all kinds of problems, and so the latter offered to bat for him. He was out for a duck, but the two struck up a conversation and, soon, a friendship which grew so close that Eyles asked Mannock if he’d like to move in with him and his wife and son. Mannock became known around the Eyles household, the first version of a comfortable family life he had ever known, by the slightly more dignified sobriquet of “Pat” rather than “Paddy.”

He regarded Eyles, who shared his political views, as a mentor and a father figure, a role the latter was more than happy to play. Eyles encouraged him to read the books found in the family library, which helped to give his socialism, previously a patchwork of good intentions and intuitive beliefs, the framework of a coherent political ideology. The two would sit up well into the night after the rest of the family had retired, discussing the ideas therein along with all the latest political news. And with Eyles’s encouragement Mannock’s socialism began to go beyond mere talk: he helped to found a branch of the socialist Independent Labour Party right there in Wellingborough. Passionate, idealistic, and articulate in his rough-hewn way, he might, Eyles began to think, have a real future in politics.

Nevertheless, a certain endemic restlessness that seemed always to exist at the root of Mannock’s character began in time to reassert itself. He sought adventure, wanted to make his way in the world outside of provincial England. He considered trying to become a diamond miner in South Africa or a plantation owner in the West Indies, but in the end he settled on the slightly more sober scheme of continuing his current trade in Turkey. The “sick man of Europe” though the Ottoman Empire may have been for decades if not centuries, its government was still doing its feeble best to modernize. Of late, these efforts had come to include the construction of a telephone network. It seemed the perfect opportunity for an ambitious man of Mannock’s talents. Thus one bleak winter day a melancholy Eyles family walked him to the local train station to begin the first stage of a long journey to the edge of the fabled Orient.

Mannock’s new life in Turkey could hardly have started out better. He showed up at the offices of the National Telephone Company in Constantinople, which was responsible for installing the new telephone network, showed his credentials, and was immediately offered a job as a rigging foreman. Placed in charge of others for the first time in his life, Mannock showed a knack for leadership, even though those he was leading were Turks with whom he could barely communicate thanks to the language barrier. He proved himself an eminently capable man on a project where capable men were sorely needed, and moved up quickly in his superiors’ estimation, being handed more and more responsibility. When not working, he lived, like virtually all of the British expatriates in Constantinople, in a small enclave with the air of a country club, where opportunities for swimming, rowing, riding, and playing tennis, croquet, and of course his beloved cricket were abundant. All told, it made for a fine life for the vigorous young man, who was now going by  the nickname of “Murphy,” yet another name given him in tribute to his Irish heritage. The only problem was the date: the 27-year-old Edward “Eddie/Paddy/Pat/Murphy” Corringham Mannock had arrived in Constantinople in February of 1914. A war that absolutely no one saw coming was soon to engulf much of Europe and the world.

Why should anyone have been thinking about war as the lovely spring of 1914 turned into the most bucolic summer anyone could remember? There hadn’t been a truly major, extended war in Western or Central Europe since Napoleon’s final defeat back in 1815. The intervening century had been on the whole the most peaceful in recorded European history, marked only by a handful of brief conflicts that had ended with fairly minimal casualties, along with more extended but remote proxy wars like the Crimean War and the Boer Wars in which Mannock’s father had fought. Historians of later times would be able to identify all sorts of plausible reasons to call the Europe of 1914 a “powder keg”: an entangling system of ill-considered alliances all but guaranteed to turn local conflicts into continent-spanning ones; the rising tide of nationalism and its less pleasant little brother militarism; the decrepit dysfunction of the Ottoman Empire, the Austro-Hungarian Empire, and Czarist Russia; the destabilization that resulted from dynamic young Germany sitting right next to all these staid old men; a nostalgic glorification of the wars of old, and with it a feeling that, with Europe not having had a really good war in such a long while, perhaps now was a good time for one; a desire on the part of many nations to try out all the modern military hardware they’d been so busily accumulating. But none of these feel terribly satisfying as a real reason to wage war. The greatest tragedy of the First World War must be that it was waged for very little concrete reason at all.

The Turks had been drifting closer to Germany for years. Indeed, they had quite the crush on the younger power. They struggled with mixed results to remodel their ragtag military in the German Army’s hyper-disciplined image, and their commanders and statesmen affected the side whiskers favored by the German general staff. The prospect of using Germany to inflict revenge for past slights upon Greece and Russia, their two most longstanding ethnic enemies, held immense appeal. When the assassination of the Austrian Archduke Francis Ferdinand by a Serbian nationalist caused the dominoes to begin to fall in Europe that summer of 1914, Mannock wrote home that “things [are] very serious here. War in the air. Great anti-British feelings displayed by the people.” Still, Turkey remained at least ostensibly neutral through that fateful summer and for some time thereafter, leaving the little expatriated community in Constantinople in an uncomfortable limbo. Certainly for Mannock, who was moving up so quickly, forgoing his life in Turkey must have felt like a bitter pill to swallow. So, like many of his fellows, he cooled his heels. If it held true to recent form, this latest European war would be a quick, clean one, after which everyone could settle down again and concentrate on the jobs they’d come to Turkey to do. And if the government of Turkey did begin to make a clear public shift in the direction of Germany, there should still be time to get out before war was declared.

He was wrong on both counts. Referencing a list of recent grievances against Britain in particular, the Ottoman Empire unexpectedly declared war on Britain, France, and Russia on October 29. Mannock and his fellow Britons were immediately interned.

Actually, “interned” is far too soft a verb to use. While the women and children were treated comparatively gently and soon sent home, the men were thrown in prison, where they lived under the most appalling conditions; not for nothing has the term “Turkish prison” become such a cliché in reference to man’s inhumanity to man. Eyewitness accounts of Mannock during this period are somewhat sketchy, but they describe a man who repeatedly defied the guards, and in return was subjected to beatings, starvation, and solitary confinement even more extreme than that endured by the other prisoners. He came to be regarded as a leader and a role model by his comrades, many of whom were older and more frail than he. One of them later described him as “our philosopher, friend, and guide”: “He was always cheery and helpful, and kept the men ‘British’ all through.” He dug a hole under the fence surrounding the prison, which he used not to attempt to escape but to make nighttime journeys to a nearby black market, returning with food to supplement the prisoners’ meager daily ration of black bread and water. For the first time in his life, he showed himself capable of genuine heroism. In a different, perhaps better world, we might admire him more for what he did during these five desperate months than for what he would later accomplish in the cockpit of an airplane.

After much negotiation, largely undertaken by the American ambassador to Turkey on behalf of the British government, an exchange of British and Turkish nationals was finally worked out. On April 1, 1915, Mannock and his fellow prisoners were released — only to be packed on board a train for a long, circuitous journey across the Balkans that proved as physically trying as life in prison had been. Reaching Greece at last, they sailed for home. Upon Mannock’s return to Wellingborough, Jim Eyles was shocked at the appearance of his surrogate prodigal son. He looked, Eyles would later say, “an absolute mess,” still wracked by dysentery and malaria from his time in prison and quite possibly pneumonia from exposure to the elements on the long journey home.

Still a strong young man at bottom, Mannock would recover physically from the Turkish nightmare. But in another sense he was changed forever. He had become a very bitter man, his idealistic enthusiasm for the international socialist movement now exchanged for a darker, more violent worldview. He reserved his greatest hatred not for the Turks but for the Germans; manifesting the casual racism of the times, he regarded them as the Turks’ masters, and thus the true cause of all his recent suffering. What with his connections to international socialism, Mannock must have known at some intellectual level that Germany as a whole was no more of a political piece than was Britain, that there were plenty inside the country, even plenty fighting on the front lines, who lamented the reactionary militarism of the Kaiser and the war that was being fought in its name. But somehow none of that seemed to register anymore. Germans were a scourge, an affront to everything Mannock and his fellow socialists believed in. He wanted not just to make the world safe for socialism; he wanted to kill Germans.

This darkness in Mannock’s character, which would only grow more pronounced with time, is something his biographers and hagiographers alike have always struggled to come to terms with. Certainly it’s tempting to put on one’s psychologist’s hat, to speculate on whether in hating Germans so rabidly he was sublimating the frustration and rage of a lifetime over the course of which, no matter how capable he proved himself or how nobly he conducted himself, he must always be looked down upon by his alleged betters in the strict Edwardian class hierarchy, must always be condescended to and made subtly to understand that he could never truly be of the class of men who were so happy to make use of his talents. Then again, maybe a cigar really is just a cigar in this case. It certainly wasn’t unusual for even ardent socialists to support the British war effort wholeheartedly. For each of them who elected to endure ridicule by sitting out the war, sometimes in prison, as a conscientious objector, several more went to war willingly, convinced both that the war was worth fighting on its own merits to combat German militarism and that, having fought and won it for their alleged betters, the British working classes would have to be rewarded with new opportunity and equality.  It would only be after the war was over, when new opportunity and equality most conspicuously failed to materialize, that the British left’s view of the war would harden into that of a colossal, pointless, criminal sham perpetuated by the ruling classes upon the working men who fought and died in it.

In this sense, then, Mannock was fairly unexceptional among his creed in eagerly rejoining the Territorial Force just as soon as his health would allow, well before mandatory conscription, the inevitable result of the casualties the volunteer army was sustaining, went into effect in 1916. He was assigned to an ambulatory force which drilled in England for months on end without ever getting sent to the front. This situation rankled Mannock deeply, as did the very notion of serving in a noncombatant role. In November of 1915, now fully recovered physically from his ordeal in Turkey, he applied for a transfer and an officer’s commission with the Royal Engineers; he thought, rightly, that they would be pleased to have a man of his practical experience with rigging and wiring. He told Eyles that his ultimate goal was to join the so-called “tunneling” forces. One of the dirtiest and deadliest jobs on the front, tunneling meant literally digging tunnels from friendly trenches under those of the enemy and planting explosives there. It seldom worked out all that well; in keeping with so many aspects of trench warfare, the whole enterprise tended to take on a farcical, blackly comic tone, with the tunnelers often winding up in the completely wrong place and blowing up nothing more than a few trees, or blowing themselves up with the touchy explosives of the era long before they made it to enemy lines. Nevertheless, the notion struck a chord with Mannock’s sheer bloody-mindedness. “Blow the bastards up!” he told Eyles. “The higher they go and the more pieces they come down [in], the better.”

Transferred to a newly established Royal Engineers unit in Bedfordshire in early 1916 and commissioned as an officer, Mannock, who for some reason was now going by the nickname of “Jerry,” found himself in a social environment very different from that he had known as a Territorial. Here, in this unit that had accepted him at all only due to his practical experience as a telephone rigger, he was surrounded by much younger men — Mannock was now pushing thirty — from the upper classes. They didn’t take any better to his gruff working-class manners than they did to his forthrightly radical politics; ditto his Irish brogue, especially when the Easter Rising began that April. They didn’t know whether he committed so many social faux pas because he didn’t know any better, because he didn’t care, or because he was actively tweaking them — but, increasingly, they began to suspect the last. And for good reason: Mannock judged most of his comrades to be the worst sort of drones of the English class system, shirkers who had used their family connections to win a posting to the Royal Engineers in the hope that the long period of training that must precede transfer to the front would let the war get finished before they had to take part in it. Having joined the unit for precisely the opposite reason, he was a man apart in the officers’ mess in this as in so many other ways.

When he learned upon completing several months of basic training that he might have to spend another full year training to become a tunneler — news that must have struck his comrades as wonderful — Mannock applied yet again for a transfer, this time to the air corps; absurdly counter-intuitive as it strikes us today, it actually took far less time to learn how to fly an airplane in those days than it did to learn how to dig tunnels and blow stuff up. His unit’s commander, perhaps weary of this capable but confrontational socialist firebrand who was constantly disrupting life in his officers’ mess, pushed this latest transfer request through. After passing the various medical and psychological exams, Mannock began aviator ground school in Reading in August of 1916.

Ground school filled about two months, during which Mannock and his fellow trainees were expected the learn the details of eight different types of aircraft engines. Other subjects included “General Flying,” “Aircraft Rigging,” “Theory of Flight,” “Bombs,” “Instruments,” “Morse Signalling,” and “Artillery Cooperation.” As the last two subjects attest, the real value of the airplanes of the time — the only roles in which they would make a really significant contribution during the war — was as the ultimate reconnaissance platform, taking photographs of the enemy lines and rear areas and guiding artillery barrages in real time. All of the exploits of the renowned “knights of the air,” the great fighter aces who were followed so eagerly by press and public, surrounded and perhaps too often obscured this real purpose. Aerial observation was therefore considered a skill that one and all needed to learn. Ground school included a rather fascinating early interactive simulator to help them do just that, as described by one Second Lieutenant Frederick Ortweiler:

In a large room, laid out on the floor was a model of the Ypres Salient and re-entrant by Messines, made exactly as it would be seen from an aeroplane 8000 to 10,000 feet up. With a squared map it was possible to pick out all the various roads, etc., and we were given practice in picking out points on the map. Then, by a system of little lights in the model, we were made to imagine that a battery was firing on a target and we were correcting. We would first be shown the target by having it lit up; then a flash would appear as the battery fired and another where the shot fell. Then we would have to send corrections over the buzzer till an “OK” was registered and the shoot finished.

Wars, if they go on long enough, produce a social leveling effect. The Royal Flying Corps, strictly the domain of “gentlemen” early on, was broadening by necessity to admit men like Mannock, who made up for in practical skills what they lacked in breeding. The class blending produced inevitable conflicts of the sort with which Mannock was all too familiar by now. A simpering gentleman named Dudley McKergow sniffed:

There are some perfectly appalling people here now. Their intonation is terrible and you can pick out hairdressers, Jews who would sell tobacco, the typical shop attendant, the comic-turn man at the very provincial show, and the greasy mechanic type. These are the class of fellows from cadet school — hardly one of them has any pretence of being a gentleman. There are still a very good crowd of observers and we keep to ourselves.

Mannock, of course, did nothing to make himself more palatable to the Dudley McKergows around him; he was as unashamed of his accent as he was of his political opinions. On the front, at least, the snobbery would fade somewhat in the face of the more elemental realities of life and death.

A Maurice Farman biplane

A Maurice Farman biplane

After ground school, it was off to Hendon Airfield in North London for flight school. It’s difficult to fully convey today how new and wondrous the very idea of powered flight was in those days, quite apart from any applicability as a tool of war. It had, after all, been barely a decade since the Wright brothers first flew at Kitty Hawk, North Carolina. A body of best practices for the teaching of new pilots was still all but nonexistent. The standard training aircraft were Maurice Farman biplanes, ungainly contraptions dating from before the war that looked hardly more airworthy than the original Wright brothers glider. They flew about the same as they looked, slewing sluggishly through the sky. Still, they responded so slowly to the controls that it was difficult to get into irreversible trouble with them, and they proved surprisingly durable when the trainees bounced them down hard on runways or taxied them into trees. The Farmans were equipped with dual controls for the trainee and his instructor, but the engine sitting directly behind them both was so loud that spoken communication was impossible. If an instructor felt he absolutely had to say something to his pupil, his only way of doing so was to turn the engine off for a moment — a risky procedure, as an engine once switched off wasn’t always guaranteed to start back up again.

What with the short and sketchy ground-school curriculum that preceded each trainee’s first flight, the job of the instructors could sometimes be almost as perilous as that of the front-line pilots. Indeed, some took to calling their trainees “Huns” (the universal British appellation for the Germans) because they considered the scariest of them every bit as dangerous as any German ace. Many of the very worst of the trainees — the ones who most needed instruction — were pushed through having barely taken the controls at all, simply because the harried instructors lacked the time or patience to correct their failings. Plenty of these hapless fledglings wound up killing themselves in routine exercises before they ever made it to the front. Those who did survive that long could look forward to a life expectancy even shorter than that of the average greenhorn on the front.

What Mannock made of the whole confused process has not been recorded. However, given the yeoman work he would later do in systematizing the art of air combat into something approaching a science, he must have been fairly appalled at the chaos. Regardless, he was certainly among the better fledglings of his class, proving himself an able pilot if not quite a superb one, his native mechanical aptitude serving him well yet again. He was good enough that upon earning his “ticket” — his pilot’s license — and beginning to fly solo, he was earmarked for posting to a “scout,” or fighter, squadron rather than being relegated to a slower, heavier observation plane along with the less promising pilots. Problem was, actual scout aircraft were in short supply. He therefore spent some months after his training was supposedly finished being moved rather aimlessly around England, making occasional flights in obsolete De Havilland DH.2s. He seethed at each new domestic posting and dreamed of the day when he would finally get to go to the front. In the meantime this man of many names at last acquired the sobriquet by which he would go down in history: he became known as “Mick” or sometimes “Mickey” among his jittery fellow fledglings, yet one more tribute to his Irishness.

De Havilland DH.2

De Havilland DH.2

In March of 1917, at the end of this period of impatient waiting, the veteran ace James McCudden, who would come to be credited with 57 victories, second most of any British pilot of the war, spent some time giving Mannock and his fellow inexperienced pilots some final instruction before the long-awaited posting to the front came. This encounter between Britain’s eventual ace of aces and the runner-up led to one of the most famous stories about Mannock — famous despite or perhaps because the incident is so atypical of him. One day, McCudden told his charges that it was impossible to pull an airplane out of a spin that began below 2000 feet in time to avoid hitting the ground. The next day, Mannock threw his DH.2 into a spin at just 1500 feet. If he had hoped to prove McCudden wrong, he didn’t quite succeed: he managed to regain just enough control to plunk his plane roughly down on its wheels directly in front of the Vickers Ammunition Factory, yards away from rows and rows of sheds stuffed full of high explosives. Mannock insisted to his livid superiors that the whole thing had been an accident, which may very well have been true; unlike so many of his fellow aces, he would never acquire a reputation for heedless daredevilry. McCudden, however, was convinced that the “impetuous young Irishman” — McCudden was about to turn 22, while Mannock was almost 30, making the description rather rich — had done it deliberately to show him up.

Although the two men would never fly into battle together, they would cross paths regularly over the fifteen months or so each had left to live. During this time they would build a friendship but also a marked rivalry, only to die within three weeks of one another. For now, though, for the older but much, much greener of the pair, training days were finally over. On April 1, 1917 — two years to the day after he had been released from a Turkish prison — Second Lieutenant Edward “Eddie/Paddy/Pat/Jerry/Murphy/Mickey/Mick” Corringham Mannock arrived at St. Omer, France, for final posting to an active-duty squadron on the front lines. At long last, he was going to war.

(Sources for this article and those that follow in this series: Mick Mannock, Fighter Pilot by Adrian Smith; The Personal Diary of ‘Mick’ Mannock, introduced and annotated by Frederick Oughton; The Knighted Skies by Edward Jablonski; The American Heritage History of World War I by S.L.A. Marshall; Aces Falling by Peter Hart; Bloody April by Peter Hart; King of Airfighters by Ira Jones; Mount of Aces by Paul R. Hare; Winged Victory by V.M. Yeates; The First World War by John Keegan; A Short History of World War I by James L. Stokesbury.)

Footnotes

Footnotes
1 Generations of hagiographers would later claim that the infection left Mannock’s vision out of his left eye permanently impaired if not destroyed entirely, thus giving rise to the legend of “the ace with one eye,” a story worthy of a Biggles novel. Manly lad that he was, the accounts claim, he never revealed his handicap to any but those who were closest to him out of a horror of being pitied. Instead he worked for hours on end to find ways to compensate when, for instance, playing cricket (a sport at which he was actually quite accomplished). Thus did the hagiographers neatly sidestep the fact that the vast majority of those who remembered Mannock remembered absolutely nothing of note about his vision, other than perhaps an unusually intense stare when he talked with them.

Similarly, the hagiographers claimed that he managed to pass at least three eye exams with ease prior to becoming a pilot by using a photographic memory that is in evidence nowhere else in his life’s story to memorize the optical chart. As it happens, one Lieutenant Gilbert Preston, who had lost his left eye as a ground soldier at Gallipoli before deciding to sign on with the Royal Flying Corps, tried to fool the doctor in exactly the same way Mannock is claimed to have done. It didn’t work out terribly well for him:

I thought that I had fooled the doctor, because after I had read the reading board with my right eye, he turned me to the window and said, “Tell me what you see out of the window.” I knew that I would have to come back to reading the eye chart, so I memorised all of the lines on the board. When I finished describing what I had seen out the window, he swung me around, and covered my right eye and said, “Will you continue reading the eye chart?” I knew what was coming, so I started to “read” the board. Suddenly he said, “You’re blind in that eye, aren’t you?” I said, “Oh no, not quite.” He told me, “Uncover your right eye and look again at the chart.” While I had been looking out the window, and unknown to me, he had turned the chart over and the only single letter on that chart was the letter “E.” I was heartsick as I thought my own chances were non-existent. He then announced, “Don’t take it too much to heart, because I have orders to send you to the Flying Corps – whether you can see or not!” To my disappointment he informed me that I could not qualify as a pilot and that I would go to France as an observer.

So, the stories of Edward Mannock as “the ace with one eye” are all, needless to say, almost certainly complete bunk. Nor are they necessary for casting him in the role of hero; his story is plenty heroic without them.

 
29 Comments

Posted by on November 18, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags: ,

Turning on, Booting up, and Jacking into Neuromancer

When a novel becomes notably successful, Hollywood generally comes calling to secure the film rights. Many an author naïvely assumes that the acquisition of film rights means an actual film will get made, and in fairly short order at that. And thus is many an author sorely disappointed. Almost every popular novelist who’s been around for a while has stories to tell about Hollywood’s unique form of development purgatory. The sad fact is that the cost of acquiring the rights to even the biggest bestseller is a drop in the bucket in comparison to the cost of making a film out of them. Indeed, the cost is so trivial in terms of Hollywood budgets that many studios are willing to splash out for rights to books they never seriously envision doing anything productive with at all, simply to keep them out of the hands of rivals and protect their own properties in similar genres.

One could well imagine the much-discussed but never-made movie of William Gibson’s landmark cyberpunk novel Neuromancer falling into this standard pattern. Instead, though, its story is far, far more bizarre than the norm — and in its weird way far more entertaining.

Our story begins not with the power brokers of Hollywood, but rather with two young men at the very bottom of the Tinseltown social hierarchy. Ashley Tyler and Jeffrey Kinart were a pair of surfer dudes and cabana boys who worked the swimming pool of the exclusive Beverly Hills Hotel. Serving moguls and stars every day, they noticed that the things they observed their charges doing really didn’t seem all that difficult at all. With a little luck and a little drive, even a couple of service workers like them could probably become players. Despite having no money, no education in filmmaking, and no real inroads with the people who tipped them to deliver poolside drinks, they hatched a plan in early 1985 to make a sequel to their favorite film of all time, the previous year’s strange postmodern action comedy The Adventures of Buckaroo Banzai Across the 8th Dimension.

The idea was highly problematic, not only for all of the reasons I’ve just listed but also because Buckaroo Banzai, while regarded as something of a cult classic today, had been a notorious flop in its own day, recouping barely a third of its production budget — hardly, in other words, likely sequel fodder. Nevertheless, Tyler and Kinart were able to recruit Earl Mac Rauch, the creator of the Buckaroo Banzai character and writer of the film’s screenplay, to join their little company-in-name-only, which they appropriately titled Cabana Boy Productions. As they made the rounds of the studios, the all-too-plainly clueless Tyler and Kinart didn’t manage to drum up much interest for their Buckaroo Banzai sequel, but the Hollywood establishment found their delusions of grandeur and surfer-boy personalities so intriguing that there was reportedly some talk of signing them to a deal — not to make a Buckaroo Banzai movie, but as the fodder for a television comedy, a sort of Beverly Hillbillies for the 1980s.

After some months, the cabana boys finally recognized that Buckaroo Banzai had little chance of getting resurrected, and moved on to wanting to make a movie out of the hottest novel in science fiction: William Gibson’s Neuromancer. Rauch’s own career wasn’t exactly going gangbusters; in addition to Buckaroo Banzai, he also had on his résumé New York, New York, mob-movie maestro Martin Scorsese’s misbegotten attempt to make a classic Hollywood musical. Thus he agreed to stick with the pair, promising to write the screenplay if they could secure the rights to Neuromancer. In the meantime, they continued to schmooze the guests at the Beverly Hills Hotel, making their revised pitch to any of them who would listen. Against the odds, they stumbled upon one guest who took them very seriously indeed.

As was all too easy to tell from her rictus smile, Deborah Rosenberg was the wife of a plastic surgeon. Her husband, Victor Rosenberg, had been in private practice in New York City since 1970, serving the rich, the famous, and the would-be rich and famous. He also enjoyed a profitable sideline as a writer and commentator on his field for the supermarket tabloids, the glossy beauty magazines, and the bored-housewife talk-show circuit, where he was a regular on programs like Live with Regis and Kathie Lee, The Oprah Winfrey Show, and Donahue. When business took him and his wife to Beverly Hills in late 1985, Deborah was left to loiter by the pool while her husband attended a medical convention. It was there that she made the acquaintance of Tyler and Kinart.

Smelling money, the cabana boys talked up their plans to her with their usual gusto despite her having nothing to do with the film industry. Unaccountably, Deborah Rosenberg thought the idea of making Neuromancer with them a smashing one, and convinced her husband to put up seed capital for the endeavor. Ashley Tyler actually followed the Rosenbergs back to New York and moved into their mansion as a permanent house guest while he and Deborah continued to work on their plans. There would be much speculation around both Hollywood and New York in the months to come about exactly what sort of relationship Deborah and Ashley had, and whether her husband a) was aware of Deborah’s possible extramarital shenanigans and b) cared if he was.

While the irony of Gibson’s book full of cosmetic surgeries and body modifications of all descriptions being adapted by a plastic surgeon would have been particularly rich, Victor took little active role in the project, seeming to regard it (and possibly Ashley?) primarily as a way to keep his high-maintenance wife occupied. He did, however, help her to incorporate Cabana Boy Productions properly in January of 1986, and a few weeks later, having confirmed that Neuromancer rather surprisingly remained un-optioned, offered William Gibson $100,000 for all non-print-media rights to the novel. Gibson was almost as naïve as Deborah and her cabana boys; he had never earned more than the most menial of wages before finishing the science-fiction novel of the decade eighteen months before. He jumped at the offer with no further negotiation whatsoever, mumbling something about using the unexpected windfall to remodel his kitchen. The film rights to the hottest science-fiction novel in recent memory were now in the hands of two California surfer dudes and a plastic surgeon’s trophy wife. And then, just to make the situation that much more surreal, Timothy Leary showed up.

I should briefly introduce Leary for those of you who may not be that familiar with the psychologist whom President Nixon once called “the most dangerous man in America.” At the age of 42 in 1963, the heretofore respectable Leary was fired from his professorship at Harvard, allegedly for skipping lectures but really for administering psychedelic drugs to students without proper authorization. Ousted by the establishment, he joined the nascent counterculture as an elder statesman and cool hippie uncle. Whilst battling unsuccessfully to keep LSD and similar drugs legal — by 1968, they would be outlawed nationwide despite his best efforts — Leary traveled the country delivering “lectures” that came complete with a live backing band, light shows, and more pseudo-mystical mumbo jumbo than could be found anywhere this side of a Scientology convention. In his encounters with the straight mainstream press, he strained to be as outrageous and confrontational as possible. His favorite saying became one of the most enduring of the entire Age of Aquarius: “Turn on, tune in, drop out.” Persecuted relentlessly by the establishment as the Judas who had betrayed their trust, Leary was repeatedly arrested for drug possession. This, of course, only endeared him that much more to the counterculture, who regarded each successive bust as another instance of his personal martyrdom for their cause. The Moody Blues wrote an oh-so-sixties anthem about him called “Legend of a Mind” and made it the centerpiece of their 1968 album In Search of the Lost Chord; the Beatles song “Come Together” was begun as a campaign anthem for Leary’s farcical candidacy for governor of California.

In January of 1970, Leary, the last person in the world on whom any judge was inclined to be lenient, was sentenced to ten years imprisonment by the state of California for the possession of two marijuana cigarettes. With the aid of the terrorist group the Weather Underground, he escaped from prison that September and fled overseas, first to Algeria, then to Switzerland, where, now totally out of his depth in the criminal underworld, he wound up being kept under house arrest as a sort of prize pet by a high-living international arms dealer. When he was recaptured by Swiss authorities and extradited back to the United States in 1972, it thus came as something of a relief for him. He continued to write books in prison, but otherwise kept a lower profile as the last embers of the counterculture burned themselves out. His sentence was commuted by California Governor Jerry Brown in 1976, and he was released.

Free at last, he was slightly at loose ends, being widely regarded as a creaky anachronism of a decade that already felt very long ago and far away; in the age of disco, cocaine was the wonderdrug rather than LSD. But in 1983, when he played Infocom’s Suspended, he discovered a new passion that would come to dominate the last thirteen years of his life. He wrote to Mike Berlyn, the author of the game, to tell him that Suspended had “changed his life,” that he had been “completely overwhelmed by the way the characters split reality into six pieces.” He had, he said, “not thought much of computers before then,” but Suspended “had made computers a reality” for him. Later that year, he visited Infocom with an idea for, as one employee of the company remembers it, “a personality that would sit on top of the operating system, observe what you did, and modify what the computer would do and how it would present information based on your personal history, what you’d done on the computer.” If such an idea seems insanely ambitious in the context of early 1980s technology, it perhaps points to some of the issues that would tend to keep Leary, who wasn’t a programmer and had no real technical understanding of how computers worked, at the margins of the industry. His flamboyance and tendency to talk in superlatives made him an uneasy fit with the more low-key personality of Infocom. Another employee remembers Leary as being “too self-centered to make a good partner. He wanted his name and his ideas on something, but he didn’t want us to tell him how to do it.”

Mind Mirror

His overtures to Infocom having come to naught, Leary moved on, but he didn’t forget about computers. Far from it. As the waves of hype about home computers rolled across the nation, Leary saw in them much the same revolutionary potential he had once seen in peace, love, and LSD — and he also saw in them, one suspects, a new vehicle to bring himself, an inveterate lover of the spotlight, back to a certain cultural relevance. Computers, he declared, were better than drugs: “the language of computers [gives] me the metaphor I was searching for twenty years ago.” He helpfully provided the media with a new go-to slogan to apply to his latest ideas, albeit one that would never quite catch on like the earlier had: “Turn on, boot up, jack in.” “Who controls the pictures on the screen controls the future,” he said, “and computers let people control their own screen.”

In that spirit, he formed a small software developer of his own, which he dubbed Futique. Futique’s one tangible product was Mind Mirror, published by Electronic Arts in 1986. It stands to this day as the single strangest piece of software Electronic Arts has ever released. Billed as “part tool, part game, and part philosopher on a disk,” Mind Mirror was mostly incomprehensible — a vastly less intuitive Alter Ego with all the campy fun of that game’s terrible writing and dubious psychological insights leached out in favor of charts, graphs, and rambling manifestos. Electronic Arts found that Leary’s cultural cachet with the average computer user wasn’t as great as they might have hoped; despite their plastering his name and picture all over the box, Mind Mirror resoundingly flopped.

It was in the midst of all this activity that Leary encountered William Gibson’s novel Neuromancer. Perhaps unsurprisingly given the oft-cited link between Gibson’s vision of an ecstatic virtual reality called the Matrix and his earlier drug experiences, Leary became an instant cyberpunk convert, embracing the new sub-genre with all of his characteristic enthusiasm. Gibson, he said, had written “the New Testament of the 21st century.” Having evidently decided that the surest route to profundity lay in placing the prefix “cyber-” in front of every possible word, he went on to describe Neuromancer as “an encyclopedic epic for the cyber-screen culture of the immediate future, and an inspiring cyber-theology for the Information Age.” He reached out to the man he had anointed as the cyber-prophet behind this new cyber-theology, sparking up an acquaintance if never quite a real friendship. It was probably through Gibson — the chain of events isn’t entirely clear — that Leary became acquainted with the management of Cabana Boy Productions and their plans for a Neuromancer film. He promptly jumped in with them.

Through happenstance and sheer determination, the cabana boys now had a real corporation with at least a modicum of real funding, the rights to a real bestselling novel, and a real professional screenwriter — and the real Timothy Leary, for whatever that was worth. They were almost starting to look like a credible operation — until, that is, they started to talk.

Cabana Boy’s attempts to sell their proposed $20 million film to Hollywood were, according to one journalist, “a comedy of errors and naïveté — but what they lack in experience they are making up for in showmanship.” Although they were still not taken all that seriously by anyone, their back story and their personalities were enough to secure brief write-ups in People and Us, and David Letterman, always on the lookout for endearing eccentrics to interview and/or make fun of on his late-night talk show, seriously considered having them on. “My bet,” concluded the journalist, “is that they’ll make a movie about Cabana Boy before Neuromancer ever gets off the ground.”

Around the middle of 1986, Cabana Boy made a sizzle reel to shop around the Hollywood studios. William Gibson and his agent  and his publicist with Berkley Books were even convinced to show up and offer a few pleasantries. Almost everyone comes across as hopelessly vacuous in this, the only actual film footage Cabana Boy would ever manage to produce.


Shortly after the sizzle reel was made, Earl Mac Rauch split when he was offered the chance to work on a biopic about comedian John Belushi. No problem, said Deborah Rosenberg and Ashley Tyler, we’ll just write the Neuromancer script ourselves — this despite neither of them having ever written anything before, much less the screenplay to a proverbial “major motion picture.” At about the same time, Jeffrey Kinart had a falling-out with his old poolside partner — his absence from the promo video may be a sign of the troubles to come — and left as well. Tyler himself left at the end of 1987, marking the exit of the last actual cabana boy from Cabana Boy, even as Deborah Rosenberg remained no closer to signing the necessary contracts to make the film than she had been at the beginning of the endeavor. On the other hand, she had acquired two entertainment lawyers, a producer, a production designer, a bevy of “financial consultants,” offices in three cities for indeterminate purposes, and millions of dollars in debt. Still undaunted, on August 4, 1988, she registered her completed script, a document it would be fascinating but probably kind of horrifying to read, with the United States Copyright Office.

While all this was going on, Timothy Leary was obsessing over what may very well have been his real motivation for associating himself with Cabana Boy in the first place: turning Neuromancer into a computer game, or, as he preferred to call it, a “mind play” or “performance book.” Cabana Boy had, you’ll remember, picked up all electronic-media rights to the novel in addition to the film rights. Envisioning a Neuromancer game developed for the revolutionary new Commodore Amiga by his own company Futique, the fabulously well-connected Leary assembled a typically star-studded cast of characters to help him make it. It included David Byrne, lead singer of the rock band Talking Heads; Keith Haring, a trendy up-and-coming visual artist; Helmut Newton, a world-famous fashion photographer; Devo, the New Wave rock group; and none other than William Gibson’s personal literary hero William S. Burroughs to adapt the work to the computer.

This image created for Timothy Leary's "mind play" of Neuromancer features the artist Keith Haring, who was to play the role of Case. Haring died of AIDS in 1990 at the age of just 31, but nevertheless left behind him a surprisingly rich legacy.

This image created for Timothy Leary’s “mind play” of Neuromancer features David Byrne of the band Talking Heads.

Leary sub-contracted the rights for a Neuromancer game from Cabana Boy, and was able to secure a tentative deal with Electronic Arts. But that fell through when Mind Mirror hit the market and bombed. Another tentative agreement, this time with Jim Levy’s artistically ambitious Activision, collapsed when the much more practical-minded Bruce Davis took over control of that publisher in January of 1987. Neuromancer was a property that should have had huge draw with the computer-game demographic, but everyone, it seemed, was more than a little leery of Leary and his avant-garde aspirations. For some time, the game project didn’t make much more headway than the movie.

Neuromancer the game was saved by a very unusual friendship. While Leary was still associated with Electronic Arts, an unnamed someone at the publisher had introduced him to the head of one of their best development studios, Brian Fargo of Interplay, saying that he thought the two of them “will get along well.” “Timothy and his wife Barbara came down to my office, and sure enough we all hit it off great,” remembers Fargo. “Tim was fascinated by technology; he thought about it and talked about it all the time. So I was his go-to guy for questions about it.”

Being friends with the erstwhile most dangerous man in America was quite an eye-opening experience for the clean-cut former track star. Leary relished his stardom, somewhat faded though its luster may have been by the 1980s, and gloried in the access it gave him to the trendy jet-setting elite. Fargo remembers that Leary “would take me to all the hottest clubs in L.A. I got to go to the Playboy Mansion when I was 24 years old; I met O.J. and Nicole Simpson at his house, and Devo, and David Byrne from Talking Heads. It was a good time.”

His deals with Electronic Arts and Activision having fallen through, it was only natural for Leary to turn at last to his friend Brian Fargo to get his Neuromancer game made. Accepting the project, hot property though Neuromancer was among science-fiction fans, wasn’t without risk for Fargo. Interplay was a commercially-focused developer whose reputation rested largely on their Bard’s Tale series of traditional dungeon-crawling CRPGs; “mind plays” hadn’t exactly been in their bailiwick. Nor did they have a great deal of financial breathing room for artistic experimentation. Interplay, despite the huge success of the first Bard’s Tale game in particular, remained a small, fragile company that could ill-afford an expensive flop. In fact, they were about to embark on a major transition that would only amplify these concerns. Fargo, convinced that the main reason his company wasn’t making more money from The Bard’s Tale and their other games was the lousy 15 percent royalty they were getting from Electronic Arts — a deal which the latter company flatly refused to renegotiate — was moving inexorably toward severing those ties and trying to go it alone as a publisher as well as a developer. Doing so would mean giving up the possibility of making more Bard’s Tale games; that trademark would remain with Electronic Arts. Without that crutch to lean on, an independent Interplay would need to make all-new hits right out of the gate. And, judging from the performance of Mind Mirror, a Timothy Leary mind play didn’t seem all that likely to become one.

Fargo must therefore have breathed a sigh of relief when Leary, perhaps growing tired of this project he’d been flogging for quite some time, perhaps made more willing to trust Fargo’s instincts by the fact that he considered him a friend, said he would be happy to step back into a mere “consulting” role. He did, however, arrange for William Gibson to join Fargo at his house one day to throw out ideas. Gibson was amiable enough, but ultimately just not all that interested, as he tacitly admitted: “I was offered a lot more opportunity for input than I felt capable of acting on. One thing that quickly became apparent to me was that I hadn’t the foggiest notion of the way an interactive computer game had to be constructed, the various levels of architecture involved. It was fascinating, but I felt I’d best keep my nose out of it and let talented professionals go about the actual business of making the game.” So, Fargo and his team, which would come to include programmer Troy A. Miles, artist Charles H.H. Weidman III, and writers and designers Bruce Balfour and Mike Stackpole, were left alone to make their game. While none of them was a William Gibson, much less a William S. Burroughs, they did have a much better idea of what made for a fun, commercially viable computer game than did anyone on the dream team Leary had assembled.

Three fifths of the team that wound up making the completed Neuromancer: Troy Miles, Charles H.H. Weidman III, and Bruce Balfour.

Three fifths of the team that wound up making Interplay’s Neuromancer: Troy Miles, Charles H.H. Weidman III, and Bruce Balfour.

One member of Leary’s old team did agree to stay with the project. Brian Fargo:

My phone rang one night at close to one o’clock in the morning. It was Timothy, and he was all excited that he had gotten Devo to do the soundtrack. I said, “That’s great.” But however I said it, he didn’t think I sounded enthused enough, so he started yelling at me that he had worked so hard on this, and he should get more excitement out of me. Of course, I literally had just woken up.

So, next time I saw him, I said, “Tim, you can’t do that. It’s not fair. You can’t wake me up out of a dead sleep and tell me I’m not excited enough.” He said, “Brian, this is why we’re friends. I really appreciate the fact that you can tell me that. And you’re right.”

But in the end, Devo didn’t provide a full soundtrack, only a chiptunes version of “Some Things Never Change,” a track taken from their latest album Total Devo which plays over Neuromancer‘s splash screen.

The opening of the game. Case, now recast as a hapless loser, not much better than a space janitor, wakes up face-down in a plate of "synth-spaghetti."

The opening of the game. Case, now recast as a hapless loser, not much better than a space janitor, wakes up face-down in a plate of “synth-spaghetti.”

As an adaptation of the novel, Neuromancer the game can only be considered a dismal failure. Like that of the book, the game’s story begins in a sprawling Japanese metropolis of the future called Chiba City, stars a down-on-his-luck console cowboy named Case, and comes to revolve around a rogue artificial intelligence named Neuromancer. Otherwise, though, the plot of the game has very little resemblance to that of the novel. Considered in any other light than the commercial, the license is completely pointless; this could easily have been a generic cyberpunk adventure.

The game’s tone departs if anything even further from its source material than does its plot. Out of a sense of obligation, it occasionally shoehorns in a few lines of Gibson’s prose, but, rather than even trying to capture the noirish moodiness of the novel, the game aims for considerably lower-hanging fruit. In what was becoming a sort of default setting for adventure-game protagonists by the late 1980s, Case is now a semi-incompetent loser whom the game can feel free to make fun of, inhabiting a science-fiction-comedy universe which has much more to do with Douglas Adams — or, to move the fruit just that much lower, Planetfall or Space Quest — than William Gibson. This approach tended to show up so much in adventure games for very practical reasons: it removed most of the burden from the designers of trying to craft really coherent, believable narratives out of the very limited suite of puzzle and gameplay mechanics at their disposal. Being able to play everything for laughs just made design so much easier. Cop-out though it kind of was, it must be admitted that some of the most beloved classics of the adventure-game genre use exactly this approach. Still, it does have the effect of making Neuromancer the game read almost like a satire of Neuromancer the novel, which can hardly be ideal, at least from the standpoint of the licenser.

And yet, when divorced of its source material and considered strictly as a computer game Neuromancer succeeds rather brilliantly. It plays on three levels, only the first of which is open to you in the beginning. Those earliest stages confine you to “meat space,”  where you walk around, talk with other characters, and solve simple puzzles. Once you find a way to get your console back from the man to whom you pawned it, you’ll be able to enter the second level. Essentially a simulation of the online bulletin-board scene of the game’s own time, it has you logging onto various “databases,” where you can download new programs to run on your console, piece together clues and passwords, read forums and email, and hack banks and other entities. Only around the midway point of the game will you reach the Matrix proper, a true virtual-reality environment. Here you’ll have to engage in graphical combat with ever more potent forms of ICE (“Intrusion Countermeasures Electronics”) to penetrate ever more important databases.

Particularly at this stage, the game has a strong CRPG component; not only do you need to earn money to buy ever better consoles, software, and “skill chips” that conveniently slot right into Case’s brain, but as Case fights ICE on the Matrix his core skills improve with experience. It’s a heady brew, wonderfully varied and entertaining. Despite the limitations of the Commodore 64, the platform on which it made its debut, Neuromancer is one of the most content-rich games of its era, with none of the endless random combats and assorted busywork that stretches the contemporaneous CRPGs of Interplay and others to such interminable lengths. Neuromancer ends just about when you feel it ought to end, having provided the addictive rush of building up a character from a weakling to a powerhouse without ever having bored you in the process.

Reading messages from The Scene... err, from Neuromancer's hacker underground.

Reading messages from the Scene… err, from Neuromancer‘s version of the hacker underground.

One of the more eyebrow-raising aspects of Neuromancer is the obvious influence that the real underground world of the Scene had on its. The lingo, the attitudes… all of it is drawn from pirate BBS culture, circa 1988. Ironically, the game evokes the spirit of the Scene far better than it does anything from Gibson’s novel, serving in this respect as a time capsule par excellence. At least some people at Interplay, it seems, were far more familiar with that illegal world than any upstanding citizen ought to have been. Neuromancer is merely one more chapter in the long shared history of legitimate software developers and pirates, who were always more interconnected and even mutually dependent than the strident rhetoric of the Software Publishers Association might lead one to suspect. Richard Garriott’s Akalabeth was first discovered by his eventual publisher California Pacific via a pirated version someone brought into the office; Sid Meier ran one of the most prolific piracy rings in Baltimore before he became one of the most famous game designers in history… the anecdotes are endless. Just to blur the lines that much more, soon after Neuromancer some cracking groups would begin to go legitimate, becoming game makers in their own rights.

Like other Interplay games from this period, Neuromancer is also notable for how far it’s willing to push the barriers of acceptability in what was still the games industry’s equivalent of pre-Hayes Code Hollywood. There’s an online sex board you can visit, a happy-ending massage parlor, a whore wandering the streets. Still, and for all that it’s not exactly a comedic revelation, I find the writing in Neuromancer makes it a more likable game than, say, Wasteland with its somewhat juvenile transgression for transgression’s sake. Neuromancer walks right up to that line on one or two occasions, but never quite crosses it in this critic’s opinion.

Of course, it’s not without some niggles. The interface, especially in the meat-space portions, is a little clunky; it looks like a typical point-and-click adventure game, but its control scheme is less intuitive than it appears, which can lead to some cognitive dissonance when you first start to play. But that sorts itself out once you get into the swing of things. Neuromancer is by far my favorite Interplay game of the 1980s, boldly original but also thoroughly playable — and, it should be noted, rigorously fair. Take careful notes and do your due diligence, and you can feel confident of being able to solve this one.

About to do battle with an artificial intelligence, the most fearsome of the foes you'll encounter in the Matrix.

About to do battle with an artificial intelligence, the most fearsome of the foes you’ll encounter in the Matrix.

Neuromancer was released on the Commodore 64 and the Apple II in late 1988 as one of Interplay’s first two self-published games. The other, fortunately for Interplay but perhaps unfortunately for Neuromancer‘s commercial prospects, was an Amiga game called Battle Chess. Far less conceptually ambitious than Neuromancer, Battle Chess was an everyday chess engine, no better or worse than dozens of other ones that could be found in the public domain, onto which Interplay had grafted “4 MB of animation” and “400 K of digitized sound” (yes, those figures were considered very impressive at the time). When you moved a piece on the board, you got to watch it walk over to its new position, possibly killing other pieces in the process. And that was it, the entire gimmick. But, in those days when games were so frequently purchased as showpieces for one’s graphics and sound hardware, it was more than enough. Battle Chess became just the major hit Interplay needed to establish themselves as a publisher, but in the process it sucked all of Neuromancer‘s oxygen right out of the room. Despite the strength of the license, the latter game went comparatively neglected by Interplay, still a very small company with very limited resources, in the rush to capitalize on the Battle Chess sensation. Neuromancer was ported to MS-DOS and the Apple IIGS in 1989 and to the Amiga in 1990 — in my opinion this last is the definitive version — but was never a big promotional priority and never sold in more than middling numbers. Early talk of a sequel, to have been based on William Gibson’s second novel Count Zero, remained only that. Neuromancer is all but forgotten today, one of the lost gems of its era.

I always make it a special point to highlight games I consider to be genuine classics, the ones that still hold up very well today, and that goes double if they aren’t generally well-remembered. Neuromancer fits into both categories. So, please, feel free to download the Amiga version from right here, pick up an Amiga emulator if you don’t have one already, and have at it. This one really is worth it, folks.

I’ll of course have much more to say about the newly self-sufficient Interplay in future articles. But as for the other players in today’s little drama:

Timothy Leary remained committed to using computers to “express the panoramas of your own brain” right up until he died in 1996, although without ever managing to bring any of his various projects, which increasingly hewed to Matrix-like three-dimensional virtual realities drawn from William Gibson, into anything more than the most experimental of forms.

William Gibson himself… well, I covered him in my last article, didn’t I?

Deborah Rosenberg soldiered on for quite some time alone with the cabana-boy-less Cabana Boy; per contractual stipulation, the Neuromancer game box said that it was “soon to be a major motion picture from Cabana Boy Productions.” And, indeed, she at last managed to sign an actual contract with Tri-Star Pictures on June 2, 1989, to further develop her screenplay, at which point Tri-Star would, “at its discretion,” “produce the movie.” But apparently Tri-Star took discretion to be the better part of valor in the end; nothing else was ever heard of the deal. Cabana Boy was officially dissolved on March 24, 1993. There followed years of litigation between the Rosenbergs and the Internal Revenue Service; it seems the former had illegally deducted all of the money they’d poured into the venture from their tax returns. (It’s largely thanks to the paper trail left behind by the tax-court case, which wasn’t finally settled until 2000, that we know as much about the details of Cabana Boy as we do.) Deborah Rosenberg has presumably gone back to being simply the wife of a plastic surgeon to the stars, whatever that entails, her producing and screenwriting aspirations nipped in the bud and tucked back away wherever it was they came from.

Earl Mac Rauch wrote the screenplay for Wired, the biopic about John Belushi, only to see it greeted with jeers and walk-outs at the 1989 Cannes Film Festival. It went on to become a critical and financial disaster. Having collected three strikes in the form of New York, New York, Buckaroo Banzai, and now Wired, Rauch was out. He vanished into obscurity, although I understand he has resurfaced in recent years to write some Buckaroo Banzai graphic novels.

And as for our two cabana boys, Ashley Tyler and Jeffrey Kinart… who knows? Perhaps they’re patrolling some pool somewhere to this day, regaling the guests with glories that were or glories that may, with the right financial contribution, yet be.

(Sources: Computer Gaming World of September 1988; The Games Machine of October 1988; Aboriginal Science Fiction of October 1986; AmigaWorld of May 1988; Compute! of October 1991; The One of February 1989; Starlog of July 1984; Spin of April 1987. Online sources include the sordid details of the Cabana Boy tax case, from the United States Tax Court archive and an Alison Rhonemus’s blog post on some of the contents of Timothy Leary’s papers, which are now held at the New York Public Library. I also made use of the Get Lamp interview archives which Jason Scott so kindly shared with me. Finally, my huge thanks to Brian Fargo for taking time from his busy schedule to discuss his memories of Interplay’s early days with me.)

 
39 Comments

Posted by on November 11, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

The Prophet of Cyberspace

William Gibson

William Gibson was born on March 17, 1948, on the coast of South Carolina. An only child, he was just six years old when his father, a middle manager for a construction company, choked on his food and died while away on one of his many business trips. Mother and son moved back to the former’s childhood home, a small town in Virginia.

Life there was trying for the young boy. His mother, whom he describes today as “chronically anxious and depressive,” never quite seemed to get over the death of her husband, and never quite knew how to relate to her son. Gibson grew up “introverted” and “hyper-bookish,” “the original can’t-hit-the-baseball kid,” feeling perpetually isolated from the world around him. He found refuge, like so many similar personalities, in the shinier, simpler worlds of science fiction. He dreamed of growing up to inhabit those worlds full-time by becoming a science-fiction writer in his own right.

At age 15, desperate for a new start, Gibson convinced his mother to ship him off to a private school for boys in Arizona. It was by his account as bizarre a place as any of the environments that would later show up in his fiction.

It was like a dumping ground for chronically damaged adolescent boys. There were just some weird stories there, from all over the country. They ranged from a 17-year-old, I think from Louisiana, who was like a total alcoholic, man, a terminal, end-of-the-stage guy who weighed about 300 pounds and could drink two quarts of vodka straight up and pretend he hadn’t drunk any to this incredibly great-looking, I mean, beautiful kid from San Francisco, who was crazy because from age 10 his parents had sent him to plastic surgeons because they didn’t like the way he looked.

Still, the clean desert air and the forced socialization of life at the school seemed to do him good. He began to come out his shell. Meanwhile the 1960s were starting to roll, and young William, again like so many of his peers, replaced science fiction with Beatles, Beats, and, most of all, William S. Burroughs, the writer who remains his personal literary hero to this day.

William Gibson on the road, 1967

William Gibson on the road, 1967

As his senior year at the boys’ school was just beginning, Gibson’s mother died as abruptly as had his father. Left all alone in the world, he went a little crazy. He was implicated in a drug ring at his school — he still insists today that he was innocent — and kicked out just weeks away from graduation. With no one left to go home to, he hit the road like Burroughs and his other Beat heroes, hoping to discover enlightenment through hedonism; when required like all 18-year-olds to register for the draft, he listed as his primary ambition in life the sampling of every drug ever invented. He apparently made a pretty good stab at realizing that ambition, whilst tramping around North America and, a little later, Europe for years on end, working odd jobs in communes and head shops and taking each day as it came. By necessity, he learned the unwritten rules and hierarchies of power that govern life on the street, a hard-won wisdom that would later set him apart as a writer.

In 1972, he wound up married to a girl he’d met on his travels and living in Vancouver, British Columbia, where he still makes his home to this day. As determined as ever to avoid a conventional workaday life, he realized that, thanks to Canada’s generous student-aid program, he could actually earn more money by attending university than he could working some menial job. He therefore enrolled at the University of British Columbia as an English major. Much to his own surprise, the classes he took there and the people he met in them reawakened his childhood love of science fiction and the written word in general, and with them his desire to write. Gibson’s first short story was published in 1977 in a short-lived, obscure little journal occupying some uncertain ground between fanzine and professional magazine; he earned all of $27 from the venture. Juvenilia though it may be, “Fragments of a Hologram Rose,” a moody, plot-less bit of atmospherics about a jilted lover of the near future who relies on virtual-reality “ASP cassettes” to sleep, already bears his unique stylistic stamp. But after writing it he published nothing else for a long while, occupying himself instead with raising his first child and living the life of a househusband while his wife, now a teacher with a Master’s Degree in linguistics, supported the family. It seemed a writer needed to know so much, and he hardly knew where to start learning it all.

It was punk rock and its child post-punk that finally got him going in earnest. Bands like Wire and Joy Division, who proved you didn’t need to know how to play like Emerson, Lake, and Palmer to make daring, inspiring music, convinced him to apply the same lesson to his writing — to just get on with it. When he did, things happened with stunning quickness. His second story, a delightful romp called “The Gernsback Continuum,” was purchased by Terry Carr, a legendary science-fiction editor and taste-maker, for the 1981 edition of his long-running Universe series of paperback short-story anthologies. With that feather in his cap, Gibson began regularly selling stories to Omni, one of the most respected of the contemporary science-fiction magazines. The first story of his that Omni published, “Johnny Mnemonic,” became the manifesto of a whole new science-fiction sub-genre that had Gibson as its leading light. The small network of writers, critics, and fellow travelers sometimes called themselves “The Movement,” sometimes “The Mirrorshades Group.” But in the end, the world would come to know them as the cyberpunks.

If forced to name one thing that made cyberpunk different from what had come before, I wouldn’t point to any of the exotic computer technology or the murky noirish aesthetics. I’d rather point to eight words found in Gibson’s 1982 story “Burning Chrome”: “the street finds its own use for things.” Those words signaled a shift away from past science fiction’s antiseptic idealized futures toward more organic futures extrapolated from the dirty chaos of the contemporary street. William Gibson, a man who out of necessity had learned to read the street, was the ideal writer to become the movement’s standard bearer. While traditional science-fiction writers were interested in technology for its own sake, Gibson was interested in the effect of technology on people and societies.

Cyberpunk, this first science fiction of the street, was responding to a fundamental shift in the focus of technological development in the real world. The cutting-edge technology of previous decades had been deployed as large-scale, outwardly focused projects, often funded with public money: projects like the Hoover Dam, the Manhattan Project, and that ultimate expression of macro-technology the Apollo moon landing. Even our computers were things filling entire floors, to be programmed and maintained by a small army of lab-coated drones. Golden-age science fiction was right on-board with this emphasis on ever greater scope and scale, extrapolating grand voyages to the stars alongside huge infrastructure projects back home.

Not long after macro-technology enjoyed its greatest hurrah in the communal adventure that was Apollo, however, technology began to get personal. In the mid-1970s, the first personal computers began to appear. In 1979, in an event of almost equal significance, Sony introduced the Walkman, a cassette player the size of your hand, the first piece of lifestyle technology that you could carry around with you. The PC and the Walkman begat our iPhones and Fitbits of today. And if we believe what Gibson and the other cyberpunks were already saying in the early 1980s, those gadgets will in turn beget chip implants, nerve splices, body modifications, and artificial organs. The public has become personal; the outward-facing has become inward-facing; the macro spaces have become micro spaces. We now focus on making ever smaller gadgets, even as we’ve turned our attention away from the outer space beyond our planet in favor of drilling down ever further into the infinitesimal inner spaces of genes and cells, into the tiniest particles that form our universe. All of these trends first showed up in science fiction in the form of cyberpunk.

In marked contrast to the boldness of his stories’ content, Gibson was peculiarly cautious, even hesitant, when it came to the process of writing and of making a proper career out of the act. The fact that Neuromancer, Gibson’s seminal first novel, came into being when it did was entirely down to the intervention of Terry Carr, the same man who had kick-started Gibson’s career as a writer of short stories by publishing “The Gernsback Continuum.” When in 1983 he was put in charge of a new “Ace Specials” line of science-fiction paperbacks reserved exclusively for the first novels of up-and-coming writers, Carr immediately thought again of William Gibson. A great believer in Gibson’s talent and potential importance, he cajoled him into taking an advance and agreeing to write a novel; Gibson had considered himself still “four or five years away” from being ready to tackle such a daunting task. “It wasn’t that vast forces were silently urging me to write,” he says. “It’s just that Terry Carr had given me this money and I had to make up some kind of story. I didn’t have a clue, so I said, ‘Well, I’ll plagiarize myself and see what comes of it.'” And indeed, there isn’t that much in 1984’s Neuromancer that would have felt really new to anyone who had read all of the stories Gibson had written in the few years before it. As a distillation of all the ideas with which he’d been experimenting in one 271-page novel, however, it was hard to beat.

 

Neuromancer

The plot is never the most important aspect of a William Gibson novel, and this first one is no exception to that rule. Still, for the record…

Neuromancer takes place at some indeterminate time in the future, in a gritty society where the planet is polluted and capitalism has run amok, but the designer drugs and technological toys are great if you can pay for them. Our hero is Case, a former “console cowboy” who used to make his living inside the virtual reality, or “Matrix,” of a worldwide computer network, battling “ICE” (“Intrusion Countermeasures Electronics”) and pulling off heists for fun and profit. Unfortunately for him, an ex-employer with a grudge has recently fried those pieces of Case’s brain that interface with his console and let him inject himself into “cyberspace.” Left stuck permanently in “meat” space, as the novel opens he’s a borderline suicidal, down-and-out junkie. But soon he’s offered the chance to get his nervous system repaired and get back into the game by a mysterious fellow named Armitage, mastermind of a ragtag gang of outlaws who are investigating mysterious happenings on the Matrix. Eventually they’ll discover a rogue artificial intelligence behind it all — the titular Neuromancer.

Given that plot summary, we can no longer avoid addressing the thing for which William Gibson will always first and foremost be known, whatever his own wishes on the matter: he’s the man who invented the term “cyberspace,” as well as the verb “to surf” it and with them much of the attitudinal vector that accompanied the rise of the World Wide Web in the 1990s. It should be noted that both neologisms actually predate Neuromancer in Gibson’s work, dating back to 1982’s “Burning Chrome.” And it should most definitely be noted that he was hardly the first to stumble upon many of the ideas behind the attitude. We’ve already chronicled some of the developments in the realms of theory and practical experimentation that led to the World Wide Web. And in the realm of fiction, a mathematician and part-time science-fiction writer named Vernor Vinge had published True Names, a novella describing a worldwide networked virtual reality of its own, in 1981; its plot also bears some striking similarities to that of Gibson’s later Neuromancer. But Vinge was (and is) a much more prosaic writer than Gibson, hewing more to science fiction’s sturdy old school of Asimov, Clarke, and Heinlein. He could propose the idea of a worldwide network and then proceed to work it out with much more technical rigorousness than Gibson could ever dream of mustering, but he couldn’t hope to make it anywhere near as sexy.

For many the most inexplicable thing about Gibson’s work is that he should ever have come up with all this cyberspace stuff in the first place. As he took a certain perverse delight in explaining to his wide-eyed early interviewers, in his real-world life Gibson was something of a Luddite even by the standards of the 1980s. He had, for instance, never owned or used a computer at the time he wrote his early stories and Neuromancer; he wrote of his sleek high-tech futures on a clunky mechanical typewriter dating from 1927. (Gibson immortalized it within Neuromancer itself by placing it in disassembled form on the desk of Julius Deane, an underworld kingpin Case visits early in the novel.) And I’ve seen no evidence that Gibson was aware of True Names prior to writing “Burning Chrome” and Neuromancer, much less the body of esoteric and (at the time) obscure academic literature on computer networking and hypertext.

Typically, Gibson first conceived the idea of the Matrix not from reading tech magazines and academic journals, as Vinge did in conceiving his own so-called “Other Plane,” but on the street, while gazing through the window of an arcade. Seeing the rapt stares of the players made him think they believed in “some kind of actual space behind the screen, someplace you can’t see but you know is there.” In Neuromancer, he describes the Matrix as the rush of a drug high, a sensation with which his youthful adventures in the counterculture had doubtless left him intimately familiar.

He closed his eyes.

Found the ridged face of the power stud.

And in the bloodlit dark behind his eyes, silver phosphenes boiling in from the edge of space, hypnagogic images jerking past like film compiled from random frames. Symbols, figures, faces, a blurred, fragmented mandala of visual information.

Please, he prayed, _now –_

A gray disk, the color of Chiba sky.

_Now –_

Disk beginning to rotate, faster, becoming a sphere of paler gray. Expanding —

And flowed, flowered for him, fluid neon origami trick, the unfolding of his distanceless home, his country, transparent 3D chessboard extending to infinity. Inner eye opening to the stepped scarlet pyramid of the Eastern Seaboard Fission Authority burning beyond the green cubes of Mitsubishi Bank of America, and high and very far away he saw the spiral arms of military systems, forever beyond his reach.

And somewhere he was laughing, in a white-painted loft, distant fingers caressing the deck, tears of release streaking his face.

Much of the supposedly “futuristic” slang in Neuromancer is really “dope dealer’s slang” or “biker’s talk” Gibson had picked up on his travels. Aside from the pervasive role played by the street, he has always listed the most direct influences on Neuromancer as the cut-up novels of his literary hero William S. Burroughs, the noirish detective novels of Dashiell Hammett, and the deliciously dystopian nighttime neon metropolis of Ridley Scott’s film Blade Runner, which in its exploration of subjectivity, the nature of identity, and the influence of technology on same hit many of the same notes that became staples of Gibson’s work. That so much of the modern world seems to be shaped in Neuromancer‘s image says much about Gibson’s purely intuitive but nevertheless prescient genius — and also something about the way that science fiction can be not only a predictor but a shaper of the future, an idea I’ll return to shortly.

But before we move on to that subject and others we should take just a moment more to consider how unique Neuromancer, a bestseller that’s a triumph of style as much as anything else, really is in the annals of science fiction. In a genre still not overly known for striking or elegant prose, William Gibson is one of the few writers immediately recognizable after just a paragraph or two. If, on the other hand, you’re looking for air-tight world-building and careful plotting, Gibson is definitely not the place to find it. “You’ll notice in Neuromancer there’s obviously been a war,” he said in an interview, “but I don’t explain what caused it or even who was fighting it. I’ve never had the patience or the desire to work out the details of who’s doing what to whom, or exactly when something is taking place, or what’s become of the United States.”

I remember standing in a record store one day with a friend of mine who was quite a good guitar player when Jimi Hendrix’s famous Woodstock rendition of “The Star-Spangled Banner” came over the sound system. “All he does is make a bunch of noise to cover it up every time he flubs a note,” said my friend — albeit, as even he had to agree, kind of a dazzling noise. I sometimes think of that conversation when I read Neuromancer and Gibson’s other early works. There’s an ostentatious, look-at-me! quality to his prose, fueled by, as Gibson admitted, his “blind animal panic” at the prospect of “losing the reader’s attention.” Or, as critic Andrew M. Butler puts it more dryly: “This novel demonstrates great linguistic density, Gibson’s style perhaps blinding the reader to any shortcomings of the novel, and at times distancing us from the characters and what Gibson the author may feel about them.” The actual action of the story, meanwhile, Butler sums up not entirely unfairly as, “Case, the hapless protagonist, stumbles between crises, barely knowing what’s going on, at risk from a femme fatale and being made offers he cannot refuse from mysterious Mr. Bigs.” Again, you don’t read William Gibson for the plot.

Which of course only makes Neuromancer‘s warm reception by the normally plot-focused readers of science fiction all the more striking. But make no mistake: it was a massive critical and commercial success, winning the Hugo and Nebula Awards for its year and, as soon as word spread following its very low-key release, selling like crazy. Unanimously recognized as the science-fiction novel of 1984, it was being labeled the novel of the decade well before the 1980s were actually over; it was just that hard to imagine another book coming out that could compete with its influence. Gibson found himself in a situation somewhat akin to that of Douglas Adams during the same period, lauded by the science-fiction community but never quite feeling a part of it. “Everyone’s been so nice,” he said in the first blush of his success, “but I still feel very much out of place in the company of most science-fiction writers. It’s as though I don’t know what to do when I’m around them, so I’m usually very polite and keep my tie on. Science-fiction authors are often strange, ill-socialized people who have good minds but are still kids.” Politeness or no, descriptions like that weren’t likely to win him many new friends among them. And, indeed, there was a considerable backlash against him by more traditionalist writers and readers, couched in much the same rhetoric that had been deployed against science fiction’s New Wave of writers of twenty years before.

But if we wish to find reasons that so much of science-fiction fandom did embrace Neuromancer so enthusiastically, we can certainly find some that were very practical if not self-serving, and that had little to do with the literary stylings of William S. Burroughs or extrapolations on the social import of technological development. Simply put, Neuromancer was cool, and cool was something that many of the kids who read it decidedly lacked in their own lives. It’s no great revelation to say that kids who like science fiction were and are drawn in disproportionate numbers to computers. Prior to Neuromancer, such kids had few media heroes to look up to; computer hackers were almost uniformly depicted as socially inept nerds in Coke-bottle glasses and pocket protectors. But now along came Case, and with him a new model of the hacker as rock star, dazzling with his Mad Skillz on the Matrix by day and getting hot and heavy with his girlfriend Molly Millions, who seemed to have walked into the book out of an MTV music video, by night. For the young pirates and phreakers who made up the Scene, Neuromancer was the feast they’d never realized they were hungry for. Cyberpunk ideas, iconography, and vocabulary were quickly woven into the Scene’s social fabric.

Like much about Neuromancer‘s success, this way of reading it, which reduced it down to a stylish exercise in escapism, bothered Gibson. His book was, he insisted, not about how cool it was to be “hard and glossy” like Case and Molly, but about “what being hard and glossy does to you.” “My publishers keep telling me the adolescent market is where it’s at,” he said, “and that makes me pretty uncomfortable because I remember what my tastes ran to at that age.”

While Gibson may have been uncomfortable with the huge appetite for comic-book-style cyberpunk that followed Neuromancer‘s success, plenty of others weren’t reluctant to forgo any deeper literary aspirations in favor of piling the casual violence and casual sex atop the casual tech. As the violence got ever more extreme and the sex ever more lurid, cyberpunk risked turning into the most insufferable of clichés.

Sensation though cyberpunk was in the rather insular world of written science fiction, William Gibson and the sub-genre he had pioneered filtered only gradually into the world outside of that ghetto. The first cyberpunk character to take to the screen arguably was, in what feels a very appropriate gesture, a character who allegedly lived within a television: Max Headroom, a curious computerized talking head who became an odd sort of cultural icon for a few years there during the mid- to late-1980s. Invented for a 1985 low-budget British television movie called Max Headroom: 20 Minutes into the Future, Max went on to host his own talk show on British television, to become an international spokesman for the ill-fated New Coke, and finally to star in an American dramatic series which managed to air 14 episodes on ABC during 1987 and 1988. While they lacked anything precisely equivalent to the Matrix, the movie and the dramatic series otherwise trafficked in themes, dystopic environments, and gritty technologies of the street not far removed at all from those of Neuromancer. The ambitions of Max’s creators were constantly curtailed by painfully obvious budgetary limitations as well as the pop-cultural baggage carried by the character himself; by the time of the 1987 television series he had become more associated with camp than serious science fiction. Nevertheless, the television series in particular makes for fascinating viewing for any student of cyberpunk history. (The series endeared itself to Commodore Amiga owners in another way: Amigas were used to create many of the visual effects used on the show, although not, as was occasionally reported, to render Max Headroom himself. He was actually played by an actor wearing a prosthetic mask, with various visual and auditory effects added in post-production to create the character’s trademark tics.)

There are other examples of cyberpunk’s slowly growing influence to be found in the film and television of the late 1980s and early 1990s, such as the street-savvy, darkly humorous low-budget action flick Robocop. But William Gibson’s elevation to the status of Prophet of Cyberspace in the eyes of the mainstream really began in earnest with a magazine called Wired, launched in 1993 by an eclectic mix of journalists, entrepreneurs, and academics. Envisioned as a glossy lifestyle magazine for the hip and tech-conscious — the initial pitch labeled it “the Rolling Stone of technology” — Wired‘s aesthetics were to a large degree modeled on William Gibson. When they convinced him to contribute a rare non-fiction article (on Singapore, which he described as “Disneyland with the death penalty”) to the fourth issue, the editors were so excited that they stuck the author rather than the subject of the article on their magazine’s cover.

Wired

Well-funded and editorially polished in all the ways that traditional technology journals weren’t, Wired was perfectly situated to become mainstream journalism’s go-to resource for understanding the World Wide Web and the technology bubble expanding around it. It was largely through Wired that “cyberspace” and “surfing” became indelible parts of the vocabulary of the age, even as both neologisms felt a long, long way in spirit from the actual experience of using the World Wide Web in those early days, involving as it did mostly text-only pages delivered to the screen at a glacial pace. No matter. The vocabulary surrounding technology has always tended to be grounded in aspiration rather than reality, and perhaps that’s as it should be. By the latter 1990s, Gibson was being acknowledged by even such dowdy organs as The New York Times as the man who had predicted it all five years before the World Wide Web was so much as a gleam in the eye of Tim Berners-Lee.

To ask whether William Gibson deserves his popular status as a prophet is, I would suggest, a little pointless. Yes, Vernor Vinge may have better claim to the title in the realm of fiction, and certainly people like Vannevar Bush, Douglas Engelbart, Ted Nelson, and even Bill Atkinson of Apple have huge claims on the raw ideas that turned into the World Wide Web. Even within the oeuvre of William Gibson himself, his predictions in other areas of personal technology and society — not least his anticipation of globalization and its discontents — strike me as actually more prescient than his rather vague vision of a global computerized Matrix.

Yet, whether we like it or not, journalism and popular history do tend to condense complexities down to single, easily graspable names, and in this case the beneficiary of that tendency is William Gibson. And it’s not as if he didn’t make a contribution. Whatever the rest did, Gibson was the guy who made the idea of a networked society — almost a networked consciousness — accessible, cool, and fun. In doing so, he turned the old idea of science fiction as prophecy on its head. Those kids who grew up reading Neuromancer became the adults who are building the technology of today. If, with the latest developments in virtual reality, we seem to be inching ever closer to a true worldwide Matrix, we can well ask ourselves who is the influenced and who is the influencer. Certainly Neuromancer‘s effect on our popular culture has been all but incalculable. The Matrix, the fifth highest-grossing film of 1999 and a mind-expanding pop-culture touchstone of its era, borrowed from Gibson to the extent of naming itself after his version of virtual reality. In our own time, it’s hard to imagine current too-cool-for-school television hits like Westworld, Mr. Robot, and Black Mirror existing without the example of Neuromancer (or, at least, without The Matrix and thus by extension Neuromancer). The old stereotype of the closeted computer nerd, if not quite banished to the closet from which it came, does now face strong competition indeed. Cyberpunk has largely faded away as a science-fiction sub-genre or even just a recognized point of view, not because the ideas behind it died but because they’ve become so darn commonplace.

You may have noticed that up to this point I’ve said nothing about the books William Gibson wrote after Neuromancer. That it’s been so easy to avoid doing so says much about his subsequent career, doomed as it is always to be overshadowed by his very first novel. For understandable reasons, the situation hasn’t always sat well with Gibson himself. Already in 1992, he could only wryly reply, “Yeah, and they’ll never let me forget it,” when introduced as the man who invented cyberspace — this well before his mainstream fame as the inventor of the word had really even begun to take off. Writing a first book with the impact of Neuromancer is not an unalloyed blessing.

That said, one must also acknowledge that Gibson didn’t do his later career any favors in getting out from under Neuromancer‘s shadow. Evincing that peculiar professional caution that always sat behind his bold prose, he mined the same territory for years, releasing a series of books whose titles — Count Zero, Mona Lisa Overdrive, Virtual Light — seem as of a piece as their dystopic settings and their vaguely realized plots. It’s not that these books have nothing to say; it’s rather that almost everything they do say is already said by Neuromancer. His one major pre-millennial departure from form, 1990’s The Difference Engine, is an influential exercise in Victorian steampunk, but also a book whose genesis owed much to his good friend and fellow cyberpunk icon Bruce Sterling, with whom he collaborated on it.

Here’s the thing, though: as he wrote all those somewhat interchangeable novels through the late 1980s and 1990s, William Gibson was becoming a better writer. His big breakthrough came with 2003’s Pattern Recognition, in my opinion the best pure novel he’s ever written. Perhaps not coincidentally, Pattern Recognition also marks the moment when Gibson, who had been steadily inching closer to the present ever since Neuromancer, finally decided to set a story in our own contemporary world. His prose is as wonderful as ever, full of sentence after sentence I can only wish I’d come up with, yet now free of the look-at-me! ostentation of his early work. One of the best ways to appreciate how much subtler a writer Gibson has become is to look at his handling of his female characters. Molly Millions from Neuromancer was every teenage boy’s wet dream come to life. Cayce, the protagonist of Pattern Recognition — her name is a sly nod back to Neuromancer‘s Case — is, well, just a person. Her sexuality is part of her identity, but it’s just a part. A strong, capable, intelligent character, she’s not celebrated by the author for any of these qualities. Instead she’s allowed just to be. This strikes me as a wonderful sign of progress — for William Gibson, and perhaps for all of us.

Which isn’t to say that Gibson’s dystopias have turned into utopias. While his actual plots remain as underwhelming as ever, no working writer of today that I’m aware of captures so adroitly the sense of dislocation and isolation that has become such a staple of post-millennial life — paradoxically so in this world that’s more interconnected than ever. If some person from the future or the past asked you how we live now, you could do a lot worse than to simply hand her one of William Gibson’s recent novels.

Whether Gibson is still a science-fiction writer is up for debate and, like so many exercises in labeling, ultimately inconsequential. There remains a coterie of old fans unhappy with the new direction, who complain about every new novel he writes because it isn’t another Neuromancer. By way of compensation, Gibson has come to be widely accepted as a writer of note outside of science-fiction fandom — a writer of note, that is, for something more than being the inventor of cyberspace. That of course doesn’t mean he will ever write another book with the impact of Neuromancer, but Gibson, who never envisioned himself as anything more than a cult writer in the first place, seems to have made his peace at last with the inevitability of the phrases “author of Neuromancer” and “coiner of the term ‘cyberspace'” appearing in the first line of his eventual obituary. Asked in 2007 by The New York Times whether he was “sick of being known as the writer who coined the word ‘cyberspace,'” he said he thought he’d “miss it if it went away.” In the meantime, he has more novels to write. We may not be able to escape our yesterdays, but we always have our today.

(Sources: True Names by Vernor Vinge; Conversations with William Gibson, edited by Patrick A. Smith; Bruce Sterling and William Gibson’s introductions to the William Gibson short-story collection Burning Chrome; Bruce Sterling’s preface to the cyberpunk short-story anthology Mirrorshades; “Science Fiction from 1980 to the Present” by John Clute and “Postmodernism and Science Fiction” by Andrew M. Butler, both found in The Cambridge Companion to Science Fiction; Spin of April 1987; Los Angeles Times of September 12 1993; The New York Times of August 19 2007; William Gibson’s autobiography from his website; “William Gibson and the Summer of Love” from the Toronto Dream Project; and of course the short stories and novels of William Gibson.)

 
 

Tags: ,

How Jordan Mechner Made a Different Sort of Interactive Movie (or, The Virtues of Restraint)

One can learn much about the state of computer gaming in any given period by looking to the metaphors its practitioners are embracing. In the early 1980s, when interfaces were entirely textual and graphics crude or nonexistent, text adventures like those of Infocom were heralded as the vanguard of a new interactive literature destined to augment or entirely supersede non-interactive books. That idea peaked with the mid-decade bookware boom, when just about every entertainment-software publisher (and a few traditional book publishers) were rushing to sign established authors and books to interactive projects. It then proceeded to collapse just as quickly under the weight of its own self-importance when the games proved less compelling and the public less interested than anticipated.

Prompted by new machines like the Commodore Amiga with their spectacular graphics and sound, the industry reacted to that failure by turning to the movies for media mentorship. This relationship would prove more long-lasting. By the end of the 1980s, companies like Cinemaware and Sierra were looking forward confidently to a blending of Hollywood and Silicon Valley that they believed might just replace the conventional non-interactive movie, not to mention computer games as people had known them to that point. Soon most of the major publishers would be conducting casting calls and hiring sound stages, trying literally to make games out of films. It was an approach fraught with problems — problems that were only slowly and grudgingly acknowledged by these would-be unifiers of Southern and Northern Californian entertainment. Before it ran its course, it spawned lots of really terrible games (and, it must be admitted, against all the odds the occasional good one as well).

Given the game industry’s growing fixation on the movies as the clock wound down on the 1980s, Jordan Mechner would seem the perfect man for the age. Struggling with the blessing or curse of an equally abiding love for both mediums, his professional life had already been marked by constant vacillation between movies and games. Inevitably, his love of film influenced him even when he was making games. But, perhaps because that love was so deep and genuine, he accomplished the blending in a more even-handed, organic way than would most of the multi-CD, multi-gigabyte interactive movies that would soon be cluttering store shelves. Mechner’s most famous game, by contrast, filled just two Apple II disk sides — less than 300 K in total. And yet the cinematic techniques it employs have far more in common with those found in the games of today than do those of its more literal-minded rivals.


 

As a boy growing up in the wealthy hamlet of Chappaqua, New York, Jordan Mechner dreamed of becoming “a writer, animator, or filmmaker.” But those ambitions got modified if not discarded when he discovered computers at his high school. Soon after, he got his hands on his own Apple II for the first time. Honing his chops as a programmer, he started contributing occasional columns on BASIC to Creative Computing magazine at the age of just 14. Yet fun as it was to be the magazine’s youngest contributor, his real reason for learning programming was always to make games. “Games were the only kind of software I knew,” he says. “They were the only kind that I enjoyed. At that time, I didn’t really see any use for a word processor or a spreadsheet.” He fell into the throes of what he describes as an “obsession” to get a game of his own published.

Initially, he did what lots of other game programmers were doing at the time: cloning the big standup-arcade hits for fun and (hopefully) profit. He made a letter-perfect copy of Atari’s Asteroids, changed the titular space rocks to bright bouncing balls in the interest of plausible deniability, and sent the resulting Deathbounce off to Brøderbund for consideration; what with Brøderbund having been largely built on the back of Apple Galaxian, an arcade clone which made no effort whatsoever to conceal its source material, the publisher seemed a very logical choice. But Doug Carlston was now trying to distance his company from such fare for reasons of reputation as well as his fear of Atari’s increasingly aggressive legal threats. Nice guy that he was, he called Mechner personally to explain why Deathbounce wasn’t for Brøderbund. He promised to send Mechner a free copy of Brøderbund’s latest hit, Choplifter, suggesting he think about whether he might be able to apply the programming chops he had demonstrated in Deathbounce to a more original game, as Choplifter‘s creator Dan Gorlin had done. Mechner remembers the conversation as well-nigh life-changing. He had been so immersed in the programming side of making games that the idea of doing an original design had never really occurred to him before: “I didn’t have to copy someone else’s arcade game. I was allowed to design my own!”

Carlston’s phone call came in May of 1982, when Mechner was finishing up his first year at Yale University; undecided about his major as he was so much else in his life at the time, he would eventually wind up with a Bachelors in psychology. We’re granted an unusually candid and personal glimpse into his life between 1982 and 1993 thanks to his private journals, which he published (doubtless in a somewhat expurgated form) in 2012. The early years paint a picture of a bright, sensitive young man born into a certain privilege that carries with it the luxury of putting off adulthood for quite some time. He romanticizes chance encounters (“I saw a heartbreakingly beautiful young blonde out of the corner of my eye. She was wearing a blue down vest. As she passed, our eyes met. She smiled at me. As I went out I held the door for her; her fingers grazed mine. Then she was gone.”); frets frequently about cutting classes and generally not being the man he ought to be (“I think Ben is the only person who truly comprehends the depths of how little classwork I do.”); alternates between grand plans accompanied by frenzies of activity and indecision accompanied by long days of utter sloth (“Here’s what I do do: listen to music. Browse in record stores. Read newspapers, magazines, play computer games, stare out the windows. See a lot of movies.”); muses with all the self-obliviousness of youth on whether he would prefer “writing a bestselling novel or directing a blockbusting film,” as if attaining fame and fortune was as simple as deciding on one or the other.

At Yale, film, that other constant of his creative life, came to the fore. He joined every film society he stumbled upon, signed up for every film-studies course in the catalog, and set about “trying to see in four years every film ever made”; Akira Kurosawa’s classic adventure epic Seven Samurai (a major inspiration behind Star Wars among other things) emerged as his favorite of them all. He also discovered an unexpected affinity for silent cinema, which naturally led him to compare that earliest era of film with the current state of computer games, a medium that seemed in a similar state of promising creative infancy. All of this, combined with the example of Choplifter and the karate lessons he was sporadically attending, led to Karateka, the belated fruition of his obsession with getting a game published.

To a surprising degree given his youth and naivete, Mechner consciously designed Karateka as the proverbial Next Big Thing in action games after the first wave of simple quarter munchers, whose market he watched collapse over the two-plus years he spent intermittently working on it. Plenty of fighting games had appeared on the Apple II and other platforms before, some of them very playable; Mechner wasn’t sure he could really improve on their templates when it came to pure game play. What he could do, however, was give his game some of the feel and emotional resonance of cinema. Reasoning that computer games were technically on par with the first decade or two of film in terms of the storytelling tools at his disposal, he mimicked the great silent-film directors in building his story out of the broadest archetypal elements: an unnamed hero must assault a mountain fortress to rescue an abducted princess, fighting through wave after wave of enemies, culminating in a showdown with the villain himself. He energetically cross-cut the interactive fighting sequences with non-interactive scenes of the villain issuing orders to his minions while the princess looks around nervously in her cell — a suspense-building technique from cinema dating back to The Birth of a Nation. He mimicked the horizontal wipes Kurosawa used for transitions in Seven Samurai; mimicked the scrolling textual prologue from Star Wars. When the player lost or won, he printed “THE END” on the screen in lieu of “GAME OVER.” And, indeed, he made it possible, although certainly not easy, to win Karateka and carry the princess off into the sunset. The player was, in other words, playing for bigger stakes than a new high score.

Karateka

The most technically innovative aspect of Karateka — suggested, like much in the game, by Mechner’s very supportive father — involved the actual people on the screen. To make his fighters move as realistically as possible, Mechner made use for the first time in a computer game of an old cartoon-animation technique known as rotoscoping. After shooting some film footage of his karate instructor in action, doing various kicks and punches, Mechner used an ancient Moviola editing machine that had somehow wound up in the basement of the family home to isolate and make prints out of every third frame. He imported the figure at the center of each print into his Apple II by tracing it on a contraption called the VersaWriter. Flipped through in sequence, the resulting sprites appeared to “move” in an unusually fluid and realistic fashion. “When I saw that sketchy little figure walk across the screen,” he wrote in his journal, “looking just like Dennis [his karate instructor], all I could say was ‘ALL RIGHT!’ It was a glorious moment.”

Karateka

Doug Carlston, who clearly saw something special in this earnest kid, was gently encouraging and almost infinitely patient with him. When it looked like Mechner had come up with something potentially great at last, Carlston signed him to a contract and flew him out to California in the summer of 1984 to finish it up with the help of Brøderbund’s in-house staff. Released just a little too late to fully capitalize on the 1984 Christmas rush, Karateka started slowly but gradually turned into a hit, especially once the Commodore 64 port dropped in June of 1985. Once ported to Nintendo for the domestic Japanese market, it proceeded to sell many hundreds of thousand units, making Jordan Mechner a very flush young man indeed.

So, Mechner, about to somehow manage to graduate despite all the missed assignments and cut classes spent working on Karateka, seemed poised for a fruitful career making games. Yet he continued to vacillate between his twin obsessions. Even as his game, the most significant accomplishment of his young life and one of which anyone could justly be proud, had entered the homestretch, he had written how “I definitely want my next project to be film-related. Videogames have taken up enough of my time for now.” In the wake of his game’s release, the steady stream of royalties therefrom only made it easier to dabble in film.

Mechner spent much of the year after graduating from university back at home in Chappaqua working on his first screenplay. In between writing dialog and wracking himself with doubt over whether he really wanted to do another game at all, he occasionally turned his attention to the idea of a successor to Karateka. Already during that first summer after Yale, he and Gene Portwood, a Brøderbund executive, dreamed up a scenario for just such a beast: an Arabian Nights-inspired story involving an evil sultan, a kidnapped princess, and a young man — the player, naturally — who must rescue her. Karateka in Middle Eastern clothing though it may have been in terms of plot, that was hardly considered a drawback by Brøderbund, given the success of Mechner’s first game.

Seven frames of animation ready to be photocopied and digitized.

Seven frames of animation ready to be photocopied and digitized.

Determined to improve upon the rotoscoping of Karateka, Mechner came up with a plan to film a moving figure and use a digitizer to capture the frames into the computer, rather than tracing the figure using the VersaWriter. He spent $2500 on a high-end VCR and video camera that fall, knowing he would return them before his month’s grace period was out (“I feel so dishonest,” he wrote in his journal). The technique he had in the works may have been an improvement over what he had done for Karateka, but it was still very primitive and hugely labor-intensive. After shooting his video, he would play it back on the VCR, pausing it on each frame he wanted to capture. Then he would take a picture of the screen using an ordinary still camera and get the film developed. Next step was to trace the outline of the figure in the photograph using Magic Marker and fill him in using White-Out. Then he would Xerox the doctored photograph to get a black-and-white version with a very clear silhouette of the figure. Finally, he would digitize the photocopy to import it into his Apple II, and erase everything around the figure by hand on the computer to create a single frame of sprite animation. He would then get to go through this process a few hundred more times to get the prince’s full repertoire of movements down.


On October 20, 1985, Jordan Mechner did his first concrete work on the game that would become Prince of Persia, using his ill-gotten video camera to film his 16-year-old brother David running and jumping through a local parking lot. When he finally got around to buying a primitive black-and-white image digitizer for his trusty Apple II more than six months later, he quickly determined that the footage he’d shot was useless due to poor color separation. Nevertheless, he saw potential magic.

I still think this can work. The key is not to clean up the frames too much. The figure will be tiny and messy and look like crap… but I have faith that, when the frames are run in sequence at 15 fps, it’ll create an illusion of life that’s more amazing than anything that’s ever been seen on an Apple II screen. The little guy will be wiggling and jiggling like a Ralph Bakshi rotoscope job… but he’ll be alive. He’ll be this little shimmering beacon of life in the static Apple-graphics Persian world I’ll build for him to run around in.

For months after that burst of enthusiasm, however, he did little more with the game.

At last in September of 1986, having sent his screenplay off to Hollywood and thus with nothing more to do on that front but wait, Mechner moved out to San Rafael, California, close to Brøderbund’s offices, determined to start in earnest on Prince of Persia. He spent much time over the next few months refining his animation technique, until by Christmas everyone who saw the little running and jumping figure was “bowled over” by him. Yet after that progress again slowed to a crawl, as he struggled to motivate himself to turn his animation demos into an actual game.

And then, on May 4, 1987, came the phone call that would stop the little running prince in his tracks for the better part of a year. A real Hollywood agent called to tell him she “loved” his script for Birthstone, a Spielbergian supernatural comedy/thriller along the lines of Gremlins or The Goonies. Within days of her call, the script was optioned by Larry Turman, a major producer with films like The Graduate on his resume. For months Mechner fielded phone calls from a diverse cast of characters with a diverse cast of suggestions, did endless rewrites, and tried to play the Hollywood game, schmoozing and negotiating and trying not to appear to be the awkward, unworldly kid he still largely was. Only when Birthstone seemed permanently stuck in development hell — “Hollywood’s the only town where you can die of encouragement,” he says wryly, quoting Pauline Kael —  did he give up and turn his attention back to games. Mechner notes today that just getting as far as he did with his very first script was a huge achievement and a great start in itself. After all, he was, if not quite hobnobbing with the Hollywood elite, at least getting rejection letters from such people as Michael Apted, Michael Crichton, and Henry Winkler; such people were reading his script. But he had been spoiled by the success of Karateka. If he wrote another screenplay, there was no guarantee it would get even as far as his first had. If he finished Prince of Persia, on the other hand, he knew Brøderbund would publish it.

And so, in 1988, it was back to games, back to Prince of Persia. Inspired by “puzzly” 8-bit action games like Doug Smith’s Lode Runner and Ed Hobbs’s The Castles of Dr. Creep, his second game was shaping up to be more than just a game of combat. Instead his prince would have to make his way through area after area full of tricks, traps, and perilous drops. “What I wanted to do with Prince of Persia,” Mechner says, “was a game which would have that kind of logical, head-scratching, fast-action, Lode Runner-esque puzzles in a level-based game but also have a story and a character that was trying to accomplish a recognizable human goal, like save a princess. I was trying to merge those two things.” Ideally, the game would play like the iconic first ten minutes of Raiders of the Lost Ark, in which Indiana Jones runs and leaps and dodges and sometimes outwits rather than merely outruns a series of traps. For a long while, Mechner planned to make the hero entirely defenseless, as a sort of commentary on the needless ultra-violence found in so many other games. In the end, he didn’t go that far — the allure of sword-fighting, not to mention commercial considerations, proved too strong — but Prince of Persia was nevertheless shaping up to be a far more ambitious, multi-faceted work than Karateka, boasting much more than just improved running and jumping animations.

With just 128 K of memory to work with on the Apple II, Mechner was forced to make Prince of Persia a modular design, relying on a handful of elements which are repeatedly reused and recombined. Take, for instance, the case of the loose floorboards. The first time they appear, they’re a simple trap: you have to jump over a section of the floor to avoid falling into a pit. Later, they appear on the ceiling, as part of the floor above your own; caught in an apparent cul de sac, you have to jump up and bash the ceiling to open an escape route. Still later, they can be used strategically: to kill guards below you by dropping the floorboards on their heads, or to hold down a pressure plate below you that opens a door on the level on which you’re currently standing. It’s a fine example of a constraint in game design turning into a strength. “There’s a certain elegance to taking an element the player is already familiar with,” says Mechner, “and challenging him to think about it in a different way.”


On July 14, 1989, Mechner shot the final footage for Prince of Persia: the denouement, showing the prince — now played by the game’s project manager at Brøderbund, Brian Ehler — embracing the rescued princess — played by Tina LaDeau, the 18-year-old daughter of another Brøderbund employee, in her prom dress. (“Man, she is a fox,” Mechner wrote in his journal. “Brian couldn’t stop blushing when I had her embrace him.”)

The game shipped for the Apple II on October 6, 1989. And then, despite a very positive review in Computer Gaming World — Charles Ardai called it nothing less than “the Star Wars of its field,” music to the ears of a movie buff like Mechner — it proceeded to sell barely at all: perhaps 500 units a month. It was, everyone at Brøderbund agreed, at least a year too late to hope to sell significant numbers of a game like this on the Apple II, whose only remaining commercial strength was educational software, thanks to the sheer number of the things still installed in American schools. Mechner’s procrastination and vacillation had spoiled this version’s commercial prospects entirely.

Thankfully, the Apple II version wasn’t to be the only one. Brøderbund already had programmers and artists working on ports to MS-DOS and the Amiga, the last two truly viable computer-gaming platforms in North America. Mechner as well turned his attention to the versions for these more advanced machines as soon as the Apple II version was finished. And once again his father pitched in, composing a lovely score for the luxuriously sophisticated sound hardware now at the game’s disposal. “This is going to be the definitive version of Prince of Persia,” Mechner enthused over the MS-DOS version. “With VGA [graphics] and sound card, on a fast machine, it’ll blow the Apple away. It looks like a Disney film. It’s the most beautiful game I’ve ever seen.” Reworked though they were in almost all particulars, at the heart of the new versions lay the same digitized film footage that had made the 8-bit prince run and leap so fluidly.

Prince of Persia

And yet, after it shipped on April 19, 1990, the MS-DOS version also disappointed. Mechner chafed over his publisher’s disinterest in promoting the game; they seemed on the verge of writing it off, noting how the vastly superior MS-DOS version was being regarded as just another port of an old 8-bit game, and thus would likely never be given a fair shake by press or public. True as ever to the bifurcated pattern of his life, he decided to turn back to film. Having tried and failed to get into New York University film school, he resorted to working as a production assistant in movies by way of supporting himself and trying to drum up contacts in the film-making community of New York. Thus the first anniversary of Prince of Persia‘s original release on the Apple II found him schlepping crates around New York City. His career as a game developer seemed to be behind him, and truth be told his prospects as a filmmaker didn’t look a whole lot brighter.

The situation began to reverse itself only after the Amiga version was finished — programmed, as it happened, by Dan Gorlin, the very fellow whose Choplifter had first inspired Mechner to look at his own games differently. In Europe, the Amiga’s stronghold, Prince of Persia was free of the baggage which it carried in North America — few in Europe had much idea of what an Apple II even was — and doubtless benefited from a much deeper and richer tradition on European computers of action-adventures and platform puzzlers. It received ebullient reviews and turned into a big hit on European Amigas, and its reputation gradually leaked back across the pond to turn it at last into a hit in its homeland as well. Thus did Prince of Persia become a slow grower of an international sensation — a very unusual phenomenon in the hits-driven world of videogames, where shelf lives are usually short and retailer patience shorter. Soon came the console releases, along with releases for various other European and Japanese domestic computers, sending total sales soaring to over 2 million units.

By the beginning of 1992, Mechner was far removed from his plight of just eighteen months before. He was drowning in royalties, consulting intermittently with Brøderbund on a Prince of Persia 2 — it was understood that his days in the programming trenches were behind him — and living a globetrotting lifestyle, jaunting from Paris to San Rafael to Madrid to New York as whim and business took him. He was also planning his first film, a short documentary to be shot in Cuba, and already beginning to mull over what would turn into his most ambitious and fascinating game production of all, known at this point only as “the train game.”

Prince of Persia, which despite the merits of that eventual “train game” is and will likely always remain Mechner’s signature work, strikes me most of all as a triumph of presentation. The actual game play is punishingly difficult. Each of its twelve levels is essentially an elaborate puzzle that can only be worked out by dying many times when not getting trapped into one of way too many dead ends. Even once you think you have it all worked out, you still need to execute every step with perfect precision, no mean feat in itself. Messing up at any point in the process means starting that level over again from the beginning. And, because you only have one hour of real time to rescue the princess, every failure is extremely costly; a perfect playthrough, accomplished with absolute surety and no hesitations, takes about half an hour, leaving precious little margin for error. At least there is a “save” feature that will let you bookmark each level starting with the third, so you don’t have to replay the whole game every time you screw up — which, believe me, you will, hundreds if not thousands of times before you finally rescue the princess. Beating Prince of Persia fair and square is a project for a summer vacation of those long-gone adolescent days when responsibilities were few and distractions fewer. As a busy adult, I find it too repetitive and too reliant on rote patterns, as well as — let’s be honest here — just too demanding on my aging reflexes. In short, the effort-to-reward ratio strikes me as way out of whack. Of course, I’m sure that, given Prince of Persia‘s status as a beloved icon of gaming, many of you have a different opinion.

So, let’s turn back to something on which we can hopefully all agree: the brilliance of that aforementioned presentation, which brings to aesthetic maturity many of the techniques Mechner had first begun to experiment with in Karateka. Rather than using filmed footage as a tool for the achievement of fluid, lifelike motion, as Mechner did, games during the years immediately following Prince of Persia would be plastered with jarring chunks of poorly acted, poorly staged “full-motion video.” Such spectacles look far more dated today than the restrained minimalism of Prince of Persia. The industry as a whole would take years to wind up back at the place where Jordan Mechner had started: appropriating some of the language of cinema in the service of telling a story and building drama, without trying to turn games into literal interactive movies. Mechner:

Just as theater is its own thing — with its own conventions, things that it does well, things it does badly — so is film, and so [are] computer games. And there is a way to borrow from one medium to another, and in fact that’s what an all-new medium does when it’s first starting out. Film, when it was new, looked like someone set up a camera front and center and filmed a staged play. Then the things that are specific to film — like the moving camera, close-ups, reaction shots, dissolves — all these kinds of things became part of the language of cinema. It’s the same with computer games. To take a long film sequence and to play that on your TV screen is the bad way to make a game cinematic. The computer game is not a VCR. But if you can borrow from the knowledge that we all carry inside our heads of how cuts work, how reaction shots work, what a low angle means dramatically, what it means when the camera suddenly pulls back… We’ve got this whole collective unconscious of the vocabulary of film, and that’s a tremendously valuable tool to bring into computer gaming.

In a medium that has always struggled to tamp down its instinct toward aesthetic maximalism, Mechner’s games still stand out for their concern with balance and proportion. Mechner again:

Visuals are [a] component where it’s often tempting to compromise. You think, “Well, we could put a menu bar across here, we could put a number in the upper right-hand corner of the screen representing how many potions you’ve drunk,” or something. The easy solution is always to do something that as a side effect is going to make the game look ugly. So I took as one of the ground rules going in that the overall screen layout had to be pleasing, had to be strong and simple. So that somebody who was not playing the game but who walked into the room and saw someone else playing it would be struck by a pleasing composition and could stop to watch for a minute, thinking, “This looks good, this looks as if I’m watching a movie.” It really forces you as a designer to struggle to find the best solution for things like inventory. You can’t take the first solution that suggests itself, you have to try to solve it within the constraints you set yourself.

Mechner’s take on visual aesthetics can be seen as a subversion of Ken Williams’s old “ten-foot rule,” which, as you might remember, stated that every Sierra game ought to be visually arresting enough to make someone say “Wow!” when glimpsing it from ten feet away across a crowded shop. Mechner believed that game visuals ought to be more than just striking; they ought to be aesthetically good by the more refined standards of film and the other, even older visual arts. All that time Mechner spent obsessing over films and film-making, which could all too easily be labeled a complete waste of time, actually allowed him to bring something unique to the table, something that made him different from virtually all of his many contemporaries in the interactive-movie business.

There are various ways to situate Jordan Mechner’s work in general and Prince of Persia in particular within the context of gaming history. It can be read as the last great swan song of the Apple II and, indeed, of the entire era of 8-bit computer gaming, at least in North America. It can be read as yet one more example of Brøderbund’s downright bizarre commercial Midas touch, which continued to yield a staggering number of hits from a decidedly modest roster of new releases (Brøderbund also released SimCity in 1989, thus spawning two of the most iconic franchises in gaming history within bare months of one another). It can be read as the precursor to countless cinematic action-adventures and platformers to come, many of whose designers would acknowledge it as a direct influence. In its elegant simplicity, it can even be read as a fascinating outlier from the high-concept complexity that would come to dominate American computer gaming in the very early 1990s. But the reading that makes me happiest is to simply say that Prince of Persia showed how less can be more.

(Sources: Game Design Theory and Practice by Richard Rouse III; The Making of Karateka and The Making of Prince of Persia by Jordan Mechner; Creative Computing of March 1979, September 1979, and May 1980; Next Generation of May 1998; Computer Gaming World of December 1989; Jordan Mechner’s Prince of Persia postmortem from the 2011 Game Developers Conference; “Jordan Mechner: The Man Who Would Be Prince” from Games™; the Jordan Mechner and Brøderbund archives at the Strong Museum of Play.)

 
 

Tags: , ,

Cinemaware’s Year in the Desert

The last year of the 1980s was also the last that the Commodore Amiga would enjoy as the ultimate American game machine. Even as the low-end computer-game market was being pummeled into virtual nonexistence by the Nintendo Entertainment System, leaving the Amiga with little room into which to expand downward, the heretofore business-centric world of MS-DOS was developing rapidly on the high end, with VGA graphics and sound cards becoming more and more common. The observant could already recognize that these developments, combined with Commodore’s lackadaisical attitude toward improving their own technology, must spell serious trouble for the Amiga in the long run.

But for now, for this one more year, things were still going pretty well. Amiga zealots celebrated loudly and proudly at the beginning of 1989 when news broke that the platform had pushed past the magic barrier of 1 million machines sold. As convinced as ever that world domination was just around the corner for their beloved “Amy,” they believed that number would have to lead to her being taken much more seriously by the big non-gaming software houses. While that, alas, would never happen, sales were just beginning to roll in many of the European markets that would sustain the Amiga well into the 1990s.

This last positive development fed directly into the bottom line of Cinemaware, the American software house that was the developer most closely identified with the Amiga to a large extent even in Europe. Cinemaware’s founder Bob Jacob wisely forged close ties with the exploding European Amiga market via a partnership with the British publisher Mirrorsoft. In this way he got Cinemaware’s games wide distribution and promotion throughout Europe, racking up sales across the pond under the Mirrorsoft imprint that often dramatically exceeded those Cinemaware was able to generate under their own label in North America. The same partnership led to another welcome revenue stream: the importation of European games into Cinemaware’s home country. Games like Speedball, by the rockstar British developers The Bitmap Brothers, didn’t have much in common with Cinemaware’s usual high-concept fare, but did feed the appetite of American youngsters who had recently found Amiga 500s under their Christmas trees for splashy, frenetic, often ultra-violent action.

Yet Cinemaware’s biggest claim to fame remained their homegrown interactive movies — which is not to say that everyone was a fan of their titular cinematic approach to game-making. A steady drumbeat of criticism, much of it far from unjustified, had accompanied the release of each new interactive movie since the days of Defender of the Crown. Take away all of the music and pretty pictures that surrounded their actual game play, went the standard line of attack, and these games were nothing but shallow if not outright broken exercises in strategy attached to wonky, uninteresting action mini-games. Cinemaware clearly took the criticism to heart despite the sales success they continued to enjoy. Indeed, the second half of the company’s rather brief history can to a large extent be read as a series of reactions to that inescapable negative drumbeat, a series of attempts to show that they could make good games as well as pretty ones.

At first, the new emphasis on depth led to decidedly mixed results. Conflating depth with difficulty in a manner akin to the way that so many adventure-game designers conflate difficulty with unfairness, Cinemaware gave the world Rocket Ranger as their second interactive movie of 1988. It had all the ingredients to be great, but was undone by balance issues exactly the opposite of those which had plagued the prototypical Cinemaware game, Defender of the Crown. In short, Rocket Ranger was just too hard, a classic game-design lesson in the dangers of overcompensation and the importance of extensive play-testing to get that elusive balance just right. With two more new interactive movies on the docket for 1989, players were left wondering whether this would be the year when Cinemaware would finally get it right.

Lords of the Rising Sun

Certainly they showed no sign of backing away from their determination to bring more depth to their games. On the contrary, they pushed that envelope still harder with Lords of the Rising Sun, their first interactive movie of 1989. At first glance, it was a very typical Cinemaware confection, a Defender of the Crown set in feudal Japan. Built like that older game from the tropes and names of real history without bothering to be remotely rigorous about any of it, Lords of the Rising Sun is also another strategy game broken up by action-oriented minigames — the third time already, following Defender of the Crown and Rocket Ranger, that Cinemaware had employed this template. This time, however, a concerted effort was made to beef up the strategy game, not least by making it into a much more extended affair. Lords of the Rising Sun became just the second interactive movie to include a save-game feature, and in this case it was absolutely necessary; a full game could absorb many hours. It thus departed more markedly than anything the company had yet done from Bob Jacob’s original vision of fast-playing, non-taxing, ultra-accessible games. Indeed, with a thick manual and a surprising amount of strategic and tactical detail to keep track of, Lords of the Rising Sun can feel more like an SSI than a typical Cinemaware game once you look past its beautiful audiovisual presentation. Reaching for the skies if not punching above their weight, Cinemaware even elected to include the option of playing the game as an exercise in pure strategy, with the action sequences excised.


But sadly, the strategy aspect is as inscrutable as a Zen koan. While Rocket Ranger presents with elegance and grace a simple strategy game that would be immensely entertaining if it wasn’t always kicking your ass, Lords of the Rising Sun is just baffling. You’re expected to move your armies over a map of Japan, recruiting allies where possible, fighting battles to subdue enemies where not. Yet it’s all but impossible to divine any real sense of the overall situation from the display. This would-be strategy game ends up feeling more random than anything else, as you watch your banners wander around seemingly of their own volition, bumping occasionally into other banners that may represent enemies or friends. It suffers mightily from a lack of clear status displays, making it really, really hard to keep track of who wants to do what to whom. If you have the mini-games turned on, the bird’s-eye view is broken up by arcade sequences that are at least as awkward as the strategy game. In the end, Lords of the Rising Sun is just no fun at all.

Lords of the Rising Sun's animated, scrolling map is nicer to look at than it is a practical tool for strategizing.

While it’s very pretty, Lords of the Rising Sun‘s animated, scrolling map is nicer to look at than it is a practical tool for strategizing.

Press and public alike were notably unkind to Lords of the Rising Sun. Claims like Bob Jacob’s that “there is more animation in Lords than has ever been done in any computer game” — a claim as unquantifiable as it was dubious, especially in light of some of Sierra’s recent efforts — did nothing to shake Cinemaware’s reputation for being all sizzle, no steak. Ken St. Andre of Tunnels & Trolls and Wasteland fame, reviewing the game for Questbusters magazine, took Cinemaware to task on its every aspect, beginning with the excruciating picture on the box of a cowering maiden about to fall out of her kimono; he deemed it “an insult to women everywhere and to Japanese culture in particular.” (Such a criticism sounds particularly forceful coming from St. Andre; Wasteland with its herpes-infested prostitutes and all the rest is hardly a bastion of political correctness.) He concluded his review with a zinger so good I wish I’d thought of it: he called the game “a Japanese Noh play.”

Many other reviewers, while less boldly critical, seemed nonplussed by the whole experience — a very understandable reaction to the strategy game’s vagaries. Sales were disappointing in comparison to those of earlier interactive movies, and the game has gone down in history alongside the equally underwhelming S.D.I. as perhaps the least remembered of all the Cinemaware titles.

It Came from the Desert

So, what with the game-play criticisms beginning to affect the bottom line, Cinemaware really needed to deliver something special for their second game of 1989. Thankfully, It Came from the Desert would prove to be the point where they finally got this interactive-movie thing right, delivering at long last a game as nice to play as it is to look at.


It Came from the Desert was the first of the interactive movies not to grow from a seed of an idea planted by Bob Jacob himself. Its originator was rather David Riordan, a newcomer to the Cinemaware fold with an interesting career in entertainment already behind him. As a very young man, he’d made a go of it in rock music, enjoying his biggest success in 1970 with a song called “Green-Eyed Lady,” a #3 hit he co-wrote for the (briefly) popular psychedelic band Sugarloaf. A perennial on Boomer radio to this day, that song’s royalties doubtless went a long way toward letting him explore his other creative passions after his music career wound down. He worked in movies for a while, and then worked with MIT on a project exploring the interactive potential of laser discs. After that, he worked briefly for Lucasfilm Games during their heady early days with Peter Langston at the helm. And from there, he moved on to Atari, where he worked on laser-disc-driven stand-up arcade games until it became obvious that Dragon’s Lair and its spawn had been the flashiest of flashes in the pan.

David Riordan on the job at Cinemaware.

David Riordan on the job at Cinemaware.

Riordan’s resume points to a clear interest in blending cinematic approaches with interactivity. It thus comes as little surprise that he was immediately entranced when he first saw Defender of the Crown one day at his brother-in-law’s house. It had, he says, “all the movie attributes and approaches that I had been trying to get George Lucas interested in” while still with Lucasfilm. He wrote to Cinemaware, sparking up a friendship with Bob Jacob which led him to join the company in 1988. Seeing in Riordan a man who very much shared his own vision for Cinemaware, Jacob relinquished a good deal of the creative control onto which he had heretofore held so tightly. Riordan was placed in charge of the company’s new “Interactive Entertainment Group,” which was envisioned as a production line for cranking out new interactive movies of far greater sophistication than those Cinemaware had made to date. These latest and greatest efforts were to be made available on a whole host of platforms, from their traditional bread and butter the Amiga to the much-vaunted CD-based platforms now in the offing from a number of hardware manufacturers. If all went well, It Came from the Desert would mark the beginning of a whole new era for Cinemaware.

Here we can see -- just barely; sorry for this picture's terrible fidelity -- Cinemaware's interactive-movie scripting tool, which they dubbed MasterPlan, running in HyperCard.

Here we can see — just barely; sorry for this picture’s terrible fidelity — Cinemaware’s scripting tool MasterPlan.

Cinemaware spent months making the technology that would allow them to make It Came from the Desert. Riordan’s agenda can be best described as a desire to free game design from the tyranny of programmers. If this new medium was to advance sufficiently to tell really good, interesting interactive stories, he reasoned, its tools would have to become something that non-coding “real” writers could successfully grapple with. Continuing to advance Cinemaware’s movie metaphors, his team developed a game engine that could largely be “scripted” in point-and-click fashion in HyperCard rather than needing to be programmed in any conventional sense. Major changes to the structure of a game could be made without ever needing to write a line of code, simply by editing the master plan of the game in a HyperCard tool Cinemaware called, appropriately enough, MasterPlan. The development process leveraged the best attributes of a number of rival platforms: Amigas ran the peerless Deluxe Paint for the creation of art; Macs ran HyperCard for the high-level planning; fast IBM clones served as the plumbing of the operation, churning through compilations and compressions. It was by anyone’s standards an impressive collection of technology — so impressive that the British magazine ACE, after visiting a dozen or more studios on a sort of grand tour of the American games industry, declared Cinemaware’s development system the most advanced of them all. Cinemaware had come a long way from the days of Defender of the Crown, whose development process had consisted principally of locking programmer R.J. Mical into his office with a single Amiga and a bunch of art and music and not letting him out again until he had a game. “If we ever get a real computer movie,” ACE concluded, “this is where it’s going to come from.”

It Came from the Desert

While it’s debatable whether It Came from the Desert quite rises to that standard, it certainly is Cinemaware’s most earnest and successful attempt at crafting a true interactive narrative since King of Chicago. The premise is right in their usual B-movie wheelhouse. Based loosely on the campy 1950s classic Them!, the game takes place in a small desert town with the charming appellation of Lizard Breath that’s beset by an alarming number of giant radioactive ants, product of a recent meteor strike. You play a geologist in town; “the most interesting rocks always end up in the least interesting places,” notes the introduction wryly. Beginning in your cabin, you can move about the town and its surroundings as you will, interacting with its colorful cast of inhabitants via simple multiple-choice dialogs and getting into scrapes of various sorts which lead to the expected Cinemaware action sequences. Your first priority is largely to convince the townies that they have a problem in the first place; this task you can accomplish by collecting enough evidence of the threat to finally gain the attention of the rather stupefyingly stupid mayor. Get that far, and you’ll be placed in charge of the town’s overall defense, at which point a strategic aspect joins the blend of action and adventure to create a heady brew indeed. Your ultimate goal, which you have just fifteen days in total to accomplish, is to find the ants’ main nest and kill the queen.

It Came from the Desert excels in all the ways that most of Cinemaware’s interactive movies excel. The graphics and sound were absolutely spectacular in their day, and still serve very well today; you can well-nigh taste the gritty desert winds. What makes it a standout in the Cinemaware catalog, however, is the unusual amount of attention that’s been paid to the design — to you the player’s experience. A heavily plot-driven game like this could and usually did go only one way in the 1980s. You probably know what I’m picturing: a long string of choke points requiring you to be in just the right place at just the right time to avoid being locked out of victory. Thankfully, It Came from the Desert steers well away from that approach. The plot is a dynamic thing rolling relentlessly onward, but your allies in the town are not entirely without agency of their own. If you fail to accomplish something, someone else might just help you out — perhaps not as quickly or efficiently as one might ideally wish, but at least you still feel you have a shot.

And even without the townies’ help, there are lots of ways to accomplish almost everything you need to. The environment as a whole is remarkably dynamic, far from the static set of puzzle pieces so typical of more traditional adventure games of this era and our own. There’s a lot going on under the hood in this one, far more than Cinemaware’s previous games would ever lead one to expect. Over the course of the fifteen days, the town’s inhabitants go from utterly unconcerned about the strange critters out there in the desert to full-on, backs-against-the-wall, fight-or-flight panic mode. By the end, when the ants are roaming at will through the rubble that once was Lizard Breath destroying anything and anyone in their path, the mood feels far more apocalyptic than that of any number of would-be “epic” games. One need only contrast the frantic mood at the end of the game with the dry, sarcastic tone of the beginning — appropriate to an academic stranded in a podunk town — to realize that one really does go on a narrative journey over the few hours it takes to play.

Which brings me to another remarkable thing: you can’t die in It Came from the Desert. If you lose at one of the action games, you wake up in the hospital, where you have the option of spending some precious time recuperating or trying to escape in shorter order via another mini-game. (No, I have no idea why a town the size of Lizard Breath should have a hospital.) In making sure that every individual challenge or decision doesn’t represent a zero-sum game, It Came from the Desert leaves room for the sort of improvisational derring-do that turns a play-through into a memorable, organic story. It’s not precisely that knowledge of past lives isn’t required; you’re almost certain to need several tries to finally save Lizard Breath. Yet each time you play you get to live a complete story, even if it is one that ends badly. Meanwhile you’re learning the lay of the land, learning to play more efficiently and getting steadily better at the action games, which are themselves unusually varied and satisfying by Cinemaware’s often dodgy standards. There are not just many ways to lose It Came from the Desert but also many paths to victory. Win or lose, your story in It Came from the Desert is your story; you get to own it. There’s a save-game feature, but I don’t recommend that you use it except as a bookmark when you really do need to do something else for a while. Otherwise just play along and let the chips fall where they may. At last, here we have a Cinemaware interactive movie that’s neither too easy nor too hard; this one is just right, challenging but not insurmountable.

It Came from the Desert evolves into a strategy game among other things, as you manuveur the town's forces to battle new infestations while you search for the main hive with the queen to put an end to the menace once and for all.

It Came from the Desert evolves into a strategy game among other things, as you deploy the town’s forces to battle each new ant infestation while you continue the search for the main hive.

Widely and justifiably regarded among the old-school Amiga cognoscenti of today as Cinemaware’s finest hour, It Came from the Desert was clearly seen as something special within Cinemaware as well back in the day; one only has to glance at contemporary comments from those who worked on the game to sense their pride and excitement. There was a sense both inside and outside their offices that Cinemaware was finally beginning to crack a nut they’d been gnawing on for quite some time. Even Ken St. Andre was happy this time. “Cinemaware’s large creative team has managed to do a lot of things very well indeed in this game,” he wrote, “and as a result they have produced a game that looks great, sounds great, moves along at a rapid pace, is filled with off-the-wall humor without being dumb, and is occasionally both gripping and exciting.”

When It Came from the Desert proved a big commercial success, Cinemaware pulled together some ideas that had been left out of the original game due to space constraints, combined them with a plot involving the discovery of a second ant queen, and made it all into a sequel subtitled Ant-Heads!. Released at a relatively low price only as an add-on for the original game — thus foreshadowing a practice that would get more and more popular as the 1990s wore on — Ant-Heads! was essentially a new MasterPlan script that utilized the art and music assets from the original game, a fine demonstration of the power of Cinemaware’s new development system. It upped the difficulty a bit by straitening the time limit from fifteen days to ten, but otherwise played much like the original — which, considering how strong said original had been, suited most people just fine.

It Came from the Desert, along with the suite of tools used to create it, might very well have marked the start of exactly the new era of more sophisticated Cinemaware interactive movies that David Riordan had intended it to. As things shook out, however, it would have more to do with endings than beginnings. Cinemaware would manage just one more of these big productions before being undone by bad decisions, bad luck, and a changing marketplace. We’ll finish up with the story of their visionary if so often flawed games soon. In the meantime, by all means go play It Came from the Desert if time and motivation allow. I was frankly surprised at how well it still held up when I tackled it recently, and I think it just might surprise you as well.

(Sources: The One from April 1989, June 1989, and June 1990; ACE from April 1990; Commodore Magazine from November 1988; Questbusters from September 1989, February 1990, and May 1990; Matt Barton’s interview with Bob Jacob on Gamasutra.)

 
 

Tags: , , ,