RSS

Tag Archives: spectrum

Acorn and Amstrad

…he explains to her that Sinclair, the British inventor, had a way of getting things right, but also exactly wrong. Foreseeing the market for affordable personal computers, Sinclair decided that what people would want to do with them was to learn programming. The ZX81, marketed in the United States as the Timex 1000, cost less than the equivalent of a hundred dollars, but required the user to key in programs, tapping away on that little motel keyboard-sticker. This had resulted both in the short market-life of the product and, in Voytek’s opinion, twenty years on, in the relative preponderance of skilled programmers in the United Kingdom. They had had their heads turned by these little boxes, he believes, and by the need to program them. “Like hackers in Bulgaria,” he adds, obscurely.

“But if Timex sold it in the United States,” she asks him, “why didn’t we get the programmers?”

“You have programmers, but America is different. America wanted Nintendo. Nintendo gives you no programmers…”

— William Gibson, Pattern Recognition

A couple of years ago I ventured out of the man cave to give a talk about the Amiga at a small game-development conference in Oslo. I blazed through as much of the platform’s history as I could in 45 minutes or so, emphasizing for my audience of mostly young students from a nearby university the Amiga’s status as the preeminent gaming platform in Europe for a fair number of years. They didn’t take much convincing; even this crowd, young as they were, had their share of childhood memories involving Amiga 500s and 1200s. Mostly they seemed surprised that the Amiga hadn’t ever been all that terribly popular in the United States. During the question-and-answer session, someone asked a question that stopped me short: if American kids hadn’t been playing games on their Amigas, just what the hell had they been playing on?

The answer itself wasn’t hard to arrive at: the sorts of kids who migrated from 8-bit Sinclairs, Acorns, Amstrads, and Commodores to 16-bit Amigas and Atari STs in Britain made a much more lateral move in the United States, migrating to the 8-bit Nintendo Entertainment System.

More complex and interesting are the ramifications of these trends. Because the Atari VCS console was never a major presence in Britain and the rest of Europe during its heyday, and because Nintendo arrived only very belatedly, for many years videogames played in the home there meant games played on home computers. One could say much about how having a device useful for creation as well as consumption as the favored platform of most people affected the market across Europe. The magazines were filled with stories of bedroom gamers who had become bedroom coders and finally Software Stars. Such stories make a marked contrast to an American console-gaming magazine like Nintendo Power, all about consumption without the accompanying ethos of creation.

But most importantly for our purposes today, the relative neglect of Britain in particular by the big computing powers in the United States and Japan — for many years, Commodore was the only company of either nation to make a serious effort to sell their machines into British homes — gave space for a flourishing domestic trade in homegrown machines. When Britain became the nation with the most computers per capita on the planet at mid-decade, most of the computers in question bore the logo of either Acorn or Sinclair, the two great rivals at the heart of the young British microcomputer industry.

Acorn, co-founded by Clive Sinclair’s former right-hand man Chris Curry and an Austrian academic named Hermann Hauser, was an archetypal example of an engineering-driven company. Their machines were a little more baroque, a little better built, and consequently a little more expensive than they needed to be, while their public persona was reserved and just a little condescending, much like that of the BBC that had given its official imprimatur to Acorn’s most popular machine, the BBC Micro. Despite “Uncle Clive’s” public reputation as the British Inspector Gadget, Sinclair was just the opposite; cheap and cheerful, they had the common touch. Acorns sold to the educators, to the serious hobbyists, and to the posh, while Sinclairs dominated with the masses.

Yet Acorn and Sinclair were similar in one important respect: they were both in their own ways very poorly managed companies. When the British home-computer market hit an iceberg in 1985, both were caught in untenable positions, drowning in excess inventory. Acorn — quintessentially British, based in the storied heart of Britain’s “Silicon Fen” of Cambridge — was faced with a choice between dissolution and selling themselves to the Italian typewriter manufacturer Olivetti; after some hand-wringing, they chose the latter course. Sinclair also sold out: to the new kid on the block of British computing, Amstrad, owned by a gruff Cockney with a penchant for controversy named Alan Sugar who was well on his way to becoming the British Donald Trump.

Ever practical in their approach to technology, Amstrad made much of the CPC's bundled monitor in their advertising, noting that with the CPC Junior could play on the computer while the rest of the family watched television.

Ever mindful of the practical concerns of their largely working-class customers, Amstrad made much of the CPC’s bundled monitor in their advertising, noting that Junior could play on the CPC without tying up the family television.

Amstrad had already been well-established as a maker of inexpensive stereo equipment and other consumer electronics when their first computers, the CPC (“Colour Personal Computer”) line, debuted in June of 1984. The CPC range was created and sold as a somewhat more capable Sinclair Spectrum. It consisted of well-built and smartly priced if technically unimaginative computers that were fine choices for gaming, boasting as they did reasonably good if hardly revolutionary graphics and sound. Like most Amstrad products, they strained to be as easy to use as possible, shipping as complete units — tape or disk drive and monitor included — at a time when virtually all of their rivals had to be assembled piece by piece via separate purchases.

The CPC line did very well from the outset, even as Acorn and Sinclair were soon watching their own sales implode. Pundits attributed the line’s success to what they called “the Amstrad Effect”: Alan Sugar’s instinct for delivering practical products at a good price at the precise instant when the technology behind them was ready for the mass market — i.e., was about to become desirable to his oft-stated target demographic of “the truck driver and his wife.” Sugar preferred to let others advance the technical state of the art, then swoop in to reap the rewards of their innovations when the time was right. The CPC line was a great example of him doing just that.

But the most dramatic and surprising iteration of the Amstrad Effect didn’t just feed the existing market for colorful game machines; it found an entirely new market segment, one that Amstrad’s competitors had completely missed until now. The story of the creation of the Amstrad PCW line is a classic tale of Alan Sugar, a man who knew almost nothing about computers but knew all he needed to about the people who bought them.

One day just a few months after the release of the first CPC machines, Sugar found himself in an airplane over Asia with Bob Watkins, one of his most trusted executives. A restless Sugar asked Watkins for a piece of paper, and proceeded to draw on it a contraption that included a computer, a monitor, a disk drive, and a printer, all in one unit. Looking at the market during the run-up to the CPC launch, Sugar had recognized that the only true mainstream uses for the current generation of computers in the home were as game machines and word processors. With the CPC, he had the former application covered. But what about the latter? All of the inexpensive machines currently on the market, like the Sinclair Spectrum, were oriented toward playing games rather than word processing, trading the possibility of displaying crisp 80-column text for colorful graphics in lower resolutions. Meanwhile all of the more expensive ones, like the BBC Micro, were created by and for hardcore techies rather than Sugar’s truck drivers. If they could apply their patented technology-for-the-masses approach to a word processor for the home and small business — making a cheap, well-built, all-in-one design emphasizing ease of use for the common person — Amstrad might just have another hit on their hands, this time in a market of their own utterly without competition. Internally, the project was named after Sugar’s secretary Joyce, since it would hopefully make her job and those of many like her much easier. It would eventually come to market as the “PCW,” or “Personal Computer Word Processor.”

The first Amstrad PCW machine, complete with bundled printer.

The first Amstrad PCW machine, complete with bundled printer. Note how the disk drive and the computer itself are built into the same case as the monitor, a very unusual design for the period.

Even more so than the CPC, the PCW was a thoroughly underwhelming package for technophiles. It was build around the tried-and-true Z80 8-bit CPU and ran CP/M, an operating system already considered obsolete by big business, MS-DOS having become the standard in the wake of the IBM PC. The bundled word-processing software, contracted out to a company called Locomotive Software, wasn’t likely to impress power users of WordStar or WordPerfect overmuch — but it was, in keeping with the Amstrad philosophy, unusually friendly and easy to use. Sugar knew his target customers, knew that they “didn’t give a shit whether there was an elastic band or an 8086 or a 286 driving the thing. They wouldn’t know what you were talking about.”

As usual, most of Amstrad’s hardware-engineering efforts went into packaging and cost-cutting. It was decided that the printer would have to be housed separately from the system unit for technical reasons, but otherwise the finished machine conformed remarkably well to Sugar’s original vision. Best of all, it had a price of just £399. By way of comparison, Acorn’s most recent BBC Micro Model B+ had half as much memory and no disk drive, monitor, or printer included — and was priced at £499.

Nervous as ever about intimidating potential customers, Amstrad was at pains to market the PCW first and foremost as a turnkey word-processing solution for homes and small businesses, as a general-purpose computer only secondarily if at all. “It’s more than a word processor for less than most typewriters,” ran their tagline. At the launch event in the heart of the City in August of 1985, three female secretaries paraded across the stage: a snooty one who demanded one of the competition’s expensive computer systems; a tarty one who said a typewriter was more than good enough; and a smart, reasonable one who naturally preferred the PCW. Man-of-the-people Sugar crowed extravagantly that Amstrad had “brought word-processing within the reach of every small business, one-man band, home-worker, and two-finger typist in the country.” Harping on one of his favorite themes, he noted that once again Amstrad had “produced what the customer wants and not a boffin’s ego trip.”

Sugar’s aggressive manner may have grated with many buttoned-down trade journalists, but few could deny that he might just open up a whole new market for computers with the PCW. Electrical Retailer and Trader was typical, calling the PCW “a grown-up computer that does something people want, packaged and sold in a way they can understand, at a price they’ll accept.” But even that note of optimism proved far too mild for the reality of the machine’s success. The PCW exploded out of the gate, selling 350,000 units in the first eight months. It probably could have sold a lot more than that, but Amstrad, caught off-guard by the sales numbers despite their founder’s own bullishness on the product, couldn’t make and ship them fast enough.

Level 9's Time and Magic text adventure running on a PCW.

Level 9’s Time and Magik text adventure running on a PCW.

Surprisingly for such a utilitarian package, the PCW garnered considerable loyalty and even love among the millions in Britain and all across Europe who eventually bought one. Their enthusiasm was enough to sustain a big, glossy newsstand magazine dedicated to the PCW alone — an odd development indeed for this machine that seemed on the face of it to be anything but a hacker’s darling. A thriving software ecosystem that reached well beyond word processing sprung up around the machine. Despite the PCW’s monochrome display and virtually nonexistent animation and sound capabilities, even games were far from unheard of on the platform. For obvious reasons, text adventures in particular became big favorites of PCW owners; with its comfortable full-travel keyboard, its fast disk drive, its relatively cavernous 256 K of memory, and its 80-column text display, a PCW was actually a far better fit for the genre than the likes of a Sinclair Spectrum. The PCW market for text adventures was strong enough to quite possibly allow companies like Magnetic Scrolls and Level 9 to hang on a year or two longer than they might otherwise have managed.

So, Amstrad was already soaring on the strength of the CPC and especially the PCW when they shocked the nation and cemented their position as the dominant force in mainstream British computing with the acquisition of Sinclair in April of 1986. Eminently practical man of business that he was, Sugar bought Sinclair partly to eliminate a rival, but also because he realized that, home-computer slump or no, the market for a machine as popular as the Sinclair Spectrum wasn’t likely to just disappear overnight. He could pick up right where Uncle Clive had left off, selling the existing machine just as it was to new buyers who wanted access to the staggering number of cheap games available for the platform. Sugar thought he could make a hell of a lot of money this way while needing to expend very little effort.

Once again, time proved him more correct than even he had ever imagined. Driven by that huge base of games, demand for new Spectrums persisted into the 1990s. Amstrad repackaged the technology from time to time and, perhaps most importantly, dramatically improved on Sinclair’s infamously shoddy quality control. But they never seriously re-imagined the Spectrum. It was now what Sugar liked to call “a commodity product.” He compared it to suntan lotion of all things: the department stores “put it in their window in July and August and they take it away in the winter.” The Spectrum’s version of July and August was of course November and December; every Christmas sparked a new rush of sales to the parents of a new group of youngsters just coming of age and discovering the magic of videogames.

A battered and uncertain Acorn, now a subsidiary of Olivetti, faced a formidable rival indeed in Alan Sugar’s organization. In a sense, the fundamental dichotomies hadn’t changed that much since Amstrad took Sinclair’s place as the yin to Acorn’s yang. Acorn remained as technology-driven as ever, while Amstrad was all about giving the masses what they craved in the form of cheap computers that were technically just good enough. Amstrad, however, was a much more dangerous form of people’s computer company than had been their predecessor in the role. After releasing some notoriously shoddy stereo equipment under the Amstrad banner in the 1970s and paying the price in returns and reputation, Alan Sugar had learned a lesson that continued to elude Clive Sinclair: that selling well-built, reliable products, even at a price of a few more quid on the final price tag and/or a few less in the profit margin, pays off more than corner-cutting in the long run. Unlike Uncle Clive, who had bumbled and stumbled his way to huge success and just as quickly back to failure, Sugar was a seasoned businessman and a master marketer. The diffident boffins of Acorn looked destined to have a hard time against a seasoned brawler like Sugar, raised on the mean streets of the cutthroat Tottenham Court Road electronics trade. It hardly seemed a fair fight at all.

But then, in the immediate wake of their acquisition by Olivetti nothing at all boded all that well for Acorn. New hardware releases were limited to enhanced versions of the 1981-vintage, 8-bit BBC Micro line that were little more ambitious than Amstrad’s re-packagings of the Spectrum. It was an open secret that Acorn was putting much effort into designing a new CPU in-house to serve as the heart of their eventual next-generation machine, an unprecedented step in an industry where CPU-makers and computer-makers had always been separate entities. For many, it seemed yet one more example of Acorn’s boffinish tendencies getting the best of them, causing them to laboriously reinvent the wheel rather than do what the rest of the microcomputer world was doing: grabbing a 68000 from Motorola or an 80286 from Intel and just getting on with the 16-bit machine their customers were clamoring for. While Acorn dithered with their new chip, they continued to fall further and further behind Amstrad, who in the wake of the Sinclair acquisition had now gone from a British home-computer market share of 0 to 60 percent in less than two years. Acorn was beginning to look downright irrelevant to many Britons in the market for the sorts of affordable, practical computer systems Amstrad was happily providing them with by the bucketful.

Measured in terms of public prominence, Acorn’s best days were indeed already behind them; they would never recapture those high-profile halcyon days of the early 1980s, when the BBC Micro had first been anointed as the British establishment’s officially designated choice for those looking to get in on the ground floor of the computer revolution. Yet the new CPU they were now in the midst of creating, far from being a pointless boondoggle, would ultimately have a far greater impact than anything they’d done before — and not just in Britain but over the entire world. For the CPU architecture Acorn was creating in those uncertain mid-1980s was the one that has gone on to become the most popular ever: the ubiquitous ARM. Since retrofitted into “Advanced RISC Machine,” “ARM” originally stood for “Acorn RISC Machine.” Needless to say, no one at Acorn had any idea of the monster they were creating. How could they?

ARM, the chip that changed the world.

ARM, the chip that changed the world.

“RISC” stands for “Reduced Instruction Set Computer.” The idea didn’t originate with Acorn, but had already been kicking around American university and corporate engineering departments for some time. (As Hermann Hauser later wryly noted, “Normally British people invent something, and the exploitation is in America. But this is a counterexample.”) Still, the philosophy behind ARM was adhered to by only a strident minority before Acorn first picked it up in 1983.

The overwhelming trend in commercial microprocessor design up to that point had been for chips to offer ever larger and more complex instruction sets. By making “opcodes” — single instructions issued directly to the CPU — capable of doing more in a single step, machine-level code could be made more comprehensible for programmers and the programs themselves more compact. RISC advocates came to call this traditional approach to CPU architecture “CISC,” or “Complex Instruction Set Computing.” They believed that CISC was becoming increasingly counterproductive with each new generation of microprocessors. Seeing how the price and size of memory chips continued to drop significantly almost every year, they judged — in the long term, correctly — that memory usage would become much less important than raw speed in future computers. They therefore also judged that it would be more than acceptable in the future to trade smaller programs for faster ones. And they judged that they could accomplish exactly that trade-off by traveling directly against the prevailing winds in CPU design — by making a CPU that offered a radically reduced instruction set of extremely simple opcodes that were each ruthlessly optimized to execute very, very quickly.

A program written for a RISC processor might need to execute far more opcodes than the same program written for a CISC processor, but those opcodes would execute so quickly that the end result would still be a dramatic increase in throughput. Yes, it would use more memory, and, yes, it would be harder to read as machine code — but already fewer and fewer people were programming computers at such a low level anyway. The trend, which they judged likely only to accelerate, was toward high-level languages that abstracted away the details of processor design. In this prediction again, time would prove the RISC advocates correct. Programs may not even need to be as much larger as one might think; RISC advocates argued, with some evidence to back up their claims, that few programs really took full advantage of the more esoteric opcodes of the CISC chips, that the CISC chips were in effect being programed as if they were RISC chips much of the time anyway. In short, then, a definite but not insubstantial minority of academic and corporate researchers were beginning to believe that the time was ripe to replace CISC with RISC.

And now Acorn was about to act on their belief. In typical boffinish fashion, their ARM project was begun as essentially a personal passion project by Roger Wilson [1]Roger Wilson now lives as Sophie Wilson. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times. and Steve Furber, two key engineers behind the original BBC Micro. Hermann Hauser admits that for quite some time he gave them “no people” and “no money” to help with the work, making ARM “the only microprocessor ever to be designed by just two people.” When talks began with Olivetti in early 1985, ARM remained such a back-burner long-shot that Acorn never even bothered to tell their potential saviors about it. But as time went on the ARM chip came more and more to the fore as potentially the best thing Acorn had ever done. Having, almost perversely in the view of many, refused to produce a 16-bit replacement for the BBC Micro line for so long, Acorn now proposed to leapfrog that generation entirely; the ARM, you see, was a 32-bit chip. Early tests of the first prototype in April of 1985 showed that at 8 MHz it yielded an average throughput of about 3.5 MIPS, compared to 2.5 MIPS at 10 MHz for the 68020, the first 32-bit entry in Motorola’s popular 68000 line of CISC processors. And the ARM was much, much cheaper and simpler to produce than the 68020. It appeared that Wilson and Furber’s shoestring project had yielded a world-class microprocessor.

ARM made its public bow via a series of little-noticed blurbs that appeared in the British trade press around October of 1985, even as the stockbrokers in the City and BBC Micro owners in their homes were still trying to digest the news of Acorn’s acquisition by Olivetti. Acorn was testing a new “super-fast chip,” announced the magazine Acorn User, which had “worked the first time”: “It is designed to do a limited set of tasks very quickly, and is the result of the latest thinking in chip design.” From such small seeds are great empires sown.

The Acorn Archimedes

The Acorn Archimedes

The machine that Acorn designed as a home for the new chip was called the Acorn Archimedes — or at times, because Acorn had been able to retain the official imprimatur of the BBC, the BBC Archimedes. It was on the whole a magnificent piece of kit, in a different league entirely from the competition in terms of pure performance. It was, for instance, several times faster than a 68000-based Amiga, Macintosh, or Atari ST in many benchmarks despite running at a clock speed of just 8 MHz, roughly the same as all of the aforementioned competitors. Its graphic capabilities were almost as impressive, offering 256 colors onscreen at once from a palette of 4096 at resolutions as high as 640 X 512. So, Acorn had the hardware side of the house well in hand. The problem was the software.

Graphical user interfaces being all the rage in the wake of the Apple Macintosh’s 1984 debut, Acorn judged that the Archimedes as well had to be so equipped. Deciding to go to the source of the world’s very first GUI, they opened a new office for operating-system development a long, long way from their Cambridge home: right next door to Xerox’s famed Palo Alto Research Center, in the heart of California’s Silicon Valley. But the operating-system team’s progress was slow. Communication and coordination were difficult over such a distance, and the team seemed to be infected with the same preference for abstract research over practical product development that had always marked Xerox’s own facility in Palo Alto. The new operating system, to be called ARX, lagged far behind hardware development. “It became a black hole into which we poured effort,” remembers Wilson.

At last, with the completed Archimedes hardware waiting only on some software to make it run, Acorn decided to replace ARX with something they called Arthur, a BASIC-based operating environment very similar to the old BBC BASIC with a rudimentary GUI stuck on top. “All operating-system geniuses were firmly working on ARX,” says Wilson, “so we couldn’t actually spare any of the experts to work on Arthur.” The end result did indeed look like something put together by Acorn’s B team. Parts of Arthur were actually written in interpreted BASIC, which Acorn was able to get away with thanks to the blazing speed of the Archimedes hardware. Still, running Arthur on hardware designed for a cutting-edge Unix-like operating system with preemptive multitasking and the whole lot was rather like dropping a two-speed gearbox into a Lamborghini; it got the job done, after a fashion, but felt rather against the spirit of the thing.

When the Archimedes debuted in August of 1987, its price tag of £975 and up along with all of its infelicities on the software side gave little hope to those not blinded with loyalty to Acorn that this extraordinary machine would be able to compete with Amstrad’s good-enough models. The Archimedes was yet another Acorn machine for the boffins and the posh. Most of all, though, it would be bought by educators who were looking to replace aging BBC Micros and might still be attracted by the BBC branding and the partial compatibility of the new machine with the old, thanks to software emulators and the much-loved BBC BASIC still found as the heart of Arthur.

Even as Amstrad continued to dominate the mass market, a small but loyal ecosystem sprang up around the Archimedes, enough to support a software scene strong on educational software and technical tools for programming and engineering, all a natural fit for the typical Acorn user. And, while the Archimedes was never likely to become the first choice for pure game lovers, a fair number of popular games did get ported. After all, even boffins and educators — or, perhaps more likely, their students — liked to indulge in a bit of pure fun sometimes.

In April of 1989, after almost two long, frustrating years of delays, Acorn released a revision of Arthur comprehensive enough to be given a whole new name. The new RISC OS incorporated many if not all of the original ambitions for ARX, at last providing the Archimedes with an attractive modern operating system worthy of its hardware. But by then, of course, it was far too late to capture the buzz a more complete Archimedes package might have garnered at its launch back in 1987.

Much to the frustration of many of their most loyal customers, Acorn still seemed not so much inept at marketing their wares to the common person as completely disinterested in doing so. It was as if they felt themselves somehow above it all. Perhaps they had taken a lesson from their one earlier attempt to climb down from their ivory tower and sell a computer for the masses. That attempt had taken the form of the Acorn Electron, a cut-down version of the BBC Micro released in 1983 as a direct competitor to the Sinclair Spectrum. Poor sales and overproduction of the Electron had been the biggest single contributor to Acorn’s mid-decade financial collapse and the loss of their independence to Olivetti. Having survived that trauma (after a fashion), Acorn seemed content to tinker away with technology for its own sake and to let the chips fall where they would when it came to actually selling the stuff that resulted.

Alan Sugar shows off the first of his new line of PC clones.

Alan Sugar shows off the first of his new line of PC clones.

If it provided any comfort to frustrated Acorn loyalists, Amstrad also began to seem more and more at sea after their triumphant first couple of years in the computer market. In September of 1986, they added a fourth line of computers to their catalog with the release of the PC — as opposed to PCW — range. The first IBM clones targeted at the British mass market, the Amstrad PC line might have played a role in its homeland similar to that of the Tandy 1000 in the United States, popularizing these heretofore business-centric machines among home users. As usual with Amstrad, the price certainly looked right for the task. The cheapest Amstrad PC model, with a generous 512 K of memory but no hard drive, cost £399; the most expensive, which included a 20 Mb hard drive, £949. Before the Amstrad PC’s release, the cheapest IBM clone on the British market had retailed for £1429.

But, while not a flop, the PC range never took off quite as meteorically as some had expected. For months the line was dogged by reports of overheating brought on by the machine’s lack of a fan (shades of the Apple III fiasco) that may or may not have had a firm basis in fact. Alan Sugar himself was convinced that the reports could be traced back to skulduggery by IBM and other clone manufacturers trying to torpedo his cheaper machines. When he finally bowed to the pressure to add a fan, he did so as gracelessly as imaginable.

I’m a realistic person and we are a marketing organization, so if it’s the difference between people buying the machine or not, I’ll stick a bloody fan in it. And if they say they want bright pink spots on it, I’ll do that too. What is the use of me banging my head against a brick wall and saying, “You don’t need the damn fan, sunshine?”

But there were other problems as well, problems that were less easily fixed. Amstrad struggled to source hard disks, which had proved a far more popular option than expected, resulting in huge production backlogs on many models. And, worst of all, they found that they had finally overreached themselves by setting the prices too low to be realistically sustainable; prices began to creep upward almost immediately.

For that matter, prices were creeping upward across Amstrad’s entire range of computers. In 1986, after years of controversy over the alleged dumping of memory chips into the international market on the part of the Japanese semiconductor industry, the United States pressured Japan into signing a trade pact that would force them to throttle back their production and increase their prices. Absent the Japanese deluge, however, there simply weren’t enough memory chips being made in the world to fill an ever more voracious demand. By 1988, the situation had escalated into a full-blown crisis for volume computer manufacturers like Amstrad, who couldn’t find enough memory chips to build all the computers their customers wanted — and certainly not at the prices their customers were used to paying for them. Amstrad’s annual sales declined for the first time in a long time in 1988 after they were forced to raise prices and cut production dramatically due to the memory shortage. Desperate to secure a steady supply of chips so he could ramp up production again, Sugar bought into Micron Technology, one of only two American firms making memory chips, in October of 1988 to the tune of £45 million. But within a year the memory-chip crisis, anticipated by virtually everyone at the time of the Micron buy-in to go on for years yet, petered out when factories in other parts of Asia began to come online with new technologies to produce memory chips more cheaply and quickly than ever. Micron’s stock plummeted, another major loss for Amstrad. The buy-in hadn’t been “the greatest deal I’ve ever done,” admitted Sugar.

Many saw in the Amstrad of these final years of the 1980s an all too typical story in business: that of a company that had been born and grown wildly as a cult of personality around its founder, until one day it got too big for any one man to oversee. The founder’s vision seemed to bleed away as the middle managers and the layers of bureaucracy moved in. Seduced by the higher profit margins enjoyed by business computers, Amstrad strayed ever further from Sugar’s old target demographic. New models in the PC range crept north of £1000, even £2000 for the top-of-the-line machines, while the more truck-driver-focused PCW and CPC lines were increasingly neglected. The CPC line would be discontinued entirely in 1990, leaving only the antique Spectrum to soldier on for a couple more years for Amstrad in the role of general-purpose home computer. It seemed that Amstrad at some fundamental level didn’t really know how to go about producing a brand new machine in the spirit of the CPC in this era when making a new home computer was much more complicated than plugging together some off-the-shelf chips and hiring a few hackers to knock out a BASIC for the thing. Amstrad would continue to make computers for many years to come, but by the time the 1990s dawned their brief-lived glory days of 60 percent market share were already fading into the rosy glow of nostalgia.

For all their very real achievements over the course of a very remarkable decade in British computing, Acorn and Amstrad each had their own unique blind spot that kept them from achieving even more. In the Archimedes, Acorn had a machine that was a match for any other microcomputer in the world in any application you cared to name, from games to business to education. Yet they released it in half-baked form at too high a price, then failed to market it properly. In their various ranges, Amstrad had the most comprehensive lineup of computers of anyone in Britain during the mid- to late-1980s. Yet they lacked the corporate culture to imagine what people would want five years from now in addition to what they wanted today. The world needs visionaries and commodifiers alike. What British computing lacked in the 1980s was a company capable of integrating the two.

That lack left wide open a huge gap in the market: space for a next-generation home computer with a lot more power and much better graphics and sound than the likes of the old Sinclair Spectrum, but that still wouldn’t cost a fortune. Packaged, priced, and marketed differently, the Archimedes might have been that machine. As it was, buyers looked to foreign companies to provide. Neglected as Europe still was by the console makers of Japan, the British punters’ choice largely came down to one of two American imports, the Commodore Amiga and the Atari ST. Both — especially the former — would live very well in this gap that neither Acorn nor Amstrad deigned to fill for too long. Acorn did belatedly try with the release of the Archimedes A3000 model in mid-1989 — laid out in the all-in-one-case, disk-drive-on-the-side fashion of an Amiga 500, styled to resemble the old BBC Micro, and priced at a more reasonable if still not quite reasonable enough £745. But by that time the Archimedes’s fate as a boutique computer for the wealthy, the dedicated, and the well-connected was already decided. As the decade ended, an astute observer could already detect that the wild and woolly days of British computing as a unique culture unto itself were numbered.

The Archimedes A3000 marked the end of an era, the last Acorn machine to also bear the BBC logo.

The Archimedes A3000 marked the end of an era, the last Acorn machine to bear the BBC logo.

And that would be that, but for one detail: the fairly earth-shattering detail of ARM. The ARM CPU’s ability to get extraordinary performance out of a relatively low clock speed had a huge unintended benefit that was barely even noticed by Acorn when they were in the process of designing it. In the world of computer engineering, higher clock speeds translate quite directly into higher power usage. Thus the ARM chip could do more with less power, a quality that, along with its cheapness and simplicity, made it the ideal choice for an emerging new breed of mobile computing devices. In 1990 Apple Computer, hard at work on a revolutionary “personal digital assistant” called the Newton, came calling on Acorn. A new spinoff was formed in November of 1990, a partnership among Acorn, Apple, and the semiconductor firm VLSI Technology, who had been fabricating Acorn’s ARM chips from the start. Called simply ARM Holdings, it was intended as a way to popularize the ARM architecture, particularly in the emerging mobile space, among end-user computer manufacturers like Apple who might be leery of buying ARM chips directly from a direct competitor like Acorn.

And popularize it has. To date about ten ARM CPUs have been made for every man, woman, and child on the planet, and the numbers look likely to continue to soar almost exponentially for many years to come. ARM CPUs are found today in more than 95 percent of all mobile phones. Throw in laptops (even laptops built around Intel processors usually boast several ARM chips as well), tablets, music players, cameras, GPS units… well, you get the picture. If it’s portable and it’s vaguely computery, chances are there’s an ARM inside. ARM, the most successful CPU architecture the world has ever known, looks likely to continue to thrive for many, many years to come, a classic example of unintended consequences and unintended benefits in engineering. Not a bad legacy for an era, is it?

(Sources: the book Sugar: The Amstrad Story by David Thomas; Acorn User of July 1985, October 1985, March 1986, September 1986, November 1986, June 1987, August 1987, September 1987, October 1988, November 1988, December 1988, February 1989, June 1989, and December 1989; Byte of November 1984; 8000 Plus of October 1986; Amstrad Action of November 1985; interviews with Hermann Hauser, Sophie Wilson, and Steve Furber at the Computer History Museum.)

Footnotes

Footnotes
1 Roger Wilson now lives as Sophie Wilson. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times.
 

Tags: , , ,

This Tormented Business, Part 2

In December of 1984 Sir Clive Sinclair and Chris Curry, heads of those leading lights of the British PC revolution Sinclair Research and Acorn Computers respectively, gave a Daily Mirror columnist named Michael Jeacock a Christmas gift for the ages. Like Jeacock, Sinclair and Curry were having a drink — separately — with colleagues in the Baron of Beef pub, a popular watering hole for the hackers and engineers employed in Cambridge’s “Silicon Fen.” Spotting his rival across the room, Sinclair marched up to him and started to give him a piece of his mind. It seemed he was very unhappy about a recent series of Acorn advertisements which accused Sinclair computers of shoddy workmanship and poor reliability. To make sure Curry fully understood his position, he emphasized his words with repeated whacks about the head and shoulders with a rolled-up newspaper. Curry took understandable exception, and a certain amount of pushing and shoving ensued, although no actual punches were thrown. The conflict apparently broke out again later that evening at Shades, a quieter wine bar to which the two had adjourned to patch up their differences — unsuccessfully by all indications.

If you know anything about Fleet Street, you know how they reacted to a goldmine like this. Jeacock’s relatively staid account which greeted readers who opened the Christmas Eve edition of the Daily Mirror was only the beginning. Soon the tabloids were buzzing gleefully over what quickly became a full-blown “punch-up.” Some wrote in a fever of indignation over such undignified antics; Sinclair had just been knighted, for God’s sake. Others wrote in a different sort of fever: another Daily Mirror columnist, Jean Rook, wrote that she found Sinclair’s aggression sexually exciting.

It would be a few more months before the British public would begin to understand the real reason these middle-aged boffins had acted such fools. Still heralded publicly as the standard bearers of the new British economy, they were coming to the private realization that things had all gone inexplicably, horribly wrong for their companies. Both were staring down a veritable abyss, with no idea how to pull up or leap over. They were getting desperate — and desperation makes people behave in undignified and, well, desperate ways. They couldn’t even blame their situations on fate and misfortune, even if 1984 had been a year of inevitable changes and shakeouts which had left the software industry confused by its contradictory signs and portents and seen the end or the beginning of the end of weak sisters on the hardware side like Dragon, Camputers, and Oric. No, their situations were directly attributable to decisions they had personally made over the last eighteen months. Each made many of these decisions against his better judgment in the hope of one-upping his rival. Indeed, the corporate rivalry that led them to a public bar fight — and the far worse indignities still to come — has a Shakespearian dimension, being bound up in the relationship between these two once and future friends, each rampantly egotistical and deeply insecure in equal measure and each coveting what the other had. Rarely does business get so personal.

Acorn’s flagship computer, the BBC Micro, is amusingly described by Francis Spufford in Backroom Boys as the Volvo of early British computers: safe, absurdly well-engineered and well-built, expensive, and just a little bit boring. Acorn had taken full advantage of the BBC’s institutional blessing to sell the machine in huge quantities to another set of institutions, the British school system; by the mid-1980s some 90% of British schools had BBC Micros on the premises. Those sales, combined with others to small businesses and to well-heeled families looking for a stolid, professional-quality machine for the back office — i.e., the sorts of families likely to have a Volvo in the driveway as well — were more than enough to make a booming business of Acorn.

Yet when the person on the street thought about computers, it wasn’t Curry’s name or even Acorn’s that popped first to mind. No, it was the avuncular boffin Uncle Clive and his cheap and cheerful Spectrum. It was Sinclair who was knighted for his “service to British industry”; Sinclair who was sought out for endless radio, television, and print interviews to pontificate on the state of the nation. Even more cuttingly, it was the Spectrum that a generation of young Britons came to love — a generation that dutifully pecked out their assignments on the BBC Micros at their schools and then rushed home to gather around their Speccys and have some fun. Chris Curry wanted some of their love as well.

Acorn Electron

Enter in 1983 the Acorn Electron, a radically cost-reduced version of the BBC Micro designed to take on the Spectrum on its own turf. Enthusiasm for the Electron amongst the rank and file at Acorn was questionable at best. Most were not afflicted with Curry’s need to show up his old boss, but rather manifested a strain of stuffy Cambridge elitism that would cling to Acorn throughout its history. They held Sinclair’s cheap machines and the games they played in a certain contempt. They were happy to cede that segment to him, would rather be working on innovative new technology — Acorn had already initiated a 32-bit RISC processor project that would eventually result in the ubiquitous ARM architecture that dominates smartphones and tablets today — than repackaging old for mewling schoolchildren. Curry had to struggle mightily to push the Electron project through in the face of such indifference.

A price of £200, about half that of the BBC Micro, would get buyers the same 32 K of memory and the same excellent BASIC, albeit in a smaller, less professional case. However, the Electron’s overall performance was sharply curtailed by an inefficient (but cheaper) new memory configuration. The Electron’s sound capabilities also suffered greatly by comparison with its big brother, and the BBC Micro’s Mode 7, a text-only display mode that programmers loved because it greatly reduced the amount of precious memory that needed to be allocated to the display, was eliminated entirely. And, much cheaper than the BBC Micro though it may have been, it was still more expensive than the Spectrum. On paper it would seem quite a dubious proposition. Still, a considerable number of punters went for it that Christmas of 1983, the very peak of the British micro boom. Many were perhaps made willing to part with a bit more cash by the Electron’s solidity and obviously superior build quality in comparison to the Speccy.

But now Curry found himself in a truly heartbreaking position for any captain of industry: he couldn’t meet the demand. Now that it was done, many months behind schedule, problems with suppliers and processes which no one had bothered to address during development meant that Electrons trickled rather than poured into stores. “We’re having to disappoint customers,” announced a spokeswoman for W.H. Smith. “We are not able to supply demand. What we have has sold out, and while we are expecting more deliveries the amount will still be well below demand.” By some estimates, Acorn missed out on as many as 100,000 Electron sales that Christmas. Worse, most of those in W.H. Smith and other shops who found the Electrons sold out presumably shrugged and walked away with a Spectrum or a Commodore 64 instead — mustn’t disappoint the children who expected to find a shiny new computer under the tree.

Never again was the lesson that Curry took away from the episode. Whatever else happened, he was damn sure going to have enough Electrons to feed demand next Christmas. Already in June of 1984 Curry had Acorn start placing huge orders with suppliers and subcontractors. He filled his warehouses with the things, then waited for the big Christmas orders to start. This time he was going to make a killing and give old Clive a run for his money.

The orders never came. The home-computer market had indeed peaked the previous Christmas. While lots of Spectrums were sold that Christmas of 1984 in absolute numbers, it wasn’t a patch on the year before. And with the Spectrum more entrenched than ever as the biggest gaming platform in Britain, and the Commodore 64 as the second biggest, people just weren’t much interested in the Electron anymore. Six months into the following year Acorn’s warehouses still contained at least 70,000 completed Electrons along with components for many more. “The popular games-playing market has become a very uncomfortable place to be. Price competition will be horrific. It is not a market we want to be in for very long,” said Curry. The problem was, he was in it, up to his eyebrows, and he had no idea how to get out.

Taking perhaps too much to heart Margaret Thatcher’s rhetoric about her country’s young microcomputer industry as a path to a new Pax Britannia, Curry had also recently made another awful strategic decision: to push the BBC Micro into the United States. Acorn spent hugely to set up a North American subsidiary and fund an advertising blitz. They succeeded only in learning that there was no place for them in America. The Apple II had long since owned American schools, the Commodore 64 dominated gaming, and IBM PCs and compatibles ruled the world of business computing. And the boom days of home computing were already over in North America just as in Britain; the industry there was undergoing a dramatic slowdown and shakeout of its own. What could an odd British import with poor hardware distribution and poorer software distribution do in the face of all that? The answer was of course absolutely nothing. Acorn walked away humbled and with £10 to £12 million in losses to show for their American adventure.

To add to the misery, domestic sales of the BBC Micro, Acorn’s bread and butter, also began to collapse as 1984 turned into 1985. Preoccupied with long-term projects like the RISC chip as well as short-term stopgaps like the Electron, Acorn had neglected the BBC Micro for far too long. Incredibly, the machine still shipped with just 32 K of memory three years after a much cheaper Spectrum model had debuted with 48 K. This was disastrous from a marketing standpoint. Salespeople on the high streets had long since realized that memory size was the one specification that virtually every customer could understand, that they used this figure along with price as their main points of comparison. (It was no accident that Commodore’s early advertising campaign for the 64 in the United States pounded relentlessly and apparently effectively on “64 K” and “$600” to the exclusion of everything else.) The BBC Micro didn’t fare very well by either metric. Meanwhile the institutional education market had just about reached complete saturation. When you already own 90% of a market, there’s not much more to be done there unless you come up with something new to sell them — something Acorn didn’t have.

How was Acorn to survive? The City couldn’t answer that question, and the share price therefore plunged from a high of 193p to as low as 23p before the Stock Exchange mercifully suspended trading. A savior appeared just in time in the form of the Turin, Italy-based firm Olivetti, a long-established maker of typewriters, calculators, and other business equipment, including recently PCs. Olivetti initially purchased a 49 percent stake in Acorn. When that plus the release of a stopgap 64 K version of the BBC Micro failed to stop the bleeding — shares cratered to as low as 9p and trading had to be suspended again — Olivetti stepped in again to up their stake to 80 percent and take the company fully under their wing. Acorn would survive in the form of an Olivetti subsidiary to eventually change the world with the ARM architecture, but the old dream for Acorn as a proudly and independently British exporter and popularizer of computing was dead, smothered by, as wags were soon putting it, “the Shroud of Turin.”

If Chris Curry wanted the popular love that Clive Sinclair enjoyed, Sir Clive coveted something that belonged to Curry: respectability. The image of his machines as essentially toys, good for games and perhaps a bit of BASIC-learning but not much else, rankled him deeply. He therefore decided that his company’s next computer would not be a direct successor to the Spectrum but rather a “Quantum Leap” into the small-business and educational markets where Acorn had been enjoying so much success.

He shouldn’t have bothered. While the Electron was a competent if somewhat underwhelming little creation, the Sinclair QL was simply botched every which way from Tuesday right from start to finish. Apparently for marketing reasons as much as anything else, Sir Clive decided on a chip from the new Motorola 68000 line that had everyone talking. Yet to save a few pounds he insisted that his engineers use the 68008 rather than the 68000 proper, the former being a crippled version of the latter with an 8-bit rather than 16-bit data bus and, as a result, about half the overall processing potential. He also continued his bizarre aversion to disk drives, insisting that the QL come equipped with two of his Microdrives instead — a classically Sinclairian bit of tortured technology that looked much like one of those old lost and unlamented 8-track audio tapes and managed to be far slower than a floppy disk and far less reliable than a cassette tape (previously the most unreliable form of computer storage known to man). The only possible justification for the contraption was sheer bloody-mindedness — or anticipation of the money Sinclair stood to make as the sole sellers of Microdrive media if they could ever just get the punters to start buying the things. These questionable decisions alone would have been enough to torpedo the QL. They were, however, just the tip of an iceberg. Oh, what an iceberg…

The QL today feels like an artifact from an alternate timeline of computing in which the arrival of new chips and new technologies didn’t lead to the paradigm shifts of our own timeline. No, in this timeline things just pretty much stayed as they had been, with computers booting up to a BASIC environment housed in ROM and directed via arcane textual commands. The QL must be one of the most profoundly un-visionary computers ever released. The 68000 line wasn’t important just because it ran faster than the old 8-bit Z80s and 6502s; Intel’s 16-bit 8086 line had been doing that for years. It was important because, among other things, its seven levels of external interrupts made it a natural choice for the new paradigm of the graphical user interface and the new paradigm of programming required to write for a GUI: event-driven (as opposed to procedural) programming. This is the reason Apple chose it for their revolutionary Lisa and Macintosh. Sinclair, however, simply used a 68008 like a souped-up Z80, leaving one feeling like they’ve rather missed a pretty significant point. It’s an indictment that’s doubly damning in light of Sir Clive’s alleged role at Sinclair as a sort of visionary-in-chief — or, to choose a particularly hyperbolic contemporary description from The Sun, as “the most prodigious inventor since Leonardo.” But then, as we shall see, computers didn’t ultimately have a lot to do with Sir Clive’s visions.

Clive Sinclair launches the QL

The big unveiling of the QL on January 12, 1984, was a landmark of smoke and mirrors even by Sinclair’s usual standards. Sir Clive declared there that the QL would begin shipping within 28 days to anyone who cared to order one at the low price of £400, despite the fact that no functioning QL actually existed. I don’t mean, mind you, that the prototypes had yet to go into production. I mean rather that no one at Sinclair had yet managed to cobble together a single working machine. Press in attendance were shown non-interactive demonstrations played back on monitors from videotape, while the alleged prototype was kept well away from them. Reporters were told that they could book a review machine, to be sent to them “soon.”

The question of just why Sinclair was in such a godawful hurry to debut the QL is one that’s never been satisfactorily answered. Some have claimed that Sir Clive was eager to preempt Apple’s unveiling of the Macintosh, scheduled for less than two weeks later, but I tend to see this view as implying an awareness of the international computer industry and trends therein that I’m not sure Sir Clive possessed. One thing, however, is clear: the oft-repeated claim that the QL represents the first mass-market 68000-based computer doesn’t hold water. Steve Jobs debuted a working Macintosh on January 24, 1984, and Apple started shipping the Macintosh months before Sinclair did the QL.

As those 28 days stretched into months, events went through the same cycle that had greeted previous Sinclair launches: excitement and anticipation fading into anger and accusations of bad faith and, soon enough, yet another round of investigations and threats by the Advertising Standards Authority. Desperate to show that the QL existed in some form and avoid legal action on behalf of the punters whose money they’d been holding for weeks or months, Sinclair hand-delivered a few dozen machines to journalists and customers in April. These sported an odd accessory: a square appendage hanging off the back of the otherwise sleek case. It seems Sinclair’s engineers had realized at some late date that they couldn’t actually fit everything they were supposed to inside the case. By the time QLs finally started shipping in quantity that summer the unwanted accessory had been removed and its contents somehow stuffed inside the case proper, but that turned out to have been the least of the machine’s problems.

Early Sinclair complete with dongle

Amongst the more troubling of these was a horrid keyboard, something of another Sinclair tradition by now. Sinclair did deign to give the new machine actual plastic keys in lieu of the famous “dead flesh” rubber keys of the Spectrum, but the keys still rested upon a cheap membrane rather than having the mechanical action of such high-flying competitors as the Commodore VIC-20. The keyboard was awful to type on, a virtual kiss of death all by itself for a supposed business computer. And it soon emerged that the keyboard, like everything else on the QL, didn’t work properly on even its own limited terms. Individual keys either stuck or didn’t register, or did both as the mood struck them. Reports later emerged that Sinclair had actually solicited bids for a mechanical keyboard from a Japanese manufacturer and found it would cost very little if anything more than the membrane job, but elected to stick with the membrane because it was such a “Sinclair trademark.” The mind boggles.

And then there were the performance problems brought on by a perfect storm of a crippled CPU, the Microdrives, and the poorly written business software that came with the machine. Your Computer magazine published the following astonishing account of what it took to save a 750-word document in the word processor:

1. Press F3 key followed by 6. A period of 35 seconds elapses by which time the computer has found the save section of Quill and then asks if I wish to save the default file, i.e. the file I am working on.

2. Press ENTER. After a further 10 seconds the computer finds that the file already exists and asks if I wish to overwrite it.

3. Press Y. A period of 100 seconds elapses while the old file is erased and the new one saved and verified in its place. The user is then asked if he wishes to carry on with the same document.

4. Press ENTER. Why a further 25 seconds is required here is beyond me as the file must be in memory as we have just saved it. Unfortunately, the file is now at the start, so to get back to where I was:

5. Press F3 key then G followed by B. The Goto procedure to get to the bottom of the file, a further 28 seconds.

For those keeping score, that’s 3 minutes and 18 seconds to save a 750-word document. For a 3000-word document, that time jumped to a full five minutes.

Your Computer concluded their review of the QL with a prime demonstration of the crazily mixed messaging that marked all coverage of the machine. It was “slightly tacky,” “the time for foisting unproven products on the marketplace has gone,” and “it would be a brave business which would entrust essential data to Microdrives.” Yet it was also a “fascinating package” and “certain to be a commercial success.” It arguably was “fascinating” in its own peculiar way. “Commercial success,” however, wasn’t in the cards. Sinclair did keep plugging away at the QL for months after its release, and did manage to make it moderately more usable. But the damage was long since done. Even the generally forgiving British public couldn’t accept the eccentricities of this particular Sinclair creation. Sales were atrocious. Still, Sir Clive, never one to give up easily, continued to sell and promote it for almost two years.

There’s a dirty secret about Sir Clive Sinclair the computer visionary that most people never quite caught on to: he really didn’t know that much about computers, nor did he care all that much about them. Far from being the “most prodigious inventor since Leonardo,” Sir Clive remained fixated for decades on exactly two ideas: his miniature television and his electric car. The original Sinclair ZX80 had been floated largely to get Sinclair Research off the ground so that he could pursue those twin white whales. Computers had been a solution to a cashflow problem, a means to an end. His success meant that by 1983 he had the money he needed to go after the television and the car, the areas where he would really make his mark, full on. Both being absolutely atrocious ideas, this was bad, bad news for anyone with a vested interest in Sinclair Research.

The TV80 was a fairly bland failure by Sinclair standards: he came, he spent millions manufacturing thousands of devices that mostly didn’t work properly and that nobody would have wanted even if they had, and he exited again full of plans for the next Microvision iteration, the one that would get it right and convince the public at last of the virtues of a 2-inch television screen. But the electric car… ah, that one was one for the ages, one worthy of an honored place beside the exploding watches of yore. Sir Clive’s C5 electric tricycle was such an awful idea that even his normally pliable colleagues resisted letting Sinclair Research get sucked up in it. He therefore took £8.6 million out to found a new company, Sinclair Vehicles.

The biggest problem in making an electric car, then and now, is developing batteries light enough, powerful enough, and long-lasting enough to rival gasoline or diesel. Researchers were a long way away still in 1984. A kilogram of gasoline has an energy potential of 13,000 watt-hours; a state-of-the-art lead-acid battery circa 1984 had an energy potential of 50 watt-hours. That’s the crux of the problem; all else is relative trivialities. Having no engineering solution to offer for the hard part of the problem, Sinclair solved it through a logical leap that rivals any of Douglas Adams’s comedic syllogisms: he would simply pretend the hard problem didn’t exist and just do the easy stuff. From his adoring biography The Sinclair Story:

Part of the ground-up approach was not to spend enormous amounts trying to develop a more efficient battery, but to make use of the models available. Sinclair’s very sound reasoning was that a successful electric vehicle would provide the necessary push to battery manufacturers to pursue their own developments in the fullness of time; for him to sponsor this work would be a misplacement of funds.

There’s of course a certain chicken-or-egg problem inherent in this “sound reasoning,” in that the reason a “successful electric vehicle” didn’t yet exist was precisely because a successful electric vehicle required improved battery technology to power it. Or, put another way: if you could make a successful electric vehicle without improved batteries, why would its existence provide a “push to battery manufacturers?” Rather than a successful electric vehicle, Sir Clive made the QL and Black Watch of electric vehicles all rolled into one, an absurd little tricycle that was simultaneously underwhelming (to observe) and terrifying (to actually drive in traffic).

Sinclair C5

He unveiled the C5 on January 10, 1985, almost exactly one year after the QL dog-and-pony show and for the same price of £400. The press assembled at Alexandria Palace couldn’t help but question the wisdom of unveiling an open tricycle on a cold January day. But, once again, logistics were the least of the C5’s problems. A sizable percentage of the demonstration models simply didn’t work at all. The journalists dutifully tottered off on those that did, only to find that the advertised top speed of 15 mph was actually more like 5 mph — a brisk walking speed — on level ground. The batteries in many of the tricycles went dead or overheated — it was hard to tell which — with a plaintive little “Peep! Peep!” well before their advertised service range of 20 miles. Those journalists whose batteries did hold out found that they didn’t have enough horsepower to get up the modest hill leading back to the exhibition area. It was a disgruntled and disheveled group of cyclists who straggled back to Sir Clive, pedaling or lugging the 30-kilogram gadgets alongside. They could take comfort only in the savaging they were about to give him. When the press found out that the C5 was manufactured in a Hoover vacuum-cleaner plant and its motor was a variation on one developed for washing machines, the good times only got that much better. If there’s a single moment when Sir Clive turned the corner from visionary to laughingstock, this is it.

Sinclair Research wasn’t doing a whole lot better than its founder as 1984 turned into 1985. In addition to the huge losses sustained on the QL and TV80 fiascoes, Sinclair had, like Acorn, lost a bundle in the United States. Back in 1982, they had cut a deal with the American company Timex, who were already manufacturing all of their computers for them from a factory in Dundee, Scotland, to export the ZX81 to America as the Timex Sinclair 1000. It arrived in July of 1982, just as the American home-computing boom was taking off. Priced at $99 and extravagantly advertised as “the first computer under $100,” the TS 1000 sold like gangbusters for a short while; for a few months it was by far the bestselling computer in the country. But it was, with its 2 K of memory, its calculator keyboard, and its blurry text-only black-and-white display, a computer in only the most nominal sense. When Jack Tramiel started in earnest his assault on the low-end later in the year with the — relatively speaking — more useful and usable Commodore VIC-20, the TS 1000 was squashed flat.

Undeterred, Timex and Sinclair tried again with an Americanized version of the Spectrum, the TS 2068. With the best of intentions, they elected to improve the Speccy modestly to make it more competitive in America, adding an improved sound chip, a couple of built-in joystick ports (British Speccy owners had to buy a separate interface), a couple of new graphics modes, a cartridge port, even a somewhat less awful version of Sinclair’s trademark awful keyboards. The consequence of those improvements, however, was that most existing Spectrum software became incompatible. This weird little British machine with no software support was priced only slightly less than the Commodore 64 with its rich and growing library of great games. It never had a chance. Timex, like other big players such as Texas Instruments and Coleco, were soon sheepishly announcing their withdrawal from the home-computer market, vanquished like the others by Commodore.

Back in Britain, meanwhile, it was becoming clear that, as if Sinclair hadn’t already had enough problems, domestic sales of the Spectrum were beginning to slow. Sinclair was still in a dominant position, owning some 40 percent of the British market. However, conventional wisdom had it that that market was becoming saturated; by late 1984 most of the people in Britain who were likely to buy a computer had already done so, to the tune of far more sales per capita than any other country on the planet. Sinclair’s only chance to maintain sales would seem to be to sell new machines to those who already owned older models. Yet they had pissed away the time and resources needed to create a next-generation Speccy on the QL. In desperation they rushed out something called the Spectrum Plus for Christmas 1984: a slightly more substantial-looking Spectrum with a better keyboard like that of the QL (still not a genuinely good one, of course; “Sinclair trademark” and all that). With no changes to its actual computing capabilities, this wasn’t exactly a compelling upgrade package for current Spectrum owners. And, Sinclair still being Sinclair, the same old problems continued; most Spectrum Pluses arrived with several of the vaunted new plastic keys floating around loose in the box.

By mid-1985, Sinclair’s position wasn’t a whole lot better than that of Acorn. They were drowning in unsold inventories of Spectrums and QLs dating back to the previous Christmas season and even before, mired in debt, and without the resources to develop the Spectrum successor they desperately needed.

Then it seemed that their own Olivetti-equivalent had arrived. In a “World Exclusive!” article in the June 17, 1985, edition, the Daily Mirror announced that “Maxwell Saves Sinclair.” The Maxwell in question was the famous British tycoon and financier Robert Maxwell, who would inject some £12 million into the company. In return, Sir Clive would have to accept some adult supervision: he would become a “life president” and consultant, with Maxwell installing a management team of his own choosing. Everyone was relieved, even Margaret Thatcher. “The government has been aware that these talks have been going on and welcomes any move to put the Sinclair business on a firm footing,” said a spokesman.

Then, not quite two months after the carefully calibrated leak to the Daily Mirror, Maxwell suddenly scuttled the deal. We’re not quite sure why. Some have said that, after a thorough review of Sinclair’s books, Maxwell concluded the company was simply irredeemable; some that Sir Clive refused to quietly accept his “life president” post and go away the way Maxwell expected him to; some that Sir Clive planned to go away all too soon, taking with him a promising wafer-scale chip integration process a few researchers had been working on to serve as his lifeboat and bridge to yet another incarnation of an independent Sinclair, as the ZX80 had served as a bridge between the Sinclair Radionics of the 1970s and the Sinclair Research of the 1980s. Still others say that Sir Clive was never serious about the deal, that the whole process was a Machiavellian plot on his part to keep his creditors at bay until the Christmas buying season began to loom, after which they would continue to wait and see in the hope that Sinclair could sell off at least some of all that inventory before the doors were shut. This last, at least, I tend to doubt; like the idea that he staged the QL unveiling to upstage the Macintosh, it ascribes a level of guile and business acumen to Sir Clive that I’m not sure he possessed.

At any rate, Sinclair Research staggered into 1986 alive and still independent but by all appearances mortally wounded. A sign of just how far they had fallen came when they had to beg the next Spectrum iteration from some of the people they were supposed to be supplying it to: Spain’s Investrónica, signatories to the only really viable foreign distribution deal they had managed to set up. The Spectrum 128 was a manifestation of Investrónica’s impatience and frustration with their partner. After waiting years for a properly updated Spectrum, they had decided to just make their own. Created as it was quickly by a technology distributor rather than technology developer, the Spectrum 128 was a bit of a hack-and-splice job, grafting an extra 80 K of memory, an improved sound chip, and some other bits and pieces onto the venerable Speccy framework. Nevertheless, it was better than nothing, and it was compatible with older Speccy games. Sinclair Research scooped it up and started selling it in Britain as well.

The state of Acorn and Sinclair as 1986 began was enough to trigger a crisis of faith in Britain. The postwar era, and particularly the 1970s, had felt for many people like the long, slow unraveling of an economy that had once been the envy of the world. It wasn’t only Thatcher’s Conservatives who had seen Sir Clive and Acorn as standard bearers leading the way to a new Britain built on innovation and silicon. If many other areas of the economy were finally, belatedly improving after years and years of doldrums, the sudden collapse of Sinclair and Acorn nevertheless felt like a bucket of cold water to the dreamer’s face. All of the old insecurities, the old questions about whether Britain could truly compete on the world economic stage came to the fore again to a degree thoroughly out of line with what the actual economic impact of a defunct Acorn and Sinclair would have been. Now those who still clung to dreams of a silicon Britain found themselves chanting an unexpected mantra: thank God for Alan Sugar.

Sugar, the business titan with the R&B loverman’s name, had ended his formal schooling at age 16. A born salesman and wheeler and dealer, he learned his trade as an importer and wholesaler on London’s bustling Tottenham Court Road, then as now one of the densest collections of electronics shops in Europe. He founded his business, Amstrad, literally out of the back of a van there in 1968. By the late 1970s he had built Amstrad into a force to be reckoned with as purveyors of discount stereo equipment, loved by his declared target demographic of “the truck driver and his wife” as much as it was loathed by audiophiles.

He understood his target market so well because he was his target market. An unrepentant Eastender, he never tried to refine his working-class tastes, never tried to smooth away his Cockney diction despite living in a country where accent was still equated by many with destiny. The name of his company was itself a typical Cockneyism, a contraction of its original name of A.M.S. Trading Company (“A.M.S.” being Sugar’s initials). Sugar:

There was the snooty area of the public that would never buy an Amstrad hi-fi and they went out and bought Pioneer or whatever, and they’re 5 percent of the market. The other 95 percent of the market wants something that makes a noise and looks good. And they bought our stuff.

An Amstrad stereo might not be the best choice for picking out the subtle shadings of the second violin section, but it was just fine for cranking out the latest Led Zeppelin record good and loud. Sugar’s understanding of what constituted “good enough” captured fully one-third of the British stereo market for Amstrad by 1982, far more than any other single company.

In 1983, Sugar suddenly decided that Amstrad should build a home computer to compete with Sinclair, Acorn, and Commodore. Conventional wisdom would hold that this was absolutely terrible timing. Amstrad was about to jump into the market just in time for it to enter a decline. Still, if Sugar could hardly have been aware of what 1984 and 1985 would bring, he did see some fairly obvious problems with the approach of his would-be competitors which he believed Amstrad could correct. In a sense, he’d been here before.

Stereos had traditionally been sold the way that computer systems were in 1983: as mix-and-match components — an amplifier here, a tape deck and record player there, speakers in that corner — which the buyer had to purchase separately and assemble herself. One of Sugar’s greatest coups had come when he had realized circa 1978 that his truck drivers hated this approach at least as much as the audiophiles reveled in it. They hated comparing a bunch of gadgets with specifications they didn’t understand anyway; hated opening a whole pile of boxes and trying to wire everything together; hated needing four or five sockets just to power one stereo. Amstrad therefore introduced the Tower System: one box, one price, one socket — plug it in and go. It became by far their biggest seller, and changed the industry in the process.

Amstrad’s computer would follow the same philosophy, with the computer, a tape drive, and a monitor all sold as one unit. The included monitor in particular would become a marketing boon. Monitors being quite unusual in Britain, many a family was wracked with conflict every evening over whether the television was going to be used for watching TV or playing on the Speccy. The new Amstrad would, as the advertisements loudly proclaimed, make all that a thing of the past.

Amstrad CPC464

The CPC-464 computer which began shipping in June of 1984 was in many other ways a typical Amstrad creation. Sugar, who considered “boffin” a term of derision, was utterly uninterested in technological innovation for its own sake. Indeed, Sugar made it clear from the beginning that, should the CPC-464 disappoint, he would simply cut his losses and drop the product, as he had at various times televisions, CB radios, and car stereos before it. He was interested in profits, not the products which generated them. So, other than in its integrated design the CPC-464 innovated nowhere. It instead was just a solid, conservative computer that was at least in the same ballpark as the competition in every particular and matched or exceeded it in most: 64 K of memory, impressive color graphics, a decent sound chip, a more than decent BASIC. Build quality and customer service were, if not quite up to Acorn’s standards, more than a notch or two above Sinclair’s and more than adequate for a computer costing about £350 with tape drive and color monitor. Amstrad also did some very smart things to ease the machine’s path to consumer adoption: they paid several dozen programmers to have a modest library of games and other software available right from launch, and started Amstrad Computer User magazine to begin to build a community of users. These strategies, along with the commonsense value-for-your-pound approach of the machine itself, let the CPC-464 and succeeding machines do something almost inconceivable to the competitors collapsing around them: post strong sales that continued to grow by the month, making stereos a relatively minor part of Amstrad’s booming business within just a couple of years.

Amstrad’s results were so anomalous to those of the industry as a whole that for a considerable length of time the City simply refused to believe them. Their share price continued to drop through mid-1985 in direct defiance of rosy sales figures. It wasn’t until Amstrad’s fiscal year ended in June and the annual report appeared showing sales of £136.1 million and an increase in profits of 121 percent that the City finally began to accept that Amstrad computers were for real. Alan Sugar describes in his own inimitable way the triumphalism of this period of Amstrad’s history:

The usual array of predators, such as Dixons, W. H. Smith, and Boots, were hovering around like the praying mantis, saying, “Ha, ha, you’ve got too many computers, haven’t you? We’re going to jump on you and steal them off you and rape you when you need money badly, just like Uncle Clive.” And we said, “We haven’t got any.” They didn’t believe us, until such time as they had purged their stocks and finished raping Clive Sinclair and Acorn, and realized they had nothing left to sell. So they turned to us again in November of 1985 and said, “What about a few of your computers at cheaper prices?” We stuck the proverbial two fingers in the air, and that’s how we got price stability back into the market. They thought we were sitting on stockpiles and they were doing us a big favour. But we had no inventory. It had gone to France and Spain.

Continental Europe was indeed a huge key to Amstrad’s success. When Acorn and Sinclair had looked to expand internationally, they had looked to the hyper-competitive and already troubled home-computer market in the United States, an all too typical example of British Anglocentrism. (As Bill Bryson once wrote, a traveler visiting Britain with no knowledge of geography would likely conclude from the media and the conversations around her that Britain lay a few miles off the coast of the United States, perhaps about where Cuba is in our world, and it was the rest of Europe that was thousands of miles of ocean away.) Meanwhile they had all but ignored all that virgin territory that started just a ferry ride away. Alan Sugar had no such prejudices. He let America alone, instead pushing his computers into Spain, France, and the German-speaking countries (where they were initially sold under the Schneider imprint — ironically, another company that had gotten its start selling low-priced stereo equipment). Amstrad’s arrival, along with an increasingly aggressive push from Commodore’s West German subsidiary, marks the moment when home computers at last began to spread in earnest through Western Europe, to be greeted there by kids and hackers with just as much enthusiasm and talent as their British, American, and Japanese counterparts.

One day in early 1986, Alan Sugar received an unexpected call from Mark Souhami, manager of the Dixons chain of consumer-electronics stores. Souhami dropped a bombshell: it seemed that Sir Clive was interested in selling his computer operation to Amstrad, the only company left in the market with the resources for such an acquisition. Dixons, who still sold considerable numbers of Spectrums and thus had a vested interest in keeping the supply flowing, had been recruited to act as intermediaries. Sir Clive and Sugar soon met personally for a quiet lunch in Liverpool Street Station. Sir Clive later reported that he found Sugar “delightful” — “very pleasant company, a witty man.” Sugar was less gracious, ruthlessly mocking in private Sir Clive’s carefully cultivated “Etonian accent” and his intellectual pretensions.

At 3:00 AM on April 2, 1986, after several weeks of often strained negotiations, Amstrad agreed to buy all of the intellectual property for and existing stocks of Sinclair’s computers for £16 million. The sum would allow Sir Clive to pay off his creditors and make a clean break from the computer market to pursue his real passions. Tellingly, Sinclair Research itself along with the TV80 and the C5 were explicitly excluded from the transfer — not that Sugar had any interest in such financial losers anyway. With a stroke of the pen, Alan Sugar and Amstrad now owned 60 percent of the British home-computer market along with a big chunk of the exploding continental European. All less than two years after the CPC-464 had debuted under a cloud of doubt.

Clive Sinclair and Alan Sugar

When Sugar and Sir Clive officially announced their deal at a press conference on April 7, the press rightly marked it as the end of an era. The famous photograph of their uncomfortable handshake before the assembled flash bulbs stands as one of the more indelible in the history of British computing, a passing of the mantle from Sir Clive the eccentric boffin to Sugar the gruff, rough, and ruthless man of the bottom line. British computing had lost its innocence, and things would never quite be the same again. Thatcher had backed the wrong horse in choosing Sir Clive as her personification of the new British capitalist spirit. (Sugar would get a belated knighthood of his own in 2000.) On the plus side, British computing was still alive as an independent entity, a state of affairs that had looked very doubtful just the year before. Indeed, it was poised to make a huge impact yet through Amstrad.

Those who fretted that Sugar might have bought the Spectrum just to kill it needn’t have; he was far too smart and unsentimental for that. If people still wanted Spectrums, he would give them Spectrums. Amstrad thus remade the Speccy with an integrated tape drive in the CPC line’s image and continued to sell it as the low end of their lineup into the 1990s, until even the diehards had moved on. Quality and reliability improved markedly, and the thing even got a proper keyboard at long last. The QL, however, got no such treatment; Sugar put it out of its misery without a second thought.

Clive Sinclair rides off into the sunset

I’ll doubtless have more to say about a triumphant Amstrad and a humbled but still technically formidable Acorn in future articles. Sir Clive, however, will now ride off into the sunset — presumably on a C5 — to tinker with his electric cars and surface occasionally to delight the press with a crazy anecdote. He exited the computer market with dreams as grandiose as ever, but no one would ever quite take him seriously again. For a fellow who takes himself so manifestly seriously, that has to be a difficult thing to bear. Sinclair Research exists as a nominal corporation to this day, but for most of the past three decades its only actual employee appears to have been Sir Clive himself, still plugging away at his electric car (miniaturized televisions have not been in further evidence). I know I’ve been awfully hard on Sir Clive, but in truth I rather like him. He possessed arrogance, stubbornness, and shortsightedness in abundance, but no guile and very little greed. Amongst the rogue’s gallery of executives who built the international PC industry that practically qualifies him for sainthood. He was certainly the most entertaining computer mogul of all time, and he did manage almost in spite of himself to change Britain forever. The British public still has a heartfelt affection for the odd little fellow — as well they should. Eccentrics like him don’t come around every day.

(Much of this article was drawn from following the news items and articles in my favorite of the early British micro magazines, Your Computer, between January 1984 and May 1986. Other useful magazines: Popular Computing Weekly of November 10 1983 and January 12 1984; Sinclair User of November 1984, February 1985, and March 1985. Two business biographies of Sir Clive are recommended, one admiring and one critical: The Sinclair Story by Rodney Dale and Sinclair and the “Sunrise” Technology by Ian Adamson and Richard Kennedy respectively. The best account I’ve found of Amstrad’s early history is in Alan Sugar: The Amstrad Story by David Thomas. Good online articles: The Register’s features on the Sinclair Microdrives, the QL, and the Acorn Electron; Stairway to Hell’s reprinting of a series of articles on Acorn’s history from Acorn User magazine. Finally, by all means check out the delightful BBC docudrama Micro Men if you haven’t already and marvel that the events and personalities depicted therein are only slightly exaggerated. That film is also the source of the last picture in this article; it was just too perfect an image to resist.)

 
 

Tags: , , ,

The Merry Pranksters of Automata

The Piman and Uncle Groucho

I closed my last article by noting that Ultimate Play the Game’s works can feel just a bit soulless to me with their slickness and unrelenting commercial focus — an opinion I’m sure many of you don’t share. Well, rest assured that I can’t attach any such complaint to my subject for today.

Automata UK was the creation of a pair of agitators named Mel Croucher and Christian Penfold who became the Merry Pranksters of the early British software industry, mixing absurdist humor with the DIY ethos of punk rock and more than a hint of an agitprop sensibility. Whatever else you care to say about them, you certainly can’t call them slick or commercial. Their works and their rhetoric harkened back to older utopian dreams for personal computing as a means to universal empowerment for all — dreams immersed in the ideals of the counterculture and promoted in the likes of the People’s Computer Company newsletter and the early issues of Creative Computing. With home computing taking off in Britain and the traditional forces of business and culture getting involved in a big way, those dreams were already beginning to sound quaint and anachronistic by the time Automata peaked in 1983 and 1984. Possessed as they were of about the level of business acumen you might expect from a pair of self-described “old hippies,” they were doomed from the start. Still, they had one hell of a lot of fun while they lasted.

Croucher was a disillusioned architect coming off a stint working under Rashid bin Saeed Al Maktoum to construct the modern Dubai. He’d also worked as a cartographer, played bass in rock bands, and briefly entertained the idea of becoming a painter. If Croucher was the would-be artist and visionary, Penfold was, relatively speaking, the more grounded; he’d sold everything from cars to plants to advertising space amidst various other jobs. Like many fruitful partnerships, they weren’t always sympatico with one another. Croucher declared that he was “to the left of Tony Benn” while Penfold was “to the right of Enoch Powell“: “That’s why it works — otherwise we’d come in in the morning and agree!”

Automata was founded circa 1977 on Croucher’s Dubai windfall. Like Melbourne House, it wasn’t initially conceived as a game or software developer, as is betrayed by the original full name: “Automata Cartography.” Capitalizing on Croucher’s background in cartography, they made tourist-friendly informational brochures and maps for the likes of the Sealink ferry service and British Airways. Those soon morphed into audio travel guides and promotions for foreign hotels as well as radio spots. Their guide to their home base of Portsmouth, narrated in the persona of, of all people, Charles Dickens, was heard by every tourist who booked a pleasure cruise around the harbor. They even produced some feature radio programming, such as a quiz program for Radio Victory that Penfold described as “rather like University Challenge without the brains.”

Mel Croucher and Christian Penfold, 1983

Mel Croucher and Christian Penfold, 1983

They were aboard a ferry on the English Channel, returning from working on a production for Sealink in the Channel Islands, when Croucher told Penfold about the Sinclair ZX81 computer he had just purchased, his first exposure to computing since the 1960s, when he’d struggled to teach the big machine at his university how to play “Twinkle, Twinkle Little Star” and flash its lights in rhythm. He’d seen advertisements for computer games in magazines for £4 or £5, a princely sum compared to what he and Penfold were used to getting paid for their work. And he recognized that the world stood on an artistic precipice.

I knew that computers were not for performing business functions at all, but would transform everything I was involved in. I was absolutely convinced that computers would facilitate the convergence of film, book, theatre and music, with the added miracle of interactivity.

Having learned from his struggles with ALGOL in earlier decades that he wasn’t much of a programmer, he asked Penfold if he’d like to take up the task, to implement his (Croucher’s) many ideas; Penfold seemed like he would have a better mind for the task. Penfold, as Croucher has delighted in telling interviewers ever since, promptly threw up — not in reaction to the idea, but because the crossing was rough and he was prone to seasickness. On that auspicious note, the new venture was born.

Options were of course decidedly limited on the ZX81 with its 1 K of memory. So Croucher and Penfold would put ten mini-games onto a cassette, each necessarily trivial in itself but building as a collection to address some overarching theme. Both reacted viscerally to violence in games even in this era when that meant no more than onscreen blobs knocking off other onscreen blobs. “If there is such a thing as an alien, it doesn’t want to come down to earth and get killed,” noted Croucher. “I have yet to find an arcade game where there is a full trial at the start,” rejoined Penfold. Croucher later put it even more strongly: “I think people who create violent games are lazy, ignorant, and have poodle shit for brains.” Thus Automata’s games from first to last would be resolutely nonviolent.

Yet that didn’t keep them from offending in other ways, as evidenced by their aptly named first tape, Can of Worms, with its “piss takes” (Croucher’s words) involving acne, vasectomies, Hitler, and Reagan, and featuring a Space Invaders satire along with Royal Flush, an anti-monarchy screed which, yes, does involve a toilet. It all seemed to Croucher the appropriate response to those combative early years of Thatcher’s reign, times of “repression, depression, recession, and political mayhem.” Some reviewers were predictably outraged, accusing Automata of “peddling pornography to kids” (exactly the sort of reaction, one senses, that Croucher and Penfold were hoping for). Others took it all more casually; Eric Deeson noted bemusedly in Your Computer that Can of Worms “must suffice for readers with bad taste until something more revolting appears.” Croucher and Penfold obligingly tried to up the ante with Love and Death, a progression from fertilization to death (a theme to which they would eventually return for their most famous work) and The Bible, always a subject guaranteed to enrage. All were storyboarded by Croucher and then programmed by Penfold in crude BASIC, which he bragged was like his poetry — “unstructured.”

When the Spectrum appeared and sold like crazy, Automata, like most software houses, quickly made the switch to the new machine with its color graphics and luxurious 48 K of memory. They came into their own with their first game for the Speccy, the most commercially successful they would ever release. Pimania is an illustrated text adventure involving the Piman, a pink fellow with a grotesquely huge nose whose relationship to the player lives on some uncertain ground between ally and antagonist. Croucher had based him on a neighbor, “a deadpan poet with a great Scottish accent and peculiar vocal delivery.” He quickly became Automata’s mascot. Children took to calling the office wanting to speak to him; Croucher or Penfold obligingly played the part. He starred in a comic strip drawn by Robin Evans which ran on the back cover of every issue of Popular Computing Weekly. And he became a beloved presence at shows and other events, with Penfold usually inside the costume. Penfold:

He is an escape — an extension of our own personalities — all the nice and nasty bits rolled into one. But now he no longer just exists in our minds. He is real. He has his own character.

As for the game itself, the first puzzle tells you just about everything you need to know about what you’re in for. The opening screen simply says, “A key turns the lock,” then refuses to do anything else until you figure out that it wants you to press the Spectrum’s PI key. After that you get to work out that navigation is based not on the usual compass directions but the hands of a clock. And then things start to get difficult. Depending on how you look at it and how charitable you’re feeling, Pimania is either a surrealistic but tough-as-nails old-school adventure game, a confused and confusing and essentially unsolvable effort all too typical of beginning designers and programmers writing their first adventure, or a piece of sadistic satire sending up the absurdities of early adventure games.

Whatever its virtues or flaws, Pimania sold quite well, largely on the back of the Piman’s inexplicable popularity and a brilliant promotional idea. This was inspired by Kit Williams’s 1979 children’s book Masquerade, which encoded within it the location of a hare made out of gold and jewels and buried in a secret location somewhere in England. Thousands scoured the book for clues to the golden hare’s whereabouts until an intrepid seeker finally recovered it almost three years after publication. The book itself became a bestseller.

The Golden Sundial of Pi

The Golden Sundial of Pi

Croucher and Penfold commissioned De Beers Diamond International Award winner Barbara Tipple to make a Golden Sundial of Pi out of gold, lapis lazuli, obsidian, and diamond, at a claimed price of an extraordinary £6000. Winning Pimania would provide the clues as to where and, crucially, when to show up to claim the prize; only one day of the year would suffice.

Contests were quickly becoming de rigueur for almost every major adventure-game release in Britain, but few attracted the attention and passion of this one. And yet the Pimania mystery went unsolved, even though Penfold’s BASIC was an accessible target for would-be code divers. As months stretched into years, some began darkly hinting that maybe there wasn’t actually a Golden Sundial at all, that the whole thing was an elaborate practical joke being played on the gaming public — which admittedly would fit right into Automata’s public persona, but would also be unspeakably cruel. One poor fellow, convinced he’d cracked the code, even made plans to jet off to Bethlehem for Christmas. When Penfold and Croucher got wind of that, they were kind enough to tell him through the press that he was on the wrong track; I don’t know whether he believed them or went off anyway. The rather more accessible Stonehenge was a favorite target of many others, while yet more, having worked out a connection to Pegasus, visited seemingly everything everywhere having anything to do with horses. All for naught.

Uncle Groucho (Croucher) and the Piman (Penfold)

Uncle Groucho (Croucher) and the Piman (Penfold)

Even as the Pimania mystery remained unsolved, Automata launched a new contest to go with their next adventure, which bore the intimidating title of My Name is Uncle Groucho… You Win a Fat Cigar. As trippy as its predecessor, it had players chasing Groucho Marx — who now became Croucher’s alter ego to join Penfold in his Piman suit — around the United States. It was all surreal enough that it led interviewer Tristan Donovan recently to ask Croucher whether “drugs were a factor.” Croucher insists that he and Penfold were running on nothing stronger than beer and cigarettes.

This time the contest at least was a bit more conventional. The first player to identify a celebrity from clues provided in the game would win a trip to Hollywood on the Concorde and passage back home on the QE2. This one had an actual winner in relatively short order, one Phil Daley of Stoke-on-Trent. The celebrity in question, it turned out, was Mickey Mouse.

Piman and Friends

In addition to their own games, Croucher and Penfold also published quite a number of titles which they received from outside programmers. Each would be retrofitted into Automata’s ever-growing lore, which soon included a cast of characters largely drawn from the comic strip, with names like Ooncle Arthur, Swettibitz, and (my favorite) Lady Clair Sinclive. Each outside submission got an appropriately Automatatized new name, usually an awful pun: Pi-Balled, Pi-Eyed, Pi in the Sky. And each also got a theme song on the other side of the cassette, recorded on a little studio setup in Croucher’s front room. On these Croucher, who was something of a frustrated rock star, could run wild. The results are some weird amalgamation of musique concrète, New Wave synth-pop, a Monty Python sketch, and Croucher’s personal hero Frank Zappa. Or, for another set of comparisons: Your Computer called the track below (included with the Groucho game) “a curious fusion of a Mark Knopfler vocal and Depeche Mode backing, with a Bonzo Dog Band playout.”

Groucho

And then came 1984 and Croucher’s magnum opus, Deus Ex Machina. He largely retired from Merry Prankstering for some six months to pour his heart and soul into it. He described it not as an Automata project proper but a “personal indulgence.” He believed he was creating Art, as well as the future of computerized entertainment:

I thought that by the mid-1980s ALL cutting-edge computer games would be like interactive movies, with proper structures, real characters, half-decent original stories, an acceptable soundtrack, a variety of user-defined narratives and variable outcomes. So I thought I’d better get in first, and produce the computer-game equivalent to Metropolis and Citizen Kane before the bastards started churning out dross. I wanted individuals to become totally immersed in the piece.

Penfold recognized Deus Ex Machina as “the crescendo of an idea” for Croucher, “an emotional achievement.”

Croucher had the idea of synchronizing a soundtrack to a computer game, to create an integrated, multimedia experience well before that word was in common usage. Because the Spectrum’s sound capabilities were rudimentary at best, the soundtrack would come on a separate cassette which the player must play at the same time as the game. The theme would be worthy of a prog-rock concept album: the journey of an individual through Shakespeare’s Seven Stages of Man in a vaguely dystopian postmodern world of mechanization, media saturation, and “Defect Police.” Perhaps it didn’t make a whole lot of literal sense — you’re somehow born from a dropping left behind by the last mouse on earth “as the nerve gas eased its sphincter” — but that hadn’t stopped Tommy or The Wall, had it?

The soundtrack, some 45 minutes in length, was mostly recorded by Croucher at home; he would have loved to have put a “half-decent band” together, but finances wouldn’t allow. He was, however, determined to get some well-known voices to feature on it, and to record them properly. He therefore booked precious time in a big London studio.

He recruited Frankie Howerd, an old-school showman and perennial on the British comedic circuit known if not always loved by everyone in Britain in the same way as a Rodney Dangerfield or Gallagher in the United States, to play the role of the head of the Defect Police, a “terrifying idiot.” (“It turned out that the real thing would eventually appear in the form of George W. Bush, but let’s not get into that.”) Howerd showed little understanding or interest in the project. He did his job — no more, no less.

More enthusiastic was former Doctor Who Jon Pertwee as the master of ceremonies. He angered everyone by arriving two hours late for his session, but explained that he’d fallen off his motorcycle on the way over and literally limped in as quickly as he could, bruised and still in leathers. All was forgiven, especially when he did a brilliant job. He and Croucher became fast friends, so much so that they later wrote a book together full of absurdist ramblings.

Croucher first wanted Patrick Moore, an eccentric popularizer of astronomy who was sort of the British Carl Sagan, complete with unforgettable tics that made him a television natural, to play the part of a sperm — “that would have been utterly surreal.” But when that fell through, he lucked into pub rocker Ian Dury of “Hit Me With Your Rhythm Stick” fame. Dury “hated what mainstream games were offering kids,” and was excited to work on an alternative.

For the female voice of the machine, Dury recommended Marianne Faithfull, but she “was buried in drugs and we couldn’t get a meeting together.” After a bid for pop diva Hazel O’Connor also fell through, Croucher settled on a local singer named Donna Bailey, who wound up contributing the best singing on the soundtrack by far. The choir from nearby Warblington School also popped in to provide some “Another Brick in the Wall”-style vocals.

The game that accompanied all this was programmed to Croucher’s specifications by Andrew Stagg, a “boy genius” assembly coder he had discovered. It consists of a series of simple action games synchronized to the soundtrack. Your goal, such as it is, is to keep your “degree of ideal entity” as high as possible; each mistake in the games costs you percentage points. However, play continues inexorably onward no matter how badly you screw up, an obvious limitation of syncing a non-interactive soundtrack to an allegedly interactive experience. Then again, maybe that’s for the best: the games are not only of rather limited interest but also brutally difficult. I’ve never seen anyone get to the end with anything other than the worst possible score of 0%. For Croucher, the score wasn’t the point anyway:

The metaphor of the score is incidental, and I hoped people would interpret it to suit themselves. Do nothing — you’ll never win. Do everything right — you’ll feel good for a while, you’ll be regarded well according to society’s rules, but you’ll still never win. However, as the man [on the soundtrack] says — Imagine if this was nothing more than a computer game and we could start our lives all over again, and do it better. That was the only meaning really.

I have some problems with Deus Ex Machina and the rhetoric deployed around it, which we’ll get to momentarily, but when it works it can approach the total immersion Croucher was striving for, gameplay and music blending into a seamless whole. “The Lover” is a particular favorite, a gentle ballad, well sung by Bailey, that morphs into something more disturbing along with the game on the screen.


Despite good reviews by writers who admittedly weren’t entirely sure what to make of a piece of unabashed multimedia art sandwiched into their usual diet of Jet Set Willy and Manic Miner, despite being released just in time for the big Christmas buying season, sales were about as close to nonexistent as they could conceivably be. By February, some five months on from release, just 700 copies had been sold on the Spectrum; a more recent port to the Commodore 64 had sold all of twelve. Still, Deus Ex Machina won “Program of the Year” from the Computer Trade Association that February. But it was now, as a profile in Sinclair User from this period put it, “an angry, bitter world” for Automata, the easygoing insanity of Groucho and the Piman replaced by uglier sentiments. Penfold dropped all of the jokey pretensions at his acceptance speech to voice his opinion of the state of a changed industry in no uncertain terms.

He and Croucher blamed Deus Ex Machina‘s failure entirely on new systems of software distribution. When they had gotten into the business back in 1981, software was sold mostly via mail-order advertisements in the hobbyist magazines, with the remainder being sold directly to the handful of computer shops scattered around the country. This approach became untenable, however, when computers came to the High Street and the big vanilla chains like Boots and W.H. Smith got into the game. As in virtually every other retail industry, distributors stepped in to act as the middle men between the publishers and the final points of sale. With far more software being produced by 1984 than they could possibly handle, these could afford to be selective; indeed, they would say their survival depended upon it. They carefully evaluated each game sent to them to establish not only whether it was a reasonably professional, bug-free effort but also whether it had enough mass appeal to be worthy of precious shelf space. There was an inevitable power imbalance at play: small publishers, desperate to get their games onto shelves in what was turning into a very uncertain year, needed distributors more than distributors needed them. Thus the distributors could afford to drive hard bargains, demanding a 40 to 60 percent discount off retail price and a grace period of up to two months after delivery to pay the bill. They also wanted games to fit into one of three price points: £2 for older “classics” and newer discount titles; £6 for typical new games; £10 for big prestige releases like The Lords of Midnight or anything from the demigods over at Ultimate Play the Game.

Automata bucked every one of these demands. Not only did they continue to demand high margins and cash on delivery, but they insisted that Deus Ex Machina retail at a grandiose £15. They weren’t entirely alone; others fought the new world order and sometimes, at least temporarily, won. The most famous among them is Acornsoft, who insisted on a similarly high price point for Elite. When a number of distributors refused to carry the game, Acornsoft simply shrugged and sold to those who would, until Elite became a transformative hit and the recalcitrant distributors came back begging for Acornsoft’s business. The artsy, avant-garde Deus Ex Machina, however, obviously lacked the mass appeal of an Elite. This time it was mostly the distributors who shrugged and moved on, with a predictable impact on the game’s commercial fortunes.

Given these circumstances, Croucher was and is eager to attribute Deus Ex Machina‘s fate to the age-old struggle between art and commerce:

The corporates had taken over as they always will when they spot a new lucrative market. They wanted standard product. Deus was designed as non-standard and I got the market completely wrong.

I suppose that’s fair enough as far as it goes, even given the odd fact that Automata was in a death struggle to charge their customers more than the going rate. The thing is, though, the suits were kind of right in this instance. Taken as a one- or two-time experience, Deus Ex Machina is intriguing, even inspiring. As a game, however, it’s not up to much at all. It took a retrospective review in ACE magazine to finally acknowledge the obvious: “the actual gameplay was strictly humdrum.” Now, Croucher and others might respond, and not without justification, that to evaluate it the same way one evaluates a more traditional game is rather missing the point; Deus Ex Machina is more multimedia experience than game. Fair enough. Except that it’s hard to overlook the fact that players were being asked to pay £15 for the privilege, one hell of a lot of money in 1984 Britain. For that price — or, for that matter, for £10 or £6 or even £2 — they deserved more than an hour or two of entertainment. Deus Ex Machina‘s failure to find a market was indeed a failure of distribution. Yet it’s not one that can really be blamed on the distributors; again, that’s just the way that the retail business works, whether you’re selling shoes, books, or computer software. It wasn’t their fault that there was no practical way to get shorter-form works to the public for a price that wouldn’t leave them feeling ripped off. The distributors just recognized, probably rightly, that there was no practical place for Deus Ex Machina in the software industry of 1984. Such is the curse of the visionary.

Automata never recovered commercially or psychologically from the failure of Deus Ex Machina. The old spirit of anarchic fun became one of passive-aggressive petulance. “Automata’s next product will be something truly wonderful, but we’re just not going to release it until everybody pulls their socks up,” Penfold declared. “Automata are too good for this industry.”

Croucher walked away in mid-1985. Penfold at first did the expected, declaring his intention to struggle on, but Croucher’s departure marked the effective end to Automata as a going concern. Penfold dropped out of sight, while Croucher continued his career as (in Jaroslav Švelch’s words) a “perpetually failed visionary.” He designed another avant-garde game or two for other publishers; tried and failed to launch a multimedia game console that would combine laser-disc video with traditional computer graphics; wrote a linked series of comic fantasy adventures of one Tamara Knight for Crash magazine; wrote a column for Computer Shopper; designed a game and a viral-marketing campaign for Duracell; did God knows what all inside and outside the computer industry.

Deus Ex Machina has become a minor cause célèbre of academics and advocates for games as art, who tend to overrate it somewhat; it’s certainly interesting, certainly visionary, but hardly a deathless masterpiece. It’s simply the best that could be done by these people with these resources at this time — and there’s no shame in that. Much the same could be said about the rest of Automata’s works. They’re great to talk about, but too crude to be all that engaging to play today. Automata embraced a punk-rock ethos of production, but failed to recognize that games have in some senses a higher bar to clear. A bit of hiss on a record may be discountable, even charming, but a game with similar technical flaws is simply excruciating to play; it’s right here that comparisons between independent games and independent music break down horribly. Appropriately enough for what often seemed more an exercise in performance art than a real software company, Automata’s crazy catalog of subversive games is most fascinating simply because it existed at all.

Mel Croucher, Sue Cooper, Christian Penfold (as the Piman), and Lizi Newman

Mel Croucher, Sue Cooper, Christian Penfold (as the Piman), and Lizi Newman

But before Automata leaves the stage and I end this article, there’s one more piece of the story to tell. On July 22, 1985, Sue Cooper and Lizi Newman, a schoolteacher and music-shop proprietor respectively from Yorkshire, arrived at an odd local landmark on the Sussex Downs: a gigantic horse cut into a chalk hill. They stood at the horse’s mouth in a driving rain, looking around nervously. Then the Piman clambered out from behind a clump of bushes. He presented the ladies with their prize while Croucher played his theme song one last time. Automata had played fair after all; the Golden Sundial of Pi was real. Perhaps due to the fact that Automata was winding down and they were tired of spending their July 22s crouching in the Sussex mud, they had even shown the women a bit of mercy: they really should have been standing at the horse’s arse.

Penfold and Croucher wave bye-bye

(If you’d like to experience Deus Ex Machina for yourself — and despite my reservations it’s well worth the effort — I’ve prepared a care package with the Spectrum tape image, the soundtrack as MP3 files, and the manual. Both sides of the tape are in the same tape image. The emulator should automatically load in the second side when the time comes, leaving you only to start the second soundtrack. That how it works under Fuse, anyway.

Information and art for this article were drawn from the following magazines: Your Spectrum of December 1984; Popular Computing Weekly of June 30 1983, January 12 1984, April 4 1985, November 28 1985; Home Computing Weekly of July 19 1983, December 13 1983, February 26 1985; Computer and Video Games of November 1982, December 1982, January 1984, October 1985, September 1986; Crash of February 1984, May 1985, April 1986; Big K of December 1984; Sinclair User of September 1984, February 1985, April 1985; Your Computer of February 1982, December 1983; ACE of May 1988; Computer Choice of January 1984. Also invaluable were ZX Golden Years, the new Automata website, the Deus Ex Machina 2 website, and the Piman Files. The last includes all of the Automata songs as MP3 files, including the one I’ve sampled here.)

 
 

Tags: ,

The Legend of Ultimate Play the Game

Ultimate Play the Game

Although the sorts of games they created are a little outside of my usual beat, I couldn’t possibly write an overview of 1984 in British gaming without including the little company with the big, unwieldy name of Ultimate Play the Game. During their glory years, which numbered no more than two, Ultimate was unabashedly worshiped amongst Spectrum owners. They took the place that the screw-ups at Imagine Software had thought was reserved for them: that of the most talented, innovative, cool, and fabulously successful developer in the country. No other Speccy developer, before or after, would ever come close to equaling Ultimate’s reputation. Amongst British gamers of a certain age or just a certain historical bent, the name “Ultimate Play the Game” is still spoken in positively reverent tones. No one, but no one, carries the mystique of Ultimate. Far from being a disadvantage, their short life only adds to their aura. They’re British gaming’s Joy Division.

Much of Ultimate’s mystique is down to their elusiveness. When they debuted their first games in 1983, the four founders of Ashby Computer and Graphics (trading name Ultimate Play the Game) — brothers Chris and Tim Stamper along with Tim’s girlfriend (later wife) Carole Ward and John Lathbury — did the usual dance, inviting the press out to their modest office in the Midlands town of Ashby-de-la-Zouch for interviews. Little did anyone realize that those first two interviews given that summer to Home Computing Weekly and Popular Computing Weekly would also be the last. At first, so the partners claimed, they were simply too busy writing and selling games to deal with the press. But it couldn’t have taken them long to realize that their silence only intensified gamers’ fascination. It was the best kind of promotion — the kind which costs nothing and for which you have to do nothing. The veil of secrecy has never really lifted; you don’t need two hands to count the total number of sit-down interviews given by the Stampers in the last thirty years. That can make things complicated for a fellow like me, but we shall do the best we can to ferret out some facts.

Born in 1958, Chris Stamper discovered computers while studying physics and electronics at Loughborough University. He was soon making one of his own, built around an RCA 1802 microprocessor, and learning to program it; this comfort with hardware as well as software would be key to his career. By 1979 he’d finagled a job at Associated Leisure, a new British firm set up to not only import Japanese and American arcade machines into the country but to continue to support the needs of the arcade owners who purchased them once they arrived. In that spirit, Chris spent much of his time working up conversion kits which would allow an owner to, say, convert a Space Invaders machine to a Galaxian when the former started getting long in the tooth. He also found a job for his less technical little brother Tim in graphics and design, and became firm friends with John Lathbury, another coder and hardware engineer.

Tim, Carole, and Chris Stamper

Tim, Carole, and Chris Stamper

Their manager at Associated was a fellow named Norman Parker. Realizing the talent he had working under him, he convinced them all to leave Associated with him for a new venture: a company called Zilec, which would become one of just two British companies to manufacture and sell original arcade games. They became his secret weapons, engineering and programming games that were sold in Britain under the Zilec name or licensed to big Japanese companies like Konami and Sega.

The Stampers made a big deal of their time in the arcade industry in those first interviews as Ultimate, and with good reason. The experience they gained was invaluable. For one thing, the games they made were all built around the ubiquitous Zilog Z80, the chip also at the heart of the Sinclair Spectrum; Chris and John had every opcode and register permanently etched into their brains long before they saw their first Speccy. But just as important were the personal contacts they made. Barely out of their teens, they got to travel the world. Parker: “They saw all the best products from around the world. They learned their trade well.” They learned the delicate feint and parry of Japanese business decorum, and forged a bond with a Floridian named Joel Hochberg who had been in coin-operated entertainment since the days when that meant pinball machines; he had helped the young Nolan Bushnell launch the original Pong. Parker and Hochberg inculcated them with their own unsentimental view of electronic entertainment as first and foremost a business that should make them money. Parker:

They are completely down to earth about games. They know what a game has to do to make money. In the arcades a game has to make money immediately or it will almost literally be scrapped. They learned this important lesson from the arcade business.

Developing using Ultimate's workstation

Developing using Ultimate’s workstation

With the arcade industry beginning to go soft and British microcomputers sweeping the country, the Stamper brothers and Lathbury decided to bail on Zilec and start their own company to cater to that new market. Carole Ward came in as well as another graphic designer and “company secretary.” From the beginning, Ultimate developed games in a way very different from the typical Speccy bedroom coder. As with so much else about Ultimate, details of their development system are not entirely clear, but those early interviews do describe it as a 32-bit multi-user system on which they could write and compile their code and ship it over to an attached Spectrum for execution. (A best guess would be a 68000-based Unix workstation.) The methodology obviously borrowed heavily from that used by Zilec and their competitors for writing arcade firmware, and cost “several thousand pounds.” They chose the Spectrum, its low-end model with just 16 K of memory, as their target platform for reasons “purely economic — to finance the costs of the development system and to provide revenue quickly they needed a big-selling computer, and the 16 K Spectrum fit the bill.”

Ultimate made their public debut in mid-1983 with Jetpac, followed by a spurt of three more games in two months and no shortage of self-confidence. Tim:

There is now an awful lot of software out for the Spectrum, but ours will always sell because it’s better. We have worried a lot of our competitors. Suddenly Jetpac came out from nowhere, but in fact we have more experience than all of them. I think we have raised the user expectation of what the Spectrum can do and software houses have been forced to raise their standards in line with us.

Jetpac

Jetpac

The games justified the hubris. Those early titles are fast-paced, colorful, and smartly designed, and pack a staggering amount of content into 16 K. They really did seem to transform the Speccy into a whole new machine which no one had ever suspected it had within it and which no one but Ultimate knew how to access. While gameplay remains firmly in the three-lives, ever-more-difficult-levels mold of the arcade, the concepts are not only original but, gratifyingly, have you building things rather than just blowing them up. In Pssst you’re a gardener trying to grow a flower and protect it from rampaging insects; in Cookie you’re a baker trying to make a cake, Swedish Chef-style, with animate ingredients that don’t want to be baked. Gamers responded. Ultimate was soon rewarded with the biggest overall sales in the industry and the beginnings of the incomparable reputation they still enjoy today. By year’s end their games sat at #1, #3, and #4 on Computer and Video Games‘s Spectrum sales top ten — and they had retreated into their offices, protected by thick perspex windows, an entry phone to ward off casual knockers, and “Private: Keep Out” signs on the garage at the rear that would soon hold a Lamborghini or two.

Ultimate's sealed offices

Ultimate’s sealed offices

But the games that would really make Ultimate legendary would come with the next batch, for which they stepped up to the 48 K Spectrum, the one almost everyone was actually buying by now.

Atic Atac

Atic Atac

The first of these bigger games was Lunar Jetman, a sequel to their very first game Jetpac with more complex play and a vastly larger area but still with the standard arcade structure of level after level, each more difficult than the previous, unfolding until you die. Then, between that game and the next, Ultimate’s design approach underwent a quiet revolution, from bringing games that could have been arcade hits to the Spectrum to longer-form experiences that could only have been born and bred for the home. The next game, Atic Atac, is another real-time action game with Ultimate’s by-now-expected superb graphics, but it drops you into a haunted castle full of rooms to explore, connections to map, monsters to defeat or avoid, objects to pick up, little puzzles to solve, all while contending with a time limit. There’s a score, but it’s not the real point of the endeavor — that’s to collect the three keys which will let you escape the castle. If you do so, you’ve actually won and the game is over. Remind you of anything? Crash magazine’s reviewer grappled with what kind of game Atic Atac really was: “Atic Atac is no way a true adventure, but neither is it a shoot-the-baddies game.” In those days, a “true” adventure meant a text adventure (possibly with illustrations for flavor) because that’s all there was. The genre that Atic Atac pioneered would within a year be named the “action-adventure.”

Which is not to say that Ultimate was the absolute first to tread this ground. The canonical first example of the form is Warren Robinett’s game Adventure, which he coded for the Atari VCS in the United States in 1979. Robinett had seen and been fascinated by the original Crowther and Woods Adventure on a visit to Stanford University. His game for Atari was an explicit attempt to translate the elements that had intrigued him on Stanford’s big PDP-10 to a form playable on the very different hardware of the VCS — so explicit an attempt that he even appropriated the name. Robinett:

I think you could describe what I did as translating the adventure-game idea from one medium to another. You can call text-adventure games a “medium”: the output to the player is text descriptions, the player types text commands to the game. That’s all there is: no graphics, no sound, no animation. The medium I was translating to was the videogame, where you have a joystick — one button and the four directions (or eight if you want to think of it that way) — and it has graphics, has color, has animation, has sound. So it had to change to work in that medium.

Robinett’s game was primitive even by the standards of Spectrum software of just a few years later; it had to be to run in 4 K of cartridge-based ROM and 128 bytes of RAM. But it threw down a gauntlet — albeit one which few American software houses, working the fertile ground of Zork-style text adventures and Ultima- and Wizardry-style CRPGs, took up. The folks from Ultimate, who had traveled far and seen much of the international videogame business while at Zilec and thus had to have seen Robinett’s Adventure, now picked it up for Britain instead. Atic Atac and the Ultimate action-adventures which followed — Sabre Wulf, Underwurlde, Knight Lore, Alien 8, Nightshade, Gunfright, and Pentagram — touched off a veritable gaming subculture of copycats and, eventually, games which took the concept even further. The big open-world action-adventure would remain a cottage industry in British software through the end of the decade and beyond, with a string of iconic titles: Mercenary, Fairlight, Spindizzy, Head Over Heels, Exile, the Dizzy series, just for starters.

Ultimate reached their pinnacle of success at the end of 1984, when they released Underwurlde and Knight Lore simultaneously just in time for Christmas. The former was a continuation of what they had already wrought in Atic Atac and Sabre Wulf, but the latter marked the last big innovation they would gift to their eager public: the first isometric action-adventure.

Knight Lore

Knight Lore

Ultimate’s worlds prior to Knight Lore, like those of virtually everyone else, were worlds of just two dimensions. (See my article on Elite for the reasons behind that and its ramifications.) They were viewed from directly above, from a strangely depthless sideways, or from even stranger, Pac-Man-like perspectives that conform to no obvious reality. Game developers had been frustrated by the limitations of 2D for years, but true 3D graphics, while David Braben and Ian Bell among others proved them not to be impossible, were very, very difficult on a simple 8-bit computer, and usually limited to wireframes if you wanted to get any performance at all out of them. Isometric perspective would turn out to offer a compromise, a way to get many of the benefits of 3D without most of the costs.

Engineers and architects had grown frustrated with the limitations of their 2D blueprints many years before videogames existed. They had realized that drawing a rectangular object like, say, a building from a sweet spot above and canted 45 degrees horizontally gave an excellent view of it in all its 3D glory without being very taxing on the drawer: it’s essentially just a matter of leaving vertical lines alone, rotating horizontal lines by -30 degrees, and rotating depth lines by +30 degrees. Best of all, a quirk in the rules of perspective means that objects deeper in the picture remain exactly the same size as those closer to the viewer — again, a huge burden lifted from the drawer. Or from the programmer and her computer. The transformations required to display an isometric view on a computer are fairly trivial compared to full 3D rendering. There are inevitable limitations — the real world isn’t generally so neatly rectangular — but, still, isometric graphics offered a magic bullet, a way to get something for, if not quite nothing, a very reduced cost.

Given that, it’s if anything surprising how slow game programmers were to catch on to their wonders. The first prominent games to use isometric graphics were a couple of arcade releases of 1982, Sega’s Zaxxon and Gottlieb’s Q*bert. Isometric graphics first came to the Spectrum in late 1983 in the form of Ant Attack, written by a young sculptor named Sandy White who applied his knowledge of real-world 3D construction to the construction of a virtual world inside his Speccy. But all of these titles were games in the traditional arcade mold. Ultimate applied the isometric perspective to a more ambitious, expansive action-adventure for the first time with Knight Lore. It was a match made in heaven; within a year virtually everyone else would make the switch as well. (One wag writing to Crash magazine labelled Knight Lore the “second most cloned piece of software after Wordstar.”) Not only did the isometric view bring a whole new depth (sorry!) of play, but it was stunning eye candy, becoming for most the ultimate (sorry again!) demonstration of Ultimate’s graphics prowess. One reviewer on a Spectrum nostalgia site: “I think that setting eyes on Knight Lore for the first time must have been like seeing King Kong on the screen back in the 1930s when the film was released. It (Knight Lore) was the single most staggering thing I’d ever seen on a Spectrum.” Another: “I was there in the computer shop on the day Knight Lore was first loaded into the demonstration Speccy. Within thirty seconds there was a crowd about seven deep, everyone clambering over each other to get a better look.” It really was that amazing. As with Ultimate itself, a gauzy patina of awe still surrounds Knight Lore today amongst British gamers. A 1994 article in Edge magazine pronounced it nothing less than “the greatest single advance in the history of computer games.” That’s an assertion that reveals a certain loss of perspective (done now, I promise!) in my opinion, but it’s nevertheless a very, very important moment whose reverberations would echo for years through British gaming and well beyond.

Knight Lore is also enshrined in Ultimate lore for another reason. In an interview given to The Games Machine magazine in December of 1987 (the first they had agreed to since those early pieces back in 1983), the Stamper brothers dropped a bombshell:

Knight Lore was finished before Sabre Wulf. But we decided then that the market wasn’t ready for it. Because if we released Knight Lore and Alien 8 — which was already half-finished — we wouldn’t have sold Sabre Wulf. So we released Sabre Wulf, which was a colossal success, and then released the other two.

This little anecdote has had huge resonance in the years since because it so confirms the legend of Ultimate as both technical and artistic geniuses (they were doing isometric action-adventures even earlier than we thought!) and iconoclastic masters of PR and marketing (who else was thinking in terms of what the market was “ready for” in the wild and wooly days of the early British software industry?). And indeed, and assuming that it’s completely true, it does show uncanny foresight of a sort which many other publishers could have used. (Just to give one example: it strikes me that Beyond and Mike Singleton would have done better to wait another few months at least before releasing Doomdark’s Revenge, to let the buzz from The Lords of Midnight run its course and fully prime the pump for the sequel.)

Ultimate’s star dimmed somewhat during 1985, beginning as early as the next game, Alien 8, which prompted a few grumbles amidst the usual superlative reviews that it was really just Knight Lore in Space. By the time of Nightshade and Gunfright a developing critical consensus had it that Ultimate was not only failing to advance their action-adventure formula but actually beginning to regress, offering less interesting puzzles, less control, and less to do. A handful of underwhelming Commodore 64 games done by outside contractors — this after Ultimate had sworn they would always do all of their games in-house — damaged their reputation even more. In truth, the divide in quality pre- and post-Knight Lore was probably exaggerated; the British gaming press had more than a whiff of the tabloid about it, and generally delighted in tearing down heroes even more than it enjoyed building them up. (One magazine was so cruel as to gleefully follow Mark Butler around from developer to developer as he desperately searched for a job following the Imagine Software debacle. With that example of how the press treated previous golden boys, the folks at Ultimate could probably count themselves lucky that their own fall from favor was confined to some harsh reviews.) Then, just as 1986 began, Ultimate did something else they’d said they’d never do: sold out to the big, well-funded house U.S. Gold, who had become a huge player by licensing, repackaging, and distributing American software for the British market. Some more underwhelming games followed, and then the Ultimate imprint disappeared entirely, making its final appearance on a Collected Works anthology that packaged together their first eleven Spectrum games — all of the ones which the founders had actually programmed. Lathbury and the Stampers disappeared entirely for a couple of years. Speculation about just where the hell these most legendary Speccy developers of all had gotten to ran rampant.

What had happened, it eventually emerged, was one of the strangest transformations in the history of gaming. The Stampers claim that as early as late 1983, even before the run of action-adventures which would form the cream of Ultimate’s legacy, they brought the first Nintendo Family Computers (Famicoms) into the office. These strange little toylike game consoles, completely unknown in Europe or North America, were the hottest thing going in Japan, as Ultimate knew well from their ongoing international contacts. As home consoles crashed everywhere else in the world, as the received wisdom from all the British and American pundits said that consoles were dead and home computers were the future, the Famicom was motoring past the half-million unit marker after a matter of months on the market. Tim Stamper:

The machine, for the price it was available in Japan then [about $100 US], had colossal potential — we looked at this and we looked at the Spectrum — and then the Spectrum was hot stuff, but this was incredible. So we spent possibly eight months finding everything out about this system — its custom chips, and it takes a fair bit of work — we managed to do that and then started to write on the machine.

It was at this time, around the point when Knight Lore was about to hit store shelves, that the Stamper brothers set into motion a master plan to gradually divest themselves from the Speccy. The Speccy was huge in its home country, but it was in some sense provincial; it would never penetrate far beyond Britain and a few other European markets despite hopeful arrangements with Timex and others that tried to introduce it to North America and other such tempting places. (Eastern-bloc countries cloned the hell out of it, but that of course didn’t lead to more software markets for the Western likes of Ultimate.) Yes, you might make the same argument about the Famicom, but Nintendo was a smarter company than Sinclair with far more resources at their disposal and long-term plans in the offing for worldwide domination, as the Stampers knew well from their Japanese contacts. Ultimate was doing brilliant, unprecedented things with the Speccy graphically, but the Nintendo was more obviously capable in that area right out of the box; just imagine how far they might push it. And the Famicom had one other hugely appealing quality: its programs were sold on cartridges rather than tapes or disks. This made it a much more appealing system for children, novices, and the computerphobic: playing a game was a simple matter of shoving a cartridge in and turning on the machine; no entering arcane commands, no waiting twenty minutes with fingers crossed. Best of all, cartridges made casual piracy impossible — piracy which many Speccy publishers claimed was costing them half or more of their revenue.

So, for most of the same reasons that other publishers would also cite but far, far earlier, Ultimate decided to jump on the console bandwagon. Working without manuals or Nintendo’s development kits, they figured out how the Famicom worked and developed ways to program it. Nintendo, however, ruled the Famicom ecosystem with an iron fist. The only way to publish for the machine was through them, and they weren’t in the habit of granting licenses to companies outside of Japan. Enter their old American friend Joel Hochberg, who had longstanding relationships with Nintendo. He approached Nintendo executive Minoru Arakawa and finagled a meeting for the Stampers. Nintendo was impressed enough with the demos they showed that they signed them. Hochberg and the Stampers — Lathbury quietly dropped out of the picture around this point for reasons that have never been explained — formed a new company, Rare, to develop for the Nintendo, using the money still coming in from Ultimate and, eventually, Ultimate’s sale to U.S. Gold to fund this new entity. Rare went on to become one of if not the most valued software partner of Nintendo and one of if not the most prolific and successful maker of Nintendo games outside Nintendo themselves, while the odd little Famicom went on to change everything everywhere as the Nintendo Entertainment System.

That, anyway, is the story as the Stampers have told it and the one that has gone down in gamer legend. I don’t have any evidence to refute it, although I do have a sneaking suspicion that there’s a bit of PR and self-mythologizing going on here, that it wasn’t all that meticulously plotted from the beginning. Certainly one must admit that it smacks just a bit of, say, a really convoluted Homeland plot which depends on every person reacting just so and every link on the chain connecting flawlessly. Ah, well… I suppose gamers need their legends too. Whatever the dirty details, the Stampers have proved themselves to be very adroit businessmen under anybody’s terms. And we’ll leave it at that.

Technically superlative and meticulously crafted as they are, I’m not a huge fan of Ultimate or Rare’s games in general. They’re just so commercially calculated, so plainly aimed with pinpoint precision at the center of a market demographic that I can’t feel much, for lack of a better word, soul in them. I’ll cast my lot with the dreamers and crazed would-be electronic artists instead, even if their games aren’t quite so air-tight technically. I’d perhaps feel better about the Stamper legacy if there was any mention of joy or fun in the few interviews we have with them to balance all the talk of competencies and markets and competition and games as “products.” (Tim: “We actually act, I suppose, as Nintendo’s development team. If they feel they are lacking a product on a machine, they tell us, we develop it, and so we are sure of licensing product to them.”) But I must also acknowledge that I wasn’t there when Knight Lore and the others first wowed players, and my view of them might be different if I had been. Certainly the legend of Ultimate Play the Game will live on, and perhaps that’s as it should be. Yes, gamers do need their legends.

(Magazine sources: Your Computer of June 1983; Home Computing Weekly of August 9 1983; Popular Computing Weekly of August 18 1983, November 10 1983; Personal Computer Games of Summer 1983, January 1985; Computer and Video Games of March 1986; The Games Machine of March 1988; Retro Gamer #20; Edge of September 1994; Commodore User of July 1985; Next Generation of November 1995. Warren Robinett and Atari Adventure material drawn from Racing the Beam by Nick Montfort and Ian Bogost and Jason Scott’s Get Lamp interview with Robinett. Nintendo material drawn from Game Over: How Nintendo Conquered the World by David Sheff.

Rare has consistently refused to grant permission to distribute the old Ultimate catalog and been quite aggressive with those who do so without approval. Since their lawyers are doubtless bigger than mine, no downloads this time. Sorry!)

 
 

Tags: ,

Mike Singleton and The Lords of Midnight

The Lords of Midnight

If Ian Bell and David Braben, brashly young and brashly nerdy in that traditional math-and-science sort of way, became the baseline expectation for British game developers, Mike Singleton, creator of the beloved adventure/strategy hybrid The Lords of Midnight, was the outlier. He was older for one thing, already in his mid-thirties at the peak of his fame in the mid-1980s. And his background was also different. After a failed stab at becoming a theoretical physicist, Singleton had graduated university with of all things a degree in English, and spent a decade teaching the subject before computers arrived to provide an outlet for his latent talents for game design and programming. As Singleton himself put it, this background gave him an advantage over many of his peers in that “I am able to spell correctly, thank goodness.” More significantly, there’s a certain sense of nuance, even of grandeur, to The Lords of Midnight and some of his other games that distinguishes them from the more typical teenage-Dungeon-Master virtual worlds that crowded them for space on store shelves. His road from English teacher to one of the most famous developers in Britain not named Bell or Braben began in the late 1970s with, of all things, a betting shop.

The shop in question was owned by a friend in his home town of Liverpool. Said proprietor was always complaining about the math involved in some of the more elaborate bets favored by his patrons, such as the “around the clock,” which consisted of thirteen individual wagers placed on three different horses. Singleton offered to help out by writing some routines on the Sinclair programmable calculator he’d just received as a birthday gift. When that looked promising, they spent £100 on a Texas Instruments TI-59 calculator to take things to the next level. When that also went well, they upgraded yet again, this time to a Commodore PET with the idea of creating a complete PET-based software suite for managing all of the functions of a betting shop — a package which they could sell. But it didn’t work out. They just couldn’t figure a way to get the software to run fast enough to keep up with the fingers of an experienced bookmaker with a line of eager punters to service.

Looking around for some alternative use for their hardware investment, Singleton and his partner came up with an idea that was, in its way, visionary. What if they let customers play and bet on the computer whilst hanging about the shop waiting for race results? Singleton’s first game (of a sort, anyway), Computer Race, was born for this mercenary purpose. Customers who fed it money were rewarded with an onscreen, animated horse race. Singleton based his horses’ movements on a picture to be found on the wall of half the betting shops in Britain, Eadweard Muybridge’s “The Horse in Motion.”

The Horse in Motion

The outcome of each race was determined randomly, with the customary small edge for the house. Again, Singleton and partner envisioned selling the game to betting shops all over the country. But again, their hopes came to little. This time they were undone by a legal rather than a technical problem. They ran afoul of British laws designed to keep betting shops as unwelcoming as possible. Singleton: “Having chairs is a little bit dicey, you might be contravening the laws — you’re encouraging people to go in a betting shop.” After narrowly losing a test prosecution, they decided the plan was too risky to go ahead with. It seemed the era of virtual racing must still be some years away. They sold exactly one Computer Race setup, to a betting shop in Ireland not subject to British law, where it played merrily for some years.

Having failed to become a gambling mogul, Singleton “took the computer and ran” to see if he could find another way to earn back its purchase price. Already a pretty good assembly-language programmer thanks to his experience with the betting shop, he wrote an action game he named Space Ace and sold it to PETSoft, one of the first software publishers to spring up in Britain. It sold 200 to 300 copies — not bad given the size of the British computer market at the time. But Singleton sensed a bigger opportunity in the offing. Despite their name, PETSoft had entered negotiations with Sinclair Research to market a line of programs for that company’s first microcomputer, the ZX80. Unfortunately, the deal fell through. Fortunately, Singleton was a persistent sort. He called Clive Sinclair directly to offer his services as an experienced programmer with a published game to his credit. Uncle Clive took a shine to him, and invited him up to Cambridge to pick up a prototype of and sign a contract to develop some games for Sinclair’s second machine, the ZX81.

To say the scope of possibility was limited on the ZX81 hardly begins to state the case; the machine had all of 1 K of memory. Over his two-week break from teaching that Christmas of 1980, Singleton knocked out half a dozen simple BASIC games and shipped them back to Sinclair to become Games Pack I. Now he began to reap some real financial rewards from his efforts at last: Games Pack I, that product of two weeks’ effort, earned him an astonishing £6000 during the following year. By way of perspective, know that £6000 was the average Briton’s yearly salary in 1980. “It was the best rate of pay I’ve ever had or am ever likely to have!” noted Singleton wryly.

If he’d been chasing the money until this point, that didn’t mean that Singleton didn’t genuinely love games. He was an avid board gamer and would-be game designer from childhood; he designed his first “James Bond-style” board game at 13, and retained an enduring fascination with Go throughout his life. In 1977 he discovered his first computer game (again, of a sort). Starweb was a play-by-mail game from the pioneer of the format, the American company Flying Buffalo. A harbinger of countless space-based grand-strategy games to come, Starweb required that players communicate each move on paper via post, writing their commands out using an arcane language to be entered into the game’s host, an exotic Raytheon 704 minicomputer. Once all of the players’ moves were entered, the Raytheon spit out the results in another arcane format that was, just for extra fun, completely different from the command-entry format. These reports were sent to players by return post. When two players met in the game, each was provided with her counterpart’s address, so they could talk amongst themselves and plot alliances and wars. And so it went, at a cost of $1.75 per move. Despite the extra postage cost and time delay entailed by his living in far-off England, Singleton was entranced. His first game lasted two years, ending in his victory over about fifteen others. He continued to play avidly thereafter. (Starweb, now approaching its 40th anniversary, is still an ongoing concern. The only obvious change from the game Singleton knew is that the communication can now be done via email.)

With that £6000 burning a hole in his pocket, Singleton now decided to press his faithful PET into service yet again to be the host of Starlord, a, shall we say, tribute to Starweb more accessible to British and European gamers. The PET got some pricey exotic technology which ate up a good chunk of the £6000: a hard disk and a color ink-jet printer. The latter allowed him to send players full-color maps of the galaxy and their position in it. Indeed, he made the entire game more colorful and friendly than its inspiration: “order forms” were now used to communicate your moves and their results were sent back in plain, readable English along with the map and copious status reports. All for the modest fee of £1.25 per turn. It was a lot of fiddly work for Singleton, but with players soon numbering in the hundreds (the game would peak around early 1984 with over 700), it was also quite profitable. When he realized he was earning more from Starlord than he was from teaching, he quit his day job.

He supplemented his Starlord income with more action games for the low-cost microcomputers that were now starting to flood Britain. First came Shadowfax, an unauthorized Lord of the Rings knockoff of the sort the rising profile of the computer-games industry wouldn’t allow for much longer. It had you playing Gandalf riding against the Black Riders; he reused the horse animations from Computer Race. It was also the first sign of a Tolkien fixation that would flower more elaborately in The Lords of Midnight. Shadowfax was followed by Siege and then Snake Pit (Singleton’s personal favorite from this early stage of his career). All were released for the Commodore VIC-20 and later the Sinclair Spectrum on the Postern label, and all did quite well.

Computers and Video Games magazine, October 1983

He went through a phase of being fascinated with the possibilities for 3D displays — meaning 3D in the sense of those funny glasses you wear at the movies, not 3D rendering as in Elite. 3 Deep Space, released initially for the BBC Micro — Singleton apparently never met a machine he didn’t want to program — would be, Computer and Video Games magazine confidently predicted, “the first of a flood of stereoscopic games to hit the micro shops.” Or not, although Singleton gave it a hell of a try, porting 3 Deep Space to the Commodore VIC-20 and 64 as well as the Spectrum and publishing a bunch of 3D type-in listings along with the bound-in glasses to view them in the same issue of Computer and Video Games that made the aforementioned prediction (funny how that worked out). 3 Deep Space became the first flop of Singleton’s career.

His close relationship with Computer and Video Games was thanks to his friendship with that innovative magazine’s founder and editor, Terry Pratt. He even once ran a special simplified version of Starlord (called The Seventh Empire) just for the magazine’s readers. Pratt left Computer and Video Games in 1983 to take charge of a new publisher being set up by the media conglomerate EMAP, to be called Beyond Software. He immediately started urging Singleton to leave Postern and come write games for him. In September of 1983 Singleton invited Pratt to his home in Chester, where they discussed three different ideas that might be turned into games. By the time of a 2004 interview for Retro Gamer magazine Singleton had forgotten what two of them were, but the third had been indelibly stamped not only in his head but in that of many thousands of gamers. It was a new graphics technique he called “landscaping” which he believed could be made to work on the Speccy. Given a grid-based map containing forests, plains, mountains, lakes, etc., landscaping would let him programmatically generate and draw a first-person view from any square on the grid, facing in any of the eight compass directions. He thought he might use it in a game far more ambitious than his previous simple action titles: something like the current bestseller The Hobbit, only much, much better. He believed he could pack a 64 X 64-square “game board” into his Spectrum’s 48 K of memory. This added up to 4096 individual locations to visit, each with eight possible views: over 32,000 individual pictures to see. The ad copy would practically write itself. Pratt told him to go for it: “I would first implement the technique and then, providing it worked, we’d go ahead with the full game.”

As would be the case with David Braben’s first rotating 3D spaceships, from this technical germ would be born a spectacularly innovative game design. Singleton envisioned a strategic war game that would play like an adventure game. Instead of viewing the conflict from on-high, you would see it through the eyes of those whose actions you controlled, whom you would be able to switch among and order about individually. The experiential aspect of war games, the role of the imagination in evoking their unfolding narratives of conflict in the mind’s eye, had always been an important part of the form’s appeal, much as stuffy grognards muttering about analyzing history and scoffing at games of mere make-believe might have sometimes been loath to admit it. Now Singleton could really bring that aspect to the fore, really let you live the experience through your generals’ eyes in a way that felt more like an adventure game. Yet there would be no parser, and text would be kept to a minimum to save memory for code. Singleton, who like many hardcore strategy gamers had little use for traditional adventures with their static worlds and static puzzles, was determined that his game be an unpredictable, dynamic, replayable experience like the cardboard war games he loved: “Routes aren’t dictated by the programmer in advance. You are in control of the main characters and their ultimate destiny.”

The Lords of Midnight

Of course, he would need a fictional context for the game. He thought of adopting the setting of a new play-by-mail design he’d been working on called The Lords of Atlantis, but Pratt wasn’t excited about that. Anyway, an underwater setting probably wasn’t worth the trouble. A medieval fantasy setting, he decided, would yield the best combination of popular appeal and ease of implementation, and be a natural fit for the sensibility of a Tolkien fan like himself. He decided to make his land a cold, icebound place simply because he “liked the combination” of the Spectrum’s shades of white and blue. He conceived of a land in which seasons last eons. His story would take place on the winter solstice, the darkest, coldest point just before a hoped-for new dawn and gradual thaw. The Lords of Atlantis became The Lords of Midnight. Now he started in earnest to build a world:

I drew a large map, which I still have, in nice felt-tip colours. It isn’t quite as difficult as it sounds and really I bet Tolkien did the same — you start off with a few word endings and tack different syllables on the front until you come up with something that sounds good, so you sit there going “Ushgarak, Ashgarak, Ighrem” to yourself until you get something that sounds nice or horrible according to what you want. [In reality the linguist Tolkien, who started with rigorously worked-out made-up languages and wrote The Lord of the Rings and the rest of the lore of Middle Earth largely to learn about the people speaking them, would have been aghast.] And once I got the map, then I started doing the story before I got on with any programming, and it was really the story that built up the atmosphere. I managed to get through the story quite quickly, in about three weeks. And that clarified all the major characters.

The story Singleton wrote, the preamble to the War of the Solstice, would eventually be included with the game as a novella. Taken on one level it’s just another Tolkien knockoff, if reasonably well written as such things go. We’ve got the usual stand-ins for Sauron (Lord Doomdark), Aragorn (Luxor), Frodo (Morkin), Elrond (Corleth), Gandalf (Rorthron), Gollum (Fawkrin). The game itself flies its Tolkien flag with equal pride. There are two ways to win which neatly parallel the two stories told in The Two Towers and The Return of the King: use Luxor (Aragorn) and as many of the thirty other Lords of the Free as he can rally to his banner to defeat Doomdark (Sauron) militarily, or sneak Morkin (Frodo) deep into Doomdark’s realm to destroy the Ice Crown (One Ring), source of Doomdark’s power.

The Lords of Midnight

The Lords of Midnight

The Lords of Midnight

The Lords of Midnight

Yet there’s a sense of the timeless to The Lords of Midnight which is seldom seen in other games then or now. Singleton does not just appropriate the surface tropes of The Lord of the Rings but also manages to capture some of its epic grandeur. In their advertising copy, Beyond would ceaselessly pound on that word “epic”: “The Lords of Midnight is not simply an adventure game nor simply a war game. It is really a new type that we have chosen to call an epic game, for as you play The Lords of Midnight you will be writing a new chapter in the history of the peoples of the Free.”

“Epic” is a word, like “tragedy” or “hero,” that’s been overused by our Hollywoodized culture to the point of meaninglessness; these days movies based on toy robots are epic sagas, and that party your friendly local teenager went to last weekend was epic, dude. The Lords of Midnight, however, reaches back to the word as it applies to The Iliad, The Odyssey, Paradise Lost, and, yes, Tolkien’s modern epic The Lord of the Rings. Not that I’m making direct artistic comparisons here, mind you. It’s just that there’s a similar if more modest windy majesty blowing through Mike Singleton’s land of Midnight, something I can’t quite put my finger on that makes it more than your typical fantasy pastiche. One need only read a few comments from the many players who remember the game well to realize that I’m not the only one to feel it. More so than strategic depth (which it has to a surprising degree; folks continue to discuss and refine their strategies to this day) or anything else, that feeling seems to be the defining trait of The Lords of Midnight for most players. It manages to be epic in the classical sense when other games settle for adjectives like “exciting” or “interesting” or “immersive.” It’s in the graphics; it’s in the simple elegance of the mechanics (there are only a few actions each character can perform); it’s even in the text, which is stately, literate, and dignified while necessarily remaining very succinct. Even when you lose — and you will lose, early and often, before you work out how everything works and develop a serviceable battle plan — you feel like the destruction of the People of the Free also has an epic resonance of its own; the story of Hector can be as compelling as that of Odysseus, after all.

Although they are very different games with very different personalities, The Lords of Midnight and Elite have a similar sense of scope about them. Both are incredibly complex systems in light of the hardware on which they run. More importantly, however, both are masterful examples of the Eliza effect writ large: they make you want to believe — make you actively imagine — that there is more to their universes than there actually is. Paradoxically, their very verisimilitude can be linked back to their hardware limitations. With scant memory and no disk storage to fall back upon, their creators were forced to generate most of their content procedurally rather than constantly slurping up static assets — whether in the form of graphics, sounds, data, or just text — from disk. The result feels unpredictable, flexible, and responsive to you in a way that most contemporary “epic” American adventure and CRPG games, which by The Lords of Midnight‘s time routinely spanned multiple disk sides, never manage with all their reams of static data. Taken purely in terms of the feeling of scope and possibility that they evoke, The Lords of Midnight and Elite are still some of the most awe-inspiring virtual worlds ever made.

Whatever other qualities The Lords of Midnight shares with Elite, it had nothing like that game’s extended gestation time. Having spent the last three months of 1983 developing the landscaping technique and designing the game and its world and of course writing the novella, Singleton didn’t start actually coding the game until January. He finished in a bare three more months of twelve-hour days, turning the whole thing in to Pratt that April. The whole project consumed less than seven months from start to finish. Within weeks, the game was on shelves. The final product was, in defiance of Beyond’s less than rigorous testing regime and the hard-won wisdom of anyone who’s ever programmed seriously, very nearly bug free. It’s a remarkable achievement indeed, one of the last of the great lone-wolf games. Singleton had done everything: design, graphics, coding, even writing most of the manual in the form of that novella. Technology and ever-greater player expectations would soon make such an approach, even for an apparent near savant like Singleton, untenable.

Mike Singleton shows off The The Lords of Midnight, 1984

Mike Singleton shows off The Lords of Midnight, 1984

Beyond, who had failed to make much of a splash with their first few titles, knew they had something special in The Lords of Midnight. They began previewing the game even before Singleton was completely done with it, creating a buzz of interest. By now it had become standard practice in Britain for publishers of hot new games to tell players to contact them as soon as they won for booty and ten minutes of fame that was either explicitly or implicitly up for grabs to the quickest and the cleverest. Recognizing the game’s literary texture, Beyond came up with one of the craziest such contests ever staged. Prospective winners, they said, should laboriously print every single screen of their experience, using a command Singleton had helpfully added for that purpose, to create “an illustrated history of the War of the Solstice.” The first person to send in a complete history of a winning game would become “the coauthor of a fantasy novel based around your adventures in Midnight,” which Beyond would “arrange for a fantasy writer to turn into a book to be published by one of the UK’s top fantasy publishers.” An audacious idea indeed, but one that turned into a fiasco. Whatever agreements Beyond had or thought they had, they ultimately couldn’t find any name author to write such a book nor anyone to publish it. The winning player, who had sent in his thick sheaf of papers just two weeks after the game was released, was quietly pacified with some alternate prizes and the whole episode flushed down the memory hole as quietly as Beyond could manage.

That hiccup aside, everything went smashingly. The Lords of Midnight sold 10,000 copies in its first two weeks at the unusually expensive price of £10. It turned into a solid success, albeit one that hovered around the tenth position on the sales charts for months on end rather than a chart-topper. No worries; that felt somehow appropriate, given the sort of slow-building but ultimately entrancing experience it is to play. Almost every Speccy-focused or platform-agnostic games magazine in existence published tips, maps, and often complete strategic solutions for both the quest victory and the military victory over the next year. There’s no better illustration of the esteem in which it came to be held by at least the more cerebral wing of Speccy gamers than Crash magazine’s big 1984 year-end poll. The Lords of Midnight was voted “Best Text/Graphical Adventure” by no less than 51% of voters; its closest competition came in the form of Sherlock with all of 10% of the vote. The game’s reputation only continued to build as the years passed and magazines like Crash turned more and more to dwell on past glories in light of the slowly dwindling supply of new Speccy games. In 1991 they dubbed it and its sequel “probably the two most exciting computer games ever written.” Its position in Spectrum lore remains inviolate today. (Beyond also funded ports to the Commodore 64 and Amstrad CPC, but those, while very positively reviewed and successful enough, never quite attracted the worship that Speccy lovers continue to bestow on the original.)

That aforementioned sequel, Doomdark’s Revenge, appeared, following another frenzy of design and programming on Singleton’s part, just in time for Christmas 1984. It was largely what one might expect: an even larger map, a few new commands to try, “few major surprises” (as Crash admitted in an otherwise glowing review). It was undoubtedly (as another reviewer noted) “more sophisticated and more difficult,” and became another big success for Beyond, but most today feel it failed to improve on its predecessor as either game or fiction. While it’s not a bad game by any stretch, even Singleton himself eventually came to recognize the original as superior. Ironically, some of its flaws were down to its increased “sophistication,” particularly its elaborations on the rather simplistic AI of the original. Singleton:

With 20/20 hindsight, the way the characters in Doomdark’s Revenge made and broke alliances of their own accord and moved about the map on their own quests made things too unpredictable for the sort of strategic planning a player could do in Lords. Perhaps some other feedback — in the form of news or intelligence information — to the player on what was actually going on in the (largely unseen) background between the other characters would have made this feature really work.

That said, I should also note that there is a substantial minority of players who prefer the larger map and more unpredictable play of Doomdark’s Revenge. Whatever else we say about it, the fact that a program running on a 48 K Spectrum is making and breaking alliances amongst non-player characters is pretty amazing in the same way as are the dynamic characters of The Hobbit and Sherlock — and, perhaps, similarly problematic from the standpoint of playability, falling into an uncanny valley of the AI which lessens rather than furthers the Eliza effect.

Singleton posing for Crash magazine at the peak of his fame

Singleton posing for Crash magazine at the peak of his fame

The scope of The Lords of Midnight and Doomdark’s Revenge as well as innovative techniques like landscaping were so inspiring that Singleton couldn’t help but become a hacking hero even amongst Spectrum owners who lacked the patience to sit down and properly play the games. While he would never challenge Bell and Braben in the mainstream-fame sweepstakes, he was a big, big man in British computing for several years, with the magazines all jostling to interview him about not only coding and design but also such pressing concerns as his favorite food (steak and chips) and favorite pop groups (Pink Floyd, Deep Purple, and Led Zeppelin; 1970s hard rock was quite the staple amongst 1980s hackers).

But the interviewers’ favorite question was always about the status of The Eye of the Moon, the final game in what had been promised from the first would be a trilogy (how could such an unabashed Tolkien pastiche be anything else?). Singleton delivered tidbits about what the last game would entail, including a yet larger map and a different structure that would break the game (and the map) into a number of smaller quests, each building upon the previous and increasing in difficulty, to keep it from getting overwhelming in the way in which Doomdark’s Revenge was already arguably verging. But first he wanted to write a new, different sort of game for the Commodore 64, his first with a collaborator — thus marking the end of this lonest of lone-wolf coders, a sign of changing times if ever there was one — called Quake Minus One. Then Beyond, after barely two years as an independent entity, got bought by the giant British Telecomsoft, who were throwing their weight and money around the British games industry like crazy during the mid-1980s. BT had already, you may remember, outbid everyone else for the biggest game in the industry, Elite. They had also bought a Star Trek license from Paramount, and Singleton was drafted to work on that game for a while while The Eye of the Moon continued to languish. The Star Trek project didn’t go well at all, and he left to form his own small development house, Maelstrom Games. In the end, The Eye of the Moon never appeared, doomed by some combination of other pressing priorities, over-ambition, and legal uncertainty about who really owned what; Singleton and Pratt, you see, hadn’t bothered with contracts, only “gentleman’s agreements,” which was fine until a big, fussy company like BT came into the picture. Maelstrom did finally make and release an alternative concept as a third game, Lords of Midnight 3: The Citadel, in 1995 on the Domark label, but probably shouldn’t have bothered; few have much good to say about that effort.

Singleton’s career after his glory years of the mid-1980s is a motley mix of worthy and similarly underwhelming games, with one more unimpeachable classic, 1989’s Midwinter, thrown in for good measure. Lords of Midnight 3 marked his last project in the role of designer. He remained in the industry until his untimely death in 2012, but was forced to content himself with lower-profile, mostly purely technical roles on big teams, working on games whose titles sometimes sound like something from an Onion parody of manic videogame culture: HSX: Hypersonic Extreme, Wrath Unleashed. A long, long way from the stately dignity of The Lords of Midnight

Having praised that game so effusively here, I should note before concluding that it isn’t as playable as, say, Elite. Formulating a strategy and keeping track of up to 32 individual leaders of the Free, not to mention Doomdark’s hordes, all but requires extensive notes and visual aids. The map views and more usable status screens that a modern strategy game of this ilk would have are painfully missing, leaving your own elbow grease as the only viable alternative. Perhaps the most practical approach is to print out one of the detailed maps published by one of the magazines on the biggest sheet of paper you can find, then to borrow some counters to use on it from the nearest handy board game. I’ve included a pretty good map, from Crash‘s July 1991 issue, along with the Spectrum tape image and the manual in a zip file which you’re welcome to download.

If all that sounds like too much effort, I highly recommend that you check out Chris Wild’s loving remake for iOS, Android, OS X, and Windows. It adds exactly those conveniences that can feel so painfully missing to modern sensibilities while still preserving the atmosphere and wonder of the original. Either way, by all means play it if you haven’t already. It’s one of the greats — and, particularly when taken in tandem with Midwinter, more than legacy enough for anyone.

(Sources: Crash of June 1984, January 1985, February 1985, March 1985, June 1987, April 1989, September 1991; Micro Adventurer of August 1984, November 1984; Personal Computer Games of August 1984; Your Computer of April 1986, December 1987; Big K of July 1984; Computer and Video Games 1985 Yearbook and of May 1982, January 1983, October 1983; Popular Computing Weekly of November 29 1984, March 13 1986; Retro Gamer #4. The painted flavor illustrations were taken from the August 1984 issue of Personal Computer Games.)

 
 

Tags: , ,