RSS

Tag Archives: bbc micro

Acorn and Amstrad

…he explains to her that Sinclair, the British inventor, had a way of getting things right, but also exactly wrong. Foreseeing the market for affordable personal computers, Sinclair decided that what people would want to do with them was to learn programming. The ZX81, marketed in the United States as the Timex 1000, cost less than the equivalent of a hundred dollars, but required the user to key in programs, tapping away on that little motel keyboard-sticker. This had resulted both in the short market-life of the product and, in Voytek’s opinion, twenty years on, in the relative preponderance of skilled programmers in the United Kingdom. They had had their heads turned by these little boxes, he believes, and by the need to program them. “Like hackers in Bulgaria,” he adds, obscurely.

“But if Timex sold it in the United States,” she asks him, “why didn’t we get the programmers?”

“You have programmers, but America is different. America wanted Nintendo. Nintendo gives you no programmers…”

— William Gibson, Pattern Recognition

A couple of years ago I ventured out of the man cave to give a talk about the Amiga at a small game-development conference in Oslo. I blazed through as much of the platform’s history as I could in 45 minutes or so, emphasizing for my audience of mostly young students from a nearby university the Amiga’s status as the preeminent gaming platform in Europe for a fair number of years. They didn’t take much convincing; even this crowd, young as they were, had their share of childhood memories involving Amiga 500s and 1200s. Mostly they seemed surprised that the Amiga hadn’t ever been all that terribly popular in the United States. During the question-and-answer session, someone asked a question that stopped me short: if American kids hadn’t been playing games on their Amigas, just what the hell had they been playing on?

The answer itself wasn’t hard to arrive at: the sorts of kids who migrated from 8-bit Sinclairs, Acorns, Amstrads, and Commodores to 16-bit Amigas and Atari STs in Britain made a much more lateral move in the United States, migrating to the 8-bit Nintendo Entertainment System.

More complex and interesting are the ramifications of these trends. Because the Atari VCS console was never a major presence in Britain and the rest of Europe during its heyday, and because Nintendo arrived only very belatedly, for many years videogames played in the home there meant games played on home computers. One could say much about how having a device useful for creation as well as consumption as the favored platform of most people affected the market across Europe. The magazines were filled with stories of bedroom gamers who had become bedroom coders and finally Software Stars. Such stories make a marked contrast to an American console-gaming magazine like Nintendo Power, all about consumption without the accompanying ethos of creation.

But most importantly for our purposes today, the relative neglect of Britain in particular by the big computing powers in the United States and Japan — for many years, Commodore was the only company of either nation to make a serious effort to sell their machines into British homes — gave space for a flourishing domestic trade in homegrown machines. When Britain became the nation with the most computers per capita on the planet at mid-decade, most of the computers in question bore the logo of either Acorn or Sinclair, the two great rivals at the heart of the young British microcomputer industry.

Acorn, co-founded by Clive Sinclair’s former right-hand man Chris Curry and an Austrian academic named Hermann Hauser, was an archetypal example of an engineering-driven company. Their machines were a little more baroque, a little better built, and consequently a little more expensive than they needed to be, while their public persona was reserved and just a little condescending, much like that of the BBC that had given its official imprimatur to Acorn’s most popular machine, the BBC Micro. Despite “Uncle Clive’s” public reputation as the British Inspector Gadget, Sinclair was just the opposite; cheap and cheerful, they had the common touch. Acorns sold to the educators, to the serious hobbyists, and to the posh, while Sinclairs dominated with the masses.

Yet Acorn and Sinclair were similar in one important respect: they were both in their own ways very poorly managed companies. When the British home-computer market hit an iceberg in 1985, both were caught in untenable positions, drowning in excess inventory. Acorn — quintessentially British, based in the storied heart of Britain’s “Silicon Fen” of Cambridge — was faced with a choice between dissolution and selling themselves to the Italian typewriter manufacturer Olivetti; after some hand-wringing, they chose the latter course. Sinclair also sold out: to the new kid on the block of British computing, Amstrad, owned by a gruff Cockney with a penchant for controversy named Alan Sugar who was well on his way to becoming the British Donald Trump.

Ever practical in their approach to technology, Amstrad made much of the CPC's bundled monitor in their advertising, noting that with the CPC Junior could play on the computer while the rest of the family watched television.

Ever mindful of the practical concerns of their largely working-class customers, Amstrad made much of the CPC’s bundled monitor in their advertising, noting that Junior could play on the CPC without tying up the family television.

Amstrad had already been well-established as a maker of inexpensive stereo equipment and other consumer electronics when their first computers, the CPC (“Colour Personal Computer”) line, debuted in June of 1984. The CPC range was created and sold as a somewhat more capable Sinclair Spectrum. It consisted of well-built and smartly priced if technically unimaginative computers that were fine choices for gaming, boasting as they did reasonably good if hardly revolutionary graphics and sound. Like most Amstrad products, they strained to be as easy to use as possible, shipping as complete units — tape or disk drive and monitor included — at a time when virtually all of their rivals had to be assembled piece by piece via separate purchases.

The CPC line did very well from the outset, even as Acorn and Sinclair were soon watching their own sales implode. Pundits attributed the line’s success to what they called “the Amstrad Effect”: Alan Sugar’s instinct for delivering practical products at a good price at the precise instant when the technology behind them was ready for the mass market — i.e., was about to become desirable to his oft-stated target demographic of “the truck driver and his wife.” Sugar preferred to let others advance the technical state of the art, then swoop in to reap the rewards of their innovations when the time was right. The CPC line was a great example of him doing just that.

But the most dramatic and surprising iteration of the Amstrad Effect didn’t just feed the existing market for colorful game machines; it found an entirely new market segment, one that Amstrad’s competitors had completely missed until now. The story of the creation of the Amstrad PCW line is a classic tale of Alan Sugar, a man who knew almost nothing about computers but knew all he needed to about the people who bought them.

One day just a few months after the release of the first CPC machines, Sugar found himself in an airplane over Asia with Bob Watkins, one of his most trusted executives. A restless Sugar asked Watkins for a piece of paper, and proceeded to draw on it a contraption that included a computer, a monitor, a disk drive, and a printer, all in one unit. Looking at the market during the run-up to the CPC launch, Sugar had recognized that the only true mainstream uses for the current generation of computers in the home were as game machines and word processors. With the CPC, he had the former application covered. But what about the latter? All of the inexpensive machines currently on the market, like the Sinclair Spectrum, were oriented toward playing games rather than word processing, trading the possibility of displaying crisp 80-column text for colorful graphics in lower resolutions. Meanwhile all of the more expensive ones, like the BBC Micro, were created by and for hardcore techies rather than Sugar’s truck drivers. If they could apply their patented technology-for-the-masses approach to a word processor for the home and small business — making a cheap, well-built, all-in-one design emphasizing ease of use for the common person — Amstrad might just have another hit on their hands, this time in a market of their own utterly without competition. Internally, the project was named after Sugar’s secretary Joyce, since it would hopefully make her job and those of many like her much easier. It would eventually come to market as the “PCW,” or “Personal Computer Word Processor.”

The first Amstrad PCW machine, complete with bundled printer.

The first Amstrad PCW machine, complete with bundled printer. Note how the disk drive and the computer itself are built into the same case as the monitor, a very unusual design for the period.

Even more so than the CPC, the PCW was a thoroughly underwhelming package for technophiles. It was build around the tried-and-true Z80 8-bit CPU and ran CP/M, an operating system already considered obsolete by big business, MS-DOS having become the standard in the wake of the IBM PC. The bundled word-processing software, contracted out to a company called Locomotive Software, wasn’t likely to impress power users of WordStar or WordPerfect overmuch — but it was, in keeping with the Amstrad philosophy, unusually friendly and easy to use. Sugar knew his target customers, knew that they “didn’t give a shit whether there was an elastic band or an 8086 or a 286 driving the thing. They wouldn’t know what you were talking about.”

As usual, most of Amstrad’s hardware-engineering efforts went into packaging and cost-cutting. It was decided that the printer would have to be housed separately from the system unit for technical reasons, but otherwise the finished machine conformed remarkably well to Sugar’s original vision. Best of all, it had a price of just £399. By way of comparison, Acorn’s most recent BBC Micro Model B+ had half as much memory and no disk drive, monitor, or printer included — and was priced at £499.

Nervous as ever about intimidating potential customers, Amstrad was at pains to market the PCW first and foremost as a turnkey word-processing solution for homes and small businesses, as a general-purpose computer only secondarily if at all. “It’s more than a word processor for less than most typewriters,” ran their tagline. At the launch event in the heart of the City in August of 1985, three female secretaries paraded across the stage: a snooty one who demanded one of the competition’s expensive computer systems; a tarty one who said a typewriter was more than good enough; and a smart, reasonable one who naturally preferred the PCW. Man-of-the-people Sugar crowed extravagantly that Amstrad had “brought word-processing within the reach of every small business, one-man band, home-worker, and two-finger typist in the country.” Harping on one of his favorite themes, he noted that once again Amstrad had “produced what the customer wants and not a boffin’s ego trip.”

Sugar’s aggressive manner may have grated with many buttoned-down trade journalists, but few could deny that he might just open up a whole new market for computers with the PCW. Electrical Retailer and Trader was typical, calling the PCW “a grown-up computer that does something people want, packaged and sold in a way they can understand, at a price they’ll accept.” But even that note of optimism proved far too mild for the reality of the machine’s success. The PCW exploded out of the gate, selling 350,000 units in the first eight months. It probably could have sold a lot more than that, but Amstrad, caught off-guard by the sales numbers despite their founder’s own bullishness on the product, couldn’t make and ship them fast enough.

Level 9's Time and Magic text adventure running on a PCW.

Level 9’s Time and Magik text adventure running on a PCW.

Surprisingly for such a utilitarian package, the PCW garnered considerable loyalty and even love among the millions in Britain and all across Europe who eventually bought one. Their enthusiasm was enough to sustain a big, glossy newsstand magazine dedicated to the PCW alone — an odd development indeed for this machine that seemed on the face of it to be anything but a hacker’s darling. A thriving software ecosystem that reached well beyond word processing sprung up around the machine. Despite the PCW’s monochrome display and virtually nonexistent animation and sound capabilities, even games were far from unheard of on the platform. For obvious reasons, text adventures in particular became big favorites of PCW owners; with its comfortable full-travel keyboard, its fast disk drive, its relatively cavernous 256 K of memory, and its 80-column text display, a PCW was actually a far better fit for the genre than the likes of a Sinclair Spectrum. The PCW market for text adventures was strong enough to quite possibly allow companies like Magnetic Scrolls and Level 9 to hang on a year or two longer than they might otherwise have managed.

So, Amstrad was already soaring on the strength of the CPC and especially the PCW when they shocked the nation and cemented their position as the dominant force in mainstream British computing with the acquisition of Sinclair in April of 1986. Eminently practical man of business that he was, Sugar bought Sinclair partly to eliminate a rival, but also because he realized that, home-computer slump or no, the market for a machine as popular as the Sinclair Spectrum wasn’t likely to just disappear overnight. He could pick up right where Uncle Clive had left off, selling the existing machine just as it was to new buyers who wanted access to the staggering number of cheap games available for the platform. Sugar thought he could make a hell of a lot of money this way while needing to expend very little effort.

Once again, time proved him more correct than even he had ever imagined. Driven by that huge base of games, demand for new Spectrums persisted into the 1990s. Amstrad repackaged the technology from time to time and, perhaps most importantly, dramatically improved on Sinclair’s infamously shoddy quality control. But they never seriously re-imagined the Spectrum. It was now what Sugar liked to call “a commodity product.” He compared it to suntan lotion of all things: the department stores “put it in their window in July and August and they take it away in the winter.” The Spectrum’s version of July and August was of course November and December; every Christmas sparked a new rush of sales to the parents of a new group of youngsters just coming of age and discovering the magic of videogames.

A battered and uncertain Acorn, now a subsidiary of Olivetti, faced a formidable rival indeed in Alan Sugar’s organization. In a sense, the fundamental dichotomies hadn’t changed that much since Amstrad took Sinclair’s place as the yin to Acorn’s yang. Acorn remained as technology-driven as ever, while Amstrad was all about giving the masses what they craved in the form of cheap computers that were technically just good enough. Amstrad, however, was a much more dangerous form of people’s computer company than had been their predecessor in the role. After releasing some notoriously shoddy stereo equipment under the Amstrad banner in the 1970s and paying the price in returns and reputation, Alan Sugar had learned a lesson that continued to elude Clive Sinclair: that selling well-built, reliable products, even at a price of a few more quid on the final price tag and/or a few less in the profit margin, pays off more than corner-cutting in the long run. Unlike Uncle Clive, who had bumbled and stumbled his way to huge success and just as quickly back to failure, Sugar was a seasoned businessman and a master marketer. The diffident boffins of Acorn looked destined to have a hard time against a seasoned brawler like Sugar, raised on the mean streets of the cutthroat Tottenham Court Road electronics trade. It hardly seemed a fair fight at all.

But then, in the immediate wake of their acquisition by Olivetti nothing at all boded all that well for Acorn. New hardware releases were limited to enhanced versions of the 1981-vintage, 8-bit BBC Micro line that were little more ambitious than Amstrad’s re-packagings of the Spectrum. It was an open secret that Acorn was putting much effort into designing a new CPU in-house to serve as the heart of their eventual next-generation machine, an unprecedented step in an industry where CPU-makers and computer-makers had always been separate entities. For many, it seemed yet one more example of Acorn’s boffinish tendencies getting the best of them, causing them to laboriously reinvent the wheel rather than do what the rest of the microcomputer world was doing: grabbing a 68000 from Motorola or an 80286 from Intel and just getting on with the 16-bit machine their customers were clamoring for. While Acorn dithered with their new chip, they continued to fall further and further behind Amstrad, who in the wake of the Sinclair acquisition had now gone from a British home-computer market share of 0 to 60 percent in less than two years. Acorn was beginning to look downright irrelevant to many Britons in the market for the sorts of affordable, practical computer systems Amstrad was happily providing them with by the bucketful.

Measured in terms of public prominence, Acorn’s best days were indeed already behind them; they would never recapture those high-profile halcyon days of the early 1980s, when the BBC Micro had first been anointed as the British establishment’s officially designated choice for those looking to get in on the ground floor of the computer revolution. Yet the new CPU they were now in the midst of creating, far from being a pointless boondoggle, would ultimately have a far greater impact than anything they’d done before — and not just in Britain but over the entire world. For the CPU architecture Acorn was creating in those uncertain mid-1980s was the one that has gone on to become the most popular ever: the ubiquitous ARM. Since retrofitted into “Advanced RISC Machine,” “ARM” originally stood for “Acorn RISC Machine.” Needless to say, no one at Acorn had any idea of the monster they were creating. How could they?

ARM, the chip that changed the world.

ARM, the chip that changed the world.

“RISC” stands for “Reduced Instruction Set Computer.” The idea didn’t originate with Acorn, but had already been kicking around American university and corporate engineering departments for some time. (As Hermann Hauser later wryly noted, “Normally British people invent something, and the exploitation is in America. But this is a counterexample.”) Still, the philosophy behind ARM was adhered to by only a strident minority before Acorn first picked it up in 1983.

The overwhelming trend in commercial microprocessor design up to that point had been for chips to offer ever larger and more complex instruction sets. By making “opcodes” — single instructions issued directly to the CPU — capable of doing more in a single step, machine-level code could be made more comprehensible for programmers and the programs themselves more compact. RISC advocates came to call this traditional approach to CPU architecture “CISC,” or “Complex Instruction Set Computing.” They believed that CISC was becoming increasingly counterproductive with each new generation of microprocessors. Seeing how the price and size of memory chips continued to drop significantly almost every year, they judged — in the long term, correctly — that memory usage would become much less important than raw speed in future computers. They therefore also judged that it would be more than acceptable in the future to trade smaller programs for faster ones. And they judged that they could accomplish exactly that trade-off by traveling directly against the prevailing winds in CPU design — by making a CPU that offered a radically reduced instruction set of extremely simple opcodes that were each ruthlessly optimized to execute very, very quickly.

A program written for a RISC processor might need to execute far more opcodes than the same program written for a CISC processor, but those opcodes would execute so quickly that the end result would still be a dramatic increase in throughput. Yes, it would use more memory, and, yes, it would be harder to read as machine code — but already fewer and fewer people were programming computers at such a low level anyway. The trend, which they judged likely only to accelerate, was toward high-level languages that abstracted away the details of processor design. In this prediction again, time would prove the RISC advocates correct. Programs may not even need to be as much larger as one might think; RISC advocates argued, with some evidence to back up their claims, that few programs really took full advantage of the more esoteric opcodes of the CISC chips, that the CISC chips were in effect being programed as if they were RISC chips much of the time anyway. In short, then, a definite but not insubstantial minority of academic and corporate researchers were beginning to believe that the time was ripe to replace CISC with RISC.

And now Acorn was about to act on their belief. In typical boffinish fashion, their ARM project was begun as essentially a personal passion project by Roger Wilson [1]Roger Wilson now lives as Sophie Wilson. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times. and Steve Furber, two key engineers behind the original BBC Micro. Hermann Hauser admits that for quite some time he gave them “no people” and “no money” to help with the work, making ARM “the only microprocessor ever to be designed by just two people.” When talks began with Olivetti in early 1985, ARM remained such a back-burner long-shot that Acorn never even bothered to tell their potential saviors about it. But as time went on the ARM chip came more and more to the fore as potentially the best thing Acorn had ever done. Having, almost perversely in the view of many, refused to produce a 16-bit replacement for the BBC Micro line for so long, Acorn now proposed to leapfrog that generation entirely; the ARM, you see, was a 32-bit chip. Early tests of the first prototype in April of 1985 showed that at 8 MHz it yielded an average throughput of about 3.5 MIPS, compared to 2.5 MIPS at 10 MHz for the 68020, the first 32-bit entry in Motorola’s popular 68000 line of CISC processors. And the ARM was much, much cheaper and simpler to produce than the 68020. It appeared that Wilson and Furber’s shoestring project had yielded a world-class microprocessor.

ARM made its public bow via a series of little-noticed blurbs that appeared in the British trade press around October of 1985, even as the stockbrokers in the City and BBC Micro owners in their homes were still trying to digest the news of Acorn’s acquisition by Olivetti. Acorn was testing a new “super-fast chip,” announced the magazine Acorn User, which had “worked the first time”: “It is designed to do a limited set of tasks very quickly, and is the result of the latest thinking in chip design.” From such small seeds are great empires sown.

The Acorn Archimedes

The Acorn Archimedes

The machine that Acorn designed as a home for the new chip was called the Acorn Archimedes — or at times, because Acorn had been able to retain the official imprimatur of the BBC, the BBC Archimedes. It was on the whole a magnificent piece of kit, in a different league entirely from the competition in terms of pure performance. It was, for instance, several times faster than a 68000-based Amiga, Macintosh, or Atari ST in many benchmarks despite running at a clock speed of just 8 MHz, roughly the same as all of the aforementioned competitors. Its graphic capabilities were almost as impressive, offering 256 colors onscreen at once from a palette of 4096 at resolutions as high as 640 X 512. So, Acorn had the hardware side of the house well in hand. The problem was the software.

Graphical user interfaces being all the rage in the wake of the Apple Macintosh’s 1984 debut, Acorn judged that the Archimedes as well had to be so equipped. Deciding to go to the source of the world’s very first GUI, they opened a new office for operating-system development a long, long way from their Cambridge home: right next door to Xerox’s famed Palo Alto Research Center, in the heart of California’s Silicon Valley. But the operating-system team’s progress was slow. Communication and coordination were difficult over such a distance, and the team seemed to be infected with the same preference for abstract research over practical product development that had always marked Xerox’s own facility in Palo Alto. The new operating system, to be called ARX, lagged far behind hardware development. “It became a black hole into which we poured effort,” remembers Wilson.

At last, with the completed Archimedes hardware waiting only on some software to make it run, Acorn decided to replace ARX with something they called Arthur, a BASIC-based operating environment very similar to the old BBC BASIC with a rudimentary GUI stuck on top. “All operating-system geniuses were firmly working on ARX,” says Wilson, “so we couldn’t actually spare any of the experts to work on Arthur.” The end result did indeed look like something put together by Acorn’s B team. Parts of Arthur were actually written in interpreted BASIC, which Acorn was able to get away with thanks to the blazing speed of the Archimedes hardware. Still, running Arthur on hardware designed for a cutting-edge Unix-like operating system with preemptive multitasking and the whole lot was rather like dropping a two-speed gearbox into a Lamborghini; it got the job done, after a fashion, but felt rather against the spirit of the thing.

When the Archimedes debuted in August of 1987, its price tag of £975 and up along with all of its infelicities on the software side gave little hope to those not blinded with loyalty to Acorn that this extraordinary machine would be able to compete with Amstrad’s good-enough models. The Archimedes was yet another Acorn machine for the boffins and the posh. Most of all, though, it would be bought by educators who were looking to replace aging BBC Micros and might still be attracted by the BBC branding and the partial compatibility of the new machine with the old, thanks to software emulators and the much-loved BBC BASIC still found as the heart of Arthur.

Even as Amstrad continued to dominate the mass market, a small but loyal ecosystem sprang up around the Archimedes, enough to support a software scene strong on educational software and technical tools for programming and engineering, all a natural fit for the typical Acorn user. And, while the Archimedes was never likely to become the first choice for pure game lovers, a fair number of popular games did get ported. After all, even boffins and educators — or, perhaps more likely, their students — liked to indulge in a bit of pure fun sometimes.

In April of 1989, after almost two long, frustrating years of delays, Acorn released a revision of Arthur comprehensive enough to be given a whole new name. The new RISC OS incorporated many if not all of the original ambitions for ARX, at last providing the Archimedes with an attractive modern operating system worthy of its hardware. But by then, of course, it was far too late to capture the buzz a more complete Archimedes package might have garnered at its launch back in 1987.

Much to the frustration of many of their most loyal customers, Acorn still seemed not so much inept at marketing their wares to the common person as completely disinterested in doing so. It was as if they felt themselves somehow above it all. Perhaps they had taken a lesson from their one earlier attempt to climb down from their ivory tower and sell a computer for the masses. That attempt had taken the form of the Acorn Electron, a cut-down version of the BBC Micro released in 1983 as a direct competitor to the Sinclair Spectrum. Poor sales and overproduction of the Electron had been the biggest single contributor to Acorn’s mid-decade financial collapse and the loss of their independence to Olivetti. Having survived that trauma (after a fashion), Acorn seemed content to tinker away with technology for its own sake and to let the chips fall where they would when it came to actually selling the stuff that resulted.

Alan Sugar shows off the first of his new line of PC clones.

Alan Sugar shows off the first of his new line of PC clones.

If it provided any comfort to frustrated Acorn loyalists, Amstrad also began to seem more and more at sea after their triumphant first couple of years in the computer market. In September of 1986, they added a fourth line of computers to their catalog with the release of the PC — as opposed to PCW — range. The first IBM clones targeted at the British mass market, the Amstrad PC line might have played a role in its homeland similar to that of the Tandy 1000 in the United States, popularizing these heretofore business-centric machines among home users. As usual with Amstrad, the price certainly looked right for the task. The cheapest Amstrad PC model, with a generous 512 K of memory but no hard drive, cost £399; the most expensive, which included a 20 Mb hard drive, £949. Before the Amstrad PC’s release, the cheapest IBM clone on the British market had retailed for £1429.

But, while not a flop, the PC range never took off quite as meteorically as some had expected. For months the line was dogged by reports of overheating brought on by the machine’s lack of a fan (shades of the Apple III fiasco) that may or may not have had a firm basis in fact. Alan Sugar himself was convinced that the reports could be traced back to skulduggery by IBM and other clone manufacturers trying to torpedo his cheaper machines. When he finally bowed to the pressure to add a fan, he did so as gracelessly as imaginable.

I’m a realistic person and we are a marketing organization, so if it’s the difference between people buying the machine or not, I’ll stick a bloody fan in it. And if they say they want bright pink spots on it, I’ll do that too. What is the use of me banging my head against a brick wall and saying, “You don’t need the damn fan, sunshine?”

But there were other problems as well, problems that were less easily fixed. Amstrad struggled to source hard disks, which had proved a far more popular option than expected, resulting in huge production backlogs on many models. And, worst of all, they found that they had finally overreached themselves by setting the prices too low to be realistically sustainable; prices began to creep upward almost immediately.

For that matter, prices were creeping upward across Amstrad’s entire range of computers. In 1986, after years of controversy over the alleged dumping of memory chips into the international market on the part of the Japanese semiconductor industry, the United States pressured Japan into signing a trade pact that would force them to throttle back their production and increase their prices. Absent the Japanese deluge, however, there simply weren’t enough memory chips being made in the world to fill an ever more voracious demand. By 1988, the situation had escalated into a full-blown crisis for volume computer manufacturers like Amstrad, who couldn’t find enough memory chips to build all the computers their customers wanted — and certainly not at the prices their customers were used to paying for them. Amstrad’s annual sales declined for the first time in a long time in 1988 after they were forced to raise prices and cut production dramatically due to the memory shortage. Desperate to secure a steady supply of chips so he could ramp up production again, Sugar bought into Micron Technology, one of only two American firms making memory chips, in October of 1988 to the tune of £45 million. But within a year the memory-chip crisis, anticipated by virtually everyone at the time of the Micron buy-in to go on for years yet, petered out when factories in other parts of Asia began to come online with new technologies to produce memory chips more cheaply and quickly than ever. Micron’s stock plummeted, another major loss for Amstrad. The buy-in hadn’t been “the greatest deal I’ve ever done,” admitted Sugar.

Many saw in the Amstrad of these final years of the 1980s an all too typical story in business: that of a company that had been born and grown wildly as a cult of personality around its founder, until one day it got too big for any one man to oversee. The founder’s vision seemed to bleed away as the middle managers and the layers of bureaucracy moved in. Seduced by the higher profit margins enjoyed by business computers, Amstrad strayed ever further from Sugar’s old target demographic. New models in the PC range crept north of £1000, even £2000 for the top-of-the-line machines, while the more truck-driver-focused PCW and CPC lines were increasingly neglected. The CPC line would be discontinued entirely in 1990, leaving only the antique Spectrum to soldier on for a couple more years for Amstrad in the role of general-purpose home computer. It seemed that Amstrad at some fundamental level didn’t really know how to go about producing a brand new machine in the spirit of the CPC in this era when making a new home computer was much more complicated than plugging together some off-the-shelf chips and hiring a few hackers to knock out a BASIC for the thing. Amstrad would continue to make computers for many years to come, but by the time the 1990s dawned their brief-lived glory days of 60 percent market share were already fading into the rosy glow of nostalgia.

For all their very real achievements over the course of a very remarkable decade in British computing, Acorn and Amstrad each had their own unique blind spot that kept them from achieving even more. In the Archimedes, Acorn had a machine that was a match for any other microcomputer in the world in any application you cared to name, from games to business to education. Yet they released it in half-baked form at too high a price, then failed to market it properly. In their various ranges, Amstrad had the most comprehensive lineup of computers of anyone in Britain during the mid- to late-1980s. Yet they lacked the corporate culture to imagine what people would want five years from now in addition to what they wanted today. The world needs visionaries and commodifiers alike. What British computing lacked in the 1980s was a company capable of integrating the two.

That lack left wide open a huge gap in the market: space for a next-generation home computer with a lot more power and much better graphics and sound than the likes of the old Sinclair Spectrum, but that still wouldn’t cost a fortune. Packaged, priced, and marketed differently, the Archimedes might have been that machine. As it was, buyers looked to foreign companies to provide. Neglected as Europe still was by the console makers of Japan, the British punters’ choice largely came down to one of two American imports, the Commodore Amiga and the Atari ST. Both — especially the former — would live very well in this gap that neither Acorn nor Amstrad deigned to fill for too long. Acorn did belatedly try with the release of the Archimedes A3000 model in mid-1989 — laid out in the all-in-one-case, disk-drive-on-the-side fashion of an Amiga 500, styled to resemble the old BBC Micro, and priced at a more reasonable if still not quite reasonable enough £745. But by that time the Archimedes’s fate as a boutique computer for the wealthy, the dedicated, and the well-connected was already decided. As the decade ended, an astute observer could already detect that the wild and woolly days of British computing as a unique culture unto itself were numbered.

The Archimedes A3000 marked the end of an era, the last Acorn machine to also bear the BBC logo.

The Archimedes A3000 marked the end of an era, the last Acorn machine to bear the BBC logo.

And that would be that, but for one detail: the fairly earth-shattering detail of ARM. The ARM CPU’s ability to get extraordinary performance out of a relatively low clock speed had a huge unintended benefit that was barely even noticed by Acorn when they were in the process of designing it. In the world of computer engineering, higher clock speeds translate quite directly into higher power usage. Thus the ARM chip could do more with less power, a quality that, along with its cheapness and simplicity, made it the ideal choice for an emerging new breed of mobile computing devices. In 1990 Apple Computer, hard at work on a revolutionary “personal digital assistant” called the Newton, came calling on Acorn. A new spinoff was formed in November of 1990, a partnership among Acorn, Apple, and the semiconductor firm VLSI Technology, who had been fabricating Acorn’s ARM chips from the start. Called simply ARM Holdings, it was intended as a way to popularize the ARM architecture, particularly in the emerging mobile space, among end-user computer manufacturers like Apple who might be leery of buying ARM chips directly from a direct competitor like Acorn.

And popularize it has. To date about ten ARM CPUs have been made for every man, woman, and child on the planet, and the numbers look likely to continue to soar almost exponentially for many years to come. ARM CPUs are found today in more than 95 percent of all mobile phones. Throw in laptops (even laptops built around Intel processors usually boast several ARM chips as well), tablets, music players, cameras, GPS units… well, you get the picture. If it’s portable and it’s vaguely computery, chances are there’s an ARM inside. ARM, the most successful CPU architecture the world has ever known, looks likely to continue to thrive for many, many years to come, a classic example of unintended consequences and unintended benefits in engineering. Not a bad legacy for an era, is it?

(Sources: the book Sugar: The Amstrad Story by David Thomas; Acorn User of July 1985, October 1985, March 1986, September 1986, November 1986, June 1987, August 1987, September 1987, October 1988, November 1988, December 1988, February 1989, June 1989, and December 1989; Byte of November 1984; 8000 Plus of October 1986; Amstrad Action of November 1985; interviews with Hermann Hauser, Sophie Wilson, and Steve Furber at the Computer History Museum.)

Footnotes

Footnotes
1 Roger Wilson now lives as Sophie Wilson. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times.
 

Tags: , , ,

This Tormented Business, Part 2

In December of 1984 Sir Clive Sinclair and Chris Curry, heads of those leading lights of the British PC revolution Sinclair Research and Acorn Computers respectively, gave a Daily Mirror columnist named Michael Jeacock a Christmas gift for the ages. Like Jeacock, Sinclair and Curry were having a drink — separately — with colleagues in the Baron of Beef pub, a popular watering hole for the hackers and engineers employed in Cambridge’s “Silicon Fen.” Spotting his rival across the room, Sinclair marched up to him and started to give him a piece of his mind. It seemed he was very unhappy about a recent series of Acorn advertisements which accused Sinclair computers of shoddy workmanship and poor reliability. To make sure Curry fully understood his position, he emphasized his words with repeated whacks about the head and shoulders with a rolled-up newspaper. Curry took understandable exception, and a certain amount of pushing and shoving ensued, although no actual punches were thrown. The conflict apparently broke out again later that evening at Shades, a quieter wine bar to which the two had adjourned to patch up their differences — unsuccessfully by all indications.

If you know anything about Fleet Street, you know how they reacted to a goldmine like this. Jeacock’s relatively staid account which greeted readers who opened the Christmas Eve edition of the Daily Mirror was only the beginning. Soon the tabloids were buzzing gleefully over what quickly became a full-blown “punch-up.” Some wrote in a fever of indignation over such undignified antics; Sinclair had just been knighted, for God’s sake. Others wrote in a different sort of fever: another Daily Mirror columnist, Jean Rook, wrote that she found Sinclair’s aggression sexually exciting.

It would be a few more months before the British public would begin to understand the real reason these middle-aged boffins had acted such fools. Still heralded publicly as the standard bearers of the new British economy, they were coming to the private realization that things had all gone inexplicably, horribly wrong for their companies. Both were staring down a veritable abyss, with no idea how to pull up or leap over. They were getting desperate — and desperation makes people behave in undignified and, well, desperate ways. They couldn’t even blame their situations on fate and misfortune, even if 1984 had been a year of inevitable changes and shakeouts which had left the software industry confused by its contradictory signs and portents and seen the end or the beginning of the end of weak sisters on the hardware side like Dragon, Camputers, and Oric. No, their situations were directly attributable to decisions they had personally made over the last eighteen months. Each made many of these decisions against his better judgment in the hope of one-upping his rival. Indeed, the corporate rivalry that led them to a public bar fight — and the far worse indignities still to come — has a Shakespearian dimension, being bound up in the relationship between these two once and future friends, each rampantly egotistical and deeply insecure in equal measure and each coveting what the other had. Rarely does business get so personal.

Acorn’s flagship computer, the BBC Micro, is amusingly described by Francis Spufford in Backroom Boys as the Volvo of early British computers: safe, absurdly well-engineered and well-built, expensive, and just a little bit boring. Acorn had taken full advantage of the BBC’s institutional blessing to sell the machine in huge quantities to another set of institutions, the British school system; by the mid-1980s some 90% of British schools had BBC Micros on the premises. Those sales, combined with others to small businesses and to well-heeled families looking for a stolid, professional-quality machine for the back office — i.e., the sorts of families likely to have a Volvo in the driveway as well — were more than enough to make a booming business of Acorn.

Yet when the person on the street thought about computers, it wasn’t Curry’s name or even Acorn’s that popped first to mind. No, it was the avuncular boffin Uncle Clive and his cheap and cheerful Spectrum. It was Sinclair who was knighted for his “service to British industry”; Sinclair who was sought out for endless radio, television, and print interviews to pontificate on the state of the nation. Even more cuttingly, it was the Spectrum that a generation of young Britons came to love — a generation that dutifully pecked out their assignments on the BBC Micros at their schools and then rushed home to gather around their Speccys and have some fun. Chris Curry wanted some of their love as well.

Acorn Electron

Enter in 1983 the Acorn Electron, a radically cost-reduced version of the BBC Micro designed to take on the Spectrum on its own turf. Enthusiasm for the Electron amongst the rank and file at Acorn was questionable at best. Most were not afflicted with Curry’s need to show up his old boss, but rather manifested a strain of stuffy Cambridge elitism that would cling to Acorn throughout its history. They held Sinclair’s cheap machines and the games they played in a certain contempt. They were happy to cede that segment to him, would rather be working on innovative new technology — Acorn had already initiated a 32-bit RISC processor project that would eventually result in the ubiquitous ARM architecture that dominates smartphones and tablets today — than repackaging old for mewling schoolchildren. Curry had to struggle mightily to push the Electron project through in the face of such indifference.

A price of £200, about half that of the BBC Micro, would get buyers the same 32 K of memory and the same excellent BASIC, albeit in a smaller, less professional case. However, the Electron’s overall performance was sharply curtailed by an inefficient (but cheaper) new memory configuration. The Electron’s sound capabilities also suffered greatly by comparison with its big brother, and the BBC Micro’s Mode 7, a text-only display mode that programmers loved because it greatly reduced the amount of precious memory that needed to be allocated to the display, was eliminated entirely. And, much cheaper than the BBC Micro though it may have been, it was still more expensive than the Spectrum. On paper it would seem quite a dubious proposition. Still, a considerable number of punters went for it that Christmas of 1983, the very peak of the British micro boom. Many were perhaps made willing to part with a bit more cash by the Electron’s solidity and obviously superior build quality in comparison to the Speccy.

But now Curry found himself in a truly heartbreaking position for any captain of industry: he couldn’t meet the demand. Now that it was done, many months behind schedule, problems with suppliers and processes which no one had bothered to address during development meant that Electrons trickled rather than poured into stores. “We’re having to disappoint customers,” announced a spokeswoman for W.H. Smith. “We are not able to supply demand. What we have has sold out, and while we are expecting more deliveries the amount will still be well below demand.” By some estimates, Acorn missed out on as many as 100,000 Electron sales that Christmas. Worse, most of those in W.H. Smith and other shops who found the Electrons sold out presumably shrugged and walked away with a Spectrum or a Commodore 64 instead — mustn’t disappoint the children who expected to find a shiny new computer under the tree.

Never again was the lesson that Curry took away from the episode. Whatever else happened, he was damn sure going to have enough Electrons to feed demand next Christmas. Already in June of 1984 Curry had Acorn start placing huge orders with suppliers and subcontractors. He filled his warehouses with the things, then waited for the big Christmas orders to start. This time he was going to make a killing and give old Clive a run for his money.

The orders never came. The home-computer market had indeed peaked the previous Christmas. While lots of Spectrums were sold that Christmas of 1984 in absolute numbers, it wasn’t a patch on the year before. And with the Spectrum more entrenched than ever as the biggest gaming platform in Britain, and the Commodore 64 as the second biggest, people just weren’t much interested in the Electron anymore. Six months into the following year Acorn’s warehouses still contained at least 70,000 completed Electrons along with components for many more. “The popular games-playing market has become a very uncomfortable place to be. Price competition will be horrific. It is not a market we want to be in for very long,” said Curry. The problem was, he was in it, up to his eyebrows, and he had no idea how to get out.

Taking perhaps too much to heart Margaret Thatcher’s rhetoric about her country’s young microcomputer industry as a path to a new Pax Britannia, Curry had also recently made another awful strategic decision: to push the BBC Micro into the United States. Acorn spent hugely to set up a North American subsidiary and fund an advertising blitz. They succeeded only in learning that there was no place for them in America. The Apple II had long since owned American schools, the Commodore 64 dominated gaming, and IBM PCs and compatibles ruled the world of business computing. And the boom days of home computing were already over in North America just as in Britain; the industry there was undergoing a dramatic slowdown and shakeout of its own. What could an odd British import with poor hardware distribution and poorer software distribution do in the face of all that? The answer was of course absolutely nothing. Acorn walked away humbled and with £10 to £12 million in losses to show for their American adventure.

To add to the misery, domestic sales of the BBC Micro, Acorn’s bread and butter, also began to collapse as 1984 turned into 1985. Preoccupied with long-term projects like the RISC chip as well as short-term stopgaps like the Electron, Acorn had neglected the BBC Micro for far too long. Incredibly, the machine still shipped with just 32 K of memory three years after a much cheaper Spectrum model had debuted with 48 K. This was disastrous from a marketing standpoint. Salespeople on the high streets had long since realized that memory size was the one specification that virtually every customer could understand, that they used this figure along with price as their main points of comparison. (It was no accident that Commodore’s early advertising campaign for the 64 in the United States pounded relentlessly and apparently effectively on “64 K” and “$600” to the exclusion of everything else.) The BBC Micro didn’t fare very well by either metric. Meanwhile the institutional education market had just about reached complete saturation. When you already own 90% of a market, there’s not much more to be done there unless you come up with something new to sell them — something Acorn didn’t have.

How was Acorn to survive? The City couldn’t answer that question, and the share price therefore plunged from a high of 193p to as low as 23p before the Stock Exchange mercifully suspended trading. A savior appeared just in time in the form of the Turin, Italy-based firm Olivetti, a long-established maker of typewriters, calculators, and other business equipment, including recently PCs. Olivetti initially purchased a 49 percent stake in Acorn. When that plus the release of a stopgap 64 K version of the BBC Micro failed to stop the bleeding — shares cratered to as low as 9p and trading had to be suspended again — Olivetti stepped in again to up their stake to 80 percent and take the company fully under their wing. Acorn would survive in the form of an Olivetti subsidiary to eventually change the world with the ARM architecture, but the old dream for Acorn as a proudly and independently British exporter and popularizer of computing was dead, smothered by, as wags were soon putting it, “the Shroud of Turin.”

If Chris Curry wanted the popular love that Clive Sinclair enjoyed, Sir Clive coveted something that belonged to Curry: respectability. The image of his machines as essentially toys, good for games and perhaps a bit of BASIC-learning but not much else, rankled him deeply. He therefore decided that his company’s next computer would not be a direct successor to the Spectrum but rather a “Quantum Leap” into the small-business and educational markets where Acorn had been enjoying so much success.

He shouldn’t have bothered. While the Electron was a competent if somewhat underwhelming little creation, the Sinclair QL was simply botched every which way from Tuesday right from start to finish. Apparently for marketing reasons as much as anything else, Sir Clive decided on a chip from the new Motorola 68000 line that had everyone talking. Yet to save a few pounds he insisted that his engineers use the 68008 rather than the 68000 proper, the former being a crippled version of the latter with an 8-bit rather than 16-bit data bus and, as a result, about half the overall processing potential. He also continued his bizarre aversion to disk drives, insisting that the QL come equipped with two of his Microdrives instead — a classically Sinclairian bit of tortured technology that looked much like one of those old lost and unlamented 8-track audio tapes and managed to be far slower than a floppy disk and far less reliable than a cassette tape (previously the most unreliable form of computer storage known to man). The only possible justification for the contraption was sheer bloody-mindedness — or anticipation of the money Sinclair stood to make as the sole sellers of Microdrive media if they could ever just get the punters to start buying the things. These questionable decisions alone would have been enough to torpedo the QL. They were, however, just the tip of an iceberg. Oh, what an iceberg…

The QL today feels like an artifact from an alternate timeline of computing in which the arrival of new chips and new technologies didn’t lead to the paradigm shifts of our own timeline. No, in this timeline things just pretty much stayed as they had been, with computers booting up to a BASIC environment housed in ROM and directed via arcane textual commands. The QL must be one of the most profoundly un-visionary computers ever released. The 68000 line wasn’t important just because it ran faster than the old 8-bit Z80s and 6502s; Intel’s 16-bit 8086 line had been doing that for years. It was important because, among other things, its seven levels of external interrupts made it a natural choice for the new paradigm of the graphical user interface and the new paradigm of programming required to write for a GUI: event-driven (as opposed to procedural) programming. This is the reason Apple chose it for their revolutionary Lisa and Macintosh. Sinclair, however, simply used a 68008 like a souped-up Z80, leaving one feeling like they’ve rather missed a pretty significant point. It’s an indictment that’s doubly damning in light of Sir Clive’s alleged role at Sinclair as a sort of visionary-in-chief — or, to choose a particularly hyperbolic contemporary description from The Sun, as “the most prodigious inventor since Leonardo.” But then, as we shall see, computers didn’t ultimately have a lot to do with Sir Clive’s visions.

Clive Sinclair launches the QL

The big unveiling of the QL on January 12, 1984, was a landmark of smoke and mirrors even by Sinclair’s usual standards. Sir Clive declared there that the QL would begin shipping within 28 days to anyone who cared to order one at the low price of £400, despite the fact that no functioning QL actually existed. I don’t mean, mind you, that the prototypes had yet to go into production. I mean rather that no one at Sinclair had yet managed to cobble together a single working machine. Press in attendance were shown non-interactive demonstrations played back on monitors from videotape, while the alleged prototype was kept well away from them. Reporters were told that they could book a review machine, to be sent to them “soon.”

The question of just why Sinclair was in such a godawful hurry to debut the QL is one that’s never been satisfactorily answered. Some have claimed that Sir Clive was eager to preempt Apple’s unveiling of the Macintosh, scheduled for less than two weeks later, but I tend to see this view as implying an awareness of the international computer industry and trends therein that I’m not sure Sir Clive possessed. One thing, however, is clear: the oft-repeated claim that the QL represents the first mass-market 68000-based computer doesn’t hold water. Steve Jobs debuted a working Macintosh on January 24, 1984, and Apple started shipping the Macintosh months before Sinclair did the QL.

As those 28 days stretched into months, events went through the same cycle that had greeted previous Sinclair launches: excitement and anticipation fading into anger and accusations of bad faith and, soon enough, yet another round of investigations and threats by the Advertising Standards Authority. Desperate to show that the QL existed in some form and avoid legal action on behalf of the punters whose money they’d been holding for weeks or months, Sinclair hand-delivered a few dozen machines to journalists and customers in April. These sported an odd accessory: a square appendage hanging off the back of the otherwise sleek case. It seems Sinclair’s engineers had realized at some late date that they couldn’t actually fit everything they were supposed to inside the case. By the time QLs finally started shipping in quantity that summer the unwanted accessory had been removed and its contents somehow stuffed inside the case proper, but that turned out to have been the least of the machine’s problems.

Early Sinclair complete with dongle

Amongst the more troubling of these was a horrid keyboard, something of another Sinclair tradition by now. Sinclair did deign to give the new machine actual plastic keys in lieu of the famous “dead flesh” rubber keys of the Spectrum, but the keys still rested upon a cheap membrane rather than having the mechanical action of such high-flying competitors as the Commodore VIC-20. The keyboard was awful to type on, a virtual kiss of death all by itself for a supposed business computer. And it soon emerged that the keyboard, like everything else on the QL, didn’t work properly on even its own limited terms. Individual keys either stuck or didn’t register, or did both as the mood struck them. Reports later emerged that Sinclair had actually solicited bids for a mechanical keyboard from a Japanese manufacturer and found it would cost very little if anything more than the membrane job, but elected to stick with the membrane because it was such a “Sinclair trademark.” The mind boggles.

And then there were the performance problems brought on by a perfect storm of a crippled CPU, the Microdrives, and the poorly written business software that came with the machine. Your Computer magazine published the following astonishing account of what it took to save a 750-word document in the word processor:

1. Press F3 key followed by 6. A period of 35 seconds elapses by which time the computer has found the save section of Quill and then asks if I wish to save the default file, i.e. the file I am working on.

2. Press ENTER. After a further 10 seconds the computer finds that the file already exists and asks if I wish to overwrite it.

3. Press Y. A period of 100 seconds elapses while the old file is erased and the new one saved and verified in its place. The user is then asked if he wishes to carry on with the same document.

4. Press ENTER. Why a further 25 seconds is required here is beyond me as the file must be in memory as we have just saved it. Unfortunately, the file is now at the start, so to get back to where I was:

5. Press F3 key then G followed by B. The Goto procedure to get to the bottom of the file, a further 28 seconds.

For those keeping score, that’s 3 minutes and 18 seconds to save a 750-word document. For a 3000-word document, that time jumped to a full five minutes.

Your Computer concluded their review of the QL with a prime demonstration of the crazily mixed messaging that marked all coverage of the machine. It was “slightly tacky,” “the time for foisting unproven products on the marketplace has gone,” and “it would be a brave business which would entrust essential data to Microdrives.” Yet it was also a “fascinating package” and “certain to be a commercial success.” It arguably was “fascinating” in its own peculiar way. “Commercial success,” however, wasn’t in the cards. Sinclair did keep plugging away at the QL for months after its release, and did manage to make it moderately more usable. But the damage was long since done. Even the generally forgiving British public couldn’t accept the eccentricities of this particular Sinclair creation. Sales were atrocious. Still, Sir Clive, never one to give up easily, continued to sell and promote it for almost two years.

There’s a dirty secret about Sir Clive Sinclair the computer visionary that most people never quite caught on to: he really didn’t know that much about computers, nor did he care all that much about them. Far from being the “most prodigious inventor since Leonardo,” Sir Clive remained fixated for decades on exactly two ideas: his miniature television and his electric car. The original Sinclair ZX80 had been floated largely to get Sinclair Research off the ground so that he could pursue those twin white whales. Computers had been a solution to a cashflow problem, a means to an end. His success meant that by 1983 he had the money he needed to go after the television and the car, the areas where he would really make his mark, full on. Both being absolutely atrocious ideas, this was bad, bad news for anyone with a vested interest in Sinclair Research.

The TV80 was a fairly bland failure by Sinclair standards: he came, he spent millions manufacturing thousands of devices that mostly didn’t work properly and that nobody would have wanted even if they had, and he exited again full of plans for the next Microvision iteration, the one that would get it right and convince the public at last of the virtues of a 2-inch television screen. But the electric car… ah, that one was one for the ages, one worthy of an honored place beside the exploding watches of yore. Sir Clive’s C5 electric tricycle was such an awful idea that even his normally pliable colleagues resisted letting Sinclair Research get sucked up in it. He therefore took £8.6 million out to found a new company, Sinclair Vehicles.

The biggest problem in making an electric car, then and now, is developing batteries light enough, powerful enough, and long-lasting enough to rival gasoline or diesel. Researchers were a long way away still in 1984. A kilogram of gasoline has an energy potential of 13,000 watt-hours; a state-of-the-art lead-acid battery circa 1984 had an energy potential of 50 watt-hours. That’s the crux of the problem; all else is relative trivialities. Having no engineering solution to offer for the hard part of the problem, Sinclair solved it through a logical leap that rivals any of Douglas Adams’s comedic syllogisms: he would simply pretend the hard problem didn’t exist and just do the easy stuff. From his adoring biography The Sinclair Story:

Part of the ground-up approach was not to spend enormous amounts trying to develop a more efficient battery, but to make use of the models available. Sinclair’s very sound reasoning was that a successful electric vehicle would provide the necessary push to battery manufacturers to pursue their own developments in the fullness of time; for him to sponsor this work would be a misplacement of funds.

There’s of course a certain chicken-or-egg problem inherent in this “sound reasoning,” in that the reason a “successful electric vehicle” didn’t yet exist was precisely because a successful electric vehicle required improved battery technology to power it. Or, put another way: if you could make a successful electric vehicle without improved batteries, why would its existence provide a “push to battery manufacturers?” Rather than a successful electric vehicle, Sir Clive made the QL and Black Watch of electric vehicles all rolled into one, an absurd little tricycle that was simultaneously underwhelming (to observe) and terrifying (to actually drive in traffic).

Sinclair C5

He unveiled the C5 on January 10, 1985, almost exactly one year after the QL dog-and-pony show and for the same price of £400. The press assembled at Alexandria Palace couldn’t help but question the wisdom of unveiling an open tricycle on a cold January day. But, once again, logistics were the least of the C5’s problems. A sizable percentage of the demonstration models simply didn’t work at all. The journalists dutifully tottered off on those that did, only to find that the advertised top speed of 15 mph was actually more like 5 mph — a brisk walking speed — on level ground. The batteries in many of the tricycles went dead or overheated — it was hard to tell which — with a plaintive little “Peep! Peep!” well before their advertised service range of 20 miles. Those journalists whose batteries did hold out found that they didn’t have enough horsepower to get up the modest hill leading back to the exhibition area. It was a disgruntled and disheveled group of cyclists who straggled back to Sir Clive, pedaling or lugging the 30-kilogram gadgets alongside. They could take comfort only in the savaging they were about to give him. When the press found out that the C5 was manufactured in a Hoover vacuum-cleaner plant and its motor was a variation on one developed for washing machines, the good times only got that much better. If there’s a single moment when Sir Clive turned the corner from visionary to laughingstock, this is it.

Sinclair Research wasn’t doing a whole lot better than its founder as 1984 turned into 1985. In addition to the huge losses sustained on the QL and TV80 fiascoes, Sinclair had, like Acorn, lost a bundle in the United States. Back in 1982, they had cut a deal with the American company Timex, who were already manufacturing all of their computers for them from a factory in Dundee, Scotland, to export the ZX81 to America as the Timex Sinclair 1000. It arrived in July of 1982, just as the American home-computing boom was taking off. Priced at $99 and extravagantly advertised as “the first computer under $100,” the TS 1000 sold like gangbusters for a short while; for a few months it was by far the bestselling computer in the country. But it was, with its 2 K of memory, its calculator keyboard, and its blurry text-only black-and-white display, a computer in only the most nominal sense. When Jack Tramiel started in earnest his assault on the low-end later in the year with the — relatively speaking — more useful and usable Commodore VIC-20, the TS 1000 was squashed flat.

Undeterred, Timex and Sinclair tried again with an Americanized version of the Spectrum, the TS 2068. With the best of intentions, they elected to improve the Speccy modestly to make it more competitive in America, adding an improved sound chip, a couple of built-in joystick ports (British Speccy owners had to buy a separate interface), a couple of new graphics modes, a cartridge port, even a somewhat less awful version of Sinclair’s trademark awful keyboards. The consequence of those improvements, however, was that most existing Spectrum software became incompatible. This weird little British machine with no software support was priced only slightly less than the Commodore 64 with its rich and growing library of great games. It never had a chance. Timex, like other big players such as Texas Instruments and Coleco, were soon sheepishly announcing their withdrawal from the home-computer market, vanquished like the others by Commodore.

Back in Britain, meanwhile, it was becoming clear that, as if Sinclair hadn’t already had enough problems, domestic sales of the Spectrum were beginning to slow. Sinclair was still in a dominant position, owning some 40 percent of the British market. However, conventional wisdom had it that that market was becoming saturated; by late 1984 most of the people in Britain who were likely to buy a computer had already done so, to the tune of far more sales per capita than any other country on the planet. Sinclair’s only chance to maintain sales would seem to be to sell new machines to those who already owned older models. Yet they had pissed away the time and resources needed to create a next-generation Speccy on the QL. In desperation they rushed out something called the Spectrum Plus for Christmas 1984: a slightly more substantial-looking Spectrum with a better keyboard like that of the QL (still not a genuinely good one, of course; “Sinclair trademark” and all that). With no changes to its actual computing capabilities, this wasn’t exactly a compelling upgrade package for current Spectrum owners. And, Sinclair still being Sinclair, the same old problems continued; most Spectrum Pluses arrived with several of the vaunted new plastic keys floating around loose in the box.

By mid-1985, Sinclair’s position wasn’t a whole lot better than that of Acorn. They were drowning in unsold inventories of Spectrums and QLs dating back to the previous Christmas season and even before, mired in debt, and without the resources to develop the Spectrum successor they desperately needed.

Then it seemed that their own Olivetti-equivalent had arrived. In a “World Exclusive!” article in the June 17, 1985, edition, the Daily Mirror announced that “Maxwell Saves Sinclair.” The Maxwell in question was the famous British tycoon and financier Robert Maxwell, who would inject some £12 million into the company. In return, Sir Clive would have to accept some adult supervision: he would become a “life president” and consultant, with Maxwell installing a management team of his own choosing. Everyone was relieved, even Margaret Thatcher. “The government has been aware that these talks have been going on and welcomes any move to put the Sinclair business on a firm footing,” said a spokesman.

Then, not quite two months after the carefully calibrated leak to the Daily Mirror, Maxwell suddenly scuttled the deal. We’re not quite sure why. Some have said that, after a thorough review of Sinclair’s books, Maxwell concluded the company was simply irredeemable; some that Sir Clive refused to quietly accept his “life president” post and go away the way Maxwell expected him to; some that Sir Clive planned to go away all too soon, taking with him a promising wafer-scale chip integration process a few researchers had been working on to serve as his lifeboat and bridge to yet another incarnation of an independent Sinclair, as the ZX80 had served as a bridge between the Sinclair Radionics of the 1970s and the Sinclair Research of the 1980s. Still others say that Sir Clive was never serious about the deal, that the whole process was a Machiavellian plot on his part to keep his creditors at bay until the Christmas buying season began to loom, after which they would continue to wait and see in the hope that Sinclair could sell off at least some of all that inventory before the doors were shut. This last, at least, I tend to doubt; like the idea that he staged the QL unveiling to upstage the Macintosh, it ascribes a level of guile and business acumen to Sir Clive that I’m not sure he possessed.

At any rate, Sinclair Research staggered into 1986 alive and still independent but by all appearances mortally wounded. A sign of just how far they had fallen came when they had to beg the next Spectrum iteration from some of the people they were supposed to be supplying it to: Spain’s Investrónica, signatories to the only really viable foreign distribution deal they had managed to set up. The Spectrum 128 was a manifestation of Investrónica’s impatience and frustration with their partner. After waiting years for a properly updated Spectrum, they had decided to just make their own. Created as it was quickly by a technology distributor rather than technology developer, the Spectrum 128 was a bit of a hack-and-splice job, grafting an extra 80 K of memory, an improved sound chip, and some other bits and pieces onto the venerable Speccy framework. Nevertheless, it was better than nothing, and it was compatible with older Speccy games. Sinclair Research scooped it up and started selling it in Britain as well.

The state of Acorn and Sinclair as 1986 began was enough to trigger a crisis of faith in Britain. The postwar era, and particularly the 1970s, had felt for many people like the long, slow unraveling of an economy that had once been the envy of the world. It wasn’t only Thatcher’s Conservatives who had seen Sir Clive and Acorn as standard bearers leading the way to a new Britain built on innovation and silicon. If many other areas of the economy were finally, belatedly improving after years and years of doldrums, the sudden collapse of Sinclair and Acorn nevertheless felt like a bucket of cold water to the dreamer’s face. All of the old insecurities, the old questions about whether Britain could truly compete on the world economic stage came to the fore again to a degree thoroughly out of line with what the actual economic impact of a defunct Acorn and Sinclair would have been. Now those who still clung to dreams of a silicon Britain found themselves chanting an unexpected mantra: thank God for Alan Sugar.

Sugar, the business titan with the R&B loverman’s name, had ended his formal schooling at age 16. A born salesman and wheeler and dealer, he learned his trade as an importer and wholesaler on London’s bustling Tottenham Court Road, then as now one of the densest collections of electronics shops in Europe. He founded his business, Amstrad, literally out of the back of a van there in 1968. By the late 1970s he had built Amstrad into a force to be reckoned with as purveyors of discount stereo equipment, loved by his declared target demographic of “the truck driver and his wife” as much as it was loathed by audiophiles.

He understood his target market so well because he was his target market. An unrepentant Eastender, he never tried to refine his working-class tastes, never tried to smooth away his Cockney diction despite living in a country where accent was still equated by many with destiny. The name of his company was itself a typical Cockneyism, a contraction of its original name of A.M.S. Trading Company (“A.M.S.” being Sugar’s initials). Sugar:

There was the snooty area of the public that would never buy an Amstrad hi-fi and they went out and bought Pioneer or whatever, and they’re 5 percent of the market. The other 95 percent of the market wants something that makes a noise and looks good. And they bought our stuff.

An Amstrad stereo might not be the best choice for picking out the subtle shadings of the second violin section, but it was just fine for cranking out the latest Led Zeppelin record good and loud. Sugar’s understanding of what constituted “good enough” captured fully one-third of the British stereo market for Amstrad by 1982, far more than any other single company.

In 1983, Sugar suddenly decided that Amstrad should build a home computer to compete with Sinclair, Acorn, and Commodore. Conventional wisdom would hold that this was absolutely terrible timing. Amstrad was about to jump into the market just in time for it to enter a decline. Still, if Sugar could hardly have been aware of what 1984 and 1985 would bring, he did see some fairly obvious problems with the approach of his would-be competitors which he believed Amstrad could correct. In a sense, he’d been here before.

Stereos had traditionally been sold the way that computer systems were in 1983: as mix-and-match components — an amplifier here, a tape deck and record player there, speakers in that corner — which the buyer had to purchase separately and assemble herself. One of Sugar’s greatest coups had come when he had realized circa 1978 that his truck drivers hated this approach at least as much as the audiophiles reveled in it. They hated comparing a bunch of gadgets with specifications they didn’t understand anyway; hated opening a whole pile of boxes and trying to wire everything together; hated needing four or five sockets just to power one stereo. Amstrad therefore introduced the Tower System: one box, one price, one socket — plug it in and go. It became by far their biggest seller, and changed the industry in the process.

Amstrad’s computer would follow the same philosophy, with the computer, a tape drive, and a monitor all sold as one unit. The included monitor in particular would become a marketing boon. Monitors being quite unusual in Britain, many a family was wracked with conflict every evening over whether the television was going to be used for watching TV or playing on the Speccy. The new Amstrad would, as the advertisements loudly proclaimed, make all that a thing of the past.

Amstrad CPC464

The CPC-464 computer which began shipping in June of 1984 was in many other ways a typical Amstrad creation. Sugar, who considered “boffin” a term of derision, was utterly uninterested in technological innovation for its own sake. Indeed, Sugar made it clear from the beginning that, should the CPC-464 disappoint, he would simply cut his losses and drop the product, as he had at various times televisions, CB radios, and car stereos before it. He was interested in profits, not the products which generated them. So, other than in its integrated design the CPC-464 innovated nowhere. It instead was just a solid, conservative computer that was at least in the same ballpark as the competition in every particular and matched or exceeded it in most: 64 K of memory, impressive color graphics, a decent sound chip, a more than decent BASIC. Build quality and customer service were, if not quite up to Acorn’s standards, more than a notch or two above Sinclair’s and more than adequate for a computer costing about £350 with tape drive and color monitor. Amstrad also did some very smart things to ease the machine’s path to consumer adoption: they paid several dozen programmers to have a modest library of games and other software available right from launch, and started Amstrad Computer User magazine to begin to build a community of users. These strategies, along with the commonsense value-for-your-pound approach of the machine itself, let the CPC-464 and succeeding machines do something almost inconceivable to the competitors collapsing around them: post strong sales that continued to grow by the month, making stereos a relatively minor part of Amstrad’s booming business within just a couple of years.

Amstrad’s results were so anomalous to those of the industry as a whole that for a considerable length of time the City simply refused to believe them. Their share price continued to drop through mid-1985 in direct defiance of rosy sales figures. It wasn’t until Amstrad’s fiscal year ended in June and the annual report appeared showing sales of £136.1 million and an increase in profits of 121 percent that the City finally began to accept that Amstrad computers were for real. Alan Sugar describes in his own inimitable way the triumphalism of this period of Amstrad’s history:

The usual array of predators, such as Dixons, W. H. Smith, and Boots, were hovering around like the praying mantis, saying, “Ha, ha, you’ve got too many computers, haven’t you? We’re going to jump on you and steal them off you and rape you when you need money badly, just like Uncle Clive.” And we said, “We haven’t got any.” They didn’t believe us, until such time as they had purged their stocks and finished raping Clive Sinclair and Acorn, and realized they had nothing left to sell. So they turned to us again in November of 1985 and said, “What about a few of your computers at cheaper prices?” We stuck the proverbial two fingers in the air, and that’s how we got price stability back into the market. They thought we were sitting on stockpiles and they were doing us a big favour. But we had no inventory. It had gone to France and Spain.

Continental Europe was indeed a huge key to Amstrad’s success. When Acorn and Sinclair had looked to expand internationally, they had looked to the hyper-competitive and already troubled home-computer market in the United States, an all too typical example of British Anglocentrism. (As Bill Bryson once wrote, a traveler visiting Britain with no knowledge of geography would likely conclude from the media and the conversations around her that Britain lay a few miles off the coast of the United States, perhaps about where Cuba is in our world, and it was the rest of Europe that was thousands of miles of ocean away.) Meanwhile they had all but ignored all that virgin territory that started just a ferry ride away. Alan Sugar had no such prejudices. He let America alone, instead pushing his computers into Spain, France, and the German-speaking countries (where they were initially sold under the Schneider imprint — ironically, another company that had gotten its start selling low-priced stereo equipment). Amstrad’s arrival, along with an increasingly aggressive push from Commodore’s West German subsidiary, marks the moment when home computers at last began to spread in earnest through Western Europe, to be greeted there by kids and hackers with just as much enthusiasm and talent as their British, American, and Japanese counterparts.

One day in early 1986, Alan Sugar received an unexpected call from Mark Souhami, manager of the Dixons chain of consumer-electronics stores. Souhami dropped a bombshell: it seemed that Sir Clive was interested in selling his computer operation to Amstrad, the only company left in the market with the resources for such an acquisition. Dixons, who still sold considerable numbers of Spectrums and thus had a vested interest in keeping the supply flowing, had been recruited to act as intermediaries. Sir Clive and Sugar soon met personally for a quiet lunch in Liverpool Street Station. Sir Clive later reported that he found Sugar “delightful” — “very pleasant company, a witty man.” Sugar was less gracious, ruthlessly mocking in private Sir Clive’s carefully cultivated “Etonian accent” and his intellectual pretensions.

At 3:00 AM on April 2, 1986, after several weeks of often strained negotiations, Amstrad agreed to buy all of the intellectual property for and existing stocks of Sinclair’s computers for £16 million. The sum would allow Sir Clive to pay off his creditors and make a clean break from the computer market to pursue his real passions. Tellingly, Sinclair Research itself along with the TV80 and the C5 were explicitly excluded from the transfer — not that Sugar had any interest in such financial losers anyway. With a stroke of the pen, Alan Sugar and Amstrad now owned 60 percent of the British home-computer market along with a big chunk of the exploding continental European. All less than two years after the CPC-464 had debuted under a cloud of doubt.

Clive Sinclair and Alan Sugar

When Sugar and Sir Clive officially announced their deal at a press conference on April 7, the press rightly marked it as the end of an era. The famous photograph of their uncomfortable handshake before the assembled flash bulbs stands as one of the more indelible in the history of British computing, a passing of the mantle from Sir Clive the eccentric boffin to Sugar the gruff, rough, and ruthless man of the bottom line. British computing had lost its innocence, and things would never quite be the same again. Thatcher had backed the wrong horse in choosing Sir Clive as her personification of the new British capitalist spirit. (Sugar would get a belated knighthood of his own in 2000.) On the plus side, British computing was still alive as an independent entity, a state of affairs that had looked very doubtful just the year before. Indeed, it was poised to make a huge impact yet through Amstrad.

Those who fretted that Sugar might have bought the Spectrum just to kill it needn’t have; he was far too smart and unsentimental for that. If people still wanted Spectrums, he would give them Spectrums. Amstrad thus remade the Speccy with an integrated tape drive in the CPC line’s image and continued to sell it as the low end of their lineup into the 1990s, until even the diehards had moved on. Quality and reliability improved markedly, and the thing even got a proper keyboard at long last. The QL, however, got no such treatment; Sugar put it out of its misery without a second thought.

Clive Sinclair rides off into the sunset

I’ll doubtless have more to say about a triumphant Amstrad and a humbled but still technically formidable Acorn in future articles. Sir Clive, however, will now ride off into the sunset — presumably on a C5 — to tinker with his electric cars and surface occasionally to delight the press with a crazy anecdote. He exited the computer market with dreams as grandiose as ever, but no one would ever quite take him seriously again. For a fellow who takes himself so manifestly seriously, that has to be a difficult thing to bear. Sinclair Research exists as a nominal corporation to this day, but for most of the past three decades its only actual employee appears to have been Sir Clive himself, still plugging away at his electric car (miniaturized televisions have not been in further evidence). I know I’ve been awfully hard on Sir Clive, but in truth I rather like him. He possessed arrogance, stubbornness, and shortsightedness in abundance, but no guile and very little greed. Amongst the rogue’s gallery of executives who built the international PC industry that practically qualifies him for sainthood. He was certainly the most entertaining computer mogul of all time, and he did manage almost in spite of himself to change Britain forever. The British public still has a heartfelt affection for the odd little fellow — as well they should. Eccentrics like him don’t come around every day.

(Much of this article was drawn from following the news items and articles in my favorite of the early British micro magazines, Your Computer, between January 1984 and May 1986. Other useful magazines: Popular Computing Weekly of November 10 1983 and January 12 1984; Sinclair User of November 1984, February 1985, and March 1985. Two business biographies of Sir Clive are recommended, one admiring and one critical: The Sinclair Story by Rodney Dale and Sinclair and the “Sunrise” Technology by Ian Adamson and Richard Kennedy respectively. The best account I’ve found of Amstrad’s early history is in Alan Sugar: The Amstrad Story by David Thomas. Good online articles: The Register’s features on the Sinclair Microdrives, the QL, and the Acorn Electron; Stairway to Hell’s reprinting of a series of articles on Acorn’s history from Acorn User magazine. Finally, by all means check out the delightful BBC docudrama Micro Men if you haven’t already and marvel that the events and personalities depicted therein are only slightly exaggerated. That film is also the source of the last picture in this article; it was just too perfect an image to resist.)

 
 

Tags: , , ,

Elite (or, The Universe on 32 K Per Day)

BBC Micro Elite

Sometimes great works go unappreciated during their time. Other times their time knows exactly what they’re on about. The latter was the good fortune of Elite, Ian Bell and David Braben’s epic game of space combat, trading, and exploration. Arriving at a confused and confusing time in the British games industry, Elite caused a rush of excitement the likes of which had never been seen before even in an industry that seemed to live and die on hype, becoming a bestseller several times over despite being initially released on a platform, the BBC Micro, that was not generally considered much of a gaming machine. Bell and Braben became recognizable stars, their names tripping off the tongues of a generation of British gamers the way that those of Lennon and McCartney had their parents’. It was about as close as the industry would ever get to Trip Hawkins’s dream of game designers as the rock stars of the 1980s. As for the game they created… well, that’s gone down into history as just possibly the most remembered and respected single computer game of the 1980s. But we’re beginning with the ending, which isn’t our usual way around here. Let’s go back to the beginning and see how it all began.

Bell and Braben first met one another during the autumn of 1982, when both arrived at Cambridge University as first-year undergraduates. Bell was to read math, Braben physics. More importantly, both were avid hackers. Bell brought a BBC Micro to university with him, Braben an example of that machine’s predecessor, the Atom, which he had expanded and soldered on and generally hacked at enough to make Dr. Frankenstein proud. Bell had real professional programming experience, at least of a sort: he’d gotten his version of Reversi published by a tiny company called Program Power, and would soon see an original action game, Freefall, published by Acornsoft, software arm of the company that made the computers on his and Braben’s desks. Braben had just passion and aptitude. The two bonded quickly.

Not that they became precisely bosom buddies. As their later story would demonstrate to anyone’s satisfaction, they were very different personalities. If I may strain an analogy just one more time, Bell was the John Lennon of the pair, pessimistic, introverted, and perhaps just a little bit tortured, while Braben was the Paul McCartney, an optimistic charmer with one eye on the market to go with one eye on his art. If not for their passion for Acorn computers, they would have likely had little to say to one another. Both, however, had programming talent to burn, along with a less obvious but at least as important instinct for visionary game design.

But then in the era of Elite even more so than today technological innovation and design innovation were often inextricably linked, with the latter most often flowing from the former. Thus the design that would become Elite stemmed directly from a routine Braben wrote in June of 1983 which could draw four different static 3D spaceships using wire-frame graphics. To understand what made those spaceships so different, and so fraught with potential, we should look to the state of game graphics in general circa 1983.

Defender Pac-Man

Almost all action games of 1983 or earlier show their world from either directly overhead or sideways (like Defender) or some odd hybrid of the two that doesn’t quite make sense in the real world (like Pac-Man). They employ a third-person perspective; you see and control an onscreen avatar from a distance, rather than viewing the world through his eyes. He, his enemies, and perhaps some other elements like laser fire move over a relatively static background image. This approach makes life much easier for programmers in at least a couple of ways. Updating big chunks of screen is very expensive in terms of the computing power available to early PCs and stand-up arcade games. Therefore many of them implemented hardware sprites, little movable chunks of graphics that exist separately from the rest of the screen inside the computer, to be overlaid onto it by the video hardware at no cost to the CPU only on the physical monitor screen. A game like Defender or Pac-Man is an ideal fit for such technology; I trust it won’t be difficult to figure out which parts of the screens above are implemented as sprites and which as background graphics. (In the early days all of the work could be left to sprites: a few early games, such as Boot Hill, consist of only sprites which are sometimes projected over a painted background image.)

There’s also another, more subtle advantage to the traditional arcade-game perspective. If you think about it for a moment, you’ll realize that the worlds shown on the screens above don’t correspond to any recognizable version of our reality even postulating that it could contain invading aliens or munching heads being pursued through a maze of food pellets by ghosts. These worlds are strictly 2D; they lack any notion of depth. Pac-Man and his friends are living in a computerized version of Edwin Abbott’s Flatland; if we were to see this world through his perspective, it would be a very strange one indeed. Similarly, your spaceship in Defender can go up and down and left and right, but not in and out. This is very convenient for the programmer because the computer screen also happens to be flat, possessed of an X- and a Y-dimension but no Z-dimension. Thus the coordinates of any object in this flat world being simulated correspond nicely to its coordinates on the physical screen.

But what if you aren’t satisfied with a Flatland-esque world shown from a locked vertical or horizontal perspective? What if you want to immerse your player in your world good and proper, and to make it one that corresponds to our own of three dimensions while you’re at it? Well, now your job just got a whole lot more difficult. As it happened, however, that was exactly what Bell and Braben were soon trying to do. The crux of the problem, the crux of a huge body of 3D graphics theory as well as lots and lots of specialized hardware that is probably a part of the computer you’re using to read this and for which if you’re a hardcore gamer you may have paid hundreds of dollars, is disarmingly simple: how to translate the X, Y, and Z of a world that lives inside the computer to the X and Y of the computer screen. The starting point must be the rules of visual perspective, well understood by artists since at least the Renaissance. But that well-trodden path opens into a thicket of complications when applied to the computer. Lacking as it does an artist’s intuitive understanding of the real world, a computer has to be laboriously instructed on how not to draw objects that are behind other objects on top of them, how to figure out which surfaces of an object are visible and which are not, etc. Just to make the challenges even greater, sprites aren’t of any real use for 3D graphics: the entire screen is necessarily changing all the time when moving a first-person perspective through a 3D world.

Bell and Braben were hardly the first to enter into this territory. Indeed, the field of 3D graphics isn’t all that much younger than the field of computer graphics itself. Academic researchers during the 1960s and especially the 1970s laid down much of the work that still grounds the field today. One minor contributor to this growing body of work was a graphics researcher and aviation enthusiast named Bruce Artwick, who finished a Master’s degree at the University of Illinois (home of PLATO) in 1976. For his thesis project, he combined his two interests. “A Versatile Computer-Generated Dynamic Flight Display” described a flight simulator featuring a first-person, out-the-cockpit view of a 3D world. In 1980, Artwick with his new company SubLogic brought to market the aptly titled Flight Simulator for the Apple II and TRS-80. Running in as little as 16 K of memory, it marked microcomputer gamers’ first encounter with the format that now dominates the industry: interactive, animated 3D graphics. The Flight Simulator line, whether sold under the imprint of SubLogic or Microsoft, went on to become a computing institution spanning some three decades.

SubLogic Flight Simulator on the Apple II (1980)

Groundbreaking as they were, however, the early versions of Flight Simulator were also, as their name would imply, much more simulator than game. They provided no story, no goals, no sense of progression — just an empty world to fly through. Yes, they did include a mode called “British Ace 30 Aerial Battle,” which transformed your little Cessna into a World War I biplane and let you fly around trying to shoot other planes out of the sky, but, well, let’s just say that it was always clear when playing it that Artwick’s real priorities lay elsewhere. Mostly you were expected to make your own fun refining your piloting technique and, of course, marveling that this 3D world could exist at all on a 16 K 8-bit microcomputer.

Battlezone

A more traditionally gamelike application of 3D came to arcades that same year in the form of Atari’s Battlezone. In it you control a tank in battle against other tanks. You view the action from a first-person perspective, through a screen made to resemble the periscope of a real tank. Battlezone eventually made it to home computers and consoles as well, albeit not until 1983. While their awareness of Flight Simulator is questionable (it was an American product made for American platforms in a very bifurcated computing world), Bell and Braben were aware of and had played Battlezone in the arcades. It was the impetus for Braben’s rotating 3D spaceships and for the combat game Bell and Braben would soon be designing around them.

They were determined to bring 3D to a 2 MHz 8-bit computer with 32 K of memory, and to do it in the context of a real game with real things to do. At least they didn’t have to bemoan the uselessness of sprites to this new paradigm: having been created with educational and “practical” uses in mind rather than gaming, the BBC Micro didn’t have any anyway. Programming, like politics, being the art of the possible, compromises would be needed if they were to have a prayer. Braben had already made the wise choice to set his 3D demo in space. Space is full of, well, space. It’s almost entirely empty, thus dramatically reducing the amount of stuff their game would have to draw. One other obvious decision was to perform only the first part of the full two-part rendering process, drawing in the outlines of objects in their 3D world but not going back and filling in their surfaces, an even more complicated and expensive process. (As the screens above illustrate, Artwick and Atari had already made the same compromise in their own initial implementations of 3D.)

BBC Micro Elite. Note that the rendering is far from perfect, with lots of line breakage. Luckily, this isn't so obvious when the ships are in motion.

BBC Micro Elite. Note that the rendering is far from perfect, with lots of line breakage. Luckily, this isn’t so obvious when the ships are in motion.

Thus Braben made his first spaceships as simple as possible, with just enough lines and points to make of each a recognizable shape. This turned out to be wise for another reason: complex designs shown in wireframe tend to turn into a confusing mishmash of lines. To simplify rendering, all objects were also made convex, meaning that any given line will only pass in and out of the object once; as Braben himself put it in a talk at a recent Game Designers Conference, a block of cheddar cheese is convex but a block of Swiss is not. Later in the game’s development, when Bell and Braben had managed to considerably accelerate the original rendering code, more complex ships, like Bell’s Transporter, were added.

Another area of concern must be your control of your own spaceship, the one through whose cockpit you would be viewing this 3D universe. A spaceship, like an airplane, can change its orientation in six ways, being able to yaw, pitch, or roll in either direction. Yet a joystick can be moved in only four cardinal directions — perfect for a 2D world but problematic for their 3D world. Bell and Braben soon realized, however, that being in space saved them. With no ground, and thus no real notion of up and down with which to contend, turns could be accomplished by simply rolling to the desired orientation and pitching up or down; no need for a yaw control at all. While they took full advantage of the good parts of being in space, they also wisely decided not to try to make the game a remotely realistic simulation of spaceflight. Like Star Wars, their game would be one of dogfights in space, with ships inexplicably subject to a law of inertia that should have been left planetside. Anything else would just feel too disorienting, they judged. Most people would prefer to be Luke Skywalker rather than David Bowman anyway.

So, yes, this would be a game of space combat. That was always a given. But what should it be beyond that? How should that combat be structured, framed? With a workable 3D engine running at last after some months of concerted effort, it was time to ask these questions seriously. One alternative would be to make a traditional arcade-style game, complete with three lives, a score, and ever-escalating waves of enemy ships to gun down. To make, in other words, Battlezone with spaceships. Certainly what they already had was more than impressive enough to sell lots of copies.

Instead, Bell and Braben made their next visionary decision, to make their game something much more than just an arcade-style shooter. They would embed the shooting within a long-form experience that would give it a context, a purpose beyond high-score bragging rights. This was not, as effervescent popular histories of Elite‘s birth have often implied, completely unprecedented. Long-form experiences were not hard to find in computer games years before Bell and Braben — in adventures, in CRPGs, in strategy and war games. It was, however, rather more unusual to see this approach combined with action elements. Taken on their own, the action elements of Bell and Braben’s game were groundbreaking enough to go down as an important moment in gaming history. By refusing to stop there, they would ensure that their game would break ground in multiple directions, and go down as not just important but one of the most important ever.

The inspiration came from tabletop RPGs, a pastime both Bell and Braben indulged in from time to time, although, perhaps tellingly, usually not together. They liked the way an RPG campaign could span many, many sessions, could turn into an ongoing long-form narrative. And they liked the process of building up a character from a low-level nothing to a veritable god over weeks, months, or years. Of course, your “character” in their game was really your spaceship. Fair enough; your goal would be to upgrade that with ever better weapons and defenses that not coincidentally bore a strong resemblance to those in Bell’s favorite RPG: Traveller, the first popular tabletop RPG to replace swords and sorcery with rockets and rayguns. From here the rest of the design seemed to unspool almost of its own accord.

BBC Micro Elite BBC Micro Elite

They needed a mechanism for upgrading the ship, something more interesting than just adding the next piece to the ship automatically every time a certain score threshold was reached. The natural choice was money; every option would have a cost, letting players prioritize and truly make their spaceships their own.

Okay, but how to earn money? Drawing again from Traveller (a game whose imprint would be all over the finished Elite not just in mechanics but in its overall feel), you could be a trader plying the spaceways, buying low in one system in the hopes of selling high in another — a whole new strategic dimension.

But then how would that involve combat? Well, the ships attacking you could be pirates. This would also go a long way to explain why they were so chaotic and kind of random in their behavior, an inevitable result of limited memory and horsepower to devote to their artificial intelligence. Pirates, after all, were chaotic and kind of random by their very nature.

But actually landing on all those trading planets obviously wasn’t going to be workable; avoiding those complications was the reason for setting the game in space in the first place. No problem; you could just dock at space stations in orbit around them. Bell and Braben came up with a new challenge to make this more interesting: in a bit inspired by 2001: A Space Odyssey, you would have to carefully guide your spaceship into the rotating station’s docking bay at the end of every voyage. Of course, over time this could get tedious as well as frustrating (a botched approach generally means instant death). No problem; for a mere 1000 credits, you could buy a docking computer to do it for you. Other non-combat-oriented ship upgrades were also added to the catalog, like a fuel scoop to gather fuel by skimming the surface of a sun instead of buying it at a station.

If those spaceships attacking you really were pirates, thought Bell and Braben, the authorities would probably be quite pleased with you for shooting them down. Why not put bounties on them, so you could make your living as a bounty hunter if you got bored with trading? Now the possibilities really started rolling. If you could shoot pirates for money, you could also attack peaceful traders — become a pirate yourself, in other words, if you felt you could outduel the police Vipers that would attack you from time to time once your reputation became known. They came up with an alternative use for the fuel scoop: use it to scoop up the cargo of ships you’d destroyed to sell on the stations. The fuel scoop also became key to yet another way of making money: buy a special mining laser, break up asteroids with it, and scoop up the alloys they contained to sell stationside. If only they’d had more than 32 K of memory, they could have gone on like this forever.

But 32 K was all they had, and that was a constant challenge to their growing ambitions. For this grand game of trading to work, there had to be a big, varied galaxy to explore. There should be planets with a variety of economies and governments, from safe, established democracies for the conservative, peaceful trader to visit to anarchies home to hordes of pirates for the brave or foolhardy looking to make a big score. They came up with a scheme to let them pack all of the vital information about a star system with a single inhabited planet — its location, its economy, its type of government, its technology level, its population, its dominant species, its GDP, its size, even its name and a bit of flavor text — into just six bytes. Even so, a modest galaxy of 100 star systems would still require 600 bytes that they just couldn’t seem to find. Now came their most storied stroke of inspiration.

In 1202 an Italian mathematician named Fibonacci described a simple construct that became known as the Fibonacci sequence. In its classic form, you begin with two numbers, either 1 and 1 or 0 and 1. To get the third number in the sequence, you add the first two together. You then add the second and third number together to get the fourth. Etc., etc. A common and very useful variation is to drop all but the least significant digit of each number that is generated. It’s also common to begin the sequence not with 1 and 1 or 0 and 1 but some other, arbitrary pair. So, a sequence that begins with 2 and 7 would look like this:

2 7 9 6 5 1 6 7 3 0 3 3 6 9 5 4 9 3 …

The sequence appears random, but is actually entirely predictable for any given starting pair. This variation, however, is only a starting point. You can apply any rules you care to specify to a sequence of numbers with entirely predictable results, as long as you are consistent about it. Bell and Braben realized that they could seed their galaxy with any sequence they wished of six hexadecimal numbers to represent the starting system. Then they could manipulate those numbers in a predetermined way to generate the next; manipulate those to generate the next; etc. They decided that 256 systems was a good size for their galaxy. They needed just those initial six bytes to “store” all 256 planets. In addition to the memory savings, this method of generating their galaxy also saved Bell and Braben many hours spent designing it from scratch. Indeed, growing new galaxies from different starting seeds soon became a game of its own for them. They went through many iterations before finding the one that made it into the final game. Some they had to throw out right away for obvious reasons, such as the one with a system called “Arse” and the ones that had unreachable systems, outside of the player’s ship’s seven-light-year range from any other stars. Others just didn’t feel right.

After a few months of steady work, the basics of what would become Elite were all in place in their heads if not entirely in their code. They decided it was time to see if anyone would be interested in publishing it. Braben believed they should try to find the biggest publisher possible, one with the reach to properly support and promote this game like no other. He accordingly secured them an appointment at the London offices of Thorn EMI, the recently instantiated software division of one of the largest media conglomerates in the world. Very much a sign of this heady period in British computing, Thorn EMI had been founded in the expectation that computer games were destined to be the next big thing in media. Like their colleagues over in EMI’s music division looking for the next big hit single, they weren’t looking for deathless art or niche audiences; they were looking for big, mainstream hits. They had developed a checklist of sorts, a list of what they thought would appeal to the general public that wasn’t all that far removed from Trip Hawkins’s guidelines for American “consumer software.” Their games should be simple, intuitive, colorful, and not too demanding. Bell and Braben’s complicated game — while it was a technical wonder; anyone could see that — was none of these things. They said it was nothing for them, although Bell and Braben were welcome to come back any time to show a reworked — i.e., simplified — version. (In the end, Thorn EMI would find that technology wasn’t ready for casual consumer software, and wouldn’t be for years. The hardcore was all they had to sell to. Unwilling or unable to adapt to this reality as Hawkins’s Electronic Arts eventually did, they faded away quietly without ever managing to find the breakout mainstream hit they sought.)

Bell suggested they try Acornsoft, who had already published his game Freefall. In many ways Acornsoft should have been the logical choice from the start. Bell already had connections there, they knew the BBC Micro better than anyone, and they were located right there in Cambridge practically next door to the university proper, an institution with which they had deep and abiding links. (Regular readers will remember that it was Acornsoft and Cambridge oceanography professor Peter Killworth who provided a commercial outlet for the adventure games created on Cambridge’s Phoenix mainframe.) Yet Braben was reluctant. Always the more commercially minded of the pair, he knew that Acornsoft was hardly at the forefront of the British games industry. Their modest lineup of adventure games, educational software, and utilities had some very worthy members, yet the operation as a whole, like most software adjuncts to hardware companies, always felt like a bit of an afterthought. With their limited advertising and doughtily minimalist packaging, no Acornsoft title had ever sold more than a few tens of thousands of copies, and most never cracked 5000 — a far cry from the numbers Braben fondly imagined for their game. Acornsoft’s association with Acorn also concerned him in that it would necessarily limit the game to only Acorn computers. He and Bell weren’t hugely fond of the Commodore 64 or especially the Sinclair Spectrum, but he knew that their game would have to be ported to those more prominent gaming platforms at some point if it was to realize its commercial potential. In short, Acornsoft was… provincial.

Still, he agreed to accompany Bell to Acornsoft’s offices. It was, to say the least, a place very different from Thorn EMI’s posh digs in central London. From Francis Spufford’s Backroom Boys:

[Acornsoft] operated from one room of a warren of offices above the marketplace. You got there by sidling around the dustbins next to the Eastern Electricity showroom. Past the window display of cookers and fridge-freezers, up a steep little staircase, and into a cramped maze that would remind one employee, looking back, of a level from Doom. “Very back bedroom,” remembered David Braben, approvingly. In Acornsoft’s office they found a rat’s nest of desks and cables, and four people not much older than themselves.

Two of those four people, managing director David Johnson-Davies and chief editor Chris Jordan, would become the unsung heroes of Elite. Both got the game immediately, grasping not just its technical wizardry but also Bell and Braben’s larger vision for the whole experience. They both realized that this thing had the potential to be huge, bigger by an order of magnitude than anything Acornsoft had done before. Of course, it also represented a risk. Bell and Braben looked and acted like the couple of headstrong kids they still were. What if they flaked out? Nor was Acornsoft accustomed to issuing contracts and advances on unfinished software. Acornsoft had been conceived as an outlet for moonlighters and hobbyists, who sold them their homegrown software only once it was finished. Their normal policy was to not even look at programs that weren’t done; Bell and Braben were there at all only as a favor to Bell, a fellow with whom Acornsoft had a history and whom they liked personally. Still, Acorn as a whole was doing well; there was enough money to try something new, and this was too big a chance to pass up. They offered Bell and Braben a contract and an advance.

Now Braben made a move that would be as critical to Elite‘s success as anything in the game itself. Still concerned about Acornsoft’s provinciality, he negotiated a non-exclusive license which would allow them to develop and market versions for other machines after the versions for the Acorn machines were finished. Not quite sure what he was on about, Johnson-Davies agreed. With his share of the advance, Braben bought his own BBC Micro, retiring his hacked and abused old Atom at last.

As Bell and Braben worked to finish their game, Acornsoft provided essential playtesting while Johnson-Davies and Jordan served as an invaluable source of guidance and a certain adult wisdom. Sometimes the latter was needed to keep their ambitions in check, as when Bell and Braben burst into the Acornsoft office one day having had an epiphany. They had realized that, if all they needed to grow a galaxy was a starting seed of six numbers, they could have an infinite number of them — well, okay, about 282 trillion of them — in the game. They could let the player buy a “galactic hyperdrive” to jump between them, whereupon they would just generate a new random seed and let it rip. Johnson-Davis now showed a sharp design instinct of his own in walking them back a bit. Having more galaxies sounds like a great idea, he said, but having so many will actually spoil the illusion of a real persistent universe you’ve worked so hard to create. It will all just start to feel like what it really is: random. Nor will many of these new galaxies be pleasing places to explore, since you won’t be able to look at them and reject the ones with unreachable systems and the like. Bell and Braben agreed to settle for just eight galaxies, with a total of 2048 star systems to visit. That should be more than enough for anyone. Perhaps too many for Bell and Braben and Acornsoft’s testers: a planet Arse sneaked into one of these later galaxies and made it into the released version of the game.

Even as they gently squashed some of Bell and Braben’s more outlandish ideas, Johnson-Davies and Jordan still felt like something was missing. For all its technical and formal innovations, for all its scope of possibility, the game lacked any sort of real goal. Now, to some extent that was just the nature of the beast Bell and Braben had created. They would have dearly loved to have a real story to give context, had even planned on it at some stage (Braben says that “trading was originally going to be a very minor aspect”), but they now had to accept the fact that they weren’t going to be able to wedge some elaborate plot along with everything else into 32 K. Still, suggested Johnson-Davies and Jordan, maybe they could add something simple, something to mark progress and give bragging rights. Thus was born the system of ranks, based on the number of kills you’ve achieved. You start Harmless. After notching eight kills you become Mostly Harmless (a nod to The Hitchhiker’s Guide to the Galaxy). Each rank thereafter is exponentially more difficult to achieve, until, after some 6400 kills, you become Elite. There was the goal, one that should keep players playing a good long time.

It was also in a backhanded sort of way a political statement. Cambridge University was awash with indignation over the policies of Margaret Thatcher; a major coal-miner’s strike which would become the battlefield for Thatcher’s final vanquishing of organized labor had the university’s liberal-arts wings all in a tumult from March of 1984. Bell and Braben bucked the university conventional wisdom to side with Thatcher. The player’s goal of becoming Elite was meant as a subtle nod toward the libertarian ideal of the self-made man, and a little poke in the eye of their leftist acquaintances. It also emphasized their view of their game as fundamentally about space combat, not trading. It gave players a compelling motivation to engage with what Bell and Braben still regarded as the most compelling part of the experience. You can make a lot of money as a peaceful, law-abiding trader who prudently runs from pirates when they show up, but you’ll never make Elite that way.

In finding an overarching goal they also found the title they’d been searching for for some time. They first planned to call the game The Elite, a name to celebrate the group that much of Cambridge was railing against. But the filenames used for the game just said “Elite.” In time, they dropped the article from the official title as well. Elite it became — shorter, punchier, and with fewer political ramifications for Acornsoft to deal with.

Similarly subtle swipes at Cambridge’s liberal-arts students, whom in the long tradition of hard-science students Bell and Braben regarded as mushy-minded prima donnas, made it into the text tables that Bell developed to describe the planets in the game. After the Fibonacci sequence had done its work, some were populated by “edible poets”; others by “carnivorous arts graduates.” Ah, youth.

Bell and Braben had disk drives on their BBC Micros. After compressing their code as much as they possibly could, they finally began to make use of their capabilities within the game. They split the game into two parts: the trading program, loaded in when you docked at a station, and the program handling travel and combat, loaded as soon as you left one. This concerned Acornsoft greatly because most BBC Micro owners still had only cassette drives, which didn’t allow such loading on the fly. What good was the game of the decade if most people couldn’t play it? So they convinced the two to fork the game three ways. One version, the definitive one with all the goodies, would indeed require a BBC Micro with a disk drive. Another, for a tape-equipped BBC Micro, would be similar but would offer a smaller variety of ships to encounter along with simplified trading and a bit less detail to planets you visited and to the overall experience. Finally, Acorn convinced them to create a third version, stripped down even more, for the BBC Micro’s little brother, the Acorn Electron, an attempt to compete with the cheap Sinclair Spectrum that Acorn had introduced the previous year.

Bell and Braben were naturally most excited about the disk-based version, particularly when they realized they had enough space still to add a little something extra. They made a couple of hand-crafted “missions” that pop up when you’ve been playing for a while: one to hunt down and destroy a stolen prototype of a new warship, another to courier some secret documents from one end of the galaxy to the other. These gave at least a taste of the more prominent story elements they wished they had space for.

Elite's packaging

While Bell and Braben finished up the coding, Johnson-Davies and Jordan worked to give the game the packaging and the launch it deserved. Acornsoft figured they needed to do all they could to justify the price they’d chosen to charge for the thing: from £12.95 to £17.65 depending on version, well over twice the typical going rate for a hot new game. They prepared a box of goodies the likes of which had never been seen before, not just from bland little Acornsoft but from anyone in the British games industry. Only some of the more lavish American packages, like those for the Ultimas and various Infocom games, could even begin to compare, and even by their standards Elite was grandiose. To a 63-page instruction manual Johnson-Davies and Jordan added The Dark Wheel, a separate scene-setting novella they commissioned from Robert Holdstock, an author just about to come into his own with the publication of his novel Mythago Wood. And they still weren’t done. They also added a ship-identification poster, a quick-reference guide, a keyboard overlay, some stickers, and a postcard to send to Acornsoft to tell them about it and get your certificate of achievement when you achieved the rank of Competent (an onscreen code revealed at that point would serve as proof).

Acornsoft stepped in and froze further development during the summer of 1984. The packaging was just about ready, and work on the game, while it would never be truly finished in the eyes of Bell and Braben, struck Acornsoft as about to reach a point of diminishing returns. And everyone was a little bit paranoid that something similar to Elite, even if it was nowhere near as good, might come out and steal their thunder. Bell and Braben grudgingly agreed that the time for release had come. But then, just as Acornsoft was about to send the master disk for duplication, Braben called Chris Jordan in a frenzy. They’d solved a niggling problem that had been bothering everyone for months, that of a “radar” scope to show where enemy ships are in relation to your own. The problem was, again, that of trying to map three dimensions onto two. Bell and Braben had done the best they could by providing two complementary scanners that had to be read in conjunction to get the full picture, but it always felt, in contrast to just about everything else about the game, kind of clunky and un-ideal. Now they had come up with a way to pack everything onto a single screen. It was beautiful. Showing a commitment few publishers then or now could match, Acornsoft agreed to take the new version of the game, which brought with it the painful task of having the manual edited and re-typeset to describe the new radar scope. Now, two years after Braben had first started playing with 3D spaceship models, they were done.

Buzz about Acornsoft’s secret “Project Bell” had been high for months. Acornsoft rented for launch day Thorpe Park, a small amusement park (nowadays a much bigger one) near London. In a darkened room, with suitably portentous music playing, the world got its first glimpse of Elite — and of its two creators, who for the next few years would be the face of the young British games industry. In their picture from the launch party they look much as the British public would come to know them: Braben in the foreground, glib and personable; Bell a bit more uncertain and stereotypically nerdy and, much to Acornsoft’s occasional chagrin, more liable to go off-script.

David Braben and Ian Bell

Elite itself, needless to say, became a hit. Acorn and Acornsoft were making a big play for the home-computer market that Christmas, trying to challenge Sinclair and Commodore on their own turf, and Elite became a big part of that push. Advertising was shockingly frequent and grandiose for anyone who remembered the Acornsoft of old. The £50,000 campaign even included some television spots. Acornsoft Elite eventually sold almost 150,000 units between the BBC Micro and the Electron, a huge number for an absurdly expensive game on platforms not particularly popular with gamers. And most of those customers seemed to play Elite with an enthusiasm bordering on the obsessive. The first person known to become Elite was one Hal Bertram, on November 3, 1984, about five weeks after the game’s release. By the end of the year he had many companions in glory, while Acornsoft was positively flooded with postcards sent in by those attaining at least Competent status; they could barely make the badges they sent back to these folks fast enough. Many were doubtless aided by a bug in the ship-equipping code that had slipped through testing and was soon making the rounds amongst players: you could make infinite cash by trying to buy a laser you already had, whereupon the game would reward you with a generous cash credit in addition to the expected refusal. Undeterred, Acornsoft fixed the bug and sponsored a series of live monthly contests culminating in a grand showdown at the Acorn Users Show.

Still, it was clear to Braben that the really big numbers would come only when Elite came to the Speccy and the Commodore 64. The game was the talk of the industry, with owners of those more popular platforms, who had not even been aware of Acornsoft’s existence a few months ago, clamoring to play it after it — along with its creators — began appearing in places like Channel 4 News.

And now we see the significance of that non-exclusive license Braben had negotiated. He heard through the grapevine about a former literary agent named Jacquie Lyons, who had recently become the first agent representing game developers in Britain. Lyons:

A friend rang up and told me about Ian Bell and David Braben. Elite had just happened and Ian and David had retained all rights other than for the BBC, which was extremely bright of them. They wanted me to represent the rest of those rights.

With virtually every publisher in Britain dying to publish Elite for the other, more popular gaming platforms, Lyons decided that there was one foolproof way to find out who really wanted it, and to make sure her new clients got served as well as possible in the process — i.e., paid as well as possible. At the beginning of December she held an auction, which, in her own words, “caused a lot of trouble in the industry — I was told this was an appalling way to go about it.” Lyons responded that such an approach was common in the publishing world from which she hailed. And what better way to ensure that your publisher would put everything they had into a game than to make them pay as dearly as possible for it? The deep pockets of British Telecom won the day amidst a flurry of media interest. Having just entered the software market with a new imprint called Firebird and eager to make a big splash with the highest-profile game in the industry, BT paid an undisclosed but “substantial” sum — Bell and Braben each got six figures up-front — for publishing rights to Elite on all platforms other than the Acorn machines. Suddenly Bell and Braben, who had yet to receive their first royalty checks from Acornsoft, were very wealthy young men.

For their part, Acornsoft allowed Bell and Braben to move on without fighting at all to retain Elite as a desperately needed platform exclusive. Indeed, they handled Bell and Braben’s departure with almost incomprehensibly good grace, even working out agreements to allow Firebird to reuse most of the wonderful supplemental materials they had stuffed into that bursting box. Perhaps they just had bigger fish to fry. Elite, you see, was the sole bright spot in a disastrous Christmas for Acorn as a whole, one rife with miscalculations which effectively wrecked the company. A desperate Acorn was purchased by the Italian firm Olivetti in 1985, and became thereafter a very different sort of place. The Acornsoft label was retired barely a year after the Elite launch, with Johnson-Davies and Jordan and all of their colleagues going on to other things.

But the game they had introduced to the world was just getting started. Bell and Braben themselves ported it to the Commodore 64. That version is not quite as fast and smooth as the BBC version — the 64’s 6502 is clocked at just 1 MHz instead of the BBC’s 2 MHz — but took advantage of the 64’s better graphics and its positively cavernous 64 K of memory to add in compensation more color and a welcome touch of whimsy to undercut its otherwise uncompromisingly dog-eat-dog world. There’s a third special mission, this one a bit of silliness drawn from the beloved Star Trek episode “The Trouble with Tribbles.” When the tribble — excuse me, “trumble” — population aboard your ship has mushroomed to the point that the little buggers start crawling around the screen in front of you, it’s laugh-out-loud funny, even if it is just about impossible to figure out how to get rid of them absent spoilers. But best of all is the new music which plays during the automated docking sequence: Johann Strauss’s “The Blue Danube,” a tribute to everyone’s favorite part of 2001: A Space Odyssey. It comes as a complete surprise (if you haven’t read an article like this, that is…) when you first flip the switch to try out your hard-won docking computer and are greeted with this unexpected note of easy beauty. Soon your travels assume an addictive rhythm: the calculus of buying and selling, followed by the tension and occasional excitement of the voyage itself, followed by the grace notes of “The Blue Danube,” when you know you’ve survived another voyage and can sit back and enjoy a few minutes of peace before starting the process over again. Life in a microcosm?


The Commodore 64 Elite established a tradition of each port being largely hand-coded all over again; this gives each its own feel. Scottish developers Torus took on the challenging task of converting Elite to the Spectrum, which is built around a Z80 rather than the 6502 microprocessor at the heart of the BBC Micro and Commodore 64. Speccy Elite arrived several months after the Commodore 64 version and about a year after the original, touching off another huge wave of sales. Amidst the usual slate of added and lost features, it added yet more special missions, for a total of five. Missions became the most obvious way for the many individual developers who worked on Elite over the years to put their own creative stamp on the game, a trend actively encouraged by Bell and Braben; “just have your own fun” with the missions was always their response to requested advice. About the same time as the Spectrum Elite arrived in Britain, Firebird brought the Commodore 64 Elite to the United States, where it — stop me if you’ve heard this before — became a huge hit, one of relatively few games of the 1980s to make a major impact in both the European and North American markets. It served to establish Firebird as an important publisher in the U.S., the first such to be based in Britain and one which would give many other British games deserved exposure in that bigger market.

The ball was now well and truly rolling. For almost a decade the existing versions just kept on selling and the ports just kept on coming: to big players of the era like the IBM PC, the Apple II, the Atari ST, the Commodore Amiga, and the Amstrad CPC as well as occasional also-rans like the Tatung Einstein. Even the Nintendo Entertainment System got a surprisingly faithful and enjoyable version in 1991. In the end Elite made it to 17 separate platforms. Ian Bell has guessed in one place that it sold about 600,000 copies. David Braben claims that Elite surpassed 1 million copies worldwide, but this claim is much more dubious. Regardless of the final tally, Elite was certainly amongst the most commercially successful born-on-a-PC games of the 1980s.

Bell and Braben’s mainstream fame proved to be almost as enduring — in September of 1991 The One magazine could still write about the latter as “the most famous developer in Britain” — but their partnership less so. The two tried for some time to make Elite II for the BBC Micro and the Commodore 64, but never got close to completing it for reasons which vary with the teller. In Bell’s version, the game was just too ambitious for the hardware; in Braben’s, Bell was more interested in enjoying his new wealth and practicing his new hobby of martial arts than buckling down to work. Braben alone finally made and released Frontier: Elite II, a hugely polarizing sequel, in 1993. The erstwhile partners then spent the rest of the decade in ugly squabbles and petty lawsuits. To the best of my knowledge, the two still refuse to speak to one another. While both agreed to give talks upon the game’s 25th anniversary at the GameCity Festival in Nottingham in 2009, they agreed to do so only if they didn’t have to share a stage together. Like most people who have studied their history, I have my opinions about who is the more difficult partner and who is more at fault. In truth, though, neither one comes out looking very good.

Bell retired quietly to the country many years ago to tinker with mathematics, martial arts, and mysticism. He hasn’t released a game since the original Elite. Braben, in contrast, has built himself a prominent career as a designer and executive in the modern games industry. If he’s no longer quite the most famous developer in Britain, he’s certainly not all that far out of the running. He recently Kickstarted a new iteration of the Elite concept called Elite: Dangerous to the tune of more than £1.5 million, proof of the game’s enduring place in even the contemporary gaming zeitgeist and its enduring appeal as well as the cachet Braben’s name still carries.

And what is the source of that appeal? As with any great game for which it all just seemed to come together somehow, that can be a difficult question to fully answer. I could talk about how it was one of the first games to show the immersive potential of even the most primitive of 3D graphics, prefiguring the direction the entire industry would go a decade later. I could talk about how it was one of the first to graft a larger context to its core action-based gameplay, giving players a reason to care beyond wanting to run up a high score. I could talk about how perfectly realized its universe is, how it absolutely nails atmosphere; its cold beauty is just that, beautiful. Those minimalist wireframe spaceships are key here. I never quite felt that later iterations for more advanced platforms, which fill in the spaceships with color, felt quite like Elite. But then I suspect that for most folks the definitive version of Elite is the one they played first…

Maybe the most impressive thing that Elite evokes is a sense of possibility. You really do feel when you start playing, even today, even when you’ve read articles like this one and know most of its tricks, that you can go anywhere (as, given time and patience, you can), and that anything might happen there (okay, not so much). Yes, over time, especially over these jaded times, that sense fades, this Fibonacci universe starts to lose some of its verisimilitude, and it all starts to feel kind of samey. I must confess that when I played again recently for this article that point came for me long before I got anywhere close to becoming Elite. I think for the game to last longer for me I’d need some more of those story elements Bell and Braben originally hoped to include. But just the fact that that feeling is there, even for a little while, is amazing, the sort of amazing that makes Elite one of the most important computer games ever released. In addition to being a great play in its own right, it represents a fundamental building block of the virtual worlds of today and those still to come.

(In addition to being such a huge hit and such a seminal game historically, Elite comes equipped with a very compelling origin story. Together these factors have caused it to be written and talked about to a degree to which almost no other game of its era compares. Thus my challenge with this article was not so much finding information as sorting through it all and trying to decide which of various versions of events were most likely to be correct.

The lengthiest and most detailed print chronicle of all is that in the book Backroom Boys by Francis Spufford. More cursory histories have been published by Edge Online and IGN. Vintage sources used for this article include: Your Computer of December 1984; The One of January 1991 and September 1991; Micro Adventurer of January 1985; Home Computing Weekly of December 11, 1984; Personal Computing Weekly of August 23, 1984. David Braben’s talk at the 2011 Game Developers Conference was a goldmine, while Ian Bell’s home page has a lot of information in its archives. Other useful fan pages included FrontierAstro and The Acorn Elite Pages. And when you get bored with serious research, check out the Elite episode of Brits Who Made the Modern World, which in its first ten seconds credits the game with starting the British games industry and goes on to indulge in several other howlers before it’s a minute old. It makes a great example of the hilariously hyperbolic press coverage that always clings to Elite.

Finally, rather than provide a playable version of Elite here I’ll just point you once again to Ian Bell’s pages, where you’ll find versions for many, many platforms.

Updated June 14, 2014 and July 14, 2014: I heard from Chris Jordan, who set me straight on more than a few facts and figures found in the original version of this article. Edits made.)

 
46 Comments

Posted by on December 26, 2013 in Digital Antiquaria, Interactive Fiction

 

Tags: , , ,

Peter Killworth’s 1983

Peter Killworth always found time for a wide array of hobbies and activities, ranging from anthropology to stage magic, to supplement his significant career in oceanographic research, but one suspects that he must have been even busier than usual during 1983. Inspired by the unexpected success of Philosopher’s Quest, he published two original games that year with Acornsoft, and ported a third to the BBC Micro for them.

The two originals were Castle of Riddles and Countdown to Doom. The former feels very much like Philosopher’s Quest II, although it is a completely original effort and not, as is sometimes reported, derived à la Zork II and III from yet-unused sections of Philosopher’s Quest‘s mainframe source, Brand X. The plot, such as it is, casts you as a down-on-your-luck adventurer who is hired by a wizard to recover a certain Ring of Power (where have we heard of that before?) from an evil warlock with a penchant for riddle games. Acornsoft had a good reason to want Castle of Riddles to be particularly difficult even by the rather heartless standards of the time: they made solving it into a national contest. Working in conjunction with Your Computer magazine, the company collected orders during the first weeks of 1983, then shipped out copies to all would-be participants on February 15. First to solve it would get a voucher good for £1500 worth of Acorn hardware and software of his choice, along with a magnificently nerdy “£700 hallmarked silver ring-shaped trophy mounted on a presentation plinth and inscribed ‘King of the Ring.'”

Castle of Riddles contest announcement

When several weeks went by without a winner, there was some concern that Killworth had made the game too difficult, that no one would manage to solve it before the contest’s expiry date of March 31. Thus a 34-year-old businessman named Colin Bignell thought he had an excellent chance when he finished the game at last late one night in March. He immediately dashed to his car and drove through the dawn from his home in Littlehampton, Sussex, to Your Computer‘s offices in London to deliver the code word that the game reveals upon completion. But alas, as he pulled up outside one Peter Voke was already inside doing the same thing; Bignell had to settle for runner-up status.

Contest winner Peter Voke stands third from left; runner-up Colin Bignell first from left.

Contest winner Peter Voke stands third from left; runner-up Colin Bignell first from left.

The Castle of Riddles contest was something of a landmark. Many other publishers would launch similar efforts in the years to come. It proved to be an excellent way to build buzz around a new title in the British software industry, which much more than the American thrived on just this kind of hype and excitement. Perhaps less fortunate was the effect it had on the designs involved. They simply had to be damnably, absurdly difficult to prevent a stampede of players beating down the publishers’ doors hours after release. Thus what could already be a stubbornly intractable genre had some of its worst tendencies elevated to the realm of virtual necessity. Indeed, Castle of Riddles itself is the least of Killworth’s games. Even he regarded it with little fondness; it’s the only one of his Acornsoft games that he did not choose to revive for the company with which he later became associated, Topologika.

Much more impressive, and a significant step forward for Killworth as a designer, is Countdown to Doom, a science-fiction scenario. Your spaceship has just crash-landed on the planet of Doomawangara. You have all of about 215 turns to repair your ship — accomplished by gathering the spare parts that are conveniently lying about the planet and dropping them into the ship’s hold — and escape, after which the ship “collapses” for reasons that aren’t entirely clear (beyond the wish for an in-game turn limit, that is). As that tight turn limit suggests, Doom is an extremely difficult game laced with the usual sudden, blameless player deaths that are such a staple of the Cambridge approach to adventure games. This game, like its stablemates, sends the dial smashing right through the top of the Zarfian Cruelty Scale and just keeps on going. With only 215 turns to hand, getting everything done makes for quite an exercise in planning even once you know the solution to each individual puzzle. Yet its puzzles, while hard as nails, mostly stay just on the right side of fairness, only dipping a toe or two occasionally over the line. They reward intellectual leaps as much or more than dogged persistence (not that the latter isn’t required as well). Let me give a quick example of how heartless yet kind of magical these puzzles are.

In your initial explorations you come upon a bunch of gibberish written on a wall.

Countdown to Doom

Anyone who’s ever played an adventure game can guess that this must be an encoded message, but how to crack it? Well, later in the game you come upon another strange message on a wall.

Countdown to Doom

This is all the information the game provides for cracking the code. Want to have a go at it? Go ahead; I’ll wait…

So, the solution is to take every fifth letter after the first, cycling around and around until every letter is used. This yields “Say ‘flezz’ to disable the robot.” Sure enough, there’s an annoying little thief of a robot elsewhere in the game.

Even if you cracked the code, don’t feel too smug; you had an advantage. In the actual game there is nothing to connect these two messages together, nothing to indicate the second provides the key for the first. I needed a nudge to make that connection myself when I played, but thereafter doing the rest myself was so satisfying that I kind of love the game for it.

Countdown to Doom is easier to love than many games of the Cambridge tradition. For all its cruelty, it does display some hints of mercy. You’re expected to gather six needed spare parts to solve the most pressing problem, that of escape, but a full score also requires satisfying your greedy inner adventurer by gathering six treasures. These, which are generally the more challenging to collect, are actually optional; it’s possible to escape and thus ostensibly win the game (apart from a chiding message telling you you could have done even better) without collecting a single one. This choice adds a welcome dose of positive reinforcement. It’s more satisfying to win the game with a less-than-optimal score and then go back in to improve it than it is to simply fail over and over.

In contrast to Castle of Riddles, Countdown to Doom remained always one of Killworth’s favorite children. Its design is tight and perfect in its own uncompromising way, its puzzles often brilliant. Games from this tradition will always be a minority taste even amongst the minority that can still stomach old-school text adventures in this day and age, but Countdown to Doom is just about as perfect an exemplar as you’ll find of said tradition.

Killworth’s final effort for 1983 was another minor landmark. Kingdom of Hamil was a loving port of the Phoenix game Hamil, the first to be solely authored by Phoenix stalwart Jonathan Partington, to the BBC Micro. Thus it became the first Phoenix game not authored by Killworth to make it into homes, and the first to retain its original title and to remain basically complete in its new form. The story, embellished a bit over the original on the Acornsoft box copy, has you the displaced heir to the throne of Hamil, needing to prove your worth to prove your identity. This being an old-school adventure game, “worth” is meant literally here: you must collect valuable treasures and drop them in the castle vault. As usual, none of this makes a whole lot of sense. No one would ever accuse the Phoenix games of even storybook realism.

But then you don’t play these games for their stories, and Hamil has some wonderful elements. Partington always had a certain fondness for large-scale, dynamic puzzles that often span multiple rooms while requiring precise timing, the sort of thing that demands to be worked out carefully with pen and paper. His talents are much in evidence here. There’s a chase scene with a dinosaur that spans more than 30 turns yet has to be planned and executed perfectly down to the last move, and a similar sequence in which the terrain literally explodes behind you. Much of Hamil is so fun to solve that you can almost forgive it its few puzzles that cross the line. The last of these, however, does a good job of crushing any spirit of generosity you might still have. It’s one of the classic what-the-fuck moments in adventure gaming, reading like a caricature of the brainy, mathematical Phoenix tradition.

Early in Hamil, you find yet another encoded message on a wall.

Kingdom of Hamil

This is actually easier than the similar puzzle in Countdown to Doom. When a certain locked door starts asking you for a password, it’s not too difficult to figure out that it must be a simple transcription cypher, with the first three words representing “The password is…” By the time you get to the climax of the game, then, you feel pretty confident in deciphering the messages that appear.

Kingdom of Hamil

Cracking the code yields:

WHAT ARE THEIR ORDERS?
WHAT WAS THE PHRASE?
WHAT IS THE SET, SORTED?

But your difficulties are only beginning. I’ve actually now given you everything the game does, so have at it if you like.

Ready to continue? Okay! In the words of the anonymous writer of a walkthrough from long ago:

You must obtain the set of letters from THE PASSWORD IS, which is THEPASWORDI. Then you must sort the letters, resulting in ADEHIOPRSTW. Finally you must encode this string. You do this in the opposite way in which you decoded messages. Thus, if, for example TPM was decoded to THE, THE is encoded as TPM. ADEHIOPRSTW encodes to NYMPHSWALTZ. To finish the game you must type NYMPHS WALTZ. (SAY NYMPHS WALTZ or NYMPHSWALTZ do not work.)

Really, what could be more clear? Again, solving this here and now, while ridiculously difficult, is actually much easier than it would be for someone encountering it in the game. There you are given no more indication than what you see of what “the phrase” is referring to amongst a big game full of phrases (it’s a text adventure, after all). Thus we come to the hate in my love-hate relationship with Phoenix.

But you don’t have to take my word for it. I’ve prepared a zip file for those of you interested in exploring Killworth’s 1983 for yourselves. It includes each of the three games as a BBC Micro tape image, the way they were first distributed. (To start one of the games on a BBC Micro emulator, mount the tape image, then type *TAPE followed by CH.””. Note also that at least some of the disk images of these games floating around ROM archives and abandonware sites are corrupted and uncompleteable.) I’ve also included some hint sheets, which you’ll likely need. For what it’s worth, when I play I give myself unfettered access to the first level of hints. This usually provides the sort of little nudges that the games so painfully lack, the likes of which Infocom would have provided within the games themselves as a matter of course by this time. I find this lets me appreciate the games’ qualities and enjoy solving them without the whole thing devolving into an exercise in masochism.

Next time we’ll check in with our other special friends in British adventuring, Level 9, to see how 1983 treated them.

 

Tags: , , , ,

1983 in British Computing

1983 in British Computing

Like its American counterpart, the British PC industry was untenably fragmented by the beginning of 1983. The previous year had been deemed Information Technology Year by the government. Unlike so many government initiatives, this one had succeeded swimmingly in its goal of drumming up excitement and enthusiasm amongst the public for microcomputers. Where excitement and enthusiasm go in a market economy, of course, also go products. Thus the new computers had come thick and fast throughout 1982. In addition to the BBC Micro and the Sinclair Spectrum which I’ve already written about, there were heaps of other machines whose names sound like something spewed by a beta version of Google Translate: the Dragon 32, the Grundy NewBrain, the Jupiter Ace, the Camputers Lynx, the Oric-1. Throw in a spate of knockoffs and clones from the Far East, and the situation was truly chaos; most of these machines were incompatible with one another. Something had to give. If the lines of battle had been drawn up in 1982, the war would begin in earnest in 1983, just as it had in North America.

Even if you aren’t that familiar with British computing history, you probably aren’t exactly in suspense about who won in the British theater of the Home Computer Wars of 1983. The fact that I chose to feature the BBC Micro and the Sinclair Spectrum on this blog in preference to all those other oddball models pretty much says it all. The BBC Micro found a home in virtually every school in Britain, and, even at a street price of £400 or so, also became a favorite of researchers and hardcore hobbyists, who loved its sturdy construction and expandability. The Spectrum had none of these things going for it, but it did have a £130 price tag and a bright color graphics display for games. Neither machine was perfect, but each was a good fit for its niche. And in addition to hardware specifications both had considerable soft power working in their favor. The BBC Micro had been blessed with the imprimatur of the British government, and thus stood in effect as the official computer of the British nation. And the Speccy came from Uncle Clive, the man who had first brought low-cost computing to the British masses. Sure, he was a bit eccentric and a bit prickly, but that was just a manifestation of his impatience with the bureaucrats and business concerns that delayed his inventions reaching the masses. It was an image that Sinclair, who had begun to read his own positive press notices when said notices existed only in his head, positively reveled in. Throughout the year Sinclair struggled to keep up with demand, as seemingly every kid in Britain begged for a Speccy. Meanwhile those other computers straggled on as best they could before bowing to the inevitable one by one during this year and the next. Kids wanted what their friends had, and their friends all had Speccys.

Put crudely, then, the BBC Micro came to occupy the space in British computing held by the Apple II in North America, the Establishment choice for education and home computing. The Speccy, meanwhile, was the Commodore 64, the cheaper, ruder, funner model that the kids adored. Just to keep us from getting too neat with our analogies, it should be noted that the Commodore 64 itself also began arriving in numbers in Britain during 1983. However, the vagaries of economics and exchange rates being what they were, its initial price there was closer to that of the BBC Micro than the Spectrum, limiting its sales. The Commodore 64 became the computer for the posh public-school kids, while the Speccy remained the choice of the masses. The former was unquestionably a much more capable machine than the latter by any objective measure, but even in later years, when the price dropped and the 64’s popularity grew, it never quite got the same sort of love that accrued to the Spectrum. Like driving on the wrong side of the road and eating baked beans for breakfast, there was just something indelibly British about the Speccy’s peculiar BASIC and weird keyboard, something that made a generation of British gamers and game programmers fall in love with it as their machine.

To work this comparison one last time, Clive Sinclair’s 1983 in Britain was like Jack Tramiel’s in North America — the best, most unblemished, most triumphant year of a long, chequered career. It must have felt like vindication itself when he received an invitation to attend the Queen’s Birthday Honors in June to receive a knighthood. Suddenly Uncle Clive had become Sir Clive. Given Sinclair’s relationship with the British bureaucracy the honor might have seemed a surprising one. Indeed, at the time that he received it his company was still embroiled in various government investigations for its failure to ship its products in a timely fashion to customers as well as a rash of complaints about shoddy workmanship. (Sinclair was still desperately trying to recall some 28,000 Spectrum power packs that had the potential to shock a person into unconsciousness — shades of the exploding watches of yore.) Luckily, he was a huge favorite of Margaret Thatcher, no friend of entrenched bureaucratic forces herself, who saw him as exactly the kind of entrepreneur that her new, more freedom-loving and capitalism-friendly Britain needed. And Thatcher, who was riding a tide of personal popularity and renewed patriotism of her own in the wake of the Falklands War, generally got what she wanted. The press gushed with praise in the wake of Sinclair’s honor, some justified, some somewhat, shall we say, overblown. Popular Computing Weekly credited him with “transforming Britain from a nation of shopkeepers to a nation of micro users.” Sinclair User simply announced that he had “invented the home micro,” conveniently forgetting about some folks on the other side of the Atlantic.

Clive still being Clive regardless of his honorific, he sunk his cash and his reputation into projects that were of debatable wisdom at best. In lieu of a floppy-disk drive for the Spectrum, he invested in a strange piece of technology called the Microdrive, a tape-based system that looked and operated rather like an old 8-track audio tape. Announced simultaneously with the Spectrum itself back in April of 1982, the Microdrive didn’t finally arrive until the summer of 1983. When it did it was like a caricature of a Sinclair product: cheaper than the competition but also slow and balky and horribly unreliable. A computer crash at the wrong moment could erase an entire tape in seconds. Users may have partially embraced the Speccy because of its eccentricities, but this was taking things too far. Rather than being charming the Microdrive was just sort of terrifying. It never got much love from Speccy users, who chose to stick with the even slower but more trustworthy medium of the cassette tape. In his choosing to develop such a white elephant rather than investing in the plebeian, well-proven technology of the floppy disk we see the most exasperating side of Clive Sinclair, who was always trying to prove how much more clever he was than the conventional wisdom of his competitors, even though conventional wisdom is often conventional for a reason. The Microdrive in turn shows the dangers of a company that is absolutely controlled by a single mercurial individualist. Sinclair’s backers and fans would learn much more about that in the time to come.

Then again, at least the Microdrive was a computer product. Sir Clive, who always harbored a deep skepticism about how long this computer thing was really going to last, also sunk energy and resources into his twin white whales, a miniature, portable television set and an electric car. Both projects would provoke much hilarity in the British press in later years when his star as a captain of industry had faded. But instead of going on any more about any of that today let’s just leave Sir Clive to enjoy his big year. His road will get bumpier soon enough.

Criticisms aside, Sinclair did play a huge role in turning Britain into the most computer-mad nation on Earth. Despite the American industry’s considerable head start, a greater percentage of British than American homes had computers by the end of 1983. Already by April total British microcomputer sales had passed the one-million mark. By December the Speccy alone was flirting with that figure.

All those computers in private hands meant a software marketplace that was if anything growing even faster than the hardware side. And since the computers selling in biggest numbers were the Speccys being installed in bedrooms and living rooms across Britain, software in this context meant mostly games. By 1983 a hit game could make you, at least for the time being, rich, as was demonstrated by a flood of brash young game publishers populated by brash young men just a year or two (at most) removed from bicycling to school. Now they drove Porsches and Ferraris to posh offices in the most fashionable parts of town. A company called Imagine Software, publishers of such Speccy hits as Arcadia, was amongst the most spectacular of the success stories. When a BBC film crew visited their office for a documentary feature in early 1984 they found “huge, luxurious offices, acres of carpet, computer terminals by the ton load, lots of young programmers, secretaries in abundance, young ‘gophers’ acting as runners for the management, and a company garage packed with a fleet of Ferrari Boxers, BMWs for the lesser executives, and the famous Mark Butler custom hand-built Harris motorbike.” Clearly a certain sector of British society had another very good reason to love Sir Clive: his creation was making them rich.

Just as in America, established media forces were also eager to get a piece of the action. Virgin Records launched Virgin Games, and of all people K-tel, those purveyors of cheesy TV-peddled hits compilations, also jumped in, attending the Midland Computer Fair with a well-publicized £1 million burning a hole in their pockets for deal-making with eager programmers.

Yet even with the arrival of Big Money on the scene the British industry remained wilder, woolier, and more democratic than its American counterpart. The games themselves remained much less expensive. Whereas a big release like Ultima III could exceed $50 in America, games in Britain rarely exceeded £10, and most sold for around £5 or even less. With less invested in any particular title, both publishers and buyers were more willing to take chances on crazy ideas, and individual programmers had a better chance of seeing their creations on store shelves and actually making them some money. Even if no one wanted to give them a chance they could just start their own little company in the hope of becoming the next Imagine; distribution was also comparatively wide open, making it relatively easy to get your game to the public on your own. It all added up to a market that had a lot of product that for very good reasons would never have passed muster in the United States. Yet it also had a spirit of wild-eyed, devil-may-care creativity about it that was sometimes lacking amongst the more staid, polished American publishers.

My special interest, adventure games, were a big part of the industry, amongst if not the most popular genre out there. As with other kinds of games, adventure seemed to be multiplying exponentially from month to month. Britain was not just computer mad but also adventure mad. Well before the end of the year production of new British adventure games far outstripped that of American, and the disparity would only continue to grow over the next few years. In November Micro-Adventurer debuted, the first magazine anywhere in the world dedicated not just to games in general but to this particular genre.

To survey this explosion of titles in any real depth would bog us down for months; that will have to remain a task for some other, even more esoteric blog than this one. But I will try to convey some sense of the times by continuing to follow the careers of some friends we met earlier. We’ll do that next time.

(This survey of the scene is drawn mainly from the Your Computer and Sinclair User issues of 1983, with occasional forays into Home Computing Weekly and Popular Computing Weekly. The image is taken from the 1984 Sinclair User annual’s cover.)

 

Tags: ,