A quick note on terminology before we get started: “CD-ROM” can be used to refer either to the use of CDs as a data-storage format for computers in general or to the Microsoft-sponsored specification for same. I’ll be using the term largely in the former sense in the introduction to this article, in the latter after something called “CD-I” enters the picture. I hope the point of transition won’t be too hard to identify, but my apologies if this leads to any confusion. Sometimes this language of ours is a very inexact thing.
In the first week of March 1986, much of the computer industry converged on Seattle for the first annual Microsoft CD-ROM Conference. Microsoft had anticipated about 500 to 600 attendees to the four-day event. Instead more than 1000 showed up, forcing the organizers to reject many of them at the door of a conference center that by law could only accommodate 800 people. Between the presentations on CD-ROM’s bright future, the attendees wandered through an exhibit hall showcasing the format’s capabilities. The hit of the hall was what was about to become the first CD-ROM product ever to be made available for sale to the public, consisting of the text of all 21 volumes of the Grolier Academic Encyclopedia, some 200 MB in all, on a single disc. It was to be published by KnowledgeSet, a spinoff of Digital Research. Digital’s founder Gary Kildall, apparently forgiving Bill Gates his earlier trespasses in snookering a vital IBM contract out from under his nose, gave the conference’s keynote address.
Kildall’s willingness to forgive and forget in light of the bright optical-storage future that stood before the computer industry seemed very much in harmony with the mood of the conference as a whole. Sentiments often verged on the utopian, with talk of a new “paperless society” abounding, a revolution to rival that of Gutenberg. “The compact disc represents a major discontinuity in the cost of producing and distributing information,” said one Ed Schmid of DEC. “You have to go back to the invention of movable type and the printing press to find something equivalent.” The enthusiasm was so intense and the good vibes among the participants — many of them, like Gates and Kildall, normally the bitterest of enemies — so marked that some came to call the conference “the computer industry’s Woodstock.” If the attendees couldn’t quite smell peace and love in the air, they certainly could smell potential and profit.
All the excitement came down to a single almost unbelievable number: the 650 MB of storage offered by every tiny, inexpensive-to-manufacture compact disc. It’s very, very difficult to fully convey in our current world of gigabytes and terabytes just how inconceivably huge a figure 650 MB actually was in 1986, a time when a 40 MB hard drive was a cavernous, how-can-I-ever-possibly-fill-this-thing luxury found on only the most high-end computers. For developers who had been used to making their projects fit onto floppy disks boasting less than 1 MB of space, the idea of CD-ROM sounded like winning the lottery several times over. You could put an entire 21-volume encyclopedia on one of the things, for Pete’s sake, and still have more than two-thirds of the space left over! Suddenly one of the most nail-biting constraints against which they had always labored would be… well, not so much eased as simply erased. After all, how could anything possibly fill 650 MB?
And just in case that wasn’t enough great news, there was also the fact that the CD was a read-only format. If the industry as a whole moved to CD-ROM as its format of choice, the whole piracy problem, which organizations like the Software Publishers Association ardently believed was costing it billions every year, would dry up and blow away like a dandelion in the fall. Small wonder that the mood at the conference sometimes approached evangelistic fervor. Microsoft, as swept away with it all as anyone, published a collection of the papers that were presented there under the very non-businesslike, non-Microsoft-like title of CD-ROM: The New Papyrus. The format just seemed to demand a touch of rhapsodic poetry.
But the rhapsody wasn’t destined to last very long. The promised land of a software industry built around the effectively unlimited storage capacity of the compact disc would prove infuriatingly difficult to reach; the process of doing so would stretch over the better part of a decade, by the end of which time the promised land wouldn’t seem quite so promising anymore. Throughout that stretch, CD-ROM was always coming in a year or two, always the next big thing right there on the horizon that never quite arrived. This situation, so antithetical to the usual propulsive pace of computer technology, was brought about partly by limitations of the format itself which were all too easy to overlook amid the optimism of that first conference, and partly by a unique combination of external factors that sometimes almost seemed to conspire, perfect-storm-like, to keep CD-ROM out of the hands of consumers.
The compact disc was developed as a format for music by a partnership of the Dutch electronics giant Philips and the Japanese Sony during the late 1970s. Unlike the earlier analog laser-disc format for the storage of video, itself a joint project of Philips and the American media conglomerate MCA, the CD stored information digitally, as long strings of ones and zeros to be passed through digital-to-analog converters and thus turned into rich stereo sound. Philips and Sony published the final specifications for the music CD in 1980, opening up to others who wished to license the technology what would become known as the “Red Book” standard after the color of the binder in which it was described. The first consumer-oriented CD players began to appear in Japan in 1982, in the rest of the world the following year. Confined at first to the high-end audiophile market, by the time of that first Microsoft CD-ROM Conference in 1986 the CD was already well on its way to overtaking the record album and, eventually, the cassette tape to become the most common format for music consumption all over the world.
There were good reasons for the CD’s soaring popularity. Not only did CDs sound better than at least all but the most expensive audiophile turntables, with a complete absence of hiss or surface noise, but, given that nothing actually touched the surface of a disc when it was being played, they could effectively last forever, no matter how many times you listened to them; “Perfect sound forever!” ran the tagline of an early CD advertising campaign. Then there was the way you could find any song you liked on a CD just by tapping a few buttons, as opposed to trying to drop a stylus on a record at just the right point or rewind and fast-forward a cassette to just the right spot. And then there was the way that CDs could be carried around and stored so much more easily than a record album, plus the way they could hold up to 75 minutes worth of music, enough to pack many double vinyl albums onto a single CD. Throw in the lack of a need to change sides to listen to a full album, and seldom has a new media format appeared that is so clearly better than the existing formats in almost all respects.
It didn’t take long for the computer industry to come to see the CD format, envisioned originally strictly as a music medium, as a natural one to extend to other types of data storage. Where the rubber met the road — or the laser met the platter — a CD player was just a mechanism for reading bits off the surface of the disc and sending them on to some other circuitry that knew what to do with them. This circuitry could just as easily be part of a computer as a stereo system.
Such a sanguine view was perhaps a bit overly reductionist. When one started really delving into the practicalities of the CD as a format for data storage, one found a number of limitations, almost all of them drawn directly from the technology’s original purpose as a music-delivery solution. For one thing, CD drives were only capable of reading data off a disc at a rate of 153.6 K per second, this figure corresponding not coincidentally to the speed required to stream standard CD sound for real-time playback. The data on a music CD is actually read at a speed of approximately 172.3 K per second. The first CD-ROM drives had an effective reading speed that was slightly slower due to the need for additional error-correcting checksums in the raw data. Such a throughput was considered pretty good but hardly breathtaking by mid-1980s hard-disk standards; an average 10 MB hard drive of the period might have a transfer rate of about 96 K per second, although high-performance drives could triple or even quadruple that figure.
More problematic was a CD drive’s atrocious seek speed — i.e., the speed at which files could be located for reading on a disc. An average 10 MB hard disk of 1986 had a typical seek time of about 100 milliseconds, a worst-case-scenario maximum of about 200 — although, again, high-performance models could improve on those figures by a factor of four. A CD drive, by contrast, had a typical seek time of 500 milliseconds, a maximum of 1000 — one full second. The designers of the music CD hadn’t been particularly concerned by the issue, for a music-CD player would spend the vast majority of its time reading linear streams of sound data. On those occasions when the user did request a certain track found deeper on the disc, even a full second spent by the drive in seeking her favorite song would hardly be noticed unduly, especially in comparison to the pain of trying to find something on a cassette or a record album. For storage of computer data, however, the slow seek speed gave far more cause for concern.
Given these issues of performance, which promised only to get more marked in comparison to hard drives as the latter continued to get faster, one might well ask why the industry was so determined to adapt the music CD specifically to data storage rather than using Philips and Sony’s work as a springboard to another optical format with affordances more suitable to the role. In fact, any number of companies did choose the latter course, developing optical formats in various configurations and capacities, many even offering the ability to write to as well as read from the disc. (Such units were called “WORM” drives, for “Write Once Read Many”; data, in other words, could be written to their discs, but not erased or rewritten thereafter.) But, being manufactured in minuscule quantities as essentially bespoke items, all such efforts were doomed to be extremely expensive.
The CD, on the other hand, had the advantage of an existing infrastructure dedicated to stamping out the little silver discs and filling them with data. At the moment, that data consisted almost exclusively of encoded music, but the process of making the discs didn’t care a whit what the ones and zeros being burned into them actually represented. CD-ROM would allow the computer industry to piggy-back on an extant, mature technology that was already nearing ubiquity. That was a huge advantage when set against the cost of developing a new format from scratch and setting up a similar infrastructure to turn it out in bulk — not to mention the challenge of getting the chaotic, hyper-competitive computer industry to agree on another format in the first place. For all these reasons, there was surprisingly little debate on whether adapting the music CD to the purpose of data storage was really the best way to go. For better or for worse, the industry hitched its wagon to the CD; its infelicities as a general-purpose data-storage solution would just have to be worked around.
One of the first problems to be confronted was the issue of a logical file format for CD-ROM. The physical layout of the bits on a data CD was largely dictated by the design of the platters themselves and the machinery used to burn data into them. Yet none of that existing infrastructure had anything to say about how a filesystem appropriate for use with a computer should work within that physical layout. Microsoft, understanding that a certain degree of inter-operability was a valuable thing to have even among the otherwise rival platforms that might wind up embracing CD-ROM, pushed early for a standardized logical format. As a preliminary step on the road to that landmark first CD-ROM Conference, they brought together a more intimate group of eleven other industry leaders at the High Sierra Resort and Casino in Lake Tahoe in November of 1985 to hash out a specification. Among those present were Philips, Sony, Apple, and DEC; notably absent was IBM, a clear sign of Microsoft’s growing determination to step out of the shadow of Big Blue and start dictating the direction of the industry in their own right. The so-called “High Sierra” format would be officially published in finalized form in May of 1986.
In the run-up to the first Microsoft CD-ROM Conference, then, everything seemed to be coming together nicely. CD-ROM had its problems, but virtually everyone agreed that it was a tremendously exciting development. For their part, Microsoft, driven by a Bill Gates who was personally passionate about the format and keenly aware that his company, the purveyor of clunky old MS-DOS, needed for reasons of public relations if nothing else a cutting-edge project to rival any of Apple’s, had established themselves as the driving force behind the nascent optical revolution. And then, just five days before the conference was scheduled to convene — timing that struck very few as accidental — Philips injected a seething ball of chaos into the system via something called CD-I.
CD-I was a different, competing file format for CD data storage. But CD-I was also much, much more. Excited by the success the music CD had enjoyed, Philips, with the tacit support of Sony, had decided to adapt the format into the all-singing, all-dancing, all-around future of home entertainment in the abstract. Philips would be making a CD-I box for the home, based on a minimalist operating system called OS-9 running on a Motorola 68000 processor. But this would be no typical home computer; the user would be able to control CD-I entirely using a VCR-style remote control. CD-I was envisioned as the interactive television of the future, a platform for not only conventional videogames but also lifestyle products of every description, from interactive astronomy lessons to the ultimate in exercise tapes. Philips certainly wasn’t short of ideas:
Think of owning an encyclopedia which presents chosen topics in several different ways. Watching a short audio/video sequence to gain a general background to the topic. Then choosing a word or subject for more in-depth study. Jumping to another topic without losing your place — and returning again after studying the related topic to proceed further. Or watching a cartoon film, concert, or opera with the interactive capabilities of CD-I added. Displaying the score, libretto, or text onscreen in a choice of languages. Or removing one singer or instrument to be able to sing along with the music.
Just as they had with the music CD, Philips would license the specifications to whoever else wanted to make gadgets of their own capable of playing the CD-I discs. They declared confidently that there would be as many CD-I players in the world as phonographs within a few years of the format’s debut, that “in the long run” CD-I “could be every bit as big as the CD-audio market.”
Already at the Microsoft CD-ROM Conference, Phillips began aggressively courting developers in the existing computer-games industry to embrace CD-I. Plenty of them were more than happy to do so. Despite the optimism that dominated at the conference, it wasn’t clear how much priority Microsoft, who earned the vast majority of their money from business computing, would really give to more consumer-focused applications of CD-ROM like gaming. Philips, on the other hand, was a giant of consumer electronics. While they paid due lip service to applications of CD-I in areas like corporate training, it was always clear that it would be first and foremost a technology for the living room, one that comprehensively addressed what most believed was the biggest factor limiting the market for conventional computer games: that the machines that ran them were just too fiddly to operate. At the time that CD-I was first announced, the videogame console was almost universally regarded as a dead fad; the machine that would so dramatically reverse that conventional wisdom, the Nintendo Entertainment System, was still an oddball upstart being sold in selected markets only. Thus many game makers saw CD-I as their only viable route out of the back bedroom and into the living room — into the mainstream of home entertainment.
So, when Philips spoke, the game developers listened. Many publishers, including big powerhouses like Activision as well as smaller boutique houses like the 68000 specialists Aegis Development, committed to CD-I projects during 1986, receiving in return a copy of the closely guarded “Green Book” that detailed the inner workings of the system. There was no small pressure to get in on the action quickly, for Philips was promising to ship the first finished CD-I units in time for the Christmas of 1987. Trip Hawkins of Electronic Arts made CD-I a particular priority, forming a whole new in-house development division for the platform. He’d been waiting for a true next-generation mainstream game machine for years. At first, he’d thought the Commodore Amiga would be that machine, but Commodore’s clueless marketing and the Amiga’s high price were making such an outcome look less and less likely. So now he was looking to CD-I, which promised graphics and sound as good as those of the Amiga, along with the all but infinite storage of the unpirateable CD format, and all in a tidy, inexpensive package designed for the living room. What wasn’t to like? He imagined Silicon Valley becoming “the New Hollywood,” imagined a game like Electronic Arts’s hit Starflight remade as a CD-I experience.
You could actually do it just like a real movie. You could hire a costume designer from the movie business, and create special-effects costumes for the aliens. Then you’d videotape scenes with the aliens, and have somebody do a soundtrack for the voices and for the text that they speak in the game.
Then you’d digitize all of that. You could fill up all the space on the disc with animated aliens and interesting sounds. You would also have a universe that’s a lot more interesting to look at. You might have an out-of-the-cockpit view, like Star Trek, with planets that look like planets — rotating, with detailed zooms and that sort of thing.
Such a futuristic vision seemed thoroughly justifiable based on Philips’s CD-I hype, which promised a rich multimedia environment combining CD-quality stereo sound with full-motion video, all at a time when just displaying a photo-realistic still image captured from life on a computer screen was considered an amazing feat. (Among extant personal computers, only the Amiga could manage it.) When developers began to dive into the Green Book, however, they found the reality of CD-I often sharply at odds with the hype. For instance, if you decided to take advantage of the CD-quality audio, you had to tie up the CD drive entirely to stream it, meaning you couldn’t use it to fetch pictures or video or anything else for this supposed rich multimedia environment.
Video playback became an even bigger sore spot that echoed back to those fundamental limitations that had been baked into the CD when it was regarded only as a medium for music delivery. A transfer rate of barely 150 K per second just wasn’t much to work with in terms of streaming video. Developers found themselves stymied by an infuriating Catch-22. If you tried to work with an uncompressed or only modestly compressed video format, you simply couldn’t read it off the disk fast enough to display it in real-time. Yet if you tried to use more advanced compression techniques, it became so expensive in terms of computation to decompress the data that the CD-I unit’s 68000 CPU couldn’t keep up. The best you could manage was to play video snippets that only filled a quarter of the screen — not a limitation that felt overly compatible with the idea of CD-I as the future of home entertainment in the abstract. It meant that a game like the old laser-disc-driven arcade favorite Dragon’s Lair, the very sort of thing people tended to think of first when you mentioned optical storage in the context of entertainment, would be impossible with CD-I. The developers who had signed contracts with Philips and committed major resources to CD-I could only soldier on and hope the technology would continue to evolve.
By 1987, then, the CD as a computer format had been split into two camps. While the games industry had embraced CD-I, the powers that were in business computing had jumped aboard the less ambitious, Microsoft-sponsored standard of CD-ROM, which solved issues like the problematic video playback of CD-I by the simple expediency of not having anything at all to say about them. Perhaps the most impressive of the very early CD-ROM products was the Microsoft Bookshelf, which combined Roget’s Thesaurus, The American Heritage Dictionary, The Chicago Manual of Style, The World Almanac and Book of Facts, and Bartlett’s Familiar Quotations alongside spelling and grammar checkers, a ZIP Code directory, and a collection of forms and form letters, all on a single disc — as fine a demonstration of the potential of the new format as could be imagined short of all that rich multimedia that Philips had promised. Microsoft proudly noted that Bookshelf was their largest single product ever in terms of the number of bits it contained and their smallest ever in physical size. Nevertheless, with most drives costing north of $1000 and products to use with them like Microsoft Bookshelf hundreds more, CD-ROM remained a pricey proposition found in vanishingly few homes — and for that matter not in all that many businesses either.
But at least actual products were available in CD-ROM format, which was more than could be said for CD-I. As 1986 turned into 1987, developers still hadn’t received any CD-I hardware at all, being forced to content themselves with printed specifications and examples of the system in action distributed on videotape by Philips. Particularly for a small company like Aegis, which had committed heavily to a game based on Jules Verne’s 20,000 Leagues Under the Sea, for which they had recruited Jim Sachs of Defender of the Crown fame as illustrator, it was turning into a potentially dangerous situation.
The computer industry — even those parts of it now more committed to CD-I than CD-ROM — dutifully came together once again for the second Microsoft CD-ROM Conference in March of 1987. In contrast to the unusual Pacific Northwest sunshine of the previous conference, the weather this year seemed to match the more unsettled mood: three days of torrential downpour. It was a more skeptical and decidedly less Woodstock-like audience who filed into the auditorium one day for a presentation by no less unlikely a party than the venerable old American conglomerate General Electric. But in the course of that presentation, the old rapture came back in a hurry, culminating in a spontaneous standing ovation. What had so shocked and amazed the audience was the impossible made real: full-screen video running in real-time off a CD drive connected to what to all appearances was an ordinary IBM PC/AT computer. Digital Video Interactive, or DVI, had just made its dramatic debut.
DVI’s origins dated back to 1983, when engineer Larry Ryan of another old-school American company, RCA, had been working on ways to make the old analog laser-disc technology more interactive. Growing frustrated with the limitations he kept bumping against, he proposed to his bosses that RCA dump the laser disc from the equation entirely and embrace digital optical storage. They agreed, and a new project on those lines was begun in 1984. It was still ongoing two years later — just reaching the prototype stage, in fact — when General Electric acquired RCA.
DVI worked by throwing specialized hardware at the problem which Philips had been fruitlessly trying to solve via software alone. By using ultra-intensive compression techniques, it was possible to crunch video playing at a resolution of 256 X 240 — not an overwhelming resolution even by the standards of the day, but not that far below the practical resolution of a typical television set either — down to a size below 153.6 K per second of footage without losing too much quality. This fact was fairly well-known, not least to Philips. The bottleneck had always been the cost of decompressing the footage fast enough to get it onto the screen in real time. DVI attacked this problem via a hardware add-on that consisted principally of a pair of semi-autonomous custom chips designed just for the task of decompressing the video stream as quickly as possible. DVI effectively transformed the potential 75 minutes of sound that could be stored on a CD into 75 minutes of video.
Philosophically, the design bore similarities to the Amiga’s custom chips — similarities which became even more striking when you considered some of the other capabilities that came almost as accidental byproducts of the design. You could, for instance, overlay conventional graphics onto the streaming video by using the computer’s normal display circuitry in conjunction with DVI, just as you could use an Amiga to overlay titles and other graphics onto a “genlocked” feed from a VCR or other video source. But the difference with DVI was that it required no complicated external video source at all, just a CD in the computer’s CD drive. The potential for games was obvious.
Still, DVI’s dramatic debut barely ended before the industry’s doubts began. It seemed clear enough that DVI was technically better than CD-I, at least in the hugely important area of video playback, but General Electric — hardly anyone’s idea of a nimble innovator — offered as yet no clear road map for the technology, no hint of what they really planned to do with it. Should game developers place their CD-I projects on hold to see if something better really was coming in the form of DVI, or should they charge full speed ahead and damn the torpedoes? Some did one, some did the other; some made halfhearted commitments to both technologies, some vacillated between them.
But worst of all was the effect that DVI had on Phillips. They were thrown into a spin by that presentation from which they never really recovered. Fearful of getting their clock cleaned in the marketplace by a General Electric product based on DVI, Phillips stopped CD-I in its tracks, demanding that a way be found to make it do full-screen video as well. From an original plan to ship the first finished CD-I units in time for Christmas 1987, the timetable slipped to promise the first prototypes for developers by January of 1988. Then that deadline also came and went, and all that developers had received were software emulators. Now the development prototypes were promised by summer 1988, finished units expected to ship in 1989. The delay notwithstanding, Philips still confidently predicted sales in “the tens of millions.” But then world domination was delayed again until 1990, then 1991.
Wanting CD-I to offer the best of everything, the project chased its own tail for years, trying to address every actual or potential innovation from every actual or potential rival. The game publishers who had jumped aboard with such enthusiasm in the early days were wracked with doubt upon the announcement of each successive delay. Should they jump off the merry-go-round now and cut their losses, or should they stay the course in the hope that CD-I finally would turn into the revolutionary product Philips had been promising for so long? To this day, you merely have to mention CD-I to even the most mild-mannered old games-industry insider to be greeted with a torrent of invective. Philips’s merry-go-round cost the industry huge. Some smaller developers who had trusted Philips enough to bet their very survival on CD-I paid the ultimate price. Aegis, for example, went out of business in 1990 with CD-I still vaporware.
While CD-I chased its tail, General Electric, the unwitting instigators of all this chaos, tried to decide in their slow, bureaucratic way what to do with this DVI thing they’d inherited. Thus things were as unsettled as ever on the CD-I and DVI fronts when the third Microsoft CD-ROM Conference convened in March of 1988. The old plain-Jane CD-ROM format, however, seemed still to be advancing slowly but steadily. Certainly Microsoft appeared to be in fine fettle; harking back to the downpour that had greeted the previous year’s conference, they passed out oversized gold umbrellas to everyone — emblazoned, naturally, with the Microsoft logo in huge type. They could announce at their conference that the High Sierra logical format for CD-ROM had been accepted, with some modest modifications to support languages other than English, by the International Standards Organization as something that would henceforward be known as “ISO 9660.” (It remains the standard logical format for CD-ROM to this day.) Meanwhile Philips and Sony were about to begrudgingly codify the physical format for CD-ROM, extant already as a de facto standard for several years now, as the Yellow Book, latest addition to a library of binders that was turning into quite the rainbow. Apple, who had previously been resistant to CD-ROM, driven as it was by their arch-rival Microsoft, showed up with an official CD-ROM drive for a Macintosh or even an Apple II, albeit at a typically luxurious Apple price of $1200. Even IBM showed up for the conference this time, albeit with a single computer attached to a non-IBM CD-ROM drive and a carefully noncommittal official stance on all this optical evangelism.
As CD-ROM gathered momentum, the stories of DVI and CD-I alike were already beginning to peter out in anticlimax. After doing little with DVI for eighteen long months, General Electric finally sold it to Intel at the end of 1988, explaining that DVI just “didn’t mesh with [their] strategic plans.” Intel began shipping DVI setups to early adopters in 1989, but they cost a staggering $20,000 — a long, long way from a reasonable consumer price point. DVI continued to lurch along into the 1990s, but the price remained too high. Intel, possessed of no corporate tradition of marketing directly to consumers, often seemed little more motivated to turn DVI into a practical product than had been General Electric. Thus did the technology that had caused such a sensation and such disruption in 1987 gradually become yesterday’s news.
Ironically, we can lay the blame for the creeping irrelevancy of DVI directly at the feet of the work for which Intel was best known. As Gordon Moore — himself an Intel man — had predicted decades before, the overall throughput of Intel’s most powerful microprocessors continued to double every two years or so. This situation meant that the problem DVI addressed through all that specialized hardware — that of conventional general-purpose CPUs not having enough horsepower to decompress an ultra-compressed video stream fast enough — wasn’t long for this world. And meanwhile other engineers were attacking the problem from the other side, addressing the standard CD’s reading speed of just 153.6 K per second. They realized that by applying an integral multiplier to the timing of a CD drive’s circuitry, its reading (and seeking) speed could be increased correspondingly. Soon so-called “2X” drives began to appear, capable of reading data at well over 300 K per second, followed in time by “4X” drives, “8X” drives, and whatever unholy figure they’ve reached by today. These developments rendered all of the baroque circuitry of DVI pointless, a solution in search of a problem. Who needed all that complicated stuff?
CD-I’s end was even more protracted and ignominious. The absurd wait eventually got to be too much for even the most loyal CD-I developers. One by one, they dropped their projects. It marked a major tipping point when in 1989 Electronic Arts, the most enthusiastic of all the software publishers in the early days of CD-I, closed down the department they had formed to develop for the platform, writing off millions of dollars on the aborted venture. In another telling sign of the times, Greg Riker, the manager of that department, left Electronic Arts to work for Microsoft on CD-ROM.
When CD-I finally trickled onto store shelves just a few weeks shy of Christmas 1991, it was able to display full-screen video of a sort but only in 128 colors, and was accompanied by an underwhelming selection of slapdash games and lifestyle products, most funded by Philips themselves, that were a far cry from those halcyon expectations of 1986. CD-I sales disappointed — immediately, consistently, and comprehensively. Philips, nothing if not persistent, beat the dead horse for some seven years before giving up at last, having sold only 1 million units in total, many of them at fire-sale discounts.
In the end, the big beneficiary of the endless CD-I/DVI standoff was CD-ROM, the simple, commonsense format that had made its public debut well before either of them. By 1993 or so, you didn’t need anything special to play video off a CD at equivalent or better quality to that which had been so amazing in 1987; an up-to-date CPU combined with a 2X CD-ROM drive would do the job just fine. The Microsoft standard had won out. Funny how often that happened in the 1980s and 1990s, isn’t it?
Bill Gates’s reputation as a master Machiavellian being what it is, I’ve heard it suggested that the chaos and indecision which followed the public debut of DVI had been consciously engineered by him — that he had convinced a clueless General Electric to give that 1987 demonstration and later convinced Intel to keep DVI at least ostensibly alive and thus paralyzing Philips long enough for everyday PC hardware and vanilla CD-ROM to win the day, all the while knowing full well that DVI would never amount to anything. That sounds a little far-fetched to this writer, but who knows? Philips’s decision to announce CD-I five days before Microsoft’s CD-ROM Conference had clearly been a direct shot across Bill Gates’s bow, and such challenges did tend not to end well for the challenger. Anything else is, and must likely always remain, mere speculation.
(Sources: Amazing Computing of May 1986; Byte of May 1986, October 1986, April 1987, January 1989, May 1989, and December 1990; Commodore Magazine of November 1988; 68 Micro Journal of August/September 1989; Compute! of February 1987 and June 1988; Macworld of April 1988; ACE of September 1989, March 1990, and April 1990; The One of October 1988 and November 1988; Sierra On-Line’s newsletter of Autumn 1989; PC Magazine of April 29 1986; the premiere issue of AmigaWorld; episodes of the Computer Chronicles television series entitled “Optical Storage Devices,” “CD-ROMs,” and “Optical Storage”; the book CD-ROM: The New Papyrus from the Microsoft Press. Finally, my huge thanks to William Volk, late of Aegis and Mediagenic, for sharing his memories and impressions of the CD wars with me in an interview.)
|↑1||The data on a music CD is actually read at a speed of approximately 172.3 K per second. The first CD-ROM drives had an effective reading speed that was slightly slower due to the need for additional error-correcting checksums in the raw data.|
September 30, 2016 at 4:45 pm
Another nice bit of history I didn’t know. Thank you!
For what it’s worth, CD-ROM speeds finally plateaued at 52x. They did try to make them faster — I caught that exact time at computer fairs, sometime around 1994 (?) — but disks were shattering inside these new drives with alarming regularity, so they quietly vanished off the market in just a few months. Wouldn’t surprise me to hear that DVDs don’t spin any faster either, but they store data at much higher densities, so it’s less of a problem.
October 1, 2016 at 9:32 pm
DVD was an advance in pit density, led by much better laser tracking and error-correction algorithms than CD.
One thing I think is important to point out is that CD is the first microprocessor-controlled storage format. That is, you can’t build a CD player without putting a CPU in there somewhere to decode the table of contents, respond to user commands, etc. Sure, the DAC was pure hardware but the interactivity was only possible through software.
Other appliances in the early 80s that required a CPU were the VCR and microwave oven. Other appliances, like the TV, dishwasher or vacuum cleaner took much longer to get a CPU. The latter still frequently don’t have one today.
November 16, 2016 at 1:54 pm
The microwave oven didn’t, and still doesn’t need a CPU. Most still have a simple egg-timer mechanism and a bunch of microswitches.
August 31, 2017 at 4:14 am
Interesting historical footnote: Kenwood came out with a 72x drive in 2001 and a 62x drive at some earlier point. They accomplished this by spinning the disc at a lower speed and splitting the laser into multiple beams.
September 30, 2016 at 6:03 pm
“Tacit” means “silent”, so this reads really weird.
September 30, 2016 at 6:55 pm
Right you are. Thanks!
September 30, 2016 at 9:04 pm
Thanks! I must point out now that the word “support” has now gone missing, though…
October 1, 2016 at 6:10 am
Sigh… show me not to try to edit articles on the couch. Thanks!
October 1, 2016 at 5:37 am
Rather interesting. I’m quite familiar with the last days of CD-I, when Phillips became Nintendo’s second partner (Nintendo initially partnered with Sony, but decided the licensing agreements were too onerous and aborted project Play Station – Sony would take their work, combined with some practice making CD-I hardware, and release their standalone PlayStation (note the lack of a space) console that inflicted a blow that Nintendo has never recovered from) on the SNES-CD add-on that never wound up materializing, but had no idea that the genesis of the product started so far back.
October 10, 2016 at 12:52 pm
One reason that Nintendo dropped Sony and went to CD-I was the settlement of the Baer Magnavox Videogame Patent case. It’s also how Link ended up on CD-I. And it’s also why Activision nearly closed in 1990/1 and it gave Bobby Kotick the opportunity to come in and save the company.
October 1, 2016 at 6:25 am
the way they could hold up to 75 minutes worth of music, enough to pack many double vinyl albums onto a single CD.
I’m confused. What are you referring to as a “double album” here? A 33 1/3 RPM LP is a little less than 25 minutes per side, isn’t it?
followed in time by “4X” drives, “8X” drives, and whatever unholy figure they’ve reached by today.
LOL. I suppose you’ve seen the Mythbusters thing where they actually spin CDs fast enough that they shatter?
October 1, 2016 at 6:58 am
The capacity of a record album depends to some extent on the nature of the content. Bass-heavy stuff tends to cause the needle to jump out of the groove if you pack the grooves too closely. You could probably get a few minutes more than your 25 minutes on a side in the case of, say, spoken-word stuff, while the practical limit for music — especially certain kinds of music, like rap or disco — is a little less.
But when I wrote of “double albums,” I meant *specific* double albums, many of which don’t come close to filling two records to maximum capacity. London Calling by the Clash and Exile on Main Street by the Rollings Stones, for instance, both come in well under 70 minutes.
I hadn’t seen the Mythbusters thing, no. Will look for it. ;)
October 1, 2016 at 7:37 am
Ahhh, my grammar understanding fail, then. You meant that many individual double albums could fit onto a CD. I was reading it as many simultaneously, which was absurd (for standard CD audio format). Gotcha.
October 3, 2016 at 11:13 am
You could probably get a few minutes more than your 25 minutes on a side in the case of, say, spoken-word stuff, while the practical limit for music — especially certain kinds of music, like rap or disco — is a little less.
This is far afield from the content of the post, but there are a reasonable number of LPs that go pretty far beyond 25 minutes a side. Miles Davis’s Get Up With It has three sides that are over half an hour each, for instance; that’s one of the more extreme cases but there are a lot more that are pretty jam-packed. I’ve never had any issues with the sound of Get Up With It or with the needle jumping out, but I’m far from an audiophile.
October 3, 2016 at 11:33 am
Volume also has an effect — make a record quieter and you can pack more on. You might want to see if you Miles Davis album is noticeably quieter than others in your collection…
October 3, 2016 at 1:00 pm
That sounds like the sort of thing that would be fun to test, but I’m not sure my wife will be on board. (closed-eyes laugh emoticon goes here)
October 1, 2016 at 4:13 pm
This article really shows your mastery at combining and condensing mountains of information from numerous sources, and producing a thoroughly readable and etertaining result.
Many thanks for another classic article !
I would also like you to comment on the following thought that came to me while reading the article, specifically your comment on things repeating themselves in 1980’s and 90’s computing : don’t all those optical data standards and hardware hark back to the floppy disk market of 1982-85 ? , with all those different densities, encoding methods, wide performance variations and custom solutions, until the 3.5″ Sony double-density format and drive became a de-facto standard, with 5 1/4″ drives mostly a legacy/budget thing.
October 2, 2016 at 7:26 am
Even the 3.5-inch floppy disks were a confusion of formats for a long time. While I’m seeming to recall that the Atari ST used the same format as MS-DOS, Apple had their own for the Mac, as did Commodore for the Amiga. Actually, the Mac had multiple formats itself, having switched filesystems about the time of the Mac Plus’s release. At some point in the late 1980s, Apple, realizing the Mac was never going to replace MS-DOS in corporate America and therefore needed to inter-operate with it, began making machines that could read and write to MS-DOS disks as well. We can see this as the industry finally beginning to converge on a single floppy format.
I do think that CD-ROM was a rather wise and far-seeing attempt to head off a similar confusion of formats in optical storage. A standard was particularly valuable in the case of CD-ROM in that the sheer size of the things made the idea of combining, say, MS-DOS and Mac versions of a product onto one disc viable. And indeed, there would be a lot of these multi-platform CD-ROMs, especially in the early days. Any way that a company could reduce the number of SKUs they were manufacturing and warehousing was generally a good thing for the bottom line…
October 2, 2016 at 11:22 pm
Elightening , as always ! Thanks for the clarification !
Can’t wait until next week ! …….
October 8, 2016 at 4:48 am
Even when the CD-ROM (High Sierra/ISO 9660) format was used for the underlying layer, Mac/Windows/Unix had different ways of extending the 8.3 file names that later had to be standardized (e.g. Rock Ridge extension). I think it took until the early 90s to get that right.
Still, I agree that High Sierra in 1986 was extremely forward-looking.
October 1, 2016 at 11:11 pm
As always, awesome read! Thank you for compiling and so readably supplying this cool old stuff.
a few typo catches:
– When one started really delving into the *practalities* of the CD
– *These development* rendered all of the baroque
– Bill Gates’s *bows* — it scans oddly for him to have multiple bows, but this may have been intentional, as he certainly had several efforts underway, more than one of which may have been affected by Philips’s move.
October 2, 2016 at 7:16 am
The idiom can be written as either “bow” or “bows,” but on reflection I agree that the latter sounds a little odd when applied to just one guy. Thanks!
Iván Díaz Álvarez
October 2, 2016 at 8:23 am
A very interesting article, I would only add that Phillips finally added DVI like functionality to CD-i in the form of the DVC (Digital Video Cartridge). This was a expansion module that you could install in most CD-i models and would later become integrated, with it you could actually play full screen video products, including a port of Dragon’s Lair. I also remember a short period in which MPEG decoder cards for PCs where a thing, being as expensive as a high end video card, this soon fade out as CPUs were able to decode MPEG content by themselves.
October 2, 2016 at 4:38 pm
“Seldom has a new media format appeared that is so clearly better than the existing formats in almost all respects.”
Get ready to have the audiophiles on your heels!
Multimedia, set-top boxes, and connectivity is something that really hasn’t been examined in depth from a computer games standpoint. People just assume that it kind of “happened”, but the CD-ROM coming to prominence was such a layered process that only really coincided with the IBM-PC clone wars. You portrayed the anti-climax of it all rather well.
I have been learning a lot more about SGI and Warner’s set-top box recently, and it seemed like every single one of these projects missed its window. CD-I, 3DO, Magic Carpet, the whole lot of them. Sure, they would have been beaten by the internet, but I think they could have at least created a market had they been more prescient on putting out the technology sooner. I don’t know if you’ll cover 3DO in any larger sense, but I do know a guy who was their during the product’s development if you’re interested.
October 2, 2016 at 6:59 pm
I put that “at least” in there as a sop to hipsters and their “warm-sounding” vinyl. ;) (I actually think the vinyl revival is kind of cool, but I’m under no illusion that the format is really better in any objective way than CD.)
I think the failure of the set-top boxes of the 1990s was partially a failure of marketing. None of these companies articulated very well what you were really supposed to *do* with them, other than “a little of everything.” They were desperately afraid of selling them as “just” videogame consoles, but failed to articulate a plausible alternate use that consumers could really grab hold of. Companies like Sega and Sony who sold CD-based boxes as game consoles did much better.
Phillips in particular had started on CD-I at a time when consoles were still regarded as a dead fad, and thus shied away from any hint of association with the likes of the Atari VCS. Later, they never updated their messaging to match the post-Nintendo reality.
I haven’t thought seriously yet about how deeply I want to delve into 3DO, but I’ll make a note of your offer in case I do go deep enough that an inside source would be a good thing to have. Thank you!
October 3, 2016 at 7:34 am
Ah, but that’s the whole issue, don’t you see? Human beings are anything but objective. Moreover, we’ve evolved in an analog environment full of little imperfections that are never twice the same, so digital formats, with their ability to perfectly reproduce sound the same way every single time, strike us as artificial. Which they are. And increasing fidelity beyond the ability of the human ear to even perceive it can’t possibly help.
When audiophiles claim that vinyl sounds “better”, they don’t mean in any objective way. Of course not. The whole point is that it sounds subjectively better. Yes, once again, in yet another way, worse is better. We’d all do well to learn that lesson already.
October 3, 2016 at 12:57 pm
I will say that, pulling out my CD collection after a while, the whole “effectively last forever” thing was oversold. A lot of my CDs have developed glitches. This is probably because of the way I store/handle them, but then again “perfect sound as long as you never take it out to play it” isn’t what we were promised.
And anyway, that’s probably irrelevant, as the CD is a dinosaur format too. Us dinosaurs like to say (or liked to say a few years ago) that young’uns today would never own an LP, not realizing that it’s CDs that they’ll never own now.
October 2, 2016 at 9:29 pm
I can’t remember when I got my first CD-ROM; it came in a big Creative Labs ‘edutainment’ box, and the drive itself was suspended in a webwork of blue plastic film that made me think of transporting nitroglycerine. It came with the Grolier encyclopedia, and I found a copy of something called Myst a few months later, so that period is kind of a blur.
Not realizing how long it had been in development, I’d always wondered just what Philips was thinking when the CD-I went to market. It seemed like a slapdash also-ran with some really weird licenses and nothing else to recommend it.
October 4, 2016 at 7:30 pm
I joined the fray in 1991, when the first ‘affordable’ CD-ROM based entertainment system arrived in the form of the Fujitsu FM-Towns, soon to be followed by offerings from Sega and shortly after, the first Playstation consoles. At the time, the FM-Towns had an unusually sophisticated video card, that allowed hand-coded video decompressors to generate full-screen interactive video (so long as you didn’t mind a limited palette).
It was a time of some chaos, with many different technical routes being explored simultaneously – from unique graphics chips (in the Jaguar), to CD-ROM and early 3D-rendering techniques. Various companies were trying to set (or lock in) standards that they could own – often with the quality of the hardware coming secondary to the complexity of the licensing agreement.
Worse still, no-one really knew what games should be like. Suddenly, hand drawn 2D sprites looked crude, but there wasn’t an easy way to generate anything more sophisticated. Even when you did have the graphics, what would the game play be? Choose your own adventure style video? Quiz games? Some complicated tie-in to Hollywood (which was what everyone wanted to emulate)? There is a hidden story of the growth of the tools that allowed all this new content to be created and tested – when you could no-longer produce the game itself on the machine it was ultimately intended to run on.
October 7, 2016 at 3:13 am
Oof, I imagine Jimmy will eventually get to that horrid period of full-motion video games in the mid-’90’s. We thought we wanted real actors, but we really, really didn’t.
November 12, 2016 at 3:32 pm
Hey, some of us still have a thing for FMV. :)
When done well. Which it often wasn’t. But hey, I have fond memories of both Phantasmagoria games (I understand most people don’t), Gabriel Knight 2, Shivers, Harvester, The Riddle of Master Lu, Temujin. Different sort of FMVs all, of course.
It wasn’t until decades later that I realised the amount of crap that had been churned out by FMV bandwagonners, like Angel Devoid or Dracula Unleashed. Maybe I was just lucky growing up, huh? Still, some gems like The Dark Side Of The Moon are worth experiencing. Others, like Conspiracy, are just so bad it’s not worth it.
It’s just plain *hard* to make good FMV. I’m sure it’s worth it, but I really can’t imagine anyone in the industry seriously wanting to work around its limitations nowadays – especially with its reputation. But good FMV *can* exist.
October 7, 2016 at 8:41 am
Phillips is also guilty of inflicting the world with two rather awful Zelda games.
October 11, 2016 at 9:02 pm
Thanks Magnavox Baer Patent Case. That was a settlement with Nintendo.
October 8, 2016 at 4:15 am
I was working as a computer networking technician when the CD-ROM was introduced. I can confirm that the capacity was, for that time, mind-blowingly huge. Whenever I would hear the total number of bytes you could store on one of them, it was completely outside my ability to reason about it, so I would forget it again. It was like if somebody said “eleventy jillion.”
The company I worked for sold Novell Netware networks to businesses. Since the drives were so expensive, one thing Netware could do for you was allow you to attach a CD-ROM drive to a server somewhere, which could be shared by all the computers in your office.
March 13, 2018 at 11:59 pm
This is a minor point, and one that wouldn’t be worth making if had you not said that “153.6 K per second” is “exactly the speed required to stream standard CD sound”. In fact it is only related to it.
CD audio actually requires 176,400 bytes/second (2 channels * 44,100 samples/second * 2 bytes/sample).
But this is divided into chunks of 1/75th second (2352 bytes). For CD-ROM, each of these chunks contains a 2 kilobyte sector, giving your CD-ROM rate of 153,600 bytes/second (or 150 KB/second if you reckon it in powers of two).
The difference in payload allows for an extra layer of error detection and correction, data needing to be more robust than audio. Applications (such as CD-i) not fussed about that, or about convenient sector sizes, could forgo it to obtain a data rate close to that for red book audio, but let’s not get too bogged down!
March 14, 2018 at 10:01 am
Thanks for this! Made an edit and added a footnote.
September 27, 2019 at 12:05 pm
I didn’t even know that CD-I and DVI existed. I had my first encounter with CD-ROMs when my dad finally upgraded the family computer from the Apple 2e (which was already old when he got it used) to a brand new Windows ME PC. So, it must’ve been in 2000 sometime. Can’t remember which game I got first.
Now, in 2019, companies seem to be moving away from having any disc drives, or any physical media storage at all. Everything is download this, stream that. It’s rare to see a new laptop with a disc drive built in, I’ve seen a few desktops that don’t have one. Kinda a shame really, because accounts get hacked, or banned, or the internet goes down, or any number of reasons it’d be nice to just pop in a disc.
January 24, 2023 at 2:28 pm
Highly interesting stuff as usual, thank you! I unfortunately don’t have anything insightful to add, but here’s a nitpick if you still take those:
“In the end, the big benefactor of the endless CD-I/DVI standoff was CD-ROM, the simple, commonsense format that had made its public debut well before either of them.”
I think you mean “beneficiary”
January 25, 2023 at 2:49 am
I do indeed. Thanks!