RSS

Audio Killed the Blogging Star

Ken Gagne and Mike Maginnis recently invited me to be their guest on their Open Apple Podcast. The result, which should show definitively why I chose to be a writer rather than a deejay, can now be enjoyed on their website or just by clicking the play button below. We talk some about the blog and various other projects, and then I offer lots of color commentary about lots of things I sometimes know something about and sometimes do not. Check it out if you have a couple of hours to spare.

Huge thanks to Ken and Mike for inviting me on the show. It was a lot of fun to do.

 

The Pinball Wizard

Bill Budge in Electronic Arts software artist pose

Bill Budge in Electronic Arts software artist pose

The name of Bill Budge has already come up from time to time on this blog. Mentioning him has been almost unavoidable, for he was one of the titans amongst early Apple II programmers, worshiped for his graphical wizardry by virtually everyone who played games. As you may remember, his name carried such weight that when Richard Garriott was first contacted by Al Remmers of California Pacific to propose that he allow CP to publish Akalabeth Garriott’s first reaction was a sort of “I’m not worthy” sense of shock at the prospect of sharing a publisher with the great Budge. Having arrived at the time of the birth of Electronic Arts and Budge’s masterpiece, Pinball Construction Set, now seems a good moment to take a step back and look at what made Budge such a star.

Budge was always a tinkerer, always fascinated by the idea of construction sets. As a young kid, he played with blocks, tinker toys, erector sets. As an older kid, he moved on to fiddling with telescopes and model rockets. (“It’s amazing we didn’t kill ourselves.”) After moving about the country constantly when Budge was younger, his family finally ended up in the San Francisco Bay area by the time Budge began high school in the late 1960s. It was a fortuitous move. With the heart of the burgeoning Silicon Valley easily accessible, Budge’s family had found the perfect spot for a boy who liked to tinker with technology. A teacher at his school named Harriet Hungate started a class in “computer math” soon after Budge arrived. The students wrote their programs out by hand, then sent them off to a local company that had agreed to donate some time on their IBM 1401 minicomputer. They then got to learn whether their programs had worked from a printout sent back to the school. It was a primitive way of working, but Budge was immediately smitten. He calls the moment he discovered what a loop is one of the “transcendent moments” in his life. He “just programmed all the time” during his last two years of high school. Hungate was eventually able to finagle a deal with another local business to get a terminal installed at the school with a connection to an HP 2100 machine hosting HP Time-Shared BASIC. Budge spent hours writing computer versions of Tic-tac-toe, checkers, and Go.

But then high school was over. Without the ready access to computers that his high school had afforded him, Budge tried to put his programming behind him. He entered the University of California Santa Cruz as an English major, with vague aspirations toward becoming a novelist. Yet in the end the pull of programming proved too strong. After two years he transferred to Berkeley as a computer-science major. He got his Bachelor’s there, then stayed on to study for a PhD. He was still working on that in late 1978 when the Apple II first entered his life.

As you might expect, the arrival of the trinity of 1977 had prompted considerable discussion within Berkeley’s computer-science department. Budge dithered for a time about whether to buy one, and if so which one. At last friend and fellow graduate student Andy Hertzfeld convinced him to go with the local product of nearby Apple Computer. It wasn’t an easy decision to make; the Commodore PET and the TRS-80 were both much cheaper (a major consideration for a starving student), and the TRS-80 had a vastly larger installed base of users and much more software available. Still, Budge decided that the Apple II was worth the extra money when he saw the Disk II system and the feature that would make his career, the bitmapped hi-res graphics mode. He spent half of his annual income on an Apple II of his own. It was so precious that he would carefully stow the machine away back in its box, securely swaddled in its original protective plastic, whenever he finished using it.

As he explored the possibilities of his treasure, Budge kept coming back again and again to hi-res mode. He worked to divine everything about how it worked and what he might do with it. His first big programming project became to rewrite much of Steve Wozniak’s original game of Breakout which shipped with every early Apple II. He replaced Woz’s graphics code with his own routines to make the game play faster and smoother, more like its arcade inspiration. When he had taken that as far as he could, he started thinking about writing a game of his own. He was well-acquainted with Pong from a machine installed at the local pizza parlor. Now he recreated the experience on the Apple II. He names “getting my first Pong ball bouncing around on the screen” as another of his life’s transcendent moments: “When I finished my version of Pong, it was kind of a magical moment for me. It was night, and I turned the lights off in my apartment and watched the trailing of the ball on the phosphors of my eighty-dollar black and white TV.” He added a number of optional obstacle layouts to the basic template for variety, then submitted the game, which he named Penny Arcade, to Apple themselves. They agreed to trade him a printer for it, and earmarked it for The Apple Tapes, a cassette full of “introductory programs” to be included with every specimen of the new Apple II Plus model they were about to release. In the manual for the collection they misattributed the game to “Bob Budge,” but it mattered little. Soon enough everyone would know his name.

Penny Arcade

Penny Arcade

With his very first game shipping with every Apple II Plus, Budge was naturally happy to continue with his new hobby. He started hanging around the local arcades, observing and taking careful notes on the latest games. Then he would go home and clone them. Budge had little interest in playing the games, and even less in playing the role of game designer. For him, the thrill — the real game, if you will — was in finding ways to make his little Apple II produce the same visuals and gameplay as the arcade machines, or at least as close as he could possibly get. In a few years Atari would be suing people for doing what Budge was doing, but right now the software industry was small and obscure enough that he could get away with it.

Budge’s big breakthrough came when a friend of his introduced him to a traveling sales rep named Al Remmers, who went from store to store selling 8″ floppy disk drives. He and Budge made a deal: Remmers would package the games up in Ziploc baggies and sell them to the stores he visited on his treks, and they would split the profits fifty-fifty. Budge was shocked to earn $7000 for the first month, more than his previous annual income. From this relationship was born Remmers’s brief-lived but significant software-publishing company, California Pacific, as well as Budge’s reputation as the dean of Apple II graphics programmers. His games may not have been original, but they looked and played better than just about anything else out there. To help others who dreamed of doing what he did, he packaged some of his routines together as Bill Budge’s 3-D Graphics System. His reputation was such that this package sold almost as well as his games. This was how easily fame and fortune could come to a really hot programmer for a brief window of a few years, when word traveled quickly in a small community aching for more and better software for their machines.

In fact, his reputation soared so quickly that Apple themselves came calling. Budge, who had been putting less and less effort into his studies as his income from his games increased, dropped out of Berkeley to join his old buddy Andy Hertzfeld in Cupertino. He was made — what else? — a graphics specialist working in the ill-fated Apple III division. He ended up spending only about a year at Apple during 1980 and 1981, but two experiences there would have a huge impact on his future work, and by extension on the field of computer gaming.

While Budge was working at Apple much of the engineering team, including Hertzfeld and even Woz himself, were going through a hardcore pinball phase: “They were students of the game, talking about catches, and how to pass the ball from flipper to flipper, and they really got into it.” Flush with cash as they were after the IPO, many at Apple started filling their houses with pinball tables.

Budge's first pinball game, from Trilogy of Games

Budge’s first pinball game, from Trilogy of Games

Budge didn’t find pinball intrinsically all that much more interesting than he did purely electronic arcade games. Still, one of the first games Budge sold through Remmers had been a simple pinball game, which was later included in his very successful Trilogy of Games package published by California Pacific. Pinball was after all a fairly natural expansion of the simple Pong variants he started with. Now, witnessing the engineers’ enthusiasm led him to consider whether he could do the game better justice, create something on the Apple II that played and felt like real pinball, with the realistic physics that are so key to the game. It was a daunting proposition in some ways, but unusually susceptible to computer simulation in others. A game of pinball is all about physics, with no need to implement an opponent AI. And the action is all centered around that single moving ball while everything else remains relatively static, meaning it should be possible to do on the Apple II despite that machine’s lack of hardware sprites. (This lack made the Apple II less suited for many action games than the likes of the Atari 8-bit computers or even the Atari VCS.) After some months of work at home and after hours, Budge had finished Raster Blaster.

Raster Blaster

Raster Blaster was the best thing Budge had yet done — so good that he decided he didn’t want to give it to California Pacific. Budge felt that Remmers wasn’t really doing much for him by this point, just shoveling disks into his homemade-looking packaging, shipping them off to the distributor SoftSel, and collecting 50% of the money that came back. The games practically sold themselves on the basis of Budge’s name, not California Pacific’s. Budge was a deeply conflict-averse personality, but his father pushed him to cut his ties with California Pacific, to go out on his own and thereby double his potential earnings. And anyway, he was getting bored in his job at Apple. So he quit, and along with his sister formed BudgeCo. He would write the games, just as he always had, and she would handle the business side of things. Raster Blaster got BudgeCo off the ground in fine form. It garnered rave reviews, and became a huge hit in the rapidly growing Apple II community, Budge’s biggest game yet by far. Small wonder — it was the first computer pinball game that actually felt like pinball, and also one of the most graphically impressive games yet seen on the Apple II.

But next came the question of what to do for a follow-up. It was now 1982, and it was no longer legally advisable to blatantly clone established arcade games. Budge struggled for weeks to come up with an idea for an original game, but he got nowhere. Not only did he have no innate talent for game design, he had no real interest in it either. Out of this frustration came the brilliant conceptual leap that would make his legacy.

Above I mentioned that two aspects of Budge’s brief time at Apple would be important. The other was the Lisa project. Budge did not directly work on or with the Lisa team, but he was fascinated by their work, and observed their progress closely. Like any good computer-science graduate student, he had been aware of the work going on at Xerox PARC. Yet he had known the Alto’s interface only as a set of ideas and presentations. When he could actually play with a real GUI on the Lisa prototypes, it made a strong impression. Now it provided a way out of his creative dilemma. He was disinterested in games and game design; what interested him was the technology used to make games. Therefore, why not give people who actually did want to become designers a set of tools to let them do that? Since these people might be no more interested in programming than he was in design, he would not just give them a library of code like the old 3-D Graphics System he had published through California Pacific. No, he would give them a visual design tool to make their own pinball tables, with a GUI interface inspired by the work of the Lisa team.

The original Pinball Construction Set box art, featuring pieces of the pinball machine that Budge disassembled to plan the program

The original Pinball Construction Set box art, featuring pieces of the pinball machine that Budge disassembled to plan the program

Budge had resisted buying a pinball table of his own while at Apple, but now he bought a used model from a local thrift shop. He took it apart carefully, cataloging the pieces that made up the playfield. Just as the Lisa’s interface used a desktop as its metaphor, his program would let the user build a pinball machine from a bin of iconographic spare parts. The project was hugely more ambitious than anything he had tackled before, even if some of the components, such as a simple paint program that let the user customize the look of her table, he had already written for his personal use in developing Raster Blaster. Budge was determined to give his would-be creator as much scope as he possibly could. That meant fifteen different components that she could drag and drop anywhere on the surface of her table. It meant letting her alter gravity or the other laws of physics if she liked. It meant letting her make custom scoring combinations, so that bumping this followed by that gave double points. And, because every creator wants to share her work, it meant letting the user save her custom table as a separate program that her friends could load and play just like they did Budge’s own Raster Blaster. That Budge accomplished all of this, and in just 48 K of memory, makes Pinball Construction Set one of the great feats of Apple II programming. None other than Steve Wozniak has called it “the greatest program ever written for an 8-bit machine.”

Pinball Construction Set

Pinball Construction Set

Amazing as it was, when BudgeCo released Pinball Construction Set at the end of 1982 its sales were disappointing. It garnered nowhere near the attention of Raster Blaster. The software industry had changed dramatically over the previous year. A tiny operation like BudgeCo could no longer just put a game out — even a great, groundbreaking game like PCS — and wait for sales. It was getting more expensive to advertise, harder to get reviews and get noticed in general. Yet when Trip Hawkins came to him a few months later asking to re-release PCS through his new publisher Electronic Arts, Budge was reluctant, nervous of the slick young Hawkins and his slick young company. But Hawkins just wouldn’t take no for an answer; he said he would make Budge and his program stars, said that only he could find PCS the audience its brilliance deserved — and he offered one hell of a nice advance and royalty rate to boot. And EA did have Woz himself on the board of directors, and Woz said he thought signing up would be a smart move. Budge agreed at last; thus BudgeCo passed into history less than two years after its formation.

As good as PCS was, it’s very possible that Hawkins had another, ulterior motive in pursuing Budge with such vigor. To understand how that might have been, we need to understand something about what Budge was like personally. Given the resume I’ve been outlining — spent his best years of high school poring over computer code; regarded his Apple II as his most precious possession; had his most transcendent moments programming it; etc. — you’ve probably already formulated a shorthand picture. If the Budge of that picture is, shall we say, a little bit on the nerdy, introverted side, you can be forgiven. The thing was, however, the real Budge was nothing like what you might expect; as he himself put it, he “didn’t quite fit the mold.” He had a tall, rangy build and handsome features beneath a luxurious head of hair, with striking eyes that a teenage girl might call dreamy. At 29 (although he looked perhaps 22), he was comfortable in his own skin in a way that some people never manage, with an easy grace about him that made others as glad to talk to him as they were to listen. His overall persona smacked more of enlightened California beach bum than hardcore programmer. And he took a great picture. If there was one person amongst Hawkins’s initial crew of developers who could actually pull off the rock star/software artist role, it was Budge; he might even attract a groupie or two. He was a dream come true for the PR firm Hawkins had inherited from his time at Apple, Regis McKenna, Inc. Thus the EA version of PCS was designed to look even more like a contemporary rock album than any of the other games. The name of Bill Budge, the man EA planned to make their very own rock star, was far larger on it than the name of his game.

EA's version of Pinball Construction Set

The down-to-earth Budge himself was rather bemused by EA’s approach, but he shrugged his shoulders and went along with it in his usual easygoing manner. When EA arranged for rock photographer Norman Seeff to do the famous “software artists” photo shoot, they asked that the subjects all wear appropriately bohemian dark clothing to the set. Budge went one better: he showed up with a single studded leather glove he’d bought for dressing up as a punk rocker for a party thrown by the Apple Macintosh team. He brought it simply as a joke, a bit of fun poked at all this rock-star noise. Imagine, then, how shocked he was when Seeff and the others demanded that he actually wear it. Thus Budge in his leather glove became the standout figure from that iconic image. As he later sheepishly admitted, “That’s not really me.” Soon after he got a software-artist photo shoot and advertisement all to himself, filled with vague profundities that may or may not have actually passed his lips beforehand. (“Programming for a microcomputer is like writing a poem using a 600-word vocabulary.”)

EA booked Budge into every gig they could find for him. He did a lengthy one-on-one interview with Charlie Rose for CBS News Nightwatch (“He knew absolutely nothing. He seemed like your typical blow-dried guy without a lot of substance. But I guess I was wrong about him.”); he demonstrated PCS alongside Hawkins on the influential show Computer Chronicles; he featured in a big segment on Japanese television, at a time when that country’s own designers were toiling in obscurity for their parent corporations; he had his photo on the cover of The Wall Street Journal; he was featured alongside the likes of Steve Jobs in an Esquire article on visionaries under the age of forty.

With his album out and the photo shoots done and the promotional spots lined up, it still remained for EA’s rock star to hit the road — to tour. If the highs just described were pretty intoxicating for a computer-game programmer, this part of the process kept him most assuredly grounded. Budge and EA’s Bing Gordon went on a series of what were billed as “Software Artists Tours,” sometimes accompanied by other designers, sometimes alone. The idea was something like a book tour, a chance to sign autographs and meet the adoring fans. Determined to break beyond the ghetto of traditional computer culture, EA booked them not only into computer stores but also into places like Macy’s in New York City, where they were greeted with confusion and bemusement. Even the computer stores sometimes seemed surprised to see them. Whether because of communications problems or flat disinterest, actual fans were often rare or nonexistent at the events. Hawkins’s dream of hordes of fans clutching their EA albums, fighting for an autograph… well, it didn’t happen, even though PCS became a major hit in its new EA duds (it would eventually sell over 300,000 copies across all platforms, a huge figure in those days). Often there seemed to be more press people eager to score an interview than actual fans at the appearances, and often the stores themselves didn’t quite know what to do with their software artists. One manager first demanded that Budge buy himself a new outfit (he was under-dressed in the manager’s opinion to be “working” in his store), then asked him if he could make himself useful by going behind the register and ringing up some customers. “That’s when I realized maybe I wouldn’t be a rock star,” a laconic Budge later said.

Budge wasn’t idle in the down-times between PR junkets. Privileged with one of the few Macintosh prototypes allowed outside of Apple, he used its bundled MacPaint application as the model for MousePaint, a paint program that Apple bundled with the first mouse for the Apple II. He also ported PCS to the Mac. Still, the fans and press were expecting something big, something as revolutionary as PCS itself had been — and small wonder, given the way that EA had hyped him as a visionary.

One of the most gratifying aspects of PCS had been the unexpected things people found to do with it, things that often had little obvious relationship to the game of pinball. Children loved to fill the entire space with bumpers, then watch the ball bouncing about among them like a piece of multimedia art. Others liked to just use the program’s painting subsections to make pictures, scattering the ostensible pinball components here and there not for their practical functions but for aesthetic purposes. If people could make such creative use of a pinball kit, what might they do with something more generalized? As uninterested as ever in designing a game in the traditional sense, Budge began to think about how he could take the concept of the construction set to the next step. He imagined a Construction Set Construction Set, a completely visual programming environment that would let the user build anything she liked — something like ThingLab, an older and admittedly somewhat obtuse stab at the idea that existed at Xerox PARC. His ideas about Construction Set Construction Set were, to say the least, ambitious:

“I could build anything from Pac-Man to Missile Command to a very, very powerful programming language. It’s the kind of a program that has a very wide application. A physics teacher, for example, could build all kinds of simulations, of little micro-worlds, set up different labs and provide dynamic little worlds that aren’t really videogames.”

It turned out to be a bridge too far. Budge tinkered with the idea for a couple of years, but never could figure out how to begin to really implement it. (Nor has anyone else in the years since.) In fact, he never made a proper follow-up to PCS at all. Ironically, Budge, EA’s software artist who best looked the part, was one of the least able to play the role in the long term. As becomes clear upon reading any interview with Budge, old or new, his passion is not for games; it’s for code. In the early days of computer gaming the very different disciplines of programming and game design had been conflated into one due to the fact that most of the people who owned computers and were interested in making games for them were programmers. During the mid-1980s, however, the two roles began to pull apart as the people who used computers and the way games were developed changed. Budge fell smack into the chasm that opened up in the middle. Lauded as a brilliant designer, he was in reality a brilliant programmer. People expected from him something he didn’t quite know how to give them, although he tried mightily with his Construction Set Construction Set idea.

Budge at home in early 1985, the beginning of his "years in the wilderness"

Budge at home in early 1985, the beginning of his “years in the wilderness”

So, he finally just gave up. After 1984 the interviews and appearances and celebrity petered out. His continuing royalties from PCS made work unnecessary for some years, so he all but gave up programming, spending most of his time wind-surfing instead (a sport that Bing Gordon, perhaps to his regret, had taught him). Most people would have a problem going back to obscurity after being on television and newspaper features and even having their own magazine column (in Softalk), but it seemed to affect Budge not at all: “I’m kind of glad when I don’t have anything new out and people forget about me.” Eventually EA quietly accepted that they weren’t going to get another game from him and quit calling. Budge refers to this period as his “years in the wilderness.” By 1990 the name of Bill Budge, such a superstar in his day, came up only when the old-timers started asking each other, “Whatever happened to….?”

In the early 1990s, Budge, now married and more settled, decided to return to the games industry, first to work on yet another pinball game, Virtual Pinball for the Sega Genesis console. Without the pressure of star billing to live up to and with a more mature industry to work in that had a place for his talents as a pure programmer’s programmer, he decided to continue his career at last. He’s remained in the industry ever since, unknown to the public but respected immensely by his peers within the companies for which he’s worked. For Budge, one of those people who has a sort of innate genius for taking life as it comes, that seems more than enough. Appropriately enough, he’s spent most of his revived careers as what’s known as a tools programmer, making utilities that others then use to make actual games. In that sense his career, bizarre as its trajectory has been, does have a certain consistency.

PCS, his one towering achievement as a software artist, deserves to be remembered for at least a couple of reasons. First of all there is of course its status as the first really elegant tool to let anyone make a game she could be proud of. It spawned a whole swathe of other “construction set” software, from EA and others, all aimed at fostering this most creative sort of play. That’s a beautiful legacy to have. Yet its historical importance is greater than even that would imply. PCS represents to my knowledge the first application of the ideas that began at Xerox PARC to an ordinary piece of software which ordinary people could buy at an ordinary price and run on ordinary computers. It proved that you didn’t need an expensive workstation-class machine like the Apple Lisa to make friendlier, more intuitive software; you could do it on a 48 K Apple II. No mouse available? Don’t let that stop you; use a joystick or a set of paddles or just the arrow keys. Thousands and thousands of people first saw a GUI interface in the form of Pinball Construction Set. Just as significantly, thousands of other designers saw its elegance and started implementing similar interfaces in their own games. The floating, disembodied hand of PCS, so odd when the game first appeared, would be seen everywhere in games within a couple of years of its release. And game manuals soon wouldn’t need to carefully define “icon,” as the PCS manual did. PCS is a surprising legacy for the Lisa project to have; certainly the likes of it weren’t anything anyone involved with Lisa was expecting or aiming for. But sometimes legacies are like that.

Next time we’ll look at another of those seminal early EA games. If you’d like to try to make something of your own in the meantime, here’s the Apple II disk image and manual for Pinball Construction Set.

(What with his celebrity in Apple II circles between 1979 and 1985, there’s a lot of good information on Budge available in primary-source documents from the period. In particular, see the November 1982 Softline, the December 1985 Compute!’s Gazette, the March 1985 Electronic Games, the March 1985 Enter, the September 1984 Creative Computing, and Budge’s own column in later issues of Softalk. Budge is also featured in Halcyon Days, and Wired magazine published a retrospective on his career when he was given a Pioneer Award by the Academy of Interactive Arts and Sciences in 2011. Budge’s interview at the awards ceremony was videotaped and is available for viewing online.)

 
 

Tags: , ,

Seeing Farther

Trip Hawkins at the new Electronic Arts offices, 1983

Trip Hawkins at the new Electronic Arts offices, 1983

Born in northern California in 1953, William M. “Trip” Hawkins III was the perfect age to be captured by the tabletop experiential games that had begun to arrive in force by his teenage years. He experimented with the Avalon Hill wargames, but what really captured his imagination was Strat-o-Matic Football. A huge football fan, he loved the idea of guiding a team game by game through the drama of a full NFL season — loved it enough that he was willing to put up with all of the dice-rolling and math that were part of the process. Unfortunately, his friends were not so entranced. After taking a look at the closely printed manual and all of the complicated forms, they threw up their hands and asked Trip if he’d maybe like to just watch some TV instead. Here was born for Hawkins a lifelong antipathy toward the “boob tube,” a belief that such a passive, brain-numbing medium could and should be superseded by other, interactive forms of entertainment. Yet he had also run into the classic experiential gamer’s dilemma. To wring a dramatic experience out of Strat-o-Matic you had to spend far too much time fiddling with numbers and mundane details. Some people revel in that sort of thing, losing themselves in games as systems. Hawkins’s friends, however, wanted them to be lived experiences. Fiddling with the system only clouded the fictional context that really interested them, and made the whole thing feel far too much like schoolwork.

Then, in 1971, Hawkins saw his first computer, a DEC PDP-8. The answer to his dilemma seemed clear: he could run games on the computer, letting the machine handle all of the boring stuff. Being possessed of a strong entrepreneurial streak — he would start his first (unsuccessful) business venture before the age of 20, selling a Strat-o-Matic-inspired football game of his own design — he decided that his mission in life would be to start a company to make computer games. By this he imagined not the simple arcade games that would soon begin appearing in bars and shopping malls, but richer, deeper experiences in the spirit of the board games that had so equally enticed and frustrated him.

As I mentioned in my last article, Hawkins was possessed of some of the same qualities that marked the young Steve Jobs, including intense charisma and the associated reality distortion field that made him able to convince older and presumably wiser people to do highly improbable things. He thus became the first and (I assume) only person ever to graduate from Harvard with a degree in “Strategy and Applied Game Theory,” for which he combined social-science and computer-science courses. He thought what he learned would aid him both in the real world of business and the simulated worlds he hoped to create. He used his access to computers at Harvard to refine his ideas, continuing to tinker with what would always remain his biggest gaming love, football simulations. In 1975, the arrival of the microprocessor and the first kit microcomputers such as the Altair made him sit down and try to decide on a date when this new technology would make his dream of a home-computer-entertainment company viable. He claims to have decided then that 1982 would be the perfect moment. And indeed, 1982 would be the year that he would found Electronic Arts. If that all sounds a little bit too neat to be entirely believable, the fact still remains that the patience and dedication he showed in the face of considerable temptation to go down other paths is, as we shall see, amazing. As the next step in his master plan, he went off to Stanford for an MBA. And then came Apple, and a pivotal role in the Lisa project.

Hawkins was one of the beneficiaries of Apple’s IPO at the end of 1980; his first two-and-a-half years in the workforce made him a millionaire, free never to work again if he didn’t feel like it. With incentives like that, and a position as marketing director for one of the most prominent young companies in the country, it would be easy to forgive him for putting games in the category of childish things left behind. Yet he never forgot his dream through those years at Apple. Hawkins was the outlier amongst a management team not just disinterested in games but a little bit afraid of them as indicative of a product line less “serious” (read: useful for business) than IBM’s. Even whilst dutifully trying to ingratiate Apple with po-faced businessmen, Hawkins kept up with the thriving game scene on the Apple II. Witnessing the success of companies like Brøderbund and On-Line, he began to fret that the entertainment revolution was coming even sooner than he had anticipated, and that he was missing it. In January of 1982, he thus told his colleagues that he wanted to resign for the most preposterous of reasons: he wanted to start a game company. Hawkins at first acquiesced when they told him how foolish he was to walk away from a company like Apple, but a few months later he resigned again, and this time stuck to his guns.

On May 28, 1982, Hawkins officially founded the venture he had been dreaming of for over ten years under the truly awful name of Amazin’ Software. He was just 28 years old. He had a small fortune of his own to inject into the company thanks to the Apple IPO, but he would need much, much more to launch on the lavish scale he envisioned. Fortunately, he had an established relationship with an investor named Don Valentine, head of Sequoia Capital, one of the most important sources of start-up funding in Silicon Valley. Valentine and Sequoia had already helped to fund Atari, Apple, and Shugart (developers of the floppy disk) among others. Now he found Hawkins’s vision of a next-generation entertainment-software publisher compelling. He became more like a business partner than an investor, providing much more than money. After working out of his home for a few months, Hawkins set up shop inside Sequoia’s offices when he began to hire his first employees. As Valentine later wryly explained, he told Hawkins he had to leave only when Hawkins’s own people exceeded the number of Sequoia people in the building. Hawkins then moved his company to a spacious three-story building in San Mateo, California, where it would remain for the next fifteen years. To begin to fill the space, Hawkins put together a team made from ex-Apple people (like Joe Ybarra), ex-Xerox PARC people (like Tim Mott), ace advertising executives (like Bing Gordon, who would remain with the company for more than 25 years), people from other games companies, from IBM, from Visicorp. Even Steve Wozniak agreed to sit on the board of directors.

But, you might ask, just what did all these people find so compelling about Hawkins’s vision? Well, he proposed a completely new approach to computer games — to the way that they were designed, programmed, marketed, and even played (or, more accurately, he wished to change who played them). As I’ve described in earlier articles on this blog, the computer-game industry was growing rapidly by 1982, and with the arrival of new, inexpensive yet capable platforms like the Commodore 64 was beginning to attract serious attention from people like Don Valentine as the potential next big thing to replace the increasingly moribund game consoles. Yet the industry had also only recently left the Ziploc era behind. Its products — full of garish cover art, typo-riddled manuals, bugs, and cryptic user interfaces — still bore an unmistakeable whiff of the dingy basements in which they were created. In short, computer games still felt almost as much hobby as business. Hawkins proposed to change that, by selling games tailored to ordinary consumers, games with the same professional polish found in the book and music industries. He felt the best way to do that was not to devalue the creative component of games and sell them as simply toys or product, as Atari had been doing for years with its game cartridges. Indeed, Atari’s current struggles illustrated that this was exactly the wrong approach. No, the best way to sell games was to celebrate them as art, made by real artists. Thus the eventual title of his company, arrived at after a long day of brainstorming in October of 1982: Electronic Arts. It was simple, classy, elegant, everything Hawkins wanted his games to be, in contrast to the scruffy products of the first-generation companies with whom he’d be competing.

Hawkins spent considerable time refining his ideas for the new “consumer software” he wanted EA to produce. Eventually he arrived at a formula: EA’s games must be “simple, hot, and deep.”

Many of his ideas about simplicity came from the Lisa project. Like the Lisa’s desktop, the interface in EA games should be as simple as possible and as much as possible prompted by obvious visual cues right there on the screen. There should be no cryptic command-key sequences, and it shouldn’t be necessary to read the manual to learn how to play.

“Hotness” is the most abstract of the three qualities. It’s not quite the same as Marshall McLuhan’s definition of the term in Understanding Media, although there is a definite kinship. Hawkins described it as meaning that the program take maximum advantage of what he saw as the four important strengths of the computer as an artistic medium: video and, increasingly with the arrival of the Commodore 64 and its magnificent SID chip, sound; interactivity, the single quality that most distinguished it from any other form of electronic media; and the ability to have hidden computational machinery to solve the bookkeeping problem that had so frustrated him in the tabletop simulations he had played as a kid. Hawkins wanted his games to push all four qualities “as far as you can” on each platform for which they were released.

Finally there is the notion of depth. Hawkins wanted EA’s games to strive for that classic ideal of being simple to learn and play, but challenging — and infinitely interesting — to master. He also pointedly considered this quality to be the real differentiation between the new generation of computer games to be made by EA and the old console and standup arcade games that were aimed at the same market of ordinary consumers. Sustained interest, he argued, required depth, and it was exactly the lack of same that had caused consumers to lose interest in Atari’s games in a way they wouldn’t in those of EA. He liked to say that arcade games were reactive rather than interactive, requiring the player to use her reflexes but not her intelligence or creativity.

There’s a definite sense of the over-optimistic here, particularly in this belief in the power of depth. A parade of truly awful games that have nevertheless become huge hits in the years since the Great Videogame Crash rather puts the lie to the idea that people would come to reject bad designs that seem determined to insult their intelligence. Nevertheless, much of Hawkins’s vision did indeed end up coming true. In his book A Casual Revolution from 2010, Jesper Juul laid out the qualities he feels define the new generation of casual games now played by a huge swathe of the population. The successful ones are, he writes: possessed of an immediately identifiable fictional context; easy to play for the first time; easily interruptable, and accepting of any level of player dedication; difficult enough to be interesting but not difficult enough to frustrate; and “juicy,” offering a constant, colorful stream of feedback to every action to hold the player’s interest. Juul’s criteria benefit from many additional years of gaming history, but they aren’t that horribly far from Hawkins’s vision for gaming back in 1982. That’s not to say that the casual model of gaming is or should be the only viable model — indeed, EA themselves would depart from it constantly over the years, and often for good reasons — but as a blueprint for consumer software, it’s hard to beat. When I played some early EA titles again recently after spending the last couple of years immersed in games of earlier vintage for this blog, I felt like I’d crossed some threshold into, if not quite modernity, at least something that felt a whole lot closer to it.

All of Hawkins’s design goals seemed great in the abstract, but of course to realize them he’d need to find actual designers capable of crafting that elusive combination of simple and deep gameplay. Not wanting to take any chances, he decided to go with several proven hands along with newcomers for EA’s first titles. He therefore made a list of those whose work had impressed him and started making calls, asking them to publish their next game through EA. To entice them, he offered exactly what you might expect, advances (a first for the industry) and generous royalty rates. That, however, was only the beginning of the pitch. Hawkins promised to do everything possible to let his developers and designers just do what they did best: create. To do so he would borrow liberally from the model of other forms of entertainment. Each development team would be assigned an in-house producer who would be their point of contact with EA and who would make sure all the boring stuff got done: arranging testing, arranging ports to other machines, adding copy protection, getting the manual written, keeping contracts up to date, coordinating with advertising and packaging designers. (EA’s early star in this role would be Joe Ybarra, who shepherded a string of classic titles through development.) In the long term, Hawkins also promised them access to a suite of in-house development tools, including workstation computers, tools to develop video and audio content and even in-house artists to help them use them, a cross-platform FORTH compiler. Such tools would not always be used as widely or as soon as Hawkins had hoped, but they were, like so much else about EA, a preview of how game development would work in the future. But the most enticing thing that Hawkins offered his developers, and by far the most remembered today, was an appeal aimed straight at their egos: he promised to make them rock stars.

Hawkins had decided, logically enough, that if computer games were art then those who created them had to be considered artists. In fact, he decided to build EA’s June 1983 launch around this premise of “software artists.” Each EA game box bore a carefully crafted mission statement that made the company sound more like an artistic enclave than a for-profit corporation:

We’re an association of electronic artists who share a common goal. We want to fulfill the potential of personal computing. That’s a tall order. But with enough imagination and enthusiasm we believe there’s a good chance for success. Our products, like this game, are evidence of our intent. If you’d like to get involved, please write to us at…

Said boxes themselves were a slim-line design deliberately evocative of record albums, with big gate-fold insides featuring pictures and profiles of the artists behind the work. Hawkins imagined that, just as you always bought the new album from your favorite band, you would rush to buy the next game from Bill Budge or Dan Bunten; that every hip household would eventually have a shelf full of EA games waiting to be pulled down and played in lieu of an evening of television.

Indeed, EA’s first big advertising blitz was designed to demonstrate just what a hip and important new artistic medium the computer was. Hawkins had his stable of developers photographed in brooding rock-star poses lifted straight from an Annie Leibowitz shoot for Rolling Stone — which was appropriate, because EA largely bypassed the traditional computer press to run them in just that sort of glossy mainstream magazine.

early EA advertisement

The advertising headlines argued for software as the next great art form: “We See Farther”; “Can a Computer Make You Cry?” (The answer to the latter was essentially “We’re working on it.”) Nobody had ever promoted computer games quite like this. It was, if nothing else, audacious as all hell.

But now we come to the part of the article where we have to ask What It All Means. We have to be careful here. It would be very easy to look at the idealistic sentiments in those early advertisements, compare it with the allegedly soulless corporate behemoth that is EA today (voted “Worst Company in America” for 2012), and drift off into an elegiac for the artistic integrity of the early days of gaming and the perpetual adolescence and sequel-driven creative bankruptcy the medium seems to be caught in today. That’s very, very easy to do, as demonstrated by the countless other blog entries in just that mold that you’ll find all over the Internet; I even did it once myself back in graduate school. Nor is it precisely a point of view without merit. Still, before we go too far down that sepia-toned road let’s make room for some other facets of all this.

There may be more similarities between the EA of 1983 and the EA of today than nostalgia likes to admit. A certain streak of cold corporate ruthlessness was a part of EA’s personality even then. For all the idealism, EA wasn’t terribly interested in playing nice with the others who had already done so much toward building this new industry. They bypassed the established distribution system that Ken Williams had first begun to build back in 1980 in favor of setting up their own network that let them sell their products to stores directly. It may seem a small thing, but the message was that EA didn’t need the rest of the industry, that now the adults were ready to take over, thank you. The first-generation publishers tended to view EA as wealthy carpetbaggers swooping in to capitalize on what they had spent years building. Yes, part of that was just inevitable jealousy toward the well-financed, well-connected Hawkins who never had to start his company on the shoestrings that they did, but there’s also a grain of truth to their complaints. June of 1983 marks as good a line of demarcation as any for the final end of the era that Doug Carlston called the software Brotherhood. The industry would be a different place in the post-EA era. The games would be better, more polished and sophisticated, but the competition would also be more ruthless, the atmosphere colder, everything slicker and more calculated. It’s hard not to feel that EA had something to do with that. The fact is that EA was always known to its competitors as a company of hard edges and sharp elbows.

One other thing that’s always lost in nostalgic reminiscence over EA’s first advertising campaign is the awkward fact that it actually didn’t work out all that terribly well. EA found that the mainstream public did not respond as they had hoped to their software artists, and within six months had already begun to switch gears, away from advertising the creators and back to advertising their creations on their own merits, as was the norm for other publishers. They also returned to the trade press for promotion, and often relaxed Hawkins’s rules for consumer software in favor of titles that catered more to the hardcore. Within a few years EA would have extended dungeon crawls, tough-as-nails adventure games, and strategy games with thick manuals, just like everybody else did. It turned out that consumers — or, perhaps more accurately, the PCs of the era — weren’t yet quite ready for consumer software. EA would turn into a very successful publisher, but not the force for widespread, mainstream cultural change Hawkins had imagined. Games would still be viewed by most of the tastemakers as kids’ stuff for many years to come. When that became clear (as it did in fairly short order), EA would continue to credit their developers on the box covers and to offer photos inside, but no longer made them the centerpiece of their marketing. Certainly there were no more developers-as-rock-stars photos like the one above.

Which brings us to another point that’s worthwhile to note. I have no doubt that much of the idealistic sentiment in those early advertisements was genuine, just as I have no doubt that Hawkins really, genuinely loved games and the potential of games and wanted to bring them to more people. Yet EA was also a business, funded by unsentimental people like Don Valentine. They ultimately demanded that EA live up to its earning potential. If presenting themselves to the public as an enclave of artists worked to do that, great. If making great, groundbreaking games did that, double great. But when push came to shove, EA needed to make money. Even the advertisement above displays as much cold calculation as it does idealism. There’s something not quite genuine about all those nerds mugging like rock stars. The message of the advertisements resonates so because it was good PR, that perfectly connected with what EA wanted to be and — just as importantly — with what so many commentators today so desperately want them to have been. But, like the iconic Infocom advertisements that still largely define perceptions of that company, it’s also a very, very carefully crafted piece of calculated rhetoric. So, What It All Means is… complicated.

But to return to firmer ground, that of the games themselves: they were mostly good. Often really, really good. EA launched that June with seven titles available or announced: M.U.L.E., by Dan Bunten and his company Ozark Softscape; Archon, by Free Fall Associates; Murder on the Zinderneuf, also by Free Fall; Worms? by David Maynard; Pinball Construction Set by Bill Budge; Hard Hat Mack by Michael Abbot and Matthew Alexander; and Axis Assassin by John Field. Perhaps surprisingly given Hawkins’s connections to Apple, the first four of those originated on the Atari 8-bit machines, only the last three on the Apple II. On the other hand, with the Commodore 64 still quite new and something of an unknown quantity as these games were in development, the Atari machines were the best qualified to realize Hawkins’s vision of audiovisually “hot” games. (EA funded ports to most viable platforms as a matter of course for most games, so all of these titles did eventually reach several platforms.)

EA’s starting lineup was so good and so important to gaming history that I want to look at several of them individually. We’ll get started with that next time.

(There are several good interviews and articles about EA’s history available on the Internet. That said, it’s also worthwhile to go back to the spate of interviews and articles that greeted EA’s entrance back in 1983. Particularly good ones can be found in the October 1983 Byte, the July/August 1983 Softline, and the October 1983 Computer Gaming World.)

 
 

Tags:

Lisa

In mid-1978 Apple Computer hired Trip Hawkins, a 25-year-old with a newly minted MBA from Stanford, to work in the marketing department. He quickly became a great favorite of Steve Jobs. The two were of similar ages and similar dispositions, good looking and almost blindingly charismatic when they turned on the charm. They were soon confirmed running buddies; Hawkins later told of “smoking dope” in a Vegas hotel room during a CES show, then going down to shoot craps all night. Less superficially, they thought differently than both the technicians and engineers at Apple (like Steve Wozniak) and the older, more conventional businessmen there (like Mike Markkula and Michael Scott). Their fondest dreams were not of bytes or market share, but rather of changing the way people lived through technology. Jobs monopolized much of Hawkins’s time during the latter part of 1978, as the two talked for hours on end about what form a truly paradigm-shifting computer might take. The ideas that began here would retain, through years of chaos to come, the casual code name they initially gave them: “Lisa.”

There’s been some confusion about the origins of the name. Years later, when they began selling Lisa as an actual product, Apple tried to turn it into LISA, short for “Local Integrated Software Architecture.” This was so obviously tortured that even the fawning computer press to whom they first promoted the machine had some fun with it; “Let’s Invent Some Acronyms” was a popular variation. Some sources name Lisa as the name of “the original hardware engineer’s daughter.” Yet it’s hard to get past the fact that just before all those long conversations with Hawkins Jobs had a daughter born to an on-again/off-again girlfriend he had known since high school. They named her Lisa Nicole. The story of what followed is confused and not terribly flattering to Jobs personally (not that it’s difficult to find examples of the young Jobs behaving like a jerk). After apparently being present at the birth and helping to name the baby, not to mention naming his new pet project after her, something caused Jobs to have a sudden change of heart. He denied paternity vigorously; when asked who he thought the real father was, he charmingly said that it could be any of about 28 percent of the male population of the country. Even when a court-ordered paternity test gave him about a 95 percent chance of being the father, he continued to deny it, claiming to be sterile. A few years later, when Jobs was worth some $210 million, he was still cutting a check each month to Lisa’s mother for exactly the amount the court had ordered: $385. Only slowly and begrudgingly would he accept his daughter in the years to come. At the end of that process he finally acknowledged the origin of the computer’s name: “Obviously it was named for my daughter,” he told his official biographer Walter Isaacson shortly before his death. The “original hardware engineer” apparently referenced above, Ken Rothmuller, was even more blunt: “Jobs is such an egomaniac, do you really think he would have allowed such an important project to be named after anybody but his own child?”

Jobs and Hawkins were convinced that the Lisa needed to be not just more powerful than the likes of the Apple II, but also — and here is the real key — much easier, much more fun for ordinary people to use. They imagined a machine that would replace esoteric key presses and cryptic command prompts with a set of simple on-screen menus that would guide the user through her work every step of the way. They conceived not just a piece of hardware waiting for outside programmers to make something of it, like the Apple II, but an integrated software/hardware package, a complete computing environment tailored to friendliness and ease of use. Indeed, the software would be the real key.

But of course making software more friendly would put unprecedented demands on the hardware. This was as true then as it is today. As almost any programmer will tell you, the amount of work that goes into a program and the amount of computing horsepower it demands are directly proportional to how effortlessly it seems to operate from the user’s perspective. Clearly Jobs and Hawkins’s ideas for Lisa were beyond the capabilities of the little 6502 inside the Apple II. Yet among the other options available at the time only Intel’s new 16-bit 8086 looked like it might have the power to do the job. Unfortunately, Apple and their engineers disliked Intel and its architecture passionately, rendering that a non-option. (Generally computer makers have broken down into two camps: those who use Intel chips and derivatives such as the Zilog Z80, and those who use other chips. Until fairly recently, Apple was always firmly in the latter camp.) In the spring of 1979, with the Apple II Plus finished and with most of the other engineers occupied getting the Sara project (eventually to be known as the Apple III) off the ground, Woz therefore set to work on one hell of an ambitious project. He would make a brand new CPU in-house for Lisa, using a technique he had always favored called bit slicing.

Up to this time Lisa had had little official existence within Apple. It was just a ground for conjecture and dreaming by Jobs and his closest circle. But on July 30, 1979, it took official form at last, when Ken Rothmuller, like Woz late of nearby Hewlett Packard, came on-board to lead the project under the close eye of Jobs, who divided his time between Sara and Lisa. Sara was now envisioned as the immediate successor to the Apple II, a much improved version of the same basic technology; Lisa as the proverbial paradigm shift in computing that would come somewhat later. Most of the people whom Rothmuller hired were also HP alumni, as were many of those working on Sara; Apple in these days could seem like almost a divisional office of the larger company. This caused no small chagrin to Jobs, who considered the HP engineers the worst sort of unimaginative, plodding “Clydesdales,” but it was inevitable given Apple’s proximity to the giant.

While they waited for Woz’s new processor, the Lisa people started prototyping software on the Apple II. Already a bit-mapped display that would allow the display of images and various font styles was considered essential. The early software thus ran through a custom-built display board connected to the Apple II running at a resolution of 356 X 720. At this stage the interface was to be built around “soft keys.” Each application would always show a menu of functions that were mapped to a set of programmable function keys on the keyboard. It was a problematic approach, wasteful of precious screen real estate and limited by the number of function keys on the keyboard, but it was the best anyone had yet come up with.

The original Lisa user interface, as of October 8, 1979. Note the menu of "soft keys" at the bottom.

The original Lisa user interface, circa autumn 1979. Note the menu of “soft keys” at the bottom.

That October Rothmuller’s team assembled the first working Lisa computer around a prototype of Woz’s processor. Just as they were doing so, however, they became aware of an alternative that would let them avoid the trouble and expense of refining a whole new processor and also avoid dealing with the idiosyncrasies of Woz, who was quickly falling out of favor with management. Their new processor would have a tremendous impact on computing during the decade to come. It was the Motorola 68000.

The 68000 must have seemed like the answer to a prayer. At a time when the first practical 16-bit chips like the Intel 8086 were just making their presence felt, the 68000 went one better. Externally, it fetched and stored from memory like a 16-bit chip, but it could perform many internal operations as a 32-bit chip, while a 24-bit address bus let it address a full 16 M of memory, a truly mind-boggling quantity in 1979. It could be clocked at up to 8 MHz, and had a beautiful system of interrupts built in that made it ideal for the centerpiece of a sophisticated operating system like those that had heretofore only been seen on the big institutional computers. In short, it was simply the sleekest, sexiest, most modern microprocessor available. Naturally, Apple wanted it for the Lisa. Motorola was still tooling up to produce the chips — they wouldn’t begin coming out in quantity until the end of 1980 — but Apple was able to finagle a deal that gave them access to prototypes and pre-release samples. Woz’s processor was put on the shelf. The Lisa was now to be a 68000 machine, the CPU of the future housed in the computing paradigm of the future. It’s at this stage, with the Lisa envisioned as a soft-key-driven 68000-based computer, that Xerox PARC enters the picture.

The story of Steve Jobs’s visit to PARC in December of 1979 has passed into computer lore as a fateful instant where everything changed, one to stand alongside IBM’s visit to Digital Research the following year. Depending on your opinion of Jobs and Apple, they would either go on to refine, implement, and popularize these brilliant ideas about which Xerox themselves were clueless, or shamelessly rip off the the work of others — and then have the hypocrisy to sue still others for trying to do the same, via their “look and feel” battle with Microsoft over the Windows interface. In truth, the PARC visit was in at least some ways less momentous than conventional wisdom would have it. To begin with, the events that set the meeting in motion had little to do with the future of computing as implemented through the Lisa project or anywhere else, and a lot to do with a pressing, immediate worry afflicting Mike Markkula and Michael Scott.

Back in early 1978, Apple had been the first PC maker to produce a disk system, using the new technology of the 5 1/4″ floppy disk which had been developed by a company called Shugart Associates. Woz’s Disk II system was as important to the Apple II’s success as the Apple II itself, a night-and-day upgrade over the old slow and balky cassette tapes that enabled, amongst many other advances, the Apple II’s first killer app, VisiCalc. Apple initially bought its drives direct from Shugart, the only possible source. However, they soon became frustrated with the prices they were paying (apparently Apple’s legendarily high profit margins were justifiable for them, but not for others) and with the pace of delivery. They therefore found a cut-rate electronics manufacturer in Japan, Alps Electric Company, whom they helped to clone the Shugart drives. Through Alps they were able to get all the drives they wanted, and get them much cheaper than they could through Shugart. Trouble was, blatantly cloning Shugart’s patented technology in this way left them vulnerable to all sorts of legal action. By this time, Apple had a reputation as an up-and-coming company to watch, and was raising money toward an eventual IPO from a variety of investors. When he heard that Xerox’s financial people were interested in making an investment, Scott suddenly saw a way to protect the company from Shugart. Shugart, you see, was wholly owned by Xerox. Scott reasoned, correctly, that Xerox would not allow one of its subsidiaries to sue a company in which it had a vested interest. Apple and Xerox quickly concluded an agreement that allowed the latter to buy 100,000 shares of the former for a rather paltry $1 million. As a sort of adjunct, the two companies also agreed to do some limited technology exchange. It was this that led to Jobs’s legendary visit to PARC some months later.

The fact that it took him so long to finally visit shows that PARC’s technology was not so high on Jobs’s list of priorities. The basics of what PARC had to offer were hardly any big secret amongst people who thought about such things during the 1970s. It was something of a rite of passage for ambitious computer-science graduate students, at least those from the nearby universities, to take a tour and get a glimpse of what everyone was increasingly coming to regard as the interface of the future. Several people at Apple and even on the Lisa team were very aware of PARC’s work. Many of their ideas had already made their way into the Lisa. Reports vary somewhat, but some claim that the Lisa already had the concept of windowing and even an optional mouse before the visit to PARC. And certainly the Alto’s bitmapped display model was already present. The Lisa team member who finally convinced Jobs to visit PARC, Bill Atkinson, later claimed to wish he had never done so: “Those one and a half hours tainted everything we did, and so much of what we did was original research.”

The legendary visit to PARC was actually two visits, which took place a few days apart. The first involved a small group, perhaps no more than Atkinson and Jobs themselves. The second was a much lengthier and more in-depth demonstration that spanned most of a day, and involved most of the principal players on Lisa, including Hawkins. As Jobs later freely admitted, he saw three breakthrough technologies at PARC — the GUI, the LAN, and object-oriented programming in the form of Smalltalk — but was so taken by the first that he hardly noticed the other two. Jobs was never particularly interested in how technology was constructed, so his lack of engagement with the third is perhaps understandable. His inability to get the importance of networking, however, would become a major problem for Apple in the future. (A fun anecdote has the Jobs of a few years later, tired of being bothered about Apple’s piss-poor networking, throwing a floppy disk at his interlocutor, saying, “Here’s your fucking network!”)

Even if there weren’t as many outright revelations at PARC as legend would have it, it’s certainly true that Jobs and the rest of the Lisa team found what they saw there hugely inspiring. Suddenly all of these ideas that they had been discussing in the abstract were there, before them, realized in actual pixels. PARC showed them that it could be done. As Hawkins later put it, “We got religion.”

Of course, every religion needs a sacred text. Hawkins provided one in the spring of 1980 when he finished the 75-page “Lisa Marketing Requirements.” Far more than what its name would imply, it was in fact a blueprint for the entire project as Jobs and Hawkins now envisioned it. It’s a fascinating read today. Lisa will “portray a friendlier ‘personality’ and ‘disposition’ than ordinary computers to allow first-time users to develop the same emotional attachment with their system that they have with their car or their stereo.” Over and over occurs a phrase that was supposed to be the mission statement for PARC: “the office of the future.” Other parts, however, put the lie to the notion that Apple decided to just junk everything it had already done on the Lisa and clone the Xerox Alto. While a mouse is now considered essential, for instance, they are still holding onto the old notion of a custom keyboard with soft keys. The MR document was Hawkins’s last major contribution to Lisa. Soon after writing it, he became Apple’s head of marketing, limiting his role with Lisa.

While the HP contingent remained strongly represented, as the team grew Apple began poaching from PARC itself, eventually hiring more than fifteen ex-PARCers. Those who weren’t on board with the new, increasingly PARC-centric direction found it tough going. Rothmuller, for instance, was unceremoniously dumped for thinking too traditionally. And then, unexpectedly, Jobs himself was gone.

As 1980 drew to a close, with the IPO looming and the Apple III already starting to show signs of becoming a fiasco, CEO Michael Scott decided that he had to impose some order on the company and neuter Jobs, whose often atrocious treatment of others was bringing a steady rain of complaints down upon his desk. He therefore reorganized the entire company along much stricter operational lines. Jobs begged for the newly created Lisa division, but Scott was having none of it. Lisa after all was coming more and more to represent the long-term future of Apple, and after watching the results of his meddling in the Sara project Scott had decided that he didn’t want Jobs anywhere near it. If Jobs would just confine himself to joining Woz as Apple’s token spokesman and mascot, that would be more than enough of a contribution, thank you very much. He placed Lisa in the hands of yet another steady ex-HP man, John Couch. Jobs went off in a huff, eventually to busy himself with another project called Macintosh. From now on he would be at war with his erstwhile pet. One of his first strikes was to lure away Atkinson, an ace graphics programmer, to the Macintosh project.

By now 68000-based prototype machines were available and software development was ramping up. Wayne Rosing was now in charge of hardware; Bruce Daniels, who had co-written the original MIT Zork and written Infocom’s first interpreter for the Apple II, in charge of the operating system; and Larry Tesler, late of PARC, in charge of the six integrated applications to be at the heart of the office of the future. They were: Lisa Draw; Lisa Write, a what-you-see-is-what-you-get word processor in the tradition of PARC’s Gypsy; Lisa Project, a project manager; Lisa Calc, a spreadsheet; Lisa List, a database; and Lisa Graph. From a very early date the team took the then-unusual step of getting constant feedback on the interface from ordinary people. When the time came to conduct another round of testing, they would go to Apple’s Human Resources department and request a few new hires from the clerical pool or the like who had not yet been exposed to Lisa. Tesler:

We had a couple of real beauties where the users couldn’t use any of the versions that were given to them and they would immediately say, “Why don’t you just do it this way?” and that was obviously the way to do it. So sometimes we got the ideas from our user tests, and as soon as we heard the idea we all thought, “Why didn’t we think of that?” Then we did it that way.

It’s difficult to state strongly enough what a revolutionary change this made from the way that software had been developed before, in which a programmer’s notion of utilitarian functionality was preeminent. It was through this process that the team’s most obvious addition to the PARC template arose: pull-down menus. User testing also led them to decide to include just one button on the mouse, in contrast to the PARC mice which had three or more. While additional buttons could be useful for advanced users, new users found them intimidating. Thus was born Apple’s stubborn commitment to the single-button mouse, which persisted more than twenty years. The final major piece of the user-interface puzzle, of the GUI blueprint which we still know today, came in June of 1981 when the team first saw the Xerox Star. The desktop metaphor was so obviously right for the office of the future that they immediately adopted it. Thus the Lisa in its final form was an amalgam of ideas taken from PARC and from the Star, but also represents significant original research.

As 1982 began, the picture of what Lisa should be was largely complete. Now it just came down to finishing everything. As the year wore on, the milestones piled up. In February the system clipboard went in, allowing the user to cut and paste not just between the six bundled applications but presumably any that might be written in the future — a major part of the Lisa vision of a unified, consistent computing environment. On July 30, 1982, the team started all six applications at once on a single computer to test the capabilities of the advanced, multitasking operating system. On September 1, the Lisa was declared feature complete; all that remained now was to swat the bugs. On October 10, it was demonstrated to Apple’s sales force for the first time.

The Apple Lisa. Not the two Twiggy drives to the right. The 5 MB hard drive sits on top.

The Apple Lisa. Note the two Twiggy drives to the right. The 5 MB hard drive sits on top.

John Couch’s people had a lot to show off. The Lisa’s hardware was quite impressive, with its high-resolution bitmapped display, its mouse, and its astonishing 1 full MB of memory. (To understand just how huge that number was in 1982, consider that the IBM PC had not been designed to even support more than 640 K, a figure IBM regarded as a strictly theoretical upper limit no one was ever likely to reach in the real world.) Yet it was the software that was the most impressive part. To use an overworked phrase that in this case is actually deserved, Lisa OS was years ahead of its time. Aside from only the hacker-oriented OS-9, it was the first on a PC to support multitasking. If the user started up enough programs to exceed even the machine’s 1 MB of memory, a virtual-memory scheme kicked in to cache the excess onto the 5 MB hard drive. (By way of comparison, consider that this level of sophistication would not come to a Microsoft operating system until Windows 3.0, released in 1990.) It was possible to cut and paste data between applications effortlessly using the system clipboard. With its suite of sophisticated what-you-see-is-what-you-get applications that benefited greatly from all that end-user testing and a GUI desktop that went beyond even what had been seen on the Star (and arguably beyond anything that would be seen for the rest of the 1980s) in consistency and usability, the Lisa was kind of amazing. Apple confidently expected it to change the world, or at least to remake the face of computing, and in this case their hubris seemed justified.

Apple officially announced the Lisa on January 19, 1983, alongside the Apple IIe in an event it labeled “Evolution/Revolution.” (I trust you can guess which was which.) They managed to convince a grudging Jobs, still the face of the company, to present these two machines that he ardently hated in his heart of hearts. It must have especially cut because the introduction was essentially a celebration of the bet he was about to lose with Couch — that being that he could get his Macintosh out before the Lisa. Jobs had come to hate everything about the Lisa project since his dismissal. He saw the Lisa team, now over 200 people strong when the business and marketing arms were taken into account, as bloated and coddled, full of the sort of conservative, lazy HP plodders he loathed. That loathing extended to Couch himself, whose low-key style of “management by walking around” and whose insistence that his people work sane hours and be given time for a life outside of Apple contrasted markedly with the more high-strung Jobs.

But then, Jobs had much to be unhappy about at this point. Time magazine had planned to make him its “Man of the Year” for 1982, until their journalists, digging around for material for the feature, unearthed a series of rather unflattering revelations about Jobs’s personal life, his chequered, contentious career at Apple, and the hatred many even in his own company felt toward him. Prominent among the revelations were the first reports of the existence of Jobs’s daughter Lisa and Jobs’s shabby treatment of her and her mother. In the face of all this, Time turned the Jobs feature into an elegiac for a brilliant young man corrupted and isolated from his erstwhile friends by money and fame. (Those who had known Jobs before his “corruption” mostly just shrugged at such a Shakespearian portrayal and said, well, he’d always kind of been an asshole.) The Man of the Year feature, meanwhile, became the computer itself — a weird sort of “man,” but what was the alternative? Who else symbolized the face of the computer age to mainstream America better than Jobs? This snub rankled Jobs greatly. It didn’t make Apple any too happy either, as now their new wonder-computer was hopelessly ensnared with the tawdry details of Jobs’s personal life. They had discussed changing the name many times, to something like the Apple IV or — this was Trip Hawkins’s suggestion — the Apple One. But they had ended up keeping “Lisa” because it was catchy, friendly, and maybe even a little bit sexy, and separated the new machine clearly from both the Apple III fiasco and everything else that had come before from Apple. Now they wished they could change it, but, with advertising already printed and announcements made, there was nothing to be done. It was the first ominous sign of a launch that would end up going nothing like they had hoped and planned.

Still, as time rolled on toward June 1983, when the Lisa would actually start shipping, everything seemed to be going swimmingly. Helped along by ecstatic reviews that rightly saw the Lisa as a potentially revolutionary machine, Apple’s stock soared to $55 on the eve of the first shipments, up from $29 at the time of the Evolution/Revolution events. Partly this was down to the unexpectedly strong sales of the Apple IIe, which unlike the Lisa had gone into production immediately after its announcement, but mostly it was all about the sexier Lisa. Apple already had 12,000 orders in the queue before the first machine shipped.

But then, with the Lisa actually shipping at last, the orders suddenly stopped coming. Worse, many of those that had been already placed were cancelled or returned. Within the echo chamber inside Apple, Lisa had looked like a surefire winner, but that perception had depended upon ignoring a lot of serious problems with the computer itself, not to mention some harsh marketplace realities, in favor of the Lisa’s revolutionary qualities. Now the problems all started becoming clear.

Granted, some of the criticisms that now surfaced were hilariously off-base in light of a future that would prove the Lisa right about so many fundamentals. As always, some people just didn’t get what Lisa was about, were just too mired in the conventional wisdom. From a contemporary issue of Byte:

The mouse itself seems pointless; why replace a device you’re afraid the executive is afraid of (the keyboard) with another unfamiliar device? If Apple was seriously interested in the psychology involved it would have given said executive a light pen.

While the desktop-with-icons metaphor may be useful, were I a Fortune 500 company vice-president, I would be mortally insulted that a designer felt my computer had to show me a picture of a wastebasket to direct me to the delete-file function. Such offensive condescension shows up throughout the design, even in the hardware (e.g., labeling the disk release button “Disk Request”).

I’d hoped (apparently in vain) that Apple finally understood how badly its cutesy, whimsical image hurts its chances of executive-suite penetration. This image crops up in too many ways on the Lisa: the Apple (control) key, the mouse, and on and on. Please, guys, the next time you’re in the executive-suite waiting room, flip through the magazines on the table. You’ll find Fortune, Barron’s, Forbes, etc., but certainly not Nibble. There’s a lesson there.

Other criticisms, however, struck much closer to home. There was one in particular that came to virtually everyone’s lips as soon as they sat down in the front of a Lisa: it was slow. No matter how beautiful and friendly this new interface might look, actually using it required accepting windows that jerked reluctantly into place seconds after pulling on them with the mouse, a word processor that a moderately skilled typist could outrace by lines at a time, menus that drew themselves line by laborious line while you sat waiting and wondering if you were ever going to be able to just get this letter finished. Poor performance had been the dirty little secret plaguing GUI implementations for years. Certainly it had been no different on the Alto. One PARC staffer estimated that the Alto’s overall speed would have to be improved by a factor of ten for it to be a viable commercial product outside the friendly confines of PARC and its ultra-patient researchers. Apple only compounded the problem with a hardware design that was surprisingly clunky in one of its most vital areas. Bizarrely on a machine that was ultimately going to be noticed primarily for its display, they decided against adding any specialized chips to help generate said display, choosing instead to dump the entire burden onto the 68000. Apple would not even have needed to design its own custom display chip, a task that would have been difficult without the resources of, say, Commodore’s MOS Technologies subsidiary. Off-the-shelf solutions, like the NEC 7220, were available, but Apple chose not to avail themselves of them. To compound the problem still further, they throttled the Lisa’s 68000 back to 5 MHz from its maximum of 8 MHz to keep it in sync with the screen refreshes it needed to constantly perform. With the 68000 so overloaded and strangled, the Lisa could seem almost as unusably slow as the old Alto at many tasks.

Other problems that should have been obvious before the Lisa was released also cropped up. The machine used a new kind of floppy disk drive that Apple had been struggling with in-house since all the way back in 1978. Known as Twiggy informally, the disks had the same external dimensions as the industry-standard 5 1/4″ disks, but were of a new design that allowed greater capacity, speed, and (theoretically) reliability. Trouble was, the custom disks were expensive and hard to find (after all, only the Lisa used them), and the whole system never worked properly, requiring constant servicing. The fact that they were on the Lisa at all made little sense in light of the new 3.5″ “micro-floppy” standard just introduced by Sony. Those disks were reliable, inexpensive, and easily available, everything Twiggy was not, while matching or exceeding all of Twiggy’s other specifications. They were in fact so good that they would remain a computer-industry staple for the next twenty years. But Apple had poured millions into the Twiggy boondoggle during the previous several years of constant internal confusion, and they were determined to use it.

And then there was the price. Trip Hawkins’s marketing-requirements document from back in 1980 had presciently warned that the Lisa must be priced at less than $5000 to have a chance of making an impact. Somewhere along the way, however, that bit of wisdom had been lost. The Lisa debuted at no less than $10,000, a figure that in 1983 dollars could buy you a pretty nice new car. Given its extreme price and the resulting necessity that it be marketed exclusively to big corporate customers, it’s tough to say whether the Lisa can really be considered a PC in the mold of the Apple II and IBM PC at all. It utterly lacked the democratic hobbyist spirit that had made the Apple II such a success. Software could be written for the Lisa only by yoking two Lisas together, one to host the program being written and the other to be used for writing it with the aid of an expensive toolkit available only from Apple. It was a barrier to entry so high that the Lisa was practically a closed system like the Xerox Star, confined to running only the software that Apple provided. Indeed, if Lisa had come from a company not known exclusively as a PC maker — like, say, Xerox — perhaps Lisa would have been taken by the trade press as a workstation computer or an “office information system” in the vein of the Star. Yet the Lisa also came up short in several key areas in comparison to the only marginally more expensive Star. It lacked the Star’s networking support, meaning that a key element of PARC’s office of the future was missing. And it lacked a laser printer. In its stead Apple offered a dot-matrix model it had jointly developed with C. Itoh. Like too much else about the Lisa, it turned out slow, clunky, and unreliable; documents on paper were always a disappointment after viewing them on the Lisa’s crisp screen. Any office manager willing to spend the cash for the Lisa might very well have been better off splashing out some extra for the Star (not that many people were buying either system).

Finally, there was the Macintosh problem. Thanks to their internal confusion and the engine of chaos that was Steve Jobs, Apple had two 68000-based computers sporting mice, GUI-based operating systems, and high-resolution bitmapped monochrome displays. Best of all, the two computers were completely incompatible with each other. Seeing his opportunity, Jobs started leaking like a sieve about Apple’s next computer even as he dutifully demonstrated the Lisa. Virtually every preview or review thus concluded with a mention of the rumors about something called “Mackintosh,” which promised to do just about everything Lisa did for a fraction of the price. Apple’s worst enemy could hardly have come up with a better scheme to squelch the Lisa’s sales.

The rest of the Lisa story is largely that of an increasingly desperate Apple struggling to breathe life back into her. In September they dropped the price to $8200, or $7000 for just the machine and the opportunity to order the applications à la carte rather than as a mandatory bundle. By now Apple’s shares had dropped back to $27, less than they had been to start the year. At year’s end they had sold just 15,000 Lisas, down from estimates of 50,000 in those heady days of June.

Lisa 2. The Twiggy drives have been replaced by a single 3.5" drive, and the hard drive is now internal.

Lisa 2. The Twiggy drives have been replaced by a single 3.5″ drive, and the hard drive is now internal.

In January of 1984 Apple released a much-needed revised model, Lisa 2, which replaced the Twiggy drives with 3.5″ models. Price was now in the range of $4000 to $5500. But Macintosh, now also released at last, well and truly stole the Lisa’s thunder yet again. The last of the Lisas were repackaged with a layer of emulation software as the Macintosh XL in January of 1985, marking the end of the Lisa nameplate. Sales actually picked up considerably after this move, as the price dropped again and the XL was still more advanced in many ways than the current “real” Macintosh. Still, the XL marked the end of the line for Lisa technology; the XL was officially discontinued on April 29, 1985, just less than two years after the first Lisa had rolled off the production line. In the end Apple sold no more than 60,000 Lisas and Macintosh XLs in total.

The Lisa was in many ways half-baked, and its commercial fate, at least in hindsight, is perfectly understandable. Yet its soft power was immense. It showed that a sophisticated, multitasking operating system could be done on a microcomputer, as could a full GUI. The latter achievement in particular would have immediate repercussions. While it would still be years before most average users would have machines built entirely around the PARC/Lisa model of computing, there was much about the Lisa that was implementable even on the modest 8-bit machines that would remain the norm in homes for years to come. Lisa showed that software could be more visual, easier to use, friendlier even on those machines. That new attitude would begin to take root, and nowhere more so than in the ostensible main subject of this blog which I’ve been neglecting horribly lately: games. We’ll begin to see how the Lisa way trickled down to the masses in my next article, which I promise will be about games again at last.

On December 14, 1989, Xerox finally got around to suing Apple for allegedly ripping off their PARC innovations, thus prompting the joke that Xerox can’t even sue you on time. With the cat so well and thoroughly out of the bag by this point, the suit was dismissed a few months later.

(As with most aspects of Apple history, there’s enough material available on the Lisa project in print and on the Internet for days of reading. A particularly fascinating source, because it consists entirely of primary-source documents, is the Lisa directory on Asimov.net.)

 
 

Tags:

Xerox PARC

One day in 1962 J.C.R. Licklider, head of the Defense Department’s Information Processing Techniques Office and future Infocom co-founder, ran into a young man named Robert Taylor at an intra-government conference on emerging computer technologies. Lick was seventeen years older than Taylor, but the two found they had much in common. Both had studied psychology at university, with a special research interest in the psychology of human hearing. Both had moved on to computers, to become what we might today call user-interface specialists, studying the ways that the human mind receives and processes information and how computers might be designed to work in more intuitive accord with their masters. Both were humanists, more concerned with that amorphous thing we’ve come to call the user experience than with the bits and bytes that obsessed the technicians and engineers around them. And both were also from the South — Lick from Missouri, Taylor from Texas — and spoke in a corresponding slow drawl that often caused new acquaintances to underestimate them. A friendship was formed.

Robert Taylor, as photographed by Annie Leibowitz in 1972 for a Rolling Stone feature article

Robert Taylor, as photographed by Annie Leibowitz in 1972 for a Rolling Stone feature article

Taylor was working at the time at NASA, having been hired there in the big build-up that followed President Kennedy’s Moon-before-the-decade-is-out speech to work on simulators. Rather astonishingly considering the excitement building for the drive to the Moon, Taylor found himself increasingly dissatisfied there. He wasn’t content working on the margins of even as magnificent an endeavor as this one. Fueled by his conversations with Lick about the potential of computers, he wanted to be at the heart of the action. In 1964 he got his wish. Stepping down as head of IPTO, Lick recommended that Ivan Sutherland be made his replacement, and that Taylor be made Sutherland’s immediate deputy. Barely two years later Sutherland himself stepped down, making the 34-year-old Taylor head of the most far-reaching, well-funded computer-research grant program in the country.

By this time Taylor had long ago come to share Lick’s dream of computers as more than just calculating and tabulating machines. They had the potential to become personal, interactive companions that would not replace human brainpower (as many of the strong AI proponents dreamed) but rather complement, magnify, and transmit it. Taylor put his finger on the distinction in a later interview: “I was never interested in the computer as a mathematical device, but as a communications device.” He and Lick together published a manifesto of sorts in 1968 that still stands as a landmark in the field of computer science, the appropriately named “The Computer as a Communications Device.” They meant that literally as well as figuratively: it was Taylor who initiated the program that would lead to the ARPANET, the predecessor to the modern Internet.

The first mouse, created at SRI circa 1964

The first mouse, created at SRI circa 1964

One of Taylor’s favorites amongst his stable of researchers became Doug Engelbart, who seemed almost uniquely capable of realizing his and Lick’s concept of a new computing paradigm in actual hardware. While developing an early full-screen text editor at the Stanford Research Institute, Engelbart found that users complained of how laborious it was to slowly move the cursor around the screen using arrow keys. To make it easier, he and his team hollowed out a small block of wood, mounting two mechanical wheels attached to potentiometers in the bottom and a button on top. They named it a “mouse,” because with the cord trailing out of the back to connect it with a terminal that’s sort of what it looked like. The strange, homemade-looking gadget was crude and awkward compared to what we use today. Nevertheless, his users found it a great improvement over the keyboard alone. The mouse was just one of many innovations of Engelbart and his team. Their work climaxed in a bravura public demonstration in December of 1968 which first exposed the public to not only the mouse but also the concepts of networked communication, multimedia, and even the core ideas behind what would become known as hypertext. Engelbart pulled out all the stops to put on a great show, and was rewarded with a standing ovation.

But to some extent by this time, and certainly by the time the ARPANET first went live the following year, the atmosphere at ARPA itself was changing. Whereas earlier Taylor was largely left to invest his resources in whatever seemed to him useful and important, the ever-escalating Vietnam War was bringing with it both tightening research budgets and demands that all research be “mission-focused” — i.e., tailored not only to a specific objective, but to a specific military objective at that. Further, Taylor found himself more and more ambivalent about both the war itself and the idea of working for the vast engine that was waging it. After being required to visit Vietnam personally several times to sort out IT logistics there, he decided he’d had enough. He resigned from ARPA at the end of 1969, accepting a position with the University of Utah, which was conducting pioneering (and blessedly civilian) research in computer graphics.

He was still there a year later when an old colleague whose research had been partially funded through ARPA, George Pake, called him. Pake now worked for Xerox Corporation, who were in the process of opening a new blue-sky research facility that he would head. One half of its staff and resources would be dedicated to Xerox’s traditional forte, filled with chemists and physicists doing materials research into photocopying technology. The other half, however, would be dedicated to computer technology in a bid to make Xerox not just the copy-machine guys but the holistic architects of “the office of the future.” Eager to exploit Taylor’s old ARPA connections, which placed him on a first-name basis with virtually every prominent computer scientist in the country, Pake offered Taylor a job as an “associate manager” — more specifically, as a sort of recruiter-in-chief and visionary-in-residence — in the new facility in Palo Alto, California, just outside Stanford University. Bored already by Mormon-dominated Salt Lake City, Taylor quickly accepted.

The very idea of a facility like Xerox’s Palo Alto Research Center feels anachronistic today, what with its open-ended goals and dedication to “pure” research. When hired to run the place, Pake frankly told Xerox that they shouldn’t expect any benefits from the research that would go on there for five to ten years. In that he wasn’t entirely correct, for PARC did do one thing for Xerox immediately: it gave them bragging rights.

Xerox was a hugely profitable company circa 1970, but relatively new to the big stage. Founded back in 1906 as The Haloid Photographic Company, they had really hit the big time only in 1960, when they started shipping the first copy machine practical for the everyday office, the Xerox 914. Now they were eager to expand their empire beyond copy machines, to rival older giants like IBM and AT&T. One part of doing so must be to have a cutting-edge research facility of their own, like IBM’s Thomas J. Watson Research Center and the fabled Bell Labs. Palo Alto was chosen as the location not so much because it was in the heart of Silicon Valley as because it was a long way from the majority of Xerox’s facilities on the other coast. Like its inspirations, PARC was to be kept separate from any whiff of corporate group-think or practical business concerns.

Once installed at PARC, Taylor started going through his address book to staff the place. In a sense it was the perfect moment to be opening such a facility. The American economy was slowing, leaving fewer private companies with the spare change to fund the sort of expensive and uncertain pure research that PARC was planning. Meanwhile government funding for basic computer-science research was also drying up, due to budget squeezes and Congressional demands that every project funded by ARPA must have a specific, targeted military objective. The salad days of Taylor’s ARPA reign, in other words, were well and truly over. It all added up to a buyer’s market for PARC. Taylor had his pick of a large litter of well-credentialed thinkers and engineers who were suddenly having a much harder time finding interesting gigs. Somewhat under the radar of Pake, he started putting together a group specifically tailored to advance the dream he shared with Lick and Engelbart for a new, more humanistic approach to computing.

One of his early recruits was William English, who had served as Engelbart’s right-hand man through much of the previous decade; it was English who had actually constructed the mouse that Engelbart had conceived. Many inside SRI, English not least among them, had grown frustrated with Engelbart, who managed with an air of patrician entitlement and seemed perpetually uninterested in building upon the likes of that showstopping 1968 demonstration by packaging his innovations into practical forms that might eventually reach outside the laboratory and the exhibit hall. English’s recruitment was the prelude to a full-blown defection of Engelbart’s team; a dozen more eventually followed. One of their first projects was to redesign the mouse, replacing the perpendicularly mounted wheels with a single ball that allowed easier, more precise movement. That work would be key to much of what would follow at PARC. It would also remain the standard mouse design for some thirty years, until the optical mouse began to phase out its older mechanical ancestor at last.

Alan Kay

Alan Kay

Taylor was filling PARC with practical skill from SRI and elsewhere, but he still felt he lacked someone to join him in the role of conceptual thinker and philosopher. He wanted someone who could be an ally against the conventional wisdom — held still even by many he had hired — of computers as big, institutional systems rather than tools for the individual. He therefore recruited Alan Kay, a colleague and intellectual soul mate from his brief tenure at the University of Utah. Kay dreamed of a personal computer with “enough power to outrace your senses of sight and hearing, enough capacity to store thousands of pages, poems, letters, recipes, records, drawings, animations, musical scores, and anything else you would like to remember and change.” It was all pretty vague stuff, enough so that many in the computer-science community — including some of those working at PARC — regarded him as a crackpot, a fuzzy-headed dreamer slumming it in a field built on hard logic. Of course, they also said the same about Taylor. Taylor decided that Kay was just what he needed to make sure that PARC didn’t just become another exercise in incremental engineering. Sure enough, Kay arrived dreaming of something that wouldn’t materialize in anything like the form Kay imagined it until some two decades later. He called it the Dynabook. It was a small, flat rectangular box, about 9″ X 12.5″, which flipped open to reveal a screen and keyboard on which one could read, write, play, watch, and listen using media of one’s own choice. Kay was already describing a multimedia laptop computer — and he wasn’t that far away from the spirit of the iPad.

Combining the idealism of Taylor and Kay with the practical knowledge of their engineering staff and at least a strong nod toward the strategic needs of their parent corporation, PARC gradually refined its goal to be the creation of an office of the future that could hopefully also be a stepping stone on the path to a new paradigm for computing. Said office was constructed during the 1970s around four technologies developed right there at PARC: the laser printer; a new computer small enough to fit under a desk and possessed of almost unprecedented graphical capabilities; practical local-area networking in the form of Ethernet; and the graphical user interface (GUI). Together they all but encapsulated the face that computing would assume twenty years later.

Gary Starkweather's laser printer

Gary Starkweather’s laser printer

Of the aforementioned technologies, the laser printer was the most immediately, obviously applicable to Xerox’s core business. It’s thus not so surprising that its creator, Gary Starkweather, was one of the few at PARC to have been employed at Xerox before the opening of the new research facility. Previous computer printers had been clanking, chattering affairs that smashed a limited set of blocky characters onto a continuous feed of yellow-tinged fan-fold paper. They were okay for program listings and data dumps but hardly acceptable for creating business correspondence. In its original implementation Starkweather’s laser printer was also ugly, an unwieldy contraption sprouting wires out of every orifice perched like a huge parasite atop a Xerox copy machine whose mechanisms it controlled. It was, however, revolutionary in that it treated documents not as a series of discrete characters but as a series of intricate pictures to be reproduced by the machinery of the copier it controlled. The advantages of the new approach were huge. Not only was the print quality vastly better, but it appeared on crisp white sheets of normal office paper. Best of all, it was now possible to use a variety of pleasing proportional fonts to replace the ugly old fixed-width characters of the line printers, to include non-English characters like umlauts and accents, to add charts, graphs, decorative touches like borders, even pictures.

Xerox Alto

Xerox Alto

The new computer was called the Alto. It was designed to be a personal computer, semi-portable and supporting just one user, although since it was not built around a microprocessor it was not technically a microcomputer like those that would soon be arriving on the hobbyist market. The Alto’s small size made it somewhat unusual, but what most set it apart was its display.

Most computers of this period — those that were interactive and thus used a display at all, that is — had no real concept of managing a display. They rather simply dumped their plain-text output, fire-and-forget fashion, to a teletype printer or terminal. (For example, it was on the former devices that the earliest text adventures were played, with the response to each command unspooling onto fan-folded paper.) Even more advanced systems, like the full-screen text editors with which Engelbart’s team had worked, tracked the contents of the screen only as a set of cells, each of which could contain a single fixed-width ASCII character; no greater granularity was possible, nor shapes that were not contained in the terminal’s single character set. Early experiments with computer graphics, such as the legendary Spacewar! game developed at MIT in the early 1960s, used a technique known as vector graphics, in which the computer manually controlled the electron gun which fired to form the images on the screen. A picture would be stored not as a grid of pixels but as a series of instructions — the sequence of strokes used to draw it on the display. (This is essentially the same technique as that developed by Ken Williams to store the graphics for On-Line’s Hi-Res Adventure line years later.) Because the early vector displays had no concept of display memory at all, a picture would have to be traced out many times per second, else the phosphors on the display would fade back to black. Such systems were not only difficult to program but much too coarse to allow the intricacies of text.

The Alto formed its display in a different way — the way the device you’re reading this on almost certainly does it. It stored the contents of its screen in memory as a grid of individual pixels, known as a bitmap. One bit represented the on/off status of one pixel; a total of 489,648 of them had to be used to represent the Alto’s 606 X 808 pixel black-and-white screen. (The Alto’s monitor had an unusual portrait orientation that duplicated the dimensions of a standard 8 1/2″ X 11″ sheet of office paper, in keeping with its intended place as the centerpiece of the office of the future.) This area of memory, often called a frame buffer during these times when it was a fairly esoteric design choice, was then simply duplicated onto the monitor screen by the Alto’s video hardware. Just as the laser printer saw textual documents as pictures to be reproduced dot by dot, the Alto saw even its textual displays in the same way. This approach was far more taxing on memory and computing power than traditional approaches, but it had huge advantages. Now the user needed no longer be restricted to a single font; she could have a choice of type styles, or even design her own. And each letter needed no longer fit into a single fixed-size cell on the screen, meaning that more elegant and readable proportional fonts were now possible.

Amongst many other applications, the what-you-see-is-what-you-get word processor was born on the Alto as a direct result of its bitmapped display. A word processor called Gypsy became the machine’s first and most important killer app. Using Gypsy, the user could mix fonts and styles and even images in a document, viewing it all onscreen exactly as it would later look on paper, thanks to the laser printer. The combination was so powerful, went so far beyond what people had heretofore been able to expect of computers or typewriters, that a new term, “desktop publishing,” was eventually coined to describe it. Suddenly an individual with an Alto and a laser printer could produce work that could rival — in appearance, anyway — that of a major publishing house. (As PARC’s own David Liddle wryly said, “Before that, you had to have an article accepted for publication to see your words rendered so beautifully. Now it could be complete rubbish, and still look beautiful.”) Soon even the professionals would be abandoning their old paste boards and mechanical typesetters. Ginn & Co., a textbook-publishing subsidiary of Xerox, became the first publishers in the world to go digital, thanks to a network of laser printers and Altos running Gypsy.

Ethernet

Speaking of which: the Ethernet network was largely the creation of PARC researcher Robert Metcalfe. Various networking schemes had been proposed and sometimes implemented in the years before Ethernet, but they all carried two big disadvantages: they were proprietary, limited to the products of a single company or even to a single type of machine; and they were fragile, prone to immediate failure if the smallest of their far-flung elements should break down. Ethernet overcame both problems. It was a well-documented standard that was also almost absurdly simple to implement, containing the bare minimum needed to accomplish the task effectively and absolutely nothing else. This quality also made it extremely reliable, as did its decentralized design that made it dependent on no single computer on the network to continue to function. Unlike most earlier networking systems, which relied upon a charged cable like that used by the telephone system, Ethernet messages could pass through any passive conductive medium, including uncharged copper wire of the sort sold by the ream in any hardware store. The new protocol was simpler, cheaper, safer, and more reliable than anything that had come before.

Like so much else at PARC, Ethernet represented both a practical step toward the office of the future and a component of Taylor’s idealistic crusade for computers as communications devices. In immediate, practical terms, it let dozens of Altos at PARC or Ginn & Co. share just a few of Starkweather’s pricy laser printers. In the long run, it provided the standard by which millions of disparate devices could talk to one another — the “computer as a communications device” in its purest terms. Ethernet remains today one of the bedrock technologies of the hyper-connected world in which we live, a basic design so effective at what it does that it still hasn’t been improved upon.

A GUI file manager running on an Alto

A GUI file manager running on an Alto

The GUI was the slowest and most gradual of the innovations to come to PARC. When the Alto was designed, Engelbart and English’s mouse was included. However, it was pictured as being used only for the specialized function for which they had originally designed it: positioning the cursor within a text document, a natural convenience for the centerpiece of the office of the future. But then Alan Kay and his small team, known as the “lunatic fringe” even amongst the others at PARC, got their hands on some Altos and started to play. Unlike the hardcore programmers and engineers elsewhere at PARC, Kay’s team had not been selected on the basis of credentials or hacking talent. Kay rather looked for people “with special stars in their eyes,” dreamers and grand conceptual thinkers like him. Any technical skills they might lack, he reasoned, they could learn, or rely on other PARC hackers to provide; one of his brightest stars was Diana Merry, a former secretary for a PARC manager who just got what Kay was on about and took to coming to his meetings. Provided the Alto, the closest they could come to Kay’s cherished Dynabook, they went to work to make the technology sing. They developed a programming language called Smalltalk that was not only the first rigorously object-oriented language in history, the forerunner to C++, Java, and many others, but also simple enough for a grade-school kid to use. With Smalltalk they wrote a twelve-voice music synthesizer and composer (“Twang”), sketching and drawing programs galore, and of course the inevitable games (a networked, multiplayer version of the old standard Star Trek was a particular hit). Throughout, they re-purposed the mouse in unexpected new ways.

Kay and his team realized that many of the functions they were developing were complementary; it was likely that users would want to do them simultaneously. One might, for example, want to write an instructional document in a text editor at the same time as one edited a diagram meant for it in a drawing program. They developed tools to let users do this, but ran into a problem: the Alto’s screen, just the size of a single sheet of paper, simply couldn’t contain it all. Returning yet again to the idea of the office of the future, Kay asked what people in real offices did when they ran out of space on their desk. The answer, of course, was that they simply piled the document they were using at that instant on top of the one they weren’t, then proceeded to flip between the documents as needed. From there it all just seemed to come gushing out of Kay and his team.

The Alto's Smalltalk windowing system in mature form

The Alto’s Smalltalk windowing system in mature form

In February of 1975 Kay called together much of PARC, saying he had “a few things to show them.” What they saw was a rough draft of the graphical user interface that we all know today: discrete, overlapping windows; mouse-driven navigation; pop-up menus. In a very real way it was the fruition of everything they had been working on for almost five years, and everything Taylor and Kay had been dreaming of for many more. At last, at least in this privileged research institution, the technology was catching up to their dreams. Now, not quite two years after the Alto itself had been finished, they knew what it needed to be. Kay and the others at PARC would spend several more years refining the vision, but the blueprint for the future was in place already in 1975.

Xerox ultimately received little of the benefit they might have from all this visionary technology. A rather hidebound, intensely bureaucratic management structure never really understood the work that was going on at PARC, whose personnel they thought of as vaguely dangerous, undisciplined and semi-disreputable. Unsurprisingly, they capitalized most effectively on the PARC invention closest to the copier technology they already understood: the laser printer. Even here they lost years to confusion and bureaucratic infighting, allowing IBM to beat them to the market with the world’s first commercial laser printer. However, Starkweather’s work finally resulted in the smaller, more refined Xerox 9700 of 1977, which remained for many years a major moneymaker. Indeed, all of the expense of PARC was likely financially justified by the 9700 alone.

Still, the bulk of PARC’s innovations went comparatively unexploited. During the late 1970s Xerox did sell Alto workstations to a small number of customers, among them Sweden’s national telephone service, Boeing, and Jimmy Carter’s White House. Yet the commercialization of the Alto, despite pleading from many inside PARC who were growing tired of seeing their innovations used only in their laboratories, was never regarded by Xerox’s management as more than a cautious experiment. With a bit of corporate urgency, Altos could easily have been offered for sale well before the trinity of 1977 made its debut. While a more expensive machine designed for a very different market, a computer equipped with a full GUI for sale before the likes of the Apple II, TRS-80, and PET would likely have dramatically altered the evolution of the PC and made Xerox a major player in the PC revolution. Very possibly they might have ended up playing a role similar to that of IBM in our own timeline — only years earlier, and with better, more visionary technology.

The Xerox Star

The Xerox Star

Xerox’s most concerted attempt to exploit the innovations of PARC as a whole came only in 1981, in the form of the Xerox Star “office information system.” The product of an extended six years of troubled development shepherded to release against the odds at last by ex-PARCer David Liddle, the Star did it all, and often better than it had been done inside PARC itself. The innovations of Kay and his researchers — icons, windows, scroll bars, sliders, pop-up menus — were refined into the full desktop metaphor that remains with us today, the perfect paradigm for the office of the future. Also included in each Star was a built-in Ethernet port to link it with its peers as well as the office laser printer. The new machine represented the commercial fruition of everything PARC had been working toward for the last decade.

The desktop metaphor in full flight on the Star

The desktop metaphor in full flight on the Star

Alas, the Star was a commercial failure. Its price of almost $17,000 per workstation meant that assembling a full office of the future could easily send the price tag north of $100,000. It also had the misfortune to arrive just a few months before the IBM PC, a vastly simpler, utilitarian design that lacked the Star’s elegance but was much cheaper and open to third-party hardware and software. Marketed as a very unconventional piece of conventional office equipment rather than a full-fledged PC, the Star was by contrast locked into the hardware and software Xerox was willing to provide. In the end Xerox managed to sell only about 30,000 of them — a sad, anticlimactic ending to the glory days of innovation at PARC. (The same year that the Star was released Robert Taylor left PARC, taking the last remnants of his original team of innovators with him. By this time Alan Kay was already long gone, driven away by management’s increased demands for practical, shorter-term projects rather than leaps of faith.)

Like the Alto, the Star’s influence would be far out of proportion to the number produced. It is after all to this machine that we owe the ubiquitous desktop metaphor. If anything, the innovations of the Star tend to go somewhat under-credited today in the understandable rush to lionize the achievements inside PARC proper. Perhaps this is somewhat down to Xerox’s dour advertising rhetoric that presented the Star as “just” an “office administration assistant”; those words don’t exactly smack of a machine to change the world.

Oddly, the Star’s fate was just the first of a whole string of similar disappointments from many companies. The GUI and the desktop metaphor were concepts that seemed obviously, commonsensically better than the way computers currently worked to just about everyone who saw them, but it would take another full decade for them to remake the face of the general-purpose PC. Those years are littered with failures and disappointments. Everyone knew what the future must be like, but no one could quite manage to get there. We’ll look at one of the earliest and most famous of these bumps on the road next time.

(Despite some disconcerting errors of fact about the computing world outside the laboratory, Michael A. Hiltzik’s Dealers of Lightning is the essential history of Xerox PARC, and immensely readable to boot. If you’re interested in delving into what went on there in far more detail than I can provide in a single blog post, it’s your obvious next stop.)

 
 

Tags: