Planet Interactive Fiction

September 23, 2016

The Digital Antiquarian

The Freedom to Associate

by Jimmy Maher at September 23, 2016 05:00 PM

In 1854, an Austrian priest and physics teacher named Gregor Mendel sought and received permission from his abbot to plant a two-acre garden of pea plants on the grounds of the monastery at which he lived. Over the course of the next seven years, he bred together thousands upon thousands of the plants under carefully controlled circumstances, recording in a journal the appearance of every single offspring that resulted, as defined by seven characteristics: plant height, pod shape and color, seed shape and color, and flower position and color. In the end, he collected enough data to formulate the basis of the modern science of genetics, in the form of a theory of dominant and recessive traits passed down in pairs from generation to generation. He presented his paper on the subject, “Experiments on Plant Hybridization,” before the Natural History Society of Austria in 1865, and saw it published in a poorly circulated scientific journal the following year.

And then came… nothing. For various reasons — perhaps due partly to the paper’s unassuming title, perhaps due partly to the fact that Mendel was hardly a known figure in the world of biology, undoubtedly due largely to the poor circulation of the journal in which it was published — few noticed it at all, and those who did dismissed it seemingly without grasping its import. Most notably, Charles Darwin, whose On the Origin of Species had been published while Mendel was in the midst of his own experiments, seems never to have been aware of the paper at all, thereby missing this key gear in the mechanism of evolution. Mendel was promoted to abbot of his monastery shortly after the publication of his paper, and the increased responsibilities of his new post ended his career as a scientist. He died in 1884, remembered as a quiet man of religion who had for a time been a gentleman dabbler in the science of botany.

But then, at the turn of the century, the German botanist Carl Correns stumbled upon Mendel’s work while conducting his own investigations into floral genetics, becoming in the process the first to grasp its true significance. To his huge credit, he advanced Mendel’s name as the real originator of the set of theories which he, along with one or two other scientists working independently, was beginning to rediscover. Correns effectively shamed those other scientists as well into acknowledging that Mendel had figured it all out decades before any of them even came close. It was truly a selfless act; today the name of Carl Correns is unknown except in esoteric scientific circles, while Gregor Mendel’s has been done the ultimate honor of becoming an adjective (“Medelian”) and a noun (“Mendelism”) locatable in any good dictionary.

Vannevar Bush

Vannevar Bush

So, all’s well that ends well, right? Well, maybe, but maybe not. Some 30 years after the rediscovery of Mendel’s work, an American named Vannevar Bush, dean of MIT’s School of Engineering, came to see the 35 years that had passed between the publication of Mendel’s theory and the affirmation of its importance as a troubling symptom of the modern condition. Once upon a time, all knowledge had been regarded as of a piece, and it had been possible for a great mind to hold within itself huge swathes of this collective knowledge of humanity, everything informing everything else. Think of that classic example of a Renaissance man, Leonardo da Vinci, who was simultaneously a musician, a physicist, a mathematician, an anatomist, a botanist, a geologist, a cartographer, an alchemist, an astronomer, an engineer, and an inventor. Most of all, of course, he was a great visual artist, but he used everything else he was carrying around in that giant brain of his to create paintings and drawings as technically meticulous as they were artistically sublime.

By Bush’s time, however, the world had long since entered the Age of the Specialist. As the sheer quantity of information in every field exploded, those who wished to do worthwhile work in any given field — even those people gifted with giant brains — were increasingly being forced to dedicate their intellectual lives entirely to that field and only that field, just to keep up. The intellectual elite were in danger of becoming a race of mole people, closeted one-dimensionals fixated always on the details of their ever more specialized trades, never on the bigger picture. And even then, the amount of information surrounding them was so vast, and existing systems for indexing and keeping track of it all so feeble, that they could miss really important stuff within their own specialties; witness the way the biologists of the late nineteenth century had missed Gregor Mendel’s work, and the 35-years head start it had cost the new science of genetics. “Mendel’s work was lost,” Bush would later write, “because of the crudity with which information is transmitted between men.” How many other major scientific advances were lying lost in the flood of articles being published every year, a flood that had increased by an order of magnitude just since Mendel’s time? “In this are thoughts,” wrote Bush, “certainly not often as great as Mendel’s, but important to our progress. Many of them become lost; many others are repeated over and over.” “This sort of catastrophe is undoubtedly being repeated all around us,” he believed, “as truly significant attainments become lost in the sea of the inconsequential.”

Bush’s musings were swept aside for a time by the rush of historical events. As the prospect of another world war loomed, he became President Franklin Delano Roosevelt’s foremost advisor on matters involving science and engineering. During the war, he shepherded through countless major advances in the technologies of attack and defense, culminating in the most fearsome weapon the world had ever known: the atomic bomb. It was actually this last that caused Bush to return to the seemingly unrelated topic of information management, a problem he now saw in a more urgent light than ever. Clearly the world was entering a new era, one with far less tolerance for the human folly, born of so much context-less mole-person ideology, that had spawned the current war.

Practical man that he was, Bush decided there was nothing for it but to roll up his sleeves and make a concrete proposal describing how humanity could solve the needle-in-a-haystack problem of the modern information explosion. Doing so must entail grappling with something as fundamental as “how creative men think, and what can be done to help them think. It is a problem of how the great mass of material shall be handled so that the individual can draw from it what he needs — instantly, correctly, and with utter freedom.”

As revolutionary manifestos go, Vannevar Bush’s “As We May Think” is very unusual in terms of both the man that wrote it and the audience that read it. Bush was no Karl Marx, toiling away in discontented obscurity and poverty. On the contrary, he was a wealthy upper-class patrician who was, as a member of the White House inner circle, about as fabulously well-connected as it was possible for a man to be. His article appeared first in the July 1945 edition of the Atlantic Monthly, hardly a bastion of radical thought. Soon after, it was republished in somewhat abridged form by Life, the most popular magazine on the planet. Thereby did this visionary document reach literally millions of readers.

With the atomic bomb still a state secret, Bush couldn’t refer directly to his real reasons for wanting so urgently to write down his ideas now. Yet the dawning of the atomic age nevertheless haunts his article.

It is the physicists who have been thrown most violently off stride, who have left academic pursuits for the making of strange destructive gadgets, who have had to devise new methods for their unanticipated assignments. They have done their part on the devices that made it possible to turn back the enemy, have worked in combined effort with the physicists of our allies. They have felt within themselves the stir of achievement. They have been part of a great team. Now, as peace approaches, one asks where they will find objectives worthy of their best.

Seen in one light, Bush’s essay is similar to many of those that would follow from other Manhattan Project alumni during the uncertain interstitial period between the end of World War II and the onset of the Cold War. Bush was like many of his colleagues in feeling the need to advance a utopian agenda to counter the apocalyptic potential of the weapon they had wrought, in needing to see the ultimate evil that was the atomic bomb in almost paradoxical terms as a potential force for good that would finally shake the world awake.

Bush was true to his engineer’s heart, however, in basing his utopian vision on technology rather than politics. The world was drowning in information, making the act of information synthesis — intradisciplinary and interdisciplinary alike — ever more difficult.

The difficulty seems to be, not so much that we publish unduly in view of the extent and variety of present-day interests, but rather that publication has been extended far beyond our present ability to make real use of the record. The summation of human experience is being expanded at a prodigious rate, and the means we use for threading through the consequent maze to the momentarily important item is the same as was used in the days of square-rigged ships.

Our ineptitude in getting at the record is largely caused by the artificiality of systems of indexing. When data of any sort are placed in storage, they are filed alphabetically or numerically, and information is found (when it is) by tracing it down from subclass to subclass. It can be in only one place, unless duplicates are used; one has to have rules as to which path will locate it, and the rules are cumbersome. Having found one item, moreover, one has to emerge from the system and reenter on a new path.

The human mind does not work that way. It operates by association. With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain. It has other characteristics, of course; trails that are not frequently followed are prone to fade, items are not fully permanent, memory is transitory. Yet the speed of action, the intricacy of trails, the detail of mental pictures, is awe-inspiring beyond all else in nature.

Man cannot hope fully to duplicate this mental process artificially, but he certainly ought to be able to learn from it. In minor ways he may even improve it, for his records have relative permanency. The first idea, however, to be drawn from the analogy concerns selection. Selection by association, rather than indexing, may yet be mechanized. One cannot hope thus to equal the speed and flexibility with which the mind follows an associative trail, but it should be possible to beat the mind decisively in regard to the permanence and clarity of the items resurrected from storage.

Bush was not among the vanishingly small number of people who were working in the nascent field of digital computing in 1945. His “memex,” the invention he proposed to let an individual free-associate all of the information in her personal library, was more steampunk than cyberpunk, all whirring gears, snickering levers, and whooshing microfilm strips. But really, those things are just details; he got all of the important stuff right. I want to quote some more from “As We May Think,” and somewhat at length at that, because… well, because its vision of the future is just that important. This is how the memex should work:

When the user is building a trail, he names it, inserts the name in his code book, and taps it out on his keyboard. Before him are the two items to be joined, projected onto adjacent viewing positions. At the bottom of each there are a number of blank code spaces, and a pointer is set to indicate one of these on each item. The user taps a single key, and the items are permanently joined. In each code space appears the code word. Out of view, but also in the code space, is inserted a set of dots for photocell viewing; and on each item these dots by their positions designate the index number of the other item.

Thereafter, at any time, when one of these items is in view, the other can be instantly recalled merely by tapping a button below the corresponding code space. Moreover, when numerous items have been thus joined together to form a trail, they can be reviewed in turn, rapidly or slowly, by deflecting a lever like that used for turning the pages of a book. It is exactly as though the physical items had been gathered together from widely separated sources and bound together to form a new book. It is more than this, for any item can be joined into numerous trails.

The owner of the memex, let us say, is interested in the origin and properties of the bow and arrow. Specifically he is studying why the short Turkish bow was apparently superior to the English long bow in the skirmishes of the Crusades. He has dozens of possibly pertinent books and articles in his memex. First he runs through an encyclopedia, finds an interesting but sketchy article, leaves it projected. Next, in a history, he finds another pertinent item, and ties the two together. Thus he goes, building a trail of many items. Occasionally he inserts a comment of his own, either linking it into the main trail or joining it by a side trail to a particular item. When it becomes evident that the elastic properties of available materials had a great deal to do with the bow, he branches off on a side trail which takes him through textbooks on elasticity and tables of physical constants. He inserts a page of longhand analysis of his own. Thus he builds a trail of his interest through the maze of materials available to him.

And his trails do not fade. Several years later, his talk with a friend turns to the queer ways in which a people resist innovations, even of vital interest. He has an example, in the fact that the outraged Europeans still failed to adopt the Turkish bow. In fact he has a trail on it. A touch brings up the code book. Tapping a few keys projects the head of the trail. A lever runs through it at will, stopping at interesting items, going off on side excursions. It is an interesting trail, pertinent to the discussion. So he sets a reproducer in action, photographs the whole trail out, and passes it to his friend for insertion in his own memex, there to be linked into the more general trail.

Wholly new forms of encyclopedias will appear, ready-made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified. The lawyer has at his touch the associated opinions and decisions of his whole experience, and of the experience of friends and authorities. The patent attorney has on call the millions of issued patents, with familiar trails to every point of his client’s interest. The physician, puzzled by a patient’s reactions, strikes the trail established in studying an earlier similar case, and runs rapidly through analogous case histories, with side references to the classics for the pertinent anatomy and histology. The chemist, struggling with the synthesis of an organic compound, has all the chemical literature before him in his laboratory, with trails following the analogies of compounds, and side trails to their physical and chemical behavior.

The historian, with a vast chronological account of a people, parallels it with a skip trail which stops only on the salient items, and can follow at any time contemporary trails which lead him all over civilization at a particular epoch. There is a new profession of trail blazers, those who find delight in the task of establishing useful trails through the enormous mass of the common record. The inheritance from the master becomes, not only his additions to the world’s record, but for his disciples the entire scaffolding by which they were erected.

Ted Nelson

Ted Nelson

There is no record of what all those millions of Atlantic Monthly and Life readers made of Bush’s ideas in 1945 — or for that matter if they made anything of them at all. In the decades that followed, however, the article became a touchstone of the burgeoning semi-underground world of creative computing. Among its discoverers was Ted Nelson, who is depending on whom you talk to either one of the greatest visionaries in the history of computing or one of the greatest crackpots — or, quite possibly, both. Born in 1937 to a Hollywood director and his actress wife, then raised by his wealthy and indulgent grandparents following the inevitable Hollywood divorce, Nelson’s life would largely be defined by, as Gary Wolf put it in his classic profile for Wired magazine, his “aversion to finishing.” As in, finishing anything at all, or just the concept of finishing in the abstract. Well into middle-age, he would be diagnosed with attention-deficit disorder, an alleged malady he came to celebrate as his “hummingbird mind.” This condition perhaps explains why he was so eager to find a way of forging permanent, retraceable associations among all the information floating around inside and outside his brain.

Nelson coined the terms “hypertext” and “hypermedia” at some point during the early 1960s, when he was a graduate student at Harvard. (Typically, he got a score of Incomplete in the course for which he invented them, not to mention an Incomplete on his PhD as a whole.) While they’re widely used all but interchangeably today, in Nelson’s original formulation the former term was reserved for purely textual works, the later for those incorporating others forms of media, like images and sound. But today we’ll just go with the modern flow, call them all hypertexts, and leave it at that. In his scheme, then, hypertexts were texts capable of being “zipped” together with other hypertexts, memex-like, wherever the reader or writer wished to preserve associations between them. He presented his new buzzwords to the world at a conference of the Association for Computing Machinery in 1965, to little impact. Nelson, possessed of a loudly declamatory style of discourse and all the rabble-rousing fervor of a street-corner anarchist, would never be taken all that seriously by the academic establishment.

Instead, it being the 1960s and all, he went underground, embracing computing’s burgeoning counterculture. His eventual testament, one of the few things he ever did manage to complete — after a fashion, at any rate — was a massive 1200-page tome called Computer Lib/Dream Machines, self-published in 1974, just in time for the heyday of the Altair and the Homebrew Computer Club, whose members embraced Nelson as something of a patron saint. As the name would indicate, Computer Lib/Dream Machines was actually two separate books, bound back to back. Theoretically, Computer Lib was the more grounded volume, full of practical advice about gaining access to and using computers, while Dream Machines was full of the really out-there ideas. In practice, though, they were often hard to distinguish. Indeed, it was hard to even find anything in the books, which were published as mimeographed facsimile copies filled with jotted marginalia and cartoons drafted in Nelson’s shaky hand, with no table of contents or page numbers and no discernible organizing principle beyond the stream of consciousness of Nelson’s hummingbird mind. (I trust that the irony of a book concerned with finding new organizing principles for information itself being such an impenetrable morass is too obvious to be worth belaboring further.) Nelson followed Computer Lib/Dream Machines with 1981’s Literary Machines, a text written in a similar style that dwelt, when it could be bothered, at even greater length on the idea of hypertext.

The most consistently central theme of Nelson’s books, to whatever extent one could be discerned, was an elaboration of the hypertext concept he called Xanadu, after the pleasure palace in Samuel Taylor Coleridge’s poem “Kubla Khan.” The product of an opium-fueled hallucination, the 54-line poem is a mere fragment of a much longer work Taylor had intended to write. Problem was, in the course of writing down the first part of his waking dream he was interrupted; by the time he returned to his desk he had simply forgotten the rest.

So, Nelson’s Xanadu was intended to preserve information that would otherwise be lost, which goal it would achieve through associative linking on a global scale. Beyond that, it was almost impossible to say precisely what Xanadu was or wasn’t. Certainly it sounds much like the World Wide Web to modern ears, but Nelson insists adamantly that the web is a mere bad implementation of the merest shadow of his full idea. Xanadu has been under allegedly active development since the late 1960s, making it the most long-lived single project in the history of computer programming, and by far history’s most legendary piece of vaporware. As of this writing, the sum total of all those years of work are a set of web pages written in Nelson’s inimitable declamatory style, littered with angry screeds against the World Wide Web, along with some online samples that either don’t work quite right or are simply too paradigm-shattering for my poor mind to grasp.

In my own years on this planet, I’ve come to reserve my greatest respect for people who finish things, a judgment which perhaps makes me less than the ideal critic of Ted Nelson’s work. Nevertheless, even I can recognize that Nelson deserves huge credit for transporting Bush’s ideas to their natural habitat of digital computers, for inventing the term “hypertext,” for defining an approach to links (or “zips”) in a digital space, and, last but far from least, for making the crucial leap from Vannevar Bush’s concept of the single-user memex machine to an interconnected global network of hyperlinks.

But of course ideas, of which both Bush and Nelson had so many, are not finished implementations. During the 1960s, 1970s, and early 1980s, there were various efforts — in addition, that is, to the quixotic effort that was Xanadu — to wrestle at least some of the concepts put forward by these two visionaries into concrete existence. Yet it wouldn’t be until 1987 that a corporation with real financial resources and real commercial savvy would at last place a reasonably complete implementation of hypertext before the public. And it all started with a frustrated programmer looking for a project.

Steve Jobs and Bill Atkinson

Steve Jobs and Bill Atkinson

Had he never had anything to do with hypertext, Bill Atkinson’s place in the history of computing would still be assured. Coming to Apple Computer in 1978, when the company was only about eighteen months removed from that famous Cupertino garage, Atikinson was instrumental in convincing Steve Jobs to visit the Xerox Palo Alto Research Center, thereby setting in motion the chain of events that would lead to the Macintosh. A brilliant programmer by anybody’s measure, he eventually wound up on the Lisa team. He wrote the routines to draw pixels onto the Lisa’s screen — routines on which, what with the Lisa being a fundamentally graphical machine whose every display was bitmapped, every other program depended. Jobs was so impressed by Atkinson’s work on what he named LisaGraf that he recruited him to port his routines over to the nascent Macintosh. Atkinson’s routines, now dubbed QuickDraw, would remain at the core of MacOS for the next fifteen years. But Atkinson’s contribution to the Mac went yet further: after QuickDraw, he proceeded to design and program MacPaint, one of the two applications included with the finished machine, and one that’s still justifiably regarded as a little marvel of intuitive user-interface design.

Atkinson’s work on the Mac was so essential to the machine’s success that shortly after its release he became just the fourth person to be named an Apple Fellow — an honor that carried with it, implicitly if not explicitly, a degree of autonomy for the recipient in the choosing of future projects. The first project that Atkinson chose for himself was something he called the Magic Slate, based on a gadget called the Dynabook that had been proposed years ago by Xerox PARC alum (and Atkinson’s fellow Apple Fellow) Alan Kay: a small, thin, inexpensive handheld computer controlled via a touch screen. It was, as anyone who has ever seen an iPhone or iPad will attest, a prescient project indeed, but also one that simply wasn’t realizable using mid-1980s computer technology. Having been convinced of this at last by his skeptical managers after some months of flailing,  Atkinson wondered if he might not be able to create the next best thing in the form of a sort of software version of the Magic Slate, running on the Macintosh desktop.

In a way, the Magic Slate had always had as much to do with the ideas of Bush and Nelson as it did with those of Kay. Atkinson had envisioned its interface as a network of “pages” which the user navigated among by tapping links therein — a hypertext in its own right. Now he transported the same concept to the Macintosh desktop, whilst making his metaphorical pages into metaphorical stacks of index cards. He called the end result, the product of many months of design and programming, “Wildcard.” Later, when the trademark “Wildcard” proved to be tied up by another company, it turned into “HyperCard” — a much better name anyway in my book.

By the time he had HyperCard in some sort of reasonably usable shape, Atkinson was all but convinced that he would have to either sell the thing to some outside software publisher or start his own company to market it. With Steve Jobs now long gone and with him much of the old Jobsian spirit of changing the world through better computing, Apple was heavily focused on turning the Macintosh into a practical business machine. The new, more sober mood in Cupertino — not to mention Apple’s more buttoned-down public image — would seem to indicate that they were hardly up for another wide-eyed “revolutionary” product. It was Alan Kay, still kicking around Cupertino puttering with this and that, who convinced Atkinson to give CEO John Sculley a chance before he took HyperCard elsewhere. Kay brokered a meeting between Sculley and Atkinson, in which the latter was able to personally demonstrate to the former what he’d been working on all these months. Much to Atkinson’s surprise, Sculley loved HyperCard. Apparently at least some of the old Jobsian fervor was still alive and well after all inside Apple’s executive suite.

At its most basic, a HyperCard stack to modern eyes resembles nothing so much as a PowerPoint presentation, albeit one which can be navigated non-linearly by tapping links on the slides themselves. Just as in PowerPoint, the HyperCard designer could drag and drop various forms of media onto a card. Taken even at this fairly superficial level, HyperCard was already a full-fledged hypertext-authoring (and hypertext-reading) tool — by no means the first specimen of its kind, but the first with the requisite combination of friendliness, practicality, and attractiveness to make it an appealing environment for the everyday computer user. One of Atkinson’s favorite early demo stacks had many cards with pictures of people wearing hats. If you clicked on a hat, you were sent to another card showing someone else wearing a hat. Ditto for other articles of fashion. It may sound banal, but this really was revolutionary, organization by association in action. Indeed, one might say that HyperCard was Vannevar Bush’s memex, fully realized at last.

But the system showed itself to have much, much more to offer when the author started to dig into HyperTalk, the included scripting language. All sorts of logic, simple or complex, could be accomplished by linking scripts to clicks on the surface of the cards. At this level, HyperCard became an almost magical tool for some types of game development, as we’ll see in future articles. It was also a natural fit for many other applications: information kiosks, interactive tutorials, educational software, expert systems, reference libraries, etc.

HyperCard in action

HyperCard in action

John Sculley himself premiered HyperCard at the August 1987 MacWorld show. Showing unusual largess in his determination to get HyperCard into the hands of as many people as possible as quickly as possible, he announced that henceforward all new Macs would ship with a free copy of the system, while existing owners could buy copies for their machines for just $49. He called HyperCard the most important product Apple had released during his tenure there. Considering that Sculley had also been present for the launch of the original Macintosh, this was certainly saying something. And yet he wasn’t clearly in the wrong either. As important as the Macintosh, the realization in practical commercial form of the computer-interface paradigms pioneered at Xerox PARC during the 1970s, has been to our digital lives of today, the concept of associative indexing — hyperlinking — has proved at least as significant. But then, the two do go together like strawberries and cream, the point-and-click paradigm providing the perfect way to intuitively navigate through a labyrinth of hyperlinks. It was no coincidence that an enjoyable implementation of hypertext appeared first on the Macintosh; the latter almost seemed a prerequisite for the former.

The full revolutionary nature of the concept of hypertext was far from easy to get across in advertising copy, but Apple gave it a surprisingly good go, paying due homage to Vannevar Bush in the process.

The full import of the concept of hypertext was far from easy to get across in advertising copy, but Apple gave it a surprisingly serious go, paying due homage to Vannevar Bush in the process.

In the wake of that MacWorld presentation, a towering tide of HyperCard hype rolled from one side of the computer industry to the other, out into the mainstream media, and then back again, over and over. Hypertext’s time had finally come. In 1985, it was an esoteric fringe concept known only to academics and a handful of hackers, being treated at real length and depth in print only in Ted Nelson’s own sprawling, well-nigh impenetrable tomes. Four years later, every bookstore in the land sported a shelf positively groaning with trendy paperbacks advertising hypertext this and hypertext that. By then the curmudgeons had also begun to come out in force, always a sure sign that an idea has truly reached critical mass. Presentations showed up in conference catalogs with snarky titles like “Hypertext: Will It Cook Me Breakfast Too?.”

The curmudgeons had plenty of rabid enthusiasm to push back against. HyperCard, even more so than the Macintosh itself, had a way of turning the most sober-minded computing veterans into starry-eyed fanatics. Jan Lewis, a long time business-computing analyst, declared that “HyperCard is going to revolutionize the way computing is done, and possibly the way human thought is done.” Throwing caution to the wind, she abandoned her post at InfoWorld to found HyperAge, the first magazine dedicated to the revolution. “There’s a tremendous demand,” she said. “If you look at the online services, the bulletin boards, the various ad hoc meetings, user groups — there is literally a HyperCulture developing, almost a cult.” To judge from her own impassioned statements, she should know. She recruited Ted Nelson himself — one of the HyperCard holy trinity of Bush, Nelson, and Atkinson — to write a monthly column.

HyperCard effectively amounted to an entirely new computing platform that just happened to run atop the older platform that was the Macintosh. As Lewis noted, user-created HyperCard stacks — this new platform’s word for “programs” or “software” — were soon being traded all over the telecommunications networks. The first commercial publisher to jump into the HyperCard game was, somewhat surprisingly, Mediagenic.1 Bruce Davis, Mediagenic’s CEO, has hardly gone down into history as a paradigm of progressive thought in the realms of computer games and software in general, but he defied his modern reputation in this one area at least by pushing quickly and aggressively into “stackware.” One of the first examples of same that Mediagenic published was Focal Point, a collection of business and personal-productivity tools written by one Danny Goodman, who was soon to publish a massive bible called The Complete HyperCard Handbook, thus securing for himself the mantle of the new ecosystem’s go-to programming guru. Focal Point was a fine demonstration that just about any sort of software could be created by the sufficiently motivated HyperCard programmer. But it was another early Mediagenic release, City to City, that was more indicative of the system’s real potential. It was a travel guide to most major American cities — an effortlessly browsable and searchable guide to “the best food, lodgings, and other necessities” to be found in each of the metropolises in its database.

City to City

City to City

Other publishers — large, small, and just starting out — followed Mediagenic’s lead, releasing a bevy of fascinating products. The people behind The Whole Earth Catalog — themselves the inspiration for Ted Nelson’s efforts in self-publication — converted their current edition into a HyperCard stack filling a staggering 80 floppy disks. A tiny company called Voyager combined HyperCard with a laser-disc player — a very common combination among ambitious early HyperCard developers — to offer an interactive version of the National Gallery of Art which could be explored using such associative search terms as “Impressionist landscapes with boats.” Culture 1.0 let you explore its namesake through “3700 years of Western history — over 200 graphics, 2000 hypertext links, and 90 essays covering topics from the Black Plague to Impressionism,” all on just 7 floppy disks. Mission: The Moon, from the newly launched interactive arm of ABC News, gathered together details of every single Mercury, Gemini, and Apollo mission, including videos of each mission hosted on a companion laser disc. A professor of music converted his entire Music Appreciation 101 course into a stack. The American Heritage Dictionary appeared as stackware. And lots of what we might call “middlestackware” appeared to help budding programmers with their own creations: HyperComposer for writing music in HyperCard, Take One for adding animations to cards.

Just two factors were missing from HyperCard to allow hypertext to reach its full potential. One was a storage medium capable of holding lots of data, to allow for truly rich multimedia experiences, combining the lavish amounts of video, still pictures, music, sound, and of course text that the system clearly cried out for. Thankfully, that problem was about to be remedied via a new technology which we’ll be examining in my very next article.

The other problem was a little thornier, and would take a little longer to solve. For all its wonders, a HyperCard stack was still confined to the single Macintosh on which it ran; there was no provision for linking between stacks running on entirely separate computers. In other words, one might think of a HyperCard stack as equivalent to a single web site running locally off a single computer’s hard drive, without the ability to field external links alongside its internal links. Thus the really key component of Ted Nelson’s Xanadu dream, that of a networked hypertext environment potentially spanning the entire globe, remained unrealized. In 1990, Bill Nisen, the developer of a hypertext system called Guide that slightly predated HyperCard but wasn’t as practical or usable, stated the problem thus:

The one thing that is precluding the wide acceptance of hypertext and hypermedia is adequate broadcast mechanisms. We need to find ways in which we can broadcast the results of hypermedia authoring. We’re looking to in the future the ubiquitous availability of local-area networks and low-cost digital-transmission facilities. Once we can put the results of this authoring into the hands of more users, we’re going to see this industry really explode.

Already at the time Nisen made that statement, a researcher in Britain named Tim Berners-Lee had started to experiment with something he called the Hypertext Transfer Protocol. The first real web site, the beginning of the World Wide Web, would go online in 1991. It would take a few more years even from that point, but a shared hypertextual space of a scope and scale the likes of which few could imagine was on the way. The world already had its memex in the form of HyperCard. Now — and although this equivalency would scandalize Ted Nelson — it was about to get its Xanadu.

Associative indexing permeates our lives so thoroughly today that, as with so many truly fundamental paradigm shifts, the full scope of the change it has wrought can be difficult to fully appreciate. A century ago, education was still largely an exercise in retention: names, dates, Latin verb cognates. Today’s educational institutions  — at least the more enlightened ones — recognize that it’s more important to teach their pupils how to think than it is to fill their heads with facts; facts, after all, are now cheap and easy to acquire when you need them. That such a revolution in the way we think about thought happened in just a couple of decades strikes me as incredible. That I happened to be present to witness it strikes me as a amazing.

What I’ve witnessed has been a revolution in humanity’s relationship to information itself that’s every bit as significant as any political revolution in history. Some Singularity proponents will tell you that it marks the first step on the road to a vast worldwide consciousness. But even if you choose not to go that far, the ideas of Vannevar Bush and Ted Nelson are still with you every time you bring up Google. We live in a world in which much of the sum total of human knowledge is available over an electronic connection found in almost every modern home. This is wondrous. Yet what’s still more wondrous is the way that we can find almost any obscure fact, passage, opinion, or idea we like from within that mass, thanks to selection by association. Mama, we’re all cyborgs now.

(Sources: the books Hackers: Heroes of the Computer Revolution and Insanely Great: The Life and Times of the Macintosh, the Computer That Changed Everything by Steven Levy; Computer Lib/Dream Machines and Literary Machines by Ted Nelson; From Memex to Hypertext: Vannevar Bush and the Mind’s Machine, edited by James M. Nyce and Paul Kahn; The New Media Reader, edited by Noah Wardrip-Fruin and Nick Montfort; Multimedia and Hypertext: The Internet and Beyond by Jakob Nielsen; The Making of the Atomic Bomb by Richard Rhodes. Also the June 1995 Wired magazine profile of Ted Nelson; Andy Hertzfeld’s website Folklore; and the Computer Chronicles television episodes entitled “HyperCard,” “MacWorld Special 1988,” “HyperCard Update,” and “Hypertext.”)

  1. Mediagenic was known as Activision until mid-1988. To avoid confusion, I just stick with the name “Mediagenic” in this article. 

September 21, 2016


An Epic Adventure Concludes

September 21, 2016 11:01 PM

Four years.

Four games.

One and a half million words.

Thirty-five thousand choices.


The mood in the inkle office today is one of celebration, and sadness.

The Sorcery! series is now complete.

The Fellowship

If you've enjoyed the adventure with us, here are a few people you might want to thank outside of the core inkle team:

Eddie Sharam - character artist. Eddie's done all the humans, monsters and re-animated furniture that appears in the series; sometimes drawing from John Blanche's crazy original illustrations, and sometimes using a little craziness of his own.

Laurence Chapman - composer. We now have original themes for Parts 2, 3 and 4, recorded by live orchestra, and we think they're incredible.

Emily Morganti - publicist. When we started work on part one we had three press emails in a spreadsheet and were planning to stand in a corner of PAX with a banner. Coverage is the oxygen of development and yet most developers have no idea how to do it. Emily sorted us right out.

Iain Merrick - coder. Iain started out in the Sorcery! team doing the audio design for the first part (the infamous walrus laugh that plays in the villages of the Shamutanti Hills was his idea). Since then he became rather more full-time, handling the cross-platform magic that takes iOS code and generates the Android and desktop builds.

Graham Robertson - writer. Graham wrote a lot of the core content for Part 4, adapting the original material, and adding a lot of strange new ideas of his own. Graham likes cats, ghosts, and hungry Goblins - you'll know when you stumble on some of his work.

Mike Schley - cartographer. Mike's maps and building drawings are the heart and soul of the Sorcery! experience. We've loved his work from the moment we first saw and it, and the most exciting moments during development have always been the delivery of a new piece of art.

And last but certainly not least, Steve Jackson himself. After creating the Fighting Fantasy series with fellow dungeon-master Ian Livingstone, Steve wrote the Sorcery!(/sorcery/) series as a standalone set of four books, with an innovative magic system, and aimed at slightly older readers.

Set in a previously unexplored corner of the FF world, they told their own story, and each book has its own tricks and devices. Speaking personally, I can still remember the feeling of being lost in the pages; and the thrill of casting the right spell at the right time (because in those days, you were supposed to learn the spellbook by heart!)

Steve created a vibrant, rich, nasty-but-nice world filled with potential; and then gave us the freedom we needed to play with it.


Then there are our testers. Fifty people played the adventure prior to launch, and of them, five were simply extraordinary. Kathryn, Felicity, John, Nikki and Jules - you guys have been amazing.

A long journey

The first Sorcery! game saw us taking our first steps as a company as not just technical developers, but gameplay and content designers as well: now, four years later, we're a BAFTA-nominated studio, working on our own game-worlds.

Where the first game saw us pushing our ink compilation script to breaking point; we're now fully migrated over to ink, and are enjoying seeing studios all around the world using the system as well.

The reaction to the series has been overwhelming to say the least - from the glowing reviews of Part 1 that appeared on mainstream gaming sites like IGN and Kotaku, through the wonderful Yogscast rendition of the first two games, to the antics at our preview party in London last week - it's been a real thrill to be seeing people getting excited about the series.

So what's left?

We've one more piece of business to do in the world of Sorcery! - a final update to the series, coming shortly. We'll have more information on that later, but if you've finished Sorcery! 4 already you'll know what we have in store.

And, of course, there is one last thing...


But for now - sit back, relax, and get ready, for death awaits you in the Fortress of Sorcerers...

Renga in Blue

Haunted House: Finished

by Jason Dyer at September 21, 2016 10:00 PM

Last I visited Haunted House I thought I was done playing. Fate decided otherwise.

Before I go on, I want to preface: this game was written with *very* tight requirements. The TRS-80 was originally released with only 4K of memory space, and while the base model was swiftly upgraded to 16K it appears Radio Shack wanted Haunted House be playable on any of their systems, including the lowest end models.

Hence, the entirety of this game fits on two 4K cassette tapes, and not for a total of 8K; each cassette is a self-contained part of the game. For reference, Adventureland (which is legendary for extremely tight space requirements) uses the entire luxurious 16K of the newer model (that is, four times the size).

So in a way Haunted House is an impossibility, a marvel. It is still a deeply bad game.

We left off on holding a bucket of water, with no apparent way to apply it to a fire.


You can “pour bucket” but it just pours water on the ground and refills. Would you suspect a bucket of endless water is a useless red herring? (Well, maybe Joseph Nuccio would.)


I want to stop for a moment and emphasize you can walk through the fire without carrying the bucket of water. The bucket of water is entirely unnecessary and its entire existence seems to be very specifically engineered to force players into an intentionally impossible game of guess the verb. Perhaps this doesn’t sound so frustrating with me just describing it, but I assure you in terms of actual gameplay this is possibly the worst maneuver I’ve ever seen. There is an analogous part in Crowther and Woods Adventure but that at least has the saving grace of no item that seems like a completely logical solution.

In any case, the part with climbing the rope which takes you to the second floor swaps you to “Tape 2.” (The version I was playing has the tapes merged so a tape swap is unnecessary.) The code on Tape 2 is entirely self-contained to the extent that some verbs that work on the first floor don’t work on the second floor, and vice versa.

To continue, I took the magic sword and went wandering:


Given your original inventory is all gone, and the verb set is even more limited than the first floor, the only option is to kill them all (“YOUR MAGIC SWORD ENABLES YOU TO KILL THE GHOST!”).

After slaying the ghosts, there’s another ghost, a … superghost of sorts?


It won’t let you just pass by either. With only TAKE, DROP, direction commands, and KILL at your disposal, what to do?

Well, obviously, go off to another room and drop off the sword. (In another context, this might have been kind of neat, but here it is just random.)

This is followed by a “maze” of sorts with a bunch of identical ghost rooms, exploiting the fact that going in a direction just repeats the room description, beating out stiff competition for the award for Least Verisimilitude in Any Maze Ever.


Eventually, going south gets to a room with a sign.


Let’s just summarize:

  1. There are three exits: east, west, and south. Two of them will kill you. There is no hint as to which one.
  2. If you ignore the sign, by, say, wandering around the maze too fast, you will die because you have to read the sign in order to live (even if you went through the correct exit)
  3. If you carry the sign with you after reading it you will also die (even if you went through the correct exit).
  4. Dying for any of the reasons above requires a reset of the second floor. I am dearly hoping it didn’t require reloading the cassette.

Somehow I don’t feel bad about spoiling the end.


Of course, a game like this deserves a seriously impressive Amiga remake (thanks, Sean Murphy!)

Feel free to share any personal stories you have about this game in the comments. The back cover claims it is fun for the entire family. When is the last time you’ve played something that’s done that?

Emily Short

Bowls of Oatmeal and Text Generation

by Emily Short at September 21, 2016 05:00 PM

In a comment on my recent post on text generation, Dryman wrote

I was wondering if you’d been following some of the recent games criticism discussing where procedural content has ultimately failed to be very interesting or engaging (in games such as Spore or No Man’s Sky), and might have some general thoughts about how procedurally generated text content can potentially be made to resonate more strongly than was the case for the largely graphical reskins in those games. Much of the discussion has focused on what Kate Compton has called the problem of “procedural oatmeal” – i.e. it is very easy to pour 10,000 bowls of plain oatmeal, with each oat being in a different position and each bowl essentially unique, but it is very hard to make these differences *matter* to an audience, and be perceived as truly different in any memorable or thought-provoking way.

(“Some of the recent games criticism” also includes this article from Mike Cook on changing how we talk about procedural generation.)

In response to which, I partly want to point to Bruno Dias’ recent Roguelike Celebration talk, which addresses exactly this point: that procedural generation is not a way to get large amounts of content from small amounts of input, and that procedural generation requires an appropriate design approach.

But to speak for myself: I think the key question in oatmeal-avoidance is whether the generation is connected to anything mechanical. I might be able to generate haiku, or funny food names, or imaginary constellations, or names of funny English-sounding towns, but all that generation is purely decorative unless it is tightly correlated with gameplay — and the player will soon realize that it is decorative and start looking past it.

But what kind of mechanical connection is required? It would be a lot to ask that every procedurally generated variant needed to correspond to different gameplay affordances — that procgen is only interesting if having 10K output states means 10K different verbs or verb clusters or play strategies that could be unlocked.

That’s a very demanding design problem. It’s not completely inconceivable to have elements that work together to generate fresh mechanics: Dominion‘s longevity comes from the fact that the cards play off one another in really interesting ways and make each other useful (or not useful) quite quickly: Peddler might be pointless in a game without +Action and +Buy cards, but in a game with a good action/buy engine and, say, Bishop, it might become a massive source of victory points. There are entire blogs devoted to pointing out productive Dominion card combos and strategies. But that’s a challenge in mechanics design, and if every application of procgen text had to be that mechanically rich, we wouldn’t need it very often.

But there are other applications as well, happily. In The Mary Jane of Tomorrow I wanted pretty much everything the robot said to serve as a reminder of her current training state: it acts as both status information and as a perceivable consequence of what the player has done with her. Show her a book about cowgirls, and she’ll start speaking in a funny accent. Teach her about botany, and suddenly she’ll get really specific about the types of apple she can bake in an apple pie.

Procgen methods are great for this kind of low-levellayered consequence: representing how the player has changed the world state persistently in a small way, rather than massively in a way that forked the narrative.

At the same time, ideally, you get some results that aren’t exactly what the player might have predicted or anticipated. The robot making up haiku with botanical names and a cowgirl accent is juicier, richer, and more appealing than a status display on the side of her neck that says

Cowgirl: ON
Botany: ON

Or to look at this another way: I spend a lot of time, in my projects, coming up with little jokes and surprises and colorful images to reward the player for trying a particular thing. Even when I’m writing for Fallen London, I really like to come up with callbacks: one of the most fun things about writing The Frequently Deceased was inventing ludicrously unsuitable ways for small children to interact with the player’s inventory items. (You can pick up a very, very large collection of child-inappropriate possessions in Fallen London.) Procedural methods let me create some of those experiences dynamically.

And then there’s the specific task of making dialogue feel highly responsive to the context in which it appears. A lot of my own interest in procgen text comes out of the desire to combine authored, narrative centric dialogue with ways of expressing relationships, social moves, and emotions. So I’m less casting about for what the model should be, and more trying to figure out how to write the generator to produce the level of richness I want.

Still, at its best, I think procedural generation, textual or otherwise, can be used to reinforce that sense of agency and presence in the world, rather than to make it feel as though nothing matters. But it isn’t a substitute for designing content. It’s a way of designing content — one that is often at least as labor-intensive as other ways, and that also demands a strong capacity for abstraction and the ability to characterize one’s aesthetic goals.



Stuff About Stuff

Spellchecking your WIP: instructions, and a question

by Andrew ( at September 21, 2016 12:42 PM

I've realized that spell checking can be tricky to do for a tester on the other side. It can also trip them up if there are bad typos. So someone I was testing for, for IFComp, asked how I was able to do it for them, for a twine game. This may seem late in the game for such a post, but I think it will give high value for the time it takes. So, have away!
I have Windows, so I run Notepad++. Which is a really awesome app if you don't have it. Worth the install time and not just for this spell check.


Right click to get the source of a twine game, then ctrl-a, copy and paste. Then paste to a document in Notepad++.
Do ctrl-f to find, then click the "regular expression" radio button.
For what to find, type <[^>]*>
This translates to <(bunch of HTML tag stuff)> but nothing outside the tags.
For what to replace, type \n
Replace all

Then I ran the native spell checker in notepad++. ctrl-shift-alt-s.

You may have to also define c:\program files (x86)\aspell\en.prepl as not read-only in windows explorer.

INFORM 7: was a BIG help for me.

Ctrl+F --> go to the Mark tab --> toggle Bookmark line --> Click Mark All. (you want to search for say ")
Select menu Search --> Bookmark --> Copy Bookmarked Lines.

Then you can replace say " with nothing, and \[[^\]]\] with \n, and then spell check as before.

If anyone knows a good text editor this works for with Mac, that'd be a big help.

September 20, 2016

Emily Short

Short, Friendly Parser Puzzle Games

by Emily Short at September 20, 2016 09:00 PM

From time to time I post lists of games that do particular things. This time the criteria are: the game is a relatively short, not overwhelmingly difficult parser piece, which should be playable in a couple of hours (and often less); it has definite puzzles, a game-like arc, and a win state; and it’s old enough, new enough, or under-discussed enough that you might not have already heard of it.

I almost put Oppositely Opal in here, as that is just the kind of game I’m talking about, but its healthy batch of XYZZY nominations mean you probably know about it already.

RaRLargeReference and Representation: An Approach to First-Order Semantics. This is a fairly new Ryan Veeder game, and it reveals its Veederishness first by using a title that fakes you out into thinking you are about to download someone’s thesis. It is, in fact, an entertaining short puzzle game about being an early human, someone who doesn’t yet understand the concepts of language and symbol. It is not just a game with a protagonist who knows less than the player; it is actually exploring how we understand what see when we see it, and models the transformation of the protagonist’s understanding. If you like the idea of cave man communication as game, you might also want to check out The Edifice.

seeksorrowStarry Seeksorrow (Caleb Wilson). From last year’s ShuffleComp. The protagonist is a magical doll that comes to life when necessary to protect the main character: this is gentle fantasy with a few hints of something darker behind the scenes.

Ka (Dan Efran). An escape game themed around the Egyptian afterlife, in which you have to perform rituals in order to make progress as a soul. It’s solemn and dreamy, and sometimes a bit reminiscent of Zarf’s work: a landscape full of partially metaphorical objects, an absence of other people or the pressure of time.

fragileshells.pngFragile Shells (Stephen Granade). Escape from a damaged orbiting space station. Granade is a physicist who has worked extensively with NASA and on communicating scientific concepts to a general audience; Fragile Shells presents a realistic, near-future setting, in contrast with a lot of space games. Speaking of which:

Piracy 2.0 (Sean Huxter). Your spaceship is attacked by pirates; can you get out and save yourself? This one is particularly rich in alternate solutions and story outcomes, and is longer than most of the others on this page, while still being roughly the length for IF Comp. I really enjoyed it at the time, but it hasn’t been discussed as much afterwards as I might have expected, especially given the rich array of possible outcomes the story provides.

Tex Bonaventure and the Temple of the Water of Life (Truthcraze). As the name implies, this is an Indiana Jones-style adventure with a couple of unexpected puzzles. The comp version had a few tricky moments, but I generally enjoyed it.

beetmongerIf you like the archaeology angle but don’t want to spend the whole game on that, The Beetmonger’s Journal (Scott Starkey) has an archaeological frame story and some fun experimentation with narrative and viewpoint.

Sparkle (Juhana Leinonen) offers mystical, transformative magic puzzles, from the original ShuffleComp. The internal logic of those puzzles is a bit silly, but the game clues them well enough to make it all work.


Looking for something longer? Here’s a list of substantial, high-quality, but underplayed large parser games.

Tagged: starry seeksorrow, tex bonaventure and the water of life

September 18, 2016

Wade's Important Astrolab

Leadlight and a new OS for the Apple II

by Wade ( at September 18, 2016 06:22 AM

John Brooks, best known in Apple circles for programming the amazing Apple IIGS version of fantasy platformer Rastan in 1990, recently released an unnofficial update to ProDOS 8, the OS used by 8-bit Apple II computers. The last official version was 2.0.3, released by Apple in 1993, so that's 23 years between lunches. Brooks's 2.0.4 release includes improvements for almost the whole range of Apple IIs, both the 8-bit ones and the 16-bit Apple IIGS.

One thing about ProDOS 2.x in general is that it never ran on the oldest Apple II models: the original Apple II, the Apple II+ and the unenhanced Apple IIe. You needed an enhanced Apple IIe, an Apple IIC or an Apple IIGS. Until now, anyway. Brooks's 2.0.4 lets you run ProDOS 2.x on the earlier IIs.

When I programmed Leadlight back in 2009–2010, I had to assess what the minimum hardware requirement for it would be. The game uses ProDOS 2.0.3 and lowercase characters, so I stated that the minimum would be an enhanced Apple IIe. Now the game could potentially run on older machines under ProDOS 2.0.4, so long as they've been upgraded to 64KB of RAM and have 80-column cards in them (giving lowercase capability).

I don't think I'll be racing to implement this possible OS change. Leadlight is at a very stable place now and content-synced between the Apple II and Inform versions, but there is a glimmer of appeal in the idea of tweaking it to try to get it to work on even more limited hardware without taking anything out of it.

* You can buy Leadlight Gamma for modern devices or get the original Apple II version for free at

These Heterogenous Tasks


by Sam Kabo Ashwell at September 18, 2016 01:01 AM

ANCIENT MYSTERIES OF IF COMP is my attempt, in the run-up to the 2016 IF Competition, to go back over Comp entries which I missed the first time around.

primroseNolan Bonvouloir’s The Primrose Path was released in the 2006 IF Competition, taking a rather distant second place to Emily Short’s Floatpoint. It was nominated for two XYZZY Awards – Best Game and Best Individual PC. It’s the kind of game which has a pretty decent reputation, but doesn’t show up much on IFDB recommendation lists or in discussion.

It was also an Inform 7 game released the same year that I7 was released as a public beta; for a first-time IF author with “more or less nonexistent” programming experience to pick up the somewhat-immature I7, learn to code in it, and produce a game that placed second in the Comp within five months is a pretty amazing accomplishment.

If I was looking for a good counterpart to The Primrose Path… tonally, I might go with EurydiceBut it’s also firmly a member of the time-travel tangle genre, alongside works like All Things DevoursFirst Things FirstFifteen Minutes and Meanwhile.

The writing is feels very grounded, with a matter-of-fact tone. This does an excellent job of maintaining a magical-realism feeling; you always feel that Matilda is a real person, rather than a fantasy trope or a player avatar. But this is a game which is impossible to talk about purely as literature, because its plot mechanics attempt some technically ambitious things – “technically” not in the sense that it’s doing anything outlandishly unprecedented, or inherently difficult to code, but in terms of very game-specific elements of craft: the care and management of player comprehension when you can’t entirely control which bits of a very open plot they’ve encountered. I came away with the idea that it doesn’t come up in discussion much because nobody feels confident enough to bring it up.

The Primrose Path is almost an amazing game, and contains multitudes, but it stumbles badly over a big, standard challenge: making sure the player understands what’s going on enough to appreciate it. I think it’s aiming for a Diana Wynne Jones-style sensation of being at the edge of a vast, powerful river of events, only partly grasped. But this is a very delicate balance to maintain, and Primrose Path‘s plot is sufficiently non-linear that it often relies heavily on bad assumptions about what the player already knows. There are strong suggestions that the game contains far more information and possibility than you’ll find by taking the walkthrough – that it wants to be a game you dig deep into to wangle out secrets – but veering away from the walkthrough often seems to put you in impossible situations without making that clear. I recommend the ClubFloyd transcript, although it’s very long and contains a great deal of reloading saves, and thus may be rather confusing in its own right.

The game opens on a dramatic plunge into the action, and we learn things more through allusions than direct information. Even the explanations leave a lot unclarified. As I played, I always had a lot of unanswered questions in mind; initially about the backstory of the characters, but increasingly about how this whole fantastic apparatus works. The characters don’t live in the present, but in an unnamed city which has recently undergone quarantines and highly destructive riots. And all this is potentially strong stuff, as long as the audience begins to feel that they’re getting it as things move along. Personally, I found myself accumulating more confusions more rapidly than I found answers.

Early on, the fantastical stuff is the least mysterious part of the game – we all understand how ‘paintings as magical portals’ ought to work. But the time-stop / time-travel shenanigans turn out to be considerably more important than initially suspected. As you traverse the world, the regular action of the portals and the machinations of plot mess around with your inventory an awful lot, which makes for further disorientation. I’m not sure if this meant to be the kind of game which needs to have its possibilities plotted out on paper, but after two losing playthroughs and one walkthrough-heavy victory it certainly felt that way.

Time-tangle puzzles in IF are typically pretty sparse, tending towards plainer writing, less complex NPCs, not a whole lot of deep backstory. Partly this is to give the author less to worry about – Primrose Path has multiple NPCs who can be encountered in a variety of unusual situations, who act on their own agendas, and who can be talked to about a fairly broad variety of plot elements which might be in a variety of states – but partly it’s so that the player has less to track. The two interact: the difficulty in grasping all of the plot is exacerbated because it’s hard to trust that everything is functioning as it should. Primrose Path‘s past versions have had bugs which, if not breaking the game, definitely make its logic more confusing; if something unexpected happens, it’s often very difficult to be confident that it was meant to.

Let’s back up for a moment and talk about themes. There is a very traditional element that immediately presents itself: locked doors, unlocked with specific keys, colour-coded. It’s entirely conventional for adventure-style games to feature this kind of gating, and for games in general to delimit their worlds in somewhat-arbitrary ways; but because of the highly naturalistic feel of the writing, this cumulatively feels more acutely confining, faintly sinister. The sliding doors to Matilda’s garden are swollen shut. The house faces onto a busy street, and Matilda doesn’t have a driver’s license any more. Matilda is a middle-aged woman, and there’s a distant sense of opportunities quietly closing down.

And then there are all these keys. Keys should open doors. The first painting you enter leads to a highly dramatic landscape, which sets up the expectation that you’re going to be going on a Cook’s Tour fantasy journey; and then, after a period in which you get keys one at a time, you find an entire key-ring, with keys of amber, jade, ruby. Jeweled keys are absolutely fantasy territory. Jeweled keys should open spectacular things. But as it turns out, they just open doors back into different areas of your house. You’re promised Narnia, and instead you get your own bathroom, and a brutal little cycle of inevitability.

At times, this confinement seems arbitrary; when time is stopped, Matilda can’t walk on water, but she can climb into the air on raindrops. (Which is the one unavoidable spoiler I already knew about the game – like, when I saw Jack and the Cuckoo-Clock Heart I thought of the ending as That Primrose Path Thing, despite having never got to it in the game.) More obviously, Matilda is physically and socially frail; she gets pushed around by other characters, and often freezes up in a crisis (except when she doesn’t).

Leo is also confined, although in ways that we only see at a remove. He comes across as depely frustrated, he’s very much a Young Man in Turmoil – and here I start to wonder if the Shakespearean title is more than a useful phrase, if Irene is Gertrude and Leo Hamlet. Which would leave you as mad Ophelia.

Irene’s conversation, I’ve noticed, tends to be the polar opposite of her son’s: whereas Leo is so intensely present that I feel slightly on edge whenever I speak to him, Irene is so vague that you start to wonder whether she’s there at all.

But of course Leo is a prominent absence for most of the game, and the story is delivered with an Irenic vagueness. For all that Leo forms – sort of – a love interest, the game isn’t much of a romance. Partly this is an inversion: we’re used to the figure of the mysterious woman who always barely eludes the hero, but when it’s a man it just feels irritating.

In fact, I kind of got to the point where I felt that this was designed as an accretive, figure-out-how-to-win-by-losing game. There are strong hints of this in at least one losing ending:

By the time my former self comes innocently down the stairs, about to answer her doorbell, I am already deep into the woods. Perhaps without my interference she’ll be able to sort all this out better than I have.

Early on, we get hints of a triangle-of-identities thing:

>ask about me
I don’t talk about you out in the open like that . . . surely you can understand. Irene would think me insane.

x me
But I can’t see you. Do you even have a body? Or were you trying to get me to examine myself?

And then, later on, this becomes suddenly, terrifyingly important and you don’t really understand why. The studied vagueness about certain things escalates into a defence mechanism, with Matilda refusing to follow commands (but also not doing anything independently). There are hints about this, buried fairly deep, but they’re far from conclusive. This is not a game that’s big on answers.

The struggle between Matilda and whoever ‘you’ refers to escalates to being downright uncomfortable at times:

>x shelf
Well, yes, there’s a wooden shelf here, along the far wall, but honestly, I see nothing important on it.

>search it
No, really, I don’t see anything I want to take.

>take ring
Damn it. All right, so the ring is on the shelf. But is it really necessary this really that I cart it about with me? I turned it down years ago; there’s really no need to go into all that again.

That was a rhetorical question.

take ring
Taken, damn you.

There’s a suggestion, very deeply buried, that Matilda is schizophrenic; but it seems just as likely that the command-giver is something more fantastical that has just been treated as schizophrenia. The game doesn’t want you to have tidy explanations for characters: Irene is initially characterised as senile, dangerously crazy; at other times she appears dangerously shrewd, at times her vagueness seems calculated, and by the conclusion it’s still not clear whether she was the most sensible one of the trio. (The big threat that motivates her – the danger of encountering yourself through time-travel – might be a paper tiger. It’s never articulated very clearly why it’s so awful.)

So: The Primrose Path is a challenging work, both to play and to nail down. I didn’t come away from it with a sense of narrative satatisfaction, and I don’t know whether I was meant to.

September 16, 2016

Choice of Games

Samurai of Hyuga Book 1 and Fatehaven on Steam

by Dan Fabulich at September 16, 2016 07:01 PM

To celebrate the launch of Samurai of Hyuga Book 2, we’re also announcing that Devon’s earlier games in our Hosted Games program, Samurai of Hyuga Book 1 and Fatehaven are out now on Steam! They’re 25% off this week only so buy them today!

New Hosted Game! Samurai of Hyuga Book 2 by Devon Connell

by Dan Fabulich at September 16, 2016 06:01 PM

Hosted Games has a new game for you to play!

Samurai of Hyuga Book 2

Samurai of Hyuga Book 2 is the blood-pumping sequel to the interactive tale you already know. Return to the land of silk and steel, where fantasy and reality clash and tough choices await you at every turn.

Good thing you’re still the toughest ronin around.

Become a bodyguard, a savior, or just a killer with a good excuse. Try to keep your mind intact as you travel down the path of madness, with twisted romances and drama at every turn. Love and lust, spirits and demons. What happens when you can’t tell the difference anymore?

That and so much more await you in the second book of this epic series!

  • Reclaim your role as a badass ronin, a master manslayer, and reluctant bodyguard for hire!
  • Find romance or let it find you, tainted and twisted as it may be!
  • Poetry and board games, dates and kabuki—try not to forget yourself in this unforgettable adventure!
  • Over 215,000 words of interactive fiction!

Devon developed this game using ChoiceScript, a simple programming language for writing multiple-choice interactive novels like these. Writing games with ChoiceScript is easy and fun, even for authors with no programming experience. Write your own game and Hosted Games will publish it for you, giving you a share of the revenue your game produces.

The Digital Antiquarian

Cracking Open the Mac

by Jimmy Maher at September 16, 2016 05:00 PM

The Macintosh II

The biggest problem with the Macintosh hardware was pretty obvious, which was its limited expandability. But the problem wasn’t really technical as much as philosophical, which was that we wanted to eliminate the inevitable complexity that was a consequence of hardware expandability, both for the user and the developer, by having every Macintosh be identical. It was a valid point of view, even somewhat courageous, but not very practical, because things were still changing too fast in the computer industry for it to work, driven by the relentless tides of Moore’s Law.

— original Macintosh team-member Andy Hertzfeld

Jef Raskin and Steve Jobs didn’t agree on much, but they did agree on their loathing for expansion slots. The absence of slots was one of the bedrock attributes of Raskin’s original vision for the Macintosh, the most immediately obvious difference between it and Apple’s then-current flagship product, the Apple II. In contrast to Steve Wozniak’s beloved hacker plaything, Raskin’s computer for the people would be as effortless to set up and use as a stereo, a television, or a toaster.

When Jobs took over the Macintosh project — some, including Raskin himself, would say stole it — he changed just about every detail except this one. Yet some members of the tiny team he put together, fiercely loyal to their leader and his vision of a “computer for the rest of us” though they were, were beginning to question the wisdom of this aspect of the machine by the time the Macintosh came together in its final form. It was a little hard in January of 1984 not to question the wisdom of shipping an essentially unexpandable appliance with just 128 K of memory and a single floppy-disk drive for a price of $2495. At some level, it seemed, this just wasn’t how the computer market worked.

Jobs would reply that the whole point of the Macintosh was to change how computers worked, and with them the workings of the computer market. He wasn’t entirely without concrete arguments to back up his position. One had only to glance over at the IBM clone market — always Jobs’s first choice as the antonym to the Mac — to see how chaotic a totally open platform could be. Clone users were getting all too familiar with the IRQ and memory-address conflicts that could result from plugging two cards that were determined not to play nice together into the same machine, and software developers were getting used to chasing down obscure bugs that only popped up when their programs ran on certain combinations of hardware.

Viewed in the big picture, we could actually say that Jobs was prescient in his determination to stamp out that chaos, to make every Macintosh the same as every other, to make the platform in general a thoroughly known quantity for software developers. The norm in personal computing as most people know it — whether we’re talking phones, tablets, laptops, or increasingly even desktop computers — has long since become sealed boxes of one stripe or another. But there are some important factors that make said sealed boxes a better idea now than they were back then. For one thing, the pace of hardware and software development alike has slowed enough that a new computer can be viable just as it was purchased for ten years or more. For another, prices have come down enough that throwing a device away and starting over with a new one isn’t so cost-prohibitive as it once was. With personal computers still exotic, expensive machines in a constant state of flux at the time of the Mac’s introduction, the computer as a sealed appliance was a vastly more problematic proposition.

Determined to do everything possible to keep users out of the Macintosh's innards, Apple used Torx screws, which were almost unheard of at the time, and even threatened them with electrocution should they persist. The contrast with the Apple II, whose top could be popped in seconds, could hardly have been more striking.

Determined to do everything possible to keep users out of the Mac’s innards, Apple used Torx screws for which screwdrivers weren’t commonly available to seal it, and even threatened users with electrocution should they persist in trying to open it. The contrast with the Apple II, whose top could be popped in seconds using nothing more than a pair of hands to reveal seven tempting expansion slots, could hardly have been more striking.

It was the early adopters who spotted the potential in that first slow, under-powered Macintosh, the people who believed Jobs’s promise that the machine’s success or failure would be determined by the number who bought it in its first hundred days on the market, who bore the brunt of Apple’s decision to seal it as tightly as Fort Knox. When Apple in September of 1984 released the so-called “Fat Mac” with 512 K of memory, the quantity that in the opinion of just about everyone — including most of those at Apple not named Steve Jobs — the machine should have shipped with in the first place, owners of the original model were offered the opportunity to bring their machines to their dealers and have them retro-fitted to the new specifications for $995. This “deal” sparked considerable outrage and even a letter-writing campaign that tried to shame Apple into bettering the terms of the upgrade. Disgruntled existing owners pointed out that their total costs for a 512 K Macintosh amounted to $3490, while a Fat Mac could be bought outright by a prospective new member of the Macintosh fold for $2795. “Apple should have bent over backward for the people who supported it in the beginning,” said one of the protest’s ringleaders. “I’m never going to feel the same about Apple again.” Apple, for better or for worse never a company that was terribly susceptible to such public shaming, sent their disgruntled customers a couple of free software packages and told them to suck it up.

The Macintosh Plus

The Macintosh Plus

Barely fifteen months later, when Apple released the Macintosh Plus with 1 MB of memory among other advancements, the merry-go-round spun again. This time the upgrade would cost owners of the earlier models over $1000, along with lots of downtime while their machines sat in queues at their dealers. With software developers rushing to take advantage of the increased memory of each successive model, dedicated users could hardly stand to regard each successive upgrade as optional. As thing stood, then, they were effectively paying a service charge of about $1000 per year just to remain a part of the Macintosh community. Owning a Mac was like owning a car that had to go into the shop for a week for a complete engine overhaul once every year. Apple, then as now, was famous for the loyalty of their users, but this was stretching even that legendary goodwill to the breaking point.

For some time voices within Apple had been mumbling that this approach simply couldn’t continue if the Macintosh was to become a serious, long-lived computing platform; Apple simply had to open the Mac up, even if that entailed making it a little more like all those hated beige IBM clones. During the first months after the launch, Steve Jobs was able to stamp out these deviations from his dogma, but as sales stalled and his relationship with John Sculley, the CEO he’d hand-picked to run the company he’d co-founded, deteriorated, the grumblers grew steadily more persistent and empowered.

The architect of one of the more startling about-faces in Apple’s corporate history would be Jean-Louis Gassée, a high-strung marketing executive newly arrived in Silicon Valley from Apple’s French subsidiary. Gassée privately — very privately in the first months after his arrival, when Jobs’s word still was law — agreed with many on Apple’s staff that the only way to achieve the dream of making the Macintosh into a standard to rival or beat the Intel/IBM/Microsoft trifecta was to open the platform. Thus he quietly encouraged a number of engineers to submit proposals on what direction they would take the platform in if given free rein. He came to favor the ideas of Mike Dhuey and Brian Berkeley, two young engineers who envisioned a machine with slots as plentiful and easily accessible as those of the Apple II or an IBM clone. Their “Little Big Mac” would be based around the 32-bit Motorola 68020 chip rather than the 16-bit 68000 of the current models, and would also sport color — another Jobsian heresy.

In May of 1985, Jobs made the mistake of trying to recruit Gassée into a rather clumsy conspiracy he was formulating to oust Sculley, with whom he was now in almost constant conflict. Rather than jump aboard the coup train, Gassée promptly blew the whistle to Sculley, precipitating an open showdown between Jobs and Sculley in which, much to Jobs’s surprise, the entirety of Apple’s board backed Sculley. Stripped of his power and exiled to a small office in a remote corner of Apple’s Cupertino campus, Jobs would soon depart amid recriminations and lawsuits to found a new venture called NeXT.

Gassée’s betrayal of Jobs’s confidence may have had a semi-altruistic motivation. Convinced that the Mac needed to open up to survive, perhaps he concluded that that would only happen if Jobs was out of the picture. Then again, perhaps it came down to a motivation as base as personal jealously. With a penchant for leather and a love of inscrutable phraseology — “the Apple II smelled like infinity” is a typical phrase from his manifesto The Third Apple, “an invitation to voyage into a region of the mind where technology and poetry exist side by side, feeding each other” — Gassée seemed to self-consciously adopt the persona of a Gallic version of Jobs himself. But regardless, with Jobs now out of the picture Gassée was able to consolidate his own power base, taking over Jobs’s old role as leader of the Macintosh division. He went out and bought a personalized license plate for his sports car: “OPEN MAC.”

Coming some four months after Jobs’s final departure, the Mac Plus already included such signs of the changing times as a keyboard with arrow keys and a numeric keypad, anathema to Jobs’s old mouse-only orthodoxy. But much, much bigger changes were also well underway. Apple’s 1985 annual report, released in the spring of 1986, dropped a bombshell: a Mac with slots was on the way. Dhuey and Berkeley’s open Macintosh was now proceeding… well, openly.

The Macintosh II

The Macintosh II

When it debuted five months behind schedule in March of 1987, the Macintosh II was greeted as a stunning but welcome repudiation of much of what the Mac had supposedly stood for. In place of the compact all-in-one-case designs of the past, the new Mac was a big, chunky box full of empty space and empty slots — six of them altogether — with the monitor an item to be purchased separately and perched on top. Indeed, one could easily mistake the Mac II at a glance for a high-end IBM clone; its big, un-stylish case even included a cooling fan, an item that placed even higher than expansion slots and arrow keys on Steve Jobs’s old list of forbidden attributes.

Apple’s commitment to their new vision of a modular, open Macintosh was so complete that the Mac II didn’t include any on-board video at all; the buyer of the $6500 machine would still have to buy the video card of her choice separately. Apple’s own high-end video card offered display capabilities unprecedented in a personal computer: a palette of over 16 million colors, 256 of them displayable onscreen at any one time at resolutions as high as 640 X 480. And, in keeping with the philosophy behind the Mac II as a whole, the machine was ready and willing to accept a still more impressive graphics card just as soon as someone managed to make one. The Mac II actually represented colors internally using 48 bits, allowing some 281 trillion different shades. These idealized colors were then translated automatically into the closest approximations the actual display hardware could manage. This fidelity to the subtlest vagaries of color would make the Mac II the favorite of people working in many artistic and image-processing fields, especially when those aforementioned even better video cards began to hit the market in earnest. Even today no other platform can match the Mac in its persnickety attention to the details of accurate color reproduction.

Some of the Mac II's capabilities truly were ahead of their time. Here we see a desktop extended across two monitors, each powered by a separate video card.

Some of the Mac II’s capabilities truly were ahead of their time. Here we see a desktop extended across two monitors, each powered by its own video card.

The irony wasn’t lost on journalists or users when, just weeks after the Mac II’s debut, IBM debuted their new PS/2 line, marked by sleeker, slimmer cases and many features that would once have been placed on add-on-cards now integrated into the motherboards. While Apple was suddenly encouraging the sort of no-strings-attached hardware hacking on the Macintosh that had made their earlier Apple II so successful, IBM was trying to stamp that sort of thing out on their own heretofore open platform via their new Micro Channel Architecture, which demanded that anyone other than IBM who wanted to expand a PS/2 machine negotiate a license and pay for the privilege. “The original Mac’s lack of slots stunted its growth and forced Apple to expand the machine by offering new models,” wrote Byte. “With the Mac II, Apple — and, more importantly, third-party developers — can expand the machine radically without forcing you to buy a new computer. This is the design on which Apple plans to build its Macintosh empire.” It seemed like the whole world of personal computing was turning upside down, Apple turning into IBM and IBM turning into Apple.

The Macintosh SE

The Macintosh SE

If so, however, Apple’s empire would be a very exclusive place. By the time you’d bought a monitor, video card, hard drive, keyboard — yes, even the keyboard was a separate item — and other needful accessories, a Mac II system could rise uncomfortably close to the $10,000 mark. Those who weren’t quite flush enough to splash out that much money could still enjoy a taste of the Mac’s new spirit of openness via the simultaneously released Mac SE, which cost $3699 for a hard-drive-equipped model. The SE was a 68000-based machine that looked much like its forefathers — built-in black-and-white monitor included — but did have a single expansion slot inside its case. The single slot was a little underwhelming in comparison to the Mac II, but it was better than nothing, even if Apple did still recommend that customers take their machines to their dealers if they wanted to actually install something in it. Apple’s not-terribly-helpful advice for those needing to employ more than one expansion card was to buy an “integrated” card that combined multiple functions. If you couldn’t find a card that happened to combine exactly the functions you needed, you were presumably just out of luck.

During the final years of the 1980s, Apple would continue to release new models of the Mac II and the Mac SE, now established as the two separate Macintosh flavors. These updates enhanced the machines with such welcome goodies as 68030 processors and more memory, but, thanks to the wonders of open architecture, didn’t immediately invalidate the models that had come before. The original Mac II, for instance, could be easily upgraded from the 68020 to the 68030 just by dropping a card into one of its slots.

The Steve Jobs-less Apple, now thoroughly under the control of the more sober and pragmatic John Sculley, toned down the old visionary rhetoric in favor of a more businesslike focus. Even the engineers dutifully toed the new corporate line, at least publicly, and didn’t hesitate to denigrate Apple’s erstwhile visionary-in-chief in the process. “Steve Jobs thought that he was right and didn’t care what the market wanted,” Mike Dhuey said in an interview to accompany the Mac II’s release. “It’s like he thought everyone wanted to buy a size-nine shoe. The Mac II is specifically a market-driven machine, rather than what we wanted for ourselves. My job is to take all the market needs and make the best computer. It’s sort of like musicians — if they make music only to satisfy their own needs, they lose their audience.” Apple, everyone was trying to convey, had grown up and left all that changing-the-world business behind along with Steve Jobs. They were now as sober and serious as IBM, their machines ready to take their places as direct competitors to those of Big Blue and the clonesters.

To a rather surprising degree, the world of business computing accepted Apple and the Mac’s new persona. Through 1986, the machines to which the Macintosh was most frequently compared were the Commodore Amiga and Atari ST. In the wake of the Mac II and Mac SE, however, the Macintosh was elevated to a different plane. Now the omnipresent point of comparison was high-end IBM compatibles; the Amiga and ST, despite their architectural similarities, seldom even saw their existence acknowledged in relation to the Mac. There were some good reasons for this neglect beyond the obvious ones of pricing and parent-company rhetoric. For one, the Macintosh was always a far more polished experience for the end user than either of the other 68000-based machines. For another, Apple had enjoyed a far more positive reputation with corporate America than Commodore or Atari had even well before any of the three platforms in question had existed. Still, the nature of the latest magazine comparisons was a clear sign that Apple’s bid to move the Mac upscale was succeeding.

Whatever one thought of Apple’s new, more buttoned-down image, there was no denying that the market welcomed the open Macintosh with a matching set of open arms. Byte went so far as to call the Mac II “the most important product that Apple has released since the original Apple II,” thus elevating it to a landmark status greater even that that of the first Mac model. While history hasn’t been overly kind to that judgment, the fact remains that third-party software and hardware developers, who had heretofore been constipated by the frustrating limitations of the closed Macintosh architecture, burst out now in myriad glorious ways. “We can’t think of everything,” said an ebulliant Jean-Louis Gassée. “The charm of a flexible, open product is that people who know something you don’t know will take care of it. That’s what they’re doing in the marketplace.” The biannual Macworld shows gained a reputation as the most exciting events on the industry’s calendar, the beat to which every journalist lobbied to be assigned. The January 1988 show in San Francisco, the first to reflect the full impact of Apple’s philosophical about-face, had 20,000 attendees on its first day, and could have had a lot more than that had there been a way to pack them into the exhibit hall. Annual Macintosh sales more than tripled between 1986 and 1988, with cumulative sales hitting 2 million machines in the latter year. And already fully 200,000 of the Macs out there by that point were Mac IIs, an extraordinary number really given that machine’s high price. Granted, the Macintosh had hit the 2-million mark fully three years behind the pace Steve Jobs had foreseen shortly after the original machine’s introduction. But nevertheless, it did look like at least some of the more modest of his predictions were starting to come true at last.

An Apple Watch 27 years before its time? Just one example of the extraordinary innovation of the Macintosh market was the WristMac from Ex Machina, a "personal information manager" that could be synchronized with a Mac to function as your appointment calendar and a telephone Rolodex among other possibilities.

An Apple Watch 27 years before its time? Just one example of the extraordinary innovation of the Macintosh market was the WristMac from Ex Machina, a “personal information manager” that could be synchronized with a Mac to take the place of your appointment calendar, to-do list, and Rolodex.

While the Macintosh was never going to seriously challenge the IBM standard on the desks of corporate America when it came to commonplace business tasks like word processing and accounting, it was becoming a fixture in design departments of many stripes, and the staple platform of entire niche industries — most notably, the publishing industry, thanks to the revolutionary combination of Aldus PageMaker (or one of the many other desktop-publishing packages that followed it) and an Apple LaserWriter printer (or one of the many other laser printers that followed it). By 1989, Apple could claim about 10 percent of the business-computing market, making them the third biggest player there after IBM and Compaq — and of course the only significant player there not running a Microsoft operating system. What with Apple’s premium prices and high profit margins, third place really wasn’t so bad, especially in comparison with the moribund state of the Macintosh of just a few years before.

Steve Jobs and John Sculley in happier times.

Steve Jobs and John Sculley in happier times.

So, the Macintosh was flying pretty high as the curtain began to come down on the 1980s. It’s instructive and more than a little ironic to contrast the conventional wisdom that accompanied that success with the conventional wisdom of today. Despite the strong counterexample of Nintendo’s exploding walled garden over in the videogame-console space, the success the Macintosh had enjoyed since Apple’s decision to open up the platform was taken as incontrovertible proof that openness in terms of software and hardware alike was the only viable model for computing’s future. In today’s world of closed iOS and Android ecosystems and computing via disposable black boxes, such an assertion sounds highly naive.

But even more striking is the shift in the perception of Steve Jobs. In the late 1980s, he was loathed even by many strident Mac fans, whilst being regarded in the business and computer-industry press and, indeed, much of the popular press in general as a dilettante, a spoiled enfant terrible whose ill-informed meddling had very nearly sunk a billion-dollar corporation. John Sculley, by contrast, was lauded as exactly the responsible grown-up Apple had needed to scrub the company of Jobs’s starry-eyed hippie meanderings and lead them into their bright businesslike present. Today popular opinion on the two men has neatly reversed itself: Sculley is seen as the unimaginative corporate wonk who mismanaged Jobs’s brilliant vision, Jobs as the greatest — or at least the coolest — computing visionary of all time. In the end, of course, the truth must lie somewhere in the middle. Sculley’s strengths tended to be Jobs’s weaknesses, and vice versa. Apple would have been far better off had the two been able to find a way to continue to work together. But, in Jobs’s case especially, that would have required a fundamental shift in who these men were.

The loss among Apple’s management of that old Jobsian spirit of zealotry, overblown and impractical though it could sometimes be, was felt keenly by the Macintosh even during these years of considerable success. Only Jean-Louis Gassée was around to try to provide a splash of the old spirit of iconoclastic idealism, and everyone had to agree in the end that he made a rather second-rate Steve Jobs. When Sculley tried on the mantle of visionary — as when he named his fluffy corporate autobiography Odyssey and subtitled it “a journey of adventure, ideas, and the future” — it never quite seemed to fit him right. The diction was always off somehow, like he was playing a Silicon Valley version of Mad Libs. “This is an adventure of passion and romance, not just progress and profit,” he told the January 1988 Macworld attendees, apparently feeling able to wax a little more poetic than usual before this audience of true believers. “Together we set a course for the world which promises to elevate the self-esteem of the individual rather than a future of subservience to impersonal institutions.” (Apple detractors might note that elevating their notoriously smug users’ self-esteem did indeed sometimes seem to be what the company was best at.)

It was hard not to feel that the Mac had lost something. Jobs had lured Sculley from Pepsi because the latter was widely regarded as a genius of consumer marketing; the Pepsi Challenge, one of the most iconic campaigns in the long history of the cola wars, had been his brainchild. And yet, even before Jobs’s acrimonious departure, Sculley, bowing to pressure from Apple’s stockholders, had oriented the Macintosh almost entirely toward taking on the faceless legions of IBM and Compaq that dominated business computing. Consumer computing was largely left to take care of itself in the form of the 8-bit Apple II line, whose final model, the technically impressive but hugely overpriced IIGS, languished with virtually no promotion. Sculley, a little out of his depth in Silicon Valley, was just following the conventional wisdom that business computing was where the real money was. Businesspeople tended to be turned off by wild-eyed talk of changing the world; thus Apple’s new, more sober facade. And they were equally turned off by any whiff of fun or, God forbid, games; thus the old sense of whimsy that had been one of the original Mac’s most charming attributes seemed to leach away a little more with each successive model.

Those who pointed out that business computing had a net worth many times that of home computing weren’t wrong, but they were missing something important and at least in retrospect fairly obvious: namely, the fact that most of the companies who could make good use of computers had already bought them by now. The business-computing industry would doubtless continue to be profitable for many and even to grow steadily alongside the economy, but its days of untapped potential and explosive growth were behind it. Consumer computing, on the other hand, was still largely virgin territory. Millions of people were out there who had been frustrated by the limitations of the machines at the heart of the brief-lived first home-computer boom, but who were still willing to be intrigued by the next generation of computing technology, still willing to be sold on computers as an everyday lifestyle accessory. Give them a truly elegant, easy-to-use computer — like, say, the Macintosh — and who knew what might happen. This was the vision Jef Raskin had had in starting the ball rolling on the Mac back in 1979, the one that had still been present, if somewhat obscured even then by a high price, in the first released version of the machine with its “the computer for the rest of us” tagline. And this was the vision that Sculley betrayed after Jobs’s departure by keeping prices sky-high and ignoring the consumer market.

“We don’t want to castrate our computers to make them inexpensive,” said Jean-Louis Gassée. “We make Hondas, we don’t make Yugos.” Fair enough, but the Mac was priced closer to Mercedes than Honda territory. And it was common knowledge that Apple’s profit margins remained just about the fattest in the industry, thus raising the question of how much “castration” would really be necessary to make a more reasonably priced Mac. The situation reached almost surrealistic levels with the release of the Mac IIfx in March of 1990, an admittedly “wicked fast” addition to the product line but one that cost $9870 sans monitor or video card, thus replacing the metaphorical with the literal in Gassée’s favored comparison: a complete Mac IIfx system cost more than most actual brand-new Hondas. By now, the idea of the Mac as “the computer for the rest of us” seemed a bitter joke.

Apple was choosing to fight over scraps of the business market when an untapped land of milk and honey — the land of consumer computing — lay just over the horizon. Instead of the Macintosh, the IBM-compatible machines lurched over in fits and starts to fill that space, adopting in the process most of the Mac’s best ideas, even if they seldom managed to implement those ideas quite as elegantly. By the time Apple woke up to what was happening in the 1990s and rushed to fill the gap with a welter of more reasonably priced consumer-grade Macs, it was too late. Computing as most Americans knew it was exclusively a Wintel world, Macs incompatible, artsy-fartsy oddballs. All but locked out of the fastest-growing sectors of personal computing, the very sectors the Macintosh had been so perfectly poised to absolutely own, Apple was destined to have a very difficult 1990s. So difficult, in fact, that they would survive the decade’s many lows only by the skin of their teeth.

This cartoon by Tom Meyer, published in the San Francisco Chronicle, shows the popular consensus about Apple by the early 1990s -- increasingly: overpriced inelegant designs and increasingly clueless management.

This cartoon by Tom Meyer, published in the San Francisco Chronicle, shows the emerging new popular consensus about Apple by the early 1990s: increasingly overpriced, bloated designs and increasingly clueless management.

Now that the 68000 Wars have faded into history and passions have cooled, we can see that the Macintosh was in some ways almost as ill-served by its parent company as was the Commodore Amiga by its. Apple’s management in the post-Jobs era, like Commodore’s, seemed in some fundamental way not to get the very creation they’d unleashed on the world. And so, as with the Amiga, it was left to the users of the Macintosh to take up the slack, to keep the vision thing in the equation. Thankfully, they did a hell of job with that. Something in the Mac’s DNA, something which Apple’s new sobriety could mask but never destroy, led it to remain a hotbed of inspiring innovations that had little to do with the nuts and bolts of running a day-to-day business. Sometimes seemingly in spite of Apple best efforts, the most committed Mac loyalists never forgot the Jobsian rhetoric that had greeted the platform’s introduction, continuing to see it as something far more compelling and beautiful than a tool for business. A 1988 survey by Macworld magazine revealed that 85 percent of their readers, the true Mac hardcore, kept their Macs at home, where they used them at least some of the time for pleasure rather than business.

So, the Mac world remained the first place to look if you wanted to see what the artists and the dreamers were getting up to with computers. We’ve already seen some examples of their work in earlier articles. In the course of the next few, we’ll see some more.

(Sources: Amazing Computing of February 1988, April 1988, May 1988, and August 1988; Info of July/August 1988; Byte of May 1986, June 1986, November 1986, April 1987, October 1987, and June 1990; InfoWorld of November 26 1984; Computer Chronicles television episodes entitled “The New Macs,” “Macintosh Business Software,” “Macworld Special 1988,” “Business Graphics Part 1,” “Macworld Boston 1988,” “Macworld San Francisco 1989,” and “Desktop Presentation Software Part 1”; the books West of Eden: The End of Innocence at Apple Computer by Frank Rose, Apple Confidential 2.0: The Definitive History of the World’s Most Colorful Computer Company by Owen W. Linzmayer, and Insanely Great: The Life and Times of Macintosh, the Computer that Changed Everything by Steven Levy; Andy Hertzfeld’s website Folklore.)


September 15, 2016


Version 1.0 of Gamefic and the SDK Are Now Available

September 15, 2016 02:01 PM

Today I released Version 1 of Gamefic and the Gamefic SDK. This release is significant for several reasons: Read More

Emily Short

Mid-September Link Assortment

by Emily Short at September 15, 2016 02:00 PM

Events (some of which I mentioned last time, but are still in the future):

September 28, Boston area, PR-IF is holding its next meetup.

September 28 is also the deadline to submit games to be shown at WordPlay London, a November event centered on interactive text and held at the British Library. You may submit your own works or nominate works by other people.

IF Comp launches at the beginning of October. If you’d like to donate prizes, that option is still open to you.

Sept 30-Oct 2 in Santa Clara, CA is GaymerX, where Carolyn VanEseltine will be running a workshop on modern interactive fiction.

The Los Angeles IF Meetup has been making twitching, not-quite-actually-dead-yet movements, so might be worth joining if you belong to that area and would like to be notified of LA area events.

AdventureX is running a Kickstarter to support its London event November 19-20. With luck, this money will help do things like fund travel for additional speakers and cover venue costs. Wordplay London is also happening November 19, which is a little unfortunate, but the two events are working together to try to minimize overlap, so you could go to Wordplay on the 19th and AdventureX on the 20th for a full weekend of texty, adventureful goodness.

Meanwhile, October 16 is Comp Game Party Day in London, when we will get together on a Sunday afternoon to play a bunch of newly released IF Comp games. I have a couple of volunteer readers, but could use more, and/or snack-bringers, so if you would like to come and do one of those things, please let me know. And of course if you do not want to bring anything except yourself, that is also welcome.


soulswereeatenChoices: And Their Souls Were Eaten is a serialized IF work for Android and iOS by Felicity Banks and published through Tin Man Games. Felicity describes it as “a gloriously epic 1830s magical steampunk adventure.” New pieces of the story come out weekly. (There is already a previous serial called Choices: And The Sun Went Out in progress. On iOS, subscribing to one gains access to both; I gather there’s a different, “story pass” payment mechanism on Android.) Felicity also has an interview up about her recent work and how she got into IF writing, together with some background about the series.

Lethophobia is a new StoryNexus piece by Olivia Wood (Failbetter’s awesome editor previously described) and Jess @notmoro. It is a game at a much different scale than the average StoryNexus piece, one where you can spend dozens of actions exploring a garden. The flavor is not very much like Fallen London, and yet it shares FL’s tendency to the evocative phrase. Beware Southend.

New from Porpentine: All Your Time-Tossed Selves, an interactive fiction implemented in Google Forms. My favorite thing about this piece is the way it marks some questions as * Required, inserting a bossy, bureaucratic affect into what is otherwise a rather evocative, allusive, poetical form. A runner up, though, is the way that Google Forms allows you to see all the analytics on everyone else’s answers before you.

Blood Will Out is a not exactly cheery Twine story about cancer diagnosis. It uses repeated text passages to good effect. Depressing but solid and representative of someone’s specific individual experience.

IntroComp is over, and the results have been posted — you can still check out the games if you’re interested.

Twine, meanwhile, has had a new beta release, 2.1.0, with improvements to running speed as well as the look and feel of the tool:


Because it’s beta, you should keep backups if you decide to try it out on an important project.

Criticism: reviews Writing Interactive Fiction with Twine, which I’ve also reviewed in the past —’s take is nice if you want a take on the book from someone who was working through the book in order to learn.

Sam Kabo Ashwell covers Photograph, an IF Comp game from some years back; it’s a more thoughtful take than I’ve previously seen on this piece.

I wrote about IF about alien kinds of life for Rock Paper Shotgun: The Axolotl Project, Coloratura, Vesp, and Solarium.

Here’s Rock Paper Shotgun on Tin Man Games’ app version of The Warlock of Firetop Mountain.

At Giant Bomb, Bruno Dias writes about how algorithms take on the responsibility for human priority-setting and decision-making:

Except at no point are human decision-makers ever replaced. Instead, they’re put behind a curtain made out of software. Where before you had bosses cutting pay and jacking up prices, now you have an algorithm telling you the value of something. Algorithms are useful in many ways, but Silicon Valley has made an artform out of using them to sublimate responsibility.

This is an important point around politics and economics, as Bruno frames it; it’s also important when we come to think about procedural artwork. The procedure came from somewhere and expresses something.

Academic crossover:

Here are papers from ICCC 2016, work on computational creativity that might be of interest to people in the procgen and narrative generation parts of IF.

For content | code | process, Judy Malloy has posted an updated article on the electronic manuscript, which provides an overview of several works (including First Draft of the Revolution as well as work by Mark Marino) that reimagine physical text formats. For example, here’s a screenshot of Soothcircuit, a circuit diagram as fortune-telling device:

Screen Shot 2016-09-04 at 9.40.08 AM.png

September 14, 2016

Oreolek's silence

A planetary magic system

by Oreolek at September 14, 2016 05:00 PM

Credit to this post (in Russian) by Roger for bringing that up.

Let’s construct a planetary magic. Everything on this planet is connected, and “butterfly effect” is a Universe Law.

For example, at 9:02 AM someone kisses a blonde under the Triumphal Arch, and at the same time some other person boards a steamboat on Rhine river and releases a white pigeon. After about five minutes a third person, in Shanghai, should topple a rice bowl to the ground. That’s it, the spell is ready, now the mage activates it. He throws a rose into a fire, and by some mystic causality an unbreakable shield forms around him.

It’s important that the actions are precise and mundane, and the result can be precise and is certainly supernatural. When the spell is ready, the mage feels it’s time and he can activate it.

This magic has to be born in collaboration with other players. The mage is not an alchemist mixing magical components, he’s not a scholar reading boring text books; he’s an organizer. His task is to arrange other people to do precise actions at a strict time, in accordance to the mystical “butterfly effect” plan.

This is perfect for LARPS, where we get a lot of players involved in a strange and incomprehensible process. As a bonus, our mages would look either like creeps, or heist masterminds, or both.

I imagine the IF variation of this system would be hard to keep from falling into “soup can” territory, but it’s worth to try.

A variation of this idea is to give the mage a spell book. He has to write down every action he encountered or caused while playing: who did what and when. (This falls definitely on the “creep” end of the spectrum) Then he can use this list for the magicking.

Of course, like every magic system for games, this idea can be expanded and changed. Which would be only for the best.

These Heterogenous Tasks

ANCIENT MYSTERIES OF IF COMP: Laid Off from the Synesthesia Factory

by Sam Kabo Ashwell at September 14, 2016 02:01 AM

ANCIENT MYSTERIES OF IF COMP is my attempt, in the run-up to the 2016 IF Competition, to go back over Comp entries which I missed the first time around.

synfacLaid Off From The Synesthesia Factory, by Katherine Morayati, placed 30th of 53 entries in IF Comp 2015; it won the XYZZY Award for Best Use of Innovation, and was nominated for Best Writing and Best Implementation.

The prose of Synfac is a long way from the standard parser style: it’s chewy, dense, sprinkled with unexpected words. It’s not flowy writing. A lot of writing, you already know where the sentence is going as you begin it: this is not that.  It’s writing that’s extremely uninterested in commonplaces, that hurries over extraneous verbiage because it has an awful lot to fit in; it suggests a protagonist who has entirely too many associations with everything in her life, and can’t stop going over them. Sentences are broken up with colons, semicolons, em-dashes; phrases are abbreviated in the manner of bullet-point notes written to oneself.

x me
What you are: A trim, functional paragon of a woman in lifelong battle with a disheveled unraveled omnidirectional grab of a girl.

The contrast here between the curt anapest stab of ‘what you are’, the near-logorrhea of ‘disheveled unraveled omnidirectional’, the return at the end to short cruel words. The use of ‘grab’, the sense of which is completely clear even though the exact meaning isn’t – are you a thing seized, a grab-bag, or the act itself of grabbing these disparate elements together. So, yeah, with writing like this I’m already sold.

The dense expression of complicated things is not invariably quite this transparent, and at the larger scale it’s very much an assembly of fragments, not quite so much bricolage as a collection of notes in the same hand. There’s the feeling of having a very great deal of data and analysis, but nothing resembling a thesis that would pull it all together. To a great extent this is yer basic alienation-of-modernity yearning-for-authenticity piece, and it’s mostly reflected in this lack of guiding purpose in a teeming world: the blind soulless corporate focus-grouping in search of a brand identity, the internet of a million tangents, none of which satisfy

This is basically a story about getting high and fucking someone you probably shouldn’t, the self-care equivalent of thumping a machine to see if that fixes it. Roslyn has been developing a Synpiece, a device which fits in the eye and literally colours emotions. Now she’s been fired from the project, which it’s suggested was largely her brainchild. Defeated, directionless, her thoughts are heavily inflected by emotional exhaustion or non-response, something like when a relative dies and grief fails to show up on the approved schedule. There’s a basic pattern of depression, more socially acceptable in men than women, which goes: this obviously isn’t working, something absolutely has to change, and the only way I can see to effect change is to fuck some things up and see what happens. When your depression isn’t bipolar but you fucking well need a manic upswing anyway and are desperate to find a way to engineer one. Become oversaturated. Often the world does not play along with this plan; Synfac‘s endings are anticlimaxes.

Mechanically, Synfac doesn’t reveal itself. Let’s say you set out to write this game according to route-1 design. Premise: there’s a device that allows you to alter your mood through use of colours. Immediate suggestion: the game verbs are colours, and their effect is to alter the protagonist’s mood, perhaps making her behave differently. UI: convey this to the player and get them to internalise what each colour means, probably bootstrapping on existing cultural associations, so red is anger, blue is sadness, all the levers are labelled and you have some idea of what they should do. Synfac mentions – but doesn’t fully lay out – a system like this, but this text is easily-missed. It has no intention of being a game of neatly-labeled levers.

The vaunted feature of the game is that the action will progress no matter what you do; your input might affect the course of the action, or it might just trigger certain thoughts, or it might not have any very clear effect and Roslyn will think about something or do something anyway. You’ll end up leaving the house whether you mean to or not. (More contrasts: the teeming multifarious world and the very few real choices it genuinely offers. There are only four places in the country where there are jobs in your field, and none presently have anything for someone of your specialisations. There are two boys available, neither of whom have what you need right now.) This encourages a certain amount of tea-leaf reading on the first playthrough, a tendency to overestimate the secret magic of how the game is understanding your input. (One of the Synpiece peripherals is basically a sentiment-analysis panopticon.) I am not sure that it would have worked in a piece very much longer; I don’t think it needs to.

It comes with a soundtrack, which is very tonally consistent, to the point of being on-the-nose: soft-voiced women singing about emotional isolation, music soft enough to form a background while playing a text game that requires close attention to words, gentle touches of electronica. More than one of the songs is about women who love robots – Ladytron’s Deep Blue is about a computer who won’t commit.

sub-Q Magazine

Strange Horizons 2016 Fund Drive Interactive Fiction Stretch Goal

by Tory Hoke at September 14, 2016 01:01 AM

…and they’re partnering with us to do it!

From Strange Horizons2016 fund drive announcement, courtesy Editor-In-Chief Niall Harrison:

The second area we’re hoping you will help us to expand into is interactive fiction. As with translations, this is an area we’ve been open to, without really necessarily having the in-house skills or experience to make it work. But it turns out—if you didn’t already know, that one of our art directors, Tory Hoke, runs a fantastic IF magazine called Sub-Q ,which has published writers such as Yoon Ha Lee, J. Y. Yang, E. Lily Yu, and many more. And so, if we raised US$23,000—our maximum goal this year—we’re going to bring Sub-Q in-house at Strange Horizons, to publish special issues twice per year, under a similar model to Samovar. Sub-Q will pay 9 cents per word.

This is big.

For years, Strange Horizons has solicited and published interactive fiction, including “You Are Here” by Bogi Takács. By partnering twice a year with sub-Q, SH will bring new eyeballs to interactive fiction and new voices to its magazine. We provide the technology, SH selects the fiction, and the stories appear as a Strange Horizons imprint. More details and submissions guidelines will be revealed.

This all depends, of course, on Strange Horizons reaching their final stretch goal.

Get the word out.

And start writing.

We’ve got wonderful things to do.

The post Strange Horizons 2016 Fund Drive Interactive Fiction Stretch Goal appeared first on sub-Q Magazine.

September 12, 2016


Sorcery! 4 launch delayed

September 12, 2016 06:01 PM

We're hugely sorry to announce this, but Sorcery! 4's launch date has had to change. The epic finale to the series will now be going live on Steam, iOS and Android on September 22nd - one week later than originally planned.

After such a long journey - four games over four years, and nearly one and a half million words of content - we want to be sure the release is as good as it can be.


In particular, this means that Thursday's live playthrough event at Loading Bar in London won't be a launch party so much as a preview party. But come down all the same - because we'll be giving out Steam codes for the game, which unlock immediately.

Once again, we hope you'll forgive us the delay. To help soften the blow, we've put the first three Sorcery! games on sale on the App Store for their cheapest price ever - just $1 each. So if you're missing any instalments, now's the time to pick them up.

Classic Adventure Solution Archive

CASA Update - 31 new game entries, 23 new solutions, 26 new maps, 2 new hints, 1 new fixed game

by Gunness at September 12, 2016 11:27 AM

Apart from adding a number of games from the various Speed IF comps (thanks, Garry), ChickenMan just pointed me towards a new, lengthy podcast from TRS-80 Trash Talk with none other than Scott Adams. The podcast goes into a lot of detail about Scott's career and the inspiration for his various games. Well worth a listen! The bit featuring him starts around the 42:40 mark.

Contributors: Garry, Sylvester, ahope1, Sudders, Zuperfaust, terri, Doreen B, Alastair, iamaran, Dorothy, Vaxalon, pippa, Gunness

Emily Short

Visualizing Procgen Text, Part Two

by Emily Short at September 12, 2016 11:00 AM

In a previous post (now several months ago — sorry!) I wrote about visualization strategies for looking at procedurally generated text from The Mary Jane of Tomorrow and determining whether the procedural generation was being used to best effect.

There, I proposed looking at salience (how many aspects of the world model are reflected by this text?), variety (how many options are there to fill this particular slot?) and distribution of varying sections (which parts of a sentence are being looked up elsewhere?)

It’s probably worth having a look back at that post for context if you’re interested in this one, because there’s quite a lot of explanation there which I won’t try to duplicate here. But I’ve taken many of the examples from that post and run them through a Processing script that does the following things:

Underline text that has been expanded from a grammar token.

Underlining is not the prettiest thing, but the intent here is to expose the template-screen-shot-2016-09-11-at-4-45-52-pmstructure of the text. The phrase “diced spam pie” is the result of expanding four layers of grammar tokens; in the last iteration, the Diced Spam is generated by one token that generates meat types, and the Pie by another token that generates types of dish.

This method also draws attention to cases where the chunks of composition are too large or are inconsistent in size, as in the case of the generated limericks for this game:


Though various factors (limerick topic, chosen rhyme scheme) have to be considered in selecting each line, the lines themselves don’t have room for a great deal of variation, and are likely to seem conspicuously same-y after a fairly short period of time. The first line of text locks in the choice of surrounding rhyme, which is part of why the later lines have to operate within a much smaller possibility space.

screen-shot-2016-09-11-at-4-45-36-pmIncrease the font size of text if it is more salient. Here the words “canard” and “braised” appear only because we’re matching a number of different tags in the world model: the character is able to cook, and she’s acquainted with French. By contrast, the phrase “this week” is randomized and does not depend on any world model features in order to appear, so even though there are some variants that could have been slotted in, the particular choice of text is not especially a better fit than another other piece of text.

This particular example came out pretty ugly and looks like bad web-ad text even if you don’t read the actual content. I think that’s not coincidental.

screen-shot-2016-09-11-at-4-45-59-pmColor the font to reflect how much variation was possible. Specifically, what this does is increase the red component of a piece of text to maximum; then the green component; and then the blue component. The input is the log of the number of possible variant texts that were available to be slotted into that position.

This was the trickiest rule to get to where I wanted it. I wanted to suggest that both very high-variance and very low-variance phrases were less juicy than phrases with a moderate number of plausible substitutions. That meant picking a scheme in which low-variance phrases would be very dark red or black; the desirable medium-variance phrases are brighter red or orange; and high-variance phrases turn grey or white.

Here “wordless” and “lusty” are adjectives chosen randomly from a huge adjective list, with no tags connecting them to the model world. As a result, even though there are a lot of possibilities, they’re likely not to resonate much with the reader; they’ll feel obviously just random after a little while. (In the same way, in the Braised Butterflied Canard example above, the word “seraphic” is highly randomized.)

Finally, here’s the visualization result I got for the piece of generated text I liked best in that initial analysis:


We see that this text is more uniform in size and color than most of the others, that the whole thing has a fair degree of salience, and that special substitution words occur about as often as stressed words might in a poem.


There’s another evaluative criterion we don’t get from this strategy, namely the ability to visualize the whole expansion space implicit in a single grammar token.

By this I don’t mean the sort of generic chart you get if you look up visualize context free grammar on Google — those are generally aimed at understanding the structural space for the purposes of parsing rather than generation, and in particular aren’t designed to help the user visualize the richness of the space they’re creating.

But having some such strategy is important because (as I found working on both Annals of the Parrigues and The Mary Jane of Tomorrow) it’s easy to get bogged down expanding the text space in ways that don’t help very visibly or even that are actively counterproductive in terms of the reader/player’s experience of the generated text.

For instance, let’s go back to the following table from Mary Jane, which describes the ways that the tag [fruit]” might be expanded:

Table of Dubious Fruits
tag-list description
{ { } } “[one of]Lime[or]Orange[or]Banana[or]Pear[or]Lemon[or]Pineapple[or]Cranberry[or]Apple[at random]”
{ { cowgirl, pos } } “Prickly Pear”
{ { french, pos } } “[one of]Fraise[or]Framboise[at random]”
{ { innuendo, pos } } “[one of]Mango[or]Passionfruit[at random]”
{ { botany, pos } } “[Table of Cultivars type] Apple”

This table means that, if no world model features apply, the player could receive any of the eight random fruits listed on the first line. If they’re training a cowgirl, “Prickly Pear” comes into play as an option; if French, “Fraise/Framboise” becomes available, and so on. [Table of Cultivars] actually calls out to a whole other massive expansion table, containing hundreds of apple species names, which I unapologetically yanked from Darius Kazemi’s corpora set. In theory, the presence of that table massively expands the number of conceivable fruit descriptions in the game, but in practice, it’s far enough down the chain of grammar, and contingent enough on other world model features, that it is unlikely to be consulted very often. Putting it there was fun, and I enjoyed having really specific species names available for the botanical version of the robot, but if my aim had strictly been to increase the experienced richness and variety of the text, that addition would’ve been a big waste of time.

There’s something even worse one could do, though, which is to start with this:

Table of Dubious Fruits
tag-list description
{ { } } “[one of]Lime[or]Orange[or]Banana[or]Pear[or]Lemon[or]Pineapple[or]Cranberry[or]Apple[at random]”

where the player has a 1 in 8 chance of randomly getting each fruit, and expand it to this:

Table of Dubious Fruits
tag-list description
{ { } } “[one of]Lime[or]Orange[or]Banana[or]Pear[or]Lemon[or]Pineapple[or]Cranberry[or]Apple[at random]”
{ { } } “Prickly Pear”

…where the player now has a 1/16 chance of getting any of the first fruits, and a 1/2 chance of getting Prickly Pear, with the result that, though I just added content, the player is much likelier to see repetitions, and it will feel as though the content space is actually smaller.

This is a deliberately simple example in order to illustrate the point, but the same problem can arise in sneakier ways, if we add a grammar token with only a few possible expansions alongside another token with many possible expansions.

All kinds of other implementation complications can come into play here: some procedural grammar systems also use percentages to control the rate at which particular elements are used, for instance. So one possible strategy would be to make the system self-balancing — able to assess how many possible expansions there were down each grammatical route and then set its own probabilities accordingly, so that it would select the 8-fruit token eight times out of nine, and the 1-fruit token only one time in nine.

This would, I think, require that there not be any loops in the grammar production (or else that the system halted its self-analysis if it discovered any). And, of course, perfect balance of production is not necessarily what we’re trying for anyway.

I’m still thinking about some of the visualization angles on this one.

Tagged: procedural generation, processing, the mary jane of tomorrow, visualization

These Heterogenous Tasks


by Sam Kabo Ashwell at September 12, 2016 02:01 AM


The Interactive Fiction Competition kicks off in a little under three weeks. Since its inception in 1995, IF Comp has spurred the production – or at least served as a convenient platform for – over seven hundred works of IF, and has had a huge impact on the culture, craft and criticism of interactive fiction.

Minolta DSC

IF Comp represents an explosion of activity, and is very much a community event, with everyone playing and talking about the same games at the same time. There’s a corresponding post-comp fatigue. In the past I’ve generally tried to play, review and score every game in the Comp, but sometimes that just doesn’t happen. And once the Comp is over, the sense of urgency diminishes, and the games I missed get abandoned in the midden of Things I Should Really Get Around To One Day. (Not that those games are particularly ill-served – they almost certainly received more critical attention than games released outside any comp.)

So ANCIENT MYSTERIES OF IF COMP is a silly title to motivate me to actually get around to going over some of them, in the run-up to IF Comp 2016. We’ll see how it goes.

photographPhotograph: A Portrait of Reflection, by Steve Evans, won third place in IF Comp 2002, and was nominated for two XYZZY Awards. The first time around I somehow managed to get stuck about midway through, and never quite figured out what anybody saw in it. (Looking at the game now, I suspect that I wasn’t actually stuck, just thought I was. It may also be a difference between the original comp version and the more recent post-comp release).

It fits straightforwardly into a parser IF tradition composed of games like Photopia, Tapestry, Map, The Life (and Deaths) of Dr. M: sombre retrospective works about life, death and regrets, with a surreal or metaphysical frame. At the opening of Photograph, it’s strongly implied that the protagonist is in the process of committing suicide. (Later, it transpires that he’s allowing himself to succumb to an unnamed ailment.) The story unfolds through a series of nested flashbacks.

Formally, Photograph is very much in the style of what was then considered the New School: the puzzles are all very simple, and the game almost always prods you towards the next step to take. But this is still very much a game that’s native to parser, with its empty buildings, its use of objects as talismans of memory, its sense of the difficulty of the individual motions of ordinary life.

You stand on a balcony looking out over treetops and distant roofs to the river and the hills beyond. The sky is pink and blue pastels, the sun barely risen above the late morning mists and woodsmoke haze. A fog rolls down the river valley like a lazy cloud.

This is tight writing, for all that it’s going for rich aesthetics; a picture is painted, a mood evoked without getting bogged down in particulars. Even when it’s being relatively prolix, Photograph doesn’t ever drag, but nor does it call heavy attention to its style – except, perhaps, when it becomes worryingly sparse, as though the viewpoint character is trying to avoid thinking about it, or is viewing life through an anhedonic haze.

You are on a wooden balcony that looks out over a river valley. An open door to the west leads back into the house. You are cold.

> i
You are carrying nothing.

> x me
You are 55 years old, but look (and feel) much older. Although you aren’t planning on going far away from your wood fire today, you are nevertheless rugged up in a thick coat.

> x coat
Your coat is old and weathered. The pockets are empty.

> x bird
The bird is a starling. It looks agitated.

> x gum
It’s a tall blue gum eucalyptus tree. A bird sits on a high branch.

This is a story about inertia, depression and isolation. At fifty-five, Jack lives alone in the bush. Everything is an effort, his house is a tip, there is nothing to really look forward to, and most things are colourless and bland. The world is cold, and Jack is constantly insulated in a heavy coat. Jack is conscious of having withdrawn from life, and is consumed by regret about it, fixating on a photograph taken just before he broke up with an old girlfriend: to him, it picks out the exact point at which things began to go wrong.

Jack’s connection to the Australian bush where he has lived since childhood is a recurring theme: while much of the game is described in terse, basic-information parserese (Jack is aware of, but tries to ignore, the neglected clutter of his home) most of the moments of loving description are about the woods. Jack breaks up with Melanie in large part because he won’t give up the bush for her sake; indeed, apart from being attractive, the main characterisation Melanie gets concerns her city-girl discomfort with the bush. But at the same time the bush is a trap, a moribund economy, isolation, bad roads, yards full of rusting old crap.

Jack hangs on to the idea that if this one moment could have been different, his whole life would have gone better. The game ultimately undermines this in a couple of ways. If you replay and don’t climb the waterfall, the game responds “but that’s not how it really happened” and you’re back where you were before. And at the end of the game, we get a retrospective of a different death, at a younger age, no happier, showing Jack as a workaholic who neglected his family. Jack’s real failing, you’re left thinking, is allowing himself to be consumed by regret. At one point, he watches as a spider captures a fly in its web, but loses interest as soon as the fly’s fate is inevitable; later, waking up, he imagines the tangled sheets of his unmade bed as the web-strands trapping the fly. The obvious reading is that Jack, feeling the rest of his life has become inevitable, has lost interest in it.

There’s a thing that’s easy to miss in the mid-game: driving to town, your car goes into a ditch. If, in an earlier flashback, you replanted a seedling tree on the side of the road, that tree is now grown (eucalyptus grows quickly) and can be used to winch the car out of the ditch. There have been some discussions of how exactly the causality of this works – since if you do it wrong, you can apparently backtrack to put the tree in place – but I think that focusing on a causal accounting of the story’s events is beside the point. This is, I think, meant to be about the ridiculousness of Jack’s one-pivotal-moment stance. The seedling could have died, there should be plenty of other trees in the bush to attach a winch to, and there’s no guarantee that you’d spin off the road in that exact spot. The coincidence is so contrived that it can only be an expression of how nonsensical certain kinds of regret are – you’ve got a three-quarter-inch sprocket wrench, if only you’d just happened to have brought it on holiday with you…

Through all this runs a recurring theme of the Egyptian afterlife, which was introduced to Jack by a later, flakier girlfriend who believed in past lives. The themes here are less obvious, in large part because there are so many possibilities. The placing of organs in canopic jars gets particular attention: Jack sees himself as a hollow man, heartless and gutless. But there’s also a sense of the futility of awaiting rebirth.

It’s easy to read Photograph as fatalistic: Jack’s failing is not about particular choices, but is rather an ingrained character flaw, a basic unwillingness to stay engaged in relationships (or to stop obsessing over relationships which weren’t going to work anyway). And the game doesn’t offer much of a vision of what a good outcome could have looked like, some happy medium between burnout salaryman and anchoritic shack-dweller. Solutions are not required of literature: the tragedy here isn’t about Jack’s failure, but rather his tangled, self-defeating attitude to it. And as such, it’s a strong portrait of the pit that is depression, of its suffocating sense that everything is your fault.

Stuff About Stuff

Grubbyville Postmortem

by Andrew ( at September 12, 2016 12:21 AM

Well, it was nice to be able to place, which was my competitive goal for IntroComp. Thanks to Jacqueline Ashwell for holding and organizing it, for so many years.

But what about creative goals?

It's always a bit frustrating to read criticism of stuff I should know, or I thought I fixed or looked at, and realizing--yeah, I didn't quite nail that down. And that's what I got from various reviews. Part of this was picking the project up maybe a week before the deadline, after a lot of on and off futzing, and a few years' failed entry.
So I'm grateful to my testers for helping me cut down a lot of extraneous stuff (enough I won't detail here) and for their positive contributions as well. One bug that slipped through was a special case I could/should have checked for. I'll describe it later. Doing so now immediately after seems like a backhanded compliment to my testers.

It was nice to be on the other side of the criticism, which was civil and helpful. And a note: there was more than enough, but I think IntroComp is a fantastic tune-up for IFComp judging: less pressure, no dreams destroyed, less of a big deal if you're wrong, and because your complaints are anonymous, that can help. Even if it's not a major one, it helps the writer. So I really hope to contribute feedback at the very least next year, and hopefully reviews to push Introcomp as much as I can. Being on both sides of the competition has been very positive for me, for relatively less investment than IFComp. Maybe it can work for you, too.

As for the game: Grubbyville is almost based on a real story. I remember the Valedictorian Wars at my high school. Two years ahead, someone got a B+ in Health, costing them the #1 spot. One year ahead, someone got a B+ in Food for You (no! Really! They apparently smart mouthed the teacher. On a more serious note, they were icky and manipulative to the new #1, and when someone reminds me of #2 that is a huge red flag) costing _them_ the #1 spot.

I'm not sure what happened my year, but somehow #1 overtook #2 our sophomore year and never gave the lead back despite both of them getting all A's. Years later, #3 mentioned to me that not taking the advanced placement science classes hurt them. (I think they've gotten over it. They're a successful lawyer now.) I wanted to capture some of the potential absurdity. The idea came about in the 2011 (!) IFComp authors' forum when I just wrote down my intent to maybe write a game like this. The thing is--you do want to achieve. But there's a bizarre sort of perfectionism in valedictorian races, and I was well aware of the competition that went on. And I remember how working on my class rank was fun, til it wasn't, because (high school angst over corruption, kids gaming the system knowing what certain teachers liked, etc. stuff cut here.)

I've always been interested in social jockeying and such, but I wondered if Grubbyville would be a farce--then I discovered Alain de Botton's movie Status Anxiety and I read the book and I subscribed to the School of Life channel, and I started getting ideas of my own. Of the sort of questions we can ask about grades and how they're not perfect, and when ambition is good and bad. So I actually had a reason to get re-started. But it'd been on the shelf a long time. From back before I understood certain principles of game design.

Grubbyville had three levels when I first designed it. Just like my high school. (I didn't just start with "My lousy apartment." I went in for "My lousy school," too.) I cut it down a bit. But maybe not enough. The thing was--I always know I should cut things down more, but it's tough to hold on to certain things. As for the diagonal directions? Well, my high school was in the shape of an H. So I didn't want to copy that directly, and the X seemed to be simpler and leave fewer rooms. But I designed the map a few years ago, and then I let Grubbyville sit while I hacked through my wordplay games and realized diagonal directions are tricky. Then when I got back to it, I didn't really see how to re-fit the map.

But fortunately I had a couple puzzles, which felt like enough. There were getting the yearbook pass (being nice or nasty) and the book and giving it to Dennis, and being nice or a jerk to him. I didn't intend them to be hard, but even when I shut off two wings, there were still too many rooms, and so some people found it tricky to put things together. So I think I should have redesigned this, and I should've put the chess hovel in with the math team. The chess and math teams at my school always had a sort of uneasy truce, where both thought the other was smart and all but wasn't a REAL academic contest discipline, and maybe I should've/could've put more into it. So that was a clear way to simplify things that I missed. The thing was, I had ideas for other puzzles, but I talked myself out of it because it might get too complex. When it wasn't about too complex, but that the complexity would be better in the puzzles/interaction than the map.

I think one thing I can do is either 1) let you warp to the wings or 2) just put the game in twine where you can click on a location. It needs to be simpler. That's a common theme. But I also did manage to weed out a lot of things that were just programming experiments that succeeded but didn't translate well to the game. My testers found them and picked them out so you did not have to.

As for Max? One person's feedback called Max a genius, but I never thought of him that way. I just thought of him as someone who knew how to play the system. Which in itself isn't good or bad, but his rival (Harley) is definitely dislikeable. Max may be more subtly wrong, as I tried to point out in Participation Points. So I need to clear up about Max earlier. Have you ever met someone who seems to be nice and says all the right things? Well, Harley is meant to be an obvious fraud, and Max, not so obvious. Max may think he's nicer than he is, and Harley may think he's more charming than he is. So I might have Max express a bit of contempt for "lazy geniuses" or something to show that side of him.

I confess I was thinking about Max vs Harley sort of parallelling voting for political parties. Max may be the lesser of two evils, but he has the chance of being good. He's smart enough. But there are lots of cop outs.

And I also need to make the explanation of how to be Valedictorian smoother. The thing is, it's simple in my mind--but at the same time I wanted flexibility for you to understand as much as you want. I do tend to get into boondoggles with conversations., and I was aware you might have a lot of possibilities, but someone got eight, which is too many. So I think having Pop drop off the spreadsheet and let you read it would be a good idea/start. Again, this is something I cut down, but not enough. I wanted it to be a la carte, so you could see as much as you wanted, but I still didn't hit on it.

Someone also mentioned they were surprised it was not a puzzly game. I think I got similar feedback for Problems Compound. The thing is--the puzzly stuff feels like a double edged sword. I'm pleased what I've done with it. But I may've typecast myself. All the same, I do see Grubbyville as potentially a sort of puzzle: how to be a nice person? What about the big questions that seem like paradoxes? I planned to have Max run into people with worse grades who unfairly looked up to him and cut him down. Maybe I could have Max outright reject weird abstract puzzles for more practical concerns, so that I can avoid any impressions from my previous games and help build his character.

And the thing is, I want to show up the cynicism without being cynical. The mini-computer games are an example of that. They aren't meant to make a grand statement, but they're just, well, there.

So I will probably completely redo the map. I'll need more NPCs. And I'll try to make it clearer what you've done, and you need to do, and whom you need to talk to. I also want to provide clearer ways for Max to fail, because I think that may be interesting, too. I want giving up to be an option, whether because Max is generally disgusted with the tinkering to get the GPA, or he just wants to quit. And of course I want Max to be able to be valedictorian and be snotty or nice while doing so.

Oh. The one bug that slipped through? It was device-specific and a one-line error-case fix I should've checked for anyway. It was the MAP command, which they all tried--and helped me fix the actual text. None of them were using mobile devices with narrow screens, though, but Verityvirtue did and I appreciate the heads up. That bug is on me--it's a matter of giving myself time to check the sort of error cases programmers can focus on, and I did, but not enough. Error checking sounds silly but often it's a good thing to do when you're not feeling creative and often a good warm up to lead to something actually creative. It's not a substitute, but -- I find I never know what will get me going. And I need as many ways to try and start as I can.

I made a public Github repository at I'll contribute to it whenever, but it may be sporadic. It seems like Twine could be very good for what I want to do. The game's not too big now, so I will look into whether automated testing is a thing by now. Twine might allow for the simplicity and clarity I might need for this project. But until then nothing is stopping me from implementing general creative ideas. Porting can't be that hard work.

September 11, 2016

Emily Short

Aviary Attorney (Sketchy Logic)

by Emily Short at September 11, 2016 02:00 PM

Screen Shot 2016-09-07 at 10.22.24 PM.png

Aviary Attorney is a game in which you guide some French lawyers, who happen to be birds, through evidence collection and trial scenes in which they pick holes in the opposing testimony. It owes a great deal to Phoenix Wright: Ace Attorney, emulating its gameplay and in-court responses. People also compare with Hatoful Boyfriend, because both are visual novels with birds who act like humans, but Aviary Attorney owes less of a debt there: the gameplay and style are really rather different.

The art, meanwhile, is lifted from the public domain work of French caricaturist J.J. Grandville, and the game’s narrative takes place against the rising action of the revolutionary year 1848. There are also many current jokes and references: the evidence binder where you store pictures of people you’ve met is your “Face Book,” for instance.

The joke could have been too weak to sustain play through the whole game. But I wound up liking it a lot, and not just because the game only needed a few hours to play through. Sketchy Logic do a good job with the light animation, the soundtrack, the dialogue writing: moment to moment, production values are consistently solid.

More to the point, though, this is not just a grab-bag of goofy cases. The whole piece is addressing themes of justice, rationality, the use of force, and the relationship between the poor and the wealthy. 1848 Paris, as portrayed here, is a place with huge disparities in wealth and class; a place where judges preferentially protect the well-to-do, and where police may arbitrarily shoot the poor. In one of the endings, you are literally assembling evidence to work out whether the victim of a (supposed) police shooting was hit in the front or the back, and under what circumstances.

There’s a substantial amount of branching and consequence in this story. Things you do as early as the first act can have moderate effect elsewhere; later, your decisions become more and more consequential, determining the fates of characters and of the city itself. The final case you confront depends on what you’ve done in acts 1-3, which means that you have the option of encountering any of three different climactic sequences: in Ashwellian CYOA structure vocabulary, take a branch-and-bottleneck structure and glue a sorting hat on the end of it. (A few other games do this as well, including a number of Choice of Games pieces. Choice of Robots does it especially strongly.)

But you’re never completely in control of the situation, either. There are lots of things that the player doesn’t know or that the player character can’t avoid: this is not a power fantasy, but a story of moral responsibility. I found myself getting a bit anxious about the fact that I might be making errors that would cause genuine and upsetting problems for the characters in the long run.

There is a Cult of Rationality in the game, and at one point dialogue questions whether this is significantly different from a religion; on the other hand, none of the major character decisions in the story seem to be driven by religious faith.

So the key opposition there is not between faith and reason. Faith really has no defenders. Instead, it’s between a naive trust in what reason can do and a more sophisticated understanding of its misapplications and defects. While much of the gameplay is about trying to gather evidence and make rational arguments (at least, rational within the game’s logic), the outcomes repeatedly remind us that both our knowledge and the persuasive power of reason are limited. And the final outcome of the Fraternité ending turns on a character’s fear overriding their other inclinations, and choosing a course accordingly.

The moral choices we make in the game project backward as well as forward. Aviary Attorney is an inconsistent-reality game. Jayjay Falcon’s background is a mystery in every playthrough, but the truth ultimately revealed changes depending on what you decide to have him do. I know some players find that kind of thing extremely irritating, wanting all branches of a game to work together to reveal the same backstory from different angles. I did not find it bothersome myself, especially because Jayjay’s projected backstory is dependent on a choice he makes at a critical moment: essentially the player is deciding what kind of person he is, and thus perhaps what kind of person he used to be.

I have a few gameplay niggles. I couldn’t find any way to skip through dialogue I’d already seen once, the way one typically can in Ren’Py visual novels, and I missed that affordance.

Steam has a walkthrough for the game; I looked at it for information about a few paths I didn’t experience in my own playing. (Despite what the walkthrough says, though,  ending 4C is available to play, and indeed is the ending I first received.)

[Disclosure: I played a copy of this game that I bought with my own money. I have no commercial relationship with the creators as of the time of writing.]

Tagged: aviary attorney, hatoful boyfriend, jj grandville, phoenix wright

September 09, 2016

The Digital Antiquarian

So You Want to Be a Hero?

by Jimmy Maher at September 09, 2016 04:00 PM

Lori Ann and Corey Cole

Lori Ann and Corey Cole

Rule #1 is “The Player Must have Fun.” It’s trivially easy for a game designer to “defeat” players. We have all the tools and all the power. The trick is to play on the same side as the players, to tell the story together, and to make them the stars.

That rule is probably the biggest differentiator that made our games special. We didn’t strive to make the toughest, hardest-to-solve puzzles. We focused on the characters, the stories, and making the player the star.

— Corey Cole

It feels thoroughly appropriate that Corey and Lori Ann Cole first met over a game of Dungeons & Dragons. The meeting in question took place at Westercon — the West Coast Science Fantasy Conference — in San Francisco in the summer of 1979. Corey was the Dungeon Master, leading a group of players through an original scenario of his own devising that he would later succeed in getting published as The Tower of Indomitable Circumstances. But on this occasion he found the pretty young woman who was sitting at his table even more interesting than Dungeons & Dragons. Undaunted by mere geography — Corey was programming computers for a living in Southern California while Lori taught school in Arizona — the two struck up a romantic relationship. Within a few years, they were married, settling eventually in San Jose.

They had much in common. As their mutual presence at a convention like Westercon will attest, both the current and the future Cole were lovers of science-fiction and fantasy literature and its frequent corollary, gaming, from well before their first meeting. Their earliest joint endeavor — besides, that is, the joint endeavor of romance — was The Spellbook, a newsletter for tabletop-RPG aficionados which they edited and self-published.

Corey also nurtured an abiding passion for computers that had long since turned into a career. After first learning to program in Fortran and COBOL while still in high school during the early 1970s, his subsequent experiences had constituted a veritable grand tour of some of the most significant developments of this formative decade of creative computing. He logged onto the ARPANET (predecessor of the modern Internet) from a terminal at his chosen alma mater, the University of California, Santa Barbara; played the original Adventure in the classic way, via a paper teletype machine; played games on the PLATO system, including the legendary proto-CRPGs Oubliette and DND that were hosted there. After graduating, he took a job with his father’s company, a manufacturer of computer terminals, traveling the country writing software for customers’ installations. By 1981, he had moved on to become a specialist in word-processing and typesetting software, all the while hacking code and playing games at home on his home-built CP/M computer and his Commodore PET.

When the Atari ST was introduced in 1985, offering an unprecedented amount of power for the price, Corey saw in it the potential to become the everyday computer of the future. He threw himself into this latest passion with abandon, becoming an active member of the influential Bay Area Atari Users Group, a contributor to the new ST magazine STart, and even the founder of a new company, Visionary Systems; the particular vision in question was that of translating his professional programming experience into desktop-publishing software for the ST.

Interestingly, Corey’s passion for computers and computer games was largely not shared by Lori. Like many dedicated players of tabletop RPGs, she always felt the computerized variety to be lacking in texture, story, and most of all freedom. She could enjoy games like Wizardry in bursts with friends, but ultimately found them far too constraining to sustain her interest. And she felt equally frustrated by the limitations of both the parser-driven text adventures of Infocom and the graphical adventures of Sierra. Her disinterest in the status quo of computer gaming would soon prove an ironic asset, prompting her to push her own and Corey’s games in a different and very worthwhile direction.

By early 1988, it was becoming clear that the Atari ST was doomed to remain a niche platform in North America, and thus that Corey’s plan to get rich selling desktop-publishing software for it wasn’t likely to pan out. Meanwhile his chronic asthma was making it increasingly difficult to live in the crowded urban environs of San Jose. The Coles were at one of life’s proverbial crossroads, unsure what to do next.

Then one day they got a call from Carolly Hauksdottir, an old friend from science-fiction fandom — she wrote and sang filk songs with them — who happened to be working now as an artist for Sierra On-Line down in the rural paradise of Oakhurst, California. It seemed she had just come out of a meeting with Sierra’s management in which Ken Williams had stated emphatically that they needed to get back into the CRPG market. Since their brief association with Richard Garriott, which had led to their releasing Ultima II and re-releasing Ultima I, Sierra’s presence in CRPGs had amounted to a single game called Wrath of Denethenor, a middling effort for the Apple II and Commodore 64 sold to them by an outside developer. As that meager record will attest, Ken had never heretofore made the genre much of a priority. But of late the market for CPRGs showed signs of exploding, as evidenced by the huge success of other publishers’ releases like The Bard’s Tale and Ultima IV.  To get themselves a piece of that action, Ken stated in his typical grandiose style that Sierra would need to hire “a published, award-winning, tournament-level Dungeon Master” and set him loose with their latest technology. Corey and Lori quickly reasoned that The Tower of Indomitable Circumstances had been published by the small tabletop publisher Judges Guild, as had their newsletter by themselves; that Corey had once won a tournament at Gen Con as a player; and that together they had once created and run a tournament scenario for a Doctor Who convention. Between the two of them, then, they were indeed “published, award-winning, tournament-level Dungeon Masters.” Right?

Well, perhaps they weren’t quite what Ken had in mind after all. When Corey called to offer their services, at any rate, he sounded decidedly skeptical. He was much more intrigued by another skill Corey mentioned in passing: his talent for programming the Atari ST. Sierra had exactly one programmer who knew anything about the ST, and was under the gun to get their new SCI engine ported over to that platform as quickly as possible. Ken wound up hiring Corey for this task, waving aside the initial reason for Corey’s call with the vague statement that “we’ll talk about game design later.”

What with Corey filling such an urgent need on another front, one can easily imagine Ken’s “later” never arriving at all. Corey, however, never stopped bringing up game design, and with it the talents of his wife that he thought would make her perfect for the role. While he thought that the SCI engine, despite its alleged universal applicability, could never be used to power a convincing hardcore CRPG of the Bard’s Tale/Ultima stripe, he did believe it could serve very well as the base for a hybrid game — part CRPG, part traditional Sierra adventure game. Such a hybrid would suit Lori just fine; her whole interest in the idea of designing computer games was “to bring storytelling and [the] interesting plot lines of books and tabletop role-playing into the hack-and-slash thrill of a computer game.” Given the technological constraints of the time, a hybrid actually seemed a far better vehicle for accomplishing that than a hardcore CRPG.

So, while Corey programmed in Sierra’s offices, Lori sat at home with their young son, sketching out a game. In fact, knowing that Sierra’s entire business model revolved around game series rather than one-offs, she sketched out a plan for four games, each taking place in a different milieu corresponding to one of the four points of the compass, one of the four seasons, and one of the four classical elements of Earth, Fire, Air, and Water. As was typical of CRPGs of this period, the player would be able to transfer the same character, evolving and growing in power all the while, into each successive game in the series.

With his established tabletop-RPG designer still not having turned up, Ken finally relented and brought Lori on to make her hybrid game. But the programmer with whom she was initially teamed was very religious, and refused to continue when he learned that the player would have the option of choosing a “thief” class. And so, after finishing up some of his porting projects, Corey joined her on what they were now calling called Hero’s Quest I: So You Want to Be a Hero. Painted in the broadest strokes, he became what he describes as the “left brain” to Lori’s “right brain” on the project, focusing on the details of systems and rules while Lori handled the broader aspects of plot and setting. Still, these generalized roles were by no means absolute. It was Corey, for instance, an incorrigible punster — so don’t incorrige him! — who contributed most of the horrid puns that abound throughout the finished game.

Less than hardcore though they envisioned their hybrid to be, Lori and Corey nevertheless wanted to do far more than simply graft a few statistics and a combat engine onto a typical Sierra adventure game. They would offer their player the choice of three classes, each with its own approach to solving problems: through combat and brute force in the case of the fighter, through spells in the case of the magic user, through finesse and trickery in the case of the thief. This meant that the Coles would in effect have to design Hero’s Quest three times, twining together an intricate tapestry of differing solutions to its problems. Considering this reality, one inevitably thinks of what Ron Gilbert said immediately after finishing Maniac Mansion, a game in which the player could select her own team of protagonists but one notably free of the additional complications engendered by Hero’s Quest‘s emergent CRPG mechanics: “I’m never doing that again!” The Coles, however, would not only do it again — in fact, four times more — but they would consistently do it well, succeeding at the tricky task of genre blending where designers as talented as Brian Moriarty had stumbled.

Instead of thinking in terms of “puzzles,” the Coles preferred to think in terms of “problems.” In Hero’s Quest, many of these problems can be treated like a traditional adventure-game puzzle and overcome using your own logic. But it’s often possible to power through the same problem using your character’s skills and abilities. This quality makes it blessedly difficult to get yourself well-and-truly, permanently stuck. Let’s say you need to get a fish from a fisherman in order to get past the bear who’s blocking your passage across a river. You might, in traditional adventure-game style, use another item you found somewhere to repair his leaky boat, thus causing him to give you a fish as a small token of his appreciation. But you might also, if your character’s intelligence score is high enough, be able to convince him to give you a fish through logical persuasion alone. Or you might bypass the whole question of the fish entirely if your character is strong and skilled enough to defeat the bear in combat. Moriarty’s Beyond Zork tries to accomplish a superficially similar blending of the hard-coded adventure game and the emergent CRPG, but does so far less flexibly, dividing its problems rather arbitrarily into those soluble by adventure-game means and those soluble by CRPG means. The result for the player is often confusion, as things that ought to work fail to do so simply because a problem fell into the wrong category. Hero’s Quest was the first to get the blending right.

Based on incremental skill and attribute improvements rather than employing the more monolithic level-based structure of Dungeons & Dragons, the core of the Hero’s Quest game system reached back to a system of tabletop rules the Coles had begun formulating years before setting to work on their first computer game. It has the advantage of offering nearly constant feedback, a nearly constant sense of progress even if you happen to be stuck on one of the more concrete problems in the game. Spend some time bashing monsters, and your character’s “weapons use” score along with his strength and agility will go up; practice throwing daggers on the game’s handy target range, and his “throwing” skill will increase a little with almost every toss. Although you choose a class for your character at the outset, there’s nothing preventing you from building up a magic user who’s also pretty handy with a sword, or a fighter who knows how to throw a spell or two. You’ll just have to sacrifice some points in the beginning to get a start in the non-core discipline, then keep on practicing.

If forced to choose one adjective to describe Hero’s Quest and the series it spawned as a whole, I would have to go with “generous” — not, as the regular readers among you can doubtless attest, an adjective I commonly apply to Sierra games in general. Hero’s Quest‘s generosity extends far beyond its lack of the sudden deaths, incomprehensible puzzles, hidden dead ends, and generalized player abuse that were so typical of Sierra designs. Departing from Sierra’s other series with their King Grahams, Rosellas, Roger Wilcos, and Larry Laffers, the Coles elected not to name or characterize their hero, preferring to let their players imagine and sculpt the character each of them wanted to play. Even within the framework of a given character class, alternate approaches and puzzle — er, problem — solutions abound, while the environment is fleshed-out to a degree unprecedented in a Sierra adventure game. Virtually every reasonable action, not to mention plenty of unreasonable ones, will give some sort of response, some acknowledgement of your cleverness, curiosity, or sense of humor. Almost any way you prefer to approach your role is viable. For instance, while it’s possible to leave behind a trail of monstrous carnage worthy of a Bard’s Tale game, it’s also possible to complete the entire game without taking a single life. The game is so responsive to your will that the few places where it does fall down a bit, such as in not allowing you to choose the sex of your character — resource constraints led the Coles to make the hero male-only — stand out more in contrast to how flexible the rest of this particular game is than they do in contrast to most other games of the period.

Hero's Quest's unabashedly positive message feels particularly bracing in this current Age of Irony of ours.

Hero’s Quest‘s message of positive empowerment feels particularly bracing in our current age antiheroes and murky morality.

Indeed, Hero’s Quest is such a design outlier from the other Sierra games of its era that I contacted the Coles with the explicit goal of coming to understand just how it could have come out so differently. Corey took me back all the way the mid-1970s, to one of his formative experiences as a computer programmer and game designer alike, when he wrote a simple player-versus-computer tic-tac-toe game for a time-shared system to which he had access. “Originally,” he says, “it played perfectly, always winning or drawing, and nobody played it for long. After I introduced random play errors by the computer, so that a lucky player could occasionally win, people got hooked on the game.” From this “ah-ha!” moment and a few others like it was born the Coles’ Rule #1 for game design: “The player must always have fun.” “We try to remember that rule,” says Corey, “every time we create a potentially frustrating puzzle.” The trick, as he describes it, is to make “the puzzles and challenges feel difficult, yet give the player a chance to succeed after reasonable effort.” Which leads us to Rule #2: “The player wants to win.” “We aren’t here to antagonize the players,” he says. “We work with them in a cooperative storytelling effort. If the player fails, everybody loses; we want to see everyone win.”

Although their professional credits in the realm of game design were all but nonexistent at the time they came to Sierra, the Coles were nevertheless used to thinking about games far more deeply than was the norm in Oakhurst. They were, for one thing, dedicated players of games, very much in tune with the experience of being a player, whether sitting behind a table or a computer. Ken Williams, by contrast, had no interest in tabletop games, and had sat down and played for himself exactly one computerized adventure game in his life (that game being, characteristically for Ken, the ribald original Softporn). While Roberta Williams had been inspired to create Mystery House by the original Adventure and some of the early Scott Adams games, her experience as a player never extended much beyond those primitive early text adventures; she was soon telling interviewers that she “didn’t have the time” to actually play games. Most of Sierra’s other design staff came to the role through the back door of being working artists, writers, or programmers, not through the obvious front door of loving games as a pursuit unto themselves. Corey states bluntly that “almost nobody there played [emphasis mine] games.” The isolation from the ordinary player’s experience that led to so many bad designs was actually encouraged by Ken Williams; he suggested that his staffers not look too much at what the competition was doing out of some misguided desire to preserve the “originality” of Sierra’s own games.

But the Coles were a little different than the majority of said staffers. Corey points out that they were both over thirty by the time they started at Sierra. They had, he notes, also “traveled a fair amount,” and “both the real-life experience and extensive tabletop-gaming experience gave [us] a more ‘mature’ attitude toward game development, especially that the designer is a partner to the player, not an antagonist to be overcome.” Given the wealth of experience with games and ideas about how games ought to be that they brought with them to Sierra, the Coles probably benefited as much from the laissez-faire approach to game-making engendered by Ken Williams as some of the others designers perhaps suffered from the same lack of direction. Certainly Ken’s personal guidance was only sporadic, and often inconsistent. Corey:

Once in a while, Ken Williams would wander through the development area — it might happen two or three times in a day, or more likely the visits might be three weeks apart. Everyone learned that it was essential to show Ken some really cool sequence or feature that he hadn’t seen before. You only showed him one such sequence because you needed to reserve two more in case he came back the same day.

Our first (and Sierra’s first) Producer, Guruka Singh Khalsa, taught us the “Ken Williams Rule” based on something Robert Heinlein wrote: “That which he tells you three times is true.” Ken constantly came up with half-baked ideas, some of them amazing, some terrible, and some impractical. If he said something once, you nodded in agreement. Twice, you sat up and listened. Anything he said three times was law and had to be done. While Ken mostly played a management role at Sierra, he also had some great creative ideas that really improved our games. Of course, it takes fifteen seconds to express an idea, and sometimes days or weeks to make it work. That’s why we ignored the half-baked, non-repeated suggestions.

The Coles had no affinity for any of Sierra’s extant games; they considered them “unfair and not much fun.” Yet the process of game development at Sierra was so unstructured that they had little sense of really reacting against them either.  As I mentioned earlier, Lori didn’t much care for any of the adventure games she had seen, from any company. She wouldn’t change her position until she played Lucasfilm Games’s The Secret of Monkey Island in 1990. After that experience, she became a great fan of the Lucasfilm adventures, enjoying them far more than the works of her fellow designers at Sierra. For now, though, rather than emulating existing computerized adventure or RPG games, the Coles strove to re-create the experience of sitting at a table with a great storytelling game master at its head.

Looking beyond issues of pure design, another sign of the Coles’ relatively broad experience in life and games can be seen in their choice of settings. Rather than settling for the generic “lazy Medieval” settings so typical of Dungeons & Dragons-derived computer games then and now, they planned their series as a hall of windows into some of the richest myths and legends of our own planet. The first game, which corresponded in Lori’s grand scheme for the series as a whole to the direction North, the season Spring, and the element Earth, is at first glance the most traditional of the series’s settings. This choice was very much a conscious one, planned to help the series attract an initial group of players and get some commercial traction; bullish as he was on series in general, Ken Williams wasn’t particularly noted for his patience with new ones that didn’t start pulling their own weight within a game or two. Look a little closer, though, and even the first game’s lush fantasy landscape, full of creatures that seem to have been lifted straight out of a Dungeons & Dragons Monster Manual, stands out as fairly unique. Inspired by an interest in German culture that had its roots in the year Corey had spent as a high-school exchange student in West Berlin back in 1971 and 1972, the Coles made their Medieval setting distinctly Germanic, as is highlighted by the name of the town around which the action centers: Spielburg. (Needless to say, the same name is also an example of the Coles’ love of puns and pop-culture in-joking.) Later games would roam still much further afield from the lazy-Medieval norm. The second, for instance, moves into an Arabian Nights milieu, while still later ones explore the myths and legends of Africa, Eastern Europe, and Greece. The Coles’ determination to inject a little world music into the cultural soundtracks of their mostly young players stands out as downright noble. Their series doubtless prompted more than a few blinkered teenage boys to begin to realize what a big old interesting world there really is out there.

Hero's Quest

Of course, neither the first Hero’s Quest nor any of the later ones in the series would be entirely faultless. Sierra suffered from the persistent delusion that their SCI engine was a truly universal hammer suitable for every sort of nail, leading them to incorporate action sequences into almost every one of their adventure games. These are almost invariably excruciating, afflicted with slow response times and imprecise, clumsy controls. Hero’s Quest, alas, isn’t an exception from this dubious norm. It has an action-oriented combat engine so unresponsive that no one I’ve ever talked to tries to do anything with it but just pound on the “attack” key until the monster is dead or it’s obviously time to run away. And then there are some move-exactly-right-or-you’re-dead sequences in the end game that are almost as frustrating as some of the ones found in Sierra’s other series. But still, far more important in the end are all the things Hero’s Quest does right, and more often than not in marked contrast to just about every other Sierra game of its era.

Hero’s Quest was slated into Sierra’s release schedule for Christmas 1989, part of a diverse lineup of holiday releases that also included the third Leisure Suit Larry game from the ever-prolific Al Lowe and something called The Colonel’s Bequest, a bit of a departure for Roberta Williams in the form of an Agatha Christie-style cozy murder mystery. With no new King’s Quest game on offer that year, Hero’s Quest, the only fantasy release among Sierra’s 1989 lineup, rather unexpectedly took up much of its slack. As pre-orders piled up to such an extent that Sierra projected needing to press 100,000 copies right off the bat just to meet the holiday demand, Corey struggled desperately with a sequence — the kobold cave, for those of you who have played the game already — that just wouldn’t come together. At last he brokered a deal to withhold only the disk that would contain that sequence from the duplicators. In a single feverish week, he rewrote it from scratch. The withheld disk was then duplicated in time to join the rest, and the game as a whole shipped on schedule, largely if not entirely bug-free. Even more impressively, it was, despite receiving absolutely no outside beta-testing — Sierra still had no way of seriously evaluating ordinary players’ reactions to a game before release — every bit as friendly, flexible, and soluble as the Coles had always envisioned it to be.

Hero's Quest

The game became the hit its pre-orders had indicated it would, its sales outpacing even the new Leisure Suit Larry and Roberta Williams’s new game by a comfortable margin that holiday season. The reviews were superlative; Questbusters‘s reviewer pronounced it “honestly the most fun I’ve had with any game in years,” and Computer Gaming World made it their “Adventure Game of the Year.” While it would be nice to attribute this success entirely to the public embracing its fine design sensibilities, which they had learned of via all the fine reviews, its sales numbers undoubtedly had much to do with its good fortune in being released during this year without a King’s Quest. Hero’s Quest was for many a harried parent and greedy child alike the closest analogue to Roberta Williams’s blockbuster series among the new releases on store shelves. The game sold over 130,000 copies in its first year on the market, about 200,000 copies in all in its original version, then another 100,000 copies when it was remade in 1992 using Sierra’s latest technology. Such numbers were, if not quite in the same tier as a King’s Quest, certainly nothing to sneeze at. In creative and commercial terms alike, the Coles’ series was off to a fine start.

At the instant of Hero’s Quest‘s release, Sierra was just embarking on a major transition in their approach to game-making. Ken Williams had decided it was time at last to make the huge leap from the EGA graphics standard, which could display just 16 colors onscreen from a palette of 64, to VGA, which could display 256 colors from a palette of 262,144. To help accomplish this transition, he had hired Bill Davis, a seasoned Hollywood animator, in the new role of Sierra’s “Creative Director” in July of 1989. Davis systematized Sierra’s heretofore laissez-faire approach to game development into a rigidly formulated Hollywood-style production pipeline. Under his scheme, the artists would now be isolated from the programmers and designers; inter-team communication would happen only through a bureaucratic paper trail.

The changes inevitably disrupted Sierra’s game-making operation, which of late had been churning out new adventure games at a rate of about half a dozen per year. Many of the company’s resources for 1990 were being poured into King’s Quest V, which was intended, as had been the norm for that series since the beginning, to be the big showpiece game demonstrating all the company’s latest goodies, including not only 256-color VGA graphics but also a new Lucasfilm Games-style point-and-click interface in lieu of the old text parser. King’s Quest V would of course be Sierra’s big title for Christmas. They had only two other adventure games on their schedule for 1990, both begun using the older technology and development methodology well before the end of 1989 and both planned for release in the first half of the new year. One, an Arthurian game by an established writer for television named Christy Marx, was called Conquests of Camelot: The Search for the Grail (thus winning the prize of being the most strained application of Sierra’s cherished “Quest” moniker ever). The other, a foray into Tom Clancy-style techno-thriller territory by Police Quest designer Jim Walls, was called Code-Name: ICEMAN. Though they had every reason to believe that King’s Quest V would become another major hit, Sierra was decidedly uncertain about the prospects of these other two games. They felt they needed at least one more game in an established series if they hoped to maintain the commercial momentum they’d been building up in recent years. Yet it wasn’t clear where that game was to come from; one side effect of the transition to VGA graphics was that art took much longer to create, and games thus took longer to make. Lori and Corey were called into a meeting and given two options. One, which Lori at least remembers management being strongly in favor of, was to make the second game in their series using Sierra’s older EGA- and parser-driven technology, getting it out in time to become King’s Quest V‘s running mate for the Christmas of 1990. The other was to be moved in some capacity to the King’s Quest V project, with the opportunity to return to Hero’s Quest only at some uncertain future date. They chose — or were pushed into — the former.

Despite using the older technology, their second game was, at Davis’s insistence, created using the newer production methodology. This meant among other things that the artists, now isolated from the rest of the developers, had to create the background scenes on paper; their pictures were then scanned in for use in the game. I’d like to reserve the full details of Sierra’s dramatically changed production methodology for another article, where I can give them their proper due. Suffice for today to say that, while necessary in many respects for a VGA game, the new processes struck everyone as a strange way to create a game using the sharply limited EGA color palette. By far the most obvious difference they made was that everything seemed to take much longer. Lori Ann Cole:

We got the worst of both worlds. We got a new [development] system that had never been tried before, and all the bugs that went with that. And we were doing it under the old-school technology where the colors weren’t as good and all that. We were under a new administration with a different way of treating people. We got time clocks; we had to punch in a number to get into the office so that we would work the set number of hours. We had all of a sudden gone from this free-form company to an authoritarian one: “This is the hours you have to work. Programmers will work over here and artists will work over there, and only their bosses can talk to one another; you can’t talk to the artist that’s doing the art.”

Some of the Coles’ frustrations with the new regime came out in the game they were making. Have a close look at the name of Raseir, an oppressed city — sort of an Arabian Nights version of Nineteen Eighty-Four — where the climax of the game occurs.

Scheduled for a late September release, exactly one year after the release of the first Hero’s Quest, the second game shipped two months behind schedule, coming out far too close to Christmas to have a prayer of fully capitalizing on the holiday rush. And then when it did finally ship, it didn’t even ship as Hero’s Quest II.

Quest for Glory II

In 1989, the same year that Sierra had released the first Hero’s Quest, the British division of the multi-national toys and games giant Milton Bradley had released HeroQuest, a sort of board-gameified version of Dungeons & Dragons. They managed to register their trademark on the name for Europe shortly before Sierra registered theirs for Europe and North America. After the board game turned into a big European success, Milton Bradley elected to bring it to North America the following year, whilst also entering into talks with some British developers about turning it into a computer game. Clearly something had to be done about the name conflict, and thanks to their having registered the trademark first Milton Bradley believed they had the upper hand. When the bigger company’s lawyers came calling, Sierra, unwilling to get entangled in an expensive lawsuit they probably couldn’t win anyway, signed a settlement that not only demanded they change the name of their series but also stated that they couldn’t even continue using the old name long enough to properly advertise that “Hero’s Quest has a new name!” Thus when Hero’s Quest and its nearly finished sequel were hastily rechristened Quest for Glory, the event was announced only via a single four-sentence press release.

So, a veritable perfect storm of circumstances had conspired to undermine the commercial prospects of the newly rechristened Quest for Glory II: Trial by Fire. Sierra’s last parser-driven 16-color game, it was going head to head with the technological wonder that was King’s Quest V — another fantasy game to boot. Due to its late release, it lost the chance to pick up even many or most of the scraps King’s Quest V might have left it. And finally, the name change meant that the very idea of a Quest for Glory II struck most Sierra fans as a complete non sequitur; they had no idea what game it was allegedly a sequel to. Under the circumstances, it’s remarkable that Quest for Glory II performed as well as it did. It sold an estimated 110,000 to 120,000 copies — not quite the hit its predecessor had been, but not quite the flop one could so easily imagine the newer game becoming under the circumstances either. Sales were still strong enough that this eminently worthy series was allowed to continue.

As a finished game, Quest for Glory II betrays relatively little sign of its difficult gestation, even if there are perhaps just a few more rough edges in comparison to its predecessor. The most common complaint is that the much more intricate and linear plot this time out can lead to a fair amount of time spent waiting for the next plot event to fire, with few concrete goals to achieve in the meantime. This syndrome can especially afflict those players who’ve elected to transfer in an established character from the first game, and thus have little need for the grinding with which newbies are likely to occupy themselves. At the same time, though, the new emphasis on plot isn’t entirely unwelcome in light of the almost complete lack of same in the first game, while the setting this time out of a desert land drawn from the Arabian Nights is even more interesting than was that of the previous game. The leisurely pace can make Quest for Glory II feel like a sort of vacation simulator, a relaxing computerized holiday spent chatting with the locals, sampling the cuisine, enjoying belly dances and poetry readings, and shopping in the bazaars. (Indeed, your first challenge in the game is one all too familiar to every tourist in a new land: converting the money you brought with you from Spielburg into the local currency.) I’ve actually heard Quest for Glory II described by a fair number of players as their favorite in the entire series. If push comes to shove, I’d probably have to say that I slightly prefer the first game, but I wouldn’t want to be without either of them. Certainly Quest for Glory II is about as fine a swan song for the era of parser-driven Sierra graphical adventures as one could possibly hope for.

The combat system used in the Quest for Glory games would change constantly. The one found in the second game is a little more responsive and playable than its predecessor.

The combat system used in the Quest for Glory games would change constantly from game to game. The one found in the second game is a little more responsive and playable than its predecessor.

Had more adventure-game designers at Sierra and elsewhere followed the Coles’ lead, the history of the genre might have been played out quite differently. As it is, though, we’ll have to be content with the games we do have. I’d hugely encourage any of you who haven’t played the Quest for Glory games to give them a shot — preferably in order, transferring the same character from game to game, just as the Coles ideally intended it. Thanks to our friends at, they’re available for purchase today in a painless one-click install for modern systems. They remain as funny, likable, and, yes, generous as ever.

We’ll be returning to the Coles in due course to tell the rest of their series’s story. Next time, though, we’ll turn our attention to the Apple Macintosh, a platform we’ve been neglecting a bit of late, to see how it was faring as the 1990s were fast approaching.

Hero's Quest

(Sources: Questbusters of December 1989; Computer Gaming World of September 1990 and April 1991; Sierra’s newsletters dated Autumn 1989, Spring 1990, Summer 1991, Spring 1992, and Autumn 1992; Antic of August 1986; STart of Summer 1986; Dragon of October 1991; press releases and annual reports found in the Sierra archive at the Strong Museum of Play. Online sources include Matt Chats 173 and 174; Lori Ann Cole’s interview with Adventure Classic Gaming; Lori and Corey’s appearance on the Space Venture podcast; and various entries on the Coles’ own blog. But my biggest resource at all has been the Coles themselves, who took the time to patiently answer my many nit-picky questions at length. Thank you, Corey and Lori! And finally, courtesy of Corey and Lori, a little bonus for the truly dedicated among you who have read this far: some pages from an issue of their newsletter The Spell Book, including Corey’s take on “Fantasy Gaming Via Computer” circa summer 1982.)


September 08, 2016

Renga in Blue

Haunted House (1979)

by Jason Dyer at September 08, 2016 09:00 PM


This game was was published by Radio Shack — the same ones who made the TRS-80 — and for obvious reasons was only available on that platform. The manual and tapes (it was originally published on two) give a copyright date of 1979, so I’m sticking with that.

It gives no author but mentions “Device Oriented Games” as the developer, who goes on to make them Bedlam (1982) and Pyramid 2000 (1982). Bedlam names the author as Robert Arnstein, who I am fairly certain was the author of every game from that company. Robert Arnstein is also credited as the author of Raäka-Tū (1981) and Xenos (1982) so we’ve got a genuine text adventure auteur on our hands. (Trivia: earlier he wrote 8080 Chess, the very first microcomputer program to participate in the ACM North American Computer Chess Championship.)

Clearly the most dramatic text adventure opening of all time.

Clearly the most dramatic text adventure opening of all time.

Old Man Murray once ran a feature called “Time to Crate” which evaluated games based on how long it took for the game to have a crate. (They were everywhere at the end of the 1990s. Often it took 5 seconds or less to find a crate.) Text adventures of this era could be evaluated on the “time to reference of Crowther/Woods Adventure” system, which in this case is two moves.


Saying “plugh” tosses you inside the haunted house, with an objective to escape. There are no room descriptions, just room names (“YOU ARE AT THE DEN.”) and so far the only danger has been in ignoring a floating knife:


(Just taking the knife prevents the death.)

If you go in a direction that is invalid, the game will just print the room description again. I first thought there were mazelike loops everywhere but given this property happens in every single room it just must be a quirk of the game.

Even for the era the verb set I’ve been able to find is really sparse: directions (NSEW only), OPEN, CLOSE, DROP, GET, READ, POUR, CLIMB (which just gives a response of “NO.”) Trying to use an invalid verb on an object gives the response “WHAT SHOULD I DO WITH IT?” which is frustrating in that it almost barely pretends to understand, and the way I found to test if a verb works is to type it without an object upon which the game says “WHAT?” as opposed to “I DON’T UNDERSTAND.”


For a long time I was stuck by a locked door. It turned out to be an absolutely horrible trick. I’ll explain in a second, but take a moment to study the right side of the map and think about it first.

Recall the “loop” property where room descriptions just repeat if you can’t move. There’s a servant’s quarters with a cabinet next to another one with a cabinet. There is no way to distinguish the difference between looping and realizing you’ve entered a new room without having dropped something in the first room.


Things did not improve after I found the key. I came across a raging fire. I happened to be holding a bucket of water (one that magically refills if I pour it, even) but I am completely unable to apply it to the fire.


It’s been a while since I’ve skipped finishing a game for this project without completing it, but I just might have to invoke that option.

The Gameshelf: IF

My Obduction nonreview

by Andrew Plotkin at September 08, 2016 07:42 PM

Obduction is a really good adventure game. You should play it.

I finished the game a week ago and I've had a heck of a time thinking of anything to say. To be sure, my Myst review was written in 2002 and my Myst 5 review in 2010, so the sensible course is just to wait five or ten years and see where Cyan's gotten to. An Obduction review will make an excellent retrospective.

But I do want you to buy the game. (To help make sure Cyan makes it another five or ten years.) So, yeah, it's a really good game and you should play it.

Some of the Obduction posts I thought about writing, but didn't:

Comparing Obduction to Myst. Everybody else has done that. Summary: it's Myst except larger, and also Cyan has gotten better at story and puzzle design. End of blog post.

Comparing Obduction to Riven. Yeah, Riven is also Myst except larger and with better story and puzzle design. So Obduction is pretty much a new game as good as Riven. End of blog post.

Comparing Obduction to The Witness. Problem is, my whole Witness post was just comparing The Witness to Myst. Summary: The Witness really has no interest in being Myst. It's doing something else. Obduction is doing the same thing as Myst only Cyan has gotten better at it. End of blog post.

Talking about what I liked most. Boring and spoilery. I want you to play the game, not read about it.

Talking about what I liked least. It's not a perfect game. The plot is weirdly off-screen, and the couple of times it's thrust on-screen are the scenes where you're most confused about what you're doing. A couple of the puzzles are underclued, and in one case a puzzle's clues become unavailable (so if you didn't take notes, you're in trouble). But these are not large gripes, and you should still play the game.

Talking about the puzzle difficulty. Worth mentioning. Obduction keeps a tight hold on its puzzle mechanics; there are just a few major ones and most of the puzzles are about understanding them. But the game also exercises restraint about how far to take them. It does not take the Witness tack of "push every mechanic until your brain explodes." The result is a fairly smooth ride (although there are some shaky spots, as I said). There is no "that damn puzzle", which I think we can agree Riven has one of (and Witness has two or two dozen, depending on your mood).

Describing the bugs. Good grief, that's what Steam forums are for. Go wallow if you like.

Talking about the shadow. I admit a desire to go on a tear about the shadow. The Witness gives you, without recourse, a male shadow -- tallish, slender, short hair -- probably Jon Blow -- or if not him, certainly not you. Obduction gives you a choice between two shadows. Is that different? It's not much different.

It would be a great expenditure of effort to import the whole Uru avatar-modelling system with body shape and hairstyles and clothing -- plus height! -- just to model the shadow. Perhaps that's silly. But at this point, offering the choice between a 160-pound male avatar and a 120-pound female one feels like a thoroughly inadequate gesture towards player inclusiveness.

(Yes, it happens that late in Obduction you get an exact readout of your weight. It's not my weight, I'll tell you that.)

And so: This is even less a review than most of my not-really-reviews. I suppose I feel somewhat bruised by today's culture of games discussion, where DID THE DEVELOPER LIE is a more central question than what the game is doing and how well it does it. Also HOW CAN YOU POSSIBLY CHARGE THAT MUCH. And THE BUGS.

I admit the bugs aren't great. (I suffered from the black-page journal bug, and had to hit a wiki to fill in the holes.) But when I look around, I see a bunch of discussion that I want to back right the heck away from. Thus, all these posts I'm not writing.

If you want to know whether Obduction is worth the money, go take a long walk and think about what kind of games you enjoy. If you enjoy environmental puzzle adventure games, play Obduction. And I'll come back and write a review in a few years, when we've all gotten a better idea of how the next generation of adventure games is playing out.

September 07, 2016

Not Dead Hugo

roody labs

by Roody ( at September 07, 2016 10:34 PM

My latest bit of Hugo coding has been comprised of distracted yet productive meandering.

*  * *

Roodylib takes several important Hugo library routines and breaks them up into several routines for the sake of readability and modification (I'd rather provide authors with a method to change just the important thing instead of having to switch out the entire 126 lines of FindObject).  Of course, one side effect of this is that it greatly increases the starting number of routines in any given game.  The Hugo default max limit of routines is 320.  Now, this is a changeable, soft limit, but it worries me when a Roodylib game creeps up to that 320 limit; I don't want it to get to the point where my beginning advice to new authors includes how to raise the routine limit.  They have enough things to take in at that point.

DescribePlace, the routine responsible for room descriptions, was one that I split into several routines so authors had the ability to change the order in which things are listed.  To combat the creeping-routine problem, I redesigned the routines as objects with one routine to execute them.  It's actually been done for a while, but I didn't mention it because it's kind of a useless modification, all things considered, and the number of routines probably only bothers me.

* * *

I also modified the "shell" files included with Roodylib to automatically compile with the -s switch.  This provides compilation statistics like number of objects, routines, etc.  I always find this useful; I figure others will, too.

* * *

Sir Jiz has a lot of timers in his game- most of which he's been handling with the room each_turn property.  I usually do this by keeping the number-of-turns-spent-in-room in the misc property so the each_turn property routine can have some "select self.misc" code and go from there.  Long story short, Jiz was getting sick of reminding himself where the best place to set the misc value was.  I figured, ah, yeah, I guess Roodylib could do something about that.  So now there's a RoomTurnCount thing so nobody has to mess with misc anymore.

select RoomTurnCount
case 0
"Runs as soon as player enters room."
case 1
"Runs after first turn."
case 2
"And so on."

Although maybe I should have pushed him towards using a daemon instead.  Ah, well, always nice to have multiple ways to do things.

* * *

Some months ago, someone expressed interest in there being a Hugo Comp this year, so I tried to gauge interest from everybody at the forums and throw some theme ideas around.  I had been in the middle of playthrough of "Spellcasting 201: The Sorcerer's Appliance" so I figured a magic-themed comp would be fun.  I even offered to throw together an extension for authors to use so they wouldn't have to write the magic system themselves.

I started off by looking at Cardinal Teulbachs' (yes, back in the olden days, we used to have an IF community member named "Cardinal Teulbachs") take on a spellcasting system.  His code strictly prohibited modification, though, so I was going to have to write my own thing by scratch (I don't always honor code licenses, but hey, this time I did).

I think I came up with the base design for my system, but I decided I needed to refresh my memory on how spells worked in the Enchanter trilogy (how many memory slots the player has, if all spells can be memorized multiple times- stuff like that).

The funny thing, though, is that I found myself super distracted by the fact that you could not read dropped scrolls in Teulbachs' sample game which was a departure from expected Infocom behavior.  Just the same, it made some sense, and I decided it'd be nice to write an object class system for objects that had to be held to be read (like a pamphlet) while still allowing for ones that don't (like a billboard).

More difficultly, I wanted to do this on the grammar level so all "You don't have that." messages didn't use up a turn.  Truthfully, it's not easy to have varied behavior when it comes to held/unheld verbs.  I knew that when I did get around to writing this system, I'd have to use my Roodylib "routine grammar helper" system.

I finally got around to looking at this problem yesterday.  It was one of those funny times where you return to an old problem a bit smarter and almost resent the obligation to improve your solution ("WHY CAN'T I JUST STAY STUPID FOREVER?").

First, to help me design the readable object classes, I looked over some grammar classes I had made previously for containers that are emptied in different ways (those that had to be held vs those that don't and so forth).  In my testing, though, the "empty" code wasn't working, and for a while there, I thought maybe I had broken FindObject somewhere along the way (and getting FindObject to do the things I already have it do was a scary, confusing journey so I wasn't looking forward to working on it again).

It turned out that a call to FindObject from AnythingTokenCheck (which itself is called from within FindObject) should have used the "recurse" argument.  Whew!

I also decided that half of my original "routine grammar helper" code was unnecessary.

But anyway, that's all working better now, although I might redesign the routine grammar helper again, possibly to use attributes instead of variable masking.

And, oh yeah, if you were curious, no, I don't think the Hugo Comp is happening.  That discussion fizzled out pretty quickly.

* * *

STATUSTYPE is a global variable that determines what kind of information is displayed in the upper right corner of the status window.  One of the several options displays a game clock like in the game Deadline.  Since the HoursMinutes routine used to print the time could also do military time, a while back, I added a STATUSTYPE value that provides a military time clock (like in Border Zone).   It kind of bugged me that the code didn't have an easy way to switch between the military time with colon and without.  While easy enough to provide a choice for the author, I really wanted it to be as unobtrusive as possible because, really, there's like no chance anyone is going to write a military time game again, and it'd just be embarrassing to show that I spent much time doing some sort of time-configuration system for a feature no one would ever use.  I ended up with just going with a #set NO_MILITARY_COLON flag.

* * *

There are a couple other things, but I'm getting tired of writing.  Anyway, probably will try to put out a Roodylib update within the next couple weeks and then maybe do a round of uploading-stuff-to-the-IF-archive.  Then, maybe, write some IF?  (gasp)

The Gameshelf: IF

Dropbox dropping support for playable HTML

by Andrew Plotkin at September 07, 2016 09:11 PM

(This has been widely noted, but I wanted to summarize what's known.)

At the beginning of September, some Dropbox users got email:

We’re writing to let you know that we’ll be discontinuing the ability to render HTML content in-browser via shared links or Public Folder. If you're using Dropbox shared links to host HTML files for a website, the content will no longer display in-browser.

(Text copied from a post on the ChoiceOfGames forum -- thanks jeantown.)

Dropbox has posted a more complete summary on their web site:

Dropbox Basic (free) users: Beginning October 3, 2016, you can no longer use shared links to render HTML content in a web browser. If you created a website that directly displays HTML content from your Dropbox, it will no longer render in the browser. The HTML content itself will still remain in your Dropbox and can be shared.

Dropbox Pro and Business users: Beginning September 1, 2017, you can no longer render HTML content.

In other words, in a month (for free users) or twelve months (for paid users), people will no longer be able to play your HTML-based games directly off of Dropbox. They'll either appear as raw HTML or as "download this file" links -- it's not clear which. (Other kinds of files, such as images or CSS files, will not be affected.)

Why are they doing this? I haven't seen a public explanation, but I assume it's because jerks are using Dropbox to anonymously host Javascript malware. Google Drive has announced a similar change.

Okay, you may ask, but does anybody publicize games this way? The ChoiceOfGames forum thread implies that the answer is yes. See also this thread and this thread.

In fact I've done this myself. When I first posted Bigger Than You Think as part of Yuletide 2012, I hosted it on Dropbox for the first seven days. Yuletide has a seven-day anonymity period, and Dropbox was an obvious short-term solution.

I know of games which only exist as Dropbox URLs, notably the creepy-comic Twine game Mastaba Snoopy. It was widely discussed in early 2013, but the only known source was this Dropbox URL. (Which currently redirects to this equivalent URL.) So that's a wee bit of history which will stop working in a month, or maybe twelve months.

To be sure, Mastaba Snoopy will not vanish. You will be able to download it as HTML (from the old URL) and play it locally. It will work fine that way. (In fact, it may work better. I've seen the Dropbox version mess up the game's Unicode something fierce.)

However, that only works because the game is a single self-contained HTML file. An HTML game with included images, sounds, JS/CSS files, or other resources would be harder to fetch. (BTYT included several JS/CSS files.) You'd have to download the HTML, ferret out all the relative URLs, and then download those too. This is always possible (unless the author has really worked to obfuscate the code!) but it may not be trivial.


If you are the author of a Twine game (or other web-based game) on Dropbox, and you have abandoned it, then you're not reading this post. Drat!

(If you're reading this and you care about the future playability of your game, I count that as "not abandoned".)

Your game is not under threat of disappearance, but most casual players will think it is broken. Preservation-minded fans may pick it up and make copies -- probably without your permission, since you're not reading this post. Sorry! Deliberate non-archivability of games is an interesting subject, but chances are high that somebody will download a copy for posterity. I have downloaded a copy of Mastaba Snoopy for my own files.

If you are the author of such a game and you want to keep it easily playable, you have various options for reposting it:

  • Itch.IO: Free. HTML games work fine, although they appear in an iframe. Probably you could launch the frame as a separate window if you tried.

  • Github Pages: If you have a Github account, you can post HTML pages at This is a better fit for open-source projects, although the Pages repository is not required to be public.

  • Free but you need a Twitter account. Intended for Twine games. I believe you can only upload a single HTML file, but I bet you could rig up a scheme where the HTML is on and the resource files (images, JS, CSS) are on Dropbox. Let me know if you make that work.

I've seen people suggest the IF Archive, but this is not a great solution. Speaking with my Archive hat on, we prefer that HTML-based games be uploaded as archives (.zip files). We don't want to be a front-line resource for playing IF; we're just not set up for that.

(We're not enforcing this as a hard-and-fast rule. In particular the IF Competition folders on the Archive contain a lot of playable games. Sorry; a 25-year history makes for a lot of exceptions.)

(It would be interesting if or a similar site gained the ability to download a Twine .zip package off the Archive, unpack and cache it, and offer it as a playable game. Eh? Eh?)

I have a long-held view -- admittedly biased by my long history with the IF Archive -- that IF preservation is deeply tied to the notion of a single-file release format for games. Of course this goes back to the days when you put your .z5 or .gam file up on and that was it; that's what people downloaded.

Years later, we caught on to the idea of making browser-playable IF. That left us in a weird state where authors were expected to release games twice, on a web page (a bunch of files) and on the Archive (a single file). The service helped stitch that divide back together -- you could just upload to the Archive and get browser-playability for free. (Although without stylistic customizability.)

But then Twine turned up, and life got messy again, because the Twine model only envisioned browser playability. Which was clearly sensible; downloading a file in a funny obscure format is obviously the wrong choice. Unless you think about archiving and preservation! Then having a file at a URL makes life so much easier.

And so we wind up back at this blog post. I have no concrete suggestions beyond the unpack-and-cache service I mentioned above. Which, yes, has its own security implications. (The same ones Dropbox faced in the first place.)

September 06, 2016

Web Interactive Fiction

Ruminations on a burst of Inform 6 coding in Inform 7 times…

by David Cornelson at September 06, 2016 01:01 AM

You may have seen my post of a very small story on I spent part of my labor day weekend appeasing a festering lack of creativity by “cleaning up” some old code in Inform 6. I had a zip file of all of the story and I6 library files, along with the 1999 version of infrmw32.

I haven’t been doing a ton of Inform 7 coding lately, but enough in the last year or so that I’m mostly comfortable with its syntax and usage scenarios. In my work life I write mostly C#, so jumping back into Inform 6 was interesting. Most of the basic syntax is easily remembered, but boy those I6 idiosyncrasies are a boat load of fun. Not having static analysis of your code is really frustrating when you find things like:

Object Foo with name 'foo', counter true, has scenery;

And try to check:


That little mishap, which will compile and execute, but give you the wrong value, is awesome.

But the most glaring difference is in usability. I eventually got my testing down to quick reruns of code in WinFrotz, but having to plan through all of that, needing to manage code, debugging with print statements, it all makes Inform 7 a huge blessing. One of the reasons I even tackled this was that I had written reasonably decent code, broken into include files, with base classes. Without that, I may have not bothered.

Even so, I can honestly say that for small stories, I might actually use Inform 6 instead of Inform 7. For a Speed-IF type endeavors, it might just be (for me) more efficient use of my time.

Of course for any WIPS I have, it’s Inform 7 all the way. At least now I’ve been given a reminder of some of the benefits.

Add a Comment

September 05, 2016

These Heterogenous Tasks

Introcomp ’16: Spellbound; Some Exceptions for Reasons Unknown

by Sam Kabo Ashwell at September 05, 2016 08:01 PM

Introcomp, a competition for the opening sections of interactive fiction games, is running through September 10. (Full disclosure: I am married to the comp organiser.) Today: a couple of fantasy parser games.

Spellbound (Adam Perry) is a wordplay parser game, taking roughly the same approach as the wordplay mechanic of Brian Rapp’s 2010 game Under, in Erebus: collect letters from the environment, then assemble them into words which, becoming real, form puzzle solutions.

Erebus, while ultimately a pretty charming game, made some big unforced errors; Spellbound avoids these. You’re immediately and clearly told what the game’s mechanic is, so you can start playing for real very promptly; and using letters doesn’t exhaust them, so you don’t have to waste time on retrieving new copies. Without this friction, Spellbound makes for a straightforward, smooth-playing, low-challenge intro. Like Erebus, your goal is mostly to guess what kind of currently-possible word might solve your present problems: you have a three-letter word frame, and as you accumulate letters you can make more stuff.

A consistent issue with wordplay IF is that it doesn’t generally lend itself well to compelling stories, worlds or characters. Most are surreal, fragmentary pieces with motivation that’s either weak or entirely metafictional; the only solid exception is Counterfeit Monkey. Often when I’m playing wordplay IF, I end up with the feeling that I’m really just playing a word puzzle and the IF-ish part isn’t contributing anything except a nuisance. Spellbound seems as though it might be aiming at a somewhat more cohesive setting than Nord and BertAd Verbum or Andrew Schultz’s work, but thus far it hasn’t been developed into much more than a puzzle backdrop. The protagonist’s motivation is ‘go on a quest to retrieve all the letters’, and they’re otherwise uncharacterised; so the thing’s not very compelling as fiction.

Considered as wordplay puzzles, Spellbound‘s are very simple. They would get somewhat tougher with more letters and a bigger rack, and hence more possibilities to guess at; but not, I suspect, more interesting. So what I’d want to see in the rest of the game would be elaborations on the basic mechanic – variations that go beyond a steady increase in available letters and frame size, words that have persistent effects rather than acting as simple lock-key solutions.

Some Exceptions for Reasons Unknown (Thomas Mack) is a parser game about a fantasy thief who has to deal with a dragon.

Thomas Mack was one of the authors of Speculative Fiction, and there are obvious similarities here: the protagonist is a rogue type, serving a wizard, in a world of corruption, incompetence and other forms of mild nastiness. The following gives a pretty decent idea of the flavour of things:

The tapestries show Blackacre’s founder, depicted as a victorious blond knight in spotless armor, slaying some sort of monstrous serpent. According to town legend, he defeated a monster that had been threatening the area and made it safe again for human habitation; in gratitude, the struggling settlers then declared him their mayor. In reality, Blackacre was founded by a consortium of herring merchants looking for a port closer to their shoals to cut down on transportation costs, and the founding comprised filing a proposal with the deputy undersecretary of the Royal Ministry of Fisheries. The artist chose the story that’s easier to weave into a tapestry.

This recognisably fits into a well-trodden and comfortable subgenre of parser IF, alongside works like Augmented Fourth or Risorgimento Represso: snarky fantasy comedies with caricature characters, formally quite traditional, often with some amount of nastiness but with a tone that’s essentially light comedy.

The exact premise of the story is a little bit fiddly: the protagonist was in the Thieves’ Guild but got forced out after a failed heist; they’re now a wizard’s apprentice, and they’re making an application to lead an expedition to hunt a troublesome dragon; but the wizard will only allow them to do this if they do a bunch of chores first. ‘Do some chores for a wizard’ is very high on the list of IF Objectives That Make Me Instantly Weary, falling somewhere between ‘make yourself some coffee at your office job’ and ‘restore functionality to a deserted spacecraft.’ So there’s pretty strong signalling that this is aiming at being a very traditional text adventure.

So, for instance, there’s a puzzle where you have to steal something from a shop, but the door is guarded by a golem who knows if you’re sneaking things out unpaid-for. This is a really classic species of IF puzzle – the alternate conveyance for an object which can’t be carried through a gateway – and it incorporates a really standard element, an NPC whose actions are predictable and can thus be readily exploited if you figure out how. And this last is how basically all the puzzles work: while they don’t have a particularly systematic mechanic, but they share a common theme of subterfuge and manipulation. This is not a very pleasant way of thinking about people, and it’s justified by the PC being kind of a selfish jerk, which is itself justified by everyone in this world being a selfish jerk. But – and this is what I really mean when I say the tone is light – it’s not really very concerned with what it’s like to inhabit a world where everyone’s viciously selfish. The protagonist is a notch or two up from an AFGNCAAP – they have a history and personal motives, but not a name, a gender, or very much in the way of an internal life.

The writing’s fine, although the comedy relies so consistently on a tone of… comfortable cynicism? that it ends up feeling rather one-note. I’m willing to forgive a very great deal if a comedy piece can get a genuine laugh out of me, and this never quite got there.

So Exceptions appears to be aiming to be the kind of game that I’d score in about a 6 in the IF Comp: capably-made, mildly entertaining, but fundamentally safe, not aiming at anything ambitious or unexpected in prose style, mechanics or subject-matter. I feel a little bad about grumbling about it, because there’s solid craft on display here; it’s also the most substantial of the entries by a considerable margin, and the one which gives me the most confidence that the author has the chops to complete it. I just couldn’t find anything to get very excited about.

Performance note: I got some pretty heavy lag on some commands, especially INVENTORY. This is fairly unusual for a game made in Inform 6, and which isn’t obviously doing anything unusually complex.

Renga in Blue

Eamon: The Lair of the Minotaur

by Jason Dyer at September 05, 2016 12:00 PM

This is the first “full length” game for Eamon past the Beginner’s Cave, and is written by Donald Brown himself.

If your character is female it assumes "boyfriend".

“Girlfriend” as a choice was automatic. If your character is female it assumes “boyfriend”.

In order to play I had to take a character through the Cave first to gather enough experience in combat, then port that same character into the Lair. I can’t emphasize enough how pleasing this sort of continuity feels; I’m fairly sure this is part of the reason Eamon took off.

There’s sort of a plot?


This doesn’t play nearly as fun as Beginner’s Cave. That game was tight enough that it felt like a genuine dungeon crawl and all the features had a chance to shine. This game has the same problem as Greg Hassett where more space for rooms leads to more rooms that do nothing.


(Click on the map for a larger version.)

Mind you, the RPG system is still relatively strong, and I had emergent sequences like this one:

  • I ran across a “black knight” whose heavy armor was very hard to penetrate in battle; fortunately, the knight fumbled and dropped their sword which I was able to grab. It then proceeded to run away. This led to a weird inversion where I was stalking a black knight repeatedly trying to hit it (for the weapon experience, of course) like I was the relentless stalker of some horror movie. Eventually I got tired of trying to knock the knight’s hit points down to zero and let it live.
  • In the process of knight-stalking I came across a “wandering minstrel eye” who was friendly and started following me around. Not helping in combat, mind you, just following, like a small puppy.
  • I met an (evil?) priest in a room full of ancient books which I bested in an extended combat. Unfortunately, in the midst of battle the priest decided the wandering eye was a valid target and slew it in a single blow.
  • I found the girlfriend in need of rescue tied to an altar with another evil priest. Unfortunately I was low on health and died before I could free her.

Related to health, I had enough money to come in with a spell this time (HEAL) which predictably healed some damage from prior combats, but as far as I could tell only worked once during the game. It’s almost more like I bought a consumable potion rather than a spell. Maybe it regenerates after enough turns or some such but I wasn’t able to figure out a way to use the spell again.

After the debacle above I made a second character which I first ran through the Beginner’s Cave again trying to get better statistics. That character fumbled and killed himself with his own sword before he could even make it out of that game. Whoops.

I repeated the sequence with a third character and much more successful character before bringing to the Lair. This time I was a bit more selective in my combats and managed to free the girlfriend, who then was able to contribute to combat. I then made my way through the maze (see map above; the “loops” connecting bottom to top were non-obvious) and defeated the minotaur mainly by hanging alive long enough for him to drop his weapon.

The strongest aspect of the game past the regular Eamon system is the amount of optional activity. Since no treasures are “required” and simply result in more gold at the end of the adventure, monsters and puzzles can be ignored to an extent there’s a “branching plot” feel.

For example: There’s a stone with the word “CIGAM” on in and if you SAY the right word (I’ll let you guess which) an emerald will pop out. There’s a portion that appears to be recently dug and if you bring a shovel you will find some gold coins. There’s a room with 5000 silver coins which are tractable to carry if you find a magic bag in another part of the map.

There’s also two “neutral” monsters: a blacksmith with a golden anvil (who is neutral upon you entering his room, but you can kill and rob because D&D) and a gypsy with a wicked looking sword. The charisma stat also comes into play here. I suspect it’s possible to make friends with the black knight with a lucky enough reaction, for instance.


There’s even one “backup item” branch. At the beginning there’s a coffin with a skeleton; if you kill the skeleton you get a “skeleton key” you need to unlock a gate later. If you skip fighting the skeleton (not unusual to occur, there’s a river after which is a one-way trip), the previously-mentioned priest with the ancient books has a skeleton key you can use instead.

While this game and the next couple Eamons are early enough in history I wouldn’t want to miss them, I do suspect enough of them tip far enough into the “RPG” category I may start skipping them in my All the Adventures list. As is, though, Eamon won’t be coming back until I’m out of 1979.

The obligatory Adventure reference.  This is more useful than it might appear, because it makes influences clear; when Jimmy Maher was trying to apply a date to Eamon he was unsure if Donald Brown had seen Adventure at all.

The obligatory Adventure reference. This is more useful than it might appear, because it makes influences clear; when Jimmy Maher was trying to apply a date to Eamon it was otherwise unclear if Donald Brown had seen Adventure at all.