Planet Interactive Fiction

April 21, 2018

PyramidIF

Spring Thing 2018 - ULTRAMARINE: A Seapunk Adventure

by Hanon Ondricek ([email protected]) at April 21, 2018 09:20 PM

ULTRAMARINE: A SEAPUNK ADVENTURE (Seven Submarines) - This appears to be the only full-on Ren'Py Visual Novel in Spring Thing. Seven individuals (submarines?) are credited and a lot of obvious work went into this presentation. The art skews more toward superhero comic than anime, and I got a very He-Man/She-Ra after school cartoon vibe from this. Sexy Mer-People! Look at the hair! I like the visual design a lot but had to turn the music way down  - not because it was bad, but because the music is performed with very synthetic instrument patches that I (personally) couldn't take much of. That could be easily fixed in software by remixing the tracks.
The mer-hunks are described as "giants"...but nothing seems specifically out of proportion beyond a comic book or professional wrestling scale here...

I'm a newcomer to VN, so I'm interested in how the presentation varies from text parser and choice IF. One thing about VN is that there's a lot of text.

The Prince's cape is faultlessly starched...or...well, this is underwater so fair enough.

A lot of text. I guess I shouldn't be as surprised coming from a complete text medium as I do, but VNs require you to either click forward a lot, set the auto-advance to a comfortable level to go handsfree, or skip text - most every VN engine has this built into the controls so maybe this isn't out of the ordinary.



I'd been reading VN where "you" (the PC) aren't onscreen and it took me a few minutes to figure out that I was "playing" Gabrielle Freedman (or at least making choices for her) despite the writing being 3rd person. She and the other two main characters (Prince Nautica and his trusty guard-bro Zeppelin) shuffle around while facing front a good deal before the director settles down and they land in some default positions to rest. They look great. And then they've got a whole lot of exposition for us - a good deal of worldbuilding was done here about undersea magikal-with-a-K denizens and everyone has superpowers, but there's a lot of "as we all know" chatter like a daily soap opera where they can't always show you a car crash or brain surgery but they'll talk about it for a week till the viewer almost feels like they were there.


I may be unfair in this criticism since I don't know the threshold of novel-to-visual that's generally accepted. For my personal taste, I think VN works best when the PC can have personal "in the now" conversations with one or more characters as opposed to being told about epic battles they weren't around f-

Whoa BATTLE SEQUENCE

Nobody expects a battle sequence! Well...all right! I'm happy I made it there. It's nothing extraordinary gameplay-wise but I finally got to participate in the action! Or at least manage some numbers for a while. Again...maybe I'm expecting too much from a visual novel - but novel means the writing should still fall under the jurisdiction of "show don't tell" - even moreso since they've got all this luscious art to use.

I appreciated ULTRAMARINE, and though I think I made only two or three actual choices, I got a numbered unsatisfactory ending, so it appears I could go through again and try for another ending using the SKIP function (I didn't bother to save) but I think I can infer the other branches of plot I didn't discover. Though this is listed as a "full length" game - mostly due to the expository water-treading - I felt like I was being somehow hastily brought up to speed on a much more expansive story in a bigger world than shown here. I'd love to see at least some of this happen over some still art or kinetic concept drawings to break up the characters just do-si-do-ing their positions while facing the audience and describing the off-stage action.



                      

Emily Short

Life in a Northern Town (People + Places, Spring Thing 2018)

by Emily Short at April 21, 2018 11:40 AM

Screen Shot 2018-04-16 at 9.04.33 AM.png

From Spring Thing 2018, Life in a Northern Town is what sometimes gets called dynamic fiction as a subset of interactive fiction: a piece in which, for most characters, you’re never making a single choice that changes an outcome or modifies the shape of a narrative. (Brianna’s chapter, in inklewriter, is an exception: she has actual agency over who she chooses to engage with. But the vast, vast majority of this story is about people making dangerous decisions while the player has no opportunity to intervene or prevent them from doing so.)

For most of the elements, a majority of the clicks are click-to-continue options, and some of the sub-stories in the piece are presented in formats such as groups of images on Instagram, where branching would be very hard to arrange. Other elements are told in Twine or on WordPress, eight different people’s perspectives on the same story — though it’s not really trying for a mimetic effect here. It’s not ARG-ishly pretending to actually be the blogs of all these people. Here and there, images are included, especially on the Instagram segments, but elsewhere it’s almost all text, including the largest chunk of the story which is presented in unstyled Twine.

Still, it’s not the same story it would have been if it had been written into a book. The work of reading it is part of the point, for one thing. This is a story about labor, and the labor is recaptured in the way of reading.

For another, the dynamic-fiction presentation fractures the temporal sequence of scenes, especially in the Twine segments. Often there will be a short scene of dialogue between characters, and then clicking through a link will reveal another beat in the same conversation, another interaction, which might be chronologically before or after the first. It doesn’t really matter how they’re joined up, temporally. I never found this to be confusing. Rather, it gave me a sense that I was getting the overall impression of the interaction and then a couple of other key moments from that interaction, in the same way I might when going over a memory in my head. A handful of times the revealed secondary beat actually overturns the sense of the initial interaction.

So I can see reasons for the way it’s presented, but this is a long piece of work — took me some hours to read, and I’m a pretty fast reader — and by the end I would really have appreciated a more comfortable, less laborious reading experience. Other markers are missing, too: there aren’t chapter breaks, so sometimes the story ratchets forward to a new scene or location without an explicit division. There’s no progress indicator, either, which I really miss when I’ve got a multi-hour work on my hands.

Something like this stands or falls on the quality of its writing. In my initial encounter with the first of its linked stories, “Dangerous Work”, I was a little discouraged by the styling and structure — of course it’s not always the case, but standard, unformatted blue-and-white-on-black Twine sometimes goes with low-effort authoring. But I found myself continuing to read screen after screen, connecting with the luckless protagonist and her precarious life in and around Minneapolis.

I’ve lived in that area myself, some years back, and I have relatives who farm in North Dakota. It felt familiar in places, and truthful when unfamiliar.

Eventually, the plot kicks up a notch, and we have a story that feels like it owes a lot to Fargo, especially if by Fargo you mean not just the original movie but Fargo as a TV series and a Fiasco-inspiration, Fargo the genre, a genre of single-story towns and grey-brown plains and blood on snow, a genre in which incompetent, venal criminality leads to shocking amounts of violence. Crime is really best left to the professionals: that’s one of the core principles of a Fargo story. Meanwhile, there’s a virtuous police officer, typically female, who understands that for society’s fabric to hold, we all have to be kind to each other, and follow the rules.

And Fargo the genre is also a black comedy genre, a farce in which characters desperately try to distract each other because, say, the dead man’s severed thumb has rolled into a corner and the unsuspecting janitor has just arrived for work.

Life in a Northern Town is not mostly all that funny, or not funny in that way: farce, even sinister farce, needs a fast pace, and neither the writing nor the interaction style are particularly hurried here. But many of the other elements are present: people who mean well-ish; who aren’t so much bad people as insufficiently motivated to be lawful, who start things they can’t finish and get out of control.

At the same time, there’s another strand here: Life in a Northern Town is darker than the average Fargo-story about the underlying systems of life. Our heroes do stupid things and get in trouble because they don’t have enough money: so far, so standard. But the reason they don’t have enough money is that they’ve been screwed over by the previous generation, both generally and specifically. The economy is bad, and also their parents, specifically, were bad managers and left them in a state. People are in pain, physically and emotionally, and they resort to various methods to dull that pain because society as a whole isn’t providing much by way of mental or physical health care. They all have to make moral compromises to have enough to live on, and even before things go really wrong, our main protagonist is supporting a fracking camp, which in theory she doesn’t approve of, but what choice does she have?

Meanwhile, there’s no good police officer here, no one who represents Lawful Good as a successful way to lead a civilized life.

So it’s grim and long, like a midwestern winter. More editing might help. It’s not in bad shape, and I only noted a few typos in all its plentiful text. But segments of Amina’s story drag out, revisiting similar conversation scenes over and over before anything happens; Brianna’s story occasionally shifts without warning or explanation between first and third person narration. And I wouldn’t have minded a stronger distinction of voice between the characters, who sometimes seem a bit alike in the way they talk and the cadence of their narration. New character perspectives typically bring some new information, but for the most part they’re reiterating a story we already know, perhaps a story we’ve already been through half a dozen times. To keep this vibrant needs either a tight, thriller-like construction, each character bringing something new to the story that snaps the meaning of the plot into new perspective — or else a much looser one, more an anthology of lives with greater divergence in their experiences.

Then, too, I feel like the segments of the story that are presented last on the contents page are not necessarily the strongest. So it’s possible — indeed, likely — to work through this in a way that lands the whole experience with a whimper rather than a bang. Life in a Northern Town is more of a novel as hypertext pieces go, but it’s relying on structures of interactivity and forms of navigation suited to short stories.

IMG_0155

Chapters are nodes in Arcadia‘s content map; each thread is a viewpoint character

I do wonder what this would have been like presented with a map like the one in Iain Pears’ Arcadia, the ability to slide sideways from one timeline to another, to follow one character or another away from a given encounter, to pick whether we wanted to read next about events in South Dakota or about contemporary happenings in Finland, or even back up a step in another character’s storyline and find out what brought that character to this particular impasse.

Because I felt that the piece intends the readers to be comparing and contrasting scenes from different angles, but it’s not actually easy to do that, to jump back and forth between places in a markerless expanse of Twine text.

In any case, the overall idea — exploring a narrative space from multiple character viewpoints and with the ability to swap between characters at different times — reminded me not only of Arcadia and Common Ground but also of Punchdrunk’s immersive theatre productions — where you cannot go backward in time, but where the events of a plot may repeat several times over and evening in order to make the plot space explorable.

April 20, 2018

The Digital Antiquarian

The Game of Everything, Part 6: Civilization and Religion

by Jimmy Maher at April 20, 2018 05:41 PM

Science without religion is lame, religion without science is blind.

— Albert Einstein

If you ever feel like listening to two people talking past one another, put a strident atheist and a committed theist in a room together and ask them to discuss The God Question. The strident atheist — who, as a colleague of the psychologist and philosopher of religion William James once put it, “believes in No-God and worships Him” — will trot out a long series of supremely obvious, supremely tedious Objective Truths. He’ll talk about evolution, about the “God of the gaps” theory of religion as a mere placeholder for all the things we don’t yet understand, about background radiation from the Big Bang, about the age-old dilemma of how a righteous God could allow all of the evil and suffering which plague our world. He’ll talk and talk and talk, all the while presuming that the theist couldn’t possibly be intelligent enough to have considered any of these things for herself, and that once she’s been exposed to them at last her God delusion will vanish in a puff of incontrovertible logic. The theist, for her part, is much less equipped to argue in this fashion, but she does her best, trying to explain using the crude tool of words her ineffable experiences that transcend language. But her atheist friend, alas, has no time, patience, or possibly capability for transcendence.

My own intention today certainly isn’t to convince you of the existence or non-existence of God. Being a happy agnostic —  one of what the Catholic historian Hugh Ross Williamson called “the wishy-washy boneless mediocrities who flap around in the middle” — I make a poor advocate for either side of the debate.  But I will say that, while I feel a little sorry for those people who have made themselves slaves to religious dogma and thereby all but lost the capacity to reason in many areas of their lives, I also feel for those who have lost or purged the capacity to get beyond logic and quantities and experience the transcendent.

“One must have musical ears to know the value of a symphony,” writes William James. “One must have been in love one’s self to understand a lover’s state of mind. Lacking the heart or ear, we cannot interpret the musician or the lover justly, and are even likely to consider him weak-minded or absurd.” Richard Dawkins, one of the more tedious of our present-day believers in No-God, spends the better part of a chapter in his book The God Delusion twisting himself into knots over the Einstein quote that opens this article, trying to logically square the belief of the most important scientist of the twentieth century in the universe’s ineffability with the same figure’s claim not to believe in a “personal God.” Like a cat chasing a laser pointer, Dawkins keeps running around trying to pin down that which refuses to be captured. He might be happier if he could learn just to let the mystery be.

In a sense, a game which hopes to capture the more transcendent aspects of life runs into the same barriers as the unadulteratedly rational person hoping to understand them. Many commenters, myself included, have criticized games over the years for a certain thematic niggardliness, a refusal to look beyond the physics of tanks and trains and trebuchets and engage with the concerns of higher art. We’ve tended to lay this failure at the feet of a design culture that too often celebrates immaturity, but that may not be entirely fair. Computers are at bottom calculating machines, meaning they’re well-suited to simulating easily quantifiable physical realities. But how do you quantify love, beauty, or religious experience? It can be dangerous even to try. At worst — and possibly at best as well — you can wind up demeaning the very ineffabilities you wished to celebrate.

Civilization as well falls victim to this possibly irreconcilable dilemma. In creating their game of everything, their earnest attempt to capture the entirety of the long drama of human civilization, Sid Meier and Bruce Shelley could hardly afford to leave out religion, one of said drama’s prime driving forces. Indeed, they endeavored to give it more than a token part, including the pivotal advances of Ceremonial Burial, Mysticism, and Religion — the last really a stand-in for Christianity, giving you the opportunity to build “cathedrals” — along with such religious Wonders of the World as the Oracle of Delphi, the Sistine Chapel, and the church music of Johann Sebastian Bach. Yet it seems that they didn’t know quite what to do with these things in mechanical, quantifiable, computable terms.

The Civilopedia thus tells us that religion was important in history only because “it brought peace of mind and the ability to get on with the work of life.” In that spirit, all of the advances and Wonders dealing with religion serve in one way or another to decrease the unhappiness level of your cities — a level which, if it gets too high, can throw a city and possibly even your entire civilization into revolt. “The role of religion in Sid Meier’s Civilization,” note Johnny L. Wilson and Alan Emrich in Civilization: or Rome on 640K a Day, “is basically the cynical role of pacifying the masses rather than serving as an agent for progress.” This didn’t sit terribly well with Wilson in particular, who happened to be an ordained Baptist minister. Nor could it have done so with Sid Meier, himself a lifelong believer. But, really, what else were they to do with religion in the context of a numbers-oriented strategy game?

I don’t have an answer to that question, but I do feel compelled make the argument the game fails to make, to offer a defense of religion — and particularly, what with Civilization being a Western game with a Western historical orientation, Christianity — as a true agent of progress rather than a mere panacea. In these times of ours, when science and religion seem to be at war and the latter is all too frequently read as the greatest impediment to our continued progress, such a defense is perhaps more needed than ever.

Richard Dawkins smugly pats himself on the back for his fair-mindedness when, asked if he really considers religion to be the root of all evil in the world, he replies that no, “religion is not the root of all evil, for no one thing is the root of everything.” And yet, he tells us:

Imagine, with John Lennon, a world with no religion. Imagine no suicide bombers, no 9/11, no 7/7, no Crusades, no witch hunts, no Gunpowder Plot, no Indian partition, no Israeli/Palestinian Wars, no Serb/Croat/Muslim massacres, no persecution of Jews as “Christ-killers,” no Northern Ireland “troubles,” no “honour killings,” no shiny-suited bouffant-haired televangelists fleecing gullible people of their money (“God wants you to give till it hurts”). Imagine no Taliban to blow up ancient statues, no public beheadings of blasphemers, no flogging of female skin for the crime of showing an inch of it.

Fair points all; the record of religious — and not least Christian — atrocities is well-established. In the interest of complete fairness, however, let’s also acknowledge that but for religion those ancient statues whose destruction at the hands of the Taliban Dawkins so rightfully decries, not to mention his Jews being persecuted by Christians, would never have existed in the first place. Scholar of early Christianity Bart D. Ehrman — who, in case it matters, is himself today a reluctant non-believer — describes a small subset of the other things the world would lack if Christianity alone had never come to be:

The ancient triumph of Christianity proved to be the single greatest cultural transformation our world has ever seen. Without it the entire history of Late Antiquity would not have happened as it did. We would never have had the Middle Ages, the Reformation, the Renaissance, or modernity as we know it. There could never have been a Matthew Arnold. Or any of the Victorian poets. Or any of the other authors of our canon: no Milton, no Shakespeare, no Chaucer. We would have had none of our revered artists: no Michelangelo, Leonardo da Vinci, or Rembrandt. And none of our brilliant composers: no Mozart, Handel, or Bach.

One could say that such an elaborate counterfactual sounds more impressive than it really is; the proverbial butterfly flapping its wings somewhere in antiquity could presumably also have deprived us of all those things. Yet I think Ehrman’s deeper point is that all of the things and people he mentions, along with the modern world order and even the narrative of progress that has done so much to shape it, are at heart deeply Christian, whether they express any beliefs about God or not. I realize that’s an audacious statement to make, so let me try to unpack it as carefully as possible.

In earlier articles, I’ve danced around the idea of the narrative of progress as a prescriptive ethical framework — a statement of the way things ought to be — rather than a descriptive explication of the way they actually are. Let me try to make that idea clearer now by turning to one of the most important documents to emerge from the Enlightenment, the era that spawned the narrative of progress: the American Declaration of Independence.

We don’t need to read any further than the beginning of the second paragraph to find what we’re looking for: “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, and the pursuit of Happiness.” No other sentence I’ve ever read foregrounds the metaphysical aspect of progress quite like this one, the most famous sentence in the Declaration, possibly the most famous in all of American history. It’s a sentence that still gives me goosebumps every time I read it, thanks not least to that first clause: “We hold these truths to be self-evident.” This might just be the most sweeping logical hand-wave I’ve ever seen. Nowhere in 1776 were any of these “truths” about human equality “self-evident.” The very notion that a functioning society could ever be founded on the principle of equality among people was no more than a century old. Over the course of that century, philosophers such as Immanuel Kant and John Locke had expended thousands of pages in justifying what Thomas Jefferson was now dismissing as a given not worthy of discussion. With all due caveats to the scourge of slavery and the oppression of women and all the other imperfections of the young United States to come, the example that a society could indeed be built around equal toleration and respect for everyone was one of the most inspiring the world has ever known — and one that had very little to do with strict rationality.

Even today, there is absolutely no scientific basis to a claim that all people are equal. Science clearly tells us just the opposite: that some people are smarter, stronger, and healthier than other people. Still, the modern progressive ideal, allegedly so rational, continues to take as one of its most important principles Jefferson’s leap of faith. Nor does Jefferson’s extra-rational idealism stand alone. Consider that one version of the narrative of progress, the one bound up with Georg Wilhelm Friedrich Hegel’s eschatology of an end to which all of history is leading, tells us that that end will be achieved when all people are allowed to work in their individual thymos-fulfilling roles, as they ought to be. But “ought to,” of course, has little relevance in logic or science.

If the progressive impulse cannot be ascribed to pure rationality, we have to ask ourselves where Jefferson’s noble hand-wave came from. In a move that will surprise none of you who’ve read this far, I’d like to propose that the seeds of progressivism lie in the earliest days of the upstart religion of Christianity.

“The past is a foreign country,” wrote L.P. Hartley in The Go-Betweens. “They do things differently there.” And no more recent past is quite so foreign to us as the time before Jesus Christ was (actually or mythically) born. Bart D. Ehrman characterizes pre-Christian Mediterranean civilization as a culture of “dominance”:

In a culture of dominance, those with power are expected to assert their will over those who are weaker. Rulers are to dominate their subjects, patrons their clients, masters their slaves, men their women. This ideology was not merely a cynical grab for power or a conscious mode of oppression. It was the commonsense, millennia-old view that virtually everyone accepted and shared, including the weak and marginalized.

This ideology affected both social relations and government policy. It made slavery a virtually unquestioned institution promoting the good of society; it made the male head of the household a sovereign despot over all those under him; it made wars of conquest, and the slaughter they entailed, natural and sensible for the well-being of the valued part of the human race (that is, those invested with power).

With such an ideology one would not expect to find governmental welfare programs to assist weaker members of society: the poor, homeless, hungry, or oppressed. One would not expect to find hospitals to assist the sick, injured, or dying. One would not expect to find private institutions of charity designed to help those in need.

There’s a telling scene late in The Iliad which says volumes about the ancient system of ethics, and how different it was from our own. Achilles is about to inflict the killing blow on a young Trojan warrior, who begs desperately for his life. Achilles’s reply follows:

“Come, friend, you too must die. Why moan about it so?
Even Patroclus died, a far, far better man than you.
And look, you see how handsome and powerful I am?
The son of a great man, the mother who gave me life
a deathless goddess. But even for me, I tell you,
death and the strong force of fate are waiting.
There will come a dawn or sunset or high noon
when a man will take my life in battle too —
flinging a spear perhaps
or whipping a deadly arrow off his bow.”

Life and death in Homer are matters of fate, not of morality. Mercy is neither given nor expected by his heroes.

Of course, the ancients had gods — plenty of them, in fact. A belief in a spiritual realm of the supernatural is far, far older than human civilization, dating back to the primitive animism of the earliest hunter-gatherers. By the time Homer first chanted the passage above, the pantheon of Greek gods familiar to every schoolchild today had been around for many centuries. Yet these gods, unsurprisingly, reflected the culture of dominance, so unspeakably brutal to our sensibilities, that we see in The Iliad, a poem explicitly chanted in homage to them.

The way these gods were worshiped was different enough from what we think of as religion today to raise the question of whether the word even applies to ancient sensibilities. Many ancient cultures seem to have had no concept or expectation of an afterlife (thus rather putting the lie to one argument frequently trotted out by atheists, that the entirety of the God Impulse can be explained by the very natural human dread of death). The ancient Romans carved the phrase “non fui; fui; non sum; non curo” on gravestones, which translates to “I was not; I was; I am not; I care not.” It’s a long way from “at rest with God.”

Another, even more important difference was the non-exclusivity of the ancient gods. Ancient “religion” was not so much a creed or even a collection of creeds as it was a buffet of gods from which one could mix and match as one would. When one civilization encountered another, it was common for each to assimilate the gods of the other, creating a sort of divine mash-up. Sumerian gods blended with the Babylonian, who blended with the Greek, who were given Latin names and assimilated by the Romans… there was truly a god for every taste and for every need. If you were unlucky in love, you might want to curry favor with Aphrodite; if the crops needed rain, perhaps you should sacrifice to Demeter; etc., etc. The notion of converting to a religion, much less that of being “born again” or the like, would have been greeted by the ancients with complete befuddlement.1

And then into this milieu came Jesus Christ, only to be promptly, as Douglas Adams once put it, “nailed to a tree for saying how great it would be to be nice to people for a change.” It’s very difficult to adequately convey just how revolutionary Christianity, a religion based on love and compassion rather than dominance, really was. I defer one more time to Bart D. Ehrman:

Leaders of the Christian church preached and urged an ethic of love and service. One person was not more important than another. All were on the same footing before God: the master was no more significant than the slave, the husband than the wife, the powerful than the weak, or the robust than the diseased.

The very idea that society should serve the poor, the sick, and the marginalized became a distinctively Christian concern. Without the conquest of Christianity, we may well never have had institutionalized welfare for the poor or organized healthcare for the sick.  Billions of people may never have embraced the idea that society should serve the marginalized or be concerned with the well-being of the needy, values that most of us in the West have simply assumed are “human” values.

Christianity carried within it as well a notion of freedom of choice that would be critical to the development of liberal democracy. Unlike the other belief systems of the ancient world, which painted people as hapless playthings of their gods, Christianity demanded that you choose whether to follow Christ’s teachings and thus be saved; you held the fate of your own soul in your own hands. If ordinary people have agency over their souls, why not agency over their governments?

But that was the distant future. For the people of the ancient world, Christianity’s tenet that they — regardless of who they were — were worthy of receiving the same love and compassion they were urged to bestow upon others had an immense, obvious appeal. Hegel, a philosopher of ambiguous personal spiritual beliefs who saw religions as intellectual memes arising out of the practical needs of the people who created them, would later describe Christianity as the perfect slave religion, providing the slaves who made up the bulk of its adherents during the early years with the tool of their own eventual liberation.

And so, over the course of almost 300 years, Christianity gradually bubbled up from the most wretched and scorned members of society to finally reach the Roman Emperor Constantine, the most powerful man in the world, in his luxurious palace. The raw numbers accompanying its growth are themselves amazing. At the time of Jesus Christ’s death, the entirety of the Christian religion consisted of his 20 or so immediate disciples. By the time Constantine converted in AD 312, there were about 3 million Christians in the world, despite persecution by the same monarch’s predecessors. In the wake of Constantine’s official sanction, Christianity grew to as many as 35 million disciples by AD 400. And today roughly one-third of the world’s population — almost 2.5 billion people — call themselves Christians of one kind or another. For meme theorists, Christianity provides perhaps the ultimate example of an idea that was so immensely appealing on its own merits that it became literally unstoppable. And for political historians, its takeover of the Roman Empire provides perhaps the first example in history of a class revolution, a demonstration of the power of the masses to shake the palaces of the elites.

Which is not to say that everything about the Christian epoch would prove better than what had come before it. “There is no need of force and injury because religion cannot be forced,” wrote the Christian scholar Lactantius hopefully around AD 300. “It is a matter that must be managed by words rather than blows, so that it may be voluntary.” Plenty would conspicuously fail to take his words to heart, beginning just 35 years later with the appropriately named Firmicus, an advisor with the newly Christianized government of Rome, who told his liege that “your severity should be visited in every way on the crime of idolatry.” The annals of the history that followed are bursting at the seams with petty tyrants, from Medieval warlords using the cross on their shields to justify their blood lust to modern-day politicians of the Moral Majority railing against “those sorts of people,” who have adopted the iconography of Christianity whilst missing the real point entirely. This aspect of Christianity’s history cannot and should not be ignored.

That said, I don’t want to belabor too much more today Christianity’s long history as a force for both good and ill. I’ll just note that the Protestant Reformation of the sixteenth and seventeenth centuries, which led to the bloodiest war in human history prior to World War I, also brought with it a new, works-focused — as opposed to faith-focused — vitality to the religion, doing much to spark that extraordinary acceleration in the narrative of progress which began in the eighteenth century. I remember discussing the narrative of progress with a conservative Catholic acquaintance of mine who’s skeptical of the whole notion’s spiritual utility. “That’s a very Protestant idea,” he said about it, a little dismissively. “Yeah,” I had to agree after some thought. “I guess you’re right.” Protestantism is still linked in the popular imagination with practical progress; the phrase “Protestant work ethic” still crops up again and again, and studies continue to show that large-scale conversions to Protestantism are usually accompanied — for whatever reason — by increases in a society’s productivity and a decline in criminality.

One could even argue that it was really the combination of the ethos of love, compassion, equality, and personal agency that had been lurking within Christianity from the beginning with this new Protestant spirit of practical, worldly achievement in the old ethos’s service that led to the Declaration of Independence and the United States of America, that “shining city on a hill” inspiring the rest of the world. (The parallels between this worldly symbol of hope and the Christian Heaven are, I trust, so obvious as to not be worth going into here.) It took the world almost 2000 years to make the retrospectively obvious leap from the idea that all people are equal before God to the notion that all people are equal, period. Indeed, in many ways we still haven’t quite gotten there, even in our most “civilized” countries. Nevertheless, it’s hard to imagine the second leap being made absent the first; the seeds of the Declaration of Independence were planted in the New Testament of the Christian Bible.

Of course, counterfactuals will always have their appeal. If, as many a secular humanist has argued over the years, equality and mutual respect really are just a rationally better way to order a society, it’s certainly possible we would have gotten as far as we have today by some other route — possibly even have gotten farther, if we had been spared some of the less useful baggage which comes attached to Christianity. In the end, however, we have only one version of history which we can truly judge: the one that actually took place. So, credit where it’s due.

Said credit hasn’t always been forthcoming. In light of the less inspiring aspects of Christianity’s history, there’s a marked tendency in some circles to condemn its faults without acknowledging its historical virtues, often accompanied by a romanticizing of the pre-Christian era. By the time Constantine converted to Christianity in AD 312, thus transforming it at a stroke from an upstart populist movement to the status it still holds today as the dominant religion of the Western world, the Roman Empire was getting long in the tooth indeed, and the thousand years of regress and stagnation that would come to be called the Dark Ages were looming. Given the timing, it’s all too easily for historians of certain schools to blame Christianity for what followed. Meanwhile many a libertine professor of art or literature has talked of the ancients’ comfort with matters of the body and sexuality, contrasting it favorably with the longstanding Christian discomfort with same.

But our foremost eulogizer of the ancient ways remains that foremost critic of the narrative of progress in general, Friedrich Nietzsche. His homage to the superiority of might makes right over Christian compassion carries with it an unpleasant whiff of the Nazi ideology that would burst into prominence thirty years after his death:

The sick are the greatest danger for the well. The weaker, not the stronger, are the strong’s undoing. It is not fear of our fellow man which we should wish to see diminished; for fear rouses those who are strong to become terrible in turn themselves, and preserves the hard-earned and successful type of humanity. What is to be dreaded by us more than any other doom is not fear but rather the great disgust, not fear but rather the great pity — disgust and pity for our human fellows.

The morbid are our greatest peril — not the “bad” men, not the predatory beings. Those born wrong, the miscarried, the broken — they it is, the weakest who are undermining the vitality of the race, poisoning our trust in life, and putting humanity in question. Every look of them is a sigh — “Would I were something other! I am sick and tired of what I am.” In this swamp soil of self-contempt, every poisonous weed flourishes, and all so small, so secret, so dishonest, and so sweetly rotten. Here swarm the worms of sensitiveness and resentment, here the air smells odious with secrecy, with what is not to be acknowledged; here is woven endlessly the net of the meanest conspiracies, the conspiracy of those who suffer against those who succeed and are victorious; here the very aspect of the victorious is hated — as if health, success, strength, pride, and the sense of power were in themselves things vicious, for which one ought eventually to make bitter expiation. Oh, how these people would themselves like to inflict expiation, how they thirst to be the hangmen! And all the while their duplicity never confesses their hatred to be hatred.

To be sure, there were good things about the ancient ways. When spiritual beliefs are a buffet, there’s little point in fighting holy wars; while the ancients fought frequently and violently over many things large and small, they generally didn’t go to war over their gods. Even governmental suppression of religious faith, which forms such an important part of the early legends of Christianity, was apparently suffered by few other other groups of believers.2 Still, it’s hard to believe that very many of our post-Christ romanticizers of the ancient ways would really choose to go back there if push came to shove — least of all among them Nietzsche, a sickly, physically weak man who suffered several major mental breakdowns over the course of his life. He wouldn’t have lasted a day among his beloved Bronze Age Greeks; ironically, it was only the fruits of the progress he so decried that allowed him to fulfill his own form of thymos.

At any rate, I hope I’ve made a reasonable case for Christianity as a set of ideas that have done the world much good, perhaps even enough to outweigh the atrocities committed in the religion’s name. At this juncture, I do want to emphasize again that one’s opinion of Christian values need not have any connection with one’s belief in the veracity of the Christian God. For my part, I try my deeply imperfect best to live by those core values of love, compassion, and equality, but I have absolutely no sense of an anthropomorphic God looking down from somewhere above, much less a desire to pray to Him.

It even strikes me as reasonable to argue that the God part of Christianity has outlived His essentialness; one might say that the political philosophy of secular humanism is little more than Christianity where faith in God is replaced with faith in human rationality. Certainly the world today is more secular than it’s ever been, even as it’s also more peaceful and prosperous than it’s ever been. A substantial portion of those 2.5 billion nominal Christians give lip service but little else to the religion; I think about the people all across Europe who still let a small part of their taxes go to their country’s official church out of some vague sense of patriotic obligation, despite never actually darkening any physical church’s doors.

Our modern world’s peace and prosperity would seem to be a powerful argument for secularism. Yet a question is still frequently raised: does a society lose something important when it loses the God part of Christianity — or for that matter the God part of any other religion — even if it retains most of the core values? Some, such as our atheist friend Richard Dawkins, treat the very notion of religiosity as social capital with contempt, another version of the same old bread-and-circuses coddling of the masses, keeping them peaceful and malleable by telling them that another, better life lies in wait after they die, thus causing them to forgo opportunities for bettering their lots in this life. But, as happens with disconcerting regularity, Dawkins’s argument here is an oversimplification. As we’ve seen already, a belief in an afterlife isn’t a necessary component of spiritual belief (although, as the example of Christianity proves, it certainly can’t hurt a religion’s popularity). It’s more interesting to address the question not through the micro lens of what is good for an individual or even a collection of individuals in society, but rather through the macro lens of what is good for society as an entity unto itself.

And it turns out that there are plenty of people, many of them not believers themselves, who express concern over what else a country loses as it loses its religion. The most immediately obvious of the problematic outcomes is a declining birth rate. The well-known pension crisis in Europe, caused by the failure of populations there to replace themselves, correlates with the fact that Europe is by far the most secular place in the world. More abstractly but perhaps even more importantly, the decline in organized religion in Europe and in North America has contributed strongly to a loss of communal commons. There was a time not that long ago when the local church was the center of a community’s social life, not just a place of worship but one of marriages, funerals, pot lucks, swap meets, dances, celebrations, and fairs, a place where people from all walks of life came together to flirt, to socialize, to hash out policy, to deal with crises, and to help those less fortunate. Our communities have grown more diffuse with the decline of religion, on both a local and a national scale.

Concern about the loss of religion as a binding social force, balanced against a competing and equally valid concern for the plight of those who choose not to participate in the majority religion, has provoked much commentary in recent decades. We live more and more isolated lives, goes the argument, cut off from our peers, existing in a bubble of multimedia fantasy and empty consumerism, working only to satisfy ourselves. Already in 1995, before the full effect of the World Wide Web and other new communications technologies had been felt, the political scientist Robert D. Putnam created a stir in the United States with his article “Bowling Alone: America’s Declining Social Capital,” which postulated that civic participation of the sort that had often been facilitated by churches was on a worrisome decline. For many critics of progress, the alleged isolating effect of technology has only made the decline more worrisome in more recent years. In Denmark, the country where I live now — and a country which is among the most secular even in secular Europe — newly arrived immigrants have sometimes commented to me about the isolating effect of even the comprehensive government-administered secular safety net: how elderly people who would once have been taken care of by their families get shunted off to publicly-funded nursing homes instead, how children can cut ties with their families as soon as they reach university age thanks to a generous program of student stipends.

The state of Christianity in many countries today, as more of a default or vestigial religion than a vital driving faith, is often contrasted unfavorably with that of Islam, its monotheistic younger brother which still trails it somewhat in absolute numbers of believers but seems to attract far more passion and devotion from those it has. Mixed with reluctant admiration of Islam’s vitality is fear of the intolerance it supposedly breeds and the acts of terrorism carried out in its name. I’ve had nothing to say about Islam thus far — in my defense, neither does the game of Civilization — and the end of this long article isn’t the best place to start analyzing it. I will note, however, that the history of Islam, like that of Christianity, has encompassed both inspirational achievements and horrible atrocities. Rather than it being Islam itself that is incompatible with liberal democracy, there seems to be something about conditions in the notorious cauldron of conflict that is the Middle East — perhaps the distortions produced by immense wealth sitting there just underground in the form of oil and the persistent Western meddling that oil has attracted — which has repeatedly stunted those countries’ political and economic development. Majority Muslim nations in other parts of the world, such as Indonesia and Senegal, do manage to exist as reasonably free and stable democracies. Ultimately, the wave of radical Islamic terrorism that has provoked such worldwide panic since September 11, 2001, may have at least as much to do with disenfranchisement and hopelessness as it does with religion. If and when the lives of the young Muslim men who are currently most likely to become terrorists improve, their zeal to be religious martyrs will likely fade — as quite likely will, for better or for worse, the zeal of many of them for their religion in general. After all, we’ve already seen this movie play out with Christianity in the starring role.

As for Christianity, the jury is still out on the effects of its decline in a world which has to a large extent embraced its values but may not feel as much of a need for its God and for its trappings of worship. One highly optimistic techno-progressivist view — one to which I’m admittedly very sympathetic — holds that the ties that bind us together haven’t really been weakened so very much at all, that the tyranny of geography over our circles of empathy is merely being replaced, thanks to new technologies of communication and travel, by true communities of interest, where physical location need be of only limited relevance. Even the demographic crisis provoked by declining birth rates might be solved by future technologies which produce more wealth for everyone with less manpower. And the fact remains that, taken in the abstract, fewer people on this fragile planet of ours is really a very good thing. We shall see.

I realize I’ve had little to say directly about the game of Civilization in this article, but I’m not quite willing to apologize for that. As I stated at the outset, the game’s handling of religion isn’t terribly deep; there just isn’t a lot of “there” there when it comes to religion and Civilization. Yet religion has been so profoundly important to the development of real-world civilization that this series of articles would have felt incomplete if I didn’t try to remedy the game’s lack by addressing the topic in some depth. And in another way, of course, the game of Civilization would never have existed without the religion of Christianity in particular, simply because so much of the animating force of the narrative of progress, which in turn is the animating force of Civilization, is rooted in Christian values. In that sense, then, this article has been all about the game of Civilization — as it has been all about the values underpinning so much of the global order we live in today.

(Sources: the books Civilization, or Rome on 640K A Day by Johnny L. Wilson and Alan Emrich, The Story of Civilization Volume I: Our Oriental Heritage by Will Durant, The Better Angels of Our Nature: Why Violence Has Declined by Steven Pinker, The End of History and the Last Man by Francis Fukuyama, The Iliad by Homer, Lectures on the Philosophy of History by Georg Wilhelm Friedrich Hegel, The Varieties of Religious Experience by William James, The Genealogy of Morals by Friedrich Nietzsche, The Communist Manifesto by Karl Marx and Friedrich Engels, The Human Use of Human Beings by Norbert Wiener, The Hitchhiker’s Guide to the Galaxy by Douglas Adams, The Triumph of Christianity: How a Forbidden Religion Swept the World by Bart D. Ehrman, The Past is a Foreign Country by David Lowenthal, Bowling Alone: America’s Declining Social Capital by Robert D. Putnam, A Christmas Carol by Charles Dickens, and The God Delusion by Richard Dawkins.)


  1. There were just three exceptions to the rule of non-exclusivity, all of them also rare pre-Christian examples of monotheism. The Egyptian pharaoh Akhenaten decreed around 1350 BC that his kingdom’s traditional pantheon of gods be replaced with the single sun god Aten. But his new religion was accepted only resentfully, with the old gods continuing to be worshiped in secret, and was gradually done away with after his death. Then there was Zoroastrianism, a religion with some eyebrow-raising similarities to the later religion of Christianity which sprung up in Iran in the sixth century BC. It still has active adherents today. And then of course there were the Jews, whose single God would brook no rivals in His people’s hearts and minds. But the heyday of an independent kingdom of Judah was brief indeed, and in the centuries that followed the Jews were regarded as a minor band of oddball outcasts, a football to be kicked back and forth by their more powerful neighbors. 

  2. The most well-documented incidence of same occurred in 186 BC, and targeted worshipers of Bacchus, the famously rowdy god of wine. These drunkards got in such a habit of going on raping-and-pillaging sprees through the countryside that the Roman Senate, down to its last nerve with the bro-dudes of the classical world, rounded up 7000 of them for execution and pulled down their temples all over Roman territory. 

Renga in Blue

Quarterstaff: The Infamous Puzzle

by Jason Dyer at April 20, 2018 04:40 PM

In a curious way, even though I just started, I’ve been playing Quarterstaff for four years.

It’s long been one of the two Infocom games I’ve never tried (Shogun is the other one) and at one point when I was organizing my files I wanted to make a directory so I could play Quarterstaff when the time was right. I set up a Macintosh emulator (a bit of a ritual in itself) and gathered the documentation files I knew I would need. According to my file dates, this happened in 2014.

I had heard that in particular there was a puzzle reliant on the documentation that was quite nasty to solve.

The most significant “real” puzzle is that of deciphering a set of magic words using a parchment and wooden coin included in the game package. (Apparently quite a few players were stumped by this — Infocom actually gave away the entire solution in the very last issue of “The Status Line,” which is included in manual download below).
Home of the Underdogs

The documentation included the “parchment” on the top of this post, as well as a wooden coin.

Knowing about the puzzle’s reputation, intermittently I would take a glance at the image files in my directory, idly trying to solve the puzzle. Was there an acrostic or something of that sort in the poem? What did the difference between the coin and the parchment pictures mean? Do the animals to the side have a meaning?

Fast-forwarding to now:

This is the way to the second level, but this is also the location of the identify wand, which seems to be critical to the game, because examining it says “The glowing identify wand is in the gouged hole. A wand that looks to be used for copy protection. You had better read the documentation to figure out how to use it.”

The manual states the format for wand use is [MAGIC WORD] [TARGET OF MAGIC]. The mystery seemed to be what magic words could be used, and thus the puzzle boiled down to finding “magic verbs” the game would recognize. The four mini-poems at the bottom seemed to be applicable.

To glean the secret of a Wand,
Spy the rising sun, and pace
Southward six.

Here I was stumped, likely as stumped as the poor Status Line readers, until I had a lateral insight. Let’s clip an image from the game as a bit of spoiler space …

.
.
.
.
(don’t go on unless you want the puzzle completely spoiled)
.
.
.
.

… before mentioning I remembered that the coin was a physical object, and while it was not certain from the pictures, it appeared to fit inside the compass circle on top of the parchment itself.

Additionally, I noticed there was an arrow on the coin; I originally assumed it pointed to north, but then realized because of it being physical the coin itself could be rotated to match whatever the poem wanted. That is, if we “spy the rising sun” (start pointing east) the arrow on the coin can can be rotated to face east. Then from the eastmost point we can read off six letters rotating clockwise (“pacing southward”).

This gets ODEEPS which is indeed recognized by the game!

>Odeeps identify wand.
The identify wand glows faintly and suddenly Titus clearly understands exactly what it can be used for: Using this wand will allow the wielder to identify scrolls, wands, potions, and keys. They key words necessary for using the wand can be deciphered from the scroll and coin included in the game packaging.

I know I promised I would get to combat this time, but I’m going to wait a little longer while I explore Level 2; the game makes some very extended claims about artificial intelligence and I’m trying to verify if any of the claims hold out.

April 19, 2018

Renga in Blue

Quarterstaff: Great in Concept, Painful in Execution

by Jason Dyer at April 19, 2018 11:40 PM

The back of the Infocom box, via an Etsy auction.

It’s been a while! (You might want to reread my first post about Quarterstaff and then come back here. TLDR version: Quarterstaff is a Macintosh-only hybrid text adventure RPG with multiple characters.) While I’ve been busy with other projects, to be fair Quarterstaff itself is trying really hard to be unplayable.

1. The multiple characters sound good in principle but are painful in practice. Members of a group can act separately, so you get a series of prompts like:

L Titus? DRINK POTION
F Bruno? Z
F Eolene? EXAMINE BELT

so while one character is trying to do something finicky like adjust their inventory, you have to control the other characters at the same time. (“L” stands for leader and “F” stands for follower. You can change who is the leader and also separate groups.)

This gets really bad with something like DROP ALL or TAKE ALL because each item is considered a separate action, so if someone is dropping three items, your other party members are prompted multiple times for actions in between each item getting dropped. It’s as ridiculous as it sounds:

Fortunately (although I only found this out about 2 hours in) it’s possible to turn off this feature by deselecting a character name from one of the menus (it just has the “clover” symbol, no name). Multiple character control is still needed for things like combat, though.

2. There are lots of circumstances (at least early on) where a character is too heavily weighted down to enter a particular area. This not only requires the aforementioned inventory shuffle, but if somone who gets stuck is a follower, whoops! — your regular party goes ahead and your follower stays behind in the dark.

3. The interface uses multiple windows for player control and messages, map, and graphics. This doesn’t sound bad at first, but if a character gets separated from their group it pops up a new window, and the graphics are wildly inconsistent in size so that particular window grows or shrinks on every turn.

Note I’ve left the top left free because the picture sometimes takes up the entire area I have allocated. If I accidentally click in that blank space with no picture I get sent to the desktop.

4. The parser is on shaky ground at times.

Once I tried to >OPEN CLOSET and the game just picked it up instead.

5. Party death results in this ignominious screen (and the famous “Macintosh beep”) and then a summary exit to desktop.

6. While this is not the game’s fault, I’ve had my emulator crash on me multiple times. I’m going to switch software and see if that helps. Fingers crossed!

I’ll try to get into combat next time; I haven’t seen enough of it to really write about it properly.

Undo Restart Restore

Texture development time

by Juhana at April 19, 2018 07:41 PM

Some time ago I and Jim Munroe released an authoring system called Texture. The system was under development for several years as we spent a couple of hours here and there whenever we had the time.

I use a time tracking tool called Toggl to keep track of how long I've been doing various things, so I have relatively accurate statistics for how much time I've really spent working on Texture. The system has three major parts: the Writer that authors can use to create Texture games, the Reader which is the "engine" that plays those games, and the Public Library which is the repository of public works made with the system.

In total I've spent about 326 hours on the project. The summary is below:

Time (hours) % of total
Writer 141 43%
Public Library 73 22%
Prototyping 50 15%
Reader 42 13%
Project management 20 6%
326

Pie chart of time spent making different parts of Texture

Some notes about those figures:

  • Prototyping is time spent on planning and making mockups and initial throwaway versions before deciding on the final design. We did a lot of trial-and-error iterations and test versions before starting to work on what became the final product.
  • Project management is mainly emails and Skypeing with Jim to discuss and plan the project.
  • Because of reusable code there's some overlap in many areas. Working on something that was needed for multiple areas is marked down somewhat arbitrarily in the statistics.
  • I didn't start tracking time right at the beginning, so the prototyping figure is the most uncertain one. A total of 300-350 hours for the entire project should be about right.

Additionally, while I did all the programming work, these figures don't include Jim's contributions that total to about 250 hours. He did most of the system design and spent a lot of time on the community outreach, managing beta testers and writing demo games.

For me the main takeaway from these figures is that it takes a relatively short time to make a new system, but a lot of time to make a user-friendly tool for authoring games in that system. For example, two of my custom-engine games (Ex Nihilo and Ex Materia) were both practically one weekend projects from scratch, but building an editor for others to make similar games would take at least a month or two of full-time work.

Finally here's some statistics about the lines of code in each part of the system, excluding third party libraries. "Other" includes mostly helper scripts and server setup.

JavaScript HTML CSS Other Total
Writer 4051 1298 536 36 5921
Public Library 1924 1268 208 47 3447
Reader 964 135 499 0 1598
Other 385 0 0 36 421
Total 7324 2701 1243 119 11387

Bar chart of the amount of code written for Texture

PyramidIF

Spring Thing 2018 - REALLY, IF / REALLY, ALWAYS

by Hanon Ondricek ([email protected]) at April 19, 2018 07:50 AM




 Really, If / Really Always, Dawn Sueoka (The Orange Juice Public Library- A conversation with a simple AI based on Eliza, one of the earliest chatbots who could hold somewhat natural-seeming conversations by reframing the querent's input as another question in almost psychotherapy fashion. Since this is Twine, the player chooses from a list of responses which sometimes feel random. It seems procedurally generated, but there are only so many ways I can answer...I'm navigating a syntactical maze - am I interviewing it or...

Am I still playing this?

It's not long. It doesn't force itself on you, so don't be put off by the warnings if horror is not your thing - it's not exactly - but Really, If / Really, Always got into my head in a way that most games don't. I know its still back there. It's trying to be quiet, but I can feel it.

April 18, 2018

Emily Short

The 39 Steps (John Buchan remade)

by Emily Short at April 18, 2018 11:42 PM

Screen Shot 2015-10-01 at 11.12.38 AM

The 39 Steps app is an adaptation of the book and movie of the same name, available on Steam. It gets a lot of comments about how it is not a game, which is probably unsurprising given the Steam audience. Some of the things written about it make defenses about how it’s really meant to be an enhanced book, and therefore the lack of gameplay is to be expected.

I don’t demand recognizable gameplay elements in my interactive stories, but I do want some consistency in how the interface works and how it’s engaging the audience.

39 Steps uses interaction and gestures for pacing: click to move the story onward and read more text. Rotate the mouse to move the text forward or backwards. (I hated this one. I don’t have a mouse; I’m using a trackpad. I never quite worked out whether I was doing the gesture correctly.)

It also uses interaction to create a sense of place and context. Sometimes the text narrative will pause and put you in an environment with two or three interactive objects you can look at more closely. This is a bit like Gone Home with less walking or looking for pale pixels in dim corners, which, in my view, is a net positive. The main narrative is full of pompous, stalwart-colonial stuff about going to South Africa and establishing diamond mines, or the protagonist’s friend deciding to try his luck in the Congo. This is true to its original period but hard now to read without at least an undercurrent of distress. So when in the protagonist’s club we find objects such as this:

Screen Shot 2015-10-01 at 11.09.14 AM

…they serve to ground the story more concretely in its particular time, and to suggest that the app doesn’t uncritically approve of all this empire-building. (Unless, that is, you’re the sort of person who can look at that map and think “Rah, the good old days!”)

All the same, though, it felt like an adaptation without a strong understanding — as though someone had looked at the original story and asked where they could stick in some pictures and clickable bits, rather than reimagining it from the ground up as an interactive story.

This piece was recommended to me by someone who finds most traditional interactive fiction disappointing, because they’re looking for more audio-visual richness.

(Confession: I found this piece sufficiently irritating to interact with that I did not complete the whole thing.)

 

Lautz of IF

Trizbort v1.7

by jasonlautzenheiser at April 18, 2018 12:41 AM

New version of Trizbort released into the wild today.  V1.7 has just a few small items in it.

Open maps from Web

You can now open Trizbort files from the web.  Ctrl-Shift-O or File -> Open Map from Web (Ctrl-Shift-O) will bring up a small dialog to enter in a URL to the map.  Trizbort will download it and open it up.   The file can then be saved locally or any other function can be performed on the file.   The URL will also be included in the recent files list so you can open it again later.   Be aware that if you close Trizbort while the map from URL is open, it will be re-opened again when you open Trizbort if the feature to load the last opened map is enabled.

Small Object Updates

Another small change is to the objects section in the room  properties dialog.  You can now indicate whether an object is being worn instead of carried.  In I7 there is a difference.  It is treated as being carried in other languages.  This is indicated by putting a [w] at the end of the object text.

Also in I7, an object can be a “part of” another object.  Again this is indicated by indenting the child object under the parent and adding [h] to the end.  Otherwise, it will be treated as being contained by.  Again in I7 there is a difference.

If other languages handle these as I7 handle them, please let me know as I’m not versed in the other languages beyond the basics.

Other than that, there are just some minor bug fixes and code refactoring.

New Update process

If you are running 1.6.1, you have a new option to do an update.  Under the Help menu there is an option for Check for Updates.  This will detect that v1.7 is available and will prompt you to download and auto-install the new version.

Of course, if you prefer or are running an older version, you can still head over to GitHub and grab the latest release zip file and unzip as normal.

As always, any problems, questions or comments, please let me know here or over on GitHub.  Enjoy!

>TILT AT WINDMILLS

The Choice in "Mama Possum" (Moments Lost)

by Aaron A. Reed ([email protected]) at April 18, 2018 12:26 AM

“Moments Lost” is a blog series where I deconstruct a single moment from a narrative game, of any vintage, and talk through how and why it works. This post originally appeared on Medium.
The Game: Mama Possum (2017) is a short narrative Twine piece with visuals and sound, created by Kevin Snow, George Kavallines, and Priscilla Snow, from a concept by Cassandra Khaw. It tells the story of two sisters, co-pilots of a giant mecha that gives them a telepathic bond, defending their home in the American South from enormous invading roach monsters.

This article will contain spoilers for the end of the game. It’s available for $5 on Itch.io, runs in a web browser, and takes about 7–10 minutes to play.


Interaction alternates between clicking linked text and touching buttons on the mecha’s dashboard to advance the story. Only one such option is ever available at a time — except for one particular moment.

The Moment: Near the end of the story, the characters suffer a surprise attack from a swarm of roaches; a malfunction gravely wounds the narrator’s sister and is about to finish her off. At this critical moment, and for the first time ever, you have a choice between two clickable options: either the button to activate the mech’s missiles, or the words “my sister”:
Out the windshield, roaches fly on toward the Arkansas border, leavin us for dead. They’re still in range of Mama Possum’s missiles for a time. But an awful mechanical grinding from across the cab raises the hair on my neck — I know damn well what my sister is about to suffer.


Games have taught us what this is: an either-or choice. We can launch the missiles and save Arkansas, or we can save our sister’s life.

Choosing the missiles seems to bear this theory out: 
I pull myself up to the dashboard, all weight off the bad ankle, and lean into the button. Hold down for ten seconds … the floor rumbles as those beautiful steel babies hiss out of Mama Possum’s torso, trails of smoke behind em. No time to watch the fireworks.
Angle of the floor slides me away from my sister. I wear out all my arm muscle, pullin my broke body, till I’m just baby fat and adrenaline.
But if we replay, we see that clicking “my sister” merely skips this first passage:
Angle of the floor slides me away from my sister. I wear out all my arm muscle, pullin my broke body, till I’m just baby fat and adrenaline.
Regardless of your choice, the missile launch and the fate of Arkansas are not mentioned again. The story always concludes with the narrator sacrificing herself to save her sister, and then, bleeding and losing consciousness, remembering “a good time, just in case I die there”: a fleeting image of the home her sister made and what it meant:
I could have driven away, then. Could have drove off with that snapshot and still possess this inklin that my sister had done it. She’d gone and built a household that looked nothin like the one we were raised in. Just threw the whole poisoned blueprint in the trash and make a place love could survive.
Ain’t that its own kind of miracle.
Possum’s one moment of perceived agency — the brief instant the piece becomes multicursal, not just ergodic — has a heightened tension because of its uniqueness, like a single color shot in a black-and-white film. We are meant to feel the moment is important, is special.

But this is not a story about choices. It’s a story where you die for, or perhaps with, a sister, and about that mattering more than whether or not the world gets saved. We might read its one fleeting moment of illusory agency as temptation, as homage to form, or as rejection of a zero-sum binary. Regardless, we are neither rewarded nor punished for our action, because the heart of the story is an action there was never any doubt our character would take.

Additional reading: author Kevin Snow wrote a postmortem of the piece.



April 17, 2018

Retroactive Fiction

Xanadu Adventure (1982) — shop till you drop

by Ant at April 17, 2018 04:41 PM

Xanadu Adventure was a text adventure game written for the BBC Micro by Paul Shave in 1982. Typically for the time, Shave was strongly influenced by Crowther and Woods’s Colossal Cave, and Xanadu therefore featured the requisite stream, grating, and forest-maze, etc. But there was little chance that any jaded adventurers would be bored by these all-too-familiar surroundings because before they could start exploring the game they were forced to go shopping.

The game opens with a list of miscellaneous items including weapons, torches, food, and, um, postcards. And each item is given a price. You’re then told that you have “125 shillings to spend”, and it’s up to you to decide exactly how to spend it — although initially you have no idea which of the items are going to be of any use.

The adventure shop and the player’s limited spending power were just a couple of ways in which Xanadu Adventure distinguished itself from the average ADVENT clone. Another was that it had a two-player mode. This didn’t involve the use of RS-423 cables or multiple BBC Micros as in Graham Nelson’s Escape From Solaris, which I wrote about previously. Xanadu‘s two-player mode required players to make a fixed number of moves, alternately, on the same computer. Players could ally and combine their weapon-count to fight monsters, or they could turn around and beat the hell out of each other instead. (The latter was the more popular choice with the playtesters of the game, who happened to be the author’s sons.)

But Paul Shave had more tricks up his sleeve. He decided to introduce randomness into Xanadu and, in the process, created what might well be the first CRPG on the BBC Micro. This, I now realise, is a wonderful thing. But it caused me no end of stress when I first started playing the game. Perhaps a list of some of the key features of Xanadu Adventure will help to explain why:

  • Random object-placement! Many of the objects in the game, including some of the treasures you have to collect, are placed in random locations when you start a new game. One of the most important objects is the spare cash, which in some new games doesn’t seem to be anywhere at all (though it actually is)! The problem of having to slog around the map because of random treasure-placement is compounded by…
  • Random dwarves and dragons! They pop up when you least expect it, and you have to kill them (the dragons, at least) to move forward. And sometimes when you kill a dragon, you experience…
  • Random sword-breakage! After fighting a dragon, your sword may (or may not) get broken, so you have to trek all the way back to the blacksmith to get it re-forged. And all this trekking about, finding treasures and/or repairing swords, is particularly bothersome because of…
  • Limited light! The batteries in your torch eventually run out, so you need to go back to the shop, which is right at the beginning of the game, in order to buy some more — and no, you can’t buy them at the start of a new game because there’s a…
  • Limited inventory! There’s only a certain number of objects you can carry at one time. You can buy a bag to increase the inventory limit, but then you run into the problem of there being…
  • Limited cash! You get 125 shillings at the start of the game, most of which you have to spend straight away on weaponry and light. There is some extra cash you might come across later, but you can never be sure where or when that will be because of…
  • Random object-placement! (REPEAT UNTIL FALSE…)

If for some reason you want a deeper understanding of how Xanadu Adventure can not only amaze and impress but also drive a player to the very brink of madness, then see my walkthrough video, above.

I have to admit that it might be slightly unfair of me to harp on about the relentless, exhausting unpredictability of Xanadu because the randomness did actually lead to some interesting emergent behaviour, which I was able to exploit to make my adventuring a little easier: see the “To Catch A Dragon” section of the video, for example, which surprised even the original author.

See also the entry for Xanadu at CASA (solutionarchive.com), which links to my written walkthrough (which I now know to be flawed — because Xanadu).

There are further details about the game at Stardot.

And you can play Xanadu Adventure online at bbcmicro.co.uk.

Emily Short

Spy EYE (The Marino Family, Spring Thing 2018)

by Emily Short at April 17, 2018 12:40 PM

Screen Shot 2018-04-15 at 1.06.05 PM.pngFrom Spring Thing 2018, Spy EYE is a continuation of the Mrs. Wobbles series (Mysterious Floor; Parrot the Pirate; Switcheroo). Like the earlier pieces in the series, it’s an Undum work that tells a part-fantasy, part-reality story about children in foster care. (I also highly recommend Lucian Smith’s guest post about Switcheroo.)

In this case, the protagonists are a Latinx brother and sister whose parents are missing, and the story revolves around going to look for them and rescue them.The story lets you play as either Juan (the older brother) or Ichel (the younger sister), and they have different takes on whether to expect their parents back any time soon. That touch reminded me of a few other stories where the choice of viewpoint character is meant to shed some light on a family situation — Stephen Granade’s Common Ground, most notably.

The subject-matter of these stories has progressed over the course of the Mrs. Wobbles series, and this one definitely feels like it’s for slightly older children than the first episode or two.

In Spy EYE, the kids are able to spend more time outside of their house, and we get more of a view of where they live, a much more detailed and vibrant representation of Los Angeles and its environs than we saw in previous games. At one point, they learn how to make their beds fly, and in a somewhat Mary Poppins sequence, visit West Hollywood and Koreatown and other LA neighborhoods — in each case getting a sense of how the social norms of the area might affect family life and personal freedom.

Meanwhile one of the sinister forces of the story is the Santana wind. Anyone who’s ever lived in LA is likely to appreciate how the Santa Anas correlate with impatience, irrationality, and a prickly sense you might at any moment come out of your own skin.

The gist of the story is challenging, though it’s wrapped in some playful fantasy imagery. The parents in this story are not dead, but they’re going through some things that make it impossible to take care of their children right at the moment. Can our protagonists accept that fact, and wait, and at the same time hold onto the faith that their family is still a family? It’s a situation that asks these children for a degree of empathy and long-term perspective-taking well beyond their years: even adults may have trouble realizing that their parents are limited and human, and forgiving them for it.

Screen Shot 2018-04-15 at 1.21.39 PM.png

Previous stories in this series have added some power-up features designed to reward the player for reading as much optional text as usual, but in the earlier versions they weren’t always fleshed out, and in some cases didn’t seem to do much. Here, those systems have been a bit more rigorously implemented, with the ability to collect poems and gather “page points”. These allow free rewinds at certain points in the story, and unlock a few extra attacks in the final battle.

There were a few points where I felt the story would have benefitted from a bit more proofreading, and several others where I felt the fact that I wasn’t really the target audience. I also felt uncomfortable about the way a major antagonist is labelled as “voodoo dude,” as a way to automatically mark him as sinister: this is an area where someone else would be much better equipped than I am to speak to stereotypes and cultural representation, though.

And since I’m talking more than usual about interface this month: Undum has been around for a few years now, and this piece is using its affordances in a pretty standard way. It still works well. The animation of new text and the sense of continuous flow make this a good tool for stories with relatively sparse branching and a fair number of click-to-continue sorts of links. Spot illustrations do well in this kind of interface, and the Marinos make good use of that, with some quality art in a style that’s been consistent across all four episodes.

April 16, 2018

Zarf Updates

Missing moments in games

by Andrew Plotkin ([email protected]) at April 16, 2018 10:05 PM

This weekend I finally played David OReilly's Everything. (I tried a pre-release build a couple of years ago, but I had trouble with the controller and gave up almost immediately. This time I played all the way through, or at least to what I can call an "ending".)
Spoiler warning: I am about to start talking about the ending ("ending") of Everything, and how it is constructed. I'm going to go into detail. So if your kink is surprises, buy Everything and play through it before reading this. It won't take you too long.
Death of the Author warning: I am about to start talking about the intent of games by seeing how they are constructed. Indeed, I will be making assumptions about how the design evolved. That is: I will be reading games as texts. I realize it's perfectly possible to go find the designer and ask what they intended, or how the game evolved -- but that's not the point here. That is not, as Alan Watts says in Everything's adopted narration, the game we're playing.

Everything is a philosophical game, and it ends on a philosophical note. Having explored all up and down the Great Mesh of Being, from carbon atoms to galactic clusters, you have now gotten stuck in a wasteland of junk and despair. This is expressed literally: pianos, shoes, and french fries wander by, bemoaning the hopelessness of their existence. You have gotten used to wandering the universe with absolute freedom, but now you are trapped. The "exit" button doesn't work.
Finally you meet a guide (the game Everything itself), and it tells you what to do. You have to enter the "thoughts" screen and empty it out by hitting the B button. Then you will be free to leave. And you do that, and it works. Freedom! Fireworks!
But it's an anticlimactic way to express this ending, ain't it? The game literally tells you what to do -- the pop-up help box says "press the back button to enter the thoughts screen, and then press B to erase everything." (I'm paraphrasing that, but only a little.) You follow instructions. It might as well start "Would you kindly"!
Thematically, it's a perfect ending. The whole game is underpinned by the aforementioned Alan Watts telling you that you are everything. Everything is you. Your ego and your separateness from the world are just habits of thought. So obviously the key realization is to discard all these thoughts -- all these expressions of individuality that you've collected from shoes and horses and planets throughout the game. Let it all go. Experience the universe with no preconceptions. Right?
To figure this out would be a powerful moment -- a mad idea which makes perfect thematic sense, and you try it, and it works. I think that's as close as a game could come to conveying the Zen experience of satori! But Everything balks. It never leads you to this epiphany. It doesn't even give you space to experience it. It just hands you instructions.
I can only imagine that the original design was for you to be stuck, stay stuck, until you came to this realization on your own. But then in playtesting it didn't work at all. No surprise: the "thoughts" screen is very marginal to the gameplay. I never entered it at all except by pushing the wrong controller button! There are no game mechanics concerned with selecting, discarding, or managing your inventory of thoughts. So to introduce one at the end is impossible -- I mean, you can try, but your players will never catch on.
One can imagine design changes to make this work better. Provide moments of blockage throughout the game which can be "solved" by selecting and discarding a specific thought. Then extend this blockage metaphor to the wasteland, so that the player realizes that all thought is now blockage and must be discarded.
I don't know (allowing the author a moment of resurrection) whether David OReilly tried such alternate designs. Maybe he tried them and they didn't work. Maybe they cluttered up the game UI too much, or bogged down the pacing. Maybe the notion of a difficult puzzle in a philosophy game was repellent, and he decided to commit to a model of a game which can play itself. (At which he succeeded.)
I might compare ending non-puzzle to the second-to-last puzzle ("puzzle") of Everything, in which you are told to "return to where you started". With no clues at all about how to do that! I found this genuinely difficult, and Googled for a hint, and then it made thematic sense. Sort of.

...But what the heck, let's jump tracks and talk about a completely different game: El Shaddai, a stylish button-masher beat-em-up from 2011. This game, too, left me with the sense of a missing epiphany.
Let's set this up. You are Enoch, travelling the realms of Heaven in anachronistic blue jeans, clobbering fallen angels. Because -- because -- look, the plot makes no sense. I think it's based on a manga. Never mind the plot. You wield angelic swords and things, but as you bash monsters, your weapon gets gunked up with evil red-glowing ichor. You have to back off and hit the "purify" button, which cleans away the corruption with holy fire and restores your weapon. This is the basic combat mechanic.
In addition to walloping nephilim, you are chasing a little girl named Nanna (or Inanna, Ishtar, etc). Then you are frozen in carbonite for ten years and Nanna grows up, because manga. Then she is infected by evil. You chase her some more, only now she's a woman being devoured by red-glowing ichor.
See where this is going? You finally catch up with Nanna/Ishtar as she lies dying. You pick her up and cleanse her soul, purifying the corruption with holy fire...
Except this happens in a cut scene! There is no moment where you push the "purify" button for the win. You just watch it happen.
This situation is not like Everything. The "purify" mechanic is deeply embedded in El Shaddai. You are taught it at the beginning and you spend the whole game practicing it. It's become instinctive by this point. So it would work perfectly well to have the game linger in that moment -- Enoch cradling Nanna's dying body -- and allow the player to catch the clue and purify her. It would work. It would be, in a sense, a quicktime action -- but quicktime done right, with semantically meaningful controls. No need for a time limit, either.
So why didn't the developers do this? I can't imagine that it playtested badly. Even if a player completely misses the point, it's a game controller -- button-mashing will work, albeit without the satisfying epiphany. Maybe the game engine wasn't flexible enough for this sort of non-cut-scene, non-gameplay mode? I don't know. But the whole game and storyline are shaped to come together at this moment at the end. Someone must have intended to do it. Then they... didn't.

When people ask how I design games, I say: I start with the interactions and work backwards to puzzles and story. (Take for granted that I like puzzles.)
Game design is hard, and I mean as an absolute percentage of the work of game-building. You want that perfect moment of gameplay -- the moment when the player sees a possibility appear out of the dense landscape of their learned game experience. Setting that up takes an incredible amount of work! The entirety of the game design, back to the first moments of play, becomes subordinated to this requirement: get the player to where this realization is possible. Give them the tools, make them use the tools, lead them to experiment with the tools.
The story becomes just another avenue for imparting expectations about game mechanics. For both of these games, I fully believe that the designers started with these core moments. Everything: free yourself from your thoughts and expectations. What kind of game leads to this point? A game in which everything is constantly talking to you, handing you mundane thoughts and complaints just like the ones that fill your own head.
Or El Shaddai: if you can purify weapons, you can apply that power to heal a human. What kind of game leads to this point? A game full of weapons, where using and then purifying them is a familiar cycle. But also: a game with a human NPC whom you feel protective towards and want to save if she becomes ill.
(No, it didn't have to be a cute passive anime girl who goes through puberty in the course of the storyline. The designers could have pushed the trope boundaries a little harder there.)
You can see this front-to-back design in my games, I hope. I see it in any successful adventure or puzzle game. Some lean towards systematic mechanics, some towards surprising combinations and outcomes. But this is a continuum, not an opposition.
It's these odd cases that really irk me, though. Games where I can see the setup; they've done the work; they just don't quite carry through. Near misses sting so much worse than simple failures, right? I can't even say "missed opportunities", because there's no way they were overlooked. Someone backed away. Something went wrong, something intervened. I may never know.
(Unless I ask the designers, which I won't do -- not within the confines of this post, anyhow.)

PyramidIF

Spring Thing 2017 - A somewhat chilly first dip into the games.

by Hanon Ondricek ([email protected]) at April 16, 2018 05:29 PM

Spring is here! Warm sun! Flowering trees! Green grass and--I just went out to move my car and there are snow flurries. I know other regions are getting actual piling snow still, so while it might not feel like a Spring Thing, a new crop of games is here! I often get wrapped up creating my own IF bidness and don't play new releases as much as I should, but since we can't really mow and plant yet, let's take shelter in the gazebo where a bumper crop of 20 entries are festivalling. I don't guarantee I will be able to chip the frost off of all of these hopeful, budding shrubs, but here's a first foray into what will hopefully soon yield thriving IF greenery:

Best Gopher Ever, Arthur DiBianca (Parser Inform/Z) - I had to resist adding an exclamation point at the end of this title. "Help the unfortunate residents of Fairview! (Who are all animals, by the way.)" This is billed as a light puzzle game for all ages and delivers on that. DiBianca has a knack for pruning the command set of parser games down to a necessary few, but I found despite this I kept reflexively trying to EXAMINE everything. This is a game of intertwined fetch-quests, and you, the title "go-fer" ostensibly, run errands for an impressive list of three-letter named animals. After I rolled my eyes, I enjoyed the busy-work, almost IF Sudoku vibe of this. The STATUS command is helpful as a quest-log, and was it not for an extremely helpful graphical MAP, I might not have seen this through to the end. The downside to this is after getting into the groove and navigating 80wpm through the map, I stopped paying attention to idle messages and got stuck on the last minor puzzle. I also secretly hoped there might be a hidden meta-game, but didn't run across one in my play.

A Bunch of Keys, Mike Gerwat (with coding by Al Golden) (Parser Inform-Glulx) - This sounds like it should be my jam: "A time travel story of a real-life piano tuner and repairman who just happens to be blind." This piece is semi auto-biographical - the protagonist shares the author's last name, and I only say "semi" due to the time-travel element unless there's something I'm unaware of. There's a read-me-first text file which gets a bit over-explanatory...the protagonist is blind and partially deaf with cochlear implants - fair enough. Great setup for an IF. Time travel, limited-specialized interaction...I'm all in. Teach me stuff about how you experience the world. Genesis (the band) is name-dropped in the when-play-begins... Good good...the banner pre-warns me I need to read the help menu and the walkthroughs and "not complain" if I can't finish the game otherwise...uh-

EXAMINING a thing is actually touching it, but EXAMINING a person is actually examining since this isn't one of those games... I can make the mental translation that the descriptions are based on touch and other senses besides sight. The description of the shirt the player is wearing is surprisingly visual, but I suppose the PC knows he's wearing his Stones concert tee and has memorized a verbal description of the logo...fair enough. Wow. The text help menu is extensive. I imagine I'll want to read all the background info and resources about visual and hearing impairment afterward as kind of a "DVD extras" for the game...
4. For the snowflakes out there. There is a bit of grow-up material in this game and fifty years ago we didn't have any political correctness...
Okay, sure? There's a lot of this pre-defensive disclamatory talk. The game is created hard on purpose and I'm supposed to learn that. I get it, people are rude and I'm certain this author is heading off people 'splaining his own disability to him. I don't expect to feel "welcomed" in a game which purports to let me experience perception and persona different than I'm used to and I want to play, but...enough of this. I'm probably dipping into the help material way too early.
....There's a diploma in a perspex frame on the wall here. It's hanging on a hook and can be removed easily from the wall. A coffee table and a very old comfortable couch are here as well. The Kitchen is to the west and the Bathroom is to the east. The Bedroom is north of here.

: x frame
Your Piano Tuning/Repair Diploma from 1968 is in the frame.

: take frame
For some reason you can't remove the frame from the wall.

: take diploma
For some reason you can't remove it from the wall.
Huh? Okay, it's hard like this. Not hard, but basically unfair. It's one thing to simulate challenges of experience with the parser, it's another for the game to just lie about the environment. I try going west but the game stops me when my hand encounters a key and an envelope which contains an invitation.
: x envelope
This an envelope that from the college you went to has sent you. You were told by your care person Maz that you would be getting one.

: w
You decide to read the contents of the envelope first.

: read envelope
You can't read the envelope.

: read invitation
The invitation is typewritten and you'll have to call your care person Maz to come and read it to you. You replace it in the envelope.
: w
You decide to read the contents of the envelope first.

: read key
Since you're blind, that's not possible, but you scan it and hear what's on the fob.

: w
You decide to read the contents of the envelope first.

: scan key
Unitl you find your scanner, that's not happening.
And ragequit. When I replayed the game just now to capture the text for this, I went west immediately, and here's what happened:
: w
As you head to your kitchen you hear the frame fall from the wall. You pick it up and and place it on the coffee table for the time being.

: x frame
Your Piano Tuning/Repair Diploma from 1968 is in the frame.

: feel frame
Your Piano Tuning/Repair Diploma from 1968 is in the frame.
Where's the key? Where's the envelope? The diploma was waiting to fall if I didn't examine it? I mean, maybe this might be part of the upcoming time-travel hijinks to be experienced later in the game, but I guess the takeaway is that I wouldn't last five seconds in the author's shoes. I think I finally figured out I'm supposed to use FEEL [OBJECT] as a more reliable EXAMINE. That said, forcing the player to do things in a specific order "You decide to read the contents first." "You can't read that, you're blind..." with this much confusion (at least right at the beginning after being somewhat grouchily advised to RTFM and the extensive supplementary material) is less-than-stellar game design, and I'm gonna need to return to this again later to see if I can make more headway.

Confessions of an NPC, Charles Hans Huang (Twine) - Again, this should be my jam - a meta-exploration of the heretofore unrealized motivations and emotions of usually-peripheral trope characters in an adventure game. Sounds fun! Turns out, Confessions is heady and thought-provoking, and I suppose I shouldn't expect escapist fun from all my games. I kind of feel like a tourist in NYC looking to take in a Broadway show but since that flashy musical everyone's talking about is sold out, I instead end up in a critically-praised but very serious Pinter play which is a cycle of five monologues by characters in a medieval fantasy world (Holograms! That's fun!) but who are actually modern people in these costumes monologuing about very up-to-the-minute problems of modern society. It's good, and very topical, but not at all what I was expecting.

I have mixed feelings. This is a message piece masquerading as satire - which is nothing more than a quick coat of glittery fantasy-trope paint that really doesn't lend any deeper irony to what's going on except that this audience probably has played games. I suppose it's a spoonful of sugar for the "medicine" to come? I appreciate what this is doing. It's an exploration of hot topics - Mother is exploring the psychology of guilt being a relative of a school shooter; Princess is a prisoner who doesn't want to shrug off the yoke of her pampered existence because she's safe with the evil she knows and ends up doing bad things in hopes to make a similarly bad situation less bad for others. Moneymaker is in the business of basically pimping out the Princess to make the game fun for oblivious Heroes in kind of a Westworld situation...

These stories are the kinds of revelations that are usually justified in turning us on our head after sucking the player into a "fun" fantasy world a-la Doki Doki Literature Club or Undertale where we already have formed a worldview and have a basis of uninformed choices to build upon - but here it feels we've skipped the revelatory turn business: (No! This fantasy world is ACTUALLY our own!!!...!) Confessions expects us to know that part already and the game just handwaves it. To continue the nerdish Broadway metaphor: This is like arriving at Into the Woods at intermission and going "Why is everyone singing slow ballads about loss? Where's the fantasy fairy tale fun I came for? Did I miss something? What's all this deconstruction? (Don't you hate theatergoers who skip Act One and then ask a bunch of questions?)

The player is frequently pressed to answer whether they'd make the same choice as the character. The answers don't really matter, but then at the end of each encounter, you're called to make a choice and compelled to type the justification for it into a text entry field in your own words. (Nrrrgh...essay questions...) The game does a good job of disclaiming your answers aren't being saved and reviewed by a shadowy government bureau, and all the sensitive scenes are politely preceded by potential trigger warnings (however, you can't reach the self-discovery essay finale scene unless you complete them all.)

I understand the need for this and this is a great use of interactive fiction. Perhaps this is such a RIGHT NOW piece deconstructing RIGHT NOW social politics RIGHT NOW while we're all neck-deep in social change it comes off more heavy-handed and "preaching to the choir" than it would at another time. I feel the people who would benefit most from this experience aren't people who would normally play IF. Maybe this will become a classic in the future that answers the question "What did 2018 feel like?" I was still holding out hope the entire time that at some point the witch would burst out and get hoisted on a broom while singing loud high notes.
----
BONUS - Recursion, Adrian Belmes - Not part of Spring Thing, but I got a random Twitter notification and played this. Perhaps I'm in a fragile state having attended a funeral this weekend and was in the right frame of mind for sad piano music and maudlin Twinery, but this is done with such care and restraint and evoked such shockingly warm feelings of bittersweetness (over and over) I had to shout it out.
----
(No, that's not me you hear warbling "Defying Gravity" in the corner. Seriously.)

Emily Short

Known Unknowns (Brendan Patrick Hennessy)

by Emily Short at April 16, 2018 02:41 PM

knownunknownsKnown Unknowns is a four-part Twine series by the author of Birdland and set in the same universe. The protagonist is Nadia, a Toronto teenager who is trying to deal with her sexuality, fraught relationships with several of her classmates, various annoying teachers, and the real possibility that she has just encountered a ghost raccoon.

Like Birdland, this is Y/A queer romance — but this time the choices are less about self-characterization and more about how you’re going to interact with the side characters. (And, as in Birdland, the core plot remains the same regardless. This is not as far as I can tell a heavily branching story, but the interpretation of individual scenes can vary a good bit.) Known Unknowns is immensely charming and accessible, solidly structured and well paced — and as it’s now available in its complete form, there’s no waiting between episodes.

The whole experience is smooth enough that it’s easy to ride right over the formal experimentation here. Though it’s a Twine game, Known Unknowns clearly belongs to a tradition of interactive fiction that knows about parser IF as well: there are several sequences where you’re navigating a space (in some cases even a space with compass directions) and choosing what to look at and which characters to talk to; and there are scattered jokes that are a clear acknowledgement of parser IF traditions. There are also moments that feel more reminiscent of Life is Strange, especially the elective conversations with scattered classmates. And there are a handful of sequences in which the player’s choices are even more surprising. Any given screen of text in Known Unknowns might represent:

  • Dialogue with one or more other characters, followed by a choice (often not verbatim) of what to say next
    • …including some examples that are not exactly in English
  • A room description, with choices of where to go or whom to interact with
  • The screenplay of a show we’re watching, with options as to which dialogue we want to comment on
  • Elements of a to-do list we’re working our way through, which we can tackle in any order

And notably, the default action here, the thing that happens most of the time when you click on a link, is not examine but discuss. Many, many Twine games divide links between those that will give you an object description and those that will move the story forward. Known Unknowns keeps you in conversation, and even your observation takes the form of conversation rather than interior monologue.

It’s good conversation. Hennessy’s dialogue is funny and assured. The few adults are pretty much all figures of satire — the mocking and dismissive French teacher, the history teacher who is desperately trying to be one of the kids, the occasional parent — but the other students mostly move (or at least can move) from an initial stereotype to something a bit more fleshed out, if you spend enough time to get to know them. Indeed, most people you encounter have a solid motive for what they do, even if it’s not initially clear what that is.

 

Meanwhile, also because of this rigorous focus on conversation, we often don’t explicitly see or specify what Nadia is thinking. We see what she’s saying to other characters — but that could be untrue, or limited, or represent internal confusion. It’s a very cinematic approach in a lot of ways — embracing the way film and television rely on subtext, rather than novelistically telling you what the protagonist has in mind.

There are quite a few moments where I wish I could make Nadia be more direct with people, and more truthful about what’s going on with her. And there were a few times when the choice structure also funneled me towards one thing I didn’t really want to do, like texting her boyfriend Allen. Curiously, the parser-esque structure where you can move from room to room helps disguise the occasions when there is only one option that will move the story forward — because you’re welcome to keep wandering through the rooms looking for something else to do, but at the end of the day, only that one interaction option will make things progress.

In the case where I wound up texting Allen because I couldn’t find anything else to do, it felt like Nadia was falling back on this as a habit and a default. Contacting him was something to do, when she didn’t have the imagination to picture different ways she could act in the present social circumstances. If this scenario had been replaced with a Twine screen where I only had one option, Text Allen, I would have felt a bit more railroaded; and I also wouldn’t have had the experience of flailing around trying to come up with another option, which was part of Nadia’s journey as a protagonist, not just my journey as a player.

But then, Nadia’s limitations are the point. She doesn’t have the confidence or self-knowledge to navigate high school romances very effectively. She needs to understand herself better before she can be honest with other people. In contrast with, say, Rameses, she is allowed to make progress and change over the course of the story.

This is a sweet and lovely piece, not to mention very funny. I highly recommend it.

April 15, 2018

Emily Short

Mid-April Link Assortment

by Emily Short at April 15, 2018 04:40 AM

April 18 is the next meeting of the Oxford/London IF Meetup, where inkle studios’ Joseph Humfrey will talk to us about making interactive text look good and flow well — and in my view there’s no one better to learn that from.

April 23 is the next gathering of the Dublin Interactive Fiction Meetup.

April 25 is the next meeting of PR-IF, the Boston/Cambridge IF Meetup.

May 5 is the next meeting of the SF Bay IF Meetup.

Also May 5, the Baltimore/Washington DC IF meetup looks at Sherlock Indomitable.

Spring Thing 2018 games are available to play, and judging continues through May 7.

Exact date is still TBD, but May’s Oxford/London IF Meetup will be a workshop on using Tracery for Twitterbots.

Feral Vector is May 31-June 2 this year. This is a joyous, playful indie conference in Yorkshire and has always been delightful when I’ve been able to attend. (I can’t make it this year, alas.)

*

At the risk of Overthinking It, I’ve been loosely theming blog content on different topics each month, and also trying to align content with what the Oxford/London IF Meetup is doing that month. Here’s what you can expect around here for the coming several months:

  • This month we’re looking at user interface and different types of IF experience in that sense.
  • Next month, May, is about procedural text and generated meaning. The London IF Meetup date for this is still TBD, but the subject matter is known: George Buckenham will be leading a workshop on using Tracery and building Twitter bots.
  • June is all about parser IF and expressive input. Graham Nelson will be talking to the London IF Meetup about what he’s been doing with Inform lately.
  • July, quality-based narrative approaches. Leigh Alexander will talk to London IF Meetup about narrative design for Reigns: Her Majesty. (There may be spreadsheets. I get very excited about spreadsheet talks.)

*

A number of the talks from GDC 2018 are now available on the GDC Vault, some of which are free. Some interesting items from there:

AI Wish List: What Do Designers Want out of AI?

Exploring Helplessness in Games with ‘Bury Me, My Love’ — I’ve mentioned this game a few times previously, but it tells a story about Syrian refugees.

Game Design Patterns for Building Friendships — not so much an IF topic, but an interesting systems design question.

Procedurally Generating History in Caves of Qud — cool stuff if you are into procedural text, procedural backstory, and simulation-heavy games.

Queens of the Phone Age: Narrative Design of Reigns: Her Majesty. As mentioned above, we will be hearing more from Leigh about writing this game at a forthcoming London IF Meetup.

Writing Modular Characters for System-Driven Games — Tanya Short talks about how you structure and write for procedural characters.

This doesn’t cover nearly everything interesting that happened at GDC, but some of the other talks remain paywalled for the time being.

*

You can find out more about inkle’s forthcoming Heaven’s Vault at Verge, with trailer videos and a discussion of their constructed language used in gameplay.

Jason Grinblat shares some amazing procedural map generation examples.

My talk from the Malta Game Jam is available here.

April 13, 2018

The Digital Antiquarian

The Game of Everything, Part 5: Civilization and War

by Jimmy Maher at April 13, 2018 03:41 PM

War appears to be as old as mankind, but peace is a modern invention.

— Henry Maine

As soon as they decided to bring rival civilizations into their game of Civilization to compete with the one being guided by the player, Sid Meier and Bruce Shelley knew they would also have to bring in ways of fighting wars. This understanding may be a depressing one on some level, but it squares with the realities of history. As far back as we can observe, humans have been killing one another. Even the possibility of long-term, lasting peace in the world is, as the Henry Maine quote above says, a very recent idea.

Tellingly, The Iliad, the oldest complete work in the Western literary canon, is a story of war. Likely written down for the first time in the eighth century BC, it hearkens back to the Trojan War of yet several centuries earlier, a conflict shrouded in myth and legend even at the time the supposed blind poet Homer first began to chant his tale. The epic does devote space to the ultimate pointlessness of being pawns in the sport of the gods that was the Bronze Age Greeks’ conception of war, as well as the suffering engendered by it. Yet that doesn’t prevent it from glorying in all the killing, thus illustrating that ultra-violent popular entertainments are anything but a modern phenomenon. The goriest videogame has nothing on Homer:

He hurled and Athena drove the shaft
and it split the archer’s nose between the eyes —
it cracked his glistening teeth, the tough bronze
cut off his tongue at the roots, smashed his jaw
and the point came ripping out beneath his chin.
He pitched from his car, armor clanged against him,
a glimmering blaze of metal dazzling round his back —
the purebreds reared aside, hoofs pounding the air
and his life and power slipped away on the wind.

Just as Homer looms large in the early Greek literary tradition, one Heraclitus does the same in early Greek philosophy; legend tells us he wrote around 500 BC. Only fragments of his works remain to us today, mostly in the form of quotations lifted from them by later philosophers. Those fragments and the things those later commentators wrote about him identify Heraclitus as a philosopher of flux and change; “No man ever steps in the same river twice,” goes his most famous aphorism. He was apparently the first to identify the tension between physis, the reality of being in all its chaotic, ever-changing splendor, and logos, meaning literally “word” or “speech” in Greek — all of the rules of logic and ethics which humans apply in the hopeless task of trying to understand and master physis. A disciple of Heraclitus would call the narrative of progress a pathetic attempt to bridle the physis of history by forcing a comforting logophilic structure upon it.

As a philosopher of unbridled physis, Heraclitus was also a philosopher of war, of conflict in all its forms. “We must know that strife is common to all and strife is justice,” he wrote, “and that all things come into being through strife necessarily.” Neglected for a long time in favor of the cooler metaphysics of Socrates, Plato, and Aristotle, Heraclitus burst back into prominence at last in the early nineteenth century AD, when he was rediscovered by the German school of idealist philosophers. Later in the same century, Friedrich Nietzsche, who loathed the rationality of the Enlightenment and the narrative of progress it inspired, saw in Homer and Heraclitus alike a purer, more essential reflection of the reality of existence.

But we need not agree with Nietzsche that the Greeks of the Bronze Age had everything right and that it’s been all downhill from there to find something of value in Heraclitus. Consider again this assertion that “all things come into being through strife.” There is, it seems to me, some truth there, perhaps more truth than we’d like to admit. As Nietzsche’s contemporary Charles Darwin taught us, this is how biological evolution works. Strife is, in other words, what made us, the human race, what we are as a species. And it would certainly appear that our earliest civilizations too came into being through strife.

During the Enlightenment era, two dueling points of view about the nature of primitive peoples dominated. The Swiss/French philosopher Jean-Jacques Rousseau coined the cliché of the “gentle savage,” who lived in a state of nature with his companions in an Eden of peace and tranquility, untouched by the profanities of modern life; progress in all its guises, Rousseau asserted, had only led humanity to “decrepitness.” Rousseau saw the narrative of progress as a narrative of regress, of civilization and all its trappings serving only to divorce humanity more and more from the idyllic state of nature. But Thomas Hobbes, whom we already encountered in my previous article, took the polar opposite view, seeing the lives of primitive peoples as “nasty, brutish, and short,” and seeing the civilizing forces of progress as the best things ever to befall his species. He believed, in other words, that humanity’s distancing itself more and more from the primitive state of nature was an unalloyed good thing.

This duality has remained with us to the present day. You can see much of Rousseau in the Woodstock Generation’s claim that “we are stardust, we are golden, and we’ve got to get ourselves back to the Garden,” as you can in some aspects of the modern environmental movement and our societies’ general fetish for all things “natural.” Meanwhile Hobbes’s ideology of progress is, of course, the core, driving idea behind the game of Civilization, among other signs of the times.

So, we have to ask ourselves, who’s right — or, at the very least, who’s more nearly right? There are few if any human communities left in the world today who live in so complete a state of nature as to conclusively prove or disprove the theory of the gentle savage. We can, however, turn to the evidence of archaeology to arrive at what feels like a fairly satisfying answer.

Almost all of the most famous finds of Stone Age corpses show evidence at the least of having suffered violent trauma, more commonly of having died from it. Indeed, the fact that it can seem almost impossible to turn up any human remains that don’t show evidence of violence has become something of a running joke among archaeologists. Ötzi the Iceman, as a 5000-year-old body discovered in the Austrian Alps in 1991 became known, turned out to have been shot with a bow and arrow and dumped into the crevasse where he was found. Kennewick Man, a 9500-year-old body discovered in Washington State in 1996, had been shot in the pelvis with some sort of stone projectile that remained embedded there. Lindow Man, a 2000-year-old body discovered in rural England in 1984, had been bonked on the head with a blunt object, had his neck broken with a twisted cord, and then, just to make sure, had his throat cut. Another 2000-year-old body found more recently in England had been beheaded, probably in a form of ritual sacrifice. Yet another recent discovery, a 4600-year-old family consisting of a man, a woman, and their two children, showed evidence of having been killed in a raid on their encampment. The Garden of Eden theory of early human history, it would appear, is right out.

Rather than being their antagonist, violence — or, often, the threat of violence — was a prime driver of early civilizations. Sentiment may have sufficed for primitive humans to keep their family and perhaps their friends close, but it was the logic of survival that pushed them to begin to enlarge their circles of concern, to band together into the larger communities that could form the basis for civilization. Long before humans had any inkling of a narrative of progress, the most important, tangible benefit of civilization was protection from the depredations of hostile neighbors. The Enlightenment philosopher Immanuel Kant called this development, which paradoxically arose out of the impulse toward conflict rather than cooperation, “asocial sociability”:

In a sense, this reality that civilizations are born in violence is baked into the game of Civilization. From the mid-game on, it’s possible to make a very good go of it as a peaceful democracy, to be a good global citizen not declaring war unless war is declared on you, striving to trade and research your way to Alpha Centauri. Before you reach that stage, however, you have to be a despotic state no better than any of the others. As every Civilization veteran knows, it’s absolutely vital to establish sovereignty over your starting continent during this early stage in order to have enough cities and resources to be competitive later on. Thus you can’t afford to play the gentle savage, even if you believe such a person ever existed. If, as is likely, there are rival civilizations on your starting continent, you have to conquer them before you can think about peaceful coexistence with anyone else.

But the debt which the narrative of progress owes to war and the threat of war extends far beyond a civilization’s early stages, both in real history and in the game. In fact, the modern world order, built around fairly large nation-states with strong centralized governments, is, along with all of the progress it has spawned on so many fronts, a direct outgrowth of the need to project military power effectively.

Early twentieth-century writers, reacting to the horrible wars of their times, concocted the legend of war as a more honorable affair during earlier ages, one in which civilians were spared and soldiers comported themselves as civilized men. One has only to read The Iliad to know what a load of bunk that is; war has always been the nastiest, most brutal business there is, and codes of behavior have seldom survived an army’s first contact with the enemy. And if one was unfortunate enough to be a civilian caught between two armies… well, raping and pillaging were as popular among soldiers of earlier centuries as it was among those of the twentieth, as the stories of same in The Iliad once again cogently illustrate.

Still, there were important differences between the wars that were fought prior to the eighteenth century and those that came later. It’s easy today to overlook how differently societies were organized prior to the Enlightenment era. Such modern countries as Germany and Italy were still collections of small independent states, cooperating at best under a framework of uneasy alliances. Even where there existed a centralized government, the monarch’s power was sharply limited under the feudal systems that held sway. If he wished to go to war, he was often reduced to begging his nobles for the money and manpower necessary to do so. In addition, economies in general had very limited resources to set aside from the basic task of feeding themselves in favor of waging war. It all added up to make wars into hugely inefficient businesses, where months or even years could go by between significant battles. In many ways, of course, that was good for the people of the countries fighting them.

It was the unification of England and Scotland as Great Britain in 1707 that marked the beginning of the modern nation-state. Thirty years before said unification, the entire English army consisted of no more than 15,000 soldiers, a number that could be packed into a typical modern sports arena and leave plenty of seats to spare. The historian John Brewer describes what followed:

The late seventeenth and eighteenth centuries saw an astonishing transformation in British government, one which put muscles on the bones of the British body politic, increasing its endurance, strength, and reach. Britain was able to shoulder an ever more ponderous burden of military commitments thanks to a radical increase in taxation, the development of public-debt finance (a national debt) on an unprecedented scale, and the growth of a sizable public administration devoted to organizing the fiscal and military activities of the state.

This radical remaking was driven by two principal factors. One was advances in technology and engineering that freed up more and more people to work at tasks other than food production; this was, after all, the period of the Enlightenment, when the narrative of progress went into overdrive. The other was the need to efficiently project military power to ever more far-flung locales in the world — the need of a burgeoning British Empire.

Alongside a centralized government bureaucracy and standing military grew that necessary evil for funding it: taxes. The effective average British tax rate rose from 3.5 percent in 1675 to 23 percent a century later, to no less than 35 percent at the height of the Napoleonic Wars of the early nineteenth century. And, what with the people from these earlier centuries not being all that different from us at bottom, lots and lots of them didn’t like it one bit. One William Pulteney spoke for them:

Let any gentleman but look into the Statute Books lying upon our Table, he will see there to what a vast Bulk, to what a Number of Volumes, our Statutes relating to Taxes have swelled. It is monstrous, it is even frightful to look into the Indexes, where for several Columns together we see nothing but Taxes, Taxes, Taxes.

The modern developed nation-state — bureaucratic, orderly, highly centralized, and absurdly highly taxed in comparison to any other era of human history — had been born, largely to meet the needs of the military.

With one country having remade itself in this more militarily efficient image, the other countries of the world felt they had no choice but to follow in order to remain competitive. In Europe, France and Spain concentrated more power than ever before in the hands of a central government, while the various small kingdoms that had traditionally made up Italy and Germany finally felt compelled to unify as centralized nations in their own right. The Ottoman Empire too remade itself after suffering a series of humiliating defeats at the hands of the ultra-modern nation-state of Napoleonic France, as did faraway Japan after the American Commodore Matthew Perry waltzed into Edo Bay with a small fleet of modern warships and held the shogunate hostage at gunpoint; “rich country, strong army” became the slogan for Japan’s economic, bureaucratic, social, and of course military modernization.

The game of Civilization does a rather remarkably clever job of depicting most of the factors I’ve just described. The deeper into the game you play, the more your maintenance costs go up, meaning that you have to tax your people more and more to maintain your civilization. And the game also captures the spur which the threat of war constantly gave to the progressive impulse; if you let your civilization fall too far behind its rivals in military terms, they will pounce. This alone provides a strong motivation to keep researching the latest advances, exactly as it did historically. The narrative of progress, in the game and in history, owes much to war.

But when we come to the second half of the twentieth century of our own planet’s history, the notion of war and/or the threat of war as a prime driver of the narrative of progress becomes more fraught. It has long been commonplace for critics of progress to contrast the bloody twentieth century with the relatively peaceful nineteenth century, using a range of seemingly telling statistics about death and suffering to anchor their contention that the narrative of progress has really only made us better at killing one another. Yet their insistence on passing their statistical judgment on the twentieth century as a whole obscures something rather important: while the first half of the century was indeed inordinately, almost inconceivably bloody, the second half was vastly less so. The statistics for the century as a whole, in other words, are hopelessly skewed by what we can all agree to hope were the historical anomalies of the two biggest wars ever fought.

Since the end of World War II, the situation has been much different. While small wars have certainly continued to be fought, two proverbial “great powers” haven’t met one another directly on a battlefield since 1945: that’s 73 years as I write these words, a record for all of post-classical human history. As the political scientist Robert Jervis could write already in 1988, “the most striking characteristic of the postwar period is just that — it can be called ‘postwar’ because the major powers have not fought each other since 1945. Such a lengthy period of peace among the most powerful states is unprecedented.” The change is so marked that historians have come up with a name for the period stretching from 1945 to the present: “The Long Peace.” This is the aspect of the Cold War which was overlooked by a public justifiably worried about the threat of nuclear annihilation, which was obscured by the small-scale proxy wars and police actions fought by the Americans in places like Vietnam and by the Soviets in places like Afghanistan. And yet the Long Peace has now outlasted the Cold War with which it overlapped by more than a quarter of a century.

If we want to identify what changed in the nature of warfare itself at the end of World War II, the answer is blazingly obvious: the atomic bomb entered the picture. The idea of a weapon so terrible that it would bring an end to war wasn’t, it should be noted, a new one at the time the bomb entered the picture. In 1864, Victor Hugo, looking forward to a future replete with flying machines, proposed that their potential on the battlefield would be sufficient to make armies “vanish, and with them the whole business of war.” Even the logic of mutually-assured destruction wasn’t really new at the dawn of the Cold War. In 1891, Alfred Nobel, the inventor of dynamite, suggested to an Austrian countess that “perhaps my factories will put an end to war sooner than your congresses: on the day that two army corps can mutually annihilate each other in a second, all civilized nations will surely recoil with horror and disband their troops.”

Still, nuclear weapons, with their capacity to mutually destroy not just opposing armies but opposing civilizations — and, indeed, the entirety of the world that built them — were clearly something new under the sun. It’s thus not hugely surprising to find that the game of Civilization doesn’t seem quite sure what to do with them when they finally appear so late in the day. After doing a credible job in the broad strokes, all things considered, of portraying the global balance of military power through World War II, the edges really begin to fray at the advent of the nuclear age. The game makes no space for the total destruction of an all-out nuclear exchange. Nuclear strikes come at a considerable cost to the environment, but it is possible in the game to win a nuclear war, sending what some critics regard as a regrettable message. To be fair to Sid Meier and Bruce Shelley, it would be very difficult indeed to implement nuclear weapons in a way that feels both true to history and satisfying in the game. Thus Civilization fell victim here to Meier’s old maxim of “fun trumps history.” That said, the designers did make an obvious attempt to simulate what a Pandora’s Box nuclear weapons really are in at least one way. When one civilization builds the Manhattan Project Wonder, all civilizations in the game who have researched the Rocketry advance get instant access to nuclear weapons.

This side of the game serves as a fine illustration of an aspect of strategy-game design that’s very easy to overlook. Many players believe that the ideal artificial intelligence plays just like a human would, but this isn’t always the case at all. If the more militaristic civilizations in the game were to start wildly nuking the player, ruining the civilization she’d spent so long building, she wouldn’t feel pleased that the artificial intelligence was so smart. Not at all; she’d feel like she was being punished for no good reason. Fun, it seems, also trumps perfect artificial intelligence. Your opponents in Civilization are notably reluctant to employ nuclear weapons in light of this maxim, only doing it to you if you start doing it to them.

The one memorable exception to this rule is down to a bug. Gandhi, the leader of the Indian civilization, is coded to be extremely non-aggressive. Problem is, his “aggression” setting is so low that it can actually loop back around to the maximum value when modifiers get subtracted. The upshot of all this is that he winds up being passive-aggressive rather than non-aggressive, avoiding all conflict until he acquires nuclear weapons, then letting the nukes fly with abandon. One can see this behavior as an unfortunate if unintentional bit of ethnic stereotyping. But one can also, of course, see it as kind of hilarious.

At any rate, there is nothing like the Long Peace accounted for in the game. (Admittedly, the Long Peace was much shorter at the time that the original Civilization was made.) As for historians: their points of view on the subject can be broadly sorted into two opposing camps, which I’ll call the realpolitik view and the globalist-idealist view. Both camps give due deference to the importance of nuclear weapons in any discussion of postwar geopolitics, but they diverge from there.

Those who fancy themselves the sober realists of the realpolitik school believe that the fundamentals of war and peace haven’t really changed at all, only the stakes in terms of potential destruction. From the first time that a primitive tribe armed itself with spears to make a neighboring tribe of warlike neighbors think twice before attacking its camp, weapons of war have been as useful for preventing wars as for fighting them. Nuclear weapons, the realpolitik camp claims, represent only a change in degree, not in kind. From this point of view arose the rhetoric of peace through strength, deployed liberally by steely Cold Warriors on both sides of the Iron Curtain. Safety, went the logic, lay in being so militarily strong that no one would ever dare mess with you. The Long Peace was a credit to the fact that the United States and the Soviet Union — and, after the Cold War, the United States alone — were so thoroughly prepared to kick any other country’s ass, with or without employing nuclear weapons.

The globalist-idealist view doesn’t ignore the awesome power of nuclear weapons by any means, but sees it through a more nuanced framework. Many people at the dawn of the nuclear age — not least many of scientists of the Manhattan Project who had helped to build the bomb — hoped that its power would lead to a philosophical or even spiritual awakening, prompting humanity to finally put an end to war. Some went so far as to advocate for the sharing of the technology behind the bomb with all the countries of the world, thus placing the whole world on a level playing field and ending the dominion of strong over weak countries everywhere. Such a thing wasn’t done, but there may be reason to believe that the idealistic impulse which led to proposals like this one found another outlet which has done the world an immense amount of good.

Looking back to the actual horrors of the previous few decades and the potential horrors of nuclear war, countries across the world after World War II instituted an international system of order that would have sounded like a utopian dream five years before. Its centerpiece was the United Nations, a forum unlike any that had existed before in human history, a place to which disputes between countries could be brought, to be hashed out with the help of neutral peers before they turned into shooting wars. Meanwhile an International Court of Justice would, again for the first time in history, institute a binding, globalized system of law to which all of the United Nations’s signatories, big or small, would be bound.

These are the major, obvious institutions of the globalized postwar order, but the spirit that spawned them has led to countless other international organs of communication and dispute resolution. Perhaps the most amazing of these — and an institution whose amazingness is too often overlooked — is the European Union. Known throughout most of history as the world’s preeminent powder keg of war, Europe, with its dozens of culturally, ethnically, and linguistically diverse countries packed together more closely than anywhere else in the world, has at last managed to set aside ancient rivalries and the many wars to which they historically led in favor of a grand continent-spanning cooperative project that’s made the idea of another general European war all but unimaginable. Even the recent decision by Britain to withdraw from the European Union hasn’t, as was breathlessly predicted by so many Cassandras, led to the dissolution of the project. Instead the latest polling shows substantially increased support for the European Union among the citizens of its remaining member states, as if the blow that was Brexit caused many to wake up to just how precious it really is.

To the extent that it takes a position, the game of Civilization winds up pitching its tent with the realpolitik school, although one does sense that this is done almost by default. Its mechanics are suited to depicting a global order based on the military balance of power, but, while the United Nations does make a token appearance, the game has no real mechanism for transcending nationalism and the wars that tend to accompany it. Only limited cooperation between rival civilizations is possible, and, especially at the higher difficulty levels, it’s a careful player indeed who manages to avoid wars in the climactic stages of the game. All of this is perhaps unfortunate, but forgivable given the long arc of history the game has to cover.

In the real world, however, your humble writer here does see reason to believe that we may be edging into a new, post-national, postwar-in-the-universal-sense era. Of course, we need to be very careful when we begin to assert that we’re privileged to live in a unique time in history. Many an earlier era has been foolish enough to regard itself as unique, only to learn, sometimes painfully, that the old rules still apply. Yet recent decades really do seem to have altered our attitudes toward war. The acquisition of territory by military force, once considered a matter of course, is now looked upon so unfavorably by the world at large that even as established a bad actor on the world stage as Vladimir Putin’s Russia felt compelled to justify its annexation of the Crimea in 2014 with a sham referendum. The United States, widely regarded with some justification as the last remaining warmonger among the well-developed Western nations, nevertheless goes to lengths that would have been inconceivable in earlier eras to avoid civilian casualties in its various military adventures. The same reluctance to accept war for the ugly business it is does everything to explain why, despite having the most powerful military the world has ever known, the same country tends to clearly win so very few of the wars it starts.

Changing attitudes toward war in the West can also be charted through our war memorials. London’s Trafalgar Square, a celebration of a major naval victory over Napoleon, is almost a caricature of extravagant triumphalism, with an outsized Admiral Horatio Nelson looking proudly down on the scene from the top of a 170-foot column. The Vietnam War Memorial in Washington, D.C., on the other hand, engages with its subject — one of those recent wars the United States failed to win out of an unwillingness to behave as brutally as was necessary — not as a triumph but as a tragedy, being a somber roll call of the ordinary soldiers who lost their lives there. But perhaps nowhere is the transformation in attitudes more marked than in Germany, which, after instigating the most terrible war in history well under a century ago, is now arguably the most fundamentally pacifistic nation in the West, going so far as to anger free-speech advocates by banning blood in videogames and banning right-wing political parties that venture anywhere close to the ideological territory once occupied by the Nazi party.

This notion that we are on the cusp of a new era of peaceful international cooperation, that soon the brutality of war might be as unthinkable to the modern mind as that of slavery or institutionalized torture, was a key component of Francis Fukayama’s assertion that humanity might be reaching the end of its history. A quarter-century on from that audacious thesis, the international order has been shaken at times, particularly by events in recent years, but the edifices built in the aftermath of World War II still stand. Even if we can only partially agree with the statement that humanity has finally found an orderly alternative to war through those edifices — reserving the other half of the Long Peace equation to the old realpolitik of might makes right, in the person of the peace-guaranteeing power of the United States and that ultimate deterrent of nuclear weapons — we might be slowly leaving behind the era of nationalism that began with the emergence of strong, centralized nation-states in the eighteenth and nineteenth centuries and led eventually to so much bloodshed in the first half of the twentieth. Even more optimistically, we might soon be able to say farewell to war as humanity has always known it. “Last night I had the strangest dream,” goes a lovely old folk song. “I dreamed the world had all agreed to put an end to war.” Today we are, by any objective measurement, closer to achieving that strange dream than we’ve ever been before. War has defined our past to a disconcerting degree, but perhaps it need not do the same for our future.

What would and should a postwar world really be like? Many have looked askance at the idea of a world free of war, seeing it as a world free as well of the noble virtues of honor, sacrifice, and courage, a world where people live only to selfishly gratify their personal needs. Unsurprisingly, Nietzsche is counted among the critics, painting a picture of a world full of “men without backs” who are no better than slaves to their creature comforts. More surprisingly, Georg Wilhelm Friedrich Hegel, that original architect of a narrative of progress climaxing in a peaceful and prosperous end of history, shared many of the same concerns, going so far as to state that nations at the end of history would need to continue to require military service of their citizens and continue to fight the occasional war in order to keep the noble virtues alive. Modern critics of the lifestyle of developed Western nations, speaking from both inside and outside those nations’ umbrellas, decry their servile “softness,” decry the way that the vicissitudes of fashion and consumerism take the place of the great feats that once stirred men’s souls. Peace and prosperity, goes another, related argument, are ultimately boring; some theories about the outbreak of World War I have long held that its biggest motivator was that countries just got tired of getting along, wanted a little mayhem to break up the monotony. Certainly our fictions — not least our videogame fictions — would be a lot duller without wars to relive.

I can understand such concerns on one level, but feel like they reflect a profound lack of imagination on another. I can’t, alas, count myself among the younger generation or generations who must put the finishing touches on a post-national, postwar world order, if it should ever come to be. Yet I can say that our current younger generation’s greater tolerance toward diversity and marked disinclination toward violence don’t strike me as being accompanied by any deficit of idealism or passion. And there is much that can replace war in their souls that is even more stirring. They could finally get serious about cleaning up this poor planet which their elders have spent so many centuries trashing. They could do something for the poorest regions of the world, to bring the benefits of the prosperous postwar international order to all. They could follow the example of humanity’s grandest adventure to date — the Apollo Moon landing, which truly was shared by the entire world thanks to the progressive technology of television — and look outward, first to Mars, perhaps eventually all the way to Alpha Centauri. For that matter, my own generation could make a solid start on many of these things right now. With all due respect to Hegel and Fukuyama, the end of war need not mean the end of history. It could mean that our real history is just getting started.

(Sources: the books Civilization, or Rome on 640K A Day by Johnny L. Wilson and Alan Emrich, The Story of Civilization Volume I: Our Oriental Heritage by Will Durant, The Past is a Foreign Country by David Lowenthal, The Sinews of Power by John Brewer, The Better Angels of Our Nature: Why Violence Has Declined by Steven Pinker, The End of History and the Last Man by Francis Fukuyama, The Iliad by Homer, Fragments by Heraclitus, The Social Contract by Jean-Jacques Rousseau, Leviathan by Thomas Hobbes, Lectures on the Philosophy of History by Georg Wilhelm Friedrich Hegel, Thus Spoke Zarathustra by Friedrich Nietzsche, A Treatise of Human Nature by David Hume, Basic Writings of Kant by Immanuel Kant, and Nationalism: A Very Short Introduction by Steven Grosby; the article “Strategic Digital Defense: Video Games and Reagan’s ‘Star Wars’ Program, 1980-1987” by William M. Knoblauch, found in the book Playing with the Past: Digital Games and the Simulation of History.)

April 12, 2018

Retroactive Fiction

Escape From Solaris (1984) by Graham Nelson

by Ant at April 12, 2018 10:41 PM

Previously I wrote about The Discovery, which is the first of two works of interactive fiction (known together as Galaxy’s Edge) that were written for the 8-bit BBC Micro in 1984 by Graham Nelson.

The Discovery is, more or less, a conventional single-player text adventure game. But the second game, Escape From Solaris, is for two players. It can be played either in split-screen mode on a single BBC Micro, with players taking alternate turns at the keyboard, or, unusually, on two BBC Micros connected together with a serial communications cable (RS-423).

This early form of “networked gaming” was an innovation. In fact, Escape From Solaris is the only Beeb game I know of that works in quite this way. (I did manage to find a BBC Micro version of Battleships that also used RS-423 comms between two Beebs, but Solaris is probably the only text adventure to do so.)

I’d been wanting to try to get the game running on two machines for quite a while, and last year I finally got around to it (after taking ages to realise that the way the comms lead had to be wired up was actually rather obvious). See the YouTube video above. Thanks to Lee for letting me borrow his setup and for helping me demo the game.

If you want to try Escape From Solaris yourself, you can play the game in an emulator in your browser, but only in split-screen mode in a single window (networked Beebs are not included):

Play Escape From Solaris online

Choice of Games

Silverworld — Survive the past. Save the future.

by Rachel E. Towers at April 12, 2018 05:41 PM

We’re proud to announce that Silverworld, the latest in our popular “Choice of Games” line of multiple-choice interactive-fiction games, is now available for Steam, iOS, and Android. It’s 30% off until April 19th!

In a world of trackless jungles, colossal beasts, and cruel pre-human civilizations, you must survive the past if you want to save the future! You were only meant to guard the laboratory, but when a treacherous power cripples Doctor Sabbatine’s time machine, you’re left stranded! Face the savage inhabitants of Silverworld and build your own civilization—or plunder the past and return home unimaginably rich!

Silverworld is a 560,000-word interactive time-travel fantasy novel by Kyle Marquis, where your choices control the story. It’s entirely text-based—without graphics or sound effects—and fueled by the vast, unstoppable power of your imagination.

You need allies to survive, but who can you trust? The locals may have already betrayed you to appease their enemies. The empress back home has ordered you to plunder this new world. Your friend survived the crash only for the jungle to infect him, transforming him into something inhuman. And the expedition’s chief adviser has imprisoned the Icons—architects of the universe, masters of time—and fled to build his own civilization.

Can you rebuild Doctor Sabbatine’s time machine and return home? You must protect your timeline, but at what cost? And after leading the people of Silverworld, will you even want to?

• Play as male, female, or nonbinary; straight, gay, bi, or ace.
• Carve out your own Stone Age nation.
• Face giant lizards, renegade airships, feathered apes, and the volcano fortress of the snake people!
• Uncover the secret history of your benefactor Doctor Sabbatine and her robot helpers.
• Confront challenges with threats or charm, overt violence or subtle tricks.
• Use modern technology to survive, or abandon it and go native!
• Protect the past from exploitation, or be the first to cash in.
• Fight the False Icon, surrender to its will, or try to trick it into granting you your heart’s desire.
• Befriend, betray, and romance robots, invincible warriors, and bee women from the Crystal Plains.
• Fight to free the Icons—the creators of the universe—or enslave them for your own ends.

You can save the future…if you can survive the past.

We hope you enjoy playing Silverworld. We encourage you to tell your friends about it, and recommend the game on StumbleUpon, Facebook, Twitter, and other sites. Don’t forget: our initial download rate determines our ranking on the App Store. The more times you download in the first week, the better our games will rank.

Available in the “Choice of Games” app for iPhone and iPad

The Choice of Games “omnibus” app is a new way to play our games on iOS: a single app that collects all of our Choice of Games titles in one place.

(The omnibus app is only available on iOS for now, not Android.)

Download the omnibus for free, and you’ll receive free, unlimited access to some of our classics, and free demos of our greatest hits and new releases.

Learn more about the omnibus in our omnibus FAQ on our website.

Give it a try!

April 10, 2018

Choice of Games

Author Interview: Kyle Marquis, “Silverworld”

by Mary Duffy at April 10, 2018 05:41 PM

In a world of trackless jungles, colossal beasts, and cruel pre-human civilizations, you must survive the past if you want to save the future! You were only meant to guard the laboratory, but when a treacherous power cripples Doctor Sabbatine’s time machine, you’re left stranded! Face the savage inhabitants of Silverworld and build your own civilization—or plunder the past and return home unimaginably rich!

Silverworld is a 560,000-word interactive time-travel fantasy novel by Kyle Marquis, author of Empyrean. I sat down with Kyle to find out more about Silverworld and his upcoming Choice of Games projects. Silverworld releases this Thursday, April 12th.


Silverworld
is your second game. What lessons did you carry over from writing Empyrean?

The main lesson from writing a Choice of Games game is that they’re not like any other game–not a text adventure, not a module for a tabletop RPG. Game mechanics that work in one system don’t necessarily translate. Anyone who’s tried to implement an elaborate inventory system in Choicescript has learned that lesson. For Silverworld, I streamlined the stat system, focusing on unipolar variables (Charisma, Education) instead of opposed variables (Charming/Domineering, Formal Education/Street Smarts), and simplified the testing mechanics. That’s a technical way of saying that Silverworld is built to be intuitive and easy to understand. In Empyrean you’re an ace pilot, an idea most gamers are familiar with. Silverworld is a time-travel alt-history game where you get to build a Stone Age village and fight evil crystal gods; the mechanics had to be clear so the players could focus on the world they’re trying to survive in.

This one’s a massive 500,000 words, which puts it in the top 3 or 4 games for length we’ve published. And in fact, when Empyrean came out at 300k+ words it was one of our longest at the time. Any comment?

I believe you can accomplish anything if you put your mind to it and don’t understand how much trouble you’re making for yourself.

In fact I had two related goals with Silverworld. First, I wanted to let players choose in what order they tackled the challenges facing them. Scenes in Silverworld aren’t linear; to repair your time machine, you can seek out components in any one of three areas, in any order, and the challenges change based on how far along in the game you are. Second, I wanted to avoid one of the main problems with games where you’re given that kind of agency: they can feel like the whole world is static, with other people just hanging around rather than pursuing their own goals. So in Silverworld, you act, then your enemies, rivals, and companions advance their own plans, then you act again, back and forth as you react against each-other. You’re up against some ruthless and clever competition, from ruthless colonizers to ancient gods, and I wanted to keep the pressure on the player while still giving them a range of options. The result is a large and very complicated game full of many different ways to solve the problems facing you.

Give me a little background on Silverworld. Is it a time travel game? Is it an alternate history? Is it primarily about the volcano fortress of the snake people?

There are snake people, and they do conduct horrific scientific experiments from a fortress hidden inside a volcano. There are also feathered apes, riding lizards, an airship full of insane survivalists, jungle cults, and at least one T. Rex. It’s that kind of game. But what I wanted to do with Silverworld is take a lot of old Lost World tropes (and they are old–Sir Arthur Conan Doyle wrote The Lost World over a century ago!) and tie them together in a new way. Silverworld is a savage world adventure, but I wanted to use those tropes to explore civilization and colonialism, the spread of technology and the nature of political control. It’s a game about history, religion, civilization, and the origins and nature of justice.

But you’ve actually also been writing another game, a “shorter” game, alongside Silverworld, called The Tower Behind the Moon. Tell us a little about it.

Tower is a fantasy game about an archmage seeking ascension. Fantasy wizards seem to have a sort of life-cycle, like cicadas: (1) apprentice, (2) adventurer-wizard, (3) wizard-in-a-tower, (4) wizard-god. I wanted to tell a story about what happens in the last month between (3) and (4), about how a wizard escapes the bounds of the mortal world…or fails to do so.

Though there’s still plenty of action–monsters to confront and enemy wizards to duel–Tower is less of an adventure story than Empyrean or Silverworld; it’s quieter and more elegiac. You play a magician who is, in a way, attending their own funeral, wrapping up loose ends before departing to become an archon, or a deity, or an entombed lich or something even stranger. You settle debts, make sure your apprentice and other helpers have a place in a world without you, and try to make peace with the mistakes you made and the things you’ve done for power and knowledge.

Are you ever concerned that the extremely specific worlds you write are Lynchianly incomprehensible and alienating to our readers?

In my experience, people are surprisingly comfortable with weird situations and settings as long as they can follow a clear emotional journey. Decades of familiarity have normalized many franchises for us–think Star Wars or the Marvel universe–but if you try explaining them without using any proper nouns, you realize two things. First, the individual parts are very, very weird. (Two robots need a space wizard to help them rescue a princess. A cryogenically preserved World War 2 super-soldier and a huge angry green scientist fight a Norse god and his alien army.) Second, stories with clear character arcs are easy to understand even if the details are unfamiliar. In Silverworld, you play a poor nobody forced to take charge when an experiment goes disastrously wrong. However a player approaches the deadly world they’re trapped in, as a noble hero or self-interested crook, as a warrior or a diplomat, they can follow their character’s emotional development through the course of the story. Deliver that, and it doesn’t matter how weird the snake people are.

Speaking of extremely specific worlds: Pon Para is your next big project for Choice of Games, the first part of which would likely be out sometime in the summer of 2019, yes?

I’m just getting started on Pon Para and the Great Southern Labyrinth, which is the first game in a Bronze Age fantasy trilogy. What will Pon Para look like, exactly? It depends on what people like most about Silverworld. The audience for Choice of Games is still figuring out what they want from these strange and wonderful games, and as long as I get to invent my weird little worlds and populate them with people you can date and/or swordfight, I’m glad to shape my games around what people like most.

sub-Q Magazine

Making Interactive Fiction: Going beyond “test your stuff.”

by Bruno Dias at April 10, 2018 01:41 PM

The most-often given advice to new IF writers, and writers in general, is “get feedback.”

Okay, so you’ve taken your story from a concept to a draft. You’re ready to show it to folks; maybe you’re even pretty proud of what you’ve put together. You take it to beta readers… and then they give you feedback.

Ah. Yeah. You signed up for this. What do you do with it?

Receiving feedback, and making use of it, is a hurdle for a lot of first-time writers. And it’s tough when you’re writing fiction that you feel is personal, or deeply subjective. Getting good feedback means finding people who can be good testers, which is challenging in itself.

A good test reader is someone who shares your aesthetic goals, someone who you can trust to be honest with you, but ideally not someone who is like you in every respect. Cultivating a circle of peers that I can exchange feedback with has been really useful to me, though I honestly couldn’t tell you how to make that happen.

You’ve probably been told to swallow your pride already, but really that’s just the start. Yes, feedback is going to be useless to you if you can’t stay objective about it, but it’s not enough to be objective. You need to know how to contextualize what you’re getting back from testers.

Some of the feedback you receive is going to be straightforward, of course: having more eyes on a project will help you find obvious bugs and errors. Having people with a diversity of viewpoints read your work before you publish it will often catch blind spots.

But what most new writers really struggle with is the frustration-guilt seesaw of getting feedback you think is bad, but you can’t be sure you’re not just giving in to your own biases. And yeah, sometimes you are being biased; I can’t really give you a way of knowing for sure. It gets easier with experience… I hope.

But for me, the key has been realizing that not all feedback is good, but all feedback is useful. Ultimately, even if someone’s perspective isn’t representative of who you’re trying to reach, they’re going to have a reaction that you can get something out of. The trick is to temper those reactions with other feedback, and keep circling back to the original vision for the project. Look at feedback not as answering the question what you should do but what are you doing now, and from there you derive what to change to get to what you want to be doing.

Most of all, don’t take early feedback looking for small tweaks. Early versions of your story are probably going to need some substantial surgery; embrace that. It’s often worth thinking through big structural changes in early revisions. IF writing is inevitably very iterative; it’s difficult to know how something will land ahead of time. And often, seemingly small problems are just the tip of the iceberg of a deeper weakness; getting to a story that hangs together well often means finding big solutions rather than spackling over the little cracks.

So: Writing is hard, not everyone shares your goals and aesthetics, but everyone has something you can glean information from. The worst possible feedback will tell you the boundaries of the audience you’re going to reach, or the ways you can expect to be misunderstood, but don’t expect (or settle) for the worst. Seeking out good feedback can be hugely rewarding, and over time, it’ll make you a better writer. Nothing has helped me improve more than editors, and while a lot of IF writing doesn’t have the luxury of a full-on editor, the basic principle still holds: Putting more eyes on a project will make it stronger.


Bruno Dias is a writer and narrative designer based in São Paulo. His work has appeared in video game publications (Waypoint, PC Gamer), games (Where the Water Tastes Like Wine) and interactive fiction on Sub-Q and elsewhere.

The post Making Interactive Fiction: Going beyond “test your stuff.” appeared first on sub-Q Magazine.

April 09, 2018

Zarf Updates

Myst 25th-anniversary Kickstarter

by Andrew Plotkin ([email protected]) at April 09, 2018 05:24 PM

Cyan just launched a Kickstarter for a 25th-anniversary re-release of Myst. And all four sequels, newly updated for Windows. And the single-player Uru collection (Complete Chronicles). Myst (the first one) will be included in both classic slideshow and full-3D (RealMyst) forms.
Everything will be available on Steam, but the big prize is a physical collection of discs in a case which looks (of course) like a linking book.
(Regrettably, this is Windows-only. They don't have the resources to update all the Mac ports.)
Cyan has been hinting at this for a few days now. Also, you know, posting to Facebook about it, which goes a bit beyond hinting. But here it is.
So that's the headline. Is there anything interesting to say about it, other than "Back this"?
Obviously, we'd prefer a new game rather than a re-release of some old games. I still have all my old CD-ROMs of the Myst series -- although it's a shaky bet which ones might run on any computer that's not buried in a closet. But Myst 3 and Myst 4 have been out of print for a very long time. This release is a first opportunity for a lot of younger players. I'm rather keen on replaying them myself.
(Yes, 3 and 4 have some obvious flaws, which are running jokes in the Myst fan community. So do 1, 2, and 5. There isn't a one of them that I regret playing.)
The more important question, to me, is "What happens to Cyan next?" Recent articles have made it clear that the company is in very tight straits right now. Obduction did not make enough money to fund a new game. In fact, Cyan is still paying for Obduction work, since the PSVR port remains stuck in the pipeline.
So this Kickstarter, if it succeeds, will get some cash into the barrel. But it looks like they're just aiming to raise enough money to fund the production of the physical rewards. (They have, very sensibly, omitted all mention of stretch goals. If the project over-funds, they'll make more of what they've planned.)
Hopefully they've done their spreadsheets right and they'll break even on the rewards. Then they'll have an additional ongoing income stream from Steam sales. I don't know if that will add up to much -- as several developers have posted recently, it's a rough year for narrative games, and re-releases of old games are going to be hard put to compete. But any long tail is better than no long tail.
In the meantime, we'll have nifty memorabilia to fondle.

April 08, 2018

Zarf Updates

Why does Twitter allow third-party clients?

by Andrew Plotkin ([email protected]) at April 08, 2018 10:31 PM

In last month's open letter to Slack, I wrote:
(Twitter may not block third-party clients, but it sure wants to discourage people from using them.)
The next shoe in that caterpillar cavalcade just hit the floor:
After June 19th, 2018, “streaming services” at Twitter will be removed. This means two things for third-party apps:
  • Push notifications will no longer arrive
  • Timelines won’t refresh automatically
That's from a web page, Apps of a Feather, which was just launched as a joint announcement from the developers of four popular Twitter clients. Twitter has responded by "delaying the scheduled June 19th deprecation date" (@TwitterDev, Apr 6) but it's unclear if their new Account Activity API will be sufficient for third-party apps to keep working.
This sucks for me. If third-party clients vanish -- and I see that day coming, soon or late -- I will not be switching to the official Twitter client or Twitter.com web site. I'm not saying that out of principle or anything. I just find the official Twitter experience to be abysmal. I can't do anything with it. It's noisy, it's out of order, and it's full of ads. No.
(Apps of a Feather is hosted by Twitterific, Tweetbot, Talon, and Tweetings. I use Echofon on iOS and Tweetbot on Mac. Echofon hasn't posted or tweeted anything about the issue, which is worrisome in a different direction.)
You might imagine, given my Slack post, that I will now write an open letter to Twitter telling them to continue supporting third-party clients. Sorry; nope; waste of time. Twitter isn't listening to me.
The question isn't why Twitter would drop support for third-party clients. The question is why they've kept supporting them for so long. Remember I just said that the official Twitter experience is full of ads? The clients I use don't show ads. I'm using Twitter ad-free. I am a freeloader! Why does Twitter put up with me?
They've never said, but I have a theory. I believe Twitter sees me as a selling point for their service. Not me, I mean, but people like me: early adopters with a lot of followers, who are seen as important or interesting people to follow. (In one circle or another.) "Influencers," if you will. I am a very small-time influencer, but there are a fair number of such people. Big-name bloggers, pundits, and so on.
We are, I am sure, the biggest users of third-party clients. We started with Twitter early, and we like how early Twitter behaved. (Ad-free, for a start.) We are cranky and unwilling to change our habits.
So for years (my theory says), Twitter has had a problem. They want to keep us early-cohorters around, because their selling proposition for newcomers is "Twitter is full of interesting people." But they don't want newcomers to use Twitter the way we do, because we're free-thinking radicals. Twitter wants newcomers to use the web site, which they have total control over. That's their only hope of getting and staying profitable.
This explains Twitter's weird, half-assed client support over the years. In 2012, they limited how popular third-party clients could get. (So old people could keep using their clients, but it's hard for those clients to acquire new users.) Over the past few years, Twitter has added new features which are not available to third-party clients. (I don't care about those features -- I just want my old-fashioned behavior -- but newcomers will want them.)
The problem with this, of course, is that every year there are more newcomers, and they follow more people who aren't me. Or people like me. Even if I'm right, the early cohort is a shrinking piece of Twitter's selling proposition. One day they're going to just shove us off the boat.
Last week's announcement, and partial retraction, is just another step in that dance. Third-party clients will still work, but maybe they won't refresh as smoothly. Or maybe they will. Nobody knows. Twitter isn't saying, because every year they're a little farther away from caring.
Where does this leave me? Off the boat. I can't use Official Twitter. I'll keep using third-party clients even if they become degraded. (I can use fussy, degraded interfaces for a long time.) If they go away entirely? I guess I lose Twitter.
There will be no Twitter replacement. I mean, there will be no service that is "like Twitter, with a Twitter-sized mass audience, but run with respect for users." You can't get that many users with an open service, because big services are expensive to run, and the only way to make money is to grind up your users for advertising paste.
On the other hand, I don't have a Twitter-sized mass audience. I have about 2500 followers. IF services that I help run, like IFComp and IFTFoundation, are of the same order of magnitude.
I have a Mastodon account: @zarfeblong (on the gamedev.place server). Perhaps the gamedev and IF crowd will migrate to Mastodon. That's not an absurd idea. Mastodon will never become a Twitter replacement, but it might work for my followers. I'm going to stay optimistic. Let's see what happens.

April 06, 2018

The Digital Antiquarian

The Game of Everything, Part 4: Civilization and Geography

by Jimmy Maher at April 06, 2018 04:41 PM

Most of history is guessing, and the rest is prejudice.

— Will Durant

Every veteran Civilization player has had some unfortunate run-ins with the game’s “barbarians”: small groups of people who don’t belong to either your civilization or any of its major rivals, but nevertheless turn up to harass you from time to time with their primitive weaponry and decided aversion to diplomacy. More of a nuisance than a serious threat most of the time, they can spell the doom of your nascent civilization if they should march into your capital before you’ve set up a proper defense for it. What are we to make of these cultural Others — or, perhaps better said, culture-less Others — who don’t ever develop like a proper civilization ought to do?

The word “barbarian” stems from the ancient Greek “bárbaros,” meaning anyone who is not Greek. Yet the word resonates most strongly with the history of ancient Rome rather than Greece. The barbarians at the gates of the Roman Empire were all those peoples outside the emperor’s rule, who encroached closer and closer upon the capital over the course of the Empire’s long decline, until the fateful sacking of Rome by the Visigoths in AD 410. Given the months of development during which Civilization existed as essentially a wargame of the ancient world, it’s not hard to imagine how the word “barbarian” found a home there.

Civilization‘s barbarians, then, really are the game’s cultural Others, standing in for the vast majority of the human societies that have existed on our planet, who have never become “civilized” in the sense of giving up the nomadic hunter-gatherer lifestyle, taking up farming, and developing writing and the other traits of relatively advanced cultures. One of the biggest questions in the fields of history, archaeology, anthropology, and sociology has long been just why they’ve failed to do so. Or, to turn the question around: why have a minority of peoples chosen or been forced to become more or less civilized rather than remaining in a state of nature? What, in other words, starts a people off down the narrative of progress? I’d like to take a closer look at that question today, but first it would be helpful to address an important prerequisite: just what do we mean when we talk about a civilization anyway?

The word “civilization,” although derived from the Latin “civilis” — meaning pertaining to the “civis,” or citizen — is a surprisingly young one in English. Samuel Johnson considered it too new-fangled to be included in his Dictionary of the English Language of 1772; he preferred “civility” (a word guaranteed to prompt quite some confusion if you try to substitute it for “civilization” today). The twentieth-century popular historian Will Durant, perhaps the greatest and certainly the most readable holistic chronicler of our planet’s various civilizations, proposed the following definition at the beginning of his eleven-volume Story of Civilization:

Civilization is a social order promoting cultural creation. Four elements constitute it: economic provision, political organization, moral traditions, and the pursuit of knowledge and the arts. It begins where chaos and insecurity end. For when fear is overcome, curiosity and constructiveness are free, and man has passed by natural impulse towards the understanding and embellishment of life.

“Civilization” is a very loose word, whose boundaries can vary wildly with the telling and the teller. We can just as easily talk about human civilization in the abstract as we can a Western civilization or an American civilization. The game of Civilization certainly doesn’t make matters any more clear-cut. It first implies that it’s an holistic account of human civilization writ large, then proceeds to subsume up to seven active rival civilizations within that whole. In this series of articles, I’m afraid that I’m all over the place in much the same way; it’s hard not to be. But let’s step back now and look at how both abstract human civilization and the first individual civilizations began.

Homo sapiens — meaning genetically modern humans roughly our equals in raw cognitive ability —  have existed for at least 200,000 years. Long before developing any form of civilization, they had spread to almost every corner of the planet. Human civilization, on the other hand, has existed no more than 12,000 years at the outside. Thus civilization spans only a tiny portion of human history, which itself spans a still vastly tinier portion of the history of life on our planet.

How and why did civilized societies finally begin to appear after so many centuries of non-civilized humanity? I’ll tackle the easier part of that question first: the “how”. Let me share with you a narrative of progress taking place in the traditional “cradle of civilization,” the Fertile Crescent of the Middle East, seat of the earliest known societies to have developed such hallmarks of mature civilization as writing.

For hundreds of thousands of years, the Fertile Crescent and the lands around it were made up of rich prairies, an ideal hunting ground for the nomadic peoples who lived there from the time of the proto-humans known as homo erectus, 1.8 million years ago. But around 12,000 to 10,000 BC, the Middle East was transformed by the end of our planet’s most recent Ice Age, turning what had been prairie lands into steppes and desert. The peoples who lived there, who had once roamed and hunted so freely across the region, were forced to cluster in the great river valleys, the only places that still had enough water to sustain them. With wild game now much scarcer than it had been, they learned to raise crops and domesticated animals, which necessitated them staying in one place. Thus they made the transformation from nomadic hunting and gathering to sedentary farming — a transformation which marks the traditional dividing line between non-civilized and civilized peoples. “The first form of culture is agriculture,” writes Will Durant.

Early on, the peoples of the Fertile Crescent developed what is, perhaps counter-intuitively, the most fundamental technology of civilization: pottery, an advance whose value every Civilization player knows. As is described in the Civilopedia, the pots they made allowed them to “lay up provisions for the uncertain future” out of the crops they harvested, sufficient to get them through the winter months when other food sources were scarce.

One might say that the invention of pottery — or rather the onset of the future-oriented mindset it signifies — marks the point of fruition of humanity’s psychological transition from a state of nature to something akin to the modern condition. Some of you might be familiar with the so-called “worker placement” school of modern board games — a sub-genre on which the city-management screen in the original Civilization has been a huge hidden influence. In a game like Agricola, you know that you need to collect enough food to feed your family by the end of the current round, and then again by the end of every subsequent round. You can rustle up a wild boar and slaughter it to deal with the problem now and let the future worry about itself, or you can defer other forms of gratification and use much more labor to plow a field and plant it with grains or vegetables, knowing that it will really begin to pay off only much later in the game. Deferred gratification is, as you’ve probably guessed, by far the better strategy.

It was, one might say, when humans developed the right mindset for playing a game like Agricola that everything changed. Something precious was gained, but something perhaps equally precious was lost. I use a lot of loaded language in this article, speaking about “primitive peoples” and “barbarians” as I do, and, good progressive that I am, generally write it under the assumption that civilization is a good thing. So, let me take a moment here to acknowledge that people do indeed lose something when they become civilized.

It is the fate of the civilized human alone among all the world’s creatures to have come unstuck in time. We’re constantly casting our gaze forward or backward, living all too seldom in the now. What the ancient Greeks called the physis moment — the complete immersion in life that we can observe in a cat on the prowl or a toddler on the playground — becomes harder and harder for us to recapture as we grow older. To think about the future also means to worry about it; to think about the past means to indulge in guilt and recrimination. “Of what are you thinking?” the polar explorer Robert Peary once asked one of his Inuit guides. “I do not have to think,” the Inuit replied. “I have plenty of meat.” An argument could be made that the barbarians are the wisest of all the peoples of the earth.

But, for better or for worse, we progressives don’t tend to make that argument. So, we return to our narrative of progress…

Cities and civilizations are inextricably bound together, not only historically but also linguistically; the word “city” is derived from the same Latin root as “civilization.” Cities provide a place for large numbers of people to meet and trade goods and ideas, and their economies of scale make specialization possible, creating space initially for blacksmiths and healers, later for philosophers and artists, as well as for hierarchies of class and power. They mark the point of transition from Karl Marx’s “primitive communism” to his so-called “slave society” — “Despotism” in the game of Civilization. On a more positive note, one might also say that the first cities with their early forms of specialization mark the first steps in Hegel’s long road toward a perfect, thymos-fulfilling end of history.

By the time a game of Civilization begins in 4000 BC, the Age of Stone was about to give way to the Age of Metal in the Fertile Crescent; people there were learning to smelt copper, which would soon be followed by bronze, as described by another pivotal early Civilization advance, Bronze Working. At this point, small cities had existed up and down the Fertile Crescent for thousands of years. The region was now rife with civilization, complete with religion, art, technology, written documents, and some form of government. Multi-roomed houses were built out of compressed mud or out of bricks covered with plaster, with floors made out of mud packed over a foundation of reeds. Inside the houses were ovens and stoves for cooking; just outside were cisterns for catching and storing rainwater. When not farming or building houses, the people carved statues and busts, and built altars to their gods. Around Jericho, one of if not the oldest of the settlements, they built walls out of huge stone blocks, thus creating the first example of a walled city in the history of the world. By 2500 BC, one of the civilizations of the Fertile Crescent would be capable of constructing the Pyramids of Giza, the oldest of Civilization‘s Wonders of the World and still the first thing most people think of when they hear that phrase.

But in writing about the Fertile Crescent I am of course outlining a narrative of progress for only one small part of the world. Elsewhere, the situation was very different, and would remain so for a long, long time to come. By way of illustrating those differences, let’s fast-forward about 4000 years on from the Pyramids, to AD 1500.

At that late date, much of the rest of the world had still not progressed as far as the Fertile Crescent of 2500 BC. The two greatest empires of the Americas, those of the Aztecs and the Incas, were for all intents and purposes still mired in the Stone Age, having not yet learned to make metals suitable for anything other than decoration. And those civilizations were actually strikingly advanced by comparison with many or most of the other peoples of the world. In Australia, on New Guinea and many other Pacific islands, over much of the rest of the Americas and sub-Saharan Africa, people trailed even the Aztecs and the Incas by thousands of years, having not yet learned the art of farming.

Consider that in 11,000 BC all peoples on the earth were still hunter-gatherers. Since then, some peoples had stagnated, while others — in the Middle East, in Europe, in parts of Asia — had advanced in ways that were literally unimaginable to their primitive counterparts. And so we arrive back at our original question: why should this be?

For a long time, Europeans, those heirs to what was first wrought in the Fertile Crescent, thought they knew exactly why. Their race was, they believed, simply superior to all of the others — superior in intelligence, in motivation, in creativity, in morality. And, as the superior race, the world and all its bounty were theirs by right. Thus the infernal practice of slavery, after having fallen into abeyance in the West since the Middle Ages, reared its head again in the new American colonies.

In time, some European attitudes toward the other peoples of the earth softened into a more benevolent if condescending paternalism. As the superior race, went the thinking, it was up to them to raise up the rest of the world, to Christianize it and to provide for it the trappings of civilization which it had been unable to develop for itself. Rudyard Kipling, that great poet of the latter days of the British Empire, urged his race to “take up the white man’s burden” as a moral obligation to the benighted inferior peoples of the world.

If there was any objective truth to the racial theories underlying Kipling’s rhetoric, we progressives would find ourselves on the horns of an ugly dilemma, split between our allegiance to rationality and science on the one hand and the visceral repugnance every fair-minded person must feel toward racism on the other. Fortunately for us, then, there is no dilemma here: racism is not only morally repugnant, it’s also bad science.

Any attempt to measure intelligence is a problematic exercise on the face of it; there are many forms of intelligence, such as empathy and artistic intelligence, about which the standard I.Q. test has nothing to say. And even within the limited scope of I.Q., cultural factors are notoriously difficult to remove from the testing process. Nevertheless, to the extent that we can measure such a thing there seems little or nothing to indicate that the overall cognitive ability of, say, a primitive tribesman from New Guinea suffers at all in comparison to that of a “civilized” person. Indeed, in some areas, such as spatial awareness and improvisational problem-solving, primitive people are quite likely our superiors. When we think about it, this stands to reason. For a tribesman on a jungle hunt, an error in judgment could mean that he and his family won’t have anything to eat that night — or, in the worst case, that he won’t get to return to his family ever again. Against that sort of motivator, the threat of failing to get into one’s favored university because of an SAT score that wasn’t all it might have been suddenly doesn’t feel quite so motivating.

All of which is good for our consciences, but it still doesn’t answer the question we’ve been dancing around since the beginning of this article. If racial differences don’t explain why the narrative of progress takes root in some peoples and not in others, what does? Plenty of other possibilities have been proposed, all centering more or less around geography and ecology.

Climate is one proposed determining factor, upon which the citizens of Germany, Scandinavia, the United States, and Canada among other places have sometimes leaned in order to explain why their countries’ economies are generally more dynamic than those of their southern counterparts. The fact that residents of more northerly regions had to work so much harder to survive — to find food and to stay warm in a much harsher climate — supposedly instilled in them a superior work ethic — and perhaps, necessity being the mother of invention, a greater intellectual flair to boot. Will Durant expressed a similar sentiment on a more universal scale in his Story of Civilization, claiming that the “heat of the tropics,” and the “lethargy” it breeds, are fundamentally hostile to civilization. But such claims too often find their evidence in ethnic stereotypes almost as execrable as those that spawned the notion of a white man’s burden, and of equally nonexistent veracity.

The fact is that the more dynamic economies of Northern Europe and the northernmost Americas are a phenomenon dating back only a few centuries at most, not the millennia that would make them solid evidence for the climate-as-destiny hypothesis; ancient Rome, the civilization that still springs to mind first when one says the word “civilization,” was itself situated in the warm, lazy, lethargic, fun-in-the-sun region of Europe. Indeed, the peoples of Northern Europe are comparative latecomers to the cultural party. Until not that many centuries ago, Northern Europe was quite literally the land of the barbarians; the Visigoths who so famously sacked Rome, it must be remembered, were a Germanic people. In the Americas as well, the most advanced native societies, the only ones to develop writing, were found in present-day Mexico and Peru rather than the United States or Canada.

The credence given to the climate-as-destiny theory for many years in the face of such obvious objections, combined with the way that evidence of civilization decays much faster in tropical environments than it does elsewhere, caused archaeologists to entirely overlook the existence of some tropical civilizations. A dry desert environment is, by contrast, about as perfect for preserving archaeological evidence as any natural environment can be, and this goes a long way in explaining why we know so much about certain regions of the world in comparison to others. Michael Heckenberger caused a sensation in archaeological and anthropological circles in 2009 when he published an article in Scientific American about the ancient Xingu people of the Amazon rain forest, who lived in well-developed, orderly communities which Heckenberger compared to the Victorian architect Ebenezer Howard’s utopian “garden cities of tomorrow.”

Another proposed determining factor for civilization or the lack thereof, also prevalent among scholars for many years, is even more oddly specific than the climate-as-destiny hypothesis. Civilization develops, goes the claim, when people find themselves forced to settle in river valleys of otherwise arid climates — i.e., exactly the conditions that prevailed in the Fertile Crescent. The only way for a growing population to survive in such a place was to develop the large-scale systems of irrigation that could bring the life-giving waters of the river further and further from their source. Undertaking such projects, the first ever examples what we would call today public works, required a form of government, even of bureaucracy. Ergo, civilization.

In addition to the example of the Fertile Crescent, proponents of the “hydraulic theory” of civilization have pointed to other examples of a similar process apparently occurring: in the Indus Valley of India, in the Yellow and Yangtze Valleys of China, in the river valleys of Mexico and Peru. The hydraulic theory was very much still a part of the anthropological discussion at the time that Sid Meier and Bruce Shelley were making Civilization, and likely informs the way that food-producing river squares are such ideal spots for founding your first cities, as well as the importance placed on irrigating the land around your cities in order to make them grow.

But more recent archaeology, in the Fertile Crescent and elsewhere, has cast doubt upon the hydraulic theory. It appears that the first governments developed not in tandem with systems of irrigation but rather considerably before them. The assertion that a fairly complex system of government is a prerequisite for large-scale irrigation remains as valid as ever, as does the game of Civilization‘s decision to emphasize the importance of irrigation in general. Yet it doesn’t appear that the need for irrigation was the impetus for government.

In 1997, a UCLA professor of geography and physiology named Jared Diamond published a book called Guns, Germs, and Steel, which, unusually, created an equal sensation in both the popular media and in academic circles. I described in my previous article the theory of technological determinism to which the game of Civilization seems to ascribe. In his book, Diamond asserted that, before there could be technological determinism, there must be a form of environmental determinism. One could of course argue that both the climate-as-destiny and the hydraulics-as-destiny theories I’ve just outlined fall into that category. What made Diamond’s work unique, however, was the much more holistic approach he took to the question of environmental determinism.

We tend to see the development of civilization through an anachronistic frame today, one which can distort reality as it was lived by the people of the time. In particular, we’ve taken to heart Thomas Hobbes’s famous description of the lives of primitive humans as “solitary, poor, nasty, brutish, and short,” just as we have the image of the development of agriculture, pottery, and all they wrought as the gateway to a better state of being. What we overlook is that nobody back in the day was trying to develop civilization; for people who have no experience with civilization, the very idea of it is, as I’ve already noted, literally unimaginable. No, people were just trying to get to the end of another day with food in their bellies.

We moderns overlook the fact that primitive farming was really, really hard, while hunting and gathering often wasn’t all that onerous at all when the environment was suited to it. Being a primitive farmer in most parts of the world meant working much longer hours than being a hunter-gatherer. Even today, the people of primitive agricultural societies are on average smaller and weaker than those of societies based on hunting and gathering, tending to die much younger after having lived much more unpleasant lives. The only way anyone would make the switch from hunting and gathering to agriculture was if there just wasn’t any other alternative. The first civilizations, in other words, arose not out of some visionary commitment to progress, but as a form of emergency crisis management.

With this wisdom in our back pocket, we can now revisit our narrative of progress about those first civilizations from a new perspective. Until roughly 12,000 years ago, it just didn’t make sense for people to be anything other than hunter-gatherers; the effort-to-reward ratio was all out of whack for farming. But then that begin to change in some part of the world, thanks to a global change in climate. The end of the Ice Age in about 10,000 BC caused the extinction in some parts of the world of many of the large mammals on which humans had depended for meat, while the same climate change greatly benefited some forms of plant life, among them certain varieties of wild cereals that could, once clever humans figured out the magic of seeds and planting, become the bedrock of agriculture. By no means did all tribes in a given region adopt agriculture at the same time, but once any given tribe began to farm a positive-feedback loop ensued. An agricultural society makes much more efficient use of land — i.e., can support far more people per square mile — than does one based on hunting and gathering. Therefore the populations of nascent civilizations which adopted agriculture exploded in comparison to those neighbors who continued to try to make a go of it as hunter-gatherers despite the worsening conditions for doing so. In time, these neighboring Luddites were absorbed, exterminated, or driven out of the region by their more advanced neighbors — or they eventually learned from said neighbors and adopted agriculture themselves.

All of these things happened in some places that were adversely affected by the change in climate, beginning with the Fertile Crescent, but in other, equally affected places they did not. And in some of these places, the reasons why have long been troublingly unclear. For example, the African peoples living south of the Fertile Crescent suffered greatly from the shortage of wild game caused by the ending of the Ice Age, and had available to them the wild cereal known as sorghum, one of the most important early crops of their counterparts to the north. Yet these peoples, unlike said counterparts, failed to develop agriculture. Similarly, the peoples of Western Europe had flax available to them, while the peoples of the Balkans had einkorn wheat, both also staple crops of the Fertile Crescent. Yet these peoples as well failed to adopt agriculture until having it imposed upon them millennia later by encroachers from the south and east.

What made the Fertile Crescent and a select few other regions so special? It was this confusing mystery that prompted many an archaeologist to reach for ideas like the climate hypothesis and the hydraulic hypothesis.

But what Jared Diamond teaches us is that there’s very little mystery here at all when one looks at the situation in the right light. What made the Fertile Crescent so special was the fact that all of the plants I’ve just mentioned were available, just waiting to be domesticated. A civilization can’t live by sorghum, flax, or einkorn wheat alone; it requires the right combination of crops in order to sustain itself. Agriculture in the Fertile Crescent sprang up around eight staples that have become known as the “founder crops” of civilization: emmer wheat, einkorn wheat, barley, lentils, peas, chickpeas, bitter vetch, and flax. This combination alone provided enough nutrition to sustain a population, if need be, without any form of meat. No other region of the world was so richly blessed.

So, it was only in the Fertile Crescent that the people of approximately 10,000 BC had both a strong motivation to change their way of life from one based on hunting and gathering to one based on agriculture and the right resources at their disposal for actually doing so. In time, some peoples in some other parts of the world would encounter the same combination of motivation and opportunity, and civilizations would take root there as well. Many other peoples would remain happily committed to hunting and gathering, in some cases right up until the present day. One thing, however, would remain consistent: when civilizations which had developed agriculture encountered more primitive people and strongly wished to destroy them, absorb them, or just push them out, they would have little trouble doing so.

With this new way of looking at things, we understand that the reason so many of the first civilizations started in river valleys wasn’t due to the hydraulic hypothesis, or indeed to anything intrinsic to river valleys themselves. People rather moved into river valleys as their last remaining option when the environment outside them became uncongenial to their sustenance. And once they were there, circumstance forced them to become civilized. Thus the climate hypothesis of civilization is correct about one thing, even if it applies the lesson far too narrowly: civilization doesn’t arise in places of plenty; it arises in places where life is hard enough to force people to improvise.

But of course the development of civilization isn’t simply an either/or proposition. Even those civilizations which learned to farm and thus started down a narrative of progress didn’t progress at the same rate. Consider perhaps the most infamous example in history of a more advanced civilization meeting one that was less so: Hernán Cortés’s conquest of the Aztec Empire in the early 1500s with a tiny army that was vastly outnumbered by the native warriors it vanquished. The Aztecs had spotted about a 5000-year head start on the narrative of progress to the Spaniards, who could trace the roots of their civilization all the way back to those earliest settlers of the Fertile Crescent. Yet the timeline alone doesn’t suffice to explain why the Aztecs were so very much less advanced in so many ways. Jared Diamond asserts persuasively that, not only was the Aztec Empire not as advanced as Spain and other European nations in 1500, but it would never, ever have reached parity with the Europeans of 1500, not even if it had been given many more millennia to progress in splendid isolation. The reasons for this come down not to race, to climate, or to some sort of qualitative difference in river valleys, but to a natural environment that was very different in a more holistic sense.

Europeans had been blessed with thirteen different species of large mammals that were well-suited for domestication; these provided them with meat, milk, clothing, transportation, fertilizer, building materials, and, by pulling plows and turning grindstones, the industrial power of their day. In all of these ways, they spurred the progress of European civilization. Central America, by contrast, had no large mammals at all that were suitable for such roles. And in terms of plants as well, luck had favored the Europeans over the Aztecs; the latter lacked the wide assortment of nutrition-rich grains available to the former, having to make do instead with less nourishing corn as their staple crop. All of these factors meant that the Aztecs had to work much harder than the Europeans to feed and otherwise provide for their people. And so we come again to this idea of specialization, and the efficiencies it produces, as a key determinant of the narrative of progress. Will Durant noted that in a well-developed civilization “some men are set aside from the making of material things, and produce science and philosophy, literature and art.” The Aztecs could afford to set far fewer of their citizenry aside for such purposes than could the Europeans — a crippling disadvantage.

And there were still further disadvantages. The wider variety of animal life in Europe had led to the evolution of far more microbes hoping to infect it. European humans had in turn developed resistances to the cornucopia of germs they carried with them to the New World, resistances which the native inhabitants there lacked. And then the fact that Europe’s habitable regions were smaller and more densely populated had resulted in much more intense competition for land and resources, spurring the development of the technologies of warfare. Between Cortés’s guns, germs, and steel, the Aztecs never had a chance. The inexorable logic of environmental determinism ruled the day.

Civilization can hardly be expected to capture all of the nuances of Guns, Germs, and Steel, not least because it was made six years before Jared Diamond’s book was published. Yet to a surprising degree it gets the broad strokes of geography-as-destiny right. Barbarians, for instance, are spawned in the inhospitable polar regions of the world — regions in which agriculture, and thus civilization, is impossible. (Maybe all those barbarians who are constantly encroaching on your territory are really just trying to get warm…) In our real world as well, the polar regions have historically been populated only by primitive hunter-gatherer communities. Thankfully, Civilization has no interest in race as a determining factor in the narrative of progress. The game has occasionally been criticized for stereotyping its various civilizations in its artificial-intelligence algorithms, but one could equally argue that it’s really the individual civilizations’ chosen historical leaders — Abraham Lincoln, Josef Stalin, Napoleon, etc. — that are being modeled/stereotyped.

Jared Diamond’s theory of environmental determinism remains widely respected today if not universally accepted in its entirety. Some have objected to the very spirit of determinism that underlies it, which seems to assert that all of us humans really are strictly a product of our environment, which seems to imply that human history can be studied much as we do natural science, can be reduced to theorems and natural laws. This stands in stark contrast to an older view of history as driven by “great men,” as articulated by Thomas Carlyle: “Universal history, the history of what man has accomplished in this world, is at bottom the History of the Great Men who have worked here.” This approach to history, with its emphasis on the achievements and decisions of individual actors and its love for stirring narratives, has long since fallen out of fashion among academics like Diamond in favor of a more systemic approach. It’s certainly possible to argue that we’ve lost something thereby, that we’ve cut the soul out of the endeavor; we do suffer, it seems to me, a paucity of great tellers of history today. Whatever else you can say about it, Diamond’s approach to history doesn’t exactly yield a page-turner. Where is our time’s Will Durant?

Alexis de Tocqueville, that great French chronicler of early American democracy, mocked both politicians, who believe that all history occurs through their “pulling of strings,” and grand theorists like Jared Diamond, who believe all of history can be boiled down, science-like, to “great first causes.” He wrote instead of “tendencies” in history which nevertheless depend on the free actions of individuals to come to fruition. Maybe we too can settle for a compromise, and say that the conditions at least need to be right in order for our proverbial great persons to build great civilizations. Such a notion was articulated long before current academic fashion held sway by no less august a nation-builder than Otto von Bismark: “The statesman’s task,” he wrote, “is to hear God’s footsteps marching through history, and to try to catch on to His coattails as He marches past.” “It remains an open question,” allows even Jared Diamond, “how wide and lasting the effects of idiosyncratic individuals on history really are.”

For those of us who believe or wish to believe in the narrative of progress, meanwhile, Diamond’s ideas provide yet further ground for sobering thought, for they rather cut against the game of Civilization‘s spirit of progress as an historical inevitability. Consider once again that homo sapiens roughly equal to ourselves in intelligence and capability have been around for 200,000 years, while human civilization has existed for only 12,000 years at the outside. The auspicious beginning to the game of Civilization, which portrays the entire natural history of your planet leading up to the moment in 4000 BC when you take control of your little band of settlers, rather makes it appear that these events were destined to happen. Yet the narrative of progress was anything but an inevitability in reality; its beginning was spurred only by a fluke change in climate. On the shifting sands of this random confluence of events have all of the glories of human civilization been built. Had the fluke not occurred, you and I would likely still be running through the jungle, spears in hand. (Would we be happier or sadder? An interesting question!) Or, had that fluke or some other spur to progress happened earlier, you and I might already be living on a planet orbiting Alpha Centauri.

(Sources: the books Civilization, or Rome on 640K A Day by Johnny L. Wilson and Alan Emrich, The Story of Civilization Volume I: Our Oriental Heritage by Will Durant, Guns, Germs, and Steel: The Fates of Human Societies by Jared Diamond, The Face of the Ancient Orient: Near-Eastern Civilization in Pre-Classical Times by Sabatino Moscati, A Dictionary of the English Language Samuel Johnson, The Spirit of the Laws by Montesquieu, Plough, Sword, and Book: The Structure of Human History by Ernest Gellner, Oriental Despotism: A Comparative Study in Total Power by Karl Wittfogel, The Structures of Everyday Life: Civilization and Capitalism by Fernand Braudel, Souvenirs by Alexi de Tocqueville, and Nationalism: A Very Short Introduction by Steven Grosby; the article “Phantasms of Rome: Video Games and Cultural Identity” by Emily Joy Bembeneck, found in the book Playing with the Past: Digital Games and the Simulation of History; Scientific American of October 2009.)

April 05, 2018

Choice of Games

Try Our New All-in-one Omnibus App for iPhone and iPad

by Dan Fabulich at April 05, 2018 05:42 PM

We’re proud to announce the launch of our new Choice of Games “omnibus” app for iPhone and iPad, available now in Apple’s App Store!

It’s a new way to play our games on iOS: a single app that collects all of our Choice of Games titles in one place.

(The omnibus app is only available on iOS for now, not Android.)

Download the omnibus for free, and you’ll receive:

  • Free access to eight complete Choice of Games titles: Choice of the Dragon, Choice of Broadsides, Choice of the Vampire, Choice of the Deathless, Choice of Kung Fu, For Rent: Haunted House, and Creatures Such As We.
  • Free demos of some of our all-time greatest hits and new releases, including Choice of Robots, Choice of Rebels, and Psy High. You can pay to unlock the full versions.

Here are the features that we’re working on for the future:

  • Search: find your favorite games within the app
  • Sort by title, author, and genre
  • Internal reviews: rate your favorite games
  • Improved graphics and menus
  • Transfer app purchases into the omnibus app

We’re sure you have a lot of questions; you can learn more about in our omnibus FAQ.

Give it a try!

The Hero Project: Open Season — Can you win America’s #1 reality show for heroes?

by Rachel E. Towers at April 05, 2018 04:44 PM

We’re proud to announce that The Hero Project: Open Season, the latest in our popular “Choice of Games” line of multiple-choice interactive-fiction games, is now available for Steam, iOS, and Android. It’s 25% off until April 12!

Can you win The Hero Project, America’s #1 reality competition for heroes? Team up with allies old and new to unravel a conspiracy threatening your world, and save the planet from destruction!

The Hero Project: Open Season is a 170,000-word interactive novel, and the final installment of Zachary Sergi’s Hero Project series. It’s entirely text-based–without graphics or sound effects–and fueled by the vast, unstoppable power of your imagination.

In a competition full of heroic stars, will you rise high enough to influence the way society views Powered people? What will you do when your fight soars to heights you never expected…and when your journey falls back into the perspective of the original Heroes Rise Trilogy main character?

As you rise, the decisions you make will shape the world for your Powered peers—and shape your relationships and potential romances. Will you fight for Powered rights or personal gain?

• Play as male, female, trans, or non-binary; gay, straight, bisexual, non-categorizable, or ace
• Play a new hero, in a brand new season of The Hero Project
• Use your animalistic Powers to survive deadly missions
• Kick slugging butt with Prodigal as your sidekick
• Become an advocate role model, a powerful kingpin, or a dangerous freedom fighter
• See Black Magic, Jury and Jenny again
• Secure the fate of a new Powered capital, or will you exploit its resources
• Untangle the conspiracy behind the scenes of The Hero Project fast enough to save the entire world
• Play as the original Heroes Rise Trilogy hero in two interactive interludes!
• Enter into one of a ten different romantic relationships!

In Open Season, everyone is a target. Who is yours?

We hope you enjoy playing The Hero Project: Open Season. We encourage you to tell your friends about it, and recommend the game on StumbleUpon, Facebook, Twitter, and other sites. Don’t forget: our initial download rate determines our ranking on the App Store. The more times you download in the first week, the better our games will rank.

April 03, 2018

sub-Q Magazine

April 2018 – Table of Contents

by Stewart C Baker at April 03, 2018 08:42 PM

Perhaps it’s just my upbringing in England and current residence in Oregon, but I’ve always associated Spring with rain.

Another thing that’s associated with Spring in some folklore traditions are certain types of fairy. In pre-Roman Spain, for instance, the Cantabrian people believed that on the night of the Spring Equinox, small female fairies called Anjana gathered in mountain fields and danced until dawn, and were thought to bring protection and happiness (source).

This month’s story, “Nine Moments in Fairyland” by Hannah Powell-Smith, takes a more ambiguous view of the fair folk. In it, a catastrophe attracts a creature of shards and splinters. What will nine moments in Fairyland do to you? Find out when the story is published on April 17th—or subscribe to our Patreon for access now.

Next week, Bruno Dias will continue his regular column about making Interactive Fiction. This month, he’ll tackle the issue of how to handle feedback on your work. That will go live on April 10th, so keep an eye out!

And on April 24th, we’ll have a guest post from Hannah Powell-Smith to round out the month’s content.

Whatever the weather’s like where you are (and boy do I need to find a new opener for these posts), I hope you enjoy what we have on offer here at sub-Q.

The post April 2018 – Table of Contents appeared first on sub-Q Magazine.

what will you do now?

Map

by verityvirtue at April 03, 2018 05:41 PM

By Ade McT. (parser; IFDB)

[Time to completion: >1 hour]

[Content warnings for mentions of abortion, implied child death]

In Map, you play a fed-up housewife in a subtly mutating house. Space, here, is used to reveal memories. As the reader learns more about the PC, the more the house expands to accommodate that, and each new room offers a chance at atonement. Just as space moves non-linearly, time creeps strangely. If you know Pratchett’s metaphor of the Trousers of Time, or think of decision-making as creating forks in a timeline – it’s very much like that. Just as the PC can enter new rooms in the house,

The themes in this game reminded me of Sara Dee’s Tough Beans, or, a more recent example, Cat Manning’s Honeysuckle. All of these feature female protagonists who have been dutiful and responsible doing what was expected of them until they were all but forgotten, until some catalytic event drives them to change.

In Map, the protagonist is much less involved, on the micro level. The rooms you discover let the player relive key decision-making moments in the PC’s life, but once you enter a moment, you can simply wait for it to get to the only choice you have: a binary yes/no choice. Without this, though, the game might have swollen to an unmanageable size, so the limited agency is more strategy than anything else, and on a conceptual level, this does work – how many times have you wondered what would have happened if you’d made a different decision?

The scope of this game is narrow and deep, delving into the emotions underpinning life-changing moments and distilling these moments into a fork in a very personal timeline. Some bits went way over my head (the rubber plant, for instance), but overall it was an ambitious, thoughtful piece.

April 01, 2018

IFTF Blog

Accessibility testing update: Spring 2018

by Jason McIntosh at April 01, 2018 08:41 PM

Since assuming management of IFTF’s accessibility testing project a few months ago, I’ve decided to update this blog every now and again with a public progress summary. Consider this the first!

This spring, the accessibility committee plans to form a number of working groups who will each create a small, purpose-built accessibility-test game. Currently, we’re planning to build games in Inform, Twine, and ChoiceScript.

Each game will serve as a sort of obstacle course of accessibility challenges, both those common to computer interactions, and those specific to IF. Notably, the entity running this course will not be the game’s player, but the game’s own development and play software. The player — one of the volunteer testers we plan to recruit, in the near future — will simply ride along and take note of how well it performs. We’ll base these challenges on WCAG 2.0 Level A, a set of recommendations that — if properly implemented — provide a broad basis of good accessibility practices.

The high-level questions each game seeks to answer:

  • Do this platform’s development tools allow an author to make appropriate affordances for players with disabilities?

    (As a simple example: Can you specify alt text that accompanies an image which appears during the game?)

  • Do the separate programs that run games created with this platform succeed in representing any such affordances?

    (If an image with a game has alt text defined, for example, will the play-platform know when and how to display that text appropriately?)

The real questions will be much more specific, of course, and many will make their way into surveys that the working groups will create alongside the test games. Earlier this year, committee member Deborah Kaplan drafted a simple example survey for a notional test game, to show the spirit we have in mind.

This work also takes a page from Roger Firth’s Cloak of Darkness. This classic specification for a tiny parser-based IF includes a number of activities that a traditional model-world game system is expected to support: moving around among rooms, playing with light sources, hanging clothes on hooks, and so on. By boasting a Cloak of Darkness implementation, a given parser-IF development system can both prove its basic functionality and provide a small, rich source-code example.

Unsurprisingly, accessibility committee members with longer personal IF histories all thought of Cloak at around the same time, once we started thinking about asking testers to play a pre-arranged game of some kind. Since we will aim to test not model-world flexibility but rather affordances for players with disabilities, we opted to create new works rather than explicitly adapt this one. But, threads of that trusty old Cloak will wind through our output, just the same.

March 30, 2018

The Digital Antiquarian

The Game of Everything, Part 3: Civilization and the Narrative of Progress

by Jimmy Maher at March 30, 2018 05:41 PM

You can’t say civilization don’t advance. In every war, they kill you in a new way.

— Will Rogers

Civilization is a game for all time, but the original version of Civilization, as created by Sid Meier and Bruce Shelley and published by MicroProse in 1991, is also a game thoroughly of its time. When you finish guiding your civilization’s history, whether because you’ve conquered the world, been conquered by the world, flown to Alpha Centauri, or simply retired, you’re ranked on a scale of real history’s leaders. If you’ve played really, really badly — as in, closing-your-eyes-and-clicking-randomly badly — you find yourself equated to Dan Quayle, the United States’s vice president as of 1991.

Quayle has long since slunk out of public life, meaning that younger players who try out the game today will likely fail to get the joke here. In his day, however, his vacuousness was legendary enough to make even the most hardened would-be assassin think twice before visiting ill upon President George H.W. Bush. Among Quayle’s greatest hits were the time he claimed the Holocaust had happened in the United States, the time he claimed Mars had a breathable atmosphere and canals filled with water, and the time he lost a spelling bee to a twelve-year-old by misspelling “potato.” And then there was the feud he started with the sitcom character Murphy Brown when she chose to have a child out of wedlock, raising the question of whether he knew that the little people he saw inside his television didn’t actually live there. It says much about what a universal figure of derision he had become by 1991 that Meier and Shelley, in no hurry to offend any potential customer of any political persuasion, nevertheless felt free to mock him in this way as the ne plus ultra of air-headed politicians. Everybody felt free to mock Dan Quayle.

But the spirit of the times in which Civilization was made is woven much deeper into its fabric than a joke tossed onto an end-of-game leader board. In fact, it’s inseparable from the game’s single most important identifying feature. The Advances Chart of which I’ve already made so much reflects a view of history as a well-nigh inevitable narrative of progress — a view that had rather fallen out of favor for much of the twentieth century, but which had come roaring back now at the end of it, prompted by the ending of the Cold War.

It’s very difficult to adequately convey today just how that ending felt to those of us in the West who had lived through what preceded it. Just a few years before, we had shivered in our beds after watching The Day After in the United States or Threads in Britain, while Ronald Reagan droned on obliviously about the Soviet Union as an “evil empire.” A decade earlier, Secretary of State Henry Kissinger had said that the Cold War would be “unending”: “We must learn to conduct foreign policy without escape and without respite. This condition will not go away.” Which was perhaps just as well, given that no one could seem to formulate an endgame for it that didn’t leave the world a heap of irradiated ashes. It was very nearly the conventional wisdom that someday, somehow, a mistake would be made by one side or the other, or one side would simply find itself backed too far into a corner. At that point, the missiles would fly, and that would be that. The logic of history seemed to be on the side of such pessimism. After all, what other weapon in the long history of warfare had humanity ever invented and then not used?

And then, with head-snapping speed, it was all simply… over. Over, for the most part, peacefully. In a series of events so improbable no novelist would ever have dared write them down, one side just decided to give up. The Berlin Wall came down, the Iron Curtain opened, and the Cold War ended in the most blessed anticlimax in the history of the world. As Meier and Shelley were finishing Civilization, the dissolution of the Soviet Union was entering its final stages. In June of 1991, Boris Yeltsin became the first democratically elected president of the nascent “Russian Federation.” In August, the world held its breath as a cabal of communist hardliners mounted a last-ditch coup in the hope of restoring the old order, only to exhale again when the coup collapsed three days later. And on December 26, 1991, a couple of weeks after Civilization had reached store shelves, the Soviet Union was officially dissolved. Russia was once again just Russia.

For those of us in the West, the whole course of events was rather hard to fathom; it was hard to know how to react to suddenly not living under the dark threat of nuclear annihilation. But Americans at least, a people who seldom need much encouragement to wave their flags and play their anthem, soon got with the program and started celebrating the historical triumph of their “way of life.” If they needed any further encouragement, the country’s first post-Cold War military adventure, the pushing of Saddam Hussein’s Iraq out of Kuwait, was being televised live every night on CNN even as the Soviet Union was still winding down. That war proved the perfect antidote to the lingering malaise of Vietnam; it was, at least from the perspective of 30,000 feet shown on CNN, swift and clean, all of the things the “police actions” of the Cold War had so seldom been. The United States was unchallenged in the world, in the ascendant as never before.

The feeling of exaltation wasn’t confined to flag-waving populists. The most-discussed book of 1992 among the chattering classes had no less grandiose a title than The End of History and the Last Man. It was the perfect book for the times, an explication of all the reasons that the United States had triumphed in the Cold War and could now look forward to a world molded in its image. Written by a heretofore obscure member of the RAND think tank named Francis Fukuyama, and based on a journal article he had first published in 1989, the book identified “a fundamental process at work that dictates a common evolutionary pattern for all human societies — in short, something like a Universal History of mankind in the direction of liberal democracy.” (“Liberal” in this context refers to classical liberalism — i.e., prioritizing the sanctity of the individual over the goals of the collective — rather than the word’s modern political connotations.) The central question of history, Fukuyama argued, had been that of how humanity should best order itself, economically and politically, in order to ensure the best possible material, social, and spiritual state of being for everyone. With the end of the Cold War, communism and totalitarianism, the last great challengers to capitalism and democracy by way of an answer to that question, had given up the ghost. From now on, liberal democracy would reign supreme, meaning that, while events would certainly continue, history had come to the fruition toward which it had been building through all the centuries past.

More discussed than actually read even in its heyday, Fukuyama’s book has since become an all-purpose punching bag, described as hopelessly naive in right-wing realpolitik circles and morally reprehensible in left-wing postmodernist and Marxist circles. Widely denounced by the latter group in particular as a purveyor of “jingoist triumphalism” in service of American hegemony, Fukuyama felt compelled to stress in a new 2003 afterword to The End of History that he actually saw the European Union as a better model for liberal democracy’s future than the government of his own country. His book is in many ways a book of its time — an example of a history book itself becoming history — but it’s neither as jingoistic nor as naive as its current reputation would suggest. Most of Fukuyama’s critics fail to address the fact that the idea of a secular historical eschatology hardly originated with him. In fact, he takes pains in the book to situate it within a long-established historiographical tradition, albeit updated to account for such an earth-shaking event as the end of the Cold War.

History as evolutionary progress is an Enlightenment ideal that ironically predates Charles Darwin’s theory of biological evolution. Without it, the American and French Revolutions, those two great attempts to bring concrete form to Enlightenment ideas about human rights and just government, would have been unimaginable. During the decades prior to those revolutions, thinkers like Immanuel Kant, Baruch Spinoza, Thomas Hobbes, David Hume, and Adam Smith had articulated a new way of looking at the world, based on reason, science, humanism, and, yes, progress. “With our understanding of the world advanced by science and our circle of sympathy expanded through reason and cosmopolitanism,” writes the modern-day cognitive psychologist Steven Pinker of the Enlightenment, “humanity could make intellectual and moral progress. It need not resign itself to the miseries and irrationalities of the present, nor try to turn back the clock to a lost golden age.” To be a progressive, then or now, is to believe in the actuality or at least the potentiality of human history as a narrative of progress. It implies a realistic, fact-based approach to problem-solving that prefers to look forward to the future rather than back to the past.

But of course, any rousing narrative of progress worth its salt needs to have a proper bang-up climax. It was the German philosopher Georg Wilhelm Friedrich Hegel in the early 1800s who first proposed the notion of a point of fruition toward which the history of humanity was leading. Indeed, he went so far as to claim that history may have reached this goal already in his own time, with the American Revolution, the French Revolution, and Napoleon’s victory over Prussia at the Battle of Jena in 1806 having proved the superiority of his time’s version of the modern liberal state. (Francis Fukuyama had nothing on this guy when it came to premature declarations of mission accomplished.) Hegel connected his notions of historical progress with a Greek word from classical theology: thymos. It’s difficult to concisely translate, but it connotes the social worth of an individual, the sum total of her competencies and predilections. The ideal society, according to Hegel, is one where each person has the opportunity to become this best self — or, we might say today, to use another word that smacks dreadfully of pop psychology, to “self-actualize.”

This, then, was the natural end point toward which all of history to date had been struggling. As befits a philosopher of the idealist school, Hegel gave the narrative of progress an idealistic tinge which still clings to its alleged rationality even today, whether it takes as its climax Hegel’s universal thymos, Fukuyama’s stable democratic world order, or for that matter Civilization‘s trip to Alpha Centauri.

Narratives of progress had a natural appeal during the nineteenth century, a relatively peaceful era once Napoleon had been dispensed with, and one in which real, tangible signs of progress were appearing at an unprecedented pace, in the form of new ideas and new inventions. In this time before nuclear Armageddon or global warming had crossed anyone’s most remote imaginings, the wonders of technological progress in particular — the railroad, the telegraph, the light bulb — were regarded as an unalloyed positive force in the world. By the latter half of the century, almost everyone seemed to be a techno-progressive. “Upon the whole,” wrote the British statesman William Ewart Gladstone in 1887, “we who lived fifty, sixty, seventy years back have lived into a gentler time.”

Even the less sanguine thinkers couldn’t resist the allure of the narrative of progress. On the contrary: it was Karl Marx among all nineteenth-century thinkers who devoted the most energy to a cherished historical eschatology. With Enlightened scientific precision, he laid out six “phases of history” through which the world must inevitably pass: primitive communism, the slave society, feudalism, capitalism, socialism, and finally true communism. He disagreed with Hegel that liberalism — i.e., capitalism — could mark the end of history by noting, by no means without justification, that the thymos-fulfilling nations the latter praised actually empowered only a tiny portion of their populations, that being white men of the moneyed classes; these were the only people Hegel tended to think of when he talked about “the people.”

But even in the nineteenth century the narrative of progress had its critics. The most vocal among them was yet another German philosopher, Wilhelm Friedrich Nietzsche. Whereas Hegel spoke of the thymos, Nietzsche preferred megalothymia: the need of the superior man to assert his superiority. To Nietzsche, nobility was found only in conflict. All of these so-called “progressive” institutions — such as democracy and human rights — were creating a world of “men without chests,” alienated from the real essence of life; the modern nation-state was “the coldest of all cold monsters.” Hegel had imagined a world where slaves would be freed from their shackles. Maybe so, said Nietzsche — but they will still be slaves. Instead of history as a ladder, Nietzsche preferred to see it as a loop — an “eternal recurrence.” Whether progress was defined in terms of railroads, telegraphs, and light bulbs or social contracts, human rights, and democracies, it was all nonsense, merely the window dressing for the eternal human cycle of strife and striving.

While Nietzsche’s views were decidedly idiosyncratic in his own century, the narrative of progress’s critics grew enormously in number in the century which followed. In contrast to the peace and prosperity that marked the last few decades of the nineteenth century in particular, the first half of the twentieth century was dominated by a devastating one-two punch: the two bloodiest wars in the whole bloody history of human warfare, the second of them accompanied by one of the most concerted attempts at genocide in man’s whole history of inhumanity to man. It wasn’t lost on anyone that the country which had conceived this last, and then proceeded to carry it out with such remorseless Teutonic efficiency, was the very place where Hegel had lived and written about his ideals of ethical progress. Meanwhile, in the Soviet Union, Josef Stalin was turning Marx’s dreams of a communist utopia into another brutal farce. In the face of all this, the narrative of progress seemed at best a quaint notion, at worst a cruel joke. After all, it was only the supposedly civilizing fruits of progress — technology, bureaucracy, a rules-based system of order — that allowed Hitler and Stalin’s reigns of terror to be so tragically effective.

Even when World War II ended with the good guys victorious, any sense that the narrative of progress could now be considered firmly back on track was undone by the specter of the atomic bomb. Maybe, thought many, the end goal toward which the narrative was leading wasn’t an Enlightened world but rather nuclear apocalypse. In his 1960 novel A Canticle for Leibowitz, Walter M. Miller, Jr., introduced a grotesque new spin on Nietzsche’s old idea of the eternal recurrence. He proposed that human civilization might progress from the hunter-gatherer phase to the point of developing nuclear weapons, and then proceed to destroy itself — again and again and again for all eternity. In 1980, the astronomer Carl Sagan upped the ante even further in his television miniseries Cosmos. Maybe, he proposed, the reason we had failed to find any evidence of intelligent extraterrestrial life was because any species which reached roughly humanity’s current level of technological development was doomed to annihilate itself within a handful of years — the eternal recurrence on a cosmic scale.

But then came that wonderful day in 1989 when the Berlin Wall came down. What followed was the most ebullient few years of the twentieth century. The peace treaty which had concluded World War I had felt like something of a hollow sham even at the time, while the ending of World War II had been sobered by the creeping shadows of the atomic bomb and the Cold War. But now, at the end of the twentieth century’s third great global conflict of ideologies, there was seemingly no reason not to feel thoroughly positive about the world’s future. The only problems remaining in the world were small in comparison to the prospect of nuclear annihilation, and they could be dealt with by a united world community of democratic nations, as was demonstrated by the clean, quick, and painless First Gulf War. From the perspective of the early 1990s, even much of the century’s darkest history could be seen in a decidedly different light. Amidst all of the wars and genocides, the century had produced agents for peace like the United Nations, along with extraordinary scientific, medical, and technological progress that had made the lives of countless people better on countless fronts. And, to cap it all off, the fact remained that we hadn’t annihilated each other. Maybe the narrative of progress was as vital as ever. Maybe it just worked in more roundabout and mysterious ways than anticipated. Maybe it was sometimes just hard to see the forest of overall progress amidst all the trees of current events.

As we all know, progress’s moment of triumph would prove even more short-lived than most of history’s golden ages. Within a few years of the fall of the Berlin Wall, an unspeakably brutal war in Bosnia and Herzegovina was showing that age-old ethnic and religious animus could still be more powerful than idealistic talk about democracy and human rights. Well before the end of the 1990s, it was becoming clear that Russia, rather than striding forward to join the international community of liberal democracies, was sliding backward into economic chaos and political corruption, priming the pump for a return to authoritarian rule. And then came September 11, 2001, the definitive ending of the era that would come to be regarded not as the beginning of humanity’s permanently peaceful and prosperous post-history but as the briefly tranquil internecine between the Cold War and a seemingly eternal War on Terror. In a future article, we’ll try to reckon with the changes that have come to the world since those ebullient days of the early 1990s.

Right now, however, let’s turn back to Civilization. Even as I continue to emphasize that Sid Meier and Bruce Shelley weren’t out to make a political statement with their game of everything, I must also note that its embrace of the narrative of progress as its core mechanic, combined with its spirit of rational practicality and a certain idealism, wind up making it a very progressive work indeed. Loathe though he has always been to talk about politics, Meier has admitted to a measure of pride at having included global warming in the game; if you don’t control your modern civilization’s pollution levels, your coastal cities will literally get swallowed up by the encroaching ocean.

I want to stress that “progressive” as I use it here is not a synonym for (non-classically) “liberal”; it’s perfectly possible to be a progressive who believes in smaller government, possible even to be a progressive libertarian. Still, even at the time of the game’s release, when the American right tended to be more trusting of science and objective truth than they’ve become today, its pragmatism about our fragile planet didn’t always sit well with MicroProse’s traditional customer base of largely conservative military-simulation and wargame fans. Computer Gaming World‘s Alan Emrich, who was more than largely conservative, penned the hilarious passage below in his otherwise extremely positive review of the game. It can probably stand in for many a crusty old grognard’s reaction to some of the game’s most interesting aspects (“politically correct,” it seems, was 1991’s version of “social-justice warrior”):

Civilization strives to be a “hip” game, and deals with popular social issues from the standpoint of “political correctness.” Thus, global warming is a tremendous threat. Odd, for such a recent, unproven theory. Evolution is expressed in the game’s introduction, but at least that debate has been around a while. Pollution, therefore, becomes a society’s primary focus after industrialization takes place, with players being channeled toward more politically-correct power plants, recycling centers, and mass transit to address the problem. Even the beta-test “super-highway” Wonder of the World gave way to “women’s suffrage.” While women’s suffrage is a novel concept for its effect during gameplay, it is also another brick in the wall of political correctness.

One hardly knows where to start with this. Should we begin with the bizarre idea that any designer of a massive computer strategy game, about the most unhip thing in the world, would ever have striven to be “hip?” Or with the idea that evolution might still be up for “debate?” Or with the idea that recycling centers and mass transit, and letting women vote, for God’s sake, are dubious notions born of political correctness, that apparent source of all the world’s evils? Or with the last mangled metaphor, which seems to be saying the opposite of what it wants to say? (Was Emrich listening to too much Pink Floyd at the time?)  Instead of snarking further, I’m just going to move on.

A more useful subject to examine right now might be just what kind of progress it is that Civilization‘s Advances Chart represents. The belief to which the game seems to subscribe, that progress in technology and hard science will inevitably drive the broader culture forward, is sometimes referred to as technnological determinism. It can be contrasted with the more metaphysical narrative of progress favored by the likes of Hegel, as it can with the social-collective narrative of progress favored by Marx. Unsurprisingly, it tends to find its most enthusiastic fans among scientists, engineers, and science-fiction writers.

Given the sort of work it is, it makes a lot of practical sense for Civilization to cast its lots with the technologists’ camp; it is, after all, much easier to build into a strategy game the results of the development of the musket than it is to chart the impact of a William Shakespeare. Even outside the rules of a strategy game, for that matter, it’s far more difficult to map great art onto a narrative of progress than it is other great human achievements. While scientists, engineers, and even philosophers build upon one another’s work in fairly obvious way, great artists often stand alone; Shakespeare continues to be widely acknowledged today as the greatest writer of English ever to have lived, even as progress has long since swept all other aspects of his century aside.

Still, importantly, Shakespeare is in the game, as are Michelangelo and Bach, and as are markers of social progress like women’s suffrage and labor unions. (By way of confirming all of Alan Emrich’s deepest suspicions, the game makes communism a prerequisite for the last.) It is, in other words, not remarkable that Civilization on the whole favors a “hard” form of progress; what is remarkable is the degree to which it manages to depart from such a bias from time to time. If the game sometimes strains to find a concrete advantage to confer upon the softer forms of progress — women’s suffrage makes your population less prone to unhappiness, which makes at least a modicum of sense; labor unions give you access to “mechanized infantry” units, which makes pretty much no sense whatsoever — its heart is nevertheless in the right place.

The first people ever to pen a study of Civilization‘s assumptions about history were none other than our old friend Alan Emrich and his fellow Computer Gaming World scribe Johnny L. Wilson, who did so together in the context of a strategy guide called Civilization: or Rome on 640K a Day. It has to be one of the most interesting books ever written about a computer game; it’s actually fairly useless as a practical strategy guide, but is full of fascinating observations about the deeper implications of Civilization as a simulation of history. Wilson writes from a forthrightly liberal point of view, while Emrich is, as we’ve already seen, deeply conservative, and the frisson between the two gives the book additional zest. Here’s what it has to say about Civilization‘s implementation of the narrative of progress:

To be civilized in terms of Sid Meier’s Civilization means to be making material progress in terms of economic well-being and scientific advancement. The game has an underlying belief in such progress. In fact, this dogma is so strong that there is actually no problem in Sid Meier’s Civilization that cannot be solved by human effort (using settler units) or more technology. There are, as a correspondent named Gary Boone wrote to us shortly after the game’s release, no Luddites (reactionary anti-technological activists during the Industrial Revolution) in this game’s universe. It is, to paraphrase Voltaire’s Dr. Pangloss, the best of all progressing worlds.

Many an earnest progressive in the real world has doubtless wished for such an alternate universe. Galileo wished he could write about heliocentrism without being hauled before an ecclesiastical court; Einstein wished he could pursue his Theory of Relativity without contending with a pitchfork-wielding mob of Isaac Newton disciples; modern researchers wish they could explore gene therapy without people forever trying to take their stem cells away. All of these wishes come true in Civilization, that best of all progressing worlds.

Of course, even those of us who proudly call ourselves progressives need to recognize that the narrative of progress has its caveats. Many of the narrative’s adherents, not least among them Civilization, have tended to see it as an historical inevitability. Notably, Civilization has no mechanisms by which advances, once acquired, can be lost again. Yet clearly this has happened in real human history, most famously during the thousand-year interregnum between the fall of the Roman Empire and the Renaissance, during the early centuries of which humanity in the West was actively regressing by countless measures; much knowledge, along with much art and literature, was lost forever during the so-called Dark Ages. (Far more would have been lost had not the Muslim world saved much of Europe’s heritage from the neglect and depredations of the European peoples — much as is done, come to think of it, by a small society of monks after each successive Apocalypse in Miller’s A Canticle for Leibowitz.)

We should thus remember as we construct our narratives of progress for our own world that the data in favor of progress as an inevitability is thin on the ground indeed. We have only one history we can look back upon, making it very questionable to gather too many hard-and-fast rules therefrom. All but the most committed Luddite would agree that progress has occurred over the last several centuries, and at an ever-increasing rate at that, but we have no form of cosmic assurance that it will continue. “The simple faith in progress is not a conviction belonging to strength, but one belonging to acquiescence and hence to weakness,” wrote Norbert Wiener in 1950, going on to note that the sheer pace of progress in recent times had given those times a character unique in human history:

There is no use in looking anywhere in earlier history for parallels to the successful inventions of the steam engine, the steamboat, the locomotive, the modern smelting of metals, the telegraph, the transoceanic cable, the introduction of electric power, dynamite and the high-explosive missile, the airplane, the electric valve, and the atomic bomb. The inventions in metallurgy which heralded the origin of the Bronze Age are neither so concentrated in time nor so manifold as to offer a good counterexample. It is very well for the classical economist to assure us suavely that these changes are purely changes in degree, and that changes in degree do not vitiate historic parallels. The difference between a medicinal dose of strychnine and a fatal one is also only one of degree.

Civilization rather cleverly disguises this “difference of degree” by making each turn represent less and less time as you move through history. Nevertheless, progress at anything but the most glacial pace remains a fairly recent development that may be more of an historical anomaly than an inevitability.

Whatever else it is, the narrative of progress is also a deeply American view of history. The United States is young enough to have been born after progress in the abstract had become an idea in philosophy. Indeed, its origin story is inextricably bound up in Enlightenment idealism. That fact, combined with the fact that the United States has been fortunate enough to suffer very few major tragedies in its existence, has caused a version of the narrative of progress to become the default way of teaching American history at the pre-university level. One could thus say that every American citizen, this one included, is indoctrinated in the narrative of progress before reaching adulthood. This indoctrination can make it difficult to even notice the existence of other views of history.

Civilization, for its part, is a deeply American game, and much about the narrative of progress must have seemed self-evident to its designers, to the point that they never even thought about it. The game has garnered plenty of criticism in academia for its Americanisms. Matthew Kapell, for instance, in indelible academic fashion labels Civilization a “simulacram” of the “American monomythic structure.” Such essays often have more of an ideological axe to grind than does the game itself, and strike me as rather unfair to a couple of designers who at the end of the day were just making a good-faith attempt to portray history as it looked to them. Still, we Americans would do well to keep in mind that our country’s view of history isn’t a universal one.

But if we shouldn’t trust in progress as inevitable, how should we think about it? To begin with, we might acknowledge that the narrative of progress has always been as much an ethical position, a description of the way things ought to be, as it has been a description of the way they necessarily are. This has been the real American dream of progress, one always bigger than the country’s profoundly imperfect reality, one in which much of the world outside its borders has been able to find inspiration. Call me a product of my upbringing if you will, but it’s a dream to which I still wholeheartedly subscribe. To be a progressive is to recognize that the world is a better place than it used to be — that, by almost any measurement we care to take, human life is better than it’s ever been on this little planet of ours — thanks to those Enlightenment virtues of reason, science, humanism, and progress. And it is to assert that we have the capacity to make things yet much, much better for all of our planet’s people.

Progress will continue to be the binding theme of this series of articles, as it is the central theme of Civilization. We’ll continue to turn it around, to peer at it, to poke and prod it from various perspectives. Because ultimately, responsibility for our future doesn’t lie with some dead hand of historical or technological determinism. It lies with us, the strategizers sitting behind the screen, pulling the levers of our real world’s civilizations.

(Sources: the books Civilization, or Rome on 640K A Day by Johnny L. Wilson and Alan Emrich, The Past is a Foreign Country by David Lowenthal, The Human Use of Human Beings by Norbert Wiener, The True and Only Heaven: Progress and Its Critics by Christopher Lasch, History of the Idea of Progress by Ribert Nisbet, The Idea of Progress by J.B. Bury, Enlightenment Now: The Case for Reason, Science, Humanism, and Progress by Steven Pinker, The End of History and the Last Man by Francis Fukuyama, A Canticle for Leibowitz by Walter M. Miller, Jr., Ulysses by James Joyce, Lectures on the Philosophy of History by Georg Wilhelm Friedrich Hegel, Thus Spoke Zarathustra by Friedrich Nietzsche, and The Communist Manifesto by Karl Marx and Friedrich Engels; Computer Gaming World of September 1991 and December 1991; Popular Culture Review of Summer 2002.)