Forty years of geekitude

TRS-80 Model 1It was forty years ago this fall, in 1977, that I bought my first computer. I had little experience with computers prior to that – a few weeks working after hours on an APL system at the U of T, mostly to play games against the machine, reading a few magazine articles on the coming ‘personal’ computer wave. Nothing seriously hands-on, experience-wise, and no programming skills either. But as soon as I saw one, I had to have it. And so I bought one.

Since then, I have not been a day without one, and generally had more than just one in my home. As many as six or seven at one time, back in the early 1980s, all different brands. But that was when I was writing about them, editing computer books and writing computer manuals.

My first computer was a TRS 80, Model 1. TRS stood for Tandy Radio Shack. It was a 16KB computer (yes: that’s 16,384 bytes of memory) In comparison, my current laptop has 8GB, or 8,388,608 kilobytes: 512 times the Model 1’s amount of RAM!

It was powered by a Zilog Z-80 eight-bit processor. My current machines all run 64-bit, multi-core processors. It had no USB ports, didn’t use a mouse, and had no audio card. Smartphones today are more versatile and more powerful. But not as much fun.

Before I bought it, I debated for a week or two whether to get the TRS or the competing Commodore PET, powered by the 6502 processor. It had similar limitations in memory and input devices, but came with a green and black screen integrated with the keyboard in one unit. But the TRS was sold at a nearby Radio Shack store within walking distance, and they also offered nighttime classes to teach the basics. The PET was only sold at stores downtown, so I bought the closer one.

I had to boot it and load programs from a cassette tape player. A year or so later, I upgraded to a 64KB RAM system and dual floppy (5.25″) drives. Each floppy could hold about 160KB of programs or data. It had a standalone B & W monitor that didn’t have any graphic capability, although canny programmers used the blocks in the ASCII character set to create pseudo-graphics (a bit like today’s Dwarf Fortress game displays, but only in B&W).
Continue reading “Forty years of geekitude”

Microsoft killed solitaire for me

Solitaire – also known as Klondike and Patience – is a very popular game on computers. So popular, in fact that a version of this 200-year-old card game has been included by Microsoft in every version of Windows since 3.0 (1990), aside from a brief hiatus with Win 8 (which gap was filled in by third-party versions). Microsoft has even launched a version for iOS, playable on the Mac, iPhone and iPad.

And according to some reports, it is the most widely used program by Windows users by a long shot. More than Word, Outlook, and PowerPoint and Explorer. Writer Luke Plunkett called that statistic “frightening.”

But for millions of us, solitaire fills the time; it occupies our brains during long travel times, in waiting rooms, in between loading, downloading, burning to disk or compiling experiences. Not just the one game: there are a whole raft of solo card games under the name solitaire – freecell, spider, Klondike, pyramid and tri-peaks among them – that people play regularly. And sometimes obsessively. Many is the time I have stopped writing this blog or some other piece, trapped by writer’s block or simple exhaustion, to while away a few minutes recharging with a simple game of solitaire.

As Plunkett wrote:

You mention Solitaire and—after the amazing end-game card haze—the first thing that pops into your head is that it was once seen as the single biggest threat to office productivity facing this planet’s workers. And in many regards, that’s correct.
Most people who have worked in an office can testify to the lure of the game, and could name one or two colleagues who spent a little too much time cutting the decks when they should have been filing reports. Some even take it too far; in 2006, New York Mayor Michael Bloomberg famously fired a city employee he caught playing the game while at work.
This addiction can even spread beyond the workplace and into people’s homes. My father has spent more time playing Freecell over the past decade than he has doing housework, for example. Things can get even worse for some: in 1996, Dr. Maressa Hecht Orzack opened the world’s first clinic for computer addicts as a response to her own chronic Solitaire addiction.

In May, 2008, Slate magazine ran a story titled, “Solitaire-y Confinement: Why we can’t stop playing a computerized card game.” In it, author Josh Levin wrote:

The game’s continued pre-eminence is a remarkable feat—it’s something akin to living in a universe in which Pong were the most-popular title for PlayStation 3. One reason solitaire endures is its predictability. The gameplay and aesthetic have remained remarkably stable; a visitor from the year 1990 could play the latest Windows version without a glitch, at least if he could figure out how to use the Start menu. It also remains one of the very few computer programs, game or nongame, that old people can predictably navigate. Brad Fregger, the developer of Solitaire Royale, the first commercial solitaire game for the Macintosh and the PC, told me that his 89-year-old mother still calls regularly to brag about her high scores. The game has also maintained a strong foothold in the modern-day cubicle.

So with its widespread popularity, a game beloved by millions and maybe even billions, you have to wonder why Microsoft seems bent on destroying the experience in Windows 10. Levin calls solitaire the “…cockroach of gaming, remarkably flexible and adaptable.” Perhaps Microsoft wants to stamp it out.
Continue reading “Microsoft killed solitaire for me”


TranscendenceIt’s not surprising that AI replaced the biological form in the popular Frankenstein monster trope. In fact the smart-evil-machine scenario has been done so often this past decade or so that I’m more surprised any film writer or director can manage to give it some semblance of uniqueness that differs it from all the others.

Transcendence tries, tries very hard and almost makes it. But the brass ring remains out of reach. Still, it’s worth watching if you’re a scifi buff because, well, it’s scifi.* And even bad scifi is better than no scifi at all. Well, maybe not the Transformer franchise, but pretty much the rest of it.

More than that, while it doesn’t tread a lot of new ground, it does use a lot of nifty sets and special effects, even if the topic isn’t all that new.

The evil robot has been with us in film for a very long time. Fritz Lang’s 1927 film, Metropolis was the first to portray a sentient robot (the ‘Maschinenmensch’). That robot was created to “resurrect” the creator’s former lover. In Transcendence, the character of Dr. Will Caster (Johnny Depp) is similarly “resurrected” but in virtual space: inside a computer. And of course he/it evolves/develops within those confines to something more than human.

Angry non-techies storm the castle with pitchforks and burn the whole place down. Well, okay it’s an underground data centre in the desert and they use artillery, but it’s basically the same thing. It’s a monster movie with CGI lipstick. And better yet, it’s in the $8 bin (with both Blu-Ray and DVD editions in the case…) at Wal Mart. But be prepared to question the premise. And a lot more.
Continue reading “Transcendance”

Server upgrade coming

Sometime in the next two weeks, I will be amalgamating servers for the several sites I manage and conflating them onto one, new and (I hope) faster and more efficient server. There may be some downtime while the files and databases migrate, like virtual birds, to their new home.

I hope that the digital gods of server migration allow my moves to go smoothly. I would sacrifice a virtual dove to propitiate them, if I could only find their virtual altar… would that I were the digital Odysseus…

For most users, it will, I expect, be but a momentary blip in the service, a temporary lapse of rant soon reconstructed. No more than a couple of hours of downtime while the ether is busy with transient bytes flitting hither and yon. My biggest concern is the Blue Agave forum which operates on an Invision system… the transition to the current servers wasn’t all that smooth when I moved a few years back. But we’ll see how it evolves… I might need the aid of Invision’s tech team, too…. but that should not concern you.

If things don’t go smoothly, and it takes longer than expected, it may be the result my clumsy handling of the tools (while still technically inclined, my edge has, I admit, lost some of its crispness as I age). Or it may be some deeper, larger problem that requires tech support to save me from myself and the quicksand of SQL content.

I can migrate the static files easily enough, but depend somewhat on online tools to make the transition for the blog and WordPress databases. And then there’s all that PHP stuff…

Anyway, things may appear and disappear, and off error pages emerge, but take heart that I am not vanished from the network, merely taking the high road to the deep north, as Basho did, but of course virtually, and expecting to return momentarily. Should my site appear gone, take heart that it has not shuffled off this mortal coil, but merely retired momentarily to a far, far better place…. and will reappear when the digital stars align.

Refresh, refresh, refresh and return and it will all be made clear. I hope. If not…. well, I can always start afresh.

The WOW Factor

WOWAfter two years away from the game, I was recently convinced by a friend to return to World of Warcraft again and play in the fantasy universe of WOW. At 10 years old, WOW remains the biggest, most-subscribed, most popular MMORPG, with around 10 million subscribers.

By technology’s rapid-aging standards, WOW is a grandfather game; maybe even a great-grandfather. It has certainly spawned a lot of offspring, although not all are legitimate.

I started playing WOW back in 2005. although I didn’t play it seriously and attentively until a little later, after the first expansion. Then I got heavily into the game, so that for a long stretch, barely a day went by without at least doing the daily quests for one or more characters.

I dutifully paid my monthly subscription fee for years. I upgraded to the first expansion set, The Burning Crusade. Then the Wrath of the Lich King. And also the third, Cataclysm, coming out in late 2010. When the fourth expansion set, Mists of Pandaria, was released, in the fall of 2012. I was already losing interest and the corny fighting pandas the expansion threw in just didn’t make me want to shell out another $50 plus the $15 a month.

WOWI slogged on for a few more months, but in December 2012, I finally gave up. I wasn’t enjoying the way the game had evolved. I wasn’t having fun any more.

I had long stopped being obsessed with finishing pointless quests, running back and forth collecting useless items for some NPC. And running was what I did most of the time. You can’t get a mount to travel faster until level 20. Flying mounts at level 60. A lot of the grind is spent running. My fingers were getting stiff.

My game time had dwindled from hours a day to hours a week. Then a month. Finally, I simply didn’t care any more.

I was tired of the repetitive canned responses from NPCs. The voice acting was old and stale. The cartoonish scenery and characters no longer amused me. I had had a small boost to my enjoyment when they added flying mounts (Cataclysm?), but that soon became tired, too. Questing and collecting and making things became a grind, not fun.

I was never big on some of the game’s aspects, even from the start – battlegrounds and raids weren’t attractive to me. Nor was PvP. I preferred questing, often solo or with a single friend, and the occasional dungeon crawl with a mixed party. But after I reached the pinnacle – level 70 at first, then cranked to 80 –  with most of my characters, it simply paled. Wash, rinse, repeat.

The expansions added territory to explore, new quests, new opponents, but generally they seemed to be a kind of kitchen-sink approach: stuff was added, changed, removed with seeming arbitrariness. The new races, the new enemies didn’t seem to match the logic of the original game series. Sometimes it felt like the whole WOW universe was designed by 14-year-olds with lots of passion but lacking a solid background in fantasy.
Continue reading “The WOW Factor”

Banished: Sandbox Gaming at Its Best

Banished 01Banished is a medieval-style city building game, along the lines of SimCity, but with several significant differences. While not as slick or comprehensive as SimCity, it still provides a compelling, addictive gameplay.*

It’s slow and cerebral, true, not your basic action-filled RPG or FPS, but it’s one of those games that demand ‘just another fifteen minutes’ that easily stretch into the wee hours. And with infinitely variable maps and a wide range of community-made mods that enhance and change the dynamics, it promises a lot of repeat play for fans of the genre.

First difference between the two city-building sims is in goals: Banished doesn’t have any, aside from simply surviving. That’s tough enough. No goals for growth, population, buildings or the like. It’s a sandbox game in which you do whatever you want, but there are clearly strategies that work better than others. Careful attention has to be paid to the details; resources, housing, jobs, education, food, weather game, trade and so on.

Second is the size. In SimCity, it’s pretty easy to get big cities with large populations fairly quickly. In Banished, after 20 in-game years in four different games, each town I built was still around 100 population. Growth is slow. I’ve built cities in SimCity that cover almost the entire map. In Banished, terrain and modest growth have kept my towns small. I’ve seen screenshots from other players showing larger towns, so I know they can be built, but it takes more time and patience than I have yet put into it.

Banished 02Third is the detail level and type. SimCity focuses on modern infrastructure and technology. Banished doesn’t concern itself with water, hydro and sewage or the trappings of modern civilization. Technologically, it’s somewhere between 1500 and 1700, so the detail is limited. The number of building types is minimal compared with SimCity, too.
Continue reading “Banished: Sandbox Gaming at Its Best”

Testing a Homeplug-Powerline Network

DlinkI’ve had some wireless issues for quite some time now. There are dead spots in the house – a central wall has metal ducts and a gas fireplace, which are beside the laundry room with its metal-enclosed washer and dryer. About 5-6m of metal interfere with the wireless signal. The modem is attached to the cable, which comes through the north side of the house, and there are no other active cable outlets in other rooms (there are outlets for cable, but they’ve never been properly connected).

Plus the ducts and pies in the basement and the metal front door interfere with the signal out of doors, making it difficult to get good reception in a large portion of the yard – including our favourite summer sitting location; the newly rebuilt front deck.


And just to confound matters, my Acer Aspire laptop has the annoying habit of losing its internet connection – while all my other wireless devices are fine – although it can see other networks nearby and even connect to my modem. Just not the net.

I’ve looked at all sorts of solutions, from wireless extenders and bridge routers, to rewiring a large portion of the house to accommodate moving the cabling to allow the modem to be placed closer to the laptop. I’ve moved the modem a few times, but the reception has only improved indoors – out of doors it remains unstable.

There is often a 10-metre ethernet cable running across the floor between laptop and modem when I want to be sure my access isn’t interrupted. Susan hates it. It’s a trip hazard and looks hokey.

This week I decided to try a different approach (a suggestion from Neville on Facebook): a powerline (aka homeplug) network extender. It’s a whole area I knew nothing about before this week, except for the vague understanding that the network connects via the AC power lines in your home.

Basically you plug one adapter into a wall socket and attach it to your modem via a (shorter) ethernet cable. They you plug in a second adapter somewhere else in your house, preferably close to your computer, do whatever the device needs to establish a connection (in my case, push a button). When they connect, you plug another (shorter) cable into the adapter and your computer.

But which one? Which type, which standard, which brand, which feature set? That’s what I spent most of my past few days studying. Reading reviews, technical papers, speed tests, manufacturers’ claims. Prices range from $40 to almost $150 for the minimal two adapters. Why the difference and would it really matter to me?

In the end I went low-end rather than cutting-edge. I bought a D-Link “PowerLine AV 500 Network Starter Kit (DHP-309AV) from the local Staples store. Took all of two minutes to set it up, another minute to connect cables and my laptop was connected to the internet.

And if it proves itself in the upcoming months, I may look at the new models due out this fall to upgrade to the new AV2 standard, and get some extra ethernet ports strung around the house.

Pondering Responsive web design

Mashable graphicI’ve been building websites since the early 1990s, and have had my own websites continually since 1995. For a few years, I did website design and analysis for commercial clients – mostly small local businesses. I even taught web design at a local adult learning centre for a couple of years. Way back when the Net was relatively new, I even did some pages for local events. Although I do less coding today, mostly for my own use, I still have an interest in the developments in web technology and layout.

I taught myself the basics of HTML back when it was version 1.0, 20 years ago – not all that difficult if you were schooled in using the old word processors like Perfect Writer and Wordstar. The first word processors used similar markup styles. Some even required users to compile the text in transient files, in order to see the formatting results, because they couldn’t be shown onscreen at the same time as the markup. That’s because these programs were small, tight and efficient enough to fit in the limited physical computer memory – 16 to 64KB in the early days of computing – but not very feature rich. Ah, the good old days of the Z80 and 6502 processors.

HTML was fairly easy (for me), but clumsy. It was a flat, 2D system and building some elements – tables in particular – was awkward and time-consuming. HTML tried – with limited success – to mix design with structure in one all-encompassing language. It was predicated on the printed page – basically replicating it onscreen. The initial versions of HTML were a desktop-publishing-like environment for the screen.

But the old ways are not always suited for the new devices. Page designs and layouts done even five years ago may be outdated and ill-suited for mobile devices (as I have found from my own work). New design paradigms are needed to stay current with the ebb and flow of technologies.

Continue reading “Pondering Responsive web design”

World of Tanks

Battlefield view
Tanks are a long distance weapon, you know. They are best used in concert with one another to provide cover and overwatch fire, and are best placed in a covered or hull-down position where their profile is reduced to the minimum. Tanks should never travel alone; they should always advance with supporting vehicles on their flanks.

That’s pretty much what I said to my teammates that Saturday morning. However, I may have typed it a little more tersely. Something like, “%#$&@ idiots. Y R U in the open w/o support?

I watched as the majority of them rushed across the field to be picked off in the open by well-placed enemy tanks, and turned into smoldering wrecks that dotted the battlefield. Don’t these people know anything about basic tank doctrine, I wondered? Well, probably not. This is the internet, after all.

Still, I want to shout out. Tanks are not close-range weapons. Or rather, they weren’t intended to be. This isn’t paintball. You can’t exactly sneak around in 25 or 30 tons of metal. But you can be clever and use the terrain to your advantage: peek carefully around corners, over rises, and stay hidden in bushes while you wait.

But there they were – half the team racing towards the enemy flag like heavy-metal Rambos, ignoring terrain, elevation, cover, overwatch or even one another. And paying the price. Boom! Another teammate in flames. You might have heard me swearing as you walked by the house that morning.

That left me with three others out of an initial 15 to guard the base; trying to cover all possible paths of approach, stay hidden and stay alive. And pick off the enemy, now bold enough to move forward. An enemy which still had nine intact vehicles, including a very active artillery and two tank destroyers, each with two kills each already. A team that seemed to understand how to play much better than our side.

We lost that one.

Good thing it’s just a game and the losers merely have to wait it out until the match ends, then come to life and play again. When there’s no other penalty for dying except to wait, you won’t learn anything.

Continue reading “World of Tanks”

The Mac celebrates 30 years

MacintoshA recent article on Gizmodo shows off some previously unseen (or perhaps just forgotten) footage of a young Steve Jobs unveiling the Macintosh computer, back on January 30, 1984.

Thirty years ago, this week.

Seems like forever ago. But I remember it, and reasonably well. I remember where I was living then, what I was working on, and who I was with (I’m still with her…)

The video clip also includes the famous Orwellian “1984” TV ad Apple used to launch the Mac. That’s worth watching for itself. It was a really cheeky ad, and generated a lot of chatter about marketing at the time. The clip includes other Mac ads you should watch.

I had a Mac around then, bought, as I recall, in late 84 or early 85. I had had a Lisa – the Mac’s unsuccessful predecessor – on loan for a few months in 83 or early 84. I wasn’t impressed with the Lisa, but the Mac really captivated me.

I also had an IBM PC, from 82 or 83, and never quite understood the anti-IBM sentiments Jobs and Apple promoted among users. But then PC users fought back just as adamantly over the superiority of their platform.

As a computer geek from way back, I just loved having any – every – computer. When I started computing, I lived in a two-bedroom apartment; the second (the Iarger of the two, of necessity) bedroom became a workroom filled with computers, books, manuals, printers, modems, tools, chips, soldering irons, cables, and printers. As a technical and documentation writer, I always had extra hardware and software on loan from manufacturers and distributors. I once described my room as looking like a “NASA launch site.”

When we eventually bought our own house, I had a room for my books and computers, too, although they tended to escape and overrun the rest of the house. Same thing has happened here, although the amount of hardware is much reduced from the glory days (more ukuleles today than computers).

But ever since my first computer, I have not been a day without at least one computer in the house, usually several.

By the time the Mac was released, I had been computing for more than six years. I bought my first computer in the fall of 1977, a TRS-80, and soon had several machines (an Apple in 79, an Atari 400 in 1980 and then an 800 in 81). I belonged to user groups, subscribed to at least a dozen computer magazines, and wrote for several, including one of the earliest columns on computer gaming (in Moves magazine). I attended many computer fests and exhibitions in Canada and the USA – in fact, I helped organize a microcomputer fair in Toronto, at the International Centre, in the mid-80s.

As you read this, in 2014, I’ve been at it for almost 37 years.

So I take some umbrage when I read this condescending snippet on Gizmodo:

30 years ago the landscape of personal computing was vastly different. That is to say, it hardly existed.

Hogwash. It was alive and well – thriving in its entrepreneurial glory. Only poorly-informed journalists who have not done their research would make such a claim. Or perhaps they are too young to know of the rich history of personal computing prior to their own acquisition of a device.

By 1984, we had seen the TRS-80, Commodore Pet, Apple II, Kaypro, IBM PC, Atari 400, 800 and 1200, Sinclair, TI-99, the Acorn, Coleco Adam and many others. Apple’s own IIc would be released later in 1984.

We would soon see both the Commodore Amiga and Atari ST 16-bit computers launched. Of which I had them all, and a few others passed through my hands in that time, too.

In the 80s, CompuServe dominated the field of online services with millions of customers as it spread. I was a sysop on CompuServe for many of those years. I even operated my own BBS for a while.

CompuServe was challenged – aggressively, but not very successfully – by several competitors in that decade including The Source and Delphi (I was later a sysop on Delphi, too, before moving to Collingwood).

Continue reading “The Mac celebrates 30 years”

Narrative and free agency in game design

World of WarcraftAs a former World of Warcraft player, I can attest to how compelling it is to play an immersive, massive, 3D role-playing game. Acting out scenarios in a fantasy world is more involving than merely reading a fantasy novel. You get addicted to being part of the narrative, to swinging the sword instead of just reading about it.

Just as when you’re reading a good novel and can’t stop turning the pages, you keep playing to see how the next chapter/adventure/scenario plays out, especially when you don’t always have to follow the script.

It’s not so much about the gameplay, as much as it is being part of the story. Well-designed games compel you to continue playing through a combination of action, puzzle solving, rewards and group activities.

WOW is an MMO – massive multiplayer online game – set in a fantasy world that draws much of its substance from Tolkein and other fantasy writers. Many of the role-playing games (RPGs) follow the pseudo-Tolkein model, but most follow paths laid out in fantasy literature (i.e. characters and novels by Robert Howard, Edgar Rice Burroughs, H.P. Lovecraft or more modern writers).

WOW is, of course, not the only game that offers that sort of setting, but at eight years old, with about 12 million subscribers, it’s both the largest and longest-lasting of them. It thus becomes the yardstick for measuring any other game in the genre. None of its competitors – Rift, Guild Wars, Lord of the Rings,Star Wars, etc. – have a fraction of the players.

RPGs owe their ancestry to a small box set of rules published in 1976, called Dungeons and Dragons. Written by Gary Gygax and Dave Arneson (whose name subsequently disappears from the list of authors in later printings), it essentially created the standards for fantasy role playing that are still in use today.

This is documented in great detail in Jon Peterson’s 700-page tome, Playing at the World (his blog is here). It was reading this book that got me thinking about game design again (and to dig through what few old wargames and rules books I have in the basement…).

In his introduction, Peterson identifies “freedom of agency” as one of the key components, “as much a necessary condition for inclusion in the genre of role-playing games as is role assumption.” The ability to make choices of action, of goal, and behaviour are central to a compelling game. In the Wired interview, linked above, Peterson defends gaming,

“…not as fads or disposable products of pop culture, but instead as a legitimate part of intellectual history, heirs to a tradition that stretches back centuries and involves many great thinkers and innovators.”

Which is similar to what I’ve been writing about for a few decades.* Gaming, at least in the simulation-style games, is not merely a pointless pastime, but rather an intellectual exercise.

Computer games have both redefined entertainment and set the bar for hardware and software development. Games are incredibly demanding of computer resources compared to, say, a spreadsheet. Consider the processing required to keep track of dozens, even hundreds of players who are interacting in 3D space in realtime, plus all of the geography, terrain, in-game trades and purchases, combat, weather and environmental effects. And to keep everyone in the game fully informed of all the events, locations and activities of their characters, pets, party members, resources, movement paths, mail… it’s a stunning amount of work.

Beyond the coding, there are some basic components any game needs to be successful:

  • Clearly defined purpose and goals;
  • Challenge;
  • Identifiable opponents to overcome;
  • Reward for accomplishing goals or overcoming challenges;
  • An understandable and accessible board geography where the game is played;
  • Clear and concise rules.

RPGs add other elements to create that immersive experience, including:

  • Connecting story/narrative;
  • Character choice, advancement and development;
  • Consequences of actions or behaviour;
  • Alternate races (orcs, elves, dwarves, etc.);
  • Role assumption (taking on the persona of a character in the story);
  • Free agency (the ability to move and act independent of the script);
  • Believable fantasy, alternate or futuristic world environment;
  • Clear sides with which the races align and which have competing goals.

Computer games have additional components:

  • Good graphics and visual appeal;
  • Good AI (artificial intelligence) and NPCs (non-player characters);
  • Believable environmental interactions, simulated physics and effects;
  • Appropriate sound (and sometimes music);
  • Interactivity with NPCs, environment;
  • Social activity (in MMOs).

Some RPGs (i.e. Fable, Witcher, Fallout 3), have more complex “consequences” built into decision making within their games. Certain choices – such as attacking or stealing from non-player characters (computer-controlled NPCs) or how you answer their questions – affect the way others relate to your character. How well these mimic actual social or personal behaviour is debatable. Mostly they seem to me merely designed to add chrome to role assumption. In some cases, they don’t really affect the game or quests.

Since these are solo games, rather than MMOs, you can usually save your game before you make a choice, then replay it with a different choice if you don’t like an outcome. That tends to dissipate the suspension of belief necessary for immersion.

I don’t include “fun” in any of these lists because fun, like beauty or taste, is subjective. Players will gravitate to the games that provide the highest entertainment value for their own interests and aptitudes. I, for example, never found WOW’s battlegrounds “fun” but always enjoyed questing and exploring (solo and in parties). Others eschew the quests for the PvP combat in the battlegrounds.

Can the storyline absorb the players sufficiently, for long enough to suspend belief, deeply enough to make you care about both the characters and the action? It depends on how well the narrative is scripted. A good storyline has to be crafted as carefully as a good novel and needs to generate a similar emotional response.

Clearly, however, game narrative is very different from a storyline in a book, since choice is a key element in gaming.

Quests can also be seen as ‘micro-narratives.’ In many games, the plot or story is merely a shell that contains numerous micro-stories presented as quests. Sometimes these are dynamic, so that the nature or goals of quest B depends on how or how well you accomplished quest A. However, the shell story needs to be coherent so players don’t simply feel they’re moving from one mini-game to the next for no reason.

A lot of games fall down with thin stories, pointless quests (collect X eyeballs or Y spleens), and predictable A-to-B-to-C plots. And too many depend more on action to move them along rather than plot or participatory narrative (i.e. Diablo III).

Patrick Holleman, on the Game Design forum, writes,

“…the difference between traditional games and videogames is that videogames have a world in which everything about the game, except for controller input, takes place. This world is created, controlled, and sometimes populated by continuous and discrete artificial intelligence. The player is a guest in that world, the central participant in its mechanics. Even still, the world is usually not driven by the player; it is the designer’s world, and should be studied as such.”

Holleman also asks, “…whether or not videogames are similar enough to traditional narratives that we should study them the same way.” In response, he adds,

“To begin, it makes sense to admit that some portions of videogame narratives are exactly like books; the player reads them without interacting except to turn the ‘page’. Some narrative segments in videogames are exactly like movies; the player watches them without doing anything except pausing and unpausing. No decent videogame is entirely like movies or books. A movie creates a fictional world that one can see and hear, but viewers are locked into a guided tour that the filmmakers have scheduled for the viewer, and viewers can never deviate from that tour. In a videogame, on the other hand, the player is presented with a world that can be accessed largely at their own discretion. Videogames that are too linear—too much like the guided tours of movies—are often deprecated by critics and gamers.”

Interactivity is essential, but is not synonymous with narrative. Ernest Adams, writing in “Three Problems For Interactive Storytellers,” said,

“Interactivity is almost the opposite of narrative; narrative flows under the direction of the author, while interactivity depends on the player for motive power.”

In that same article, Henry Jenkins writes,

“You say narrative to the average gamer and what they are apt to imagine is something on the order of a choose-your-own adventure book, a form noted for its lifelessness and mechanical exposition rather than enthralling entertainment, thematic sophistication, or character complexity… Most often, when we discuss games as stories, we are referring to games that either enable players to perform or witness narrative events – for example, to grab a lightsabre and dispatch Darth Maul in the case of a Star Wars game. Narrative enters such games on two levels – in terms of broadly defined goals or conflicts and on the level of localized incidents. “

Immersiveness also depends heavily on how much free will the player has, and the ability to write ourselves into the script. In games like Diablo III, the action is very linear and with little flexibility for explore or act outside the proscribed plot and territory. These games have little immersive value (and, at least with D3, little replay value, either). Others, like Mass Effect and Dragon Age, combine limited freedom with scripted activities and plots.

Morrowwind, Skyrim and the post-apocalyptic Fallout 3 provide a generally freely roam-able world, and in some cases, the ability to attempt quests well beyond your character’s level (some MMOs offer this, as well).**

While few solo RPG games offer such significant free agency, it is the hallmark of most MMOs. Holleman writes,

“World of Warcraft is another game heavily dependent on the depth and persuasiveness of its world; it has the benefit of being an ever-expanding world as well, with content updates and expansion packs. The first time through the game tends to be the best, from a narrative perspective. The structure of the quests (tasks with completion rewards) that guide gameplay are heavy on exploration, but often a bit short on variety, i.e. collect 10 quest items, for the millionth time. What makes these quests and dungeons compelling—at least the first time through the game—is that they are driven by a strong, interesting setting.”

Because RPGs have a character-building ladder system, the reason many players don’t explore the MMO environments more fully is usually that their characters are graded too low to survive in higher-level zones. Some sort of safe passage is sometimes offered (i.e. roads where hostile NPCs don’t patrol), or sometimes swift transportation is available (riding or flying mounts in WOW) to encourage more exploration.

Most MMOs have graduated zones for each race. These offer playable regions challenging for characters within that given range, such as levels 1-5, 6-10, 11-15, etc. You play your character in a zone until it levels up to be able to enter (and survive in) the next zone. Each zone has level-related quests to fulfill to aid your advancement.***

Completing all available quests is also part of the achievement ladder. Players are encouraged to complete all quests in all zones, regardless of their level. The problem with this system is that, in many MMOs, when your high-level character enters a low-level zone (for example, for another race), the quests are ridiculously easy but yous till want to complete them. On the other hand, quests are designed to get players to explore the entire zone while questing, which increases the sense of immersion.

Where most games have a defined end (in RPGs, usually the defeat of a final boss character), MMOs are often open-ended: they can be played after the characters have reached their highest level, accomplished all available quests and defeated all the boss characters. Usually such activities are social: group raids, battlegrounds and dungeons outside the formal narrative and questing lines (essentially making them into fantasy variants on the FPS-PvP line of gaming). It’s also possible to create new characters and start again from level one, often choosing a different race, type/class (warrior, priest, hunter, etc.,) or even alliance.

As the goal of game design, immersion is difficult to achieve: it depends on the interaction of several factors, as well as the independent activities of players outside the scripted narrative. It’s an interesting challenge that, so far, no single game has managed to meet fully, but it’s always interesting to examine the results.  ****


* I started wargaming in the mid-1970s, bought a computer in 1977, and by around 1980 was writing a regular column on computer games for Moves magazine, as well as writing articles for contemporary programming magazines. I wrote about computers and game design for several magazines in the 1980s including Antic and ST Log, and published a column on technology in Canadian newspapers for a decade from the mid 1990s, which often looked at game developments.

** First-person shooters (FPS) like Call of Duty and Medal of Honor usually combine scripted scenarios with open or semi-open gameplay in a small environment. Very few have a fully open environment (Far Cry, however was one).

*** Level grinding is when you rush through all the available quests solely to get your characters up to a reasonable level of strength to be able to use powers or traits unlocked at higher levels, and then to engage in multiplayer activities like dungeons and raids. It’s common in WOW to see level 60-80 characters doing level 1-10 quests to complete their achievement ladders. For the lower level players, this can be frustrating as you watch a higher-level character blaze through an area, taking quest items or killing quest characters with ease, forcing you to wait for them to respawn. Guild Wars 2 has a different approach. When the player’s level is higher than the zone, that level is reduced in that zone to make repeat and collective quests competitive. A level 35 character, playing in a 1-5 zone, will play at level 3-5. Weapon and armour strengths are decreased accordingly. This is somewhat offset by the character’s accumulated buffs, unlocked skills and so on, so it is easier but still a challenge. This heightens the immersive value of GW2.

**** One of the things in WOW that, for me, detracts from immersion is the cartoonish style of characters and buildings. Games like Rift and GW2 have tried to make the player feel less distanced through more realistic graphics and animation. However, none of them are up to the detail or lifelike characters we see in Call of Duty or Medal of Honor. Some licence must be allowed, of course, for fantasy races and characters.