The Mac celebrates 30 years

MacintoshA recent article on Gizmodo shows off some previously unseen (or perhaps just forgotten) footage of a young Steve Jobs unveiling the Macintosh computer, back on January 30, 1984.

Thirty years ago, this week.

Seems like forever ago. But I remember it, and reasonably well. I remember where I was living then, what I was working on, and who I was with (I’m still with her…)

The video clip also includes the famous Orwellian “1984″ TV ad Apple used to launch the Mac. That’s worth watching for itself. It was a really cheeky ad, and generated a lot of chatter about marketing at the time. The clip includes other Mac ads you should watch.

I had a Mac around then, bought, as I recall, in late 84 or early 85. I had had a Lisa – the Mac’s unsuccessful predecessor – on loan for a few months in 83 or early 84. I wasn’t impressed with the Lisa, but the Mac really captivated me.

I also had an IBM PC, from 82 or 83, and never quite understood the anti-IBM sentiments Jobs and Apple promoted among users. But then PC users fought back just as adamantly over the superiority of their platform.

As a computer geek from way back, I just loved having any – every – computer. When I started computing, I lived in a two-bedroom apartment; the second (the Iarger of the two, of necessity) bedroom became a workroom filled with computers, books, manuals, printers, modems, tools, chips, soldering irons, cables, and printers. As a technical and documentation writer, I always had extra hardware and software on loan from manufacturers and distributors. I once described my room as looking like a “NASA launch site.”

When we eventually bought our own house, I had a room for my books and computers, too, although they tended to escape and overrun the rest of the house. Same thing has happened here, although the amount of hardware is much reduced from the glory days (more ukuleles today than computers).

But ever since my first computer, I have not been a day without at least one computer in the house, usually several.

By the time the Mac was released, I had been computing for more than six years. I bought my first computer in the fall of 1977, a TRS-80, and soon had several machines (an Apple in 79, an Atari 400 in 1980 and then an 800 in 81). I belonged to user groups, subscribed to at least a dozen computer magazines, and wrote for several, including one of the earliest columns on computer gaming (in Moves magazine). I attended many computer fests and exhibitions in Canada and the USA – in fact, I helped organize a microcomputer fair in Toronto, at the International Centre, in the mid-80s.

As you read this, in 2014, I’ve been at it for almost 37 years.

So I take some umbrage when I read this condescending snippet on Gizmodo:

30 years ago the landscape of personal computing was vastly different. That is to say, it hardly existed.

Hogwash. It was alive and well – thriving in its entrepreneurial glory. Only poorly-informed journalists who have not done their research would make such a claim. Or perhaps they are too young to know of the rich history of personal computing prior to their own acquisition of a device.

By 1984, we had seen the TRS-80, Commodore Pet, Apple II, Kaypro, IBM PC, Atari 400, 800 and 1200, Sinclair, TI-99, the Acorn, Coleco Adam and many others. Apple’s own IIc would be released later in 1984.

We would soon see both the Commodore Amiga and Atari ST 16-bit computers launched. Of which I had them all, and a few others passed through my hands in that time, too.

In the 80s, CompuServe dominated the field of online services with millions of customers as it spread. I was a sysop on CompuServe for many of those years. I even operated my own BBS for a while.

CompuServe was challenged – aggressively, but not very successfully – by several competitors in that decade including The Source and Delphi (I was later a sysop on Delphi, too, before moving to Collingwood).

Continue reading


Narrative and free agency in game design

World of WarcraftAs a former World of Warcraft player, I can attest to how compelling it is to play an immersive, massive, 3D role-playing game. Acting out scenarios in a fantasy world is more involving than merely reading a fantasy novel. You get addicted to being part of the narrative, to swinging the sword instead of just reading about it.

Just as when you’re reading a good novel and can’t stop turning the pages, you keep playing to see how the next chapter/adventure/scenario plays out, especially when you don’t always have to follow the script.

It’s not so much about the gameplay, as much as it is being part of the story. Well-designed games compel you to continue playing through a combination of action, puzzle solving, rewards and group activities.

WOW is an MMO – massive multiplayer online game – set in a fantasy world that draws much of its substance from Tolkein and other fantasy writers. Many of the role-playing games (RPGs) follow the pseudo-Tolkein model, but most follow paths laid out in fantasy literature (i.e. characters and novels by Robert Howard, Edgar Rice Burroughs, H.P. Lovecraft or more modern writers).

WOW is, of course, not the only game that offers that sort of setting, but at eight years old, with about 12 million subscribers, it’s both the largest and longest-lasting of them. It thus becomes the yardstick for measuring any other game in the genre. None of its competitors – Rift, Guild Wars, Lord of the Rings,Star Wars, etc. – have a fraction of the players.

RPGs owe their ancestry to a small box set of rules published in 1976, called Dungeons and Dragons. Written by Gary Gygax and Dave Arneson (whose name subsequently disappears from the list of authors in later printings), it essentially created the standards for fantasy role playing that are still in use today.

This is documented in great detail in Jon Peterson’s 700-page tome, Playing at the World (his blog is here). It was reading this book that got me thinking about game design again (and to dig through what few old wargames and rules books I have in the basement…).

In his introduction, Peterson identifies “freedom of agency” as one of the key components, “as much a necessary condition for inclusion in the genre of role-playing games as is role assumption.” The ability to make choices of action, of goal, and behaviour are central to a compelling game. In the Wired interview, linked above, Peterson defends gaming,

“…not as fads or disposable products of pop culture, but instead as a legitimate part of intellectual history, heirs to a tradition that stretches back centuries and involves many great thinkers and innovators.”

Which is similar to what I’ve been writing about for a few decades.* Gaming, at least in the simulation-style games, is not merely a pointless pastime, but rather an intellectual exercise.

Computer games have both redefined entertainment and set the bar for hardware and software development. Games are incredibly demanding of computer resources compared to, say, a spreadsheet. Consider the processing required to keep track of dozens, even hundreds of players who are interacting in 3D space in realtime, plus all of the geography, terrain, in-game trades and purchases, combat, weather and environmental effects. And to keep everyone in the game fully informed of all the events, locations and activities of their characters, pets, party members, resources, movement paths, mail… it’s a stunning amount of work.

Beyond the coding, there are some basic components any game needs to be successful:

  • Clearly defined purpose and goals;
  • Challenge;
  • Identifiable opponents to overcome;
  • Reward for accomplishing goals or overcoming challenges;
  • An understandable and accessible board geography where the game is played;
  • Clear and concise rules.

RPGs add other elements to create that immersive experience, including:

  • Connecting story/narrative;
  • Character choice, advancement and development;
  • Consequences of actions or behaviour;
  • Alternate races (orcs, elves, dwarves, etc.);
  • Role assumption (taking on the persona of a character in the story);
  • Free agency (the ability to move and act independent of the script);
  • Believable fantasy, alternate or futuristic world environment;
  • Clear sides with which the races align and which have competing goals.

Computer games have additional components:

  • Good graphics and visual appeal;
  • Good AI (artificial intelligence) and NPCs (non-player characters);
  • Believable environmental interactions, simulated physics and effects;
  • Appropriate sound (and sometimes music);
  • Interactivity with NPCs, environment;
  • Social activity (in MMOs).

Some RPGs (i.e. Fable, Witcher, Fallout 3), have more complex “consequences” built into decision making within their games. Certain choices – such as attacking or stealing from non-player characters (computer-controlled NPCs) or how you answer their questions – affect the way others relate to your character. How well these mimic actual social or personal behaviour is debatable. Mostly they seem to me merely designed to add chrome to role assumption. In some cases, they don’t really affect the game or quests.

Since these are solo games, rather than MMOs, you can usually save your game before you make a choice, then replay it with a different choice if you don’t like an outcome. That tends to dissipate the suspension of belief necessary for immersion.

I don’t include “fun” in any of these lists because fun, like beauty or taste, is subjective. Players will gravitate to the games that provide the highest entertainment value for their own interests and aptitudes. I, for example, never found WOW’s battlegrounds “fun” but always enjoyed questing and exploring (solo and in parties). Others eschew the quests for the PvP combat in the battlegrounds.

Can the storyline absorb the players sufficiently, for long enough to suspend belief, deeply enough to make you care about both the characters and the action? It depends on how well the narrative is scripted. A good storyline has to be crafted as carefully as a good novel and needs to generate a similar emotional response.

Clearly, however, game narrative is very different from a storyline in a book, since choice is a key element in gaming.

Quests can also be seen as ‘micro-narratives.’ In many games, the plot or story is merely a shell that contains numerous micro-stories presented as quests. Sometimes these are dynamic, so that the nature or goals of quest B depends on how or how well you accomplished quest A. However, the shell story needs to be coherent so players don’t simply feel they’re moving from one mini-game to the next for no reason.

A lot of games fall down with thin stories, pointless quests (collect X eyeballs or Y spleens), and predictable A-to-B-to-C plots. And too many depend more on action to move them along rather than plot or participatory narrative (i.e. Diablo III).

Patrick Holleman, on the Game Design forum, writes,

“…the difference between traditional games and videogames is that videogames have a world in which everything about the game, except for controller input, takes place. This world is created, controlled, and sometimes populated by continuous and discrete artificial intelligence. The player is a guest in that world, the central participant in its mechanics. Even still, the world is usually not driven by the player; it is the designer’s world, and should be studied as such.”

Holleman also asks, ”…whether or not videogames are similar enough to traditional narratives that we should study them the same way.” In response, he adds,

“To begin, it makes sense to admit that some portions of videogame narratives are exactly like books; the player reads them without interacting except to turn the ‘page’. Some narrative segments in videogames are exactly like movies; the player watches them without doing anything except pausing and unpausing. No decent videogame is entirely like movies or books. A movie creates a fictional world that one can see and hear, but viewers are locked into a guided tour that the filmmakers have scheduled for the viewer, and viewers can never deviate from that tour. In a videogame, on the other hand, the player is presented with a world that can be accessed largely at their own discretion. Videogames that are too linear—too much like the guided tours of movies—are often deprecated by critics and gamers.”

Interactivity is essential, but is not synonymous with narrative. Ernest Adams, writing in “Three Problems For Interactive Storytellers,” said,

“Interactivity is almost the opposite of narrative; narrative flows under the direction of the author, while interactivity depends on the player for motive power.”

In that same article, Henry Jenkins writes,

“You say narrative to the average gamer and what they are apt to imagine is something on the order of a choose-your-own adventure book, a form noted for its lifelessness and mechanical exposition rather than enthralling entertainment, thematic sophistication, or character complexity… Most often, when we discuss games as stories, we are referring to games that either enable players to perform or witness narrative events – for example, to grab a lightsabre and dispatch Darth Maul in the case of a Star Wars game. Narrative enters such games on two levels – in terms of broadly defined goals or conflicts and on the level of localized incidents. ”

Immersiveness also depends heavily on how much free will the player has, and the ability to write ourselves into the script. In games like Diablo III, the action is very linear and with little flexibility for explore or act outside the proscribed plot and territory. These games have little immersive value (and, at least with D3, little replay value, either). Others, like Mass Effect and Dragon Age, combine limited freedom with scripted activities and plots.

Morrowwind, Skyrim and the post-apocalyptic Fallout 3 provide a generally freely roam-able world, and in some cases, the ability to attempt quests well beyond your character’s level (some MMOs offer this, as well).**

While few solo RPG games offer such significant free agency, it is the hallmark of most MMOs. Holleman writes,

“World of Warcraft is another game heavily dependent on the depth and persuasiveness of its world; it has the benefit of being an ever-expanding world as well, with content updates and expansion packs. The first time through the game tends to be the best, from a narrative perspective. The structure of the quests (tasks with completion rewards) that guide gameplay are heavy on exploration, but often a bit short on variety, i.e. collect 10 quest items, for the millionth time. What makes these quests and dungeons compelling—at least the first time through the game—is that they are driven by a strong, interesting setting.”

Because RPGs have a character-building ladder system, the reason many players don’t explore the MMO environments more fully is usually that their characters are graded too low to survive in higher-level zones. Some sort of safe passage is sometimes offered (i.e. roads where hostile NPCs don’t patrol), or sometimes swift transportation is available (riding or flying mounts in WOW) to encourage more exploration.

Most MMOs have graduated zones for each race. These offer playable regions challenging for characters within that given range, such as levels 1-5, 6-10, 11-15, etc. You play your character in a zone until it levels up to be able to enter (and survive in) the next zone. Each zone has level-related quests to fulfill to aid your advancement.***

Completing all available quests is also part of the achievement ladder. Players are encouraged to complete all quests in all zones, regardless of their level. The problem with this system is that, in many MMOs, when your high-level character enters a low-level zone (for example, for another race), the quests are ridiculously easy but yous till want to complete them. On the other hand, quests are designed to get players to explore the entire zone while questing, which increases the sense of immersion.

Where most games have a defined end (in RPGs, usually the defeat of a final boss character), MMOs are often open-ended: they can be played after the characters have reached their highest level, accomplished all available quests and defeated all the boss characters. Usually such activities are social: group raids, battlegrounds and dungeons outside the formal narrative and questing lines (essentially making them into fantasy variants on the FPS-PvP line of gaming). It’s also possible to create new characters and start again from level one, often choosing a different race, type/class (warrior, priest, hunter, etc.,) or even alliance.

As the goal of game design, immersion is difficult to achieve: it depends on the interaction of several factors, as well as the independent activities of players outside the scripted narrative. It’s an interesting challenge that, so far, no single game has managed to meet fully, but it’s always interesting to examine the results.  ****


* I started wargaming in the mid-1970s, bought a computer in 1977, and by around 1980 was writing a regular column on computer games for Moves magazine, as well as writing articles for contemporary programming magazines. I wrote about computers and game design for several magazines in the 1980s including Antic and ST Log, and published a column on technology in Canadian newspapers for a decade from the mid 1990s, which often looked at game developments.

** First-person shooters (FPS) like Call of Duty and Medal of Honor usually combine scripted scenarios with open or semi-open gameplay in a small environment. Very few have a fully open environment (Far Cry, however was one).

*** Level grinding is when you rush through all the available quests solely to get your characters up to a reasonable level of strength to be able to use powers or traits unlocked at higher levels, and then to engage in multiplayer activities like dungeons and raids. It’s common in WOW to see level 60-80 characters doing level 1-10 quests to complete their achievement ladders. For the lower level players, this can be frustrating as you watch a higher-level character blaze through an area, taking quest items or killing quest characters with ease, forcing you to wait for them to respawn. Guild Wars 2 has a different approach. When the player’s level is higher than the zone, that level is reduced in that zone to make repeat and collective quests competitive. A level 35 character, playing in a 1-5 zone, will play at level 3-5. Weapon and armour strengths are decreased accordingly. This is somewhat offset by the character’s accumulated buffs, unlocked skills and so on, so it is easier but still a challenge. This heightens the immersive value of GW2.

**** One of the things in WOW that, for me, detracts from immersion is the cartoonish style of characters and buildings. Games like Rift and GW2 have tried to make the player feel less distanced through more realistic graphics and animation. However, none of them are up to the detail or lifelike characters we see in Call of Duty or Medal of Honor. Some licence must be allowed, of course, for fantasy races and characters.