Banished is a medieval-style city building game, along the lines of SimCity, but with several significant differences. While not as slick or comprehensive as SimCity, it still provides a compelling, addictive gameplay.*
It’s slow and cerebral, true, not your basic action-filled RPG or FPS, but it’s one of those games that demand ‘just another fifteen minutes’ that easily stretch into the wee hours. And with infinitely variable maps and a wide range of community-made mods that enhance and change the dynamics, it promises a lot of repeat play for fans of the genre.
First difference between the two city-building sims is in goals: Banished doesn’t have any, aside from simply surviving. That’s tough enough. No goals for growth, population, buildings or the like. It’s a sandbox game in which you do whatever you want, but there are clearly strategies that work better than others. Careful attention has to be paid to the details; resources, housing, jobs, education, food, weather game, trade and so on.
Second is the size. In SimCity, it’s pretty easy to get big cities with large populations fairly quickly. In Banished, after 20 in-game years in four different games, each town I built was still around 100 population. Growth is slow. I’ve built cities in SimCity that cover almost the entire map. In Banished, terrain and modest growth have kept my towns small. I’ve seen screenshots from other players showing larger towns, so I know they can be built, but it takes more time and patience than I have yet put into it.
Third is the detail level and type. SimCity focuses on modern infrastructure and technology. Banished doesn’t concern itself with water, hydro and sewage or the trappings of modern civilization. Technologically, it’s somewhere between 1500 and 1700, so the detail is limited. The number of building types is minimal compared with SimCity, too. Continue reading “Banished: Sandbox Gaming at Its Best”
I’ve had some wireless issues for quite some time now. There are dead spots in the house – a central wall has metal ducts and a gas fireplace, which are beside the laundry room with its metal-enclosed washer and dryer. About 5-6m of metal interfere with the wireless signal. The modem is attached to the cable, which comes through the north side of the house, and there are no other active cable outlets in other rooms (there are outlets for cable, but they’ve never been properly connected).
Plus the ducts and pies in the basement and the metal front door interfere with the signal out of doors, making it difficult to get good reception in a large portion of the yard – including our favourite summer sitting location; the newly rebuilt front deck.
And just to confound matters, my Acer Aspire laptop has the annoying habit of losing its internet connection – while all my other wireless devices are fine – although it can see other networks nearby and even connect to my modem. Just not the net.
I’ve looked at all sorts of solutions, from wireless extenders and bridge routers, to rewiring a large portion of the house to accommodate moving the cabling to allow the modem to be placed closer to the laptop. I’ve moved the modem a few times, but the reception has only improved indoors – out of doors it remains unstable.
There is often a 10-metre ethernet cable running across the floor between laptop and modem when I want to be sure my access isn’t interrupted. Susan hates it. It’s a trip hazard and looks hokey.
This week I decided to try a different approach (a suggestion from Neville on Facebook): a powerline (aka homeplug) network extender. It’s a whole area I knew nothing about before this week, except for the vague understanding that the network connects via the AC power lines in your home.
Basically you plug one adapter into a wall socket and attach it to your modem via a (shorter) ethernet cable. They you plug in a second adapter somewhere else in your house, preferably close to your computer, do whatever the device needs to establish a connection (in my case, push a button). When they connect, you plug another (shorter) cable into the adapter and your computer.
But which one? Which type, which standard, which brand, which feature set? That’s what I spent most of my past few days studying. Reading reviews, technical papers, speed tests, manufacturers’ claims. Prices range from $40 to almost $150 for the minimal two adapters. Why the difference and would it really matter to me?
In the end I went low-end rather than cutting-edge. I bought a D-Link “PowerLine AV 500 Network Starter Kit (DHP-309AV) from the local Staples store. Took all of two minutes to set it up, another minute to connect cables and my laptop was connected to the internet.
And if it proves itself in the upcoming months, I may look at the new models due out this fall to upgrade to the new AV2 standard, and get some extra ethernet ports strung around the house.
I’ve been building websites since the early 1990s, and have had my own websites continually since 1995. For a few years, I did website design and analysis for commercial clients – mostly small local businesses. I even taught web design at a local adult learning centre for a couple of years. Way back when the Net was relatively new, I even did some pages for local events. Although I do less coding today, mostly for my own use, I still have an interest in the developments in web technology and layout.
I taught myself the basics of HTML back when it was version 1.0, 20 years ago – not all that difficult if you were schooled in using the old word processors like Perfect Writer and Wordstar. The first word processors used similar markup styles. Some even required users to compile the text in transient files, in order to see the formatting results, because they couldn’t be shown onscreen at the same time as the markup. That’s because these programs were small, tight and efficient enough to fit in the limited physical computer memory – 16 to 64KB in the early days of computing – but not very feature rich. Ah, the good old days of the Z80 and 6502 processors.
HTML was fairly easy (for me), but clumsy. It was a flat, 2D system and building some elements – tables in particular – was awkward and time-consuming. HTML tried – with limited success – to mix design with structure in one all-encompassing language. It was predicated on the printed page – basically replicating it onscreen. The initial versions of HTML were a desktop-publishing-like environment for the screen.
But the old ways are not always suited for the new devices. Page designs and layouts done even five years ago may be outdated and ill-suited for mobile devices (as I have found from my own work). New design paradigms are needed to stay current with the ebb and flow of technologies.
Tanks are a long distance weapon, you know. They are best used in concert with one another to provide cover and overwatch fire, and are best placed in a covered or hull-down position where their profile is reduced to the minimum. Tanks should never travel alone; they should always advance with supporting vehicles on their flanks.
That’s pretty much what I said to my teammates that Saturday morning. However, I may have typed it a little more tersely. Something like, “%#$&@ idiots. Y R U in the open w/o support?”
I watched as the majority of them rushed across the field to be picked off in the open by well-placed enemy tanks, and turned into smoldering wrecks that dotted the battlefield. Don’t these people know anything about basic tank doctrine, I wondered? Well, probably not. This is the internet, after all.
Still, I want to shout out. Tanks are not close-range weapons. Or rather, they weren’t intended to be. This isn’t paintball. You can’t exactly sneak around in 25 or 30 tons of metal. But you can be clever and use the terrain to your advantage: peek carefully around corners, over rises, and stay hidden in bushes while you wait.
But there they were – half the team racing towards the enemy flag like heavy-metal Rambos, ignoring terrain, elevation, cover, overwatch or even one another. And paying the price. Boom! Another teammate in flames. You might have heard me swearing as you walked by the house that morning.
That left me with three others out of an initial 15 to guard the base; trying to cover all possible paths of approach, stay hidden and stay alive. And pick off the enemy, now bold enough to move forward. An enemy which still had nine intact vehicles, including a very active artillery and two tank destroyers, each with two kills each already. A team that seemed to understand how to play much better than our side.
We lost that one.
Good thing it’s just a game and the losers merely have to wait it out until the match ends, then come to life and play again. When there’s no other penalty for dying except to wait, you won’t learn anything.
A recent article on Gizmodo shows off some previously unseen (or perhaps just forgotten) footage of a young Steve Jobs unveiling the Macintosh computer, back on January 30, 1984.
Thirty years ago, this week.
Seems like forever ago. But I remember it, and reasonably well. I remember where I was living then, what I was working on, and who I was with (I’m still with her…)
The video clip also includes the famous Orwellian “1984” TV ad Apple used to launch the Mac. That’s worth watching for itself. It was a really cheeky ad, and generated a lot of chatter about marketing at the time. The clip includes other Mac ads you should watch.
I had a Mac around then, bought, as I recall, in late 84 or early 85. I had had a Lisa – the Mac’s unsuccessful predecessor – on loan for a few months in 83 or early 84. I wasn’t impressed with the Lisa, but the Mac really captivated me.
I also had an IBM PC, from 82 or 83, and never quite understood the anti-IBM sentiments Jobs and Apple promoted among users. But then PC users fought back just as adamantly over the superiority of their platform.
As a computer geek from way back, I just loved having any – every – computer. When I started computing, I lived in a two-bedroom apartment; the second (the Iarger of the two, of necessity) bedroom became a workroom filled with computers, books, manuals, printers, modems, tools, chips, soldering irons, cables, and printers. As a technical and documentation writer, I always had extra hardware and software on loan from manufacturers and distributors. I once described my room as looking like a “NASA launch site.”
When we eventually bought our own house, I had a room for my books and computers, too, although they tended to escape and overrun the rest of the house. Same thing has happened here, although the amount of hardware is much reduced from the glory days (more ukuleles today than computers).
But ever since my first computer, I have not been a day without at least one computer in the house, usually several.
By the time the Mac was released, I had been computing for more than six years. I bought my first computer in the fall of 1977, a TRS-80, and soon had several machines (an Apple in 79, an Atari 400 in 1980 and then an 800 in 81). I belonged to user groups, subscribed to at least a dozen computer magazines, and wrote for several, including one of the earliest columns on computer gaming (in Moves magazine). I attended many computer fests and exhibitions in Canada and the USA – in fact, I helped organize a microcomputer fair in Toronto, at the International Centre, in the mid-80s.
As you read this, in 2014, I’ve been at it for almost 37 years.
So I take some umbrage when I read this condescending snippet on Gizmodo:
30 years ago the landscape of personal computing was vastly different. That is to say, it hardly existed.
Hogwash. It was alive and well – thriving in its entrepreneurial glory. Only poorly-informed journalists who have not done their research would make such a claim. Or perhaps they are too young to know of the rich history of personal computing prior to their own acquisition of a device.
By 1984, we had seen the TRS-80, Commodore Pet, Apple II, Kaypro, IBM PC, Atari 400, 800 and 1200, Sinclair, TI-99, the Acorn, Coleco Adam and many others. Apple’s own IIc would be released later in 1984.
We would soon see both the Commodore Amiga and Atari ST 16-bit computers launched. Of which I had them all, and a few others passed through my hands in that time, too.
In the 80s, CompuServe dominated the field of online services with millions of customers as it spread. I was a sysop on CompuServe for many of those years. I even operated my own BBS for a while.
CompuServe was challenged – aggressively, but not very successfully – by several competitors in that decade including The Source and Delphi (I was later a sysop on Delphi, too, before moving to Collingwood).
As a former World of Warcraft player, I can attest to how compelling it is to play an immersive, massive, 3D role-playing game. Acting out scenarios in a fantasy world is more involving than merely reading a fantasy novel. You get addicted to being part of the narrative, to swinging the sword instead of just reading about it.
Just as when you’re reading a good novel and can’t stop turning the pages, you keep playing to see how the next chapter/adventure/scenario plays out, especially when you don’t always have to follow the script.
It’s not so much about the gameplay, as much as it is being part of the story. Well-designed games compel you to continue playing through a combination of action, puzzle solving, rewards and group activities.
WOW is an MMO – massive multiplayer online game – set in a fantasy world that draws much of its substance from Tolkein and other fantasy writers. Many of the role-playing games (RPGs) follow the pseudo-Tolkein model, but most follow paths laid out in fantasy literature (i.e. characters and novels by Robert Howard, Edgar Rice Burroughs, H.P. Lovecraft or more modern writers).
WOW is, of course, not the only game that offers that sort of setting, but at eight years old, with about 12 million subscribers, it’s both the largest and longest-lasting of them. It thus becomes the yardstick for measuring any other game in the genre. None of its competitors – Rift, Guild Wars, Lord of the Rings,Star Wars, etc. – have a fraction of the players.
RPGs owe their ancestry to a small box set of rules published in 1976, called Dungeons and Dragons. Written by Gary Gygax and Dave Arneson (whose name subsequently disappears from the list of authors in later printings), it essentially created the standards for fantasy role playing that are still in use today.
This is documented in great detail in Jon Peterson’s 700-page tome, Playing at the World (his blog is here). It was reading this book that got me thinking about game design again (and to dig through what few old wargames and rules books I have in the basement…).
In his introduction, Peterson identifies “freedom of agency” as one of the key components, “as much a necessary condition for inclusion in the genre of role-playing games as is role assumption.” The ability to make choices of action, of goal, and behaviour are central to a compelling game. In the Wired interview, linked above, Peterson defends gaming,
“…not as fads or disposable products of pop culture, but instead as a legitimate part of intellectual history, heirs to a tradition that stretches back centuries and involves many great thinkers and innovators.”
Which is similar to what I’ve been writing about for a few decades.* Gaming, at least in the simulation-style games, is not merely a pointless pastime, but rather an intellectual exercise.
Computer games have both redefined entertainment and set the bar for hardware and software development. Games are incredibly demanding of computer resources compared to, say, a spreadsheet. Consider the processing required to keep track of dozens, even hundreds of players who are interacting in 3D space in realtime, plus all of the geography, terrain, in-game trades and purchases, combat, weather and environmental effects. And to keep everyone in the game fully informed of all the events, locations and activities of their characters, pets, party members, resources, movement paths, mail… it’s a stunning amount of work.
Beyond the coding, there are some basic components any game needs to be successful:
Clearly defined purpose and goals;
Identifiable opponents to overcome;
Reward for accomplishing goals or overcoming challenges;
An understandable and accessible board geography where the game is played;
Clear and concise rules.
RPGs add other elements to create that immersive experience, including:
Character choice, advancement and development;
Consequences of actions or behaviour;
Alternate races (orcs, elves, dwarves, etc.);
Role assumption (taking on the persona of a character in the story);
Free agency (the ability to move and act independent of the script);
Believable fantasy, alternate or futuristic world environment;
Clear sides with which the races align and which have competing goals.
Computer games have additional components:
Good graphics and visual appeal;
Good AI (artificial intelligence) and NPCs (non-player characters);
Believable environmental interactions, simulated physics and effects;
Appropriate sound (and sometimes music);
Interactivity with NPCs, environment;
Social activity (in MMOs).
Some RPGs (i.e. Fable, Witcher, Fallout 3), have more complex “consequences” built into decision making within their games. Certain choices – such as attacking or stealing from non-player characters (computer-controlled NPCs) or how you answer their questions – affect the way others relate to your character. How well these mimic actual social or personal behaviour is debatable. Mostly they seem to me merely designed to add chrome to role assumption. In some cases, they don’t really affect the game or quests.
Since these are solo games, rather than MMOs, you can usually save your game before you make a choice, then replay it with a different choice if you don’t like an outcome. That tends to dissipate the suspension of belief necessary for immersion.
I don’t include “fun” in any of these lists because fun, like beauty or taste, is subjective. Players will gravitate to the games that provide the highest entertainment value for their own interests and aptitudes. I, for example, never found WOW’s battlegrounds “fun” but always enjoyed questing and exploring (solo and in parties). Others eschew the quests for the PvP combat in the battlegrounds.
Can the storyline absorb the players sufficiently, for long enough to suspend belief, deeply enough to make you care about both the characters and the action? It depends on how well the narrative is scripted. A good storyline has to be crafted as carefully as a good novel and needs to generate a similar emotional response.
Clearly, however, game narrative is very different from a storyline in a book, since choice is a key element in gaming.
Quests can also be seen as ‘micro-narratives.’ In many games, the plot or story is merely a shell that contains numerous micro-stories presented as quests. Sometimes these are dynamic, so that the nature or goals of quest B depends on how or how well you accomplished quest A. However, the shell story needs to be coherent so players don’t simply feel they’re moving from one mini-game to the next for no reason.
A lot of games fall down with thin stories, pointless quests (collect X eyeballs or Y spleens), and predictable A-to-B-to-C plots. And too many depend more on action to move them along rather than plot or participatory narrative (i.e. Diablo III).
“…the difference between traditional games and videogames is that videogames have a world in which everything about the game, except for controller input, takes place. This world is created, controlled, and sometimes populated by continuous and discrete artificial intelligence. The player is a guest in that world, the central participant in its mechanics. Even still, the world is usually not driven by the player; it is the designer’s world, and should be studied as such.”
Holleman also asks, “…whether or not videogames are similar enough to traditional narratives that we should study them the same way.” In response, he adds,
“To begin, it makes sense to admit that some portions of videogame narratives are exactly like books; the player reads them without interacting except to turn the ‘page’. Some narrative segments in videogames are exactly like movies; the player watches them without doing anything except pausing and unpausing. No decent videogame is entirely like movies or books. A movie creates a fictional world that one can see and hear, but viewers are locked into a guided tour that the filmmakers have scheduled for the viewer, and viewers can never deviate from that tour. In a videogame, on the other hand, the player is presented with a world that can be accessed largely at their own discretion. Videogames that are too linear—too much like the guided tours of movies—are often deprecated by critics and gamers.”
Interactivity is essential, but is not synonymous with narrative. Ernest Adams, writing in “Three Problems For Interactive Storytellers,” said,
“Interactivity is almost the opposite of narrative; narrative flows under the direction of the author, while interactivity depends on the player for motive power.”
“You say narrative to the average gamer and what they are apt to imagine is something on the order of a choose-your-own adventure book, a form noted for its lifelessness and mechanical exposition rather than enthralling entertainment, thematic sophistication, or character complexity… Most often, when we discuss games as stories, we are referring to games that either enable players to perform or witness narrative events – for example, to grab a lightsabre and dispatch Darth Maul in the case of a Star Wars game. Narrative enters such games on two levels – in terms of broadly defined goals or conflicts and on the level of localized incidents. “
Immersiveness also depends heavily on how much free will the player has, and the ability to write ourselves into the script. In games like Diablo III, the action is very linear and with little flexibility for explore or act outside the proscribed plot and territory. These games have little immersive value (and, at least with D3, little replay value, either). Others, like Mass Effect and Dragon Age, combine limited freedom with scripted activities and plots.
Morrowwind, Skyrim and the post-apocalyptic Fallout 3 provide a generally freely roam-able world, and in some cases, the ability to attempt quests well beyond your character’s level (some MMOs offer this, as well).**
While few solo RPG games offer such significant free agency, it is the hallmark of most MMOs. Holleman writes,
“World of Warcraft is another game heavily dependent on the depth and persuasiveness of its world; it has the benefit of being an ever-expanding world as well, with content updates and expansion packs. The first time through the game tends to be the best, from a narrative perspective. The structure of the quests (tasks with completion rewards) that guide gameplay are heavy on exploration, but often a bit short on variety, i.e. collect 10 quest items, for the millionth time. What makes these quests and dungeons compelling—at least the first time through the game—is that they are driven by a strong, interesting setting.”
Because RPGs have a character-building ladder system, the reason many players don’t explore the MMO environments more fully is usually that their characters are graded too low to survive in higher-level zones. Some sort of safe passage is sometimes offered (i.e. roads where hostile NPCs don’t patrol), or sometimes swift transportation is available (riding or flying mounts in WOW) to encourage more exploration.
Most MMOs have graduated zones for each race. These offer playable regions challenging for characters within that given range, such as levels 1-5, 6-10, 11-15, etc. You play your character in a zone until it levels up to be able to enter (and survive in) the next zone. Each zone has level-related quests to fulfill to aid your advancement.***
Completing all available quests is also part of the achievement ladder. Players are encouraged to complete all quests in all zones, regardless of their level. The problem with this system is that, in many MMOs, when your high-level character enters a low-level zone (for example, for another race), the quests are ridiculously easy but yous till want to complete them. On the other hand, quests are designed to get players to explore the entire zone while questing, which increases the sense of immersion.
Where most games have a defined end (in RPGs, usually the defeat of a final boss character), MMOs are often open-ended: they can be played after the characters have reached their highest level, accomplished all available quests and defeated all the boss characters. Usually such activities are social: group raids, battlegrounds and dungeons outside the formal narrative and questing lines (essentially making them into fantasy variants on the FPS-PvP line of gaming). It’s also possible to create new characters and start again from level one, often choosing a different race, type/class (warrior, priest, hunter, etc.,) or even alliance.
As the goal of game design, immersion is difficult to achieve: it depends on the interaction of several factors, as well as the independent activities of players outside the scripted narrative. It’s an interesting challenge that, so far, no single game has managed to meet fully, but it’s always interesting to examine the results. ****
* I started wargaming in the mid-1970s, bought a computer in 1977, and by around 1980 was writing a regular column on computer games for Moves magazine, as well as writing articles for contemporary programming magazines. I wrote about computers and game design for several magazines in the 1980s including Antic and ST Log, and published a column on technology in Canadian newspapers for a decade from the mid 1990s, which often looked at game developments.
** First-person shooters (FPS) like Call of Duty and Medal of Honor usually combine scripted scenarios with open or semi-open gameplay in a small environment. Very few have a fully open environment (Far Cry, however was one).
*** Level grinding is when you rush through all the available quests solely to get your characters up to a reasonable level of strength to be able to use powers or traits unlocked at higher levels, and then to engage in multiplayer activities like dungeons and raids. It’s common in WOW to see level 60-80 characters doing level 1-10 quests to complete their achievement ladders. For the lower level players, this can be frustrating as you watch a higher-level character blaze through an area, taking quest items or killing quest characters with ease, forcing you to wait for them to respawn. Guild Wars 2 has a different approach. When the player’s level is higher than the zone, that level is reduced in that zone to make repeat and collective quests competitive. A level 35 character, playing in a 1-5 zone, will play at level 3-5. Weapon and armour strengths are decreased accordingly. This is somewhat offset by the character’s accumulated buffs, unlocked skills and so on, so it is easier but still a challenge. This heightens the immersive value of GW2.
**** One of the things in WOW that, for me, detracts from immersion is the cartoonish style of characters and buildings. Games like Rift and GW2 have tried to make the player feel less distanced through more realistic graphics and animation. However, none of them are up to the detail or lifelike characters we see in Call of Duty or Medal of Honor. Some licence must be allowed, of course, for fantasy races and characters.