Of mice and men, and trackballs, too

All the mice tested
Top: Aukey, Steelcase (wired gaming). Bottom: Anker, EV, Logitech, Kensington (from the left, seen from the back)

Late last year, I purchased another laptop to separate my work and recreational uses. After a long search in stores, and a lot of online reading and comparing models, I decided to get an MSI gaming rig (an entry level in their pantheon, admittedly). That process got me thinking again about how we buy and sell computers.*

Computers are, for the most part, sold like muscle cars: what’s under the hood gets the attention. The processor, ram, speed, drive capacity all get writ large in ads and promoted in stores. But it’s all a bit misleading. For most uses – surfing the web, email, some word processing or spreadsheet work, non-graphics-intensive games, shopping on Amazon, that sort of thing – any modern computer or tablet will do.

Today’s smart phones and tablets have bazillions more processing power in a single handheld device than a room full of bulky, freezer-sized IBM 360s had a few decades back. I ran games, word processors, spreadsheets and more on Atari and other eight-bit computers that couldn’t out-compute a modern digital watch, let alone an i3 laptop (and that’s a weak sibling to the i5 and i7). Those seemingly limited Chromebooks and bargain-priced laptops are really monsters of computing muscle compared to what we used only a couple of decades back.

Yes, the hardware specs matter if you have processor-heavy work such as graphic design, video or music editing, 3D animation or graphics-intensive gaming. But for the most part what should really matter when you’re choosing a computer are where you interact the most: the input/output devices: the screen, the keyboard and the mouse/trackpad. That’s where you’ll spend the most time and get the most sensory response from.

All the mice tested
Top: Aukey, Steelcase (wired gaming). Bottom: Anker, EV, Logitech, Kensington (from the left, seen from the front)

I recently decided to change my mouse. Or mice, rather, since each laptop has its own. In part it’s because after many hours a day spent with one, my wrists and fingers can be tired and sore. I only use the inherent trackpads when I don’t have access to a mouse because I find them inefficient and clumsy.

Arm twistingI’ve favoured wired, gaming mice in the past for several reasons. First, a wired connection is consistent where a wireless might be susceptible to interference (and gaming mice have excellent but often long cables). Second, a gaming mouse usually has a lot more features than a regular mouse, including programmable buttons, more levels of speed and sensitivity. Third they offer better (longer lasting) buttons and scroll wheel, built for constant clicking and wheeling. And fourth, from long experience, I’ve learned not to buy the cheapest mice: they are generally less durable and less accurate than those from recognized companies.***

Traditional mice have the same basic problem for me and many other users: they force the user’s arm to be held for long times in a position that can encourage strain and wear. Part of my work includes graphic design that needs precision control, and part includes copying and pasting text and links from one monitor to applications on another, so the cursor travels a fair distance. More standard uses include document processing in word processors and spreadsheets. I’m on the computer many, many hours every day. And I find my arms/wrist hurting all too much these days.

I decided to look at something different: ergonomic mice, including vertical mice and trackballs. Here’s what I discovered, and my review of each.
Continue reading “Of mice and men, and trackballs, too”

Forty years of geekitude

TRS-80 Model 1It was forty years ago this fall, in 1977, that I bought my first computer. I had little experience with computers prior to that – a few weeks working after hours on an APL system at the U of T, mostly to play games against the machine, reading a few magazine articles on the coming ‘personal’ computer wave. Nothing seriously hands-on, experience-wise, and no programming skills either. But as soon as I saw one, I had to have it. And so I bought one.

Since then, I have not been a day without one, and generally had more than just one in my home. As many as six or seven at one time, back in the early 1980s, all different brands. But that was when I was writing about them, editing computer books and writing computer manuals.

My first computer was a TRS 80, Model 1. TRS stood for Tandy Radio Shack. It was a 16KB computer (yes: that’s 16,384 bytes of memory) In comparison, my current laptop has 8GB, or 8,388,608 kilobytes: 512 times the Model 1’s amount of RAM!

It was powered by a Zilog Z-80 eight-bit processor. My current machines all run 64-bit, multi-core processors. It had no USB ports, didn’t use a mouse, and had no audio card. Smartphones today are more versatile and more powerful. But not as much fun.

Before I bought it, I debated for a week or two whether to get the TRS or the competing Commodore PET, powered by the 6502 processor. It had similar limitations in memory and input devices, but came with a green and black screen integrated with the keyboard in one unit. But the TRS was sold at a nearby Radio Shack store within walking distance, and they also offered nighttime classes to teach the basics. The PET was only sold at stores downtown, so I bought the closer one.

I had to boot it and load programs from a cassette tape player. A year or so later, I upgraded to a 64KB RAM system and dual floppy (5.25″) drives. Each floppy could hold about 160KB of programs or data. It had a standalone B & W monitor that didn’t have any graphic capability, although canny programmers used the blocks in the ASCII character set to create pseudo-graphics (a bit like today’s Dwarf Fortress game displays, but only in B&W).
Continue reading “Forty years of geekitude”

Microsoft killed solitaire for me

Solitaire – also known as Klondike and Patience – is a very popular game on computers. So popular, in fact that a version of this 200-year-old card game has been included by Microsoft in every version of Windows since 3.0 (1990), aside from a brief hiatus with Win 8 (which gap was filled in by third-party versions). Microsoft has even launched a version for iOS, playable on the Mac, iPhone and iPad.

And according to some reports, it is the most widely used program by Windows users by a long shot. More than Word, Outlook, and PowerPoint and Explorer. Writer Luke Plunkett called that statistic “frightening.”

But for millions of us, solitaire fills the time; it occupies our brains during long travel times, in waiting rooms, in between loading, downloading, burning to disk or compiling experiences. Not just the one game: there are a whole raft of solo card games under the name solitaire – freecell, spider, Klondike, pyramid and tri-peaks among them – that people play regularly. And sometimes obsessively. Many is the time I have stopped writing this blog or some other piece, trapped by writer’s block or simple exhaustion, to while away a few minutes recharging with a simple game of solitaire.

As Plunkett wrote:

You mention Solitaire and—after the amazing end-game card haze—the first thing that pops into your head is that it was once seen as the single biggest threat to office productivity facing this planet’s workers. And in many regards, that’s correct.
Most people who have worked in an office can testify to the lure of the game, and could name one or two colleagues who spent a little too much time cutting the decks when they should have been filing reports. Some even take it too far; in 2006, New York Mayor Michael Bloomberg famously fired a city employee he caught playing the game while at work.
This addiction can even spread beyond the workplace and into people’s homes. My father has spent more time playing Freecell over the past decade than he has doing housework, for example. Things can get even worse for some: in 1996, Dr. Maressa Hecht Orzack opened the world’s first clinic for computer addicts as a response to her own chronic Solitaire addiction.

In May, 2008, Slate magazine ran a story titled, “Solitaire-y Confinement: Why we can’t stop playing a computerized card game.” In it, author Josh Levin wrote:

The game’s continued pre-eminence is a remarkable feat—it’s something akin to living in a universe in which Pong were the most-popular title for PlayStation 3. One reason solitaire endures is its predictability. The gameplay and aesthetic have remained remarkably stable; a visitor from the year 1990 could play the latest Windows version without a glitch, at least if he could figure out how to use the Start menu. It also remains one of the very few computer programs, game or nongame, that old people can predictably navigate. Brad Fregger, the developer of Solitaire Royale, the first commercial solitaire game for the Macintosh and the PC, told me that his 89-year-old mother still calls regularly to brag about her high scores. The game has also maintained a strong foothold in the modern-day cubicle.

So with its widespread popularity, a game beloved by millions and maybe even billions, you have to wonder why Microsoft seems bent on destroying the experience in Windows 10. Levin calls solitaire the “…cockroach of gaming, remarkably flexible and adaptable.” Perhaps Microsoft wants to stamp it out.
Continue reading “Microsoft killed solitaire for me”

Back to black

Grey scalesI had noticed of late that several websites are more difficult to read, that they opted to use a lighter grey text instead of a more robust black. But it didn’t dawn on me that it wasn’t my aging eyes: this was a trend. That is, until I read an article on Backchannel called “How the Web Became Unreadable.”

It’s a good read for anyone interested in typography, design and layout – and not just the Web, but print as well. It makes several good points about contrast including providing some important technical details about how contrast is measured.

I’ve written in the past about how contrast is important in design (here, and here for example). But apparently there’s a design trend of late away from contrast towards murkiness. In his article, author Kevin Marks notes:

There’s a widespread movement in design circles to reduce the contrast between text and background, making type harder to read. Apple is guilty. Google is, too. So is Twitter.

Others have noticed this too, even before Marks. In 2015, Katie Sherman wrote on Neilsen Norman Group’s site:

A low-contrast design aesthetic is haunting the web, taking legibility and discoverability with it. It’s straining our eyes, making us all feel older, and a little less capable. Lured by the trend of minimalism, sites are abandoning their high-contrast traditions and switching to the Dark Side (or should I say, the Medium-Gray Side). For sites willing to sacrifice readability for design prowess, low-contrast text has become a predictable choice, with predictable, persistent usability flaws.

This trend surprises and distresses me because it seems a singularly user-hostile trend; anti-ergonomic against the whole point of the internet. Apparently it’s part of a minimalist design trend. Now I don’t mind clean, uncluttered web pages, but I balk at making them unreadable. Pale grey reduces accessibility and legibility.

Continue reading “Back to black”

The bucket list, kicked

Kick the bucketNowadays the “bucket list” concept has become a wildly popular cultural meme, thanks to the movie of the same name. Subsequent marketing of the idea to millennials has proven a successful means to derive them of their income, with which they seem eager to part.

I don’t like the concept. The list, I mean, not necessarily the plucking of the millennial chickens who willingly hand over their financial feathers. They get what they deserve.

Bucketlist.org has, at the time of this writing, more than 5.317 million “dreams” for you to pursue. Contributed by more than 450,000 people. And your individual dream? Part of the Borg’s list. Pretty hard to think of something original that the previous 450,000 folks didn’t already add to the list.

Just search “bucket list” on Google and you’ll turn up close to 52 million hits, and a huge number of them are selling something, from New Age codswallop to travel to high-tech gadgets and everything in-between. Nowadays, “your” bucket list is everyone’s bucket list and has become part of a slick campaign aimed at your wallet. At every corner there’s some entrepreneur eager to play Virgil to your hollow life’s Dante, for a price.

A bucket list is, we learned from the film, the wish list of things you want to accomplish before you kick the metaphorical bucket  – i.e. die – as a means to give your previously pathetic life some substance. That notion quickly morphed into a commercial selling point, and it seems I encounter it every day in some new form, usually on social media. It’s up there with posts about puppies, angels, magic crystals, and nasty troll posts about liberals.

The movie is about two seniors undergoing an end-of-life crisis trying to figure out the Meaning of It All. They resolve to avoid dwelling on their inevitable end by taking very expensive trips around the world (Jack Nicholson plays a billionaire…). It’s a cute, moving film. It’s fiction, but also a great marketing idea. We are all susceptible to Hollywood, after all. And, of course, we all have billionaire friends who will buy the tickets, right?

Okay, I get it: we all want life to make sense, and to have meaning that makes the 9-5 grind worthwhile. But even if our lives are meaningless, we don’t want to die, either. We want to be able to say something we did made the journey worth the effort. But is this the way? Is life simply a series of boxes we check off? A list that keeps growing with more and more items to check? Your self esteem will suffer if you don’t check this off. And this. And this. And this…

Continue reading “The bucket list, kicked”

PE5: Corel’s Wrong Direction

Way back in 1990, a program called Fractal Painter was published by Fractal Design. It offered a “natural media” approach to digital art: mimicking real world art tools and media in the digital environment. You could  – if you had more artistic skills than I – make an image onscreen that looked like it was a photo of a real-media image. Images had texture, oils had highlights. You could mix colours like you do in real life.

It was brilliant, exciting and ground-breaking stuff.

In 1997, after gobbling up several other products (including Poser 3D), the company became Metacreations, but it and extended itself a little too far, and split into fragments that were individually acquired by Microsoft and Corel.*

Corel continued to publish – and enhance – Painter. It turned Painter into the foremost “natural media” digital art program on the market. Its latest incarnation is Painter 2016, a remarkable and powerful art program that sells for $500.

The price, however, deterred many users who wanted a simple art program for non-commercial uses or to just make art-like variations of their digital photographs. So Corel developed Painter Essentials,  at a modest (under $50) price. PE was, essentially, Painter-light, offering a stripped-down subset of Painter tools.

The big draw in Painter Essentials (PE) was its auto-paint feature. It automated the brush and pen strokes. With a few clicks, users could turn a digital image (a photograph, for example) into an art-like rendition. Oil, charcoal, pencil, watercolour, impressionist, modern, pen-and-ink… just a few easy clicks to create an art-like image. Even watching the process work was mesmerizing.

Want a photo to look like a charcoal sketch? A watercolour? A Cezanne or Van Gogh painting? PE 4 had those options. But better yet, it had options to control the brush size, colours, canvas, stroke frequency and so on in each category. That meant users could personalize images in many ways previously unavailable except at much higher cost.

People loved PE. It filled a need for creating beautiful art without the effort, time or cost of Painter, but without the talent required to use all those tools. It offered enough control to make each resulting image unique and give users a feeling they had accomplished something other than just clicking.

It had many powerful tools and features – Painter’s brushes – to make it more than a toy.

Continue reading “PE5: Corel’s Wrong Direction”