Goodbye, Information Age

Fake news“Say goodbye to the information age: it’s all about reputation now,” is the headline of an article by Italian philosopher and professor Gloria Origgi, published recently on Aeon Magazine’s website.

She writes:

…the vastly increased access to information and knowledge we have today does not empower us or make us more cognitively autonomous. Rather, it renders us more dependent on other people’s judgments and evaluations of the information with which we are faced.

I no longer need to open a computer, go online and type my questions into Google if I want to know something: I can simply ask it. “Hey Google, what’s the population of China?” or “Hey Google, who’s the mayor of Midland, Ontario?” or “Hey Google, how many lines are in Hamlet?” Google will answer with all the data. If I ask, “Hey Google, what are the headlines this morning?” it will play a recent CBC newscast.

Google Home can, however, only give me a summary, a snippet, a teaser. Should I want to delve deeper or into than one question, I still need to go online and search. And that leads me into the information swamp that is the internet. How do i sort it all out?

The way we access information has changed as radically as the amount available to us. Just look at the Cambridge Dictionary’s Word of the Year for 2018: “Nomophobia” which means “a fear or worry at the idea of being without your mobile phone or unable to use it”.

Describing a typical day in his life, Dan Nixon writes of how we isolate ourselves with out phones, imagining they are instead connecting us:

…the deluge of stimuli competing to grab our attention almost certainly inclines us towards instant gratification. This crowds out space for the exploratory mode of attention. When I get to the bus stop now, I automatically reach for my phone, rather than stare into space; my fellow commuters (when I do raise my head) seem to be doing the same thing.

What could there be that is so engaging on the phone that the writer cannot use the time to, say, think? Read? Observe? Communicate with his fellow travellers? Eleven studies found that “…participants typically did not enjoy spending 6 to 15 minutes in a room by themselves with nothing to do but think, that they enjoyed doing mundane external activities much more, and that many preferred to administer electric shocks to themselves instead of being left alone with their thoughts.” The phone serves as a personal barrier to interaction instead of facilitating it. It’s a feedback loop: making it seem we are “doing something” by giving us a sensory response, while making it seem that simply thinking is “doing nothing.”

“Nothing, to my way of thinking, is better proof of a well-ordered mind than a man’s ability to stop just where he is and pass some time in his own company.”
Seneca, Letter II to Lucilius, trans. Robin Campbell, Penguin Classics: Letters from a Stoic, 2004.

Continue reading “Goodbye, Information Age”

Dictionary vs Dictionary.com

Concise OEDDid you know that doxastic is a philosophical adjective relating to an individual’s beliefs? Or that doxorubicin was an antibiotic used in treating leukemia? Or that doxy is a 16th century word for mistress and prostitute? That drack is Australian slang for unattractive or dreary? Drabble means to make wet and dirty in muddy water? A downwarp is a broad depression in the earth’s surface? Drail is a weighted fish hook? Dragonnade means quartering troops on a population while dragonet is a small fish but a dragoman is an interpreter? That a dramaturge is a literary editor on a theatre staff?

These are words I read when I was looking up the word doxology last night. They all appear close to doxology, either on the same or the adjacent page. Anyone with even a modicum of curiosity opening a dictionary can find these and other words in your search for the meaning of an unfamiliar or uncommon word. In fact, it’s quite entertaining to simply open a dictionary at any random page and read because you are likely to learn something new each time (well, perhaps less so if you use one of the generic no-name dictionaries you bought in the box store).

My bedside dictionary is the Concise Oxford, but I also have several other Oxford editions, a Random House, Merriam Webster, and Chambers, plus some others. I often refer to several for a more comprehensive understanding of a word. And yes, I do keep one by the bed because I read a lot before sleep and sometimes encounter unfamiliar words. Oxford because it’s simply the best, I like the layout and typography, and it’s English, not American.
Continue reading “Dictionary vs Dictionary.com”

Of mice and men, and trackballs, too

All the mice tested
Top: Aukey, Steelcase (wired gaming). Bottom: Anker, EV, Logitech, Kensington (from the left, seen from the back)

Late last year, I purchased another laptop to separate my work and recreational uses. After a long search in stores, and a lot of online reading and comparing models, I decided to get an MSI gaming rig (an entry level in their pantheon, admittedly). That process got me thinking again about how we buy and sell computers.*

Computers are, for the most part, sold like muscle cars: what’s under the hood gets the attention. The processor, ram, speed, drive capacity all get writ large in ads and promoted in stores. But it’s all a bit misleading. For most uses – surfing the web, email, some word processing or spreadsheet work, non-graphics-intensive games, shopping on Amazon, that sort of thing – any modern computer or tablet will do.

Today’s smart phones and tablets have bazillions more processing power in a single handheld device than a room full of bulky, freezer-sized IBM 360s had a few decades back. I ran games, word processors, spreadsheets and more on Atari and other eight-bit computers that couldn’t out-compute a modern digital watch, let alone an i3 laptop (and that’s a weak sibling to the i5 and i7). Those seemingly limited Chromebooks and bargain-priced laptops are really monsters of computing muscle compared to what we used only a couple of decades back.

Yes, the hardware specs matter if you have processor-heavy work such as graphic design, video or music editing, 3D animation or graphics-intensive gaming. But for the most part what should really matter when you’re choosing a computer are where you interact the most: the input/output devices: the screen, the keyboard and the mouse/trackpad. That’s where you’ll spend the most time and get the most sensory response from.

All the mice tested
Top: Aukey, Steelcase (wired gaming). Bottom: Anker, EV, Logitech, Kensington (from the left, seen from the front)

I recently decided to change my mouse. Or mice, rather, since each laptop has its own. In part it’s because after many hours a day spent with one, my wrists and fingers can be tired and sore. I only use the inherent trackpads when I don’t have access to a mouse because I find them inefficient and clumsy.

Arm twistingI’ve favoured wired, gaming mice in the past for several reasons. First, a wired connection is consistent where a wireless might be susceptible to interference (and gaming mice have excellent but often long cables). Second, a gaming mouse usually has a lot more features than a regular mouse, including programmable buttons, more levels of speed and sensitivity. Third they offer better (longer lasting) buttons and scroll wheel, built for constant clicking and wheeling. And fourth, from long experience, I’ve learned not to buy the cheapest mice: they are generally less durable and less accurate than those from recognized companies.***

Traditional mice have the same basic problem for me and many other users: they force the user’s arm to be held for long times in a position that can encourage strain and wear. Part of my work includes graphic design that needs precision control, and part includes copying and pasting text and links from one monitor to applications on another, so the cursor travels a fair distance. More standard uses include document processing in word processors and spreadsheets. I’m on the computer many, many hours every day. And I find my arms/wrist hurting all too much these days.

I decided to look at something different: ergonomic mice, including vertical mice and trackballs. Here’s what I discovered, and my review of each.
Continue reading “Of mice and men, and trackballs, too”

Forty years of geekitude

TRS-80 Model 1It was forty years ago this fall, in 1977, that I bought my first computer. I had little experience with computers prior to that – a few weeks working after hours on an APL system at the U of T, mostly to play games against the machine, reading a few magazine articles on the coming ‘personal’ computer wave. Nothing seriously hands-on, experience-wise, and no programming skills either. But as soon as I saw one, I had to have it. And so I bought one.

Since then, I have not been a day without one, and generally had more than just one in my home. As many as six or seven at one time, back in the early 1980s, all different brands. But that was when I was writing about them, editing computer books and writing computer manuals.

My first computer was a TRS 80, Model 1. TRS stood for Tandy Radio Shack. It was a 16KB computer (yes: that’s 16,384 bytes of memory) In comparison, my current laptop has 8GB, or 8,388,608 kilobytes: 512 times the Model 1’s amount of RAM!

It was powered by a Zilog Z-80 eight-bit processor. My current machines all run 64-bit, multi-core processors. It had no USB ports, didn’t use a mouse, and had no audio card. Smartphones today are more versatile and more powerful. But not as much fun.

Before I bought it, I debated for a week or two whether to get the TRS or the competing Commodore PET, powered by the 6502 processor. It had similar limitations in memory and input devices, but came with a green and black screen integrated with the keyboard in one unit. But the TRS was sold at a nearby Radio Shack store within walking distance, and they also offered nighttime classes to teach the basics. The PET was only sold at stores downtown, so I bought the closer one.

I had to boot it and load programs from a cassette tape player. A year or so later, I upgraded to a 64KB RAM system and dual floppy (5.25″) drives. Each floppy could hold about 160KB of programs or data. It had a standalone B & W monitor that didn’t have any graphic capability, although canny programmers used the blocks in the ASCII character set to create pseudo-graphics (a bit like today’s Dwarf Fortress game displays, but only in B&W).
Continue reading “Forty years of geekitude”

Microsoft killed solitaire for me

Solitaire – also known as Klondike and Patience – is a very popular game on computers. So popular, in fact that a version of this 200-year-old card game has been included by Microsoft in every version of Windows since 3.0 (1990), aside from a brief hiatus with Win 8 (which gap was filled in by third-party versions). Microsoft has even launched a version for iOS, playable on the Mac, iPhone and iPad.

And according to some reports, it is the most widely used program by Windows users by a long shot. More than Word, Outlook, and PowerPoint and Explorer. Writer Luke Plunkett called that statistic “frightening.”

But for millions of us, solitaire fills the time; it occupies our brains during long travel times, in waiting rooms, in between loading, downloading, burning to disk or compiling experiences. Not just the one game: there are a whole raft of solo card games under the name solitaire – freecell, spider, Klondike, pyramid and tri-peaks among them – that people play regularly. And sometimes obsessively. Many is the time I have stopped writing this blog or some other piece, trapped by writer’s block or simple exhaustion, to while away a few minutes recharging with a simple game of solitaire.

As Plunkett wrote:

You mention Solitaire and—after the amazing end-game card haze—the first thing that pops into your head is that it was once seen as the single biggest threat to office productivity facing this planet’s workers. And in many regards, that’s correct.
Most people who have worked in an office can testify to the lure of the game, and could name one or two colleagues who spent a little too much time cutting the decks when they should have been filing reports. Some even take it too far; in 2006, New York Mayor Michael Bloomberg famously fired a city employee he caught playing the game while at work.
This addiction can even spread beyond the workplace and into people’s homes. My father has spent more time playing Freecell over the past decade than he has doing housework, for example. Things can get even worse for some: in 1996, Dr. Maressa Hecht Orzack opened the world’s first clinic for computer addicts as a response to her own chronic Solitaire addiction.

In May, 2008, Slate magazine ran a story titled, “Solitaire-y Confinement: Why we can’t stop playing a computerized card game.” In it, author Josh Levin wrote:

The game’s continued pre-eminence is a remarkable feat—it’s something akin to living in a universe in which Pong were the most-popular title for PlayStation 3. One reason solitaire endures is its predictability. The gameplay and aesthetic have remained remarkably stable; a visitor from the year 1990 could play the latest Windows version without a glitch, at least if he could figure out how to use the Start menu. It also remains one of the very few computer programs, game or nongame, that old people can predictably navigate. Brad Fregger, the developer of Solitaire Royale, the first commercial solitaire game for the Macintosh and the PC, told me that his 89-year-old mother still calls regularly to brag about her high scores. The game has also maintained a strong foothold in the modern-day cubicle.

So with its widespread popularity, a game beloved by millions and maybe even billions, you have to wonder why Microsoft seems bent on destroying the experience in Windows 10. Levin calls solitaire the “…cockroach of gaming, remarkably flexible and adaptable.” Perhaps Microsoft wants to stamp it out.
Continue reading “Microsoft killed solitaire for me”

Back to black

Grey scalesI had noticed of late that several websites are more difficult to read, that they opted to use a lighter grey text instead of a more robust black. But it didn’t dawn on me that it wasn’t my aging eyes: this was a trend. That is, until I read an article on Backchannel called “How the Web Became Unreadable.”

It’s a good read for anyone interested in typography, design and layout – and not just the Web, but print as well. It makes several good points about contrast including providing some important technical details about how contrast is measured.

I’ve written in the past about how contrast is important in design (here, and here for example). But apparently there’s a design trend of late away from contrast towards murkiness. In his article, author Kevin Marks notes:

There’s a widespread movement in design circles to reduce the contrast between text and background, making type harder to read. Apple is guilty. Google is, too. So is Twitter.

Others have noticed this too, even before Marks. In 2015, Katie Sherman wrote on Neilsen Norman Group’s site:

A low-contrast design aesthetic is haunting the web, taking legibility and discoverability with it. It’s straining our eyes, making us all feel older, and a little less capable. Lured by the trend of minimalism, sites are abandoning their high-contrast traditions and switching to the Dark Side (or should I say, the Medium-Gray Side). For sites willing to sacrifice readability for design prowess, low-contrast text has become a predictable choice, with predictable, persistent usability flaws.

This trend surprises and distresses me because it seems a singularly user-hostile trend; anti-ergonomic against the whole point of the internet. Apparently it’s part of a minimalist design trend. Now I don’t mind clean, uncluttered web pages, but I balk at making them unreadable. Pale grey reduces accessibility and legibility.

Continue reading “Back to black”