Synecdoche, Universe

No Man's Sky
In the delightfully quirky, postmodern film, Synecdoche, New York, the late Philip Seymour Hoffman plays a movie director obsessed with creating a set that realistically represents New York City for an upcoming movie. But as he tries to incorporate more and more people and bits that represent the city, the set grows and grows into a micro-city itself. As Wikipedia describes it:

The plot follows an ailing theater director (Hoffman) as he works on an increasingly elaborate stage production whose extreme commitment to realism begins to blur the boundaries between fiction and reality. The film’s title is a play on Schenectady, New York, where much of the film is set, and the concept of synecdoche, wherein a part of something represents the whole, or vice versa.

I feel much the same thinking and obsession went into the creation of No Man’s Sky, a sandbox (“action-adventure survival,” plus trading, exploration, fighting, gathering, building, mining, refining, upgrading, flying, meeting aliens, and more) science fiction computer game of enormous size and scope that attempts to cram everything imaginable into one game.  Synecdoche, Universe might be a suitable nickname for this sprawling, all-encompassing game.* Again from Wikipedia:

Players are free to perform within the entirety of a procedurally generated deterministic open world universe, which includes over 18 quintillion planets… nearly all parts of the galaxy, including stars, planets, flora and fauna on these planets, and sentient alien encounters, are created through procedural generation…

Eighteen quintillions? That’s 18,000,000,000,000,000,000. Beyond comprehension. I can’t vouch for anything close to that number, since in about 25 hours of play, I’ve only been to five or six of them in No Man’s Sky (NMS).

My first four game starts (three on similarly difficult planets, one sandbox in a more habitable clime) were all just learning experiences that, after fumbling, failing, and even dying, I deleted having played only a few hours each. My currently-running game has more than half of my game time logged, spent entirely on one planet with a couple of short visits to a nearby orbital space station. Most of my time on this one planet has been running or walking around, exploring. I’ll come back to that. Meanwhile, I’m still poking about on one planet while the rest of the universe awaits.

Continue reading “Synecdoche, Universe”

The Long Read Lost

Reading by candlelight
“What we read, how we read, and why we read change how we think, changes that are continuing now at a faster pace,” wrote Maryanne Wolf, a neuroscientist, in her book, Reader, Come Home: The Reading Brain in the Digital World (Harper Paperbacks, 2019). It’s the sequel to her previous book on reading and neuroscience, Proust and the Squid (Harper, 2007). In that latter book, Wolf famously wrote,

We are not only what we read, we are how we read.

Reading — Marcel Proust called it a “fertile miracle of communication effected in solitude” — is a breathtakingly remarkable, and uniquely human talent, yet one that we have no genetic disposition for, like we have for speaking or for social behaviour. No one is born knowing how to read. It must be learned by each of us individually, painstakingly from a young age and practiced through a lifetime. It is the classic case of nurture over nature. Yet there are an estimated 800 million illiterate people in the world today.

Learning to read changes our brains, rewires our neural networks, creates new connections, and helps us think. Not in a metaphorical sense: the changes have been mapped by neuroscientists like Wolf and her colleagues. Yet reading (and its co-host inventions, writing, and the alphabet; itself even younger at a mere 3,800 years old), is a very recent talent, historically speaking. The oldest known record of writing is a mere 5,500 years old; the oldest Sumerian tablets are about 4,400 years old. The first complete alphabet (ancient Greek: with symbols for vowels as well as consonants) is from around 750 BCE. In modern times, the first book was produced on a Western printing press only about 570 years ago. That’s a remarkably short time in the 300,000-400,000-year history of our species.

“In a span of only six millennia reading became the transformative catalyst for intellectual development within individuals and within literate cultures,” Wolf added. Right from the beginning of writing, stories were part of the written record: the imaginations of ancient civilizations were carved on clay and in stone, for us to read even today.

Literate cultures. The term might refer to cultures which have a reasonably high level in the ability to actually read regardless of its content, but could also refer to a civilization that has a culture of deep, passionate, and lengthy reading: one that celebrates in books, poetry, magazines, and other forms of the written word. It’s a civilization that has book clubs, discusses and shares books, has public libraries and bookstores, poetry festivals, and has plays performed and authors celebrated. A literate culture even has cursive writing (somewhat of a canary in the coal mine of literacy).

We are such a culture, even though — at least from my perspective — we continue to move at an accelerating pace to a more visually-oriented, non-reading culture, away from the written form; a short form culture where the tweet, the sound bite, and the YouTube video all have more reach than a long article or story. Our attachment to many of the longer written forms is dissipating. Long reads online are often prefaced by the initialism TL:DR — “Too Long; Didn’t Read” with a tweet-sized precis for those who will not (or cannot) read the longer piece.

The quality of our reading is not only an index of the quality of our thought, it is our best-known path to developing whole new pathways in the cerebral evolution of our species. There is much at stake in the development of the reading brain and in the quickening changes that now characterize its current, evolving iterations. (P. 2)

We live in an astoundingly complex, complicated, demanding, challenging world. To understand it even at a very basic level, we need to be able to read and read deeply; not simply watch videos or read tweets. We need to ignore the noise of social media and open books, newspapers (real newspapers, not merely the local ad-wrappers), and magazines to get a fulsome explanation of what is happening in our lives. No one can understand or learn about politics, economics, or science from tweets.

Not reading deeply is plunging us into an increasingly anti-intellectual age, suspicious of learning and science. We have world leaders who are barely literate or are even functionally illiterate, and yet who take pride in their ignorance. The result is the proliferation of conspiracy cults, pseudoscience, anti-mask and anti-vaccination movements, and both political and religious fundamentalism (most of which claptrap, not surprisingly, originates from the right wing of the political spectrum).

And it’s not just Donald Trump, although he is the epitome of the illiterate, uninformed, conspiracy-addled leader. Look at the leaders of Turkey, Brazil, Hungary, India, the Philippines, and even here in Ontario: populist (rightwing) leaders like these share similar attributes, including a distrust of institutions, science, and experts. I’ve served with members of our local municipal council who never even read agendas or staff reports, let alone books (we now have a council replete with such non-readers). The result at all levels of government is evident in the decay of public debate, the reduction to populist, slogan-based oratory, slovenly and uninformed decision making, and lackluster governance. But I digress.

Continue reading “The Long Read Lost”

The day that reason died

Aliens sort of
I’m not a believer in alien visitations and UFOs, but I’ll bet if an alien did swing by, after an hour or two observing us, checking out Facebook or Twitter, they’d lock their doors, hang a detour sign around our planet, and race off. They’d tell their friends not to visit us because we were all nuts. Scarily, dangerously crazy.

Seriously. What sort of world can be called civilized when it has people touting — and believing — homeopathy? Reiki? Chemtrails? Anti-vaccination screeds? Anti-mask whines during a frigging pandemic? Wind turbines cause cancer? 5G towners spread COVID-19? Creationism? Reflexology? Alien abductions? Crop circles? Astrology? Crystal healing? Ghosts? Flat earth? Bigfoot? Psychics? Ayurveda? Nigerian generals offering us free money? Palmistry? David Avocado Wolfe? David Icke? Gwyneth Paltrow? Donald Trump? Alex Jones? The Food Babe? Televangelists?  Ken Ham? You have to be really hard-of-thinking or massively gullible to fall for any of it. But we do, and we fall for it by the millions.

And that doesn’t include the baseless , puerile crap like racism, homophobia, misogyny, pedophilia, anti-Semitism, radical religion, trickle-down economics, and nationalism, all of which evils remain rampant despite concerted efforts to educate people since the Enlightenment. Little wonder aliens wouldn’t want to be seen here.

Why would they want to land on a planet of such extreme hypochondriacs who one day are happily eating muffins and bread, then the next day millions of them suddenly develop gluten “sensitivity” or even “allergies” right after some pseudo-wellness guru pronounces gluten an evil that is killing them? Or who self-diagnose themselves with whatever appeared in the last illness or pseudo-illness they saw in a YouTube video? Or who go ballistic over being asked to wear a mask for public safety despite its very minor inconvenience? Or who refuse to get a vaccination to help develop herd immunity and would prefer their children suffer the illness instead?

Despite all the efforts, despite science, logic, rational debate, medicine, facts, and common sense (which is not common at all these days) everything has been downgraded into mere opinion. Everyone has a right to an opinion, we say (which is politically correct bollocks), and we respect their opinion (even if it’s toxic bullshit or simply batshit crazy, or in Donald Trump’s case, both). All opinions get equal weight and consideration, especially on social media, where people will eagerly agree with anything that confirms their existing beliefs that the world is out to get them or that makes them feel special.

Who should you believe in this dark age of anti-science and anti-intellectualism: unemployed, high-school-dropout Bob who lives in his parent’s basement and watches porn in his PJs when he’s not cranking out conspiracy videos, or Dr. Fauci, an award-winning physician, medical researcher, epidemiologist, and immunologist who has dedicated his whole life to public health care, with more than five decades experience in the field, who has served as director of the National Institute of Allergy and Infectious Diseases (NIAID) since 1984, and is considered one of the world’s leading experts on infectious diseases? But there are two sides to every issue, cry Bob’s followers (by the way, there aren’t: that’s another stupid fallacy) who rush to share Bob’s latest video about why you don’t need to wear a mask during a pandemic, and that you’ll develop immunity if we all just cough on one another. What do experts know, they ask. Bob speaks for us; he’s one of us. We trust Bob, not the elitist guy with the string of degrees. And even if we do get sick, we can just drink some bleach because or president said it will cure us.

Doomed. We are so fucking doomed when wingnuts like Bob (or Trump) get any traction. But there’s Gwyneth Paltrow doing a Netflix series to promote her batshit crazy ideas about health and wellness, and women shovelling their money at her to buy her magic stones to stuff into their vaginas. Bob is just a small, sad voice compared to the commercial money harvesting machines that Paltrow, Wolfe, and Vani Hari are. Doomed, I tell you.

While a lot of hokum has been around for ages, I’ve often wondered if there was some recent, seminal event that caused it to explode as it has into every corner of the world. Sure, the internet is the conduit for most of the codswallop these days, but was there something before that that started the tsunami of ignorance, bile, anti-intellectualism, incivility, and bullshit? Was there a tipping point when reason sank and cranks went from bottom-feeding fringe to riding the surface?,

Maybe — I think I’ve found it: August 22, 1987.

Continue reading “The day that reason died”

Johnson’s words

Samuel JohnsonI have recently been reading through the David Crystal anthology of words from Samuel Johnson’s dictionary (Penguin, 2006), attempting to cross-reference it with entries in the Jack Lynch anthology (Levenger Press, 2004), comparing how the two editors chose their selections, and to see how the book designers chose to present them. Yes, I know: reading dictionaries isn’t a common pastime, but if you love words, then you do it.

In part, I’m doing so for the sheer delight of the reading (Johnson’s wit shines through in so many of the entries), and as a measure of the differences in book design, but also with an odd project in mind: The Word-of-the-day From Johnson. I had the notion of transcribing a single word at random every day, and posting it online, on and social media. Not something that seems to have been done before, as far as I can tell.

I’ve previously written about how much I enjoy Johnson’s dictionary, and how I recommend it to anyone who enjoys reading, not merely bibliophiles, logophiles and lexicographers. However, there is no reasonably-priced version of the complete dictionary with its 40,000-plus entries, just various selections. As good as the abridgments are, readers will soon ache, as I do, to read more than the limited number of definitions provided in these.

Continue reading “Johnson’s words”

Goodbye, Information Age

Fake news“Say goodbye to the information age: it’s all about reputation now,” is the headline of an article by Italian philosopher and professor Gloria Origgi, published recently on Aeon Magazine’s website.

She writes:

…the vastly increased access to information and knowledge we have today does not empower us or make us more cognitively autonomous. Rather, it renders us more dependent on other people’s judgments and evaluations of the information with which we are faced.

I no longer need to open a computer, go online and type my questions into Google if I want to know something: I can simply ask it. “Hey Google, what’s the population of China?” or “Hey Google, who’s the mayor of Midland, Ontario?” or “Hey Google, how many lines are in Hamlet?” Google will answer with all the data. If I ask, “Hey Google, what are the headlines this morning?” it will play a recent CBC newscast.

Google Home can, however, only give me a summary, a snippet, a teaser. Should I want to delve deeper or into than one question, I still need to go online and search. And that leads me into the information swamp that is the internet. How do i sort it all out?

The way we access information has changed as radically as the amount available to us. Just look at the Cambridge Dictionary’s Word of the Year for 2018: “Nomophobia” which means “a fear or worry at the idea of being without your mobile phone or unable to use it”.

Describing a typical day in his life, Dan Nixon writes of how we isolate ourselves with out phones, imagining they are instead connecting us:

…the deluge of stimuli competing to grab our attention almost certainly inclines us towards instant gratification. This crowds out space for the exploratory mode of attention. When I get to the bus stop now, I automatically reach for my phone, rather than stare into space; my fellow commuters (when I do raise my head) seem to be doing the same thing.

What could there be that is so engaging on the phone that the writer cannot use the time to, say, think? Read? Observe? Communicate with his fellow travellers? Eleven studies found that “…participants typically did not enjoy spending 6 to 15 minutes in a room by themselves with nothing to do but think, that they enjoyed doing mundane external activities much more, and that many preferred to administer electric shocks to themselves instead of being left alone with their thoughts.” The phone serves as a personal barrier to interaction instead of facilitating it. It’s a feedback loop: making it seem we are “doing something” by giving us a sensory response, while making it seem that simply thinking is “doing nothing.”

“Nothing, to my way of thinking, is better proof of a well-ordered mind than a man’s ability to stop just where he is and pass some time in his own company.”
Seneca, Letter II to Lucilius, trans. Robin Campbell, Penguin Classics: Letters from a Stoic, 2004.

Continue reading “Goodbye, Information Age”

Dictionary vs Dictionary.com

Concise OEDDid you know that doxastic is a philosophical adjective relating to an individual’s beliefs? Or that doxorubicin was an antibiotic used in treating leukemia? Or that doxy is a 16th century word for mistress and prostitute? That drack is Australian slang for unattractive or dreary? Drabble means to make wet and dirty in muddy water? A downwarp is a broad depression in the earth’s surface? Drail is a weighted fish hook? Dragonnade means quartering troops on a population while dragonet is a small fish but a dragoman is an interpreter? That a dramaturge is a literary editor on a theatre staff?

These are words I read when I was looking up the word doxology last night. They all appear close to doxology, either on the same or the adjacent page. Anyone with even a modicum of curiosity opening a dictionary can find these and other words in your search for the meaning of an unfamiliar or uncommon word. In fact, it’s quite entertaining to simply open a dictionary at any random page and read because you are likely to learn something new each time (well, perhaps less so if you use one of the generic no-name dictionaries you bought in the box store).

My bedside dictionary is the Concise Oxford, but I also have several other Oxford editions, a Random House, Merriam Webster, and Chambers, plus some others. I often refer to several for a more comprehensive understanding of a word. And yes, I do keep one by the bed because I read a lot before sleep and sometimes encounter unfamiliar words. Oxford because it’s simply the best, I like the layout and typography, and it’s English, not American.
Continue reading “Dictionary vs Dictionary.com”

Of mice and men, and trackballs, too

All the mice tested
Top: Aukey, Steelcase (wired gaming). Bottom: Anker, EV, Logitech, Kensington (from the left, seen from the back)

Late last year, I purchased another laptop to separate my work and recreational uses. After a long search in stores, and a lot of online reading and comparing models, I decided to get an MSI gaming rig (an entry level in their pantheon, admittedly). That process got me thinking again about how we buy and sell computers.*

Computers are, for the most part, sold like muscle cars: what’s under the hood gets the attention. The processor, ram, speed, drive capacity all get writ large in ads and promoted in stores. But it’s all a bit misleading. For most uses – surfing the web, email, some word processing or spreadsheet work, non-graphics-intensive games, shopping on Amazon, that sort of thing – any modern computer or tablet will do.

Today’s smart phones and tablets have bazillions more processing power in a single handheld device than a room full of bulky, freezer-sized IBM 360s had a few decades back. I ran games, word processors, spreadsheets and more on Atari and other eight-bit computers that couldn’t out-compute a modern digital watch, let alone an i3 laptop (and that’s a weak sibling to the i5 and i7). Those seemingly limited Chromebooks and bargain-priced laptops are really monsters of computing muscle compared to what we used only a couple of decades back.

Yes, the hardware specs matter if you have processor-heavy work such as graphic design, video or music editing, 3D animation or graphics-intensive gaming. But for the most part what should really matter when you’re choosing a computer are where you interact the most: the input/output devices: the screen, the keyboard and the mouse/trackpad. That’s where you’ll spend the most time and get the most sensory response from.

All the mice tested
Top: Aukey, Steelcase (wired gaming). Bottom: Anker, EV, Logitech, Kensington (from the left, seen from the front)

I recently decided to change my mouse. Or mice, rather, since each laptop has its own. In part it’s because after many hours a day spent with one, my wrists and fingers can be tired and sore. I only use the inherent trackpads when I don’t have access to a mouse because I find them inefficient and clumsy.

Arm twistingI’ve favoured wired, gaming mice in the past for several reasons. First, a wired connection is consistent where a wireless might be susceptible to interference (and gaming mice have excellent but often long cables). Second, a gaming mouse usually has a lot more features than a regular mouse, including programmable buttons, more levels of speed and sensitivity. Third they offer better (longer lasting) buttons and scroll wheel, built for constant clicking and wheeling. And fourth, from long experience, I’ve learned not to buy the cheapest mice: they are generally less durable and less accurate than those from recognized companies.***

Traditional mice have the same basic problem for me and many other users: they force the user’s arm to be held for long times in a position that can encourage strain and wear. Part of my work includes graphic design that needs precision control, and part includes copying and pasting text and links from one monitor to applications on another, so the cursor travels a fair distance. More standard uses include document processing in word processors and spreadsheets. I’m on the computer many, many hours every day. And I find my arms/wrist hurting all too much these days.

I decided to look at something different: ergonomic mice, including vertical mice and trackballs. Here’s what I discovered, and my review of each.
Continue reading “Of mice and men, and trackballs, too”

Forty years of geekitude

TRS-80 Model 1It was forty years ago this fall, in 1977, that I bought my first computer. I had little experience with computers prior to that – a few weeks working after hours on an APL system at the U of T, mostly to play games against the machine, reading a few magazine articles on the coming ‘personal’ computer wave. Nothing seriously hands-on, experience-wise, and no programming skills either. But as soon as I saw one, I had to have it. And so I bought one.

Since then, I have not been a day without one, and generally had more than just one in my home. As many as six or seven at one time, back in the early 1980s, all different brands. But that was when I was writing about them, editing computer books and writing computer manuals.

My first computer was a TRS 80, Model 1. TRS stood for Tandy Radio Shack. It was a 16KB computer (yes: that’s 16,384 bytes of memory) In comparison, my current laptop has 8GB, or 8,388,608 kilobytes: 512 times the Model 1’s amount of RAM!

It was powered by a Zilog Z-80 eight-bit processor. My current machines all run 64-bit, multi-core processors. It had no USB ports, didn’t use a mouse, and had no audio card. Smartphones today are more versatile and more powerful. But not as much fun.

Before I bought it, I debated for a week or two whether to get the TRS or the competing Commodore PET, powered by the 6502 processor. It had similar limitations in memory and input devices, but came with a green and black screen integrated with the keyboard in one unit. But the TRS was sold at a nearby Radio Shack store within walking distance, and they also offered nighttime classes to teach the basics. The PET was only sold at stores downtown, so I bought the closer one.

I had to boot it and load programs from a cassette tape player. A year or so later, I upgraded to a 64KB RAM system and dual floppy (5.25″) drives. Each floppy could hold about 160KB of programs or data. It had a standalone B & W monitor that didn’t have any graphic capability, although canny programmers used the blocks in the ASCII character set to create pseudo-graphics (a bit like today’s Dwarf Fortress game displays, but only in B&W).
Continue reading “Forty years of geekitude”

Microsoft killed solitaire for me

Solitaire – also known as Klondike and Patience – is a very popular game on computers. So popular, in fact that a version of this 200-year-old card game has been included by Microsoft in every version of Windows since 3.0 (1990), aside from a brief hiatus with Win 8 (which gap was filled in by third-party versions). Microsoft has even launched a version for iOS, playable on the Mac, iPhone and iPad.

And according to some reports, it is the most widely used program by Windows users by a long shot. More than Word, Outlook, and PowerPoint and Explorer. Writer Luke Plunkett called that statistic “frightening.”

But for millions of us, solitaire fills the time; it occupies our brains during long travel times, in waiting rooms, in between loading, downloading, burning to disk or compiling experiences. Not just the one game: there are a whole raft of solo card games under the name solitaire – freecell, spider, Klondike, pyramid and tri-peaks among them – that people play regularly. And sometimes obsessively. Many is the time I have stopped writing this blog or some other piece, trapped by writer’s block or simple exhaustion, to while away a few minutes recharging with a simple game of solitaire.

As Plunkett wrote:

You mention Solitaire and—after the amazing end-game card haze—the first thing that pops into your head is that it was once seen as the single biggest threat to office productivity facing this planet’s workers. And in many regards, that’s correct.
Most people who have worked in an office can testify to the lure of the game, and could name one or two colleagues who spent a little too much time cutting the decks when they should have been filing reports. Some even take it too far; in 2006, New York Mayor Michael Bloomberg famously fired a city employee he caught playing the game while at work.
This addiction can even spread beyond the workplace and into people’s homes. My father has spent more time playing Freecell over the past decade than he has doing housework, for example. Things can get even worse for some: in 1996, Dr. Maressa Hecht Orzack opened the world’s first clinic for computer addicts as a response to her own chronic Solitaire addiction.

In May, 2008, Slate magazine ran a story titled, “Solitaire-y Confinement: Why we can’t stop playing a computerized card game.” In it, author Josh Levin wrote:

The game’s continued pre-eminence is a remarkable feat—it’s something akin to living in a universe in which Pong were the most-popular title for PlayStation 3. One reason solitaire endures is its predictability. The gameplay and aesthetic have remained remarkably stable; a visitor from the year 1990 could play the latest Windows version without a glitch, at least if he could figure out how to use the Start menu. It also remains one of the very few computer programs, game or nongame, that old people can predictably navigate. Brad Fregger, the developer of Solitaire Royale, the first commercial solitaire game for the Macintosh and the PC, told me that his 89-year-old mother still calls regularly to brag about her high scores. The game has also maintained a strong foothold in the modern-day cubicle.

So with its widespread popularity, a game beloved by millions and maybe even billions, you have to wonder why Microsoft seems bent on destroying the experience in Windows 10. Levin calls solitaire the “…cockroach of gaming, remarkably flexible and adaptable.” Perhaps Microsoft wants to stamp it out.
Continue reading “Microsoft killed solitaire for me”

Back to black

Grey scalesI had noticed of late that several websites are more difficult to read, that they opted to use a lighter grey text instead of a more robust black. But it didn’t dawn on me that it wasn’t my aging eyes: this was a trend. That is, until I read an article on Backchannel called “How the Web Became Unreadable.”

It’s a good read for anyone interested in typography, design and layout – and not just the Web, but print as well. It makes several good points about contrast including providing some important technical details about how contrast is measured.

I’ve written in the past about how contrast is important in design (here, and here for example). But apparently there’s a design trend of late away from contrast towards murkiness. In his article, author Kevin Marks notes:

There’s a widespread movement in design circles to reduce the contrast between text and background, making type harder to read. Apple is guilty. Google is, too. So is Twitter.

Others have noticed this too, even before Marks. In 2015, Katie Sherman wrote on Neilsen Norman Group’s site:

A low-contrast design aesthetic is haunting the web, taking legibility and discoverability with it. It’s straining our eyes, making us all feel older, and a little less capable. Lured by the trend of minimalism, sites are abandoning their high-contrast traditions and switching to the Dark Side (or should I say, the Medium-Gray Side). For sites willing to sacrifice readability for design prowess, low-contrast text has become a predictable choice, with predictable, persistent usability flaws.

This trend surprises and distresses me because it seems a singularly user-hostile trend; anti-ergonomic against the whole point of the internet. Apparently it’s part of a minimalist design trend. Now I don’t mind clean, uncluttered web pages, but I balk at making them unreadable. Pale grey reduces accessibility and legibility.

Continue reading “Back to black”