Synecdoche, Universe

No Man's Sky
In the delightfully quirky, postmodern film, Synecdoche, New York, the late Philip Seymour Hoffman plays a movie director obsessed with creating a set that realistically represents New York City for an upcoming movie. But as he tries to incorporate more and more people and bits that represent the city, the set grows and grows into a micro-city itself. As Wikipedia describes it:

The plot follows an ailing theater director (Hoffman) as he works on an increasingly elaborate stage production whose extreme commitment to realism begins to blur the boundaries between fiction and reality. The film’s title is a play on Schenectady, New York, where much of the film is set, and the concept of synecdoche, wherein a part of something represents the whole, or vice versa.

I feel much the same thinking and obsession went into the creation of No Man’s Sky, a sandbox (“action-adventure survival,” plus trading, exploration, fighting, gathering, building, mining, refining, upgrading, flying, meeting aliens, and more) science fiction computer game of enormous size and scope that attempts to cram everything imaginable into one game.  Synecdoche, Universe might be a suitable nickname for this sprawling, all-encompassing game.* Again from Wikipedia:

Players are free to perform within the entirety of a procedurally generated deterministic open world universe, which includes over 18 quintillion planets… nearly all parts of the galaxy, including stars, planets, flora and fauna on these planets, and sentient alien encounters, are created through procedural generation…

Eighteen quintillions? That’s 18,000,000,000,000,000,000. Beyond comprehension. I can’t vouch for anything close to that number, since in about 25 hours of play, I’ve only been to five or six of them in No Man’s Sky (NMS).

My first four game starts (three on similarly difficult planets, one sandbox in a more habitable clime) were all just learning experiences that, after fumbling, failing, and even dying, I deleted having played only a few hours each. My currently-running game has more than half of my game time logged, spent entirely on one planet with a couple of short visits to a nearby orbital space station. Most of my time on this one planet has been running or walking around, exploring. I’ll come back to that. Meanwhile, I’m still poking about on one planet while the rest of the universe awaits.

Continue reading “Synecdoche, Universe”

The Long Read Lost

Reading by candlelight
“What we read, how we read, and why we read change how we think, changes that are continuing now at a faster pace,” wrote Maryanne Wolf, a neuroscientist, in her book, Reader, Come Home: The Reading Brain in the Digital World (Harper Paperbacks, 2019). It’s the sequel to her previous book on reading and neuroscience, Proust and the Squid (Harper, 2007). In that latter book, Wolf famously wrote,

We are not only what we read, we are how we read.

Reading — Marcel Proust called it a “fertile miracle of communication effected in solitude” — is a breathtakingly remarkable, and uniquely human talent, yet one that we have no genetic disposition for, like we have for speaking or for social behaviour. No one is born knowing how to read. It must be learned by each of us individually, painstakingly from a young age and practiced through a lifetime. It is the classic case of nurture over nature. Yet there are an estimated 800 million illiterate people in the world today.

Learning to read changes our brains, rewires our neural networks, creates new connections, and helps us think. Not in a metaphorical sense: the changes have been mapped by neuroscientists like Wolf and her colleagues. Yet reading (and its co-host inventions, writing, and the alphabet; itself even younger at a mere 3,800 years old), is a very recent talent, historically speaking. The oldest known record of writing is a mere 5,500 years old; the oldest Sumerian tablets are about 4,400 years old. The first complete alphabet (ancient Greek: with symbols for vowels as well as consonants) is from around 750 BCE. In modern times, the first book was produced on a Western printing press only about 570 years ago. That’s a remarkably short time in the 300,000-400,000-year history of our species.

“In a span of only six millennia reading became the transformative catalyst for intellectual development within individuals and within literate cultures,” Wolf added. Right from the beginning of writing, stories were part of the written record: the imaginations of ancient civilizations were carved on clay and in stone, for us to read even today.

Literate cultures. The term might refer to cultures which have a reasonably high level in the ability to actually read regardless of its content, but could also refer to a civilization that has a culture of deep, passionate, and lengthy reading: one that celebrates in books, poetry, magazines, and other forms of the written word. It’s a civilization that has book clubs, discusses and shares books, has public libraries and bookstores, poetry festivals, and has plays performed and authors celebrated. A literate culture even has cursive writing (somewhat of a canary in the coal mine of literacy).

We are such a culture, even though — at least from my perspective — we continue to move at an accelerating pace to a more visually-oriented, non-reading culture, away from the written form; a short form culture where the tweet, the sound bite, and the YouTube video all have more reach than a long article or story. Our attachment to many of the longer written forms is dissipating. Long reads online are often prefaced by the initialism TL:DR — “Too Long; Didn’t Read” with a tweet-sized precis for those who will not (or cannot) read the longer piece.

The quality of our reading is not only an index of the quality of our thought, it is our best-known path to developing whole new pathways in the cerebral evolution of our species. There is much at stake in the development of the reading brain and in the quickening changes that now characterize its current, evolving iterations. (P. 2)

We live in an astoundingly complex, complicated, demanding, challenging world. To understand it even at a very basic level, we need to be able to read and read deeply; not simply watch videos or read tweets. We need to ignore the noise of social media and open books, newspapers (real newspapers, not merely the local ad-wrappers), and magazines to get a fulsome explanation of what is happening in our lives. No one can understand or learn about politics, economics, or science from tweets.

Not reading deeply is plunging us into an increasingly anti-intellectual age, suspicious of learning and science. We have world leaders who are barely literate or are even functionally illiterate, and yet who take pride in their ignorance. The result is the proliferation of conspiracy cults, pseudoscience, anti-mask and anti-vaccination movements, and both political and religious fundamentalism (most of which claptrap, not surprisingly, originates from the right wing of the political spectrum).

And it’s not just Donald Trump, although he is the epitome of the illiterate, uninformed, conspiracy-addled leader. Look at the leaders of Turkey, Brazil, Hungary, India, the Philippines, and even here in Ontario: populist (rightwing) leaders like these share similar attributes, including a distrust of institutions, science, and experts. I’ve served with members of our local municipal council who never even read agendas or staff reports, let alone books (we now have a council replete with such non-readers). The result at all levels of government is evident in the decay of public debate, the reduction to populist, slogan-based oratory, slovenly and uninformed decision making, and lackluster governance. But I digress.

Continue reading “The Long Read Lost”

The day that reason died

Aliens sort of
I’m not a believer in alien visitations and UFOs, but I’ll bet if an alien did swing by, after an hour or two observing us, checking out Facebook or Twitter, they’d lock their doors, hang a detour sign around our planet, and race off. They’d tell their friends not to visit us because we were all nuts. Scarily, dangerously crazy.

Seriously. What sort of world can be called civilized when it has people touting — and believing — homeopathy? Reiki? Chemtrails? Anti-vaccination screeds? Anti-mask whines during a frigging pandemic? Wind turbines cause cancer? 5G towners spread COVID-19? Creationism? Reflexology? Alien abductions? Crop circles? Astrology? Crystal healing? Ghosts? Flat earth? Bigfoot? Psychics? Ayurveda? Nigerian generals offering us free money? Palmistry? David Avocado Wolfe? David Icke? Gwyneth Paltrow? Donald Trump? Alex Jones? The Food Babe? Televangelists?  Ken Ham? You have to be really hard-of-thinking or massively gullible to fall for any of it. But we do, and we fall for it by the millions.

And that doesn’t include the baseless , puerile crap like racism, homophobia, misogyny, pedophilia, anti-Semitism, radical religion, trickle-down economics, and nationalism, all of which evils remain rampant despite concerted efforts to educate people since the Enlightenment. Little wonder aliens wouldn’t want to be seen here.

Why would they want to land on a planet of such extreme hypochondriacs who one day are happily eating muffins and bread, then the next day millions of them suddenly develop gluten “sensitivity” or even “allergies” right after some pseudo-wellness guru pronounces gluten an evil that is killing them? Or who self-diagnose themselves with whatever appeared in the last illness or pseudo-illness they saw in a YouTube video? Or who go ballistic over being asked to wear a mask for public safety despite its very minor inconvenience? Or who refuse to get a vaccination to help develop herd immunity and would prefer their children suffer the illness instead?

Despite all the efforts, despite science, logic, rational debate, medicine, facts, and common sense (which is not common at all these days) everything has been downgraded into mere opinion. Everyone has a right to an opinion, we say (which is politically correct bollocks), and we respect their opinion (even if it’s toxic bullshit or simply batshit crazy, or in Donald Trump’s case, both). All opinions get equal weight and consideration, especially on social media, where people will eagerly agree with anything that confirms their existing beliefs that the world is out to get them or that makes them feel special.

Who should you believe in this dark age of anti-science and anti-intellectualism: unemployed, high-school-dropout Bob who lives in his parent’s basement and watches porn in his PJs when he’s not cranking out conspiracy videos, or Dr. Fauci, an award-winning physician, medical researcher, epidemiologist, and immunologist who has dedicated his whole life to public health care, with more than five decades experience in the field, who has served as director of the National Institute of Allergy and Infectious Diseases (NIAID) since 1984, and is considered one of the world’s leading experts on infectious diseases? But there are two sides to every issue, cry Bob’s followers (by the way, there aren’t: that’s another stupid fallacy) who rush to share Bob’s latest video about why you don’t need to wear a mask during a pandemic, and that you’ll develop immunity if we all just cough on one another. What do experts know, they ask. Bob speaks for us; he’s one of us. We trust Bob, not the elitist guy with the string of degrees. And even if we do get sick, we can just drink some bleach because or president said it will cure us.

Doomed. We are so fucking doomed when wingnuts like Bob (or Trump) get any traction. But there’s Gwyneth Paltrow doing a Netflix series to promote her batshit crazy ideas about health and wellness, and women shovelling their money at her to buy her magic stones to stuff into their vaginas. Bob is just a small, sad voice compared to the commercial money harvesting machines that Paltrow, Wolfe, and Vani Hari are. Doomed, I tell you.

While a lot of hokum has been around for ages, I’ve often wondered if there was some recent, seminal event that caused it to explode as it has into every corner of the world. Sure, the internet is the conduit for most of the codswallop these days, but was there something before that that started the tsunami of ignorance, bile, anti-intellectualism, incivility, and bullshit? Was there a tipping point when reason sank and cranks went from bottom-feeding fringe to riding the surface?,

Maybe — I think I’ve found it: August 22, 1987.

Continue reading “The day that reason died”

Johnson’s words

Samuel JohnsonI have recently been reading through the David Crystal anthology of words from Samuel Johnson’s dictionary (Penguin, 2006), attempting to cross-reference it with entries in the Jack Lynch anthology (Levenger Press, 2004), comparing how the two editors chose their selections, and to see how the book designers chose to present them. Yes, I know: reading dictionaries isn’t a common pastime, but if you love words, then you do it.

In part, I’m doing so for the sheer delight of the reading (Johnson’s wit shines through in so many of the entries), and as a measure of the differences in book design, but also with an odd project in mind: The Word-of-the-day From Johnson. I had the notion of transcribing a single word at random every day, and posting it online, on and social media. Not something that seems to have been done before, as far as I can tell.

I’ve previously written about how much I enjoy Johnson’s dictionary, and how I recommend it to anyone who enjoys reading, not merely bibliophiles, logophiles and lexicographers. However, there is no reasonably-priced version of the complete dictionary with its 40,000-plus entries, just various selections. As good as the abridgments are, readers will soon ache, as I do, to read more than the limited number of definitions provided in these.

Continue reading “Johnson’s words”

Goodbye, Information Age

Fake news“Say goodbye to the information age: it’s all about reputation now,” is the headline of an article by Italian philosopher and professor Gloria Origgi, published recently on Aeon Magazine’s website.

She writes:

…the vastly increased access to information and knowledge we have today does not empower us or make us more cognitively autonomous. Rather, it renders us more dependent on other people’s judgments and evaluations of the information with which we are faced.

I no longer need to open a computer, go online and type my questions into Google if I want to know something: I can simply ask it. “Hey Google, what’s the population of China?” or “Hey Google, who’s the mayor of Midland, Ontario?” or “Hey Google, how many lines are in Hamlet?” Google will answer with all the data. If I ask, “Hey Google, what are the headlines this morning?” it will play a recent CBC newscast.

Google Home can, however, only give me a summary, a snippet, a teaser. Should I want to delve deeper or into than one question, I still need to go online and search. And that leads me into the information swamp that is the internet. How do i sort it all out?

The way we access information has changed as radically as the amount available to us. Just look at the Cambridge Dictionary’s Word of the Year for 2018: “Nomophobia” which means “a fear or worry at the idea of being without your mobile phone or unable to use it”.

Describing a typical day in his life, Dan Nixon writes of how we isolate ourselves with out phones, imagining they are instead connecting us:

…the deluge of stimuli competing to grab our attention almost certainly inclines us towards instant gratification. This crowds out space for the exploratory mode of attention. When I get to the bus stop now, I automatically reach for my phone, rather than stare into space; my fellow commuters (when I do raise my head) seem to be doing the same thing.

What could there be that is so engaging on the phone that the writer cannot use the time to, say, think? Read? Observe? Communicate with his fellow travellers? Eleven studies found that “…participants typically did not enjoy spending 6 to 15 minutes in a room by themselves with nothing to do but think, that they enjoyed doing mundane external activities much more, and that many preferred to administer electric shocks to themselves instead of being left alone with their thoughts.” The phone serves as a personal barrier to interaction instead of facilitating it. It’s a feedback loop: making it seem we are “doing something” by giving us a sensory response, while making it seem that simply thinking is “doing nothing.”

“Nothing, to my way of thinking, is better proof of a well-ordered mind than a man’s ability to stop just where he is and pass some time in his own company.”
Seneca, Letter II to Lucilius, trans. Robin Campbell, Penguin Classics: Letters from a Stoic, 2004.

Continue reading “Goodbye, Information Age”

Dictionary vs Dictionary.com

Concise OEDDid you know that doxastic is a philosophical adjective relating to an individual’s beliefs? Or that doxorubicin was an antibiotic used in treating leukemia? Or that doxy is a 16th century word for mistress and prostitute? That drack is Australian slang for unattractive or dreary? Drabble means to make wet and dirty in muddy water? A downwarp is a broad depression in the earth’s surface? Drail is a weighted fish hook? Dragonnade means quartering troops on a population while dragonet is a small fish but a dragoman is an interpreter? That a dramaturge is a literary editor on a theatre staff?

These are words I read when I was looking up the word doxology last night. They all appear close to doxology, either on the same or the adjacent page. Anyone with even a modicum of curiosity opening a dictionary can find these and other words in your search for the meaning of an unfamiliar or uncommon word. In fact, it’s quite entertaining to simply open a dictionary at any random page and read because you are likely to learn something new each time (well, perhaps less so if you use one of the generic no-name dictionaries you bought in the box store).

My bedside dictionary is the Concise Oxford, but I also have several other Oxford editions, a Random House, Merriam Webster, and Chambers, plus some others. I often refer to several for a more comprehensive understanding of a word. And yes, I do keep one by the bed because I read a lot before sleep and sometimes encounter unfamiliar words. Oxford because it’s simply the best, I like the layout and typography, and it’s English, not American.
Continue reading “Dictionary vs Dictionary.com”