11/12/13

Gluten, Sourdough, Fads and Ailments


Gluten breakdownGluten, that everyday protein found in many grains, has become the health-fad followers’ most recent evil spectre, and many (one in three, stats show) have jumped onto the anti-gluten bandwagon, generally with a simplistic message: “gluten bad.”

Like most diet fads, I expect it will likely fall off centre stage when the next Big Thing To Rise Against comes along. But meanwhile, until the next fad raises its head, gluten gets sensationalized, demonized and generally misunderstood.

Headlines like this abound (it was matched by a CBC Radio story on Ontario Morning Tuesday, Nov. 12):

Sourdough breadmaking cuts gluten content in baked goods
Celiacs and gluten avoiders have a new way to enjoy a slice of bread

That’s from a misleading and potentially dangerous CBC story about sourdough bread. It’s dangerous because there are people who suffer severe reaction from gluten intake (celiac disease or CD), and others who have non-celiac intolerance (sensitivities) to gluten (not, as some sites say, an allergy) and they might be misled to think sourdough bread is now safe.

People – thinking CBC a reliable, even credible source – might consume regular sourdough bread  - or at least bread labelled as “sourdough” – believing this article deems it safe, when it may in fact cause severe and painful reactions.*

The article says:

A handful of recent studies have some good news for those trying to reduce the amount of gluten they eat — old-fashioned sourdough baking techniques significantly cut gluten content in bread…

But the reporter fails to identify those studies, so readers need to research to find out what those studies actually say (and more importantly, what they don’t say). Nor does the writer say whether all sourdough methods work, or just some (Google sourdough starter and you’ll find hundreds of recipes, some including wild yeast, others with domestic yeast). The writer then adds:

A team of Italian scientists led by Luigi Greco at the University of Naples authored a 2010 study that showed significantly lower levels of gluten in sourdough made according to old methods.

Old methods? Like leaving the started in peasant’s thatch-roof, mud-walled hut shared with the family pig?

Well, unless I completely misread it, that study of 13 people didn’t say anything of the sort about “old methods” It showed reduced gluten in “fully hydrolyzed wheat flour” that had been treated in a sterile laboratory environment with a clinical mix of cultured bacteria commonly found in sourdough, as well as adding fungal enzymes:

Fermentation with selected lactobacilli added with fungal proteases, routinely used as an improver in bakery industries, decreased the concentration of gluten to below 10 ppm. Despite the markedly reduced concentration of gluten, the resulting spray-dried flour was still adequately workable. As shown in this and other studies, the hydrolyzed flour is suitable for making sweet baked goods and also bread and pasta if supplemented with gluten-free structuring agents…

A 60-day diet of baked goods made from hydrolyzed wheat flour, manufactured with sourdough lactobacilli and fungal proteases, was not toxic to patients with CD.

Which is good news and encourages further research, but not a promise that all breads labelled “sourdough” will have that effect. Or that the baker’s sourdough starter will have the ingredients in the necessary quantities and balance of ingredients to sufficiently reduce the  gluten in the flour. Or that the length of fermentation will be sufficient to achieve those results. Or that the flours used in the bakery are the same as those used in the research (different flours have different gluten levels).

Notice that caveat for bakers: “…if supplemented with gluten-free structuring agents…” These test subjects were fed pastries, not breads or pasta.

Continue reading

11/10/13

I’m struggling with this…


Recent bread loafMy recent passion for bread and baking has caused a bit of an internal upset. Not the baking thereof, but rather the writing about it. I’m doing a lot of that, recently. Writing (and, yes, baking too). And of course it comes with the attendant research into bread’s history, the combing through websites for recipes and book reviews, the hunt for equipment and the discussions about yeasts, pH balance, sourdough starters, Canadian versus American flours, protein contents, vintage and ancient grains… gawds, I’m having fun.

And it is, if you don’t mind some hyperbole, damn tasty fun.

It’s both culinary and hands-on science, with a bushel of history tossed into the mix. I haven’t had this much fun since I discovered the ukulele, back in 2008.

I suppose everyone needs new challenges, new horizons, new mountains to conquer. Bread – well, so far artisan, rustic bread – is my latest Everest.  Which is a bit synchronistic, because sourdough is often described as the “Everest of breads” and sourdough is one of my next projects. Enroute to that pinnacle, I have a lot to learn. But sourdough is on the horizon for this winter.

It’s a bit of a throwback for me because I was baking bread 25-30 years ago with all the earnestness of a wannabe chef. My notes from classes at the Toronto Academy of Culinary Arts date to the mid-1970s. I was making bread (albeit mostly in bread machines by 1988) into the late 1980s. I stopped after we moved here. Now I’ve started again.

BouleThanks to both a spate of new books on breadmaking and the internet, I can now re-indulge that interest and share the experiences of others, as well as their recipes. It’s an obsession, I admit, but a creative one.

I am pondering starting a whole new WordPress blog zone for my baking and research about bread. One amateur to another, it would be. Here’s me flailing around with recipe after recipe, tweaking, tinkering and photographing. And about the historical impact and implications of bread, as I have recently posted (on the impact of ergot and witchcraft via bread). My naked tweets would be – unlike Anthony Wiener’s – about my bread results.

Bread blogging is actually been and being done successfully on other sites such as The Fresh Loaf (highly recommended if you’re into bread baking, by the way). But I am reluctant to create my own blog zone on their site (if for nothing more than I want to be able to create recipes in my preferred format and style, which is possible through WP plug-ins). Besides, there have a lot more accomplished people there, and my efforts would seem presumptuous.

Continue reading

11/9/13

Bread, Madness and Christianity


St. Anthony's FireThe witch craze of Europe is a popular, albeit often misrepresented, part of our collective history. Everyone knows witches were hunted, tortured and often killed – burned at the stake, a particularly repulsive method of murder. While not a uniquely Christian form of killing, it was practiced widely by Christians throughout history in every European nation, perfected in ritual by the Spanish Inquisition.

Hunting witches in the period between 1480 and 1750 (the so-called “classical period” of witch hunting) resulted in between 40,000 and 60,000 executions, although some authorities guess the total to be as high as 100,000.

While it’s politically correct these days to report they were all  killed at the hands of religious zealots, it’s actually a lot more complicated than that. But that’s not the subject of this post.

What really interests me is the potential cause of this madness, not the religious response to it. Yes, I know the belief in witches has been around since biblical times, in many cultures, and people are still being killed today because of it, but Europe’s witch craze was something different; almost an industrial scale of madness and murder. Why so many?

The answer may lie in that staple of our foodstuffs: bread.

Okay, not all breads. Just breads made with rye flour, it seems (well, not 100%, but that’ comes a bit further down the post, No peeking!). Pumpernickel, a dense rye bread, may derive it’s name from the German for Devil’s Fart. Really. The stuff you learn online. Anyway, witches may be the result of food poisoning – not, as the church believed, the supernatural. Bad case of mistaken identity, that.

Dance of DeathRye grain (Secale cereale) is susceptible to ergot (Claviceps purpurea), a fungus with a whole lot of chemicals in it that, when eaten, have some nasty side effects, from burning to madness to death. I mentioned this briefly in a recent blog post on the history of bread making. It’s a fascinating chapter in the history of bread (which itself is a fascinating chapter in the history of humanity).

The madness comes from the alkaloids in ergot that bear a resemblance to LSD as Wikipedia tells us:

The ergot sclerotium contains high concentrations (up to 2% of dry mass) of the alkaloid ergotamine, a complex molecule consisting of a tripeptide-derived cyclol-lactam ring connected via amide linkage to a lysergic acid (ergoline) moiety, and other alkaloids of the ergoline group that are biosynthesized by the fungus. Ergot alkaloids have a wide range of biological activities including effects on circulation and neurotransmission.
Ergot alkaloids can be classified into two classes:

  1. derivatives of 6,8-dimethylergoline and
  2. lysergic acid derivatives.

Ah, Timothy Leary, where were you when you were needed back in the 15th and 16th centuries? The madness and physical side effects of eating ergot is colloquially called “St. Anthony’s Fire.” We call it ergotism today:

In large doses, ergotamine paralyzes the motor nerve endings of the sympathetic nervous system. The disease ergotism (St. Anthony’s fire) is caused by excessive intake of ergot. This can occur by the overuse of the drug or by eating baked goods made with contaminated flour, as happened in the Middle Ages. (Ergotism also can affect cattle, by their eating ergot-infected grain and grass).

Acute and chronic ergotism are characterized by mental disorientation, convulsions, muscle cramps, and dry gangrene of the extremities.

A psychoactive drug, lysergic acid diethylamide, best known as LSD, is chemically related to ergotamine.

I suspect the effect would have been frightening, confusing and disorienting – combined with the physical pains, burning, convulsions, the gangrene and other effects. No one would connect the effects with rye until the late 17th century. But for more than a millennium, stories of outbreaks of madness and St. Anthony’s Fire would fill the chronicles.**

And it would often be blamed not on the bread, but on a supernatural cause: the devil, demons or witchcraft. Christianity was not particularly kind to people accused of consorting with the devil.

Continue reading

10/27/13

Anti-Intellectualism: The New Elitism


Anti-intellectualismThere’s a growing – and disturbing – trend in modern culture: anti-intellectual elitism. The dismissal of art, science, culture, philosophy, of rhetoric and debate, of literature and poetry, and their replacement by entertainment, spectacle, self-righteous self ignorance, and deliberate gullibility. These are usually followed by vituperative ridicule and angry caterwauling when anyone challenges the populist ideals or ideologies.

As if having a brain, as if having any aspirations to culture, to art, to learning – or worse, to science – was an evil, malicious thing that must be stomped upon. As if the literati were plotting world domination by quoting Shakespeare or Chaucer. Or Carl Sagan, Charles Darwin, Richard Dawkins.

“The mind of this country, taught to aim at low objects, eats upon itself.”
Ralph Waldo Emerson, oration to the Phi Beta Kappa Society Cambridge, August 31, 1837.

Anti-intellectualism isn’t new – Richard Hofstadter wrote about it in 1963 - but it has become highly visible on the internet where pseudoscience and conspiracy theories have developed unchallenged into popular anti-science and anti-rationalist countercultures, many followed and accepted by millions.

Hofstadter wrote,

Anti-intellectualism is a resentment and suspicion of the life of the mind and of those who are considered to represent it, and a disposition constantly to minimize the value of that life.

He warned in his book that intellectualism was “on the run” in America. It still is.*

Just look at the superstitious Jenny-McCarthyites who fear vaccinations with the same religious fervour medieval peasants feared black cats crossing their paths. Or the muddle-headed practitioners and followers of homeopathy. The chemtrail conspiracists. The anti-wind turbine and the anti-fluoride crowd. Any Scientologist. Or any religious fundamentalist. The list of true believers in the anti-intellectual crowd is huge.

Online technology didn’t create these mythologies, or the gullibility of their followers, but the internet is the great equalizer and the great popularizer. It’s not making us smarter; in fact, it may be dumbing down a lot of folks. That’s because anyone, anywhere, can have his or her say and there’s no way to easily discern the intellectual wheat from the. abundant chaff without doing some hard thinking and analysis.

Technology has created the sense of entitlement that every comment, every opinion is of equal value, regardless of the context and the person making that comment. It’s the ultimate democratizer. But it’s a democracy where communication is reduced to the lowest level: the instant, the brief and the angry retort.

Facebook and Twitter don’t have categories that identify posters as more relevant or more important than others. If the prime minister posts on Facebook, he doesn’t get a gold box around his post that says he’s in charge of the country. If Stephen Hawking weighs into a Facebook debate about the nature of the space-time continuum, he doesn’t get a special icon that lets people know he owns this conversation.***

All messages we post have the same weight, the same gravity. There’s nothing to identify any post as more informed, as factually correct or even relevant. So it becomes easy to derail a discussion by spurious claims and allegations, but innuendo, lies or simply confrontational language.

We’re all equally important on the internet. One person’s belief in magic, superstition or conspiracies gets the same opportunity to be heard and seen as those about science and empirical fact. In the online land of the blind, the one-eyed man has no special significance.

Facebook image

We’re creating a world of dummies. Angry dummies who feel they have the right, the authority and the need not only to comment on everything, but to make sure their voice is heard above the rest, and to drag down any opposing views through personal attacks, loud repetition and confrontation.

When they can’t respond with an intellectual counterargument – as is often the case – the anti-intellectuals respond with the ideology of their peer group (see the religious content of the message in the image taken from Facebook on the left) or ad hominem attacks. Name calling. Belittling and demeaning the opponent.

Bill Keller, writing in the New York Times, said,

The Web culture is simultaneously elitist and anti-authoritarian…

But it’s not an elitism of wisdom, education, experience or knowledge. The new elite are the angry posters, those who can shout loudest and more often, a clique of bullies and malcontents baying together like dogs cornering a fox. Too often it’s a combined elite of the anti-intellectuals and the conspiracy followers – not those who can voice the most cogent, most coherent response.

Together they ferment a rabid culture of anti-rationalism where every fact is suspect; every shadow holds a secret conspiracy. Rational thought is the enemy. Critical thinking is the devil’s tool.

Continue reading

10/23/13

Survival of the Fittest


Herbert SpencerCharles Darwin has long been associated with the phrase, “survival of the fittest.” For a century and a half people have used it to refer to their understanding of his explanation of how species evolved.

But it wasn’t his. And it has obscured the understanding of Darwin’s own theory.

It came from a contemporary, Herbert Spencer. Spencer was a contemporary of Darwin – an English polymath:  philosopher, biologist, anthropologist, sociologist, economist liberal political theorist, utilitarian – and, by some accounts, an early libertarian. His ideas came from people like Malthus and Adam Smith (read more about his philosophy here). Wikipedia tells us:

For many, the name of Herbert Spencer would be virtually synonymous with Social Darwinism, a social theory that applies the law of the survival of the fittest to society; humanitarian impulses had to be resisted as nothing should be allowed to interfere with nature’s laws, including the social struggle for existence. Spencer desired the elimination of the unfit through their failure to reproduce, rather than coercion or state intervention to initiate their physical annihilation.

He wrote his interpretation of Darwin’s ideas in an 1864 textbook of biology:

“This survival of the fittest, which I have here sought to express in mechanical terms, is that which Mr. Darwin has called ‘natural selection’, or the preservation of favoured races in the struggle for life.”

Spencer was really trying to apply Darwin’s ideas to his own ideas about economics, class struggle, competition and politics. He also believed in Lamarckism – the inheritance of attributes gained in one generation by the next – which has long since been discredited. But whether you agree with Spencer’s views, his reduction of Darwin’s theory to a convenient axiom did the theory an injustice.

In the public mind, Darwin’s ideas about natural selection were confusing and challenging. They became conflated with Spencer’s ideas and somehow the phrase stuck – the Victoria era equivalent of a bumper sticker phrase. It became wildly popular, and was soon applied to social and political phenomena, not simply biological.

It was so popular as a catch phrase that in the 1869  fifth edition of his book, On the Origin of Species, Darwin – unfortunately – added this line:

“But the expression often used by Mr. Herbert Spencer, of the Survival of the Fittest, is more accurate, and is sometimes equally convenient.”

The problem is really in how the word “fittest” is defined. Like its sister term, theory, it has both a common and a scientific meaning.*

Survival of the FittestFittest, in Darwin’s sense, doesn’t mean the biggest, best, toughest, strongest or even the most competitive. It’s not the macho concept of superiority. It isn’t about power, control or brute force.

It means the “best suited for the immediate environment.” It has also been described as a “property of the relationship between the organism and the environment.” That might be a different colour, smaller size, less active. Whatever offers the best opportunity to survive and breed. Having offspring is key.

It’s a far more subtle notion than commonly used. As Wikipedia says:

Modern evolutionary theory defines fitness not by how long an organism lives, but by how successful it is at reproducing. If an organism lives half as long as others of its species, but has twice as many offspring surviving to adulthood, its genes will become more common in the adult population of the next generation.

Continue reading

10/20/13

Infestations, Microbes & Parasites


Micrococcus (Actinobacteria)Staphylococci, Corynebacteria, Actinobacteria, Clostridiales, and Bacilli. They’re the most common, but they’re not the only ones. Bacteria. Microbes. Yes, even parasites.  Living in your belly button. And on your skin. Your hair. But the belly button flora and fauna fascinate me the most.*

We’ve known ever since the microscope was invented that we had a population of hitchhikers living on our skin, scalp – and even inside us. The great Antony van Leeuwenhoek is known as the “father of microbiology” for his explorations through his fledgling, primitive microscope in the late 17th and early 18th centuries. But we are still learning about the biodiversity we all have.**

There was a recent story in the Atlantic that researchers had found, of 2,368 species of bacteria living in our navels recorded so far, 1,458 new species.

New as in previously unknown until some scientist stuck a Q-tip into his or her belly button and examined the results. Which is what they are still doing, exploring this brave new frontier in microbiology. It must have been one of those eureka moments.

Gawd, dontcha love science? To bold go where no Q-tip has gone before…

The Atlantic story opens:

Instead of taking your fingerprint, maybe police should swab our belly buttons with Q-tips. No, that’s ridiculous, actually. But the idea illustrates a point made by a group of North Carolina-based researchers in their new Belly Button Biodiversity (BBB) project. Last month, the group published results of their first of many experiments, in which they swabbed 60 belly buttons and identified a total of 2,368 species of bacteria. People’s individual profiles were snowflake-ily, bacterially unique.

Most of the microbes that live on our skin are harmless, and many are actually beneficial: they protect us from more virulent invaders. What’s remarkable is the sheer number and variety of them.***

The scientists at Your Wildlife wrote that among all those species, there are some common forms found in most navels:

Turns out, belly buttons are a jungle of microbial biodiversity: we detected over 2300 species! And get this, only eight of those 2300 species– we call them oligarchs – were quite frequent and abundant, present in more than 70% of the individuals we sampled.

Oligarchs. Make me think of the Politburo. Or Putin’s new Russian clique. What role do these bacterial oligarchs play in their micro-environment?

We’re an entire ecosystem, Scientific American tells us:

Over the past 10 years or so, however, researchers have demonstrated that the human body is not such a neatly self-sufficient island after all. It is more like a complex ecosystem—a social network—containing trillions of bacteria and other microorganisms that inhabit our skin, genital areas, mouth and especially intestines. In fact, most of the cells in the human body are not human at all. Bacterial cells in the human body outnumber human cells 10 to one. Moreover, this mixed community of microbial cells and the genes they contain, collectively known as the microbiome, does not threaten us but offers vital help with basic physiological processes—from digestion to growth to self-defense.

By the way, you can contribute to this ongoing research; check the notes at yourwildlife.org about how to help. The belly button experiment is over, as researchers assess their samples, but they are moving on to armpits, a whole new territory.

Continue reading