Author Archives: glabwrites

Torture: Being forced To Watch TV

The day before Thanksgiving I found myself watching broadcast television. Not by choice, mind you. I haven’t had my own broadcast-receiving TV since, oh, the spring of 1997. A few years before that, I’d already eschewed broadcast news and found my mental health much improved for it. By ’97, I realized that all of broadcast television is designed to make the viewer feel profoundly unhappy with one’s lot in life — his car, her body, the cleanliness of their toilet, the smell of their armpits, the whiteness of their teeth, their lack of bravado in gunfights, and their inability to fuck every comely or handsome figure who saunters into the room.

I haven’t regretted for one nanosecond my decision to boot broadcast TV.

But Wednesday, I was in a position wherein I was unable to run, shrieking, out of the room when the TV, tuned to WTHR in Indianapolis, was turned on. See, I’m in the middle of a six-week course of daily hyperbaric chamber treatments. Also known as HBO (for hyperbaric oxygen), the treatments are a must for the likes of diabetics who’ve lost or are at risk of losing toes or feet, say, to their ravages of their horribly unfortunate disease or, for those like me, who’ve undergone cancer radiation treatment. In my case, my cancer was in my neck so I’d had to submit to a month and a half of daily linear-beam radiation therapy. The result was the smashing of a number of malignant lymph nodes surrounding my thyroid gland as well as the weakening of my jaw to the point that the bone now has the structural integrity of styrofoam and the blood vessels supplying said mandible have been shrunk to a thread, making me vulnerable to tooth abscesses and unable to heal in that locale should any dental work be done. Turns out, I indeed do have an abscess now and that work can’t be done until, through HBO, my mandibular blood vessel has been strengthened and enriched.

The HBO treatments work like this: I lie in a coffin-like airtight container for hours a day breathing pure oxygen at twice normal atmospheric pressure. Every day, I strip down, take off all jewelry and my glasses, get questioned about whether I’ve put on underarm deodorant or skin lotion, get physically examined, and then lay down, flat on my back, my arms at my sides, and sealed into this clear tube. The hope is at the end of six weeks the blood flow in my jaw will be so enhanced that I’ll be able to get my broken tooth removed and then start scheduling three other surgeries that can’t be done right now because of that abscess. Phew! I’ve said it before and I’ll say it again, once a cancer patient, always a cancer patient.

A Hyperbaric Chamber.

You might notice the TV screen suspended above the HBO tube in the photo. That’s cool. The people who run these devices realize people like me would probably prefer to leap off tall buildings to lying in a coffin-like tube for hours every day so, to ameliorate that unhappiness, they provide TV. I can’t bring a book or my crossword puzzles into the tube because, for the same reason I have to strip and have no oily substances on my body, the pure oxygen environment is highly flammable. (Those of us of a certain age might remember the fatal fire that took the lives of the three Apollo 1 astronauts in January, 1967. During a practice run, they were sealed into their capsule breathing pure oxygen and a stray spark set off a conflagration within it. The astronauts died of asphyxiation within moments.)

The Interior of Apollo 1 after the Fire.

The people at my HBO facility (ironically, just yards from the cancer treatment center where I did my radiation stint back in 2016), allow us patients to bring in DVDs to watch during our sessions. And for those who don’t collect movies, the facility actually has a library of DVDs, donated by angels and past patients. I bring in my own DVDs and Wednesday I’d been watching On the Waterfront with Marlon Brando, Eva Marie Saint, Lee J. Cobb, Rod Steiger, and Karl Malden and scored by Leonard Bernstein. I’ve seen the movie a dozen times but each time, I’m blown away by the acting chops displayed therein.

Steiger (L) and Brando, Playing Brothers in On the Waterfront.

Brando’s performance, along with a couple of other films he’d done around the same time, essentially redefined how actors act in movies. For that matter, all the main players were adherents of the then-revolutionary Stanislavski system or method of acting. They were no longer “stage actors” but fully immersed themselves into character.

For my money, if someone wants to assert that On the Waterfront is the greatest movie ever made, I wouldn’t argue too much

In any case, the movie ended with about twenty minutes left in my HBO session. The attendant then switched the TV to broadcast and I was treated to a program called Daily Blast Live, wherein four people sit behind a desk and blather. The four seem to be straight out of a TV producer’s dreamworld of diversity, with a black man and woman and a white man and woman, ranging in age from early 30s to late-ish 40s, all imbued with nice, clean, middle-class values, competing with each other to convince us that they’re just like you and me.

Watching these four was a revelation. I should have known, but have forgotten in the last quarter century, how godawfully vacuous broadcast TV is. I felt as if I were watching an over-the-top satire of Paddy Chayefsky’s Network. In fact, for a hot few moments, I actually thought I was watching some take-off on all these daytime TV shows. But no, this was the real thing and, for chrissakes, if this is what America watches on a regular basis no wonder so many people are thrashing about, subscribing to conspiracy theories, voting for carnival barkers for president, refusing to wear masks during a pandemic, and every other sin we’ve been committing for decades in this holy land. Broadcast TV has warped peoples’ minds, shattered our collective view of reality, and turned us into slack-jawed zombies.

None of this, of course, is any breaking news, but, as I say, I’d so completely divorced myself from this sick oeuvre that I’d forgotten how bizarre it all is.

The four were talking about the next day’s Thanksgiving meal and, swear to god, they spent at least ten minutes discussing whether one should eat like a pig, stuffing one’s self to near nausea, or perhaps take it easy and eat in something akin to moderation. They argued this point with all the passion and ferocity of Karl Marx and Sen. Joe McCarthy in some fantasy world fighting about communism versus capitalism.

A Knock-down, Drag-out Battle.

One of them, the black man, posited, “I think it’s Thanksgiving (quite an astute observation, I might add) and we should eat to our heart’s content!” He uttered this with all the conviction of a man calling for an end to the child sex slave trade. The white man shook his head vigorously and countered, “No, no, no, no! It’s better to eat small portions. That way, you can enjoy your food and not suffer afterward.” He offered this position with the assurance of Einstein chatting about his special relativity theory. This went on for long minutes until the black woman said , “Well, let’s all agree there’s nothing so satisfying as sitting back on the sofa with your belt undone.” The other three nodded as if she’d advocated for an end to all wars.

The white woman then shifted gears and introduced a remote interview with a women who starred in one of those Real Housewives shows, which I didn’t even know was a thing anymore. This woman was from Orange County. Leaning forward toward the camera, the white woman asked, earnestly, what the Real Housewife lady was going to do tomorrow on Thanksgiving Day. Get ready for a shock: the lady said she and her family were going to eat a big meal!

After running down the list of things she was going to have on her table — all of which were typical Thanksgiving fare, to which the Daily Blast panel oohed and aahed as if she were ticking off exotic treats from distant foreign lands — the Real Housewife lady turned deadly serious and asked if she could be indulged in crowing about her young daughter’s recent fabulous accomplishment. Given license to crow, she then revealed her daughter had participated in an event that raised money for some life-threatening disease research. The Daily Blast gang gaped and gasped and, honestly, I wouldn’t have been surprised if one or more of them demanded she be nominated for the next Nobel Peace Prize.

Mercifully, my HBO session had come to an end and while I was being de-pressurized, the speakers within my tube went silent.

When the attendant brought me out of the tube, I resisted with all my might the urge to ask her if she watched this show every day. And, if she’d said yes, I was fully prepared to yell, “What in the goddamned hell is wrong with you?”

I dunno, maybe none of this is news to some of you but I drove away from the HBO facility in a daze. I still can’t believe this is how we entertain ourselves, this is how we get informed about the world around us. But I really shouldn’t be surprised. Look at what in the hell we’ve become.

What Is Science?

There are any number of terms and/or concepts bandied about these days that mean many different things to many different people. One of the important features of clear language is the understanding that we all pretty much agree on what words mean. I use the qualifier “pretty much” because definitions shouldn’t be written in stone, impervious to lexicographical evolution. But if I say to you, “Watch out, there’s an angry hornet on your right shoulder,” it’d behoove you to know that yours and my definitions of “right,” “shoulder,” and “hornet” jibe.

Awful [Image: Fotolia/AP]

All languages are fluid, constantly changing. English is no better or worse, in that sense, than any other tongue. Take, for instance, the words terrific and awful. Anybody today who uses either term is conveying a meaning that everybody would get. Something terrific is good to an almost superlative degree. Something awful is bad, to the same extent. Yet terrific originally denoted something that inspired terror. Awful, on the other hand, described a thing or idea that filled one with awe, the interior, say, of the Ulm Cathedral in Germany, once one of the tallest structures in the world. Its stone steeple reaches 530 feet into the sky. Imagine being a rustic Briton in pre-skyscraper days, arriving in the Teutonic big city and strolling up to the edifice. She or he’d be filled with awe. “This,” she’d say, “is awful!”

Back then, those within earshot’d know precisely what she meant.

If she said the same thing today, listeners would be scratching their heads.

And, by the way, most people refer to that structure, officially the Ulm Minster, as a cathedral but, truth is, it is no such thing. A cathedral, technically, is the home church of a bishop, the headquarters of what is called in the Christian nomenclature a See.

See? A didact from the Holy See (the official name of the sovereign state whose capital is the Vatican City), might shake his finger at you for calling the Ulm structure a cathedral, but no one else on Earth would. That bit of inexactness (or, some might say, laziness) among the hoi polloi has led to an effective change in the meaning of the word cathedral. Most people today would say it refers to any grand or awe-inspiring church.

For pity’s sake, the very term hoi polloi itself can mean something quite different from the original intent. About 75 years ago, the term meant the unwashed masses, rubes, un-sophisticates, the common clay. As such, obviously, it was a slur. People of a certain “noble” rank used it to describe the slobs of no rank or wealth they had to suffer seeing whenever they ventured out from their safe estates. Pretty straightforward, no?

No. Here’s the Merriam-Webster definition of the term hoi polloi:

  1. The general populace. The masses
  2. People of distinction or wealth or elevated social status. Elite.

Well, which is it?

Fortunately for us in the year 2021, few use the term. It, again, is an insult. Perhaps it fell out of fashion because nobody could could guess with any assurance what you meant when you uttered it.

Back when I was a kid, white people started copping terms from black people, who themselves had been copping terms from jazz hipsters. One of them was bad, as in good. My father and brother used to whip themselves into a frenzy watching the Chicago Bears play football every fall Sunday afternoon. One day, my brother said of Dick Butkus, the legendary Hall of Fame middle linebacker whose very name at the time conjured an image of the immovable barrier, the unforgiving force or, simply, the best pro football defender alive, “Man, he’s bad!”

My father, not yet hip, was flummoxed. You could almost see the wheels spinning within his head: Butkus? Bad?

I could go on. Hipster once was a descriptor most wannabe cool guys would have loved to be called. Anybody who played bebop jazz or listened to it was a hipster. Charlie Parker was a hipster. Lenny Bruce was a hipster. The Beat Writers were hipsters. Outsiders, rebels, anti-establishment types. Today? Let’s go to the Wikipedia reference, Hipster (contemporary subculture):

Affluent or middle class youth?! Charlie Parker? Lenny Bruce? Allen Ginsberg?

Like I said, language is fluid. Dig?

This, natch, is all preamble to the question, What is science? This is important because people are using the term promiscuously in public discourse, on social media, and in their own minds. In fact, the very word has become a definitive marker as emotionally and forensically fraught as religion once was (and, to a vanishing extent, remains). When someone says “That’s just science,” they’re really saying, “That’s the truth, and if you don’t believe it, you’re not going to hell but, man, you’re out of it.”

Problem is, people who buy into astrology, for instance, will argue that the practice of it and belief in it is firmly grounded in science. They’ll say people have been working on its charts and formulae for thousands of years. It’s a noble and entrenched science.

Anti-vaxxers swear to their gods that they have science on their side. Those who believe genetically modified organisms (GMOs) are poison say their stance is science-based. Conspiracy theorists explain the origin of COVID-19, the collapse of several Manhattan skyscrapers after the 9/11 attack, even the idea that John F. Kennedy was, as The Onion headline blared, “shot 129 times from 43 different angles while riding through downtown Dallas in a motorcade,” will tell you, That’s just science.

In each of the aforementioned cases scientific consensus held an opposing view.

The word science today too often connotes Received Wisdom just as much as the Bible or the pronouncements of the Rev. Sun Myung Moon once meant to their adherents. In many cases, science is the new Bible; scientists the new priests. The hoi polloi (there’s that word again) casually surf the internet and, finding some quote or assertion by a white lab coat-wearing figure, accept the same without question.

Science, in millions and millions of people’s minds, is now orthodoxy.

Like so many terms, so many ideas, science today is becoming meaningless.

Which is a damned shame because, not terribly long ago (before the Age of Trump and, to be sure, before the advent of the internet) the word had a hard and fast meaning.

People say, “Science says…,” as if there’s some authority, some panel of infallible experts who speak in its name. As if all the scientists of the world are in lock-step agreement on things. As if there’s a daily or weekly report issued by them, the purpose of which is to posit inerrant truths.

Thing is, scientists are nothing more than a bunch of human beings trying to do the best they can in their chosen fields. There are good ones; there are lousy ones. This is not to belittle their efforts. But it can be assumed that if 90 percent or more of the practitioners in any scientific discipline buy into an idea, it’s likely spot on. Of course, scientific consensus has been wrong before. Many times, in fact. Think of things like phrenology, social “survival of the fittest,” Piltdown Man, the Steady State universe, and many other now-debunked but one-time accepted truths.

Oliver G. Alvar wrote in 2019:

Science makes mistakes, there’s no doubt about it. If it claimed to possess perfect knowledge of the world, it would be no better than religion or other dogmatic doctrines. Unlike religion, science doesn’t deal in absolutes, but in probabilities — which is how we conduct our everyday knowledge anyway.

Science is humanity’s accumulated body of knowledge, ever-changing, constantly being refined, the particulars within it occasionally refuted. With each passing day practitioners of experience, capability, and acclaim within it are trying to make it better, more accurate, closer to the truth. The best of scientists innately grasp that they’ll never fully know any “truth,” yet, they still strive toward it.

They do their work following the guidelines of the scientific method. It’s utterly logical and exquisitely simple. Here’s a chart I found in the website Lumen:

You see the box reading “Form a hypothesis…?” That’s a term people usually conflate with theory. “That’s your theory,” people might say when they’re really implying, “You’re full of shit.” A hypothesis is a guess based on observation, an unproven stab. A theory is a hypothesis that has been either proven or shown to be so overwhelmingly unassailable that it works quite well as a model of understanding, evolution, say, or the Big Bang.

In any case, scientists are not the spiritual kin of Moses, who came down from Mount Sinai carrying the Ten Commandments. People throughout history have craved such a prophet, such a visionary, certain of “truth.” They still do today. If anything, the impulse within us to embrace such a man (always a man) is less in the year 2021 than it was in 1921 or, for that matter, 1121 or 1121 BCE. That’s good. Yet it remains.

Moses, essentially, said, This is so. Ideally, scientists say, This just might be.

Leave it to Isaac Asimov to put it best:

The most exciting phrase to hear in science, the one that heralds new discoveries, is not “Eureka” but “That’s funny…”


Good scientists that he was, Asimov got a bigger kick out of being puzzled by something than he got from finding any sort of “truth.”

From the Minds of Babies

From Nobel Prize-winning physicist Frank Wilczek’s book, Fundamentals: Ten Keys to Reality:

[T]o appreciate the physical universe properly, one must be “born again.”

As I was fleshing out the text of this book, my grandson Luke was born. During the drafting, I got to observe the first few months of his life. I saw how he studied his own hands, wide-eyed, and began to realize that he controlled them. I saw the joy with which he learned to reach out and grasp objects in the external world. I watched him experiment with objects, dropping them and searching for them, and repeating himself (and repeating himself…), as if not quite certain of the result, but laughing in joy when he found them.

In these and many other ways, I could see Luke was constructing a model of the world. He approached it with insatiable curiosity and few preconceptions. By interacting with the world, he learned the things that nearly all human adults take for granted, such as that the world divides itself into self and not-self, that thoughts can control movements of self but not of not-self, and that we can look at bodies and not change their properties.

Babies are like little scientists, making experiments and drawing conclusions….

Wilczek is one of those super-brains-on-two-legs guys who, in their day jobs, are probing into the very existence of…, well, existence while they moonlight as authors penning books for the less cerebral among us to try to pretend to understand what they’re writing about. He knows what quantum mechanics is all about. Or, let me put that more accurately: among humans on Earth, he’s one of the very few who can at least know what he doesn’t yet know about the subject. For, as Richard Feynman once famously observed, “I think I can safely say that nobody understands quantum mechanics.”

Feynman, of course was one of those super-human intellectuals who wrote a lot for the lay person. Carl Sagan, too. Among the living, there are Dava Sobel and Neil deGrasse Tyson. In fact, here’s a list of great science writers, living and dead, who can (in most cases almost) be grasped by the likes of you and me:

This slideshow requires JavaScript.

Pick any three of the above and delve into one or more of their books and you’ll find your life enriched beyond any monetary figure. A caveat: it won’t be terribly easy getting through any of their books. No matter — the Big Mike philosophy goes: Nothing worth doing is easy.

Anyway, let’s go back to that Wilczek quote. The idea being perhaps the truest and most pure scientists are human babies trying to figure out what those spindly little things on the ends of their hands are for; why things drop; how, if my diaper is full and I howl about it, some big person’ll come along and put a new one on me; and, eventually, what’ll happen when I stick a safety pin point into one of those little holes in the wall.

How beautiful is that idea?

Babies observe. They wonder about what they’re observing. They experiment. Sometimes the result of that experiment hurts. Sometimes the result is they learn something. They file either conclusion away in their memories for future use.

The smartest among us are those who understand they don’t know all that much and are hungry to learn.

It’s Up To The Women

With apologies to Eleanor Roosevelt, whose book of that name, she hoped, would spur females to get involved in politics. They’ve done so, to a certain extent. Now, it’s time for women to elbow their way into another field.

How many of these names do you recognize? (And before you start, you don’t have ID them to any degree of precision; just what their general profession was.)

  • Mary Elliot Hill
  • Lise Meitner
  • Jewell Plummer Cobb
  • Cecilia Payne-Gaposchkin
  • Carolyn Porco
  • Rosalind Franklin
  • Emmanuelle Charpentier
  • Dorothy Crowfoot Hodgkin
  • Joy Buolamwini
  • Émilie du Châtelet
  • Shubha Tole
  • Mary Calkins

I’d guess if any of these names is familiar, it would be that of Rosalind Franklin. Maybe Lise Meitner. Otherwise, I’ll wager the vast majority of Pencillistas won’t get who these people are. And, for pity’s sake, Pencillistas are the hippest, coolest, most well-read, brightest bulbs in the box, here or anywhere else for that matter. When a Pencillista doesn’t know who you are, you are, for all intents and purposes, nonexistent.

Only the people named above certainly do/did exist. Not only that, they contributed greatly to the advancement of the human species. Without them, we’d be in a bit of a pickle, grossly uniformed about the nature of the universe, sans one of humankind’s most depended-upon materials, unable to ward off the effects of one of the most virulent cancers, lacking in synthetic penicillin and, potentially, at the mercy of the worst aspects of Artificial Intelligence, among many other effects these people have either instituted, discovered, or averted.

Joy Buolamwini

They are all women and they are/were all scientists. Everybody knows who Einstein, Newton, and Darwin were. Who — besides me; and I had to look up all but a couple of the names — knows who Joy Buolamwini is. Or who Jewel Plummer Cobb was? The only distinction between them and the aforementioned scientific titans was the fact that these virtually anonymous figures possessed vaginas rather than penises.

That’s all it comes down to really. We’d love to think the smartest among us might also be the most forward-thinking, the most open to diversity within their ranks. But hell no. Take the case of Rosalind Franklin, for instance. She should have shared in the 1962 Nobel Prize in Physiology or Medicine along with James Watson, Francis Crick, and Maurice Wilkins. Dang it, she had as much to do with identifying and describing the DNA molecule as any of those three but her genitalia made her ignorable to them. They — especially Watson — stood on their heads to deprive her of her rightful share of laurels for figuring out just exactly how the basic building block of life is structured. In fact, in Watson’s famous memoir, The Double Helix, he complained that Franklin was difficult to work with and he criticized her for not caring about her appearance enough.

Rosalind Franklin (Image: Vittorio Luzzati)

Yeah, sure. Remember how the world’s scientists didn’t take Einstein seriously because he refused to run a comb through his hair? Or how the members of the Royal Society snubbed Newton because he was essentially a pathological loner?

Here’s a truth: the smartest man in the world has about the same chance of being an insufferable jerk as the least educated.

Even in this supposedly enlightened era, when women are running countries, making movies, writing novels, heading corporations, running police departments, rapping the gavel as Speaker of the House, and doing anything males can do, fewer than than three in ten of the world’s research scientists and academicians are women.

Could that be one reason why so many of our scientific advancements of the past couple of centuries have been things like bombs that blow up entire cities, chemical compounds that foul our groundwater, motors that turn our air into unbreathable muck, and prescription drugs that turn us into addicted zombies? Perhaps a scientific community made up entirely of women the last 200 or so years might have put us in the same perilous state we find ourselves now. That’s something for college sophomores to discuss while passing around bongs and copies of The Bell Jar. What we do know is it’s been guys in white lab coats who’ve put the planet on the brink of catastrophe. Well, they and other males in a hundred and fifty other vocations, to be sure.

Perhaps in another hundred years (we should live so long) the aforementioned gender ratio in science will be reversed. Perhaps the women-dominated STEM field of the future will adopt a more caring, concerned outlook regarding its research and discoveries. It wouldn’t at all hurt us to find out.

A population of primarily-female scientists certainly couldn’t harm us and our planet any more than the traditional, guy-dominated tribe already has.

In any case, let me hip you to three women who are interpreting and explaining the world of science to the general public these days. Two are authors and one a podcast maven. Their respective takes on the world of science is both refreshing and needed. So, here they are:

Natalie Angier: She’s the author of what I consider to be my most indispensable book on science, The Canon: A Whirligig Tour of the Beautiful Basics of Science. Notice that? She describes the fundamental tenets of science as “beautiful” — and they are. Yet, how many men have waxed so poetically about, say, chemistry or physics? A few decades back, when Carl Sagan wrote his iconic The Dragons of Eden and Broca’s Brain, critics called him science’s first poet. If so, then Angier is science poetry’s Bob Dylan. She transforms what could have been a dry recitation of facts into multi-layered, complex love song. Yep, she loves science — and she’s smart as a whip. She’s also written three other science books, including her expert take on the female gender, Woman: An Intimate Geography. Angier has been a New York Times science writer since 1990 and won the 1991 Pulitzer Prize for Beat Reporting for numerous articles on science.


Mary Roach: If Natalie Angier is science’s poet, Roach is its standup comic. Or, more accurately, the nation’s high school science teacher whose class everybody wanted to take. She views science through an ironic, amused lens, explaining and revealing with humor and a heavy dosage of irreverence. Her most recent book is Fuzz: When Nature Breaks the Law, in which she addresses the collisions between humans and critters, and between us and potentially toxic flora and natural substances. She begins by telling us that several centuries ago when, say, a bear mauled a hapless wandered in the woods, the bear could be tried in a court of law! Bears, of course, aren’t brought up on charges anymore but they still occasionally rip someone to shreds. Roach also has written a shelf’s worth of engaging, informative books including:

  • Stiff: The Curious Lives of Human Cadavers
  • Spook: Science Tackles the Afterlife
  • Bonk: The Curious Coupling of Science and Sex
  • Packing for Mars: The Curious Science of Life in the Void
  • Gulp: Adventures on the Alimentary Canal
  • Grunt: The Curious Science of Humans at War

All these titles are available through her website.


The youngest of this trio is Rebecca Watson, host of the regular podcast on her site, Skepchick. Watson founded Skepchick in 2005 in order to, in her words, “discuss science and skepticism from a woman’s perspective.” An aspiring magician throughout college, she was inspired to get into the the science and skepticism business after meeting James Randi, “The Amazing Randi,” noted illusionist and myth/magic debunker. With the rise of the internet of the last quarter century, scientific misinformation and downright quackery and fraud have spread like so many coronavirus variants among the unvaccinated. Watson is never lacking for new topics to tackle and bunkum to refute.


Bottom line: Conventional wisdom holds that science will save us from ourselves. Perhaps a better — or at least alternative — way of looking at it is women scientists ought to be given a shot to come to humanity’s, and the Earth’s, rescue. And, just as Important, they ought to take that shot.





…The More They Stay The Same

When I was kid, nine years old in 1965, I read about and saw TV historical programs about World War II, 20 years prior, and thought it was something that had happened in an ancient time, so long ago that people were different, had evolved way past such horror.

Tomorrow we mark 20 years since 9/11. Being an old coot now, I realize that couple of decades was nothing more than a barely perceptible blip in the long stretch of time.

What I know now is those horrors, WWII and 9/11, changed us as a nation, both for good and for evil, and we are who we are today because of them.

WWII brought us together, taught us the value of sacrifice and righteous resistance against the evils of the tyrannical empires we fought. Conversely, it convinced us we were the richest, most powerful empire in the history of the Earth, able to exert our will over any and every other nation on the globe, and had the god-given right to do whatever the hell we wanted no matter how it affected anything or anybody else.

Berlin, 1945.

9/11, too, reminded us about sacrifice and righteous resistance against the theocratic hoodlums of the world. But, like WWII, it made us suspicious of other cultures and other religions, willing to sacrifice our liberties for some imaginary sense of safety and security, and filled to a certain extent in all of us with a vestigial sense of hatred and fear.

New York, 2001.

The thing that never changes is evil begets evil.

Lead Us Not

Thomas Midgley.

I don’t know if he was the most dangerous American scientist ever to pour the contents of various beakers into a larger vessel and then loose the result, come what may, upon the general public, but if there were a competition for that title Thomas Midgley would be a top contender.

Go get yourself a cool drink, grab a chair, have a seat, and ponder with me this man who, very likely, is as responsible for much of the environmental mess we find ourselves in as any other human ever to grace this tainted planet.


Midgley was so smart, so creative, so imaginative, that his efforts in the lab touched millions — nay, tens of millions, even hundreds of millions — of lives here in this holy land and around the globe. Heck, let’s go all the way and say he’s somehow affected billions of us. But, like Prometheus of Greek mythology, his discoveries, his labors, wound up screwing the lot of us over…, well, to a nearly mythical extent. If only he were a mythical figure. The life and work of Thomas Midgley are all too true.

In fact, so brilliant was he, so gifted, that the last thing he conjured up in his fertile scientific mind wound up killing him. Fitting, I guess. It’s the living — us — who are dealing with the fallout from his endeavors.

First, let’s consider lead. Its chemical symbol is Pb, short for the Latin plumbum. It was in ancient Rome that water was first delivered to homes via lead pipes. That’s why we call plumbers plumbers. That’s why, some historians suggest, the Roman Empire fell. Its population, the hypothesis goes, all started going mad from the effects of lead in their water. Forget for a moment that there were countless reasons why the Roman Empire — why any empire has or will — collapsed. Hell, we’re witnessing that very phenomenon happening right here, right now. Just keep in mind that as far back as the end of the 19th century, the deleterious effects of lead became known. That’s when people began advancing their Roman Empire conjecture.

How Ancient Romans Formed Lead Water Pipes.

According the the Centers for Disease Control, lead, once it enters the human body, can and will cause such symptoms as abdominal pain, constipation, fatigue, headache, irritability, loss of appetite, memory loss, pain in the extremities, and overall physical weakness. Intense, chronic exposure to lead can produce in humans the inability to concentrate, depression, nausea, loss of coordination, insomnia, stupor, slurred speech, anemia, hallucinations, palsies, convulsions, and cancer. Children exposed to lead, for instance that found in lead paints in old homes, become antisocial, hyperkinetic, and aggressive, profoundly affecting their performance in school. Prolonged exposure may even lead to a type of blindness called scotoma. The World Health Organization estimates lead causes upwards of 10 percent of all mental disabilities in the world even today. The WHO warns that no level of exposure to lead is safe.

All this was becoming apparent toward the end of the 1800s. That’s fully a quarter century at least before Midgley, an employee the General Motors-owned research lab at Dayton, Ohio, started futzing around with lead. Educated at Cornell University in mechanical engineering, Midgley became interested in chemistry at the Dayton lab. One of the knocks (you’ll pardon the pun) against the then-newfangled automobiles in the 19-teens was the fact that the early engines in them knocked like hell. Auto manufacturers became hot for a gasoline compound that would eliminate the maddening knock as so many Model Ts sped along at, oh, 16 mph.

Midgley, by trial and error, found that adding a substance called tetraethyl lead, a derivative of basic lead, to gas greatly reduced loud knocking. The big boss at General Motors, Charles Kettering, and Midgley filed for a US patent and began to license the new compound’s use to petroleum companies. They became rich men. Well, Midgley became rich; Kettering simply richer.

Kettering and his GM crew, knowing the growing awareness that lead was a poison, took pains to not mention its addition to gasolines, so they trademarked the new gas ethyl. As in, something akin to a woman’s name, your aunt for example, the one who bakes apple pies. How can that hurt you?

Naturally, since gasoline, leaded or not, burns and its vapors enter the atmosphere through the automobile’s tailpipe, for the next five decades after Midgley’s discovery, adults and children inhaled lead with virtually every breath they took.

I wasn’t until the mid-1970s that the Environmental Protection Agency started working to get leaded gas off the market. It took 20 years until ethyl™ was completely banned in this country for use in private automobiles. The rest of the world followed either quickly or leisurely. The last country to ban leaded gas seems to be Algeria. And guess what the hell what! That ban went into effect today, August 30, 2021.

Researchers have discovered that the dramatic decrease in the worldwide violent crime rate the last few decades can be directly connected to the disappearance of ethyl in the consumer market.

Clearly, leaded gas effectively kicked the shit out of humanity. Thanks, Thomas Midgley.

Funny thing is, Midgley staged a demonstration in hopes of allaying the public’s fears about leaded gasoline soon after he developed it in 1921. Before reporters, he washed his hands in the stuff and then inhaled deeply above a jugful of it for a full minute, claiming he could repeat the demonstration every day and not suffer any ill effects. What he didn’t tell the reporters that day was he’d already had to take a long break from his work due to lead poisoning a few months previously. Since then, he assiduously avoided coming anywhere near lead, except for that theatrical demonstration. When workers at various leaded gas processing plants started losing their minds and even dying, one company’s spokesperson announced the poor souls “probably went insane because they worked too hard.” Their fault, in other words.

Still, Thomas Midgley wasn’t finished. Toward the end of the decade of the 1920s, another newfangled machine, the air conditioner, was becoming more and more popular. Those early air conditioners used things like ammonia, chloromethane, propane, and sulfur dioxide as coolants. Problem was any and all of them were prone to explosions. General Motors, at the time, owned the Frigidaire Corporation, one of the early manufacturers of air conditioners. Kettering put Midgley in charge of a team working to find a “safer” substance for A/Cs. The Midgley team came up with Freon™️, a chlorofluorocarbon. Freon was volatile and many chemists warned it would be dangerous to produce the stuff on a widespread basis but Midgley and his team pooh-poohed the notion. Freon, they retorted, was chemically inert. Soon, almost every air conditioner made used Freon as a coolant and, before you knew it, manufacturers of underarm deodorants and other products coming in aerosol cans were gobbling up Freon as a propellant.

Flash forward to the 1970s and ’80s. Environmental scientists began noticing holes in the atmosphere’s ozone layer above both polar regions. The ozone layer protects life on Earth from overdosing on ultraviolet radiation coming from the Sun. The Sun, natch, keeps us alive but too much UV radiation can cause in humans eye cataracts, suppressed immune systems, genetic damage and skin cancer. The ozone layer filters up to 99 percent of the Sun’s UV radiation from reaching the Earth’s surface. Those holes in the ozone layer rapidly spreading above the poles were determined to have been caused by none other than Midgley et al’s Freon and other commercial chlorofluorocarbons. The nations of the world agreed to ban the use of them gradually when they signed the Montreal Protocol in 1987. Derivatives of Freon called hydrofluorocarbons were substituted by manufacturers for a few years but they, too, were found to be extremely dangerous. Under terms of a 2019 amendment to the Montreal Protocol, these HFCs have been branded “super greenhouse gases” and are now strictly regulated.

So, in a short decade, Thomas Midgley developed a couple of industrial products that have spurred much of the population to violent crime and subjected the rest of us, not dead or in prison, to an all-too-often fatal form of cancer. Prometheus, indeed.

I forgot to mention, Thomas Midgley, for his efforts, was awarded the Nichols Medal by the American Chemical Society and the Perkins Medal by the Society of Chemical Industry. Both these prizes were awarded after Midgley took a sabbatical in 1923 due to the aforementioned case of lead poisoning he suffered.

So far as I can determine, Midgley has not received any medals from environmental or public health organizations.

Midgley did not live a terribly long life. He contracted a case of polio in 1940 when he was 51 years old. Like President Franklin D. Roosevelt, Midgley became a cripple. Ever the inventor, Midgley, unable to walk, designed while lying in his bed a pulley device that would lift him above his bed so the sheets could be changed and would turn his inert body so he wouldn’t suffer bedsores. The device was driven by an electric motor pulling and releasing a network of ropes. Wouldn’t you know it, one day in 1944 Midgley got tangled up in the mesh of ropes and was strangled to death. He was 55 years old.

Neither a poet nor a novelist could conjure so fitting an end for him.

Only The President?

Things Every Adult Ought to Know

We’ve been living under the shadow of the mushroom cloud for going on 76 years. It was on a Monday, August 6, 1945, that the Japanese city of Hiroshima was virtually fried off the face of the Earth by a single nuclear weapon dropped by an American Army Air Forces B-29.

Hiroshima, Burnt Out of Existence.

The bomb had exploded at approximately 8:16am, Japan Standard Time. An estimated 80,000 people were killed, either instantly by the momentary +10,000ºF temperature within the bomb’s 1,200-feet in diameter fireball or within moments by the firestorm that hellpoint ignited in the city 1,900 feet below it. Everything — vehicles, mules, birds, people, structures (except for a very few reinforced concrete, earthquake resistant buildings) — within a mile radius of ground zero was vaporized. Outside that circle, extending out another mile, everything was burned in a wind-driven inferno that lasted for hours. Only a lack of stuff left to burn caused the firestorm to fizzle out.

Within the next few months and years some 6000 more people died from radiation effects. Those who were in the blast zone and survived experienced for the rest of their lives a high risk of cancer directly related to their exposure to radiation

That particular bomb today seems laughably primitive. Even when it was dropped, Manhattan Project physicists and Army Air Forces commanders understood a much more complicated but also more efficient bomb would be used in the ensuing days as well as in future warfare. The Hiroshima bomb, nicknamed Little Boy, was a gun-type shell that produced a nuclear fission explosion. Its designers had re-purposed a large-bore naval artillery gun and encased it in a ten-foot-long aerodynamic cylinder. At the moment of detonation, a pellet of Uranium-235 was fired down the length of the gun tube until it nestled precisely within a hollow cylinder, also made of U-235. That created a critical mass, initiating an uncontrolled nuclear chain reaction, releasing heat, light and X-ray energy of previously unimaginable proportions.

Kid Stuff.

Three days later, another B-29 dropped a second nuclear weapon, this one nicknamed Fat Man, on the city of Nagasaki. In Fat Man, a 3 1/2-inch diameter ball of plutonium was squeezed into critical mass by a concentric shell of explosives, the resultant heat and blast wave killing another 75,000 or so people either instantly or by the explosion’s aftereffects. Japan surrendered within a week.

In the whole of human history, a total of more than 150,000 people have been killed in the only two wartime uses of nuclear weapons. Since those two incidents, the world’s nation have constructed well more than 60,000 nuclear weapons. A more exact total is impossible to ascertain since each nation’s nuclear weapon inventory is kept secret. Thus far, eight nations have been recognized as possessing nuclear weapons. They are the United States, Russia, France, the United Kingdom, China, North Korea, Pakistan, and India. Most observers believe Israel also possesses a nuclear inventory but that nation refuses to verify it, preferring to let its Middle East rivals fret over the question. Were you to state in court that Israel is a nuclear power, it’s a good bet you wouldn’t be at risk of perjuring yourself.


By the way, it’s generally acknowledged that South Africa, under its apartheid rulers, had built a few nuclear weapons but after the African National Congress ousted that regime, the nation’s nuclear bombs were dismantled. Knowing humanity as we do, South Africa’s actions in this matter remain stunning to this day.

The nuclear bombs nations posses in the year 2021 (some 13,000-plus overall) are mostly of the thermonuclear variety. Dubbed “The Super” by its earliest advocate, physicist Edward Teller, and commonly known as the hydrogen bomb, a thermonuclear device actually uses an old fashioned atom bomb, something akin to the Nagasaki explosive, its critical mass being depleted uranium, as a detonator. When a hydrogen bomb is dropped, the atom bomb within it explodes, creating enough heat to cause a fusion reaction. In the old fission bombs, atomic nuclei caught in the chain reaction are split apart, releasing energy. In Teller et al‘s “Super,” the energy created by those spiltting nuclei is merely the match the lights the real guts of the thing, a mass of hydrogen isotopes. The nuclei of those hydrogen isotopes are fused together, forming helium atoms, the same type of reaction that goes on in the cores of stars. In order for the bomb to cause that fusion, that temperature must momentarily reach about 180,000,000ºF.

Fission vs. Fusion.

The blast generated by a hydrogen bomb makes both the Little Boy and Fat Man explosions look like firecrackers set off by children. Were a one-megaton hydrogen bomb dropped on Hiroshima that day in August 1845, its destructive power — including to one degree or another, the crushing overpressure, initial and residual radiation, heat and resultant fires — everything within a nearly five-mile radius would effectively be destroyed with significant damage to structures within a seven-plus-mile radius. A lethal dose of radiation would extend outward, depending on wind direction and speed up to 90 miles. Death for anyone caught within that radiation plume would ensue within two weeks. An area of up to 250 miles distant, again depending on wind speed and direction, would be uninhabitable for up to three years.

By the way, a megaton in nuke-speak is analogous to one million tons of TNT. That’s big. How big? Consider this: the biggest thermonuclear device ever exploded, the USSR’s “Tsar Bomba,” dropped from an airplane in October 1961 over the absolute nowheresville locale of Russia’s Novaya Zemliya island archipelago north of the Arctic Circle, had a yeild of 50 megatons. The crew of the aircraft that dropped the bomb barely survived the blast even though the plane was more than 24 miles away at the moment of the explosion. Soviet planners previously had estimated the crew would have a 50 percent chance of surviving the blast but it was important enough to them to risk those lives in order to prove to the United States how big its nuclear dick was.

The Tsar Bomba’s Explosive Force in Terms of a Cube of TNT. That’s the Eiffel Tower on the Left, for Comparison.

Here in the United States, a nation just as concerned with nuclear genital size as the (now) Russians, we go about our daily business, most of us, believing only the president can authorize the use of nuclear weapons by our armed forces. To this point, the Army, the Navy and the Air Force (the Army Air Forces became a separate service in 1947) possess and control separate nuclear stockpiles. Spy movies and suspense novels over the last eight decades have led us to believe the President of the United States travels around followed by a military officers carrying the “Football,” a briefcase containing the launch codes and communications devices that allow only him (that gender thus far) to “press the red button.” No general or admiral, the belief goes, no matter how high up in the chain of command, can launch the Bomb without a presidential go-ahead.

It’s all bullshit.

A Member of the Armed Services Carrying “The Football” Accompanies the President at All Times.

From the weeks before the Hiroshima bombing when Harry S Truman lay awake in bed for nights at a time trying to decide whether to authorize the use of this nation’s terrible new weapon, the assumption always has been it’s the president who has the sole authority to use a nuclear bomb. The average American thinks there’s some kind of mechanical barrier — that “Football” — in addition to tradition and an abundance of prudence that make it impossible for anyone but the Chief Executive to make such an apocalyptic decision.

Not so. Not at all.

In fact, the number of people who can elect to drop a hydrogen bomb on a city — be it Moscow, Beijing, Tehran or any major metropolis in a country that happens to stick in their craw at that moment — reaches into the thousands.

Let’s ponder that again: thousands of people, American people, can, on a whim, obliterate a major world city, killing hundreds of thousands, even millions, in a blinding flash of light and heat.

In the last few years, a number of books have been published recounting the history of this Holy Land’s nuclear arsenal. That history has been a doozy.

Two books in particular illuminate what is in reality a not-very controlled control of this nation’s nuclear arsenal. It can be assumed that the arsenals of Russia and at least some of the rest of the nuclear powers are similarly left in the hands of many people, not all of whom, of course, have been vetted for sanity, compassion, morality, or decency. The books are reporter Fred Kaplan’s The Bomb: Presidents, Generals, and the Secret History of Nuclear War, and Daniel Ellsberg‘s The Doomsday Machine: Confessions of a Nuclear Planner.

Kaplan‘s book is largely based on Freedom of Information Act requests as well as scheduled classified information releases. Ellsberg’s research was more direct; he was a nuclear war planner for the RAND Corporation, the nonprofit financed by the US government to analyze, basically, how big and effective our military dick is.

Both Kaplan and Ellsberg became aghast at both the destructive power of our nuclear arsenal and the mechanisms to control and utilize it. Both authors remark every president from John F. Kennedy to the present day * were stunned by the power they controlled, a capability they learned their first days in office. And, yes, there is a “Football” and it does indeed contain the codes the president needs to launch a nuclear attack. But that “Football” is no barrier to all those people whose fingers are not on the nation’s entire nuclear inventory but merely some of it.

[ * Not only that, the succeeding presidents to a man immediately became convinced the nuclear arms race must be reversed, with one exception, acc’d’g to Kaplan. When the 45th President took office, he nearly gleefully urged his military commanders to increase significantly the number of nuclear weapons in the United States arsenal, just because, it can be surmised, bigger is better.]

US Marine Corps 1st Lieutenant Daniel Ellsberg (c. 1957).

Those button-pushers range from military theater commanders, admirals or generals in charge of broad regions of operation like the Pacific Ocean or Europe down to bomber pilots and submarine captains whose craft are laden with one or more thermonuclear weapons. For instance, acc’d’g to Ellsberg, President Harry Truman in the early 1950s gave the then-named Commander in Chief–Pacific Command (CINCPAC), Admiral Harry Felt the authority to use any and all of the nuclear weapons under his command, basically, any time he felt the need to. That order, Felt attested, had never been rescinded by the time The Doomsday Machine was published.

Going one step further, Regional CINC’s have authorized pilots and submarine commanders to use their thermonuclear weapons at their individual discretion any time communications are lost between themselves and their bases at times of high alert. Knowing what we know about the reliability of any of our modes of reaching out to each other (phones, radios, the internet), it’s reasonable to assume those pilots and captains’d be on their own, burdened with the decision to roast a city of several million, far more often than is comfortable to ponder.

In other words, a small town’s worth of potential Major T.J. “King” Kongs from “Dr. Strangelove” are flying airplanes or sailing on or beneath the surface of the world’s seas are all that stand between us and armageddon.

Given that both Russia’s and the US’s strategies are to respond en masse with nuclear weapons should either party launch a single bomb against the other, only the sanity and sense of human decency of those few thousand has kept the lot of us from being cremated into our constituent atoms.

We’re All Guests Here In Smallworld

Things Every Adult Ought To Know.

Let’s ponder the Earth’s most dominant life form.

No, it’s not fans of Star Wars, Harry Potter, or Billie Eilish. It’s not Asians or Africans or Americans, North or South. It’s not cicadas (anymore, for the next 17 years or so) or mosquitos, despite what folks in their backyards on August nights might think.

It’s bacteria.


These prokaryotic microorganisms that household cleaners and hand sanitizers are designed to eliminate from the face of this mad, mad, mad, mad world are more numerous, more plentiful, and far more resilient than any other form of life hereabouts.

Bill Bryson writes in his exquisite A Short History of Nearly Everything:

Because we humans are big and clever enough to produce and utilize antibiotics and disinfectants, it is easy to convince ourselves that we have banished bacteria to the fringes of existence. Don’t you believe it. Bacteria may not build cities or have interesting social lives, but they will be here when the Sun explodes. This is their planet, and we are on it only because they allow us to be.

So, the evolutionary lifespan of bacteria appears to stretch from the very onset of life here on Earth, 3.5 billion years ago, to the last gasp of it, ten billion or so years hence. For such little guys, bacteria are strong and stubborn critters.

I took a food safety course back when I was working in the education department at Whole Foods Market (don’t hate me, please). The thrust of the entire course was kitchen staff and front of the house food handlers should have as their overriding concern the aim of eliminating every single eensy bacterium on our hands, our utensils, our food preparation machines, our plates, our pots and pans, in short, every conceivable surface that might come in contact with the pâtés and all-natural corn dogs we’re fixin’ up.

That’s an aim so ambitious as to be impossible. We might as well try to get rid of all those pesky nitrogen molecules we fill our lungs with every time we inhale. Our success rate would be about the same. For that matter, it’d be as destructive an aim as banishing the world’s bacteria, inasmuch as nitrogen is a key building block in both DNA and plant life. So let’s not, okay?

That doesn’t mean cooks, servers, and dishwashers ought to scrap the whole notion of scrubbing their mitts after engaging in the production of Nos. 1 and/or 2 and before getting back to handling comestibles. It’s all a numbers game; we’ll get back to that.

But let’s continue pondering the overall numbers of earthly bacteria, past and present, without which it’d be curtains around here.

A couple of decades ago, a researcher named William Whitman and his team at the University of Georgia took it upon themselves to estimate how many bacteria were alive at the time on this planet. Natch, they couldn’t hope to count them all, as they came up with an awfully big number. So big we’ll have to put it into words rather than figures. The Whitman gang posited that some five million trillion trillion individual bacteria lived, breathed, and ate on Earth in the year 1998. That’s the numeral 5 followed by thirty zeros. Hell, that’s a bigger number than all the US dollars the likes of Jeff Bezos, Elon Musk, and all the other plutocratic, borderline sociopathic wealth grabbers many Americans love to idolize posses or control.

In the ensuing 23 years, those bacteria have happily reproduced, of course, so the Whitman number most assuredly is greater today.

Most bacteria live in the soil or in the oceans, leaving us landlubbers with a mere several million trillion of the little buggers to contend with (or benefit from). There are plenty of them to go around.

For instance, think of how much oil we’ve pumped from beneath the Earth’s surface since the development of the first commercially successful internal combustion engine 160 or so years ago. John Jones, of the University of Aberdeen’s School of Engineering, has estimated we’ve yanked some 135 billion barrels out of wells since John D. Rockefeller’s Standard Oil Company began obsessively pumping it way back in 1870. Burnable oil, most scientists believe, results from decaying plankton, other microscopic marine life, and bacteria. The lion’s share of the biomass most assuredly is bacteria.

Or should we now start saying the bacterium’s share?

From the page on bacteria from Oxford University’s Museum of Natural History:

Bacteria survive, thrive, fight and die by the trillion every moment. They swim using nanoscopic motors, and battle with spears. They sense, communicate, remember. And as scientists discover more about these tiny organisms, it is becoming clear that bacteria wield huge influence over us, shaping Earth’s past, our present and the future for us all. We have only recently realised how much our lives are inextricably linked with the lives of bacteria. We are living in a bacterial world.

What good do bacteria do for us? Plenty good.

Each of us has ten times as many bacteria in us as actual body cells. Don’t get scared. Of the 30,000-plus species of bacteria so far identified, only 100 or so can cause humans harm. The rest of them do things like produce oxygen and enzymes in plant life, clean our ground water, fertilize our farmlands, create vitamins within our stomachs and intestines, reside in our bellies so we can digest food, ferment things (both in nature and in labs), devour potentially harmful microorganisms in nature or in industrial spills, and do any number of other things w/o which we’d be in awful shape.

But it’s those dangerous little guys — salmonella, say, or campylobacter; foodborne illnesses — that we fret over when we see our waitstaff exit the powder room.

Don’t get scared again, but this is the truth: every bite of food we take is laden with billions of bacteria. Period. No way around it. Just as every breath we take is chock-full of nitrogen. That’s the world in which we live.

So why do we want our food handlers to scrub their paws? As mentioned earlier, it’s a numbers game. The most fecund among the many species of bacteria are able to create new generations every ten to twenty minutes. To drive that point home, that means some bacteria colonies can double in size in less than the time it takes you to make your brown bag lunch in the morning.

So, if these evil little buggers — the aforementioned salmonella, campylobacter, et al — are so numerous and becoming more so by the minute, what’s the use of us even trying to ward them off?

Salmonella (R) & Campylobacter

The human body has an army of similarly minuscule cells whose sole purpose in this existence to to kick the crap out of bacterial invaders that have somehow weaseled their way into us. Three soldiers in this army are called phagocytes (meaning they eat bacteria), body cells that have been made immune by previous exposure to bacteria, and things called complement proteins. To understand precisely how these troops work, I’d need to make this post 23,000 words long and, truth is, most people think I blab on far too much in the first place, so if you’re curious about their machinations you’re welcome to do your own research.

The success of any army is almost wholly dependent on numbers. The Allies beat the Axis in World War II because most of the planet’s nations were aligned against Hitler, Tojo, and Il Duce. Hell, the Soviets lost 20 to 30 million people in that fracas while the a mere 3.6 million Germans gave up their lives for the Führer‘s deranged ambitions. Yet the Russkies and their satellite partners were among the participants in the carving up of Germany when the insanity ended on May 8, 1045. The USSR and the rest of the Allies had far outnumbered the Wehrmacht.

So it goes when the human body’s defenses sense an army of bacteria marching in on it. Which, as previously mentioned, happens every time you suck out of a straw dipped in a chocolate shake topped with whipped cream. There are more of our good guys than the bacterial army’s bad guys.


The bacteria that come into us from our food loiter within us for some time even as our defenses go about munching or otherwise neutralizing them. But, as mentioned, bacteria can multiply rapidly. Success for our continued good health depends on our phagocytes and their buddies doing their thing quickly enough so that the reproducing bacteria don’t begin to outnumber them.

One way bacteria from that chocolate shake can win the battle is if the load carried by the ice cream or even the tainted straw is so huge that our defenses are outnumbered. Then, the body must turn to its specialized soldiers. Call them our own Navy Seals. They include fever, vomiting, and diarrhea. To mix metaphors, their purpose is to act like bouncers  trying to clear up a barroom brawl. When the melee becomes too crazy, they simply have to turn on the overhead lights and toss everybody out the door.

A common date for people to experience that dramatic ousting is Thanksgiving. Often you’ll hear folks say they had the family over for the big turkey feed and a bunch of them caught the flu. That’s likely not the case. Vomiting and diarrhea aren’t normal symptoms of the flu virus. What’s far more likely is the bird, it’s juices, the sweet taters, and all the other foods in which bacteria can thrive and create numerous generations have been sitting out of the table for a couple of hours. People have been picking at the stuff and simply overwhelming their own defenses with monster loads of campylobacter or even salmonella.

A Veritable Orgy of Frolicking Bacteria.

Good food safety practices hold that the three main factors that control bacterial growth are Time, Temperature, and Acidity. There are a couple of other factors but they aren’t in your control, so let’s ignore them. The latest research shows that bacteria reproduce most efficiently between 40 and 140 degrees Fahrenheit. That’s why your refrigerator is set for 37 to 39 degrees. At that temperature, bacterial growth is retarded enough so that those sweet potatoes may keep for a few days, although after about a week, they’ll start looking rather psychedelic. Your freezer is set much lower: 0ºF. That slows bacterial reproduction down so much you can keep your yams frozen for a year and not see acid-trip colors on them.

As for time, foods generally can be left out for a total of four hours. Total being the key word here. You must add up all the time the food has been lingering at temps between 40 and 140. If it had been out for an hour and a half at dinner time, then put away in the fridge until the next day when it was taken and left out for another hour, that’s two and a half hours in the danger zone. Now you’re pushing your luck. And, if you eat the leftovers at room temperature, you haven’t killed off whatever bacteria resides in it through reheating, so that’s another risk you’re taking.

And, really importantly, if you haven’t washed your hands before plunging them into the mashed potatoes, you’re transferring all the bacteria swirling around in the oils on your fingers and on your nails into your food. You ain’t gonna enjoy hugging the porcelain bowl after that — a possibility that increases with each violation of good food saftey practices.

So, the takeaway is this: a few bacterial species, in sufficient numbers within us, can make us terribly ill or even kill us, but most bacteria keep us alive.

And ain’t that just like life? It’s the damnedest contradiction.

TEAOTK*: Visits To A Teensy Planet

* Things Every Adult Ought To Know, No. 1

Welcome to the first of — it is to be hoped — many. This one will provide few answers but many questions. And isn’t that what science is all about?

They’re Here! They’re Here!

Every ten or so years for the past three quarters of a century, Americans go UFO crazy.

Just after the end of World War II, and extending into the early 1950s, people in our Holy Land started seeing UFOs all over the place. Then, in the mid ’60s and on into the ’70s, after a lull in sightings, people became all agog over alien visitations again. UFO mania hit rock bottom in the ’80s and ’90s and then on into the 21st Century when people were too busy playing the stock market or worrying about when the Muslim War on the West * would explode. [ * Speaking of manias. ]

1st Question: Do You Believe In UFOs?

Well, do ya, punk? As for me, the answer is, Yes, of course I believe in UFOs! No one in good conscience and/or operating under the simple rules of grammar and logic can deny the existence of UFOs. They are things some people occasionally see in the sky that they cannot in any way, y’know, identify.

Now, if what you really mean is Do you believe this planet is being visited by intelligent beings from some other planet and they have been flying around for decades, watching us do whatever it is they think we’re doing?, my answer would be somewhat different. Is it possible alien spaceships are careening through our blue skies? Sure. It’s possible. Anything’s possible. But is it probable? Now things get a little sticky.

Perhaps one of the reasons many people are eager to believe UFOs are actually alien spaceships is their knowledge that even we, humans, the otherwise lunkheads who cannot save ourselves from climate change immolation or racial bigotry or jaw-dropping wealth inequalities, have already, in the last 64 years * sent rocket ships and odd-looking machines into orbit around the Earth; to the moon, Mars, and Venus; on a grand tour of the solar system, and even into the fiery Sun.

[ * The USSR launched Sputnik into Earth orbit on October 4, 1957. It was the first human-made gadget ever to partially escape the bonds of this planet’s gravity. Sputnik, nearly two feet in diameter, was a shiny hollow metal ball with four radio antennae attached to it. Frankly, it looked cool as hell but, natch, it scared the bejesus out of America because many of us alive and aware at the time figured the godless commies were fixin’ to either drop hydrogen bombs on us from orbit or at least keep an eye on everything we do down here. Sputnik 1 stayed in orbit for precisely three months; it burned up in the atmosphere on January 4, 1958. The launch of that first Sputnik (Russian for satellite — clearly the Russkies’ guys in charge of naming the thing were not spiritual descendants of Tolstoy or Chekhov) signaled the beginning of the Space Race. ]

The idea being, hell, if we can do it, surely others in this big, wide universe can send contraptions our way, right?

The problem is, our space travels thus far have been embarrassingly modest in scope and distance. We’ve not yet come anywhere near traveling to inhabited cosmic locales. Some researchers suspect Mars or Saturn’s moon Enceladus may now or at some time in the past have harbored primitive, microscopic life, but it’s a good bet those little critters — if they exist — aren’t running around telling each other about visitors from another planet.

The farthest one of our spacecraft has flown is Voyager 1, launched in September, 1977,  to go poking around the outer reaches of the Solar System. As of May 31st this year, it is still flying outward from us and the Sun, still receiving and transmitting messages, and is a little bit more than 14 billion miles away from our star. Now 14 billion miles seems like a fairly ambitious trek but, in the scheme of things, it’s next to nothing.


It’s taken Voyager 1 some 47 years to get that far out. But, as I say, “that far out” ain’t squat. The space probe still is within the boundaries of the Solar System. Even at 14 B miles out, it’s not but a third of the way to the currently known edge of the Solar System, a boundary known as the Kuiper Cliff. The farthest extent of the Kuiper Belt, the eponymous Cliff is that the place beyond which no objects circling the Sun have yet been identified. That doesn’t mean they don’t exist, only that we can’t see them. So the Solar System just might extend out much farther than the 47 billion-mile circumference of the Kuiper Cliff.

That means we haven’t even left home yet, really.

So, assuming no intelligent creatures live in our Solar System (and there’s debate over the question of whether we humans are intelligent creatures, to be honest) we’ll have to look to the stars for civilizations that might be advanced enough to take an extended weekend trip to this tiny rock.

The nearest star to our Solar System is called Proxima Centauri. It is four and a quarter light years away. That’s almost 25 trillion miles. Trillion, babies. Twenty five thousand billion. It’d take Voyager 1, were it so aimed, nearly 84,000 years to get to Proximi Centauri at its current rate of speed. To give you an idea of how long that is, consider that humanity, 84,000 years ago, had not yet achieved its Great Leap Forward, in which it learned to bury its dead, make clothing from animals skins, or even draw those animal figures in the Lascaux caves in southwest France. In other words, humans have evolved to a spectacularly dramatic extent in that time. How might our species evolve over the next 84,000 years. We’d certainly be unrecognizable to our contemporary selves, no?

Anyway, let’s assume that putative intelligent civilization on a planet circling Proxima Centauri has developed a propulsion system allowing its space probes to travel much faster than Voyager 1. There are a couple of problems with getting spaceships up to interstellar speeds. One is fuel. You can’t use coal or gasoline to achieve those speeds, of course, and even our most advanced liquid rocket fuels — subcooled liquid oxygen and kerosene in Space X’s Falcon Heavy — can only produce speeds of 25,000 miles per hour. And the Heavy must carry 430 tons of the stuff to get it into orbit around the Earth. Multiply that on the fingers of both hands plus those of several of your friends to get a rocket free of the Earth’s gravitational bonds. That’s heavy (you’ll pardon the pun) and a problem our Proxima Centauri folks’d have had to overcome so many, may, many, many, many years ago.


Let’s assume the Proxima Centauri-ites have developed the Mother of All Rockets, capable of propelling a probe at speeds far beyond what we, simple humans, have thus far conjured. How fast would it go?

Faster, Faster, Faster!

Well, you’d like it to travel at some significant fraction of the speed of light, right? Oops. There’s another problem. The speed of light is the universe’s…, well, speed limit. No complex piece of material can travel faster than that. In fact any material that even approaches that speed limit soon begins to transform itself into pure energy. Meaning some super-advanced Toyota Prius whose makers might hope for it to go, say, 90 percent the speed of light, would soon become just another part of the electromagnetic spectrum, rather than a readily identifiable coupé. That’d play havoc with the comfort of its occupants.

Not only that, the energy needed to accelerate a nice-sized piece of machinery to any significant fraction of the speed of light approaches infinity the nearer it gets to that speed. It takes scads and gobs of energy simply to get a subatomic particle within a fraction of the speed of light at places like CERN’s Large Hadron Collider or Fermilab’s Tevatron, so much so that when the operators of those devices turn them on, people in surrounding areas see their light dim. Imagine the power needs of our souped-up Prius.

You Need A Machine This Big To Accelerate A Proton.

So, we’ll have to say it’d take those Proxima Centauri explorers at the very least many thousands of years to get to us, during which time, they’ll not only have evolved through countless generations but they’ll have had to eat, defecate, bathe, read, have sex, clean out their rocket’s closets, and all the other things intelligent creatures must do. I’d guess after some tens of thousands of years, interstellar space travelers probably would have forgotten why in the hell they headed this way in the first place.

Then again, they might have sent un-crewed space probes to visit us. That’s a possibility. The problem there is powering the thing. The Proxima Centauri-ites’d have to have come up with a power source to keep the turn signals and navigation system on in the thing, no mean feat. Any civilization that comes up with a battery that lasts tens of thousands of years is advanced indeed.

Say they did send an un-crewed craft to fly around our skies. Fair enough; as I say it could be possible. The thing is, people these days are seeing not one, not a couple, not several, not even ten, but dozens and hundreds of UFOs that, they think, must most assuredly be alien spaceships. All those problems associated with getting one craft here from another star’s planet must be multiplied accordingly to get those hundreds here.


Guns are displayed at Dragonman’s, an arms seller east of Colorado Springs, Colo.

Come to think of it, why is the Earth so special that another civilization must labor so spectacularly to get here? And why must that civilization’s scientists keep its probes circling the Earth for years and years and years only to learn that we obsessively watch TV, hate each other over our external colors, spend our treasure on devices that kill each other, and amuse ourselves by listening to Kanye West and Harry Styles?

Were I a Proxima Centuari-ite, I’d say Earthlings are a dreadful bore when they’re not downright dangerously weird. Let’s go someplace else.


I’ll say it again, it’s entirely possible some wildly advanced alien civilization has visited the Earth or is in the process of gallivanting around in our atmosphere. I doubt, though, if it’s true, that we’d even be able to recognize their arrival. The difficulties in interstellar travel are so many that we can’t even comprehend what such successful travelers between the stars might look like. They wouldn’t be traveling in souped-up Priuses or even customized Falcon Heavy rockets.

I can’t see the dark blobs on photographs and videotape taken by Air Force pilots being the preferred method of interstellar space exploration for a group of beings that has somehow outpaced human intellectual development by a factor of thousands.

Again, there are UFOs, to be sure. And again, we have no idea what in the holy hell they are.


Tagged , , , , ,

Hot Air: What Does a Pencil Look Like?

A Different Direction

Join me in something new here.

For the last year or more, I’ve been averaging only a post a month on this global communications colossus. When I started The Electron Pencil back in 2012 and running through 2019 or so, I was striving — and mostly succeeding — in putting up a post a day herein. For the last couple of years of that run, I wrote about the 45th President of the United States more than any other topic. Much more. The truth is, what in the hell else was there to write about starting in the summer of 2016? What had once been a Simpsons cartoon joke had become — improbably, alarmingly, disturbingly — serious business. The joke was on us.

Funny-Not Funny.

So, as I say, I wrote, angrily for the most part, about President Gag. And, truth be told, it eventually became a millstone. Thinking and writing about Trump, that is. By ’19, I was sick to death of him and the country that had elected him on a technicality. Next thing I knew, i was going weeks at a time without putting up a Pencil post.

Even though this Holy Land has had a new president for some five months now, I’ve not yet got back into the groove of posting regularly, much less daily. And for that period of time I’ve been wondering what to do with this tool I have at my fingertips and that I pay for, I might add. I subscribe to the WordPress Business package, an option that allows me to put up podcasts and get all sorts of analytics and bells and whistles that the WP free basic package lacks. I pondered long and hard about simply going back to basic and saving the yearly premium subscription fee. Hell, I even tossed around the idea of closing down this shop altogether, but I abhor that option most of all.

Back at the beginning (the year 1 AP, or Anno Penicillum * ) I did a lot of local news coverage and opinionating here, another thing I lost pretty much all my ardor for as Bloomington, like the rest of the country, became a soap opera of antagonists snarling at each other, righteous brothers- and sisters-in-arms convinced everyone on the other side of even the most innocuous issue was in league with Satan, or at least an aspiring child pornographer. I eventually lost any desire to continue wading into the cesspool of local news and issues as well.

[ * Some sources have the word penicillum as the Latin translation for the American English pencil. Those sources go on to assert the Latin word actually meant small penis back in the days of Cicero and Augustus Caesar. I suppose I get the connection, pencils and penises sort of resemble each other — emphasis on sort of. Once I learned this, though, I was hooked. Yep, I’m definitely denoting each year of the Pencil era as an Anno Penicillum.]

Bill Bryson

In any case, I’ve considered any number of different ways I could go with this blog and website. The one, though, that keeps popping back into mind has to do with science. Loyal Pencillistas know I’m a voracious reader. I purchase books the way some people buy cars or wine or Hummels. That is, obsessively. At the Book Corner, where I still work a few hours each week, when people ask me what I like to read, I tell them history and science. Hell, my favorite living author is Bill Bryson, who writes about both topics (as well as language and travel).

So, yeah, science. I love science. Or shall I say sciences? Every single one of them. Astronomy, particle physics, engineering, medicine, biology, geology, archeology, anthropology, mathematics. Name a hard science and I’m in on it, as much as an unlettered layperson can be. The soft sciences — psychology, sociology, and political science — you can keep. I mean, I’ll converse with anybody about those topics; for pity’s sake, I’ll converse with anybody about anything. But I’m fairly averse to accumulating books on those subjects and I take the pronouncements emanating from mavens in those soft sciences with a grain of salt. But the sciences that traffic in testable, demonstrable, observable principles? Friends, count me in.

Ergo (don’t you just love Latin?), I want to turn this Pencil thing into a fun science reader. Sure, why not? The idea being in each post I’ll ruminate * on a specific science or topic, illuminating it with a light, hopefully witty, touch. Let’s look at it as a digest of Things Every Adult Ought to Know. Every adult and a goodly number of exceptional kids, too.

[ * Most dictionaries define ruminating as 1) thinking deeply about a subject and 2) chewing cud. Don’t you just love American English?]

What’s She Thinking About?

Don’t you agree there is a floor-level of knowledge the grown-up human beings of the 21st Century ought to possess? We don’t necessarily have to be on intimate terms with quantum electrodynamics (the daddy-o of which, Richard Feynman, once famously said

Richard Feynman

anyone who purports to truly understand that particular science simply doesn’t) but, dang mang, we should by all rights know the difference between tensile, torque, shear, and compressive strength (we’d like to feel safe and secure when driving across big, high bridges) or what the four macronutrients are for human beings (water, fats, carbohydrates, and proteins). We don’t need to be PhD candidates in any of these sciences but, golly, we’d better know a little something about all of them.

For that matter, each and every one of us should know who Rosalind Franklin, Cecilia Payne, and Loney Clinton Gordon were. BTW: I’m not linking to their names here because I want to do future posts on each of them and more.

I’m going to start up this new Pencil push sometime within the next few days. If you dig it, keep coming back. If not, there are plenty of other ways for you to occupy your time in this world. Speaking of the world, did you know a University of Texas researcher determined that if everybody alive on Earth today hoped to enjoy a lifestyle similar to the average American, we’d need the resources of ten planet Earths.

See what I mean? That’s the kind of thing I’ll traffic in when this new Things-Every-Adult-Ought-to-Know phase of the Pencil kicks off.

See you soon.

Does This Look Like a Bunch of Penises to You?

%d bloggers like this: