Giving Thanks

Two blue post office mail boxes
photo credit:

It is that time of year when we are encouraged to step back, assess our lives, and give thanks. And in doing so this year an unlikely recipient of my thanks has floated to the surface of my own consciousness—the United States Post Office.

Out of nowhere, right?

Can you imagine an institution that has faced more change over the last couple of decades? Can you imagine an eatery or coffee house with a more complicated menu? Or a company with more impatient or intolerant customers?

Technology, of course, has eliminated much of the hard mail traffic that anyone really looked forward to in the past. We now have e-mail, Facebook, Skype, and the like for staying in touch with friends and family. What’s left are bills, and they are disappearing from the mailbox quickly, and, of course, circulars, which I strongly believe should be illegal for so many reasons.

What’s left, in other words, are the dregs of the delivery business: Packages to Uncle Ned, who lives off the grid in the middle of nowhere, in every size and shape imaginable.

And yet the level of respect and courtesy postal workers show their customers is, in my experience, unsurpassed by any company, utility, or government agency. And I mean it. I can think of no organization on the planet that so consistently delivers such high levels of customer service at a one-on-one level, to customers who don’t always deserve it.

I am sure there are many people who deserve credit for that. But I have to give some of the credit to the American Postal Workers Union (APWU) that represents US postal workers. Now, I have never been a member of a union and as a retired business executive I spent a good part of my career going head to head with unions, not all of it pleasant. In my experience, however, companies get the unions they deserve. I strongly believe that the decline in union membership in the US is both cause and effect of the stagnation in wages and the obliteration of the middle class we are now experiencing.

And, as a result, we get the service that we, as customers, pay for. The drive to improve customer service at the USPS would not have been successful if postal workers were paid on the disgraceful par of fast food or other retail workers. The APWU forced us to pay respect to our postal workers through a living wage. But look at what we got in return. It has been a very good investment.

It goes back to an age-old lesson that we just never seem to learn: You reap what you sow. Pay people only as much as you are forced to and you will get the level of commitment and engagement you deserve. Pay them a living wage and they will pay you back with service and courtesy many times over.

Now, if we could only do something about those circulars.

Contact: You may reach the author at

Who Belched?

photo credit:

If a novel opens with, “There was an audible grunt in the audience,” you would probably assume it was a man that was grunting. A grunt is a well-defined noise, not unlike the way English treats consonants. They are distinct and discreet sounds, and naturally associated with men.

The fact that man historically applied his manhood to the world through the universal “he” pronoun, of course, is challenged by most authors and linguists today. Some still ignore the challenge, of course. But what have the rest of us done?

Some of us use “she” as the universal pronoun, but such usage never goes unnoticed. Many think of “she”, with some disdain, as the politically correct “he”. Others are okay with it but notice nonetheless. And the rest of us use “it” or “one,” as in one beer drinker belched. Does anyone, however, think of “it” as having any qualities associated with the female gender?

Language is an entirely arbitrary convention. It is whatever we say it is. So who decided that “it” would be gender neutral? Men, of course. It’s a bit of English trickery, in fact, since the universal “it” in French is transparently masculine.

Like everything else in life and the universe, language exists in context. While it is a medium for communication, its very structure and rules of usage influence the outcome. Language doesn’t just unveil meaning; it defines it.

Powerful is one of the most, well, powerful adjectives in the language. A person who can lift a car by its bumper is certainly powerful. Some people are powerful storytellers. Some have a powerful voice. Corporate and political leaders are typically powerful. So, too, are military generals.

The adjective “powerful” is, in theory, gender-neutral. Each of the people described in the preceding paragraph could have been a man or a woman. The chances are good, however, that most people were inclined to think of men.

Since context is everything, as I note in Understanding Life, it’s important to recognize that context inevitably renders a solution that is defined by the medium itself. Language is no exception.

Whoever invented scissors, for example, had to have been right-handed. If a left-hander like me uses your typical off-the-shelf shears, our effort to cut a sheet of paper in half will not be pretty.

Why does there remain a glass ceiling in corporate America today? Is it that all men are closet misogynists? There are some of those, for sure. But the real reason is that men designed the game that is advancement in the corporate arena. (A very masculine way to put it, of course.) It’s not enough for men to “accept” the idea of women in the corner office. Gender equality in business won’t happen until we redefine the rules of the game in a totally gender-neutral way.

The same goes for race. “Color blind” won’t cut it. Visual neutrality is not enough. We have to redefine the game and the way we talk about it.

Contact: You may reach the author at

Language as Hammer

Carpenter with hammer hitting nails
photo credit:

I am fascinated by language. Not linguistics, mind you. My interest is the arbitrary and symbolic nature of language. Given the role it plays in communication, it’s important to understand the tool and not just the results. Because the tool influences how we get the job done. And the how, being the bridge between the tool and the objective, is where we spend most of our time and effort.

Consider the hammer. All carpenters carry one. Why? Because of the nails. The nail has long been the “how” of choice for securing two pieces of wood. And so it is with letters, sound, and numbers. They require a tool for coordination. We call that language.

The Thought Police, in Orwell’s dystopia, still fresh in my mind, had an active initiative to shorten the language to as few words as possible. This makes sense given their objective of limiting thought. Think political correctness today. The objective is the same: Prevent unsanctioned thought.

While I have always considered language to be limiting and arbitrary, I am just now beginning to realize how much language actually influences thought and behavior. Enter stage right a study by two cognitive psychologists, Lera Boroditsky and Alice Gaby, of Stanford and UC Berkeley, respectively.

Their experiment involved native speakers of English, Hebrew, and Kuuk Thaayorre. The last is the language of a small Aboriginal community in northern Australia.

For each group they laid out a set of pictures representing a time progression. One set, for example, showed a man aging. Another showed a banana being eaten. And so on.

They then asked each participant to arrange the pictures in the proper order of progression. And guess what? The English speakers arranged them left to right. The Hebrew speakers arranged them right to left, which, if you don’t know, and I didn’t, is how Hebrew is written. The Aboriginal participants, however, arranged them in an order dependent on which direction the participant was facing. They arranged them from east to west, regardless of which point on the compass they were facing at the time. IOW, the progressions all differed.

So what? Well, as just one example, the engineers working on artificial intelligence at places like Google are close to mastering AI translation. Language is a structure of patterns, so that’s not surprising. It’s a perfect application for the digital brain, really, so it’s not too incredible that we will soon be able to hold normal conversations in multiple languages simply by holding our smart phones in front of us.

But will we truly understand if we don’t think in the fashion that the other participants think? (Again the importance of why, the theory behind my Understanding Series of books.)

The biggest biases built into the English language today will have to wait as I have run out of room today. A preview: The English language is causally structured to reinforce the convention of white, male supremacy. Not the language of PC, mind you. The kind we all use.

Contact: You may reach the author at

Irony & the Internet

Anti-Prism demonstration, Frankfurt
photo credit:

For some reason beyond conscious thought I decided to go back and read George Orwell’s dystopian masterpiece, 1984. As the book was written in 1949, it has often been heralded as eerily prescient. And that assessment surely held as I worked my way through it.

With my sensitivity sensors thus set to maximum, I read an online news article this morning about irony. The assertion made was that irony is the most misused word in the English language today. It is often used to denote a merely surprising or unexpected coincidence when, according to the experts, the unexpected side of the comparison must be intentional. (Merriam-Webster, the living language folks, dispute this. The fact that irony is so widely used in the mere coincidence way makes it appropriate. Bravo, M-W!)

One of the central themes to 1984 is the notion of the three-tiered pyramidal society. (The Inner Party, the Outer Party, and the proles, in the Orwellian vernacular.) History is merely the record of the Top and Middle levels of society jockeying for position. (The Bottom, Orwell maintains, are so consumed with the drudgery of existence that they cannot escape.)

All of that changed in 1984, however, because of the Thought Police, who have found a way to make the existing social distinctions permanent.

The Thought Police have always been a threat, of course, but for much of history have been missing the means to act on their intent. In 1984, however, the ruling class has the telescreen, a 24/7 interactive communication device controlled by the government. (It is ironic, in the intentional sense of the word, that only the Inner Party members can turn the machines that watch them off.)

In 2017, of course, we have the Internet. And we are now beginning to understand, thanks to the Russians, that the Internet is not the democratic utopia once envisioned. The algorithms that it is built upon are inherently biased, and there is no effective way to patrol the activities of 3 billion reporters in the field for accuracy. The Internet can even be turned back on us, the hapless users, to spy on us. Not even our pajamas can protect us any more.

The problem with Orwell’s Thought Police, of course, is the same problem found in the proper use of irony. Beyond the tipping point it can no longer be contained.

Perhaps the most concerning aspect of Orwell’s dystopia, which is truly ironic, is that it thrives on patently ironic words that are at the heart of the State’s power. “Crimestop, in short, means protective stupidity.” And backwhite, which “means also the ability to believe that black is white, and more, to know that black is white, and to forget that one has ever believed the contrary.”

In yet another irony, the most powerful ally of the Thought Police are the children, since all children, after all, are taught to lust power—to get ahead—by the very parents they ultimately turn in.

Ironic indeed.

Contact: You may reach the author at

What Science Tells Us About _______

Microbiological studies
photo credit:

The title of this post is ubiquitous to the Internet these days. And it’s misleading in a way that the Internet seems to excel in. It’s both a powerful click-bait and a deceitful way to disguise opinion by wrapping it in a white lab coat and giving it the aura of irrefutability.

Medical clinician Chris Kresser has written an excellent book called, Unconventional Medicine. It was just released and offers both a damning critique of our current health care system and an attractive and logical alternative. Our current healthcare model, Kresser argues, is built on a preoccupation with trauma and disease.

He suggests, however, that advances in technology have allowed the science of medicine to evolve at a faster rate than the human body itself. We are victims of our own success. And technology has impacted our social, physical, and epicurean habits in ways that the current specialized and disease-centric medical paradigm is ill equipped to recognize or address.

As a result, “One in two Americans now has a chronic disease, and one in four has multiple chronic diseases…” The treatment of chronic disease now absorbs more than three-quarters of all healthcare spending and we’re not winning. Chronic diseases, like diabetes and cardiovascular disease, now account for seven of the top ten causes of death.

The pharmaceutical companies, of course, are the big winners. But, as Kresser points out, it’s not a fair fight. The pharmaceutical companies now fund two-thirds of all medical research in the US, contributing greatly to the disturbing trend exposed by Dr. Barbara Starfield: “…medical care is the third leading cause of death in this country.” Side effects and the interaction of drugs prescribed by multiple medical specialists are undoubtedly contributing to the problem.

This is, of course, part of a much bigger problem in America today. English biochemist, author, and researcher, Rupert Sheldrake, refers to it as “the science delusion.” In his case the characterization is based on ten presumed scientific dogmas that are themselves not authoritative in any scientific way.

Marcis Angell, a former editor of the New England Journal of Medicine, has this to say: “It is simply no longer possible to believe much of the clinical research that is published.” Stanford researcher, John Ioannidis agrees. He has published a paper entitled, “Why Most Published Research is False,” noting that most research is better at cataloguing the prevailing bias than discovering new scientific truths.

The proof is in the pudding. The scientific method is based on replicability. Cause and effect, science holds, is fixed by the laws of nature. In one recent study, however, researchers attempted to replicate the results of 100 published psychology studies and failed to do so in 65 percent of the cases. Researchers from Bayer, likewise, attempted to replicate the research behind sixty-seven blockbuster drugs currently in use and failed in 75% of their efforts.

Where is Pyrrho of Elis, the Greek philosopher that started the philosophical school of skepticism, when we need him?

Contact: You may reach the author at

The Humanity of Analog

Close up of a turntable needle head playing music from an lp.  (shallow focus).
photo credit:

It is commonly accepted wisdom that those of us who can remember where they were when President Kennedy was shot are not particularly good with all things digital.

There are many theories as to why. Most of them have to do with the momentum of habit, and, of course, a general lack of familiarity. It is, in fact, hard for me to remember a time when my teenage daughters didn’t have their iPhones in hand.

I have another theory, however. I think this apparent distinction simply reflects the difference between the digital and analog worlds. The turntable on which my father listened to his prized Boots Randolph records was analog. As was the reel-to-reel tape deck on which I listened to Emmylou Harris in college. My daughters, on the other hand, have their smart phones.

The distinction matters because analog devices could be broken; fairly easily, in fact. If something wasn’t working properly you approached the repair with some caution, knowing that if you did the wrong thing you could, in fact, make the device irreparable.

Digital, on the other hand, can usually be repaired through a risk-free reboot. Just unplug it, wait a moment, and plug it in again. Digital technology is, in a word, regenerative. It can heal itself. (The irony, of course, is that digital insures obsolescence. When it does break, it can’t be fixed.)

I am not a total digital neophyte, mind you. I don’t have a landline and I have active accounts on Facebook, Twitter, and LinkedIn. I even blog, obviously.

When checking my own iPhone for news the other day, there was a notification that I had a message entitled, “Best Moments of 2017.” It looked harmless enough, so I clicked, and was treated to a wonderful video showing a selection of my own pictures set precisely to appropriate music. It was complex enough, in fact, that the pictures were not sequenced evenly. The pictures themselves, in fact, were clearly curated.

There were a couple of tells, as they say in poker. The picture of my feet, taken in obvious error, was one. And my wife was aghast at the selection of pictures chosen of her. Overall, however, it was pretty darn good.

As a result, I immediately assumed that someone had put it together. Here was a Swiss watch; it was obvious that someone had designed it.

I texted my daughters and thanked them, telling them, of course, how proud I was of their digital talents. They, in turn, texted back and informed me that my phone itself had created the video using software that I didn’t even know existed.

“How cute,” noted my sixteen year old, “that you thought we had done it.” I immediately felt old, of course, but I wasn’t sure if I was impressed or saddened that my daughters will never know the analog world in which I grew up. It was a more vulnerable world. Not quite as clever perhaps. But more like us in that way.

Contact: You may reach the author at

Digital Idealism

Social Network Applications
photo credit: iStock/Wachiwit

The Internet is empowering idealism. And that’s not a good thing.

Internet trolls have long sought to expose the fact that Internet celebrities frequently doctor their selfies before posting them. They take off the weight, eliminate the skin blemishes, tone the muscles, and generally enhance the photo toward an idealized state. Even the photo itself is staged—stand like this, not this.

Au naturel, it seems, is so yesterday. Despite the frantic efforts of the trolls to gain their own notoriety, however, the legions of followers don’t seem to care. If anything, they actively promote it. Exposing the truth or posting photos that are undoctored are, in the end, shamed for enabling the shamers. “Leave the bubble be!”

The ideal, in other words, is the new reality. Thigh gaps and six pack abs are the new standards of beauty even though they don’t really exist in nature.

Whether this pursuit of idealism at the expense of reality is cause or effect I don’t know. Perhaps this simply underlines the aspirational power of all things digital. The Internet of Everything, after all, promises to eliminate scarcity, maximize our leisure time, and put astonishing experiences at our fingertips.

This trend, however, will have both intended and unintended consequences. And the latter, I suspect, will be the continual erosion of trust.

Earlier this year, Mark Zuckerberg, the CEO of Facebook, said, “For the last decade we’ve been focusing on making the world more open and connected…” Going forward, however, he has defined Facebook’s new strategy and vision to be, “to give people the power to build community to bring the world closer together.” And, “Going forward, we will measure Facebook’s progress with groups based on meaningful groups, not groups overall.

It sounds enticing, to be sure. But is it realistic? I think not.

The main ingredient of a viable and productive community is trust. In the small communities of yore, and I was born and raised in one, that trust was largely gained through familiarity and exposure. The transparency was 24/7. There are no secrets in a small community. Whether or not the members like each other, they know each other’s business.

That’s not the case in a digital community. Fake news, doctored photographs, and over-hyped personal narratives are the norm. The duality of the Internet is that it is both the perfect place to find community and the perfect place to hide. There is little distinction between truth and propaganda.

The Internet is a world of fantasy, not authenticity. And, in the short run, it might be fun to join the fantastic communities it spawns. Over time, however, I think the search for substance will come back into vogue. It always does. It’s how we’re wired.

The trust deficit, for many reasons, including algorithmic bias, as I have discussed before, are inherent to a digital society. We need more community, for sure, but we won’t find it online once we understand its reality.

Contact: You may reach the author at