Saturday, July 22, 2017

American Pastoral by Philip Roth

This is the first Philip Roth that I have read, and I suppose that I’ll read more, although I’ve been hearing lately about how dark and nihilistic he can be. American Pastoral certainly has its darkness and its nihilism, but I think it also attempting to do something profound.

I was ten and I had never read anything like it. The cruelty of life. The injustice of it. I could not believe it. The reprehensible member of the Dodgers is Razzle Nugent, a great pitcher but a drunk and a hothead, a violent bully fiercely jealous of the Kid. And yet it is not Razzle carried off “inert” on a stretcher but the best of them all, the farm orphan called the Kid, modest, serious, chaste, loyal, naive, undiscourageable, hard-working, soft-spoken, courageous, a brilliant athlete, a beautiful, austere boy.

This is on page 9, and our first-person narrator (who seems to disappear as the novel wears on) is reflecting on a baseball morality play that made a big impact on his young mind. As he continues, he speculates on what the lessons of the book might mean for our actual protagonist, a boyhood hero of the narrator nicknamed the Swede.

Needless to say, I thought of the Swede and the Kid as one and wondered how the Swede could bear to read this book that had left me near tears and unable to sleep. Had I the courage to address him, I would have asked if he thought the ending meant the Kid was finished or whether it meant the possibility of yet another comeback. The word “inert” terrified me. Was the Kid killed by the last catch of the year? Did the Swede know? Did he care? Did it occur to him that if disaster could strike down the Kid from Tomkinsville, it could come and strike the great Swede down too? Or was a book about a sweet star savagely and unjustly punished -- a book about a greatly gifted innocent whose worst fault is a tendency to keep his right shoulder down and swing up but whom the thundering heavens destroy nonetheless -- simply a book between those “Thinker” bookends up on his shelf?

The book our narrator is referring to is called The Kid from Tomkinsville (although even he contemplates that it could have better been called The Lamb from Tomkinsville or even The Lamb from Tomkinsville Led to the Slaughter), but, of course, behind him it is Roth referring to American Pastoral. The Swede -- Seymour Irving Levov -- is the Kid, in the sense that the Swede is also a “sweet star savagely and unjustly punished,” but that juxtaposition is not the profound thing that Roth is trying to do. For that, we have to understand that the Swede is not just a man, but an entire generation of Americans, and that the thing that brings him down, is not cruel fate, but the beloved generation that follows them.

But before going there, Roth offers the reader a caution. He doesn’t know if he can actually succeed in doing what he’s attempting to do.

An Astonishing Farce of Misperception

You fight your superficiality, your shallowness, so as to try to come at people without unreal expectations, without an overload of bias or hope or arrogance, as untanklike as you can be, sans cannon and machine guns and steel plating half a foot thick; you come at them unmenacingly on your own ten toes instead of tearing up the turf with your caterpillar treads, take them on with an open mind, as equals, man to man, as we used to say, and yet you never fail to get them wrong. You might as well have the brain of a tank. You get them wrong before you meet them, while you’re anticipating meeting them; you get them wrong while you’re with them; and then you go home to tell somebody else about the meeting and you get them all wrong again. Since the same generally goes for them with you, the whole thing is really a dazzling illusion empty of all perception, an astonishing farce of misperception.

Roth can’t truly know the subject of his own book, just, as we will come to see, the two characters within it who come to represent the older and younger generations of America -- Swede Levov and his daughter Merry -- can’t truly know each other. It is the desire to know, and the inability to doing so, that creates the bitter tragedy.

And yet what are we to do about this terribly significant business of other people, which gets bled of the significance we think it has and takes on instead a significance that is ludicrous, so ill-equipped are we all to envision one another’s interior working and invisible aims? Is everyone to go off and lock the door and sit secluded like the lonely writers do, in a soundproof cell, summoning people out of words and then proposing that these word people are closer to the real thing than the real people that we mangle with our ignorance every day? The fact remains that getting people right is not what living is all about anyway. It’s getting them wrong that is living, getting them wrong and wrong and wrong and then, on careful reconsideration, getting them wrong again. That’s how we know we’re alive: we’re wrong. Maybe the best thing would be to forget being right or wrong about people and just go along for the ride. But if you can do that -- well, lucky you.

And don’t try to take sides, Roth seems to caution us. One is not right and the other wrong -- or at least there is no way for us to tell, given our basic ignorance of another’s interior working and invisible aims. Better to just go along for the ride.

The American Pastoral and the American Berserk

Here’s the passage that gives you the clue you need to decipher Roth’s profundity.

The disruption of the anticipated American future that was simply to have unrolled out of the solid American past, out of each generation’s getting smarter -- smarter for knowing the inadequacies and limitations of the generations before -- out of each new generation’s breaking away from the parochialism a little further, out of desire to go the limit in America with your rights, forming yourself as an ideal person who gets rid of the traditional Jewish habits and attitudes, who frees himself of the pre-America insecurities and the old, constraining obsessions so as to live unapologetically as an equal among equals.

This is Swede Levov, the child of Jewish immigrants, a generation of people embracing the American dream and all of its totems and rituals.

And then the loss of the daughter, the fourth American generation, a daughter on the run who was to have been the perfected image of himself as he had been the perfected image of his father, and his father the perfected image of his father’s father … the angry, rebarbative spitting-out daughter with no interest whatever in being the next successful Levov, flushing him out of hiding as if he were a fugitive -- initiating the Swede into the displacement of another America entirely, the daughter and the decade blasting to smithereens his particular form of utopian thinking, the plague America infiltrating the Swede’s castle and there infecting everyone. The daughter who transports him out of the longed-for American pastoral and into everything that is its antithesis and its enemy, into the fury, the violence, and the desperation of the counterpastoral -- into the indigenous American berserk.

And this is Merry, the radical, a generation of people disillusioned with the very totems and rituals that define the generation that came before.

In the course of the novel we will discover that Merry took her radicalization seriously, bombing a drugstore in her hometown, killing the proprietor, and spending years on the run and out of touch with her father -- all as a protest against his politics, his country, his generation, him.

And we will also discover that, for these things, the Swede can only blame himself.

I am thinking of the Swede and of what happened to his country in a mere twenty-five years, between the triumphant days at wartime Weequahic High and the explosion of his daughter’s bomb in 1968, of that mysterious, troubling, extraordinary historical transition. I am thinking of the sixties and of the disorder occasioned by the Vietnam War, of how certain families lost their kids and certain families didn’t and how the Seymour Levovs were one of those that did -- families full of tolerance and kindly, well-intentioned liberal goodwill, and theirs were the kids who went on a rampage, or went to jail, or disappeared underground, or fled to Sweden or Canada. I am thinking of the Swede’s great fall and of how he must have imagined that it was founded on some failure of his own responsibility. There is where it must begin. It doesn’t matter if he was the cause of anything. He makes himself responsible anyway. He has been doing that all his life, making himself unnaturally responsible, keeping under control not just himself but whatever else threatens to be uncontrollable, giving his all to keep his world together. Yes, the cause of the disaster has for him to be a transgression. How else would the Swede explain it to himself? It has to be a transgression, a single transgression, even if it is only he who identifies it as a transgression. The disaster that befalls him begins in a failure of his responsibility, as he imagines it.

But, perhaps as you can begin to see even in that excerpt, it is always important in this novel not to view Seymour and Merry Levov as individuals -- as people that Roth has told us we are incapable of truly knowing anyway -- but as generations, wrestling with each other for the soul of America. The Swede, in blaming himself, embodies the mindset of an aspirational generation, while Merry, in rejecting all that her father has arranged and decoded for her, embodies the mindset of a nihilistic one -- the American pastoral versus the American berserk.

Because what is it, exactly, that Merry finds so objectionable about her father, that the young generation finds so objectionable about the older? The narrator alludes to it when he meets the Swede for dinner as adults in the opening pages.

I was impressed, as the meal wore on, by how assured he seemed of everything commonplace he said, and how everything he said was suffused by his good nature. I kept waiting for him to lay bare something more than this pointed unobjectionableness, but all that rose to the surface was more surface. What he has instead of a being, I thought, is blandness -- the guy’s radiant with it. He has devised for himself an incognito, and the incognito has become him. Several times during the meal I didn’t think I was going to make it, didn’t think I’d get to dessert if he was going to keep praising his family and praising his family … until I began to wonder if it wasn’t that he was incognito but that he was mad.

And the Swede’s brother throws it in his face much deeper in the novel.

“No, you’re not the renegade. You’re the one who does everything right.”

“I don’t follow this. You say that like an insult.” Angrily [the Swede] says, “What the hell is wrong with doing things right?”

“Nothing. Nothing. Except that’s what your daughter has been blasting away at all her life. You don’t reveal yourself to people, Seymour. You keep yourself a secret. Nobody knows what you are. You certainly never let her know who you are. That’s what she’s been blasting away at -- that facade. All your fucking norms. Take a good look at what he did to your norms.”

“I don’t know what you want from me. You’ve always been too smart for me. Is this your response? Is this it?”

“You win the trophy. You always make the right move. You’re loved by everybody. You marry Miss New Jersey, for God’s sake. There’s thinking for you. Why did you marry her? For the appearance. Why do you do everything? For the appearance!”

The Swede is a man so swamped in the cultural ideal of his generation that nothing individual, nothing messy, nothing radical, ever swims to the surface.

There is a powerful scene early in the novel that illustrates the Swede’s need for this control, for this all-consuming normality, and the hidden frailty that secretly lives within him, the shattered self he can show no one but which is a direct result of Merry’s betrayal. He is giving a young woman named Rita a tour of his family’s glove manufacturing business, and Roth dives deep into Melvillian detail as the Swede discusses, demonstrates, and diagrams both the art and science that is glove making. It goes on for so long, and in so much obsessive detail, that I began to wonder what it all meant. It’s clearly not just an interlude. And then this.

This is the silking, that’s a story in itself, but this is what she’s going to do first. … This is called a piqué machine, it sews the finest stitch, called piqué, requires far more skill than the other stitches. … This is called a polishing machine and that is called a stretcher and you are called honey and I am called Daddy and this is called living and the other is called dying and this is called madness and this is called mourning and this is called hell, pure hell, and you have to have strong ties to be able to stick it out, this is called trying-to-go-on-as-though-nothing-has-happened and this is called wanting-to-be-dead-and-wanting-to-find-her-and-to-kill-her-and-to-save-her-from-whatever-she-is-going-through-wherever-on-earth-she-may-be-at-this-moment, this unbridled outpouring is called blotting-out-everything and is does not work, I am half insane, the shattering force of that bomb is too great. … And then they were back at his office again, waiting for Rita’s gloves to come from the finishing department, and he was repeating to her a favorite observation of his father’s, one that his father has read somewhere and always used to impress visitors, and he heard himself repeating it, word for word, as his own.

There’s a kind a sad beauty in both this device and Roth’s writing. It’s one of those rare moments in literature where something is set-up and the unexpected pay-off delivers seven-fold. It really captures of emotion of the Swede’s impossible situation.

The Awfulness of Her Terrible Autonomy

That’s a phrase I circled when I encountered it on the page. Much of the novel will be consumed by the Swede’s consuming obsession, and his inability to understand his daughter’s actions.

Nor could he say he hated his daughter for what she had done -- if he could! If only, instead of living chaotically in the world where she wasn’t and in the world where she once was and in the world where she might now be, he could come to hate her enough not to care anything about her world, then or now. If only he could be back thinking like everybody else, once again the totally natural man instead of this riven charlatan of sincerity, an artless outer Swede and a tormented inner Swede, a visible stable Swede and a concealed beleaguered Swede, an easygoing, smiling sham Swede enshrouding the Swede buried alive. If only he could even faintly reconstitute the undivided oneness of existence that had made for his straightforward physical confidence and freedom before he became the father of an alleged murderer. If only he could be as unknowing as some people perceived him to be -- if only he could be as perfectly simple as the legend of Swede Levov concocted by the hero-worshipping kids of his day. If only he could say, “I hate this house!” and be Weequahic’s Swede Levov again. If he could say, “I hate that child! I never want to see her again!” and then go ahead, disown her, forevermore despise and reject her and the vision for which she was willing, if not to kill, then to cruelly abandon her own family, a vision having nothing whatsoever to do with “ideals” but with dishonesty, criminality, megalomania, and insanity. Blind antagonism and an infantile desire to menace -- those were her ideals. In search always of something to hate. Yes, it went way, way beyond her stuttering. That violent hatred of America was a disease unto itself. And he loved America. Loved being an American. But back then he hadn’t dared begin to explain to her why he did, for fear of unleashing the demon, insult. They lived in dread of Merry’s stuttering tongue. And by then he had no influence anyway. [His wife] Dawn had no influence. His parents had no influence. In what way was she “his” any longer if she hadn’t even been his then, certainly not his if to drive her into her frightening blitzkrieg mentality it required no more than for her own father to begin to explain why his affections happened to be for the country where he’d been born and raised. Stuttering, sputtering little bitch! Who the fuck did she think she was?

Pages and pages of this: self abuse, shame, and torment. He loves her. He hates her. He can’t understand her.

Hate America? Why, he lived in America the way he lived inside his own skin. All the pleasures of his younger years were American pleasures, all that success and happiness had been American, and he need no longer keep his mouth shut about it just to defuse her ignorant hatred. The loneliness he would feel as a man without all his American feelings. The longing he would feel if he had to live in another country. Yes, everything that gave meaning to his accomplishments had been American. Everything he loved was here.

The voice of one generation. Struggling to understand the mind of another.

For her, being an American was loathing America, but loving America was something he could not let go of any more than he could have let go of loving his father and his mother, any more than he could have let go of his decency. How could she “hate” this country when she had no conception of this country? How could a child of his be so blind as to revile the “rotten system” that had given her own family every opportunity to succeed? To revile her “capitalist” parents as though their wealth were the product of anything other than the unstinting industry of three generations. The men of three generation, including even himself, slogging through the slime and stink of a tannery. The family that started out in a tannery, at one with, side by side with, the lowest of the low -- now to her “capitalist dogs.” There wasn’t much difference, and she knew it, between hating America and hating them. He loved the America she hated and blamed for everything that was imperfect in life and wanted violently to overturn, he loved the “bourgeois values” she hated and ridiculed and wanted to subvert, he loved the mother she hated and had all but murdered by doing what she did. Ignorant little fucking bitch! The price they had paid!

And all of it -- the Swede and Merry, the two generations they represent, the American Pastoral and the American Berserk -- Roth ruthlessly allows all of it to circle high above the reader like a desert scavenger, only and always to eventually come down to feed on that one powerful phrase. The awfulness of her terrible autonomy. We do what we want. And there is nothing, not even generations of toil and fealty to a dream, that can stop us.

The Kiss

At first, I was not going to include this, both because I didn’t think it was crucial to one’s understanding of the novel, and because I wasn’t sure I could adequately convey its subtle subversion. But as I reflect back on the novel, re-reading all the pages I’ve dog-eared and passages I’ve underlined, I’ve come to realize I do have to address it.

I found him in Deal, New Jersey, at the seaside cottage, the summer his daughter was eleven, back when she couldn’t stay out of his lap or stop calling him by cute pet names, couldn’t “resist,” as she put it, examining with the tip of her finger the close way his ears were fitted to his skull.

This is the narrator again, looking into the life of the Swede as a grown man, not “as a god or a demigod whose triumphs one could exult as a boy but his life as another assailable man.”

Wrapped in a towel, she would run through the house and out to the clothesline to fetch a dry bathing suit, shouting as she went, “Nobody look!” and several evenings she had barged into the bathroom where he was bathing and, when she saw him, cried out, “Oh, pardonnez-moi -- j’ai pensé que--” “Scram,” he told her, “get-outahere-moi.”

She, of course, is Merry, and the time is one of innocent and adoring love.

Driving along with him back from the beach one day that summer, dopily sun-drunk, lolling against his bare shoulder, she had turned up her face and, half innocently, half audaciously, precociously playing the grown-up girl, said, “Daddy, kiss me the way you k-k-kiss umumumother.”

Merry is eleven and, as described earlier, suffering with an awkward stutter.

Sun-drunk himself, voluptuously fatigued from rolling all morning with her in the heavy surf, he had looked down to see that one of the shoulder straps of her swimsuit had dropped over her arm, and there was her nipple, the hard red bee bite that was her nipple. “N-n-no,” he said -- and stunned them both. “And fix your suit,” he added feebly. Soundlessly she obeyed.

It was stunning because the Swede had made fun of her stammer, something he had never done before, something he had previously seemed incapable of doing. He immediately regrets it.

“I’m sorry, cookie--” “Oh, I deserve it,” she said, trying with all her might to hold back her tears and be his chirpingly charming pal again. “It’s the same at school. It’s the same with my friends. I get started with something and I can’t stop. I just get c-c-carried awuh-awuh-awuh-awuh--”

To deepen the sense of betrayal, Roth next gives us the following long paragraph.

It was a while since he’d seen her turn white like that or seen her face contorted like that. She fought for the word longer than, on that particular day, he could possibly bear. “Awuh-awuh--” And yet he knew better than anyone what not to do when, as Merry put it, she “started phumphing to beat the band.” He was the parent she could always rely on not to jump all over her every time she opened her mouth. “Cool it,” he would tell Dawn, “relax, lay off her,” but Dawn could not help herself. Merry began to stutter badly and Dawn’s hands were clasped at her waist and her eyes fixed on the child’s lips, eyes that said, “I know you can do it!” while saying, “I know that you can’t!” Merry’s stuttering just killed her mother, and that killed Merry. “I’m not the problem -- Mother is!” And so was the teacher the problem when she tried to spare Merry by not calling on her. So was everybody the problem when they started feeling sorry for her. And when she was fluent suddenly and free of stuttering, the problem were the compliments. She resented terribly being praised for fluency, and as soon as she was praised she lost it completely -- sometimes, Merry would say, to the point that she was afraid “I’m going to short out my whole system.” Amazing how this child could summon up the strength to joke about it -- his precious lighthearted jokester! If only it were within Dawn’s power to become a little lighthearted about it herself. But it was the Swede alone who could always manage to be close to perfect with her, though even he had all he could do not to cry out in exasperation, “If you dare the gods and are fluent, what terrible thing do you think will happen?” The exasperation never surfaced: he did not wring his hands like her mother, when she was in trouble he did not watch her lips or mouth her words with her like her mother, he did not turn her, every time she spoke, into the most important person not merely in the room bu in the entire world -- he did everything he could not to make her stigma into Merry’s way of being Einstein. Instead his eyes assured her that he would do all he could to help but that when she was with him she must stutter freely if she needed to. And yet he had said to her, “N-n-no.” He had done what Dawn would rather die than do -- he had made fun of her.

“Awuh-awuh-awuh--”

There’s so much here. The competition between a mother and a daughter; the love between a daughter and a father; the struggle of a child to understand what growing up means; the struggle of a parent to keep from shaping children in an idealized image. Universals, all; and all expertly bundled together in this little vignette about a beach cottage and an adolescent stammer. There’s so much here, but there’s so much more to come.

“Oh, cookie,” he said, and at just the moment when he had understood that the summer’s mutual, seemingly harmless playacting -- the two of them nibbling at an intimacy too enjoyable to swear off and yet not in any way to be taken seriously, to be much concerned with, to be given an excessive significance, something utterly uncarnal that would fade away once the vacation was over and she was in school all day and he had returned to work, nothing that they couldn’t easily find their way back from -- just when he had come to understand that the summer romance required some readjusting all around, he lost his vaunted sense of proportion, drew her to him with one arm, and kissed her stammering mouth with the passion that she had been asking him for all month long while knowing only obscurely what she was asking for.

Yes. That. A single but singularly horrific lapse of parental judgment and betrayal.

Was he supposed to feel that way? It happened before he could think. She was only eleven. Momentarily it was frightening. This was not anything he had ever worried about for a second, this was a taboo that you didn’t even think of as a taboo, something you are prohibited from doing that felt absolutely natural not to do, you just proceeded effortlessly -- and then, however momentarily, this.

When I first read this, I really struggled with it. Despite Roth’s elaborate context and the self-tortured inner dialogue he provides the Swede, the kiss still feels out of place. It goes too far. As my college creative writing teacher would have said, he hasn’t earned it.

Never in his entire life, not as a son, a husband, a father, even as an employer, had he given way to anything so alien to the emotional rules by which he was governed, and later he wondered if this strange parental misstep was not the lapse from responsibility for which he paid for the rest of his life. The kiss bore no resemblance to anything serious, was not an imitation of anything, had never been repeated, had itself lasted five seconds … ten at most … but after the disaster, when he went obsessively searching for the origins of their suffering, it was that anomalous moment -- when she was eleven and he was thirty-six and the two of them, all stirred up by the strong sea and the hot sun, were heading happily home alone from the beach -- that he remembered.

And it does fill that niche in the story. The Swede, as we have seen, desperate both to blame himself and to find the reason for “the disaster,” for Merry’s radicalization and her bombing of the local drugstore, will obsess and obsess and obsess some more over this transgression.

Did it have to do with him? That foolish kiss? That was ten years behind them, and besides, it had been nothing, had come to nothing, did not appear to have meant anything much to her even at the time. Could something as meaningless, as commonplace, as ephemeral, as understandable, as forgivable, as innocent … No! How could he be asked again and again to take seriously things that were not serious? Yet that was the predicament that Merry had forced on him all the way back when she was blasting away at the dinner table about the immorality of their bourgeois life. How could anybody take that childish ranting seriously? He had done as well as any parent could have -- he had listened and listened when it was all he could do not to get up from dinner and walk away until she’d spewed herself out; he had nodded and agreed to as much as he could even marginally agree to, and when he opposed her -- say, about the moral efficacy of the profit motive -- always it was with restraint, with all the patient reasonableness he could muster. And this was not easy for him, given that it was the profit motive to which a child requiring tens of thousands of dollars’ worth of orthodontia, psychiatry, and speech therapy -- not to mention ballet lessons and riding lessons and tennis lessons, all of which, growing up, she at one time or another was convinced she could not survive without -- might be thought to owe if not a certain allegiance then at least a minuscule portion of gratitude. Perhaps the mistake was to have tried so hard to take seriously what was in no way serious; perhaps what he should have done, instead of listening so intently, so respectfully, to her ignorant raving was to reach over the table and whack her across the mouth.

But what would that have taught her about the profit motive -- what would it have taught her about him? Yet if he had, if, then the veiled mouth could be taken seriously. He could now berate himself, “Yes, I did it to her, I did it with my outbursts, my temper.” But it seemed as though he had done whatever had been done to her because he could not abide a temper, had not wanted on or dared to have one. He had done it by kissing her. But that couldn’t be. None of this could possibly be.

But then I began to think about the novel’s larger canvas, about how the Swede and Merry represented two generations in the American story, the Swede’s humble with its self-importance and Merry’s angry at how easy everything seems to be. And through this lens, the kiss takes on a more metaphoric meaning. The Swede’s generation loves its children, will do anything to keep it from pain and danger, but, in removing the struggle from their lives removes the very thing that builds the kind of character they esteem most. They love. But they don’t parent.

The Swede’s father gets it.

“I remember when Jewish kids were home doing their homework. What happened? What the hell happened to our smart Jewish kids? If, God forbid, their parents are no longer oppressed for a while, they run where they think they can find oppression. Can’t live without it. Once Jews ran away from oppression; now they run away from no-oppression. Once they ran away from being poor; now they run away from being rich. It’s crazy. They have parents they can’t hate anymore because their parents are so good to them, so they hate America instead.”

The tragedy of American Pastoral is that the Swede, and his generation, never does.

+ + +

This post first appeared on Eric Lanke's blog, an association executive and author. You can follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.


Monday, July 17, 2017

Clarifying the What Not the How

I'm doing performance evaluations with my direct reports this week and, on the advice of one of those direct reports, I'm asking everyone for their feedback on my performance as well. What, essentially, should I be doing that I'm not to help make everyone's job easier, to remove some of the roadblocks and barriers that are holding people back?

So far, two people have zeroed in on the same thing, taken from the list of behaviors that we created to describe alignment with our core values: "Bring purpose and understanding to complex and uncertain environments."

I get it. We work in a complex environment. Some of that complexity is inherent to our organization, and to many associations. I sometimes call it the diffused nature of leadership, in which the authority for determining courses of actions rests not with an individual but with a group -- the Board, a committee, a staff team. But some of that complexity is my own doing. If you've spent any time reading this blog, then you know I'm always tinkering with the process and mechanisms that our association uses to come up with its strategy agenda and operational plans. Sometimes, I know, I can overwhelm people with new terminology and experimental ways of doing things.

So, I'm doing the best thing I can with this constructive criticism. I'm accepting it, taking it to heart, and considering how to adjust my behavior to address it.

But as I am thinking those things through, at least one essential point has occurred to me. As I figure out ways to bring more purpose and understanding to our complex strategic and operational environment, my focus must remain entirely on what we are here to do, not on how we're going to do it.

Of all the experimental iterations that I've introduced in the organization, the one that I remain most committed to is finding ways to better empower (and hold accountable) the people closest to the challenges we face to determine the methods by which we will surmount and surpass them. Our Board embraced this mindset a few years ago, and now has a culture sharply focused on determining the intended outcomes of our organization. The Board is self-policing is this regard, scrupulously keeping itself out of the weeds, and reminding itself whenever necessary that it has formally delegated the determination of the means to achieve our ends to its chief staff executive and his staff.

I'm embracing the same mindset and trying to build the same culture within our staff organization. Yes, I need to be more clear about what it is we are trying to achieve, but in my attempt to be more clear, I have to avoid directives about how we should be achieving those things.

+ + +

This post first appeared on Eric Lanke's blog, an association executive and author. You can follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.

Image Source
https://npengage.com/nonprofit-fundraising/from-an-event-manger-focus-on-the-end-goal-donor-engagement-retention/


Monday, July 10, 2017

Multiple Views Around the Board Table

At our recent strategic Board retreat I had an interesting experience. I was sitting at the head of the Board table, as I always do, only the person on my left was not the Board chair that I had worked throughout the past year with. It was the new Board chair that we had just inducted the evening before, at something we call our "Pass the Gavel" dinner.

He wasn't a total stranger. He's been on the Board for several years, the last few in various positions on our Executive Committee. If you're familiar with succession ladders, you know the drill. Secretary, Treasurer, Vice Chair, and now Chair. But the reason he had received the gavel the night before was because today's session was focused on setting our strategy agenda for the upcoming year -- the year in which he would serve the association as its Board Chair.

Like a lot of discussions at our Board table, the one leading up to this important decision was a sometimes fuzzy one, with different Board members expressing different (and sometimes divergent) opinions about what the association should be doing and how it should define our success. I'd seen our new Board Chair in situations like this before, and as before, he ably guided the discussion towards a concrete conclusion -- not something he had predetermined, but something that the table itself created, provided, of course, that it aligned with the agreed-upon strategic direction was had already settled on.

When the discussion was over and we had our outcome, we took a much needed break. Before our Board Chair could get pulled away I grabbed his attention and quickly sketched out a framework for what we had just decided around the Board table. It was something I had been quietly developing while the discussion had meandered towards its conclusion. I wasn't changing anything about that outcome, I was just putting the strategic decisions in a rudimentary operational structure that I thought we could use to guide association activities the upcoming year. All I wanted his initial feedback on it before fleshing it out any further.

He listened and looked at what I was sketching for him. When I was finished he simply nodded and said he had a different perspective. In two or three sentences he described an alternate framework that covered the same strategic bases but attacked them from a different angle.

I nodded in return, realizing that his frame made sense -- more sense than mine -- especially when viewed from the perspective of the Board members. And that realization triggered the interesting experience I referenced in this post's opening line.

My framework, I could now see with his framework placed next to it as a kind of foil, was built around how I would execute functions within our staff organization. It relied on Objectives, Departments and Programs. That's the world I needed to live in if I was going to make things happen in the association. His framework, in contrast, was build around how he would lead discussions around future Board tables. It relied on Goals, Metrics, and Resources. That's the world he needed to live in if he was going to make things happen in the association.

Neither framework was wrong. Indeed, they were both right, but for different purposes. But importantly, they were not the same. Each one offered a different view of our strategy, and each, if used to guide the organization, would require a different structure for its execution. One was about the staff and the other was about the Board and those, at the end of the day, are two different things.

The interesting experience came when I understood that if I and my Board Chair were going to employ our different frameworks in our respective spheres, that neither one of us would necessarily need to approve or understand the entirety of what the other was doing. To work together as an effective leadership team, we would only need to approve and understand the places where our frameworks connected -- where strategy turned into action, and where action delivered results.

+ + +

This post first appeared on Eric Lanke's blog, an association executive and author. You can follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.

Image Source
https://www.pinterest.com/explore/barn-board-tables/


Saturday, July 8, 2017

Lindbergh by A. Scott Berg

A biography of Charles Lindbergh, a prominent figure in American history that, I admit, I knew very little about before picking up this tome. Sure, The Spirit of St. Louis, everyone knows that, and the Lindbergh Baby, most people know that one, too. And a sizeable portion could probably cite something about his activities before, during and after World War II, where his views matured too late for most, especially given his status as an American Hero.

But other than that, what did I really know about Charles A. Lindbergh? Absolutely nothing.

Here, then, are four small anecdotes from one man’s life. I’m not sure they add up to anything, but they are the pieces that are most likely to stick with me.

The Baby

Here’s the grisly scene as Berg describes it.

The officers had a badly decomposed child’s body before them, face down in the dirt. The size of the body, the shape of the skull, the still golden, curly hair all suggested the Lindbergh baby. More police were summoned to the makeshift gravesite. They carefully turned over what proved to be an incomplete corpse. Not only had the figure blackened severely, but its left leg was missing from the knee down as was the right arm below the elbow and the left hand. The body parts had probably been eaten by animals, as had most of the viscera. But the eyes, the nose, and the dimpled chin left little doubt as to the corpse’s identity. The clothes were in bad condition, but intact.

This is 72 days after the crime, after “the most widespread search ever conducted in police history.” The small body was discovered accidentally by a motorist who had pulled over to relieve himself on the side of dark country road. The clothes would clinch it. The police had a detailed description of what the child had been wearing on the night he was abducted.

The officers returned to the corpse with Colonel Schwarzkopf [father of the Desert Storm general and head of the investigation]. Under his direction, an inspector cut and peeled off each layer of the baby’s clothes, manipulating the body with a stick. He accidentally pierced the softened skull, leaving a small hole below the right earlobe. Each article of clothing was exactly as Betty Gow [the child’s nurse] had described, down to the scalloped flannel undershirt with its blue thread. A visible skull fracture suggested a violent blow to the head had been the cause of death.

The subsequent autopsy would reach the same conclusion.

The autopsy by Dr. Charles H. Mitchell revealed no signs of strangulation or bullets. With so much decomposition to the body, there was little for him to add beyond the supposition that “the cause of death is a fractured skull due to external violence.” Because blood had been found nowhere near the crime-scene, not even on the chisel left behind, it seemed logical that when the ladder had broken, the baby had met his death smashing against the side of the house or onto the ground.

What a scene. Like something out of Coen Brothers movie, only real, horridly real, but a real tragic farce, all the same. The bumbling perpetrator, blinded by his own vision of the “fabled catbird seat,” doing the unspeakable, putting disastrous events and accidents into motion, bringing nothing but grief and death for himself and others. Imagine. On the way down the ladder, from the second story bedroom, the Lindbergh Baby clutched in one hand and a rough wooden rung in the other, it breaks, he falls, the baby smacks his head and dies. What does one do? What can one possibly do? Useless. Useless.

The Ladder

That may be the most grisly aspect of that colossally sad tale, but it is not nearly the most interesting.

Since the spring of 1932, the wood technologist Arthur Koehler had been analyzing the kidnap ladder. He began by completely disassembling it, numbering each rail and rung. Several types of wood -- pine, birch, fir -- went into the ladder’s construction, each with its own internal markings of rings and knots and its own external markings from the machinery that milled the raw timber into lumber and from the tools used to build the ladder. One piece of wood -- identified as “rail number 16” -- was especially interesting because it had four nail holes in it that had no connection with the making of the ladder, thus suggesting prior usage. Of low-grade sapwood, with no signs of weathering, it suggested that the rail had been previously nailed down indoors and used for rough construction, perhaps in the interior of a garage or attic.

This “wood technologist” Arthur Koehler was some kind of savant, the head of the Forest Service Laboratory of the Department of Agriculture in Madison, Wisconsin, who claimed that “lumber had specific markings as individualized as fingerprints, from which he could trace its history -- where it was grown, where it was milled, where it was sold.” He was about to prove it.

There were dozens of other clues that kept Koehler on the investigative trail. The rungs of this homemade ladder, for example, were of soft Ponderosa pine but showed no signs of wear, indicating that the ladder had been built for this particular job. The marks on those rungs from the planer that dressed the wood revealed an unusual combination of cutter heads. Koehler mailed a form letter to 1,600 lumber mills on the East Coast, asking if their lumber planers shared the same characteristics. Positive replies came from twenty-five mills, which were asked to send sample boards. From them, Koehler was able to identify the Dorn Lumber Mill in McCormick, South Carolina, as the source of the boards that became the ladder’s siderails. Twenty-five lumberyards had received shipments of Dorn’s southern pine since the fall of 1929. Through scientific deduction, Koehler whittled the list down to the National Lumber and Millwork Company in the Bronx, which had bought its shipment in December 1931, three months before the kidnapping.

Unfortunately, National Lumber and Millwork was mostly a cash business, and few records were kept as to who its customers were. But then, a search of the suspected kidnapper’s house turned up an interesting discovery.

Although the lead detective from New Jersey had been in [the] attic several times, he had not previously noticed one of the pine planks in its southwest corner was shorter than the other boards by a good eight feet. This detective suddenly recalled the wood expert, Arthur Koehler, commenting that rail 16 of the ladder had some prior use. Rail 16 was brought to the Bronx and laid across the crossbeams of the attic floor. Four holes in the rail lined up exactly with four nailholes in the floor joists.

Arthur Koehler was summoned. Although a little more than an inch of wood had been cut away between the rail and the original floor plank, the number, color, dimension, and pattern of the rings indicated to him that the one piece of wood had been cut from the other. Koehler also examined a hand plane taken from [the suspected kidnapper’s] garage, whose blade markings, he said, revealed that it had been used in making the ladder.

Astounding. Not only did the poorly constructed ladder fail on the kidnapper, breaking on his nefarious descent, causing him to lose the precious cargo of both his and the Lindberghs’s dreams, but the same ladder then seems to betray him, revealing its secrets to a wood necromancer and the world. At the trial, Koehler’s testimony was as incontrovertible as it was devastating. This reader, 80 years after the fact, was just as mesmerized by its exactitude and resistless logic.

In the end, Koehler’s testimony had been so dumbfounding in its precision that there was little for the defense to challenge. As Ford Madox Ford observed in a column for The New York Times, Koehler “was like the instrument of a blind and atrociously menacing destiny. You shuddered at the thought of what might happen to you if such a mind and such an inconceivable industry should get to work upon your own remote past -- a man who searched 1,900 factories for the traces of the scratches of your plane on a piece of wood. It was fantastic and horrifying.”

Indeed. It still is.

The War

Somewhat famously, Charles Lindbergh opposed America’s entry into the Second World War. Some thought him a German sympathizer, others an apologist for the atrocities of the Nazi regime. My jury remains out. According to what I read, he was clearly a leading figure in the America First movement, sincerely believing that it was more patriotic to keep America out of the entanglements of European Wars. He was certainly not the only one who felt that way.

But when the War came, Lindbergh was as equally patriotic in his desire to fight, to offer his aid to his country in its time of need. He was, after all, a pioneer of world aviation. The Spirit of St. Louis was just the beginning. He likely knew more about the aerial combat capabilities of the different belligerent nations than anyone else on earth. But to oversimplify the situation, he had angered FDR with his pre-war rhetoric -- especially given the size of the platform he launched it from -- and he was only allowed to play a minor role in America’s aerial action against the Germans and the Japanese.

At it was in the Pacific theater that he first saw the true horrors of war.

[The Japanese stronghold of] Biak also provided Lindbergh with the most grotesque images of war he had ever seen, visions that would haunt him forever. On Monday, July 24, 1944, Lindbergh and several officers drove a jeep to the Mokmer west caves, where the enemy had waged one of its most stubborn stands. They went as far as they could up a crude military road, then walked the next few hundred feet towards the caves. Going down a hill, they came to a pass with bodies of a Japanese officer and a dozen soldiers “lying sprawled about in the gruesome positions which only mangled bodies can take.” Several weeks of weather and ants had eaten most of the flesh from the skeletons. The sight of skulls smashed to fragments prompted one officer to say, “I see that the infantry have been up to their favorite occupation,” namely, knocking out gold-filled teeth for souvenirs.

In a way, the reaction that Lindbergh describes next is understandable.

At the side of the road, they passed a bomb crater in which lay the bodies of another half-dozen Japanese soldiers, partly covered with a truckload of garbage Allied troops had dumped on top of them. “I have never felt more ashamed of my people,” Lindbergh wrote in his journal. “To kill, I understand; that is an essential part of war. Whatever method of killing your enemy is most effective is, I believe, justified. But for our people to kill by torture and to descend to throwing the bodies of our enemies into a bomb crater and dumping garbage on top of them nauseates me.”

But in another way, I think it is hideous, revealing an almost childlike morality at work in his mind, where killing people in whichever way is most effective is “justified,” simply because your nation is at war with their nation. Killing them is justified, but dumping garbage on their corpses is nauseating.

And contrast that to the very next paragraph.

On July 28, 1944, Lindbergh joined up with the 433rd Fighter Squadron, as observer in the No. 3 position of an eight-plane sweep. Their mission was to bomb and strafe “targets of opportunity” on Amboina, a small, Japanese-held island off the southwest coast of Ceram.

Well, that’s okay, right? After all, dropping bombs on people is justified because it is a very effective way of killing your enemies. Just make sure you don’t cover their dead bodies with your garbage when you’re done with them. You land that plane and make sure they get the burial your touching respect for the dignity of human life demands.

But, Lindbergh saw even worse things in Germany, and these began to change his thinking on these subjects.

The next day was even more phantasmagoric. Intimations of what lay ahead came at breakfast, as members of Lindbergh’s party discussed alleged savageries at Camp Dora. “That’s where the Germans had furnaces that were too small to take a whole body, so they used to cut the arms and legs off and stuff ‘em in that way,” said one man. “The prisoners were so badly starved that hundreds of them were beyond saving when the Americans came,” added another.

I’ll admit, I’m fascinated by the horrors of war -- the savage details that are so often missing from a society’s abstract exploration of the subject -- and I think I’m fascinated by them because I believe they have to be better remembered. They have to be remembered in way they seldom are when the next call for war comes marching down the street. I offer only that as my justification for transcribing what comes next.

A short time later, Lindbergh and his party had made their way up the mountainside above the camp, off the road so that they might reach a low, factory-like building. The diameter of its brick smokestack was disproportionately large for its height. At one end of the building, he saw two dozen stretchers, soiled and bloodstained -- “one of them showing the dark red outline of a human body which had lain upon it.” Upon entering the building they saw a plain black coffin with a white cross painted on it. Beside that, covered in canvas on the concrete floor, lay what was unmistakably a human body. In a moment, Lindbergh realized exactly what kind of “factory” he had entered.

Moving into the main room of the building, Lindbergh saw two large furnaces, side by side, with steel stretchers for holding the bodies protruding through the open doors. “The fact that two furnaces were required added to the depressing mass-production horror of the place,” Lindbergh would note. The sight appalled him. “Here was a place where men and life and death had reached the lowest form of degradation,” he wrote. “How could any reward in national progress even faintly justify the establishment and operation of such a place. When the value of life and the dignity of death are removed, what is left for man?”

In these comments, I think, we can see Lindbergh’s moral understanding beginning to change, shoved, as it was, from its position of superiority by the mechanical brutality that surrounded him. But here’s not there, yet. Onward.

A figure walked through the door, something between a young boy and an old man. It was a seventeen-year-old Pole, wearing a striped prison uniform, cinched at the waist but otherwise much too large for his skeleton of a body. Speaking German to Lieutenant Uellendahl, he pointed to the furnaces and said, “Twenty-five thousand in a year and a half.” Then he ushered the two Americans into the room they had first entered, and he lifted the canvas from the corpse on the floor.

“It was terrible,” the boy said, his face contorted in anguish. “Three years of it.” Pointing to the bony cadaver, he added, “He was my friend -- and he [was] fat.” As though sleepwalking, Lindbergh followed the boy outside, his mind “still dwelling on those furnaces, on that body, on the people and the system which let such things arise.” He was jerked back to reality by Uellendahl’s translating again: “Twenty-five thousand in a year and a half. And from each one there is only so much.” the boy cupped his hands together, then looked down. Lindbergh followed his gaze and realized they were standing at the edge of a pit, eight feet by six feet, and possibly six feet deep. It was filled to overflowing with ashes and bone chips. Lindbergh noticed two oblong mounds of clay nearby, evidently pits that had been capped. The boy reached down and picked up a knee joint, which he held out for Lindbergh’s inspection.

Yes. A human knee joint. How could someone not be forever affected by such an experience?

The horrors were not lost on Lindbergh. “Of course, I knew these things were going on,” he would write in his journal on June 11, 1945; “but it is one thing to have the intellectual knowledge, even to look at photographs someone else has taken, and quite another to stand on the scene yourself, seeing, hearing, feeling with your own senses.” His mind flashed back to the rotting Japanese bodies he had discovered in the Biak caves and the load of garbage he had seen dumped on dead soldiers in a bomb crater. He thought in rapid succession of stories he had heard of Americans machine-gunning prisoners on a Hollandia airstrip, of Australians pushing Japanese captives out of transport planes, of American soldiers probing the mouths of Japanese soldiers for gold-filled teeth, of pictures of Mussolini and his mistress hanging by their feet. “As far back as one can go in history,” he told himself, “these atrocities have been going on, not only in Germany with its Dachaus and its Buchenwalds and its Camp Doras, but in Russia, in the Pacific, in the riotings and lynchings at home, in the less-publicized uprisings in Central and South America, the cruelties of China, a few years ago in Spain, in pogroms of the past, the burning of witches in New England, tearing people apart on the English racks, burnings at the stake for the benefit of Christ and God.”

The rhetoric is rising. Is it all just poetry? Or will he make the difficult connection?

Lindbergh never considered that his ignoring -- or his ignorance of -- the Nazi slaughter was tantamount to condoning it. Instead, he stood ready to accept only collective blame, as an American and a member of the human race. “It seemed impossible that men -- civilized men -- could degenerate to such a level,” he wrote. “Yet they had. Here at Camp Dora in Germany; there is the coral caves of Biak. But there, it was we, Americans, who had done such things, we who claimed to stand for something different. We, who claimed that the German was defiling humanity in his treatment of the Jew, were doing the same thing in our treatment of the Jap.”

There. At last. A fully mature moral reflection. There is a universal depravity in man, certainly more fully expressed in some instances than in others, but present across all cultures and typically hidden in one own’s cultural context. The fight is not always against the other. Sometimes, and frequently most importantly, the fight is to keep from seeing the other at all.

Unfortunately, this stroke of conscience came much too late for many people of his time and, I’m afraid, for me. Even much later, when Lindbergh had become a crusader for environmental causes, people questioned his zeal.

Some, particularly Jews, found Lindbergh’s newfound passion disconcerting, especially when he flung around such phrases as, “I don’t want history to record my generation as being responsible for the extermination of any form of life.” Longtime editorial writer Max Lerner, for one, wondered, “Where the hell was he when Hitler was trying to exterminate an entire race of human beings?”

The Man

The most interesting part of this biographical journey for me was, I think, the slow and slogging realization that Lindbergh was not a great man. Many biographies put their subject on a pedestal, explaining and often excusing their subject’s human failings as some part of the mystic alchemy that our culture requires for greatness. But not here. Berg maintains a editorial distance throughout, showing Lindbergh, as much as possible, as the man he was.

And, in that respect, there are two episodes from late in Lindbergh’s life that are worth remembering.

In October 1965, Lindbergh invited each of his children and their spouses to join him for several weeks camping in southern Kenya. He and Anne offered to cover most of the costs. Lindbergh flew ahead, on the new weekly Pan American flight from New York to Nairobi, arriving on December eleventh. Over the next few weeks, Jon and his wife, Barbara, left their five children behind on Bainbridge Island, Washington, where they had settled; Anne [Lindbergh’s daughter] and Julien Feydy flew down from Paris with Scott, who had transferred to Cambridge University; and Anne [Lindbergh’s wife] arrived with Reeve, a student at Radcliffe. Only Land -- with his wife and two children on their four-thousand-acre ranch on the Blackfoot River in Montana -- politely declined the offer, anticipating several strained weeks marching to the relentless beat of his father’s drum. “I’m not going,” he told his wife, “--too many people and too tight a schedule.”

Lindbergh, evidently, was a difficult man to be around, even in the opinion of his son. And, as the next episode reveals, also in the opinion of his wife.

Charles assured Anne that she would come to care for Hawaii once Argonauta was completed and that he intended to spend more time with his wife there. He misled her on both counts. It rained steadily the first week in January 1971, when they returned to Hawaii to move into their newly completed house; and they quickly discovered the roof leaked. Worse than that, despite Charles’s admonitions, the architect and contractor had failed to create proper drainage for the house. A torrential downpour awakened them their first night; and muddy streams, just as Charles had foretold, sluiced through the house. They spent the next few hours out in the storm, he digging channels with a bucket while she built a mud dam. The house had not even dried out when they were invaded by armies of ants, spiders, cockroaches, lizards, rats, even a mongoose. And the Lindbergh was summoned to an emergency meeting of the Pan American board in New York.

Anne was, as she scratched in her diary, “furious to be left at this point in this place in this state. A place which is not of my choosing. I do not have friends, family, or interests here. It is not a place I would normally choose to live in alone. I only came for him -- because he loves it & said he expected to be here with me. I am angry not only at him but at myself for hoping that he would at least stay here.” Argonauta did not even have a telephone, and the nearest people were ten minutes away through the mud. Propane gas motors generated electricity -- one for lights, the other for appliances; but, she wrote Lucia Valentine, she would gladly trade her few modern conveniences for a little company. What she found most discouraging was “the pattern of being left” and -- after all her years of weeping to her therapist and wailing in her diary -- her own inability to walk away from such unacceptable behavior.

As Anne alludes above, this was not the first time Lindbergh had acted with such willful disregard to her and her feelings. It was simply the latest example in what was by then more than forty years of marriage.

When they were together, he expected her attention to be focused on him, his self-absorption reaching comical proportions. He sometimes forbade her to pick up the telephone when it rang; and if he found her spending too much time gabbing to friends, he sometimes grabbed his gun from the closet and threatened to shoot the phone. When Anne replaced some seventy-five-year-old mattresses in the guestroom with a new set -- bought on sale at Bloomingdales -- it sparked a sermon on her contributing to the fall of civilization. He became obsessed with the general breakdown of law and order and the upsurge in anarchy, and he often groused about “what’s happening to the country.”

He was, I came to understand, in many respects a cantankerous old man, his mental faculties fading and regressing closer and closer to the baseness of his own self-centered personality.

+ + +

This post first appeared on Eric Lanke's blog, an association executive and author. You can follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.



Monday, July 3, 2017

Sharing Results from the Board Meeting

Those of you who are chief staff executives of associations, I have a question for you. Each time you come back to the office after one of your Board meetings, what do you tell your staff about what happened at the Board table?

Let me guess. Only what they need to know.

I used to feel the same way. At my Board meetings, there are typically so many different kinds of conversations -- not so many different conversations, so many different kinds of conversations -- that it's often hard to know where to begin.

There are, of course, the action items. Those are the easy ones. The Board voted on this or that issue and here is how the vote came out. But beyond that there are all the different kinds of conversations that swirl around or amidst the action items. I look at these as a kind of running commentary -- on programs, on performance, on strategic intent -- all three of them frequently blurring together into a continuous stream of ideas, suggestions, and direction.

How many of those do you share?

As it turns out in my case, very few. We for some time now have been working with our Board to more clearly define the line between governance and management. The Board handles governance -- which we have come to understand as defining the expected outcomes, or ends, of the organization. And I, as the chief staff executive, handles management -- which we have come to understand as defining the methods, or means, for how those outcomes are going to be achieved.

Board-level discussion then, when placed on this footing, is focused almost entirely on defining the right outcomes and determining if those outcomes are being met. What is our mission and is that mission being fulfilled? If we aren't clear on the mission, we need to get clear. If we are clear on the mission, but we aren't achieving it, we need to find new ways of pursuing it.

In order to help the Board stay in this zone, it's important to have the right structures in place for their review of organizational success. Mission and purpose aren't the kinds of things that should be changing every time to Board gets together, so the focus inevitably turns to how we are measuring success and what those metrics are telling us about the capabilities of the organization. That can sometimes be messy territory, but the end result should always be fairly clear. We're either measuring the right thing or we're not. We're either succeeding against that metric or we're not. Action items, once placed in this frame, almost always steer clear of programmatic micromanaging and stay focused on building the resources the organization needs to do its job.

And that, I've discovered, makes reporting Board meeting outcomes to the larger staff a much easier task. There isn't a long list of disparate programmatic directives. There are really just two topics of conversation. Here's where we're doing well and here's where we have to do better. The trick is no longer figuring out what to tell people. They challenge more frequently is coming up with new solutions to old problems.

+ + +

This post first appeared on Eric Lanke's blog, an association executive and author. You can follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.

Image Source
http://www.wisegeek.org/what-is-a-staff-meeting.htm

Monday, June 26, 2017

Build Before You Change

As I write this I am on the plane out to my association's annual strategic board retreat. For those who wonder when I find the time to write my blog posts, airplane rides are great for my output.

I just finished re-reviewing the report I've prepared and will give to the board. Our annual retreat comes right at the end of our fiscal year, which gives us the ideal opportunity to both look back at the year just ending and to look forward at the year just beginning. And my report attempts to do exactly that. To look back on the successes of the year just ending and to look forward on the metrics and goals that will help determine our success in the year just beginning.

My summary comment on the whole package goes something like this: We've had a very successful year. Member participation and engagement is up, pretty much across the board, and our outreach networks to stakeholders outside our association have also grown substantially. As we look ahead to next year, a key priority will be exerting better leverage on those networks for the outcomes we seek.

That's especially true, I think, when we look at our efforts to see more of the technology our association represents being taught in our nation's universities and technical schools. One key focus area for us has been in building better stakeholder networks in these areas -- essentially engaging with the instructors and administrators in these institutions who would be in a position to actually do and facilitate this teaching.

We've offered a number of research and curriculum grants through our charitable foundation to many of these individuals, and the beneficial outcome of those activities extends beyond the creation of new curriculum pieces focused on our technology. We now have a substantial body of university professors and technical school instructors who are familiar with our association and interested in working collaboratively with us.

And only now that this network has been built do I feel that the time is right to try and leverage it for the wholesale change that we seek. In retrospect, building the connections took a great deal of time, but it was time that was necessary. Without the right partners, there is little chance that we would be able to create the kind of change we feel we need.

+ + +

This post first appeared on Eric Lanke's blog, an association executive and author. You can follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.

Image Source
http://www.sunny923.com/2016/01/06/investing-in-legos-is-smarter-than-putting-money-in-stocks-or-your-401k/



Saturday, June 24, 2017

Doubt by Jennifer Michael Hecht

A book I enjoyed a lot less than I thought I would. It’s called “a history” by its author, as in “a history of doubt,” and indeed it is as it chronicles the principles and personalities of great doubters (i.e., those who doubt the existence of God or gods) through 2,600 years of history. An ambitious and worthy subject, but at times it felt like I was reading an encyclopedia.

Here’s are the random bits that seemed to jump out at me. Given the subject matter of the book, you can be sure that they take a unapologetic freethinking bent.

Christian Saints = Pagan Nature Gods

By the sixth century, Christians in the West had won over the cities, but the countryside was still a place of almost endless supernatural energies, and even city dwellers saw the natural world in this spirited way. The great God of the Christians was too far away for farmers, and the Son may have been human but he was not available for watering the fields or fending off locusts. In parts of Spain the practice of leaving little piles of votive candles near springs, in trees, and on hilltops and crossroads was still so rampant that as late as the 690s dramatic Church ceremonies were staged to transfer the candles to the local churches and announce that idolatry was finally dead. What actually worked was not sermons against the enchanted natural world, but rather the reenchantment of the world in Christian terms. Gregory of Tours (538-594) was most responsible for the reinterpretation of the Christian saints as capable of helping average people in their relationship with the natural world; through them springs and crossroads once again became sanctioned places for worship. The saints brought healing, mercy, and fertility to the small places of field and hearth, and brought safety on byroads and high seas. In myriad ways, water was holy again, and trees might spring up on the graves of saints.

Never thought of saints this way before, but is makes total sense. Similar to the way the early Christian church adopted pagan holidays as their own.

Begging the Question

After my long post on The Mind and the Brain, where I accuse the author of constantly begging the question in a similar way, this one really resonated with me.

[John] Locke did not agree with [Rene] Descartes, because Locke noticed that “I think, therefore I am” is a bit of a leap (as the Buddha might have happily pointed out); that “I think, therefore thinking happens” is pretty much all you can get.

Touche.

More Christian Tormentors Than Christian Martyrs

There were not that many martyrs anyway, wrote [Edward] Gibbon, announcing “a melancholy truth which obtrudes itself on the reluctant mind,” that “even admitting” all the Christian martyrdom history has recorded, “or devotion has feigned … it must still be acknowledged that the Christians, in the course of their intestine dissensions, have inflicted far greater severities on each other than they had experienced from the seal of infidels.” The number of Protestants “executed in a single province and a single reign far exceeded that of the primitive martyrs in the space of three centuries and of the Roman empire.”

An excellent point.

Jesus: Jefferson’s Philosopher, not Savior

Thomas Jefferson, author of the Declaration of Independence and third President of the United States, in a letter to his friend, William Short:

“That Jesus did not mean to impose himself on mankind as the son of God, physically speaking, I have been convinced by the writings of men more learned than myself in that lore. But that he might conscientiously believe himself inspired from above, is very possible. The whole religion of the Jews, inculcated on him from his infancy, was founded in the belief of divine inspiration … he might readily mistake the coruscations of his own fine genius for inspirations of an higher order. This belief carried, therefore, no more personal imputation, than the belief of Socrates, that himself was under the care and admonitions of a guardian Daemon. And how many of our wisest men still believe in the reality of these inspirations, while perfectly sane on all other subjects.”

So, fixated, it seemed, was Jefferson on separating the philosopher Jesus from the mythical one, he famously edited his own Bible, taking out of the supernatural mumbo-jumbo that seem to permeate the Gospels. He also penned this delightful quote:

“But the greatest of all the reformers of the depraved religion of his own country was Jesus of Nazareth. Abstracting what is really his from the rubbish in which it is buried, easily distinguished by its luster from the dross of his biographers, and as separable from that as the diamond from the dunghill.”

Schopenhauer: Jefferson’s Disciple

In the following fact, philosopher Arthur Schopenhauer seems to be taking the same page out of Jefferson’s notebook.

He wrote that believers convince themselves their religion’s myths are somehow connected to its ethical code and thus “regard every attack on the myth as an attack on right and virtue.”

Here, Schopenhauer’s myth is Jefferson’s dunghill, and Schopenhauer’s ethical code is Jefferson’s diamond. But the German takes the idea one step further.

Almost comically, “this reaches such lengths that, in monotheistic nations, atheism or godlessness has become the synonym for absence of all morality.”

To those who equate myth with morality, the rejection of one must therefore entail the rejection of the other.

A Fundamental Misunderstanding

Finally...

[Charles] Bradlaugh wrote that “the Atheist does not say ‘There is no God,’” but says: “‘I know not what you mean by God; I am without idea of God; the word God is to me a sound conveying no clear or distinct affirmation. I do not deny God, because I cannot deny that of which I have no conception’ especially when even those who believe in the things cannot even define it.”

This seems to me one of the fundamental misunderstandings that exist between believers and non-believers. One cannot deny that which one does not understand.

+ + +

This post first appeared on Eric Lanke's blog, an association executive and author. You can follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.





Monday, June 19, 2017

It's Okay to Ignore People

My phone doesn't ring as much as it used to. Fifteen years ago, it seemed, my phone rang all the time. Sometimes it was a member of my association looking for help, sometimes it was an unsolicited salesperson, and sometimes it was someone looking for a piece of information that only me or my association could provide.

And, as someone interested in maintaining a professional reputation, I tried to respond to as many of these calls as I could. The members, of course, would get my immediate and prompt attention. The unsolicited salespeople would be politely asked to stop calling if I wasn't in the market or otherwise interested in their services. And I would do whatever I could, within the policies and procedures of my association, to help the people looking for information.

Today, as I said, my phone doesn't ring as much as it used to. The phone is not as popular as it used to be, and I'm in a different position. It's probably harder for outsiders to get my number and get to me. But the calls that do get through still fall into the same three categories.

And today, the only people who get my attention are the members. Both the unsolicited salespeople and the strangers looking for information are ignored.

And that's okay.

Frankly, it took me some time to come to that conclusion. The first to get the cold shoulder were the unsolicited salespeople, and I actually felt guilty about that for a few years. They've got a tough job after all -- calling strangers on the telephone and asking them to buy something they probably don't want. But they created so many interruptions for me -- needless interruptions -- that I eventually found peace with the decision to ignore them.

And shortly thereafter, I realized that the strangers seeking information were creating exactly the same kind of interruptions for me.

Hi, you don't know me, but I'm doing a study on the industry your association represents, and I was wondering if I could get a few minutes of your time?

Hello, I work for a venture capital firm and we're thinking about buying one of the companies in your industry, and I need whatever information you have on the size of product market this company plays in.

Good afternoon, I'm an engineer and I've invented a new product that's going to revolutionize the industry your association represents, and I want you to put me in touch with the companies most likely to license this technology.

One day, after getting one of these calls, I had a kind of epiphany. Nine times out of ten, the kind of information I was being asked to provide was tightly connected to the value proposition that we had created for our members, and for which they paid substantial amounts of money in the form of membership dues. In other words, I worked for a trade association, not a public help line. The information I had access to was not only valuable, it belonged to my members, not any stranger who had found our phone number on our website.

So I started ignoring the people making these calls as well. And, unlike the case of the unsolicited salespeople, I didn't feel guilty at all.

Inspired by this.

+ + +

This post first appeared on Eric Lanke's blog, an association executive and author. You can follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.

Image Source
http://sugarsystems.com/2017/01/3-reasons-why-people-ignore-your-linkedin-requests/




Monday, June 12, 2017

Don't Be Misled By the Concentric Circles of Diversity

A few weeks ago I mentioned that Spark Consulting was out with another white paper -- this one on the sometimes challenging topic of diversity and inclusion -- and that it was another thought-provoking read for association CEOs. If you're interested, you can download "Include Is A Verb: Moving From Talk to Action on Diversity and Inclusion" here. It's free and you don't even have to register for it.

I also said that, for me, there were several key concepts. Here's another one.

Look at the picture accompanying this blog post. It's from the white paper, and it leads off the section in which the authors provide some helpful advice on starting your own diversity and inclusion initiative at your association. They, quite correctly, I think, advise that you start first and foremost with yourself.

The first step is to undertake the work individuals must do on themselves.

Start in the center of the target with yourself and then, as implied by the picture, begin working your way out in concentric rings, working next to reform your workplace, then your volunteer leaders, then your membership, and finally, if you have the courage, the very profession or industry your association represents.

To be fair to the authors, they admit in the text of the white paper that things are not really this linear. That, for example, diversity in the industry your association represents is obviously a prerequisite for diversity in your association's membership, and that diversity in your association's membership is just as obviously a prerequisite for diversity in your association's leadership. In this regard, diversity in the outer three concentric rings shown in the picture moves from the outside in, not the inside out.

In my previous post I shared some of the leadership discussions and diversity initiatives that I participated in when I was Board chair of the Wisconsin Society of Association Executives. Well, it was this realization about the white paper's outer three concentric rings--and the recognition of how difficult changing the diversity of the profession we represented would be--that was one of the primary factors that led us down the "dimensions of diversity" path I described. Rather than determining what the diversity of the association management profession in Wisconsin should be, we decided instead to better understand what the diversity of that profession was, and then work proactively to ensure that that diversity was reflected in our association's membership and leadership.

That's one problem I have with the picture. Once you're told you're supposed to start in the center, you assume you have to keep moving outward through the rings. You don't.

Here's the other problem I have with it. The diversity of your association workplace and the diversity of your association leadership have little or no connection at all.

Unless you work for one of the few associations of association professionals, or for an even rarer association entirely staffed by the same people who work in the industry or profession the association represents, then, by definition, the profession of the people who work for the association and the profession of the people who belong to the association are two different professions. And two different professions likely have two different dimensions of diversity. What's important in one may not be important in the other, and therefore, fixing one may have no impact on fixing the other.

It might actually be better for the white paper to show two targets instead of one. The first with yourself in the middle, working outward to change the diversity of your workplace, and the second with your association's industry or profession in the middle, again working outward to change the diversity of your association's membership, and then its leadership. That way, not only do you start from the right premise, you've correctly split the task before you into its two basic strategies.

+ + +

This post first appeared on Eric Lanke's blog, an association executive and author. You can follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.

Image Source


Saturday, June 10, 2017

The Mind and the Brain by Jeffrey M. Schwartz, M.D., and Sharon Begley

Some time ago I read a book called The Brain That Changes Itself, written by Norman Doidge. Its central theses is that the brain exhibits something called “plasticity”—that it can be rewired and retrained throughout life, a notion that runs contrary to a hundred years of brain theory, but which is gaining more and more acceptance. Each chapter in Doidge’s book presents a case study of someone who consciously or unconsciously used the plasticity of their brain to change fundamental behaviors or regain functionality medical science predicted was impossible.

My own reaction to Doidge’s book was that its subject was fascinating stuff. Doidge makes a compelling argument that the brain is not as we once believed it to be. But a larger—and far more fascinating—question seemed to loom unanswered in all of Doidge’s case studies. What, specifically, is doing the rewiring? Is there an entity, separate from the brain, such as the “mind” or the “soul,” that can exert top-down control over this process? Or is plasticity an inherent property of the brain the way wetness is an inherent property of water? Is there, or isn’t there, a “ghost in the machine?”

Well, Jeffrey Schwartz in The Mind and the Brain tackles that question head-on, and comes down decidedly on the side that the mind does exist as something distinct from the brain. And not only that, the mind can change the brain through “its” conscious will, and that it manifests itself through quantum phenomena in our heads.

I’m not sure I buy that, but let’s unpack it.

1. The Mind Does Exist As Something Distinct from the Brain

From Schwartz’s introduction, page 11:

Through the centuries, the idea of mindfulness has appeared, under various names, in other branches of philosophy. Adam Smith, one of the leading philosophers of the eighteenth-century Scottish Enlightenment, developed the idea of the “impartial and well-informed spectator.” This is “the man within,” Smith wrote in 1759 in The Theory of Moral Sentiments, an observing power we all have access to, which allows us to observe our internal feelings as if from without. This distancing allows us to witness our actions, thoughts, and emotions not as an involved participant but as a disinterested observer. In Smith’s words:

“When I endeavor to examine my own conduct … I divide myself as it were into two persons; and that I, the examiner and judge, represent a different character from the other I, the person whose conduct is examined into and judged of. The first is the spectator. … The second is the agent, the person whom I properly call myself, and of whose conduct, under the character of a spectator, I was endeavoring to form some opinion.”

It was in this way, Smith concluded, that “we suppose ourselves the spectators of our own behaviour.” The change in perspective accomplished by the impartial spectator is far from easy, however: Smith clearly recognized the “fatiguing exertions” it required.

Before reading any further, I scribbled in the margin, “Will he make the argument that this ‘spectator’ is the mind, and what it observes is the brain?” And Schwartz goes on to do exactly that farther down the same page and through the rest of the book. He is approaching the question as a clinician, trying to find a solution to the obsessive-compulsive disorders (OCDs) of his patients, and in this idea of the observing mind and the acting brain, he thinks he has found -- and later demonstrates -- an effective therapy.

The obsessions that besiege the patient seemed quite clearly to be caused by pathological, mechanical brain processes -- mechanical in the sense that we can, with reasonable confidence, trace their origins and the brain pathways involved in their transmission. OCD’s clear and discrete presentation of symptoms, and reasonably well-understood pathophysiology, suggested that the brain side of the equation could, with enough effort, be nailed down.

As for the mind side, although the cardinal symptom of obsessive-compulsive disorder is the persistent, exhausting intrusion of an unwanted thought and an unwanted urge to act on that thought, the disease is also marked by something else: what is known as an ego-dystonic character.

I’ll stop there simply to note the word choice. Scarcely a paragraph after introducing Smith’s idea of “the impartial and well-informed spectator,” Schwartz has simply adopted an assumed and linguistic paradigm of “the brain side” and “the mind side” of his equation. Stating it evidently makes it so.

Now, I haven’t read Adam Smith, and I’m certainly not a clinical psychiatrist, but the problem I detect is that Schwartz is begging the question. The effectiveness of the therapy he builds on this premise -- and it does prove to be effective -- is a red herring.

When someone with the disease experiences a typical OCD thought, some part of his mind knows quite clearly that his hands are not really dirty, for instance, or that the door is not really unlocked (especially since he has gone back and checked it four times already). Some part of his mind (even if, in serious cases, it is only a small part) is standing outside and apart from the OCD symptoms, observing and reflecting insightfully on their sheer bizarreness.

Just because there are two “parts” of the brain, each chemically encoded with a pair of discordant thoughts, and just because you name one of those parts the “mind” and the other part the “brain,” that does not then mean that the “brain” and the “mind” exist as separate phenomena, with the dividing line of physicality (or quantum superposition) running between them. You have stated that the mind and the brain are different, but you have not proven it.

A. The Language Defines, or Dismisses, the Argument

I’ve complained about this before, but Schwartz’s book often took my frustration to new heights. We frankly need a new vocabulary to talk substantively about these issues. Until that day arrives, unfortunately, even bona fide experts in the field, like Schwartz, will be hobbled by inaccurate and misleading turns of phrase like the dozens that appear in his text.

Some simply demonstrate just how confusing the subject is, with terms that should have clear and mutually exclusive definitions being used interchangeably. The following three sentences appear on the same spread of pages.

The discovery that neuroplasticity can be induced in people who have suffered a stroke demonstrated, more than any other finding, the clinical power of a brain that can rewire itself.

Oops. Watch your language there. Aren’t you the one arguing that its is the “mind” that changes the “brain,” not the “brain” that changes itself?

Stapp’s youthful pursuit of the foundations of quantum physics evolved, in his later years, into an exploration of the mind’s physical power to shape the brain.

That’s better. That’s what you mean to say, isn’t it?

Individuals choose what they will attend to, ignoring all other stimuli in order to focus on one conversation, one string of printed characters, or, in Buddhist mindfulness meditation, one breath in and one breath out.

Wait. I thought we were talking about minds and brains. Now you’re calling something an “individual?” Is that the “mind,” or the “brain,” or some third thing you haven’t yet defined?

These are all more or less innocent slips, not intended to frame the argument in one direction or another. But, all too frequently, I found Schwartz to be using language that is outright dismissive of any other paradigm than the one he favors. To see what I mean by that, let’s dive a little deeper into a very deep subject.

B. Moral Philosophy 101

I’m tired of reading paragraphs like this.

One important answer is that the materialist-determinist model of the brain has profound implications for notions like moral responsibility and personal freedom. The interpretation of mind that dominates neuroscience is inimical to both. For if we truly believe, when the day is done, that our mind and all that term entails -- the choices we make, the reactions we have, the emotions we feel -- are nothing but the expression of a machine governed by the rules of classical physics and chemistry, and that our behavior follows ineluctably from the workings of our neurons, then we’re forced to conclude that the subjective sense of freedom is a “user illusion.” Our sense that we are free to make moral decisions is a cruel joke, and society’s insistence that individuals (with exceptions for the very young and the mentally ill) be held responsible for their actions is no more firmly rooted in reason than a sand castle is rooted in the beach.

There are so many things wrong with this line of thinking that I marvel that grown, educated adults actually subscribe to it.

First, as I began to describe above, you’re begging the question (again) with phrases like “nothing but.” For if we truly believe … that our mind and all that term entails … are nothing but the expression of a machine governed by the rules of classical physics and chemistry. Well, what else besides physics and chemistry would they be? Elven magic? “Nothing but” are words by which you smuggle your conclusion into the premise, and which therefore bias the very phraseology of your discussion in its favor. Drop them, and words like them. For if we truly believe … that our mind and all that term entails … are the expression of a machine governed by the rules of classical physics and chemistry … then we’re forced to conclude … What? What are we forced to conclude? Not sure, but at least we’re no longer presuming something not in evidence.

Second, you’re using the wrong grammar. If the mind is an expression of a machine governed by physical laws, then there is no “we” to be further speaking of, at least not the kind of “we” encoded in the syntax you persist in using. In the universe you have postulated, “we” haven’t lost anything, because “we” don’t exist. Choices are made, reactions are had, and emotions are felt, but “we” aren’t making, having or feeling them. The thinking is flawed if for no other reason than because the predicate does not match the subject.

Third, looking ahead to concepts that will be argued later in the book, why should we believe that matter governed by the rules of classical physics is devoid of moral content, yet matter governed by those of quantum physics is somehow imbued with it? When the “particle” goes through the slit and we see the resulting wave pattern on the screen, we may not understand how it is possible, but is the particle in question not following the same kind of universal laws as the planet classically orbiting the sun? Physical laws, regardless of how counter-intuitive they may seem, are still physical laws. And quantum brains could be just a determined as classical ones.

And fourth, even in the world you’re describing, moral action and personal responsibility still exist. Shouldn’t a malfunctioning machine be fixed? Even if it is only harming other machines? To use an emerging example, imagine a world in which we have driverless trucks transporting goods all over the country for us. If one of those driverless trucks had a flaw in its programming that forced it to run other driverless trucks off the road, who would argue that something shouldn’t be done about it? That driverless truck should be reprogrammed, or decommissioned if reprogramming proved ineffective. Now, I get that some may still argue that the only reason to fix the driverless truck is because by derailing the shipments of goods, it is by extension harming the moral agents (i.e., people) for whom those goods are intended. To that I say maybe. But it seems to me that even in a world where those people have no agency -- where, in other words, choices are made, reactions are had, and emotions are felt, not by an observing mind, but by an acting brain -- then there remains a kind of utility in fixing broken machines that creates a moral imperative.

2. The Mind Can Change the Brain Through “Its” Conscious Will

I understand why we all struggle with these ideas, and why some, like Schwartz, want to find a way out of what most will perceive as a “chicken and egg” dilemma. If the actions of the brain come from the physics of its electrochemical activity, then it seems wrong to also say that the electrochemical activity of the brain come from the actions of the brain. It feels like to has to be one or the other, not both. Either the brain (or the “mind” or the “individual” or the “soul”) directs its electrochemical activity to create specific actions within itself and its body -- a condition authors like Schwartz seem desperate to prove -- or the naturally occurring electrochemical activity of the brain create the sensation of consciousness that accompanies so many of the brain’s specific actions -- a condition that seems to be an anathema of the same authors..

But there is actually a fair amount of reasonable conjecture about that second possibility. In evolution circles, it is called epiphenomenalism.

A. Epiphenomenalism

Epiphenomenalism acknowledges that mind is a real phenomenon but holds that it cannot have any effect on the physical world. This school acknowledges that mind and matter are two separate beasts, as are physical events and mental events, but only in the sense that qualia and consciousness are not strictly reducible to neuronal events, any more than the properties of water are reducible to the chemical characteristics of oxygen and hydrogen. From this perspective, consciousness is an epiphenomenon of neuronal processes.

This is from a fairly helpful section of Schwartz’s book where he briefly describes six different “philosophies of mind and matter,” listing them from what he terms the most to the least materialistic. They are:

Functionalism
Epiphenomenalism
Emergent Materialism
Agnostic Physicalism
Process Philosophy
Dualistic Interactionism

Never mind what they all mean, I only showed them to illustrate how far to the materialistic side my sympathies lie. Except, when it comes to epiphenomenalism, I think Schwartz gets its slightly wrong -- or at least he doesn’t quite capture in its description what I think is going on.

Epiphenomenalism views the brain as the cause of all aspects of the mind, but because it holds that the physical world is causally closed -- that is, that physical events can only have physical causes -- it holds that the mind itself doesn’t actually cause anything to happen that the brain hasn’t already taken care of. It thus leaves us with a rather withered sort of mind, one in which consciousness is, at least in scientific terms, reduced to an impotent shadow of its former self.

He keeps begging the question, doesn’t he? Why on earth would anyone believe in something that “withers” the mind, that “reduces” it to an “impotent shadow” of its former self?

As a nonphysical phenomenon, it cannot act on the physical world. It cannot make stuff happen. It cannot, say, make an arm move. Epiphenomenalism holds that the brain is the cause of all the mental events in the mind but that the mind itself is not the cause of anything. Because it maintains that the causal arrow points in only one direction, from material to mental, this school denies the causal efficacy of mental states.

This causal arrow idea is key to Schwartz’s entire thesis, but instead of questioning that, let’s take a look at what Schwartz says about evolutionary biologists and their definition of epiphenomenalism.

The basic principles of evolutionary biology would seem to dictate that any natural phenomenon as prominent in our lives as our experience of consciousness must necessarily have some discernible and quantifiable effect in order for it to exist, and to persist, in nature at all. It must, in other words, confer some selective advantage.

Sigh. More begging the question. If philosophical epiphenomenalism is true, then there is no external observer than can determine that the “experience of consciousness” is “prominent in our lives.” By choosing that phraseology, you are biasing the argument against the possibility that epiphenomenalism is true. Again, the better phraseology comes from just deleting the complicating words: The basic principles of evolutionary biology would seem to dictate that any natural phenomenon must necessarily have some discernible and quantifiable effect in order for it to exist, and to persist, in nature at all.

True enough. But as Schwartz well knows, there is a problem. Not every natural phenomenon is specifically selected for. Some are along for the ride with others that are.

True, evolutionary biologists can trot out many examples of traits that have been carried along on the river of evolution although not specifically selected for (the evolutionary biologists Stephen Jay Gould and Richard Lewontin called such traits spandrels, the architectural term for the elements between the exterior curve of an arch and the right angle of the walls around it, which were not intentionally built but were instead formed by two architectural traits that were “selected for”). But consciousness seems like an awfully prominent trait not to have been the target of some selection pressure.

There we go again. Consciousness “seems like an awfully prominent trait.” It does? To who? To the observing mind you haven’t yet proven isn’t something other than the product of the acting brain? There is a deep reluctance on the part of many to consider -- or, as I hope I have shown, to even frame a fair argument for -- the possibility that consciousness is epiphenomenal. Part of this reluctance comes, I suppose, from the very nature of consciousness itself. The ghost in the machine would have a hard time, after all, of admitting that it was, in fact, a ghost. If our consciousness was epiphenomenal, then it’s likely that “we” wouldn’t be able to tell that it was. Almost by definition. As Schwartz quotes one of his colleagues:

Epiphenomenalism is a possible thesis, but it is absolutely incredible, and if we seriously accepted it, it would make a change in our world view more radical than any previous change, including the Copernican Revolution, Einsteinian relativity theory and quantum mechanics.

To which I say, yes, exactly.

B. Conscious Thoughts Without Conscious Intent

Because experiments have shown repeatedly that the brain activity responsible for conscious movements begins before conscious awareness. The most famous may have been performed by neurologist Benjamin Libet in the 1980s. Libet will also show up later in our discussion, but we needn’t even go there to demonstrate the point. Evidence that conscious brain activity occurs through actions other than the free exercise of consciousness can be pulled from Schwartz’s own work on the study obsessive-compulsive disorders.

Someone with obsessive-compulsive disorder derives no joy from the actions she takes. This puts OCD in marked contrast to, for instance, compulsive gambling or compulsive shopping. Although both compulsive shoppers and compulsive gamblers lack the impulse control to resist another trip to the mall or another game of video poker, at least they find the irresistible activity, well, kind of fun. An OCD patient, in contrast, dreads the arrival of the obsessive thought and is ashamed and embarrassed by the compulsive behavior. She carries out behaviors whose grip she is desperate to escape, either because she hopes that doing so will prevent some imagined horror, or because resisting the impulse leaves her mind unbearably ridden with anxiety and tortured by insistent, intrusive urges. Since the obsessions cannot be silenced, the compulsions cannot be resisted. The sufferer feels like a marionette at the end of a string, manipulated and jerked around by a cruel puppeteer -- her own brain.

Wow. Seems like OCD is itself proof of that conscious thoughts can occur without conscious intent. What makes it a disorder is perhaps not the fact the the “conscious” mind cannot control what appear to be the determined actions of the brain, but the fact that the epiphenomenal consciousness finds itself out of sync with those same determined actions. Normally, after all, the consciousness believes it has willed the determined actions of the brain to occur.

And Schwartz’s short description of selective serotonin reuptake inhibitors (SSRIs) as a potential pharmacological therapy for OCD leaves, to my way of thinking, a gapingly open question that needs further exploration. These SSRIs (Prozac, Paxil, Zoloft, Luvox, and Celexa) all block “the molecular pump that moves serotonin back into the neurons from which its was released, thus allowing more of the chemical to remain in the synapse.” It makes me wonder if serotonin isn’t a kind of “consciousness chemical,” since the more of it you have in your synapses, the more aligned the actions of your determined brain seem to be with your conscious will. Too little serotonin in your synapses and the resulting OCD behaviors reveal how disconnected those two phenomena can really be.

C. Telling People They Are More Than Their Gray Matter Doesn’t Make It So

But that’s not how Schwartz approached the problem and not the kind of therapy that he developed as a result. He goes into some detail about the anatomy and biochemistry of the brain that appear responsible for OCD, and I have no reason to question any of it. He describes it as a kind of overactive “worry circuit” in the brain, something that fires to help alert the organism that something is amiss in its environment. In the case of OCD, the circuit fires even when things are not amiss, causing the sufferer to check and doublecheck to make sure the things not out of order are, in fact, in order.

That’s all good. What troubled me was that his therapeutic approach depended on the belief that the patient existed as something separate from the functioning of her brain.

I began showing patients in the treatment group their PET scans, to drive home the point that an imbalance in their brains was causing their obsessive thoughts and compulsive behaviors. Initially, some were dismayed that their brain was abnormal. But generally it dawned on them, especially with therapy, that they are more than their gray matter. When one patient … exclaimed, “It’s not me; it’s my OCD!” a light went off in my head: what if I could convince patients that the way they responded to the thoughts of OCD could actually change their brains?

This makes absolutely no sense to me. Saying “It’s not me; it’s my OCD!” is akin to saying “It’s not me; it’s they way my brain is functioning!” They are, in essence, the same thing, as troubling as that thought may be to someone who believes that they exist in some way separate from their unique brain function.

Which, of course, many people do, and which, paradoxically, allows Schwartz’s cognitive therapy for OCD to be efficacious. Among several other techniques, Schwartz developed practices he calls Relabeling and Reattributing, where the OCD patient makes conscious the separation between herself and her malfunctioning brain.

Accentuating Relabeling by Reattributing the condition to a rogue neurological circuit deepens patients’ cognitive insight into the true nature of their symptoms, which in turn strengthens their belief that the thoughts and urges of OCD are separate from their will and their self. By Reattributing their symptoms to a brain glitch, the patients recognize that an obsessive thought is, in a sense, not “real” but, rather, mental noise, a barrage of false signals. This improves patients’ ability not to take the OCD thoughts at face value. Reattributing is particularly effective at directing the patient’s attention away from demoralizing and stressful attempts to squash the bothersome OCD feeling by engaging in compulsive behaviors. Realizing that brain biochemistry is responsible for the intensity and intrusiveness of the symptoms helps patients realize that their habitual frantic attempts to wash (or count or check) away the symptoms are futile.

I fear I would make a troublesome patient for Dr. Schwartz, for, even though I could potentially benefit from his new therapeutic ideas, I wouldn’t be able to stop from asking troublesome questions. What goes on in the brain, for example, that brain biochemistry isn’t responsible for? If it is brain biochemistry that is responsible for the intensity and intrusiveness of my OCD symptoms, then isn’t it also brain biochemistry that is responsible for my conscious attempts to relabel and reattribute them? What separates one from the other?

Relabeling and Reattributing reinforce each other. Together, they put the difficult experience of an OCD symptom into a workable context: Relabeling clarifies what is happening, and Reattributing affirms why it’s happening, with the result that patients more accurately assess their pathological thoughts and urges. The accentuation of Relabeling by Reattributing also tends to amplify mindfulness. Through mindfulness, the patient distances himself (that is, his locus of conscious awareness) from his OCD (an intrusive experience entirely determined by material forces). This puts mental space between his will and the unwanted urges that would otherwise overpower the will.

Putting mental space between the will and the unwanted urges is something I can understand and get behind. But the other separation Schwartz is talking about still seems to me asserted but not proven. The parenthetical phrases alone in the above excerpt contain enough fuzziness to keep the two of us going around in circles for some time. In the first, he seems to be saying that the patient can be equated with his locus of conscious awareness. We are, mysteriously it would seen, the act of paying attention to something. In the second, he again begs the question with that slippery word entirely. “An intrusive experience entirely determined by material forces” is a very different idea than the one expressed by “an intrusive experience determined by material forces.” The former implies that his counterpose, in this case, the will, is not determined by material forces. But who says it isn’t?

Despite the fact that Schwartz’s cognitive therapy works, it does not necessarily follow that his patients are able to marshal their conscious will from a source other than the material forces on which the brain is built and functions. How can the observing mind cause something to happen in the acting brain through “its” conscious will when that very consciousness may very well be dependent on the brain and its casual activities in order to manifest itself. Schwartz’s OCD therapy may have shown that there is a way out of that maze, but I don’t think it shows what that way is.

3. The Mind Manifests Itself Through Quantum Phenomena in Our Heads

I don’t understand quantum mechanics. And I’ll wager that Schwartz doesn’t either. As physicist Richard Feynman once reportedly said, if you think you understand quantum mechanics, then you absolutely do not understand quantum mechanics. It is, seemingly by its very nature, counter-intuitive and resistant to human understanding, full of undecipherable math equations and observable phenomena that apparently defy the physical logic that governs the classical world of bodies in motion.

And yet, Schwartz hangs his entire argument for the mechanism by which the mind changes the brain on one of these seeming tricks of quantum mechanics -- the apparent reality that the act of observation changes, or perhaps defines, what is being observed.

I’ll try to tease apart what Schwartz reports to understand about quantum mechanics from what I think know about the subject, but we’re likely to get twisted into even more challenging knots that the philosophical ones I’ve created so far.

Let’s begin with the math.

A. Mere Mathematical Devices

It has been a century since the German physicist Max Planck fired the opening shot in what would become the quantum revolution. On October 19, 1900, he submitted to the Berlin Physical Society a proposal that electromagnetic radiation (visible light, infrared radiation, ultraviolet radiation, and the rest of the electromagnetic spectrum) exists as tiny, indivisible packets of energy rather than as a continuous stream. He later christened these packets quanta. … Planck viewed his quanta as mere mathematical devices, something he invoked in “an act of desperation” to explain why heated, glowing objects emit the frequencies of energy that they do (an exasperating puzzle known as the black-body radiation problem). He did not seriously entertain the possibility that they corresponded to physical entities. It was just that if you treated light and other electromagnetic energy as traveling in quanta, the equations all came out right.

So here we have one of the fundamental realities of science. Not that light travels in “tiny, indivisible packets” called quanta, but that math is the tool that science uses to describe, not define, natural phenomena. In one of my most accessible examples, the mathematics of the epicycles used in pre-Keplerian astronomy did an admirable job in describing (and predicting) the observable motions of the planets in the solar system, but the math used did not bring into existence the great celestial wheels upon wheels that the formulae described. Like all sciences, quantum mechanics uses mathematics first to describe what is observed and then to extrapolate from those descriptive formulae new phenomena and understandings of reality. In some cases, those extrapolations are confirmed by observation, and those instances are held up as proof of the predictive power of the mathematical theory. But one always has to be careful to remember that the theory is one of constructed of epicycles -- structures that have no hard existence in reality.

Now, as I said earlier, one of the fundamental observed phenomena of quantum mechanics is the apparent reality that the act of observation changes, or perhaps defines, what is being observed. Let’s delve deep into what I only mentioned earlier, the famous two-slit experiment.

B. The Famous Two-Slit Experiment

In 1801, the English polymath Thomas Young rigged up the test that has been known forever after as the two-slit experiment. At the time, physicists were locked in debate over whether light consisted of particles (minuscule corpuscles of energy) or waves (regular undulations of a medium, like water waves in a pond). In an attempt to settle the question, Young made two closely spaced vertical slits in a black curtain. He allowed monochromatic light to strike the curtain, passing through the slits and hitting a screen on the opposite wall. Now, if we were to do a comparable experiment with something we know to be corpuscular rather than wavelike -- marbles, say -- there is no doubt about the outcome.

Pay close attention to Schwartz’s use of terms here. Planck’s “quanta” have now become “particles,” a transposition not fully explained, and “marbles” have been equated with those particles of light (soon to be called “photons”), an extrapolation indefensible by any structure of logic I’m familiar with. In what way, exactly, should we expect marbles to act similarly to “minuscule corpuscles of energy.”

Most marbles were fired at, for instance, a fence missing two slats would hit the fence and drop on this side. But a few marbles would pass through the gaps and, if we had coated the marbles with fresh white paint, leave two bright blotches on the wall beyond, corresponding to the positions of the two openings.

This is not what Young observed.

Again, why should it be? Comparing photons to marbles is like comparing electrons to elephants.

Instead, the light created, on a screen beyond the slitted curtain, a pattern of zebra stripes, alternating dark and light vertical bands. It’s called an interference pattern. Its genesis was clear: where crests of light waves from one slit met crests of waves from the other, the waves reinforced each other, producing the bright bands. Where the crest of a wave from one slit met the trough of a wave from the other, they canceled each other, producing the dark bands. Since the crests and troughs of a light wave are not visible to the naked eye, this is easier to visualize with water waves.

Okay, but be careful. We’re about to make another “photons to marbles” comparison. Unlike water waves, light waves do not need a medium in order to propagate. A few minutes on Google helped me verify that double-slit experiments done in a vacuum produce the same results: an interference pattern despite the lack of any medium for the waves to be traveling through. Light waves, therefore, are about as analogous to water waves as photons are to marbles (or electrons are to elephants).

Place a barrier with two openings in a pool of water. Drop a heavy object into the pool -- watermelons work -- and observe the waves on the other side of the barrier. As they radiate out from the watermelon splash, the ripples form nice concentric circles. When any ripple reaches the barrier, it passes through both openings and, on the other side, resumes radiating, now as concentric half-circles. Where a crest of ripples from the left opening meets a crest of ripples from the right, you get a double-height wave. But where crest meet trough, you get a zone of calm. Hence Young’s interpretation of his double-slit experiment: if light produces the same interference patterns as water waves, which we know to be waves, then light must be a wave, too. For if light were particulate, it would produce not the zebra stripes he saw but, rather, the sum of the patterns emerging from the two slits when they are opened separately -- two splotches of light, perhaps, like the marbles thrown at our broken fence.

It’s not my broken fence. I think what troubles me most about this line of reasoning is how falsely dichotomous it is. It assumes that light is either a wave or a particle, although we already know that light bends (if not breaks) the very definitions of both of those words. Let me boldly state the obvious. Light is neither a wave nor a particle. It is something else. You planted your own seeds of disappointment when you decided that light had to act predictably like either a wave or a particle. Don’t blame the observation when you yourself biased your thinking against it.

But what does any of this have to do with the mind or the brain? We’re getting there.

So far, so understandable. But, for the next trick, turn the light source way, way down so that it emits but a single photon, or light particle, at a time. (Today’s photodetectors can literally count photons.) Put a photographic plate on the other side, beyond the slits. Now we have a situation more analogous, it would seem, to the marbles going through the fence: zip goes one photon, perhaps making it through a slit. Zip goes the next, doing the same. Surely the pattern produced would be the sum of the patterns produced by opening each slit separately -- again, perhaps two intermingled splotches of light, one centered behind the left slit and the other behind the right.

Why would you think that? You’re still not shooting marbles. You emitting the smallest possible quantities of light. Marbles go through one missing fence slat or the other. Light, whatever its quantity, will go through both curtain slits.

But no.

As hundreds and then thousands of photons make the journey (this experiment was conducted by physicists in Paris in the mid-1980s), the pattern they create is a wonder to behold. Instead of the two broad patches of light, after enough photons have made the trip you see the zebra stripes. The interference pattern has struck again. But what interfered with what? This time the photons were clearly particles -- the scientists counted each one as it left the gate -- and our apparatus allowed only a single photon to make the journey at a time. Even if you run out for coffee between photons, the result is eventually the same interference pattern. Is it possible that the photon departed the light source as a particle and arrived on the photographic plate as a particle (for we can see each arrive, making a white dot on the plate as it lands) -- but in between it became a wave, able to go through both slits at once and interfere with itself just as a water wave from our watermelon drop goes through the two openings in the barrier? Even weirder, each photon -- and remember, we can release them at any interval -- manages to land at precisely the right spot on the plate to contribute its part to the interference pattern.

Do you see how Schwartz is tying himself into knots, trying to interpret the behavior of light through the false dichotomy of wave vs. particle? It has to be one or the other, so let’s call it a particle when it acts like a particle and a wave when it acts like a wave. Except I don’t see why such knots are necessary since the “particles” are acting like particles, in the sense that they hit the photographic plate one at a time, just as one would expect them to. It’s only when “hundreds and thousands” of particles make the journey that the interference pattern emerges.

And, as Schwartz goes on to explain, this is not just some “weird property of light.” The same experiments, with the same results, have been done with electrons and “larger particles.”

Electrons -- a form of matter -- can behave as waves. A single electron can take two different paths from source to detector and interfere with itself: during its travels it can be in two places at once. The same experiments have been performed with larger particles, such as ions, with identical results. And ions … are the currency of the brain, the particles whose movements are the basis for the action potential by which neurons communicate. They are also, in the case of calcium ions, the key to triggering neurotransmitter release. This is a crucial point: ions are subject to all of the counterintuitive rules of quantum physics.

Maybe now you see where Schwartz is going with all of this. Forget the fact that calling electrons “a form of matter” is misleading in the extreme. And forget the fact that using the same term, “particle,” to describe both electrons (whatever they are) and calcium ions (atomic masses with 20 protons, 20 neutrons, and 18 electrons) is also misleading in the extreme. Schwartz has apparently taken us down this quantum journey so he can arrive as this destination -- the ions that are the basis for neurological function exhibit quantum properties.

Because, for the purpose of his conjecture, Schwartz has only seemingly been tying himself into knots. He knows why photons, electrons, and calcium ions exhibit these strange “double-slit” behaviors, and it isn’t because they are sometimes particles and sometimes waves. He’s been leading us down the garden path.

C. Collapsing Wave Functions

A key to understanding the whole bizarre situation is that we actually measure the photon or electron at only two points in the experiment: when we release it (in which case a photodetector counts it) and when we note its arrival at the end. The conventional explanation is that the act of measurement makes a spread-out, fuzzy wave (at the slits) collapse into a discrete, definite particle (on the scintillation plate or other detector). According to quantum theory, what in fact passes through the slits is a wave of probability. In fact, quantum physics describes the behavior of a particle by something called the Schrödinger wave equation (after Erwin Schrödinger, who conceived it in 1926). Just as Newton’s second law describes the behavior of particles, so Schrödinger’s wave equation specifies the continuous and smooth evolution of the wave function at all times when it is not being observed. The wave function encodes the entire range of possibilities for that particle’s behavior -- where the particle is, when. It contains all the information needed to compute the probabilities of finding the particle in any particular place, any time. These many possibilities are called superpositions. The element of chance is key, for rather than specifying the location, or the energy, or any other trait of a particle, the equation modestly settles for describing the probability that those traits will have particular values. (In precise terms, the square of the amplitude of the wave function at any given position gives the probability that the particle will be found in some region near that position.) In this sense the Schrödinger wave can be considered a probability wave.

Oh my god. What does all of that mean? Let me simplify it for you. It’s math. Neither the math invented by Newton to describe the observed behavior of particles nor the math invented by Huygens to describe the observed behavior of waves suffices when it comes to the observed “two-slit” behavior of photons, electrons, and calcium ions. So a very smart person named Schrödinger invented a new set of math equations that do describe those observed behaviors. Schrödinger’s math deals with probabilities, not with concrete factors, and, as such, when one extrapolates his equations to make accurate predictions about the behavior of the phenomena represented by his “wave functions,” some very unusual things happen.

When a quantum particle or collection of particles is left alone to go its merry way unobserved, its properties evolve in time and space according to the deterministic wave equation. At this point (that is, before the electron or photon is observed), the quantum particle has no definite location. It exists instead as a fog of probabilities: there are certain odds (pretty good ones) that, in the appropriate experiment, it will be in the bright bands on the plate, other odds (lower) that it will land in the dark bands, and other odds (lower still, but nonzero) that it will be in the Starbucks across the street.

Do you see what Schwartz did there? He equated the inability of the math to determine a precise location for the quantum particle with the supposed reality that the quantum particle has no precise location. Do you see what else he did? He equated the fact that the Schrödinger wave equation has solutions that are nonzero for the quantum particle being in the Starbucks across the street with the supposed reality that the quantum particle could, in fact, be in the Starbucks across the street. The math tells us that both are possible -- in a “nonzero” kind of way -- so I guess we’d better consider them.

But as soon as an observer performs a measurement -- detecting an electron landing on a plate, say -- the wave function seems to undergo an abrupt change: the location of the particle it describes is now almost definite. The particle is no longer the old amalgam of probabilities spread over a large region. Instead, if the observer sees the electron in this tiny region, then only that part of the wave function representing the small region where observation has found it survives. Every other probability for the electron’s position has vanished. Before the observation, the system had a range of possibilities; afterward, it has a single actuality. This is the infamous collapse of the wave function.

Schwartz will go on to describe multiple theories multiple physicists have offered to explain the “abrupt change” or “collapse” of the wave functions that describe the observed behaviors of photons, electrons, and calcium ions. But Schwartz settles on one in service of the “mind changes the brain” theory he is developing.

[Niels] Bohr insisted that quantum theory is about our knowledge of a system and about predictions based on that knowledge; it is not about reality “out there.” That is, it does not address what had, since before Aristotle, been the primary subject of physicists’ curiosity -- namely, the “real” world. The physicists [who agreed with Bohr] threw in their lot with this view, agreeing that the quantum state represents our knowledge of a physical system.

Before the act of observation, it is impossible to know which of the many probabilities inherent in the Schrödinger wave function will become actualized. Who, or what, chooses which of the probabilities to make real? Who, or what, chooses how the wave function “collapses?” Is the choice made by nature, or by the observer? According to [Bohr’s interpretation], it is the observer who both decides which aspect of nature is to be probed and reads the answer nature gives. The mind of the observer helps choose which of an uncountable number of possible realities come into being in the form of observations. A specific question (Is the electron here or there?) has been asked, and an observation has been performed (Aha! The electron is there!), corralling an unruly wave of probability into a well-behaved quantum of certainty. Bohr was silent on how observation performs this magic. It seems, though, as if registering the observation in the mind of the observer somehow turns the trick: the mental event collapses the wave function. Bohr, squirming under the implications of his own work, resisted the idea that an observer, through observation, is actually influencing the course of physical events outside his body. Others had no such qualms. As the late physicist Heinz Pagels wrote in his wonderful 1982 book The Cosmic Code, “There is no meaning to the objective existence of an electron at some point in space … independent of any actual observation. The electron seems to spring into existence as a real object only when we observe it!”

“Seems” being the operative word in that last sentence. But like Heinz Pagels, Schwartz has no qualms accepting that which made Niels Bohr squirm.

D. Quantum Brains

This maxim that “reality only exists when it is observed,” if true, remains one of the deepest and most misunderstood puzzles of quantum mechanics, but it is finally the peg that Schwartz will hang his hat on. Because, according to Schwartz’s theory, which he spends the rest of his book developing, the observer in the “quantum brain,” the entity that forces the wave function of the brain’s electrochemical activity to collapse into a concrete reality is, you guessed it, the mind.

And, although you more than likely have thought this long before getting to this paragraph, this idea that it is the “observing mind” that is creating biochemical reality by collapsing Schrödinger wave functions in the “acting brain,” is where Schwartz really starts driving us off the rails.

Applying quantum theory to the brain means recognizing that the behaviors of atoms and subatomic particles that constitute the brain, in particular the behavior of ions whose movements create electrical signals along axons and neurotransmitters that are released into synapses, are all described by Schrödinger wave equations. Thanks to superpositions of possibilities, calcium ions might or might not diffuse to sites that trigger the emptying of synaptic vesicles, and thus a drop of neurotransmitter might or might not be released. The result is a whole slew of quantum superpositions of possible brain events.

Okay, I’m with you so far, but I notice you’re no longer talking about the nonzero solutions to the Schrödinger wave equations that would put the calcium ions responsible for neurotransmitter release in the Starbucks across the street.

When such superpositions describe whether a radioactive atom has disintegrated [a reference to an earlier description of the thought experiment involving Schrödinger’s famous cat], we say that those superpositions of possibilities collapse into a single actuality at the moment we observe the state of that previously ambiguous atom.

Technically, the possibilities collapse into a single actuality when they are observed, not when “we” observe them. The electrons hit the photographic plate whether we’re in the room or not, just as Schrödinger’s cat is alive or dead in the box before we open it. Plenty of famous physicists accept this interpretation, but Schwartz carefully avoids it in order to build his case. He continues ...

The resulting increment in the observer’s knowledge of the quantum system (the newly acquired knowledge that the atom has decayed or not) entails a collapse of the wave functions describing his brain.

This is a reference not just the the radioactive atom threatening Schrödinger’s cat, but an interpretation of quantum mechanics that classifies it as an information system within a consciousness, not as a mechanical system within a classical universe. Quantum mechanics, in this view, is not a science of particle/wave things interacting with their environment, but the extent and ability of brains to understand and interpret the world around them. That’s why Schwartz talks about the “newly acquired knowledge” as a part of the quantum system. It’s an unjustified leap -- in my opinion -- but let’s allow Schwartz to continue …

This point is key: once the brains of observers are included in the quantum system, the wave function describing the state of the brain of any observer collapses to the form corresponding to his new knowledge. The quantum state of the brain must collapse when an observer experiences the outcome of a measurement. The collapse occurs in conjunction with the conscious act of experiencing the outcome of the observation. And it occurs in the brain of the observer -- the observer who has learned something about the system.

Do you see what he did? We’ve now moved away from the wave functions of calcium ions -- probabilistic math equations describing all the possible energies and locations of “physical” objects -- to wave functions of … what? Knowledge “moments” of brains? Probabilistic math equations describing all the possible energies and locations of all the elementary particles in a brain that correspond to a particular and distinct “set” of knowledge? Do such equations even exist? Is there a physics professor who has written such an equation down? How many lecture hall blackboards did it take? And how many times did the wave function of his brain change in the time it took him to write it? Let’s give Schwartz a chance to explain …

What do we mean by collapsing the quantum state of the brain? Like an atom threatening Schrödinger’s cat, the entire brain of an observer can be described by a quantum state that represents all the various possibilities of all of its material constituents.

“Can be described” as in “can be expressed mathematically,” or “can be described” as in “can be conjectured”? Are we dealing with an actual or a thought experiment here? Back to Schwartz …

That brain state evolves deterministically until a conscious observation occurs. Just before an observation, both the observed quantum system (let’s stick with the radioactive atom) and the brain that observes it exist as a profusion of possible states. Think of each possible state as a branch on a tree. Each branch corresponds to some possible state of knowledge, or course of action. But when the observation registers in the mind of the observer, the branches are brutally ...

Brutally?

… pruned: only the branches compatible with the observer’s experience remain. If, say, the observation is that the sun is shining, then the associated physical event is the updating of the brain’s representation of the weather. Branches corresponding to “the sky is overcast” are chopped off. An increase in knowledge is accompanied by an associated reduction of the quantum state of the brain. And with that, the quantum brain changes, too.

Okay. That’s as far as I’m going to go with this. Schwartz has so hopelessly confused his argument that I don’t think we’re conceivably talking about reality any more.

This business about branches being pruned is a reference to the “many worlds” theory -- one of the explanations for the “collapse” of the wave function that some physicists have offered, and which Schwartz himself seemed to dismiss earlier in his text. It says that all possible solutions to the wave function in fact do exist, the one observed here in this universe, and all the others … um, somewhere else. Enough said there.

But more practically, if Schwartz is going to claim that the brain exists in a quantum state that changes (or collapses?) with each new piece of knowledge it receives, he frankly has a lot more explaining to do.

First, what counts as a “piece” (of quantum?) of knowledge? The example Schwartz continues to use is the decay of the radioactive atom threatening Schrödinger’s cat. He treats that in the binary sense. Either the atom has decayed or it hasn’t, and he therefore treats the resulting quantum brain states as equally binary. It looks like this if it knows the atom has decayed, and it looks like that if it knows it hasn’t. But, I’m not sure I buy that we’re only talking about two brain states (or two solutions to the brain’s Schrödinger wave function) here. To borrow a phrase, there “seems” to be a lot going on in our brains moment to moment that would probably contaminate any pure sample of quantum brain states that we’re trying to measure. Forget all the random bits of conscious trivia that are constantly flowing along in our awareness (sorry, can’t resist: Did I turn off the iron? Did I lock the house?). What about all the subconscious monitoring and control our brains are responsible for? Heart activity, respiration, digestion, tactile awareness, spatial orientation -- our brains are always working on these processes whether we are aware of them or not. What if the duodenum needs to contract, or the pancreas needs to excrete, just as the bit of “important” quantum information about decaying atoms comes into our awareness? Can we really be sure that we’re collapsing the right wave function at the right time?

Second, why stop at the brain? As I read through Schwartz’s text I was constantly confused as to which wave function he was talking about and he teased his way through the various pieces of his theory. Are we talking about the wave function associated with a single electron on a single calcium ion, a single calcium ion in a single synapse, all the calcium ions in a single synapse, all the calcium ions in all the synapses, all the molecules and floating ions/atoms that constitute a single neuron, all the neurons associated with the quantum knowledge gain in question, all the neurons in the entire brain, and/or the whole brain itself? And if we accept that wave functions can usefully describe macroscopic objects like brains, then why stop there? What about the wave functions associated with the head, the body, the room, the building, the city, the earth, the solar system, the galaxy, and the universe? Why aren’t we talking about the addition of quantum information to any of those collapsing wave functions?

And third, who, exactly, is the observer?

The fact that the collapse of the wave function so elegantly allows an active role for consciousness -- which is required for an intuitively meaningful understanding of the effects of effort on brain function -- is itself strong support for using a collapse-based interpretation in any scientific analysis of mental influences on brain action.

Evidently, “we” are. We’re back to the unproved statement that consciousness exists as something separate from the actions of the brain to provide a mechanism to explain brain action -- this time on a quantum rather than spiritual level.

E. Free Will Vetoes Determined Brain Activity

Let me try to conclude with this. After presenting his hypothesis that the “observing mind” is the observer in the quantum phenomenon that collapse into brain activity, Schwartz spends most of the rest of his book making the case that this “quantum observer” indeed represents our “efficacious will,” that is, the non-physical ability to choose between a multitude of quantum states and direct our brains towards certain actions and not others. Quoting Benjamin Libet, he of the famous experiments that show that the brain activity associated with conscious action occurs before the consciousness becomes aware of it, Schwartz says:

But in later years [Libet] embraced the notion that free will serves as the gatekeeper for thoughts bubbling up from the brain and did not duck the moral implications of that. “Our experimental work in voluntary action led to inferences about responsibility and free will,” he explained in late 2000. “Since the volitional process is initiated in the brain unconsciously, one cannot be held to feel guilty or sinful for simply having an urge or wish to do something asocial.

Jesus in Matthew 5:28 might disagree with that.

But conscious control over the possible act is available, making people responsible for their actions. The unconscious initiation of a voluntary act provides direct evidence for the brain’s role in unconscious mental processes. I, as an experimental scientist, am led to suggest that true free will is a [more accurate scientific description] than determinism.”

Forgive me, I couldn’t resist the biblical reference, because, frankly, that reads more like theology than science to me. You will be tempted by forces you can’t control, but you have the strength to resist them and choose the righteous path.

But where does this volitional activity come from? What does science tell us about the source of this ability to veto the asocial thoughts that come “bubbling up from the brain”?

Study after study has indeed found a primary role for the prefrontal cortex in freely performed volitional activity. “That aspect of free will which is concerned with the voluntary selection of one action rather than another critically depends upon the normal functioning of the dorsolateral prefrontal cortex and associated brain regions,” Sean Spence and Chris Frith concluded in “The Volitional Brain.” Damage to this region, which lies just behind the forehead and temples and is the most evolutionarily advanced brain area, seems to diminish one’s ability to initiate spontaneous activity and to remain focused on one task rather than be distracted by something else. These symptoms are what one would predict in someone unable to choose a particular course of action. Large lesions of this region turn people into virtual automatons whose actions are reflexive responses to environmental cues: such patients typically don spectacles simply because they are laid before them, or eat food presented to them, mindlessly and automatically. (This is what those who have had prefrontal lobotomy do.) And studies in the 1990s found that when subjects are told they are free to make a particular movement at the time of their own choosing -- in an experimental protocol much like Libet’s -- the decision to act is accompanied by activity in the dorsolateral prefrontal cortex. Without inflating the philosophical implications of this and similar findings, it seems safe to conclude that the prefrontal cortex plays a central role in the seemingly free selection of behaviors, choosing from a number of possible actions by inhibiting all but one and focusing attention on the chosen one. It makes sense, then, that when this region is damaged patients become unable to stifle inappropriate responses to their environment: a slew of possible responses bubbles up, as it does in all of us, but brain damage robs patients of the cerebral equipment required to choose the appropriate one.

So, evidently, a brain with a damaged or without a dorsolateral prefrontal cortex does not have the “cerebral equipment” needed to exercise its “efficacious will” over the unbidden actions that deterministically occur in the other portions of the brain. Here I seem I have an answer to the question I asked earlier about the “size” of the necessary wave function. Without a dorsolateral prefrontal cortex, there is evidently no observer to collapse the brain’s wave function -- and yet, quantum phenomena undoubtedly occur as calcium ions makes their deterministic passages in and out of neurons elsewhere in the brain. Which observer is collapsing those?

+ + +

This post first appeared on Eric Lanke's blog, an association executive and author. You can follow him on Twitter @ericlanke or contact him at eric.lanke@gmail.com.