Wednesday, February 22, 2017

BloodHateSilence

Blood and Hate and Silence. 


I always wanted to kill humans, possibly about a dozen I hated, but since I still had some vestiges of rationality in me, I also didn't want to get caught. So I waited for the right kind of opportunities to present themselves. I had waited for a long time; months rolled into years and then decades. 

Finally, an opportunity came. I took advantage of it, but I did also take some risks with it. I couldn't help myself. A single shot, maybe two or three, to the head and then covering my tracks and walking away would be too easy and anti-climatic. But I was a dramatic, bombastic, flamboyant kind of guy. So I gave in to who I really was. I tied his feet after handcuffing him and threw him in my car's trunk, and told him to shut up, otherwise I would blow his head off right there and then. 

It was early January. I took him to the woods, in the middle of nowhere in Northern Wisconsin, close to Minnesota. I told him to get out of the car, after loosening the rope on his feet so he could walk. I then stuck a little plastic ball in his mouth, and closed it with a strong duct tape. 

He walked in front of me. His legs wobbled. The air was crisp and sharp, though the sky was overcast. I could see mists coming out of his nostrils and trailed thinly away. I also smelled naked, animalistic odor of fear wafting from him. He knew he was going to die, but not how or when. I taped his mouth because I didn't want to hear his pleadings and cries. I was a soft-hearted kind of guy. I cried easily myself. And I wrote poetry.

I tied him to a birch tree. Then I put on a pair of boxing gloves. But first I kicked him hard in the groin. He couldn't bend over because of the restraining rope. I heard some muffled sound coming from his mouth. Tears streamed down his face. During my working him over, from head down to his belly, I cited his sins, one by one, against me. I didn't shout or scream to express my rage. I just released my pent-up frustrations, by methodically hitting him on his face and body, a blow for each sin, I recounted for his "benefit". I was sure he couldn't remember them all. He was that kind of a dog, a petty-minded and persistent dog. He thought he had fun at my expense, making up outlandish, outrageous lies about me. "Guess who's laughing now? Ashole!"  I leaned over and whispered to his ear. After about five minutes, the cur's face was swollen and bloody, and his eyes puffed up and completely shut. I swear I was hearing celestial music in tune with my every blow. Winds arrived with increasing speed, lifting fallen leaves temporarily off the ground. They swirled  around me and him---me and my human punching bag, two solitary human figures in the wilderness, in the biting cold, in the presence of silent birch and elm and maple and pine trees and shrubs; one figure in constant motion, the other stationary and tied to a tree. Then slanting snow started coming down hard. A blizzard was in full swing. Nature must have also been angry. The only noises were my punching blows and the howling winds. I was perspiring because of the exertion. He was then unconscious and barely breathing. I ended his misery by pulling out a knife and cutting across his throat while. A jet stream of red blood landed in the white wet snow. 

As I walked back to my car, I felt cathartic and peaceful and yet pondered on the imperative of keeping my mouth shut, on the necessity and beauty of Silence. 

The celestial music stopped. I only heard the howling winds. Snow was landing hard on my face. 

One just went down, eleven more to go. The Dirty Dozen. That was what I called them. They had been in my consciousness during my waking hours and in my dreams at night. They were one of the reasons I wanted to live so that one day I would exact vengeance. There is a saying that Love trumps Hate. It may be so. But in my universe, Hate and Love are intertwined. If you don't have Love, you wouldn't know Hate. Hate is Love unfulfilled. 

The asshole that I just killed was like a vermin that deserved its fate. It asked for it. It was a dumb one. That was why I caught it first. The other animals would be more elusive. But I have patience. It is one of my prime virtues. I'm hot-tempered but I'm patient. Sounds like a contradiction, an oxymoron. But Man is a self-conflicting, internally self-warring animal, suspended somewhere between Angel and Devil. Anybody knows that, even a stupid guy like me. One doesn't have to be a Sigmund Freud to be cognizant of that fact. Man aspires to be God so he invented Him, but he's aware that he's more of an asshole, so he also invented Devil/Lucifer/Satan. And Man is restless, bouncing back and forth between the two poles, contributing not much to Life and the World, but accounting for a lot of its destruction and heartaches. 

Some Greek guy said thousands of years ago that Man is an animal that wants to know. I seriously doubt that statement's veracity, just by looking at the bullshit dogmas and doctrines peddled by the so-called religious and political "leaders". If you're dumb and ignorant, and hence gullible, there are always assholes trying to take advantage of you by telling you lies. But since you're stupid and uninformed, you would embrace those lies and firmly believe that those lies are "truths". 

Life, to me, must be an endless struggle and fight for Truth. A human must be surrounded himself by facts, truths, and sound reasoning, not stupid and Blind Faith. Blind Faith makes you become a blind and dumb slave, hence manageable, controllable, and useful. How sad and how brutal! But that's Life is all about: an endless struggle for domination and exploitation, for power, and definitely not for Love. Love is just a front, a bait to attract potential preys. I'm not saying there are no humans who are full of Love. Yes, there are. I even have met several. That's why I haven't given up completely on the human race. Not yet. 

But where am I? And what am I doing? Am I preaching or trying to write a fictional story, or both? I don't know. I just know one thing: There's an "artist" inside me, trying to break out. So I keep trying to weave words together. Maybe someday, the tapestry of my words constitute something akin to "Art".  But now, since I'm stuck, that is to say, I'm having what's called a "writer's block", I'm going to make a  detour by "condensing"/paraphrasing/ or just plain copying a portion of a story ("The Largesse of the Sea Maiden") by Denis Johnson that shocked and awed me by virtue of its artistry and brevity, a kind of story that I've wanted to write: concise, bare, and yet lingering in the mind of the reader after he finishes reading it, and wonders if the story is based on real life or just on the imagination of the writer. In fact several, if not all, of my " stories" that I've written have this kind of flavor. 

"After dinner, nobody went home right away...We sat around in the living room describing the loudest sounds we'd ever heard. One said it was his wife's voice when she told him she didn't love him anymore and wanted a divorce. Another recalled the pounding of his heart when he suffered a coronary. Tia Jones had become a grandmother at the age of thirty-seven and hoped never again to hear anything so loud as her granddaughter crying in her sixteen-year-old daughter's arms. Her husband, Ralph, said it hurt his ears when his brother opened his mouth in public, because his brother had Tourette's syndrome and erupted with remarks like "I masturbate! Your penis smells good!" in front of perfect strangers on a bus or during a movie, or even in church.

Young Chris Case reversed the direction and introduced the topic of silences. He said the most silent thing he'd ever heard was the land mine taking off his right leg outside Kabul, Afghanistan. 

As for the other silences, nobody contributed. In fact, there came a silence now....I hadn't even known that Chris had fought in Afghanistan. "A land mine?" I said. 
"Yes sir. A land mine."
"Can I see it?" Deirdre said.
"No, ma'am," Chris said. " I don't carry land mines around on my person."
"No, I mean your leg."
"It was blown off."
"I mean the part that's still there!"
"I'll show you," he said, "if you kiss it."

Shocked laughter. We started talking the most ridiculous things we'd ever kissed. Nothing of interest. We'd all kissed only people, and only in the usual places. " All right, then, " Chris told Deirdre. "Here's your chance for the conversation's most unique entry."
"No, I don't want to kiss your leg!"

....I think we all felt a little irritated with Deirdre. We all wanted to see.

Morton Sands was there too, that night, and for the most part he'd managed to keep quiet. Now he said, "Jesus Christ, Deirdre!"
"Oh, well. OK," she said.
Christ pulled up his right pant leg......Deirdre got down on her bare knees before him, and he hitched forward in his seat to move the scarred stump within two inches of Deirdre's face. Now she started to cry. Now we were all embarrassed, a little ashamed (of ourselves). 

For nearly a minute, we waited. 

Ralph Jones was sitting beside Chris. Then Ralph said, "Chris, I remember when I saw you fight two guys outside the Aces Tavern (right before you got sent to war). No kidding," Ralph told the rest of us. "He went outside with these two guys and beat the crap out of both of them."
"I guess I could've given them a break," Chris said. "They were both pretty drunk."
......
We wanted to see how this sort of thing worked out. How often will you witness a woman kissing an amputation? But Ralph ruined everything by talking. He'd broken the spell. Chris worked the prosthetics back into place and tightened the straps and rearranged his pant leg. Deirdre stood up and wiped her eyes and smoothed her skirt and took her seat, and that was that. 

The outcome of all this was that Chris and Deirdre, about six months later, down at the courthouse, in the presence of very nearly the same group of friends, were married by a magistrate. Yes, they're husband and wife. You and I know what goes on."

I don't know about you, but I don't know what's going on. However, tomorrow I'll be on a ship for a week cruise in the Caribbean. If I meet a pretty, vivacious Latina on a crutch or even in a wheel chair, and if I'm challenged to kiss her, anywhere, just to start a conversation and get to know her, I'll take my chances. I'll be glad to do it, unlike Deirdre. I won't be crying. No sir, I won't. 

Wissai
May 14, 2016

Tuesday, February 21, 2017

Biography of Angela Carter, a book review by Dwight Gardner

THE INVENTION OF ANGELA CARTER
A Biography
By Edmund Gordon
525 pages. Oxford University Press. $35.
The English writer Angela Carter (1940-1992) tended to look, one observer said, “like someone who’d been left out in a hurricane.” She liked to make an impression, and her hair was often wild. She wore, when young, what she termed “a reasonably suave Jimi Hendrix cut.”
She enjoyed floppy hats, tattered furs, large eyeglasses. Boredom was her enemy. Carter was a disrupter of dull dinner parties. A friend called her a “raconteur of glee.” If she rang you on the phone, you’d clear your schedule for the afternoon.
She was a similarly disruptive agent in British fiction. Her novels, when they began arriving in the late 1960s, were unlike the button-down realism that then prevailed. They were fantastical, feminist, absurdist, sexy. She tinkered with genres (fairy tales, horror, science fiction, gothic) most literary writers scorned.
Carter found an audience before she died, at 51, of lung cancer. But it was only after her death that her reputation was secured, and it has continued to rise. The Times of London, in 2008, ranked Carter 10th on its list of “the 50 greatest writers since 1945.” In 2012, her novel “Nights at the Circus” was named the best of the winners of the James Tait Black Memorial Prize.
Like all the best writers, she was incapable of phoning anything in. Her fiction aside, Carter’s thick book of collected journalism, travel writing, criticism and essays, “Shaking a Leg” (1997), is its own erudite stay against dullness.
Now we have “The Invention of Angela Carter,” the first full-length biography, and it will consolidate her position. Edmund Gordon has written a terrific book — judicious, warm, confident and casually witty. The ratio of insight to literary-world gossip, of white swan to black swan, is as well calibrated as one of Sara Mearns’s impossible balletic leaps.
Gordon has had the good fortune to seize upon, for his subject, not only an important writer but one who led a deeply interesting life. This bio unfolds a bit like one of the fairy tales Carter shook to release its meaning. The pages turn themselves.
She was born Angela Olive Stalker and grew up mostly in London, playing with her brother in the post-Blitz urban rubble. Her father was a journalist. Her mother was a woman who loved too much. She smothered Angela with affection and food.
“It wasn’t easy to become obese” in England in the 1940s, Gordon writes, when milk and butter were rationed. With her mother’s eager assistance, Carter did so. Her nicknames at school included Tubs and Fatty.
She escaped her mother’s clutches by attending an elite prep school on a scholarship, where the weight fell off. She was an intense reader. She escaped, too, by marrying young, at 19, rather than applying, as she had been encouraged to do, to Oxford.
Her first husband, Paul Carter, was deep into England’s nascent folk music scene. The couple became known, sometimes sneeringly, as “the Folk Singing Carters.” This marriage lasted nine years. Carter later had cruel things to say about her husband. She told a friend she had had “more meaningful relationships with people I’ve sat next to on aeroplanes.”
She began writing fiction in earnest while married to Paul Carter, though she did not feel supported by him. She wrote a different friend: “Behind every great man is a woman dedicated to his greatness whilst behind every great woman is a man dedicated to bringing her down.”
Carter’s first novel, “Shadow Dance,” appeared in 1966. By 1972 she had published five more. Her life changed when she went to live in Tokyo for two years on prize money from a Somerset Maugham Award.
In Japan she took on a younger lover, the first of several in her life. About one, she commented: “Every time I pull down his underpants, I feel more & more like Humbert Humbert.” Sex was important to Carter and she wrote about it beautifully, in her fiction and in the letters and journals of which Gordon makes use.
In her book “The Sadeian Woman” (1979), she wrote, “We do not go to bed in simple pairs; even if we choose not to refer to them, we still drag there with us the cultural impedimenta of our social class, our parents’ lives, our bank balances.”
Carter never learned to drive or ride a bike. She was a savage smoker of cigarettes. “When she was writing,” Gordon says, “smoke would curl out from the keyhole in her room.”
She was, oddly, to borrow a phrase from Kingsley Amis, a “mean sod” about alcohol. “Guests often felt frustrated by her habit of pouring them a single glass of wine,” Gordon writes, “then corking the bottle and putting it back in the fridge, never to emerge again.”
She returned from Japan to a changing literary world in London. She assisted her friend Kazuo Ishiguro in finding an agent. She helped usher Pat Barker into print. She befriended Salman Rushdie and Ian McEwan.
There was a boom in British fiction in the early 1980s, with Rushdie, Ishiguro, McEwan and Martin Amis at its vanguard. “Angela missed out on all this,” Gordon writes. Her book advances, unlike theirs, did not soar. She always felt she had never quite broken through.
Selling each new book, Gordon tells us, was a struggle. It is hard to see why. Carter wrote some of the 20th century’s unforgettable first sentences. Her novel “The Passion of New Eve” (1977) begins this way: “The last night I spent in London, I took some girl or other to the movies and, through her mediation, I paid you a little tribute of spermatozoa, Tristessa.”
Carter ultimately married again, to a construction worker 15 years her junior, and had her only child, a son, at 43. After her death, Rushdie wrote that “English literature has lost its high sorceress, its benevolent white witch.” This biography is witchy, in the best sense, as well.
Follow Dwight Garner on Twitter: @dwightgarner

Thinking

WHY FACTS DON’T CHANGE OUR MINDS

New discoveries about the human mind show the limitations of reason.

In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. The students were then asked to distinguish between the genuine notes and the fake ones.

Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times. Others discovered that they were hopeless. They identified the real note in only ten instances.

As is often the case with psychological studies, the whole setup was a put-on. Though half the notes were indeed genuine—they’d been obtained from the Los Angeles County coroner’s office—the scores were fictitious. The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.

In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.

“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”

A few years later, a new set of Stanford students was recruited for a related study. The students were handed packets of information about a pair of firefighters, Frank K. and George H. Frank’s bio noted that, among other things, he had a baby daughter and he liked to scuba dive. George had a small son and played golf. The packets also included the men’s responses on what the researchers called the Risky-Conservative Choice Test. According to one version of the packet, Frank was a successful firefighter who, on the test, almost always went with the safest option. In the other version, Frank also chose the safest option, but he was a lousy firefighter who’d been put “on report” by his supervisors several times. Once again, midway through the study, the students were informed that they’d been misled, and that the information they’d received was entirely fictitious. The students were then asked to describe their own beliefs. What sort of attitude toward risk did they think a successful firefighter would have? The students who’d received the first packet thought that he would avoid it. The students in the second group thought he’d embrace it.

Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted. In this case, the failure was “particularly impressive,” since two data points would never have been enough information to generalize from.

The Stanford studies became famous. Coming from a group of academics in the nineteen-seventies, the contention that people can’t think straight was shocking. It isn’t any longer. Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way?

In a new book, “The Enigma of Reason” (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question. Mercier, who works at a French research institute in Lyon, and Sperber, now based at the Central European University, in Budapest, point out that reason is an evolved trait, like bipedalism or three-color vision. It emerged on the savannas of Africa, and has to be understood in that context.

Stripped of a lot of what might be called cognitive-science-ese, Mercier and Sperber’s argument runs, more or less, as follows: Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain. For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

“Reason is an adaptation to the hypersocial niche humans have evolved for themselves,” Mercier and Sperber write. Habits of mind that seem weird or goofy or just plain dumb from an “intellectualist” point of view prove shrewd when seen from a social “interactionist” perspective.

Consider what’s become known as “confirmation bias,” the tendency people have to embrace information that supports their beliefs and reject information that contradicts them. Of the many forms of faulty thinking that have been identified, confirmation bias is among the best catalogued; it’s the subject of entire textbooks’ worth of experiments. One of the most famous of these was conducted, again, at Stanford. For this experiment, researchers rounded up a group of students who had opposing opinions about capital punishment. Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.

The students were asked to respond to two studies. One provided data in support of the deterrence argument, and the other provided data that called it into question. Both studies—you guessed it—were made up, and had been designed to present what were, objectively speaking, equally compelling statistics. The students who had originally supported capital punishment rated the pro-deterrence data highly credible and the anti-deterrence data unconvincing; the students who’d originally opposed capital punishment did the reverse. At the end of the experiment, the students were asked once again about their views. Those who’d started out pro-capital punishment were now even more in favor of it; those who’d opposed it were even more hostile.

If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias. Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do. Such a mouse, “bent on confirming its belief that there are no cats around,” would soon be dinner. To the extent that confirmation bias leads people to dismiss evidence of new or underappreciated threats—the human equivalent of the cat around the corner—it’s a trait that should have been selected against. The fact that both we and it survive, Mercier and Sperber argue, proves that it must have some adaptive function, and that function, they maintain, is related to our “hypersociability.”

Mercier and Sperber prefer the term “myside bias.” Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.

A recent experiment performed by Mercier and some European colleagues neatly demonstrates this asymmetry. Participants were asked to answer a series of simple reasoning problems. They were then asked to explain their responses, and were given a chance to modify them if they identified mistakes. The majority were satisfied with their original choices; fewer than fifteen per cent changed their minds in step two.

In step three, participants were shown one of the same problems, along with their answer and the answer of another participant, who’d come to a different conclusion. Once again, they were given the chance to change their responses. But a trick had been played: the answers presented to them as someone else’s were actually their own, and vice versa. About half the participants realized what was going on. Among the other half, suddenly people became a lot more critical. Nearly sixty per cent now rejected the responses that they’d earlier been satisfied with.

Cartoon
“Thanks again for coming—I usually find these office parties rather awkward.”

This lopsidedness, according to Mercier and Sperber, reflects the task that reason evolved to perform, which is to prevent us from getting screwed by the other members of our group. Living in small bands of hunter-gatherers, our ancestors were primarily concerned with their social standing, and with making sure that they weren’t the ones risking their lives on the hunt while others loafed around in the cave. There was little advantage in reasoning clearly, while much was to be gained from winning arguments.

Among the many, many issues our forebears didn’t worry about were the deterrent effects of capital punishment and the ideal attributes of a firefighter. Nor did they have to contend with fabricated studies, or fake news, or Twitter. It’s no wonder, then, that today reason often seems to fail us. As Mercier and Sperber write, “This is one of many cases in which the environment changed too quickly for natural selection to catch up.”

Steven Sloman, a professor at Brown, and Philip Fernbach, a professor at the University of Colorado, are also cognitive scientists. They, too, believe sociability is the key to how the human mind functions or, perhaps more pertinently, malfunctions. They begin their book, “The Knowledge Illusion: Why We Never Think Alone” (Riverhead), with a look at toilets.

Virtually everyone in the United States, and indeed throughout the developed world, is familiar with toilets. A typical flush toilet has a ceramic bowl filled with water. When the handle is depressed, or the button pushed, the water—and everything that’s been deposited in it—gets sucked into a pipe and from there into the sewage system. But how does this actually happen?

In a study conducted at Yale, graduate students were asked to rate their understanding of everyday devices, including toilets, zippers, and cylinder locks. They were then asked to write detailed, step-by-step explanations of how the devices work, and to rate their understanding again. Apparently, the effort revealed to the students their own ignorance, because their self-assessments dropped. (Toilets, it turns out, are more complicated than they appear.)

Sloman and Fernbach see this effect, which they call the “illusion of explanatory depth,” just about everywhere. People believe that they know way more than they actually do. What allows us to persist in this belief is other people. In the case of my toilet, someone else designed it so that I can operate it easily. This is something humans are very good at. We’ve been relying on one another’s expertise ever since we figured out how to hunt together, which was probably a key development in our evolutionary history. So well do we collaborate, Sloman and Fernbach argue, that we can hardly tell where our own understanding ends and others’ begins.

“One implication of the naturalness with which we divide cognitive labor,” they write, is that there’s “no sharp boundary between one person’s ideas and knowledge” and “those of other members” of the group.

This borderlessness, or, if you prefer, confusion, is also crucial to what we consider progress. As people invented new tools for new ways of living, they simultaneously created new realms of ignorance; if everyone had insisted on, say, mastering the principles of metalworking before picking up a knife, the Bronze Age wouldn’t have amounted to much. When it comes to new technologies, incomplete understanding is empowering.

Where it gets us into trouble, according to Sloman and Fernbach, is in the political domain. It’s one thing for me to flush a toilet without knowing how it operates, and another for me to favor (or oppose) an immigration ban without knowing what I’m talking about. Sloman and Fernbach cite a survey conducted in 2014, not long after Russia annexed the Ukrainian territory of Crimea. Respondents were asked how they thought the U.S. should react, and also whether they could identify Ukraine on a map. The farther off base they were about the geography, the more likely they were to favor military intervention. (Respondents were so unsure of Ukraine’s location that the median guess was wrong by eighteen hundred miles, roughly the distance from Kiev to Madrid.)

Surveys on many other issues have yielded similarly dismaying results. “As a rule, strong feelings about issues do not emerge from deep understanding,” Sloman and Fernbach write. And here our dependence on other minds reinforces the problem. If your position on, say, the Affordable Care Act is baseless and I rely on it, then my opinion is also baseless. When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views. If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration.

“This is how a community of knowledge can become dangerous,” Sloman and Fernbach observe. The two have performed their own version of the toilet experiment, substituting public policy for household gadgets. In a study conducted in 2012, they asked people for their stance on questions like: Should there be a single-payer health-care system? Or merit-based pay for teachers? Participants were asked to rate their positions depending on how strongly they agreed or disagreed with the proposals. Next, they were instructed to explain, in as much detail as they could, the impacts of implementing each one. Most people at this point ran into trouble. Asked once again to rate their views, they ratcheted down the intensity, so that they either agreed or disagreed less vehemently.

Sloman and Fernbach see in this result a little candle for a dark world. If we—or our friends or the pundits on CNN—spent less time pontificating and more trying to work through the implications of policy proposals, we’d realize how clueless we are and moderate our views. This, they write, “may be the only form of thinking that will shatter the illusion of explanatory depth and change people’s attitudes.”

One way to look at science is as a system that corrects for people’s natural inclinations. In a well-run laboratory, there’s no room for myside bias; the results have to be reproducible in other laboratories, by researchers who have no motive to confirm them. And this, it could be argued, is why the system has proved so successful. At any given moment, a field may be dominated by squabbles, but, in the end, the methodology prevails. Science moves forward, even as we remain stuck in place.

In “Denying to the Grave: Why We Ignore the Facts That Will Save Us” (Oxford), Jack Gorman, a psychiatrist, and his daughter, Sara Gorman, a public-health specialist, probe the gap between what science tells us and what we tell ourselves. Their concern is with those persistent beliefs which are not just demonstrably false but also potentially deadly, like the conviction that vaccines are hazardous. Of course, what’s hazardous is not being vaccinated; that’s why vaccines were created in the first place. “Immunization is one of the triumphs of modern medicine,” the Gormans note. But no matter how many scientific studies conclude that vaccines are safe, and that there’s no link between immunizations and autism, anti-vaxxers remain unmoved. (They can now count on their side—sort of—Donald Trump, who has said that, although he and his wife had their son, Barron, vaccinated, they refused to do so on the timetable recommended by pediatricians.)

The Gormans, too, argue that ways of thinking that now seem self-destructive must at some point have been adaptive. And they, too, dedicate many pages to confirmation bias, which, they claim, has a physiological component. They cite research suggesting that people experience genuine pleasure—a rush of dopamine—when processing information that supports their beliefs. “It feels good to ‘stick to our guns’ even if we are wrong,” they observe.

The Gormans don’t just want to catalogue the ways we go wrong; they want to correct for them. There must be some way, they maintain, to convince people that vaccines are good for kids, and handguns are dangerous. (Another widespread but statistically insupportable belief they’d like to discredit is that owning a gun makes you safer.) But here they encounter the very problems they have enumerated. Providing people with accurate information doesn’t seem to help; they simply discount it. Appealing to their emotions may work better, but doing so is obviously antithetical to the goal of promoting sound science. “The challenge that remains,” they write toward the end of their book, “is to figure out how to address the tendencies that lead to false scientific belief.”

“The Enigma of Reason,” “The Knowledge Illusion,” and “Denying to the Grave” were all written before the November election. And yet they anticipate Kellyanne Conway and the rise of “alternative facts.” These days, it can feel as if the entire country has been given over to a vast psychological experiment being run either by no one or by Steve Bannon. Rational agents would be able to think their way to a solution. But, on this matter, the literature is not reassuring. 


Ainsi Parlait/Thus Spoke/Así Dijo Wissai

canngon.blogspot.com

Trump's AttackonMedia

Twenty years ago, in what now seems an Edenic time, I wrote a book called Breaking the NewsThe Atlantic ran a cover-story excerpt called “Why Americans Hate the Media.” The book’s main argument was that when reporters presented an overly conflict-centered, tactics-minded, “horse race”-dominant picture of public life, they hurt the news business, and they hurt the function of democracy as well. If the press served up public life as just another version of reality TV, but with less-interestingly scripted plot lines and less-sexy-looking participants, then the public would naturally turn away from this less enticing entertainment and go for the real thing. Back then, it was quaintly possible to think of “news” and “entertainment” as separable realms.

We’re now a million miles down the news-as-entertainment road. Instinctively by Trump, perhaps strategically by Bannon and others, the Trump moment has promoted the idea that there are no facts, no reality, no authorities, no actual truth. There’s only us and them. Donald Trump's caudillo skill as a performer is being the “I” who can be the voice of the “us.” He’s simply better at that than the other side is. I expect if any reporters with experience in 1930s Italy were still around, they’d be writing about parallels with Il Duce. It is no coincidence that reporters who have dealt with state-news systems in autocratic Russia (like David Remnick or Masha Gessen) or China (where I have lived) are more much concerned by Trump than amused.

I won’t draw the comparison to reporters who were in Germany in the 1930s; that is heavy-handed, and a stretch. But I’ll close with part of an interview with Hannah Arendt, who was one of the great interpreters of that era. This is from an interview published in 1978 in the New York Review of Books, freshly relevant now:

The moment we no longer have a free press, anything can happen. What makes it possible for a totalitarian or any other dictatorship to rule is that people are not informed; how can you have an opinion if you are not informed? If everybody always lies to you, the consequence is not that you believe the lies, but rather that nobody believes anything any longer. This is because lies, by their very nature, have to be changed, and a lying government has constantly to rewrite its own history. On the receiving end you get not only one lie—a lie which you could go on for the rest of your days—but you get a great number of lies, depending on how the political wind blows.

And a people that no longer can believe anything cannot make up its mind. It is deprived not only of its capacity to act but also of its capacity to think and to judge. And with such a people you can then do what you please.

ABOUT THE AUTHOR

  • James Fallows
    JAMES FALLOWS is a national correspondent for The Atlantic and has written for the magazine since the late 1970s. He has reported extensively from outside the United States and once worked as President Carter's chief speechwriter. His latest book is China Airborne.

The Russification of America by Roger Cohen in the NYT 2/21/2017

MUNICH — This Munich Security Conference was different. A Frenchman defended NATO against the American president. The Russian foreign minister was here but the phantom American secretary of state was not. An ex-Swedish prime minister had to respond to the “last-night-in-Sweden affair” — an ominous incident in a placid Scandinavian state dreamed up in his refugee delirium by Donald Trump.
Surreal hardly begins to describe the proceedings at this annual gathering, the Davos of foreign policy. This is what happens when the United States is all over the place. Allies get nervous; they don’t know what to believe. “Trump’s uncontrolled communication is unsettling the world,” John Kasich, the Republican governor of Ohio, told me. That is an understatement. The Trump doctrine is chaos.
Vice President Mike Pence came, communicated and exited without taking questions. He said the United States would be “unwavering” in its commitment to NATO, whose glories he extolled. (He never mentioned the European Union, whose fragmentation Trump encourages.)
If a question had been allowed, it might have been: “Mr. Pence, you defend NATO but your boss says it’s obsolete. So which is it?”
To which the answer could well have been: “This is an administration that says everything and the contrary of everything. I advise you to get used to it — and pay up.”
But getting used to an American president who responds through Twitter to the last guy in the room or what he’s just seen on TV, has no notion of or interest in European history, and has turned America’s word into junk, is not easy. Europeans are reeling.
Jean-Marc Ayrault, the French foreign minister, insisted that, “We certainly cannot say that NATO is obsolete.” (For a long time the French kind of wished it was.) Wolfgang Ischinger, the chairman of the Munich conference, told Deutsche Welle that if Trump continues to advocate against the European Union it would amount to a “nonmilitary declaration of war.” Those are extraordinary words from a distinguished former German ambassador about the American president.
For me, the most troubling thing was finding myself unsure who was more credible — Pence or Sergei Lavrov, the Russian foreign minister. The Russification of America under Trump has proceeded apace. Vladimir Putin’s macho authoritarianism, disdain for the press, and mockery of the truth has installed itself on the Potomac.
Putin is only the latest exponent of what John le Carré called “the classic, timeless, all-Russian, barefaced, whopping lie” and what Joseph Conrad before him called Russian officialdom’s “almost sublime disdain for the truth.”
The Russian system under Putin is a false democracy based on a Potemkin village of props — political parties, media, judiciary — that are the fig leaf covering repression or elimination of opponents. Russia runs on lies. It’s alternative-fact central (you know, there are no Russian troops in Ukraine). But what happens when the United States begins to be infected with Russian disease?
Pence’s speech may not have been precisely a barefaced whopping lie, but it certainly showed barefaced whopping disdain for the intelligence of the audience (you know, nothing has changed with Trump, ha-ha.) By comparison, Lavrov was blunt. He announced the dawn of the “post-West world order.” That became a theme. Mohammad Javad Zarif, the Iranian foreign minister, announced the “post-Western global order.”
I wonder what that means — perhaps a world of lies, repression, unreason and violence. It advances as America offers only incoherence. To counter the drift, what is needed? A functioning American State Department would be a start.
Right now, Rex Tillerson, the secretary of state, cuts a lonely figure, his reasonable choice of deputy nixed by Trump, his authority (if any) unclear. America today has no foreign policy. It’s veering between empty reaffirmations of old bonds (Pence) and the scattershot anti-Muslim, Sweden-syndrome, anti-trade, bellicose, what’s-in-it-for-me mercantilism of Trump.
Month two of this presidency needs to produce a capacity to speak with one voice. It was interesting to see John Kelly, the secretary of Homeland Security, talk about a coming revised travel ban order for seven mainly Muslim countries and say that “this time” he’d be able to work on the rollout plan. Rough translation: last time he was cut out of the process and it was a real mess. Almost everything has been.
I’m skeptical of Trump ever running a disciplined administration. His feelings about Europe are already clear and won’t change. The European Union needs to step into the moral void by standing unequivocally for the values that must define the West: truth, facts, reason, science, tolerance, freedom, democracy and the rule of law. For now it’s unclear if the Trump administration is friend or foe in that fight.
Carl Bildt, the former Swedish prime minister, joked to me that the most terrifying aspect of Trump’s “incident” in Sweden was “the suppression of it by all the major Swedish newspapers, and even Swedish citizens.” We laughed. The unfunny moment will come when Trump lashes out based on nothing but fervid imaginings and the “post-West” order stumbles from confusion into conflagration.