Archive for the ‘essay’ Category

h1

The rise and fall of Default Man

November 10, 2014

How did the straight, white, middle-class Default Man take control of our society – and how can he be dethroned?

 

By Grayson Perry……. a Turner Prize-winning artist. In 2012, his series All In The Best Possible Taste was broadcast on Channel 4, and in 2013 he delivered the BBC’s Reith Lectures. He guest-edited the New Statesman in October 2014.

This article was first published in the Newstatesmen in October 2014

 

Grayson Perry's stamp

Attack of the clones: Default Man is so entrenched in society that he is “like a Death Star hiding behind the moon”. Artwork by Grayson Perry

 

Paddle your canoe up the River Thames and you will come round the bend and see a forest of huge totems jutting into the sky. Great shiny monoliths in various phallic shapes, they are the wondrous cultural artefacts of a remarkable tribe. We all know someone from this powerful tribe but we very rarely, if ever, ascribe their power to the fact that they have a particular tribal identity.

I think this tribe, a small minority of our native population, needs closer examination. In the UK, its members probably make up about 10 per cent of the population (see infographic below); globally, probably less than 1 per cent. In a phrase used more often in association with Operation Yewtree, they are among us and hide in plain sight.

They dominate the upper echelons of our society, imposing, unconsciously or otherwise, their values and preferences on the rest of the population. With their colourful textile phalluses hanging round their necks, they make up an overwhelming majority in government, in boardrooms and also in the media.

They are, of course, white, middle-class, heterosexual men, usually middle-aged. And every component of that description has historically played a part in making this tribe a group that punches far, far above its weight. I have struggled to find a name for this identity that will trip off the tongue, or that doesn’t clutter the page with unpronounceable acronyms such as WMCMAHM. “The White Blob” was a strong contender but in the end I opted to call him Default Man. I like the word “default”, for not only does it mean “the result of not making an active choice”, but two of its synonyms are “failure to pay” and “evasion”, which seems incredibly appropriate, considering the group I wish to talk about.

Today, in politically correct 21st-century Britain, you might think things would have changed but somehow the Great White Male has thrived and continues to colonise the high-status, high-earning, high-power roles (93 per cent of executive directors in the UK are white men; 77 per cent of parliament is male). The Great White Male’s combination of good education, manners, charm, confidence and sexual attractiveness (or “money”, as I like to call it) means he has a strong grip on the keys to power. Of course, the main reason he has those qualities in the first place is what he is, not what he has achieved. John Scalzi, in his blog Whatever, thought that being a straight white male was like playing the computer game called Life with the difficulty setting on “Easy”. If you are a Default Man you look like power.

I must confess that I qualify in many ways to be a Default Man myself but I feel that by coming from a working-class background and being an artist and a transvestite, I have enough cultural distance from the towers of power. I have space to turn round and get a fairly good look at the edifice.

In the course of making my documentary series about identity, Who Are You?, for Channel 4, the identity I found hardest to talk about, the most elusive, was Default Man’s. Somehow, his world-view, his take on society, now so overlaps with the dominant narrative that it is like a Death Star hiding behind the moon. We cannot unpick his thoughts and feelings from the “proper, right-thinking” attitudes of our society. It is like in the past, when people who spoke in cut-glass, RP, BBC tones would insist they did not have an accent, only northerners and poor people had one of those. We live and breathe in a Default Male world: no wonder he succeeds, for much of our society operates on his terms.

Chris Huhne (60, Westminster, PPE Mag­dalen, self-destructively heterosexual), the Default Man we chose to interview for our series, pooh-poohed any suggestion when asked if he benefited from membership or if he represented this group. Lone Default Man will never admit to, or be fully aware of, the tribal advantages of his identity. They are, naturally, full subscribers to that glorious capitalist project, they are individuals!

 

This adherence to being individuals is the nub of the matter. Being “individual” means that if they achieve something good, it is down to their own efforts. They got the job because they are brilliant, not because they are a Default Man, and they are also presumed more competent by other Default Men. If they do something bad it is also down to the individual and not to do with their gender, race or class. If a Default Man commits a crime it is not because fraud or sexual harassment, say, are endemic in his tribe (coughs), it is because he is a wrong ’un. If a Default Man gets emotional it is because he is a “passionate” individual, whereas if he were a woman it would often be blamed on her sex.

When we talk of identity, we often think of groups such as black Muslim lesbians in wheelchairs. This is because identity only seems to become an issue when it is challenged or under threat. Our classic Default Man is rarely under existential threat; consequently, his identity remains unexamined. It ambles along blithely, never having to stand up for its rights or to defend its homeland.

When talking about identity groups, the word “community” often crops up. The working class, gay people, black people or Muslims are always represented by a “community leader”. We rarely, if ever, hear of the white middle-class community. “Communities” are defined in the eye of Default Man. Community seems to be a euphemism for the vulnerable lower orders. Community is “other”. Communities usually seem to be embattled, separate from society. “Society” is what Default Man belongs to.

In news stories such as the alleged “Trojan Horse” plot in Birmingham schools and the recent child-abuse scandal in Rotherham, the central involvement of an ethnic or faith “community” skews the attitudes of police, social services and the media. The Muslim or Pakistani heritage of those accused becomes the focus. I’m not saying that faith and ethnic groups don’t have their particular problems but the recipe for such trouble is made up of more than one spicy, foreign ingredient. I would say it involves more than a few handfuls of common-or-garden education/class issues, poor mental health and, of course, the essential ingredient in nearly all nasty or violent problems, men. Yeah, men – bit like them Default Men but without suits on.

In her essay “Visual Pleasure and Narrative Cinema”, published in 1975, Laura Mulvey coined the term “the male gaze”. She was writing about how the gaze of the movie camera reflected the heterosexual male viewpoint of the directors (a viewpoint very much still with us, considering that only 9 per cent of the top 250 Hollywood films in 2012 were directed by women and only 2 per cent of the cinematographers were female).

The Default Male gaze does not just dominate cinema, it looks down on society like the eye on Sauron’s tower in The Lord of the Rings. Every other identity group is “othered” by it. It is the gaze of the expensively nondescript corporate leader watching consumers adorn themselves with his company’s products the better to get his attention.

Default Man feels he is the reference point from which all other values and cultures are judged. Default Man is the zero longitude of identities.

He has forged a society very much in his own image, to the point where now much of what other groups think and feel is the same. They take on the attitudes of Default Man because they are the attitudes of our elders, our education, our government, our media. If Default Men approve of something it must be good, and if they disapprove it must be bad, so people end up hating themselves, because their internalised Default Man is berating them for being female, gay, black, silly or wild.

I often hear women approvingly describe themselves or other women as feisty. Feisty, I feel, has sexist implications, as if standing up for yourself was exceptional in a woman. It sounds like a word that a raffish Lothario would use about a difficult conquest.

I once gave a talk on kinky sex and during the questions afterwards a gay woman floated an interesting thought: “Is the legalising of gay marriage an attempt to neutralise the otherness of homosexuals?” she asked. Was the subversive alternative being neutered by allowing gays to marry and ape a hetero lifestyle? Many gay people might have enjoyed their dangerous outsider status. Had Default Man implanted a desire to be just like him?

Is the fact that we think like Default Man the reason why a black female Doctor Who has not happened, that it might seem “wrong” or clunky? In my experience, when I go to the doctor I am more likely to see a non-white woman than a Default Man.

It is difficult to tweezer out the effect of Default Man on our culture, so ingrained is it after centuries of their rules. A friend was once on a flight from Egypt. As it came in to land at Heathrow he looked down at the rows of mock-Tudor stockbroker-belt houses in west London. Pointing them out, he said to the Egyptian man sitting next to him: “Oh well, back to boring old England.” The Egyptian replied, “Ah, but to me this is very exotic.” And he was right. To much of the world the Default Englishman is a funny foreign folk icon, with his bowler hat, his Savile Row suit and Hugh Grant accent, living like Reggie Perrin in one of those polite suburban semis. All the same, his tribal costume and rituals have probably clothed and informed the global power elite more than any other culture. Leaders wear his clothes, talk his language and subscribe to some version of his model of how society “should be”.

When I was at art college in the late Seventies/early Eighties, one of the slogans the feminists used was: “Objectivity is Male Subjectivity.” This brilliantly encapsulates how male power nestles in our very language, exerting influence at the most fundamental level. Men, especially Default Men, have put forward their biased, highly emotional views as somehow “rational”, more considered, more “calm down, dear”. Women and “exotic” minorities are framed as “passionate” or “emotional” as if they, the Default Men, had this unique ability to somehow look round the side of that most interior lens, the lens that is always distorted by our feelings. Default Man somehow had a dispassionate, empirical, objective vision of the world as a birthright, and everyone else was at the mercy of turbulent, uncontrolled feelings. That, of course, explained why the “others” often held views that were at such odds with their supposedly cool, analytic vision of the world.

Recently, footage of the UN spokesman Chris Gunness breaking down in tears as he spoke of the horrors occurring in Gaza went viral. It was newsworthy because reporters and such spokespeople are supposed to be dispassionate and impartial. To show such feelings was to be “unprofessional”. And lo! The inherited mental health issues of Default Man are cast as a necessity for serious employment.

I think Default Man should be made aware of the costs and increasing obsolescence of this trait, celebrated as “a stiff upper lip”. This habit of denying, recasting or suppressing emotion may give him the veneer of “professionalism” but, as David Hume put it: “Reason is a slave of the passions.” To be unaware of or unwilling to examine feelings means those feelings have free rein to influence behaviour unconsciously. Unchecked, they can motivate Default Man covertly, unacknowledged, often wreaking havoc. Even if rooted in long-past events in the deep unconscious, these emotions still fester, churning in the dark at the bottom of the well. Who knows what unconscious, screwed-up “personal journeys” are being played out on the nation by emotionally illiterate Default Men?

Being male and middle class and being from a generation that still valued the stiff upper lip means our Default Man is an ideal candidate for low emotional awareness. He sits in a gender/ class/age nexus marked “Unexploded Emotional Time Bomb”.

These people have been in charge of our world for a long time.

Things may be changing.

****

Women are often stereotyped as the emotional ones, and men as rational. But, after the 2008 crash, the picture looked different, as Hanna Rosin wrote in an article in the Atlantic titled “The End of Men”:

Researchers have started looking into the relationship between testosterone and excessive risk, and wondering if groups of men, in some basic hormonal way, spur each other to make reckless decisions. The picture emerging is a mirror image of the traditional gender map: men and markets on the side of the irrational and overemotional, and women on the side of the cool and level-headed.

Over the centuries, empirical, clear thinking has become branded with the image of Default Men. They were the ones granted the opportunity, the education, the leisure, the power to put their thoughts out into the world. In people’s minds, what do professors look like? What do judges look like? What do leaders look like? The very aesthetic of seriousness has been monopolised by Default Man. Practically every person on the globe who wants to be taken seriously in politics, business and the media dresses up in some way like a Default Man, in a grey, western, two-piece business suit. Not for nothing is it referred to as “power dressing”. We’ve all seen those photo ops of world leaders: colour and pattern shriek out as anachronistic. Consequently, many women have adopted this armour of the unremarkable. Angela Merkel, the most powerful woman in the world, wears a predictable unfussy, feminised version of the male look. Hillary Clinton has adopted a similar style. Some businesswomen describe this need to tone down their feminine appearance as “taking on the third gender”.

Peter Jones on Dragons’ Den was once referred to as “eccentric” for wearing brightly coloured stripy socks. So rigid is the Default Man look that men’s suit fashions pivot on tiny changes of detail at a glacial pace. US politicians wear such a narrow version of the Default Man look that you rarely see one wearing a tie that is not plain or striped.

 

Suits you, sir: Grayson Perry as Default Man. Photo: Kalpesh Lathigra/New Statesman

Suits you, sir: Grayson Perry as Default Man.
Photo: Kalpesh Lathigra/New Statesman

 

One tactic that men use to disguise their subjectively restricted clothing choices is the justification of spurious function. As if they need a watch that splits lap times and works 300 feet underwater, or a Himalayan mountaineer’s jacket for a walk in the park. The rufty-tufty army/hunter camouflage pattern is now to boys as pink is to girls. Curiously, I think the real function of the sober business suit is not to look smart but as camouflage. A person in a grey suit is invisible, in the way burglars often wear hi-vis jackets to pass as unremarkable “workmen”. The business suit is the uniform of those who do the looking, the appraising. It rebuffs comment by its sheer ubiquity. Many office workers loathe dress-down Fridays because they can no longer hide behind a suit. They might have to expose something of their messy selves through their “casual” clothes. Modern, overprofessionalised politicians, having spent too long in the besuited tribal compound, find casual dress very difficult to get right convincingly. David Cameron, while ruining Converse basketball shoes for the rest of us, never seemed to me as if he belonged in a pair.

When I am out and about in an eye-catching frock, men often remark to me, “Oh, I wish I could dress like you and did not have to wear a boring suit.” Have to! The male role is heavily policed from birth, by parents, peers and bosses. Politicians in particular are harshly kept in line by a media that seems to uphold more bizarrely rigid standards of conformity than those held by any citizen. Each component of the Default Male role – his gender, his class, his age and his sexuality – confines him to an ever narrower set of behaviours, until riding a bicycle or growing a beard, having messy hair or enjoying a pint are seen as ker-azy eccentricity. The fashionable members’ club Shoreditch House, the kind of place where “creatives” with two iPhones and three bicycles hang out, has a “No Suits” rule. How much of this is a pseudo-rebellious pose and how much is in recognition of the pernicious effect of the overgrown schoolboy’s uniform, I do not know.

I dwell on the suit because I feel it exemplifies how the upholders of Default Male values hide in plain sight. Imagine if, by democratic decree, the business suit was banned, like certain items of Islamic dress have been banned in some countries. Default Men would flounder and complain that they were not being treated with “respect”.

The most pervasive aspect of the Default Man identity is that it masquerades very efficiently as “normal” – and “normal”, along with “natural”, is a dangerous word, often at the root of hateful prejudice. As Sherrie Bourg Carter, author of High-Octane Women, writes:

Women in today’s workforce . . . are experiencing a much more camouflaged foe – second-generation gender biases . . . “work cultures and practices that appear neutral and natural on their face”, yet they reflect masculine values and life situations of men.

Personally, working in the arts, I do not often encounter Default Man en masse, but when I do it is a shock. I occasionally get invited to formal dinners in the City of London and on arrival, I am met, in my lurid cocktail dress, with a sea of dinner jackets; perhaps harshly, my expectations of a satisfying conversation drop. I feel rude mentioning the black-clad elephant in the room. I sense that I am the anthropologist allowed in to the tribal ritual.

Of course, this weird minority, these curiously dominant white males, are anything but normal. “Normal,” as Carl Jung said, “is the ideal aim for the unsuccessful.” They like to keep their abnormal power low-key: the higher the power, the duller the suit and tie, a Mercedes rather than a Rolls, just another old man chatting casually to prime ministers at the wedding of a tabloid editor.

Revolution is happening. I am loath to use the R word because bearded young men usually characterise it as sudden and violent. But that is just another unhelpful cliché. I feel real revolutions happen thoughtfully in peacetime. A move away from the dominance of Default Man is happening, but way too slowly. Such changes in society seem to happen at a pace set by incremental shifts in the animal spirits of the population. I have heard many of the “rational” (ie, male) arguments against quotas and positive discrimination but I feel it is a necessary fudge to enable just change to happen in the foreseeable future. At the present rate of change it will take more than a hundred years before the UK parliament is 50 per cent female.

The outcry against positive discrimination is the wail of someone who is having their privilege taken away. For talented black, female and working-class people to take their just place in the limited seats of power, some of those Default Men are going to have to give up their seats.

Perhaps Default Man needs to step down from some of his most celebrated roles. I’d happily watch a gay black James Bond and an all-female Top Gear, QI or Have I Got News for You. Jeremy Paxman should have been replaced by a woman on Newsnight. More importantly, we need a quota of MPs who (shock) have not been to university but have worked on the shop floor of key industries; have had life experiences that reflect their constituents’; who actually represent the country rather than just a narrow idea of what a politician looks like. The ridiculousness of objections to quotas would become clear if you were to suggest that, instead of calling it affirmative action, we adopted “Proportionate Default Man Quotas” for government and business. We are wasting talent. Women make up a majority of graduates in such relevant fields as law.

Default Man seems to be the embodiment of George Bernard Shaw’s unreasonable man: “The reasonable man adapts himself to the world; the unreasonable one persists in trying to make the world adapt to himself. Therefore all progress depends on the unreasonable man.”

Default Man’s days may be numbered; a lot of his habits are seen at best as old-fashioned or quaint and at worst as redundant, dangerous or criminal. He carries a raft of unhelpful habits and attitudes gifted to him from history – adrenalin addiction, a need for certainty, snobbery, emotional constipation and an overdeveloped sense of entitlement – which have often proved disastrous for society and can also stop poor Default Man from leading a fulfilling life.

Earlier this year, at the Being A Man festival at the Southbank Centre in London, I gave a talk on masculinity called: “Men, Sit Down for your Rights!”. A jokey title, yes, but one making a serious point: that perhaps, if men were to loosen their grip on power, there might be some benefits for them. The straitjacket of the Default Man identity is not necessarily one happily donned by all members of the tribe: many struggle with the bad fit of being leader, provider, status hunter, sexual predator, respectable and dignified symbol of straight achievement. Maybe the “invisible weightless backpack” that the US feminist Peggy McIntosh uses to describe white privilege, full of “special provisions, maps, passports, codebooks, visas, clothes, tools and blank checks”, does weigh rather a lot after all.

h1

An Authoritarian Future

January 7, 2014

Why The West Slowly Abandons Its Civil Liberties

By Werner de Gruijter of Global Research

 

 

ifeelsafer

 

Politicians on both sides of the Atlantic who construct an image of toughness – tough on crime, on terrorism, on humanistic-inspired idealism etc. – are tapping into a sensitive spot that blocks critical thought among the public. Obama’s brute and harsh reaction on Edward Snowden’s revelations is just another example. Somehow it seems like  “We, the people…”  lost track of ourselves. Four main reasons why we abandon our once hard fought civil rights.

Many countries in the West, like Britain, France, Spain the US and the Netherlands have experienced in recent years an exponential increase in technological surveillance and a resolute decline in parliamentary and judicial control over state police and secret service.

Issues like the ban on torture, the possibility of detention without charge, privacy and freedom of speech were in the public debate reframed in favour of state control. And everybody accepted it. To be fair, there was some opposition – but it lacked intensity. Why is this happening?

To give an example, under former British Prime Minister Tony Blair 45 criminal laws were approved creating 3000 new criminal offences. British writer John Kampfer argues that in the past ten years more criminal offences were made in his country than in a hundred years before. All this was legitimized by the idea that a ‘terroristic’ virus attacked Western civilization. Of course, there is some truth in it – but these risks were grossly exaggerated. Still, we fearfully went along with the proposed measures.

This cultural shift towards perhaps a more authoritarian future for the West is no coincidence of nature. It is manmade. If the opportunity is there, top down induced shifts happen only if politicians, corporations, media pundits and other cultural icons are able to find the right symbols and techniques to get a new message across.

 But first, besides these techniques, famous American psychologist Abraham Maslow is probably aware that there is also something else which stimulates our apathy in this respect. He signified the importance of leisure time for our own personal well being as well as for the well being of the community as a whole – it creates so to speak the possibility to make well informed decisions. Currently our leisure time is under assault. Thirty years of income stagnation in the midst of rising prices – people have to struggle to earn a living – meant that for most of us there is less time for critical thought.

 But it has even been made harder to reflect on important issues since politicians and opinions leaders use marketing tools in order to seduce. Remember that soon after the 2008 banking bailout the discussion was reframed in such a way that government spending instead of the unregulated financial sector itself, was the root cause of all ‘evil’ – this message was repeated like a commercial, over and over again. This technique of repetition effectively neutralizes critical thinking. Hence, Nazi propagandist, Joseph Goebels, was on to something when he famously stated:

“If you tell a lie big enough and keep repeating it, people will eventually come to believe it.”

Long after Goebbels died, psychologists experimentally discovered that it is a natural tendency of human beings to react more receptive to whatever kind of message the more they are exposed to it. They call this “the law of mere exposure”. We should question ourselves if this habit is healthy for our general welfare.

 Furthermore, psychologists discovered that our ability to think critically is severely limited when we act under stress. Frightened people tend to perceive reality through a prism of simple right and wrong answers, leaving the complexities aside. Scared, we are easily fooled. Politicians and corporations can’t resist the temptation to manipulate this animal instinct – like when we started a war without having been shown any serious proof of its legitimacy.

 One could expect that the mainstream media in its role as guard dog was attacking those politicians that create black & white polemics. However, currently most (privately owned) media echo the voice of corporations, which these days doesn’t differ much in substance from that of the government. As a result alternative and more nuanced voices are underrepresented in cultural discourse which, again, makes it harder to produce well informed decisions.

And, when considering the information that is filtered thru to a broad audience – one also notes the slow, but steady disappearing of the separation line between news media and entertainment. American academic Daniel Hallin argues that the average time for sound bites politicians are given in media performances has shrunk from forty seconds in the 1960s to ten seconds in 1988. Hallin’s crucial point is that he believes that the biggest victim of this still on going process is the careful scrutinizing of social problems. This results in so called ‘horse race’ news – news about politics presented as a game of  “who’s the most witty” in which politicians try to be popular instead of reasonable. The blur of catchy one-liners reaching the audience creates a further alienation from reality.

Taken together an assault on leisure, repetition of information, fear policies and the transformation of our media outlets from guard dogs to lap dogs create a situation wherein our spirit for the common good slowly dissolves into an ocean of noise, distraction and misinformation.

Meanwhile, the social environment which politicians, corporations and media gurus are constructing produces anxieties and illusions in order to make profits or political gains. Together these social forces act as a gravitational pull for government and corporate empowerment. That is to say, they pull away strength from the people to participate in the maintenance of a mentally healthy, meaningful democratic environment.

Thomas Jefferson once argued that a government should fear the power of the people. In that respect the apathy with which the audience in general responds to the revelations of Snowden is a cynical demonstration of our time frame. Although, however little, a message this confronting does still stir society a tiny bit. We are not completely brain-dead – and there is some hope in that.

Probably the best question contemporary Westerners can ask themselves is: will today’s power structure be able to obscure these clear violations of human civil rights or is this message too loud to ignore?

Or to say it more bluntly than that: will there be a transition to a meaningful democracy in the West or to an advanced form of authoritarianism? What’s your point of view…

 

 

h1

The problem with beliefs – Jim Walker

December 9, 2013
Belief, Justice, to think, to know.

Belief, Justice, to think, to know.

 

 

Introduction

People have slaughtered each other in wars, inquisitions, and political actions for centuries and still kill each other over beliefs in religions, political ideologies, and philosophies. These belief-systems, when stated as propositions, may appear mystical, and genuine to the naive, but when confronted with a testable bases from reason and experiment, they fail miserably. I maintain that faiths (types of beliefs) create more social problems than they solve and the potential dangers from them could threaten the future of humankind.

Throughout history, humankind has paid reverence to beliefs and mystical thinking. Organized religion has played the most significant role in the support and propagation of beliefs and faith. This has resulted in an acceptance of beliefs in general. Regardless of how one may reject religion, religious support of supernatural events gives credence to other superstitions in general and the support of faith (belief without evidence), mysticism, and miracles. Most scientists, politicians, philosophers, and even atheists support the notion that some forms of belief provide a valuable means to establish “truth” as long as it contains the backing of data and facts. Belief has long become a socially acceptable form of thinking in science as well as religion. Indeed, once a proposition turns to belief, it automatically undermines opposition to itself. Dostoyevsky warned us that those who reject religion “will end by drenching the earth in blood.” But this represents a belief in-itself. Our history has shown that the blood letting has occurred mostly as a result of religions or other belief-systems, not from the people who reject them.

However, does rational thinking require the adherence to beliefs at all? Does productive science, ethics, or a satisfied life require any attachment to a belief of any kind? Can we predict future events, act on data, theories, and facts without resorting to the ownership of belief? This paper attempts to show that, indeed, one need not own beliefs of any kind to establish scientific facts, observe and enjoy nature, or live a productive, moral, and useful life.

Relative to the history of life, human languages have existed on the earth for only a few thousand years, a flash of an instant compared to the millions of years of evolution. (Estimates for the beginnings of language range from 40,000 to 200,000 years ago). It should come to no surprise that language takes time to develop into a useful means of communication. As in all information systems, errors can easily creep into the system, especially at the beginning of its development. It should not come to any surprise that our language and thought processes may contain errors, delusions and beliefs. It would behoove us to find and attempt to deal with these errors and become aware of their dangers.

The ability to predict the future successfully provides humans with the means to survive. No other animal species has a capacity to think, remember, imagine, and forecast to the degree of Homo sapiens. To replace our thoughts with intransigent beliefs belies the very nature of the very creative thinking process which keeps us alive.

Before I go on, I’d like to apologize for the sloppy writing style of this article. I intend this as a work in progress as this reflects my thoughts about the subject of belief along with what science has discovered about it. As new information arrives, I either make changes or add information on the fly, so some things may seem out of order, anachronistic or repeated. I have no expertise in neuroscience or psychology and my main source for disowning beliefs comes from my own experience, thus I use the word “I” a lot, something that authors of scientific papers should never use. Sorry.

I learned how to disown beliefs even before I had any scientific understanding on the subject. Many people do not understand how a person can do this so I hope to explain that one can indeed live without beliefs, or at least give them a better understanding about the subject. I also hope to explain what belief means and what it doesn’t mean and the problems they can cause. If my experience only applies to me and no one else, then I probably have an abnormal brain. Fortunately the scientific information that has arrived has tended to support my case and diminished the argument against it. Nor do I intend to proselytize or try to convince you that you should abandon your beliefs. Perhaps some people can’t disown beliefs, even in principle, because of some unknown reason that I have no awareness of.

 


 

Origins of belief

 

“The closest relative of the chimp is the human. Not orangs, but people. Us. Chimps and humans are nearer kin than are chimps and gorillas or any other kinds of apes not of the same species.”
-Carl Sagan

Very little evidence has yet appeared about how belief arose in humans. As social animals, we probably have always held beliefs to some degree. Studies of our closest DNA relatives, the apes, have suggested that primate social animals require both followers and leaders. The followers must assume the codes of conduct of their leaders if they wish to live without social conflict. Since there always occurs more followers than leaders, the property of accepting the leaders without challenge and the introduction of language may have led human primates towards the expression of beliefs.

As one possibility, perhaps the human animal believes because of an inherent result from expressed genes (phenotypes). Interestingly, some animals have in their DNA a predisposition for imprinted programming. [1] One extreme example of maturation imprinting occurs with newborn greylag geese where they regard the first suitable animal that it sees as its parent and follows it around. In nature geese usually see their natural mother when born, but if humankind interrupts the natural process and a newborn goose first sees a human, then it comes to regard itself, in some sense, as a human, thus compromising its natural life as a goose. Some young animals have a kind of “eidetic” memory; they will believe whatever gets taught to them. Do humans exhibit a similar kind imprinting while young as do many other animals? Or do we learn how to believe from our parents, expressed from memetic inheritance? Most people accept, without question, the religion of their youth. The degree that humans have imprinted or learned belief memories, or the ability to control their beliefs, or reduce them remains open for further investigation. Learning about the mechanism of beliefs at this early stage may help us understand the consequences of impressionable teaching and may lead us to modify the strategy of early learning so as to avoid the debilitating effects of unexamined beliefs.

Some evolutionary biologists think that beliefs require an evolutionary explanation because every known culture through history has had beliefs. And if beliefs have an evolutionary survival advantage, how can they serve that advantage? Of course no one knows how for sure because beliefs do not leave behind fossil evidence. Nevertheless one can still propose a hypothesis and the best one I’ve heard comes from Richard Dawkins. In his book, “The God Delusion,” he explains this on pages 172-179. I’ll give you a brief section from this chapter. Although he writes about religion, it also applies to beliefs in general:

“My specific hypothesis is about children. More than any other species, we survive by the accumulated experience of previous generations, and that experience needs to be passed on to children for their protection and well-being. Theoretically, children might learn from personal experience not to go too near a cliff edge, not to eat untried red berries, not to swim in crocodile-infested waters. But, to say the least, there will be selective advantage to child brains that possess the rule of thumb: believe, without question, whatever your grown-ups tell you. Obey your parents; obey the tribal elders without question. This is a generally valuable rule of thumb to believe.” [Dawkins] (also watch a video of his explanation)

[Humans also communicate these belief rules through spoken or symbolic language. Since other animals do not have the language ability to the degree of humans, that explains why animals do not have religions.]

However, as children grow up, they no longer need to listen to their parents because their brains have now fully developed and they can think for themselves. Unfortunately, evolution has no way to clean up these evolutionary belief traits while in adulthood so the beliefs they inherited from their parents remain.

The evolutionary advantage of utilizing beliefs while young, although they help the survival of our species, can also lead to bad consequences later in adult life but not so severe as to prevent the survival of our species. These bad consequences of beliefs may have led early humans toward violence against members of their own. As early Homo sapiens collected beliefs, some of them must surely have contained beliefs of violence, possibly to protect them from other tribes who might harm them or who they believed might harm them.

The earliest evidence of human culture from Paleolithic and Mesolithic societies show that humans practiced some form of violence against fellow humans. These violent actions appear similar to the brutality of other primate species (chimpanzees, our closest primate relative, for example, reveals they engage in chimpanzee warfare). Later, the skills of human weaponry increased during the Neolithic period, and archeologists have uncovered evidence for executions and sacrifices. Although no one has direct evidence for languages spoked in the Neolithic period, violence of this kind, no doubt requires commination so they probably had language along with beliefs to justify their executions and sacrifices.

Many early societies believed in spirits and animism, the belief that animals and inanimate objects possess a spirit. Indeed, the Latin word, anima, means soul. The word “spirit” also derives from the Latin word for breath. No doubt ignorance about the nature of wind, breath and movement of animals led them to construct an “explanation” about things in their world. How could they possibly know the difference between beliefs, facts, and evidence? These early societies hardly had anything that we would call multiculturalism, and this alone would isolate their belief systems from other belief systems. Imagine, for example, that you lived in a tribe that held strong beliefs and you came across another tribe that held an entirely different set of beliefs. Without an understanding of cultural diversity, or even the difference between beliefs and facts, how could they not feel threatened by another tribe that held beliefs that conflicted with their beliefs?

With language came the contemplation and study of thoughtful systems. Socrates and Plato introduced beliefs of “forms” of things existing independently of their physical examples. These philosophical beliefs represented superficial representations of an underlying and absolute “reality.” Aristotle carried the concept further but placed these forms to physical objects as “essences.” He posited the existence of a soul and introduced the concept of an immovable mover (God) to justify matter which moves through the “heavens.” These ghostly concepts live today, not only in religion, but in our language. Many times we express essence ideas without thinking about them because they exist in the very structure of common communication derived from ancient philosophers. Since no one can see or measure these essences, the only way to comprehend them comes in the form of belief. Sadly, people still accept these essences as “real” based on nothing but faith without ever investigating whether they exist or not.

Orthodox religionists hinged their “sacred” philosophies upon the shoulders of ancient philosophers. Plotinus reorganized Plato’s work as the bases for Platonism which lasted for many centuries. Thomas Aquinas became the foremost disseminator of Aristotle’s thought. Aristotelianism and its limited logic still holds the minds of many believers. Today people still believe in inanimate objects, spirits, gods, angels, ghosts, alien UFOs, without ever questioning the reliability of their sources. Belief and faith can overpower the mind of a person to such an extent that even in the teeth of contrary evidence, he will continue to believe in it for no other reason than others around him believe in it or that people have believed in it for centuries.

“Religion. n. A daughter of Hope and Fear, explaining to ignorance the nature of the Unknowable.”
-A. Bierce

 


 

The meaning of belief

To establish a common ground for the general concept of belief, I hold to the common usage of the term from the American Heritage dictionary:

Belief: 1. The mental act, condition, or habit of placing trust or confidence in a person or thing; faith. 2. Mental acceptance or conviction in the truth or actuality of something. 3. Something believed or accepted as true; especially, a particular tenet, or a body of tenets, accepted by a group of persons.

Believe: 1 To accept as true or real. 2. To credit with veracity; have confidence in; trust.

In its simplest form, belief occurs as a mental act, a thinking process in the brain that requires two things: a feeling and a logical statement. To “believe” requires a conscious feeling of truth. To communicate what this feeling refers to requires some form of logical structure such as spoken or written language. Thus a belief requires a thought and a conscious feeling of “truth” which, according to neurological brain research, stems from the limbic part of the brain (discussed in the mechanism of belief, below). Thus, belief occurs as a thought with a feeling or emotion “attached.” In other words: Belief= emotion + logic. Because belief requires emotion, it also represents a psychological state, not simply a mechanical thinking state.

In all cases, I refer to beliefs as occurring in an aware state of consciousness. Beliefs here do not refer to subconscious thoughts, or any mental activity occurring below the threshold of consciousness. Nor do beliefs apply to sleeping and dream states, or to unconscious habits, or instincts. When a person owns a belief, s/he consciously accepts their own belief. The degree of feeling to which one accepts their own beliefs, as valid, can vary from mild acceptance to certain absoluteness. Thus it would prove meaningless to say that a person has beliefs without them knowing it or for them to deny their own beliefs. Obviously, a person who does not believe in something, does not believe in that something; a person who believes in something, does believe in that something. Belief requires conscious acceptance.

 

How belief confuses arguments

In the mildest form of belief, that of acceptance without absoluteness, a speaker or writer could simply replace belief words with more discriptive words to avoid confusion.

Note that in most instances, one can replace the word “believe” with the word “think”. For example:

“I believe it will rain tonight.”

can transpose into:

“I think it will rain tonight.”

Most simple beliefs come from the expression of the experience of external events. From past experience, for example, people believe that dark clouds can produce rain, therefore, we attempt to predict the weather by forecasting from past events. However, to believe that an event will occur can produce disappointment if the prediction never happens. To make a prediction based on past events alone does not require believing in the future event, but rather, a good guess as to what may or may not happen. We can eliminate many of these simple beliefs by replacing the word “believe” with the word “think.” The word “think” describes the mental process of predicting instead of relying on the abstraction of belief which reflects a hope which may not happen. And if we replaced Aristotelian either-or beliefs with statistical thinking we would reflect probable events instead of believed events.

Belief represents a type of conscious mental thought, a subclass of many kinds of mental activity. Thinking may or may not include beliefs or faiths. Therefore, when I use the word “think” I mean it to represent thought absent of emotional belief.

If “think” won’t work as a substitute for belief, then it should prove easy to find another substitute word. For example:

“Our challenge then, is to believe only evidence claims that are likely.”

can transpose into:

“Our challenge then, is to use only evidence claims that are likely.”

Because belief statements contain logical propositions, one should consider if emotions and feelings have anything to do at all with our logic. The anecdote about Archimedes running through the streets crying, “Eureka!” after discovering the relationship between mass and volume describe his emotion after making his discovery. This could have led him to believe that density equals mass divided by volume, but do feelings and emotions add anything at all useful to his logical statement? If feelings really do add to our logical structures then why not add them to our mathematical statements? One could, for example, make up a table of ordinal words to express the intensity of the feelings such as “eureka!” (for the highest emotion), “good!” (for a lesser emotion), “meh” (an ambivalent emotion), etc. We can then plug our emotions into our logical statement:

 D[eureka!] = M/V

Now we have an attempt to use mathimatical statements as beliefs. Belief = emotion + logic.

As you should see by this silly example, the variable of emotion, not only does nothing to help the equation, the belief could vary from person to person. And if the holder of that belief dies, so goes their belief. Archimedes died with his beliefs but mathematicians today might think of his equation as: D[meh] = M/V, yet the truth value still holds even if no one believes it.

The example above appears silly when you attach an emotion to a mathematical statement, but the very same thing happens when you use beliefs in your language. Once you take out the emotional aspect of your belief from your statement, it would not, by definition, equal a belief. You would simply have a propositional statement. “I believe that it will rain,” would turn into, “It will rain.” Now if that seems too certain an expression, simply describe your uncertainty such as: “It probably will rain tonight”, or “It may rain tonight.” In a mathematical statement you might include a probability number.

Belief words can have many meanings. They can range from a guess (I believe so) to absolute certainty (I BELIEVE in God). So when a person uses a belief word in a sentence, the reader might get an entirely different meaning than you intend. It could mean just opposite of what you mean to your audience. For example, If you say to religious people that “I believe in science,” they might think you mean it as an absolute in the way they believe in god. It comes from this very kind of misinterpretation that can lead a religious person to think that science represents a religion. Realize that some religious people quote-mine popular scientific literature just to prove what some scientist believes or has faith in. Carl Sagan, Richard Dawkins, Danniel Dennet, for example, have all used “belief ” in their books and speeches but with the intention of justifying their belief with evidence and logic. The reader, however, might think of it as the strong version of the word. So why use belief at all? To avoid this problem, simply don’t use belief and faith and substitute it with a more descriptive word. Of course if you do have beliefs stronger than mild forms of belief, for example, if you hold supernatural beliefs, then of course, you should use the words belief or faith.

Many kinds of concepts occur without the need for belief. People can invent rules, maps, games, social laws, and models without requiring a belief or absolute trust in them. For example, a map may prove useful to get from point A to point B, but to believe that the map equals the territory would produce a falsehood. Humans invented the game of baseball, but it requires no need to believe in the game, or to attach some kind of “truth” to it. People can enjoy baseball, simply for the game itself. Technological societies invent “rules of the road” and construct traffic lights, signs and warnings. We do not take these rules as absolute but realize that they form a system of conduct that allow mass transit to exist. If any confidence results from the use of models and rules, it should come from experience of past events predicted by the models rather than from the emotions connected to beliefs.

 

 

Examples of non-beliefs

Many people misunderstand what constitutes belief and what does not. For many, belief has so infiltrated their minds, that everything perceived or thought incorporates a belief for them, including all of their knowledge and experience. This hierarchical, top-down, approach, in effect, puts such a person entirely within a world of solipsistic reasoning. Why? Because all thoughts describe a belief for them and since beliefs only occur within the mind, every belief refers to the self.

However, beliefs have no bilateral symmetry requirements; although one can believe in knowledge, one can obtain knowledge without owning beliefs; although one certainly accepts their own beliefs, not all things accepted require beliefs.

Consider that if one defined belief to incorporate all forms of thought, then the word belief would become tautological and meaningless, not to mention that knowledge and experience would fall as a subset of belief. Need I remind the reader that words differ not only in their spelling, but in their meanings? The following gives examples of non-beliefs:

Acceptance: Although belief requires some form of acceptance, not all things accepted require belief (beliefs have no bilateral symmetry requirements). Examples: I can accept the premise of a fictional story, but I do not for one moment believe in it. I can accept a scientific hypothesis without believing in it. Computers accept data and produce solutions, but computers have no consciousness, let alone beliefs. Many arguments can take the form of Devil’s Advocate to oppose an argument with which the arguer may not necessarily disagree.

Action: Although many people believe in the actions they perform, one can act without beliefs (beliefs have no bilateral symmetry requirements). Actions can occur out of a desire, a submission to an authority, or by unplanned events or even by mechanical means completely absent of humans. Examples: I can act a part without believing in it. I can act from a set of rules, but I do not need to believe the rules. I might act from an order from the police or government. I may act out of a desire to achieve something. There occurs no action which requires belief.

Agreement: Although belief requires some form of agreement (believers agree that their beliefs have validity), not all agreements represent beliefs (beliefs have no bilateral symmetry requirements). However, for some people (myself included), agreement requires no belief at all. Examples: I might agree that Captain Kirk served aboard the Starship Enterprise, but I hold no beliefs in Star-Trek fiction. I may agree with the rules of football, but I do not need to believe in football in order to understand the game; I may not even like the game! I may agree with any premise, without believing in it.

Knowledge: Knowledge comes from awareness of the world, or understanding gained through experience. Although people may believe in what they know, knowledge has no requirement for belief (beliefs have no bilateral symmetry requirements). Examples: I may have knowledge of a story, poem or song, but I have no need to believe it. I know the rules of many games, but I do not believe in games. I know the mathematics of calculus, but I do not believe in calculus. I have knowledge of information, but I do not believe in information. I have direct knowledge of my existence through sensations, thought, and awareness, but I do not believe I exist: I know I exist (even though I may not know how I exist).

Information: Although many people believe the information they receive, information received does not require belief (again, beliefs have no bilateral symmetry requirements). Examples: the information from books, stories, science, theories, fiction, religion, etc., all represent communicated ideas, but one does not need to believe in any communication in order to utilize it.

 


 

Differences between beliefs and thinking without beliefs

The two charts above represent a visual abstract concept of the differences between the paths of belief and the path to knowledge. Both paths represent a form of thinking or mental activity. Note that the chart on the left shows a convergence point at the bottom where simple beliefs and thoughts coexist. At this level, they appear virtually the same with the only difference amounting to its semantic designation (“believe” can substitute for “think” and vise versa). However as each path progresses, they diverge; the path of belief progresses towards intransigence and the path of knowledge leads to factual knowledge. Each progresses as a matter of degree and each forms an independent path. For example, beliefs requires no external evidence whatsoever (examples: belief in ghosts, gods, astrology, etc.) The path of knowledge requires no reliance on beliefs (examples: the observation that the earth orbits the sun and airplanes fly, etc. appears regardless of whether you believe in them or not.) However, the path towards knowledge requires external verification (observation and testing) whereas the path of belief does not. The path towards workable knowledge (facts) must agree with nature if we wish to utilize it. The path of belief requires no agreement with nature at all (although it might coincide with it).

Unfortunately, the usual practice of thinking involves the combination of beliefs with theory and factual knowledge (see the right chart). Most people tend to own beliefs of facts and knowledge, including perhaps the most rational people of all– scientists and philosophers. A hypothesis or a theory may lead a scientist to strongly believe in his or her theories, the verification of test results may lead them to havefaith in the results, and an established fact may lead some scientists to dogmatically hold to its verification (even if later evidence contradicts it). Thus even a scientist can attach beliefs to theories, faith to verification, and dogma to facts. Although scientists rarely approach intransigence (although some do), they usually believe in their data and theories and most philosophers believe in their philosophies and most of them will die with their beliefs. As Maxwell Planck once said, “A scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die and a new generation grows up that is familiar with it.”   Fortunately, scientific dogmatic beliefs do not appear as prevalent as it once did in the last two centuries. Scientists like Freud, Jung, Velikovsky, and even Einstein held stubborn beliefs bordering on inflexible religious-like thinking, even when presented with evidence that contradicted their beliefs. I suspect that much of the reduced degree of dogma in the scientific communitiy today results in better communication (especially through the internet), and a broader understanding of the sciences around them, and a humbling realization that some other scientist will call them out on their theories, but it still occurs to an unnecessary degree in my opinion.

If facts about nature come from nature itself, then every scientific fact can stand as the evidence alone and the theories that explain those facts. At no time do we need beliefs to understand facts and theories. Nature occurs without human beliefs and so does reliable evidence. And once we understand our facts and theories we call it knowlege. There simply exists no apparent necessity for attaching beliefs to knowledge.

Think about the following: Regardless of how strongly one has attached faith to scientific facts, no matter how religious the disposition of a scientist, there has never appeared a single workable theory or scientific fact that required the concept of a god or superstitious idea. Not a single workable mathematical equation contains a symbol for a “creator.” There occurs not the slightest evidence for ghosts in our machines or in our bodies. Even the most ardent non-believers can live their lives in complete accord with nature and live as long as the most fanatical believer. The same holds true for non-religious beliefs and in spite of the temporary mental comfort that belief might bring, (as do drugs) then what purpose can belief serve in the establishment of useful knowledge about the world? Note that when a person dies, so goes his or her beliefs, but if that person lived as a scientist and provided a world with a workable piece of knowlege about nature, then only the knowlege remains useful. The beliefs during the person’s lifetime have no bearing, whatsoever, on the usefulness of the knowlege that he or she brought into the world.

Now you might argue that the knowlege brought to us by persons no longer living still requires people to believe in the knowlege that they brought, but on what grounds and to what degree?

“Have you ever noticed…. Anybody going slower than you is an idiot, and anyone going faster than you is a maniac?” –George Carlin

I find it interesting to observe the state of belief in people. They most always see the problems of fanatical belief above them on the chart, but they never accept the disbelief of those below them. Believers always retain just the right amount of belief, it seems, and they unconsciously put themselves in a kind of self-centered, subjective dogma. I contend that most of us do not own beliefs of every kind and, indeed, we disbelieve more than we believe. Just as some believers have fewer beliefs than others, non-believers simply sit at the bottom of the scale. If you can, temporarily, put yourself outside of your own beliefs, you can question why you dismiss the beliefs of others, while perhaps understanding why non-believers dismiss yours.

The degree problem goes away once you understand that the amount of belief says nothing about the usefulness or factual nature of knowledge. If you squint your eyes, pray, or through sheer willpower force your belief to strengthen, will that improve your knowledge? (It can certainly produce falsehoods, but how can it improve on knowledge?) Conversely, if you act on your knowledge without belief, will thatchange the status of your knowledge? If you think knowledge requires belief between the extremes of strong belief and no belief, then just what degree of belief do you think it necessary for the proper understanding of knowledge?

Some people have argued that all knowledge represents forms of beliefs. Well it certainly can if you believe that and, no doubt, most people do believe that knowledge describes a belief, but that doesn’t mean it has to. Even Plato and Socrates defined knowledge as “justified true belief,” but this only describes what I already mentioned above (combining beliefs with knowledge), and again, this certainly serves far better than a belief without knowledge (I suppose Plato might call that unjustified false belief). But can knowledge exist without beliefs at all? Yes. And I can give examples.

I could use examples of animals such as insects or reptiles but someone might object on the grounds that they possess some form of consciousness and beliefs, so I will give an example of non-life entities: Autonomous computers. Autonomous drones, for example can take-off, fly and land without a human pilot or even a remote pilot. These aircraft take in information from the world around them through cameras and sensors, process that information, make algorithmic decisions and act on them by navigating, taking photos, or sending lethal bombs to kill enemy targets. To give another example, IBM’s computer called Watson (also autonomous) defeated the best Jeopardy players in the world. Its designers made it capable of understanding human language and knowledge by data mining documents, dictionaries, anthologies, and encyclopedias and deriving a correct answer. These computer systems, in fact, posses some knowledge about the world around them, otherwise they would not have the ability to carry out their tasks. These Autonomous computers  have no consciousness or emotions, so they cannot possibly have beliefs. Knowledge can indeed exist without beliefs. Humans, too, can act on knowledge even without consciousness. Sleep walking, driving a car while having a conversation, for example, can result in actions from subconscious knowledge even without that person consciously knowing what has happened.

Of course humans do not live like computers and we grow up with beliefs, perhaps even ingrained into our genes, but I submit that to suggest that an intelligent conscious human cannot understand knowledge without beliefs has no bases. Humans have a unique ability to understand abstractions and even abstractions about abstractions (metacognition) At least some humans have that ability (more on this below). One can understand how a belief can adversely affect knowledge and thusly learn to act on knowledge without owning beliefs. Nor do I claim that all people have the ability to disown beliefs. Perhaps some people can’t, even if they wanted to. It certainly seems that some people, especially highly religious people, do not have that ability. Perhaps their genetic and/or cultural upbringing forever prevents them from doing so, I don’t know. However, to suggest that every human must have beliefs belies the very fact that some of us don’t.

I submit that some, if not most, conscious human beings can learn to gather, understand, and accumulate knowledge and act on it without owning a single belief and that this provides far more of an advantage for the advancement of knowledge than a disadvantage.

 


 

Problems that derive from belief

Although one can argue that beliefs supported by scientific evidence represent a benign form of beliefs, they also act as barriers towards further understanding. Even the most productive scientists and philosophers through the ages have held beliefs which prevented them from seeing beyond their discoveries and inventions.

For example, Aristotle believed in a prime mover, a “god” that moves the sun and moon and objects through space. With a belief such as this, one cannot possibly understand the laws of gravitation or inertia. Issac Newton saw through that and established predictions of gravitational events and developed a workable gravitational theory. Amazingly, Newton began to think about relativity theory long before Albert Einstein. However, his belief in absolute time prevented him from formulating a workable theory. Einstein, however, saw through that and thought in terms of relative time and formulated his famous theory of General relativity. But even Einstein owned beliefs which barred him from understanding the consequences of quantum mechanics. He could not accept pure randomness in subatomic physics, thus he bore his famous belief: “God does not play dice.” Regardless, physicists now realize that for quantum mechanics to work, nature not only plays with dice, but randomness serves as a requirement if one wishes to predict with any statistical accuracy. And on it goes.

Even though great scientists, like any human, can fall prey to beliefs, their discoveries live beyond the barriers of their naive beliefs. Not only did they establish new knowledge about the universe but they also established its limits and, with them, the elimination of absolutes (and if you think about it, only a believer could pretend to know about absolutes, something not even in principle testable for mortal humans). For example, Einstein found the limits to velocity and time (once believed as absolute), Heisenberg saw the limits to reality (uncertainty principle), and Kurt Gödel’s incompleteness theorem set a limit on our knowledge of the basic truths of mathematics. A belief in absolutes that directly contradict these scientific discoveries can only bar one from further understanding.

Although thinking without beliefs does not, by any means, guarantee that people will make scientific breakthroughs, it can, at the very least, remove unnecessary mental obstructions. Belief, even at its lowest form of influence can create problematic and unnecessary barriers.

As belief progresses towards faith and dogma, the problems escalate and become more obvious. We see this in religions and political ideologies, especially those that contain scripts (bibles, manifestos) which honor war, intolerance, slavery and superstitions. We see this in the religious inquisitions, “holy” wars, and slavery. During the period of the black plague, millions of humans died out of ignorance of the disease with beliefs that God or Satan caused it. Meanwhile their religious leaders did little or nothing to encourage experimental scientific investigation. In the 1930s and 40s the world saw the fanatical idealism of communists (which has far more in common with religion than it does with atheism) as they destroyed millions of lives. We saw how Christianized Germany produced Nazism and the holocaust in order to defend against the Jews in order to fight for the Lord (Hitler’s belief). To this day, one can observe religious and ethnic beliefs creating war and intolerance in Bosnia, Sri Lanka, Israel, Africa, Russia and in Muslim countries. The tragedy of 9/11 could not have occurred without religious belief in an afterlife. Only religion produces the concept of moral war. Only a religious minded government would allow science to flounder while emphasizing faith-based programs.

Why does religious belief create such monstrous atrocities? Because religion expresses everything into terms of belief, faith, and absolutes, without need for reason or even understanding. Religion puts reality, morality, love, happiness and desire in a supernatural realm inaccessible to the mind of man. How can humans ever achieve peace when their religious scripts has their god condoning war and violence, while man must accept the superstitious belief that their unknowable god does this for mysterious reasons, forever beyond the comprehension of man? How can you understand the physics of the universe if you believe that an unfathomable supernatural agent created everything just a few thousand years ago? How can you live a full happy life if your religion denies the nature of sex, desire, and mind? How can you have workable government if you believe laws derive from an incomprehensible super-being? How can you have the future of the planet or your grand children if you believe that supernatural predestination will end the world?

Parents teach children at a very young age to believe in abstract concepts such as Santa Claus, the toothfairy, and supernatual gods. These parants have no understanding of the dangers that their beliefs might cause. Thus we prepare our society to not only accept beliefs, but to honor and fight for them. This commonly results in conflicts between free expression and censorship. For a believer, expression of ideas in-and-of-themselves represent beliefs. Thus violent television, movies and fictions present opportunities for the unaware to believe in them.

If, instead, we taught our children about beliefs and how they infect the mind and the dangers they can produce, society would have little need for censoring ideas. For without believers, there would live no one to believe them and the violence and fantasy portrayed by their fictions could only represent just that– fictions.

“Don’t believe anything. Regard things on a scale of probabilities.
The things that seem most absurd, put under ‘Low Probability’, and
the things that seem most plausible, you put under ‘High
Probability’. Never believe anything. Once you believe anything, you
stop thinking about it.”
–Robert A. Wilson

 


 

The mechanism of belief

Because belief requires a mental process involving neural activity, this allows scientific investigation into its mechanism. Although the abstractions of belief sit at a hierarchical level above the neuron level, there obviously occurs a connection between neuron activity to mental thought and vise versa. Unfortunately we still have only minute knowledge about the working of the brain, let alone the complex process that produces thought. However, studies have shown that some forms of delusional thought involve problems with the neocortex. Indeed, one of the characteristics of schizophrenic delusion involves grandiose and religious thinking [3] Some have even suggested that schizophrenia involves beliefs and attitudes taught to them while young [4]

Also, in epilepsy, neurological storms can trigger feelings and thoughts divorced from external events. Although the neocortex and its sensory equipment gets its information from the external world, the limbic system takes its cues from within. The neuroscientist, Paul MacLean became fascinated with the “limbic storms” suffered by patients with temporal-lobe epilepsy. [5] MacLean reported:

“During seizures, they’d have this Eureka feeling all out of context– feelings of revelation, that this is the truth, the absolute truth, and nothing but the truth.”

“You know what bugs me most about the brain? It’s that the limbic system, this primitive brain that can neither read nor write, provides us with the feeling of what is real, true, and important.”

This provides an important clue as to the mechanism of belief because it suggests that what we think of as true or real, actually produces or triggers a feeling. Belief in this sense then means a thought with a feeling attached where the feeling gives us a sense of conviction or truth. In normal people, a well reasoned thought can trigger a eureka-like feeling, thus the generation of a belief. This emotional tag attached to a thought may very well have served an important evolutionary role because it would allow Homo sapiens a way to prioritize thoughts that give a survival advantage. These eureka-like emotions also feel good and might very well enhance the memory of survival thoughts.

In abnormal thinking, even an irrational thought can trigger the same eureka-like feeling. In other words, regardless of a reasoned thought or an irrational thought, both can trigger a feeling of “truth”; or in other words, a belief. In its most extreme form, epiphany-like beliefs can result from the ingestion of hallucinogenic chemicals, fanatical religious rituals, extreme fasting, or chemical imbalances in the brain (i.e., manic-depressive, bipolar disorder, schizophrenia, etc.) All of these mental disorders can lead to excessive beliefs and intense feeling, yet with only irrational thoughts attached to them.

The worst forms of schizophrenia almost always involve extreme forms of delusional beliefs. Schizophrenics hear voices, act on impulse, think they hear the voice of God, Satan, or act out whatever belief-myth they grew up with. Interestingly, it appears that only thinking animals develop schizophrenia. We have no other animal model for this disease for holding false beliefs and the perception of unreal things. [6] Schizophrenia appears to exist only in humans.

According to V.S. Ramachandran, patients with temporal lobe epilepsy may experience a variety of symptoms that include an obsessive preoccupation with religion and the intensified and narrowed emotional responses that appear characteristic of mystical experience.

I present epileptic storms and schizophrenia here because they represent examples of mental disorder that can result in beliefs pegged to their extreme limit. I trust that most people will recognize that these mental diseases can result in dangerous forms of thinking. If the extreme beliefs held by schizophrenics represents a danger and an undesirable trait, then at what point below this do we consider beliefs desirable?

Since I first posted this article, further research has arrived on the subject that supports the connection of emotions to belief.  In 2007, Sam Harris, et all, used functional magnetic resonance imaging (fMRI) to study brains of 14 adults while they judged written “truth,” “false” or “undecidable” statements. They found “strong reciprocal connections from the limbic system, the basal ganglia, and the association cortex of the parietal lobe. This region of the frontal lobes appears to be instrumental in linking factual knowledge with relevant emotional associations.” The study suggests an anatomic link between purely cognitive aspects of belief and emotional reward. It also suggests that “the physiological difference between belief and disbelief can be independent of a proposition’s content and affective associations.” (italics, mine) [Harris] This suggested independence means that a proposition, or knowledge itself, does not require any emotion at all. In disbelief (a form of negative belief, and not the same as no-belief), the researchers also found a similar pattern of activation as that of belief.

Many believers seem to think that all humans believe and that belief represents a requirement for human life. We can show the falsity of this assumption by simply eliminating thought entirely. Not everyone can do this, especially schizophrenics, but for those that wish to, there exists methods for doing so.

At the opposite end of the spectrum, some people can completely stop their thoughts. And when someone can stop their thought process, beliefs cease to exist, at least temporarily. Ancient meditation or modern biofeedback practices show how to reduce or stop the semantic noise within our heads. During this practice, concentrating on a single idea or word (mantra) can reduce the thought level to a minimum (ekaggata). The final aim at eliminating this single thought results in a state of no-thought (“higher” levels of jhanic samadhi). While in such a state, all thoughts, ideas, and beliefs cease. Indeed EEG (electroencephalography) scans reveal that during meditative states, theta and alpha brain waves (brain waves associated with relaxed attention) dominate whereas delta waves (associated with goal-oriented and mental thoughts) are eliminated.

I bring up meditation and delusion to show that there occurs some range of degree of intensity of belief between the two extremes.

DEGREES OF BELIEFSThe curve above represents a population of beliefs from 0 (no beliefs, no thinking) to 1 (extreme beliefs, irrational thinking), charted with only two data points (x). The dotted line represents a guess since I have no data to plot actual probabilities (future investigators will have to gather this information). The degree of belief determines dispositions to hold an idea as absolute or true. Thus, insane forms of thinking (delusional, schizophrenia, etc.) would appear on the far right end of the graph. The extremists (far-right-political and religious-right, for example) might appear at around .8-.9. The opposite of extremism would fall toward the left end of the chart (meditators, day dreaming, etc.). From my personal observation, most people do not fall at either end of the spectrum; most fall somewhere well between the two limits. For the general population, I suspect the graph would appear as a Bell curve as shown above.

Although schizophrenia describes an obvious dysfunctional disease that causes harm to themselves and possibly to others, many schizophrenic properties can coexist in the “normal” human thinking process without causing notice to people observing them. Delusional thinking usually accompanies schizophrenia. But note that delusions represent false beliefs, virtually the same as the conditions for faith. Faith has become acceptable mainly because powerful social institutions support it.

Symptoms of mental disease, of course, do not appear identical for everyone. Some people may have only one episode of schizophrenia in their lifetime. Others may have recurring episodes but lead relatively normal lives in between. Others may have severe symptoms for a lifetime. Indeed, many who we consider sane commit the most atrocious criminal acts without a diagnoses of insanity. Even legal acts such as war, inquisitions, and pogroms can cause harm to its believers as well as to others. Yet we do not diagnose these acts of belief as a mental disease because the very engine of belief puts them in the context of acceptability. Most societies do not abhor war; instead, they honor it because their belief-systems support the notion of solving problems through mass killing called war. If, instead, we approached belief supported violence the way we attempt to solve mental diseases, perhaps we might produce solutions to some of our cultural problems.

A question arises out of these low-to-extreme forms of beliefs: If extreme beliefs represent a symptom or cause of mental disorder, then can a lack of belief produce a better, healthier, [or whatever desirable characteristic word you may want to use] way of socially interacting with people? At the low limit, that of meditation, one not only stops belief, but all forms of thought. This of course would result in a dangerous living condition if continued indefinitely, but only at the expense of the meditator. At worst the meditator might die for lack of food, but he or she could hardly harm anyone else. But what if one could learn how to think without beliefs? Might it not serve and advantage to make our thoughts more efficient?

Of course accidents will happen and tragedies will occur. Errors in our models of perception will no doubt always happen. But if we can reduce or eliminate beliefs, wouldn’t we have fewer reasons to harm others through prejudice or violence? Without beliefs, our thoughts would follow the prevailing evidence instead of blocking them with unnecessary convictions.

Even if we cannot solve all mental diseases or prevent dangerous beliefs from forming, we might at least become aware of the mental processes that create beliefs and why they sometimes lead to intransigence. Although no one yet has a clear understanding of how schizophrenia originates, it appears that it may have some connection with genetics, brain damage, chemical imbalances or social upbringing. Fortunately treatments have become available for many mental diseases. For those who have mild cases of mental problems, education alone may redirect the neural path towards productive thinking. For others, drugs and therapy can help alleviate mental problems. Likewise, early education in critical thinking, identification of logical fallacies, and the mechanism of belief may alleviate many of our dangerous beliefs.

 

 


Disowning beliefs

 

From the meaning of beliefs as described above, a person who owns a belief must possess two things: a thought and the feeling of that thought as ‘true.” The first requires a functioning neocortex and the second requires a functioning limbic system (note, by functioning, this also includes abnormal as well as normal functioning). This evolutionary and biologically inherited function brings up a valid question:

If a functioning human brain produces thought along with a feeling of ‘truth,’ then all humans who have functioning brains must experience beliefs, no?

Yes! And although this seems to contradict the very concept of no-beliefs, we humans have something that other animals don’t have (except for, perhaps, some other primates): the power of retrospection and the ability to see our own abstractions (at least some humans have this ability). Psychologists call this ability, metacognition (coined by John Flavell). Metacognition simply means “cognition about cognition.” Indeed, I have the experience of belief as when reading a convincing novel or watching a movie or a play, but I know that novels and movies represent fictions because I have the ability to think about my feelings and thoughts. Although I buy, temporarily, the belief for the entertainment value, I do not own the belief. It would prove not only silly but dangerous to walk out of a theater (say The Exorcist) and still believe the story. The same goes with any belief experience whether it comes from rational scientific reasoning or to fictions or myths. I may feel (believe) that I have discovered a scientific truth, but I know that my belief comes as a property of brain function and I have the ability to disown the belief. I can say that it feels right, but I also know that feelings don’t represent facts or knowledge any more than color exists as a property in matter. I also know that feelings-of-truth can mislead, especially when future evidence contradicts the truth-valve of the belief and can lead to intransigence. I can acknowledge the feeling but I don’t have to acknowledge the belief.

By putting yourself in a higher abstraction, you can ‘see’ the abstractions below you. In this sense you act at the arbitrator of your thoughts, picking out which produces the best results and dismissing those which don’t work, all without owning any belief. Owning beliefs means that you blind yourself to seeing them as what they represent: abstractions. You must also defend the beliefs you own or else feel oppressed when someone attacks them, and this can lead to depression, argument, violence, or to any ultimate tragic end. By disowning beliefs, you not only don’t have to defend them, but you avoid the problems associated with them.

If you still don’t understand how you can disown an inherited biological function, let me give you an analogue using an even older biological function: the sense of balance.

Every normal human has it, those little grains of calcium carbonate, the otoconia, in the inner ear that tickle the hairs of the maculae, that detect gravity and acceleration. Pilots of early aviation used to rely on this sense in what they called, “flying by the seat of the pants.” But during stormy weather or night flying, pilots became disoriented and began to lose their lives. At first the survivors chalked it up to high winds (how dare they accuse these brave pilots of becoming disoriented). But the aviation scientists knew better. When they invented instrument flying, the old timers balked, but pilots grudgingly learned to rely on the instruments. They learned to distrust their own senses and replaced it with more reliable instruments. One might even ask the heretical question: Do humans really need a sense of balance to fly at all? Note that nowhere in that statement does it say that one should eliminate the sense of balance.

I simply ask a similar question about belief. Do humans need beliefs to survive? Nowhere in that statement do I claim that one should eliminate the feeling of beliefs, only that one can eliminate the ownership of them. We humans have an evolved brain that can contemplate our own abstractions and beliefs. We can disown beliefs and replace them. So in the analogy of the sense of balance, what mechanism serves as the flying instrument that replaces belief? Critical thinking coupled with empirical testing (science).

You can feel that something seems true, even if false, while at the same time you do not have to think of it as true.

 


Inside our head vs Outside our head

 

Many people have a difficult time telling the difference between what happens inside their heads as opposed to what happens outside their heads. And I don’t mean just schizophrenics or psychopaths, but also sane people. Most of us have had confusions about “reality” at some times in our lives. Since all sensations and information comes to the brain filtered, we experience all our perceptions in our head. To establish the difference between outside verses inside events, we usually derive, through intuition, some sort of comparative test. Most of our sensations instinctively tell us what occurs outside. As infants, we quickly learn that the sounds we hear in our heads actually emanate from the outside. We learn to manipulate objects through touch, observe movement through sight, etc. As we grow, we begin to form abstract thought and we attach these abstractions to our perceptions. Observation, reasoning, and experimentation gives us the means to determine the difference between outside our heads and inside our heads.

Errors can creep into our thinking process. And from there it can invade our language system. This happens, virtually in any information system. If we do not correct these linguistic and logic errors, we may go for years propagating ancient errors without thinking about them. It seems obvious that this has already occurred to many cultures that have promoted dangerous belief sets. Although most will agree that dangerous beliefs present a threat and that we should do something about them, many beliefs that seem inconsequential receive no concern at all. These, seemingly, innocent beliefs act through our language system and can give us a false sense of “knowing.”

To give an example, we usually think of color as “out there.” We observe green foliage, blue skies, red apples, etc. Yet color, demonstrably, does not occur “out there,” but rather, exclusively inside our heads. Matter contains no color. Color has no bases from the physics of light. Color, rather, describes a sensation. [10] However, matter does “reflect” or produce light. Our eyes absorb this energy and our brains interpret this information by “tagging” a sensation of color to it. Many times we express this perception through an error of language that projects color as “out there.” We use ancient “essence” words like “is” and “be” that put mystical properties to events which occur only in our heads. For example, “the grass IS green” seems to project the property of “greenness” to an external plant form. Regardless of how much chlorophyll a plant may contain, it contains no “green.” The color green occurs in our brains as a “tag” to an indirect reflective property of light. Yet our “essence” words and ideas continually fool us into thinking that things exist outside our heads, without the slightest evidence to support it. To help eliminate these “essence” verbs, we can simply replace them with descriptive verbs. Instead of saying “The grass is green,” I might say, “The grass appears green (to me).” The descriptive verb “appears” connects perception to the observer instead of placing it outside the body. Many sentences which use “to be” verbs produce false or misleading statements. [9]

 


From belief to faith

 

Many rational people, including most scientists, still insist on utilizing beliefs with the rationale that beliefs must accompany evidence to support them. Of course it proves more prudent to attach evidence to one’s beliefs than to own beliefs without evidence, but why should anyone feel compelled to attach beliefs to evidence at all? Why not stand on the evidence without beliefs? Consider a measurement, for example the velocity of light. I can simply state the calculated or measured velocity as a numerical figure or I can say “I believe that the speed of light equals 299,790 KPS. But the velocity represents a measurement of an external event, not a belief. The belief of the velocity of light adds nothing to the information about the velocity of light. The belief only reflects an intransigent property of the believer and nothing at all about the measured property. Regardless of how mild the intransigence, the belief itself provides no scientific value at all. On the contrary, the belief within that individual may grow to such extent that it overshadows the evidential data and may cause the believer to hold on to his theory even if future evidence contradicts it. As a theory only, without belief, the possibility of future evidence may reveal new data that would modify and improve the theory.

I have met such believers before and when shown evidence of the differing velocity of light in crystals, their belief of an absolute value of light rose to the occasion to combat this new (to them) information. Note that when I say that belief appears unnecessary to evidence, I do not mean that ideas and thoughts should not accompany them. On the contrary, instead of beliefs, we can establish theories and models about the evidence, a predictive and productive way of understanding the consequences of the evidence. (I’ll add more about this later.)

Although the reasons why people tend towards certain belief-systems remains unclear, Frank Sulloway, a research scholar, has proposed that family dynamics and birth order influences social survival strategies [8]. In general terms, firstborns tend to think conservatively and laterborns tend to think as liberals. In the extremes of both liberals and conservatives, the beliefs can take on a fantastical form of thinking. In its most dangerous form, belief can take its most intransigent property as faith, the reliance on hope and ignorance. Indeed, many psychopaths and schizophrenics provide extreme examples of faith as the beliefs inside their heads take over the evidence from outside their heads. Some researchers have noted the higher prevalence of schizophrenia in certain religions [11].

 


 

Hypotheses, theories and models

Many religious people who challenge scientists, attempt to make their scientific theories equivalent to faith. I suspect this gives the faithful comfort, as reducing theory to the level of faith puts both on an equal plane. However, useful theories do not rely on faith and do not even require belief. Scientific theories must agree with nature to some degree, faith does not. If a theory’s prediction fails to produce results, then the theory itself cannot provide usefulness and the scientists must throw it out. A hypothesis represents nothing more than a good guess subject to further verification and usually precedes a theory. A workable theory, however, represents a good guess based on evidence and makes useful predictions.

“It does not make any difference how beautiful your guess is. It does not make any difference how smart you are, who made the guess, or what his name is– if it disagrees with experiment it is wrong. That is all there is to it.”
-Richard Feynman

Newton’s theory of gravity, for example, represents a useful set of guesses that make predictions about matter traveling through space. Newton’s mechanics, however, does not give us absolute or exact predictions. It only allows predictions about matter within acceptable tolerances. Einstein’s theory of gravity carries Newton’s theories to ever more exact figures and we can make even better guesses. But note that the theories of gravity must rely on outside evidence, and the guess must agree with experiment. A theory, therefore, without supporting evidence has no meaning. The following provides some examples of theories:

The kinetic theory of matter depends on the measurable properties between the forces between particles of matter.

The theories of gravitation depend on the facts of the measurable results of matter in the field of gravity.

The theory of natural selection depends on the facts of evolution as confirmed by observation, evidence and experiment.

Note that understanding any scientific aspect about the physical world requires some form of theoretical thought.

Models differ from theories, in that they usually represent an abstract copy of the event or thing that we wish to understand. They may provide us with predictions, but they can never fully represent the subject in all its nature. A model represents an incomplete abstraction of a thing outside our heads. Maps, scale models, computer simulations, etc. all provide us with methods to predict the future of an event or thing. For an example of scientific modeling, look at the history of the investigation of atoms. As the evidence accumulated, the physicists made better and more accurate (although incomplete) models of the structure of matter.

A hypothesis may lead to experiment and both may lead to a theory. If the theory of the evidence provides accurate predictions every time, sometimes we call these “laws” or “knowledge.” Note, however, that “knowledge” does not mean that it comes absolute. A fact or theory may change in the future and we may have to modify our knowledge to accommodate the changing evidence.

By utilizing hypotheses, theories and models, we can express thoughts about the world without resorting to beliefs and faith.

 


 

Logic, mathematics, and reason

Unfortunately, many people misuse the concept of logic and believe that it provides a method of arriving at “truth” about the world; that if they propose a logical argument it, somehow, has validity to external events. However, logic, by itself, says little about the world and does not guarantee “truth.” Logic provides a language of self-consistent reasoning that pertains only to the construction of itself. A logical conclusion based on sound reasoning, in fact, might disagree with the external event we wish to understand. For example, in the following logical construction:

All judges are lawyers

No bishops are lawyers

Therefore: No bishops are judges

The above syllogism consists of valid logic. However, each of its propositions must agree with observation before its conclusion can provide any usefulness. Does every judge actually serve as a lawyer? Have no bishops ever served as lawyers? Reason and logic without evidential support cannot determine much about the world until the evidence supports the propositions.

All ghosts are spirits

No cartoons are spirits

Therefore: No cartoons are ghosts

The logic above appears sound, but what in the world does it mean and how does it relate to the world? In what context does it refer? What about Casper the ghost?

Interestingly, one of the signs of mental illness, especially schizophrenia, involves their irrational thinking and the errors they make in syllogistic reasoning [12].

Note also that many different “Logics” occur for many different fields. Traditional logic, for example, simply does not work in the world of quantum physics. The math, the reasoning, and the logic of the quantum world differs widely from the macro-world. Unfortunately, today most people rely on only one kind of logic, usually some from of aristotelian logic. We tend to think in terms of black/white, true/false, good/evil, guilty/not-guilty, up/down, inside/outside, etc. Although many things, indeed, follow this simple kind of logic, a plethora of things operate through a continuum. Although aristotelian logic may work great for digital circuits, or simple syllogisms, it fails miserably when trying to understand the human condition or things that work through calculus.

Mathematics represents a symbolic language of logic and provides us with a tool for reasoning. But mathematics and logic must accommodate the external events if it wishes to explain them. Of course people may have beliefs about one mathematical system over another, but any philosophical belief always fails in light of nature. Only the results of the accuracy of the predictions matter in the mathematical world; beliefs have no requirement in the outcome, regardless of how good it may make its believers feel. In fact, it has appeared commonplace in physics, especially quantum mechanics, where two entirely different mathematical approaches derived from different starting points turn out to give identical quantitative answers [13].

Although logic and mathematics may provide a useful tool for reason, scientists may encounter information about the world that matches no logic whatsoever. Unknowns and incomplete information occurs many times, but that does not necessarily prevent establishing useful results. Doctors knew that aspirin, for example, worked as a pain blocker, but for many years they had no workable explanation of how it worked. Even gravity, to this day, with all the mathematics predicting its effect on matter, has stumped physicists as to the nature of its mechanism. Many times the physicists do not even understand why their system works. They only know that it works. The prime requirement of making useful predictions must come from nature herself, from things outside our heads. All the beliefs, theories, logics and models, regardless of how well they got constructed, cannot do us any good unless they have some support from evidence. Many times events outside our heads provide us with life sustaining support without our thinking about them at all (such as breathing air)!

Instead of relying on one logical system, as most people do, we might instead incorporate a language that incorporates a system of logics and we might choose the system that best fits the object of investigation. Sadly our English language contains severe limitations and cannot possibly express many of the extraordinary discoveries of the new physics. Mathematics allows a language of continuum, multiple dimensions, and infinities and all without the need for introducing ghostly beliefs.

 


 

Preconceived beliefs

I once heard an amusing story about the artist, Picasso. I don’t know if this actually happened but it makes a point about how people construct beliefs of reality from abstractions:

A stranger recognizing Picasso asked him why he didn’t paint pictures of people “the way they really are.” Picasso asked the man what he meant by “the way they really are,” and the man pulled out of his wallet a snapshot of his wife as an example. Picasso responded: “Isn’t she rather small and flat?

To believe that an abstract representation shows the actual thing leads to an unnecessary biased form of perception. Belief of any kind puts a kind of shield on the thinker and puts in its place a form of thought which in effect says: “This is real.” Preconceived beliefs coupled with the lack of information can lead to false conclusions.

To take another example, I might say to a group of people, “I love fish.” Everyone may hear me correctly, but because of their preconceived beliefs and a lack of context, some may interpret my meaning as a statement about dining and others may believe I have a love for aquarium fish. Virtually all expressions of thought contain some limitations and to add preconceived ideas without evidentiary support can produce false statements and beliefs.

Without resorting to belief, I can look at a photograph and see that it only resembles some aspect of a particular thing or person, and that it represents an indirect abstraction. Without belief, I can question a proposition before arriving at a conclusion.

 


 

Limitations of knowledge

“It used to be thought that physics describes the universe. Now we know that physics only describes what we can say about the universe.”
-Niels Bohr

“It is always better to have no ideas than false ones; to believe nothing, than to believe what is wrong.”
-Thomas Jefferson

Our thoughts and expressions through language represent abstractions about the world, metaphors and models about things and not the things themselves. Language and thought cannot describe the totality of a thing anymore than a painting or picture can. A picture does not equal its subject, and a map does not equal its territory. But our myths, maps, models, and abstract thoughts do provide a limited means to understand the world and to make predictions about external events. They provide a way to quantify and simplify our communication systems so that we can perform desirable and useful actions in the world. But if we allow unnecessary thoughts and beliefs to reside with our abstractions, we develop semantic noise which can lead to incorrect information.

As limited humans, we do not possess absolute knowledge. Our perceptions and information comes to us incomplete. When we look, touch and measure an object, for example, we only observe part of its totality. Belief, on the other hand, can produce the illusion that we understand without limitations. Eliminating concepts of beliefs, at least puts us closer to the range of our perceptions. We inherit mortal limitations, we cannot know with absolute certainly about the external world; we cannot completely remove doubt about our conclusions. Many philosophers and scientists have come to this same observation [14]. Doubt leaves the door open for further investigation. Intransigent belief puts a mental barrier to further knowledge.

 


 

Bias (point of view)

Because our models and theories represent limited knowledge about the world, this forces us to examine the universe within boundaries. This produces a point of view. Bias represents a focus, direction, or preference towards a point of view without examining or ignoring existing evidence. One cannot avoid a point of view. Regardless of how one might try to prevent bias, there will most always occur something left out of the description. Similar to Heisenburg’ Uncertainty Principle, as a focus becomes narrow, the more outside its focus gets left out. And vice versa, the more general a view becomes, the more the details get left out. If one tries to include the details with the general, a view can bog down with an overblown aggregation of information, turning a direction of thought into a cloud of complexity; and even still, the entire system would reside within a framework of limitations. Regardless of how one may reject beliefs, a point of view occurs if only because we represent a unique and limited spatial entity within the universe.

The negative aspect we usually associate with bias does not come from bias itself but rather the belief that comes with it. Belief produces a set of brackets around a point of view that says in effect “The answer lies here.” Once you believe you have found the answer, your point of view becomes biased, (intransigent and prejudiced) and prevents you from looking at other possible alternatives. Again, beliefs act as a barrier to further understanding. If a person develops a faith in a point of view, then it becomes overwhelming to the point that nothing, even in the light of convincing evidence, will the faithful yield to better information. A biased belief can convince its believers that they hold the key to all understanding and “truth” without providing any evidence to support it.

A point of view, however, does not demand a predisposition to belief; it can simply represent a direction of thought. Ideas, by their very nature, represent limitations of thought. As long as a point of view produces a reasonable explanation, uses only pertinent information necessary to make predictions and leaves open the possibility of change in favor of better evidence, then it serves as a useful and productive tool. As we learn and understand our limitations, that a point of view represents an understood direction, we have the possibility to transcend it into an even more productive point of view.

 


 

Imagination, fantasy and wonder

Imagination is more important than knowledge.”
-Albert Einstein

As humans, we have the remarkable ability to make things up and to pretend. Imagination and fantasy provides us with one of the most pleasurable ways to experience thoughts and gives us one of the fundamental requirements for the ability to create. Our imagination provides us with the mental capacity to express models in our heads and to act out scenarios of love, conquest, gamesmanship and adventure. I can’t imagine any new invention, art, or literature deriving without its author engaging in the pleasure of a fantasy. The feeling of wonder about things in the world and the mysteries of the universe fills us with imagination and speculation. Although Einstein put imagination above knowledge (something I don’t necessarily agree with), it certainly serves a very useful function.

Fantasies and imaginations, of course require no belief in them. They provide us a way to model and hypothesis non-actual events that may eventually lead to knowledge of actual things or perhaps even a novel invention. Fantasy coupled with ideas about actual events can lead to great insights about future events. Many a science fiction story, for example, has inspired scientists to construct hypothesis that lead to verifiable experiment and the invention of useful machines. Even fantasy by itself provides an enjoyable way of expressing thoughts. But if an individual begins to believe in his own fantasy, or worse, has faith in it, then usually only disappointment or tragedies result.

 


 

Natural desire

“We always move on two feet– the two poles of knowledge and desire.”
-Elie Faure

Desire comes to us as a natural feeling. As biological animals, we cannot avoid desires. We desire food, shelter, freedom of expression, etc. As exploratory animals, we humans use our minds as a tool to help satisfy the desires within us. With reflection and thought, we learn the limits to our desires. Eating too much, for example, can lead to limited heath and the prevention of satisfying other desires. By understanding the consequences of desire, we can avoid the excesses and blockages of desire. To express and satisfy our desires (sex, feelings, hunger, etc.) provides a human need. And if we do not satisfy our natural needs, then severe consequences can result.

Sadly, many of our belief-systems put a stranglehold on our natural instinctive desires. If a belief-system teaches that “sex is evil,” “only godly belief will help you,” or suppresses expression and communication, we may turn depraved, depressed, or violent.

Believers many times express desire indirectly in terms of hope, a form of wishful thinking. Indeed faith hinges on the requirement of hope and ignorance. Hope without an adequate method of achieving our desires can lead to debilitating disappointment and sorrow. I can only imagine the number of tragedies that have occurred from failures due to excessive wishful thinking. Instead of relying on faith and hope, we might analyze our desires and use our knowledge and creative minds to find a way of satisfying them.

 


Morality

Many people think that morality stems from religion, usually from some form of ‘divine’ instruction in the form of scripture, holy writ or from the teachings of shamans or priests. However, research from evolutionary biologists and sociobiologists have shown that the precursors of human morality occur in many other social animals, especially primates such as chimpanzees and bonobos (our closest animal relatives). Religion emerged after morality and, thus, human morality cannot have come originally from religion. As an example from personal experience, I remember as a child that I learned about golden rule behavior by interacting with my fellow school mates in the sandlot before anyone taught me about religion, nor did I even know about what the golden rule or morality meant! I simply behaved in a manner that felt right to me at the time. (A few other children acted through Iron rule behavior, the “bullies”).

Morality ultimately stems from the brain and it requires emotions and consciousness. The science of human behavior suggests that innate morality comes to us from birth, perhaps similar to the language instinct where humans have an innate capacity for language even though any particular language comes from cultural development (see Steven Pinker and Noam Chomsky). Religion may have served as the first system to control morality through religious belief instruction (and force) but that says nothing about the workability of a moral system. In fact, one can argue that religious morality creates more moral problems because it does not conform to reality (because it relies on supernatural beliefs, not on nature) and it produces dogma which can prevent one from establishing workable morality in light of new evidence. After all, the three most influential religions in the world (Judaism, Christianity, Islam) stem from books written during the Bronze and Iron ages long before people understood the science of biology and human behavior. Clearly thousands of years of moral instruction from these religions have never produced a workable moral system (do I really need to go into wars, slavery, pogroms, witch hunts, intolerance’s, etc. to explain this?)

Since humans live in the natural world and science provides the only tools to understand the natural world, it follows that science might provide us with the best way to establish workable moral systems. Unfortunately, much of human nature remains unknown to us and scientists have barely begun to study moral systems. Moreover, the dogmatic belief that morality can only come from religion further bars people from thinking about it, even from many scientists. Nevertheless, the science of morality started with the philosophical ideas from Jeremy Bentham, the philosophy of consequentialism, the research on human cooperation from Robert Axelrod, and many scientists now studying how the brain creates moral judgments.

Innate morality does not require ownership of beliefs because it acts through our biological system in response to stimulus and our environment (although many people do attach beliefs to them). I do not have to believe in order to act. However religious morality almost always requires belief because it involves religious instruction which one must believe in order to accept the dogma. In both cases, innate morality or religious morality might prove tragically wrong because of particular circumstances (for example, I might treat someone altruistically not knowing that that person relies on deception and trickery to get what he wants, or I might turn the other cheek to an enemy which could result in the death of myself and others).

Instead of relying on innate feelings or belief, I can spend more time thinking and evaluating my feelings and the feelings of others around me and to try to establish the consequences of my actions (ethics).

Morality requires feelings and emotions because our subjective values stem from emotions, and we need values to establish morality. Here we have emotions that trump logical reasoning (just opposite of beliefs). For morality, we want to use emotions with logical structures but not as beliefs but as a way to achieve desires and wants. Beliefs involve statements about external truths which do not require the feelings but in morality we must use our feelings to direct us toward a workable moral system. But one does not need to use belief statements to do this. Instead one need only use desire statements. For example, I want people to live together peacefully because everyone will feel better as a result. And then I might describe a way to achieve this want by using a theory to establish it. At no time do I require beliefs to establish statements about morality.

Much of our innate feelings already drives us in this direction but only a full study of the behavior and feelings of humans can result in any kind of consensus on the right action to take. And this requires science. In any case, one could construct an ethical system that remains flexible, based on human nature and science, all without owning a single moral belief. Of course disowning beliefs does not guarantee a workable moral system but it does get rid of all the belief based systems that have no connection at all with human nature. At the very least, this opens up opportunities to create a moral system that works for both the individual as well as others.

 


 

Everyone believes in something?

Many a believer, religious and atheist alike, will become astonished at any statement against belief, if for no other reason because they believe and the people around them have beliefs. They tend to form a belief-of-its-own that projects beliefs onto others. However, simply because most people own beliefs does not necessarily follow that all people own beliefs. To claim the knowledge that everyone on earth believes in something portends an astonishing proclamation. It would require an omniscient ability to see into the minds of every human on earth. Moreover, many people fail to understand that belief requires conscious acceptance. People who own beliefs (unless they lie) do not deny them. Quite the contrary, people who believe, admit their beliefs quite readily. Furthermore, few people stop to ask what we mean by beliefs or understand that one can replace belief with other forms of “thinking.”

 


 

I don’t believe the sun will rise tomorrow, but I predict it will

Not believing in something does not mean thinking something may not happen. The absence of belief does not prevent one from predicting the event. It may seem fatuous not to believe the sun will not appear the next day. However, as a limited human being, I maintain no absolute certainty that a sunrise will occur. At best. I can only make a prediction based on past experience. Since I have experienced daylight every day of my life, and know of no human who hasn’t, I have little evidence that a sunrise will not occur tomorrow. Therefore I can make a prediction based on past experience that a sunrise will appear highly likely to occur the next day. Note that I do not require believing to do this, only observation, experience, and good guessing. Prediction based on experience, in this case, replaces belief. But note that my prediction may prove wrong, regardless of how remote the chances. We have evidence that supernovas exist in the universe that can destroy local solar systems. If, indeed, such an event occurred in our part of the galaxy, our sun could possibly get absorbed, along with the earth and all humans on it. So although there exists a very remote chance that the sun will not appear, I can at least predict with great (but imperfect) accuracy that I will see sunlight the next day.

By replacing belief with predictive thought, one can eliminate the need for belief, yet still maintain an outlook on life and make useful predictions.

 


 

Don’t you believe you exist?

To the believer who poses this question, I can only respond with “I know I exist, but apparently you only believe you exist.”

Questions about belief of our own existence aim to put a philosophical end to the discussion by proposing an impossible (to believers) proposition that no one could possibly deny. However, eliminating belief does not deny the evidence of existence. This appears so obvious and apparent that it only shows the power of belief to blind people from the world around them.

Any fair observer will note that no animal, including humans, require a need to believe in their existence. Humans, however, have the power of knowledge and the ability to express themselves. I know I exist because I get knowledge of my existence every second of my conscious life directly from my feelings, perceptions, or thoughts – no belief required. Belief only introduces an unnecessary proposition. I can simply say “I exist,” instead of “I believe I exist.” My knowledge of existence comes from experience, not belief. The elimination of beliefs, makes our statements more concise, accurate and meaningful.

However, when one only believes in their existence, they automatically reduce their entire life to an abstraction: a belief. In effect, they have put an unnecessary barrier between their minds and the world around them.

 


 

Owning no beliefs does not result in nihilism

To characterize no beliefs as nihilist only creates a straw man. Of course a nihilist might very well claim to abandon knowledge of existence but usually it comes in the form of a belief– one who believes that nothing exists or one who believes that no one can know anything. Nothing I have written rejects the notion of existence or knowledge, whether it comes from metaphysical, political or ethical thought. Abandoning beliefs does not prevent one from reality, morality or sociality. On the contrary, I submit that eliminating ownership of beliefs tends to enhance the knowledge of things by the very act of eliminating the very obstruction which prevents us from knowing how things work in the universe. The elimination of beliefs as I describe it illustrates the very antithesis of nihilism. The problems that derive from beliefsprevent us from knowledge of existence, morality and workable political systems.

Ironically many believers who accuse others of nihilism follow a similar path of nihilism by denying reality in favor of superstitious beliefs. How in the world can one know about reality when one believes in a supernatural force which (according to religious philosophers) remains entirely separated from the world, and in principle, no one can know?

So if you think (or believe) that I submit to a form of nihilism, then you will have abandoned a main premise and put yourself at a personal disadvantage by ignoring or denying an idea (a valid and very workable idea in my opinion).

 


 

No, I don’t believe my own words

And neither should you. But I do ask questions, and because you’ve read this far you, and even if you strongly disagree, you must ask yourself this: Which method works best: acting on beliefs or acting on knowledge? If you have difficulty answering this question, then perhaps your beliefs prevent you from acknowledging the obvious.

This text presents points of views based from my (and others) experiences, observations, and research about the thought process. I do not present them as beliefs but rather as an investigation into the mechanism of belief. If any of my statements prove false, then they will show simply that, and subject to further revision. Disowning beliefs does not guarantee “truth” or accuracy, only a method to help clear away superstitions and falsehoods.

 


 

Summary

Beliefs and faiths represent a type of mental activity that produces an unnecessary and dangerous false sense of trust and wrongful information (thinking coupled with the feeling of ‘truth’). Faith rarely agrees with the world around us. History has shown that beliefs and faith, of the most intransigent kind, have served as the trigger for tragic violence and destruction and sustained the ignorance of people. Replacing beliefs with predictive thoughts based on experience and evidence provide a means to eliminate intransigence and dangerous superstitious thought.

Beliefs and faiths do not establish “truths” or facts. It does not matter how many people believe or for how many centuries they have believed it. It does not matter how reverent or important people think of them, if it does not agree with evidence, then it simply cannot have any validity to the outside world. All things we know about the world, we can express without referring to a belief. Even at its most benign level, beliefs can act as barriers to further understanding.

I present a very simple observation at the limits of ignorance and knowledge: If you don’t know about something and you submit it to nothing but belief, it will likely prove false; if you know about something, then you don’t need to believe it, because you know it. Between ignorance and knowledge you have the uncertainties about the world, and the best way to handle uncertainties involves thinking in terms of probabilities. So what use does belief have?

If you have awareness of abstracting, you can then begin to replace believing with thinking.

Instead of owning beliefs, we can utilize hypothesis, theory, and models to make predictions about things in the world. In its semantic form, we can replace “belief” words with “thinking” words which better describes the formation of our ideas. We can use our imaginations to create new hypothesis towards desired goals. The wonder of the universe gives us a powerful feeling of inquisitiveness. Certainly we will fail sometimes, but disowning beliefs allows us to correct our mistakes without submitting our ideas to years or centuries of traditional time consuming barriers. Theory coupled with imagination can yield inventive thoughts and points of views. By further understanding our language and eliminating unworkable essence words, we can communicate without resorting to preconceived ideas based on past beliefs. Our feeling of wonder about the universe provides us the fuel for exploration; how much more magnificent the results from useful thoughts than ones based on faith.

 


 

Notes:

[1] Sagan, C., Duryan, A., “Shadows of Forgotten Ancestors,” p. 198

[2] Eisler, Riane, “The Chalice & the Blade,” Chapter 2

[3] Shapiro, Sue A., “Contemporary Theories of Schizophrenia, Review and Synthesis,” p.10

See also Early Warning Signs of schizophrenia: http://www.mentalhealth.com/book/p40-sc02.html#Head_5

[4] Modrow, John, “How to Become a Schizophrenic,” See Introduction & Chapter 1

[5] Hooper, Judith & Teresi, Dick, “The 3-Pound Universe, “p. 48 (paperback)

[6] Hooper, Judith & Teresi, Dick, “The 3-Pound Universe, “p. 106 (paperback)

[7] Scheibe, Karl E., “Beliefs and Values,” p.27

[8] Sulloway, Frank J., “Born to Rebel: Birth Order; Family Dynamics, and Creative Lives.” Sulloway presents a scientific statistical analysis of radical believers in history compared to conservative believers. His findings offer evidence that family dynamics influences the behavior of siblings. Firstborns tend to identify with parents of authority and status quo, while laterborns tend to rebel against authority. This engine of behavior can influence what we believe in.

[9] Bourland, Jr., D. David, and Johnston, P. D., “To Be or Not: An E-Prime Anthology, 1991, International Society for General Semantics

[10] Feynman, Richard, “The Feymnan Lectures on Physics,” Vol 1, pp. 35-10

[11] Bellak M.D., Leopold, “Disorders of the Schizophrenic Syndrome,” pp. 26-27

[12] Chapman, Loren J. & Champman, Jean, P., “Disordered Thought in Schizophrenia,” Chapter 8: “Errors in Syllogistic Reasoning”

[13] Heisenberg’s matrix mechanics and Schrodinger’s wave mechanics provide an example of two mathematical systems which give equivalent results. See Polkinghorne, J.C., “The Quantum World,” p.14 (paperback)

[14] Levi, Isaac, “The Fixation of Belief and its Undoing,” pp. 2-3

 


 

Bibliography (click on an underlined book title if you wish to obtain it):

Bellak M.D., Leopold, “Disorders of the Schizophrenic Syndrome,” Basic Books, Inc., New York, 1979

Bourland, Jr., D. David, and Johnston, P. D., “To Be or Not: An E-Prime Anthology,” International Society for General Semantics, 1991

Chapman, Loren J. & Champman, Jean, P., “Disordered Thought in Schizophrenia,” Prentice-Hall, Inc., 1973

Crees, Adrian, “Anatomy of Religion,” Freshet Press, 1989

Dawkins, Richard, “The God Delusion,” Bantom Press, 2006

Feynman, Richard, “The Character of Physical Law,” The M.I.T. Press, 1965

Feynman, Richard, “The Feymnan Lectures on Physics,” Vol. 1, Addison-Wesley Publishing Co., 1963

Gottesman, Irving I., “Schizophrenia Genesis, the Origins of Madness,” 1991

Harris, Sam, Sameer, A. Sheth, Cohen, Mark S, “Functional Neuroimaging of Belief, Disbelief, and Uncertainty,” 2007

Herbert, Nick, “Quantum Reality, Beyond the New Physics,” Anchor Books, 1985

Hoffer, Eric, “The True Believer, “The New American Library,” 1951

Hooper, Judith & Teresi, Dick, “The 3-Pound Universe,” Dell Publishing, 1986

Is Religion a Form of Insanity a Free Inquiry Symposium, [Free Inquiry, Summer, Vol. 13, No. 3, 1993]

Levi, Isaac, “The Fixation of Belief and its Undoing,” Cambridge University Press, 1991

Modrow, John, “How to Become a Schizophrenic,” Apollyon Press, 1995

Murphy, H.B.M., “Cultural Factors in the Genesis of Schizophrenia, in the transmission of schizophrenia,” Rosenthal, D., & Kety, S.S., Oxford: Pergamon Press, 1968

Polkinghorne, J.C., “The Quantum World,” Princeton University Press, 1984

Ramachandran, V.S; Blakeslee, S., “Phantoms in the Brain : Probing the Mysteries of the Human Mind,” Quill, 1999

Sagan, Carl & Druyan, Ann, “Shadows of Forgotten Ancestors,” Random House, New York, 1992

Scheibe, Karl E., “Beliefs and Values,” Holt, Rinehart and Winston, Inc., 1970

Shapiro, Sue A., “Contemporary Theories of Schizophrenia, Review and Synthesis,” McGraw-Hill Book Co. 1981

Sinclair, W.A., “The Traditional Formal Logic,” Methuen & Co. Ltd., 1937

Sulloway, Frank J., “Born to Rebel: Birth Order; Family Dynamics, and Creative Lives,” Pantheon Books, New York, 1996

 



Internet sites:

An Essay on Belief and Acceptance by Jonathan Cohen

Atoms, a short history of the knowledge of the atom by Jim Walker

Brain Waves and Meditation

Confusing the Map for the Territory by Jim Walker

Schizophrenia: early warning signs by Max Birchwood, Elizabeth Spencer & Dermot McGovern

Pictures of the brain’s activity during Yoga Nidra by Robert Nilsson

Understanding E-Prime: by R. A. Wilson 


 

HOME

h1

War is a racket – It always has been. By USMG Smedly Butler

September 13, 2013
USMC Major General Smedley D. Butler (1881-1940)

Smedley Darlington Butler was a Major General in the U.S. Marine Corps, the highest rank authorized at that time, and at the time of his death the most decorated Marine in U.S. history

WAR IS A RACKET

It always has been.

It is possibly the oldest, easily the most profitable, surely the most vicious. It is the only one international in scope. It is the only one in which the profits are reckoned in dollars and the losses in lives.

A racket is best described, I believe, as something that is not what it seems to the majority of the people. Only a small “inside” group knows what it is about. It is conducted for the benefit of the very few, at the expense of the very many. Out of war a few people make huge fortunes.

In the [First] World War, a mere handful garnered the profits of the conflict. At least 21,000 new millionaires and billionaires were made in the United States during the World War. That many admitted their huge blood gains in their income tax returns. How many other war millionaires falsified their tax returns, no one knows.

How many of these war millionaires shouldered a rifle? How many of them dug a trench? How many of them knew what it meant to go hungry in a rat-infested dug-out? How many of them spent sleepless, frightened nights, ducking shells and shrapnel and machine gun bullets? How many of them parried a bayonet thrust of an enemy? How many of them were wounded or killed in battle?

Out of war, nations acquire additional territory, if they are victorious. They just take it. This newly acquired territory promptly is exploited by the few – the selfsame few who wrung dollars out of blood in the war. The general public shoulders the bill.

And what is this bill?

This bill renders a horrible accounting. Newly placed gravestones. Mangled bodies. Shattered minds. Broken hearts and homes. Economic instability. Depression and all its attendant miseries. Back-breaking taxation for generations and generations.

For a great many years, as a soldier, I had a suspicion that war was a racket; not until I retired to civil life did I fully realize it. Now that I see the international war clouds gathering, as they are today, I must face it and speak out.

Again they are choosing sides. France and Russia met and agreed to stand side by side. Italy and Austria hurried to make a similar agreement. Poland and Germany cast sheep’s eyes at each other, forgetting for the nonce [one unique occasion], their dispute over the Polish Corridor.

The assassination of King Alexander of Jugoslavia [Yugoslavia] complicated matters. Jugoslavia and Hungary, long bitter enemies, were almost at each other’s throats. Italy was ready to jump in. But France was waiting. So was Czechoslovakia. All of them are looking ahead to war. Not the people – not those who fight and pay and die – only those who foment wars and remain safely at home to profit.

There are 40,000,000 men under arms in the world today, and our statesmen and diplomats have the temerity to say that war is not in the making.

Hell’s bells! Are these 40,000,000 men being trained to be dancers?

Not in Italy, to be sure. Premier Mussolini knows what they are being trained for. He, at least, is frank enough to speak out. Only the other day, Il Duce in “International Conciliation,” the publication of the Carnegie Endowment for International Peace, said:

“And above all, Fascism, the more it considers and observes the future and the development of humanity quite apart from political considerations of the moment, believes neither in the possibility nor the utility of perpetual peace… War alone brings up to its highest tension all human energy and puts the stamp of nobility upon the people who have the courage to meet it.”

Undoubtedly Mussolini means exactly what he says. His well-trained army, his great fleet of planes, and even his navy are ready for war – anxious for it, apparently. His recent stand at the side of Hungary in the latter’s dispute with Jugoslavia showed that. And the hurried mobilization of his troops on the Austrian border after the assassination of Dollfuss showed it too. There are others in Europe too whose sabre rattling presages war, sooner or later.

Herr Hitler, with his rearming Germany and his constant demands for more and more arms, is an equal if not greater menace to peace. France only recently increased the term of military service for its youth from a year to eighteen months.

Yes, all over, nations are camping in their arms. The mad dogs of Europe are on the loose. In the Orient the maneuvering is more adroit. Back in 1904, when Russia and Japan fought, we kicked out our old friends the Russians and backed Japan. Then our very generous international bankers were financing Japan. Now the trend is to poison us against the Japanese. What does the “open door” policy to China mean to us? Our trade with China is about $90,000,000 a year. Or the Philippine Islands? We have spent about $600,000,000 in the Philippines in thirty-five years and we (our bankers and industrialists and speculators) have private investments there of less than $200,000,000.

Then, to save that China trade of about $90,000,000, or to protect these private investments of less than $200,000,000 in the Philippines, we would be all stirred up to hate Japan and go to war – a war that might well cost us tens of billions of dollars, hundreds of thousands of lives of Americans, and many more hundreds of thousands of physically maimed and mentally unbalanced men.

Of course, for this loss, there would be a compensating profit – fortunes would be made. Millions and billions of dollars would be piled up. By a few. Munitions makers. Bankers. Ship builders. Manufacturers. Meat packers. Speculators. They would fare well.

Yes, they are getting ready for another war. Why shouldn’t they? It pays high dividends.

But what does it profit the men who are killed? What does it profit their mothers and sisters, their wives and their sweethearts? What does it profit their children?

What does it profit anyone except the very few to whom war means huge profits?

Yes, and what does it profit the nation?

Take our own case. Until 1898 we didn’t own a bit of territory outside the mainland of North America. At that time our national debt was a little more than $1,000,000,000. Then we became “internationally minded.” We forgot, or shunted aside, the advice of the Father of our country. We forgot George Washington’s warning about “entangling alliances.” We went to war. We acquired outside territory. At the end of the World War period, as a direct result of our fiddling in international affairs, our national debt had jumped to over $25,000,000,000. Our total favorable trade balance during the twenty-five-year period was about $24,000,000,000. Therefore, on a purely bookkeeping basis, we ran a little behind year for year, and that foreign trade might well have been ours without the wars.

It would have been far cheaper (not to say safer) for the average American who pays the bills to stay out of foreign entanglements. For a very few this racket, like bootlegging and other underworld rackets, brings fancy profits, but the cost of operations is always transferred to the people – who do not profit.

WHO MAKES THE PROFITS?

The World War, rather our brief participation in it, has cost the United States some $52,000,000,000. Figure it out. That means $400 to every American man, woman, and child. And we haven’t paid the debt yet. We are paying it, our children will pay it, and our children’s children probably still will be paying the cost of that war.

The normal profits of a business concern in the United States are six, eight, ten, and sometimes twelve percent. But war-time profits – ah! that is another matter – twenty, sixty, one hundred, three hundred, and even eighteen hundred per cent – the sky is the limit. All that traffic will bear. Uncle Sam has the money. Let’s get it.

Of course, it isn’t put that crudely in war time. It is dressed into speeches about patriotism, love of country, and “we must all put our shoulders to the wheel,” but the profits jump and leap and skyrocket – and are safely pocketed. Let’s just take a few examples:

Take our friends the du Ponts, the powder people – didn’t one of them testify before a Senate committee recently that their powder won the war? Or saved the world for democracy? Or something? How did they do in the war? They were a patriotic corporation. Well, the average earnings of the du Ponts for the period 1910 to 1914 were $6,000,000 a year. It wasn’t much, but the du Ponts managed to get along on it. Now let’s look at their average yearly profit during the war years, 1914 to 1918. Fifty-eight million dollars a year profit we find! Nearly ten times that of normal times, and the profits of normal times were pretty good. An increase in profits of more than 950 per cent.

Take one of our little steel companies that patriotically shunted aside the making of rails and girders and bridges to manufacture war materials. Well, their 1910-1914 yearly earnings averaged $6,000,000. Then came the war. And, like loyal citizens, Bethlehem Steel promptly turned to munitions making. Did their profits jump – or did they let Uncle Sam in for a bargain? Well, their 1914-1918 average was $49,000,000 a year!

Or, let’s take United States Steel. The normal earnings during the five-year period prior to the war were $105,000,000 a year. Not bad. Then along came the war and up went the profits. The average yearly profit for the period 1914-1918 was $240,000,000. Not bad.

There you have some of the steel and powder earnings. Let’s look at something else. A little copper, perhaps. That always does well in war times.

Anaconda, for instance. Average yearly earnings during the pre-war years 1910-1914 of $10,000,000. During the war years 1914-1918 profits leaped to $34,000,000 per year.

Or Utah Copper. Average of $5,000,000 per year during the 1910-1914 period. Jumped to an average of $21,000,000 yearly profits for the war period.

Let’s group these five, with three smaller companies. The total yearly average profits of the pre-war period 1910-1914 were $137,480,000. Then along came the war. The average yearly profits for this group skyrocketed to $408,300,000.

A little increase in profits of approximately 200 per cent.

Does war pay? It paid them. But they aren’t the only ones. There are still others. Let’s take leather.

For the three-year period before the war the total profits of Central Leather Company were $3,500,000. That was approximately $1,167,000 a year. Well, in 1916 Central Leather returned a profit of $15,000,000, a small increase of 1,100 per cent. That’s all. The General Chemical Company averaged a profit for the three years before the war of a little over $800,000 a year. Came the war, and the profits jumped to $12,000,000. a leap of 1,400 per cent.

International Nickel Company – and you can’t have a war without nickel – showed an increase in profits from a mere average of $4,000,000 a year to $73,000,000 yearly. Not bad? An increase of more than 1,700 per cent.

American Sugar Refining Company averaged $2,000,000 a year for the three years before the war. In 1916 a profit of $6,000,000 was recorded.

Listen to Senate Document No. 259. The Sixty-Fifth Congress, reporting on corporate earnings and government revenues. Considering the profits of 122 meat packers, 153 cotton manufacturers, 299 garment makers, 49 steel plants, and 340 coal producers during the war. Profits under 25 per cent were exceptional. For instance the coal companies made between 100 per cent and 7,856 per cent on their capital stock during the war. The Chicago packers doubled and tripled their earnings.

And let us not forget the bankers who financed the great war. If anyone had the cream of the profits it was the bankers. Being partnerships rather than incorporated organizations, they do not have to report to stockholders. And their profits were as secret as they were immense. How the bankers made their millions and their billions I do not know, because those little secrets never become public – even before a Senate investigatory body.

But here’s how some of the other patriotic industrialists and speculators chiseled their way into war profits.

Take the shoe people. They like war. It brings business with abnormal profits. They made huge profits on sales abroad to our allies. Perhaps, like the munitions manufacturers and armament makers, they also sold to the enemy. For a dollar is a dollar whether it comes from Germany or from France. But they did well by Uncle Sam too. For instance, they sold Uncle Sam 35,000,000 pairs of hobnailed service shoes. There were 4,000,000 soldiers. Eight pairs, and more, to a soldier. My regiment during the war had only one pair to a soldier. Some of these shoes probably are still in existence. They were good shoes. But when the war was over Uncle Sam has a matter of 25,000,000 pairs left over. Bought – and paid for. Profits recorded and pocketed.

There was still lots of leather left. So the leather people sold your Uncle Sam hundreds of thousands of McClellan saddles for the cavalry. But there wasn’t any American cavalry overseas! Somebody had to get rid of this leather, however. Somebody had to make a profit in it – so we had a lot of McClellan saddles. And we probably have those yet.

Also somebody had a lot of mosquito netting. They sold your Uncle Sam 20,000,000 mosquito nets for the use of the soldiers overseas. I suppose the boys were expected to put it over them as they tried to sleep in muddy trenches – one hand scratching cooties on their backs and the other making passes at scurrying rats. Well, not one of these mosquito nets ever got to France!

Anyhow, these thoughtful manufacturers wanted to make sure that no soldier would be without his mosquito net, so 40,000,000 additional yards of mosquito netting were sold to Uncle Sam.

There were pretty good profits in mosquito netting in those days, even if there were no mosquitoes in France. I suppose, if the war had lasted just a little longer, the enterprising mosquito netting manufacturers would have sold your Uncle Sam a couple of consignments of mosquitoes to plant in France so that more mosquito netting would be in order.

Airplane and engine manufacturers felt they, too, should get their just profits out of this war. Why not? Everybody else was getting theirs. So $1,000,000,000 – count them if you live long enough – was spent by Uncle Sam in building airplane engines that never left the ground! Not one plane, or motor, out of the billion dollars worth ordered, ever got into a battle in France. Just the same the manufacturers made their little profit of 30, 100, or perhaps 300 per cent.

Undershirts for soldiers cost 14¢ [cents] to make and uncle Sam paid 30¢ to 40¢ each for them – a nice little profit for the undershirt manufacturer. And the stocking manufacturer and the uniform manufacturers and the cap manufacturers and the steel helmet manufacturers – all got theirs.

Why, when the war was over some 4,000,000 sets of equipment – knapsacks and the things that go to fill them – crammed warehouses on this side. Now they are being scrapped because the regulations have changed the contents. But the manufacturers collected their wartime profits on them – and they will do it all over again the next time.

There were lots of brilliant ideas for profit making during the war.

One very versatile patriot sold Uncle Sam twelve dozen 48-inch wrenches. Oh, they were very nice wrenches. The only trouble was that there was only one nut ever made that was large enough for these wrenches. That is the one that holds the turbines at Niagara Falls. Well, after Uncle Sam had bought them and the manufacturer had pocketed the profit, the wrenches were put on freight cars and shunted all around the United States in an effort to find a use for them. When the Armistice was signed it was indeed a sad blow to the wrench manufacturer. He was just about to make some nuts to fit the wrenches. Then he planned to sell these, too, to your Uncle Sam.

Still another had the brilliant idea that colonels shouldn’t ride in automobiles, nor should they even ride on horseback. One has probably seen a picture of Andy Jackson riding in a buckboard. Well, some 6,000 buckboards were sold to Uncle Sam for the use of colonels! Not one of them was used. But the buckboard manufacturer got his war profit.

The shipbuilders felt they should come in on some of it, too. They built a lot of ships that made a lot of profit. More than $3,000,000,000 worth. Some of the ships were all right. But $635,000,000 worth of them were made of wood and wouldn’t float! The seams opened up – and they sank. We paid for them, though. And somebody pocketed the profits.

It has been estimated by statisticians and economists and researchers that the war cost your Uncle Sam $52,000,000,000. Of this sum, $39,000,000,000 was expended in the actual war itself. This expenditure yielded $16,000,000,000 in profits. That is how the 21,000 billionaires and millionaires got that way. This $16,000,000,000 profits is not to be sneezed at. It is quite a tidy sum. And it went to a very few.

The Senate (Nye) committee probe of the munitions industry and its wartime profits, despite its sensational disclosures, hardly has scratched the surface.

Even so, it has had some effect. The State Department has been studying “for some time” methods of keeping out of war. The War Department suddenly decides it has a wonderful plan to spring. The Administration names a committee – with the War and Navy Departments ably represented under the chairmanship of a Wall Street speculator – to limit profits in war time. To what extent isn’t suggested. Hmmm. Possibly the profits of 300 and 600 and 1,600 per cent of those who turned blood into gold in the World War would be limited to some smaller figure.

Apparently, however, the plan does not call for any limitation of losses – that is, the losses of those who fight the war. As far as I have been able to ascertain there is nothing in the scheme to limit a soldier to the loss of but one eye, or one arm, or to limit his wounds to one or two or three. Or to limit the loss of life.

There is nothing in this scheme, apparently, that says not more than 12 per cent of a regiment shall be wounded in battle, or that not more than 7 per cent in a division shall be killed.

Of course, the committee cannot be bothered with such trifling matters.

WHO PAYS THE BILLS?

Who provides the profits – these nice little profits of 20, 100, 300, 1,500 and 1,800 per cent? We all pay them – in taxation. We paid the bankers their profits when we bought Liberty Bonds at $100.00 and sold them back at $84 or $86 to the bankers. These bankers collected $100 plus. It was a simple manipulation. The bankers control the security marts. It was easy for them to depress the price of these bonds. Then all of us – the people – got frightened and sold the bonds at $84 or $86. The bankers bought them. Then these same bankers stimulated a boom and government bonds went to par – and above. Then the bankers collected their profits.

But the soldier pays the biggest part of the bill.

If you don’t believe this, visit the American cemeteries on the battlefields abroad. Or visit any of the veteran’s hospitals in the United States. On a tour of the country, in the midst of which I am at the time of this writing, I have visited eighteen government hospitals for veterans. In them are a total of about 50,000 destroyed men – men who were the pick of the nation eighteen years ago. The very able chief surgeon at the government hospital; at Milwaukee, where there are 3,800 of the living dead, told me that mortality among veterans is three times as great as among those who stayed at home.

Boys with a normal viewpoint were taken out of the fields and offices and factories and classrooms and put into the ranks. There they were remolded; they were made over; they were made to “about face”; to regard murder as the order of the day. They were put shoulder to shoulder and, through mass psychology, they were entirely changed. We used them for a couple of years and trained them to think nothing at all of killing or of being killed.

Then, suddenly, we discharged them and told them to make another “about face” ! This time they had to do their own readjustment, sans [without] mass psychology, sans officers’ aid and advice and sans nation-wide propaganda. We didn’t need them any more. So we scattered them about without any “three-minute” or “Liberty Loan” speeches or parades. Many, too many, of these fine young boys are eventually destroyed, mentally, because they could not make that final “about face” alone.

In the government hospital in Marion, Indiana, 1,800 of these boys are in pens! Five hundred of them in a barracks with steel bars and wires all around outside the buildings and on the porches. These already have been mentally destroyed. These boys don’t even look like human beings. Oh, the looks on their faces! Physically, they are in good shape; mentally, they are gone.

There are thousands and thousands of these cases, and more and more are coming in all the time. The tremendous excitement of the war, the sudden cutting off of that excitement – the young boys couldn’t stand it.

That’s a part of the bill. So much for the dead – they have paid their part of the war profits. So much for the mentally and physically wounded – they are paying now their share of the war profits. But the others paid, too – they paid with heartbreaks when they tore themselves away from their firesides and their families to don the uniform of Uncle Sam – on which a profit had been made. They paid another part in the training camps where they were regimented and drilled while others took their jobs and their places in the lives of their communities. The paid for it in the trenches where they shot and were shot; where they were hungry for days at a time; where they slept in the mud and the cold and in the rain – with the moans and shrieks of the dying for a horrible lullaby.

But don’t forget – the soldier paid part of the dollars and cents bill too.

Up to and including the Spanish-American War, we had a prize system, and soldiers and sailors fought for money. During the Civil War they were paid bonuses, in many instances, before they went into service. The government, or states, paid as high as $1,200 for an enlistment. In the Spanish-American War they gave prize money. When we captured any vessels, the soldiers all got their share – at least, they were supposed to. Then it was found that we could reduce the cost of wars by taking all the prize money and keeping it, but conscripting [drafting] the soldier anyway. Then soldiers couldn’t bargain for their labor, Everyone else could bargain, but the soldier couldn’t.

Napoleon once said,

“All men are enamored of decorations…they positively hunger for them.”

So by developing the Napoleonic system – the medal business – the government learned it could get soldiers for less money, because the boys liked to be decorated. Until the Civil War there were no medals. Then the Congressional Medal of Honor was handed out. It made enlistments easier. After the Civil War no new medals were issued until the Spanish-American War.

In the World War, we used propaganda to make the boys accept conscription. They were made to feel ashamed if they didn’t join the army.

So vicious was this war propaganda that even God was brought into it. With few exceptions our clergymen joined in the clamor to kill, kill, kill. To kill the Germans. God is on our side…it is His will that the Germans be killed.

And in Germany, the good pastors called upon the Germans to kill the allies…to please the same God. That was a part of the general propaganda, built up to make people war conscious and murder conscious.

Beautiful ideals were painted for our boys who were sent out to die. This was the “war to end all wars.” This was the “war to make the world safe for democracy.” No one mentioned to them, as they marched away, that their going and their dying would mean huge war profits. No one told these American soldiers that they might be shot down by bullets made by their own brothers here. No one told them that the ships on which they were going to cross might be torpedoed by submarines built with United States patents. They were just told it was to be a “glorious adventure.”

Thus, having stuffed patriotism down their throats, it was decided to make them help pay for the war, too. So, we gave them the large salary of $30 a month.

All they had to do for this munificent sum was to leave their dear ones behind, give up their jobs, lie in swampy trenches, eat canned willy (when they could get it) and kill and kill and kill…and be killed.

But wait!

Half of that wage (just a little more than a riveter in a shipyard or a laborer in a munitions factory safe at home made in a day) was promptly taken from him to support his dependents, so that they would not become a charge upon his community. Then we made him pay what amounted to accident insurance – something the employer pays for in an enlightened state – and that cost him $6 a month. He had less than $9 a month left.

Then, the most crowning insolence of all – he was virtually blackjacked into paying for his own ammunition, clothing, and food by being made to buy Liberty Bonds. Most soldiers got no money at all on pay days.

We made them buy Liberty Bonds at $100 and then we bought them back – when they came back from the war and couldn’t find work – at $84 and $86. And the soldiers bought about $2,000,000,000 worth of these bonds!

Yes, the soldier pays the greater part of the bill. His family pays too. They pay it in the same heart-break that he does. As he suffers, they suffer. At nights, as he lay in the trenches and watched shrapnel burst about him, they lay home in their beds and tossed sleeplessly – his father, his mother, his wife, his sisters, his brothers, his sons, and his daughters.

When he returned home minus an eye, or minus a leg or with his mind broken, they suffered too – as much as and even sometimes more than he. Yes, and they, too, contributed their dollars to the profits of the munitions makers and bankers and shipbuilders and the manufacturers and the speculators made. They, too, bought Liberty Bonds and contributed to the profit of the bankers after the Armistice in the hocus-pocus of manipulated Liberty Bond prices.

And even now the families of the wounded men and of the mentally broken and those who never were able to readjust themselves are still suffering and still paying.

HOW TO SMASH THIS RACKET!

WELL, it’s a racket, all right.

A few profit – and the many pay. But there is a way to stop it. You can’t end it by disarmament conferences. You can’t eliminate it by peace parleys at Geneva. Well-meaning but impractical groups can’t wipe it out by resolutions. It can be smashed effectively only by taking the profit out of war.

The only way to smash this racket is to conscript capital and industry and labor before the nations manhood can be conscripted. One month before the Government can conscript the young men of the nation – it must conscript capital and industry and labor. Let the officers and the directors and the high-powered executives of our armament factories and our munitions makers and our shipbuilders and our airplane builders and the manufacturers of all the other things that provide profit in war time as well as the bankers and the speculators, be conscripted – to get $30 a month, the same wage as the lads in the trenches get.

Let the workers in these plants get the same wages – all the workers, all presidents, all executives, all directors, all managers, all bankers –

yes, and all generals and all admirals and all officers and all politicians and all government office holders – everyone in the nation be restricted to a total monthly income not to exceed that paid to the soldier in the trenches!

Let all these kings and tycoons and masters of business and all those workers in industry and all our senators and governors and majors pay half of their monthly $30 wage to their families and pay war risk insurance and buy Liberty Bonds.

Why shouldn’t they?

They aren’t running any risk of being killed or of having their bodies mangled or their minds shattered. They aren’t sleeping in muddy trenches. They aren’t hungry. The soldiers are!

Give capital and industry and labor thirty days to think it over and you will find, by that time, there will be no war. That will smash the war racket – that and nothing else.

Maybe I am a little too optimistic. Capital still has some say. So capital won’t permit the taking of the profit out of war until the people – those who do the suffering and still pay the price – make up their minds that those they elect to office shall do their bidding, and not that of the profiteers.

Another step necessary in this fight to smash the war racket is the limited plebiscite to determine whether a war should be declared. A plebiscite not of all the voters but merely of those who would be called upon to do the fighting and dying. There wouldn’t be very much sense in having a 76-year-old president of a munitions factory or the flat-footed head of an international banking firm or the cross-eyed manager of a uniform manufacturing plant – all of whom see visions of tremendous profits in the event of war – voting on whether the nation should go to war or not. They never would be called upon to shoulder arms – to sleep in a trench and to be shot. Only those who would be called upon to risk their lives for their country should have the privilege of voting to determine whether the nation should go to war.

There is ample precedent for restricting the voting to those affected. Many of our states have restrictions on those permitted to vote. In most, it is necessary to be able to read and write before you may vote. In some, you must own property. It would be a simple matter each year for the men coming of military age to register in their communities as they did in the draft during the World War and be examined physically. Those who could pass and who would therefore be called upon to bear arms in the event of war would be eligible to vote in a limited plebiscite. They should be the ones to have the power to decide – and not a Congress few of whose members are within the age limit and fewer still of whom are in physical condition to bear arms. Only those who must suffer should have the right to vote.

A third step in this business of smashing the war racket is to make certain that our military forces are truly forces for defense only.

At each session of Congress the question of further naval appropriations comes up. The swivel-chair admirals of Washington (and there are always a lot of them) are very adroit lobbyists. And they are smart. They don’t shout that “We need a lot of battleships to war on this nation or that nation.” Oh no. First of all, they let it be known that America is menaced by a great naval power. Almost any day, these admirals will tell you, the great fleet of this supposed enemy will strike suddenly and annihilate 125,000,000 people. Just like that. Then they begin to cry for a larger navy. For what? To fight the enemy? Oh my, no. Oh, no. For defense purposes only.

Then, incidentally, they announce maneuvers in the Pacific. For defense. Uh, huh.

The Pacific is a great big ocean. We have a tremendous coastline on the Pacific. Will the maneuvers be off the coast, two or three hundred miles? Oh, no. The maneuvers will be two thousand, yes, perhaps even thirty-five hundred miles, off the coast.

The Japanese, a proud people, of course will be pleased beyond expression to see the united States fleet so close to Nippon’s shores. Even as pleased as would be the residents of California were they to dimly discern through the morning mist, the Japanese fleet playing at war games off Los Angeles.

The ships of our navy, it can be seen, should be specifically limited, by law, to within 200 miles of our coastline. Had that been the law in 1898 the Maine would never have gone to Havana Harbor. She never would have been blown up. There would have been no war with Spain with its attendant loss of life. Two hundred miles is ample, in the opinion of experts, for defense purposes. Our nation cannot start an offensive war if its ships can’t go further than 200 miles from the coastline. Planes might be permitted to go as far as 500 miles from the coast for purposes of reconnaissance. And the army should never leave the territorial limits of our nation.

To summarize:

Three steps must be taken to smash the war racket.

We must take the profit out of war.

We must permit the youth of the land who would bear arms to decide whether or not there should be war.

We must limit our military forces to home defense purposes.

TO HELL WITH WAR!

I am not a fool as to believe that war is a thing of the past. I know the people do not want war, but there is no use in saying we cannot be pushed into another war.

Looking back, Woodrow Wilson was re-elected president in 1916 on a platform that he had “kept us out of war” and on the implied promise that he would “keep us out of war.” Yet, five months later he asked Congress to declare war on Germany.

In that five-month interval the people had not been asked whether they had changed their minds. The 4,000,000 young men who put on uniforms and marched or sailed away were not asked whether they wanted to go forth to suffer and die.

Then what caused our government to change its mind so suddenly?

Money.

An allied commission, it may be recalled, came over shortly before the war declaration and called on the President. The President summoned a group of advisers. The head of the commission spoke. Stripped of its diplomatic language, this is what he told the President and his group:

“There is no use kidding ourselves any longer. The cause of the allies is lost. We now owe you (American bankers, American munitions makers, American manufacturers, American speculators, American exporters) five or six billion dollars.

If we lose (and without the help of the United States we must lose) we, England, France and Italy, cannot pay back this money…and Germany won’t. So…

“Had secrecy been outlawed as far as war negotiations were concerned, and had the press been invited to be present at that conference, or had radio been available to broadcast the proceedings, America never would have entered the World War. But this conference, like all war discussions, was shrouded in utmost secrecy. When our boys were sent off to war they were told it was a “war to make the world safe for democracy” and a “war to end all wars.”

Well, eighteen years after, the world has less of democracy than it had then. Besides, what business is it of ours whether Russia or Germany or England or France or Italy or Austria live under democracies or monarchies? Whether they are Fascists or Communists? Our problem is to preserve our own democracy.

And very little, if anything, has been accomplished to assure us that the World War was really the war to end all wars.

Yes, we have had disarmament conferences and limitations of arms conferences. They don’t mean a thing. One has just failed; the results of another have been nullified. We send our professional soldiers and our sailors and our politicians and our diplomats to these conferences. And what happens?

The professional soldiers and sailors don’t want to disarm. No admiral wants to be without a ship. No general wants to be without a command. Both mean men without jobs. They are not for disarmament. They cannot be for limitations of arms. And at all these conferences, lurking in the background but all-powerful, just the same, are the sinister agents of those who profit by war. They see to it that these conferences do not disarm or seriously limit armaments.

The chief aim of any power at any of these conferences has not been to achieve disarmament to prevent war but rather to get more armament for itself and less for any potential foe.

There is only one way to disarm with any semblance of practicability. That is for all nations to get together and scrap every ship, every gun, every rifle, every tank, every war plane. Even this, if it were possible, would not be enough.

The next war, according to experts, will be fought not with battleships, not by artillery, not with rifles and not with machine guns. It will be fought with deadly chemicals and gases.

Secretly each nation is studying and perfecting newer and ghastlier means of annihilating its foes wholesale. Yes, ships will continue to be built, for the shipbuilders must make their profits. And guns still will be manufactured and powder and rifles will be made, for the munitions makers must make their huge profits. And the soldiers, of course, must wear uniforms, for the manufacturer must make their war profits too.

But victory or defeat will be determined by the skill and ingenuity of our scientists.

If we put them to work making poison gas and more and more fiendish mechanical and explosive instruments of destruction, they will have no time for the constructive job of building greater prosperity for all peoples. By putting them to this useful job, we can all make more money out of peace than we can out of war – even the munitions makers.

So…I say,

TO HELL WITH WAR!

h1

The Unbearable Lightness of Being Tony Blair

June 4, 2013

By Matthew Carr / December 3rd, 2009

At some point in the New Year Tony Blair will appear before the Chilcot Inquiry established by the British government to assess the historical ‘lessons’ of the Iraq war. Few individuals bear more responsibility for the invasion and its calamitous aftermath than Blair. Not only was his single-minded determination crucial in bringing his own country into the war, but his close political relationship with the Bush administration, also helped US hawks present the case for war to a skeptical American public.

Tony Blair

The consequences of this intervention are well-known; hundreds of thousands of Iraqi deaths and four million refugees and internally displaced persons; thousands of British and American soldiers killed or wounded; an Iraqi society devastated by war and counterinsurgency, by criminal and terrorist violence, ethnic cleansing and death squads; a neo-colonial occupation marked by torture and brutality and barely-credible levels of financial corruption and incompetence.

All these consequences constitute one of the most extraordinary disasters – and one of the greatest crimes – in British political history. Yet the man who did so much to make this disaster possible has yet to be made accountable. The Chilcot Inquiry is unlikely to make much progress in this direction. Sir John Chilcot has already made it clear that his inquiry does not intend to ‘apportion blame’ and his commission contains two of Blair’s self-professed admirers. Blair himself will undoubtedly be at his slickest and most Teflon-like best, indignant at any suggestion of lowly motives behind his actions or slurs on his ‘reputation’. But accountability is necessary, and not only because of Iraq. As one of the most militaristic prime ministers in British history, Blair is an emblematic symbol of the new imperial violence of the 21st century. More than any other Western leader, he embodies the oxymoronic fantasy of ‘humanitarian’ warfare and the doctrine of liberal interventionism that makes such wars possible.

The Liberal Crusader

Posterity will struggle to unravel the disconcerting combination of evangelical moral fervour, cynicism and narcissism, and duplicity that marks Blair’s trajectory on the world stage. Blair has always attributed his decisions as a leader to a principled determination to ‘do the right thing’. Like Margaret Thatcher before him, he has often presented himself as a conviction politician, but Blair has shown an almost plaintive desire to be admired as a noble and heroic figure, grappling with difficult decisions at the lonely summits of power. At a ‘National Prayer Breakfast’ for Barack Obama earlier this year, he told the incoming president:

When I was Prime Minister I had cause often to reflect on leadership. Courage in leadership is not simply about having the nerve to take difficult decisions or even in doing the right thing, since oftentimes God alone knows what the right thing is. It is to be in our natural state – which is one of nagging doubt, imperfect knowledge, and uncertain prediction – and to be prepared nonetheless to put on the mantle of responsibility and to stand up in full view of the world, to step out when others step back, to assume the loneliness of the final decision-maker, not sure of success but unsure of it.

The mixture of fake humility, narcissism and self-congratulation is vintage Blair. At no time in his premiership did he give any indication that the knowledge that informed his actions might be ‘imperfect’ or incomplete. His speeches and interviews are punctuated with expressions such as ‘I believe that…’ or ‘I have absolutely no doubt that…’ to presage even the most dubious or tendentious claims, as if the mere fact that he believed them was sufficient proof of their truthfulness.

Where George Bush cultivated a more folksy sincerity, Blair was always more eloquent, sure-footed and plausible, with an ability to appeal to very different audiences and constituencies. These qualities already evident during Blair’s first appearance as a principled liberal interventionist during the NATO bombing of Serbia in March 1999 – a war that Blair described in typically Manichean style as ‘a battle between good and evil, between civilisation and barbarism.’ In his famous ‘doctrine of international community’ speech delivered in Chicago in April 1999 he described the NATO campaign as a the product of a new concept of ‘international community’ in which states no longer pursued the selfish national interests of the past but were ‘guided by a more subtle blend of mutual self-interest and moral purpose in defending the values we cherish’. These principles ignored the fact that the war had been launched by NATO in order to bypass the United Nations that represented the ‘international community.’ When Blair evoked ‘the tear stained faces of the hundreds of thousands of refugees streaming across the border’ from Kosovo as a justification for intervention, he did not mention that this exodus had taken place after the war had begun, when the Serbian president Slobodan Milosovic with characteristic ruthlessness ordered the mass expulsion of 800,000 Kosovars in retaliation for the NATO bombings.

Blair’s statement of principle also ignored the fact that the war was essentially a gamble – even a reckless one – that was relatively cost-free to those who launched it. Both Blair and Clinton had assumed that air power would force Milosovic into an early surrender without the need to commit troops on the ground. When this did not happen, NATO began to escalate its bombing raids and air strikes on Serbian cities and economic ‘infrastructure’. Had Milosovic not capitulated on 11 June, NATO would have been forced to intensify the bombing of Serbian cities and carry out a ground invasion, and the notion of a humanitarian war might have looked even more threadbare.

Coming at the end of a grim decade punctuated by bloody catastrophes in the Balkans and Rwanda, the war was nevertheless widely supported across the British political spectrum. Kosovo crystallised an emerging consensus amongst conservative and liberal writers alike, which argued that Western – and more specifically American – military power could be used for moral and humanitarian purposes.

Few people were more seduced by what he called the ‘imperfect instrument’ of military power than Blair. It was in this period that Blair’s self-belief began to mutate into something more messianic, and his sense of his own greatness was matched by equally grandiose aspirations for his country. In December 1999, he called for Britain to become a ‘beacon’ to the rest of the world that would ‘stand up for justice and carry the torch of freedom everywhere where there is injustice and conflict, whether in Kosovo or East Timor.’

Even then, there were contradictions in this agenda. In 1997, Blair overruled an attempt by his Foreign Secretary Robin Cook to ban the sale of Hawk fighter jets to Indonesia as part of an ‘ethical foreign policy’ limiting arms sales to regimes with poor human rights records. Few countries had records as bad as Indonesia, but Cook’s interpretation of ‘ethical’ was at odds with Blair’s commitment to British Aerospace (BAE) – a commitment that would later lead him to block an investigation by the Serious Fraud Office into alleged malfeasance in the company’s dealings with Saudi Arabia. In 2000, BAE sold Hawks to Robert Mugabe’s regime in Zimbabwe. Two years later, Blair personally helped persuade India to buy sixty Hawk jets at a time when India and Pakistan were on the brink of full-scale war. Nor did Blair’s moral commitment to human rights prevent him from supporting Vladimir Putin’s brutal assault on Chechnya, as a quid pro quo for Russian acquiescence in NATO’s war in Kosovo.

Blair himself had always qualified the idealistic component of his ‘doctrine of international community’ by arguing that the decision over whether to take military action should be dependent not just on whether force was morally desirable, but on whether it was practically feasible and in the national interest. Both conditions appeared to be present in the fortuitous British intervention in the Sierra Leone civil war in May 2000, when a small contingent of 1,000 troops was sent to evacuate British nationals and inadvertently helped to stabilize the country and bring its deposed president back to power.

The Road to Iraq

Blair’s sense of what was possible and desirable was radically altered by the 11 September attacks on the United States. The attacks brought all Blair’s messianic instincts to the fore, so that he seemed to see himself as an indispensable figure in a world-historic drama. In the weeks after 9/11, he briefly became the Pied Piper of the war on terror, travelling back and forth across the world in an attempt to rally international support behind US military action against what he called the ‘new evil’ of ‘mass terrorism’.

This urgency was not accompanied by any evidence of original or independent insight into the phenomenon that he described. In a speech to the Labour Party conference on 2 October 2001, with NATO only days away from a military assault on Afghanistan to topple the Taliban, Blair raised the question of whether an attempt should be made to ‘understand the causes of terror’. He immediately rejected the ‘moral ambiguity’ that such an effort might involve, since ‘nothing could ever justify the events of September 11 and it is to turn justice on its head to pretend it could.’

Very few people were attempting to ‘justify’ the attacks, but not everyone was prepared to attribute them to metaphysical evil. Some of Blair’s own party had reservations about the impact of NATO bombings on Afghan civilians. Blair insisted, ‘The action we take will be proportionate; targeted; we will do all we humanly can to avoid civilian casualties.’ His rhetoric then reached visionary heights, with the promise that

The starving, the wretched, the dispossessed, the ignorant, those living in want and squalor from the deserts of North Africa to the slums of Gaza, to the mountains of Afghanistan: they too are our cause. This is a moment to seize. The kaleidoscope has been shaken. The pieces are in flux. Soon they will settle again. Before they do, let us reorder this world around us.

These pronouncements were partly intended to legitimize an American-led military operation whose objectives were not nearly as ambitious or utopian as he described. Blair’s invocation of a new 21st century white man’s burden was strongly influenced by the more hard-headed ‘defensive imperialism’ propounded by the Foreign Office intellectual Robert Cooper, one of the few British foreign office officials in Blair’s inner circle. Cooper first came to public attention in April 2002, when he published an article in The Observernewspaper in which he argued that ‘post imperial, postmodern states’ were obliged to use ‘double standards’ in dealing with rogue or failed states in ‘zones of chaos’ such as Afghanistan, which might require ‘the rougher methods of an earlier era – force, pre-emptive attack, deception, whatever is necessary to deal with those who still live in the nineteenth century world of every state for itself.’

This was a fairly exact description of Blair’s own worldview. In a speech to the Lord Mayor’s banquet in London in November 2002, he argued that the war on terror was a ‘new kind of war’ that could be directed against specific states as well as terrorist groups, since ‘States which are failed, which repress their people brutally, in which notions of democracy and the rule of law are alien, share the same absence of rational boundaries to their actions as the terrorist.’ Stripped of its contemporary references to terrorists, rogue states and WMD, Blair was reprising an old trope from British imperial history – that of the ‘mad’ foreigner who can only be subdued by civilising violence. One of the states where Blair observed an ‘absence of rational boundaries’ was Iraq, in an early indication of his willingness to comply with the new agenda that was beginning to take shape in Washington.

There is no space here to analyse in detail the devious and duplicitous strategies through which Blair manoeuvred his country into the Iraq war, but it is worth recalling the broad contours of this process. Blair’s support for American military action in Iraq was already evident as early as March 2002, according to the leaked memo by his special foreign policy adviser David Manning on a recent visit to the White House. In it Manning informed Blair that he had assured the Bush administration that ‘you would not budge in your support for regime change, but you had to manage a press, a Parliament and a public opinion that was very different from anything in the United States’.

These differences are crucial to understanding Blair’s political strategy in the long build-up to war. For more than a year, he repeatedly denied that military action in Iraq was inevitable and insisted that he was merely trying to get the United Nations to pressure Saddam to disarm. Subsequent leaked documents, such as the ‘Downing Street memo’ make it clear that the ‘UN route’ was not intended to avert war, but to create the conditions in which war became inevitable. To ensure this outcome, Iraq policy was directed by Blair and a small coterie of special advisers, who systematically and relentlessly set out to terrify the British public and present Iraqi as a clear and present danger to British national security. In the September 2003 ‘dodgy dossier’ entitled “Iraq’s Weapons of Mass Destruction: the Assessment of the British Government,” Blair declared:

What I believe the assessed intelligence has established beyond doubt is that Saddam has continued in his efforts to produce chemical and biological weapons, that he continues in his efforts to produce nuclear weapons, and that he has been able to extend the range of his ballistic missile programme… I am in no doubt that the threat is serious and current.

It is impossible to know if Blair really believed these declarations, partly because it is difficult to disentangle what he actually believed from what was politically convenient, and also because his own slippery and often contradictory explanations often shifted once it became apparent that these claims were false. Carne Ross, a diplomat with the UN with long experience of Iraq who resigned in protest at the war, later told a parliamentary committee ‘I knew that evidence they were presenting for WMD was totally implausible… All my colleagues knew that too.’ Blair’s disenchanted Secretary of State for International Development Clare Short also claimed that she was told by Defence Intelligence Staff (DIS) officials that any chemical or biological material that Iraq possessed ‘almost certainly wasn’t weaponised’. In his resignation speech in protest at the war, Robin Cook declared emphatically that no intelligence information he had ever seen had claimed that Saddam possessed WMD.

Why was Blair so certain when others so doubtful? Why was his government forced to draw on intelligence material of dubious and even laughable quality, from crude forgeries to an outdated Phd thesis plagiarised from the Internet to prove its case? Why did it control the flow of information to the point when Blair’s own cabinet was barely informed on a range of crucial issues, such as the 13-page opinion piece by the Attorney General on the legality of the war that was whittled down to a 300-word summary?

These questions have yet to be conclusively answered. On 30 September 2003, amid mounting criticism of the post-invasion chaos in Iraq, Blair attempted to explain his decision to support military action by asking the Labour Party conference to imagine the dilemma in which he found himself after 9/11:

I believe the security threat of the 21st century is not countries waging conventional war. I believe that in today’s interdependent world the threat is chaos. It is fanaticism defeating reason. Suppose the terrorists repeated September 11 or worse. Suppose they got hold of a chemical or nuclear dirty bomb; and if they could, they would. What then? And if this is the threat of the 21st century, Britain should be in there helping confront it, not because we are America’s poodle, but because dealing with it will make Britain safer.

It is impossible to know how much this explanation really described his state of mind before the war. But it did not explain why military action was necessary against a regime that did not have the weapons he described. Even as Blair’s inner circle talked up the threat of Iraq publicly, they often struggled to understand the urgency themselves. In a diary entry on 3 September 2002, Blair’s pugnacious press officer Alastair Campbell records a discussion about Iraq which raised the questions ‘Why now? What was it that we knew now that we didn’t before that made us believe we had to do it now?’ The answer comes from Blair, who says that ‘dealing with Saddam was the right thing to do’ and was ‘definitely worth doing.’

According to Campbell, Blair was convinced that ‘it would be folly for Britain to go against the US on a fundamental policy, and he really believed in getting rid of bad people like Saddam’. Blair may have been sincere in his detestation of Saddam’s regime. But such loathing was not matched by any awareness of the politics and history that made his regime possible – or the potential consequences of its downfall. The Guardian correspondent Jonathan Steele describes how Blair was visited at Downing Street shortly before the war by three leading British Middle East experts, who tried to impress on Blair that Iraq was a ‘very complicated country’ with ‘tremendous inter-communal resentments’ that might not be containable if Saddam was overthrown. According to Steele, these arguments made little impact on Blair, who merely replied, ‘But the man’s uniquely evil, isn’t he?’

The three academics were reportedly dumbfounded by this simplistic response. One later described Blair as ‘someone with a very shallow mind, who’s not interested in issues other than the personalities of the top people, no interest in social forces, political trends, etc’. Another recalled his ‘weird mixture of total cynicism and moral fervour.’ This was not the only occasion when Blair was warned of the potentially negative consequences of military intervention, but they did not affect his belief that regime change was ‘right’ or that it would be successful.

Catastrophe

In The March of Folly, a collection of essays on disastrous historical decisions from the Trojan horse to the Vietnam war, the historian Barbara Tuchman noted a recurring tendency to ‘wooden-headedness’ on the part of governments and rulers – a phenomenon she defined as ‘assessing a situation in terms of preconceived fixed notions while ignoring or rejecting any contrary signs.’ These observations can certainly be applied to the Iraq war – and to Blair’s contribution to it. His diplomacy was partially successful in persuading the Bush administration to override its more unilateralist instincts and accept the return of the UN inspectors to Iraq. But things went awry when the other members of the security council refused to accept that Iraq was in breach of its resolutions and asked for the inspectors to be given more time – an objective that clashed with the US military timetable.

At this point Blair was forced at last to declare his hand. Blaming the pusillanimity of the French and Germans in not committing themselves to military action, he argued that Iraq was now in breach of the security council’s resolutions and that war had become unavoidable. While millions of people marched against the war worldwide, Blair prepared to ‘liberate’ a country he knew almost nothing about, a country that was not a real place but a fantasy onto which he and his acolytes projected their dim sense of moral purpose. In a speech in November 2002 announcing the beginning of NATO’s war in Afghanistan, he insisted that ‘no country lightly commits forces to military action and the inevitable risks involved.’

But few wars have ever been undertaken with such a serene and blissful disregard for the consequences than the Iraq invasion. Blair’s supporters have described the British participation in the war as a noble and principled intervention, but there is little evidence of nobility or principle amongst the coterie of special advisers and officials who made it possible. The Times editor Peter Stothard’s fly-on-the-wall portrait of Blair and his circle before and immediately after the war reveals men and women with no obvious motives at all, beyond an unquestioning loyalty to their superiors and to ‘Tony’ in particular. They are minions and war flies, floating in the slipstream of American military power, whose excitement at the drama of vicarious warfare is matched by a pervasive cynicism that reveals itself their own in-house jargon, such as the verb ‘to Kofi’ – a semantic device based on the UN Secretary-General which Stothard translates as meaning ‘we had better obscure this bit of military planning with a good coat of humanitarian waffle.’

For Blair and his acolytes the moral uplift of humanitarian war cannot be disturbed by dead and wounded bodies, destruction, grief and terror. Their war is a war of memos, emails, and press briefings by mobile phones fought by bureaucrats, spin doctors, and apparatchiks obsessed with avoiding negative newspaper headlines and dictating the news agenda. At the same time these carpeted combat zones are dominated by male officials intoxicated by the long-distance drama of bloodless telegenic conflict. Blair himself demonstrates an almost boyish enthusiasm for the war. When he asks for ‘bigger maps’ of Iraq to be pinned up in his Downing Street ‘den’, even the faithful Sally Morgan observes that ‘he would really have liked a sandpit with tanks.’ Asked by Stothard how he feels about the ‘deaths of children’ caused by the ‘avoidable act’ of the Iraq invasion, Blair once again manages to turn other peoples’ tragedies into a testament to his own moral grandeur:

He puts down the fountain pen. Behind his gaze there is a momentary blankness. Aides have spoken of how much he has felt the responsibility of shedding blood. He speaks of being ready ‘to meet my maker’ and answer for ‘those who have died or have been horribly maimed as a result of my decisions’. He accepts that others who share a belief in his maker, who believe in “the same God”, assess that the last judgement will be against him…. He talks of how he has to isolate himself when people are dying from what he has decided he must do. He talks of how he has to put barriers in his mind.

These ‘barriers’ were also evident in the aftermath of the invasion. In January 2004, with Iraq slipping into a vortex of chaotic violence and insurgency, the British ambassador to Iraq Jeremy Greenstock later recalled how Blair ‘didn’t want to understand the full horror of what he was hearing from us.’ When the horror became unavoidable, Blair refused to accept any responsibility for it and blamed anyone else, whether it was al Qaeda, local terrorists or neighbouring countries such as Syria and Iran. In April 2004 fifty-two former British ambassadors wrote an unprecedented open letter to the prime minister in April 2004, which pointed out that

The conduct of the war in Iraq has made it clear that there was no effective plan for the post-Saddam settlement. All those with experience of the area predicted that the occupation of Iraq by the Coalition forces would meet serious and stubborn resistance, as has proved to be the case. To describe the resistance as led by terrorists, fanatics and foreigners is neither convincing nor helpful.

Blair has never accepted such criticisms. Year after year, he continued to reiterate the same refrain that ‘he was not sorry for getting rid of Saddam ’ while ignoring or downplaying the consequences that followed. He showed a similar dishonesty as the repercussions of the Iraq war began to reach Britain. In 2004, a Home Office and Foreign Office report concluded that the risk of terrorist attacks in the UK had significantly increased as a result of the Iraq war and that many British Muslims had become disillusioned by ‘a perceived “double standard” in the foreign policy of western governments, in particular Britain and the US’.

These conclusions were echoed by both mainstream security analysts and intelligence agencies – but routinely rejected by Blair himself. On 7 July 2005, these predictions were proven brutally accurate by the suicide bombings on the London Underground during the G8 Summit in Gleneagles. At a press conference that day, Blair delivered his ritual interpretation of such events:

Our determination to defend our values and our way of life is greater than their determination to cause death and destruction to innocent people in a desire to impose extremism on the world. Whatever they do, it is our determination that they will never succeed in destroying what we hold dear in this country and in other civilised nations throughout the world.

The ‘martyrdom videos’ released by the 7/7 attackers left no doubt that their actions were intended as a response to western military action in the Muslim world and Iraq in particular. Whatever else can be said about this ‘justification’, it had nothing to do with Blair’s sonorous platitudes. His refusal to accept that his own actions may not have made his country ‘safe’ may have been due to genuine conviction, but there was always a suggestion of something more cunning and devious behind Blair’s description of himself as ‘a pretty straight guy’.

Blair has always been an unwavering supporter of Israel. Throughout Israel’s bombardment of Lebanon in July-August 2006 he publicly deplored what he called the humanitarian ‘catastrophe’ caused by the war, while refusing to support a ceasefire that might have brought this catastrophe to an end, in order to give Israel more time to achieve its war aims and crush Hezbollah.

On 18 July 2006 a microphone at the G8 Summit inadvertently recorded a conversation between Blair and George Bush, in Lebanon, in which the two men criticized Kofi Annan’s attempts to broker a ceasefire. When Bush tells Blair that his Secretary of State Condoleeza Rice will shortly be going to Lebanon to discuss ways of bringing the war to an end, Blair offers to go himself to ‘prepare the ground’ and argues that ‘if she goes she might have to succeed, as it were, whereas if I went I could just talk.’

It is worth pausing to consider the implications of this astounding statement Here is Blair the great humanitarian crusader, offering himself as a peace envoy, not to secure a peace agreement, but so that he can ‘just talk’ – and prolong the war. The same devious duplicity has been evident in his role as the Quartet Envoy to the Middle East. Though Blair has presented himself as an honest broker in the Israeli-Palestinian conflict, he has remained as supportive of Israeli interests as he was during his time in office.

Given the task of ‘strengthening Palestinian institutions’ he colluded in the American-Israeli-EU blockade imposed on Hamas in Gaza. The man who had once hailed the ‘slums of Gaza’ as ‘our cause’ often expressed his concern at the impact of Israeli restrictions on the inhabitants of the Gaza Strip, but his few public pronouncements on this issue made it clear that he believed that Hamas, not Israel, was ultimately responsible for them. In December 2008 he gave an interview to the newspaper Ha’aretz which made it clear that he was aware that a major Israeli military action in Gaza was being planned. When Israel launched Operation Cast Lead in January the following year, his silence was broken only by the usual expressions of humanitarian concern, which studiously avoided any criticism of the military action itself.

Such behaviour may explain why Blair was awarded a £1million award from Israel’s Dan David Foundation in 2009 for ‘his exceptional leadership and steadfast determination in helping to engineer agreements and forge lasting solutions to areas in conflict’. In January that same year, he received a presidential medal of freedom from the departing George Bush in recognition of his efforts to promote “democracy, human rights and peace abroad”, together with a Congressional Gold Medal bearing his own slogan ‘our real weapons are not our guns but our beliefs.’ Blair has also accrued less symbolic rewards for his advocacy of the former. Within months of leaving office he was recruited as an advisor to JP Morgan Chase, with an annual salary of £1m, followed by a similar appointment at the Zurich Financial Service that netted another £500,000. That same year he was appointed special envoy for the UN- US-Russian-EU Quartet to the Middle East. Today he is reportedly the highest-paid public speaker in the world, charging up to £400,000 for half hour speeches on the international lecture circuit.

In addition to appearing on tv chatshows and radio programs and delivering lectures on various continents, Blair has maintained a frenetic international schedule that at times seems to make him a ubiquitous presence. Blair also has a cyber-presence on MySpace and also on Facebook, where visitors can buy copies of an imprint of his hand to raise money for charity (‘an awesome item for fans of Tony’). His two charitable institutions, the Tony Blair Faith Foundation and the Tony Blair Sports Foundation have been associated with a range of issues, from climate change and malaria to child obesity and interfaith dialogue.

Faith has become a dominant theme in the new career of a politician who famously did not ‘do God’ while in office. Shortly after his resignation, Blair converted to Catholicism – a conversion that has dovetailed seamlessly with his relentless acquisition of wealth. In April 2009, he explained the purpose of his Faith Foundation to the Toronto Star in the following terms:

It is true there are two faces of faith: one reactionary, extreme, occasionally violent; the other, compassion, love, fellowship and solidarity. So the task for the foundation is: first, to help people understand different faiths better so they can understand different cultures more fully; and, second, to promote faith as part of progress and reconciliation, not a focus for conflict and sectarian divisions.

Which of these two ‘faces’ belongs to Blair himself is open to question. In April 2009, he delivered an unrepentant speech in Chicago that revisited his ‘doctrine of international community’ and accused Iran of sponsoring or ideologically supporting terrorism across the world, from Mumbai to Somalia. Blair insisted that the West should continue to use hard and soft power against a terrorist enemy that ‘kills the innocent’ and ‘creates chaos in a world which increasingly works through confidence and stability.’ Nowhere in Blair’s speech was there any recognition of the chaos generated by the ‘interventions’ that he had promoted so avidly and continued to insist on. There were only the same simplistic binary formulations, the same sanctimonious paeans to ‘our’ values, the same ability to harness grand moral principles to current American propaganda tropes.

Blair’s sense of his own greatness is clearly impervious to self-doubt and he now appears to believe that he is God’s instrument on earth. Others appear to see him in the same way. In August 2009, he took time off from a holiday on the software millionaire Larry Ellison’s yacht to deliver a speech to the prestigious Communion and Liberation conference at Rimini in which he condemned ‘the restless search for short-term material gain in a globalised economic system’. Incredibly, in October 2009, Blair was proposed by the British government as a candidate for the first president of the European Council. His supporters claimed that Blair’s star quality would create a ‘motorcade effect’ that would be beneficial to Europe. Many of Blair’s compatriots breathed a sigh of relief when European leaders took a different view, but his supporters are clearly as besotted with their hero as they ever were. And whatever conclusions the Chilcot Inquiry reaches, the triumph of this vain, hollow and dangerous man is a bleak reflection of his times, in which as Yeats once wrote in a different context:

The blood-dimmed tide is loosed, and everywhere
The ceremony of innocence is drowned;
The best lack all conviction, while the worst
Are full of passionate intensity.

 

The Chilcot Inquiry

http://www.iraqinquiry.org.uk/

 

h1

Django Unchained: The D is silent.

March 4, 2013



by C. Liegh McInnis

It was never correct to group much of African American-themed cinema under the heading “Blacksploitation.” The latest blockbuster of the genre deserves to be examined as serious social commentary. “Django Unchained exposes well the complex classes of slaves, the complex relationships between slaves, and the complex relationships between slaves and whites within the ‘peculiar institution.'”

Django Unchained: Don’t Miss What’s Truly Important Because of the Smoke and Mirrors

by C. Liegh McInnis

Stephen is the example of the calculating, critical thinking slave who learns/masters the plantation system/culture and manipulates it to his good fortune regardless of whom he must hurt.”

Django Unchained is a very good, possibly great, Western, if one can stomach Quinton Tarantino’s highly sexualized and gory style. My issue with Tarantino is that most of his films seem to be rooted in or use the white fascination for the exotic and violent black as a trope or backdrop for the sexuality and violence of his movies. However, in this case, Tarantino’s hypersexual and ultraviolent style is a perfectly suited vehicle to show the horrors of slavery, especially the degradation of human beings into chattel for the economic gain and perverse pleasure of white supremacy. That being said, while being a visually moving, if not often grotesque, film, rooted in sex and violence, Django Unchained exposes well the complex classes of slaves, the complex relationships between slaves, and the complex relationships between slaves and whites within the “peculiar institution.”

Yet, contrary to Ben Daite’s assertion in his review “Django Unchained – The Black, The Beautiful & Ev’thing Ugly,” Django Unchanged is not the first or best film to do this.  So, when Daite asserts,

“It’s a black hero movie of some sort, a well crafted emancipation epic of a black man and shames the myriad emancipation organisms we have been hitherto inundated with in movies like the coveted Sweet Sweetbacks Badasssss Song by many a black film academics. Who said a black film cannot be bold, hot, intelligent, packed, disturbing and soothing at the same time? No film, like Django Unchained, has ever drawn the moral and physical color line so inadvertently,”

I can only cringe from the fact that Daite allows his desire to love and defend Django Unchained to show just how clueless he is regarding the history of African American films.

Tarantino’s hypersexual and ultraviolent style is a perfectly suited vehicle to show the horrors of slavery.”

First, while lacking the budget needed to make it as well-polished cinematically as Django Unchained,Sweet Sweetback’s Badasssss Song (1971) is a very good film. In fact, its style is quite revolutionary.The film’s fast-paced montages and jump-cuts were unique features in American cinema at the time and a precursor to the action-packed style for which Daite applauds Django Unchained. Also, the manner in which Sweetback is forced to use his penis constantly as a bargaining tool comments on American fixation with the black penis (as Tarantino eventually does in the later stage of Django Unchained) and on the notion that far too many African American men fall into the trap of allowing their penis to become a major aspect of their identity. Furthermore, I should not be forced to remind Daite of Sidney Poitier and Harry Belafonte‘s Buck and the Preacher (1972), which not only addresses the conflict between African Americans and whites but also addresses the problem of all people of color (in this case African Americans and Native Americas) navigating their issues with each other while engaging their common issues with whites.  And, if we understand that Death Wish and the Dirty Harry series are, essentially, urban Westerns because they are driven by the same white supremacist notions of conquering the savages, then we understand that Shaft (1971) also refutes Daite’s historically misinformed assertion.  Additionally, Django Unchained does not come close to discussing or drawing “the moral and physical color line” that is drawn, deconstructed, mocked, refashioned, and obliterated in the manner of Blazing Saddles (1974), written by Mel Brooks and Richard Pryor.

 Of course, maybe Daite’s assertion of Django Unchained‘s superiority to other African American films is rooted in it being “hot” or “hotter” than other African American films that address race, but someone should remind Mr. Daite that “hotness” is relative and, often, judged on differing generational criteria. With that said, I seem to remember that most women found Richard Roundtree to be “hot” in his portrayal of John Shaft, and the same is true of Mario Van Peebles’ portrayal of Jesse Lee in Posse, Denzel Washington’s portrayal of Trip in Glory, and, if my memory serves me correctly, more than a few women were brought to a swoon over John Amos’ portrayal in Roots of the African warrior, Kunta Kinte, whose unbreakable desire for freedom and courage to obtain it are the heart and soul of the narrative. But, in fairness to Daite, when he says “hot,” I know that he also means the stylistic production and presentation of the film. Again, to this I respond that “hotness” is relative and often based on generational criteria as well as what the current technology allows a film to do.

White filmmakers are regularly given larger budgets and creative control than African American filmmakers.”

Remember, we all have fashion moments in our past for which we hope no one finds those pictures.  The same is true of film.  Often, the current marvel of fashion and high-tech production, especially special effects, in movies often appears inferior (lame and dated) just ten years later.  But, the reviews of that time tell us just how “hot” and stylish those effects were then.  Still, in any era a film’s “hotness” is directly related to its production budget.  Therefore, Django Unchained‘s “hotness” may be more a tribute to the manner in which white filmmakers are regularly given larger budgets and creative control than African American filmmakers.  Let’s not forget that Spike Lee was forced to go with his hat in his hand to African American funders to finish Malcolm X because the studio’s budget wouldn’t produce the epic that Lee sought to make.  And even Robert Townsend had to fight with the studio for more money because, as he says, “The amount of the budget determines whether there are five hundred screaming fans after a Five Heartbeat’s concert or if there are just five people in an empty alley.”  So, the style or hotness of the film is not so much about Tarantino as it is about the types of limited budgets African American filmmakers are given even after they have proven themselves to be excellent.

 Yet, what’s really flawed about Daite’s review is that he spends so much time trying to convince readers that Tarantino is a bold and revolutionary director just for making Django Unchained that he never fully discusses the most important aspect of the film, which is the juxtaposition and exploration of the various ideologies of slaves, namely the ideological positions of Stephen (Samuel L. Jackson) and Django (Jamie Foxx) as well as Broomhilda (Kerry Washington) when one considers that she is an example of the manner in which various slave classes/ideologies were often created based on the ideology of the plantation where a slave was born or purchased as an infant.  (Check the history of Phillis Wheatley.) So if Mr. Daite could remove his head from Tarantino’s ass long enough and stop making jabs at Lee long enough, he might find the time to write an analysis of the film that shows us how Django Unchainedsucceeds rather than spending the entire review stroking Tarantino’s…err…ego while giving the middle finger to the history of African American cinema. Thus, the saddest part of Mr. Daite’s analysis is that he becomes guilty of the same flaw of which he accuses Lee. For some reason Daite seems to think that he can only celebrate Django Unchained by denouncing the history of African American cinema.

Tarantino correctly identifies Stephen as the traditional Greek and Shakespearean figure, such as Iago, who has the ear of the King and manipulates his position for his own good often at the demise of others.”

To be clear, when Samuel Jackson responded to Tarantino’s questioning of if he would have a problem playing Stephen by stating, “You mean do I have a problem playing the most hated black man in the history of American cinema?,” one wonders if the general public will understand the depths of what Jackson was saying. What makes Django Unchained a good, possibly great film, is, again, the layering of the complexity of African American characters.  Stephen is not just a flat, stereotypical house nigger or sellout or Uncle Tom or handkerchief head.  Stephen is the example of the calculating, critical thinking slave who learns/masters the plantation system/culture and manipulates it to his good fortune regardless of whom he must hurt.  To his credit, even Tarantino correctly identifies Stephen as the traditional Greek and Shakespearean figure, such as Iago, who has the ear of the King and manipulates his position for his own good often at the demise of others.  But even more, Stephen is proof that the slaves both intellectually (administratively) and physically maintained the plantation during slavery and much of the South after slavery.

As a digression, watching The Jack Benny Program, I often wondered if the white writers purposefully crafted Rochester, Benny’s valet and chauffeur, as being more intelligent and moral than Jack Benny or if it was simply a Freudian slip of white supremacist schizophrenia.  Moreover, drawing a chronological line from Rochester to Stephen and plotting that line with a host of African American servants and slaves, one realizes that African Americans not only built America, but they also maintained it administratively.  Yet, I wonder how many people will not realize that the library scene between Stephen and his master, Calvin Candie (Leonardo Dicaprio), is not fantasy but a fictionalized retelling of the manner in which African American slaves and their offspring have been and have remained counselors for whites in leadership positions. How many African Americans had the ideas, but whites obtained the patents or job promotions based on African American intellect and work? Stephen is not a heroic character, but he is not a mindless boob either. Stephen is an example of one of the various ways that African Americans were forced or chose to analyze, navigate, and manipulate the schizophrenia of white supremacy for survival and profit.

The African American community is not taking seriously the need to produce enough critical thinkers to engage and evaluate artistic portrayal of real-life issues.”

An African American whom I have known since high school once said to me in 2010, “C. Liegh, your problem is that you spend too much time with niggers. Niggers ain’t got no money, no power, nor the sense to use either if and when they get ’em. That’s why I hang out with white folks.” This person works as a highly paid consultant in his field with very lucrative contracts from major white firms. One person may view this person as a modern day Stephen, and another person may view this person as somebody just taking life as it is. The real question is what is informing how one may perceive this person because the real problem is that the African American community is not taking seriously the need to produce enough critical thinkers to engage and evaluate artistic portrayal of real-life issues in a manner that allows the mass of African Americans to understand what is being said about us in all forms of media and what is being done to us in every way possible, even when it is being done by us. Furthermore, a key to understanding all of this is understanding the complex history of African people.

So, again, I wonder how many people will really understand what is occurring in the library scene between Stephen and Candie. If there is real brilliance to Django Unchained, it is Tarantino’s writing of and Jackson’s portrayal of Stephen and Dicaprio’s ability to portray Candie’s schizophrenic dependence upon Stephen in a manner that exposes Stephen’s plantation magnitude. Then, add to this Django’s ability to analyze various circumstances and navigate them accordingly along with Tarantino’s crafting and positioning of other slaves to complete the complex three-dimensional portrayal of multifaceted human beings all reacting to slavery in the manner that best suits their understanding, personality, and benefit, and Django Unchained does become a film worthy of most of the praise that Daite gives it, and one does not need to marginalize the history of African American cinema to celebrate the artistic (creative and critical) successes of Django Unchained, unless one is positioning oneself to be a literary neo-Stephen.

C. Liegh McInnis is an instructor of English at Jackson State University, the publisher and editor of Black Magnolias Literary Journal, the author of seven books, including four collections of poetry, and one collection of short fiction.

h1

Short essay: Barracks by Tw1itteratti

December 26, 2012

248254

The 1930 from LondonBridge to East Croydon is always a favourite   After a hard day in the office there are few compensations for a lonely journey home. But if the evening is warm and the train is quiet and clean, one can get on with a bit of reading or perhaps discretely identify the attractive commuters from behind the headphones of your IPOD.  Sometimes if you are lucky you may receive a reciprocal glance of approval, but most often you will be ignored.  Commuter trains are not the place for chance involvements although this may say more about me than David Lean’s “Brief Encounter”.

On such a day, I swept into that temple of painted steal and glass that is East Croydon on the outskirts of London, and took my place on the crowded slow march to the ticket barriers.  As usual, I was way laid by the ponderous shuffling of a languid teenager then a short skip and an Astaire-esque shuffle later, I was beyond the immediate and onto the next obstacle; a tanned and suited European business type with a bad tie whose luggage and laptops dangled all around him like a fair ground ride.  A genuinely insurmountable human carousel that I could neither see past or through, I was stuck.  I followed cursing my bad luck to the ticket barriers only to be further way laid as East Croydon’s barriers are not designed for human carousels.  My eyes rolled and I huffed and puffed, but my obstacle remained unhurried.

Eventually, the bright haze of the exit opened up to a vista of tramlines, traffic lights and taxi ranks.  The beggars outside the coffee shops moped miserably into their empty begging hats while British Transport Police hung around and provided some relief from harassment from the beggars, hoodlums and drunks alike.  I headed out onto the street freed from the confines of the commuter treadmill when I chanced upon an old friend whom I had last seen some six years previously.

Time had eroded my memory for his real name, but I could recollect his nickname: “Barracks”.   I readied myself.  Engaging my old “homeboy” profile, I activated the pleased to see you smile and removed my headphones.  I stuck out a friendly punch to his chest followed by a robust handshake that rotated into a brotherly clinch of palms.

Barracks responded with familiar recognition.  He knew the protocol and returned the punch and grab.

Although short in stature it was clear life had been good because the gangly figure of six years ago was now a powerfully built black man with a pot-belly and sparkling smiling eyes.

“BARRACKS!  Nahhhh man, how long’s it been?  Man, it’s good to see you. “I shake his hand warmly, he’s still the same old Barracks.  “What’ you doing with yourself?” I said cheerfully.  He threw his head back as if to jangle free the stories that needed to be told and the glint fades in his eyes.  “Nahh man, things haven’t been so good you nahhh mean?  I’ve just got back from Manchester and I’ve moved into a small bed sit in East Croydon.  I don’t know how long I’ll be here.”  He paused. I stopped smiling.

“It’s mah girl man.   She’s…..”.  He kisses his teeth as he growls out the ‘g’ in girl.

“We’ve just split ………” he seems to splutter through his words as his manner turns dark and closed.  His head tilts side and back as he wrestles his painful memories.  His brow creases and his eyes begin to cloud a bit with water.   “Nahhh man, I’ll tell you about it, but….what about you and kids?  You’re married init?”   Sensing the impending sob story I figure I’ll be upbeat.  I compress all my good news into some choice cuts and relate them in the form of a summary.

“Well life’s been pretty good lately.  My wife gave birth to our baby daughter, she’s six months now.  My boy’s doing well at school, he’s just started playing football.”   I pause nodding my head in happy recollection.  “The christening’s coming up soon too, but hey what about you?  What happened?”   I’m smiling, but it’s guarded in pending sympathy.

Amongst men who have grown up together and perhaps lost regular touch over time – the opportunity to talk about oneself should not be passed up lightly, but in these busy stolen times between rushing and bustling home from work; priority is always given to a sad story.

Barracks had sexual problems, which resulted in him being unable to conceive a child in his first marriage.  His impotence was tracked down to problems with his urinary works.  He’d seen the specialists, done the tests; taken the Viagra, but no dice.  His wife stayed with him for five years, but took her leave amidst a host of other reasons.   Apparently they’d had problems apart from their conception ones and in the end had started to resent each other.  She asked him to leave after a bruising argument about his failed attempts to show affection and warmth.

He moved out and stayed at friends, but became depressed.   The phone calls were long and painful.  The text messages were sharp and hurtful.  His phone was full of them, like a diary of heartbreak. I glanced at the reams of pitiful texts, asking for second chances, meetings, Christmas outings, holiday requests, all turned down.  I stopped looking, even while he scrolled down and down.

He had moved north to Manchester while they maintained telephone contact.  Eventually the calls shifted from frequent to infrequent and the long silences tolled the death for their marriage. A few months later he received a call.  She’d taken a new lover and for the first time in years was genuinely happy.  Barracks heart cracked when a month later he received a text message saying she was pregnant and wanted a divorce.

Mandy already had a baby daughter whom over the next five years came to know Barracks as Dad.  Happiness came after a fashion.  Mandy’s parents loved him, she loved him, the baby girl loved him but the gnawing cancer of unresolved guilt about his infertility, his ex-wife’s lost love, and the fear that he could never properly love another, coupled with Mandy’s expectation that he should make a good women of her, made him feel inadequate. He took comfort at nightclubs, dancing and drinking his troubles away even as Mandy preferred to stay at home.

One day, after a heavy night out with his new friends, the dizzying effect of drink mixed with beautiful sirens pulsing in disco lights beckoned him.  He answered the call and woke in a different bed wrapped by different arms.

Mandy discovered his infidelity by way of confession; he had been defeated by guilt and had come clean.   Her hopes dashed and her life broken she now cries most nights, but cannot bring herself to ask him to leave.  Her daughter still loves him and her family don’t know.

“I really screwed up!” he sighs sadly.   We stare in silence and it’s a long one.   “But you know; I love that girl…. She was everything to me.”   I can’t look him in the eye under this cold and heavy sky.  We’ve stopped smiling.  We’re not connected anymore.

“Nahhh man, what the hell made you do it?”  He seems stunned and attempts some brief justification, “naahh man, I was out with the guys.  You know… The guys them!  And you know, one thing led to another, and, ahhh well, you know?”.  I’m irritated and fed up.  I mask it and gently proffer a “Naahh man!!!” and nod my head apparently sympathetically.   His eyes are watery and full of emotional debris.  “Look, I’ve gotta go.  Look, we’ll catch up soon yeah?”  He nodded.  I didn’t take his number nor did I leave my card.

He wanted to talk some more but I didn’t need his bad news.  I shook his hand weakly and left.