Adam Ash

Your daily entertainment scout. Whatever is happening out there, you'll find the best writing about it in here.

Thursday, June 30, 2005

Black Perspective: plenty of reason to be suspicious of whites

If you're African-American, this probably won't be news to you. If you're not, it might be. From The Black Commentator via wood s lot (see blogroll). Listen up, honky bastards.

REJECT THE LANGUAGE OF WHITE SUPREMACY by Glen Ford and Peter Gamble

As we move toward an historic national Black convention in the first quarter of 2006 “Going back to Gary,” as convener William Lucy, President of the Coalition of Black Trade Unionists phrased it, referring to the 1972 National Black Political Convention in that Indiana city – it is imperative that we reexamine the language of our political discourse. Otherwise, we will wind up talking nonsense – or worse, speaking against our own interests.

In the 33 years since the Gary convention , corporate-speak has become ever more deeply embedded in the national conversation, reflecting the assumptions and aspirations of the very rich, who have vastly increased and concentrated their power over civil society. This alien language saturates the political culture via corporate media of all kinds, insidiously defining the parameters of discussion. Once one becomes entrapped in the value-laden matrix of the enemy’s language, the battle is all but lost. We cannot strategize ourselves out of the racist-corporate coil while ensnared in the enemy’s carefully crafted definitions and points of reference.

“Going back to Gary” must mean going back to straight talk, from the African American perspective. The political consensus among the Black masses remains remarkably consistent, but has been relentlessly challenged since 1972 by 1) the rise of a small but vocal corporate class of African Americans who see their own fortunes as linked to larger corporate structures, and 2) aggressive corporate subsidization, beginning in the mid-Nineties, of a growing clique of Black politicians who define Black progress in terms of acceptance among rich, white people.

Thus, the internal contradictions in African American politics have greatly multiplied since Gary. This has not occurred because of increasing conservatism among a much enlarged Black middle class over the last three decades – a corporate-concocted slander for which there is no factual evidence – but by the determination of Big Money to impose an alternative leadership on the recalcitrant Black masses.

Time for confrontation, not celebration

The 1972 National Black Political Convention took place in an atmosphere of euphoria over the demise of Jim Crow, which unleashed the shackles of those Black social sectors that were prepared to take advantage of new opportunities, and empowered a new set of politicians who found themselves in majority Black jurisdictions. When the call to convention went out, everyone was welcomed, and as many as 5,000 showed up. Although much worthwhile political work was accomplished, the general atmosphere was celebratory. We were “Movin’ on Up” to “Celebrate Good Times.” Most believed there “Ain’t No Stoppin’ Us Now – We’re on the Move.”

Essentially, the Gary-era Black discussion centered on consolidation of the gains made during the previous civil rights decade. Short shrift was given to those who had called for deep structural change in the United States, whose demands (and often, lives) were snuffed out by U.S. police and intelligence agencies amidst the carnival of No-Mo’-Jim-Crow. There seemed to be great promise for Black America under a post-segregation regime – and certainly there was, for some. As long as that promise seemed attainable, demands for basic change in American (and world) power relationships were deemed by the upwardly mobile African American sectors as passé, distractions, quaint, but dated.

This self-satisfied analysis was encouraged by a (mostly) white corporate class that harbored larger plans for total world domination: for the absolute, planetary rule of money. By the mid-Nineties, important elements of this class finally got over their reflexive racism – the aversion to sitting in a room with more than a few Black people – and invited some Black folks to join the club.

In 1972, Black collaborators had to work hard to get paid even a pittance to advance the corporate agenda that is inextricably entwined with the ideology of White American Manifest Destiny. Today, they are actively solicited, and handsomely paid in monetary, media and political currency. Corporate-speak is mimicked in many high places of Black American society. For example, corporations dominated the leadership-selection process of our largest mass organization, the NAACP . Corporations have always had a special place in the National Urban League. Corporate influence has reached unprecedented levels among members of the Congressional Black Caucus – while most members stand firm with the historical Black Consensus. And corporations have created out of whole cloth a number of purportedly “Black” organizations, such as the Black Alliance for Educational Options ( BAEO ), which serve the interests of Wal-Mart and the rightwing Bradley Foundation – and are now also subsidized by the Bush regime.

Under Bush, the Black clergy have been subjected to wholesale cooptation, through the Bradley Foundation-invented Faith Based Initiatives bribery schemes. This massive subornation of a critical Black institution resulted in only a net two percent change in Black party affiliation – from nine percent to eleven percent Black GOP voters in 2004. The base remains steadfast, but the leadership institutions have been infected by corporate and Republican money.

Even so, the major Black Baptist denominations this year reaffirmed their allegiance to the “social gospel” that is our proudest legacy, and has generated and encouraged so many other movements that have pushed the envelope of civilization.

Who, then, should be welcomed to the next “Gary” convention, tentatively scheduled for March, 2006? Everyone, just as in 1972.

It is the job of the conveners and organizers of the next National Black Political Convention to set the terms of the Great Black Debate. In large part, this is a function of language – to craft a language that is not infested with assumptions born of white privilege, imperialism, war-mongering, and anti-social ideology. In other words: Let’s talk Black. Among our own people, that kind of conversation wins the day, every time. Amen.

The apologists, collaborators, and opportunists cannot confuse us if we speak directly to the issues that our people care about most deeply. Let the turncoats come – and be exposed. Some may even be saved.

No rule of law?

It is at times of crisis that precise language becomes most important. The Bush regime has plunged the entire planet into crisis – their grotesque version of globalization. While serving as White House counsel, the current Attorney General of the United States, Alberto Gonzalez, derided the Geneva Convention – the basis of international law, and codified as U.S. law – as “quaint.” The 2002 Gonzalez memo signaled that the regime was preparing to launch, not a War on Terror, but a war against world order, and against the rule of law within U.S. borders. We must reject his proclamation, in precise language that affirms the magnificent wording of the Geneva Convention, which outlaws aggressive war and upholds the right of all peoples to self-determination. That means get out of Iraq now, and no further threats to the independence and self-determination of other nations.

Black America is the firmest national constituency for peace, having opposed U.S. adventures abroad in greater proportions than any other ethnic group. We arrived at this more civilized state of being through our own gory experience of White American Manifest Destiny, which declared that a Black person has no rights “that a white man is bound to respect.” George Bush is acting out this vicious dictum on a global scale, and we know it. Therefore, we must tell the truth, as the masses of Black folks understand it: the United States is an aggressor nation in the world, and we demand that it cease, immediately. The problem in Iraq is not U.S. casualties, which are the result of George Bush’s crimes, but the predicate crime of U.S. aggression, a violation of international law.

Mass Incarceration is Genocide

Black America has always stood for the rule of law, despite the fact that American law has so often ruled against us. It does so every day, in vast disproportion to the anti-social behavior of some African American individuals. Guantanamo Bay is, indeed, part of an international “gulag” of American prisons, dotting the globe, as Amnesty International has declared. But the largest “ gulag ” in the world is in the United States – half Black and only 30 percent white, in a 70 percent white country. Fully one out of eight incarcerated human beings on earth are African American, the casualties of an internal war that has not ceased since the Euro-American aggressions against Africa. Rather, it escalates.

At our next grand convention, we must state in no uncertain terms that the real crime wave is being committed against us by all levels of U.S. governments, which have placed Black people under surveillance for the purpose of incarcerating them, and devised laws that impact most heavily on our communities. Mass Black incarceration is a legacy of slavery and, therefore, a form of genocide. “We Charge Genocide,” again – because our social structures are being deliberately destroyed through government policy. Our language must make that plain.

This language is not meant for the oppressors’ ears, but for those of our own people. Black conventions are meant to mobilize Black people. Others are invited to take note. Our object is to galvanize African Americans to take action.

Our history tells us that others follow our lead. Therefore, as a people that believe in the oneness of humanity, we are obligated to lead. We must reject the entire edifice of language that justifies a U.S. war machine that costs more than all the rest of the world’s militaries, combined, and then claims there is no money for the people’s welfare. We know where the money is: it is engaged in criminal, global corporate enterprises, such as war. We demand these enterprises cease, and that the national treasure be redirected to domestic concerns, and to righting the wrongs that the oppressors have inflicted on humanity throughout the planet – including the wrongs committed against Black people in the United States.

We must not argue on corporate terms, about the “affordability” of national health insurance, or housing. We have a right to life, and to live somewhere. There will be no negotiation. Human rights trump property rights and corporate rights and warmonger rights. State it clearly.

A real social contract

Corporate politicians and media deploy the code words of “working people” and “middle class” to mean “white people” as the “deserving” members of the national community. We must reject such language, which is intended to exclude all Black people, including those who work, but explicitly dismisses the unemployed. We must not accept that corporate decisions to eject or bar people from the workplace, should have moral authority or political effect. All citizens have a right to live a decent life. We must demand a national minimum income, in addition to living wage standards.

The cost of the Iraq war and related U.S. military deployments would finance a fundamental change in the average American’s life expectations – and life span. Nobody needs such a change more than Black folks. The rich can afford it. We need to say so, and dare the rich to go to some other country with their money. Nobody else wants them, and nobody else will fund their military, which is the savior of their holdings.

We must directly confront the idea – the unquestioned Holy Grail of corporate politicians and media – that corporations have the rights of citizens. Black and brown metropolises (the top 100 largest U.S. cities have non-white majorities) are at the mercy of corporate barons who shape the urban landscape to fit their profit-driven needs. Inevitably, they move in white people, the process that we call “gentrification.” This process is mostly unchallenged, yet it decides where Black people will live and work, and whether we will preserve the majorities that allow us to even contemplate meaningful democracy in urban America. While we are still majorities in these places, we must take action to exercise the powers that cities possess, in the service of our people. There must be a movement for Democratic Development – development that serves the people who already live in the city. This is perhaps the greatest challenge that faces the next National Black Convention because, if it turns out anything like Gary, in 1972, there will be plenty of Black politicians in attendance who have not done a damn thing to preserve the assets of the cities they nominally oversee, or to protect their own electorate from being displaced by corporate power. So be it. We must tell the truth, because our people are in crisis.

There is no solving the problem of urban education, unless we can force the sharing of education funds. White people in the mass have shown over the last four decades that they will not share classrooms with us. But they must share the money, to correct the gross disparity between suburban and urban schools. Integration is not a one-way street, but citizenship is a shared status. We must state clearly that we are entitled to equal funding – that is, funding adequate for a white suburban district, and additional monies to deal with problems that suburbs don’t have.

There are many other issues that must be tackled as we struggle to escape the Race to the Bottom that has been initiated by multinational corporations, and is politically empowered by the historical racism of white Americans, and made lethal by the military power of the U.S. state. The conveners of the next Black Political Convention should keep the agenda as efficient as possible, knowing that our assembled folks will add a plethora of resolutions. But keep our eyes on the prize. Black folks understand racism, but the whole world is getting an education in unbridled corporate behavior, that leads to famine, wars, and the dismantling of social services worldwide – including the United States, which is intentionally being made to fail as a society.

In the current configuration, globalization means corporate rule – by the gun, if necessary. Privatization is part and parcel of the deal, a divvying up of the spoils. National rights and the rights of minorities are all subservient to the rights of capital. Voting rights go down the toilet.

We must teach a lesson in resistance, and give guidance to action – for our own people, and to those who look to us for leadership in the desert that is the United States.

If we are to Speak Truth to Power, we must aim our words with precision.

Black knowledge as a weapon

The Coalition of Black Trade Unionists is ideal for the mission that faces us, since Black unionists have an intimate understanding of both corporate ruthlessness and the racist machinations that white privilege has found so successful through centuries of plunder and rule. Black unionists know the animal up close, and have smelled his foul breath. They also know the weaknesses of white co-unionists, who are quick to claim white privilege and abandon class solidarity. These are lessons learned painfully – but become weapons in the hands of those committed to struggle.

We must not accept the legitimacy of the current rulers of the United States. They are thieves: stealers of elections; of the bodies of a million imprisoned African Americans; of the minds that are enfeebled by their corporate media; of the countries that they treat as plantations, and feel they can invade at will; and thieves of the productive capacity of the world, which grows every year, but fills only their own bank accounts.

The predatory lenders of the United States have stolen a half trillion dollars from the pockets of African Americans, according to anti-racist reporter Tim Wise. They have also stolen whole continents, and converted their populations into low-wage slaves whose labor is used as a weapon against workers of the United States, including the dwindling number of Black workers fortunate enough to have jobs.

The global tentacles of multinational corporations are not a logical consequence of human civilization, but a construction of predators, whom we know all too well. It is our task to uphold civilization, against the corporate machine that would crush all humans underfoot. We know the feeling. We’ve been crushed before, and reel from the butt of the gun. But still we rise, to indict the criminals.

And we, alone, have a constituency that is ready to march – if we tell them where and why to go. Let us choose our words carefully.

Reaffirm the Black Consensus

We are not looking for drama, but for clarity. The corporations and their war machines are providing the drama. A nation and world in crisis need clarity. Black America retains the power to speak, in the terms of our elders who identified with – and were – the Wretched of the Earth. On this coming Fourth of July, remember the words of Frederick Douglass, our greatest thinker and leader of the 19th Century, who spoke in the darkest hours of our people’s oppression, in 1852:
“I do not hesitate to declare, with all my soul, that the character and conduct of this nation never looked blacker to me than on this 4th of July!”

Frederick Douglass’s words were not popular with his white audience, but they buoyed the spirits of a people seeking freedom. They resonate with us now, when the “character and conduct of this nation never looked blacker.”

We can end this madness, but not without great exercise of discipline, and the precise use of language. Much of it has already been written. But we must write the next version of our destiny…and act on it.

(Glen Ford and Peter Gamble, Co-Publishers of BlackCommentator.com, are writing a book on Barack Obama and the Crisis in Black Leadership.)

Quote of the week

"In Latin America the upper classes say that bodyguards are like testicles -- big, hairy and always outside when the party happens." --Dana Vachon

Nude Thoughts 6


Don’t disturb my fantasy, woman. This is about her, not you. I don’t need your help. I just need my mind, and the memory of her standing by the window, bent over to open it, her behind sticking out – oh! the piercing poignancy of it, and the pang of her behind's atavistic call to my poor bereft dick aching and hungering and dying for the ripe fruit of her hanging three feet away. That’s all I need. Mind you, if at the end, you want to lean over, and put your lips around the tip, feel free. Go ahead. But not before then, OK? A man needs his fantasies.

Wednesday, June 29, 2005

The torturers accept Senator Durbin's apology

From fafblog:

Senator Durbin:
You had the nerve to compare torture conducted by America's brave men and women in uniform to torture conducted by Nazis, Soviets, and Khmer Rouge, and now you finally apologized. As well you should have: that comment was offensive. Deeply offensive. Many of the proud men and women at Guantanamo were deeply upset by it, shaking with anger, fear and doubt as they kicked naked men in the ribs and released the dogs. One soldier could hardly muster up the morale to brain a prisoner and stuff his semi-conscious head into a toilet bowl. But they found the strength and resolve to keep going, senator. Because they aren't torturing for Nazis or communists or some third world hellhole. Those boys are torturing for the stars and stripes, senator - and don't you forget it.

But the men and women in the United States military don't just learn how to twist arms into stress positions and chain detainees to the ceiling. They also learn to forgive. And they've decided to forgive you, senator.

On behalf of all the torturers working hard today in the United States military, the Medium Lobster would like to say: apology accepted, Senator Durbin.

The boys in the 101st Thumbscrew Division would accept your heartfelt concession personally, but they're busy serving the highest ideals of their country, beating a young Afghan man tied to the floor in his own feces. He is believed to be either a steadfast and vicious enemy of freedom intent on the murder of thousands of American citizens, or another cab driver. It's so hard to keep track these days.
(Posted by the Medium Lobster)

Last chance to burn the flag


I couldn't find a flag to burn, so here's a picture. Maybe we should have a National Free Speech "Burn the Flag" Day before the Senate passes a law against flag-burning, which they're working on now, because it's such a burning issue. (Do you get the feeling that the GOP has run out of agenda? No more new ideas from the new ideas party?)
BTW, why flag-burning? What about blowing your nose with the flag? Peeing on it? Wiping your ass with it? Masturbating with it and splooging all over it? There are worse things that can happen to a flag than being burned. How would you feel if you were a flag and someone vomited a fried-egg-and-bacon puke all over you?

Male fantasy of female sex fantasy


If I were a woman, this would be a favorite sex fantasy. Am I just an inveterate male mule, or might I have some vestige of an insight into female sex fantasies? You know, women and horses and all that.

Blogging about abortion with Dear Bitch

Your humble blogger got into an email exchange with his favorite blogger, Bitch Ph.D (see blogroll) about reframing abortion. She posted about it on her blog here. Comments following this post now number 54. Interesting stuff about debating with "life starts at conception" friends and other related matters.
Prof. Bitch is great, because she not only blogs about feminism and the academic life, but also about her cute kid and her open marriage. It's a fascinating blog because it strikes a perfect balance between the personal and the public: the essence of what makes blogs interesting. I read her for what she says -- and for how she lives.
I worry that my blog is all public and nothing personal. Maybe I should start blogging about something intensely personal, besides hating that muthafucka Bush. Like, er, I don't know, favorite animals to fuck, or why I like painting my penis an extremely bright fluorescent glow-in-the-dark orange. Ask me something really personal, and I'll answer.

Tuesday, June 28, 2005

Why is Pop Art more lasting than Abstract Expressionism?

From LA Times:

A critique of stinginess by David Pagel
Pop art has evolved, creating an ever more fertile fusion of high spirits and purposefully lowbrow aesthetic

Pop art is not what it used to be. What began with Andy Warhol in New York and Ed Ruscha in Los Angeles has grown up over the last 45 years, maturing into a style as sophisticated, refined and elegant as the best of Abstract Expressionism. Today, Fred Tomaselli's mind-bending mélanges of pills and pictures, Sharon Ellis' artifice-laden landscapes and Marcelo Pombo's meticulously dripped paintings take Pop far beyond its humble beginnings.

Pop got its start by making fun of the platitudes that second- and third-generation Abstract Expressionism had settled into, mocking the increasingly popular movement's increasingly shrill insistence on gestural spontaneity, existential anxiety and psychological authenticity.

In a nutshell, popularity was a problem for Abstract Expressionism, which derived its significance and the core of its identity from not fitting into the culture around it. That led to a precarious foothold in the world and a case of old-fashioned elitism. In love with its otherness, Abstract Expressionism could not survive success.

Popularity was never a problem for Pop. Success is still Pop's stock in trade, its modus operandi and raison d'être. Having changed the way the world looks, its influence extends across all levels of culture.

David Reed's abstract canvases draw equally on CinemaScope movies and Baroque painting, creating noir-tinged dramas suffused with smoldering sensuality. Philip Argent transforms the airless otherworldliness of virtual reality and the whiplash graphics of digital design into expansive images of breathless beauty and unsettling perfectionism.

From the beginning, Pop threw in its lot with the flash and glam of commercial culture. Broadly addressed to the populace rather than niche-marketed to a handful of cognoscenti, Pop had no problem with large audiences or with the unruliness that accompanied their love. Its artists used marketing techniques to make works in which easy-to-read emblems ridiculed the ideals of established art while serving up similar visual kicks. Pop criticized high art's exclusivity and privilege in the name of shopping — of finding, or realizing, individual desires in mundane products and eventually cobbling together an identity out of mass-produced things made powerful because they were available to others and not private.

Ever since Romanticism, which followed the standardization that swept Europe during the Industrial Revolution, art's identity has been tied to its capacity to enable aficionados and fans to distinguish themselves from the masses, who are invariably disparaged for being uncouth, unschooled, uncool and vulgar. To snobs, being different meant being better than everyone else.

In contrast, Pop embraced sameness. It did not rely on specialists or put stock in specialness. Sticking to the superficial (but never simple) appearance of things, Pop required no experts to translate its deep inner meanings. Although detractors contended that its serial works were blasé, Pop lived and died by its capacity to create the exhilarating sense that my excitement about something in no way diminishes your excitement about the same thing, and vice versa. Meaning was not a zero-sum game. Objects were not inert things to be accumulated, but transitory vehicles that function as lightning rods for talk, argument and (sometimes) writing. Pop sidestepped established hierarchies by proposing — and delivering — a world of voluntary participation and self-selective equality.

Because of its egalitarian ethos, Pop had a problem with opulence. In the '60s, opulence seemed old-fashioned and conservative. Pop concentrated on expanding the franchise of art, broadening the parameters of aesthetics by getting art out of the institutions in which it had been ghettoized. The battle left little room for refinement, and Pop's groundbreaking works were blunt. Lacking formal niceties, they borrowed the look of posters and trafficked in the renegade urgency of grass-roots social movements.

After 40 years, opulence is no longer a problem for Pop. Subtlety and sophistication are as much a part of it as brashness and vulgarity. Luxury and accessibility meet in Jorge Pardo's laser-cut lamps made of plywood, Lari Pittman's paintings of operatic comics and Polly Apfelbaum's flowers cut from swatches of tacky synthetic velvet. The civility of good manners mingles promiscuously with the characteristics of bad taste: garishness, excess, spectacle. Pop now embodies uptown eloquence, self-effacement and understatement, but always filtered through a downtown vocabulary of sleazy verve and boldness.

In contemporary and first-generation Pop, social and economic class play out differently than they do in European art. In the United States, art does not belong to the aristocracy, the state or even big business. A product (and an advertisement for itself), American art has more autonomy than art in societies whose social fabrics are more tightly woven. Here, more free play exists between the pleasures and privileges and powers of art and those of more traditional forms of wealth.

Even so, this country's ruling aesthetic is modeled on Minimalism. The durability, severity and seriousness of that style's works echo the virtues of business: predictability, efficiency and detachment. It's no coincidence that the homes and offices of the rich and powerful are designed to look like art museums, with restraint, control and a puritanical denial of sensuality as dominant design features.

Contemporary Pop flies in the face of such authoritarian parsimony. Its works make a place for the extravagant appetites of the unglamorous strata of society, for whom bigger is better, more is merrier and lavishness is the whole point of having an appetite.

A type of lowbrow baroque takes shape. It fuses a working-class love of things (grounded in appreciation of manual dexterity) with an equal — but not opposite — love of intellectual prowess, historical wisdom and formal rigor. Its amiable relationship with craftsmanship corresponds to a lack of antipathy toward design, resulting in user-friendly hospitality and more-where-that-came-from generosity.

Pop and connoisseurship are no longer opposed. Sophistication is not limited to highbrow cultivation but encompasses enthusiasms that cut across classes, arising wherever passion has room to pursue its own ends, on its own terms.

Contemporary Pop offers a critique of stinginess — whether of concept, attitude or material — by making a place for a wealth of embodied experiences that include bedazzlement, awe and joy.

(David Pagel is curator of "POPulence" at Blaffer Gallery, the Art Museum of the University of Houston, June 25 through Aug. 27. His essay is adapted from "Not Your Father's Pop," to be published in the exhibition catalog.)

Monday, June 27, 2005

When will the fucking Dems get a backbone?

From truthout:
The Howling by John Cory (a Nam vet with a Purple Heart)

    Sometimes I get so angry I could just spit.

    What the hell is wrong with you Democratic politicians? Why do you turn on each other? Why won't you stand by your own words and ideals? Why do you keep apologizing for telling the truth? And why can't anyone see McCain's true colors? Or Lieberman or Biden?

    And why in God's name do you expect folks like me to come support you after you turn tail and run?

    Senator Durbin apologizes, for what? Telling the truth?

    The howling banshees of the GOP and corrupt media sprout hair under a full moon and wail into the sky until the noise becomes unbearable for simpering politicians. And the jackals of right-wing power feast on the carcass of democracy. Warm jelly spine is such a delicacy for these carnivores.

    We know the truth out here. We will stand with anyone who stands up for us. Where are you?

    This GOP has the moral certitude of Errol Flynn at a convention of underage bargirls in Bangkok. And ethics? Please! Republicans have the ethics of - well - Tom Delay and Randy "Duke" Cunningham and the Ohio Coin-gOp scandal. They quote Stalin as a means of dealing with the judiciary or anyone who opposes them, but no one makes them apologize. They call Democrats traitors, and accuse them of aiding the enemy, while at the same time, the GOP ensures their friends profit from the deaths of American soldiers. There is not a corpse in this country that these people will not stick Uncle Dick's cheney into, in order to screw themselves into a frenzy of godly power. And yet, these bilious bullies rule the political playground? Give me a break!

    If you're a Democrat, I'll give you everything I have, money, time, whatever; and all you have to do is - give me my country back. All you have to do is stand up and face the storm. Expose these hypocrites for what they are and what they do in the name of America. My America.

    Let me tell you three things you could do if you really want my support. I'm no corporation, just a working stiff, so I will certainly understand if you pay me no never-mind. But if you are interested in stirring the pot and maybe generating a whole heap of grassroots support, here's my Three-Step Democracy Recovery program:
Support John Conyers and the hearings on the Downing Street Minutes and every revelation that comes out of these papers. Hold Bush Co. accountable by making the issue public via the Internet, your own web sites, any newspaper with integrity (not NY Times or Washington Post - I said integrity), support liberal blogs at every opportunity. Put it on your answering machines.

Take a group, a LARGE group of Congressmen and Senators over to Dover Air Base to honor and photograph the returning dead from the war in Iraq and Afghanistan. Do not be deterred by MPs or civilian police, and be willing to be arrested for trespassing. Let the Bush administration arrest a congressional delegation representing the families of the fallen for daring to face the price America pays for Bush's War.

Rent a Jumbotron in Times Square and use the Nightline model of the Iran hostage crisis - remember? (Day 152 of "America Held Hostage") Only this time, the Jumbotron flashes the picture of our young troops with the caption: "Death 1732 - Wound 5432 - of the Bush War."

And then as an added bonus offer up this message on the giant screen: Let Us Pray.
Our Forefathers, who art in heaven,
How hollowed have become thy names.
The rule of kingdom has come,
The Republic is undone
On Earth just as they intend to do to Heaven.

They give us each day, a daily dread
And warn us about our trespasses
While they trespass against us -
We, the people.

Oh, lead us out of wars of pre-emption,
And deliver us from one-party evil,
For we know there should be no corporate kingdom,
Only empowerment of the people,
And the glory of the common good
That is true Democracy
For ever and ever,
Amen.

And after you have done this - let the howling begin. I'll stand beside you.

Doesn't this make your blood boil?

1. On average, the 2003 tax cut has already given $93,500 to every millionaire. It is estimated that 52% of the benefits of George W. Bush's 2001–03 tax cuts have enriched the wealthiest 1% of Americans (those with an average annual income of $1,491,000).
2. On average, the 2003 tax cut gave $217 to every middle-income person. By 2010, it is estimated that just 1% of the benefits of the tax cut will go to the bottom 20% of Americans (those with an average annual income of $12,200).
3. During at least one year since 2000, 82 of the largest American corporations -- including General Motors, El Paso Energy, and, before the scandal broke, Enron -- paid no income tax.

Quote of the week

''In our time, political speech and writing is largely the defense of the indefensible." --George Orwell.

The War in Iraq: the last gasp of Empire

What is Iraq if not the last gasp of America as Empire? The end of the Cold War is often seen as a U.S. victory over the U.S.S.R., but in fact, it cost both superpowers their empires.

The parallel of these two losses of empire is uncanny.

Like Russia lost its empire in Eastern Europe, we lost our "empire" in Western Europe. During the Cold War, Western Europe was a collection of U.S. client states under our nuclear umbrella, our missiles and troops stationed there to protect it against the U.S.S.R. The collapse of the Soviet Empire not only set free Eastern Europe (East Germany, Poland, Hungary, Czechoslovakia, Bulgaria, Romania, Yugoslavia and Albania), but also liberated Western Europe to pursue its own course, which it has with the European Union.

The EU is now a single-currency economy in competition with the U.S. -- and with the power to bring U.S. companies like GE and Microsoft to heel, something heretofore unthinkable. By declining to follow us into Iraq (making it a U.S.-U.K. Lone Ranger and Tonto adventure), the two EU leaders, Germany and France, have carved out a separate role for EU diplomacy in the world, which is often, and ironically, allied with Russia against the U.S.

The parallel continues in another way. Both the U.S. and the U.S.S.R. also lost their empires in their own backyards. The Baltic states (Estonia, Lithuania, and Latvia) and the old Soviet Republics (Ukraine, Georgia, Azerbaijan, etc.) are now states independent of Russia -- while U.S. hegemony over own backyard in South America is at an end, too. Until quite recently, we installed puppet regimes in South America almost as freely as the British used to play Puppet Shuffle in the Middle-East. Now South Americans can vote, and they vote for leaders who are charting an independent course well to the left of us.

We're still the #1 military and economic force on earth -- two great imperial weapons -- but huge question marks have emerged over our continued efficacy in both these spheres. We have a rival economy as strong as our own in the EU, as well as the emerging giants of India and China to contend with. The Indians seem set to become the IT leader of the world -- not only do we outsource our computer needs there, but our own Silicon Valley has felt the impact of Indian nationals. And the Chinese, now the undisputed manufacturing powerhouse of the world, will overtake us as the world's biggest economy in the next two decades.

Besides overtaking us, China will probably end up owning us, too. Under Bush we've gone into total hock to China. And they are returning the favor by using our dollars to start buying U.S. companies. The Chinese bids for Unocal and Maytag are only the beginning of a massive redistribution of wealth started by our own president. Bush's sell-out to the Chinese will probaby be his most lasting legacy.

Still, we have the strongest military ever in the history of humankind. But as Vietnam and now Iraq proves, military power has its limits. Iraq will probably be our final military adventure for imperial gain, as the former Yugoslavia will probably be our last military adventure for humanitarian reasons (we declined to intervene in Rwanda, the biggest genocide since the Holocaust, and we are staying out of Darfur, even though we're calling it a genocide).

What is the Soviet parallel to our Iraq adventure? Their war in Afghanistan. This quagmire lasted seven years, and led to the Soviet collapse. Expect ours to last shorter than that, now that polls show that U.S. voters are losing heart for our continued presence there.

It's truly instructive how the Iraq adventure, all by itself, exemplifies our loss of power in the world. It started as a fear-of-Islamic-terror imperial oil grab with the strategic aim of planting permanent U.S. military bases in a Middle-Eastern puppet state under Achmed Chalabi. Emboldened by our victory in Afghanistan, we went into Iraq thinking it would be "a cakewalk" -- probably our greatest moment of imperial hubris.

And now our greatest humiliation. We've had to actually change motives in mid-course. Pretty unprecedented. Far from a cakewalk, the Iraq adventure has become an arduous nation-building effort to set Iraq on the course of democracy, as well as a most unfortunate sinking of our moral reputation around the world. Osama Bin Laden couldn't have hoped for better: we're his #1 recruitment agency.

Nation-building was the last thing we wanted. But circumstances, mostly in the person of Al-Sistani, forced our hand. When we balked at elections, this former Iranian cleric simply brought out his Iraq Shiite followers en masse in a protest march. We had to swallow our imperial pride and recast our presence there as an exercise in democracy. One Muslim cleric stopped our imperial ambitions cold.

In fact, far from securing our influence in the Middle-East, the Iraq war will greatly decrease our influence in the region. If the Iraq democracy project succeeds, it will bring about a powerful Shiite alliance of Iraq and Iran against us. We started off by serving our own interests there (as well as Israel's) -- and now we are serving the interests of Iran. Not a terrific example of being an empire, now that we're doing the dirty work for Bin-Laden and Iran. The empire as dupe.

Such are the ironies of history. But fortunately for the health of our republic, this irony brings the idea of America as an empire to its final end. There's something poetically fitting in the fact that the greatest power on earth has found the cure for its manifest destiny in Mesopotamia, the cradle of civilization.

Sunday, June 26, 2005

Ten steps to a better brain

From The New Scientist:

It doesn't matter how brainy you are or how much education you've had - you can still improve and expand your mind. Boosting your mental faculties doesn't have to mean studying hard or becoming a reclusive book worm. There are lots of tricks, techniques and habits, as well as changes to your lifestyle, diet and behaviour that can help you flex your grey matter and get the best out of your brain cells. And here are 11 of them.

1. Smart drugs:
Does getting old have to mean worsening memory, slower reactions and fuzzy thinking?

AROUND the age of 40, honest folks may already admit to noticing changes in their mental abilities. This is the beginning of a gradual decline that in all too many of us will culminate in full-blown dementia. If it were possible somehow to reverse it, slow it or mask it, wouldn't you?

A few drugs that might do the job, known as "cognitive enhancement", are already on the market, and a few dozen others are on the way. Perhaps the best-known is modafinil. Licensed to treat narcolepsy, the condition that causes people to suddenly fall asleep, it has notable effects in healthy people too. Modafinil can keep a person awake and alert for 90 hours straight, with none of the jitteriness and bad concentration that amphetamines or even coffee seem to produce.

In fact, with the help of modafinil, sleep-deprived people can perform even better than their well-rested, unmedicated selves. The forfeited rest doesn't even need to be made good. Military research is finding that people can stay awake for 40 hours, sleep the normal 8 hours, and then pull a few more all-nighters with no ill effects. It's an open secret that many, perhaps most, prescriptions for modafinil are written not for people who suffer from narcolepsy, but for those who simply want to stay awake. Similarly, many people are using Ritalin not because they suffer from attention deficit or any other disorder, but because they want superior concentration during exams or heavy-duty negotiations.

The pharmaceutical pipeline is clogged with promising compounds - drugs that act on the nicotinic receptors that smokers have long exploited, drugs that work on the cannabinoid system to block pot-smoking-type effects. Some drugs have also been specially designed to augment memory. Many of these look genuinely plausible: they seem to work, and without any major side effects.

So why aren't we all on cognitive enhancers already? "We need to be careful what we wish for," says Daniele Piomelli at the University of California at Irvine. He is studying the body's cannabinoid system with a view to making memories less emotionally charged in people suffering from post-traumatic stress disorder. Tinkering with memory may have unwanted effects, he warns. "Ultimately we may end up remembering things we don't want to."

Gary Lynch, also at UC Irvine, voices a similar concern. He is the inventor of ampakines, a class of drugs that changes the rules about how a memory is encoded and how strong a memory trace is - the essence of learning (see New Scientist, 14 May, p 6). But maybe the rules have already been optimised by evolution, he suggests. What looks to be an improvement could have hidden downsides.

Still, the opportunity may be too tempting to pass up. The drug acts only in the brain, claims Lynch. It has a short half-life of hours. Ampakines have been shown to restore function to severely sleep-deprived monkeys that would otherwise perform poorly. Preliminary studies in humans are just as exciting. You could make an elderly person perform like a much younger person, he says. And who doesn't wish for that?

2. Food for thought:
You are what you eat, and that includes your brain. So what is the ultimate mastermind diet?

YOUR brain is the greediest organ in your body, with some quite specific dietary requirements. So it is hardly surprising that what you eat can affect how you think. If you believe the dietary supplement industry, you could become the next Einstein just by popping the right combination of pills. Look closer, however, and it isn't that simple. The savvy consumer should take talk of brain-boosting diets with a pinch of low-sodium salt. But if it is possible to eat your way to genius, it must surely be worth a try.

First, go to the top of the class by eating breakfast. The brain is best fuelled by a steady supply of glucose, and many studies have shown that skipping breakfast reduces people's performance at school and at work.

But it isn't simply a matter of getting some calories down. According to research published in 2003, kids breakfasting on fizzy drinks and sugary snacks performed at the level of an average 70-year-old in tests of memory and attention. Beans on toast is a far better combination, as Barbara Stewart from the University of Ulster, UK, discovered. Toast alone boosted children's scores on a variety of cognitive tests, but when the tests got tougher, the breakfast with the high-protein beans worked best. Beans are also a good source of fibre, and other research has shown a link between a high-fibre diet and improved cognition. If you can't stomach beans before midday, wholemeal toast with Marmite makes a great alternative. The yeast extract is packed with B vitamins, whose brain-boosting powers have been demonstrated in many studies.

A smart choice for lunch is omelette and salad. Eggs are rich in choline, which your body uses to produce the neurotransmitter acetylcholine. Researchers at Boston University found that when healthy young adults were given the drug scopolamine, which blocks acetylcholine receptors in the brain, it significantly reduced their ability to remember word pairs. Low levels of acetylcholine are also associated with Alzheimer's disease, and some studies suggest that boosting dietary intake may slow age-related memory loss.

A salad packed full of antioxidants, including beta-carotene and vitamins C and E, should also help keep an ageing brain in tip-top condition by helping to mop up damaging free radicals. Dwight Tapp and colleagues from the University of California at Irvine found that a diet high in antioxidants improved the cognitive skills of 39 ageing beagles - proving that you can teach an old dog new tricks.

Round off lunch with a yogurt dessert, and you should be alert and ready to face the stresses of the afternoon. That's because yogurt contains the amino acid tyrosine, needed for the production of the neurotransmitters dopamine and noradrenalin, among others. Studies by the US military indicate that tyrosine becomes depleted when we are under stress and that supplementing your intake can improve alertness and memory.

Don't forget to snaffle a snack mid-afternoon, to maintain your glucose levels. Just make sure you avoid junk food, and especially highly processed goodies such as cakes, pastries and biscuits, which contain trans-fatty acids. These not only pile on the pounds, but are implicated in a slew of serious mental disorders, from dyslexia and ADHD (attention deficit hyperactivity disorder) to autism. Hard evidence for this is still thin on the ground, but last year researchers at the annual Society for Neuroscience meeting in San Diego, California, reported that rats and mice raised on the rodent equivalent of junk food struggled to find their way around a maze, and took longer to remember solutions to problems they had already solved.

It seems that some of the damage may be mediated through triglyceride, a cholesterol-like substance found at high levels in rodents fed on trans-fats. When the researchers gave these rats a drug to bring triglyceride levels down again, the animals' performance on the memory tasks improved.

Brains are around 60 per cent fat, so if trans-fats clog up the system, what should you eat to keep it well oiled? Evidence is mounting in favour of omega-3 fatty acids, in particular docosahexaenoic acid or DHA. In other words, your granny was right: fish is the best brain food. Not only will it feed and lubricate a developing brain, DHA also seems to help stave off dementia. Studies published last year reveal that older mice from a strain genetically altered to develop Alzheimer's had 70 per cent less of the amyloid plaques associated with the disease when fed on a high-DHA diet.

Finally, you could do worse than finish off your evening meal with strawberries and blueberries. Rats fed on these fruits have shown improved coordination, concentration and short-term memory. And even if they don't work such wonders in people, they still taste fantastic. So what have you got to lose?

3. The Mozart effect:
Music may tune up your thinking, but you can't just crank up the volume and expect to become a genius

A DECADE ago Frances Rauscher, a psychologist now at the University of Wisconsin at Oshkosh, and her colleagues made waves with the discovery that listening to Mozart improved people's mathematical and spatial reasoning. Even rats ran mazes faster and more accurately after hearing Mozart than after white noise or music by the minimalist composer Philip Glass. Last year, Rauscher reported that, for rats at least, a Mozart piano sonata seems to stimulate activity in three genes involved in nerve-cell signalling in the brain.

This sounds like the most harmonious way to tune up your mental faculties. But before you grab the CDs, hear this note of caution. Not everyone who has looked for the Mozart effect has found it. What's more, even its proponents tend to think that music boosts brain power simply because it makes listeners feel better - relaxed and stimulated at the same time - and that a comparable stimulus might do just as well. In fact, one study found that listening to a story gave a similar performance boost.

There is, however, one way in which music really does make you smarter, though unfortunately it requires a bit more effort than just selecting something mellow on your iPod. Music lessons are the key. Six-year-old children who were given music lessons, as opposed to drama lessons or no extra instruction, got a 2 to 3-point boost in IQ scores compared with the others. Similarly, Rauscher found that after two years of music lessons, pre-school children scored better on spatial
reasoning tests than those who took computer lessons.

Maybe music lessons exercise a range of mental skills, with their requirement for delicate and precise finger movements, and listening for pitch and rhythm, all combined with an emotional dimension. Nobody knows for sure. Neither do they know whether adults can get the same mental boost as young children. But, surely, it can't hurt to try.

4. Gainful employment:
Put your mind to work in the right way and it could repay you with an impressive bonus

UNTIL recently, a person's IQ - a measure of all kinds of mental problem-solving abilities, including spatial skills, memory and verbal reasoning - was thought to be a fixed commodity largely determined by genetics. But recent hints suggest that a very basic brain function called working memory might underlie our general intelligence, opening up the intriguing possibility that if you improve your working memory, you could boost your IQ too.

Working memory is the brain's short-term information storage system. It's a workbench for solving mental problems. For example if you calculate 73 - 6 + 7, your working memory will store the intermediate steps necessary to work out the answer. And the amount of information that the working memory can hold is strongly related to general intelligence.

A team led by Torkel Klingberg at the Karolinska Institute in Stockholm, Sweden, has found signs that the neural systems that underlie working memory may grow in response to training. Using functional magnetic resonance imaging (fMRI) brain scans, they measured the brain activity of adults before and after a working-memory training programme, which involved tasks such as memorising the positions of a series of dots on a grid. After five weeks of training, their brain activity had increased in the regions associated with this type of memory (Nature Neuroscience, vol 7, p 75).

Perhaps more significantly, when the group studied children who had completed these types of mental workouts, they saw improvement in a range of cognitive abilities not related to the training, and a leap in IQ test scores of 8 per cent (Journal of the American Academy of Child and Adolescent Psychiatry, vol 44, p 177). It's early days yet, but Klingberg thinks working-memory training could be a key to unlocking brain power. "Genetics determines a lot and so does the early gestation period," he says. "On top of that, there is a few per cent - we don't know how much - that can be improved by training."

5. Memory marvels:
Mind like a sieve? Don't worry. The difference between mere mortals and memory champs is more method than mental capacity

AN AUDITORIUM is filled with 600 people. As they file out, they each tell you their name. An hour later, you are asked to recall them all. Can you do it? Most of us would balk at the idea. But in truth we're probably all up to the task. It just needs a little technique and dedication.

First, learn a trick from the "mnemonists" who routinely memorise strings of thousands of digits, entire epic poems, or hundreds of unrelated words. When Eleanor Maguire from University College London and her colleagues studied eight front runners in the annual World Memory Championships they did not find any evidence that these people have particularly high IQs or differently configured brains. But, while memorising, these people did show activity in three brain regions that become active during movements and navigation tasks but are not normally active during simple memory tests.

This may be connected to the fact that seven of them used a strategy in which they place items to be remembered along a visualised route (Nature Neuroscience, vol 6, p 90). To remember the sequence of an entire pack of playing cards for example, the champions assign each card an identity, perhaps an object or person, and as they flick through the cards they can make up a story based on a sequence of interactions between these characters and objects at sites along a well-trodden route.

Actors use a related technique: they attach emotional meaning to what they say. We always remember highly emotional moments better than less emotionally loaded ones. Professional actors also seem to link words with movement, remembering action-accompanied lines significantly better than those delivered while static, even months after a show has closed.

Helga Noice, a psychologist from Elmhurst College in Illinois, and Tony Noice, an actor, who together discovered this effect, found that non-thesps can benefit by adopting a similar technique. Students who paired their words with previously learned actions could reproduce 38 per cent of them after just 5 minutes, whereas rote learners only managed 14 per cent. The Noices believe that having two mental representations gives you a better shot at remembering what you are supposed to say.

Strategy is important in everyday life too, says Barry Gordon from Johns Hopkins University in Baltimore, Maryland. Simple things like always putting your car keys in the same place, writing things down to get them off your mind, or just deciding to pay attention, can make a big difference to how much information you retain. And if names are your downfall, try making some mental associations. Just remember to keep the derogatory ones to yourself.

6. Sleep on it:
Never underestimate the power of a good night's rest

SKIMPING on sleep does awful things to your brain. Planning, problem-solving, learning, concentration,working memory and alertness all take a hit. IQ scores tumble. "If you have been awake for 21 hours straight, your abilities are equivalent to someone who is legally drunk," says Sean Drummond from the University of California, San Diego. And you don't need to pull an all-nighter to suffer the effects: two or three late nights and early mornings on the trot have the same effect.

Luckily, it's reversible - and more. If you let someone who isn't sleep-deprived have an extra hour or two of shut-eye, they perform much better than normal on tasks requiring sustained attention, such taking an exam. And being able to concentrate harder has knock-on benefits for overall mental performance. "Attention is the base of a mental pyramid," says Drummond. "If you boost that, you can't help boosting everything above it."

These are not the only benefits of a decent night's sleep. Sleep is when your brain processes new memories, practises and hones new skills - and even solves problems. Say you're trying to master a new video game. Instead of grinding away into the small hours, you would be better off playing for a couple of hours, then going to bed. While you are asleep your brain will reactivate the circuits it was using as you learned the game, rehearse them, and then shunt the new memories into long-term storage. When you wake up, hey presto! You will be a better player. The same applies to other skills such as playing the piano, driving a car and, some researchers claim, memorising facts and figures. Even taking a nap after training can help, says Carlyle Smith of Trent University in Peterborough, Ontario.

There is also some evidence that sleep can help produce moments of problem-solving insight. The famous story about the Russian chemist Dmitri Mendeleev suddenly "getting" the periodic table in a dream after a day spent struggling with the problem is probably true. It seems that sleep somehow allows the brain to juggle new memories to produce flashes of creative insight. So if you want to have a eureka moment, stop racking your brains and get your head down.

7. Body and mind:
Physical exercise can boost brain as well as brawn

IT'S a dream come true for those who hate studying. Simply walking sedately for half an hour three times a week can improve abilities such as learning, concentration and abstract reasoning by 15 per cent. The effects are particularly noticeable in older people. Senior citizens who walk regularly perform better on memory tests than their sedentary peers. What's more, over several years their scores on a variety of cognitive tests show far less decline than those of non-walkers. Every extra mile a week has measurable benefits.

It's not only oldies who benefit, however. Angela Balding from the University of Exeter, UK, has found that schoolchildren who exercise three or four times a week get higher than average exam grades at age 10 or 11. The effect is strongest in boys, and while Balding admits that the link may not be causal, she suggests that aerobic exercise may boost mental powers by getting extra oxygen to your energy-guzzling brain.

There's another reason why your brain loves physical exercise: it promotes the growth of new brain cells. Until recently, received wisdom had it that we are born with a full complement of neurons and produce no new ones during our lifetime. Fred Gage from the Salk Institute in La Jolla, California, busted that myth in 2000 when he showed that even adults can grow new brain cells. He also found that exercise is one of the best ways to achieve this.

In mice, at least, the brain-building effects of exercise are strongest in the hippocampus, which is involved with learning and memory. This also happens to be the brain region that is damaged by elevated levels of the stress hormone cortisol. So if you are feeling frazzled, do your brain a favour and go for a run.

Even more gentle exercise, such as yoga, can do wonders for your brain. Last year, researchers at the University of California, Los Angeles, reported results from a pilot study in which they considered the mood-altering ability of different yoga poses. Comparing back bends, forward bends and standing poses, they concluded that the best way to get a mental lift is to bend over backwards.

And the effect works both ways. Just as physical exercise can boost the brain, mental exercise can boost the body. In 2001, researchers at the Cleveland Clinic Foundation in Ohio asked volunteers to spend just 15 minutes a day thinking about exercising their biceps. After 12 weeks, their arms were 13 per cent stronger.

8. Nuns on a run:
If you don't want senility to interfere with your old age, perhaps you should seek some sisterly guidance

THE convent of the School Sisters of Notre Dame on Good Counsel Hill in Mankato, Minnesota, might seem an unusual place for a pioneering brain-science experiment. But a study of its 75 to 107-year-old inhabitants is revealing more about keeping the brain alive and healthy than perhaps any other to date. The "Nun study" is a unique collaboration between 678 Catholic sisters recruited in 1991 and Alzheimer's expert David Snowdon of the Sanders-Brown Center on Aging and the University of Kentucky in Lexington.

The sisters' miraculous longevity - the group boasts seven centenarians and many others well on their way - is surely in no small part attributable to their impeccable lifestyle. They do not drink or smoke, they live quietly and communally, they are spiritual and calm and they eat healthily and in moderation. Nevertheless, small differences between individual nuns could reveal the key to a healthy mind in later life.

Some of the nuns have suffered from Alzheimer's disease, but many have avoided any kind of dementia or senility. They include Sister Matthia, who was mentally fit and active from her birth in 1894 to the day she died peacefully in her sleep, aged 104. She was happy and productive, knitting mittens for the poor every day until the end of her life. A post-mortem of Sister Matthia's brain revealed no signs of excessive ageing. But in some other, remarkable cases, Snowdon has found sisters who showed no outwards signs of senility in life, yet had brains that looked as if they were ravaged by dementia.

How did Sister Matthia and the others cheat time? Snowdon's study, which includes an annual barrage of mental agility tests and detailed medical exams, has found several common denominators. The right amount of vitamin folate is one. Verbal ability early in life is another, as are positive emotions early in life, which were revealed by Snowdon's analysis of the personal autobiographical essays each woman wrote in her 20s as she took her vows. Activities, crosswords, knitting and exercising also helped to prevent senility, showing that the old adage "use it or lose it" is pertinent. And spirituality, or the positive attitude that comes from it, can't be overlooked. But individual differences also matter. To avoid dementia, your general health may be vital: metabolic problems, small strokes and head injuries seem to be common triggers of Alzheimer's dementia.

Obviously, you don't have to become a nun to stay mentally agile. We can all aspire to these kinds of improvements. As one of the sisters put it, "Think no evil, do no evil, hear no evil, and you will never write a best-selling novel."

9. Attention seeking:
You can be smart, well-read, creative and knowledgeable, but none of it is any use if your mind isn't on the job

PAYING attention is a complex mental process, an interplay of zooming in on detail and stepping back to survey the big picture. So unfortunately there is no single remedy to enhance your concentration. But there are a few ways to improve it.

The first is to raise your arousal levels. The brain's attentional state is controlled by the neurotransmitters dopamine and noradrenalin. Dopamine encourages a persistent, goal-centred state of mind whereas noradrenalin produces an outward-looking, vigilant state. So not surprisingly, anything that raises dopamine levels can boost your powers of concentration.

One way to do this is with drugs such as amphetamines and the ADHD drug methylphenidate, better known as Ritalin. Caffeine also works. But if you prefer the drug-free approach, the best strategy is to sleep well, eat foods packed with slow-release sugars, and take lots of exercise. It also helps if you are trying to focus on something that you find interesting.

The second step is to cut down on distractions. Workplace studies have found that it takes up to 15 minutes to regain a deep state of concentration after a distraction such as a phone call. Just a few such interruptions and half the day is wasted.

Music can help as long as you listen to something familiar and soothing that serves primarily to drown out background noise. Psychologists also recommend that you avoid working near potential diversions, such as the fridge.

There are mental drills to deal with distractions. College counsellors routinely teach students to recognise when their thoughts are wandering, and catch themselves by saying "Stop! Be here now!" It sounds corny but can develop into a valuable habit. As any Zen meditator will tell you, concentration is as much a skill to be lovingly cultivated as it is a physiochemical state of the brain.

10. Positive feedback:
Thought control is easier than you might imagine

IT SOUNDS a bit New Age, but there is a mysterious method of thought control you can learn that seems to boost brain power. No one quite knows how it works, and it is hard to describe exactly how to do it: it's not relaxation or concentration as such, more a state of mind. It's called neurofeedback. And it is slowly gaining scientific credibility.

Neurofeedback grew out of biofeedback therapy, popular in the 1960s. It works by showing people a real-time measure of some seemingly uncontrollable aspect of their physiology - heart rate, say - and encouraging them to try and change it. Astonishingly, many patients found that they could, though only rarely could they describe how they did it.

More recently, this technique has been applied to the brain - specifically to brain wave activity measured by an electroencephalogram, or EEG. The first attempts were aimed at boosting the size of the alpha wave, which crescendos when we are calm and focused. In one experiment, researchers linked the speed of a car in a computer game to the size of the alpha wave. They then asked subjects to make the car go faster using only their minds. Many managed to do so, and seemed to become more alert and focused as a result.

This early success encouraged others, and neurofeedback soon became a popular alternative therapy for ADHD. There is now good scientific evidence that it works, as well as some success in treating epilepsy, depression, tinnitus, anxiety, stroke and brain injuries.

And to keep up with the times, some experimenters have used brain scanners in place of EEGs. Scanners can allow people to see and control activity of specific parts of the brain. A team at Stanford University in California showed that people could learn to control pain by watching the activity of their pain centres (New Scientist, 1 May 2004, p 9).

But what about outside the clinic? Will neuro feedback ever allow ordinary people to boost their brain function? Possibly. John Gruzelier of Imperial College London has shown that it can improve medical students' memory and make them feel calmer before exams. He has also shown that it can improve musicians' and dancers' technique, and is testing it out on opera singers and surgeons.

Neils Birbaumer from the University of Tübingen in Germany wants to see whether neurofeedback can help psychopathic criminals control their impulsiveness. And there are hints that the method could boost creativity, enhance our orgasms, give shy people more confidence, lift low moods, alter the balance between left and right brain activity, and alter personality traits. All this by the power of thought.

Friday, June 24, 2005

Deep Thoughts: the triple alienation of being post-human

I'm not well-versed in Theory, having studied English at university when Practical Criticism and Leavis was all the rage, so I have only a nodding, passing and autodidactical acquaintance with it (note to self: must read Terry Eagleton's book). But I like the way it makes explosions in my mind, the way existentialism did when it counseled me to embrace my teenage despair, and damned me to freedom in a dark, indifferent and god-absent universe.
This piece makes those little explosions in my mind, even if every now and then, a sentence or two leaves me intellectually blinded and bereft in my indifference about Theory. The piece comes via the most excellent wood s lot, from CTheory. (The C, I believe, stands for Canada. Remember Canada? It's what the death camp inmates called the place where the belongings of the incoming Jews went -- the place of stuff, the storeroom of useful, desirable, trade-worthy things.)
The piece manages to conjoin the usual suspects -- Foucault, Lacan, Zizek -- with Descartes, Narcissus, Icarus, Terri Schiavo and 'faith-based" American Empire. Quite a nifty trick. You may have to hang in there, especially through Lacan's fucking Mirror Stage (break that goddam mirror, I say, and live with seven bad-luck years of Theory), but you might get a few petite brainfarts in your posthuman cortical synapses.

1000 DAYS OF THEORY
The Lacanian Conspiracy by Ted Hiebert

I went to school with an artist named Ryan Klaschinsky. We are no longer in contact, but one of his artworks has stayed with me. A work best not seen, but imagined. The Spirit is Willing but the Flesh is Weak. A work about the futility of flesh -- the futility of bodies in a delirious world. Flesh colored liquid foam -- haphazardly strewn about on an old turntable. Prosthetic body parts, refuse. Tongues, or brains. That special shade of rosy pink has never looked more decrepit. The Spirit is Willing but the Flesh is Weak. Revolving flesh, spun around, flopping from side to side. Useless flesh. Front row at the car crash -- meaning not only an unobstructed view but the possibility of being hit oneself. The confrontation with one's own being as a result of the horrific -- the fate of the posthuman.

1
It has been asserted by many that the body has become virtual. This is the fate of flesh in the 21st century. Not only coded flesh and data bodies, but more importantly -- as always -- is the inverse. Flesh codes and body data as we self-regulate towards a pending utopia of nihilism itself. Ultimately, it is the fate of nihilism to be exactly that which the posthuman demands.

Make no mistake, we ARE posthuman. Not only because we are inextricably wired to the world around us -- prosthetic extensions on every front, from television to radio, internet to video games, cell phones to email... even clothing and language fit this particular bill -- but more importantly again, because we are wired to ourselves, puppets that will not give up the delusion of being also the puppet master. The history of Western thinking has been -- quite simply -- wrong. The first technology was not language, nor the wheel, nor even the tree-branch-turned-spear. The first technology is and has always been the technology of reflection -- that through which we inscribe ourselves onto our own flesh.

Interestingly enough however, the theories of the posthuman are not yet written, as they should be, upon the flesh of bodies themselves. For the moment at least, it seems that we are content with the rhetoric of resistance -- intellectual self-fashioning in an age of subjective uncertainty and uncertain subjectivity. Consider, as an example, Katherine Hayles' definition of the posthuman:
the posthuman is 'post' not because it is unfree but because there is no a priori way to identify a self-will that can be clearly distinguished from an other-will.[1]

Might it not be however, in this instance, slightly ironic that the a priori distinguishing of "will" returns as the rhetorical basis for self-definition in an age of technological proliferation? Might one not accuse such a theory of having all too quickly forgotten the writings of the poststructuralists, namely Barthes and Foucault, who argued so eloquently about the death of authorship and the exact absence of will in contemporary subjectivity?[2] And, along the lines of such a reconsideration, might we not take Hayles' argument to an even more extreme position, in which it is precisely this "absence of will" that becomes the very factor determining "how we became posthuman"?[3]

Indeed, along these lines, it is no longer the will at stake in posthumanism, but rather the body -- the site upon which the "will-less will" collapses into subjective confusion. And through this collapse, something even more dangerous happens -- now, it is no longer merely selves that are incapable of self-distinguishing, but bodies too. And no longer is this even a function of technological extensions of mind and perception, but rather more fundamentally because this has always been the fate of the self, and indeed the fate of the body as well.

We are posthuman -- our theories tattooed on the flesh of our bodies, stamped and branded, not merely as symbols of aesthetic affiliation, but rather as something dangerously equivalent -- symbols of belonging and defiance. And it is this particular paradox -- of defiant belonging -- can be so dangerous precisely because the two are now equivalent: futile resistance in an age of unverifiable self-fashioning.

Consider the hypothesis: If I didn't already know myself I might well fail to recognize myself in my own reflection.

The self has never had a reliable face, nor any face a reliable self. And it is the technology of reflection through which this fundamental self-deceit is reflected back to us -- bodies now tattooed only with the alienting futility of nihilism itself.

2
The argument is relatively simple:

if the self in contemporary crisis (technological or otherwise) is the same as the self at stake in the writings of Foucault,
and, if the self at stake in the writings of Foucault is the same as the self engendered by the Lacanian mirror-stage,
and, if the self engendered by the Lacanian mirror-stage is the same as the self behind Descartes' cogito ergo sum,
then, we end up with the following formulation: placebo ergo sum -- I hallucinate myself into being.

The explanation is slightly more complex.

It begins quite simply with the observation that through Descartes the self comes into existence at the moment of its self-conception.[4] What is not simple are the ways in which this self-conception occurs, nor the particularities of the "existence" which that self-conception engenders. To read Descartes out of his historical context -- in the attempt to find some groundwork for such an assertion in a contemporary setting -- would be to suggest that fundamentally this existence is of the same nature as its conception. Existence then becomes merely the observation of existing, in whatever form that observation might take.

More complex is the question of "who" observes -- a question that Descartes does not actually answer, but which both Lacan and Foucault do.

For Foucault, this "who" is the normalized self. There are no other options. For despite the "inside joke" that might be posited through a privileged understanding -- an enlightened observation -- there can be no rational hierarchy in play in the ordering of self-conceived existence. Rather, Foucault is explicit: "discipline makes individuals,"[5] and it is not self-conception that is responsible for individualized existence but rather acculturation, education, social participation. We learn to perceive ourselves, learn to self-conceive under the shadow of discipline itself.[6]

Strangely however, we also learn to forget. For a normalized self is not quite normal if it does not also think itself thinking itself. In other words, the Foucauldian normalized self is one that is disciplined into self-conception, indeed one that has a vested stake in maintaining the conception of itself as autonomous. Disciplined into autonomous self-perception, what is forgotten -- as a condition of autonomous being -- is that being was never autonomous.

Candy coated autonomy. Except that the sweetness turns sour as soon as we facialize the sinister authorities responsible for our self-deceits. To doubt the doctor is to doubt the medicine, and once "off our meds" the face of identity itself cannot possibly ever look the same.

Under the sign of Foucault, consequently, my identity always will have a face that is not mine. In fact, it is inevitably the face of the disillusioned doctor, whose medications I cannot possibly believe. This is not a problem as long as I maintain an active forgetfulness in the face of an autonomy that was never mine -- but the moment I cease to forget, my body is no longer my own -- and even its face has changed.

The problem, of course, is that under this sign of identity-in-crisis we are fully able to facialize the authorities responsible. While the signs of self-deceit cause the breach of faith in the autonomy we remember, this self-deceit is immediately transferred and transformed into the accusation of deceit levied against our own circumstances of personal formation. This is to say, that we do not -- indeed we cannot -- blame ourselves for our own inherent depersonalization, for we have neither a (self) subject to blame, nor an autonomous perspective to wager.

3
This is why for the formulation of the posthuman "placebo ergo sum" it is necessary to consider Lacan. For while one might say that under the sign of Foucault there is "no a priori way to distinguish a self-will from an other-will," there is most certainly a Foucauldian framework by which to clearly distinguish the other-wills responsible for our formulations of self.[7] And the futility of identity under the sign of Foucault is precisely that these other-wills are the only wills that can be clearly distinguished. We did not choose ourselves but had ourselves chosen for us, in fact by others themselves equally unchosen. And again in this case, what is true for wills is true for bodies as well. No longer is it as simple as to say that we do not recognize ourselves in the mirror -- now, in fact, we inevitably recognize someone else.

But isn't this exactly Lacan's assertion in " The Mirror Stage as Formative of the Function of the I as Revealed in Psychoanalytic Experience"? Isn't Lacan's theory of the "mirror-stage" precisely a theory that suggests that the ego is formed -- the self itself is formed -- out of self-alienation?
... the important point is that this form [the mirror-stage] situates the agency of the ego, before its social determination, in a fictional direction, which will always remain irreducible for the individual alone...by which he must resolve as I his discordance with his own reality... [and which consequently] ... symbolizes the mental permanence of the I, at the same time as it prefigures its alienating destination...[8]

Consequently, wouldn't this mean that Foucault has not, in the end, reduced the possibilities of individuality at all, but merely come full-circle to that position in which his own self was born? Here, in a way that is frighteningly poignant, Lacan has out-Foucaulted Foucault -- making his theories of social conditioning and discipline already redundant to the basic self-alienation required for the birth of self-conception itself.

In other words, the child body, before even recognizing itself as alienated, comes to its own self-conception through precisely a form of alienation. The mechanism of this alienation however, is not quite equivalent to that of which Foucault speaks, but rather more similar to the technological alienation that a theorist like Paul Virilio would call the automation of perception:
'Now objects perceive me', the painter Paul Klee wrote in his Notebooks. This rather startling assertion has recently become objective fact, the truth.[9]

While Virilio's automated perception is strictly industrial, and Lacan's is biological, this nevertheless suggests that it is a technology of perception -- in this instance the technology of reflection -- that is the primordial factor responsible for the self-alienation of the individual. In other words: I have always been other, indeed that is my condition of being. I am unchosen. And what this means, of course, is that no longer can I facialize the world around me as responsible for my lack of individuality -- rather, I must recognize that in my self-recognition I have already chosen, however circumstantially it might seem, through my engagement with technological mediation, to not be myself. And this is not simply the result of social control or discourses of power, but rather the very condition of having a self to begin with. The (adult) confrontation with oneself as another is the re-enactment of something much more primal, much more fundamental, much less human.

Could we not then say, given the dual frameworks of Foucauldian depersonalization and its a priori Lacanian counterpart, that there is something that happens when we "re-realize" the self-alienation at the core of subjective being? A second-order mirror-stage perhaps -- that re-death of the self that was already fundamentally dead? And to re-realize ourselves as such entities is also to realize that our own personal histories, our personal trajectories and formulations, are themselves equally hallucinatory -- equally unchosen -- as the "I" that spawned them.

Consequently, if Foucault -- as an already self-alienated individual -- can come to the conclusion that individuality is already self-alienated, is his declaration self-revealing or prophetic? In other words, is the fate of the posthuman to return, not to the redundancy of humanist fantasies of autonomy, but rather to the re-realization that humanity was only ever a phase in its cultural development? And, in this spirit could we not assert that: to be human is to be alienated; to be posthuman is to be self-alienated?

placebo ergo sum -- I am born into posthuman destiny.

It is an already normalized self that now confronts itself as already normalized. Yet no longer is there any hope of recovery, for it is not as simple as to attribute contextual development as the root of self-formation. Lacan posits a "nature" in contrast to Foucault's "nurture," and yet the result is the same. Doubly depersonalized, the myth of autonomy has always already been the spectre of the myth of biology.
These reflections [on the workings of the mirror-stage] lead me to recognize in the spatial captation manifested in the mirror-stage, even before the social dialectic, the effect in man of an organic insufficiency in his natural reality -- in so far as any meaning can be given to the word 'nature'.[10]

But it is also precisely here, in this attribution of essence -- this attribution of a non-human (alienated) nature to all things human -- that the truly sinister side of Lacan comes out. For if the spectre of autonomy under the sign of Foucault begins to look somewhat like a social conspiracy, under the sign of Lacan this conspiracy is precisely genetic. Betrayed by our own bodies into not-being, the mirror stage becomes the literal stage upon which the fantasy of existence is acted out.
The mirror-stage is a drama whose internal thrust is precipitated from insufficiency to anticipation... [and] which will mark with its rigid structure the subject's entire mental development.[11]

Consequently, a question: what would Lacan see when he looks in the mirror? For if the fantasy of autonomy resides with the spectral double -- the self-alienated ego -- are the words of Lacan not precisely the words rather of his reversed reflection? And if Lacan himself is and must be seen as exactly self-alienated to the same extent as everyone else, which Lacan do we hear when he speaks? The answer is obvious: we hear the posthuman Lacan, the doubled Lacan. This is necessary since, as a biological theory, the "mirror stage" does not allow for a privileged recovery, nor for even the possibility of "inside knowledge" of one's a priori alienation.[12] In fact, Lacan's theory itself is a better answer to "how we became posthuman" than any other: we became human through self-alienation, we became posthuman through (re)recognition of our own humanity as myth. Lacan's voice is not the voice of (human) alienation, but rather that of (posthuman) belonging.

One need not try very hard to turn this into a conspiracy theory of sorts, as the fundamental politics of identity, that in which the self is irrecoverably alienated, serves the convenient (or sinister) purpose for which psychoanalysis itself becomes indispensable. The sympathetic ear that knowingly tells the individual that it understands the frustration of alienation, simultaneously leads it further into the trauma of irrecoverable loss. Indeed in the mirror of psychoanalysis we recognize that which we can never recover, namely a fantasy of an autonomous whole that, without psychoanalysis, without Lacan, we never would have known we had.

What this is to say, in the end, is not that psychoanalysis is itself a form of normalization, understandable under the Foucauldian umbrella of disciplined thinking, but just the inverse. Alienation is the form in which we come to the world, and it may thus be only "natural" that we continue to seek it out, continue to be fascinated by the very desire for nihilism that is the only predictable trajectory initiated by the original fall from singularity. "Mirrors make individuals." And the Lacanian conspiracy is one that has his name only because he was the first to "alienate" it from the already self-alienated perspective of posthuman destiny.

placebo ergo sum.

4
And isn't this why Narcissus and Icarus are, in fact, the same person?

We all know the story of Narcissus, he who was so completely infatuated with his own reflection that he refused all human company in the name of self-absorption. We know also that his fate was death, starvation due to a complete indifference to the world around him.[13] Narcissus was the first nihilist -- accidental self-destruction as a result of overzealous self-obsession.

The story of Icarus is different, for he was not obsessed, but rather possessed, by his reflection. They say that Icarus, flying on wings of wax and feathers, flew too close to sun, his wings melted and he fell to his death in the ocean waters below.[14] This is only part of the story. For Icarus was (no less than Narcissus) posthuman. And it was not the sun, but the gravity of his reflection, peering up at him from the ocean waters below, that caused his (literal) downfall. Pulled, disciplined, infatuated, Icarus simply could not help himself -- his mistake was to self-conceive, his accidental self-conception his downfall.

Narcissus is Icarus born again.

5
placebo ergo sum.

I think of Daniel White's "The Bride of Compassionate Conservatism," and the tragedy of Terri Schiavo's spectacularized death in a "culture of life." And I wonder if the spectacularization of life might not be in some ways equivalent. Is it only silenced flesh that cannot speak on its own behalf, or was Foucault right and even the "free speech" of humanist identity is couched in such extreme formulae of social and cultural conditioning that its own voice is already that of White's Vox Angelae?
'I' speak through the aperture of the media... My body is the property of medicine, of my spouse, of my parents, of the courts, of the legislatures, of the society of spectacle. I am a death parade. I am a life without consciousness... a hybrid of biology and technology, a cyborg whose consciousness is no longer 'mine'...

'I' am cortically dead.[15]

But was I ever cortically alive?

placebo ergo sum.

I think of Arthur Kroker's "Born Again Ideology," and the chilling account of American politics as the "faith-based" attempt to provoke the Apocalypse in the name of Christ's second coming. And I wonder if there isn't something frighteningly similar in this account to the "cortical" life of spectacularized death?
The second coming of god then as the real politics of American empire: a fateful meeting of the ancient prophecies of the Old Testament with full-spectrum futurism of cyber-warfare. That's Born Again Ideology, and this time, the rulers of the American covenant intend to get it right, far right, with a style of political action -- an unyielding politics based on preemptive action, a politics of hand to mouth existence, constant military interventions, ceaselessly stirring up turbulence, media provocations intended to provoke panic fear among the domestic population for which redemptive violence is the only recourse...[16]

With "born again politics" do we not also then find ourselves "re-born" as a domesticated population?

placebo ergo sum.

Now, quite simply, I wonder why these two stories feel so similar. Might it not be exactly because they say the same thing? Might it not in fact be precisely because we have all been born twice -- once the birth we have been told to remember -- the other, the engendering of self, the memory of our own Icarian plummet into being?

Born once at birth -- born a second time through the mirror-stage.[17]

Born once into flesh -- then born again into humanist being.

Isn't the similarity striking? Perhaps we are all "born again" to be "brides of compassionate conservatism," formed to be submissive to our very delusions of autonomy. And while this may sound tenuous to some, let us remember that both Lacan and Zizek (his "born again" counterpart) are adamant that autonomy cannot be sustained without the prosthetic (prophetic?) sustenance of precisely the big Other.[18] Born itself out of opposition, autonomy needs the discourse of power, indeed the discourses of conspiracy and fascism as well. Consequently, even autonomy has been mis-thought by the Western mind. Not the freedom to choose, but rather only ever the mere freedom to refuse.

But this is dangerous. For to rethink autonomy as the freedom to refuse is also to say that none of us is autonomous -- at least insofar as we invoke a right to make choices. Rather to choose, under this paradigm, is devastating to the very notions of precious self and inextricable subjectivity that might otherwise be the (prosthetic) grounds upon which individuality could be based.

6
Might we not rethink the history of humanist theory then as precisely equivalent to Kroker's "faith-based politics"?[19] Might it not in fact be suggested that "faith-based politics" can be both so deadly and so prominent because it has always already been the face of philosophy? Faith-based philosophy? Isn't that the accusation that Wittgenstein levies against the traditions of analytic thought when he says quite simply that "behind every well-founded belief lies belief that is not founded"?[20] And sadly, isn't it also through this identical formulation that the "faith-based self" comes back as the unverifiable, illegitimate offspring of psychoanalysis?

Consequently, if (according to Kroker) American politics never really broke with its 17th century Puritan roots, but simply added a revamped technological face to its "faith-based" body,[21] might we not assert that something similar is true for all things faith-based, from philosophy to identity itself? "Born again identity" as the sign of all things Western, all things humanist.

Consequently, is it possible that philosophy never left the Renaissance, the literal "re-birth" that has remained the pinnacle moment of Western culture itself? And might this not then be the singularly decisive moment, after which we have all become posthuman, reborn into nihilist self-alienation?

placebo ergo sum.

-------------------------------------
Notes;
[1] N. Katherine Hayles, How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago: University of Chicago Press, 1999. p.4.
[2] see Roland Barthes, "The Death of the Author." In Image, Music, Text. Stephen Heath, trans. London: Fontana, 1977; also, Michel Foucualt, "On the Author." In Art in Theory: 1900-1990. Charles Harrison and Paul Woods, eds. Oxford: Blackwell, 1992.
[3] Hayles does, in fact, address the question of the body. However, her analysis limits itself to the discourses of embodiment and experience and, despite her dismissal of Foucault for precisely not addressing the question of embodiment, her discussion itself remains (arguably) rhetorical. see Hayles, How We Became Posthuman, Chapter 8.
[4] The following passage (from Lex Newman, "Descartes' Epistemology." Stanford Encyclopedia of Philosophy: plato.stanford.edu/entries/descartes-epistemology) should make this assertion clear:
Famously, Descartes puts forward a very simple candidate as the "first item of knowledge." The candidate is suggested by methodic doubt -- by the very effort at thinking all my thoughts might be mistaken. Early in the Second Meditation, Descartes has his meditator observe:
I have convinced myself that there is absolutely nothing in the world, no sky, no earth, no minds, no bodies. Does it now follow that I too do not exist? No: if I convinced myself of something then I certainly existed. But there is a deceiver of supreme power and cunning who is deliberately and constantly deceiving me. In that case I too undoubtedly exist, if he is deceiving me; and let him deceive me as much as he can, he will never bring it about that I am nothing so long as I think that I am something. So after considering everything very thoroughly, I must finally conclude that this proposition, I am, I exist, is necessarily true whenever it is put forward by me or conceived in my mind. (Med. 2, AT 7:25)
As the canonical formulation has it, I think therefore I am (Latin: cogito ergo sum; French: je pense, donc je suis) -- a formulation which does not expressly arise in the Meditations.
[5] Michel Foucault, Discipline and Punish: The Birth of the Prison. Alan Sheridan, trans. New York: Vintage, 1979. p. 170.
[6] Ibid, p. 183.
[7] see Foucault, Discipline and Punish, p. 183.
[8] Jacques Lacan, "The Mirror Stage as Formative of the Function of the I as Revealed in Psychoanalytic Experience." In A. Elliott, ed. The Blackwell Reader in Contemporary Social Theory. Oxford: Blackwell, 1999. p. 62.
[9] Paul Virilio, The Vision Machine. Indianapolis: Indiana University Press, 1994. p. 59
[10] Lacan, p.63.
[11] Ibid, p. 64.
[12] In his analysis of Lacan, Zizek argues that the goal of psychoanalysis is precisely "not the pacification/gentrification of the trauma, but the acceptance of the very fact that our lives involve a traumatic kernel beyond redemption, that there is a dimension of our being which forever resists redemption-deliverance." This traumatic kernel, if it is to be "accepted" and cannot be redeemed, cannot be less present in the voice of Lacan, or of Zizek for that matter, than anyone else. Even an "enlightened" voice remains traumatized. see: Slavoj Zizek, On Belief. London: Routledge, 2001. p. 98.
[13] For a full account of the myth of Narcissus see: Robert Graves, The Greek Myths, Vol.1. London: Penguin, 1955. pp. 286-288.
[14] For a full account of the myth of Icarus see: Robert Graves, The Greek Myths, Vol.1. London: Penguin, 1955. pp. 311-318.
[15] Daniel White, "Terri Schiavo -- Bride of Compassionate Conservatism." CTheory, article td152, www.ctheory.net/text_file.asp?pick=449
[16] Arthur Kroker, "Born Again Ideology." CTheory, article td153, www.ctheory.net/text_file.asp?pick=451
[17] This is analogous to what Lacan calls the "specific prematurity of birth in man." Lacan, p.63.
[18] "The notion of sacrifice usually associated with Lacanian psychoanalysis is that of a gesture that enacts the disavowal of the impotence of the big Other: at its most elementary, the subject does not offer his sacrifice to profit from it himself, but to fill in the lack in the Other, to sustain the appearance of the Other's omnipotence or, at least, consistency." Zizek, p.70.
[19] Kroker argues that under the sign of "Born Again Ideology," contemporary American politics are reducible to their "faith-based" Puritan roots. Kroker, "Born Again Ideology."
[20] Ludwig Wittgenstein, On Certainty. Dennis Paul and G.E.M. Anscombe, trans. New York: Harper, 1972. p. 253 (33e).
[21] Kroker, "Born Again Ideology."
------------------------------
Ted Hiebert is a Canadian visual artist and theorist. His artwork has been shown in solo exhibitions across Canada and in group exhibitions internationally. Hiebert is the Editorial Assistant for CTheory, a Research Assistant at the Pacific Centre for Technology and Culture (University of Victoria) and a Ph.D. candidate in