Adam Ash

Your daily entertainment scout. Whatever is happening out there, you'll find the best writing about it in here.

Tuesday, October 31, 2006

Weird World: Osama Bin Laden's Tora Bora cave hide-out to become a tourist resort

Osama Bin Laden's secret caves hideout is being converted - into a £5.3million tourist resort. Hotels and restaurants are being constructed on mountains overlooking the al-Qaeda chief's Tora Bora refuge in Afghanistan, reports the Sun. Former warlord Gul Agha Sherazi, now a local governor, said: "Tora Bora is world famous - but we want it to be known for tourism, not terrorism. It was known as a picnic spot long before anyone had heard of Osama Bin Laden."

Bin Laden, who hid there in 2001 after the Taliban government was ousted, is believed to have fled after a US bombing blitz. Two journalists were killed there this month but Mr Sherazi insisted: "Tora Bora is 100 per cent safe."

US election: it's time for progressives in the Democratic Party to stand up and be counted

A Democratic Wing of the Democrats?
Liberal Doormats: Tread on Us
By LANCE SELFA/Counterpunch

The self-styled "Democratic wing of the Democratic Party"--the liberals who try to rally the Democratic base for every election--are always the good soldiers.

In a party increasingly consumed with offering a Republican-Lite agenda, they hold out hope that a Democratic Congress might investigate the Bush administration, enact national heath care or cut off funds for the occupation in Iraq. Even though party leaders disdain them, they toil on for the good of the party.

A case in point is the most recent attempt at an "inside-outside" strategy of changing the Democrats: the Progressive Democrats of America (PDA). Founded in 2004 and now claiming more than 85,000 members in 135 chapters, they take responsibility for winning resolutions in support of Bush's impeachment and for withdrawal from Iraq in state-level Democratic Parties.

In an October 17 message on the eve of congressional elections, PDA leader Tim Carpenter notes: "The strategic activism of Progressive Democrats of America gives me new hope that the Democratic Party and our country can be turned around."

Yet the fact is that the majority of liberal candidates the PDA backed in Democratic primaries lost to more conservative Democrats--many of them backed by the party establishment. Many of the winners--especially those, like Illinois candidate Tammy Duckworth, who were recruited and promoted by the Democratic Congressional Campaign Committee and its pro-war leader Rep. Rahm Emanuel--are pro-war themselves.

This has put the PDA in the same position as previous formations like it: working for the election of Democrats who not only don't share their views on the war or health care, but are actually opposed to them. Yet in the interests of party unity and a broader outlook, the PDA has urged its members to work for these candidates.

One justification is that if the Democrats take the House, then safe-seat liberals will take over key congressional committees. "Imagine the investigative work that could be done on the Downing Street Memos and the Ohio voting irregularities and the steps that could and would be taken toward the censure of President Bush with these members managing the committees," Carpenter wrote.

"For this reason, PDA is urging its members and all progressives to donate, organize and vote Democratic in November. It may involve some holding of noses in some districts, but the stakes are high and the road ahead is long. Progressives must support all the Democratic nominees--including [Jane] Harman, [Al] Wynn and Duckworth as well as centrist Democrats who faced no progressive primary challenge--so we can demand and expect the support of centrist Democrats when our candidates win future primaries."

Fat chance of that. Just look at Sen. Joe Lieberman, who is favored to win re-election as an independent after he refused to accept his defeat in the Democratic primary to the moderate Iraq war critic Ned Lamont.

The problem doesn't just come from selfish "centrists." It's in the setup of politics itself, where liberal support for Democrats is always a one-way street.

A good recent example might be the case of PDA-endorsed Rep. Sherrod Brown (D-Ohio), who is looking increasingly likely to become a senator from Ohio, defeating incumbent Republican Mike DeWine.

On September 17, Carpenter presented Brown with a "Backbone Campaign" award as part of PDA-supported effort to reward Democrats who stand up against special interests, the Republicans and the White House. In Brown's case, the PDA was commending him for leading opposition to the Central American Free Trade Agreement.

A few days later, Brown's backbone crumbled. He was one of only 34 House Democrats to vote for Bush's torture bill.

Of course, Brown was in a difficult position. What's opposing torture and supporting the right of habeus corpus against the possibility of facing a Republican attack-ad melding his face with Osama bin Laden's?

The PDA continues to back Brown, which goes to show once again that liberal groups working within the Democratic Party end up as the "gofers." They work the hardest at inspiring people to vote for an uninspiring party, and receive little in return. Yet they redouble their efforts to elect the Democrats.

As the socialist Hal Draper wrote in 1967 about the "lib-labs" (liberal-labor) of his day: "The Democrats have learned well that they have the lib-lab vote in their back pocket, and that therefore the forces to be appeased are those forces on the right ."

Don't the liberals ever get tired of being treated as doormats? How about calling another press conference and publicly rescinding the award to Brown? Now that would be a show of backbone.

(Lance Selfa writes for the Socialist Worker.)

2. Progress by Committee
Progressives won't get party leadership, but seniority brings them committee chairs
By Geov Parrish

The punditsphere is abuzz with speculation over what will happen in the 110th Congress next year, when and if Democrats take over one or both houses. Among liberals and progressives, it is an article of faith that winning on Nov. 7 is The Only Thing That Matters, and that the sea change likely in next week's election will by definition make everything better.

At the macro level, where headlines are made, this is beyond wishful thinking. There is absolutely nothing in the last six years of Democrats' performance as a minority party to suggest that once in control of one or both houses, the Democratic Party leadership will provide much of a useful counterweight, let alone impediment, to the inevitable Bush administration excesses of incompetence, corruption, lies, and Constitution-shredding. From Enron and tax cuts to the PATRIOT Act (twice), Iraq, NSA wiretapping, torture and secret prisons (and the abolition of habeas corpus), Medicare, Katrina relief, Supreme Court and lesser appointments, and far more, congressional Democrats have been damned near useless for six long years.

Harry Reid and Nancy Pelosi, the prospective leaders of the Senate and House, are centrists who attained their positions by mastering the game of corporate fundraising. And most of the Senate hopefuls for the 2008 presidential race, starting with frontrunner Hillary Clinton and current media darling Barack Obama, and adding Biden, Bayh, and to a lesser extent Kerry, are legislative centrists. For all of them, not offending potential donors (and, oh yeah, voters) will be the bottom line for the next two years. And they'll need even more corporate cash than usual. Don't look for a whole lot of confrontation with, let alone impeachment of, the Executive Branch from those folks. Democrats are doing well this year as a result of public hostility to Republican rule and to Congress, not because of anything the Dems have stood for. In terms of public perceptions, that's not likely to much change in the next two years.

In other words, if Democrats sweep to victory next week, sure, progressives should celebrate -– it is what Dubya, two years ago, charmingly called an "accountability moment." Moreover, an unprecedented level of grassroots activism will have helped make it possible, always a good thing for a democracy in peril. But don't stop working on November 8. A lot of organizing will remain before the folks at the head of the Democratic Party pyramid pursue anything remotely resembling a progressive agenda.

That's the bad news in the prospective 110th Congress. The good news is at the next level down in the pyramid.

Progressives on Capitol Hill don't get party leadership positions these days. What they do get is seniority, especially in the House, where many have represented safe urban districts for decades. And this year, as never before, a remarkable number of congressional progressives are in line to become powerful committee chairs.

There are only a few progressives (or prospective new ones) in the Senate. I'd count Tom Harkin, Edward Kennedy, Barbara Boxer, Russ Feingold, and Patrick Leahy, along with Sherrod Brown and independent Bernie Sanders among the newcomers. On a good day, a handful of others will join them.

But if the Democrats should somehow edge into the Senate majority, look at what those five returning progressives would be doing. Four are in line to get committee chairs: Leahy the Judiciary Committee, Harkin at the Agriculture Committee, Kennedy the Health, Education, Labor, and Pensions Committee, and Boxer, in all likelihood, the Environment Committee. The fifth, Feingold, is a probable presidential candidate and so will have elevated visibility.

That fledgling progressive caucus, especially Leahy at Judiciary, will carry influence all out of proportion to its numbers. But the real earthquake will be in the House, where all 21 committees are currently chaired by white men. When the Democrats (barring a miracle or Act Of Diebold) win the House, the Congressional Black Caucus will be instantly transformed from afterthought to power broker. And plenty of other progressives will be newly running things, too. Consider these probable new committee chairs: John Conyers (Judiciary); Alcee Hastings (Intelligence); Charlie Rangel (Ways and Means); David Obey (Appropriations); Henry Waxman (Government Reform); Bennie Thompson (Homeland Security); Louise McIntosh Slaughter (Rules); George Miller (Education); Barney Frank (Financial Services); Tom Lantos (International Relations); James Oberstar (Transportation); Juanita Millender-McDonald (House Administration); Nydia Velazquez (Small Business).

That's 13 of the 21 House committees. It's enough to make one giddy. And note that Conyers and Leahy control the two Judiciary Committees. While impeachment, however richly deserved, is a political non-starter (party leadership would never allow or support it), it's the committees that launch investigations and issue subpoenas. Despite the timidity of Pelosi and company, there's opportunity for hours' worth of "accountability moments" in the next two years should these far more progressive committee chairs start poking around.

At the broader legislative level, Harry Reid and Nancy Pelosi aren't especially inclined to prioritize legislation they believe will hurt the party's 2008 chances with donors (and, oh yeah, voters). But they can only work with the bills the congressional committees give them.

Buried in staff minutiae and far away from the headlines, the real good news as to how the 110th Congress will turn out is that a remarkable amount of the legislation to be considered might need to pass muster with progressive dealmakers before it ever reaches a vote.

That would be something worth celebrating.

3. The Polls Predict a Huge Republican Defeat. The People Aren't So Sure
The war is a disaster and Bush has become a political liability, but can the Democrats turn that into midterm triumph?
By Gary Younge

There are polls and there are people. Polls are relatively straightforward. When compiled reliably they are supposed to tell a story in digits. That story may be contradictory (people say they want more social services and less tax), but even the contradictions are clear. Polls provide the pillars for the mega-narratives - the broad brush strokes that draw the big picture.

People, on the other hand, are anything but straightforward. They dissemble, evade and lie - even to themselves. They confuse what they want to be true with what they know to be true. A person who does not contradict himself or herself is a bore, an ideologue or both. To fit people into a bigger picture, you must first change their shape.

With just one week to go before the midterm elections, the polls speak with one voice. The Republicans are heading for resounding defeat. A Pew research poll released late last week shows the Democrats with a double-digit lead among likely voters in the crucial competitive districts. The Democrats need 15 seats to retake the House of Representatives and six to win the Senate.

In the 131 House seats where John Kerry mustered only 40-49% of the vote two years ago, Democrats now have a lead of about 5%. They are ahead among men, whites, suburbanites, southerners and rural voters, all groups in which they trailed heavily in 2004. Independents now favour Democrats by 16%; four years ago it was just 3%.

The Republicans are favoured for dealing with terrorism, North Korea and immigration. On everything else, including Iraq, the economy, healthcare, morality and taxes, the Democrats are ahead. Even the Republicans' most faithful supporters are rapidly losing confidence in the administration's strategy on Iraq. Almost half of white evangelical Protestants believe the US should set a timetable for withdrawal - a 40% hike in just two months.

If the pollsters are correct, the US is set for a transformation on a scale somewhere between the Gingrich revolution of 1994, the last Republican legislative revival, and the Blair landslide of 1997. Republicans are about to be crushed by a series of meteorological metaphors - tsunamis, floods and hurricanes are poised to descend on the US electoral landscape with devastating effect.

But, to hear the people talk, you would think the country was in store for little more than grey skies with a chance of rain. Many voters in key districts in the midwest say they are undecided or just plain uninterested. Ask them what will swing it for them and they shrug. The big issues, they say, are Iraq and something else - usually healthcare, the economy or social security. Hurricane Katrina, corruption and terrorism never come up. They will answer questions about the election if you ask them, but it rarely seems to have been on their minds before you interrupted. Even big regional papers, such as the Chicago Tribune, are more focused on the races for local and state office (if indeed they are looking at politics at all) than the national scene. The issues people say are important are national but for the most part the voters insist on a local remedy. "I'll vote for the man not the party," they say. And which man do they like best? They're not sure.

Take Barbara, a "staunch Democrat", who lives in Chicago's suburbs, in one of the most hotly contested districts in the country, and voted for Bush in 2000. She is opposed to the war, but she is not yet sure whether she will vote for Democratic candidate Tammy Duckworth, the Iraq war veteran who lost her leg in the conflict. "I don't have any real feelings about her one way or another," she says. What will swing it for Barbara? "I don't know. Something will hit me."

The Democrats do seem more determined; those who voted for Bush in the last election are having second thoughts about both him and his party. But you come away with a sense that this could end up resembling Kerry's defeat in 2004 or John Major's victory in 1992.

None the less, two common strands do tie the people and the polls. First, the American public has concluded that the Iraq war has been an abject failure and wants the troops to come home. In 1999, George Bush Sr explained why he didn't move on to Baghdad after the first Gulf war. "Whose life would be on my hands as the commander-in-chief because I, unilaterally, went beyond international law, went beyond the stated mission and said, 'We're going to show our macho? We're going to be into Baghdad. We're going to be an occupying power - America in an Arab land - with no allies at our side.' It would have been disastrous."

This accurately describes his son's actions and, more importantly, the way they are commonly understood. The principle determinant of American support for any military intervention is not whether it is right or wrong but whether people think it will be successful.

For the past three years Bush has been telling anyone who would listen that the only option was to "stay the course". So long as people thought things were going well, they backed him. With US casualties rising and talk of civil war no longer taboo, the mood has soured. Last week he said that staying the course had never been his strategy. He refused to set a timetable but instead referred to "benchmarks". This may be one semantic illusion too many. They now fear that, having lied his way into it, he will lie his way out of it.

The second is that, largely as a result of the war, Bush has become a political liability. Attempts to pass this off as midterm blues simply do not wash. Six years into Clinton's presidency in 1998 - after the Lewinsky scandal had broken - the Democrats picked up five seats. In Reagan's sixth year, Republicans once again lost just five.

In 2004 Bush stood not for president but for commander-in-chief - the war leader. He called himself "the decider" - a man of principle and determination who checked his gut before he checked the polls. Unlike Kerry, the "flip-flopper", he would stand his ground when times were hard.

"The greatest thing about this man is he's steady," said comedian Stephen Colbert in a now famous parody before the White House correspondents' dinner in May. "You know where he stands. He believes the same thing on Wednesday that he believed on Monday, no matter what happened Tuesday. Events can change; this man's beliefs never will."

With Bush's approval ratings stuck in the high 30s and low 40s since Katrina, Democrats run ads tarring their opponents by association with the president. One ad in Colorado shows Republican Rick O'Donnell standing alongside Bush, and warns that a vote for O'Donnell means "another vote for George Bush's agenda". Meanwhile, many Republicans are desperate to distance themselves from him. "George Bush is not a message I want to talk about," struggling Indiana congressman John Hostettler told the Chicago Tribune recently.

Quite what impact this will have on the elections on November 7 is not quite clear. Bush isn't on the ballot and the House of Representatives could not recall the troops even if it wanted to. The fact that Americans want to change course at home as well as abroad does not mean that they want to follow the Democratic course - even if they knew what it was. People are complicated. And the only poll that matters has not yet taken place.

Bookplanet: does Stephen King write literature?

King takes aim at Lit wits
Stephen King's best novel in a decade is still hostage to those damn professors

Stephen King's esteemed place in the hierarchy of genre fiction authors is undisputed, but a more contentious question has arisen in the decade or so: Can King's work be considered literature?

Fans, and even a few brave academics, have argued that King's novels and stories, taken together, provide a composite portrait of late-20th-century American masculinity in crisis. They also point to his undeniable storytelling skills, his vast if somewhat lurid imagination and his gift for capturing American speech as proof of, if not literary greatness, then something more than mere hack work.

This question hit the media in a big way when King won the 2003 National Book Award for distinguished contribution to American letters. King's supporters, many of them such "literary" authors as Michael Chabon, said that it was about time that a writer of King's calibre was recognized by the literary establishment; many traditionalists saw the bestowing of the award on a mere horror writer as another example of the dumbing down of American culture.

There was nothing new in any of this. The literary value of genre fiction, broadly defined as work that eschews moral ambiguity, narrative experimentation, verbal nuance and depth of characterization for such "lowbrow" considerations as story, suspense and obvious readability, has vexed critics and academics of the English-speaking world since at least the late 19th century.

What most of the commentators failed to ask is whether King's work is any good, regardless of its genre classification and "literary" intentions. The only question worth asking is whether the work stands up to a close reading, both as genre fiction and literary fiction.

King's latest novel, Lisey's Story , provides the ideal text to test that question, embodying all of the strengths and limitations of King's work. Like many of King's later novels, Lisey's Story attempts to extend the boundaries of horror fiction by delving deeper into the everyday life and complex internal motivations of its central character, in this case Lisey Debusher Landon.

Lisey is the recently widowed wife of bestselling author Scott Landon, a fantastically successful author of dark, fantastic fiction whose early death left Lisey with a fat bank account and a rambling house in the country full of memories and boxes of manuscripts, letters and other detritus from his prodigious literary output, plus all the commentary it generated in media and academic circles.

Scott has been dead for two years when the novel opens. At the urging of her older sister and the pressure of assorted academics — who see in Scott's unpublished papers a goldmine for potential critical commentary — Lisey finally begins the long slog of cleaning out her dead husband's study.

This narrative set-up allows King to insert a stunningly unsubtle commentary on his own feelings about the worthiness of genre fiction and its often cool reception by the critical community. As Lisey begins her long recollection of Scott's storied career, she heaps scorn on everything from academics to critics to a tiny university magazine called Push-Pelt — which, Lisey immediately concludes, is "one of those names designed by English majors to be charming and mean absolutely nothing."

Lisey even coins a word — Incunks — to describe the impatient "wheedlers" who are after Scott's papers. King obviously finds this word pretty clever, as he has Lisey use it four times in the next two pages. This would be fine, if not a little annoying, if King were using this repetition to illustrate some facet of Lisey and Scott's characters, but it becomes clear that King is entirely onside with his fictional couple and their anti-academic bent.

So when Lisey describes a professor who failed to come to Scott's aid after a botched assassination attempt as a "southern-fried chickenshit coward," the reader can bet that the narrative will bear this judgment out.

And when another professor accidentally sets a psychotic fan on the trail of Scott's unpublished work, that professor will prove to be the kind of weasely, failed-writer-turned-teacher that Lisey predicted he'd be.

But King is after something bigger than a bludgeoning metatext on the supremacy of the author over the critic, and King's own status as a "literary" author.

Lisey's Story hinges on Lisey's prolonged, painful meditation on 25 years of marriage to a very troubled author with a complex imaginary world — and here the novel both fails spectacularly and occasionally opens up into some of the most genuinely moving and nuanced writing of King's career.

First the bad stuff: As with his treatment of the bad guys, King's personal feelings for Lisey and her beloved late husband utterly dictates their treatment, both in tone and narrative outcome. From page one, the message is clear: If you don't like Scott and Lisey, you're on the side of the Incunks and the chickenshit cowards.

Worse, King goes to painful lengths to establish the depth of his protagonists' love for each other by embroidering Lisey's reminiscences with a constant gush of baby talk and insiders' jokes, which makes reading the narrative feel at times like being trapped under the bed of an aging couple on their second honeymoon.

King is trying to mine the language of intimacy to explore both the depths and limits of Scott and Lisey's love. While this technique does help give their relationship a depth not found in ordinary genre fiction, King is far too enamoured of his creations to occasionally tell them to shut up and get on with the story.

In spite of this self-indulgence and an opening hundred pages that could easily have been trimmed to 20, Lisey's Story is King's best novel in at least a decade.

The central story — Lisey's attempt to unravel the mystery of Scott's imaginary and not-so-imaginary worlds — shows King at his best, blending the mundane daily realities of average American lives with fully realized, fantastic landscapes of paranoia, obsession, loss and unbridled imagination.

It's a shame King second-guesses his own truly awesome imagination and storytelling powers to stoop to the level of a merely literary author.

(James Grainger is the author of the story collection The Long Slide (ECW), winner of the 2005 ReLit Award.)

What we need is a war on war: how about a Department of Peace?

The Cry of Our Inner Gandhi
What's so scary about a Department of Peace?
By Robert C. Koehler

“While Republicans fight the War on Terror, grow our robust economy, and crack down on illegal immigration, House Democrats plot to establish a Department of Peace, raise your taxes, and minimize penalties for crack dealers. The difference couldn’t be starker.”

As a contribution to the general noise and ignorance, House Whip Roy Blunt’s Web-site politicking is nothing special. Boo! Scared yet? Fear-baiting at election time is standard GOP save-our-keister strategy, but the list of acceptable bogeymen that party leaders parade before the constituency, with inimitable cynicism, is always instructional.

Taxes, check. Crack dealers, check. Department of Peace . . . huh?

Heaping derision on this quiet but potent piece of legislation — H.R. 3760 , which now has 74 co-sponsors — may be a miscalculation on Blunt’s part, given that most Americans have lost patience with the carnage in Iraq, don’t feel safer because of the war on terror and want the country to move in a new direction.

I can understand why Blunt himself would be scared of it — the establishment of a cabinet-level Department of Peace would signal a profound national direction change — but, sadly, I also understand why he sees it as a safe target to mock and misrepresent. The extraordinary notion that violence, like disease, may have causes that can be eradicated — that it is not embedded in human nature and therefore inevitable — isn’t in wide circulation yet. It remains barely a pinprick in the national awareness, as manifested by the mainstream media and other outlets of popular culture.

The concept is also dangerous and upsets the powers that be. Violence is not only big business, it permeates the mythology that unites us as a nation. To suggest building a culture of peace, of which a Department of Peace would be one component, no doubt seems like a “plot” to the likes of Blunt — but I’m convinced there is a groundswell of hope for such a culture, indeed, a spiritual hunger for it.

A woman recently wrote to me: “I don’t think I’ve ever felt a deeper level of frustration with the direction this country is going. Honestly though, what do you do? I give to the candidates and important causes, I’ve gone to marches and rallies, I write letters when necessary but I honestly don’t know what to do with the anger, frustration, despair that I feel. I’ve had this conversation with friends and we talk about it but then agree that we don’t ‘Do’ anything. But what is there to do? What is the best way to get involved?”

How many of us haven’t felt such anguish ourselves? There’s no simple fix for this sort of frustration, which, though it may be triggered by the Bush presidency, is far more spiritual in nature than it is political. For all the nation’s vaunted self-aggrandizement as the world’s oldest democracy, we are not encouraged by the mass media to participate in public life — certainly not at that level.

The Washington Post, for instance, in a story about House Minority Leader Nancy Pelosi (the story quotes Blunt’s laundry list of bogeymen), describes how the California Democrat set about revitalizing her party after its defeat in ’04. Did she reach out to the public, tap into the great desire for change afoot in the land or craft a relevant party platform? Well, actually, no.

Instead, the story matter-of-factly notes, “she reached out to advertising executives, Internet moguls and language specialists to ask how Democrats could rise from the ashes and challenge President Bush and the Republicans.”

This is the kind of story that ruins my day. The word “participatory” seems to be so thoroughly atrophied at this point that no self-respecting journalist would seriously consider using it as a modifier — much less an amplifier — for “democracy.” Yet my anguished letter-writer is groping for precisely this word. That vibrating imperative she expressed, to do something that matters, is nothing less, in my view, than the cry of our inner Gandhi to become the change we want to see happen in the world.

And this brings me back to the Department of Peace, the culture of peace, the idea of peace. If we don’t break the cycles of violence that keep hatred and injustice at a constant simmer, our future is limited and stunted. “We need a partner in our government so that peace becomes an organizing principle in this society,” said Dot Maver, executive director of The Peace Alliance . That’s the value of the movement to establish a Department of Peace, and for those of you, like my correspondent, who want to know where to put your energy, this may be the place.

Peace, as defined by Johan Galtung at , is “the capacity to handle conflicts with empathy, nonviolence and creativity.” Far from being a “plot” hatched by a cabal of Democrats led by Nancy Pelosi, as Rep. Blunt seems to think it is (if only he were right), peace is a principle, an array of social technologies and, above all, a life commitment demanding every ounce of our strength.

(Robert Koehler, an award-winning, Chicago-based journalist, is an editor at Tribune Media Services and nationally syndicated writer. You can respond to this column at

Those sexy Swedes wear their libidos on their sleeves

The Sexdagarna Bares All
This week Swedish students can attend a Sex Fair -- right on their campus. The organizers are offering lectures on pornography and female orgasms, dildo displays and lubricant samples. Should we be shocked, or are the Swedes just living up to their reputation?
By Frauke Lüpke-Narberhaus in Stockholm/Der Spiegel

It's normal for Stockholm University to be plastered with posters, but this one stands out. The bright red poster is covered with breasts, hands, feet, handcuffs and boots. And it features the word " Sexdagarna " -- "Sex Fair" -- in thick black lettering.

Italian student Anita Boffano can't help but stare. "They want to shock us!" is the first thing she says. Then she adds: "Oh, that's typically Swedish. Everything is so open and free here."

Twenty-one-year-old Anita is heading to the university this afternoon just as she would do on any other day -- but she's revised her choice of subject. She's giving her Swedish course a miss and putting sex on the agenda instead. Today's topics: "Men, Sex and Coming at the Right Moment"; "Questions and Answers About Sex Toys"; "Porn Saves the World"; "Anarchy in Relationships -- How it Works"; "Discussion on Asexuality"; and "Faces of the Female Orgasm." On Monday and Wednesday, experts will address other important sex-related topics. "Poly -- Day to Day Life with Multiple Partners," for example. Or "Tips for Anal Sex."

Shocking? That's not the point, organizers insist. "We want to inform people," Caroline Orre, the project's 27-year-old organizer explains. Fourteen companies and non-profit organizations are participating in the fair. They want to raise awareness about safe sex, Orre explains. "We also want to normalize things and break taboos. No one should be ashamed or feel abnormal just because they prefer certain sexual practices," she says.

Etchings of copulating couples

So the idea is for the Sexdargana to be respectable, not just salacious -- even if the first impressions visitors get may make them think otherwise. Incense wafts through the air from inside the sex temple's open doors. The lights have been dimmed. The carpet is wine-red, like the heavy curtains. The walls are lined with etchings of copulating couples and sculptures of two, three or four men and women in the most unlikely positions.

Salacious or not, the Sex Fair has reached Stockholm University -- and there have already been three held previously. This year Uppsala University is losing its virginity: Another Sex Fair will be held there in November. But Orre would like nothing more than to see the project spread all over Sweden. And she's already talking to possible collaborators.

Stockholm's UniversitetsStudentkar is hosting the event. The student assembly has rented a hall in the Allhuset building, on the university campus. The organizers have converted the academic hall into a sexual paradise -- for three days.

Still, and despite all the incense, the organizers don't want to create any false impressions: Sex is risky, and Malin Devstam from Sesam, an organization financed by Stockholm's provincial government, is there to inform visitors about those risks. "Students fall within the age group most frequently affected by sexually transmitted diseases," she says. "The Sex Fair helps us reach these young people." But some students don't come here because they're afraid of catching a disease -- they're worried about their sex life. "They're often so focused on their studies that they lose interest in sex," says Malin Devstam.

Let's talk about sex

Restoring that interest is Marika Smith's hobbyhorse. She's put her treasures on display on the next stand: dildos, vibrators, combs, lubricants. Her job: inspiring people sexually. Her target audience: the very same students that Devstam talks about. But none too many of them have the courage to approach her stand: "It's pretty obvious what your stand is about when you're standing there with a dildo in your hand," she says.

All in all, the Swedes seem pretty relaxed about the Sex Fair. The mood here really is as frank and free as 21-year-old Anita from Italy thought when she saw the poster. But Orre, the project organizer, takes a different view: "I think many students don't approve of the Sex Fair. But they would never say so -- that's how people are in Sweden. Exchange students are much more open-minded."

And then, sure enough, a lady in her fifties, weighed down by shopping bags, shows up at the stand and begins ranting loudly. But listen closely and it turns out she's not upset about the Sex Fair itself -- she's annoyed with one of the people working there: She wants more free lubricant samples than he's willing to give her.

Anita from Italy hasn't taken any lubricants. But they weren't the reason she came to the Sex Fair anyway. She came to break a taboo. "The more you talk about sex the easier it becomes to do so. That makes it easier to express your desires, fears and worries," she says. Class dismissed.

We need a war on arms

We Arm The World -- by William D. Hartung

While the U.S. hangs its foreign policy on preventing the spread of “weapons of mass destruction” (a worthy goal, however grossly the Bush administration goes about achieving it), it continues to ignore a more immediate threat—the proliferation of small arms and light weapons—that deserves serious attention as well. These low-tech arms have been described as “slow motion weapons of mass destruction,” because they are responsible for hundreds of thousands of deaths over the past dozen years, from the genocide in Rwanda to the ongoing civil war in the Democratic Republic of the Congo. Yet yesterday, the United States, the world's largest supplier of small arms, was the only country to vote against an historic United Nations proposal to curb traffic in arms.

The United Nations vote was the culmination of the work of a network of prominent individuals and diverse non-governmental organizations. They set out to address the problem of small arms and light weapons—as well as larger systems like tanks, fighter planes and attack helicopters—by putting forward a proposal for an Arms Trade Treaty. The thrust of the proposed treaty is to curb arms transfers to major human rights abusers and areas of conflict. It would also urge weapons suppliers to limit weapons sales that are likely to undermine development in poor nations.

Other elements of an arms treaty could include the creation of common international criteria for assessing particular exports, and movement toward global enforcement mechanisms such as licensing of the arms brokers and shippers who are all too often at the center of illegal deals that have fueled conflicts in Liberia, Sierra Leone, Angola and Rwanda.

As a first step—by a vote of 139 to 1 with 24 abstentions—the U.N. General Assembly agreed yesterday to create a two-part process aimed at pursuing such a treaty. The United States was the only vote in opposition to the resolution.

Now, as a result of the successful vote, the first step will be a survey of U.N. member states by the secretary general’s office. The survey will seek the views of U.N. members on the feasibility and practicality of a legally binding treaty that would set international standards on arms transfers. In 2008, these same questions will be addressed by a group of experts that will delve more deeply into the subject.

However long it takes, a treaty will be an historic step forward in the global arms control regime. Up until now, the arms trade has been the “orphan of arms control”—bound by no international treaties of the sort that govern the possession or spread of nuclear, chemical and biological weaponry.

How can it be that the Bush administration was the only government in the world that voted against even thinking about an Arms Trade Treaty? U.S. security has suffered more harm than good from the widespread availability of small arms and light weapons, which often end up being used against U.S. troops. A recent study by researchers at Johns Hopkins University has found that over half of U.S. casualties in Iraq have been inflicted by AK-47s.

In fact, American-made weapons also frequently end up pointing at American soldiers. For example, the early foundations of al-Qaida were built in part on relationships and weaponry that came from the billions of dollars in U.S. support for the Afghan mujahadin during the war to expel Soviet forces from that country. U.S. military personnel in Somalia and Panama faced U.S.-supplied weaponry that had been given to those nations when they were U.S. allies. In Panama, the issue at hand was a change in Panamanian and U.S. government policies. In Somalia, warlords got hold of U.S.-origin weapons in the wake of the overthrow of the Siad Barre dictatorship. These patterns are likely to continue if nothing is done to stem the wholesale trade in weapons.

The specific impacts of runaway arms trafficking on U.S. forces are amplified by broader concerns. Relatively inexpensive and readily available small arms and light weapons can be used to destabilize countries, creating political chaos and economic devastation. In turn this can contribute to making these countries havens for terrorism while undermining their ability to achieve economic self-sufficiency and accountable governments.

So, the question remains, why is the United States opposed to taking measures to stop this deadly trade? The first answer is strategic. The executive branch wants to preserve its “freedom of action” to arm U.S.-allied groups like the Nicaraguan contras, the Afghan mujahadin, Jonas Savimbi’s UNITA movement in Angola, the Iraqi National Congress and groups opposed to the current regime in Iran. Even if one accepts the right of the United States to attempt to overthrow governments that oppose its short-term political or economic imperatives—which this author does not—the short-term “benefits” of these arms-supply relationships are inevitably outweighed by the long-term costs to U.S. and global interests. Unfortunately, short-sighted policymakers in Washington—of both parties—have failed to understand or accept this fundamental principle.

As the world’s number one arms exporting nation, the United States has a special responsibility to take the lead in regulating the trade. A 2005 report by the World Policy Institute found that of the largest U.S. arms recipients in the developing world, over 70 percent were undemocratic regimes, major human rights abusers or both.

The United States is not alone in the business of unsavory arms exports. A recent report by the research group Saferworld found that in the past year, the United Kingdom provided weapons to 19 of 20 nations that had been singled out by its own government as “major countries of concern” for human rights abuses. And the Control Arms Campaign has found Russian, Greek, Chinese and U.S.-origin bullets in the Democratic Republic of the Congo, which is engaged in one of the deadliest civil wars in living memory.

A second factor in U.S. opposition to any substantial measures to curb the weapons trade is the role of the domestic gun lobby. Both the National Rifle Association and its allied organization, the World Forum on the Future of Sport Shooting Activities, have gone on record against an Arms Trade Treaty. National Rifle Association propaganda has made the false claim that a treaty would lead to the confiscation of guns owned by U.S. citizens.

The good news is that, despite U.S. opposition, the U.N. General Assembly has voted to support steps towards the creation of a treaty regulating the arms trade. This is due in large part to the strenuous efforts of organizations like Amnesty International, Oxfam and the International Action Network on Small Arms, which includes over 500 member organizations in more than 100 countries.

There is a long way to go before there will be an international treaty curbing the arms trade, but this week’s action at the United Nations is an important step forward, and an indication that progress can be made even in the face of opposition by the Bush administration and the gun lobby. A change in U.S. policy is urgently needed, but in the mean time the rest of the world is moving ahead without us.

(William D. Hartung is a senior fellow at the World Policy Institute at the New School and the director of the Institute’s Arms Trade Resource Center .)

Living way longer by eating way less

One for the Ages: A Prescription That May Extend Life -- by MICHAEL MASON

How depressing, how utterly unjust, to be the one in your social circle who is aging least gracefully.

In a laboratory at the Wisconsin National Primate Research Center, Matthias is learning about time’s caprice the hard way. At 28, getting on for a rhesus monkey, Matthias is losing his hair, lugging a paunch and getting a face full of wrinkles.

Yet in the cage next to his, gleefully hooting at strangers, one of Matthias’s lab mates, Rudy, is the picture of monkey vitality, although he is slightly older. Thin and feisty, Rudy stops grooming his smooth coat just long enough to pirouette toward a proffered piece of fruit.

Tempted with the same treat, Matthias rises wearily and extends a frail hand. “You can really see the difference,” said Dr. Ricki Colman, an associate scientist at the center who cares for the animals.

What a visitor cannot see may be even more interesting. As a result of a simple lifestyle intervention, Rudy and primates like him seem poised to live very long, very vital lives.

This approach, called calorie restriction, involves eating about 30 percent fewer calories than normal while still getting adequate amounts of vitamins , minerals and other nutrients. Aside from direct genetic manipulation, calorie restriction is the only strategy known to extend life consistently in a variety of animal species.

How this drastic diet affects the body has been the subject of intense research. Recently, the effort has begun to bear fruit, producing a steady stream of studies indicating that the rate of aging is plastic, not fixed, and that it can be manipulated.

In the last year, calorie-restricted diets have been shown in various animals to affect molecular pathways likely to be involved in the progression of Alzheimer’s disease, diabetes ,heart disease , Parkinson’s disease and cancer . Earlier this year, researchers studying dietary effects on humans went so far as to claim that calorie restriction may be more effective than exercise at preventing age-related diseases.

Monkeys like Rudy seem to be proving the thesis. Recent tests show that the animals on restricted diets, including Canto and Eeyore, two other rhesus monkeys at the primate research center, are in indisputably better health as they near old age than Matthias and other normally fed lab mates like Owen and Johann. The average lifespan for laboratory monkeys is 27.

The findings cast doubt on long-held scientific and cultural beliefs regarding the inevitability of the body’s decline. They also suggest that other interventions, which include new drugs, may retard aging even if the diet itself should prove ineffective in humans. One leading candidate, a newly synthesized form of resveratrol — an antioxidant present in large amounts in red wine — is already being tested in patients. It may eventually be the first of a new class of anti-aging drugs. Extrapolating from recent animal findings, Dr. Richard A. Miller, a pathologist at the University of Michigan , estimated that a pill mimicking the effects of calorie restriction might increase human life span to about 112 healthy years, with the occasional senior living until 140, though some experts view that projection as overly optimistic.

According to a report by the Rand Corporation, such a drug would be among the most cost-effective breakthroughs possible in medicine, providing Americans more healthy years at less expense (an estimated $8,800 a year) than new cancer vaccines or stroke treatments.

“The effects are global, so calorie restriction has the potential to help us identify anti-aging mechanisms throughout the body,” said Richard Weindruch, a gerontologist at the University of Wisconsin who directs research on the monkeys.

Many scientists regard the study of life extension, once just a reliable plotline in science fiction, as a national priority. The number of Americans 65 and older will double in the next 25 years to about 72 million, according to government census data. By then, seniors will account for nearly 20 percent of the population, up from just 12 percent in 2003.

Earlier this year, four prominent gerontologists, among them Dr. Miller, published a paper calling for the government to spend $3 billion annually in pursuit of a modest goal: delaying the onset of age-related diseases by seven years.

Doing so, the authors asserted, would lay the foundation for a healthier and wealthier country, a so-called longevity dividend.

“The demographic wave entering their 60s is enormous, and that is likely to greatly increase the prevalence of diseases like diabetes and heart disease,” said Dr. S. Jay Olshansky, an epidemiologist at the University of Illinois at Chicago, and one of the paper’s authors. “The simplest way to positively affect them all is to slow down aging.”

Science, of course, is still a long way from doing anything of the sort. Aging is a complicated phenomenon, the intersection of an array of biological processes set in motion by genetics , lifestyle, even evolution itself.

Still, in laboratories around the world, scientists are becoming adept at breeding animal Methuselahs, extraordinarily long lived and healthy worms, fish, mice and flies.

In 1935, Dr. Clive McCay, a nutritionist at Cornell University , discovered that mice that were fed 30 percent fewer calories lived about 40 percent longer than their free-grazing laboratory mates. The dieting mice were also more physically active and far less prone to the diseases of advanced age.

Dr. McCay’s experiment has been successfully duplicated in a variety of species. In almost every instance, the subjects on low-calorie diets have proven to be not just longer lived, but also more resistant to age-related ailments.

“In mice, calorie restriction doesn’t just extend life span,” said Leonard P. Guarente, professor of biology at the Massachusetts Institute of Technology . “It mitigates many diseases of aging: cancer, cardiovascular disease, neurodegenerative disease. The gain is just enormous.”

For years, scientists financed by the National Institute on Aging have closely monitored rhesus monkeys on restricted and normal-calorie diets. At the University of Wisconsin, where 50 animals survive from the original group of 76, the differences are just now becoming apparent in the older animals.

Those on normal diets, like Matthias, are beginning to show signs of advancing age similar to those seen in humans. Three of them, for instance, have developed diabetes, and a fourth has died of the disease. Five have died of cancer.

But Rudy and his colleagues on low-calorie meal plans are faring better. None have diabetes, and only three have died of cancer. It is too early to know if they will outlive their lab mates, but the dieters here and at the other labs also have lower blood pressure and lower blood levels of certain dangerous fats, glucose and insulin.

“The preliminary indicators are that we’re looking at a robust life extension in the restricted animals,” Dr. Weindruch said.

Despite widespread scientific enthusiasm, the evidence that calorie restriction works in humans is indirect at best. The practice was popularized in diet books by Dr. Roy Walford, a legendary pathologist at the University of California , Los Angeles, who spent much of the last 30 years of his life following a calorie-restricted regimen. He died of Lou Gehrig’s disease in 2004 at 79.

Largely as a result of his advocacy, several thousand people are now on calorie-restricted diets in the United States, says Brian M. Delaney, president of the Calorie Restriction Society.

Mike Linksvayer, a 36-year-old chief technology officer at a San Francisco nonprofit group, embarked on just such a diet six years ago. On an average day, he eats an apple or some cereal for breakfast, followed by a small vegan dish at lunch. Dinner is whatever his wife has cooked, excluding bread, rice, sugar and whatever else Mr. Linksvayer deems unhealthy (this often includes the entrée). On weekends, he occasionally fasts.

Mr. Linksvayer, 6 feet tall and 135 pounds, estimated that he gets by on about 2,000 to 2,100 calories a day, a low number for men of his age and activity level, and his blood pressure is a remarkably low 112 over 63. He said he has never been in better health.

“I don’t really get sick,” he said. “Mostly I do the diet to be healthier, but if it helps me live longer, hey, I’ll take that, too.”

Researchers at Washington University in St. Louis have been tracking the health of small groups of calorie-restricted dieters. Earlier this year, they reported that the dieters had better-functioning hearts and fewer signs of inflammation, which is a precursor to clogged arteries, than similar subjects on regular diets.

In previous studies, people in calorie-restricted groups were shown to have lower levels of LDL, the so-called bad cholesterol , and triglycerides. They also showed higher levels of HDL, the so-called good cholesterol, virtually no arterial blockage and, like Mr. Linksvayer, remarkably low blood pressure.

“Calorie restriction has a powerful, protective effect against diseases associated with aging,” said Dr. John O. Holloszy, a Washington University professor of medicine. “We don’t know how long each individual will end up living, but they certainly have a longer life expectancy than average.”

Researchers at Louisiana State University reported in April in The Journal of the American Medical Association that patients on an experimental low-calorie diet had lower insulin levels and body temperatures, both possible markers of longevity, and fewer signs of the chromosomal damage typically associated with aging.

These studies and others have led many scientists to believe they have stumbled onto a central determinant of natural life span. Animals on restricted diets seem particularly resistant to environmental stresses like oxidation and heat, perhaps even radiation . “It is a very deep, very important function,” Dr. Miller said. Experts theorize that limited access to energy alarms the body, so to speak, activating a cascade of biochemical signals that tell each cell to direct energy away from reproductive functions, toward repair and maintenance. The calorie-restricted organism is stronger, according to this hypothesis, because individual cells are more efficiently repairing mutations, using energy, defending themselves and mopping up harmful byproducts like free radicals.

“The stressed cell is really pulling out all the stops” to preserve itself, said Dr. Cynthia Kenyon, a molecular biologist at the University of California, San Francisco. “This system could have evolved as a way of letting animals take a timeout from reproduction when times are harsh.”

But many experts are unsettled by the prospect, however unlikely, of Americans adopting a draconian diet in hopes of living longer. Even the current epidemiological data, they note, do not consistently show that those who are thinnest live longest. After analyzing decades of national mortality statistics, federal researchers reported last year that exceptional thinness, a logical consequence of calorie restriction, was associated with an increased risk of death. This controversial study did not attempt to assess the number of calories the subjects had been consuming, or the quality of their diets, which may have had an effect on mortality rates.

Despite the initially promising results from studies of primates, some scientists doubt that calorie restriction can ever work effectively in humans. A mathematical model published last year by researchers at University of California, Los Angeles, and University of California, Irvine, predicted that the maximum life span gain from calorie restriction for humans would be just 7 percent. A more likely figure, the authors said, was 2 percent.

“Calorie restriction is doomed to fail, and will make people miserable in the process of attempting it,” said Dr. Jay Phelan, an evolutionary biologist at the University of California, Los Angeles, and a co-author of the paper. “We do see benefits, but not an increase in life span.”

Mice who must scratch for food for a couple of years would be analogous, in terms of natural selection, to humans who must survive 20-year famines, Dr. Phelan said. But nature seldom demands that humans endure such conditions.

Besides, he added, there is virtually no chance Americans will adopt such a severe menu plan in great numbers.

“Have you ever tried to go without food for a day?” Dr. Phelan asked. “I did it once, because I was curious about what the mice in my lab experienced, and I couldn’t even function at the end of the day.”

Even researchers who believe calorie restriction can extend life in humans concede that few Americans are likely to stick to such a restrained diet over a long period. The aging of the body is the aging of its cells, researchers like to say. While cell death is hardwired into every organism’s DNA, much of the infirmity that comes with advancing years is from an accumulation of molecular insults that, experts contend, may to some degree be prevented, even reversed.

“The goal is not just to make people live longer,” said Dr. David A. Sinclair, a molecular biologist at Harvard. “It’s to see eventually that an 80-year-old feels like a 50-year-old does today.”

In a series of studies, Dr. Kenyon, of the University of California, San Francisco, has created mutant roundworms that live six times longer than normal, largely because of a mutation in a single gene called daf-2. The gene encodes a receptor on the surface of cells similar to a receptor in humans that responds to two important hormones , insulin and the insulin-like growth factor 1 or IGF-1.

Insulin is necessary for the body to transport glucose into cells to fuel their operations. Dr. Kenyon and other researchers suggest that worm cells with mutated receptors may be “tricked” into sensing that nutrients are not available, even when they are. With its maintenance machinery thereby turned on high, each worm cell lives far longer — and so does the worm.

Many experts are now convinced that the energy-signaling pathways that employ insulin and IGF-1 are very involved in fixing an organism’s life span. Some researchers have even described Type 2 diabetes, which is marked by insensitivity to the hormone insulin, as simply an accelerated form of aging.

In yeast, scientists have discovered a gene similar to daf-2 called SIR2, that also helps to coordinate the cell’s defensive response once activated by calorie restriction or another external stressor. The genes encode proteins called sirtuins, which are found in both plants and animals.

A mammalian version of the SIR2 gene, called SIRT1, has been shown to regulate a number of processes necessary for long-term survival in calorie-restricted mice.

Scientists are now trying to develop synthetic compounds that affect the genes daf-2 and SIRT1.

Several candidate drugs designed to prevent age-related diseases, particularly diabetes, are on the drawing boards at biotech companies. Sirtris Pharmaceuticals, in Boston, already has begun testing a new drug in patients with Type 2 diabetes that acts on SIRT1 to improve the functioning of mitochondria, the cell’s energy factories.

While an anti-aging pill may be the next big blockbuster, some ethicists believe that the all-out determination to extend life span is veined with arrogance. As appointments with death are postponed, says Dr. Leon R. Kass, former chairman of the President’s Council on Bioethics, human lives may become less engaging, less meaningful, even less beautiful.

“Mortality makes life matter,” Dr. Kass recently wrote. “Immortality is a kind of oblivion — like death itself.”

That man’s time on this planet is limited, and rightfully so, is a cultural belief deeply held by many. But whether an increasing life span affords greater opportunity to find meaning or distracts from the pursuit, the prospect has become too great a temptation to ignore — least of all, for scientists.

“It’s a just big waste of talent and wisdom to have people die in their 60s and 70s,” said Dr. Sinclair of Harvard.

Deep thoughts: we've always been an expansionist cowboy nation

Cowboy Nation -- by Robert Kagan

These days, we are having a national debate over the direction of foreign policy. Beyond the obvious difficulties in Iraq and Afghanistan, there is a broader sense that our nation has gone astray. We have become too militaristic, too idealistic, too arrogant; we have become an "empire." Much of the world views us as dangerous. In response, many call for the United States to return to its foreign policy traditions, as if that would provide the answer.

What exactly are those traditions? One tradition is this kind of debate, which we've been having ever since the birth of the nation, when Patrick Henry accused supporters of the Constitution of conspiring to turn the young republic into a "great and mighty empire." Today, we are mightier than Henry could have ever imagined. Yet we prefer to see ourselves in modest terms--as a reluctant hegemon, a status quo power that seeks only ordered stability in the international arena. James Schlesinger captured this perspective several years ago, when he said that Americans have "been thrust into a position of lonely preeminence." The United States, he added, is "a most unusual, not to say odd, country to serve as international leader." If, at times, we venture forth and embroil ourselves in the affairs of others, it is either because we have been attacked or because of the emergence of some dangerous revolutionary force--German Nazism, Japanese imperialism, Soviet communism, radical Islamism. Americans do not choose war; war is thrust upon us. As a recent presidential candidate put it, "The United States of America never goes to war because we want to; we only go to war because we have to. That is the standard of our nation."

But that self-image, with its yearning for some imagined lost innocence, is based on myth. Far from the modest republic that history books often portray, the early United States was an expansionist power from the moment the first pilgrim set foot on the continent; and it did not stop expanding--territorially, commercially, culturally, and geopolitically--over the next four centuries. The United States has never been a status quo power; it has always been a revolutionary one, consistently expanding its participation and influence in the world in ever-widening arcs. The impulse to involve ourselves in the affairs of others is neither a modern phenomenon nor a deviation from the American spirit. It is embedded in the American DNA.

Long before the country's founding, British colonists were busy driving the Native American population off millions of acres of land and almost out of existence. From the 1740s through the 1820s, and then in another burst in the 1840s, Americans expanded relentlessly westward from the Alleghenies to the Ohio Valley and on past the Rocky Mountains to the Pacific, southward into Mexico and Florida, and northward toward Canada--eventually pushing off the continent not only Indians, but the great empires of France, Spain, and Russia as well. (The United Kingdom alone barely managed to defend its foothold in North America.) This often violent territorial expansion was directed not by redneck "Jacksonians" but by eastern gentlemen expansionists like George Washington, Thomas Jefferson, and John Quincy Adams.

It would have been extraordinary had early Americans amassed all this territory and power without really wishing for it. But they did wish for it. With 20 years of peace, Washington predicted in his valedictory, the United States would acquire the power to "bid defiance, in a just cause, to any earthly power whatsoever." Jefferson foresaw a vast "empire of liberty" spreading west, north, and south across the continent. Hamilton believed the United States would, "erelong, assume an attitude correspondent with its great destinies--majestic, efficient, and operative of great things. A noble career lies before it." John Quincy Adams considered the United States "destined by God and nature to be the most populous and powerful people ever combined under one social compact." And Americans' aspirations only grew in intensity over the decades, as national power and influence increased. In the 1850s, William Seward predicted that the United States would become the world's dominant power, "the greatest of existing states, greater than any that has ever existed." A century later, Dean Acheson, present at the creation of a U.S.-dominated world order, would describe the United States as "the locomotive at the head of mankind" and the rest of the world as "the caboose." More recently, Bill Clinton labeled the United States "the world's indispensable nation."

From the beginning, others have seen Americans not as a people who sought ordered stability but as persistent disturbers of the status quo. As the ancient Corinthians said of the Athenians, they were "incapable of either living a quiet life themselves or of allowing anyone else to do so." Nineteenth-century Americans were, in the words of French diplomats, "numerous," "warlike," and an "enemy to be feared." In 1817, John Quincy Adams reported from London, "The universal feeling of Europe in witnessing the gigantic growth of our population and power is that we shall, if united, become a very dangerous member of the society of nations." The United States was dangerous not only because it was expansionist, but also because its liberal republicanism threatened the established conservative order of that era. Austria's Prince Metternich rightly feared what would happen to the "moral force" of Europe's conservative monarchies when "this flood of evil doctrines" was married to the military, economic, and political power Americans seemed destined to acquire.

What Metternich understood, and what others would learn, was that the United States was a nation with almost boundless ambition and a potent sense of national honor, for which it was willing to go to war. It exhibited the kind of spiritedness, and even fierceness, in defense of home, hearth, and belief that the ancient Greeks called thumos. It was an uncommonly impatient nation, often dissatisfied with the way things were, almost always convinced of the possibility of beneficial change and of its own role as a catalyst. It was also a nation with a strong martial tradition. Eighteenth- and nineteenth-century Americans loved peace, but they also believed in the potentially salutary effects of war. "No man in the nation desires peace more than I," Henry Clay declared before the war with Great Britain in 1812. "But I prefer the troubled ocean of war, demanded by the honor and independence of the country, with all its calamities, and desolations, to the tranquil, putrescent pool of ignominious peace." Decades later, Oliver Wendell Holmes Jr., the famed jurist who had fought--and been wounded three times--in the Civil War, observed, "War, when you are at it, is horrible and dull. It is only when time has passed that you see that its message was divine."

Modern Americans don't talk this way anymore, but it is not obvious that we are very different in our attitudes toward war. Our martial tradition has remained remarkably durable, especially when compared with most other democracies in the post-World War II era. From 1989 to 2003, a 14-year period spanning three very different presidencies, the United States deployed large numbers of combat troops or engaged in extended campaigns of aerial bombing and missile attacks on nine different occasions: in Panama (1989), Somalia (1992), Haiti (1994), Bosnia (1995-1996), Kosovo (1999), Afghanistan (2001), and Iraq (1991, 1998, 2003). That is an average of one significant military intervention every 19 months--a greater frequency than at any time in our history. Americans stand almost alone in believing in the utility and even necessity of war as a means of obtaining justice. Surveys commissioned by the German Marshall Fund consistently show that 80 percent of Americans agree with the proposition that "[u]nder some conditions, war is necessary to obtain justice." In France, Germany, Italy, and Spain, less than one-third of the population agrees.

How do we reconcile the gap between our preferred self-image and this historical reality? With difficulty. We are, and have always been, uncomfortable with our power, our ambition, and our willingness to use force to achieve our objectives. What the historian Gordon Wood has called our deeply rooted "republicanism" has always made us suspicious of power, even our own. Our enlightenment liberalism, with its belief in universal rights and self-determination, makes us uncomfortable using our influence, even in what we regard as a good cause, to deprive others of their freedom of action. Our religious conscience makes us look disapprovingly on ambition--both personal and national. Our modern democratic worldview conceives of "honor" as something antiquated and undemocratic. These misgivings rarely stop us from pursuing our goals, any more than our suspicion of wealth stops us from trying to accumulate it. But they do make us reluctant to see ourselves as others see us. Instead, we construct more comforting narratives of our past. Or we create some idealized foreign policy against which to measure our present behavior. We hope that we can either return to the policies of that imagined past or approximate some imagined ideal to recapture our innocence. It is easier than facing the hard truth: America's expansiveness, intrusiveness, and tendency toward political, economic, and strategic dominance are not some aberration from our true nature. That is our nature.

Why are we this way? In many respects, we share characteristics common to all peoples through history. Like others, Americans have sought power to achieve prosperity, independence, and security as well as less tangible goals. As American power increased, so, too, did American ambitions, both noble and venal. Growing power changes nations, just as it changes people. It changes their perceptions of the world and their place in it. It increases their sense of entitlement and reduces their tolerance for obstacles that stand in their way. Power also increases ambition. When Americans acquired the unimaginably vast territory of Louisiana at the dawn of the nineteenth century, doubling the size of their young nation with lands that would take decades to settle, they did not rest content but immediately looked for still more territory beyond their new borders. As one foreign diplomat observed, "Since the Americans have acquired Louisiana, they appear unable to bear any barriers round them."

But, in addition to the common human tendency to seek greater power and influence over one's surroundings, Americans have been driven outward into the world by something else: the potent, revolutionary ideology of liberalism that they adopted at the nation's birth. Indeed, it is probably liberalism, more than any other factor, that has made the United States so energetic, expansive, and intrusive over the course of its history.

Liberalism fueled the prodigious territorial and commercial expansion in the eighteenth and nineteenth centuries that made the United States, first, the dominant power in North America and, then, a world power. It did so by elevating the rights of the individual over the state--by declaring that all people had a right to life, liberty, property, and the pursuit of happiness and by insisting it was the government's primary job to safeguard those rights. American political leaders had little choice but to permit, and sometimes support, territorial and commercial claims made by their citizens, even when those claims encroached on the lands or waters of foreigners. Other eighteenth- and nineteenth-century governments, ruled by absolute monarchs, permitted national expansion when it served personal or dynastic interests--and, like Napoleon in the New World, blocked it when it did not. When the king of England tried to curtail the territorial and commercial expansionism of his Anglo-American subjects, they rebelled and established a government that would not hold them back. In this respect, the most important foreign policy statement in U.S. history was not George Washington's farewell address or the Monroe Doctrine but the Declaration of Independence and the enlightenment ideals it placed at the heart of American nationhood. Putting those ideals into practice was a radical new departure in government, and it inevitably produced a new kind of foreign policy.

Liberalism not only drove territorial and commercial expansion; it also provided an overarching ideological justification for such expansion. By expanding territorially, commercially, politically, and culturally, Americans believed that they were bringing both modern civilization and the "blessings of liberty" to whichever nations they touched in their search for opportunity. As Jefferson told one Indian leader: "We desire above all things, brother, to instruct you in whatever we know ourselves. We wish to learn you all our arts and to make you wise and wealthy." In one form or another, Americans have been making that offer of instruction to peoples around the world ever since.

Americans, from the beginning, measured the world exclusively according to the assumptions of liberalism. These included, above all, a belief in what the Declaration of Independence called the "self-evident" universality of certain basic truths--not only that all men were created equal and endowed by God with inalienable rights, but also that the only legitimate and just governments were those that derived their powers "from the consent of the governed." According to the Declaration, "whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it." Such a worldview does not admit the possibility of alternative truths. Americans, over the centuries, accepted the existence of cultural distinctions that influenced other peoples to rule themselves differently. But they never really accepted the legitimacy of despotic governments, no matter how deeply rooted in culture. As a result, they viewed them as transitory. And so, wherever Americans looked in the world, they saw the possibility and the desirability of change.

The notion of progress is a central tenet of liberalism. More than any other people, Americans have taken a progressive view of history, evaluating other nations according to where they stood on the continuum of progress. The Russians, Theodore Roosevelt believed, were "below the Germans just as the Germans are below us ... [but] we are all treading the same path, some faster, some slower." If Roosevelt's language sounds antiquated, our modern perspective is scarcely different. Although we may disagree among ourselves about the pace of progress, almost all Americans believe that it is both inevitable and desirable. We generally agree on the need to assist other nations in their political and economic development. But development toward what, if not toward the liberal democratic ideal that defines our nationalism? The "great struggle of the epoch," Madison declared in the 1820s, is "between liberty and despotism." Because the rights of man were written "by the hand of the divinity itself," as Hamilton put it, that struggle could ultimately have only one outcome.

It was a short step from that conviction to the belief that the interests of the United States were practically indistinguishable from the interests of the world. "The cause of America is in a great measure the cause of all mankind," Thomas Paine argued at the time of the revolution. Herman Melville would later write that, for Americans, "national selfishness is unbounded philanthropy; for we cannot do a good to America but we give alms to the world." It was another short step to the belief that the United States had a special, even unique, role to play in serving as a catalyst for the evolution of mankind. "The rights asserted by our forefathers were not peculiar to themselves," Seward declared, "they were the common rights of mankind." Therefore, he said, the United States had a duty "to renovate the condition of mankind" and lead the way to "the universal restoration of power to the governed" everywhere in the world. Decades earlier, John Quincy Adams had noted with pride that the United States was the source of ideas that made "the throne of every European monarch rock under him as with the throes of an earthquake." Praising the American Revolution, he exhorted "every individual among the sceptered lords of mankind: 'Go thou and do likewise!'"

A Russian minister, appalled at this "appeal to the nations of Europe to rise against their Governments," noted the hypocrisy of Adams's message, asking, "How about your two million black slaves?" Indeed. The same United States that called for global revolution on behalf of freedom was, throughout its first eight decades, also the world's great defender of racial despotism. The slaveholding South was itself a brutal tyranny, almost totalitarian in its efforts to control the speech and personal behavior of whites as well as blacks. Much of the U.S. territorial expansion in the nineteenth century--including the Mexican War, which garnered today's American Southwest and California--was driven by slaveholders, insisting on new lands to which they could spread their despotic system.

In the end, the violent abolition of slavery in the United States was a defining moment in the country's foreign policy: It strengthened the American tendency toward liberal moralism in foreign affairs. The Northern struggle against slavery, culminating in the Civil War, was America's first moral crusade. The military defeat of the Southern slaveholders was America's first war of ideological conquest. And what followed was America's first attempt at occupation and democratic nation-building (with the same mixed results as later efforts). The effect of the whole struggle was to intensify the American dedication to the universality of rights and to reaffirm the Declaration of Independence, rather than the Constitution with its tacit acceptance of slavery, as the central document of American nationhood. The Civil War fixed in the American mind, or at least in the Northern mind, the idea of the just war--a battle, fought for moral reasons, whose objectives can be achieved only through military action.

Such thinking led to the Spanish-American War of 1898. One of the most popular wars in U.S. history, it enjoyed the support of both political parties, of William Jennings Bryan and Andrew Carnegie, of eastern Brahmin Republicans like Henry Cabot Lodge, radical prairie populists, and labor leaders. Although one would not know it from reading most histories today, the war was motivated primarily by humanitarian concerns. Civil strife in Cuba and the brutal policies of the Spanish government--in particular the herding of the civilian population into "reconcentration" camps--had caused some 300,000 deaths, one-fifth of Cuba's population. Most of the victims were women, children, and the elderly. Lodge and many others argued that the United States had a responsibility to defend the Cuban people against Spanish oppression precisely because it had the power to do so. "Here we stand motionless, a great and powerful country not six hours away from these scenes of useless bloodshed and destruction," he said, imploring that, if the United States "stands for humanity and civilization, we should exercise every influence of our great country to put a stop to that war which is now raging in Cuba and give to that island once more peace, liberty, and independence." The overwhelming majority of the nation agreed. The U.S. intervention put an end to that suffering and saved untold thousands of lives. When John Hay called it a "splendid little war," it was not because of the smashing military victory--Hay was no militarist. It was the lofty purposes and accomplishments of the war that were splendid.

It was also true that the United States had self-interested reasons for going to war: commercial interests in Cuba, as well as the desire to remove Spain from the hemisphere and establish our preeminence in the region. Most of Europe condemned the United States as selfish and aggressive, failing to credit it with humanitarian impulses. Moreover, the war produced some unintended and, for many who idealistically supported it, disillusioning consequences. It led to the acquisition of the Philippines and a most unsplendid war against independence-minded Filipinos. It also produced a well-intentioned, but ultimately disappointing, multiyear occupation of Cuba that would haunt Americans for another century. And it reignited an old debate over the course of U.S. foreign policy--similar to the one that consumes us today.

Now, as then, the projection of U.S. power for liberal purposes faces its share of domestic criticism--warnings against arrogance, hubris, excessive idealism, and "imperialism." Throughout the eighteenth and nineteenth centuries, conservatives in the republican tradition of Patrick Henry worried about the effect at home of expansive policies abroad. They predicted, correctly, that a big foreign policy generally meant a big federal government, which--in their eyes--meant impingements on the rights and freedoms of the individual. The conservatives of the slaveholding South were the great realists of the nineteenth century. They opposed moralism, rightly fearing it would be turned against the institution of slavery. As Jefferson Davis put it, "We are not engaged in a Quixotic fight for the rights of man. Our struggle is for inherited rights. ... We are conservative." At the end of the century, when Americans were enthusiastically pushing across the Pacific, critics like Grover Cleveland's long-forgotten secretary of state, Walter Q. Gresham, warned that "[e]very nation, and especially every strong nation, must sometimes be conscious of an impulse to rush into difficulties that do not concern it, except in a highly imaginary way. To restrain the indulgence of such a propensity is not only the part of wisdom, but a duty we owe to the world as an example of the strength, the moderation, and the beneficence of popular government."

But, just as progressivism and big government have generally triumphed in domestic affairs, so, too, has the liberal approach to the world beyond our shores. Henry failed to defeat the Constitution. Southern realism lost to Northern idealism. The critics of liberal foreign policy--whether conservative, realist, or leftist--have rarely managed to steer the United States on a different course.

The result has been some accomplishments of great historical importance--the defeat of German Nazism, Japanese imperialism, and Soviet communism--as well as some notable failures and disappointments. But it was not as if the successes were the product of a good America and the failures the product of a bad America. They were all the product of the same America. The achievements, as well as the disappointments, derived from the very qualities that often make us queasy: our willingness to accumulate and use power; our ambition and sense of honor; our spiritedness in defense of both our interests and our principles; our dissatisfaction with the status quo; our belief in the possibility of change. And, throughout, whether succeeding or failing, we have remained a "dangerous" nation in many senses--dangerous to tyrannies, dangerous to those who do not want our particular brand of liberalism, dangerous to those who fear our martial spirit and our thumos, dangerous to those, including Americans, who would prefer an international order not built around a dominant and often domineering United States.

Whether a different kind of international system or a different kind of America would be preferable is a debate worth having. But let us have this debate about our future without illusions about our past.

Monday, October 30, 2006

Adam's blogbox: the truth about Iraq you won't get from Bush/Cheney or our media

The weird thing about Iraq is that it has no central government –- whatsoever.

The official US-sanctioned ‘government’ sits in the Green Zone and has no power. Mostly, these officials travel overseas on our tax money, sometimes to stand by Bush or Blair and make announcements. At one point this year the entire Iraqi cabinet was overseas.

Meanwhile, outside the Green Zone, amidst the facts on the ground that people are always talking about, various militias are running various parts of the country. And in those parts they’re running, they devote their time to offing their ethnic rivals in what some call "sectarian violence" and others "civil war," and what I'd call your basic Middle East government: warlordism.

It would be easy to split this no-central-government country in three – under the Kurds, who’ve run themselves for years now, and whose warlords don’t allow any Arabs in, and under the Sunni and the Shiite warlords (which the Sunnis don’t want to do, because their area has no oil, and they, who ruled Iraq under Saddam, would find themselves fatally impoverished).

So you can divide the country in three, but how do you divide Baghdad? It would be like trying to separate a rabid threesome of snakes on Viagra. In the capital, Sunnis and Shiites live cheek by jowl. Neighborhoods are mixed; unmixed Sunni and Shiite neighborhood sit right next to each other, on both sides of the river. That’s why the killing in Baghdad goes on regardless, openly in daylight on the streets for all to see, as Shiites and Sunnis try to consolidate the neighborhoods they can consolidate.

The ‘government’ of Iraq is not a government like we understand it. If the US pulled out today, the Iraqi ‘government’ would immediately flee overseas, back to where many of its members have come from -- expats who returned to their home country when Saddam was overthrown. They returned after being away for 30 years, not knowing a damn thing about their country anymore, in order to batten on the largesse of the US occupation. This they know how to do -- vampire fleas sucking themselves fat as hogs from our tax dollars like it's warm blood on tap. We’re not just talking thousands and millions; we’re talking billions as well. These guys are more corrupt than a clusterfuck of Abramoffs.

They’re making hay while the sun shines, because they know that once the US pulls out, they’ll have to get out of Dodge pronto themselves -- to escape being whacked by their own angry citizens.

The truth about Iraq is that it has become a vast boondoggle for the benefit of this corrupt 'government" and our own military-industirla complex: for Halliburton’s KBR, for Bechtel, for the extended Bush family (Bush Sr, who gets greased by the Carlyle Group, Uncle ‘Bucky’ Bush, who runs the military contractor ESSI, Neil Mallon Bush of the Silverado savings and loan scam, who helps companies cash in on Iraq, and Marvin Pierce Bush), and for other Bush/Cheney cronies who've connected their fat butts via a direct pipeline into our Treasury, and who've dialed the sucking force of said butts right up to eleven (see Iraq pieces below).

Iraq itself is a mess, warlorded by militias, who use drills and blowtorches on their opponents before they shoot them in the back of the head. Great swathes of Shiite Iraq are under the control of the Mahdi Army, who are killing Sunnis like they’re drowning fresh grosses of kittens every day.

And who created this mess? One Paul Bremer of the CPA, when he disbanded the Iraqi army, and sent them home unemployed with no further pay, but with their guns and ammo in their hands, to fuel the insurgency and "sectarian violence."

That was bad enough. What was maybe worse was firing the entire Iraq bureaucracy that ran the country, because they were Baathists. They were Baathists because they had to be Baathists to get a job from Saddam, but Bremer thought they were the brethren of Satan, so he de-Baathified the administration and Armageddoned Iraq's entire administrative structure.

In two fell swoops, Bremer destroyed the two institutions that held the country together. He left Iraq without any functioning administration, and then ducked out himself.

The irony is he didn't have be the stupidest administrator since a horse was declared a Roman Senator by its proud emperor-owner. He had a great historical precedent to learn from. How did we get Germany back on its feet after WW2? We pumped in a lot of money, yes. But mainly, we tried and executed a handful of the top Nazi brass, but we weren’t so dumb as to de-Nazify the whole place from top to bottom. We sort of interviewed people about their backgrounds and lectured them about their politics, but basically we let the rest of the Nazi businessmen, officials, lawyers and others return to their business as usual to build their country back up.

In Iraq, our neocons were too stupid to follow our own successful historical blueprint. These guys are so dumb, they miss the toilet when they sit down to take a shit. For some pie-in-the-sky conservative dream state, ideologue Bremer wrecked the government that was there. God knows what he was thinking. Did he think he was going to build a new country by letting a bunch of twenty-something rightwing American staffers in the Green Zone run a foreign place along evangelical lines? Or create a conservative haven with his decrees about a maximum of 15% corporate taxes and other conservative BS -- with nobody to enforce these laws? Or privatize the economy and sell it to Halliburton and other US corporations for them to run some capitalist-type heaven in the midst of a civil war? The same US companies who've not been able to get electricity and water back to pre-war levels? The same companies that charge the American taxpayer up to 45% percent overhead for their own staff salaries and housing out of their ‘reconstruction’ budgets, and overcharge the Pentagon for not reconstructing anything?

Today Iraqis are longing for the days of Saddam, when they had security. And when there was much less snuffing of the neighbors. The country is thoroughly butt-bonked, blood-assed and bum-buggered for the next 20 years. And we did it, not the Iraqis. We destroyed their institutions, and let them vote for a bunch of corrupt pols who play us for suckers and sit high and dry in the safety of the Green Zone and act like they’re running the country, a country they’re too afraid to travel in, unless surrounded by US Army tanks, jeeps and Humvees.

What a mess. It’s insoluble. Bush says we’re training the Iraqis to stand up so we can stand down. Such BS. What we’re actually doing is training the Shiites in the army to kill the Sunnis, and giving them the arms to do it with. That’s what we’re doing. We’re knee-deep in a civil war, on the side of the Shiites.

Is Bush telling us any of this? No. All we get from him are the usual. Lies, lies, and more lies. The man can't open his mouth and a lie falls out, which he sucked out of Karl Rove's ass the night before.

It’s an abdication of responsibility last seen in the governments of Caligula and Nero.

Impeachment. Jeez, that’s not enough. In any other epoch, there’d probably be some kind of military tribunal court-martial thing. In our country, the whole nation deserves to be court-martialed. We have completely destroyed another country. For what? Not even Bush or Cheney could tell you straight anymore, even if they wanted to. They’ve probably forgotten by now that originally, hyped up by the quick success of the Afghan War, they thought the Iraq invasion would net them a friendly puppet regime under Chalabi, who’d give their Texas oil buddies the exploration rights that Saddam gave to the Russians and Chinese. And also score us 14 big fat military bases which the Pentagon are still building in Iraq, from which they hoped to rule the Middle East. There's about us much chance of that happening as Paris Hilton making a videotape of her and Bill Frist doing each other doggie-style.

And what did Bush/Cheney get besides billions for their cronies? Thousands of dead American soldiers, this October closing in on 100 killed and another 400 wounded, the worst month ever -- plus 665,000 dead Iraqis.

What they got was blood on their hands, buckets and rivers and Niagara Falls of it.

This is what your President has done, America. This is what your country has done. Your President refuses to acknowledge any responsibility. He’ll never have the guts to do it.

So the question we as a nation have to ask ourselves is this: will we?

Music: what's the most sampled record in hip-hop?

All Rise for the National Anthem of Hip-Hop -- by WILL HERMES

THIS is a story about a nearly forgotten album and the birth of hip-hop music. Like many good hip-hop tales, and pop yarns in general, it involves unlikely characters rising from obscurity and is colored with creative passion, violence, drugs, thievery, paydays and paybacks.

It’s a story of a Bronx D.J. making his name with a record that began as the soundtrack for a B-movie called “The Thing With Two Heads.” And it suggests that the two most important drummers in rap history might be a guy who spent his career touring behind Neil Diamond and another who played with John Lennon and Eric Clapton before stabbing his mother to death and being committed to a mental hospital.

The record in question is “Bongo Rock,” a 1972 LP by the Incredible Bongo Band, which, after years of bootlegging, is being properly reissued by the Mr. Bongo label on Tuesday, paired on a single disc with its 1974 follow-up, “The Return of the Incredible Bongo Band.” Created by a group of Hollywood session musicians who never toured, it’s a set of sometimes thrilling, sometimes cheesy instrumentals built on tight brass charts, psychedelic guitar riffs, funky keyboard vamps and heavy percussion.

“Bongo Rock” is significant, however, for being one of the musical cornerstones of rap. While it’s hard to measure these things accurately, it is certainly one of the most sampled LP’s in history, if not the most sampled. Most every history-minded hip-hop D.J. has a copy, and the first few bars of its signature number, a driving cover version of the 60’s instrumental number “Apache,” can send crowds into overdrive.

The Bongo Band’s “Apache” has been recycled continually in rap songs over the years; just this past August, Missy Elliott won an MTV video award for the clip to her song “We Run This,” whose central motif is lifted wholesale from “Apache.” According to Kool Herc, the stylistic pioneer many people consider to be the father of hip-hop music, the Bongo’s “Apache” is “the national anthem of hip-hop.”

The story of “Bongo Rock,” however, begins far from hip-hop’s birthplace in New York, with Michael Viner, a white kid from Washington who worked for Senator Robert F. Kennedy in Los Angeles. After Kennedy’s assassination in 1968, Mr. Viner, then 24, roomed with his friend Rosie Grier, the football player turned actor, and returned to the film industry, where he had worked as a youngster. He rose through the ranks and was soon in charge of music for MGM.

“I did soundtracks for American International Pictures — Frankie Avalon and Annette Funicello ,” Mr. Viner said from his office in Beverly Hills. “I worked with Sinatra. My first big hit was ‘Candyman’ by Sammy Davis Jr. I signed him to MGM. I picked that song.”

Mr. Viner, who is 62 and runs Phoenix Books, a successful publisher and audio-book label that counts Bill Maher and the Kiss singer Gene Simmons among its clients, is soft spoken as Hollywood moguls go. But he’s a savvy businessman. Forthcoming, too.

“I was always a second- or third-rate musician,” he confessed. “I can play a few things badly. But I was always smart enough to get much more talented people to surround myself with.”

And so it was when he assembled a group to score a chase scene for “The Thing With Two Heads,” a ludicrous but racially provocative horror film starring his pal Mr. Grier and Ray Milland as, respectively, a black head and a white head attached to a single body. The resulting two tracks, “Bongo Rock” and “Bongolia,” appeared on the soundtrack LP and, later, on a seven-inch single, which became a surprise hit.

When R&B D.J.’s began playing it, Mr. Viner explained, “the record company spent extra money putting our pictures on the little 45 sleeves. But as soon as people saw all the white guys in the band, it stopped selling. So they went back to the generic blank sleeve.”

Flush with some success, the group reconvened to make a full-length LP. And befitting its name, the core of the band was two powerhouse percussionists. One, the bongo and conga player King Errisson, was a revered session man from the Bahamas who played limbo shows in a Miami strip joint before jumping into New York’s early-60’s jazz scene, where he befriended musicians like Miles Davis and Cannonball Adderley. Mr. Errisson eventually found his groove in California, where, among other gigs, he was Motown’s go-to guy for percussion in the late 60’s and early 70’s, after the label relocated from Detroit to Los Angeles. He played on sessions by the Temptations, the Four Tops, the Jackson Five and the Supremes.

“I got my real education in L.A., which had the greatest musicians in the world,” Mr. Errisson said from his home in Las Vegas. “That was my universe. I was doing four or five sessions a day with different artists. If you didn’t learn then, you didn’t have an ear to learn.”

The drummer was Jim Gordon, a handsome California boy probably best remembered as part of the group Derek and the Dominoes. During that stint he came up with the famous piano coda for the hit “Layla,” written with his bandmate Eric Clapton. His fat, precise beats also turn up on recordings by John Lennon (the “Imagine” album), George Harrison , Traffic, Frank Zappa, Steely Dan, Merle Haggard and the Monkees.

“In my book, he was the No. 1 session drummer in town in the 60’s and 70’s,” recalled the Los Angeles producer and songwriter Perry Botkin, who arranged the material on “Bongo Rock.” “I would put off record dates just to make sure I could get him. He was absolutely incredible.”

“Bongo Rock,” credited to Michael Viner’s Incredible Bongo Band, was released in 1972 and did not set the world on fire. “It bombed,” said Mr. Botkin, who could barely recall the details of the recording. “It was just another session. It didn’t mean anything.” The band soon regrouped for a follow-up LP, “The Return of the Incredible Bongo Band,” which fared even worse.

And that, for all anyone involved could guess, was the end of it. Mr. Viner and Mr. Botkin turned their attention to other projects. Mr. Errisson began his tenure with Neil Diamond’s band, which continues.

Mr. Gordon, for his part, had dived headlong into the rock scene. After years of living the high life and maintaining his reputation as a consummately reliable player, he became a famous drug casualty. He stopped playing altogether and later claimed he was haunted by imaginary voices. After a series of violent episodes and self-elected stays in psychiatric hospitals, he murdered his mother — whom he identified as one of the voices in his head — with a knife in 1983.

“He was the sweetest guy you’d ever want to meet,” Mr. Botkin recalled of Mr. Gordon, who is currently an inmate at the State Medical Corrections Facility in Vacaville, Calif. “We were very good friends. The drugs got him.”

Meanwhile, back in late 1972 in the Bronx, a young Jamaican immigrant who worked as a D.J at parties under the name Kool Herc discovered the “Bongo Rock” LP through his colleague, DJ Timmy Tim. He had heard the “Bongo Rock” single, which he thought was O.K. But “Apache” was something else. Beginning with the tandem drumming of Mr. Errisson and Mr. Gordon, the song peaks like a fireworks display, with bursts of organ, horns and surf guitar exploding amid a rain of bongo and kit-drum beats. It drove dancers crazy at the Hevalo, on Jerome Avenue between Tremont and Burnside, where Herc had a steady gig and where he first played the record for a crowd.

“I used that record to start what I called the Merry-Go-Round,” he explained in a telephone interview, retelling an oft-told story. “It was the segment where I played all the records I had with beats in them, one by one. I’d use it at the hypest part of the night, between 2:30 and 3 a.m. Everybody loved that part of my format.”

Soon, the Merry-Go-Round evolved, as Herc acquired extra copies of certain records, which allowed him to extend percussion-driven sections of songs indefinitely through hand manipulation of the turntables, creating hypnotic percussive loops. The “Bongo Rock” LP — specifically “Apache,” but other tracks, too — was the first record he used in this way. Others followed suit, using breakbeats (as the percussion samples became known) to undergird the chants and rhymes and exclamations of M.C.’s. Rap was born.

The “Bongo Rock” LP never fell far from favor as a hip-hop building block, from the Sugarhill Gang’s 1981 rap version of “Apache” to tracks by LL Cool J, Moby, Nas and countless other artists. The lure of “Apache” has obsessed fans as well. Last year Michaelangelo Matos, a music writer, compiled a history of the song, tracing its evolution from the British guitarist Bert Weedon’s mellow 1960 version through more uptempo versions by the Shadows, the Danish guitarist Jorgen Ingmann (who had a No. 1 hit with it in the United States) and the Ventures, up to modern-day sampled versions. A fellow music historian and blogger, Oliver Wang, picked up on Mr. Matos’s paper (first presented at the annual Experience Music Project Pop Conference in Seattle) and posted a clutch of “Apache” MP3’s on his music blog Soul Sides ( ).

Eventually the creators of “Bongo Rock” became aware of their music’s rebirth. The original Bongo Band recordings had been contractually slated to revert to Mr. Viner from MGM after eight years. But the company had been sold and bought and sold again, so Mr. Viner had to do some detective work to claim ownership, which he secured in 1990. Attempting to track down the countless bootleggers, let along getting paid for what some guess are thousands of sample usages, was another matter.

“I just tried to get some of the illegal recordings off the market piece by piece,” Mr. Viner said, adding that he had been only partly successful. Scanning eBay during a phone interview, he laughed over his finds. “My favorites are records that we never even made,” he said. “Here: ‘Vincent Miner’s Incredible Bongo Band Promo CD.’ It’s amazing. ‘Killer Funk’ LP by the Incredible Bongo Band — $79.99.” (He noted that there had been one authorized reissue of “Bongo Rock,” in 2000, by a British label, now defunct, that he says never paid him.)

Mr. Viner recently hired someone to help him track down payments for sample usages in cases where a substantial portion of a Bongo Band recording was used in making a song. But most of their work will be backtracking. Current rap artists are happy to pay for the imprimatur of using the original Incredible Bongo Band recordings, which are not just laden with history but remarkably funky.

Perhaps the most notable is Nas, who has used Incredible Bongo Band samples on “Made Ya Look” and “Thieves Theme” (the latter using the band’s cover of Iron Butterfly’s “In-A-Gadda-Da-Vida”), among other songs. “Those breaks are so hip-hop,” he said in an e-mail message via his label, Def Jam. “I’m going to continue using them again and again.” Mr. Viner said he was pleased that Nas has a new record due later this year that reportedly does just that.

And never one to miss a business opportunity, Mr. Viner has reconvened a new version of the Incredible Bongo Band, which is finishing work on a record he plans to shop to labels. There are also plans for the group’s first public performances, maybe even a tour.

The lineup isn’t the same: not only is Gordon absent, but Perry Botkin has retired from for-hire arranging in favor of making experimental electronic music at home. But there’s talk of Kool Herc being involved. And Mr. Errisson, who put the bongos in the original recordings, is back at the forefront (Neil Diamond is taking a year off, so he has time on his hands). This time, though, he’s a bit savvier about the business end of things.

“Back then, I was so busy and so biggity, I didn’t pay attention to the one percent he offered me,” Mr. Errisson said with a laugh, referring to Mr. Viner’s offer of royalties on the original recordings. “I said, ‘Just pay me,’ you know? I got what I asked for. But now the thing turned out to be a smash. There you go — those things happen.” His deal with Mr. Viner for the coming record, he said, is more favorable: a 50-50 split.

“That’s better than all the wives I ever married, that kind of deal,” Mr. Errisson said. “Michael has a lot of great ideas. He comes up with these corny tunes, but they’re great tunes that we can make work. He seems to know what he’s doing, so I think we’ll do O.K.”

And what is Mr. Viner’s working title for the new record? (Cue drummer’s rim shot.) “Sample This.”