Waving the Bloody Shirt
by MARK GABRISH CONLAN
Copyright © 2006 by Mark Gabrish Conlan for Zenger’s Newsmagazine • Used by permission
A funny — or maybe not-so-funny — thing happened to the Democratic Party on its way to what a lot of people thought would be an election in which at least one house of Congress would revert to Democratic control. The fifth anniversary of the 9/11 attacks occurred — and there was President Bush on TV virtually every day, making masterly speeches that all but came out and said, “If you elect Democrats, the terrorists win.” There Bush was, again and again, saying on September 6, “To win the war on terror, we must be able to detain, question, and, when appropriate, prosecute terrorists captured here in America, and on the battlefields around the world” — meaning, in context, that we can no longer afford to allow niggling little nit-picking about Constitutional rights and due process to get in the way; we have to be able to hold so-called “terror suspects” indefinitely and either not try them at all or put them before special military tribunals where they won’t be allowed legal representation, to confront the witnesses against them or even know the information on which they’re being charged.
Earlier in the year, when Democrats in Congress hailed the U.S. Supreme Court decision demanding that the detainees at Guantánamo be tried under due-process rights and denounced the Bush administration programs to wiretap at will and monitor the telephone calls of all Americans looking for so-called “patterns” that might indicate terrorists, Bush’s principal political strategist, Karl Rove, said, “Bring it on.” Rove was supremely confident that he and his boss have done such a great job of keeping the people afraid of new acts of terrorism — and convinced that only the authoritarian policies of the Bush administration can keep the U.S. from being hit by terrorists again — that any attempt on the part of Democrats to make the conduct of the “war on terrorism” an issue would backfire against them and help the Republicans.
On September 21, the Los Angeles Times published the results of a poll it co-sponsored with the Bloomberg network that proved that Bush and Rove were right. “President Bush’s approval rating has reached its highest level since January, helping to boost the Republican Party’s image across a range of domestic and national-security issues just seven weeks before this year’s midterm election,” Ronald Brownstein’s story on the poll results began. “Democrats hold a lead in the poll, 49 percent to 39 percent, when registered voters are asked which party they intend to support for Congress this year. But that advantage may rest on softening ground; on virtually every comparison between the parties measured in the survey, Republicans have improved their position since early summer. In particular, Republicans have nearly doubled their advantage when voters are asked which party they trust most to protect the nation against terrorism.”
In the poll, 49 percent of the respondents said they thought the Republicans would be better than the Democrats at “national security/war on terrorism” versus 32 percent who thought the Democrats would be better — and if that result holds on November 7 the Democrats might just as well stay home. The Republicans were able to make historically unprecedented gains in the 2002 midterm elections and increase their Congressional majorities in 2004 partly by creating social “wedge issues” like same-sex marriage that split African-Americans and other socially conservative people of color away from the Democrats, but mainly by making the “war on terror” the Issue of Issues, the one that separates the political men from the boys — or girls.
The Republicans have done this sort of thing many times before. Beginning in the 1866 midterm campaign, they engaged in a campaign tactic that came to be called “waving the bloody shirt.” Over and over again, Republican orators reminded voters of which party had tried to appease the South and preserve and expand slavery — and which party had forcefully resisted the South and won the Civil War. “Waving the bloody shirt” enabled the Republicans to win the next four Presidential elections after Lincoln’s assassination — even though the first postwar President they elected, Civil War hero Ulysses S. Grant, ran the most corrupt administration the U.S. had seen to that time and essentially sold policy to the highest corporate bidder. Even as voters’ memories of the Civil War faded, the political machine the Republicans had built up by “waving the bloody shirt” enabled them to dominate American politics until the Depression, when the failure of “trickle-down” economic policies benefiting the rich became obvious for all to see.
While the Democrats got to do a bit of their own bloody-shirt waving during World War II — reminding voters of which party had wanted to challenge Hitler and the Axis and which party had provided most of the isolationists — the Republicans seized the tactic back and used it with a vengeance throughout the 1950’s. This time the “bloody shirt” was the threat of Communism in general and the accusation that the Democrats had “lost China,” and therefore were responsible for the millions murdered under Mao’s repressive regime, in particular. Under anti-Communist hawk Presidents like Truman, Kennedy and Johnson, the Democrats were able to neutralize the “bloody shirt” a bit by proclaiming their own “toughness on Communism” — resulting in the debacle of Viet Nam and the Republicans’ retaking the “national security” issue and the domination of American politics from such seemingly hapless Democrats as McGovern, Carter, Mondale, Dukakis and Clinton.
When al-Qaeda’s hijacked planes slammed into the World Trade Center and the Pentagon, the Republican Party got the biggest, bloodiest, reddest shirt they’d had to wave since the Civil War — and they’ve made full use of it. An entire Republican infrastructure of communication aimed way beyond what we usually think of as “politics” — a network of churches spewing forth Republican propaganda from the pulpit every Sunday (while the Internal Revenue Service ignores them and instead goes after a peace-oriented Left-leaning congregation in Riverside and threatens them with loss of their tax exemption), talk radio stations broadcasting Republican propaganda 24/7, and a largely successful effort to bend the mainstream corporate media in a more Right-wing direction — projects the message again and again that Republicans are Menschen and Democrats are wimps; Republicans will keep you safe and Democrats will appease the terrorists; 9/11 was Clinton’s fault, not Bush’s; there won’t be any successful terrorist attacks on the U.S. if the Republicans stay in power but there will be if the Democrats take over.
The Republicans have far more going for them than the “bloody shirt” of 9/11 and a communications infrastructure the Democrats can’t even begin to rival. (The pathetic attempt at a “liberal” talk-radio network, Air America, is so broke it can’t even afford to pay its star broadcaster, Al Franken.) They’ve already taken power in so many state legislatures — where the lines of Congressional districts are drawn — that they’ve been able to use modern remapping software to draw districts meticulously crafted to elect as many Republicans as possible. The Republican National Committee has a program nicknamed “Voter Vault” that can actually target voters as individuals and research otherwise Democratic-leaning voters for the one piece of information about them that will give the Republicans a chance to win them over — like the mailing the GOP just sent to snowmobile owners in Michigan to persuade them that incumbent Democratic Senator Debbie Stabenow is an “environmental extremist” who’s going to block them from using their snowmobiles. Not only have the Democrats barely even begun to create anything similar, the information that fuels Voter Vault comes from databases created by private corporations — and there’s nothing to stop them from selling this information to Republicans while denying it to Democrats.
The Republicans also have one other key advantage: they control the actual process of counting votes. All computer equipment and programs used in U.S. elections is made by three companies, all of whose CEO’s are heavy donors to the Republican party and its candidates. Elections in America are run on a state-by-state basis, and many decisions like how many polling places to open in which neighborhoods are made by partisan secretaries of state — and we’ve seen from the records of Katherine Harris in Florida in 2000 and Ken Blackwell in Ohio in 2004 how a secretary of state willing to bend his or her decisions in a direction that favors voters in Republican-leaning areas and handicaps voters in Democrat-leaning areas can, if not determine an outcome, at least twist it enough to support the Republicans’ bloody shirts, wedge issues and Voter Vaults and help them to victory.
But the biggest advantage the Republicans have over the Democrats is that Republicans are a vertebrate life form and Democrats aren’t. Republicans have the courage of bad convictions while Democrats seem to pride themselves on not having any convictions at all. Republicans protect, service and coddle their base, while Democrats continually bash theirs — from racist attacks on African-American cultural figures to passing “free-trade” treaties that destroy American jobs and undermine what’s left of the private-sector labor movement. Republican strategists win by reaching out to their base and getting their partisans excited about voting for them; Democrats continually reach for the will-o’-the-wisp of “swing voters” — and keep losing.
In a way, the 2006 elections are a win-win situation for the Republicans. If the Democrats do eke out a bare majority in the House of Representatives, the Republicans will have them available as a ready-made scapegoat for their own failures. If the Democrats can’t parlay the abysmal record of Bush’s incompetence — Iraq, Katrina, the attempt to privatize Social Security in the guise of “reform,” a prescription drug plan that has enriched the pharmaceutical companies at the expense of the federal government and America’s senior citizens, and so on — into at least a House majority, their credibility as an opposition party will be smashed for decades to come and the “bloody shirt” of 9/11 will wave over Washington, D.C. for the rest of our lives.
Monday, September 25, 2006
International Law Professor Speaks on Lebanon War
Both Sides Broke Standards, but Israel Went Much Father
by MARK GABRISH CONLAN
Copyright © 2006 by Mark Gabrish Conlan for Zenger’s Newsmagazine • Used by permission
When Israel sent its army into Lebanon in mid-July as a response to a border raid from Hezbollah, the Shi’ite Islamic militia that helped drive Israel out of Lebanon six years earlier, the entire world took notice. But much of the media reporting on the war, especially in the United States, didn’t take into account the historical context of the attacks or analyze whether they met the qualifications of the Geneva Convention and the other international treaties governing the actions of nation-states at war. On September 21, the United Nations Peace Association sponsored a talk by San Diego State University international relations and international law professor Jonathan Graubart at the Hall of Nations in Balboa Park that attempted to put the conflict — and the earlier battle between Israeli forces and Hamas militants in Gaza — in context and analyze its rights and wrongs in terms of international law. His conclusion: both sides broke international law, but Israel’s violations were far greater and more serious than those of Hezbollah or Hamas.
“Hezbollah starts the most recent activities by a border incursion and kidnapping two Israeli soldiers,” Graubart explained. “That’s illegal, but Israel’s response is disproportionate. One rule in international law is that you don’t up the ante. Israel should have gone through diplomatic channels [to negotiate the release of Hezbollah’s prisoners] or at most staged a proportionate response. The massive bombing of Lebanon has no justification under international law.”
Graubart also explained the doctrine of “acceptable warfare,” defined in Common Article 3 of the Geneva Convention — a document that’s been in the news in the U.S. a lot lately because it also contains the prohibition against torture or “cruel and inhuman treatment” of prisoners or detainees which the Bush administration is trying to set aside on the grounds that it’s supposedly “vague.” The relevant part of Common Article 3 to the conflict between Israel and Hezbollah, Graubart explained, is the idea of “differentiation between civilian and military targets.”
According to Graubart, Common Article 3 flatly prohibits attacks on civilian targets “unless they’ve been converted to military use.” For so-called “dual-use” targets — ones which have both civilian and military uses — “the burden of proof is on the attacker” to show that the military value of hitting the target outweighs the collateral damage in civilian deaths and property destruction. This meant that Hezbollah’s rocket attacks on Israel were illegal — but, Graubart added, “the U.S. attack on Kosovo in 1999 was worse.” Graubart also said that Israel’s worst violation of international law was its open attack on Lebanon’s civilian population on the ground that they were “supporting” the Hezbollah militia. “Collective punishment of the broader population of Lebanon for ‘support’ of the militia is illegal,” he said.
Acknowledging that there is no institution with either the authority or the power to punish violators of international law the way criminal justice systems routinely punish people who break domestic laws, Graubart said, “What’s the relevance and value of international law? It’s useful, but it’s a status quo institution. It sets forth useful ideas about sovereignty and standards. Certainly, it’s biased towards the powerful. They have the biggest role in making and enforcing treaties. The United Nations Security Council is shaped by the great powers” — specifically the United States and the other four nations with veto power (Britain, France, Russia and China) — “and will be successful to the extent the great powers allow it to be.”
Graubart said that the attempt to use the U.N. to broker a cease-fire to the Israel-Lebanon war was hamstrung and delayed for over a month by both U.S. and Israeli intransigence. “There were quick international calls for a cease-fire [after the Israeli attack] but Israel and the U.S. didn’t agree,” he explained. “The U.S. got in the way of a cease-fire for a while, so a lot of people died.” Israel refused to agree to a cease-fire, Graubart said, because they staged the attack in the first place in order to disarm the Hezbollah militia completely — and it was only “in early to mid-August, when Israel’s political and military leaders realized they couldn’t utterly defeat Hezbollah,” that they agreed to the cease-fire embodied in U.N. Security Council Resolution 1701.
According to Graubart, Israel originally wanted that resolution to include a demand that Hezbollah disarm — thereby asking an international peacekeeping force to do what Israel’s own military hadn’t been able to — “but that would have been suicidal for an international force.” Graubart added that the attack on Lebanon had originally been supported by 80 percent or more of Israel’s population, but later Israeli public opinion shifted to a consensus, not that the attack was wrong, but “that it was militarily unwise.” One reason for that change of mind was the extent to which the attack boosted the prestige of Hezbollah, “not only in Lebanon but throughout the entire Middle East,” Graubart explained. After all, Hezbollah had held out against a vastly larger, better-equipped Israeli force for over a month — longer than the professional armed forces of Egypt, Syria and Jordan were able to do in their 1967 and 1973 wars with Israel.
In fact, while Israel’s motive for the attack was supposedly to disarm Hezbollah, “Israel’s actions were designed to stop any momentum for disarmament,” Graubart said. He also noted that, though the resolution provides for humanitarian aid to the people of Lebanon — over 700,000 of whom were made homeless by the Israeli attack — “at the U.S.’s insistence, it does not allow aid to go to areas controlled by Hezbollah.” Graubart said that this condition — which the U.S. is imposing on a lot of other international aid programs and peacekeeping missions — is dangerous not only because it punishes innocent Lebanese for living in the “wrong” part of their country, but also because a peacekeeping or humanitarian mission can function only if it’s seen as neutral, and the U.S. bans on aid in certain areas makes it look as if the aid workers have taken sides.
Graubart also mentioned the virtually forgotten prologue to Israel’s attack on Lebanon: a similar attack on the Palestinian outpost in the Gaza Strip between Israel and Egypt. That started much the way the Lebanese attack did: with a border incursion by Hamas, the combination guerrilla army, political party and social-service organization that now controls the Palestinian government. Hamas staged a raid on an Israeli military outpost on June 25, killing two Israeli soldiers and capturing one, and in response Israel “reacted with a full-scale raid and captured eight Hamas ministers” — all duly elected representatives in the Palestinian government. “Gaza is still a disaster,” Graubart said.
Much of Graubart’s talk dealt with the U.N. Security Council and how the United States “has become more assertive about trying to get the Security Council to go along with its actions” since it successfully won U.N. authorization for its first attack on Iraq in the 1990-1991 Gulf War. Though the U.N. hasn’t always gone along with the U.S. — “it didn’t go along with the blockade of Cuba or the 1989 invasion of Panama,” Graubart conceded — even on the U.S. conquest of Iraq in 2003, though the Security Council refused to authorize the war (actually the U.S. withdrew its request for a use-of-force resolution when it became clear there wouldn’t be a majority on the Council for one), “it did pass a hard-line resolution after the war and ratify the occupation afterwards.”
Graubart addressed two common criticisms about the U.N. — that it’s biased against Israel and it’s biased against the United States. “The Security Council can’t be biased against the U.S. because the U.S. is a permanent member with veto power,” he said — and as for the alleged “bias” against Israel, supposedly proven by a 1970’s resolution of the U.N. General Assembly equating Zionism with racism, Graubart said that while there may be “a considerable volume” of U.N. resolutions calling on Israel to return to its 1967 borders and allow the rest of occupied Palestine to become an independent country, “the Israeli-Palestinian conflict resonates throughout the Middle East and the resolutions generally seek a peaceful solution in conjunction with the United States.”
Activist Compares Civil Rights, Immigrants’ Movements
by MARK GABRISH CONLAN
Copyright © 2006 by Mark Gabrish Conlan for Zenger’s Newsmagazine • All rights reserved
The modern-day movement in the U.S. to protect the rights of immigrants and win amnesty and decent livings for the estimated 12 million undocumented immigrants currently in the U.S. could learn a great deal from both the successes and the failures of the African-American civil rights movement, veteran activist Joel Geier told a meeting sponsored by the International Socialist Organization (ISO) at the City Heights Recreation Center September 23. According to Geier, “The civil rights movement didn’t end racism in America, but it did make it impolite to be racist. Now the racism is coming out again.”
Even before Geier spoke, ISO members at the meeting discussed an action earlier that day that underscored his point. The ISO has been working in coalition with local groups in National City to make that community a “sanctuary city” for undocumented immigrants, which would forbid National City police officers from working with federal Immigration and Customs Enforcement (ICE) agents to conduct sweeps looking for undocumented immigrants. The anti-immigrant Minutemen scheduled a protest against the “sanctuary city” proposal for the morning of September 23 — and according to ISO members who attended, their counter-demonstration drew 200 people to the Minutemen’s 100.
“I’m sure most of you have heard of Elvira Arellano, who took sanctuary in a city building in Chicago and said, ‘I’ve learned from Rosa Parks, the law is wrong and I’m not going to the back of the bus,’” Geier said at the start of his speech. Throughout his talk Geier, an ISO member, told the familiar history of the civil rights movement but put his own revolutionary socialist “spin” on it, criticizing those parts of the movement that insisted on nonviolence and wanted to work with the Democratic Party and praising those who wanted to make the movement specifically and openly revolutionary and anti-capitalist.
According to Geier, though the U.S. Supreme Court ruled in May 1954 in Brown v. Board of Education that racial segregation in public schools was unconstitutional, segregation remained in place without any challenge from activists until 1 1/2 years later, when Rosa Parks refused to give up her seat on a bus in Montgomery, Alabama to a white passenger and triggered a boycott of the bus system by African-Americans. “The whole Black community mobilized and organized car pools to break the segregation,” Geier said. “It started on December 1, 1955 and went on until December 14, 1956, 381 days in which you had to mobilize the population and keep spirits up. It went on until the city was about to go bankrupt and settled.”
Geier dated the existence of the civil rights movement from the 1956 bus boycott until the early 1970’s. He said its main lesson for activists today was that “it was a long struggle, and we have to commit to a long struggle. It’s a struggle in which there will be ups and downs, victories and defeats, and you have to build an organization that can withstand setbacks.” Geier noted that the civil rights movement didn’t have any major victories after Montgomery until four years later, when four Black college students in Greensboro, North Carolina mobilized and staged a sit-in to demand service at Woolworth’s whites-only lunch counter.
“The next day,” Geier recalled, “there’s a little rally on campus and 12 more people join. In the next two to three months you’ve got a movement of 20,000 to 30,000 people involved in sit-ins and support actions.” At least part of the tactic’s effectiveness, he added, came from the fact that the students targeted a national chain — which meant that people in Northern cities, where there was racial prejudice but not overt, legislated segregation, could join in by staging support protests at Woolworth’s in their own communities. Geier called the Greensboro sit-ins “the start of the mass civil rights movement.”
Why did it take so long between Montgomery and Greensboro? According to Geier, “there were no real organizations to mobilize successful demonstrations. The existing organizations, the National Association for the Advancement of Colored People (NAACP) and the Urban League, were obstacles. The Democratic Party controlled both houses of Congress and had since the 1930’s. They did not pass one civil rights bill [until 1957] or even an anti-lynching bill.” (Actually there had been two Congresses between 1932 and 1956 — 1947-48 and 1953-54 — that had been under Republican control; and even when the Democrats were nominally in charge, control really rested with the racist Southern “Dixiecrats” who chaired all the major Congressional committees. Later the “Dixiecrats” would shift from the Democratic to the Republican party, thereby moving political control of the South — and, eventually, control of Congress as well — to the Republicans.)
Aside from the absence of organizations in the African-American community to sustain a mass movement in the late 1950’s and the co-optation of the nascent movement by the Democratic Party, Geier also said there was “an enormous backlash after Brown” by white Southerners determined to maintain segregation. Geier noted the formation of White Citizens’ Councils throughout the South — essentially above-ground versions of the Ku Klux Klan — and the “Southern Manifesto” signed by 105 Southern Democrats (“Dixiecrats”) in Congress “pledging to block all civil rights bills.” Even a conservative African-American group like the NAACP was labeled a “subversive organization” in 12 Southern states, meaning that “people who belonged to it couldn’t be schoolteachers or work for the government” — at a time when those jobs represented almost the only hope for African-Americans to work their way up in the system.
“All the things the civil rights movement engaged in were illegal,” Geier stressed. “Segregation was the law, as it is today in Pennsylvania and Colorado (where city and state governments have passed laws discriminating against immigrants), and they were forsaking it and relying on themselves. The U.S. Supreme Court passed Brown but 10 years later 98 percent of Black kids in the South still went to segregated schools.” Geier said that the movement’s strategy was not only civil disobedience but what they called “social dislocation,” which essentially meant getting in the racists’ faces so often and in so many places that they couldn’t continue the system of segregation and would have to end it.
What this strategy needed to work, Geier said, was “the creation of organizations, of leadership, cadres, policies and strategies.” Eventually two direct-action civil rights organizations emerged: the Student Non-Violent Coordinating Committee (SNCC), which grew out of the Greensboro sit-ins; and the Congress of Racial Equality (CORE), an older group which was taken over by militants and adopted militant strategies and tactics. “Right now,” Geier added, “there are few organizations in the immigrant community, and most of them are impediments, aimed at lobbying and compromise.”
Geier also said “there was a confusion in terms of ideas” in the civil rights movement as it organized. “The people had mixed consciousness; they were militant and willing to break the law, but they also had both liberal and conservative ideas. The conservative ideas they had were nonviolence and ‘Christian love,’ and the liberal idea was to support liberal Democrats. A decade later, these same people were revolutionaries against imperialism and capitalism. Forty percent of all Blacks under 30 [in the early 1970’s] identified with the Black Panther Party. What changed people was the struggles of the sit-ins, the Freedom Rides, Albany, Birmingham, Selma. In struggle, people get bolder, more confident, more militant.”
This, Geier explained, is one of the principal lessons the immigration rights activists of today have to learn from the civil rights movement. “People were more confident on May 2” — the day after the nationwide general strike of Latinos — “than they were on April 25 or May 25,” Geier said. He also stressed the need for immigrant rights activists to remain clear on what they want — amnesty, a “path to citizenship” and a living wage for all immigrants in the U.S., documented or not — and resist the temptation to “compromise” and accept the so-called “moderate” bill in the U.S. Senate, which would continue fencing and militarizing the U.S.-Mexico border, set up a guest-worker program which (according to Geier) would just repeat the abuses of the bracero program of the 1940’s and 1950’s, and only allow a handful of the undocumented immigrants currently here a “path to citizenship.”
Returning to his history of the civil rights movement, Geier said that “by 1964 you had a cadre of activists, 30,000 to 50,000 people, who had been politically educated by the struggles. There were a lot of key questions they had to confront, and which moved them Left. The first was nonviolence: is it a principle or a tactic? Should you engage in self-defense or not? Not that many people would agree to sign up for a movement in which they couldn’t defend themselves, especially when three civil rights workers were murdered in Mississippi by the police. Every civil rights organizer got a gun and didn’t drive at night without it. At a demonstration, there would be peaceful, nonviolent demonstrators, and then within the crowd there’d be people with guns called Deacons for Defense.”
The other two questions they had to resolve were their relationship with the Democratic Party and the whole question of “integration” as the movement’s goal. According to Geier, the militant civil rights activists definitively rejected the Democrats after the party turned down the challenge of the Mississippi Freedom Democratic Party, which went to the 1964 Democratic convention with a full slate of delegates and a demand that the party unseat the white racist Dixiecrat delegation from Mississippi and let them represent Mississippi instead. When the party offered the Freedom Democrats only two non-voting “at-large” seats on the convention floor — and moderate African-American activists, including Martin Luther King, advised the Freedom Democrats to accept the “compromise” — “it led to an enormous split within the civil rights movement over whether it was fighting for Blacks or more concerned about the electoral needs of the Democratic Party,” Geier explained.
As for “integration” as a goal, the movement started to grapple with the question of “do we want to integrate with the racists?” precisely when the Viet Nam war was being escalated and not only Black but white and, eventually, Latino activists started to move Left after being radicalized by the war the liberal Democrats then running the U.S. government were waging against Viet Nam. The civil rights movement was also confronting the current economic stagnation in Black America — “in 1960 Blacks made 53 percent of what whites did and in 1964 it was 54 percent” — and there were riots in over 200 American cities, which Geier described as “urban uprisings” which “created a radical Black consciousness” and gave rise to the phase in the movement known as “Black Power.”
“Black Power” itself led to four responses, Geier said. One was the liberal response of having middle-class African-Americans work their way up through the political structure of the Democratic Party and ultimately run for and win elective office. One was the conservative response of “Black capitalism,” which called on African-Americans to start businesses (usually very small ones since that was all the capital they could raise) and work their way up in the corporate system. One was a movement for “community control” of the schools and police in African-American neighborhoods — which, Geier explained, ignored the fact that the police and schools were still dependent on funding from white governments, and that money could be pulled from any school system or police department under nominal Black control.
The fourth response was “revolutionary consciousness,” which led to the formation of the Black Panther Party and the League of Revolutionary Black Workers in the late 1960’s. Though generally supportive of these groups’ goals, Geier was critical of their lack of structure. “They were organizations without clear cadres or political strategies,” he said. “The Panthers went from 35 people in 1967 to 30,000 in 1969, and the government developed a two-pronged strategy against them: the carrot and the stick. The carrot was yielding on the ‘community control’ issues, and the stick was repression, shooting and killing people. Without a stable leadership program, they were quickly snuffed out.”
Geier acknowledged the difference between conditions in the late 1960’s and now. Then, he said, the U.S. was in the middle of “a period of extraordinary economic prosperity” and the Black militants believed the white working classes were content with their lot and uninterested in radical social change, while today “the immigrant rights demonstrations are those of the worst paid and most exploited” workers in the U.S. But he also pointed to similarities, especially in the use of racism as a tool to keep American working people fighting each other instead of the capitalists. He warned that if there’s another recession, there’s a real possibility of “mass deportations” of immigrant workers “unless American workers realize that the immigrants’ issues are their issues” and white workers join with workers of color to demand decent wages and working conditions for all.
STEPHEN DAVIS:
“Wrongful Death” Author Criticizes Orthodox View of AIDS
interview by MARK GABRISH CONLAN
Copyright © 2006 by Mark Gabrish Conlan for Zenger’s Newsmagazine • All rights reserved
The cover of Stephen Davis’s novel Wrongful Death: The AIDS Trial is appropriately bright and arresting: in front of a red background, a blindfolded figure of Justice (from a sculpture by artist Dennis Taylor) holds a set of balance scales with a cascade of AZT capsules falling from one side. The copy on the back cover is equally intense: “How bureaucratic lies and incompetence, gross medical malpractice, and unbridled greed by a drug company cost 300,000 American lives in just 10 years — five times the number killed in the entire Viet Nam war!”
But behind the lurid cover lies a gripping John Grisham-style courtroom drama that vividly and entertainingly presents the case Davis has to make — that HIV cannot possibly be the cause of AIDS; that AZT, the first drug ever approved for AIDS treatment (in 1987), was a brutally toxic chemotherapy that killed most of the people who took it; that the discovery and claimed “isolation” of HIV were fraudulent; and that the modern-day treatments seem “better” than AZT only because they’re marginally less toxic. What’s more, though ostensibly a work of fiction, it ends with 71 pages of references so readers can check out for themselves what he’s saying in the book.
A former Arizona state senator and physician’s assistant, Davis has had a widely varied background which he discusses below. Since he published the book earlier this year, he has been active in the alternative AIDS movement, attempting to bring its disparate groups together for a unified publicity campaign to make more Americans aware that there is another way to look at AIDS besides the simple-minded “HIV = AIDS = death” equation we’ve been propagandized to believe since HIV was first politically proclaimed to be the cause of AIDS at a press conference in April 1984.
Davis is also working on a new novel exposing the inaccuracies of standard HIV antibody tests, pointing out that there are at least 64 possible causes of false-positive results documented in the scientific literature — among them such common infections as hepatitis, herpes, malaria and flu, as well as pregnancy, especially in women who have been pregnant before. He’s hoping to organize public opinion against the recent recommendation of the U.S. Centers for Disease Control (CDC) to have people tested for HIV antibodies as part of their routine medical examinations — which he fears will mean millions of falsely identified “HIV positives” being subjected to pressure to take unneeded anti-HIV medications that will only speed them to their graves.
To buy Wrongful Death: The AIDS Trial, click on the link at www.healsd.org, the Web site of a local alternative AIDS organization co-founded and chaired by this author, or visit Davis’s own Web site at www.theaidstrial.com
Zenger’s: Why don’t you start by giving me some of your background? The blurb on you in the book is quite fascinating: “Former Arizona state senator, commercial pilot, captain of a whale and dolphin research ship and founding member and musical director for Up with People.”
Stephen Davis: Yes, Mark, I’ve had quite a varied and interesting life. I’ve had an opportunity to do a lot of things, and I got into the AIDS/HIV question from a result of my experience in government and in medicine, as a physician’s assistant. That was what allowed me to, first of all, understand some of the studies that have been done by actually reading the studies themselves, rather than taking the media’s word for what they said. It also provided me with a fairly high level of skepticism about what was being told to the American public.
Zenger’s: What about your experience in government conditioned you to look with favor on alternative explanations of what’s going on in AIDS?
Davis: Well, when I got elected in 1974 I was 28 years old. I was very naïve about government, and, like everyone else who talked to me, I thought that if you wanted to make any changes, you would do it from the inside. So while I was lying on my bunk in Viet Nam one day, I decided that I would try to run for office and see if I could change the situation, and hopefully make sure something like Viet Nam would never happen again.
When I got into government as a state senator, and was very intimately involved in the whole process, I discovered that government was not the solution; that, indeed, it itself was the problem. I didn’t actually even like being a senator. I did run for re-election. I’m really not sure I could tell you why, but I was literally very happy when I lost. I just didn’t feel like that was where we were going to make a difference.
Having seen the way government operated, and the forces at work within government dictating what could and could not be done, I just really believed, when Robert Gallo stood up on April 23, 1984 on behalf of the Department of Health and Human Services and announced that he had found the probable cause of AIDS, it just didn’t fly. It was just another piece of propaganda. I consider it to be very much like the Warren Commission. I don’t think a lot of people anymore believe that John F. Kennedy was shot by a lone gunman with a magic bullet. I didn’t believe that HIV causes AIDS. It was as simple as that.
Zenger’s: So at the time that Gallo made his announcement, it was more just a gut-level feeling, based on general skepticism that government could actually get the answer to a momentous question like this right?
Davis: It was partly that. It was partly because Margaret Heckler, who was then secretary of the Department of Health and Human Services, had no medical or scientific experience. She was put into that position by Ronald Reagan because of her business and management expertise. Reagan was a big one on making government “efficient,” and I’m not being critical of Margaret Heckler or her appointment, but she had no basis on which to judge any pronouncement by Gallo, or even to ask Gallo any intelligent questions, before she stood up and announced to the world that the United States had found the cause of AIDS.
Also, I was skeptical because this hadn’t been even submitted for peer review by Gallo, which is the normal medical/scientific process of making a discovery: you send it out to your peers so they can test your discovery to see if they can verify it. Then you publish it. This was medical diagnosis by press conference. That’s unprecedented in the history of medical and scientific research.
Then, from my medical experience, I did not believe that a retrovirus could cause the kind of damage that was evidence with AIDS. And, of course, in 1990, Dr. Luc Montagnier [co-discoverer of HIV with his laboratory assistant, Françoise Barre-Sinoussi] verified that himself when he said that HIV could not cause AIDS [without a co-factor] and that a retrovirus had never caused harm in a human being, much less killed the numbers of cells that were necessary in order to bring on AIDS. I was sure HIV couldn’t possibly cause AIDS, but I could not prove that because I had not studied it. It was not until 1996, when I read Peter Duesberg’s book Inventing the AIDS Virus, that I began to see that the scientific evidence also supported my gut feeling.
As I said earlier, we have been told many, many things by the government and by the media — and by the medical profession, to be honest with you — about studies they claim say this or that. And, honestly, when you go read the studies themselves they don’t say this or that. As a matter of fact, most of them say exactly the opposite of what we’re being told. I had the chance and the ability to read these studies for myself and make my own decision, rather than buying the propaganda that was being disseminated.
Zenger’s: When I heard about the press conference, my gut-level reaction was simply that AIDS seemed to be a collection of so many varied phenomena — it struck so many people so differently; some people lived for years with it, some people died almost immediately; some people got one set of diseases, some people got another set — that I greeted the news with a kind of shrug and said, “Well, that’s one possibility.” But I really didn’t think that a condition that had so many possible outcomes, so many different symptomatologies, could possibly have a single cause.
Davis: And that was almost easier to swallow back then than it is now, since the CDC has expanded the definition of AIDS to include diseases that have nothing to do with immune deficiency. One of the things that I talk about in my book is the fact that we really need to be talking about four entirely separate classes of “AIDS,” which each have their own causes. We keep lumping them together into one big soup, and not only does this not make sense, it amounts to medical malpractice to diagnose people with “HIV” and tell them that they’re going to get and die of this lethal disease called “AIDS,” when in fact, in Africa for example, you don’t have to have HIV to be diagnosed with AIDS.
But, of course, in the United States we keep hearing on radio and television about “AIDS in Africa,” and nobody in this country knows that AIDS in Africa doesn’t mean what it means here. You don’t have to have either HIV or immune deficiency to be diagnosed with “AIDS” in Africa. All you have to have is a persistent cough, a persistent fever, diarrhea and lose more than 10 percent of your body weight [within a 30-day period]. It’s literally insane and very illogical to be talking about AIDS as a specific disease and linking it to a specific virus or retrovirus called HIV.
Zenger’s: When [University of California at Berkeley microbiology professor] Peter Duesberg published his first paper against HIV as the cause of AIDS in 1987, his argument in a nutshell was, “For 10 years we have researched retroviruses as possible causes of cancer because they do not kill their host cells. They need their host cells in order to survive themselves. Now we are saying that a retrovirus causes a disease in which massive numbers of the host cell die.”
Davis: Yes, but unfortunately, by that time it was already too late, wasn’t it? It was just like the assassination of John F. Kennedy, or the start of the Viet Nam war. We now know that the incident normally called the Gulf of Tonkin didn’t happen, but President Johnson could get on national television and say, “We were attacked in the Gulf of Tonkin, and therefore we are going to war in Viet Nam.” Well, that attack never occurred, and yet it was just not until a couple of years ago, I think, maybe even shorter than that, that the actual paperwork, the actual memos and classified documents were released showing that that was all a fake story as well.
Now, we lost 58,000 people, Americans, in Viet Nam. We’ve lost 500,000 to AIDS from the same kind of fabrication — fantasy, if you will — from the government. So I’m wondering when the American people are going to wake up and see that this happens over and over and over again, and that we’d better start questioning the things that are being told to us, both by the government and the media.
Zenger’s: Or another more recent one: the weapons of mass destruction in Iraq. We were told that Saddam Hussein’s government was an imminent threat to us; and also, while the President and the people in his administration never actually said that Saddam Hussein had anything to do with the attacks on 9/11, over and over again they used “Saddam Hussein” and “Iraq” in the same sentences as “al-Qaeda” and “9/11,” and people made the connection, which didn’t exist.
Davis: Absolutely. Did you hear Bush the other day flat-out say, “Iraq had no connection to al-Qaeda at the time”? I think he probably spoke without thinking of what he was saying, but that was what he actually said. I saw it on CNN; he actually came right out and said that there was no connection between Saddam Hussein and Iraq, and al-Qaeda for 9/11. But once a story gets started, it’s hard to stop it, especially when money gets involved. And now, of course, AIDS has become a multi-billion dollar business throughout the world, and people’s livelihoods depend on HIV causing AIDS
When I was in the Arizona state senate there was a law that I was prohibited from voting on or participating in any legislation in which I had a direct financial interest. That is not only ethical but logical and proper, and I would expect the same kind of ethics and logic from the people involved in the AIDS industry. Those whose livelihoods depend on the HIV/AIDS hypothesis literally have no business saying a word about this; and, when they do, what they say cannot be trusted. That’s the bottom line.
Zenger’s: You said that you really got interested in the alternative point of view on AIDS after you read Peter Duesberg’s book Inventing the AIDS Virus in 1996. What was your reaction to that?
Davis: I sat down and read that cover to cover in three days. My specialty has always been to take difficult, complex concepts and rewrite or rephrase them in simple three-letter words so that I can understand them, and hopefully others will as well. So I immediately took Peter’s book, and a couple of other studies that I read at the time, and I wrote a 10,000-word paper called “AIDSGate” and published it on the Internet, Remember, this was 1997, so it was an outline of AIDS and HIV and AZT as we knew it up to that point.
In that paper, I specifically asked people who were HIV [positive] or families of those who were HIV positive, who had lost a loved one to “AIDS,” to file a class-action suit against Robert Gallo, against the FDA, against the Department of Health and Human Services, and [AZT manufacturer] Burroughs Wellcome, as it was called at that time —now GlaxoSmithKline — file a class-action suit for the wrongful deaths of their loved ones, based on the fact that all the studies show that HIV does not cause AIDS — AZT, on the other hand, does — and that these people died first of all from a misdiagnosis, and secondly, iatrogenic causes of being prescribed this lethal chemotherapy.
I couldn’t do anything about that myself. I’m not HIV positive. I frankly never knew anyone — and still don’t — who had AIDS or died from it, so in legal terms I had no standing. I had no ability to be involved in such a suit. But I certainly hoped others would file that kind of suit. I thought it would be helpful to have a court case in which the truth comes out. The answers to specific questions are provided in sworn testimony, like, where’s the proof that HIV causes AIDS, when in fact the proof is obviously that HIV fails every medical and scientific test to be called the cause of AIDS — or to cause any disease, for that matter. Where’s the proof that HIV has even been isolated? Where’s the proof that the HIV blood tests have been validated? They have never even been demonstrated to prove HIV antibody reactions, much less HIV reactions. So all these are questions that I had hoped would come out in a class-action lawsuit.
I thought that at that point I was finished with this issue. I had done my thing. I had explained this so that almost anybody could understand it. I had put out a solution for it, and I thought I was finished with that issue and I was moving on. Unfortunately, no class-action suit was ever filed. There were private suits filed, I was told, and settled by GlaxoSmithKline for a lot of money, but also with the stipulation that nothing ever be made public about the settlements. Therefore, the general public has never had the benefit of hearing this.
When no class-action lawsuit over AIDS, HIV and AZT happened, I had the idea about three years ago, 2003, to write a movie script as if that court case had actually happened. It turned out to be a two-part, four-hour made-for-TV movie called Wrongful Death: The AIDS Trial, Now, obviously, this was a fictitious court case, but I made sure that every word of testimony, every witness, every statement made in that movie was absolutely factual and based on medical and scientific research. That screenplay came in second in a screenwriting contest. But of course nobody was going to produce it. So last year a friend of mine talked me into rewriting that screenplay as a novel, and hence my book, Wrongful Death: The AIDS Trial. That’s how all that came about.
Zenger’s: I noticed reading the book that you based it largely on Peter Duesberg’s writings and also on the reporting of John Crewdson, which I found rather ironic because, while Crewdson did a great job of exposing the shenanigans around who discovered HIV and how the blood test was developed, he’s as staunchly committed to HIV as the cause of AIDS as anybody. His book contains a lot of sneering references to alternative points of view, saying for example that Montagnier discredited himself and dealt himself out of the world of serious AIDS research when he made that speech in San Francisco in 1990 suggesting that HIV needed a co-factor. So did you have any qualms about melding these two perspectives: Peter Duesberg’s that HIV doesn’t cause AIDS, and John Crewdson’s that it does, but we were lied to about how it was discovered and who invented the test?
Davis: Not all of us can get everything right all of the time, and Crewdson got some stuff right some of the time. I wasn’t about to throw that out simply because he got other stuff wrong. I even disagree somewhat with Peter [Duesberg] in some of his conclusions about the causes of AIDS. I’m one of those who staunchly adhere to the theory, which was held by the CDC for the first three years, from 1981 to 1984, that amyl nitrite and its derivatives, butyl nitrite and isobutyl nitrite, were the actual cause of AIDS.
Now, when we say that we are talking specifically about the “AIDS” that sprang up in the homosexual community in the early 1980’s, which was specifically an immunodeficiency disease syndrome, with KS [Kaposi’s sarcoma] of course being the hallmark disease. At that time, there was a lot of proof that KS was linked directly to poppers, and that poppers indeed destroyed the immune system and indeed turned many classes of antibiotics into carcinogens.
So Peter’s probably right in one sense, that there was a lot of antibiotic use as well in the homosexual community, and the combination of the antibiotics with the amyl nitrite wreaked havoc, but I don’t just go to a generalization of recreational drugs [as the sole cause of AIDS]. There’s another movement that says it has to do with simple stress in the immune system, “oxidative stress” is what it’s called. And that’s fine. I’m not saying there’s a single cause of AIDS. I think there are four different kinds of AIDS, each with their own causes.
But it’s not like I swallowed everything Peter said, or swallowed everything Crewdson said, or John Lauritsen [journalist and author of Poison by Prescription and The AIDS War], or anybody. I jfelt that they all had some very good contributions to make to the understanding of this disaster, this tragedy, and that the place we ought to be focusing our attention now is on what we continue to do today to people who are diagnosed as “HIV positive,” especially looking at the blood tests that make the diagnosis.
Zenger’s: You’ve mentioned a few times that you think what’s called “AIDS” is really four different syndromes. Could you briefly run down what you think each one is?
Davis: Yes. I call the first kind “classic AIDS.” That is the kind of AIDS that sprang up in the homosexual community, with the first cases being identified by Dr. Michael Gottlieb in L.A. in May 1981. I believe these cases are linked directly to the use of poppers in the homosexual community. There’s so much documentation about the widespread use of poppers, and their side effects, that I think the link is actually pretty clear.
The second kind of AIDS is what I call “iatrogenic AIDS,” “iatrogenic” meaning caused by a doctor, a hospital or a drug. We were very familiar with iatrogenic immune deficiency prior to the advent of AIDS, because that is exactly what cancer patients get: iatrogenic immune deficiency from their chemotherapy and radiation therapy and so forth. Most cancer patients, as I understand it, die of opportunistic diseases they get from their therapy, rather than dying of the cancer. So immune suppression caused by drugs was not a new thing at all.
But what we started doing in 1987 is giving this very lethal drug called AZT, proven to destroy the immune system, not only to people who were sick with AIDS, but who were HIV positive but had no symptoms. They were perfectly healthy human beings. What happened is, of course, those people developed AIDS from the immune suppression caused by AZT, and they died.
Zenger’s: My recollection is that AZT was originally marketed only for people with actual AIDS diagnoses, and it wasn’t until 1989 that Burroughs Wellcome won approval to sell it to so-called “asymptomatic” HIV positives as well.
Davis: That is correct. It started in 1987, and as a matter of fact the actual FDA [U.S. Food and Drug Administration] approval in 1987 was based on the stipulation that they would only give it to very sick AIDS patients. However, in 1988 they started giving it [to asymptomatic HIV positives] without approval, and in 1989 they got that approval. You’re correct.
You have to remember that Burroughs Wellcome also held the patent on amyl nitrite [the active ingredient in poppers]. When amyl nitrite was being threatened in the first three years, 1981 to 1984, by the CDC’s belief that it caused AIDS, Burroughs Wellcome very much wanted to get the attention off of amyl nitrite, and therefore totally supported the idea that it was a virus instead.
Plus they were not the ones that created AZT. That was created in a cancer lab in Michigan back in 1964. Burroughs Wellcome or someone found it on a back shelf, but Burroughs Wellcome put no money whatsoever into the development of AZT. They simply took AZT from someone else who developed it, ran it through some tests and came out saying that it killed HIV. Well, of course, it also killed every other, healthy cell in the body as well. In fact, they lied at the FDA approval committee and said that it only killed one healthy cell for every 1,000 infected cells, when in fact the ratio was 1,000 times that.
Zenger’s: What are the other two kinds of AIDS, as you see them?
Davis: The third kind of AIDS I call “AIDS by definition,” and that is because the Centers for Disease Control and Prevention changed the definition of AIDS many, many times between 1981 and 1993. Each time they did that, they would add more diseases. The latest addition was cervical cancer, which obviously itself is not an immune-deficient disease, and has no relationship to HIV. But that was done solely because HIV was failing the first epidemiological law of viral and microbial diseases, which means that a virus cannot differentiate between men and women, and has to affect them fairly equally.
Well, up until 1993, 90 percent of the AIDS cases were men. And that didn’t sit very well. So they added cervical cancer to begin to include more women, and now I believe it’s only 83 percent men and 17 percent women. But there’s no other logical reason for cervical cancer to be included on a list of AIDS-defining diseases.
Zenger’s: As I recall, the reason given at the time was, “We find a lot of cervical cancer in women who test HIV positive; therefore we believe there is a connection, and therefore we consider it part of the syndrome.”
Davis: Oh, well, over the years they have come up with lots of different justifications and rationalizations for the things that they do. I don’t believe that for an instant. I don’t think there’s any science to support that, either. But it is simply one example of how the CDC has changed the definition of AIDS over the years. Now it includes 30 different diseases, many of which are not opportunistic and therefore not dependent upon an immune-deficient environment. So you have this whole class of people getting “AIDS” simply by definition. That’s what that class is. And the fourth group of AIDS is —
Zenger’s: [Ph.D. chemist and alternative AIDS researcher] David Rasnick pointed out that one of the things they did in 1993 is allow you to be diagnosed with AIDS if you were not sick at all. If you have an HIV positive test result and a T-cell count below 200, then you can be considered an “AIDS” patient.
Davis: Yes. That’s part of the AIDS by definition, absolutely.
Zenger’s: Rasnick also said that he looked very closely at the proportions of AIDS patients, according to the CDC’s records, based on what their AIDS-defining condition was. He said that as of 1997, two-thirds of all new AIDS diagnoses in the United States were in that category of people who were not sick, who simply had an HIV positive test and a T-cell count under 200. After 1997, the CDC stopped providing that information.
Davis: [Laughs.] That’s great. I love that. I had not heard that, Mark, but that’s so typical of what’s happened in this whole tragedy. It’s so typical. Just like, in 1991, when the CDC was embarrassed that there were thousands of AIDS cases with no evidence of HIV or the HIV antibodies, they created the new disease ICL, “idiopathic CD4 lymphocytopenia,” so that from that point on, anyone with AIDS symptoms who were HIV negative would be reclassified or rediagnosed with ICL instead. Therefore, since 1991, there has been a direct relationship — caused by definition — of HIV and AIDS. This is part of the game they play.
They actually played this earlier. This has some precedent. In the polio era, after the Salk vaccine came out, if anyone came to hospitals with polio symptoms, and they had had the vaccine, they were reclassified as “aseptic meningitis” instead of polio, so that no one got “polio” after the vaccine. They’ve played this game for a while, and it is a game. I consider it to be medical malpractice myself.
Zenger’s: And the fourth kind of AIDS?
Davis: AIDS in Africa, which we’ve already discussed. So it’s “classic AIDS,” linked to amyl nitrites; iatrogenic AIDS, linked to AZT; AIDS by definition, linked to the CDC; and AIDS in Africa. I’m not saying that there aren’t other factors involved. I’m not saying that there can’t be other immunosuppressive drugs that might be causing the classic AIDS. I’m not saying that there can’t be other HIV drugs, in addition to AZT, causing the iatrogenic AIDS. I’m just simply saying that these are the major factors that I have found to be involved.
Zenger’s: In 1996, of course, we had the big splash at the Vancouver AIDS conference, the introduction of the protease inhibitors, the three-drug cocktails, and everybody in the establishment was saying, “This is a great triumph. These are life-saving drugs. We have finally turned the corner on this. Thanks to these wonderful new drugs, AIDS is going to be a chronic, manageable disease and people will be able to live long and fairly healthy lives. They just have to take the drugs for the rest of their lives.” And the AIDS death rates did indeed go down. What do you think happened?
Davis: Well, we got people off of AZT monotherapy, full-strength AZT. That’s why those drugs, the new drugs, were life-saving, because they stopped 1,500 milligrams of AZT on a daily basis. They were life-saving, but only because they didn’t do the kind of damage that AZT was doing. They could not create AIDS in at least the intensity or the numbers that AZT had done, killing 300,000 Americans from 1987 to 1997. That is what cut down all the cases, starting in 1996 — I think the drop from 1995 to 1996 was 50 percent and the drop from 1996 to 1997 another 50 percent, as we began to get AZT off the market.
Now, my question is whether the remaining 600 milligrams of AZT in, for example, Trizivir and Combivir, are still lethal. It would be very difficult to determine, because how would you in fact separate the effects of one of the drugs in that cocktail in that study? But AZT, of course, is still being given in very special circumstances and in these two cocktails, and frankly, we just don’t know whether in fact it is still killing people or not.
All we know is we still have 15,000 deaths from AIDS each year. We also know that there is still use of poppers in the homosexual community, because poppers have never been blamed or linked directly to AIDS since 1984, and so we may still be seeing some effect of that. And, of course, as David Rasnick says, of the 15,000 deaths per year, we still have people who were totally asymptomatic but were HIV positive with a depressed T-cell count who were then started, they were put on these new HAART drugs — Highly Active Anti-Retroviral Therapy — which indeed themselves have lethal side effects, such as liver damage, severe liver damage.
Zenger’s: At the 2004 AIDS conference in Barcelona it was admitted that at that time the number one cause of AIDS deaths was liver fatalities that could be traced to the anti-HIV drugs. Of course, the reason that doesn’t bother the AIDS mainstream is they make the assumption that without treatment, anybody with HIV will inevitably progress to AIDS and a premature death.
Davis: Yes, that’s one of their good rationalizations. Again, it has no basis in fact, because there’s no study to prove that. As a matter of fact, the studies that I’ve read say the opposite: that people do better off placebos than they do on the HAART drugs.
On August 5, 2006, the Lancet published the longest and largest study of this kind on HAART. It included 22,000 patients over 10 years. What it did — and this is a little tricky — what it did is it compared the first year of HAART therapy by calendar year. In other words, they started in 1996 and they took people who went on HAART for their first year in 1996 and compared them to the people who went on HAART for the first year starting in 1997, and on up.
The results were quite clear. While there was a definite decrease in viral load — in other words, there was a betterment in viral load testing — there was also a decrease in CD4 cell count in the later years. And there was no improvement in longevity. In other words, the HAART drugs that are being given today actually are worse than the drugs being given in 1996. They produce worse effects, other than a better viral load test. People are getting AIDS faster. The onset to the first AIDS event is shorter. Their immune systems are worse off and, as I said, there is no increase in longevity. That study has just been released, and it is, in my mind, very powerful.
Zenger’s: In fact, one of the quirkier things I have noticed about this issue politically is that there are a few voices on the Right, particularly the Libertarian Right, who have been willing to question AIDS and HIV, but on the Left it has become kind of an article of faith, and a test of your level of concern about African people, people of color, Queer people, etc., that you go down the line with HIV and you join all these campaigns to raise money to buy AIDS drugs for the Third World, where you have “these millions of people who are living with HIV who are going to die unless we get them these lifesaving medications,” blah blah blah. Any thoughts you might have on how the politics of this plays out: why so many people go along with it, and in particular why so many Leftists, who ordinarily pride themselves on questioning big government and big business, go along with HIV?
Davis: Well, a couple of personal opinions. One, of course, “show me the money.” We know for a fact that Magic Johnson is on the GlaxoSmithKline payroll, and therefore will not say a word about his AZT experience, or even what he’s actually doing today, and allows his picture to be used in connection with Combivir ads — although the ads are very careful not to attribute what they say to Magic Johnson. They just put his picture there.
Also, there is such an assumption in the liberal community that this shows compassion: “We care about these people.” Bill Clinton -— I don't know whether Bill Clinton is on the GlaxoSmithKline payroll or not, but he always likes to feel other people’s pain, and so here is a cause of compassion and caring. And that’s all that people see. They don’t understand that other people, like us, care just as much and are just as compassioniate, and believe that actually the best thing we could do for people would be to get them off these drugs and clarify the whole situation surrounding HIV.
But that’s not the way we’re perceived, as a matter of fact. I’m sure you’ve seen [Robin Scovill’s film] The Other Side of AIDS, where [Dr. Mark] Wainberg has some not-so-nice comments about the personality of Peter Duesberg. So we have been painted as people who want other people to die. This is actually very common in public health. Let me go specifically to the Incarnation Children’s Center in New York City. You’ve seen “Guinea-Pig Kids”?
Zenger’s: I know the Incarnation Children’s Center story quite well, because I interviewed Liam Scheff, who broke it. [The story dealt with an orphanage in New York City that housed children with HIV who were wards of the state, either because their parents were dead or had lost custody due to recreational drug use. The children were enrolled, without the consent of their parents or surviving family members, in HIV drug trials sponsored by the National Institutes of Health in association with pharmaceutical companies. In some cases, the kids were operated on and gastrointestinal tubes were inserted surgically so their doctors could force-feed them the HIV medications being studied.]
Davis: The video is very disturbing. I came away from watching that video wondering how in the world people could do that to children. When I mean “that,” I mean surgically insert stomach tubes and then force-feed them these dangerous, lethal medications. I didn’t know how those people sleep at night. I finally realized they have to have convinced themselves that they’re doing good in the name of public health. First of all, they could never have read any scientific studies themselves. And they have to believe that they’re doing the best for that child. To me, this is the tragedy, because people are accepting what they’re told by the authorities, by the government, by the media, by the public-health officials, and they really must feel that this is the best thing for that child.
I couldn’t ever go there. No matter what I believed, I could never think or justify surgically inserting a stomach tube in a one-year-old and feeding him medication. That’s beyond me. But I think it marks the sad state of affairs in medicine and science these days. And, according to a study by Gary Null, more people in the United States die from iatrogenic causes — that means by the doctors and the hospitals and the drugs — than by cancer or heart disease, anyway. In other words, iatrogenic causes is the leading cause of death in the United States. That’s how bad medicine has gotten. And I think we’re going to have to have a major revolution.
In fact, we are having a major revolution in terms of people’s faith in their doctors. When the truth breaks out about AIDS and HIV and these HIV blood tests and the HIV drugs, I think it’s going to bring the wall down, and we’re going to see major changes in our social structure.
Zenger’s: You’ve mentioned several places your forthcoming book, which is specifically about the HIV antibody tests. Could you tell me a little about that project? Is it going to be another novel or a nonfiction work, and what’s the gist, basically?
Davis: Yes it is, actually. It is going to be a novel and it’s going to use the same lead character, the health reporter from the Arizona Tribune, from my original book, Wrongful Death: The AIDS Trial. Her name is Sarah Meadows. Sarah embarks on this Erin Brockovich-style research concerning the HIV blood test and the fact that they have not been validated, there are so many false positives, that the proteins that are used in the test have never been specifically linked to HIV itself, that there are 10 different criteria used around the world to judge the results of the test, and on and on and on, and even every test manufacturer puts a disclaimer in their test kit that this test kit may not be used to diagnose the presence or absence of HIV.
This book will deal with her research about the people who are diagnosed as “HIV positive” as the result of these arbitrary and capricious and fraudulent HIV blood tests. It will include half a dozen or so actual true-life stories of HIV positives and what the diagnosis meant in their life, and what they’re doing about it, and what the HIV drugs are doing to people. It will specifically focus on the HIV positives who are not taking drugs and are doing well without them.
I want to end by referring you to a Web site called www.staynegative.org. If you look at the banner on this site there are people with diapers, people with distended bellies from lipodystrophy, people with sunken cheeks from facial wasting. This is a perfect example of how they’re spinning this story. Anyone going to this Web page is led to believe that these men are examples of what HIV can do.
But lipodystrophy is not a symptom of HIV, or even of AIDS. It’s a proven side effect of the HIV drugs that HIV positives are forced to take. The title of this Web site is “HIV — not fabulous.” The scientifically correct title would be, “HIV Drugs — not fabulous.” Frankly, it is false advertising to blame HIV, rather than the HIV drugs, for the condition of these men. This is criminal, and I hope to bring this to court one day.
Friday, September 15, 2006
“Another Crack in the Wall”
Song of the George W. Bush Administration; sung to the tune of Pink Floyd’s “Another Brick in the Wall”
We don’t need no Constitution
We don’t need due-process laws
We already know who’s guilty
Ask us how, we say, “Because … ”
Hey! Judges! Leave our trials alone!
All in all, it’s just another crack in the wall.
We don’t need no separation
Church and state, they should be one.
America’s a Christian nation
Or else the terrorists have won.
Hey! Judges! Leave that cross alone!
All in all, it’s just another crack in the wall.
We don’t need no bans on torture
We need all information
Never mind if they are lying,
Waterboarding sure is fun!
Hey! CIA! Make those ragheads moan!
All in all, it’s just another crack in the wall.
We don’t need no free elections,
No polls with long lines in the sun,
We’ve got out-of-date computers,
Diebold tells us who has won.
Hey! Voters! You can just stay home!
All in all, it’s just another crack in the wall.
We don’t need no reconstruction,
We don’t need no New Orleans,
God has swept away the poor folk,
God will keep our cities clean.
Hey! Losers! Soon you’ll all be gone.
All in all, it’s just another crack in the wall.
— Mark Gabrish Conlan, September 15, 2006
(inspired by Bill O’Reilly and dedicated to Philip Paulson)
Song of the George W. Bush Administration; sung to the tune of Pink Floyd’s “Another Brick in the Wall”
We don’t need no Constitution
We don’t need due-process laws
We already know who’s guilty
Ask us how, we say, “Because … ”
Hey! Judges! Leave our trials alone!
All in all, it’s just another crack in the wall.
We don’t need no separation
Church and state, they should be one.
America’s a Christian nation
Or else the terrorists have won.
Hey! Judges! Leave that cross alone!
All in all, it’s just another crack in the wall.
We don’t need no bans on torture
We need all information
Never mind if they are lying,
Waterboarding sure is fun!
Hey! CIA! Make those ragheads moan!
All in all, it’s just another crack in the wall.
We don’t need no free elections,
No polls with long lines in the sun,
We’ve got out-of-date computers,
Diebold tells us who has won.
Hey! Voters! You can just stay home!
All in all, it’s just another crack in the wall.
We don’t need no reconstruction,
We don’t need no New Orleans,
God has swept away the poor folk,
God will keep our cities clean.
Hey! Losers! Soon you’ll all be gone.
All in all, it’s just another crack in the wall.
— Mark Gabrish Conlan, September 15, 2006
(inspired by Bill O’Reilly and dedicated to Philip Paulson)
“Planet of Slums” Author Mike Davis Speaks
Gives Audience Bleak View of World’s Urban Future
by MARK GABRISH CONLAN
Copyright © 2006 by Mark Gabrish Conlan for Zenger’s Newsmagazine • Used by permission
You don’t go to a Mike Davis lecture expecting to have a good time or to hear a rosy, optimistic vision of the future of the human race. The best-selling author of City of Quartz, Ecology of Fear and Under the Perfect Sun came to the First Unitarian-Universalist Church in Hillcrest September 13 to present the ideas in his latest book, Planet of Slums, and express a grim view of humanity’s urban future. According to Davis, not only does the majority of the world’s human population now live in cities instead of rural environments — for the first time in human history — but up to one-third of those city dwellers live in environments so squalid, and lead lives so far removed from the organized economy, that he feels it’s legitimate to describe them as living in slums.
“We’re accustomed to think that the earth is well explored, but until recently we’ve known as little about this new urban planet as our ancestors in the 1830’s and 1840’s knew about the slums of their time,” Davis explained. Most of what we do know about the worldwide slum environments, he added, comes from the studies of a United Nations program called Human Habitat, which regularly commissions case studies documenting the levels of income, poverty and deprivation in various human environments.
The group recently released a report called The Challenge of the Slums and held a global summit meeting in Vancouver two months ago. This study, said Davis, “gives us our first look at the new urban reality, comparable to the historical audits of the slums of London and New York, all this a testament to how invisible the poor are,” But if you rely for your information on America’s mainstream media, you never heard of the report or the meeting held to discuss it. According to Davis, neither the New York Times nor the Los Angeles Times published one word about this.
Davis also cited as a source “a Yugoslavian economist who works for the World Bank, who has sampled census data and done household surveys around the world, including the former Soviet Union and China.” This economist, Davis explained, came up with something called the GENI Index, which measures the distribution of wealth and income and thereby quantifies the gap between rich and poor both nationwide and worldwide. According to Davis, the current GENI Index for the world is 0.66 — “which is what would result if one-third of the world’s people consumed everything and left the other two-thirds with nothing at all.”
According to the U.N. Habitat report which Davis used as his primary source for Planet of Slums, “no fewer than one billion human beings currently live in slums. The U.N. report doesn’t get distracted by political correctness. It defines a ‘slum’ as a neighborhood or community of anywhere from 200 to half a million people characterized by substandard housing and basic lack of infrastructure: potable water, electricity or sewage.” Davis added that “the 19th century reporters also characterized slums as dens of vice and criminality,” but that isn’t necessarily true of today’s slums — and when it is, Davis added, it’s usually the result of the slum population’s economic marginality and what its residents have to do to survive.
“Not all people who live in slums are poor, but at least 90 percent are,” Davis explained. “The number of the actual urban poor, defined as those who make less than $2 a day — or the extremely poor, defined as making less than $1 a day — is larger because not all urban poor live in slums.” Davis compared the one billion slum dwellers described in the U.N. Habitat report to the people Dickens and Gorky wrote about in the 19th and early 20th centuries, and said the report “puts the problem of the slum at the top of the list of problems the human race will face.” Davis said the one million figure is, if anything, an underestimate. For example, he explained, the report says only 16 percent of urban Mexicans live in slums — a figure he questioned because Mexico is one of the few slum environments outside the U.S. he’s actually visited and he’s convinced it’s far higher.
One of the points Davis returned to again and again is that the world’s slum population exists outside the formal economies of the world as a whole or the individual country where they live. He explained that the reason earlier generations of slum dwellers were able to work their way out of poverty through what he called “sweat equity and hard work” was that they had the option of “squatting” on unused land around the city perimeter. Today, he explained, the earlier generations of squatters have become the new slumlords; once they acquired legal title to their properties, they began to rent them out to the new urban migrants.
What’s more, Davis added, the formal economies of the poorest areas of the planet, where the fastest population growth is occurring — like sub-Saharan Africa, particularly its southwest coast — are producing almost no new jobs. “In urban Africa, job growth is almost entirely outside the organized economy,” he said. “People create their own employment by renting a rickshaw or a pushcart from somebody. In certain parts of Africa they open a shabeen to sell beer from their living rooms, and have their kids work in trades based on child labor. In Taiwan rural women are turning to prostitution. Criminal activities and street gangs are universal parts of the subsistence economy. The U.N. found that the total number of people worldwide involved in the underground economy to be about equal to the total number of people living in slums.”
Davis put the blame for the world’s growing slum population squarely on the Right-wing economic philosophy variously known as globalization, neoliberalism or the “Washington consensus,” and on its international enforcers: the World Bank and International Monetary Fund (IMF). “In the 1970’s former Defense Secretary Robert McNamara became head of the World Bank at a time when the world was overflowing with surplus petrodollars parked in banks which were eager to loan it to the Third World,” he explained. “The IMF ceded part of its role to the World Bank, which became intensely involved in lending money and then, when the countries went into debt around 1980, the World Bank and IMF started imposing ‘structural adjustment programs’ (SAP’s) to force Third World countries to readjust their economies according to the Washington consensus.”
That “consensus,” Davis said, “was that big government had failed and international lenders should give smaller amounts of money to the poor themselves.” That wasn’t such a bad idea in and of itself, Davis admitted, but it came attached to a wide variety of other demands imposed on these countries. They had to decimate their public sectors, slashing social services and closing “unprofitable” publicly owned industries that had nonetheless employed large percentages of these countries’ populations. Even worse, they had to revamp their agriculture, abandoning the cultivation of food crops to feed their own people and instead producing export crops that could be sold on the world market to help pay the countries’ debts.
“SAP’s shrunk both public employment and home market-based employment,” Davis explained. “People came to the cities when the job market was collapsing, the infrastructure was collapsing and the governments were abandoning any plans for new housing. This was when the slum explosion occurred.” This, Davis argued, put the Third World countries in a continuing cycle of dependency in which they were essentially in the same place, economically, they had been when they were still formal colonies of the European powers. Today liberal reformers still talk about “microfinance” and “microcredit” programs as the way to lift Third World countries out of poverty — but, Davis said, “the success stories of microcredit are dwarfed by the scale of the problem.“
Part of the problem, Davis added, is that the informal economy can only support a limited number of people. “It consists of a very small number of niches in which a large number of people try to fit,” he explained. “There can only be so many rickshaw drivers or street vendors. The economic competition among the poor is increasing as more and more people compete for the same scraps. This will not go on endlessly. Eventually there will be rules that only people who are Moroccan, or Muslim, or members of one political party, will be permitted to work. Thus the informal economy breeds sectarianism.”
It’s the dynamic of slums and the disappearance of virtually all opportunities for poor people to advance economically, Davis said — not a so-called “clash of civilizations” between the U.S. and Islam — that’s responsible for the resistance in Iraq, both its persistence against the U.S. and the increasing sectarian violence rocking Baghdad and other Iraqi cities. “I’m not claiming al-Qaeda, which generally consists of alienated middle-class Saudis, as part of this problem,” Davis said, “but nobody in the world today has anything serious to say to 15-, 16- or 17-year-olds in cities who are waking up to the kind of future they face.”
Towards the end of his talk Davis took two audience questions, one from this reporter and one from Activist San Diego founder Martin Eder, both of which gave him a chance to focus on what could be done to deal with the slum problem — and the sheer unlikelihood that what has to be done will in fact happen. “The only way the human species will survive this century and the environmental disasters brought about by indiscriminate capitalism is to make the cities our arks,” Davis said. “The only way to mobilize resources on a finite planet is to construct public spaces. If you’re a wealthy professor, you can buy books every day on Amazon.com, but my library will never be as utopian as a great public library. Privatized consumption turns us all into addicts of wealth. It can’t meet the sorts of needs public institutions can. So much of the literature points to a very dire destiny, but cities also have the potential to bring people together for public purposes.”
The Science of Sleep: Frustrating Dream Movie
by MARK GABRISH CONLAN
Copyright © 2006 by Mark Gabrish Conlan for Zenger's Newsmagazine • All rights reserved
The Science of Sleep — a French-Italian co-production from the world’s oldest continuously existing motion picture company, Gaumont, with a title rather awkwardly translated from the French La Science des Rêves (The Knowledge of Dreams would be closer) — begins as a real charmer. Basically it’s a remake of The Secret Life of Walter Mitty, drawing on both James Thurber’s marvelous short story of the man who couldn’t stop daydreaming and the 1947 film version, for which screenwriters Ken Englund and Everett Freeman made Mitty a proofreader for a pulp-fiction publisher (thereby immersing him in the same man-of-action clichés that fueled his dreams) and involved him in a real adventure story as far-fetched as anything in his dreams or his employer’s publications.
In this version, written and directed by Michel Gondry — who directed Eternal Sunshine of the Spotless Mind and co-wrote its story with Charlie Kaufman and Pierre Bismuth, only to see the reviewers hail Kaufman as that film’s auteur because he wrote the actual screenplay — the Mitty character is Stéphane Miroux (Gael GarcÃa Bernal). The product of a Mexican father and a French mother (though in that case, why does he have a French last name?), he went to live with his dad when his parents broke up. As the film begins he’s just lost his father to cancer and mom (Miou-Miou) has summoned him to Paris with the promise of a job that will tap his skills as an artist and a graphic designer.
As things turn out, the job is about as mindless as one could imagine: he’s going to be doing pasteup for office calendars manufactured by a company headed by one M. Pouchet (Pierre Vaneck) — a name that seems a calculated pun on “poulet,” the French word for “chicken.” He’s totally uninterested in the job and relatively uninterested in his motley group of co-workers — except for Guy (Alain Chabat), who insists on giving him dubious advice on how to meet, relate to and seduce women — and he’s living in a building owned by his mom, where he meets two potential girlfriends, neighbor Stéphanie (Charlotte Gainsbourg) and her friend Zoë (Emma de Caunes). He’s first attracted to Zoë but soon shifts his affections to Stéphanie, partly because she’s actually there in the same building with him and partly because Gondry has set it up that way and signaled his intentions to us by giving them such similar first names.
Nothing in Stéphane’s waking life, however, matters anywhere near as much as the world he enters when he dreams. We first see him hosting a faux TV show called “Stéphane TV,” which he seems to imagine as a cross between Pee Wee’s Playhouse and Wayne’s World, with cheerily mock props: the “cameras” are cardboard boxes with tubes stuck on the front to simulate lenses, the monitor screens in back of him are also cardboard with TV-shaped holes cut in them, and the sound baffling is made from egg crates. Gondry has clearly thrown far more imagination into the dream sequences than any other part of the movie; the dreams range from hard-edged realism to animated cardboard models and everything in between (the one in which he tries to flee the police in a cardboard car is especially charming), and they’re continually interesting visually in a way the rest of the film is not.
At first we’re genuinely charmed by Stéphane’s character — especially as compared to the hard-bitten, unscrupulous, ambitious aspiring actors Bernal played in Bad Education and dot the i — and we want to see him learn what he has to from his dream world, mature as an adult human being and end up with Stéphanie at the end. The problem with this film is that he doesn’t: he doesn’t grow, he doesn’t change, he doesn’t do anything. He just slacks off more and more on his job and alienates Stéphanie with gross physical references to her body in the manner of an eleven-year-old boy on a school playground taking out his first twinges of puberty on the girls his age. As the movie progresses — or at least unreels — Stéphanie gets more and more alienated by his boorish immaturity — and so does the audience, until by the end both she and we are glad to be rid of him.
Gondry’s inability — or unwillingness — to let his central character grow up makes it impossible to enjoy The Science of Sleep, despite its visual elegance and funny gags. It may seem strange that a sophisticated Latin American actor like Bernal comes off as more of a boor than Jim Carrey did in a similarly themed film by the same director, but that’s what happens. The characters in Eternal Sunshine, with their desperate desires to regain the memories artificially pumped out of them, touch the audience in ways that Gondry’s puppets in Science of Sleep do not. Gondry came to film directing via music videos, and it shows in his airy indifference to dramatic sense and his willingness to move his story in any direction that momentarily suits his desire for striking visual images — but whereas other music-video directors who’ve attempted feature films have had a problem with keeping their characters consistent, Science of Sleep errs in the opposite direction: Stéphane is too consistent, and we get tired of seeing him make the same mistakes over and over.
Buster Keaton’s 1924 Sherlock, Jr. remains the greatest dream comedy ever put on film; like Stéphane, Keaton’s character is a naïf, unable to relate normally to women (or anyone else), who assumes an air of potency and power when, falling asleep while running a movie in a small-town theatre (he works as the projectionist), he literally dreams his way into the movie he’s showing. As director, writer and star, Keaton shows a dazzling imagination far beyond Gondry’s, shooting his character’s dream with real-looking people and props that make the tension between real-reality and dream-reality that much more excruciating and hilarious. Gondry has a real sense of atmosphere — in his evocation of what a publishing company looked like before personal computers and scanners, with pasteup boards and stat cameras, you can practically smell the foul odors coming from the photographic typesetting machines and the chemicals that developed their output into usable galleys — but his movie isn’t anywhere nearly as entertaining as it would have been if he’d just have let Stéphane grow up a little.
by MARK GABRISH CONLAN
Copyright © 2006 by Mark Gabrish Conlan for Zenger's Newsmagazine • All rights reserved
The Science of Sleep — a French-Italian co-production from the world’s oldest continuously existing motion picture company, Gaumont, with a title rather awkwardly translated from the French La Science des Rêves (The Knowledge of Dreams would be closer) — begins as a real charmer. Basically it’s a remake of The Secret Life of Walter Mitty, drawing on both James Thurber’s marvelous short story of the man who couldn’t stop daydreaming and the 1947 film version, for which screenwriters Ken Englund and Everett Freeman made Mitty a proofreader for a pulp-fiction publisher (thereby immersing him in the same man-of-action clichés that fueled his dreams) and involved him in a real adventure story as far-fetched as anything in his dreams or his employer’s publications.
In this version, written and directed by Michel Gondry — who directed Eternal Sunshine of the Spotless Mind and co-wrote its story with Charlie Kaufman and Pierre Bismuth, only to see the reviewers hail Kaufman as that film’s auteur because he wrote the actual screenplay — the Mitty character is Stéphane Miroux (Gael GarcÃa Bernal). The product of a Mexican father and a French mother (though in that case, why does he have a French last name?), he went to live with his dad when his parents broke up. As the film begins he’s just lost his father to cancer and mom (Miou-Miou) has summoned him to Paris with the promise of a job that will tap his skills as an artist and a graphic designer.
As things turn out, the job is about as mindless as one could imagine: he’s going to be doing pasteup for office calendars manufactured by a company headed by one M. Pouchet (Pierre Vaneck) — a name that seems a calculated pun on “poulet,” the French word for “chicken.” He’s totally uninterested in the job and relatively uninterested in his motley group of co-workers — except for Guy (Alain Chabat), who insists on giving him dubious advice on how to meet, relate to and seduce women — and he’s living in a building owned by his mom, where he meets two potential girlfriends, neighbor Stéphanie (Charlotte Gainsbourg) and her friend Zoë (Emma de Caunes). He’s first attracted to Zoë but soon shifts his affections to Stéphanie, partly because she’s actually there in the same building with him and partly because Gondry has set it up that way and signaled his intentions to us by giving them such similar first names.
Nothing in Stéphane’s waking life, however, matters anywhere near as much as the world he enters when he dreams. We first see him hosting a faux TV show called “Stéphane TV,” which he seems to imagine as a cross between Pee Wee’s Playhouse and Wayne’s World, with cheerily mock props: the “cameras” are cardboard boxes with tubes stuck on the front to simulate lenses, the monitor screens in back of him are also cardboard with TV-shaped holes cut in them, and the sound baffling is made from egg crates. Gondry has clearly thrown far more imagination into the dream sequences than any other part of the movie; the dreams range from hard-edged realism to animated cardboard models and everything in between (the one in which he tries to flee the police in a cardboard car is especially charming), and they’re continually interesting visually in a way the rest of the film is not.
At first we’re genuinely charmed by Stéphane’s character — especially as compared to the hard-bitten, unscrupulous, ambitious aspiring actors Bernal played in Bad Education and dot the i — and we want to see him learn what he has to from his dream world, mature as an adult human being and end up with Stéphanie at the end. The problem with this film is that he doesn’t: he doesn’t grow, he doesn’t change, he doesn’t do anything. He just slacks off more and more on his job and alienates Stéphanie with gross physical references to her body in the manner of an eleven-year-old boy on a school playground taking out his first twinges of puberty on the girls his age. As the movie progresses — or at least unreels — Stéphanie gets more and more alienated by his boorish immaturity — and so does the audience, until by the end both she and we are glad to be rid of him.
Gondry’s inability — or unwillingness — to let his central character grow up makes it impossible to enjoy The Science of Sleep, despite its visual elegance and funny gags. It may seem strange that a sophisticated Latin American actor like Bernal comes off as more of a boor than Jim Carrey did in a similarly themed film by the same director, but that’s what happens. The characters in Eternal Sunshine, with their desperate desires to regain the memories artificially pumped out of them, touch the audience in ways that Gondry’s puppets in Science of Sleep do not. Gondry came to film directing via music videos, and it shows in his airy indifference to dramatic sense and his willingness to move his story in any direction that momentarily suits his desire for striking visual images — but whereas other music-video directors who’ve attempted feature films have had a problem with keeping their characters consistent, Science of Sleep errs in the opposite direction: Stéphane is too consistent, and we get tired of seeing him make the same mistakes over and over.
Buster Keaton’s 1924 Sherlock, Jr. remains the greatest dream comedy ever put on film; like Stéphane, Keaton’s character is a naïf, unable to relate normally to women (or anyone else), who assumes an air of potency and power when, falling asleep while running a movie in a small-town theatre (he works as the projectionist), he literally dreams his way into the movie he’s showing. As director, writer and star, Keaton shows a dazzling imagination far beyond Gondry’s, shooting his character’s dream with real-looking people and props that make the tension between real-reality and dream-reality that much more excruciating and hilarious. Gondry has a real sense of atmosphere — in his evocation of what a publishing company looked like before personal computers and scanners, with pasteup boards and stat cameras, you can practically smell the foul odors coming from the photographic typesetting machines and the chemicals that developed their output into usable galleys — but his movie isn’t anywhere nearly as entertaining as it would have been if he’d just have let Stéphane grow up a little.
Hollywoodland: Marvelous Truth-Based Neo-Noir
by MARK GABRISH CONLAN
On June 16, 1959, actor George Reeves — a journeyman contract player at Warner Bros. in the late 1930’s and early 1940’s who’d had a minor role in Gone With the Wind, served in World War II, come back to a town that had passed him by, moved to New York, done live TV and landed a role in the filmed series Adventures of Superman that turned him into an icon — was found shot to death in the upstairs bedroom of his home while some friends of his were partying downstairs. The police officially ruled Reeves’ death a suicide, and that’s what most of the people who knew him thought. Jack Larson, who played Jimmy Olsen on the Adventures of Superman show, said he thought Reeves was depressed because he wanted to appeal to adult audiences and the only people who came to his personal appearances were children. But rumors have persisted that he was actually murdered, and that his killing had something to do with his long affair with Toni Lanier Mannix, wife of MGM’s second-in-command Eddie Mannix.
That’s the factual basis behind the new movie Hollywoodland, which opens September 8. Directed by Allen Coulter — who’s never made a feature film before but has done a large amount of work for TV, including such edgy series as The X-Files, Millennium, Sex and the City and The Sopranos — and written and co-produced by Paul Bernbaum, another TV vet with just one previous feature credits (something from 1998 called Family Plan), Hollywoodland uses Reeves’ death as the basis for a modern-day film noir about greed, lust, the lure of stardom and the enveloping fog of protectiveness the major studios wrapped around their operations from the 1930’s to the 1950’s to make sure that nothing came out in the media that might deglamorize the stars and discourage moviegoers from paying money to see their films.
Hollywoodland is basically two movies in one. It’s a biopic of the last eight years in the life of Reeves (Ben Affleck) that picks him up as a studio-system reject, follows him through his unexpected — and unexpectedly humiliating — stardom as Superman on TV, and grimly contrasts his off-screen drinking, smoking and screwing with the clean-cut image he was expected to maintain as the living embodiment of “truth, justice and the American way.” Intercut with this story is the tale of seamy (and fictional) private detective Louis Simo (Adrien Brody, top-billed) who, reduced to working out of a motel room for only one client, latches on to the Reeves case as a way of making megabucks and gets Reeves’ mother (Heather Allin) to pay him to investigate the case as a murder.
Cynical tales about Hollywood and its denizens are nothing new on screen — indeed one might say that Hollywoodland was made for the moviegoers who liked Chinatown and L.A. Confidential — but Hollywoodland is a richer and better film than either of those. It doesn’t wear its cynicism quite so obviously on its sleeve, and writer Bernbaum is careful to give his central characters at least some points of audience appeal and sympathy. Simo may be in it for the money, but he’s also saddled with an ex-wife (Molly Parker) and a son (Charlie Lea at age 5, Zach Mills later) who idolized Reeves as Superman, then burned his Superman suit — an earlier present from daddy — in a fit of disappointed rage after Reeves died. As for Reeves, he becomes something of a pathetic figure, torn between being the boy-toy of Toni Mannix (Diane Lane) and his lust-at-first-sight relationship with fiancée Leonore Lemmon (Robin Tunney) and also hating the part of Superman even as it gives him something of the stardom he’s long craved.
Hollywoodland is occasionally too sluggish in its pacing, too brown-toned in its physical appearance (a common failing of modern films set in the recent past) and sometimes confusing in the time sequence: often we don’t know which storyline a scene is part of until we see whether Brody or Affleck is in it. But most of Coulter’s direction is tight and involving, and the cast members seem so “right” for their parts it’s amazing that anyone else was even considered for this film (which they were: among the actors up for Simo were Benicio Del Toro and Joaquin Phoenix, while Hugh Jackman and Kyle MacLachlan were on the list to play Reeves). Brody proves that there is indeed life for him after The Pianist; he gives his character the right mix of seediness and underlying integrity, and at times seems to be taking as much punishment here as he did as a man hiding out from the Nazis. Lane is equally adept at playing the seductress in her opening scene and the still good-looking but aging woman at the end who has to face not only the loss of her boy-toy to someone younger but the evidence in her mirror. Affleck, after his recent string of flops, seems all too right for the role of a star on the skids; though he’s not as tall as the real Reeves (who, in his early days at Warners, competed for “B” leads with another tall, fair and gangly young actor, Ronald Reagan), he otherwise looks uncannily like him and brings dimension to a part far more complex than any Reeves himself played.
One especially noteworthy aspect of Hollywoodland is its illustration of how destructive the wrong kind of fame can be. Reeves shows up for the Superman audition sure that the show will never be aired and he’ll get a quick, one-shot paycheck that won’t do long-term damage to his career. When he gets the role and starts actually shooting, he’s forced to wear a padded costume to look more muscular and to do his own stunts, taking a bone-jarring fall from the wire harness that’s supposed to make it look as if he can fly. When he performs as Superman live before an audience of kids, one aims a real gun at him and threatens to shoot him, sure that the bullets will just bounce off the way they do on TV. When Reeves finally lands a role in a big, classy movie, From Here to Eternity, the preview audience giggles when he comes on and wags in the crowd titter, “That’s Superman!” — and he’s cut out of the film. (The real Reeves is visible, barely, in From Here to Eternity, but he was purged from its credits.) And when the Superman series is dropped after seven seasons, the only job his manager can line up for him is wrestling.
There’s a good deal more dark and sinister about this story. Though the film never takes a position on whether Reeves’ death was suicide or murder, it includes real people in the dramatis personae, notably Eddie Mannix (Bob Hoskins, totally suppressing his British accent and talking like an East Coast Jew), who rose from bouncer at the Palisades amusement park in New Jersey to second-in-command behind Louis B. Mayer at MGM; and Howard Strickling (Joe Spano), who spent his life at MGM covering up everything from Clark Gable’s drunk-driving arrests to the suicide of Jean Harlow’s husband. (The real Strickling turned down seven-figure offers to write a tell-all memoir and took his secrets to his grave.) At its height, the studio system — under which even the biggest stars were long-term employees of a single company, contractually barred from choosing their own parts or working anywhere else — was paternalistic but also often brutal in the way it exploited the up-and-coming and disposed of the down-and-falling. In the 1950’s this system was down but not yet out, and the aura of evil that surrounds Mannix and Strickling as they’re depicted here stems from the studios’ determination — aided by a far more cooperative media than today’s — to quash all adverse publicity and present the great stars as epitomes of glamour and morals off screen as well as on.
Allen Coulter and Paul Bernbaum have taken a potentially great story and achieved just the right mix of elements. Tragedy, absurdity, pathos and rage alternate in a plot in which nothing is quite as it seems and the efforts of the characters to achieve a normal life in the Hollywood hothouse — particularly Simo’s doomed efforts to patch his family back together while still dating his secretary (who’s seeing someone else, just as Eddie Mannix’s wife feels entitled to take Reeves as a sort of male mistress because Mr. Mannix has an extra-relational sex life of his own) — are simultaneously exasperating and moving in a twisted sort of way. Even the ending leaves it unclear whether Simo has found himself, sold out or both. Hollywoodland — filmed under the even more ironic working title Truth, Justice and the American Way — dredges up a sow’s ear of Hollywood scandal and turns it into a silk purse. It’s a movie not to be missed.
by MARK GABRISH CONLAN
On June 16, 1959, actor George Reeves — a journeyman contract player at Warner Bros. in the late 1930’s and early 1940’s who’d had a minor role in Gone With the Wind, served in World War II, come back to a town that had passed him by, moved to New York, done live TV and landed a role in the filmed series Adventures of Superman that turned him into an icon — was found shot to death in the upstairs bedroom of his home while some friends of his were partying downstairs. The police officially ruled Reeves’ death a suicide, and that’s what most of the people who knew him thought. Jack Larson, who played Jimmy Olsen on the Adventures of Superman show, said he thought Reeves was depressed because he wanted to appeal to adult audiences and the only people who came to his personal appearances were children. But rumors have persisted that he was actually murdered, and that his killing had something to do with his long affair with Toni Lanier Mannix, wife of MGM’s second-in-command Eddie Mannix.
That’s the factual basis behind the new movie Hollywoodland, which opens September 8. Directed by Allen Coulter — who’s never made a feature film before but has done a large amount of work for TV, including such edgy series as The X-Files, Millennium, Sex and the City and The Sopranos — and written and co-produced by Paul Bernbaum, another TV vet with just one previous feature credits (something from 1998 called Family Plan), Hollywoodland uses Reeves’ death as the basis for a modern-day film noir about greed, lust, the lure of stardom and the enveloping fog of protectiveness the major studios wrapped around their operations from the 1930’s to the 1950’s to make sure that nothing came out in the media that might deglamorize the stars and discourage moviegoers from paying money to see their films.
Hollywoodland is basically two movies in one. It’s a biopic of the last eight years in the life of Reeves (Ben Affleck) that picks him up as a studio-system reject, follows him through his unexpected — and unexpectedly humiliating — stardom as Superman on TV, and grimly contrasts his off-screen drinking, smoking and screwing with the clean-cut image he was expected to maintain as the living embodiment of “truth, justice and the American way.” Intercut with this story is the tale of seamy (and fictional) private detective Louis Simo (Adrien Brody, top-billed) who, reduced to working out of a motel room for only one client, latches on to the Reeves case as a way of making megabucks and gets Reeves’ mother (Heather Allin) to pay him to investigate the case as a murder.
Cynical tales about Hollywood and its denizens are nothing new on screen — indeed one might say that Hollywoodland was made for the moviegoers who liked Chinatown and L.A. Confidential — but Hollywoodland is a richer and better film than either of those. It doesn’t wear its cynicism quite so obviously on its sleeve, and writer Bernbaum is careful to give his central characters at least some points of audience appeal and sympathy. Simo may be in it for the money, but he’s also saddled with an ex-wife (Molly Parker) and a son (Charlie Lea at age 5, Zach Mills later) who idolized Reeves as Superman, then burned his Superman suit — an earlier present from daddy — in a fit of disappointed rage after Reeves died. As for Reeves, he becomes something of a pathetic figure, torn between being the boy-toy of Toni Mannix (Diane Lane) and his lust-at-first-sight relationship with fiancée Leonore Lemmon (Robin Tunney) and also hating the part of Superman even as it gives him something of the stardom he’s long craved.
Hollywoodland is occasionally too sluggish in its pacing, too brown-toned in its physical appearance (a common failing of modern films set in the recent past) and sometimes confusing in the time sequence: often we don’t know which storyline a scene is part of until we see whether Brody or Affleck is in it. But most of Coulter’s direction is tight and involving, and the cast members seem so “right” for their parts it’s amazing that anyone else was even considered for this film (which they were: among the actors up for Simo were Benicio Del Toro and Joaquin Phoenix, while Hugh Jackman and Kyle MacLachlan were on the list to play Reeves). Brody proves that there is indeed life for him after The Pianist; he gives his character the right mix of seediness and underlying integrity, and at times seems to be taking as much punishment here as he did as a man hiding out from the Nazis. Lane is equally adept at playing the seductress in her opening scene and the still good-looking but aging woman at the end who has to face not only the loss of her boy-toy to someone younger but the evidence in her mirror. Affleck, after his recent string of flops, seems all too right for the role of a star on the skids; though he’s not as tall as the real Reeves (who, in his early days at Warners, competed for “B” leads with another tall, fair and gangly young actor, Ronald Reagan), he otherwise looks uncannily like him and brings dimension to a part far more complex than any Reeves himself played.
One especially noteworthy aspect of Hollywoodland is its illustration of how destructive the wrong kind of fame can be. Reeves shows up for the Superman audition sure that the show will never be aired and he’ll get a quick, one-shot paycheck that won’t do long-term damage to his career. When he gets the role and starts actually shooting, he’s forced to wear a padded costume to look more muscular and to do his own stunts, taking a bone-jarring fall from the wire harness that’s supposed to make it look as if he can fly. When he performs as Superman live before an audience of kids, one aims a real gun at him and threatens to shoot him, sure that the bullets will just bounce off the way they do on TV. When Reeves finally lands a role in a big, classy movie, From Here to Eternity, the preview audience giggles when he comes on and wags in the crowd titter, “That’s Superman!” — and he’s cut out of the film. (The real Reeves is visible, barely, in From Here to Eternity, but he was purged from its credits.) And when the Superman series is dropped after seven seasons, the only job his manager can line up for him is wrestling.
There’s a good deal more dark and sinister about this story. Though the film never takes a position on whether Reeves’ death was suicide or murder, it includes real people in the dramatis personae, notably Eddie Mannix (Bob Hoskins, totally suppressing his British accent and talking like an East Coast Jew), who rose from bouncer at the Palisades amusement park in New Jersey to second-in-command behind Louis B. Mayer at MGM; and Howard Strickling (Joe Spano), who spent his life at MGM covering up everything from Clark Gable’s drunk-driving arrests to the suicide of Jean Harlow’s husband. (The real Strickling turned down seven-figure offers to write a tell-all memoir and took his secrets to his grave.) At its height, the studio system — under which even the biggest stars were long-term employees of a single company, contractually barred from choosing their own parts or working anywhere else — was paternalistic but also often brutal in the way it exploited the up-and-coming and disposed of the down-and-falling. In the 1950’s this system was down but not yet out, and the aura of evil that surrounds Mannix and Strickling as they’re depicted here stems from the studios’ determination — aided by a far more cooperative media than today’s — to quash all adverse publicity and present the great stars as epitomes of glamour and morals off screen as well as on.
Allen Coulter and Paul Bernbaum have taken a potentially great story and achieved just the right mix of elements. Tragedy, absurdity, pathos and rage alternate in a plot in which nothing is quite as it seems and the efforts of the characters to achieve a normal life in the Hollywood hothouse — particularly Simo’s doomed efforts to patch his family back together while still dating his secretary (who’s seeing someone else, just as Eddie Mannix’s wife feels entitled to take Reeves as a sort of male mistress because Mr. Mannix has an extra-relational sex life of his own) — are simultaneously exasperating and moving in a twisted sort of way. Even the ending leaves it unclear whether Simo has found himself, sold out or both. Hollywoodland — filmed under the even more ironic working title Truth, Justice and the American Way — dredges up a sow’s ear of Hollywood scandal and turns it into a silk purse. It’s a movie not to be missed.
Subscribe to:
Posts (Atom)