Ron Fournier: Not A Journalist (Actually, He's A Dick)

So a guy named Ron Fournier (RonFo) is the AP's Washington Beaureau Cheif. This morning he put on the wire that:
He [Obama] picked a 35-year veteran of the Senate — the ultimate insider — rather than a candidate from outside Washington, such as Govs. Tim Kaine of Virginia or Kathleen Sebelius of Kansas; or from outside his party, such as Sen. Chuck Hagel of Nebraska; or from outside the mostly white male club of vice presidential candidates. Hillary Rodham Clinton didn't even make his short list.
This gives the Obama-as-neophyte meme some legs, no? But it's bullshit. Here is why:

There are two ways to consider Fournier's piece: substantively and in the broader context.

First, on the substance, Fournier's analysis seems a little lazy. By his logic, any potential running mate shows a "lack of confidence" -- picking Hillary would mean Obama lacked confidence in his ability to win over women voters; picking Bayh would mean Obama lacked confidence in his ability to win over independents and conservative Dems; picking Webb would mean Obama lacked confidence in his ability to win over voters concerned about national security; picking Kaine would mean Obama lacked confidence in his ability to win over voters in the South; etc. For that matter, "the status quo" in Washington has been conservative Republican rule. Biden may be an old pro and a DC insider, but he's anything but "the status quo."

Second, in context, Fournier's objectivity covering the presidential race continues to look shaky. We are, after all, talking about a journalist who, as recently as last year, considered working for the McCain campaign.

RonFo is a McBushie! And he sends news all over the world on the AP wire! Keep up the smackdowns!!

A Young Journalist's Take On Biden

Spencer Ackerman (formerly of TNR, stupid bastards) who writes Attackerman has interviewed Biden a few times. Biden, like the universe, is yin and yang, dark and light, up and down, so on and so on. But what shines through is the fact that Biden doesn't bullshit around. He is a serious guy, a wonk, and it is apparently obvious to those who meet him. Here is Ackerman's best description from his post:
Something else that struck me from our interviews. Biden is many things, but he's absolutely not intellectually insecure. I've seen his key staffers argue with him on important, substantive points of policy -- war policy, even -- while I was in the room, notebook out, voice recorder on. Once Biden agreed to an interview about the war after coming back from an eleven-hour flight from Libya, and was disturbingly sharp, and so was his key foreign-policy aide, Tony Blinken. A lot of politicians keep yes-men around. Biden keeps intellectual counterweights around, both his staff and the press, to keep himself sharp. Whatever his faults, he'll be ready to govern from the start.
I am thrilled about Biden as the choice. I like the fact that he argues, has opinions, and takes no shit. Forget the other nonsense about plagerism (not really) and credential padding (more like parsing). The guy will be an asset to an Obama administration. So, just shut up!

The Republican Machine Will Eat Your Charismatic Candidate

Mike Kinsley has a new post at the Post about the dangers of success. Money quote:
The greatest strength of this year's Democratic candidate, Barack Obama, is his eloquence, his charisma, his ability to create excitement and draw a crowd. This could be a legitimate debating point if the Republicans were saying that, on some particular issue or even many, Obama is using his charm and way with words to disguise a lack of substance in what he says. But Republican ambitions are grander: They are attacking Obama's charisma, as if popularity itself were a disqualifying factor and whoever draws the larger crowds is by definition the lesser candidate. This is truly perverse. It comes close to being an attack on democracy itself. Can the Republicans possibly score with such a preposterous argument?

Oh, probably.
Let's not fawn!


It's nice to hear Biden fired up, literally!

Biden Helps Obama

This is Bidentown. Notice how close it is to Pennsylvania. He is an asset, and may sway some of those in need of catharsis. Anyway, I was just thinking and mapping...

I Had An Original Thought: 2

The school reformers who would "professional development" us to death would also have you believe that most teachers just aren't very smart. These are the same folks who, rightly or wrongly, put weight in IQ indicators. For argument, let's stipulate that too many teachers are stoopid. Here is a chart showing the population's IQ distribution:

If 100 is normal, and a standard deviation is 15, you see a couple SDs in either direction and you're looking at someone packing, or not, so to speak. How many folks with IQs of 120+ are going into teaching? Not many. Hell, there aren't very many! What might lure them?

Also, I think there are teachers out there who are incredibly smart, and do their thing their way, and it works. They are indeed, and in IQ score, exceptional. So leave them alone! Let them teach! They bring what NCLB is quickly draining from our kids--creativity! Thoughtfulness! Things that matter to humanity, not some bottom line or rung on a career ladder.


Like I Said...

It's Biden. Secret Service have been dispatched to his house.
More here.
Update: It's 1:23am in CA. Where's my damn text message!
Update II: It's 1:55 and I just got the email. Sent Joe a nice welcome note. Now GO GET ELECTED!!!
Update III: It's 1:58 and I got my text message. Carry on.

Schools Aren't Businesses!

Here is another one of those school-as-business/weigh-the-pig debunking articles. Most of these, and the ones I write, come from the position that teachers and schools cannot perform miracles; parents must be involved, and be accountable for the education of their children. Read the article below.
Charlie Kyte: Don't just take aim at the schools; work with them


August 21, 2008

The only chance Minnesota has of developing a competitive future workforce lies in a closely collaborative relationship between businesspeople and educators.

That relationship, though, is far from mutually supportive. Business leaders criticize educators and demand that market-model benchmarks be met to earn what seems like grudging support. Educators resist engaging with the business community for fear of scrutiny that focuses on shortcomings rather than on successes.

Last Sunday, Bill Blazar of the Minnesota Chamber of Commerce laid out his expectations of the Minneapolis public schools in order to gain his support of their much-needed operating levy, to be voted on in November. He stated that he wanted a demonstration of accountability from the school system.

Since success will only come with a sincere effort to work together, I've developed a number of suggested benchmarks for our business community partners. This is what they need to do to earn the support of educators as we together try to have more students be successful.

•Make sure every one of your employees who is a parent of young children has their own child "learning/reading/behaviorally" ready when they begin kindergarten. Make part of their compensation dependent on this goal, and reward them for success.

•Create every possible opportunity to have teachers come inside your businesses to see firsthand the skills your employees need to be successful. Do this at "scale" for large numbers of educators.

•Create a media marketing/advertising campaign that will embed the belief in students and parents that obtaining an education will provide a better future. Tell them "do well in school and we will have a job waiting for you." Sell education as their best hope for the future.

•Begin to compliment educators when there is success. The constant "tough love" set of messages is not lifting our teachers. Rather, it is crushing their spirits.

Oh, and by the way ... we would like all these goals to be measurable.

Let's start working together.

Charlie Kyte is the executive director of the Minnesota Association of School Administrators and served for 20 years as a superintendent of schools in Minnesota.


The Perpetuation Of False Memes (Its Bad!)

I seem to be on a if you say it enough it becomes effectively true thing. Over at LG&M they have a great little piece demonstrating how destructive such stupidity can be. Remember Al Gore in 2000? A little perspective. Here you go:

Manufactured Outrage

Since it's amazing how persistent the idea that Al Gore was an awful candidate is -- with special focus on his allegedly bad debate performance -- it's worth returning to this simple fact:

But without question, “Al Gore’s [alleged] operatic sighs” played a key role in Campaign 2000. In the wake of that first Bush-Gore debate, TV journalists put Gore’s (infrequent) sighs on a tape; jacked the volume way, way up; and played them again and again, in a loop. And yes, this seemed to affect the election. In the immediate aftermath of that debate, five polls of viewers were taken; in all five polls, viewers said that Gore had won the debate, by an average margin of ten points. But so what? After “journalists” played that loop tape—and flogged some trivial errors by Gore—judgments about the debate began changing. Within a week, Gore lost his lead in the national polls. He was forced to fight from behind right through Election Day.
To be clear: it wasn't "the public" that thought Gore's sighing was more important that the substantive matters, or that Gore's trivial errors about which particular FEMA official accompanied him on a particular visit were more important than George Bush's baldfaced howlers about his fiscal proposals. It was the press. People who actually watched the debate thought Gore won; people who learned about it from press coverage didn't. And it's not clear what a candidate can do in the face of such skewed priorities. It is, however, crucial to be alert to the ways these narratives develop and to counteract them before the fact.

More Education Myth Debunking: Learning Styles

Learning Styles. Multiple Intelligences. Blank Slates. Ghosts In Our Machines. Noble Savages. Jewish Popes. These things do not exist. They are products of our minds, and as such hold great weight, even though they are wrong. You know how these things work--something that is not true is said enough, and heard enough, that everyone just believes it, even though the something is blatantly, obviously, demonstrably false. An example: Saddam Hussein had something to do with 9-11. It was said enough that many believe(d) it. And it effected our behavior. We started a war and killed a bunch of people. So, believing things that are untrue can be very dangerous, even when everyone thinks they are true.

Now, this video by Daniel Willingham debunks the learning styles myth. It has been debunked in many places, fortunately (just Google "learning styles debunked). Watch, and enjoy the feeling of "ahhhh, I thought so!"

Who Should Obama Pick?

Good grief! Obama should, and apparently did, pick his VP based on what he wants, not what the game may enjoy. But, he gotta play the game, right? It's unfortunately looking more and more like he is going to have to play the game to get elected.

How horrible is it that the one candidate in my voting life that has morals I want to support (like not pandering--to anyone) may be required to pander to a bunch of whiny, in need of catharsis, foot-shooting Clinton supporters by making her his VP. I don't think it will happen because I think Obama is pretty sure he wants to be the president when he gets elected, and not the Clintons.

But is the race so tight that Democrats risk losing to McCain? Looks like that may be so. So, do we need to play the game and start getting behind a Clinton for VP thing? Maybe! But you watch, it will be Biden!

What If There Were No Toys?

Kids used to play. Now they seem to compete with objects bought for them. It is disappointing. Here is a little ditty pointing out how kids morphed from imaginative to needy.
Old-Fashioned Play Builds Serious Skills

by Alix Spiegel

Morning Edition, February 21, 2008 ·

On October 3, 1955, the Mickey Mouse Club debuted on television. As we all now know, the show quickly became a cultural icon, one of those phenomena that helped define an era.

What is less remembered but equally, if not more, important, is that another transformative cultural event happened that day: The Mattel toy company began advertising a gun called the "Thunder Burp."

I know — who's ever heard of the Thunder Burp?

Well, no one.

The reason the advertisement is significant is because it marked the first time that any toy company had attempted to peddle merchandise on television outside of the Christmas season. Until 1955, ad budgets at toy companies were minuscule, so the only time they could afford to hawk their wares on TV was during Christmas. But then came Mattel and the Thunder Burp, which, according to Howard Chudacoff, a cultural historian at Brown University, was a kind of historical watershed. Almost overnight, children's play became focused, as never before, on things — the toys themselves.

"It's interesting to me that when we talk about play today, the first thing that comes to mind are toys," says Chudacoff. "Whereas when I would think of play in the 19th century, I would think of activity rather than an object."

Chudacoff's recently published history of child's play argues that for most of human history what children did when they played was roam in packs large or small, more or less unsupervised, and engage in freewheeling imaginative play. They were pirates and princesses, aristocrats and action heroes. Basically, says Chudacoff, they spent most of their time doing what looked like nothing much at all.

"They improvised play, whether it was in the outdoors… or whether it was on a street corner or somebody's back yard," Chudacoff says. "They improvised their own play; they regulated their play; they made up their own rules."

But during the second half of the 20th century, Chudacoff argues, play changed radically. Instead of spending their time in autonomous shifting make-believe, children were supplied with ever more specific toys for play and predetermined scripts. Essentially, instead of playing pirate with a tree branch they played Star Wars with a toy light saber. Chudacoff calls this the commercialization and co-optation of child's play — a trend which begins to shrink the size of children's imaginative space.

But commercialization isn't the only reason imagination comes under siege. In the second half of the 20th century, Chudacoff says, parents became increasingly concerned about safety, and were driven to create play environments that were secure and could not be penetrated by threats of the outside world. Karate classes, gymnastics, summer camps — these create safe environments for children, Chudacoff says. And they also do something more: for middle-class parents increasingly worried about achievement, they offer to enrich a child's mind.

Change in Play, Change in Kids

Clearly the way that children spend their time has changed. Here's the issue: A growing number of psychologists believe that these changes in what children do has also changed kids' cognitive and emotional development.

It turns out that all that time spent playing make-believe actually helped children develop a critical cognitive skill called executive function. Executive function has a number of different elements, but a central one is the ability to self-regulate. Kids with good self-regulation are able to control their emotions and behavior, resist impulses, and exert self-control and discipline.

We know that children's capacity for self-regulation has diminished. A recent study replicated a study of self-regulation first done in the late 1940s, in which psychological researchers asked kids ages 3, 5 and 7 to do a number of exercises. One of those exercises included standing perfectly still without moving. The 3-year-olds couldn't stand still at all, the 5-year-olds could do it for about three minutes, and the 7-year-olds could stand pretty much as long as the researchers asked. In 2001, researchers repeated this experiment. But, psychologist Elena Bodrova at Mid-Continent Research for Education and Learning says, the results were very different.

"Today's 5-year-olds were acting at the level of 3-year-olds 60 years ago, and today's 7-year-olds were barely approaching the level of a 5-year-old 60 years ago," Bodrova explains. "So the results were very sad."

Sad because self-regulation is incredibly important. Poor executive function is associated with high dropout rates, drug use and crime. In fact, good executive function is a better predictor of success in school than a child's IQ. Children who are able to manage their feelings and pay attention are better able to learn. As executive function researcher Laura Berk explains, "Self-regulation predicts effective development in virtually every domain."

The Importance of Self-Regulation

According to Berk, one reason make-believe is such a powerful tool for building self-discipline is because during make-believe, children engage in what's called private speech: They talk to themselves about what they are going to do and how they are going to do it.

"In fact, if we compare preschoolers' activities and the amount of private speech that occurs across them, we find that this self-regulating language is highest during make-believe play," Berk says. "And this type of self-regulating language… has been shown in many studies to be predictive of executive functions."

And it's not just children who use private speech to control themselves. If we look at adult use of private speech, Berk says, "we're often using it to surmount obstacles, to master cognitive and social skills, and to manage our emotions."

Unfortunately, the more structured the play, the more children's private speech declines. Essentially, because children's play is so focused on lessons and leagues, and because kids' toys increasingly inhibit imaginative play, kids aren't getting a chance to practice policing themselves. When they have that opportunity, says Berk, the results are clear: Self-regulation improves.

"One index that researchers, including myself, have used… is the extent to which a child, for example, cleans up independently after a free-choice period in preschool," Berk says. "We find that children who are most effective at complex make-believe play take on that responsibility with… greater willingness, and even will assist others in doing so without teacher prompting."

Despite the evidence of the benefits of imaginative play, however, even in the context of preschool young children's play is in decline. According to Yale psychological researcher Dorothy Singer, teachers and school administrators just don't see the value.

"Because of the testing, and the emphasis now that you have to really pass these tests, teachers are starting earlier and earlier to drill the kids in their basic fundamentals. Play is viewed as unnecessary, a waste of time," Singer says. "I have so many articles that have documented the shortening of free play for children, where the teachers in these schools are using the time for cognitive skills."

It seems that in the rush to give children every advantage — to protect them, to stimulate them, to enrich them — our culture has unwittingly compromised one of the activities that helped children most. All that wasted time was not such a waste after all.

(And some suggestions:)

Your Questions on Kids & Play

Organizing play for kids has never seemed like more work. But researchers Adele Diamond and Deborah Leong have good news: The best kind of play costs nothing and really only has one main requirement — imagination.

Here, they answer your questions about play.

Better Ways to Play
Self-regulation is a critical skill for kids. Unfortunately, most kids today spend a lot of time doing three things: watching television, playing video games and taking lessons. None of these activities promote self-regulation.

We asked for alternatives from three researchers: Deborah Leong, professor of psychology at Metropolitan State College of Denver, Elena Bodrova, senior researcher with Mid-Continent Research for Education and Learning, and Laura Berk, professor of psychology at Illinois State University.

Here are their suggestions:

Simon Says: Simon Says is a game that requires children to inhibit themselves. You have to think and not do something, which helps to build self-regulation.

Complex Imaginative Play: This is play where your child plans scenarios and enacts those scenarios for a fair amount of time, a half-hour at a minimum, though longer is better. Sustained play that last for hours is best. Realistic props are good for very young children, but otherwise encourage kids to use symbolic props that they create and make through their imaginations. For example, a stick becomes a sword.

Activities That Require Planning: Games with directions, patterns for construction, recipes for cooking, for instance.

Joint Storybook Reading: "Reading storybooks with preschoolers promotes self-regulation, not just because it fosters language development, but because children's stories are filled with characters who model effective self-regulatory strategies," says researcher Laura Berk.

She cites the classic example of Watty Piper's The Little Engine That Could, in which a little blue engine pulling a train of toys and food over a mountain breaks down and must find a way to complete its journey. The engine chants, "I think I can. I think I can. I think I can," and with persistence and effort, surmounts the challenge.

Encourage Children to Talk to Themselves: "Like adults, children spontaneously speak to themselves to guide and manage their own behavior," Berk says. "In fact, children often use self-guiding comments recently picked up from their interactions with adults, signaling that they are beginning to apply those strategies to themselves.

"Permitting and encouraging children to be verbally active — to speak to themselves while engaged in challenging tasks — fosters concentration, effort, problem-solving, and task success." — Alix Spiegel

McCain Was NOT Tortured! (According To (V)POTUS)

Andrew Sullivan says in black and white what needs to be said about the sinking depths to which Republicans will go to get elected. It proves again the Repubs care about the end, not the means!
Does Bush Believe McCain Was Tortured?

In all the discussion of John McCain's recently recovered memory of a religious epiphany in Vietnam, one thing has been missing. The torture that was deployed against McCain emerges in all the various accounts. It involved sleep deprivation, the withholding of medical treatment, stress positions, long-time standing, and beating. Sound familiar?

According to the Bush administration's definition of torture, McCain was therefore not tortured.

Cheney denies that McCain was tortured; as does Bush. So do John Yoo and David Addington and George Tenet. In the one indisputably authentic version of the story of a Vietnamese guard showing compassion, McCain talks of the agony of long-time standing. A quarter century later, Don Rumsfeld was putting his signature to memos lengthening the agony of "long-time standing" that victims of Bush's torture regime would have to endure. These torture techniques are, according to the president of the United States, merely "enhanced interrogation."

No war crimes were committed against McCain. And the techniques used are, according to the president, tools to extract accurate information. And so the false confessions that McCain was forced to make were, according to the logic of the Bush administration, as accurate as the "intelligence" we have procured from "interrogating" terror suspects. Feel safer?

The cross-in-the-dirt story - although deeply fishy to any fair observer - is in the realm of the unprovable. But the actual techniques used on McCain, and the lies they were designed to legitimize, are a matter of historical record. And the government of the United States now practices the very same techniques that the Communist government of North Vietnam once proudly used against American soldiers. When they are used against future John McCains, the victims will know, in a way McCain didn't, that their own government has no moral standing to complain.

Now the kicker: in the Military Commissions Act, McCain acquiesced to the use of these techniques against terror suspects by the CIA. And so the tortured became the enabler of torture. Someone somewhere cried out in pain for the same reasons McCain once did. And McCain let it continue.

These are the prices people pay for power.

Teachers Hate NCLB (You Would Too If It Happened To You!)

Here is an article by an elementary school teacher. She sounds like most of us! Read it and weep.
One Teacher’s Cry: Why I Hate No Child Left Behind
By Susan J. Hobart, August 2008 Issue

I’m a teacher. I’ve taught elementary school for eleven years. I’ve always told people, “I have the best job in the world.” I crafted curriculum that made students think, and they had fun while learning. At the end of the day, I felt energized. Today, more often than not, I feel demoralized.

While I still connect my lesson plans to students’ lives and work to make it real, this no longer is my sole focus. Today I have a new nickname: testbuster. Singing to the tune of “Ghostbusters,” I teach test-taking strategies similar to those taught in Stanley Kaplan prep courses for the SAT. I spend an inordinate amount of time showing students how to “bubble up,” the term for darkening those little circles that accompany multiple choice questions on standardized tests.

I am told these are invaluable skills to have.

I am told if we do a good job, our students will do well.

I am told that our district does not teach to the test.

I am told that the time we are spending preparing for and administering the tests, analyzing the results, and attending in-services to help our children become proficient on this annual measure of success will pay off by reducing the academic achievement gap between our white children and our children of color.

I am told a lot of things.

But what I know is that I’m not the teacher I used to be. And it takes a toll. I used to be the one who raved about my classroom, even after a long week. Pollyanna, people called me. Today, when I speak with former colleagues, they are amazed at the cynicism creeping into my voice.

What has changed?

No Child Left Behind is certainly a big part of the problem. The children I test are from a wide variety of abilities and backgrounds. Whether they have a cognitive disability, speak entry-level English, or have speech or language delays, everyone takes the same test and the results are posted. Special education students may have some accommodations, but they take the same test and are expected to perform at the same level as general education students. Students new to this country or with a native language other than English must also take the same test and are expected to perform at the same level as children whose native language is English. Picture yourself taking a five-day test in French after moving to Paris last year.

No Child Left Behind is one size fits all. But any experienced teacher knows how warped a yardstick that is.

I spent yesterday in a meeting discussing this year’s standardized test results. Our team was feeling less than optimistic in spite of additional targeted funds made available to our students who are low income or who perform poorly on such tests.

As an educator, I know these tests are only one measure, one snapshot, of student achievement. Unfortunately, they are the make-or-break assessment that determines our status with the Department of Education.

They are the numbers that are published in the paper.

They are the scores that homebuyers look at when deciding if they should move into a neighborhood.

They are the numbers that are pulled out and held over us, as more and greater rigidity enters the curriculum.

I was recently told we cannot buddy up with a first-grade class during our core literacy time. It does not fit the definition of core literacy, I was told. Reading with younger children has been a boon to literacy improvement for my struggling readers and my new English-speaking students. Now I must throw this tool away?

In an increasingly diverse public school setting, there is not one educational pedagogy that fits all students. We study and discuss differentiated curriculum, modify teaching strategies, and set “just right reading levels” to scaffold student learning. But No Child Left Behind doesn’t care about that. It takes no note of where they started or how much they may have progressed.

As a teacher, I measure progress and achievement for my students on a daily basis. I set the bar high, expecting a lot.

I don’t argue with the importance of assessment; it informs my instruction for each child.

I don’t argue with the importance of accountability; I believe in it strongly—for myself and my students.

I have empathy for our administrators who have to stand up and be told that we are “challenged schools.” And I have empathy for our administrators who have to turn around and drill it into our teacher heads, telling us we must do things “this” way to get results. I feel for them. They are judged on the numbers, as well.

No Child Left Behind is a symptom of a larger problem: the attack on public education itself. Like the school choice effort, which uses public funds to finance private schools and cherry-pick the best students, No Child Left Behind is designed to punish public schools and to demonstrate that private is best.

But I don’t think we’ve turned a corner that we can’t come back from. Public education has been a dynamic vehicle in our country since its inception. We must grapple with maintaining this progressive institution. Policymakers and educators know that education holds out hope as the great equalizer in this country. It can inspire and propel a student, a family, a community.

The state where I teach has a large academic achievement gap for African American and low income children. That is unacceptable. Spending time, money, energy on testing everyone with a “one size fits all test” will not eliminate or reduce that gap.

Instead, we need teacher-led professional development and more local control of school budgets and policymaking. Beyond that, we need to address the economic and social issues many children face, instead of punishing the schools that are trying to do right by these students.

We’ve got things backwards today. Children should be in the front seat, not the testing companies. And teachers should be rewarded for teaching, not for being Stanley Kaplan tutors.

Ten years ago, I taught a student named Cayla. A couple of months ago, I got a note from her, one of those things that teachers thrive on.

“Ms. Hobart was different than other teachers, in a good way,” she wrote. “We didn’t learn just from a textbook; we experienced the topics by ‘jumping into the textbook.’ We got to construct a rainforest in our classroom, have a fancy lunch on the Queen Elizabeth II, and go on a safari through Africa. What I learned ten years ago still sticks with me today. When I become a teacher, I hope to inspire my students as much as she inspired hers.”

Last week, I received a call from Niecy, another student from that class ten years ago. She was calling from southern Illinois to tell me she was graduating from high school this month and had just found out that she has won a scholarship to a college in Indiana. I was ecstatic in my happiness for her. We laughed, and I told her I was looking at a photo of her on my wall, building a pyramid out of paper bricks with her classmates.

I also had a recent conversation with Manuel in a grocery parking lot. He reminded me of my promise eight years ago to attend his high school graduation. I plan to be there.

Cayla and Niecy and Manuel are three of the reasons I teach. They are the reasons that some days this still feels like a passion and not a job.

When I pick up the broom at the end of the day to sweep my class due to budget cuts, I remember Cayla.

When I drive home demoralized after another meeting where our success is dissected with a knife manufactured in Texas, I remember Niecy.

When another new program that is going to solve the reading disparity, resulting in higher test scores, is introduced on top of another new program that was supposed to result in the same thing, I remember Manuel.

They are the fires that fuel my passion. They are the lifeboats that help me ride this current wave in education.

Eight or ten years from now, I want other former students to contact me and tell me a success story from their lives. I don’t want to be remembered as the teacher who taught them how to sing “Testbusters” or to “bubble up.” I want to be remembered as a teacher who inspired them to learn.

Susan J. Hobart, M.S. Ed., is a National Board Certified Teacher living in the Midwest.

Tolerance And Diversity: Trouble?

Here is an essay (from the Council for Secular Humanism) about some pitfalls one encounters in the "diversity/tolerance" pedagogy. I agree with most of it, though some of you may find it a bit too critical. In my classroom I have seen how exposure does not necessarily impart good feeling about others. My take on diversity is like my take on most things; you must be smart enough to see the forest for the trees.

Doubts about Celebrating Diversity
By Kenneth R. Stunkel

We like to believe that colleges and universities are unique sanctuaries for critical inquiry and mostly logical thinking and that academics resist unexamined, foolish beliefs as a professional responsibility. As an example of how shaky these assumptions have become in an atmosphere charged with political correctness, I cite my own university. A statement from the president's office has put faculty and students on notice that all beliefs are to be acknowledged and respected, and that "socially constructed" differences are to be acknowledged and celebrated. Well-meant as all this may sound, apparently, no one thought out the implications of these injunctions, had doubts about their wisdom, or raised objections. This essay attempts to do all three by focusing on an irrational, unhealthy phenomenon in higher education that has become insistent and pervasive. American pluralism and egalitarianism have merged with identity mania, the self-esteem movement, and postmodern indeterminacy to produce a seemingly fine, idealistic notion—a celebration of differences between peoples and cultures. Diversity as a celebration must not be confused with diversity as a redress of historic inequities through just representation of women and minorities in the public life of opportunity, work, and study. Nor is it about enjoying a harmless variety of taste, style, demeanor, or aesthetic experience, as when people unlike one another in small ways (as most people are) assemble to hear an African poet or to sample cuisines at an international food festival. And on the most trivial level, the issue is not fashion anomalies like students wearing rings in their noses, eyebrows, and tongues.

The Cult of Differences

At issue is a sometimes overt but more commonly hidden assumption that differences are better and more fundamental than similarities. The idea is not new. The first theorist and champion of incommensurable cultural diversity was Johann Gottlieb Herder, who flourished near the close of the eighteenth century and argued for cultural nationalism and the accepting of differences based on a common heritage of language and custom. In the Middle Ages, nominalists and realists debated whether individual things are more real than any similarities they may share. The tension between differences and similarities is my theme as well, but what I have in mind is an inflated status for ethnicity, not physical differences of race or gender, which are unavoidable and given. Ethnicity implies traditions, beliefs, and practices rather than the anthropology of physical appearance or differences of reproductive anatomy. Academic paeans to ethnicity claim that cultural differences between groups merit spontaneous admiration. The questionable premise is that traditions, beliefs, and practices in all their ethnic and historical profusion self-authenticate their claims to truth, beauty, and goodness. Not only must all the "voices" be heard, whatever they come up with must be treated with respect, since no voice has less or more significance than any other. From this hyper-tolerant perspective, it is not good enough in a pluralistic society to cultivate forbearance or to be content with provisional civility extended to differences of belief, experience, and cultural background. Open-ended diversity is thrust upon us as a positive object of obligatory good feeling. Acceptance of differing outlooks, behavior, habits, customs, and values must be enthusiastic to ward off intolerance and confirm difference as virtue.

For converts to this doctrine of good feeling about differences, the more differences the better and all differences are equal in a spirit of radical democracy. Without an abundance of diversity, sanctified by parity, there would be no cause for revelry. A dictionary (Merriam-Webster's) defines the word celebrate along a continuum from the sacred ("to perform a sacrament") to the secular ("to hold up or play up for public notice"). A generic slant on celebration suggests a receptive attitude in which all sense of discomfort about differences is sponged away. Doubt about the value of diversity is tantamount to outright intolerance, hateful perversity, or lamentable backwardness. If any fragment of difference should provoke indifference, dislike, outrage, skepticism, or resistance, the suspicious party may face quarantine for sensitivity therapy or slip into disrepute as a reactionary. Despite these risks of dissent, I invite some reflection on the pitfalls and limitations of celebrating diversity.

The Price of Ethnicity

However one may react to various cultural practices and beliefs, it is not self-evident that diversity is either good or bad, which holds for similarity as well. Good judgment about what is desirable or not requires historical and social context and invites cautious reflection about consequences. Non-Western traditions have viewed social and cultural differences as little more than blunt facts of life, inviting exclusion, repression, or degrees of accommodation. Celebration has never been an issue. In many countries with an ethnic mix, plain, old toleration ("live and let live") is something of a miracle. Consciousness of kind, however minor the criteria for better or worse, is the mortar that binds people into cohesive groups, until education or wider perspectives crack the mold. Such bonds have the functional purpose of promoting social harmony. Should ethnic differences intervene with consciousness of kind, the outcome might be harmless enough, but it can also be disastrous. Who can argue credibly that diversity has been good for Hutu and Tutsi, Albanian and Serbian, Israeli and Palestinian? At the right historical moment, relatively unimpressive differences of tradition, perception, and interest have triggered mutual persecution and slaughter, with no sense of a common humanity that ought to take precedence over narrow tribal or ethnic identities. The historical reality is that differences have seldom been acknowledged and tolerated, much less celebrated.

The contemporary prevalence of ethnic and tribal conflict suggests it is unrealistic, irrational, and dangerous to embrace difference as an absolute good. Unqualified diversity can be as oppressive as unqualified uniformity. A decision about where to draw the line normally occurs in practice rather than theory. Nevertheless, some obvious antinomies come to mind. A patriarchal status system in which rights are gender specific, for example, is incompatible with a gender-neutral system of equality before the law. Where a gulf between competing values is less dramatic and more bridgeable, to want celebration on top of judicious, humane accommodation invites caution. Sharing the same country and history does not prevent nasty conflicts between secular and orthodox Jews in Israel. Irish Protestants and Catholics present the same spectacle of minor differences adding up to serious conflict. Across the globe, there are peoples sharing the same land and history who are eager to kill one another, like Muslims and Hindus in Kashmir, or Muslims and Muslims in Iraq.

Cultural differences can proliferate with no thought whatever of common interests. A spectacular example at hand is in Indonesia, where one tiny island out of some 17,000 (called Alor) has 140,000 people divided into fifty tribes, speaking nearly as many tongues in seven language groups. Agreement on anything touching the common good is understandably difficult. Imagine a school system in Alor trying to be "ethnically sensitive" while also laboring to impart a shared foundation of knowledge, goals, and commitments. The sensible option in modern pluralistic societies is to ask how much and what kinds of difference to accommodate before consensus becomes impossible and the social order devolves into an incoherent sheet of sand. Much the same can be said for multicultural curricula in schools and colleges, which have overwhelmed any sense of standards and coherence in many places. Eagerness to promote and vindicate diversity is usually indifferent to the social unity needed to keep a school system or a curriculum afloat.

Irreconcilable Values

Celebration of diversity in general makes it difficult to stand for anything in particular. Postmodernists, who claim that truth, meaning, and reality fluctuate with the rise and fall of individual and group perspectives and interests, exploit messy historical facts of irreconcilable or warring differences. Allegedly, no impartial viewpoint is available (a caveat that logically must include postmodern doctrines) to judge the adequacy of "stories" or "narratives" by which minds and bodies are connected to the world. A consequence of unlimited pluralism as a higher good is the demolition of shared purpose in a common world that nullifies any plausible idea of universal human rights. When a bill of particulars is requested, inclusive diversity clashes with familiar notions of impartial justice, fairness, compassion, and rationality. An appeal to human rights assumes the existence of needs and interests embedded in a human identity that takes precedence over lesser identities defined by narrow categories of race, class, gender, and ethnicity.

The world is and always has been a playground for incompatible, mutually hostile value systems. Mormons would be practicing polygamy openly as part of their religion if they had not emerged as a minority in a monogamous society. The Taliban in Afghanistan was persuaded by religious conviction to dispatch adulterous women by stoning them to death. In Morocco and Iran, a Muslim who converts to another faith is severely punished. There are still societies in the Middle East, Asia, and Africa that accept and practice human bondage, the favorite commodities being women and children. How are such cultural practices to be evaluated simultaneously from the perspective of human rights and the blanket imperatives of diversity?

Should an ethnic attachment to astrology be included as a legitimate discipline in college curricula because politicians and bureaucrats in India submit decisions bearing on public issues to readings of the stars? Should tribal shamans be licensed to practice "alternative" medicine? In postmodern jargon, is not one scientific or medical "narrative" as good as another? Japanese identity is still defined markedly by a code of duty and obligation to a group, and the Japanese are notably ethnocentric, acutely aware that someone is a foreigner (gaijin). The Western preference, traceable to the eighteenth century's Enlightenment, is for liberty and expression of the individual in a spirit of tolerant cosmopolitanism. Are both options to be celebrated equally without a murmur of skepticism? Lewis Mumford argued that all cultures and civilizations can be judged by a simple criterion—to what extent are autonomy and unimpeded development of the person respected and nurtured? In other words, do beliefs and practices of a culture enhance possibilities of human life or diminish them? If such a standard resonates, where does that leave Islam with regard to the status of women?

Learning about the "Other"

Ideological pluralists assume that education is the royal path to happy as well as peaceful diversity. Let us better understand other peoples and cultures, goes the argument, so that non-Western "voices" are heard impartially and all identities surface to stand as equals. In secondary and higher education, this belief has been codified as doctrine, but is loaded with impediments. Any teacher wanting a curriculum to mirror diversity faces an insuperable and value-laden problem of selection. Some five thousand ethnic groups are scattered across some two hundred nations, and America hosts around ninety ethnic enclaves. In educational venues, which are to be represented or left out, since no one has time or knowledge to include all of them? If differences are equal, what is the principle of choice? Is not choosing one over the other arbitrary bias and discrimination? And where in a course of study does one find relief from ethnic exposition and celebration to address priorities like reading and writing, geography and mathematics, history and science?

The root fallacy is to think that mere exposure to unfamiliar cultural traditions will promote sympathy and understanding. It may or may not, depending on depth of exposure, a recipient's aptitude, and what ends up being understood and assimilated. Toleration in the ethnic domain is not an inevitable result of understanding. Really knowing the ways and thoughts of non-Western cultures, as opposed to brushing against sanitized versions of them, may have the opposite effect and stimulate dislike. Whatever an individual takes away from cursory reading, group confessionals, show-and-tell sessions, or field trips is likely to be shallow, ephemeral, and misleading. Even if a modicum of interest results from selective exposure, it does not follow that understanding has been achieved.

Every cultural tradition has a grim, murky side that discourages celebration. Censorship of the bad stuff to avoid offending someone invites deficient understanding and later disillusionment. The more some of us understand the social basis for the widespread African practice of vaginal mutilation of young girls to protect their virtue, the more we dislike and oppose it. The Muslim practice of secluding and controlling women is hard to tolerate much less celebrate. Some teachers indirectly praise Aztec culture because it was "victimized" by predatory Europeans, but conveniently ignore brutal Aztec imperialism in old Mexico. An appreciation of Aztec temple architecture is shallow without an understanding that thousands of people had their hearts cut out at the top in religious ceremonies that culminated with bodies tumbling down the steep steps. Full understanding of what the structures were used for makes appreciation more difficult and ambiguous. Indeed, a full understanding that Aztec practices centered on propitiation of bloody, improbable deities might well induce disgust and alienation.

There is a price for understanding other cultural traditions: investment of time and effort, an immersion in chores of hard study that contemporary students resent and evade. Submission to historical settings and absorption in difficult texts are unavoidable conditions for real understanding. The imperfect but attainable attitude of suspended judgment supported by deep knowledge is sine qua non. Alien terms must be mastered. Islam cannot be understood without the Qur'an or Hinduism without the Vedas, nor can words like Sharia, jihad, Shi'ite, karma, Shaivite, and puja be ignored. Convictions and teachings in either case are not grapes plucked effortlessly from a vine, and complications abound. Quite apart from barriers to understanding ways of thinking associated with Confucian China or Buddhist Thailand, life and ideas in the West can be as mystifying in some historical periods as anything gleaned from the anthropologist's notebook. Medieval scholasticism and Renaissance kabbalism, at least in my experience, mystify and befuddle students as much as Hindu Vedanta or the Daoist yin and yang.

Humanity versus Ethnicity

If the idea of universal human rights is taken seriously, then an excess of conflicting social and cultural differences become impediments to their realization. The sociologist Karl Mannheim observed long ago that no society could expect a shared system of coherent values without a process for their creation, dissemination, reconciliation, and assimilation. Such a process in American institutional life, particularly higher education, has been notably weak; it has also been rejected outright by an assortment of ideologues in the past quarter century. The task for American democracy is to secure and sustain an accommodation between diversity and the shared beliefs and commitments that define a society and a nation. If a tradition of universal human rights transcends local ethnic traditions, ethnic diversity without limits or interference will have to yield.

A plausible aim of enlightened education is to lift people above their parochial roots to a larger view of the world, to help them transcend limitations of birth and upbringing, to hasten their liberation from shuttered windows of race, class, gender, and ethnicity. Diversity ideology supplies instead a melancholy determination of schools, scholarship, and public rhetoric to herd people more deeply into a cul-de-sac of glorified particularity. Another aim of good education is critical thinking, which was once a distinguishing mark of the academy. The fate of that ideal is ironic and bizarre. On the one hand, criticism has become a form of intellectual suicide in which "theories" like deconstruction and social construction set out to level everything in sight and end with self-immolation on their own dead-end premises. On the other hand, academic multiculturalists insist that questioning beliefs, values, and practices of the "other," whatever they may be, much less rejecting them, is "insensitive" and "intrusive." Criticism is tantamount to intolerance.

A rare spirit of criticism was codified in the Enlightenment, which flourished only once before in the Greek world, from the sixth to the fourth centuries b.c.e. Its goals were to expose errors and make way for unexpected truths. In our postmodern euphoria, radical pluralists deny the existence of truths that make us free (while, of course, claiming or implying that a truth has been enunciated). Truth is rejected as a form of bondage, because it implies whatever may be true is true for all. The present surrogate for truth is diversity. It is sad that a quintessentially European ideal like diversity is in conflict with the ideal of criticism, which requires argument and evidence to support any belief. On Enlightenment premises, all views and ways of life cannot be admitted as equals. No belief or practice, however sacred or wedded to group or individual self-esteem, is immune from examination. It is intellectual and moral cowardice to refrain from responsible criticism on the ground that offense may be taken. The discomfort of being offended is a consequence of living in a complex world while holding questionable or unsupportable beliefs. It is also a risk associated with getting a decent education and growing up (no more Tooth Fairy). It is an inevitable consequence of encountering points of view and ways of life that cannot or do not want to be reconciled. Indiscriminate "celebration" is incompatible with critical thinking.

No matter how one cuts it, tension and conflict will surface when incompatible value systems confront issues of belief and action in public life. The best hope lies in selective accommodation of differences guided by modest expectations anchored to a core of shared convictions sheltered by common sense and open to criticism. These realities about diversity are widely evaded and denied in higher education, where a dreary scene of identity seeking is being played out in exclusive, solipsistic groups, each claiming a unique version of meaning, truth, and reality, each contributing to an impenetrable social babble, all of it stoutly defended by uncritically idealistic academics—but also by campus zealots and block wardens on the lookout for heresy.

An uninformed, unsuspecting student body, awash in diversity rhetoric and pedagogy, maneuvered by solemn, earnest action plans shaped by diversity ideologues, might be led to think that ethnic violence and hatred, alive and readily visible around the world, has nothing to do with ethnicity and its inherent premise of exclusiveness.

Kenneth Stunkel is a professor of history at Monmouth University in New Jersey.

Seven: Houses That Is

Obama is hitting back harder than I thought he would. Good!

Update: It looks like 8!


Prove Your Premise!!

I read a snippet of a story over at Schools Matter that just bolsters the correct notion that NCLB is a policy with a head-up-its-ass problem. Here is the mythbusting story from NewsReview, Reno!
Policy myths cause a lot of government’s problems, and education is especially damaged
Can a school system built on imaginary premises possibly succeed?

By Dennis Myers

Richard Rothstein spent three years as education columnist for the New York Times, giving a popular audience an unaccustomed look at the work of a scholar in sharp, to-the-point essays that challenged conventional wisdom and corrected public policy assumptions. Reading them and his current work as an associate at the Economic Policy Institute in Washington, D.C., one gets the impression of a very patient man. Although he must deal constantly with public policy myths—things we all know to be “true” but are not—he keeps plugging away trying to dispel them."All I can do is keep on telling the truth as I see it, and others have to do the same,” Rothstein says. “In the long run, I have to hope the truth wins out.”

The number of myths on which so many public policies are built raise serious questions about whether those policies have any hope of succeeding. In education, there are a number of myths that have been repeated so incessantly by press and politicians that they have become “true” in the public’s mind. Samples:

• Business uses performance pay, so schools should do the same

• Schools are violent

• Parents and students are fleeing public for private schools

• Schools should use the kind of numerical goals that business uses

• Charter schools outperform public schools

In fact, business generally avoids performance pay, schools are the safest places children frequent, private school enrollment is declining, business recommends against numerical goals, and public schools generally perform better than charter schools.

Take just one of them—numerical goals, which were actually written into the No Child Left Behind Act in the belief that they are an accepted business practice. In fact, it reflects a practice that has long since passed out of fashion in the business world, which found that it focused workers on process instead of outcome and generated fear in the workplace, low productivity, and customer alienation.

Using modern research methods, the business community discovered that when a worker must have numbers to show, she or he will crank them out by some means, at the expense of careful workmanship and productivity. Legendary statistician W. Edwards Deming, who in the postwar years gave Japanese management the methods of design, product quality and sales that brought that nation to commercial dominance and later swept the United States, wrote in his book Out Of the Crisis, “A numerical goal leads to distortion and faking, especially when the system is not capable of meeting the goal. Anybody will meet the quota (goal) allotted to him. He is not responsible for the losses so generated. Sears Roebuck waded into trouble in 1992 by allotting goals to their Auto Service Centers. Agents tried to meet the goals set for them. They did, to the detriment of the customer and of the reputation of the company.”

Insurance consultant John Pryor tells the companies he advises, “Focus first on the underwriting or claims or audit processes—and the quality of their delivery from the customer’s perspective—and then it can be determined if productivity is optimum.”

Yet numerical goals—what Rothstein calls “goals distortion” are a basic part of U.S. education policy.

Our peaceful schools

In 1998 during the spate of heavily publicized school shootings around the nation, the Center on Juvenile and Criminal Justice checked some figures on school violence against figures provided by the National Climatic Data Center. “To give the reader a sense of the idiosyncratic nature of these [school violence] events,” the CJCJ reported, “the number of children killed by gun violence in schools is about half the number of Americans killed annually by lightning strikes.”

Nothing has changed since then. There were more deaths on school grounds 32 years ago, when there were 80 million fewer people in the United States than there are today—usually less than 50 annually. Of home, the 7-Eleven, the park, school is their safest environ—far safer than the home, where as many children are killed in family violence every three days as died at Columbine. But the myth lives on.

Rothstein has been a one-person myth-buster in the education field by doing what reporters are supposed to do—checking the facts and statistics before reporting a “trend” (which may be the most abused word in journalism).

When in the early years of the Bush administration members of Congress were arguing that the private sector uses performance pay, so schools should do likewise, Rothstein went looking for companies that did so. He talked to firms like Wal-Mart and Cisco Systems, consulted Harvard Business School, called private and commercials schools. All told him the same thing. “The private sector does nothing of the sort,” he said. He even called John Chubb at Edison Schools Inc., the largest firm that tries to get contracts to commercially operate public schools. Chubb said using test scores to influence pay was a mistake. (Rothstein does acknowledge that “stockbrokers and sales clerks are paid on commission” but says the hardball sales tactics the practice fosters “should be intolerable where children are concerned.")

What George Bush, in his 2000 campaign, called parents voting with their feet—taking kids out of public and into private schools—did not escape Rothstein’s notice. He found that the actual number showed enrollment falling in private schools at every income level.

The charter school myth has been examined in depth, with Rothstein a part of it. In their book The Charter School Dust-Up, authors Martin Carnoy, Rebecca Jacobsen, Lawrence Mishel, and Rothstein examined 19 studies in 11 states and the District of Columbia, and then folded in data from the 2003 National Assessment of Educational Progress (NAEP) tests. Their conclusion: “There is no evidence that, on average, charter schools outperform regular public schools. In fact, there is evidence that the average impact of charter schools is negative.” (Their study also found that a claim that charter schools serve more disadvantaged students than public schools was false.)

In some cases, legislators have enacted legislation knowing full well that it was flawed. Rothstein recalls that during the debates over No Child Left Behind, economists Thomas Kane and Douglas Staiger produced a paper showing that, under the legislation, schools would be rewarded or penalized entirely because of dubious statistics produced by inadequate testing. Months went by while members of Congress wrestled with the problem. Then, they just gave up and passed the bill anyway.

“It’s a mistake to adopt policies that you know, based on the science, cannot work effectively,” Rothstein said.

The cost

None of this would matter much if it was harmless, but it is very harmful. Though the pendulum is now swinging back, for a decade precious education dollars were diverted to expensive high tech security gear, more police and weapons, expansion of juvenile jails. Various panaceas were legislated. In Virginia, the governor proposed eliminating after-school programs, an established violence preventive.

No Child Left Behind has exacerbated long-standing education problems, such as teacher shortages.

Can a school system succeed when its policy premises are false?

Former Nevada school superintendent Eugene Paslov says policy myths are complicated—he says they are like onions, with merit to be found in some levels as they are peeled—"Some of it has merit and much of it is mythology.” It’s hard to imagine hard pressed school administrators having the time or resources to sort things out.

Washoe County School District spokesperson Steve Mulvenon says he has to spend unnecessary amounts of time helping reporters do stories about school violence in the hope that they’ll get it right, though at times he has become so exasperated that he has considered cutting off assistance to such stories. “I guess I need to keep putting the message out there on each one of these incidents, that it is an aberration, that it is unusual.”

R.I.P. Stephanie Tubbs Jones

A sad day for Democracy. (from HuffPo)

Pilobolus: The Blue Mat Group

I was swimming freestyle and came across this. I had not seen these folks before, but they are pretty cool! Check out this vid from Late Night with Conan:

It's The Means Meme

Robert Reich, who gets some flack in his comments, hits it out of the park with his comparison of Repulican's and Democrat's consideration of "the means" to a given end. The end, he says, is what Republicans care about--regardless of the means! Do you agree? I do. Read his post here, or continue and read it below.

McCain, Obama, and the Inherent Advantage of Caring More About Ends Than Means

We’ve been here before: The Republican attack machine at full throttle, spewing lies in best-selling books, on Fox News, on talk radio. The mainstream media reporting on the controversy, thereby giving it more air time and squeezing out the Democrats’ affirmative message. Followed by accusations by Democrats that Republicans are playing unfairly. Responded to by smiling shrugs and winks from Republicans, who say Democrats can’t take the heat or can’t enjoy a joke or are out of touch with average Americans who are concerned about whatever it is the Republicans are lying about. This ignites a furious debate among Democrats about how negative they should go against the Republican. “If we use their tactics, we’ll lose the moral high ground,” say the Democratic doves. “If we don’t, we’ll lose the war,” say the Democratic hawks. The debate is never fully resolved. The Democrats sort of fight back but don’t have the heart to do to Republicans what Republicans do to them. And so it goes.

The underlying problem is that Democrats care about means as well as ends, while Republicans care almost exclusively about ends and will use any means to get there. The paradox lies deeper. For most Democrats, the means are part of the ends. We want an electoral process that eschews the lying and cheating we’ve witnessed since Richard Nixon’s dirty tricks. If we use their tactics, we undermine our own goal, violating one of the very things that distinguishes us from them. Yet if we don’t stoop to their level, how can we prevail in a system that allows – even rewards – such lying and cheating?

It’s the same with governing. Right-wing Republicans detest government, so when they screw it up – failing to protect the citizens of New Orleans or returning veterans in Walter Reed hospital, or wasting billions of taxpayer dollars on non-competitive bids for the military, turning budget surpluses into massive deficits – they’re proving their own subterranean point that the public can’t trust government to do anything right. Democrats, once in power, inherit this legacy of distrust and deficit, and spend much of their time in office working their way out of it. And also inordinate time and energy promoting good governmental processes (recall Al Gore’s “making government work” crusade, which holds the record for the most arduous effort generating the least media attention).

Democrats also care about the rule of law – adherence to legal norms, rules, and precedents – as an end in itself. Republican administrations view the law as a potential obstacle to achieving particular ends. Anyone trying to chronicle the Bushie’s disregard for the rule of law is quickly overwhelmed with examples, such as violating civil service laws to fill up the executive branch with political hacks; riding roughshod over constitutional laws in firing federal prosecutors; wiretapping Americans in clear violation of law; holding prisoners of war without charge, in violation of international law; using torture. Democrats, once in power, regard laws as serious constraints on that power. (When I was secretary of labor, the department’s lawyers would instruct me about what I could not do because I was unauthorized to do it, rather than how I might reinterpret or bend the laws in order that I could. The lawyers who work in the Bush administration do the opposite.)

Those who are willing to do anything to achieve their ends will always have a tactical advantage over those who regard the means as ends in themselves. The question posed in this election, and, one hopes, by an Obama administration, is whether the moral authority generated by the latter position is itself enough to overcome these odds.

Jay P. Greene: Still A Moron

Jay P. Greene is known as an "Education Researcher" though we will use the term loosely. Leo Casey over at Edwize has a nice explanation of some issues JayP has confounded with his moronishness.
Jay Greene and the United Cherry Pickers

Jay Greene needs a union. Anyone who works as hard as he does at cherry picking education research — a white collar version of uncreative, gritty farm work — could use the collective power of an organization that unites all cherry pickers in common cause. Just picture it: a picket line of Greene, Hoxby, Moe and Peterson, marching in line outside of an AERA convention, with signs declaring “Unfair To Educational Research Cherry Pickers” — all carefully written on oak tag paper bought at Wal-Mart.

How could life as an educational research cherry picker be so tough, the reader might ask, that Greene would have to resort to the usurious rent seeking of nefarious monopoly power? How could he end up hitchhiking on the road to serfdom?

Here’s how: Serious research conducted by respected scholars without an ideological axe to grind has consistently found every major voucher experiment in the United States wanting. John Witte’s and Cecilia Rouse’s definitive analyses of the Milwaukee voucher program and the Indiana University studies of the Cleveland voucher program have shown no meaningful educational performance advantage for students in those two high profile, large scale voucher programs. The US Department of Education studies of the Washington DC voucher program [here and here] show no significant educational performance benefits. An overview of the current state of research on vouchers can be found here.

All of this just makes Jay Greene and his comrades in the United Cherry Pickers work harder and harder, on a desperate search through the bountiful fruit of educational research for something, anything that can be cherry picked to support vouchers. Just look at what Greene has been reduced to: glittering generalities that repeat the same tired misrepresentations, again and again, in the most unimaginative way. [He even cites research that is not on the subject of vouchers: Hank Levin will be most surprised to learn that his research "supports" vouchers.]

When the research that is there doesn’t do the job, the “have laptop, will produce junk science on demand” crowd at the United Cherry Pickers make up their own, ‘refined’ versions. In the heat of the 2000 presidential election campaign, Paul Peterson announced a “Harvard study” that found African-American students participating in private voucher programs in New York City, Dayton, Ohio and Princeton, New Jersey had significantly better results on a standardized test. Peterson’s claims went so far beyond what the actual evidence demonstrated that one of his partners in the research, the firm Mathematica, went public with its repudiation. Further research by Princeton University’s Alan Krueger and Pei Zhu cast even more doubt on the results. If you wonder what is at stake is the seemingly ‘inside baseball’ fight over peer review of research [see here and here], it is precisely this misuse of the reputation and currency of the academy to promote a policy agenda with “research” that fails to meet minimal academic standards.

So in the interests of union solidarity, let us provide the following recommendation to Jay Greene and the United Cherry Pickers: you are working too hard, take a vacation. Labor Day is coming up — we won that one for you.


Rachel Maddow Gets A Show!

It took a little while, but the smartest woman on TV, Rachel Maddow, just got her own show!
Rachel Maddow will replace Dan Abrams as host of the 9PM hour on MSNBC, the New York Times' Bill Carter reports. Just last month in a Times article by Jacques Steinberg, MSNBC president Phil Griffin declared Maddow "at the top" of a "very short list" for those who should have their own show, though at the time he said he "[didn't] know when" that would be. As Carter reports, the final stretch of the 2008 election season will be Maddow's debut as the host of her own MSNBC show:
Just in time for the closing rush of the presidential election, MSNBC is shaking up its prime-time programming lineup, removing the long-time host -- and one-time general manager of the network -- Dan Abrams from his 9 p.m. program and replacing him with Rachel Maddow, who has emerged as a favored political commentator for the all-news cable channel.

The moves, which were confirmed by MSNBC executives Tuesday, are expected to be finalized by Wednesday, with Mr. Abrams's last program on Thursday. After MSNBC's extensive coverage of the two political conventions during the next two weeks, Ms. Maddow will begin her program on Sept. 8.

MSNBC is highlighting the date, 9/8/08, connecting it to the start of the Olympics on 8/8/08, as a way to signal what the network's president, Phil Griffin, said "will be the final leg of the political race this year." He added, "We making that Rachel's debut."

Mr. Abrams, who is well liked at MSNBC, is expected to remain at both that network and at NBC News, where he is the chief legal correspondent. He will also serve as an anchor during some of MSNBC's daytime coverage, as well as a substitute host on NBC's "Today" show, Mr. Griffin said.
The last broadcast of Abrams' "Verdict" will air Thursday.

Abrams, the network's former General Manager, told the Times that he understood the decision.

"Putting my general manager's hat back on, considering where the network is right now, it is actually the right call," he said.

Almost immediately, Keith Olbermann took to DailyKos to celebrate the news, brag about his involvement in the decision — "Yes, I had something to do with it," he wrote — and remind readers that though Maddow's rise at the network was quick ("less than five months between first paid appearance and own show"), his was quicker ("I believe I still hold the MSNBC record: I came back to guest host for three days in 2003 and 39 days later I had a contract to do the 8 PM show.").

Update: Better picture now (wiki)

Saddleback: Who Cares? (Hertzberg Does)

I love Hertzberg. He is smart as hell, and a great writer, and sees things others do not. He is concerned about how poorly Obama did in his Warren interview. I personally don't give a shit, because pandering to the religious makes me sick. Hertzberg wrote about his reactions to the Saddleback thing here. Here is the good part:
Today’s evangelical Christianity may be thriving on TV and at the megachurch collection plate, but it has yet to find a cause worthy of its fervor. Its crusade on behalf of the unborn and unquickened—which, if successful, would make criminals out of actually existing women and doctors while doing less than nothing to relieve actual human suffering—is a sad waste. A century and a half ago, the great evangelical blockbuster was “Uncle Tom’s Cabin,” by Beecher’s sister Harriet. Now it’s Warren’s self-help manuals and the “Left Behind” series. This is a progression that offers meager evidence either for evolution or for intelligent design.
Saddle Sore

I had hoped for a more fluid, energetic performance from Barack Obama at Saturday night’s Saddleback Church forum—something along these lines:

Alas, it was not to be. Obama didn’t seem to know how to go with the flow and make a real emotional connection with his audience, both in the room and beyond. He seemed a little out of sorts—tentative and wan and much too careful, introducing too many of his comments with a choked “y’know” and keeping his eyes downcast instead of locked on Warren’s. He was wilted, not crisp. Or so he seemed to me.

I was especially disappointed by his answer on same-sex marriage. He said what he has often said before: that he sees marriage as a union of a man and a woman; that he supports civil unions that extend rights and obligations to same-sex couples; that under the Constitution marriage is a matter for the states and should remain so. It felt chilly and legalistic—not quite Dukakis on capital punishment for wife-murderers, but perilously close. He missed a chance to challenge his evangelical audience and connect with it by pointing out the human contradiction between sectarian doctrine and Christian compassion. Christian denominations take all manner of views of homosexuality, but it’s hard to find a Christian these days who insists that simply being homosexual—having a gay or lesbian orientation—is sinful in and of itself. If two people love each other and wish to commit themselves to each other and are eager to take on the responsibilities and joys of family life, including the raising of children (biological or adopted) in a loving home, isn’t that a good thing, not just for them and the children they give a home to, but for all of us? Obama said, “I think my faith is strong enough and my marriage is strong enough that I can afford those civil rights to others, even if I have a different perspective or a different view.” But how helpful is it to imply that other people—people whose faith and/or marriages aren’t so strong—are in danger of abandoning their faith or their marriage because gay people are permitted to get married?

There are observers I respect—Andrew Sullivan, for one*, and members of my family with whom I watched the program, for two more—who saw thoughtfulness and humility where I saw hesitation and eggshell-walking and, watching McCain, saw pandering and bloviating where I saw shrewdness and confidence. If they’re right, and I hope they are, I still wish that the thoughtfulness and humility had been accompanied by a bit more passion and force.

Everyone, me included, seems to agree that Rick Warren was the undisputed winner of the night. Granted, he isn’t a particularly probing questioner. But at least he was polite, and he didn’t commit any of the sins James Fallows enumerates in the current Atlantic. With his genial personality, his emphasis on happiness over hellfire, and his instinct for (relative) moderation, he reminds me a little of the young Henry Ward Beecher, a fascinating biography of whom I happen to be reading at the moment. Not that they’re in quite the same league, of course. Warren’s willingness to admit the reality of global warming and his admonitions to his fellow evangelical heavies to quit demonizing Democrats are most welcome, but he’s still got a ways to go before he can match the content or courage of Beecher’s stirring antislavery sermons.

Today’s evangelical Christianity may be thriving on TV and at the megachurch collection plate, but it has yet to find a cause worthy of its fervor. Its crusade on behalf of the unborn and unquickened—which, if successful, would make criminals out of actually existing women and doctors while doing less than nothing to relieve actual human suffering—is a sad waste. A century and a half ago, the great evangelical blockbuster was “Uncle Tom’s Cabin,” by Beecher’s sister Harriet. Now it’s Warren’s self-help manuals and the “Left Behind” series. This is a progression that offers meager evidence either for evolution or for intelligent design.

*See also Andrew’s superb series of posts yesterday on the questionable provenance of McCain’s “cross in the dirt” anecdote.
(Photograph: Alex Brandon)


My Obama VP Prediction

Joe Biden

Why don't you vote for your VP pick above? Sorry, poll was created before Musharraf was available, and can't be altered.

The Drug War: We're Losing

Do you think we have a drug problem in the U.S.? Well, we do. Too many of us get arrested for smoking pot (well, not me), but take all kinds of mind-altering drugs prescribed by our doctors. McCain apparently loves Ambien. Which is the bigger problem?

Watch this, then tell me...

Go To Newshoggers

This Newshoggers story is an important read because it reminds us that McCain is dangerous, and Obama must win. Visit Newshoggers.
Snatching defeat from the jaws of victory

By Ron Beasley

I was late to endorse Obama - late January. I was afraid that he was not up to fighting the Republican slime machine and that he was an empty suit - lots of talk and little substance. I haven't seen anything at this point that would lead me to believe I was wrong. This is very disturbing because the though of John McCain as president truly scares the hell out of me. At he first sign of trouble he looks for somebody to hit.

Response to 9/11 Offers Outline of McCain Doctrine
WASHINGTON — Senator John McCain arrived late at his Senate office on the morning of Sept. 11, 2001, just after the first plane hit the World Trade Center. “This is war,” he murmured to his aides. The sound of scrambling fighter planes rattled the windows, sending a tremor of panic through the room.

Within hours, Mr. McCain, the Vietnam War hero and famed straight talker of the 2000 Republican primary, had taken on a new role: the leading advocate of taking the American retaliation against Al Qaeda far beyond Afghanistan. In a marathon of television and radio appearances, Mr. McCain recited a short list of other countries said to support terrorism, invariably including Iraq, Iran and Syria.

“There is a system out there or network, and that network is going to have to be attacked,” Mr. McCain said the next morning on ABC News. “It isn’t just Afghanistan,” he added, on MSNBC. “I don’t think if you got bin Laden tomorrow that the threat has disappeared,” he said on CBS, pointing toward other countries in the Middle East.

Within a month he made clear his priority. “Very obviously Iraq is the first country,” he declared on CNN. By Jan. 2, Mr. McCain was on the aircraft carrier Theodore Roosevelt in the Arabian Sea, yelling to a crowd of sailors and airmen: “Next up, Baghdad!”
Now if that doesn't scare you it should. Digby as usual says it very well:
I remember writing a long time ago that John McCain is the man George W. Bush was pretending to be, right down to the flight suit. The Real Thing is actually far more dangerous than the cheap imitation. If he wins this thing, we could find ourselves in a very, very serious crisis, of both economic stability and national security ---- and very likely of our government itself. This man is unstable.
But she also gives us a branch of hope. The corporate types know this and they also must know that Armageddon would be bad for business.
The funny thing is that I don't think the Big Money Boyz expect the Republicans to win this election so they didn't think there was much danger in putting Buck Turgidson on the ballot. You can't help but wonder if they are having some second thoughts about allowing for even that slim possibility.
All those tax cuts and deregulation won't be worth much if the hot headed John McCain brings on a nuclear winter. That said race still a lot closer than it should be because Obama is not connecting with the average voter and he needs to go after John McCain. Obama already has a handicap. We don't like to talk about it but he is a black man running for president in the US of A. John McCain has some handicaps as well.
  • His age - there is nothing wrong with making that an issue.
  • He's a Republican.
  • He didn't have the "right stuff' to be an admiral.
  • He admits that we are worse off than we were 4 years ago but plans to continue Bush policies.
  • He's a hot head.
And that's just a few. Use them all. I'm sorry, but nice guys usually finish last in US politics and this is too important.

Total Pageviews