Feeds:
Posts
Comments

Archive for the ‘Columns’ Category

My discovery earlier this week of Chapeau Noir Golf prompted me to do a Top Ten Best-Dressed Players on the PGA Tour list yesterday, which in turn prompted me to do a Ten Worst-Dressed Players today.

Let me first say that, with a few exceptions, it hard to call any PGA Tour players “badly dressed.” The vast majority of them are wearing slacks, shirts and shoes that neither I nor my two readers could ever afford, and it’s difficult to look too terrible when your outfit costs the better part of a Super Bowl ticket. Still a few guys on Tour do manage to blow it in the dressing department — even when they’re spending thousands of dollars to do it.

In order of worst dressed to better dressed (No. 1 being the worst), my Worst Dressed list is as follows:

1. John Daly. But for Charley Hoffman, Long John would have won this title going away — his first win in quite some time. The Loudmouth pants are atrocious and he doesn’t help things by combining them with hats and shirts that rarely go. And while it may be unfair to say he looks like he dresses in the dark, it is fair to say he looks like he dresses in a dimly-lit room. (Ed. note: I don’t begrudge the guys at Loudmouth. They succeeded in building a national brand, something I tried to do with Betcha.com but failed [although, to be fair, I didn’t exactly get a fair shake]. I just don’t care for their gear.)

The hat, pants, and shirt/sweater are all fine -- by themselves. But JD usually looks like he dressed in a dimly-lit room.

2. Charley Hoffman. Green shoes, flat-brimmed hats, green gloves. I just don’t know what this guy is thinking, but he looks bad in a “hard to look at” sort of way. If Daly was -18 in the Worst Dressed Tourney, Hoffman was -17.

3. Woody Austin. Aquaman’s a distant third in my Worst Dressed Tourney metaphor, somewhere around -10. I don’t know what he’s wearing these days, but I think he wore that Tobasco gear for about ten years. That’s ten years too long. Ditto for Scott Hoch, who’d have made my list but that he’s now on the Champions Tour.

4. Boo Weekley. I should probably be kinder to Boo because at least he’s making an effort and has his own schtick. But that camo look he’s got going on just doesn’t work. Even for a country hayseed.

Ugliest. Shirts. Around.

5. J.B. Holmes. I don’t know what it is about J.B.’s look, but for some reason he just doesn’t look good. Maybe it’s that steady diet of two-toned shirts. Or maybe it’s that he so often favors orange shirts with black undershirts (Halloween, anyone?). Or maybe it’s the Cobra hat that never seems to go with the rest of his outfit. I can’t put my finger on it, but it has him at No. 5 on this list.

6. Phil Mickelson. I love watching Phil. He’s the second-best player of his generation, he’s as charasmatic as the day is long, and I think he’d make a great PGA Tour Commissioner one day. (Ditto for his caddie, Jim “Bones” MacKay, who’s articulate, knows the Tour inside and out, and is as well respected in his craft as anyone who’s ever lugged a bag.) But Phil just doesn’t look good on the golf course. It isn’t that he doesn’t drop the cash — I heard his custom-made brown alligator shoes went for something like $1,500. It’s that what he wears just doesn’t work. The pinstripe pants are too much, and those shirts — my God those shirts. For the life of me I can’t figure out how Phil doesn’t get it golf fashionwise. I’d have him higher on this list but at least he has some style — I can’t say that about any of the top five.

7. Steve Elkington. Conventional wisdom was that Elk was one of the best-dressed players on Tour. Neither I nor nor his colleagues see it that way. The ’95 PGA champion used to look like a ho-hum PGA Tour pro. But somewhere along the line something went sideways. I’ve heard commentators say he designs his own stuff; if true, he should consider leaving clothing design to professional designers. These days, his hats sit too high, his shirts lack both in size and sleeve length and I think his pants still have pleats.

Like Philly Mick, Elk tries -- but doesn't pull it off.

8. Bubba Watson. The Man in the Black Hat ranks Bubba among the best-dressed players on Tour. (More.) As I explained in my Best Dressed column, I dissent.

9. Sergio Garcia. Sergio often gets it right on the golf course. But just as often he gets it wrong — way wrong. His Adidas hats are awful with that piping, and the Doug Sanders-inspired monochrome look doesn’t work well — especially when he does it in anything but black. (Remember his canary suit at Hoylake?) Of course, no one looks good when they’re horking loogies.

Too often Sergio looks like walking sherbert.

10. Bill Haas. Jay’s kid is a classic example of a guy dressing about twenty years older than he is. The guy’s lean, cut and nice looking — but he dresses like a guy fixing to move to the Champions Tour. Too many knit collars and sleeves to the elbows for a guy in his twenties. Because he’s otherwise got the look, this one’s an easy fix, and I don’t anticipate him being on this list for long.

Dishonorable mention goes to Dustin Johnson (see my commentary on my Best Dressed entry), Jim Furyk (those camp shirts of a few years back were horrid), Jeff Quinney (see Bill Haas), Matt Kuchar (ditto, plus his pants don’t fit) and Brian Davis (those techie shirts and techier sunglasses — yuck).

Read Full Post »

Last week, the Washington State Supreme Court ruled that Betcha.com, a Seattle-based person-to-person betting platform I founded, violated the very state Gambling Act I designed it to comply with. The Court’s opinion didn’t pass the giggle test.

Betcha was a social networking site where people offered and accepted bet propositions. (Think Ebay meets Facebook in Las Vegas.) We charged people to offer and accept those propositions. The Site was honor-based: bettors could opt out of their losses (read: no gambling), but if they did they risked receiving negative feedback. The Washington State Gambling Commission shut us down in 2007 — we’ve been in court since. In 2009, a Washington Court of Appeals held that there was “no logical basis” to conclude Betcha bettors were “gambling” under state law. It also held we were not bookmaking as the State had alleged.

The Supreme Court didn’t reach “gambling.” Instead it held that Betcha, which lets individuals bypass bookmakers by connecting them personally, was itself a bookmaking operation and thus engaged in “professional gambling.”

How it got there was jaw dropping.

Washington’s Gambling Act prohibits all sorts of “professional gambling.”  “A person is engaged in ‘professional gambling’ … when (inter alia) (t)he person engages in bookmaking.” RCW 9.46.0269(1)(d). “(B)ookmaking” means “accepting bets, upon the outcome of future contingent events, as a business or in which the bettor is charged a fee or ‘vigorish’ for the opportunity to place a bet.” RCW 9.46.0213. The Court read the first “or” as separating two independent clauses – everything from “accepting” to “business” on one side, “in which” to “bet” on the other. Since Betcha charged fees, the Court reasoned, it violated the second clause.

The problem: statutory definitions are meant to be read in context. “(B)ookmaking” appears in the definition of “professional gambling,” and when you plug the Court’s second definition of bookmaking into that provision, its error is obvious:

“A person is engaged in professional gambling … when (t)he person engages in (in which the bettor is charged a fee or vigorish’ for the opportunity to place a bet).”

Two consecutive “ins”? Even Microsoft Word’s grammar check knows that’s wrong. The correct read – ours – was that, fees charged or not, one must “accept” bets to be a bookie. By adding an active verb – “charging” — to the statute, thereby making two independent clauses where two dependent ones are written, Justice Tom Chambers (or his law clerk) literally rewrote the law by which I tried to abide. The Court simply rewrote the law to make a necessary component — “accepting bets” — wholly unnecessary.  That we did not “accept” bets was our principle defense!  Reasonable people can quibble about what it means to “accept” a bet – although I wonder what it is the bettor who accepts the bet did if Betcha “accepted” it. (“Super-accept” it?) They cannot quibble about grammar or the order and tense of words in a statute.

There’s more. We argued that implied in the term “bets” was that they be gambling bets, just as the word “races” in a hypothetical Bobsled Act would be limited to bobsled races and not, say, horse- or three-legged races.  We should have been safe: “bet” appeared on both sides of the aforementioned “or,” and the State didn’t address our points. No matter. The Court brushed them aside, too, concluding that we were asking them to read words into the statute. So the Gambling Act covers even betting that isn’t gambling. Wow. Having decided that Betcha was bookmaking, the Court didn’t consider whether Betcha bettors were “gambling” – thus, “professional gambling” without actual or even thought about gambling. All in a criminal statute, where doubts about coverage are supposed to be resolved against the State. Not a single justice doubted such an odd result. Hmm.

I knew we were in trouble at oral argument, when the justices raised objection after objection we’d knocked down in our supplemental brief. (I wonder if a single justice even skimmed it.) I really knew we were toast when Justice Jim Johnson asked whether Betcha would compete against tribal casinos. But I didn’t think a state supreme court – or traffic court — would airmail in an opinion that so evidenced a pick-the-winner-first approach. If it didn’t mean I’ll almost certainly go to prison, the Court’s earth moving would be comical. So blatant were the Court’s errors that I wonder whether it was the law or the identity of the litigants that mattered at the Temple of Justice. I would have preferred a one-line opinion that said “look, dude, you can’t beat the State.”

At least there’d have been no pretense of objectivity.

Nicholas G. Jenkins is a 1991 graduate of the University of Washington, a 1994 graduate of the Georgetown University Law Center, and the founder of Betcha.com. He blogs at JenkinsFamilyBlog.Wordpress.com.

Read Full Post »

Ed. note: I wrote this column back in November 2003. I must have been hurting for material.

Just when you thought there were no good arguments left to keep white males off college campuses, researchers at Harvard have come up with another one. In a study entitled “Watering Down Drinks: The Moderating Effect of College Demographics on Alcohol Use or High-Risk Groups,” Professors Henry Wechsler, Ph.D. and Meichun Kuo of Harvard’s School of Public Health concluded that white males on college campuses cause each other to binge drink. Study the study, however, and a more apt title comes to mind — “Watered Down Logic.”

The study supposedly asked “whether colleges with larger enrollments of students from demographic groups with lower rates of binge drinking (women and minorities) exert a moderating effect on students from groups with higher binge drinking rates (white males).” It analyzed data from 52,312 college students at predominantly white colleges from the 1993, 1997, 1999, and 2003 College Alcohol Study surveys.

According to Harvard’s press release — which announced the study and will be the only thing about it anyone actually reads – the study concluded that binge drinking rates among white, male and underage students are lower at college campuses that have larger proportions of minority, female, and older students. It also found that greater diversity on campuses may serve as a “risk-protective factor,” even for those who were binge drinkers in high school. That is, incoming white freshmen who did not binge drink in high school were less likely to start binge drinking as college students if their universities had higher proportions of African American, Latino, Asian or older students. Conversely, incoming white freshmen who were binge drinking in high school were less likely to continue binging when attending schools with higher percentages of minority or older students. This “risk protective” finding is, in the words of the press release, its “most significant” conclusion.

Now these are quite sensational conclusions. One problem – they don’t follow. It’s one thing to show a correlation between more female- or African American students and lower binge drinking rates. The study indeed did that. But it’s quite another to conclude that the former causes the latter. Freddy Freshman might decide to bury his head in the books on a Friday night. But it doesn’t follow that, but for the presence of twenty-two African Americans in his dorm instead of twenty, he’d have been out hammering a half rack of Heineys. I’ve seen shaky logic in academic studies before, but this takes the keg.

There are — dare I say — other, more common sense, explanations. Wechsler himself admits deep in the text of the actual study that “(c)olleges that have larger numbers of minority and older students and women may attract white, underage and male students with different attitudes about drinking.” In other words, lunkheads who binge drink might be more attracted to the Arizona States of the world than the Yales or Harvards.

Or it might be that male students prone to binge drinking just don’t get accepted to schools with larger numbers of minority- and older students. If there’s an inverse correlation between high school binge drinking and grade point averages, and a positive correlation between premier schools and diversity — and I’m sure Harvard would be proud to say there is — then that’s as good a bet as any.

But neither of these possibilities mattered to the Harvard PC – er, PR – machine that issued the press release. Nor did the causal issue matter much to Dr. Wechsler, who, in paragraph three of the press release, announced like a proud papa that “(t)his study has shown that having a diverse student body on college campuses is an important factor in lowering binge-drinking rates.” (Italics mine.) No mention of alternate causal explanations, which made Wechsler’s next leap easy: “(i)n making decisions about admissions, colleges should recognize the many benefits of greater diversity on campus, including a possible decrease in problem drinking.”

How folly. Using Wechsler’s logic, one way to combat incidents of interracial violence on campus is to not consider race as a factor in admissions. After all, considering race necessarily increases a student body’s minority population and with it, the occurrences of violence between races. Want to increase graduation rates? Admit fewer African Americans – their graduation rates pale compared to whites. Want to combat the problem of eating disorders on college campuses? Admit fewer women – after all, they are statistically more inclined to eating disorders than men. I don’t see anyone in the Harvard faculty lounge signing on to any of those cures. But sign on to a recommendation that sheds white males in a negative light – no problem.

It gets worse. The brilliant Hah-vud minds who wrote the press release concluded that “(t)he findings suggest practical solutions for predominantly white colleges, including: creating a campus environment that would attract a diverse student body; increasing the number of minorities on campus; encouraging more women and older students to live on campus, and in fraternity and sorority houses; and decreasing the heavy concentration on campus of likely high-risk drinkers who are overwhelmingly young, male, and white.”

Diversi-crats should “Amen” that sophistry, because the machine’s startlingly broad conclusion could be used to justify just about anything in their platform. Need a reason to build a new campus diversity center? How about: “A new diversity center will attract a diverse student body, which will decrease the incidents of binge drinking on campus.” Looking for a reason to admit Bobby Blackguy over Willie Whiteguy? How about: “Admitting Billy will decrease the heavy concentration of white males on campus, which will decrease the incidents of binge drinking.” (Note to any white diversi-crats still reading: skip the “Amen.” An atheist may charge you with a hate crime.)

The media is lapping this stuff up off the tap. The Health Channel on Discovery.com proclaimed that “Campus Diversity Leads to Drop in Binge Drinking.” Some outfit called Join Together Online ran a headline over Harvard’s press release proclaiming that “Diverse College Campuses Yield Lower Binge Drinking Rates.” And the always-objective Reuters announced that “Diversity Helps Colleges Trim Binge Drinking.” Even CNN picked up the story. I don’t blame them: they know that sensational, anti-white male headlines are catnip for the chattering class. Trivial matters like truth are secondary at best.

College admissions officers don’t need more reasons to stamp “REJECTED” on white boys’ college applications. Most of them already think all white males are oppressive, even the ones still in high school. They see it as their role to exact a little payback, and now, thanks to this social engineering masquerading as scholarship, they can site public health concerns to do it. Here’s hoping admissions officers stick to the “oppressor” sham when they discrminate against white males. At least for a few of their great-great-great-great-great grandfathers, it might have been true.

Read Full Post »

Ed. note: I penned this in November 2006, when I still cared about such things.

The worse part about the Republicans’ loss of the Senate on Tuesday night was, as Newt Gingrich and outgoing Senate Judiciary Chair Arlen Specter (among others) have since pointed out, it was oh-so avoidable. The war was an albatross for Republicans heading into Election Day — specifically, the Bush Administration’s rhetoric about “staying the course.” As late as November 1, President Bush said he expected Rumsfeld to be with him through the duration of his Administration.

But — and here’s the infuriating part — President Bush’s actions didn’t match his rhetoric. Chief of Staff Josh Bolten has since admitted that, behind the scenes, the Administration was looking for a Rumsfeld replacement. And they had him — Texas A&M president Robert Gates — before the election. As Specter pointed out, it’s “a hard thing to calculate (exactly when Bush settled on Rumsfeld’s successor). But it’s highly doubtful that he made up his mind between the time the election returns came in on Tuesday and Wednesday when Rumsfeld was out.” I’d add: you don’t have the president of Texas A&M show up to a Wednesday morning press conference announcing him as Rumsfeld’s replacement unless he’d already agreed to take the position.

The president’s unfortunate decision to withhold his announcement ended up being a major blunder, probably even a historic one. In the short term, it cost the Republicans the Senate. To wit: a poll on AOL News asked: “Would you have voted differently if you knew Rumsfeld was resigning?” Eight percent (8%) of the 283,363 people who responded as of this writing said “yes.” Eight percent! That’s more than enough to have made a difference, and then some. Both George Allen in Virginia and Conrad Burns in Montana lost their races by less than 1% of their states’ votes. If either Allen or Burns wins, the Republicans keep the Senate.

Medium term, the decision will probably cost America the war. Soon-to-be Majority Leader Harry Reid has made it clear that Senate Democrats will spend the next two years bogging the Administration down with subpoenas and document requests. He calls it “oversight.” The presumptive new Chair of the House Government Reform Committee, Henry Waxman of California, echoed the sentiment last week when he said the list of areas of possible Administration wrongdoing is so long that “the most difficult thing will be to pick and choose.” Democratic operative Susan Estrich calls it “Democrats Gone Wild.” The end result of all this — Nancy Pelosi’s headline-grabbing pledge notwithstanding — will, at best, be an Administration that spends the next two years doing little more than responding to document requests. At worse, Democratic “oversight” could very well result in impeachment hearings — if not for some underlying charge like the well-weathered trumped up WMD’s, then for some perceived shortcoming in responding to subpoenas or document requests. (See Patrick Fitzgerald and Plamegate for precedent on that one.) The best way for the Administration to avoid this — negotiate a deal whereby, in exchange for dropping impeachment and investigations ad nauseum, the Democratic leadership gets what it wants. In this case, that’s by announcing a troop withdrawal — er, “redeployment” — from Iraq. In other words: four-plus years, 3,000-some deaths, and untold billions — all for not.

Long term, it may end up costing a lot more. If I’m right about the war winding down, Democrats in Congress can claim that the United States got control of Iraq once the Administration started listening to them. The Republicans strongest card throughout the last generation — foreign policy — will then be in Democratic hands. That will probably be enough to elect a President Obama or Clinton. It will certainly be enough to keep the Senate in Democrat hands through 2008. That means a move left in health care, entitlements, and economic/tax policy, among other areas. It also means the one more conservative Supreme Court appointment needed to kickstart a Roberts revolution ain’t gonna happen.

Worst for all Americans, the very-avoidable election results sent a message to all those who would oppose America abroad that they can affect political change in America — and hence, military victory on the battlefield — if only they keep up the propoganda war. Terrorists now know it’s official: America may not be a paper tiger. But it is surely a tiger run by its papers.

President Bush might have had a good point when he said after the election that announcing Rumsfeld’s ouster before the election would have put off the troops. But it was just that — a point. It was far outweighed by the risk of holding back . The president’s dubious call may well kill two revolutions — the conservative revolution at home, the democratic revolution (battle one: Iraq) abroad. Just a guess, but the mostly-Republican troops will find that to be a far-more offputting result.

Read Full Post »

The following op-ed was published in the Seattle Post-Intelligencer on October 2, 2007. A few days later, Governor Christine Gregoire signed my extradition papers to Louisiana. That made me and my colleagues the first human being ever extradited to the Bayou State for allegedly violating its online gambling law.

By NICHOLAS G. JENKINS
GUEST COLUMNIST

Soon two colleagues and I may be hauled off to Louisiana in shackles. Our “crime”? I founded and they work for Betcha.com, a Seattle-based person-to-person betting Web site on which, acting with the Washington State Gambling Commission, a Louisiana state trooper accepted four bets. Our take: 70 cents. The trooper says the patent-pending Betcha.com violates Louisiana’s online gambling law. We think Betcha.com is legal, and in July we filed a lawsuit against the WSGC to get a state judge’s take. Our hearing is scheduled for Nov. 9.

Gov. Chris Gregoire, who bet openly against the governors of North Carolina and Pennsylvania in the Seahawks ’06 Super Bowl run, can stop the extraditions. She should.

Betcha already has paid dearly for being located in Washington. The ultra-aggressive WSGC demanded that we shut Betcha down and cough up our revenue 13 days after we launched. The WSGC was working with Louisiana at the time, but Betcha saw no action from the Bayou state until the trooper bet — 32 minutes after we notified the WSGC about our lawsuit. That raises an inference of retaliation. The WSGC raided our office and seized our computer equipment, but kept our lawsuit quiet when it obtained the necessary search warrant. Then it launched a forfeiture action — while our action was pending. (That happened within days of a judge’s ruling that the WSGC acted arbitrarily and capriciously in a suit against another in-state employer.)

A King County prosecutor alleged under oath that we fled Louisiana in July to avoid arrest. Problem: My colleagues have never been to Louisiana, and I was last there in 1994. After we filed our brief in September, the WSGC requested more time to weigh Betcha.com against the gambling laws it’s supposed to know. While we wait, at least two U.S.-based startups have launched competing Web sites.

Sending us to Louisiana over 70 cents would be piling on. We may not leave Louisiana anytime soon — after all, we’ve allegedly fled arrest once before. It would also be unprecedented. Louisiana has reportedly issued arrest warrants for more than fifty people for allegedly violating its online gambling law, but no one’s been extradited. The closest Louisiana came to nabbing someone was in 2006, when Peter Dicks of UK-based Sportingbet PLC was arrested at New York’s JFK Airport. Then-Gov. George Pataki refused to extradite.

Sparing us would not get us out of the woods. Louisiana authorities will not drop the warrants, so we cannot travel internationally. A traffic stop may result in arrest and extradition. (We’ve already been to jail for this debacle — me twice.) Sportingbet paid $400,000 to a Louisiana parish to drop four Sportingbet warrants. Louisiana will probably start our bidding somewhere in that neighborhood. For 70 cents.

If Gregoire needs a reason to spare us other than it’s the right thing to do, there’s the law. Louisiana’s online gambling law is almost certainly unconstitutional. Or she can look to the shortcomings in the extradition papers — there are many. Governors often cite paperwork problems to decline extraditions — it happened last week when the governor of a Southeast state refused to extradite a woman wanted by Oklahoma. The Southeast state that refused extradition? Yup — Louisiana.

Nicholas G. Jenkins is a Seattle native and founder of Betcha.com. He is a graduate of Burien’s John F. Kennedy Memorial High School, the University of Washington and the Georgetown University Law Center.

Read Full Post »

Ed. note: I penned this one after President Reagan died in 2004.

I’ve often wondered how the left co-opted the descriptor “tolerant.” To most people on my side of the fence, the average liberal is slightly more tolerant than a Gestapo member. After my latest experience in Toleranceville – that Left Coast hometown of mine, Seattle, Washington – I decided to wonder in column.

I must confess that what happened didn’t seem particularly Earth-shattering at first. Some clown ripped a “Reagan/Bush ’84” bumper sticker off my car. Putting it there was supposed to be my most visible step yet out of the ideological closet. There aren’t many “out” conservatives in these parts percentagewise, and being one is like being a cross-dressing Marine at boot camp – better to keep it on the down low. President Reagan’s death reminded me that a good American – like him — stands strong, resolute and proud. “My driver’s conservative,” the sticker would tell drivers behind me. “And he’s damn proud of it.”

It was bad enough that some yahoo took my sticker. What’s worse was that it only survived on my bumper for three hours. At most. I put it there at 3:30 pm. By 6:30, after driving all of one mile and parking my car on a very public street for about an hour, it was gone. So sometime between 5:30 and 6:30, some self-appointed guardian of All Things Fit for Public Viewing decided that my expression of Right Pride wasn’t so fit. Three hours. John Kerry takes longer to flip flop.

In hindsight, I guess I shouldn’t be surprised. The left’s “Take Back America” theme is, at bottom, a message of vigilantism — President Bush stole their country under cover of night (via an election), and they have an inalienable right to take it back. Their political leaders have slightly more room for the right than National Socialists had for Jews in the early years. Indeed, Howard Dean said he wanted to break up the Fox News Channel on ideological grounds. And now Tom Harkin wants Rush Limbaugh off Armed Forces Radio.

Their foot soldiers are more belligerent. In my hometown, Tolerant Ones vandalized Starbucks stores at the WTO rally with the same vigor that young Nazis went after synagogues on Kristallnacht — and the police let them. A guy who lives a few miles north of me has had his car keyed, his house egged, and his mail box blown up – several times. His crime is having a “Bush/Cheney” sign in his front yard. In truth, the left’s message is “Take Back America . . . by any means necessary.” If those means happen to include vandalizing other people’s private property – so be it.

As troubling as losing my “Reagan/Bush ’84” sticker was, the reaction of my supposedly tolerant friends was even more so. One friend – she bright and articulate, but just this side of Karl Marx — said I should feel lucky: the Tolerant One could have keyed my car door or slashed my tires or busted my windows. To which I wondered aloud whether blacks in the South felt lucky when the KKK burned their houses instead of killing them altogether. Another card-carrying lefty pal suggested with a straight face that a fellow right-side-of-the-fencer took the sticker for his own private collection. To which I replied: it’s possible that John Kerry didn’t care about the net worth of his two multi-gajillionaire wives when he married them – but I doubt it. A fellow rightie straightened me out. She told me that conservative bumper stickers don’t actually go on bumpers anymore, at least in the Northwest. If you want it to last, you have to tape it to the inside of the car’s back window.

Don’t get me wrong. My world isn’t over – yet. If losing my six dollar “Reagan/Bush ’84” bumper sticker is the worst thing that happens to me, I’ll be okay. It’s still outrageous. In America, we’re supposed to be free to express ourselves – especially, our political selves. Other people are supposed to tolerate that expression. That is, more than anything else, the very essence of being “American.” If the best conservatives can hope for is to get by without their tires being slashed, then the war in Iraq isn’t the only war we should be worried about. Vandalizing private property wasn’t right when good ol’ boys were burning crosses in front yards, and it isn’t right now.

My fear is that this was a foreshadowing of something worse, like the banning of “insensitive” political signage altogether. Can’t happen in America, you say? We’re well on the way. In America today, many businesses, schools, apartment complexes and universities ban the American flag, lest some non-Americans feel offended. The Koran is required reading to get into the University of North Carolina (a public university), lest incoming freshmen be insensitive to the plight of fellow Muslim students. In many schools nationwide, fifth- and sixth graders are required to pray to Allah to make them “sensitive” to Muslim students post 9-11. Here in Seattle, municipal employees got the word from above not to wish co-workers “Merry Christmas,” lest they offend each other. All this in a country whose Constitution supposedly protects political and religious freedom. Is the specter of political signage being banned in certain “tolerant” towns – maybe Seattle — really so outlandish? Say, on “sensitivity” or “public safety” grounds? I don’t think so.

Funny thing is, if I was a homosexual and “Reagan/Bush ’84” was one of those rainbow stickers, the left would scream about gay intolerance and I’d be labeled a hate crime victim. If I was a black and someone ripped off my Black Power bumper sticker, they’d say my civil rights were violated. But you won’t see that kind of reaction when the victim of the self-appointed Bumper Sticker Police is — like me — a heterosexual, Christian, conservative, Reagan-loving white male. Out on the left, tolerance is a one way street.

Read Full Post »

Ed. note: This column was published by Orbus Max on October 17, 2003.

Earlier this week the United States Supreme Court agreed to review the Ninth Circuit’s decision that struck down the recital of the Pledge of Allegiance in California’s public schools. At issue were the words “under God” in the Pledge, the inclusion of which the three-judge panel felt amounted to California establishing religion. Here’s hoping the Supremes will take this case not only as another opportunity to reverse the Ninth Circuit – a frequent happening, that — but also to inject some common sense and accurate history into the church/state discussion. Because from where I’m sitting, there’s little of either.

The case has been the subject of an intense national debate. A district court in Sacramento initially dismissed the lawsuit brought by an atheist, Michael A. Newdow, who did not want his daughter exposed daily in her elementary school classroom to “a ritual proclaiming that there is a God.” A three-member panel of the Ninth Circuit overturned that decision, first ruling in June that the words “under God” made the pledge itself unconstitutional. Earlier this year, the court tempered its ruling by confining it to the public school context, invalidating school policies that require teachers to lead willing students in the Pledge.

The Ninth Circuit’s aversion to “under God” didn’t come from whole cloth. Left-leaning jurists have been building a wall between church and state since the days of the Warren Court. Inspired by the spirits of Thomas Jefferson and James Madison, the Supreme Court began its modern day crusade against religion in 1962 when it struck down a state-composed non-denominational classroom prayer (Engel v. Vitale). It later extended that ruling to ban daily classroom readings of The Bible (Abington School Dist. V. Schempp (1963)) and to strike down an Alabama statute that authorized a one-minute silent period at the start of each day to be used for meditation or prayer (Wallace v. Jaffree (1985)). These practices were all voluntary, and were all invalidated even though the First Amendment says the government may not prohibit “the free exercise” of religion. The basis: the Jeffersonian proposition that, although it didn’t say so, the First Amendment was intended to erect a wall between church and state.

What a bill of goods. As M. Stanton Evans detailed in his excellent 1994 book “The Theme Is Freedom: Religion, Politics, and the American Tradition,” before, during, and after they ratified the Constitution, the state legislatures and Congress acted like a wall between church and state was the last thing on their collective mind. Official state churches were the rage in colonial times, with no fewer than nine of the colonies having state religions as late as 1775. They gradually fell out of favor over the next several years, but in 1789, the year of the Constitutional Convention, the three New England states still had official religions, and most of the other states still retained some official sanctions for religious belief. South Carolina’s constitution, for example, deemed “the Christian Protestant religion” to be “the established religion of the state” and said that no religious society could be deemed a church unless it agreed, inter alia, that “the Christian religion is the true religion.” The Maryland Constitution decreed “a general and equal tax for the support of the Christian religion.” New Jersey expressed the idea by saying “no Protestant inhabitant of this colony shall be denied the enjoyment of any civil right.” In 1780, Massachusetts authorized a special levy to support “public Protestant teachers of piety, religion and morality.” New Hampshire later adopted that formula – verbatim.

State involvement with religion didn’t vanish with ratification. Until 1826, you had to be a Christian to hold public office in Maryland. North Carolina required elected officials to be Protestant until 1835, when it said any type of Christian would do. Massachusetts didn’t abolish its established church until 1833. New Jersey didn’t allow Roman Catholics to hold office until 1844. New Hampshire didn’t abandon its requirement that one had to be a Protestant to serve in the legislature until 1877.

Congress didn’t ditch religion, either. After it ratified the Bill of Rights, Congress retained its Chaplain (a position created in 1774) and continued to open its proceedings with a prayer. And – here’s a biggie — the very day after it passed the Bill of Rights in 1789, the House of Representatives passed a resolution calling for a day of national prayer and thanksgiving. The resolution thanked the “Almighty God” for allowing the United States the opportunity to establish a constitutional government. An also-grateful-to-God President George Washington issued a proclamation designating that day, now known as Thanksgiving. See the problem? If the First Amendment really meant to establish a separation of church and state, then Congress, the states and George Washington violated it – and kept on violating it — right after its inception. Logically, that’s nonsense.

Truth is, the men who passed the Bill of Rights intended for the Establishment Clause to speak not to what the states could say about religion vis-à-vis the people, but what Congress could do vis-à-vis the states. We know that because, after much deliberation, that’s how they wrote it: “Congress shall make no law respecting the establishment of religion.” (Italics mine.) And we know it because, in introducing a set of amendments to the Constitution in June 1789, including one speaking to the establishment of religion, Madison said as much. To wit: when Roger Sherman suggested to Madison that the Establishment Clause was unnecessary because of the Tenth Amendment, which reserved all powers not expressly granted to the federal government to the states, Madison explained that he apprehended the meaning of the words to be, that Congress shall not establish a religion and enforce the legal observation of it by law, nor compel men to worship God in any manner contrary to their conscience. Whether the words are necessary or not, he did not mean to say, but they had been required by some of the state conventions, who seemed to entertain an opinion that (under the “necessary and proper” clause) . . . Congress . . . might infringe the rights of conscience and establish a national religion; to prevent these effects he presumed the amendment was intended, and he thought it as well expressed as the nature of language would admit. (Italics in Stanton’s version.)

Translation: Madison introduced the language because others in the state conventions wanted it; he did so on their behalf; and its intent was merely to prohibit Congress from interfering with the states’ prerogatives on the subject of religion by setting up a national one. Madison should have listened to Sherman because, suffice it to say, never in the history of man has such a clarifier wrought such havoc.

But straightforward as it was, the Establishment Clause hasn’t aged well. Since the Fourteenth Amendment was ratified after the Civil War to guarantee equal protection and due process to blacks, the Supreme Court has extended it to make almost all of the Bill of Rights’ guarantees applicable to the states. Of course nowhere in the Fourteenth- or any of the other Civil War amendments does it say “the Bill of Rights shall now be applicable to the states,” but in most matters, particularly in areas of individual rights, binding state governments to the same set of rules as the federal government made good sense. But in 1947, the Court made the Establishment Clause binding on the states. The irony: the provision originally offered on behalf of the states to protect their prerogatives on matters of religion now prohibited them. And with that and other nonsensical rulings, we find ourselves in the mess we’re in today — a constitutional provision intended to protect states from Congress imposing on them a national religion now is now read to prohibit an elementary school teacher from allowing her students to praise the Man Upstairs. Jerry Garcia forgive me, but what a long strange trip it’s been.

I don’t expect to hear much about this history in the national debate certain to get cacophonous before the “under God” decision. The revisionist drumbeat will get louder, and with separatists in the papers, on the Internet, and on the tube 24×7, they can do a lot of revising. But repeating “the world is flat” doesn’t make it so, and repeating “the First Amendment was adopted to establish a wall between church and state” doesn’t make it so, either.

From historical cherrypickers I expect to hear the names Jefferson and Madison, both of who are said to have been strict separatists later in life. Maybe, but Jefferson was not a member of the Constitutional Convention, or of the Congress that considered the Bill of Rights, or even his home state of Virginia’s ratifying convention. From 1784 to 1789, the year of ratification, Jefferson was in France: what he wanted for the Establishment Clause should be about as relevant as what Ronald Reagan wanted for the 1964 Civil Rights Act. Madison was at least there for ratification, but even assuming he wanted strict separation at the time – no sure thing, given what he said to Sherman – it’s clear that’s not what the states or Congress had in mind.

I’m not holding my breath, but a few separatists might admit that a strict separation of church and state wasn’t what was originally intended. But, they’ll say, that was then and this is now, and no one wants to return to the days when only Protestants need apply. I agree. But the answer isn’t to create fictitious dogma. It’s to defeat such (to date) hypothetical measures at the ballot box. There’s a word for that: democracy.

Look for others to wax poetic about how the wall separating church and state wall is part of “American tradition.” It has been lately, but only because historians, judges, and other fiction writers ignored how inextricably intertwined religion and government were in the early years of our nation. Marriage between a man and a woman is also part of our tradition. Just guessing, but I doubt they’d consider that tradition a constitutional life preserver.

Bet the collection plate on at least once daily invocations of the Taliban as an example of what happens when church/state separation isn’t absolute. That’s poppycock for the soundbite set. States have long had police forces that haven’t become Gestapos and, outside of Manhattan, they’ve long taxed without becoming socialist. Surely America can tolerate religion in public life without plunging into the abyss of totalitarian theocracy. After all, governments were heavily involved in religion before, during and after the Bill of Rights, and none ever went Taliban.

Next year, the Supreme Court can do something about this mess. No doubt, injecting real history into the Establishment Clause foray would be a marked departure from recent precedent. But even liberals agree the Court should take the lead in redressing past wrongs. And if perpetrating a fraud on the American people that our nation was built on a strict church/state separation isn’t such a wrong, I don’t know what is. The Court will start its day with a clerk declaring “God save this honorable Court.” Let’s hope the Court returns the favor.

Read Full Post »

« Newer Posts - Older Posts »