Saturday, May 30, 2015

Needed: A billionaire!


This piece was originally titled, "America needs an opposition party," but I changed the title after a correspondence with Bruce Bartlett, former Reagan and senior Bush advisor and member of the Heritage Foundation.  Mr. Bartlett was interviewed by Patt Morrison in today's L.A. Times (http://www.latimes.com/opinion/op-ed/la-oe-morrison-bartlett-20150603-column.html#page=1) about his view that the GOP leadership and its extended family of consultants and handlers exist in an ideological bubble that keeps them from understanding the game around them as well as the Democrats do.  This fits with my premise in this post that the U.S. has no opposition party.  I wrote Mr. Bartlett and he kindly wrote back.  Mr. Bartlett told me (I quote him with permission) that there is "zero support" within the GOP for reassessing its alliances.  I asked if he thought it was time for a third party, and he responded that any movement for a third party would require a billionaire.  Of course Mr. Bartlett is correct about the billionaire, so I'm sending this out with a new title, "Needed, a billionaire!," along with Mr. Bartlett's interview link, to any billionaires, or friends or associates of  a billionaire, who might have a philosophical/historical bent, and who might find fulfillment in supplying meaning to American political life.

An opposition party is one that opposes the party in power.  We in the U.S. do not have such a party.

The GOP is certainly not a credible opposition party, as its inability to generate a front-runner for the 2016 presidential race indicates.  The Los Angeles Times reports (“GOP donors await a favorite,” 5/17/15, http://www.latimes.com/nation/politics/la-na-gop-fundraising-20150517-story.html#page=1) that although there are just as many wealthy GOP donors as ever, unlike recent presidential elections when there was a GOP front runner (e.g. Mitt Romney in 2012), this time no GOP candidate for president is polling over 20% (and many are polling less) so the donors don’t know whom to back.

The Democrats, of course, already have a front-runner in Hillary Clinton, who is polling about 80%, far above any other candidate of either party, though her strategy of saying nothing on substantive issues, perhaps hoping to glide in mainly on the basis of being the first woman president, may upset some of her base later on. 

Republican donors are waiting, but what are they waiting for?  A front-runner?  That could be a long wait.  What they are really waiting for is the Party to redefine itself in the wake of Romney’s defeat by President Obama.

What does it mean for a political party to re-define itself, and why would it need to do that?  The Democrats know the answer, which is why they hold the presidency now.  After the 1984 presidential election, when Ronald Reagan defeated Walter Mondale in a landslide, there was much talk of the demise of the Democratic Party, much like the talk today of the demise of the GOP.  But in contrast to the GOP’s current lack of reaction to its losses, the Dems created the Democratic Leadership Council (DLC), which turned the party away from “New Left” and FDR New Deal traditions, towards positions on the right in areas like free trade, union power and welfare reform.  Public spats were purposely engaged in- for instance with Jesse Jackson, who called the DLC, “Progressives for the leisure class”- to impress the change on potential Democratic voters.  The result was the ascendency of Bill Clinton and the Democratic Party, an ascendency that continues through Obama.

But the Republican leadership does not appear to have undertaken a similar assessment of the Romney defeat.  If it had, it would have found that far-right social positions, such as those associated in most voters' minds with the Tea Party, were the single most important cause of that defeat, as those positions have negated the electability of GOP candidates for state office in California.  

These social positions were made prominent by Senator Rick Santorum, Romney's 2012 rival for the GOP nomination, e.g. that abortion doctors should be tried for murder; that contraception and homosexuality are sins; that the Founding Fathers never intended separation of church and state.  It happens that I am opposed to these positions, but in this essay I just point out the numbers.  Far right social positions are supported by about 30% of the electorate, but the real problem is that the other 70% are so repelled by them that they will vote for a Democrat they hate before they’ll vote for a Republican who is associated with the Tea Party.  The problem was enough to doom Romney, who was ill-advised that he needed to keep the Tea Party vote, rather than sacrifice some of it for a wider slice of the electorate.  Republican leadership needs to look at far-right social positions and ask if official party endorsements of them are worth the cost.

Minus such a Party reassessment, there seems small likelihood that a definitive GOP front-runner could appear in 2016, or that this front-runner could beat Clinton.  Jeb Bush, the closest to a front-runner (though he too polls no more than 20%), is trying to distance himself from the Tea Party and run on his own positions, which include shifts to the left on immigration and education, and in doing so he faces complications with his base.  As noted, though, it's not the base that's the problem; it's the leadership.  Bush may take his own stands, but people will identify him with the national party, whatever he says.  The Republican leadership needs the guts and foresight to do what the Democrats did with the DLC, and take a stand.

Why do I care?  I am registered "Decline to State" - the fastest growing "party" around - meaning I'm looking for a party to represent me but have not found one yet.  The Democrats are deeply corrupt in two areas I care about: education (pushing Obama's ill thought-out, pork-driven Common Core Standards- CCS), and foreign policy (Obama has single-handedly destroyed the anti-war movement, which has proven no match for him and supportive sophisticates in Washington), but how do the Republicans, the only hope for an opposition party, stand on these issues? 

Romney's confused statements against Common Core- a combination of Tea Party hyperbole about Big Brother coming for your children and misinformation about the nature of the funding (Romney complained that the Feds were bribing the states to accept CCS, but the states are paying nearly all the bill) squandered millions of potential votes.

The Republicans have been no help with foreign policy.  Rather than pointing out that the world left us by Obama is much more dangerous than that left us by G.W. Bush, they shoot off demagogic rhetoric along party lines, sounding like uninformed bullies on the playground.

Thus we have no opposition party.  The GOP, in my view, now plays the role of a prizefighter throwing a fight.  In return for giving up the presidency and (especially in California) statewide office, it gets safe rural seats and a spoiler role in Congress.  Ideologically, It will stand for nothing in 2016, not even Tea Party ideas, which will continue to go down to defeat.  Much of GOP leadership is fine with this role- but the obvious lack of utility of our old "two party" system will start to gum up the presidential election in new ways, demonstrating to the Millennial generation that the machinery of American democracy is archaic, neglected and corrupt.  The Millennials are the big prize, currently being lost in a big way.

What's stopping the Republican Party from being more than it has become, from fixing itself the way the DLC fixed the Democrats?  One factor is that the old system of party bosses is gone, replaced by consultants whose purview is limited to particular candidates, not the health of the party or the credibility of the democratic process.  What's left of party leadership does not relish the thought of messing with its base, but what is that base?  My impression from talking to self-identified Tea Party members is that there is no consensus in the Tea Party on social issues.  I attended a Tea Party meeting in Hollywood where the featured speaker was a gay man who talked in support of gay marriage, and no one batted an eyelash.  The president of the group told me that the Tea Party does not take positions on social issues. Who knew?  Someone should tell the media, and then tell the Party.

Republican leadership faces a choice.  The Party can retreat to rural seats and give up state offices and the presidency, or it can take a page from the Democratic playbook and figure out how to survive as a national party.  Clearly a shake-out involving far right social positions would entail a struggle, but that is how parties and candidates are defined, and definition brings success.  Most importantly, America would have an opposition party.


Sunday, April 19, 2015

Tales from the front

Now that I’m officially old (69 last January), I notice that my definition as a “senior” is relentlessly reiterated and emphasized by our culture.  The message is: “You are old; get in your place.”   In response, to break the definition, I need to do things outside my “age group” so I’ll have something vital to write about (for instance, I don’t think two weeks of pain in my right hip will hold the reader’s attention, interesting though it is to me).

With this in mind I told my colleague at the high school from which I retired six years ago (and where I now coach debate in the morning) that I would cover his classes for two weeks, even while knowing the substantial quantitative and qualitative differences between part-time and full-time teaching.  It seemed unlikely I could work the two weeks without finding something to write about.

That expectation was confirmed on the first day, in fifth period, when I distinctly heard, in a male voice from across the room, “Fuck Jews.”

I walked over to the area where the voice came from and stood, taking in the peer solidarity, all eyes attentively looking away.  I lingered for a few moments, then returned to my seat.

The next day, in fourth period, came another “Fuck Jews,” from the same side of the room.  I walked over and saw a boy, Atilla, black with rasta hair, from fifth period who had sneaked in.  The boy thus became a suspect, but I thought it wise to just look around, ask Atilla to leave fourth period, and say nothing.

On the third day, in fifth period, Atilla came at me with what I felt was false friendliness (“Hey, can I call you Mr. L?” with a big smile).  I responded, “You know, what sticks in my mind is that I heard someone say something terrible from this part of the room in fifth period two days ago, and the same thing in fourth period when you sneaked in yesterday, and I’m wondering if you said it.”  A look of shocked innocence appeared on Atilla’s face, and I added,”In thirty years of teaching, I’ve never heard anyone say this terrible thing before,” which was true.  A white boy sitting nearby, who I later learned was Jewish, then said, with a smile, ”There’s a first time for everything,” to which I replied, “And a last.”

That was the end of it, anti-climactic perhaps, but I felt I had made as much of a point as I was going to, or needed to, and indeed for the next two weeks the classroom was mercifully free of “Fuck Jews.” 

What is going through the mind of a teenage boy who says, “Fuck Jews,” anyway?  Is this likely a boy who knows any history?  Who has had an unpleasant experience with a Jew or Jews?  It seemed more likely he had just discovered how much commotion could be caused with this simple utterance and decided to give it a try, though one does have to wonder if the current climate played a role.  This is a pivotal time for Jews as a group.  The view that the U.S. is overly obligated to Israel, simmering for years, has suddenly taken political shape with the odd dance between Israeli Prime Minister Netanyahu and President Obama.  I call it odd because of the alliance it suggests between Israelis- and thus, in a sense, Jews in general- and the far right Christian movement in the U.S.  The idea seems to be that there’s a common bond between Jews and Christians expressed in the apocalyptic visions of the Book of Revelations, which, we’re told, portend a time when Israel faces attack from “Gog and Magog” (generally interpreted as Russia) and when a red heifer will be born indicating that, try as it might, the whole world will not be able to destroy Israel, and Jesus will come down and carry all the saved Christian souls to heaven, leaving the Jews on earth to pay any outstanding taxes.  
I suppose I should be happy with the news that the Jews don’t die, but World War III, if that’s what this is supposed to be, will not be kind to anyone, so I’m holding off celebrating.

One might add, this is not the first time Jews have been told it’s their duty to stand in the middle of humanity’s strife and take a hit.  The Nazi Party in the early ‘30’s proclaimed itself Zionist.  The “solution” to the “Jewish problem,” at that stage, was that all the Jews of Europe would leave for the Holy Land.  The Party even offered agricultural training for future kibbutzniks.  The Netanyahu/Obama formulation seems a modern version of this.   

Do individual Jews get a choice?  Can I, for instance, just by stating my preference, remove myself from this historical concept, which suggests that the best hope for Jews in America is to take cover as God’s target practice for the world?  Forgive me if I look for an alternative evolution.



But I digress.

There were other interesting revelations during my two-week close encounter with teenagers.  In their dual status as children and adults, teenagers display a strange combination of intelligence and ignorance (suggested in the etymology of “sophomore”).  A video was planned for the first week: the original 1975 “Stepford Wives,” about husbands who turn their wives into fawning robots.  Part of the assessment required students to speculate on how this movie would be different if it were made today.  A number of students wrote that an important difference would be that today famous movie stars would be used to draw crowds, not unknown actors as in “Stepford Wives.”  I thought this misapprehension was important enough to merit a special lecture, so I informed the students that Katherine Ross and Paula Prentiss, stars of “Stepford Wives," were famous in 1975.  Blank stares from the students told me reinforcement was needed, so I told them that the day would come when they would mention Taylor Swift to a younger person and this person would have no idea who they were talking about.  Further blank stares indicated not, I thought, that they did not understand me, but that they did not believe me that all generations seek and lose fame- that their minds were uncomfortable with the idea that plus ca change, plus c’est la meme chose.  I witnessed the same phenomenon in the 60’s when many felt that no generation before ours had needed a sexual revolution.  It’s hard for all of us, isn't it, to accept that humans haven't much changed in the last 50,000 years?

Many students corroborated Steven Spielberg's comment that if he had to make "Jaws" again today, he would need to put a death-by-shark earlier after the opening credits, because today's audience will not wait longer than a few minutes for violence.  The first violence in "Stepford Wives" comes after about forty minutes of character and plot development, and several students commented that it was boring to wait so long for violence, that there needed to be much more violence and killing in the movie to make it interesting.

In spite of their predilection for violence, many students were taken aback by the dark ending of "Stepford Wives," in which evil wins, and several wrote that today's audiences prefer a happy ending.  In fact the 2004 re-make of "Stepford Wives," justly panned by critics as a ruined husk of the original, changed the ending to a happy one, where the robotic wives manage to turn off the chips implanted in their heads by their husbands (in the original, each robot strangles its original human model, so there's no going back).

There was a gender divide over the movie.  Many of the girls liked it; only a few of the boys did.

Another salient feature of the two-week job was how much work was involved.  I haven't worked this hard since I retired in 2009.  I implemented curriculum, gave tests and graded them, and handled discipline (by far the most arduous of the tasks).  I did that for 25 years, but it carries a special poignancy in retirement.  Teaching public school sure is hard, and getting harder.  Why?  Because there is little attempt to update public schools, to infuse them with the modern world.  I hear the objections to this statement already: We wire our schools to the internet, buy computers for everyone, etc.  How can I say the schools are not updated?

I can say it because the updating is more a gift to technology vendors than a transformation of learning.  Though word processing is a tremendous leap forward from the typwriter, facility in typing and editing has little bearing on students' writing skills, which, as most parents and teachers will tell you, have not improved significantly since the internet arrived.  Nor have reading skills improved, or understanding of history, math, or much of anything beyond understanding of the technology itself.  We have mistaken technology for cultural consensus and awareness. 

Of course, this situation will not last.  Sometime in the future, perhaps after the red heifer is born and Gog and Magog get the green light, we’ll have the technology sorted out.  Kids will enter the classroom, turn on the computer, put on a headset, and interact with the software, while the “teacher,” now a computer technician, oversees. 

That will be the easy part.  There is also the problem of unemployment, caused in large part by the same machines we celebrate.  There are very few jobs awaiting the students in my colleague's classes.  Why would that change?  No doubt the answer will come, as it has in the past, from war.  Whether it’s battling Gog and Magog or ISIS, we’ll find work for idle hands, as the forces we're fighting have done.

Finally, I was struck by the indifference to politics in my students.  Hillary Clinton announced her candidacy for president in the second week, but I heard no mention of it from any student.  Quite a contrast to 2008, when every student was mesmerized by the election of Obama.  That moment of credibility is gone; teenagers are among the most cynical and skeptical of politics of any demographic.  And why not?  Who from the exalted heights comes “down” to the high school level to talk to them, to explain the world and their role in it? One thinks of the former Iraqi official who, from his classroom, told a "Vice" reporter that, for criticizing the regime, he had been "exiled to teach high school." Siberia with bells!  

America should take a lesson from its teachers:  If you want our culture and country to survive, make the young a priority, in more ways than buying them breakfast.  Tell it like it is about public education, if you can.  






Thursday, April 02, 2015

Why I quit politics

To give background for my discussion of school board politcs with Dr. Cheryl Lubin on her radio show, In our Times, this Friday, 4/3 at 3:00pm, I am re-posting this account of my 1993 campaign for L.A. school board.  

To hear the show live or download, go to http://www.latalkradio.com/Cheryl.php.

"Why I quit politics" is reposted from Andrei Codrescu's journal, Exquisite Corpse, at http://www.corpse.org/archives/issue_12/clash/lasken.html



Of course you have to do something before you can quit it. I was a novice politician for almost a year in 1993, the year I ran for a seat on the Los Angeles School Board. I walked door to door, badgered people on the street, debated my opponent at public forums and on T.V. I talked to the newspapers, gave them statements, bios, photos. My opponent was the incumbent, well connected in Democratic circles through his political family, fast with facts and figures, thinner and younger than I.

From the start I had dumb luck. Most importantly, the teachers union, United Teachers of Los Angeles, declined to make an endorsement in our race, although they had supported the incumbent in his first campaign. I would have been dead in the water against them.

I also had luck in packaging. I was a classroom teacher, and this turned out to be a greatly saleable ballot label against my opponent's "Board member" (Political operatives have learned about this, and will scrounge deeply to find any past connection between the classroom and their candidates).

I stumbled into a lucky situation with a political sign company. The first company I approached, a major one in L.A., had been stiffed by a series of candidates and was reluctant to commit to me. My father had loaned me two thousand dollars for my campaign, and I blurted out that I would pay this up front in the form of a cashier's check. Within two days hundreds of signs saying "Keep Askin' for Lasken" were all over the turf in contention (so called Region 5, the western edge of the city running north from Westchester to Chatsworth). Compounding this beginner's luck was what I found to be a striking naivety in seemingly sophisticated people. For instance, a school administrator, a follower of news and an activist in neighborhood politics, said in reference to the signs that she had no idea I had so much "support."

My timing with the issues was lucky. The opinion in the San Fernando Valley was almost entirely for breaking up the giant L.A. school district (second largest in the country after New York's), and the west San Fernando Valley, the part in Region 5, was the most intensely pro-breakup. The incumbent was not in a position to support breakup, and I had supported it for years.

The issue of bilingual education worked in my favor. Though I supported California's efforts to help non-English speaking children with native language support, I was opposed to the withholding of English language instruction until higher grades. This played well with voters, anticipating the landslide passage five years later of Proposition 227, which mandated English language instruction in addition to native language support. Newspaper editors, most particularly Jack Miles at the L.A. Times, liked the topic, and I was able to publish a series of articles on bilingual education; several appeared during the campaign.

One week before the election I got a call from a pro-choice organization. They had been planning to send thousands of mailers in support of the incumbent because he had paid them a sizable fee and, of course, was pro-choice. I had only evinced the latter virtue. It happened that someone in the incumbent's campaign had angered them, and they had decided to support me in the mailer for free.

Topping off my luck, I won a raffle that placed my name first among the seven candidates. The effect of " 1. Doug Lasken-Teacher" was hard to beat as product placement.

The result of my luck: I received 36,000 votes, coming in second behind the incumbent's 50,000 ( turnout was large in this election because of the Riordan-Wu race). Had I taken 1% more of his vote, we would have been in a run-off. The day after the election the L.A. Times referred to "...newcomer Doug Lasken's surprising showing."

I remember standing at a newsstand off Hollywood Boulevard at 6:00a.m. reading, with trembling hands, the Times' hopeful obituary of me. Something sank inside me. The Doors '"This is the End" comes to mind. I knew I would not "capitalize" on my dumb luck, but I did not know why. I did not know why I had, at that moment, quit politics.

Well, perhaps what I didn't know was how to say it. I'm going to try to say it now: Politicians can't say "I don't know."

Politicians, in fact, can't say much at all of what they think. Well "Duh",you say. Yes, but when you're in a political situation where you're setting yourself up as the person who knows what's best, who has an answer to complex problems, there's a certain poignancy that comes with the knowledge that you're constructing a facade, a veil of words that sounds right, while the much vaunted human cortex watches as from the end of a long tunnel.

The above mental state was produced by certain types of questions, such as, "How would you increase test scores?" There is familiar boilerplate to deal with such questions: "Every student must receive quality instruction...We must have accountability and standards... Education must be our number one priority...", etc. Not that there is anything incorrect in such sentiments, but if they contained any important policy ideas we would be experiencing a much larger number of high scoring children. I did my best to sling a few slogans, and I used the English language instruction and breakup issues with some effect, but my brain was uncomfortable, my speech somewhat hesitant, and this perhaps cost me the 1% and the runoff.

Delving deeper into my uncooperative mind, I found something truly scary. It's not just that I wasn't in a position to say what I really thought about raising test scores. My hands hover now above the keyboard, waiting for a sign. No sign comes. Some muse has got me this far, but at the crucial moment she stands silent.

What the hell, here goes. Well you see, the thing is... I didn't really know how to raise test scores. I did believe that breaking up the district might improve efficiency, and that teaching English would improve English skills, but I wasn't completely sure test scores would go up significantly as a result. After all, when we talk about raising test scores we're not just talking about a few numbers going up; we're talking about real improvement in children's intellectual abilities. How do you get fifth graders in large numbers to know their times-tables, and remember them into secondary school? How do you get secondary students in large numbers to read books, really read them, from beginning to end? Why would a few corrective policy changes produce such profound educational outcomes?

Hindsight has justified the hesitation I felt during my campaign. Proposition 227 reinstated English instruction. A well funded "Standards" movement took hold in California and in much of the rest of the country, accompanied by millions of dollars in new textbooks and teacher training. There has been math reform, with renewed emphasis on basics. These reforms have helped a lot of kids, but they have not "raised test scores" in the real sense. In other words, although there have been small jumps in scores, there is no systemic, widespread change in our students. If you walk into a California classroom at random you are unlikely to find kids who can read well, or want to read, or who do math with the facility you find in Asia. Nor will you find this two years from now, or four years from now. It's not happening and it's not going to happen.

Why not? Because the discussion is political, and therefore incomplete. Standards are important, and logical instruction is important. But those are the easy parts.

Back to the reporter asking me how I would raise test scores. Let's say a cosmic force had ordered me to tell the truth. What would I have said? I might have stammered, "Well... I'm not sure." The reporter's brain would then have closed my file, stamping "loser" on it. If he was polite, though, there would be a pause, and then I would begin to think. This in itself, the sight of a politician lost in thought while the world waits, is anathema to a successful image. But if the cosmic force could get everyone to wait a bit, I could have given a decent answer. The discussion might have gone something like this:

Me: Well, we have a fundamental disconnect between our media based culture and the school setting. Virtually every kid is taught by the media to gaze at colored images which ridicule schools and teachers. We have nothing effective to counter this. We have not figured out a modern motivation for students. The U.S.is one of the few countries in the world that has ruled out physical pain as an educational tool (Singapore, much admired by math reformers, achieves the highest secondary math scores in the world partly by beating underachievers with bamboo canes). We do rely on the psychological pain implicit in the report card grade, but because of grade inflation, rampant from kindergarten through graduate school, and the glorification in the media of school failure, grades alone have become a weak motivator for all but a few students.

Reporter: So you advocate beating our students?

Me: Of course not.

Reporter: Then what do you advocate?

Me: We've forgotten economic incentive.

Reporter: For teenagers?

Me: Yes. Our surplus based society has extended childhood, resulting in dependence on parents at later ages, but teenagers are in their physical and intellectual prime, and will remain so into their twenties. They are designed to create and work, but the automation that gave us our surplus has resulted in a more seriously underemployed society than we like to admit. There are over 100,000 gang members in L.A., but there are not 100,000 jobs for them, not even menial ones. The standard curriculum in high school does not relate directly to visible jobs. Perhaps shop and computer classes do, but the thousands of jobs it would take to rationalize that curriculum do not exist. Honors students, the handful of clever kids who know how they will work the system, put up with non job-related curricula because they see a path to employment based on grades and general literacy, but they too have to wait. It is arguable that one of the purposes of secondary school is to serve as a holding facility to keep teenagers out of the job market. The first several years of college may serve the same purpose.

Reporter: So...you would propose.....?

Me: Well, somehow we need to have an economy that can absorb many more teenagers and people in their early twenties, and a school system that clearly feeds into this economy. But our technology, automation, may have made this impossible.

Reporter: How do you propose to remedy this?

Me ( after very long pause): I don't know.

End of dialogue, and career. Even an answer like, " We will have to replace our world economy, built up in haphazard form over two hundred years of industrial revolution, with a completely new, rationally organized economy", impractical as it might be as a campaign position, would be better than "I don't know." Anything is better than "I don't know."

It might seem strange to an extraterrestrial visitor from an advanced civilization that we have no place in our public discourse for "I don't know", since we so often, clearly, don't know, but it's basic human psychology at work. Management theorists have shown that leaders get approval for making decisions, for being decisive, regardless of the results (advice routinely followed by politicians). This is understandable given the human condition. We really don't know what we are supposed to do on this earth, or even if we are supposed to do something. If our leaders admitted this in public, society at large might collapse in terror. Still though, it can be something of a hindrance to problem solving to maintain at all times that soothing platitudes are solutions.

So after a refreshing brush with the fast lane, I returned, sober but wiser, to the classroom, where I find I can say "I don't know" a lot,to students, to parents, to my colleagues, and they don't seem to mind. Hey wait a minute, these people vote, or will vote...Hmmm.


Thursday, February 05, 2015

ISIS: A virtual reality

"Virtual" is a difficult term to define, especially in the modern phrase, "virtual reality." Of course, by the time kids are in middle school they know what virtual reality is, but ask them to define it. Then ask yourself.  In this essay I've assembled what knowledge I could about "virtual reality" and the role the concept plays in modern society.  At the end I relate what I find to our conception of the terrorist group ISIS.

A good dictionary covers the basics: "Virtual" is related to the noun, "virtue," which we know to mean, "a morally good quality," like integrity or honesty, from Latin virtus, "merit," "perfection," from vir, "man."  The transition from vir to the rest is a challenging etymological puzzle (while you're at it, consider "woman of virtue," a woman who has not had sex with a man), but my focus here is the equally mystifying modern usage of "virtual."


Back to the dictionary- there are three broad definitions of "virtual":


Number 1: "Almost or nearly as described, but not completely or according to strict definition : the troops stopped at the virtual border."   Virtual borders are not official borders on a map, but de facto borders, determined by use.


[Note: Only definition Number 1 clearly references the historic usage of virtue, in the sense of "possessing certain virtues."  In the example above, virtual borders have the "virtue" of being observed by practice, though not the virtue of being indicated on maps.]


Number 2: "In computing, not physically existing as such but made by software to appear to do so : a virtual computer;"  In other words, imaginary.


Number 3: "Physics, denoting particles or interactions with extremely short lifetimes and indefinitely great energies, postulated as intermediates in some processes."  In other words, particles, or things, that exist for such a brief period of time that their reality as things is questionable.


I would have guessed that "virtual reality" derived from Number 3, since Number 3 is the most confusing.  Does the length of time that something exists have bearing on whether it exists?  How long do individual humans exist?  In galactic time, it's not very long.   So is ours a lesser existence?  That subject will have to wait for another essay, however, since "virtual reality" derives from definition Number 2, which means, as noted, imaginary.


Under virtual reality we get: "The computer-generated simulation of a three-dimensional image or environment that can be interacted with in a seemingly real or physical way by a person using special electronic equipment, such as a helmet with a screen inside or gloves fitted with sensors."

The question I ask at this point is, why do we need to call computer-generated simulation any sort of reality, as if it were a type of reality?  We never had this need with novels, plays or movies.  Those are not types of realities.  They are imaginary.


In modern, media based culture, we do seem to have a need to think, or feel, that we create our reality.  It is a sort of honesty, in one sense.  After all, when we turn on the news and see what is happening in far away parts of the world, the news show is constructing reality, so that what we receive is not reality only, but a reality put together by the show.  We have "reality shows," in which people behave in stage-managed ways, real only in the sense that we've made the behavior real on the show.


I'm calling this usage "honest" because, unlike past ages when, for instance, young men recruited for the Crusades were told that various things were happening in the Holy Land that required invasion, those various things were held to be real, not virtually real.  So our culture, by holding that certain things can be virtually real, as opposed to just real, admits a pervasive doubt into our discourse, and doubt is a virtue.


The extended context is not so hopeful, however, because it suggests that we don't require actual reality from our media, that it is enough to produce simulated, virtual reality, as video games do.


It is in this context that I consider ISIS, which, with its professionally produced, ready for prime time video of a man burning to death, has realized the predictions of numerous science fiction novels, from the media-mediated wars of George Orwell's 1984 to the blurred lines between war and mass entertainment in Suzanne Collins' Hunger Games.  On the day of the ISIS video, our national news anchors breathlessly described the "high production values" of the video, Scott Pelly of CBS marvelling that no Hollywood studio could have done "better," as if he were delivering a movie review.  In a sense he was.


The next day my friend showed me a video on his computer that his girlfriend had sent him.  It purported to be a recording of the transmission from a drone that was conducting an attack on ISIS ground troops who were attacking the peshmerga (Kurdish enemies of ISIS, thus our allies).  It was a nighttime attack, the ground troops glowing white through infrared lenses.  The chatter from drone control, which was hundreds or thousands of miles from the scene, was dispassionate though highly engaged, technical, referencing targets and coordinates, ordering rocket and 30mm fire that resulted moments later in white flashes where running forms had been.  It looked exactly like a video game.  My friend and I could have been blasting aliens or Kazakhs (a favorite game foe for a while).  Orson Scott Card's Ender's Game came to mind, in which hot-shot 9th grade gamers are told by the military that they are trying out a new training video, while they are in fact fighting real aliens (spoiler alert: Book III reveals that the invading force was actually on a peaceful mission).


What do these musings have to do with the real ISIS?  And by the way, my point is not that ISIS is not real (or evil).  It's hard to see how their actions could be faked.  That man was really burned, and the earlier victims were really beheaded.  I'm talking about the thinking behind ISIS, specifically their marketing department, and they clearly have one.  The War of ISIS is packaged for young men the way a video game would be packaged.  You know how you'll be watching a TV show that young people also watch, and suddenly there's a commercial, long minutes of CGI heroes blasting a variety of monsters, with forceful titles, like End of Doom Part III!  Now it's The Rise of ISIS Part I!


Our commentators are wondering where ISIS came from, and what it wants.  It is not a country.  It has no past as an established enemy, no clear parentage.  Its psychology seems not to reference anything in the surrounding world.


In other words, ISIS somehow does not seem real.  Virtually, though, it's real enough.  And it certainly has no problem with ratings.  There's no question that we'll have to win this war, virtually and really.





Sunday, January 11, 2015

Mystical kicks on Route 66

Our driving trip the first week of 2015, like our last to Havasu City, took me and Susan into the Southwest, this time as far as Sante Fe, New Mexico.  Our itinerary paralleled old Route 66, the romanticized predecessor of the Interstate Highway System, surviving stretches of which feature commercial clusters which promise and often deliver nostalgic glimpses of early automotive America.  This trip was different from our Havasu getaway in another respect: We did not get away from the world, but were immersed in it by the car radio and our mobile devices as word of the French massacres at Charlie Hebdo and the kosher deli invaded our space and seemed to stain even the open deserts.  

Our first stop, Williams, Arizona, where we stayed at the Railway Hotel and took the delightful old narrow gauge train to the Grand Canyon, occurred before the awful news, so we could relax and enjoy the Americana.  The train was built when there were no drivable roads leading to the canyon and was taken by many notables in the 1920’s, including movie stars and presidents.  It chugged along for two hours each way, past beautiful snowy terrain. 

At the canyon, after some minutes staring into the seeming endless depths, it seemed to me that, thanks to the Colorado River, one can look into private parts of the earth.  The guide explained that it is a common misperception that the river cut into a motionless plain; in fact the plain rose as part of the Colorado Plateau, the river just keeping level.  The result in any case is an assault on the earth’s integrity, allowing the viewer to look where we normally can’t look.  I wondered if the earth was in pain from this violation.  Or was it a sexual penetration?  Is the earth ravished here, in the throes of rapture from the river’s thrust?  When such thoughts come over me I take a cautionary moment to think of the scorn a geologist might have for them.  The earth alive?  Native Americans were allowed to think that, but we are not, as the dogma of our state religion, Science, has it that matter is a random chaos of unconscious reaction.  Our permitted mystical focus, which we call “God,” is not matter, but spirit, guiding us in our manipulation of matter.  Our drive to colonize the extra-terrestrial universe is a crusade to force all matter to conform to our will.  Such, anyway, were my thoughts at the Grand Canyon.   

The next day we toured Meteor Crater, Az., a wonderful gigantic dent in the earth, surrounded to the horizon by the Colorado Plateau, dated to a meteor collision about 50,000 years ago.  The crater and visitor center are privately owned by the descendants of a man who staked a mining claim here and dug several hundred feet below the crater floor in the mistaken belief that a mass of extraterrestrial iron that he could sell to the railroads was buried there.  In fact the nickel-iron mass had disintegrated on impact into myriad small particles which lay everywhere around the site.  The largest intact piece of the meteor, about two feet long, is in the visitor center.  I touched it eagerly, wondering what far away mind, what “other”, I might be in contact with, again, obviously, deviating from the state religion, where iron and nickel atoms are “materials,” in effect dead things. 

On the tour of the rim I had forgotten my water bottle and the guide suggested I eat snow.  I did and it was delicious, and I wondered if any of the dispersed iron atoms were in the snow.  Could it be that I would metabolize pieces from “outer-space,” that they would become part of me?  Looking down into the crater I again had the thought that something sexual had occurred.  Was the earth fucked here?  If so, perhaps there was a resonance in the ground beneath my feet, detectable even in the snow I was eating.  And I thought, not for the first time, that it’s a good thing our state religion does not have an Inquisition (or, at any rate, a formal one).

At about this time the news of the Charlie Hebdo attack broke, followed soon after by news of the kosher deli attack, and our trip from then on was not, strictly speaking, a vacation, in the etymological sense of a vacated mind, though we found some escape our first night in Santa Fe.  We arrived late.  It was cold and dark; most stores and restaurants around the historic city center were closed.  Luckily we found that the Hotel La Fonda’s restaurant was not only open but featured a lively country band and a group of people who had been dancing there for 35 years (per our waitress, who seated us close to the band).  Several of the dancing couples were quite expert, one in particular whose precision moves looked professional, so, although we love dancing, we felt a certain hesitation to join in.  The first martini took care of that (we learned the next day that, at 7,000 feet, the effects of alcohol in Santa Fe are notably enhanced), and we danced through a number of songs (the regulars were quite tolerant of our lurching about, for which we were grateful).

The next day we took a walking tour of the city, our initial mood somewhat dour, both because of the aftereffects of our revelries the night before and the deepening horror on the news.  But that news proved to be an engaging backdrop to the tour.  Santa Fe is one of those cities where history informs everything.  It embodies the living memory of Native Americans, the Spanish Conquest, the Catholic Church, the Mexican period, the early American period, and something else we did not know about.  The guide took us into a sprawling 19th Century hacienda, through room after room added over the years, and in the furthest room was the former office of Robert Oppenheimer, where he met, towards the end of World War II, with other people with familiar names, like Edward Teller, as these men, engaged in the Manhattan Project, oversaw the invention of the atom bomb in the nearby desert.  Touching the preserved objects on Oppenheimer’s desk, I thought of the atom, and again my heresy was aroused, probably more so in the context of events unfolding in France.  What is an atom?  It comes from the Greek, meaning, “that which cannot be cut,” but of course we have cut it.  What does that mean, to “cut” an atom?  We have split it into “sub-atomic” particles, an oxymoron in the etymological sense, although, as if attempting to make semantic amends, science journals often assert that within the atom we've found new “basic building blocks” which cannot be cut, like quarks, or whatever particle we have not cut yet. 

We’ve found that when we cut an atom, a burst of something we call “energy” emerges.  This energy can be used to do things, like heat cities or incinerate hundreds of thousands of people.  The latter is what Oppenheimer et al had in mind. 

But what is an atom?  To find out, we hurl them at great speeds at each other, causing collisions that rip apart their structure so that tiny components spill out.  In my blasphemy, I consider that a strange way to find out what an atom is.  What if you were an advanced being from another galaxy and you wondered what people were?  You note that they move excitedly over the planet, changing everything, often rubbing against each other, activity that apparently produces more of them.  To further your understanding of people-particles, you deem it necessary to hurl them against each other, ripping apart their structure so that their components spill out.  That does not seem a likely scenario, as an advanced being would probably figure that nothing much would be learned about people that way.  I guess what I’m saying is, we’re not advanced.

After the first atom bomb was detonated in New Mexico in 1945, Oppenheimer is said to have quoted the Bhagavad Gita: “Now I am become Death, the destroyer of worlds.”  That occurred to me as we walked out into the cold sun in the courtyard.  What underlying phenomenon is happening in France, and almost everywhere in the world?  Are we becoming Death?  Does our state religion of dead matter mask a religion of murder and suicide?

Fortunately we had a few more kicks on Route 66 to revive the sense of carefree getaway, like a stop in Winslow, Arizona, where we sought coffee and parked at random on a corner memorializing the Eagle’s song, “Take it Easy.”  The corner store featured a plaster statue of a young dude flashing a smile and waving across the street at a permanently parked flatbed Ford, a plaster girl at the wheel.  Above the statue was a sign reading “Route 66."  While Susan checked the stores, I wandered down from the Ford, finding a gap between buildings featuring a lonely hut,  big enough for one person to stand in, with a wooden sign reading, "World's Smallest Church! Come in and Pray!"  I went in and closed my eyes, hearing only the cars on Route 66, and I saw everything that had attached to my ego since I was one year old suddently stripped away, as by a tornado or an atom bomb, and from everyone around me and the culture of the whole world, a vast layer was stripped away, and we were all souls, basic building blocks of consciousness.  I jumped out after a few seconds, wondering what spirits I had disturbed here.  I looked over to the plaster girl in the flatbed Ford for an answer, but she gave none.

Through the long and rainy deserts on the final stretch from Phoenix to L.A., we heard that the male French terrorists (one female escaped) had been killed.  I spent the hours navigating the deserts and L.A.’s freeways wondering if there was any logic to feeling good that these men were dead.  I think everyone would agree that those deaths do not represent the end of something, but a mere beginning in our religious quest to become death, destroyer of worlds.  

We give thanks to the Southwest, and Route 66, and the earth as it expresses itself here, for providing a backdrop to our thoughts and mysterious lives.



Friday, November 28, 2014

Bad words

Isn’t it odd that a word can be bad? Odd, that is, that the word itself is bad, not its referent. And odd that there’s no clear logic behind the bad word’s badness. For instance, “murder” and “torture” refer, in most people’s minds, to bad things, but the words are not bad. The word “fuck,” however, is bad, though it doesn’t refer to anything bad in an absolute sense.

"Fuck" is probably the most bad of the bad words, though, as noted, its referent, expressed acceptably in Latin as "copulation," ("a coupling together") is morally neutral.  How does such a word become bad?

History demonstrates the agonized process.  Christian Konrad Sprengel, 18th century German naturalist, was the first academician to suggest that flowers are sexual organs. For his pains he was hounded out of polite society and his work vilified. Today it is common knowledge that a wholly female flower is a type of vagina, that male-only flowers are types of penises, and hermaphroditic flowers are cocks with pussies attached that fuck themselves.

Sorry for the cheap shock value of my prose, but I’m trying to make a point: Sprengel turned “flower” into a bad word.

As an elementary and high school teacher I spent a lot of time and energy trying to dissuade children from saying bad words that denoted sexual organs, various sex acts and, of course, excrement. In this essay I ponder what I was trying to accomplish, and what our culture is trying to accomplish.

I’m a crossover person who remembers bygone eras. In 1955 my family went to see the movie “Picnic” because we’d heard that William Holden said “damn." A hushed, almost worshipful audience awaited the big moment, and when the word was uttered a gasp in unison pervaded the theater. The movie producer’s gamble had paid off: box office dividends from a bad word. Few at that time realized that the dam was about to burst (sorry).

Fast forward to San Francisco State, 1969- my Chaucer professor has just charged breathlessly into the classroom. Instead of giving us a page number to find, he asks if we’ve heard what’s going on at U.C. Berkeley. Mario Savio and an army of dedicated young people have taken a stand for free speech, he informs us. We can say “fuck” if we want to!  Add cable TV a few years later, and the rest is history.

Fuck! Fuck! Fuck! Fuck! Thank you Mario!

Fast forward from the 60's to 1983, when, as a new elementary school teacher in inner-city L.A., I face a demure little black girl who, standing before my desk, has just said, “fuck.” There is no context, just the word, hanging understated in the air. I track down the mother’s work number and call. The mother’s response: “Let me get this straight. You called me at work to tell me my daughter said ‘fuck’?”

“Er…yes…” I stammer, and realize I need a zeitgeist upgrade.

Fast forward a few years and I'm a high school English teacher, listening all day to kids speak in 60's style linguistic abandon.

Like everything else in our society, our language protocols are in a state of flux.  At times of head-spinning change, it's helpful to ponder history.  The Norman invasion of England in 1066 gives some needed background. The Normans spoke French (though they were only two generations removed from their Viking ancestry) and imposed their language on the indigenous Anglo-Saxons, whom they despised beyond words, especially four letter words. The Anglo-Saxons said things like “fuck” and “shit,” scum that they were, while the Normans, heirs to Latin, could say, in the French versions, “copulate” and “defecate.” Thus Savio's battle for free speech represents a continuation of the thousand year struggle for the Anglo-Saxons' right to speak the mother tongue.

The "four-letter" words do of course have another property: they carry emotion.  Compare these two sentences:

1. There are dog feces on the mat.

2. There's dog shit on the fucking mat!

The first sentence is devoid of emotion, an expression of information only; the second, identical to the first except for two bad words, a contraction and an exclamation mark, explodes with emotion.   It is their prohibition that has attached emotional power to the bad words.  They are forbidden... special.  The process has given us useful words which express levels of emotion that other words cannot.

Once the prohibition has been gone long enough, the words' power will diminish.

In the high school portion of my teaching career I was able to formulate a policy on the goodness or badness of words based on their usefulness. “Plethora" I identified as a bad word because it’s ugly and show-offy, making its common synonyms more useful.  When we read an Anglo-Saxon bad word in literature, I encouraged students to assess the word's usefulness in its context.  Words are either useful or they're not. They are useful if they carry meaning and force; they are not useful if they don’t. If I have to hear “mother-fucker” all fucking day, that phrase is not useful. If it's only used once in a while, well then, maybe….