WE’RE ALL GOING TO DIE (probably not from Ebola, though)


Spreading Disease, Hollywood Style

Around 2 million people die every year in the US. About 38,000 will overdose on drugs, 33,000 will die in a car accident, 31,000 will get shot (on accident or by choice), and a little over 26,000 will fall down and not get up.

In any given year, 3,000 to 49,000 other people will die from the flu or complications from respiratory disease linked to the flu.

The rest of us will hang around for about 78.9 years until various body parts fail or our cells betray us. I don’t often wish I was 18 again, but sometimes I miss that sense of immortality endemic to youth.

Shortly after I started teaching literature, I had a student ask if all “great” literature (his air quotes, not mine) were about death.

Of course not, I told him, but death and taxes are the elemental constants of human existence. Taxes you can avoid if own a multi-national corporation with off shore offices or you have a Congressional patron allowing you to shirk your duty to America and mooch off your fellow Americans.

Death not so much.

You, I continued, are sitting in this 8:00 am class, listening to me prattle on so you can earn 3-hours of credit, so you can get a college degree, so you can get a good job, so you can buy a house, so you can subscribe to a good cable package, so you can stock a little mini-fridge full of beer and yet, before it’s all said and one, you’re going to die.

Death isn’t a question of if. Just when and how. Don’t get excited, though, if you don’t snort some coke, throw back some shots, and then drive 100 mph waving a gun around the car, you’ll probably be around for the midterm. Please study.

By the way, thanks for spending what might be your final moments with me. I hope you get your money’s worth.

Of course, I should have also said if an airline pilot, beaten by a rogue, genius gorilla trained in James Franco’s attic takes an international flight after a nose bleed, then WE’RE ALL GOING TO DIE.

From the simian flue. Or the SARS. Or the HIV. Or the bird flu. Or the Ebola. Or whatever turned everyone into zombies in the Walking Dead. 

Feel free to skip the final is that happens. 

Sell your stocks. Stockpile tomato sauce, pickles, and fruit cocktail. Bury your gold. Use your book as a fire starter. Literature matters, but when the apes take over, knowing how to find Shakespearean allusions in Sons of Anarchy won’t help you survive.

He never asked another question in class.

I’ve written before about the 24-hour news cycle and the impact social media has on public perception of information. If aliens landed in America (and any of them weren’t shot or put in deportation centers near the border), they would assume America was in the midst of an Ebola pandemic.

Ban travel, Republicans say. Appoint an Ebola Czar, the President says. It’s a conspiracy, my crazy cousin says. “Obama is from Africa. Ebola is from Africa. I’m just sayin’.” Blame the Center for Disease Control, blame Budget Cuts, just be sure to blame some one before we all die!

It’s enough to make me wish Justin Bieber would punch someone so the headlines would change. Maybe President Obama will do us all a favor and hold a latte while he salutes the Marines to give the folks at Fox News something else to talk about.

Honestly, I don’t want to downplay the danger of Ebola (or any other infectious disease) and I understand that by definition something infectious can spread without warning. Around 4,500 people have died in Africa, and we have a moral obligation to help countries not as fortunate as us contain the disease. We also have a vested interest in working to solve world-wide medical crises. This disease can be isolated and contained.

But our political leaders also have an ethical responsibility to calm down just a tad. I realize that American politicians seem to have the emotional stability of a teenager on prom night, but can we at least pretend to care about the facts?

ONE person has died in America from Ebola.


I’m no math major, but that seems like a relatively small number.

So far in America, there have been two infections. Two is really just another way of saying a couple, which is still not many. I haven’t even had to use all the fingers on one hand, yet.

We also know that Senegal and Nigeria have been declared Ebola free, and the Spanish nurse who contracted the disease has tested negative.

I’m not really one for big, sweeping declarations, but even though we’ve cut the CDC budget by almost 50% since 2006, and, as Judy Stone points out in her Scientific American article, we’ve politicized science funding to the detriment of our preparedness, we still have the best medical facilities and infrastructure in the world.

America is uniquely capable and prepared to stymie infectious disease outbreaks.

If we can keep our wits about us and let science work.

No offense Senator Cruz, but until I see the MD after your name let the experts do their jobs.

Like many of you, I’m growing increasingly weary when politicians turn things like Ebola outbreaks in Africa into an opportunity to score political points at home. When Rick Perry had to stand as the voice of reason in your political party, you know things are going off the rails.

Travel bans weren’t a good idea under President Bush (because they don’t work), and they won’t work any better under President Obama. ISIS is not gathering at the US/Mexico Border with vials of Ebola ready to infect Americans with rapid fire sneezing. We don’t need an Ebola Czar to coordinate our Ebola response because a private hospital in Dallas didn’t follow protocols. How much is this guy going to get paid to tell hospitals to follow protocol and why can’t I apply for that job?

What we really need is someone to reassure us tell us that yes, we are all going to die (but probably not from Ebola) and probably not today.

It Was a Dark and Stormy Century: And It Just Might Be Our Fault


Dr. Mann and colleagues graph for the IPCC

Dr. Michael E. Mann, author of The Hockey Stick and the Climate Wars is on our campus today. Dr. Mann is most well known for the graph you see in the upper left-hand corner. The graph, submitted to the IPCC (Intergovernmental Panel on Climate Change), shows the temperature changes over the last thousand years. If you look closely (or if you click on the graph), you will see a sharp uptick in temperatures starting during and directly after the industrial revolution.

The graph is designed to provide a simple, easy to read measurement that demonstrates anthropogenic (human-caused) climate change based on 1,000 years worth of temperature and climate data using multiple measurement tools.

The graph and leaked emails from members of the IPCC research group thrust Dr. Mann into the political spotlight. Critics of climate change and global warming cherry-picked sentences from emails and willfully (or ignorantly) misconstrued conversations between scientists who had been working together for many years to cast doubt on both the data and Dr. Mann’s professionalism.

It was, in many ways, a picture perfect example of the politics of personal attack writ large on the cultural landscape. Notable climatologists Sean Hannity, Michelle Bachmann, Jim Inhofe, and Sara Palin attacked Dr. Mann’s data while other critics accused him of politicizing climate change. Dr. Mann was called before Congress and dressed down while Senators grandstanded on tv, collecting campaign contributions from Exxon, Haliburton, and Mobile in the backrooms. For his critics, Dr. Mann’s hockey stick became an example of his hockey shtick and they clearly imagined that destroying him personally would discredit his data. (A rhetorical device, I recognize, I employ in this paragraph as well.)

Certainly, there are legitimate questions to be asked about rising temperatures and all science requires peer review. The very nature of scientific inquiry demands that data be reproduceable. When data can be reproduced, we move toward scientific consensus.

And please note that we have scientific consensus on global climate change and global warming. No reputable scientific agency throughout the world denies humanity’s impact on the levels of CO2 and other measures of global warming. The ice caps are melting, super-storms are quickly becoming the norm, and long-term droughts have human fingerprints all over them. The question, at this point, isn’t really if humans have impacted climate change. Groups like 350.org and Dr. Mann aren’t leading some world-wide scientific conspiracy, cooking the data, and tricking people.

The question, instead, needs to be how we balance our needs for fuel, energy, and food within the ecological limits of our planet. Driving hybrids and eating granola can’t be the only answer but doing nothing isn’t really an option at this point.

But what is most fascinating, I think, about Dr. Mann’s story isn’t the coming global climate disaster but the intense and incredibly loud adverse reaction to the scientific data.

In some ways, the reaction to any data that disrupts our livelihood and threatens our way of life reflects an ages old distrust of science and experts. One need only look at popular culture’s presentation of mad scientists and recognize that we have a healthy distrust of people who play with chemicals and the other fundamental building blocks of life on earth. Dr. Frankenstein, Dr. Jeckyll, Dr. Moreau, Dr. Strangeglove: when scientists go bad they can destroy the planet. My guess is we will eventually learn that the zombie strain terrorizing Rick and his crew on the Walking Dead is man-made.

The visceral reaction to the climate change data, I think, reflects some of this distrust. Truly disruptive ideas in the scientific arena have an ability to change the way we see humanity’s role in the universe. Imagine, if you will, living your entire life believing the earth is the center of the universe and our role is pre-ordained, defined by a higher power. When that belief is challenged by science and when the very underpinnings of our moral identity are called into question, we often fight back and reject the change.

In the 21st century, social media, the internet, and an ability to isolate ourselves the echo chamber of like-minded people has also turned too many of us into “experts” in our own minds. We reject scientific inquiry for observational data, forgetting that just because we can’t see it doesn’t mean it might not be true. Too many of us forget that our life time, these 70-90 years we walk the planet, are but a small sample of time. Dr. Mann’s data covers over a thousand years. Your memories from 1950 barely register as a trend and just because last year was cold in Topeka, Kansas doesn’t disprove global warming.

But I also recognize that things aren’t so simple.

Andrew Hoffman argues “Climate change is an existential challenge to our contemporary world view.” Multiple polls show us that climate change is poised to replace health care as the essential battleground issue in the coming years because, I think, it challenges our relationship with the physical world. Religion and myth show us as passive recipients of weather and climate. Floods, droughts, hordes of locusts, and other phenomena are “natural” disasters, occurring mysteriously or as a way for God (or the gods) to remind us that they have the power to unleash such things and we don’t.

Except now Dr. Mann and his colleagues are arguing we do have that power, and I think it scares the bejeesus out of some people. But I think it also smells to some as yet another large, international organization trying to define and redefine what it means to be a human being. Well meaning skeptics of climate change sense a loss of power as that “self” is redefined and the power of God (or the gods) is diminished.

Simply put, climate change is both a moral issue and an issue of morality that brings science, religion, and politics into conflict.

Perhaps justly so. As we embark on this conversation, though, it might be nice if we could focus less on the messengers and pay far more attention to the message. Doing so, I think, might help us save the planet and emerge as better people in the meantime.

Keep Your Hands Where I Can See Them


The Great Porn Experiment–TEDTalk (Click to view the 15 minute video)

In my English 1302 course, my students have to write a research paper and I allow them to choose any topic that interests them. Except abortion, Elvis sightings, the death penalty, and UFOs: “No one, especially a first year college student,” I tell them, “can separate belief from opinion, fact from fiction, or faith from rationality with regard to any of those topics.” More important, these are such highly charged and emotional issues that students too often assume the grade is relative to whether I agree or disagree with them. Any potential learning goes out the window when they assume they failed because I am either 1) a liberal pinko communist sympathizer or 2) a conservative right wing nut job.

In other words, my life is much easier if we just avoid certain topics. Plus, half the students in my class did an awful job in their high school debate class defending the right to life or the right to choose. Budding young Clarence Darrow’s they are not.

So, shortly after I started teaching when a student told me he wanted to write about porn addiction, my only recourse was to tell him no visual aids. (I had to say the same thing to the student who wanted to write about the lingerie retail business, arguing for the positive psychological impact of matching bras and panties. It was a strange semester.)

My porn addiction student (and, yes, when you’ve taught enough students that’s the only way we can remember them sometimes) did a pretty good job. His argument, simply put, was that the internet was transforming the distribution of pornography and the easy availability would create addictions. These addictions would destroy families, he argued, and lead to the downfall of civilization. (Remember–he was 18. Hyperbole is second nature to many first year college writers. Notably, he and lingerie woman should have joined forces–she argued that Victoria’s Secret might save marriages by making women feel sexy and confident.)

Despite his over zealous claims at the end of his essay, his paper hinged not on pornography itself (although he argued any porn was bad porn). Instead, he argued that the internet would increase the amount of porn people saw and that increase would create a corresponding increase in divorce.

For an 18 year old student, the argument was relatively sophisticated and he did a pretty good job of gathering scholarly sources. Even at the time, I had my doubts about some of his conclusions, but the nature of a first year research paper isn’t to be perfect. His job was to gather scholarly papers, write well, and use that information to think critically about a topic.

My student was probably wrong about increased porn availability and divorce, but it does appear that he was correct in foreseeing the inherent dangers of wide-spread pornography.

I’m sure, at this point in the blog, I should make the requisite announcement that I’m no internet porn expert, but I should also note that I have two teen-age boys at home. My wife and I, as you might imagine, are cognizant of the delicate balancing act between teaching our kids how to respect the human body, working hard to protect them from inappropriate images, and recognizing that their bodies are taking that evolutionary ride toward manhood.

But, in our ever connected world, their opportunities often exceed our ability to monitor them.

They are, in short, becoming increasingly aware of their manhood. (Although as a dad, I think I’m supposed to hitch up my pants, shrug my shoulders, and declare proudly “There ain’t nothing short about their manhood. If you know what I mean.”) We know they’ve both accessed porn on our computer. (I can assure you that we don’t make it easy but passwords and a central location are no match for a determined teenager.)

Either way, as we are able to increase our understanding of the brain via neuroscience and as we watch the first generation truly raised on the easy availability of high speed internet, we can begin measuring the impact of long-term exposure to, well, exposure.

And it’s not pretty. Scientists are finding that long term exposure to internet porn, coupled with, well, auto-coupling creates both physical and psychological problems. In essence, virtual stimulation might stop real penetration. As important, viewing internet port and masturbating creates a chemical reaction that mirrors other addictions. As viewers get addicted, they, literally, can’t get an erection and perform in real life. (It’s like being addicted to whiskey without the hangover or bad breath!)

In other words, the irony of internet porn is that you know 100 different positions, but you can’t actually perform any of them.

Don’t get me wrong. I’m not interested in making light of the situation. (Although I’m an American male and talking about sex and masturbation brings out the junior high kid in me.)

Any addiction is problematic and I suspect we will see more and more studies looking at the brain’s reaction to visual stimuli and the internet. We already are witnessing the different ways these students interact with material and the influence visual stimuli have on the evolution of the brain. We know, simply put, that the internet is re-wiring that mass of tissue between our ears. As our Net Generation goes to college, such studies will be increasingly important and, perhaps, helpful.

Especially, if we can find a way to get kids hooked on reading novels. I guess putting pictures of naked bodies in the books isn’t a good idea, though.

Just Because We Don’t Like It Doesn’t Mean We Won’t Remember It


Click to view music video by Bowling for Soup

Hollywood has been telling us for years that high school might very well be the most important time of our lives. While there are movies about college years, those films really just take the high school concept (jocks, nerds, princess, criminals, etc) and add legalized drinking (as opposed to the illegal drinking that takes place in high school movies) and at least one snobby college professor. It is the high school movie, perhaps best exemplified by movies like Dazed and Confused, Napolean Dynamite and Breakfast Club (although I’m sure any one of us could come up with our own list) that has the ability to capture a more universal, American idea. In theory, we all go to high school; hence, we all have a place in the high school movie.

Obviously, Hollywood plays on stereotypes, often neglecting some issues in order to focus on sex (American Pie for instance), but the basic concept is that who we are in high school largely shapes and predicts our future selves. For a 4 minute refresher on the high school experience, one need go no further than Bowling for Soup’s “High School Never Ends.” The notable exception, it seems, are the “princesses”– those girls who found self-esteem and popularity in their beauty– become less confident the older they get and the “brainy girls’ grow more confident the older they get. (See the link below for more on this difference.) For the rest of us, well, high school never ends.

Now, as is often the case, neuroscience is catching up to popular ideology. Not only is our self-image especially “adhesive” during this time, “the prefrontal cortex–the part of the brain that governs our ability to reason, grasp abstractions, control impulses, and self-reflect–undergoes a huge flurry of activity.” The net result–“During times when your identity is in transition, . . . it’s possible you store memories better than you do in times of stability.”

Our adolescent years, science tells us, offer us great opportunities to learn. We shape our identity, we develop our capacity for social engagement, and our brains are, quite literally, growing by leaps and bounds. Unfortunately, we also are a hormonal mess of conflicting emotions and we have limited a limited capability to understand what we are learning. It’s why every parent has entertained, at least once, trading in the 15 year old for a newer, more improved model. (Or, perhaps, for the younger version that was sweeter, kinder, and a tad bit more predictable emotionally.)

Regardless of our emotional stability, the memories are being stored whether we like it or not.

I’m not going to pretend, in a thousand words or less, that I can offer a cure for what ills our educational system, but I do think we have to recognize the extraordinary possibilities available when we work with kids whose brains are in transition. First and foremost, we must recognize that these kids are soaking up information and creating memories regardless of their desire to learn. While we all want active, engaged participation and such students definitely master the material, we also have to remember that exposure matters. Life, as I tell people all the time, is a marathon not a sprint.

I’m under no illusions, either, that we can create a sea-change to the social culture at the high school level. I’m sure there are things we can do to ameliorate things, but high school reflects our larger social construct. Pretty people get more stuff. Athletes are more popular. We have a certain cultural distrust of intellectuals and people who play the trombone. Our high school students, quite frankly and for better or for worse, are taking their social cues from the adult world.

But we can use these formative years to build a catalogue, a rolodex if you will, of memories for our students. Many years ago, I used to use a pop culture reader in my freshman composition classes. The idea, simplistically enough, was that the topics would be interesting to my students and, as such, they would be more engaged. We could talk about music, film, tv, sports, and other contemporary topics and those conversations would create intellectual growth and curiosity.

What I found, however, is that my students were so poorly versed in anything outside of themselves and their own self-interests that they had almost no ability to dig deeper than the surface. They had, quite frankly, just spent the last 8 years of their education studying things that interested them and their only reference point educationally was their own ego. Critical thinking amounted to a “It sucks” or “It’s cool.” Or, when they couldn’t decide, they would tell me “It’s all good.”

What we know, if we watch enough John Hughes films and read enough neuroscience, is we must change the way we approach adolescent education. Our students need reference points outside their own experience whether they are interested or not and regardless of whether they “use” that experience in their 9th grade class or on a standardized exam. I’m not advocating we stop teaching certain skills or that we stop using popular culture references. But I am arguing that we force our students to read, see, and listen to works of art that influence the contemporary moment in which they live even if they aren’t a part of that student’s lexicon. I don’t care anymore if the Iliad is boring (and Troy with Brad Pitt more exciting) and it doesn’t bother me if Picasso’s paintings are weird looking.

They may not like it and they may be bored, but education isn’t just about that moment in time. What neuroscience tells us is that what we experience in 1985 is relevant in 1995, 2005, and all points in between. We may not know when it matters, but the memory is there and available. Let’s stop worrying about skills we can measure in a specific place and time, and start focusing a little bit more on the kind of memories we want these kids to store.

It’s Beginning to Look A Lot Like, Spring?

Last night was a good night in our house. I went home, threw some steaks on the grill and drank a cold beer (or three). With two teenage boys, steak is a real treat in our house, mostly because it takes about half a cow to make even a small dent in their hunger.

I don’t want to sound like an old codger, and I’ll willingly admit that good steaks (New York Strip or T-Bones) have always been pricey, but I do remember the days when a nice sirloin was a treat but not a luxury.

I also remember the days when I put the grill away during December. I realize complaining about warm weather might seem odd, especially to my friends in Minnesota. They don’t have a lot of sympathy when I complain about how warm I was playing golf last Sunday (or that I’m grilling steak on Dec. 6). But, can I share my first world problem with a tinge of political outrage?

Lamar Smith, long-time Republican Congressman from Texas (one of those men who is, ironically, an anti-government career politician), has been named chair of the House Committee on Science, Space, and Technology. Smith, despite the fact that he’s from Texas and lives in a state suffering from the worst drought since the 1950s, has his doubts about the impact of humans on the climate and his bonafides regarding science are certainly questionable.

At the same time, Smith notes “we can help future generations get [to space] by encouraging kids to study in STEM fields (science, technology, engineering and mathematics). If America is going to remain competitive in today’s global economy, we need to remain innovative and focused on exploring science and expanding new technologies.” (Maybe the Republicans did learn from the election that crazy has no place in science.)

On the surface, it sounds like Smith at least recognizes that science matters. If I had Smith’s ear, I would tell him let’s leave behind the notion of blame regarding the climate and recognize that we simply must do something about climate change. In other words, stop fighting the idea that humans are or are not having an impact, and start focusing on initiatives that push renewable energy and invest in companies, universities, and individuals who can help us overcome the impact of climate change. I’ll offer Rep. Smith a little more advice, also. Drilling for more oil isn’t the answer and, while this may sound like heresy, in an era with limited funding, I think Smith can invest our money someplace other than pushing educational initiatives from a committee that isn’t an educational committee. (Call me crazy, but how about if we leave encouraging kids to study STEM fields to the Education Agency?)

In other words, Smith’s committee should be investing in research and development. Stay focused. And, while I love NASA, the most important issue of our time is going to be the climate and water. President Kennedy challenged us to get to the moon. How about if Smith challenges us to reduce our carbon footprint? We have some of the greatest scientists in the world–let’s create a Manhattan Project focused on the climate and water availability.

As the chair of the Science committee, I need Rep. Smith to invest in educational initiatives targeting farmers and ranchers in these changing climates. Extension agencies across the West are exploring ways to feed and grow with less water, but they need money for research and, more importantly, money to take that information to the land owners. Land owners need funds to make changes. If we want them to water crops differently help them replace old equipment. Likewise, let’s focus on cities and urban areas reducing the water they use and increasing the LEED buildings. Let’s build smart buildings and let’s build smartly.

I would also encourage Smith to actually visit Texas and look at the ranch lands trying to recover from over-grazing, drought, and falling water tables. If that doesn’t help, perhaps he can head up to HEB, visit the meat counter, and ask the butcher why steak has become increasingly expensive. If that doesn’t help, have him show up to my house and I’ll take him shopping. What he will see is the direct impact of a changing climate on our bottom line. Milk, butter, meat–these are staples of the dinner table and every month of drought increases their cost and impacts college funds, purchasing power, donations, and other tools that drive the economic engine of the country.

If he shows up, though, he better bring a pair of shorts because last time I saw old man winter he had on a swimsuit.

Things I Read

And Things I Learned

Washington Monthly

Do I contradict myself? / Very well then I contradict myself, / (I am large, I contain multitudes.)

Joanne Jacobs

Thinking and Linking by Joanne Jacobs

Inside Higher Ed

Do I contradict myself? / Very well then I contradict myself, / (I am large, I contain multitudes.)

NYT > U.S. > Politics

Do I contradict myself? / Very well then I contradict myself, / (I am large, I contain multitudes.)

Balloon Juice

Do I contradict myself? / Very well then I contradict myself, / (I am large, I contain multitudes.)

Dilbert Daily Strip

Do I contradict myself? / Very well then I contradict myself, / (I am large, I contain multitudes.)

The Full Feed from HuffingtonPost.com

Do I contradict myself? / Very well then I contradict myself, / (I am large, I contain multitudes.)