Netflix’s Daredevil Adaptation: Miller Lite

The origin story for superhero comic books rests in the 1930s and 1940s, but those creators remained in relative obscurity, often with little or no financial reward. However, the 1980s and 1990s ushered in an era of comic book creators as superstars.

One of the most iconic and influential superstars from that period was Frank Miller, who built his comic book capital on a staple of the industry—the reboot.

Miller reimagined the canon for and resurrected Daredevil (Marvel) as well as Batman (DC). Some argue that his work on Batman: Year One (with David Mazzucchelli) and The Dark Knight Returns (with Klaus Janson and Lynn Varley) is among the best in the history of superhero comic books.

Batman-Year One (cover).jpg

Miller’s artwork also proved to be a visually impressive source for film—notably his Sin City and 300.

Superstardom for Miller hasn’t avoided stumbles (his script for RoboCop 2) or controversy, as Sean Howe detailed in 2014:

But, as if Miller were one of his own antiheroes, his stark individualist philosophy has also led him down some lonely corridors. He’s written graphic novels that many of his fans recoil from—including one that WIRED called “one of the most appalling, offensive, and vindictive comics of all time.” And he followed that up with ferocious online musings that provoked an outcry, even from some of his most stalwart supporters. In recent years, he’s withdrawn from the public eye.

One of the newest renditions of  Miller’s work has itself been mostly hidden from the public eye: Miller’s The Man without Fear and his “Born Again” arc as source material for Netflix’s now cancelled Daredevil series.

Charlie Cox in Daredevil (2015)

The Many Universes of Superheroes: Netflix’s Miller Lite Adaptation

While rebooting characters and entire universes became a standard convention of comic books at Marvel and DC, the adaptation of superheroes from print to film sputtered throughout the 1970s, 1980s, and into the 1990s.

Marvel has mastered the film adaptation, and many in the public are far more familiar with the film Marvel Universe than the many universes of the comic books. Concurrent with the feature film success of Marvel and struggles with DC-based films other than Batman, Netflix launched serialized superhero adaptations in conjunction with Marvel: Jessica Jones, Daredevil, Luke Cage, Iron Fist, The Defenders, and The Punisher.

These adaptations, I thought, held much greater potential than feature films; they matched the current generation’s lust for binge watching, but they also maintained one of the most compelling features of comic books, extended serialization.

The Netflix approach was well-suited to Jessica Jones since the adaptations downplay some of the main conventions of superhero comic books, such as elaborate and identifiable superhero costumes.

To their credit, Netflix adaptations have been character driven, often as much about the everyday person as the superhero alter-ego.

Season 1 of Daredevil traveled that muted approach to superheroes, and found the perfect source in Miller’s arc, later published as a graphic novel, The Man without Fear, written by Miller with dynamic artwork from John Romita Jr. (pencils) and Al Williamson (inks).

This first season follows a softened and tweaked Miller narrative and draws significantly from Romita Jr.’s art, notably the black non-costume Matt Murdock dons in most of the season:

Daredevil: The Man without Fear 5 (John Romita Jr. and Al Williamson)

While I have examined The Man without Fear and its relationship with the Netflix series [1], I want below to look at Season 3 and the use of the “Born Again” arc as more Miller Lite.

Daredevil Born Again, and Again

The “Born Again” arc (Daredevil vol. 1, issues 227-231, and often including 232-233) features Miller and Mazzucchelli, who also paired on Batman: Year One. This storyline builds on the rebooted Daredevil fashioned by Miller and includes some powerful religious imagery and themes.

Daredevil Born Again.jpg

Daredevil as a mythology and narrative has survived, I think, like other major superheroes because in its essence that mythology has compelling elements—structural justice versus vigilante justice, tensions surrounding the idea that “justice is blind,” etc. However, the serial rebooting of the character and the adaptations of the comic book medium into feature films and serialized filmed formats suggest at least that these essential elements have not in some real way been fulfilled.

This is where the differences between the source material and the adaptation come into play. Netflix’s S3 of Daredevil uses “Born Again” as the primary frame, as S1 used The Man without Fear. But S3 also pulls directly and loosely from other sources in the comic book universes as well.

Jesse Schedeen offers 9 changes made in S3 to the comic book sources:

  1. Schedeen focuses on Karen Page’s role in Wilson Fisk/Kingpin discovering Matt Murdock is Daredevil; Karen is manipulated into revealing Murdock’s secret in “Born Again” because Miller has reimagined her as a drug addict and failed-actress-turned-porn-performer. I want to add and emphasize here that the Netflix version of Karen is an important shift from Miller’s trite and reductive Karen. Netflix’s adaptation has clearly sought ways to keep Karen flawed (her backstory revealed in S3 is brutal and dark) but maintain a far more complex and fully human character than Miller has allowed. Like Matt, Karen feels a great deal of guilt and self-loathing in S3, but this adaptation resists a common flaw in comic book narratives to reduce women to one dimension.
  2. Another change involves pulling from a different source, “Guardian Devil” from 1998, as Schedeen notes. This change fits into my point above, I think, in that S3 character Benjamin “Dex” Poindexter (an adaptation of the Marvel character Bullseye) kills Father Lantom instead of Karen. Again, I see these changes allowing a richer and more complex version of Miller’s Karen Page and the wider Daredevil contemporary canon (in this case crafted by Kevin Smith and Joe Quesada).
  3. S3 maintains the “Born Again” reveal of Matt Murdock discovering Sister Maggie is his mother, as Schedeen details, developing more tension in the adaptation version.
  4. The teasing out of Wilson Fisk/ Dex (Bullseye) and another assassin, Nuke, between “Born Again” and S3 demonstrates how the Netflix series often streamlines source narratives and characters while also in many ways blunting superhero elements.
  5. One of the most distinct differences is the use of Dex, and dropping the name “Bullseye” as well as the superhero uniform, in S3. Netflix’s adaptation has chosen to emphasize Dex as mentally unstable, paralleling, I think, in many ways the motif throughout the series concerning childhood trauma (shared by Dex, Fisk, and Murdock) and authority conflicts—the parent/child pattern seen also with Karen.
  6. The paralysis of Bullseye is shared between S3 and the comic book source, and as the Netflix S3 ends, Dex’s surgery clearly was designed to propel the series into another season.
  7. One of the key characters in the Daredevil myth is Foggy, and the Netflix version also develops from the foundational source character into a more complex and even realistic person, a necessary change, I think, in terms of how Foggy parallels Karen as they interact with Matt.
  8. Fisk’s love interest, Vanessa, proves to be another interesting adaptation in S3, much like the changes made with Karen. As Schedeen explains, “In the comics, though, Vanessa has a much more complicated relationship with her husband and his criminal empire.” Here, I think, the viewer of S3 is forced to consider Vanessa as a more fully human and independent character, again in similar ways to how we view Karen. In comic books, as in literature, women are often reduced to being merely symbolic or muses for men as heroes, or villains.
  9. Similar to Dex (Bullseye), Fisk (Kingpin) is essentially drawn from the comic book Marvel universe, and “Born Again,” but the superhero/villain elements are greatly muted. The “Born Again” Kingpin projects the sort of large ego we see in S3, but the fights and outcome for Fisk vary substantially in the adaptation. Schedeen adds, “Fisk doesn’t suffer quite so resounding a defeat in ‘Born Again.’ He does overplay his hand in his attempts to destroy Matt Murdock, eventually causing the deaths of dozens of Hell’s Kitchen residents when he unleashes the out-of-control Nuke.”

With the Netflix run of Daredevil finished, in midstream, we can see how Miller’s version has provided a powerful and compelling frame for the adaptation. But we should also recognize the potential and purpose of adaptation from one medium to another.

The Netflix series as Miller Lite presents an important argument for the urge in the comic book universe to reboot and retell. Daredevil as a foundational superhero myth has extremely important characters, motifs, and themes, but too often the array of creators positioned to soar with those elements has tended to flutter, falter, and even fail.

S1 of Daredevil was exciting in its potential, even as I found the filming too dark (although the dark tendency of the comic book with some artists, such as Alex Maleev, has been among my favorite qualities). By S3 and the abrupt end, I was increasingly hopeful that this adaptation was working its way in the right direction.

While episode 13 of S3 charged viewers with Matt’s “man without fear” speech at Father Lantom’s funeral, we are left once again with less than we had hoped for.


[1] Thomas, P.L. (2019). From Marvel’s Daredevil to Netflix’s Defenders: Is justice blind? In S. Eckard (ed.), Comic connections: Building character and theme (pp. 81-98). New York, NY: Rowman and Littlefield; Thomas, P.L. (2012). Daredevil: The man without fearElektra lives again; science fiction.  [entries]. In Critical Survey of Graphic Novels: Heroes and Superheroes. Pasadena, CA: Salem Press.

Domestic Tuesday

My life as a voracious reader began in childhood, but matured at some point in early adolescence as obsessive. That early obsession was grounded in collecting and reading Marvel comic books as well as science fiction novels—early Michael Crichton, Larry Niven and Jerry Pournelle, and Arthur C. Clarke.

I have steadily plowed through my reading life discovering and then devouring new writers. In my last couple years as an undergraduate English education major, I was in my John Irving phase, spurred by falling madly in love with his The World According to Garp.

Naive and often clueless, I was a twenty-something who hoped to be a writer, and desired more than anything a deep and unique love. My idealizing falling in love and marrying was compounded with idealizing Garp’s life as a stay-home husband/father.

While I have read most of Irving’s novels, and loved quite a few, it has been years since I read Garp and realize I may now find much of the novel, and Garp’s domestic self, far more problematic. However, while I have never become the novelist and fiction writer I had planned, my life as an academic and writer has included domestic elements that I genuinely enjoy.

Since I teach most often on a Monday, Wednesday, Friday schedule, I have for many years remained home to write and work on Tuesdays and Thursdays. Also starting more than four years ago, I have been a caregiver for, first, my granddaughter, and now my grandson on Tuesdays.

Whether I have been home to write and work or to watch my grandchildren, I spend part of my time washing dishes and washing, drying, and folding laundry. Some days I also make a trip to the grocery store.

Laundry, while being a chore, also provides a bit of zen for me. I find a certain peace in folding and hanging up clothing the way I prefer.

As a man, I recognize the absurdity of finding peace in the sort of domestic chores society has imposed onto women, that many marginalize as “women’s work.” It is a sort of absurdity that could easily ignore that women historically and currently often must navigate a professional life as well as their domestic obligations in a way that men can drift into and out of—or even avoid—without much consequence.

One of my favorite, although heavy, units I taught while a high school English teacher included using the film Pleasantville as an entry point (focusing on the TV mother character) into exploring women poets—Adrienne Rich, Sylvia Plath, and Anne Sexton—in terms of how their status as women impeded on their work as poets.

As I have shuffled back and forth between writing and doing the laundry, I have more than once paused against the awareness that Plath’s life overwhelmed her as wife, mother, poet. An awareness of the millions of women who have suffered and now suffer the same fate without the spotlight we shine on the celebrity-tragedy of Plath.

There is a convergence here since my mother was the most important influence on the reader I became, the writer I would become because of that reader life steeped in science fiction and comic books, and since my mother imprinted on me an indelible image of the domestic life of women.

shallow focus photography of brown clothes pins
I will always associate my mother with clothes pins, the bucketful in the laundry room where she hid hundreds of dollars at the bottom. (Photo by Caspar Camille Rubin on Unsplash)

My mother, Rose, was a child of the 1950s, and she spent much of her life caring for her siblings, and then her own children before later running a daycare. Even when she worked outside the home, my mother did the laundry, cooked, and provided the bulk of the childcare; she also handled the bills—and quite frankly it seems did everything.

And as Caralena Peterson explores about women academics, my mother appeared to do everything extremely well and nearly effortlessly.

Today, as my iPhone reminds me, is my father’s birthday and my parents’ anniversary. They died about six months apart less than two years ago.

My parents were very 1950s, very Southern and white. They were also uncritical embodiments of gender stereotypes and obligations.

Hard work matters, I believe because of them, for the sake of making the effort, and I do find some tranquility and sense of accomplishment in doing things the right way, or at least a purposeful way.

Like carefully folding each piece of clothing because each piece of clothing—whether yours or someone else’s—deserves that moment of purpose.

Part of the celebration around Irving’s Garp, which eventually led to a film starring Robin Williams, revolved around his provocative topics, but the novel also spurred a conversation about Garp as domestic husband.

In no small part, the public discussion equated “domesticated” with “emasculated.” A man without a job was no man.

This was a long time ago when I was far less aware, but I don’t really think that conversation interrogated that Garp as a man still had a decision. A decision that women are often not easily allowed.

I often find the sink filled with dirty dishes, and the dishwasher storing clean dishes—from when I started the cycle. Whether late at night before bed or first thing in the morning, I often make that right.

Putting away clean dishes. Filling the dishwasher and starting another cycle.

This seems simple; some would be compelled to compliment my helping out.

But this is not some other person’s chore. This is something I choose to do, in part because it brings me a calm to set things right.

It is, however, a decision I can make. It is my remaining privilege as a man.

Today as my grandson plays, and as I write, do some work, I cycle through washing and drying all the dirty clothes, folding them warm and clean smelling on the day my father was born, the day my parents were married 59 years ago.


Recommended

Stop Assuming That I’m Just Writing About Myself by Kathryn Vandervalk

Cat Person, Kristen Roupenian

Pledge, Anton DiSclafani

Mything Truth

A confrontation in Washington DC this past weekend has introduced yet another image certain to join a disturbing history of white America.

About a year and a half ago, we were asked to confront rage.

As the Covington Catholic student controversy has been unfolding, I felt compelled to add this warning on social media:

Don’t be distracted. Regardless of exact details of the controversy around the private school students and the Native American veteran, stay focused on how this PROVES the ugly power of white/wealth privilege. Every. Time. White/wealthy insures a calvary will rush to protect you, to lie if necessary, but in the end, you will be insulated from your behavior regardless of your innocence or guilt.

And then the predictable occurred: White men inserting themselves into the comments, attacking me and offered the standard “white male privilege is a myth” response.

As a white man with a great deal of social capital and economic slack, I am a first-hand witness to what people say and do when only whites are around, when only men are around. I am not speaking about hypotheticals, but about how this world absolutely functions.

Few lies are more flimsy or toxic than denying white male privilege (skim the evidence). But we must note how it is always white men who rush to deny, and attack.

Simultaneous with the shifting and expanding narrative around the Covington Catholic student confrontation, I watched the newest Coen brothers film, The Ballad of Buster Scruggs.

the-ballad-of-buster-scruggs-final-poster

I have long been an admirer of the craft of the Coen films, but their work also presents a disturbing problem: While there is a high-level of art and craft in the work, we must also acknowledge that the work is very white, and possibly too often uncritically white.

Ballad has received the usual critical praise the Coens enjoy. Richard Brody is effusive:

The Western is the most inherently political genre, and, with “The Ballad of Buster Scruggs,” their two-hour-plus anthology of six short Westerns, Joel and Ethan Coen have made an exemplary political film. (It’s already in limited theatrical release and will launch on Netflix this Friday.) It is a movie put together from bits and pieces of cinematic tropes, conventions, and clichés, including ones borrowed from a range of genres, from ingenious physical comedy to romantic lyricism to Gothic horror. But all are united by a giddy Western revisionism centering upon a common theme: the relentless cruelty, wanton violence, deadly recklessness, and cavalier abuses of unchecked power that prevailed in the thinly and casually governed Wild West. Whether with outrageous antics or metaphysical mysteries, the Coen brothers fill the film with a subtle nose-thumbing; they’re laughing up their sleeve at the long-standing exaltation of the West as a primordial realm of titanic heroes, and at a society that even now consumes Western legends and spits them back in the form of historical verities and political pieties.

Brody gives the Coens one of the usual passes, in fact: “The movie sets up its action with the droll framework of an old, illustrated book of Western stories, further emphasizing that this movie is—like the Western stories that it parodies—a batch of back-constructed tall tales.”

Parody is among the safest refuge for those artists with privilege. “We are just depicting the world as it is, in all its flaws,” they shrug, “so that the audience can draw their own conclusions.”

And Matt Goldberg offers the ultimate stamp of approval by suggesting that this film rises to the level of universal in the final story, “The Mortal Remains,” a tour de force of Coenesque dark humor: excellent acting driven by loquacious characters, each representing a different philosophy:

What these kinds of conversations point out is that we struggle to put everything together, but we can only do the best through our own point of view. We’re extremely confident in that point of view, and as the Englishman notes, “We love hearing about ourselves. As long as the people in the stories are us, but not us. Not us in the end, especially.” We’re all guilty of confirmation bias, and yet that will not save us. The Lady is just as dead as the Frenchman who is just as dead as the Trapper.

It is another story, however, “The Gal Who Got Rattled,” that fits best into the real-world controversy involving private school boys, Black Israelites, and a Native American veteran.

“The Gal” involves some of the most enduring, and problematic, elements of the Western film: a wagon train and the ever-present threat of attacking Indians.

The climax of this story finds one of the wagon train leaders, Mr. Arthur, and the central woman, Alice Longabaugh, trapped in a gun fight with the standard image of Indians appearing menacingly atop a ridge.

Aided by prairie dog holes, the rugged (white male) individual, Mr. Arthur, poises himself against several waves of Indians on horseback; in the balance is Alice’s womanhood-as-assumed-virginity.

Among those of us with at least a modicum of experiences with Western films (John Wayne, spaghetti Westerns/Clint Eastwood), this story triggers thoughts of “circling the wagons” (used uncritically, this is packed with racist stereotypes and erases imbalances of power and Westward expansion as genocide)*  and the calvary.

This brings me back to the confrontation in Washington DC, a confrontation among two marginalized groups—blacks and Native Americans—and the privileged white male students from an expensive private school.

Of those three groups, only one has hired a PR firm, and despite the apologists for the Covington Catholic students shouting that the whole picture absolves, somehow, these young men and the culture that spawned them, that whole story now includes video of the boys in their red MAGA hats taunting a young woman and truly incriminating evidence that the school has at least allowed a deeply antagonistic culture of white arrogance.

I suspect the apologists do not really want the whole story, no more so that America wants the whole and ugly story of our so-called Wild West.

Regardless of all the details, as I cautioned, we must be willing to see that the Covington Catholic students are examples of the power of privilege, the guarantee that privilege insures a white calvary will come.

For those interested in the truth and not mything truth, I offer here at the end some of the best full pictures of the confrontation:


* This post and blog title have been revised to addressed racially insensitive use of language.

“Despite the Data”: Higher Education Fails Equity, Inclusion with SETs

Entering higher education in my early 40s after 18 successful years as a high school English teacher, I remain 17 years later baffled and even disappointed at the mess of contradictions that characterizes an institution populated by the most educated people possible.

Immediately I had to hold my tongue against the pervasive culture of college professors bemoaning constantly how busy they are. When my high school teaching career ended, I was wearing a wrist brace because I was hand marking about 4000 essays and 6000 journals per year while teaching five course and about 100 students (many colleagues taught 20+ more students per year).

I also coached many of those years, with work days from about 7:30 AM until 10 or 11 PM in the evening.

By contrast, I teach two first-year writing courses each fall (as part of my full load, a minimum of five course per academic year), a total of 24 students, and my teaching schedule tends to be three days a week, often a Monday evening class included.

The Ivory Tower effect is a bit more accurate than I would prefer.

More disturbing, however, is the power of tradition among academics, a dynamic that works against practices and policies being based on evidence (and thus in a state of flux when that evidence changes).

While the U.S. has a long history of characterizing and even demonizing higher education as some sort of liberal cult, the truth is that the very worst qualities of higher education are from its conservative urges as institutions.

Of course, you can find a disproportionate number of professors who have left-leaning social and philosophical ideologies, but the most powerful department/colleges in higher education are often the most conservative—political science, economics—or the most apt to take non-political poses—the hard sciences.

This disconnect between how higher education is perceived and how higher education exists stems from, in part, I think, higher education presenting itself rhetorically as progressive—mission statements, social justice initiatives, etc.

However, with a little unpacking, we can expose that practices and policies often contradict and even work against that rhetoric and those initiatives.

One example that I have addressed again and again is the use of student evaluations of teaching (SET) to drive significantly the promotion, tenure, and reward process.

Consider a few points raised in Colleges Are Getting Smarter About Student Evaluations. Here’s How by Kristen Doerer:

“Having a female instructor is correlated with higher student achievement,” Wu said, but female instructors received systematically lower course evaluations. In looking at prerequisite courses, the two researchers found a negative correlation between students’ evaluations and learning. “If you took the prerequisite class from a professor with high student teaching evaluations,” Harbaugh said, “you were likely, everything else equal, to do worse in the second class.”

The team found numerous studies with similar findings. “It replicates what many, many other people found,” said Harbaugh. “But to see it at my own university, I sort of felt like I had to do something about it.”…

Studies since the 1980s have found gender bias in student evaluations and, since the early 2000s, have found racial bias as well. A 2016 study of data from the United States and France found that students’ teaching evaluations “measure students’ gender biases better than they measure the instructor’s teaching effectiveness,” and that more-effective instructors got lower ratings than others did….

Despite the data, at many colleges, particularly research-based institutions, student evaluations are still the main measure, if not the only one, of teaching effectiveness in promotion-and-tenure decisions.

Common among universities and colleges across the U.S., diversity and inclusion are pervasive problems. Poor students and students of color are underrepresented in many colleges, especially the so-called elite institutions; women and people of color are equally underrepresented on faculties.

Nothing rings more true or frustrating than Doerer’s use of “despite the data.”

I have rejected SETs directly in my bi-annual self-evaluation for merit raises. I have consistently advocated the administration and our faculty status committee to end or greatly reduce the influence of SETs.

In all of the situations, I have repeatedly shared the research, the data:

And without fail, those with power, who tend to be white men, offer a tepid acknowledgement of the research followed by a quick “But we have to do something.” Doerer includes a response (from a white man) that sounds all too familiar:

Ken Ryalls, president of the IDEA Center, a nonprofit higher-education consulting organization, recognizes the bias but thinks doing away with evaluations isn’t the answer. He opposes efforts to eliminate the voice of students. “It seems ludicrous,” he said, “to have the hubris to think that students sitting in the classroom have nothing to tell us.”

“The argument that you should get rid of student evaluations because there is bias inherently is a bit silly,” he said. “Because basically every human endeavor has bias.”

The “yes, but” dynamic works to maintain the inequitable status quo. And as Ryalls’s comment shows, the “yes, but” response is often a distraction.

No one is arguing to remove the voice of students, but as Doerer’s reporting confronts and as the research base shows, student evaluations of teaching are fraught with student biases that corrupt the teacher evaluation process, effectively discouraging women, people of color, and international faculty from remaining in a hostile environment with very real negative career consequences.

For example, calls to end SETs a primary or major instruments for promotion, tenure, and merit pay are often part of a larger examination of how to make student feedback more effective for teaching and learning.

Doerer notes:

That’s in large part why Oregon decided to try a midterm student-experience survey that only the applicable faculty member can view. An instructor can make changes in the middle of a semester, when students can still benefit, encouraging them to give constructive feedback.

For many years, I have asked students for feedback at midterm, and explained that I would like the opportunity to address their concerns, and also to identify what is working well, because receiving complaints after a course really benefits no one.

Further, when student feedback is for the professor only, it becomes a conversation about improving teaching and learning, and as a professor myself, I am best equipped to interpret student comments. I consistently receive feedback intended as negative by students, but will never change them because they misunderstand my role and their roles in the classroom.

Yes, student feedback is valuable, but it likely cannot be simply or easily reduced to numbers, formulas, or even verbatim interpretations of their direct words.

It has taken nearly four decades of high-stakes accountability in K-12 education for people to begin to acknowledge that high-stakes accountability causes far more harm than good.

In higher education, if equity and inclusion are real goals, we can and must seek ways that students have safe and open spaces for providing their professors feedback, we can and must better support faculty in how to interpret that feedback in ways that improve their teaching and student learning, but to reach those goals, we must end the practice of using SETs in significant ways to evaluate faculty.

Higher education must end the tradition of “despite the data,” recognize that rhetoric means less than nothing if contradicted by practices, policies, and a culture of “yes, but.”

The Man in the High Castle and Cat’s Cradle in Trumplandia

At the very naive age of 21, I fell in love with Blade Runner (1982), unaware at the time that it was a film adaptation of Philip K. Dick’s Do Androids Dream of Electric Sheep? My formative years had been spent on science fiction B-movies my mom adored and Marvel comic books, but I remained then still only engaged with genre as a fan.

Many years later, I read Electric Sheep, and was mostly underwhelmed with Dick as a novelist while recognizing his gift for ideas*, much of which was mined by what would become a Ridley Scott modern classic and cult hit.

I just finished my second Dick novel, having begun several of them over the years but finding it difficult to stay connected. The Man in the High Castle has gained a new life with the amazon serial adaptation, and I decided to give his work another shot.

Similar to Margaret Atwood’s The Handmaid’s Tale being resurrected through serialization, Castle seems perfect for our time in Trumplandia. Many in the U.S. fear the rise of totalitarianism, but there also is an important new recognition of the fragility of truth and facts.

I must admit that once again I was underwhelmed with Castle as a novel; the central idea—an alternate history in which Germany and Japan win WWII—however, is incredibly compelling as a thought experiment.

The characters, I feel, aren’t themselves very compelling, and the main woman, Juliana Frick, especially felt superficial, even trite at times. Yet, about a third of the way into the novel when Germany is suffering a crisis of leadership, an exchange between Juliana and her mysterious lover, Joe Cinnadella, essentially solidifies why this novel speaks so powerfully now:

high castle

It is here that I read Castle as a much more political and economic narrative version of Albert Camus’s The Stranger captured in Meursault’s musing in prison:

Afterwards my only thoughts were those of a prisoner….At the time, I often thought that if I had had to live in the trunk of a dead tree, with nothing to do but look up at the sky flowering overhead, little by little I would have gotten used to it. I would have waited for birds to fly by or clouds to mingle, just as here I waited to see my lawyer’s ties, and just as, in another world, I used to wait patiently until Saturday to hold Marie’s body in my arms. Now, as I think back on it, I wasn’t in a hollow tree trunk. There were others worse off than me. Anyway, it was one of Maman’s ideas, and she often repeated it, that after a while you could get used to anything. (p. 77)

Dick forces the reader to see that any of us can easily see our side as always in the right and the other side as always in the wrong; this Nazi/communist duality framed in the novel ultimately is revealed as a false dichotomy in the sense that no option had any real moral superiority.

When is war, or even politics, not a gruesome real-world version of the ends justify the means?

And that thematic element prompted also in my mind Kurt Vonnegut.

“‘When Bokonon and McCabe took over this miserable country years ago,’ said Julian Castle, ‘they threw out the priests. And then Bokonon, cynically and playfully, invented a new religion’” (p. 172)—opens Chapter 78 of Vonnegut’s Cat’s Cradle.

Bokonon has created a religion “‘to provide the people with better and better lies’” (p. 172), foma, and a central aspect of that strategy involves the orchestrated war between the government of San Lorenzo and the religion, Bokononism:

“But people didn’t have to pay as much attention to the awful truth. As the living legend of the cruel tyrant in the city and the gentle holy man in the jungle grew, so, too, did the happiness of the people grow. They were all employed full time as actors in a play they understood, that any human being anywhere could understand and applaud.” (pp. 174-175)

The false choice between McCabe and Bokonon in this other world created by Vonnegut happens to represent well the delusion of choice that exists in the U.S. McCabe/Bokonon reflect the false choice currently in the U.S. between Republican/Democrat; it’s a fake fight, and a false choice.

However, I must qualify that it has been a fake fight and false choice until the era of Trumplandia.

The policy and ideological differences among Bill Clinton, Hillary Clinton, George W. Bush, and Barack Obama are quite small—even as some of those policies have profound consequences for individuals in the U.S. and abroad.

The partisan political arena, like McCabe and Bokonon, have been compelled for political reasons to make those small differences seem dramatic, often resorting to the sort of hyperbolic language that stretches credulity.

Obama, for example, is no socialist, no communist. Obama is a centerist, a bit moderate and even liberal in his rhetoric, but he is not so far away from George W. Bush that they couldn’t reach out and dap.

This false chasm between Democrats and Republicans has perpetuated a standard cultural and political ideology for decades, a state of perpetual war and an economic system that feeds the wealthy on the backs of workers and the demonized poor.

The norm of hyperbolic partisan rhetoric now has dire consequences as some seek to confront a new norm in Trumplandia, a more insidious assault on truth with even more far reaching negative consequences for much of the U.S. and even many beyond our borders.

Evoking words such as “Nazi” and “fascism” are no longer vapid hyperbole, but those markers fail to resonate among many who have been numbed by partisan hyperbole and hate-mongering along party lines.

George W. Bush was mostly mainstream U.S. politics and ideology, despite the histrionics from the Left. Obama was mostly mainstream U.S. politics and ideology, despite the histrionics from the Right.

There is almost nothing mainstream or normal under Trump, although we are hesitant to admit that this new extreme has most of its roots in mainstream Republican politics that has depended on racism and misogyny for decades.

As a former high school English teacher, I am now deeply concerned that it will not be fake news that sinks this ship, but our inability to distinguish between hyperbole and honest but blunt language.


* I can draw a parallel with a difference here. I love Milan Kundera as a powerful philosophical author, but I find Kundera a much more compelling storyteller.

How to Avoid the Tyranny of the Lesson Plan: Planning Less to Teach Better

woman holding marker
Teaching is a daily intimidating adventure, one that requires we find the confidence to enter each lesson with the board empty. Photo by rawpixel on Unsplash.

My journey to becoming a certified high school English teacher occurred during the early 1980s. My methods course work was solidly grounded in an era obsessed with behavioral objectives and highly detailed lesson plans.

This approach to preparing to teach centered content acquisition and the authority of the teacher. In many respects, I was trained to teach as if students didn’t even exist in the process.

I immediately entered an M.Ed. program since I graduated in December and would not find a full-time teaching position until the coming fall. Those courses further entrenched mastery learning, although I also had my first glimpse into a much broader array of educational philosophies that included reading John Dewey, Maxine Greene, and others I would eventually recognize as my own critical perspective.

Many years later, after those nearly overwhelming first years of teaching when all that philosophy and theory has to be put into some sort of practice, I was well on my way to being a student-centered and critical teacher when I had a student teacher. She was a very short black woman who taught from a script—every single lesson she taught.

I immediately thought of my initial training to teach as well as this student teacher as I was reading Christine Tulley’s How to Avoid Overprepping for Your Classes.

First, after my 18 years teaching high school, I have been working in teacher education for 17 years while also helping with providing university professors attaining and improving their writing pedagogy. In both cases, I have witnessed what Tulley confronts:

I recently consulted with a Ph.D. student who was logging long nights and weekends in her office. I knew she was trying to revise her dissertation into a book and complete a book proposal, but I soon learned that she was also using the late nights to get ready for class and “keep up” with course planning. With classes and committee work scheduled during the day, she never had time to write.

When I do classroom observations, for example, my teacher candidates feel compelled to perform, believing that “teaching” is about the lesson plan and teacher behavior (again, as if students are not present).

But it is Tulley’s next point that really sparked my memory of my student teacher from many years ago:

I often see this pattern of overpreparing among the early-career faculty members whom I mentor. Many have unwittingly fallen into what Armando Bengochea terms “the teaching trap.” Bengochea notes that such overprepping is a real problem for faculty members who suffer from impostor syndrome or use course preparation as a procrastination strategy because it sounds legitimate. They often engage in extensive lecture preparation, working to fill all available class time as a protection mechanism. The result is they have to do a time-consuming deep dive into content each week to develop lengthy lecture slides or handouts. Perhaps not surprisingly, a disproportionate number of faculty of color, non-native speakers, women and other marginalized populations prepare too much for the classes they teach.

Even though I now work at a selective university with students often benefitting from a great deal of privilege, my teacher candidates are often young women, several of whom struggle against being small in stature or “looking young.”

Tulley has prodded me to understand better why I have struggled for years to help my teacher candidates understand, and practice, a key distinction I make about teaching: Teaching is not about meticulous and detailed lesson plans but about being prepared every day you enter a classroom.

In some significant ways, I am here once again addressing that teachers need both pedagogy and expertise. The urge to hyper-plan—my student teacher who scripted every lesson—is often a self-defense mechanism, but it is one that is counter to our goals as educators.

I want here to examine briefly how to avoid the tyranny of lesson plans while also building on and pushing against Tulley’s alternative to “overprepping”:

Pattern teaching is a solution I regularly offer to faculty members who seek parameters on preparing for courses efficiently and effectively. The premise is simple and not revolutionary: develop a regular pattern or structure to the class. Often instructors create such a pattern (the first 15 minutes are used to review homework, group work is always done on Wednesdays and so on) for their students’ benefit. But pattern teaching can also influence how content is delivered, making it a useful strategy for streamline course preparation.

One nuance I would offer to Tulley’s ideas is that teachers should distinguish between planning (what we should decrease) and being prepared (an ongoing state of gaining both more effective pedagogy and greater expertise).

While I am not opposed to “pattern teaching,” I have adopted a different language cultivated in my years teaching for the Spartanburg Writing Project. We used the metaphor of writing teachers building and expanding their “teaching toolbox.”

That toolbox would be available so that daily teaching did not need to be scripted or meticulously planned. Teaching in a frame structure (for example, the writing or reading workshop guided by elements similar to Tulley’s patterns) allowed the teacher to pick and choose among the tools to apply as needed in the flow, spontaneously, of teaching.

Finally, here let me offer a few different ways of thinking about being prepared to teach daily instead of planning:

  • Create a syllabus/daily schedule and each lesson plan as tentative frames, not “that which you must execute.” The key here is that when any teacher spends an inordinate amount of time planning schedules and lesson plans, they feel compelled to follow through on that plan regardless of how it works, or doesn’t, in practice. Syllabi, daily schedules, and daily lesson plans should provide some organization and structure, but they are not exhaustive or fixed.
  • Rethink what counts as preparing to teach. Preparing to teach includes a teacher’s time spent being a student themselves, reading, researching, thinking, discussing with other teachers, etc. While Tulley recognizes many young professors lament so much time planning as a distraction from doing scholarship, I would argue all teachers at every level are preparing to teach by being scholarly; the two must not be in conflict, in other words.
  • Consider first and foremost what students will be doing in daily lesson plans. As I have noted above, too often teaching and planning to teach remain focused on teacher behaviors. The key, I think, to avoiding the tyranny of the lesson plan is to recognize that the essence of learning is student behavior, students being actively engaged in behaviors the teacher fosters and negotiates, but does not orchestrate.
  • Seek ways to build self-confidence by always being a student of how to teach and the content of courses being taught. Teaching is a state of constant learning and growing. That process occurs outside the classroom, but also in the classroom every day. Our teaching goal is to become adept at improv, not playing a role.
  • Resist the allure of being a martyr. Teaching has an unhealthy culture that includes who can make the best case about their martyrdom—lamenting in the teachers’ lounge or posting on Facebook about hours and hours spent planning and grading. There is clearly something compelling about this, but I believe it is ultimately not personally or professionally healthy.

I certainly understand why beginning teachers at all levels are drawn to over-planning, even scripting daily lessons. But I also recognize that this urge has more to do with matters not related to teaching and learning.

The lesson plan outlined down to the exact minute and governed by the teacher may leave no space for problems or look effective and efficient to anyone watching the play work out. What is sacrificed, I am certain, is student engagement and that teacher’s emotional reserves. This is not sustainable.

Teaching is a daily intimidating adventure, one that requires we find the confidence to enter each lesson with the board empty.

The King’s English, Social Media, and the Digital Era

Jeff Somers poses about Ray Bradbury’s Fahrenheit 451:

Collective cultural memory suggests Fahrenheit 451 is about censoring books…. But dig deeper into Bradbury’s own discussions about his novel (and carefully reread the text) and you’ll see the author was really obsessed with the encroachment of technology, especially television, on the tradition of the written word. Bradbury positions the burning of books as a symptom of what’s happened to society, not the cause—he’s much more interested in the erosion of critical thought and imagination caused by society’s consumption of media.

This argument frames the dystopian novel as a powerful and prescient commentary on the nature and status of language in our current era of social media (Twitter, etc.) and digital text (from Kindle to the Internet).

Bradbury explained that his novel is about “”being turned into morons by TV.”

Even as some wring their hands about the death of print, we mostly in 2019 take that print for granted, rarely, I think, considering the importance of the printing press to the development of humanity, and even thought itself.

The importance of fixed language, or the possibility of fixed language, began with the printing press, and then Bradbury imagined a logical conclusion well past his lifetime—one in which other forms of technology dwarfed communication as print did.

At the end of the novel, readers discover that people have memorized books, becoming organic, living Kindles, of sorts, to preserve the fixed nature of language. Before print, narratives flourished in oral forms, the tellings and retellings perpetuating and changing those narratives along the way.

I suspect the sky is not falling in terms of print text now because I recall while teaching high school English that the same sort of doom’s day warnings sprang up in the era of MTV and music videos. Videos, some warned, would not just kill the radio star, but were going to kill print.

English teachers were urged to pivot away from so much focus on print text, writing, and toward video communication; watching was the new literacy. Unlike Bradbury, these fear merchants failed to anticipate messaging over computers, the growth of email, and the advent of text messaging on smart phones and social media—all of which reshaped and propelled the importance of keyboarding and text (even as much of that is virtual).

The world shifted rather quickly away from music videos (MTV morphed into reality TV), toward cell phones with miniature keyboards (think BlackBerry), and then touchscreen cell phones with integrated keyboards (even the iPad has bowed to the market popularity of having a keyboard).

Print—fixed language—is an enduring aspect of human communication, and humanity itself, it seems. But the printing press and making language somewhat permanent resulted in another often ignored development—the rise of prescriptive rules for language (grammar, mechanics, spelling, and even style).

The rise of what many call simply “grammar books” because of their use in formal schooling reveals more about power than language itself. Proper use of language in English once carried the term “the King’s English.” It is there we should pause for a moment.

Linguistics professor John McWhorter has leveled a critique of Donald Trump, not so much for his presidential politics as for his language, notably on Twitter.

“The president of the United States has many faults, but let’s not ignore this one: He cannot write sentences,” McWhorter begins before cataloguing a pretty hefty list of Trump’s unusual uses of language on social media—odd capitalization, garbled spelling (apparently not copyedited by anyone), and typos.

From that evidence, McWhorter proclaims: “Trump’s serial misuse of public language is one of many shortcomings that betray his lack of fitness for the presidency.”

While some may find—as I do—McWhorter’s critique linguistically prudish, the stale prescriptivist rant, he makes two important, although complex, points: “Trump’s writing suggests not just inadequate manners or polish—not all of us need be dainty—but inadequate thought” and “One must not automatically equate sloppy spelling with sloppy thinking.”

I fear many people will not read McWhorter’s analysis as carefully as he intended, so I want to emphasize his use of “suggests” and “not automatically.”

Emily Dickinson and e.e. cummings played thoughtfully with capitalization and lower case letters. William Shakespeare manufactured quite a few words.

While there certainly is a case to be made for standardizing language to aid communication, the automatic and abrupt association of so-called nonstandard language in print form with “inadequate thought” is very dangerous.

If we return to the rise of “the King’s English,” we must be reminded that prescribing rules was far more often about power than the linguistic integrity of any language. Early grammar texts for English imposed (without any real linguistic justification) mathematical concepts onto language (no double negatives!) and wrestled English into Latin constructs (do not split infinitives!) because English was viewed as inferior as a language.

But even more important in that process is that “the King’s English” was mostly an effort to fix, make permanent, the ruling class’s language, one honed through formal education and in the privileged context of access to print text (which was incredibly expensive). Literacy was a wedge among the so-called classes, notably a mechanism used to leverage power in the balance of those already in power.

There is more to the politics of “the King’s English” also; the direct connection between the so-called use of proper English and moral character. The earliest cases for correct use of language was an argument that proper language reflected a person of high moral character as well as the inverse. Of course, this was gross propaganda to portray the ruling class as deserving their privilege and the poor as deserving their poverty.

So I am left with a predicament in terms of McWhorter’s analysis of Trump’s use of language, especially as Trump represents the state of language in an era of social media and digital text.

I am not buying McWhorter’s prescriptivist bent even as I recognize we must critique and then reject “Trump’s serial misuse of public language” as an issue of dishonesty and “inadequate thought.”

If Trump himself or someone on his staff suddenly found the impetus to copyedit Trump’s public rants on Twitter and elsewhere, that would in no way abdicate Trump’s lies and abuse of status and power.

To nitpick about Trump’s so-called correctness in matters of mechanics, grammar, and style is too much like those concerned with Trump’s ill-fitting suits and his god-awful hair and orange skin-glow.

Trump ascended to the highest office in a free country, mainly as a careless business man and reality TV star—more bravado than anything else.

There’s too much of substance we must be confronting instead of the surface where he has flourished.

Playing grammar Nazi with Trump’s Tweets is a simplistic distraction from the very real threat of Nazis in 2019 America.

Nero fiddled, Trump (more reality TV star than business man) Tweets (badly). But, you know, the fires.

Charter Schools Fail SC: A Reader

Nationally, momentum has been building toward political and public recognition that the education reform movement begun in the early 1980s has fallen well short of promises. This failure was identified throughout the accountability era by educators and scholars, of course, but political leaders and the public chose to ignore those with experience and expertise in their own field.

The problem with the reform movement included a refusal to acknowledge the primary problems in our public schools—overwhelming poverty and inequity of opportunity along social class and racial lines—and ideological commitments to the accountability paradigm (standards and high-stakes testing as well as focusing on so-called teacher quality) despite that solution in no way matching those ignored problems.

A subset of that movement has been the rise of charter schools, which served to bridge a political divide between school choice advocates on the right and public school advocates on the left. Charter schools are touted as public schools, but they also are driven by many elements (the worse kinds) of market forces.

Even with charter school popularity, they constitute a very small percentage of schooling in the U.S. (data from Education Week):

  • Traditional public schools: 91,422 (2015-16, Source)
  • Public charter schools: 6,855 (2015-16, Source)
  • Private schools: 34,576 (2015-16, Source)

And thus: “According to data from three years earlier2.8 million public school students, or 5.7 percent, are in charter schools.”

Here is what we know about charter schools, then, messages repeated by educators and scholars for many years. Charter schools do not outperform public schools because they are charter schools (just as private schools do no outperform public schools).

When charter schools claim to outperform public schools, the reasons often lie in serving different populations (notably concerning ELL and special needs students), having the ability to select or counsel out students, and other policies and practices that public schools often cannot or do not implement (longer school days and years, for example).

Charter schools, like all school choice, contribute heavily to segregation—one of the serious problems lingering in public schools today.

Recent reporting at the Post and Courier (Charleston, SC) may suggest the tide is also turning against charter school advocacy trumping evidence:

This media recognition matches messages I have been sending for many years, including damning analysis that charter schools in SC mostly perform the same or worse than comparable public schools:

And my analysis of two years of data on SC charter schools has shown:

  • Using 2011 SC state repost cards and the metric “Schools with Students Like Ours,” charter schools performed as follows: 3/53 ABOVE Typical, 17/53 Typical, and 33/53 BELOW Typical.
  • Using 2013 SC state repost cards and the metric “Schools with Students Like Ours,” charter schools performed as follows: 2/52 ABOVE Typical, 20/52 Typical, 22/52 BELOW Typical.

Here, then, is a reader to further reinforce how charter schools fail SC, particularly in terms of re-segregating a system long-plagued by race and class inequity:


See Also

Challenging the market logic of school choice: A spatial analysis of charter school expansion in Chicago, Stephanie Farmer, Chris D. Poulos, and Ashley Baber

ABSTRACT

Corporate education reformers take for granted that market competition in the public schools system will improve education conditions. We conducted a spatial analysis of Chicago Public Schools, examining the spatial features of charter school expansion in relation to under-18 population decline, school utilization, and school closure locations. Our findings indicate that 69% of new charter schools were opened in areas with significantly declining under-18 population and approximately 80% of charter schools were opened within walking distance of closed school locations. Our findings show, contrary to corporate education reform logic, that a competitive charter school market created spatial and financial inefficiencies resulting in school closures and systemwide budgetary cuts primarily impacting distressed neighborhoods. We explain the overproduction of charter schools through the lens of the firm-like behavior of charter school operators driven by a self-interested growth mandate that can undermine the stability of the public schools system as a whole.

Death Takes a Lifetime, and then a Year

& how the last
time I saw you

“Maps,” Yesenia Montilla

wareshoals
My nephew Steven found this yearbook picture of my mother, Rose (circled), from Ware Shoals High (South Carolina).

Mid-afternoon on 7 January 2019, my oldest nephew Steven (on my side of the family, we call him Tommy) texted that he needed to meet with my middle nephew, Kendall, and me. He had checks and forms for each of us to sign.

This was the final probate meeting for my mother’s and father’s estate—although having grown up working-class, I find that term more than misleading.

None of us anticipated what eventually transpired that afternoon: The probate court transferred all of my father’s matters (he died several months before my mother) to my mother, and then her probate was settled with their will dispersed as they planned.

Pressed for time, I met Steven (Tommy) in the parking lot of Best Buy just 10 minutes or so from my house. We hugged, and he handed me a check and the form I had to sign as well as find someone to witness the transaction.

Steven had medical power-of-attorney and was the executor of the will so he wasn’t allowed to sign the form, which in legalese confirmed that I was receiving my share of the will, all of my mom’s accounts and such having been fairly and fully disclosed.

My nephew offered to let me see anything if I was concerned, although he had meticulously shared every possible detail and artifact throughout the long, arduous process over the year-plus since my mother died of stage 4 lung cancer discovered a few months after she suffered a debilitating stroke.

I waved him off and said simply, “I trust you.”

And I do. He is a good and careful person, especially when it comes to my parents, his grandparents, and like my other two nephews, he loved my parents genuinely, more like parents than grandparents.

Since my parents raised those three grandchildren, my nephews split equally with me the remnants of my parents’ lives. There are some messy and uncomfortable details underneath that, but in the end, my parents made the consequences of their deaths about as simple and direct as possible. And anyone who could quibble chose not to do so.

On a Monday afternoon in January—the birth month of my father and me as well as the month my parents were married—those remnants were quartered after about 13 months of the state (in this case, South Carolina) prolonging the end of their lives by keeping their estate open to the public for anyone wishing to make a claim against it.

So I deposited the check and I signed the form, asking a staff person in my department to sign as a witness to the obvious fact that I am well aware of what now constitutes my parents’ lives.

|||

Over the Xmas holiday break, I sat with a few friends at a favorite taproom watching Hoarders. I am not a fan of reality TV, and this show in particular makes me very uncomfortable.

I am beyond skeptical about capitalism and consumerism; I also have an unhealthy (but functioning) dose of OCD, enough to understand hoarding (I am a collector, the socially acceptable form of hoarding), to empathize with being victim of ones own compulsions.

Several episodes ran as we talked, watched, and drank beer throughout the afternoon. Yes, I found myself mesmerized, equal parts fascinated and horrified at these lives swallowed in mountains of acquired stuff that both defined and paralyzed these people.

Episode after episode documented the inevitable: What hoarders had deemed essential—that which they could not part with—was ultimately tossed by volunteers wearing gloves, protective suits, and face masks into large waste dumpsters.

This past summer, it took some coaxing, but my nephews and I eventually rented a waste dumpster, dragging and tossing a huge portion of my parents’ lives into it sitting ominously in their driveway. Their precious house had to be emptied so that we could sell it.

My parents’ lives reduced to trash for the landfill and then 4 checks as detailed by their will—the final material, financial, and legal remnants of two lives lived until they died followed by the state mandating another year before their deaths could be officially over.

Death takes a lifetime, and then a year.

|||

The final check I received was a bit more than I had expected. I now contemplate what to do with the money, in some ways wondering what last ways I could make gestures that would please my parents if they could witness the scattering of their lives like my mom’s ashes we spread at Myrtle Beach.

Those dollars and her ashes, in fact, haunt me as I weigh them against two people’s lives and their living bodies. The balance is disturbingly out of kilter.

My mom just an oddly dense box of ashes. My parents’ entire lives just 4 checks spread among checking accounts as so much electronic data.

It all feels very heavy. It all numbs me with the unbearable lightness of being.

|||

Several years ago, when I came to my university, first-year students were assigned a common book to read over the summer before entering college. Once the selection was Blood Done Sign My Name by Timothy B. Tyson.

While several colleagues gushed over the memoir, I found myself mostly irritated, at the gushing itself but also the book. My problem was grounded in not finding anything remarkable about Tyson’s experiences because it was a South I knew first-hand and lives I found familiar.

But it was also a collection of experiences I was still trying to move beyond—if not understand and reconcile with my current self in some way.

I have little patience with poor and working-class white-folk narratives. I am particularly critical of the Othering of rednecks from the South—like exotic zoo animals or museum displays.

It is not as though, I want to yell, that I used to be that redneck. I am that redneck.

I just have a doctorate. I am allowed to live my life in the mostly rarified air of academia. Unlike my father who could barely raise his arms because of his arthritic shoulders.

In fact, you could see my father’s life of manual labor in his giant gnarled hands and fingers, in the stooped, shuffling man sitting in a wheel chair the day he died beside my mother, him simply needing to go to the bathroom.

Writing about the most recent poor-white-folk memoir, J.D. Vance’s Hillbilly Elegy, Stanley Greenberg argues:

The book’s cascading errors begin with its failure to appreciate how exceptional Appalachian white history and culture actually are, and how dangerous it is to equate Vance’s hillbillies with today’s white working class. Yet that is the equation Vance makes at the very beginning of his memoir.

I think I have loathed Vance’s thinly masked conservative screed far more than Tyson’s romanticizing because I am a few years older and I have weathered the actual demise of the embodiments of my struggling—my parents who I have loved deeply while also having to recognize them for all their very troubling flaws.

|||

Things pass, like all humans.

Some times we feel things deeply, too much, and we let ourselves cry, or laugh, or even shout.

But the human machine cannot maintain that level of response to this world. It’s just too much to care all the time.

Some of my friends, after watching Hoarders, wanted to rush home and purge. At least one did. But all of us, given a few days, simply went back to consuming, the sort of socially acceptable collecting that makes us fully human in the good ol’ U.S. of A.

Mom and Dad—because my nephews and I decided to reduce their home, our home, to money—left behind that which allows me to consume, buy more stuff. The allure is goddam powerful.

Turn a small portion of my parents’ house into a new bicycle or an iPhone upgrade.

I am lost in this and the realization we are merely human, doing the best we can even though that often falls quite short:

Death takes a lifetime, and then a year.

When Ideology Trumps Evidence, Expertise

How do humans know the world? That answer is very complex, of course, but each of us begins understanding the world through our senses.

At the most basic level, we can explain “knowing the world” as an on-going interaction between our genetics and the experiences we gather from that world through our senses. As we mature, particularly as our brain develops, and thus our ability to use cognition (thinking), we are more able to think through our sensory perceptions (slow down and even change our responses) than merely react.

This dynamic is incredibly important as we try to understand the distinction between correlation and cause. Humans, however, are hostages to ancient evolutionary impulses that often contributed to our survival; in other words, in the earliest years of human existence, making abrupt causal assumptions (which may have often been mere correlation) were preferable to making more deliberate decisions because of the primary need simply to survive.

Contemporary humans not currently in dire environments or under the stress of poverty, oppression, or disease (for example) have the privilege of cognitive deliberation: Many of us in relatively stable and safe lives can (and should) be more careful about drawing causal or correlational conclusions, and thus, we should be far more deliberate about “knowing the world” based on more than our personal experiences and grounded in robust evidence while also resisting the allure of knowing the world through mere ideology.

In many of my courses, I ask students to consider all that by one simple thought experiment grounded in our sense of smell, “closely linked with memory.” I ask students to recall a first visit to a friend’s home and having the realization that other people’s houses smell different.

girl holding white flower covered with flower
Photo by Annie Spratt on Unsplash

Many, if not most, students begin to nod and even smile, recalling the experience. I then ask them to interrogate how they reacted to the house smelling different, and we conclude that our urge is to think of the different smell as bad or wrong.

Here, I think, is a powerful example of how human experience, cognition, and ideology conspire to derail human potential.

Recently on Twitter, I joined a discussion about charter schools, specifically contentious debates about the charter chain KIPP:

Stepping back from the topic of charter schools itself and looking broadly at the nature of the advocacy for charter schools is a microcosm of the problem I noted above. Charter schools (6855) are a very small fraction of public schools (91, 422) in the U.S., and only 5.7% of students attend charter schools (see data here).

At one level, then, the public and political debate and discourse about charter schools are both disproportionate and distorted by advocacy driven by ideology and not evidence and expertise.

That dynamic is driven by a belief that charter and private schools are outperforming public schools, which have suffered under a very long history of being characterized as failing. Yet, research has shown time and again that type of schooling has no real causal relationship with so-called school quality; in short, charter, private, and public schools all have about the same outcomes when conditions of that schooling are constant.

When charter schools boast of superior outcomes, the truth lies in many factors—such as underserving significant populations of students or the ability to choose or “counsel out” students—that make a comparison with public schools misleading at best and false at worst.

The charter school phenomenon represents the problem with ideology driving public policy at the expense of evidence and expertise.

Now, as I noted, charter schools and students attending charter schools are relatively small populations, and thus in the grand scheme of funding and public policy, my discussion here may seem as disproportionate as the debate itself.

My concern is that the charter school dynamic is just one aspect of a much more insidious problem with the U.S. persisting as a belief culture, particularly in terms of the political and public faith in equity, equal opportunity, and our having reached some sort of post-racial (and post-racist) society.

If we dig deeper in the charter school debate and the persistent antagonism toward public schools, we see a powerful racial element. U.S. public schools now serve a majority-minority population of students (white students constitute 48.9%), and what we can say about charter, private, and public schools is that all types of schooling have witnessed an increase in segregation.

Beliefs about school quality must not be disentangled from beliefs about race.

Let’s place the charter school debate in how the public perceives racial equity. Blacks and whites grossly mischaracterize both historical racial inequity and current racial inequity.

As an interview with Michael Kraus details:

For instance, one question in the study asked: “For every $100 earned by an average white family, how much do you think was earned by an average black family in 2013?” The average respondent guessed $85.59, meaning they thought black families make $14.41 less than average white families. The real answer, based on the Current Population Survey, was $57.30, a gap of $42.70. Study participants were off by almost 30 points.

The gap between estimate and reality was largest for a question about household wealth. Participants guessed that the difference between white and black households would be about $100 to $85, when in reality it’s $100 to $5. In other words, study participants were off by almost 80 points. Participants were also overly optimistic about differences in wages and health coverage.

If we allow public policy to be driven by belief, we find no political motivation for that policy addressing the realities of racial inequity:

Michael Kraus argues that these misperceptions fit conveniently with the idea of the American dream—that every individual, regardless of background, can succeed with talent and hard work. “Those beliefs can lead us astray, can lead us to not see the world for what it is. There’s a lot of work that still needs doing if our economic reality is going to match up with our narratives of opportunity.”

The irony is that believing the American Dream already exists prevents the U.S. from attaining the American Dream of racial equity.

As an educator for almost four decades now, I must share a final thought on evidence. Despite my best efforts—for example when we try to examine evolution and how the U.S. compares with international acceptance of evolution—students remain themselves resistant to setting aside their beliefs and then embracing a more accurate understanding of the world based on evidence and expertise.

From corporal punishment, to school safety, and to grade retention, when I engage students or the public, most people remain committed to their beliefs and refuse to engage with evidence while often discounting expertise.

So the really sobering reality about how we know the world is that too many of us are failing the evolutionary curve toward knowing the world based on evidence and expertise instead of imposing our ideologies onto that world.

The consequences of this are dire, especially to the most vulnerable among us.


See Related

Unlearning the Lessons of Hillbilly Elegy, Stanley Greenberg