On Reading and Comic Books: A Journey from 1975 to 2021 (and Beyond)

She was born in November 1963/The day Aldous Huxley died/And her mama believed/That everyone could be free

“Run, Baby, Run,” Sheryl Crow

The summer of 1975, I was diagnosed with scoliosis and fitted with a form-fitting plastic body brace anchored with aluminum rods and spanning from my pelvic bone to my chin. This was a hell of a way to start my ninth grade at Woodruff Junior High.

I would wear that brace 23 hours a day, gradually weaning myself off the support as my vertebrae both (mostly) repaired their disfigurement and eventually stopped growing; this meant I wore the brace for much of my high school experience as well.

My childhood and teen years were a contradiction of Southern racism, ignorance, and bigotry warmly wrapped in the blanket of my loving and doting working-class parents. My scoliosis was a significant financial burden on my parents (who never flinched at the medical care it required), but it also in some ways broke their hearts.

I was a skinny and very anxious human, deeply self-conscious and introverted before the years of the brace came upon me in the roiling shit-storm of adolescence.

It was at this juncture of my life that I discovered comic books, what now seems like a logical extension of the fascination I inherited from my mom for science fiction (she loved classic black-and-white B-movies, always claiming The Day the Earth Stood Still as her favorite film).

Once again, my parents never wavered when I began collecting and drawing from Marvel comics in the mid-1970s. They drove me to the local pharmacies to buy new comics and even bought a pretty large and important collection from a guy selling hundreds of comics in the local newspaper.

By high school graduation, I had amassed essentially every comic book Marvel published in the 1970s.

It would take me many years to recognize that my comic book collecting and science fiction reading were the foundation upon which I eventually chose to be a high school English teacher and came to recognize that I am a writer (although I initially clung to being a comic book artist since I spent hours and hours standing at our kitchen bar drawing from the comics I collected). (See my original artwork from the mid-/late 1970s below.)

Just thirteen days away from turning 60, I am baffled at not being able to specifically identify when I stopped collecting comics some time around graduating high school and attending college. I assume it seemed childish at some point even though I kept my 7000-book collection well into marriage.

I do know that when we bought our first townhouse, I sold that collection for way less than it was worth in both dollars and for my soul. I held onto the full run of Howard the Duck, but let everything else fund my misguided pursuit of the corrupted American Dream—home ownership.

At some point in the late 1980s and early 1990s, I briefly returned to collecting, prompted by several of my high school students and the Frank Miller rebooting of Batman as well as the Tim Burton/Michael Keaton films. This coincided with the 1990s boom/bust of mainstream comics by Marvel and DC, and once again, adult life kept me from really fully engaging in something I love.

When I moved to higher education in 2002 after 18 years teaching high school English, I found a way to merge my adolescent love for comic books and my adult life—comic book scholarship and blogging. I also published one book on comic books, which allowed me a justification for buying comics and graphic novels once again (and a way to move beyond super hero comics). I learned a great deal (and made several embarrassing mistakes) when I merged my fandom with my scholarship, but that work about a decade ago, once again, didn’t really stick—although it certainly didn’t fade away either.

Recently, I allowed myself to re-commit to collecting, focusing on Daredevil and then adding the newest Wolverine run. I am back engaged with a local comic book store just minutes from where I live, and I also collected the recent X of Swords run from Marvel. (See part of my Xmas haul below.)

And yesterday, something very interesting happened for me, again just two weeks from turning 60.

Concurrent with my reconnecting with comic book collecting, I have been embroiled in the newest reading war around the “science of reading” and also making a very feeble attempt at learning to play video games (initially Minecraft).

I never became a gamer because I always have struggled with the controls, and in my advancing age, that has been a real hurdle even more pronounced. But I also experienced a significant amount of disorientation as well as feeling extremely (for lack of a better word) dumb.

Starting a game left me paralyzed, repeatedly asking what I was supposed to do. I often was coached with this advice: Just explore and watch for what the game shows you to do.

That meant nothing to me, even less than nothing. In fact, I soon realized that I was simply unable to read the video games while experienced gamers have internalized hundreds of signals and cues to the point that “what you are supposed to do” seems obvious (see this on gaming, for example).

One of my foundational complaints about the “science of reading” movement has been its embracing a simple view of reading, and here I was, at 60, experiencing how incredibly complex reading is—that reading is far more than decoding print (and is even often apart from print).

Gaming like reading comic books is a holistic experience with text as well as images all guided by prior knowledge and experiences, and the blending of many different kinds of codes that are both unique to a single environment as well as common across the medium/genre/form.

The subgenres of gaming have commonalities like the subgenre of comic books, super hero comics.

Although I have recognized myself as a writer for forty years now—and never lift a pencil to draw any more—I was pulled back into comic book collecting because of the artwork, first Daredevil (a series that has always had distinct and powerful artists working on the character, in my opinion), then the rebooted Wolverine series, and now the incredible artists working on X-Men.

X-Men vol. 5, issues 5 and 6 (cover art: Leinil Francis Yu and Sunny Gho)

In several of my college courses, I have integrated comic books and graphic novels, often to students who have never read comics. They almost always admit that reading comics is much harder and takes much longer than they expected. It wasn’t, they discovered, like reading a text-only essay or book.

As I have been diving back into the X of Swords series and the rebooted X-Men series spearheaded by Jonathan Hickman, I have noticed my haphazard reading style of comics, very art-based and not very sequential (I glance around the entire spread and often dart back and forth among the text and panels).

And so here is the very interesting thing from yesterday.

In issue 4 of X-Men (vol. 5), Magneto quotes Aldous Huxley:

A sucker for literary references, I paused to search the quote, and then returned to reread the pages leading up to and after the use of the quote. Then, I realized something unusual that I had not noticed when first reading:

X-Men vol. 5, issue 4 (Hickman/Yu)

The omission of “care.”

Every time I read this, I still insert “care” automatically and have to force myself to see that it isn’t there (as if Professor X is doing it for me each time).

There are dozens of cues in those three panels, some of them text (and one of them the absence of assumed text).

As I count down the days until I turn 60, I am living some of the fantastical elements we associate with children’s stories, comic books, and science fiction—a pandemic, a Capitol siege, and the many eras of my own life overlapping with each other as if I am both living my current life and going back in time.

Life is no comic book or video game, but I am tasked with making sure as I explore the things around me that I pay attention to all the cues of what I am supposed to do—and it remains a very complicated task in 2021 as it was in 1975.

Thinking Critically about Critical Thinking in the Era of Trump and TikTok

While it now seems like generations ago, in the spring of 2008, I joined other faculty at Furman University in an organized protest labeled “We Object.” Through the university’s connections with FU graduate and former governor of South Carolina Mark Sanford, George W. Bush was invited to speak at commencement.

Recent university tradition was to have two students speak, but did not include outside speakers. None the less, students and the community (overwhelmingly conservative) seemed to welcome the opportunity to have a two-term Republican president speak to graduates.

The protest took many forms, including reaching out to the media, posting an official “We Object” statement, and wearing a “We Object” shirt, revealed from beneath professor’s gowns during the speech.

I did not yet have tenure as an assistant professor, but I was active in the organized resistance that included a wide range of reasons why professors were objecting. I attended meetings, helped with the statement, and provided interviews to the media (I did not stand and protest during graduation, however).

One aspect of that spring that now looks like a harbinger of the world in which we live today was an Op-Ed published by two conservative professors in political science. In that piece, they discounted the professors protesting as postmodernists.

Two problems stand out from that commentary. First, as is typical of conservative thinkers, they either did not understand postmodernism or willfully misrepresented postmodernism in order to have a strawman to attack. Second, when those of us protesting gathered after the piece was published, we uniformly confirmed that not a single one of us considered ourselves postmodernists (an intellectual movement now well in the past, supplanted by the ever-inane, in fact, post-postmodernism).

Conservatives have long posed postmodernism as a full rejection of truth/Truth (which it isn’t), but the great irony of being falsely slandered as postmodernists is that we objecting were all doing so on very clear ethical grounds.

A logical and dangerous extension of postmodernism’s challenge to the nature of truth/Truth is, of course, that there is no truth; many academics quickly rejected that path. In its purest form, however, postmodernism attempted to emphasize that truth/Truth is never objective but always a pawn of those in power.

In other words, postmodernism posed that truth/Truth is almost always what people in power say is truth regardless of empirical evidence (truth couched in power versus truth gleaned from evidence).

While scholars in philosophy, literature, and the arts had moved through and past postmodernism in many ways, this moment in 2008 certainly was a harbinger for the conservative and popular bastardization of postmodernism by Trump and the youngest generations in the U.S.—fake news and the power of social media to create (distort) truth/Truth.

The paradox of Trump is that he has become the embodiment of “there is no truth except what I declare is true” (even when those claims are baseless and repeatedly self-contradictory). Yes, Trump’s appropriating “fake news” to prop up his pathological lies and power-mania are exactly the worst of problems with truth/Truth that postmodernism was confronting.

Even Trump’s use of the term “fake news” is itself false (an ignorant or willfully planned use similar to the one used by the two conservative professors), but Trump’s mendacity and megalomania have both spoken to and emboldened a much wider and more insidious faction of the U.S. who function with the same sort of misguided approach to truth/Truth as Trump.

Not so long ago, Fox News and Rush Limbaugh seemed like mostly harmless sideshows, things of a very small minority of people in the U.S.

In 2021, Parler and Breitbart have far surpassed what was once rightwing media—and then there is QAnon.

Just as there was a logical and dangerous natural conclusion to postmodernism, there is now a very real and dangerous outcome of simplistic approaches to critical thinking as well as honoring the democracy of ideas.

The Right in the U.S. has leveraged challenging any and every idea, fact, and authority into a chaos that allows even a greater concentration of power among very few (mostly white and male) Americans.

Republicans have aligned themselves with both Evangelical Christian conservatism and authoritarianism; democrats have increasingly become the party of ethical challenges to the status quo (a party that at least pays lip service to gender, race, and sexuality equity).

Trump’s “fake news” ploy is a scorched-Earth policy for political and financial gain.

What has happened, however, in the wider society is much more disturbing in the sense that we can see some possible end to Trump as president.

Here is just one odd and troubling example: Young people (often expressed on TikTok) in the U.S. do not “believe” in Hellen Keller.

Writing on Medium, Isabella Lahoue concludes:

Maybe we [Gen Z] don’t believe in her [Hellen Keller] because we’re growing up in a world of fake news. We know the power of manipulation and lies in the media, and we’re losing faith in the sources everyone once trusted. There’s too much data and too many lies circulating for us to process and believe it all….

We don’t have to believe in Helen Keller, and it shouldn’t be surprising if we don’t. The world we were born into makes us profoundly different than other generations, and hopefully, it will also make us into change agents.

The Generation that Doesn’t Believe Helen Keller Existed by Isabella Lahoue

In 2021, then, there are now at least three Hellen Kellers: the historical Keller (the radical socialist and activist), the myth of Keller as rugged individual [1] (the distorted version often taught in school through The Miracle Worker), and the “fake news” Keller who did not (could not?) exist.

At the root of this is critical thinking, how formal education fails to teach it by mis-teaching it (see here and here).

Questioning authority and hearing all sides have long been a part of American culture.

Like postmodernism, “critical” is too often misunderstood and almost entirely absent from formal education.

Traditional schooling has reduced “critical thinking” to skills (such as HOTS, high-order thinking skills). This approach reduced being critical to a checklist of skills and a mechanical approach to interrogating texts and ideas.

But while education has been lazy and superficial in its approach to being critical, popular culture has gone off the rails, specifically because of the power of social media to allow and foster insular communities in which that community establishes truth/Truth and controls what counts as evidence (Facebook, Twitter, reddit, etc.).

To be blunt, the anti-vaccination movement has gone mainstream—and widespread [2].

Since the insurrection at the Capitol, I have circled back to 2008, when I was mis-labled a postmodernist.

Not a postmodernist, I am a critical educator, my work grounded in Paulo Freire’s critical pedagogy.

Unlike those who suggest I believe there is no truth/Truth, my critical teaching and writing are a pursuit of both truth/Truth and that which is ethical and moral.

Critical thinking, then, is not about rejecting truth/Truth, but acknowledging that truth/Truth is always couched in power. Critical thinking, then, is not about hearing all sides, but recognizing that it is a complicated but necessary thing to recognize what is credible and what is not when interrogating a text or idea.

Critical think allows anyone to realize that Hellen Keller was a real person, a complicated human made exceptional due to challenges beyond her control. But critical thinking also allows anyone to know that rugged-individual Keller is in many ways a lie, part theater and part ideological myth-making—and that Keller denial is a dangerously frivolous thing (several magnitudes less so but overlapping with Holocaust denial).

Critical thinking allows anyone to realize there is a wide and complicated gray area between “Believe no one” and “Listen to everyone.”

Those two extremes, in fact, have joined hands and are poised to destroy democracy and the sort of slow and painful arc of history reaching for justice on a darkening horizon.

If and when Trump leaves office, and if and when he fades from public spaces, we will still have TikTok (or something like it) and Parler (or something like it) and tens of millions of people who don’t believe in Keller but do believe Trump (or someone like him).

It is again a critical time for truth/Truth.


[1] See also how Pat Tillman suffered a similar fate, being misrepresented for ideological/political purposes.

[2] I recommend A Game Designer’s Analysis Of QAnon by Rabbit Rabbit as one interesting look at how this happen with QAnon.

The Politics of Calling for No Politics: 2021

As a part of the education community, I noticed two immediate responses to the insurrection of the U.S. Capitol by domestic terrorists seeking to disrupt the confirmation of the next President of the U.S.

One response anticipated that (once again) teachers would be on the front line of addressing trauma by suggesting ways that examining the riot in DC could be (should be) incorporated into the classroom—notably for those teachers dealing with history.

Another response, however, was the both-sides warning calling for no politics in the classroom.

Some educators received the identical email shared after the November elections, essentially telling teachers not to take political sides in the classroom.

We stand in the first weeks of 2021 once again needing to clarify language and confronting just what being “political” means.

First, to remain neutral or to use the “both sides” approach (or to remain silent) is a form of politics—often imposed by those with power onto those who fear for their jobs (notably teachers in non-union states).

“Politics” is simply the negotiation of power between and among humans; in other words, all human behavior is political.

Many demanding “no politics” are in fact confusing “political” with “partisan.”

2021 is providing a vivid and disturbing example of “partisan” though the behavior of Republicans who have for four years yielded all ethical ground to Trump in order to protect their partisan power in the White House and Senate.

With the insurrection at the Capitol, we have witnessed cowardly backpedaling (Lindsay Graham) and the most disturbing doubling-down on partisanship (Ted Crus et al.).

The politics of calling for no politics is both a paradox (since the ones in power demanding “no politics” are themselves being political) and the worst sort of ethical abdication.

The horrific four years of Trump has been fueled in part by calls for civility and by a simplistic belief that people can just get along if they have a difference of “politics.”

The last weeks of Trump have proven beyond a shadow of a doubt that rejecting Trump is not about his being a Republican. We are rejecting Trump on moral and ethical grounds; there is no compromise between white supremacy and human equity, no compromise between fascism and democracy, no compromise between lies and truth.

Trump’s minions storming the Capitol must not be read along partisan lines even though it is an easy thing to do.

For example, in 2016, Hillary Clinton gained 3+ million more popular votes than Trump, but lost in the Electoral College. Certainly we can all agree that most of Clinton’s supporters were at least as angry about Trump’s win as Trump supporters are of Biden’s win.

However, Clinton conceded, and the transition to Trump occurred without any disruption from Clinton’s majority of voters. Notable is that Trump won key states by even fewer votes than Biden won in 2020, yet no weeks and weeks of false claims of voter fraud.

The boundaries in 2020 and 2021 of democracy, however, have now been crossed, and Republicans have made that decision for the political party.

Regardless of our professions or stations in life, we cannot take the “both sides” or neutral approach to that line crossing without also being complicit in the insurrection.

Neutral is a political stance that endorses the status quo through silence and inaction.

Calling for no politics is always a political move of the powerful, who worship few things more than the status quo that allows their power.

Calling for no politics is always a political move of the powerful who depend on individual compliance and fear collective ethical resolve.

The worst and best examples of power in the U.S. is Trump and Mitch McConnell, both embodying the very worst of partisan and personal dishonesty and blatant loyalty only to their own fortunes; in other words, they have clung desperately to the status quo and their behavior has no ethical underpinning except to keep their own butts on their different thrones.

If the COVID-19 pandemic has taught us anything, we should now recognize that we are all in very tenuous circumstances; life has no guarantees.

But in our daily lives, we must eat, cloth ourselves, and sleep somewhere while we also have families and loved ones who need us so that they can live; calling for no politics where we work and because of our professions is the most insidious way to keep the status quo of inequity in place.

As adults, if we genuinely seek a country that honors life, liberty, and the pursuit of freedom for all people, we must take clear stands, especially in our work puts us in front of children and young people.

There is nothing partisan about calling fascism “fascism,” calling racism “racism,” and calling lies “lies.”

To name those wrongs is the very best of being political and remaining neutral and silent, then, is the very worst of being political.

We must not allow the latter.

See Also

The Politics of Calling for No Politics

Thinking Beyond Bean Dad: A Reader

First, Bean Dad (as he would become known) posted a Twitter thread about teaching his daughter a lesson. The thread was flippant, snarky—and about a child not knowing how to use a can opener.

I was, frankly, surprised that Bean Dad took a beating on this because his approach to his child is essentially the foundational belief system in the U.S. about child rearing: The world is dangerous so I better pound on my kid before the world does so she/he is prepared for the Real World.

In far too much of the U.S., that pounding is literal—corporal punishment—but the pounding takes many forms such as grade retention and “no excuses” policies and practices in K-12 schooling.

Gradually, the clever thing to do about the Bean Dad trending on social media was to interrogate the phenomenon as an example of everything-that-is-wrong-with-Twitter. While a valid take, I think, it is also careless to set aside how this thread (whether it was hyperbole, as he claims, or not) is one small but ugly picture of how we mistreat children in the U.S., both in our families and in our institutions such as formal schools.

Let me offer an analogy.

One of the most important moments in the U.S. for the safety of children was recognizing the dangers of lead paint. This moment also is a powerful illustration of the need to target the external danger and not the child.

Instead of teaching children a lesson about lead paint—somehow toughening up those kids so that when they did consume lead paint, they would survive the experience—we used the power of public policy to remove lead from paint—to eradicate the danger, instead of pounding on the children.

Bean Dad quipped about his own compulsion to prepare his daughter for the apocalypse—some sort of version of The Road where the child is always alone?—but there seems never to be any consideration, as Maggie Smith concludes, for a better world: “This place could be beautiful,/right? You could make this place beautiful.”

A child is not an inherently flawed human that must be “fixed,” corrected, or improved. A child is a developing human that must be nurtured, and nurturing requires love, patience, and safe spaces.

If nothing else, we must all check our impulses to be Bean Dad so I offer here some reading to reconsider the many ways we fail that calling:

On Children and Childhood

Rethinking grade retention

Rethinking corporal punishment

Rethinking “grit”

Rethinking growth mindset

Resisting deficit ideologies

Capitalism Is Your Daddy: “what sort of person to die as”

From “you are what you eat” to “when you have sex with someone, you are having sex with all of their sexual partners,” we seem obsessed with fear-mongering in order to shape how people behave, and thus, who people become.

About fifty pages into Keiichiro Hirano’s A Man, the reader experiences the first hints of what becomes one motif of the novel:

In the end, although Kido would only drift to law school in the grips of the fuzzy thinking that plagues many students of the humanities, his father’s words would contribute to his firm decision, while enrolled there, to try to see things through and become a lawyer.

Keiichiro Hirano, A Man (p. 56)

A work written by a Japanese author about a Korean naturalized as Japanese (Akira Kido) resonates in several ways with growing up and making career choices in the U.S.

In less than fifty more pages, Kido is gripped in a panic attack identified as “existential anxiety”: “Kido’s fear of the same thing [death] happening to him made him painfully sensitive to the minutiae of life” (p. 99).

Kido recognizes as he approaches middle age that he is revisiting similar questions he faced as a teenager:

As was typical of someone that age, in the process of trying to decide what he wanted to be, he had thought long and hard about what kind of person he was. In the end he had drifted along and become a lawyer in accordance with his father’s advice. His doubts about whether this was truly the right path had never completely left him, but he went on looking to the future, telling himself that the person he was meant to be would be realized through the profession ha had chosen.

Keiichiro Hirano, A Man (p. 99-100)

Kido had lived “[f]or fifteen years now” complacent in those choices, who he had become—a lawyer with a wife and child. Yet in his new surge of existential anxiety, he recognized “steady work was no longer as common as it once had been, and many in his generation were denied the opportunity” he had received (p. 100):

He understood the struggles such people faced all too well because he dealt with many of them as clients. Forced to accept a life in which their social position and income were always unstable, they could never hope to self-actualize their profession as he had.

Keiichiro Hirano, A Man (p. 100)

Despite feeling fortunate, Kido must confront renewed angst prompting a new question: “Did I make the right choice?” (p. 100). And Kido becomes starkly aware “there might have been other paths he could have taken and therefore other people he might have been”:

The problem now was not who he was in the present but who he’d been in the past, and the solution he sought was no longer supposed to help him live but to help him figure out what sort of person to die as.

Keiichiro Hirano, A Man (p. 100)

For Kido, “the minutiae of life”—career as a lawyer, his family, and his Korean race beneath his naturalized Japanese citizenship—resulted in his “reignited existential anxiety:

The judicial order that Kido worked hard as a lawyer to preserve propped up his quotidian life. It protected his family’s human rights and maintained their status as sovereign citizens.

Keiichiro Hirano, A Man (p. 101)

I am now in the late fall/early winter of a life and my so-called career; I turn 60 in about three weeks.

Regrets linger around me, but they are not primary my existential anxiety (that has been my companion since birth); although I do recognize in Kido’s dilemmas my own mixed feelings about who I have become and what sort of person I will die as—since I can recognize on the horizon many endings such as my career and my ability to do some of the things associated with younger humans.

My body’s deteriorations seem exponential by not the year, but the month and even day.

What strikes me about this aspect of the novel is that we rarely utter what Kido is confronting: you are your job (especially for those of us living in capitalism). I suspect this is one fear-inspiring slogan we don’t launch at children because it is the most frightening of all.

In capitalism, we must work in order to be fully human, especially in the U.S. where working is the only access to even marginal health insurance and care as well as livable retirement savings.

During the COVID-19 pandemic, U.S. government has been the least responsive internationally in providing financial support, but one of the most aggressive about maintaining the economy (which, in fact, really means keeping workers on the job).

Millionaire U.S. senators held forth about the dangers of sending citizens money and expanding unemployment payments because of the cancerous dangers of “handouts.”

And there has been during the election cycle the ever-present mantra about “socialism” (a red herring, but common from the Right none the less).

I have been talking with a family member and close friend, one in their 30s and on in their 20s, and hearing a common theme; they are both not just disillusioned about their professions/jobs, but about working.

Why, they are asking, should someone work five days every week for the brief two days of the weekend? And why work forty or fifty years just so you can retire in your dotage?

And like Kido, I think of my father, a young man in the 1950s and young husband/parent in the 1960s who bought the work-yourself-into-the-grave mentality hook, line, and sinker.

Then, of course, he passed it onto me; he imprinted it in me.

To these questions by young people, I can honestly say that the answer is capitalism is your Daddy.

You are your job, and if you are not careful, if you fail to ask these questions and then act, that’s the person you will die as.

Just an EdD from a State University

I’m Nobody! Who are you?

Emily Dickinson

Several years ago I was on a panel for a public forum held on my university’s campus. At the Q&A ending the panel talk, a colleague from another discipline asked a detailed question grounded in their discipline.

I watched their face and eyes as I navigated not only the arcane and somewhat navel-gazing elements of the question (we academics love to hold forth with questions that are thinly veiled opportunities to hear ourselves talk) but also that this conversation between the two of us was almost entirely alienating for 75% of the audience, which included several of my students.

Referencing key scholars from my colleague’s field, I did a bit better than hold my own—although I just have an EdD from a state university.

Because of the lingering Jill Biden controversy—using “Dr.” with people holding doctorates and working as professors—the public has been exposed to the ugliness surrounding and within the academy that includes classism (one detractor of Jill Biden clearly also disrespects community college students), sexism (the original swipe at Jill Biden that isn’t even thinly veiled misogyny), and degree stigmas (even in this excellent rebuttal of all the nonsense tossed at Jill Biden, the EdD is framed as a lesser degree).

My journey to academia and an advanced, terminal degree (EdD in Curriculum and Instruction) began in junior college after I left high school an avid math/science student set on majoring in physics (one of the most prestigious disciplines in academia).

However, while in junior college where I spent an inordinate amount of time playing pick-up basketball and drinking, I was approached by a Dean who taught my British lit intro course. Dean Carter asked me to tutor English in the college’s academic assistance office.

A bit disoriented, I asked why, and he said I was the best student in the class. At that point, a first-year student who had made almost all his As in high school in math and science (although my favorite teacher was Mr. Harrill, my English teacher), I never considered myself a literary person—and certainly had never entertained any proclivity for teaching (I laughed in high school, in fact, when Mr. Harrill one day suggested I consider teaching).

That moment with Dean Carter changed my life.

I soon fell in love with tutoring, and by the spring of my first year of college, I had fallen in love with poetry (thanks to my speech class taught by Mr. Steve Brannon) and discovered that I am a writer (having written my first “real” poems that spring after immersing myself in the poetry of e.e. cummings).

From 1983 until 1998, I completed three (shitty, in seems) degrees in education—a BA in secondary English education, an MEd in secondary English education, and an EdD in Curriculum and Instruction—all from (shitty, it seems) the state system where I live.

I am well aware that K-12 teaching isn’t very highly regarded, that many people see teachers as academically weak themselves (the Urban Legend about education majors having the lowest SATs, GPAs, etc.). I am also well aware that my education degrees are viewed as pre-professional and not academically rigorous.

As I noted above, even an impassioned and detailed defense of Jill Biden using “Dr.” included a swipe at the EdD degree:

Jill Biden does not have a PhD. She has an Ed.D. in Educational Leadership. It’s an applied doctorate, designed to certify rising administrators in the field of education….

At the outset I mentioned that Biden has “an Ed.D. in Educational Leadership” and not a PhD. The Department’s website provides a handy summary of the difference at Delaware.

Dan Nexon

That “handy summary,” however, is not a definition of EdDs, but of that particular degree and program, but even as there is an emphasis on being a practitioner, the summary ends with this: “The Doctor of Education represents the highest level of scholarly attainment in the professional field of education.”

Now honestly, there is a lot of coded language here that links “scholarly” with “professional field” and speaks into a cultural and disciplinary marginalizing of education as a pre-professional and not academic field.

Here is a significant distinction that many do not acknowledge about education as a discipline. Much of education in the academy is grounded in teacher certification (an area in which I work and strongly criticize), but education as a discipline is a social science, a cousin to psychology.

My graduate degrees (MEd and EdD) included many advanced courses in statistics and qualitative/quantitative research, educational philosophy, and educational psychology. I am willing to concede that education as a field is a hybrid of other disciplines, but I can hold my own among researchers regardless of the field, among complex discussions of philosophy and psychology (not just education), and among debates about the challenges of realizing theory/philosophy in day-to-day practice.

But many have also criticized Jill Biden for her dissertation, again the implication being that EdDs are less academically rigorous:

Some critics have honed in on the fact that Biden’s dissertation is not a PhD thesis but an “Executive Position Paper.”…

For that matter, we can argue, as Volokh does, that a PhD thesis “is generally a dissertation that constitutes a substantial original work of scholarship,” but it should be pretty clear that “generally” is doing a lot of work here. This is why writing a crappy thesis doesn’t mean that Gorka can’t call himself “Dr. Gorka.” It means his PhD doesn’t certify him as an expert on terrorism or in political science. Biden isn’t trying to pass herself off as a leading expert on educational reform or whatever….

For that matter, we can argue, as Volokh does, that a PhD thesis “is generally a dissertation that constitutes a substantial original work of scholarship,” but it should be pretty clear that “generally” is doing a lot of work here. This is why writing a crappy thesis doesn’t mean that Gorka can’t call himself “Dr. Gorka.” It means his PhD doesn’t certify him as an expert on terrorism or in political science. Biden isn’t trying to pass herself off as a leading expert on educational reform or whatever.

Dan Nexon

This defense, you see, of Jill Biden is grounded in the argument that she did in fact meet the requirements of her EdD, which is a doctoral degree, and therefore, assigning “Dr.” to her name while she is a professor is entirely reasonable.

This defense glides right past making any concession that EdD programs may in fact have rigorous dissertation requirements that result in a “a substantial original work of scholarship.”

My own experience with graduate school was not like my colleagues’ programs since I completed my MEd at a satellite campus (degree was based in the main campus, however) and then completed the EdD with very lenient residency requirements (I did not quit my teaching job or live on campus, meeting residency by taking a certain number of main-campus listed courses over several consecutive semesters).

And my dissertation does meet the threshold of being “a substantial original work of scholarship,” but it is an educational biography—a qualitative research paradigm and a sub-genre of history, both of which are stigmatized in academia (once again, shitty and shitty).

Tracing the life and career of Lou LaBrant through much of the twentieth century required my completing a literature review of biography/educational biography grounded in feminism and critical pedagogy (that grounding, you guessed it, shitty and shitty), reading dozens of works by LaBrant and about LaBrant that form the skeleton of my field of literacy and English education, and then writing a book-length biography (which has since been published).

I was well equipped during my years in graduate education to have written a traditional dissertation driven by a quantitative study (I found none of that compelling and chose my program specifically because it included a key figure in education biography, Craig Kridel, and because I could write a biography).

My work on LaBrant, as Kridel declared at my dissertation defense, is a unique contribution to the field of education (as a social science), rich in history as well as robust debates about philosophy, theory, and practice.

My challenge is that I wonder how many economists, political scientists, psychologists, and almost all the other disciplines awash in PhDs could have done the type of work I did, academically advanced writing (not to a dissertation temple) that grounded a prominent figure over almost a century of thought in that field.

I suspect few of the academic snobs pontificating on Jill Biden could have done the work I did, and part of their condescension is a way to avoid that fact.

It’s not, then, that my terminal degree is just an EdD from a state university; it’s all the layers of shitty I have trafficked in along the way—just the field of education, just a biography.

The Jill Biden debate is mostly about sexism and misogyny; it is unwarranted and petty.

I know from first-hand witnessing that there are plenty of charlatans in all the disciplines—small-minded and weak thinkers about even mundane topics. I have to stand in proximity to their PhDs as if I don’t count because of the simple difference in letters.

We call them “Dr.” and don’t bat an eye.

These hierarchies and professional/personal pettiness are embarrassing among people who are supposed to be well educated.

But there is no place for any of that in our public debates either. I know that despite my shitty degree.

Writing as an Academic and Scholar

My transition to being an academic and scholar occurred in the mid-1990s after I had been a public school English teacher for more than a decade and a writer for almost two decades.

Looking back, one of the most pivotal moments of that transition was when Craig Kridel, a leading scholar of educational biography who would become the anchor on my doctoral committee, stood at the first organizing meeting of new doctoral students and announced the importance of being able to write well. He introduced me to Joseph Williams’s Style and forever changed me as a writer and a teacher.

I was in my mid- to late 30s when I completed that doctoral program, and I chose the degree primarily because I had the opportunity to write an educational biography (a non-traditional dissertation form and an under-appreciated research type, qualitative) and work with Kridel. You see, I wanted to write a real book and not a formulaic doctoral dissertation.

Of course, entering the doctoral program, I had yet to experience or fully understand just what writing a dissertation entailed—or what it would mean to be an academic and scholar.

At this writing, I am 19 years into a career in higher education and have been teaching 37 years as well as publishing for over 30 years.

Despite the recent public criticisms of academics—and specifically those of us with EdDs and not PhDs—I have witnessed some problems within the academy that often go unaddressed, notably the assumption that someone with a terminal degree can teach and write with little or no preparation in either.

Writing as an academic and scholar, I think, receives even less attention than teaching in higher education; people with doctorates have almost all completed book-length studies and then continue to write and publish as a key component of their careers as professors.

As I have noted often, there seems to be a flawed assumption, in fact, that professors can not only all write well but can teach writing.

At the end of this fall semester, I completed an editorial review of a policy brief and was immediately struck with how my comments in many ways matched much of what I emphasized for my first-year writing students.

The brief was written by a very bright scholar and the content was excellent. However, this experience pushed me to interrogate for myself that writing by academics and scholars often exhibits an intense focus on being careful and meticulous with the content and ideas of the text while falling quite short on the art and craft of writing at the sentence and paragraph levels as well as not fully keeping the audience in mind.

Academics and scholars can find themselves writing in a wide variety of contexts—to a specialized audience, often their peers in their discipline; to an informed and educated audience outside their field of specialization; or to a public audience, possibly not well informed or highly educated.

Yet, academic and scholarly writing tends to remain in the first register, to a specialized audience, and includes highly structured (and stilted) organizational features, specialized terminology and academic language, long and complex sentences and paragraphs, and somewhat traditional expectations to depersonalize the text (as if the academic and the audience do not exist).

When I am working with first-year students, I spend a great deal of time and energy helping them unlearn practices that tarnish their writing, and their credibility.

A lesson those beginning writers (and potential scholars) and seasoned academic and scholars need to learn is that how we write and how we engage our audiences are essential elements of our authority and credibility.

One of the paradoxes of writing by academics and scholars is that the focus on fidelity to the content and ideas at the exclusion of accessible and engaging expression serves to discredit and devalue that content and those ideas.

Here, then, are some entry points for academics and scholars to re-imagine themselves as writers:

  • Rethink the structures of writing, particularly the essay as a form. Traditional approaches to introduction, direct thesis sentences, and conclusions are not only weak writing but also harmful to the goal of any piece of writing—to engage and persuade or inform the reader. Openings and closings often have profound influence over whether or not the reader actual reads as well as what that audience takes from the text. Academics and scholars need to add to their goals as writers being engaging and vivid instead of simply complying with templates.
  • Reject de-personalizing writing by fore-fronting real people (including “I,” the academic/scholar) doing real things. Traditional scholarly writing still avoids first person as well as anecdotes and narrative. These expectations come out of a (simplistic) concession to objectivity and a valid concern for representing accurately research and evidence (acknowledging, for example, that anecdotes may be outliers or cherry picking). Pursuing objectivity is a trap; instead academics and scholars can improve their writing by seeking to be transparent and adding context (both of which make writing richer and more engaging). Also, simply because anecdote and narrative can be distorting doesn’t mean that they must be distorting; writing as an academic and scholar includes an ethical expectation that the anecdote is representative, not that the writing cannot be vivid and engaging.
  • Start any text with the audience, not the content or ideas. Keep in mind that the final text of writing need not look the same as the early drafts; in other words, many academics and scholars likely should start their drafting with their content and ideas, shaping them and wrestling with them in ways that allow a later draft to forefront the audience and write in ways that are engaging and vivid.
  • Begin to interrogate your writing at the word (diction), sentence, and paragraph levels. One of the greatest challenges of writing as an academic and scholar is that the content of specialized fields often includes arcane terms and sophisticated ideas that are so complex they resist simple writing. None the less, making academic and scholarly writing accessible as well as credibly accurate is part of the authority in that writing. When academic and scholarly writing is too dense, inaccessible, and overwhelming, it is likely either misunderstood or outright ignored. One writing strategy that can improve academic and scholarly writing is revising sentences and paragraphs for length (shorter is better than longer) and variety.
  • Cultivate a peer group of readers that can provide feedback during drafting that includes people who are themselves writers and people both a part of and outside the intended audience of the text being drafted. Here is a simply tip: Don’t write in isolation. Meaning derived from reading a text is a communal experience (reader, writer, and text interact to create meaning); therefore, creating meaning through text should also be communal.

In several of my classes, I have students prepare two different but related assignments—one is scholarly (an informed audience and using academic citation) and the other is a public text (general audience and using hyperlinks for citation). Students often are required to address the same topic in these pieces, and thus, must begin to investigate how to express themselves well while adapting their diction, tone, and style to different audiences and in different writing formats.

This is the domain of being a writer.

I have always thought that we ask far too little of students as writers, doing most the writer’s work for them when we provide detailed writing prompts, intricate rubrics, and essay templates.

Many if not most academics and scholars carry that baggage into their professional writing. I also think we ask far too little of academics and scholars as writers.

Writing well, being engaging and vivid, should not be some afterthought, or simply no thought at all. Writing as an academic and scholar can only serve their content and research well if they invite in and engage an audience.

If an academic/scholar writes an essay and no one reads it, does it make a sound?

Normality in Sayaka Murata

What is normal? Are you normal? Am I normal?

“Normality was contagious, and exposure to the infection was necessary to keep up with it,” explains Natsuki in Sayaka Murata’s Earthlings.

Earthlings: A Novel: Murata, Sayaka, Takemori, Ginny Tapley: 9780802157003:  Amazon.com: Books

If we accept that “normal” describes what is typical, and thus, what we may expect in any circumstance, then the novels of Murata are themselves not normal.

And the central characters in both Earthlings and Convenience Store Woman are certainly not normal either.

Amazon.com: Convenience Store Woman: A Novel (9780802128256): Murata,  Sayaka, Tapley Takemori, Ginny: Books

Having focused for the last couple of years on fiction in translation (see links below)—prompted in part by my scholarly and personal interest in Haruki Murakami—I think part of the appeal of fiction from other cultures, crafter originally in languages other than my native English, is that the works confront and challenge my perceptions of normal, even though my critical ideology always calls on me to question, to step back, and to reconsider the assumptions of being human.

However, Murata’s work has shaken me to the core, although in a different way than my recent journey through three novels by Ryu Murakami (below); both authors leave me confused about my responses to their graphic violence and matter-of-fact explorations of the decidedly taboo (child sexual assault, incest, and cannibalism, for example, in Murata).

But while Ryu Murakami crafts tension around both horrific violence by serial murderers and the ever-present threat of violence (readers can likely never again ignore the possibility of severed feet), Murata’s tensions are existential, and while far more dramatic than day-to-day human anxiety, any reader who lives with the existential dread of simply being alive must interrogate their empathy for Keiko (Convenience Store Woman) and Natsuki.

I read Earthlings first, mesmerized by the first third of the novel focusing on Natsuki at 11 years old and in the early stages of puberty. However, this opening is no coming-of-age narrative seeking to reach some sort of universal appeal.

Yes, some of the first two chapters is somewhat quirky explorations of what almost everyone understands about being an adolescent—especially Natsuki’s feeling alienated from her family, particularly her antagonistic mother—but Natsuki being the victim of sexual assault (far too common for young women throughout the world) turns even more disturbing because her confession of the abuse is callously dismissed by her mother and ultimately because Natsuki at 11 enacts a surrealistic revenge that leaves the reader, again, conflicted.

The rest of the novel is Natsuki as an adult, in her 30s, and here we see many of the same powerful motifs found in Convenience Store Woman, where Keiko is also a woman in her 30s.

Murata offers readers characters explicitly aware that they are not normal, but who are along a spectrum of navigating their world-views against either the urge to become normal or finding a way to exist in the so-called normal world as an alien (with sufficient ambiguity about whether that is literal, delusional, or metaphorical).

From casual interest in incest and gleeful cannibalism to choosing a single life as a career part-time convenience store worker, the plot elements of Murata’s novels shatter expectations about tone as well as anyone’s confidence in their own sense of normality.

It isn’t enough to say that Murata seems to show that there really is no such thing as “normal”—except for the power of normalization to seem real.

Murata pushes even further, toward the implication of normal as entirely arbitrary; “normal,” if we dare to be critical, becomes most harmful in human experiences when it becomes “right.”

Normal people marry and have children. Normal people seek out careers and center the focus of their lives on those careers.

And since these are the right things to do, this is how anyone can be fully human.

The harm, of course, is that those who choose not to marry, have children, and center their lives on their careers are choosing the wrong path—and are in effect not fully human.

This brings me to the ultimate overwhelming weight of Murata’s novels—the burden of normal on children and women as well as the role of normal in the sexual and physical violence pervading the lives of children and women.

Yes, there are cartoonishly surreal moments in Murata that prod a smile, but everything in her worlds is tinted by the inevitability of the disease of normality and the futility of a single human’s desire simply to be herself, her true and full self.

See Also

Found in Translation

The Diving Pool: Three Novellas, Ogawa, Yoko

The Housekeeper and the Professor, Yoko Ogawa

Revenge: Eleven Dark Tales, Ogawa, Yoko

Hotel Iris: A Novel, Yoko Ogawa

The Memory Police: A Novel, Ogawa, Yoko

Breasts and Eggs, Kawakami, Mieko

Piercing, Murakami, Ryu

Audition, Murakami, Ryu

In the Miso Soup, Murakami, Ryu

Convenience Store Woman, Sayaka Murata

Earthlings, Sayaka Murata

A Man, Hirano, Keiichiro

The Naked Eye, Tawada, Yoko

The Lower Realities of Higher Education

I posted a fairly tame Tweet about the Wall Street Journal‘s recent Op-Ed attacking Jill Biden using “Dr.” and editorial doubling-down on negative responses to the Op-Ed (none of which I will link here):

The Tweet attracted conservatives with ten’s of followers, most of them misreading the Tweet and many of them attacking me for being an academic/professor (the typical snarky references to Marx, etc.) as well as being in the field of education (my university affiliation and doctorate, an EdD, are part of my Twitter bio and handle—although several Twits thought they were outing me in some way for these public facts).

While I am enormously privileged, I share with Jill Biden the paradox of holding a doctorate in an often marginalized field, education; when I attained my EdD in the mid-1990s, it was still a much lesser degree than a PhD—and remains well down the hierarchy of academic credentials since education is often discounted as a pre-professional field.

Over 37 years as an educators, I spent the first 18 as a public high school English teacher. K-12 teachers are disproportionately women, and being a K-12 teacher is a profession rarely recognized as such—mostly, I contend, because it is perceived as mere women’s work.

Like babysitting.

Now in the middle of my nineteenth year as a professor, having moved through the ranks to full professor and received tenure, I am part of a male-dominated field (especially at the higher ranks) that often warrants far more prestige than K-12 teachers but also receives a fair amount of public shaming and ridicule (notably from conservatives, as my Twitter experience illustrates).

That ridicule is based in large part on cartoonish stereotypes of the Ivory Tower (academic knowledge not being realistic or practical) and a mischaracterization of professors as radical Leftists.

What popular and conservative attacks of higher education often miss is that academia is incredibly traditional, especially in terms of policies and practices that are sexist, racist, classist, and (often) petty.

Higher education, like K-12 education, more often reflects society—the good, the bad, and the ugly—than not.

The Jill Biden debate prompted by the conservative WSJ is an opportunity to confront the gendered inequity of academia that is replicated in the racism, classism, and other inequities that permeate disciplinary hierarchies, the tenure and promotion process (along with faculty evaluation such as student evaluations of teaching [SET]), and numerous unspoken norms.

That higher education fails to be the Ivory Tower of equity is not the only paradox of academia. Many would assume, for example, that academics practice research-based policies and procedures, but one of the greatest inequities of being a professor is the use of SETs for annual evaluations and the tenure/promotion process (see here).

From 2019, Kristen Doerer reported:

“Having a female instructor is correlated with higher student achievement,” Wu said, but female instructors received systematically lower course evaluations. In looking at prerequisite courses, the two researchers found a negative correlation between students’ evaluations and learning. “If you took the prerequisite class from a professor with high student teaching evaluations,” Harbaugh said, “you were likely, everything else equal, to do worse in the second class.”…

Studies since the 1980s have found gender bias in student evaluations and, since the early 2000s, have found racial bias as well. A 2016 study of data from the United States and France found that students’ teaching evaluations “measure students’ gender biases better than they measure the instructor’s teaching effectiveness,” and that more-effective instructors got lower ratings than others did….

Despite the data, at many colleges, particularly research-based institutions, student evaluations are still the main measure, if not the only one, of teaching effectiveness in promotion-and-tenure decisions.

Just as the WSJ editorial staff doubled down on a grossly incompetent and even laughably weak Op-Ed by a classic mediocre white man, academia repeatedly doubles down on SETs, arguing that colleges must have something to evaluate teaching and casually flaunting the research base.

But even the college classroom remains inequitable for women; Lee and McCabe have found that gender inequity in the college classroom hasn’t improved over the past 40 years, as they observed:

Men students are more likely to take the floor to talk while women students are more likely to wait for their turns. Across all nine courses observed, men students talk 1.6 times as often as women. In addition, men are also more likely to speak out without raising their hands, interrupt other speakers in the classroom, and engage in prolonged conversations with the professor during class….

Despite great gains in women’s access to and achievements in higher education, contemporary college classrooms seem to have remained “chilly.” Our observations suggest that men students continue to occupy advantaged positions while women students are largely hesitant to take up space in classrooms. These differences occur regardless of students’ or professors’ awareness of these inequalities. 

A key point here is that women for many years have surpassed men in attending and achieving success in higher education. And the nonsensical WSJ Op-Ed seems to reflect anther disturbing finding about gender and higher education by Levanon, England, and Allison:

Occupations with a greater share of females pay less than those with a lower share, controlling for education and skill. This association is explained by two dominant views: devaluation and queuing. The former views the pay offered in an occupation to affect its female proportion, due to employers’ preference for men—a gendered labor queue. The latter argues that the proportion of females in an occupation affects pay, owing to devaluation of work done by women. Only a few past studies used longitudinal data, which is needed to test the theories. We use fixed-effects models, thus controlling for stable characteristics of occupations, and U.S. Census data from 1950 through 2000. We find substantial evidence for the devaluation view, but only scant evidence for the queuing view.

As women surpass men in doctorates, the prestige of that credential has diminished.

Once again, however, we need only to listen to women themselves, of course, to recognize the lower realities of higher education that have nothing to do with cancel culture, Marxism/socialism, or diversity/equity/inclusion initiatives.

Those lower realities are mostly good old American sexism.

“Contrary to what one might have expected,” Allison Miller explains while unpacking the Jill Biden controversy, “I have found that the further away from higher education I’ve gotten, the more respect for my degree colleagues have shown.”

Miller continues:

Where I have encountered most disrespect for my doctorate is actually from academics. It’s not just that all Ph.D.s are not created equal — some schools still dominate hiring and will continue to do so as the academic-job market shrinks….

[T]he fetishization of hazing hasn’t disappeared from inside academe….

Once you have a Ph.D. … you learn the lessons of academic hierarchy all over again. What’s called “collegiality” is actually deference, a willingness to get along by going along, to put up with corridor microaggressions, to smile through Professor X’s department-meeting BS — but like a whack-a-mole, there’s always another Professor X. The rules of deference are unwritten because most of them would probably be illegal. “Wait until you get tenure” is not in the faculty handbook….

The demands for deference speak to gatekeeping and a general clubbiness that is hard to penetrate without a background that includes close proximity to upper-middle-class white people. 

Three key points must be acknowledged here in order to recognize the lower realities of higher education: “hazing,” “gatekeeping,” and “clubbiness” all confront that higher education is a highly insular and sexist system that, like most formal organizations, is more concerned with conserving its structure than changing for the good of all.

Higher education is often a good ol’ boys club with more credentialing and a more arcane vocabulary.

Attaining a doctorate—PhD or EdD (JD or MD)—is a relatively rare achievement, but those credentials do not guarantee that people are better humans after they earn the opportunity to be called “Dr.”

Dr. X and Dr. Y are no less likely to be selfish and arrogant, and we have no guarantee that anyone in any field, academic or medical, wasn’t last in their class—or isn’t a charlatan, a hack.

But when medical doctors gained the label of “Dr.” (after academics) and when academic doctors were mostly men, society rarely balked at the possibility that “Dr.” didn’t make any of those guarantees.

If anyone is ready for a reckoning in the U.S. (and I doubt many are), we would be better served to question the outsized role of mediocre white men, like the recent scribe of a WSJ Op-Ed, both inside and outside the academy.

In the mean time, it’s Dr. Jill Biden who will be the next FLOTUS, and along with Kamal Harris being the vice president, there is much to celebrate about women and simply no room for adolescent Op-Eds in the WSJ that can’t rise above Ayn Rand basement level pseudo-thinking.

Grades Tarnish Teaching as well as Learning

Recently on social media, a professor asked if others used rubrics with graduate students. Since rejecting rubrics has been a central component of my career-long efforts to de-grade and de-test teaching and learning, I chimed in.

My posts in the comments explaining why I don’t use rubrics were significant outliers because the thread of comments was overwhelmingly endorsing rubrics, almost entirely in terms of making grading easier or more transparent as well as providing teachers/professors protection against (hypothetical) students challenging their grades.

One immediate response to my comments is also worth highlighting since a person who doesn’t know me made fairly nasty assumptions about me being like the professors they had in grad school, the “gotcha” professors who use grades to ambush and punish students.

While most of my public (see here and here, for example) and scholarly work rejecting the use of rubrics—especially when teaching writing—has focused on their negative impact, along with grades, on students and learning (see this example), the recent social media thread highlights that grades also tarnish teaching.

Early into my first 18 years as a high school English teacher, I stopped giving tests; a bit later in that position, I also stopped grading assignments (although I had to assign students quarter and course grades). Over my on-going 19 years as a college professor, I have always delayed grades (feedback but not grades on assignments but course grades assigned) and never given traditional tests (midterms are often class discussions, projects, or reflections; and final exams are always portfolios of the work over the entire course).

My syllabi have no grade scales or policies, no weights for calculating grades, and no late policy even; I do have an explanation of my no grades/no tests approach to teaching, and I do share with students some broad patterns often correlated with course grades. [1]

While reading the thread on social media, I recognized a pattern of fear and a need among teachers/professors to justify grades but also to guard against a hypothetical complaining student.

This pattern struck me as a non-grader because over the 19 years I have been teaching in higher education full time, I have zero official complaints by students about grades. And only one student has ever confronted me about a course grade, a student who failed their FYW seminar for not participating in the minimum requirements (the student submitted all four essays once at the end of the course without submitting them throughout the semester and fulfilling the drafting and conferencing requirements).

That student left our meeting with the understanding that they in fact earned the F by not meeting the minimum requirements and expectations listed on our syllabus, and never pursued any official complaint.

While I remain deeply concerned about the negative consequences of grades, tests, and prescriptive structures such as rubrics on students and learning, I am also convinced more than ever that grades, tests, and rubrics detract significantly from effective teaching and actually create the problems many teachers/professors seem to be inordinately worried can occur in the hypothetical.

Rubrics as a subset of the traditional grading culture are often justified in terms of transparency as well—a very compelling argument.

As I have examined before in terms of the backwards design movement associated with Wiggins and McTighe, I have taught for almost 40 years while the focus on teachers and students has shifted from learning objectives to student assessment, and I do recognize that the shift to backwards design was in part an acknowledgement that students deserve transparency in expectations and goals for learning and student behaviors (artifacts of learning such as essays, projects, or performances).

Grade policies, rubrics, and templates are one type of transparency, prescriptive and authoritarian, but they all prove to be teacher/authoritarian-centered and to be mechanisms that reduce student autonomy and engagement in their own learning. Codified transparency is demanding compliance over student agency.

Despite the assumptions of at least one person commenting on social media, I am not a “gotcha” professor, and I am transparent about learning goals and student behaviors. However, I see transparency as a conversation in a learning community and an evolving, not static, state of any course bound by the limits of the academic calendar. That transparency must support my authoritative role as a teacher (as opposed to authoritarian).

I have posted many times that my transparency is in the form of minimum requirements (see below) and providing for students a wealth of resources that include detailed models of their assignments with instructional comments and checklists for preparing and revising their work.

By not grading assignments, I provide students low-risk environments that remove the “gotcha” element entirely since students are required and allowed to revise their work as well as engage with me in an ongoing conversation (conferences, feedback provided on the assignments) that helps them construct their own learning (individualized rubrics, in other words).

And since course grades are linked to a final portfolio of their work, assigning a grade occurs after students have had the entire course to learn, and considering the amount of feedback and conferences students have experienced along with class sessions grounded in their artifacts of learning (I teach based on the strengths and needs their assignments reveal), neither students nor I are surprised by the final course grade assigned.

I must emphasize again that I have been de-grading and de-testing my teaching since 1984 (the first year) and that these practices have been implemented in a rural public high school as well as a selective university. I developed and practiced not grading assignments and not giving traditional tests while teaching public school in a right-to-work (non-union) state and during my non-tenure years as I began my career in higher education.

I fully acknowledge and have worked in the so-called “real world” of traditional schooling that requires grades. Therefore, I have conceded that at best I am delaying grades, but I must emphasize that I also forefront significantly student learning and my teaching while complying with assessment, evaluation, and grades last, as a mandate that must not negatively impede student learning or my teaching.

Many justifications of rubrics are placing grades first, sacrificing learning and teaching.

Once we prioritize student learning/agency and teacher professionalism as well as teaching, structures such as rubrics can be recognized as traps that center the authority for a course in those structures (rubrics, templates, grading policies) instead of in the teacher/professor.

A syllabus is a legal contract, and once we codify how grades are determined, we as teachers/professors are bound to those codes regardless of how valid they prove to be for each student.

Well designed rubrics must be highly prescriptive (see Popham, Chapter 7), and thus, they do much of the work for students, choices and experiments that would better serve the students as learners; poorly designed rubrics (open-ended, vague, etc.) are neither fulfilling the goals of using a rubric or satisfying the standard justifications for using rubrics.

In rejecting rubrics, I am not rejecting transparency or fairness.

I am advocating for teachers and professors to step outside those traps and to make commitments to transparency and fairness grounded in student learning and teaching, not assessment, evaluation, and grades.


Notes

[1] [First-year writing seminar example; detail vary by course]

Student Participation in a Course without Grades or Tests

While you will receive a grade for this course per university policy, I do not grade individual assignments, and I do not administer traditional tests in any course I teach. We will comply with university expectations for midterm and final exams (see the assignments in the course overview), and I will submit either an S (satisfactory) or I (incomplete) for the midterm grade to designate whether or not you have fulfilled assignments as required through midterm.

Instead of traditional grades, I expect students to meet minimum requirements; in this course minimum requirements include completing all assignments (see the final portfolio sheet) fully and on time, and submitting, conferencing, and resubmitting all four required essays (a first full submission and a revision after receiving feedback and/or conferencing).

Assignments in my courses are not designed primarily for assessment (grading), but are designed as learning experiences. By completing and revising assignments, you are learning, and thus, you should expect to receive challenging feedback, and should also embrace the opportunity to revise work when allowed.

If you could complete an assignment perfectly the first time submitted, then there would be no reason for me assigning the work. All academic work can (and should) be improved through multiple efforts and feedback.

Since I require all work must be completed, and even though the expectation is that students meet due date deadlines, I must accept late work if and when students are unable to turn in work when due (see More Thoughts on Feedback, Grades, and Late Work). However, students should strive to be punctual with work unless circumstances beyond their control interfere (note that there are reasonable excuses for work being late, and I appreciate honest and upfront communication when students are unable to meet deadlines, even if the excuse isn’t urgent).

All four required essays must be revised at least once, but you are allowed and encouraged to revise as often as you wish to produce a high-quality essay.

At the end of the course, once you have been given ample opportunities to learn and can do so while taking risks and not worrying about your grade, I evaluate the entire portfolio of course work to assign a grade for the course.

Completing all work and submitting that work in the portfolio are mandatory (incomplete portfolios will be assigned an “F” for the course) and your course grade will be impacted by completing work fully and on time as well as the quality of the assignments (notably the four required essays). Proper citation (APA), quality of references, diligence in revising, and the sophistication of the writing and thinking in your assignments ultimately inform that final grade.

I recommend you read some or all of the following to understand my approach to grades and tests:

Minus 5: How a Culture of Grades Degrades Learning

Delaying Grades, Increasing Feedback: Adventures from the Real-World Classroom

More Thoughts on Feedback, Grades, and Late Work

Grades Fail Student Engagement with Learning

Note:

When I think about final grades, here are some guiding principles:

  • A work: Participating by choice in multiple drafts and conferences beyond the minimum requirements; essay form and content that is nuanced, sophisticated, and well developed (typically more narrow than broad); a high level demonstrated for selecting and incorporating source material in a wide variety of citation formats; submitting work as assigned and meeting due dates (except for illness, etc.); attending and participating in class-based discussion, lessons, and workshops; completing assigned and choice reading of course texts and mentor texts in ways that contribute to class discussions and original writing.
  • B work: Submitting drafts and attending conferences as detailed by the minimum requirements; essay form and content that is solid and distinct from high school writing (typically more narrow than broad); a basic college level demonstrated for selecting and incorporating source material in a wide variety of citation formats; submitting work as assigned and meeting most due dates; attending and participating in class-based discussion, lessons, and workshops; completing assigned and choice reading of texts and mentor texts in ways that contribute to class discussions and original writing.

educator, public scholar, poet&writer – academic freedom isn't free