Saturday, May 25, 2019


The whole point of school reform is to have students learn more. If this doesn’t happen, the experiment is a failure, no matter how happy the children, the parents and teachers—and the reformers—are.

Albert Shanker, The New York Times, March 13, 1994

        For many years, when I talked to principals and superintendents in large urban school districts where youngsters were not doing very well, I’d hear the same thing. They’d tell me that they had a new plan or policy or initiative that was making a tremendous difference. You couldn’t see any difference in the test scores, they’d say, but that was not the whole story. If I would only go into the schools, I’d see kids smiling, and I’d feel the warmth in the classrooms. That would tell me a whole lot more than test scores.

        There were always some hard-nosed reformers who didn’t buy that idea. They insisted that you had to judge the success of schools by the results—how well students performed. Some of these people were key players in creating the big Chicago school reform in 1988. This removed much of the authority from the central school board and brought it down to the school level, creating councils of parents and teachers and community members with the authority to hire and fire principals, spend money and establish curriculum—in other words, it introduced the kind of school-level empowerment that many believed was needed if we hoped to improve student learning.

        What’s happened? Several months ago, I was in Chicago with a group that was looking at the reform. With us were the key players who had always pushed the idea that the important thing was test scores—if you didn’t have good test scores, the kids weren’t learning.

        I asked them, “Now that the law’s been changed so that parents and teachers and the principal are in charge of what goes on in school instead of a bunch of central office bureaucrats, have the test scores gone up?” What I got from these people—those who were leaders in this revolution—was a variation of what I used to hear from principals and superintendents: “No,” they said, “the test scores haven’t gone up, but you should see the children smiling—how active and happy they are—and how the school councils are working together to make some very exciting changes.”

        That’s basically what a big report these people did on the Chicago reform also said, except it used a lot of data and scholarly apparatus. The report told us that reform is fragile but it is coming along. It classified four different types of school politics and five different types of school improvement initiatives. It found that small schools were more likely to be undertaking ambitious reforms than larger ones and said how good it is when people work together. But the report ducked the question of whether or not any of the changes were beginning to achieve what the reform intended—raise the achievement of Chicago’s students out of the cellar.

        Is student learning better in schools where there are democratic school councils or in schools where the principal runs the show? Have what the report calls “practices associated with ‘authentic learning’” produced any “authentic learning” in schools where they have been introduced? The report does say that the reform has not harmed the achievement in schools that were already doing well (a good thing). And it offers enthusiastic quotes from students and teachers in reforming schools, but no data or discussion about student performance. It talks all about process when what we want to hear about is substance.

        Am I saying that Chicago’s schools should show an across-the-board jump in student achievement after five years of reform? Of course not. Reforming any large institution is extraordinarily difficult. But it’s ridiculous and dangerous for a report on the status of the reform to omit what should be its centerpiece—an attempt to measure progress toward the goal of improved student achievement and an analysis of what seems to be working and what seems to be failing.

        Proposals for radical school decentralization are very popular now. New York City is considering a division into five borough school systems, and a number of states have or are looking at charter school proposals, which would allow people to establish free-standing public schools that are independent of local school boards and can do pretty much what they want. The jury is still out on whether decentralization in Chicago will significantly improve student achievement. But the five years of reform do illustrate one thing. When school decentralization—or any other reform—is put in place, we need to be careful that the new “owners” of the school system don’t change the rules. 

The whole point of school reform is to have students learn more. If this doesn’t happen, the experiment is a failure, no matter how happy the children, the parents and teachers—and the reformers—are.

Monday, May 20, 2019


Bookless Wonders
Will Fitzhugh

The Concord Review
The Ides of August, 2013

Someone told me that Muslims honor “People of the Book.” They have the Quran, the Jews have the Old Testament, the Christians have the New Testament, the Hindus have the Bhagavad-Gita, and so on. Perhaps the Analects and the Tao Te Ching belong in that group as well. But those are sacred books (not scared, sacred—just google it, if the word is unfamiliar...)

American students, in contrast, seem on their way to becoming what might be called Bookless Wonders. Not only do they not read the great religious texts, but they are more likely to read Catcher in the Rye (written at the fifth-grade level) than to read a single complete history book while they are in high school. Perhaps many educators now feel history is really passé in the 21st Century—after all, it is just about the Past, no?

In fact, Renaissance Learning, in a 2013 report on the books most commonly read by American high school students, found that the top forty books they read are, on average, at the fifth-grade reading level (that is, four to seven years below their current grade level).

Since the 2003 National Endowment for the Arts study of the reading of fiction among young people and others, I have wanted to do a study of the assignment of complete nonfiction books (e.g. history books) to U.S. public high school students, but no one else wants to know about that, and no one will fund our study, so no one knows, Q.E.D..

On the bookshelves of one of our history classrooms in the high school in Concord, Massachusetts, was a set of Profiles in Courage by John F. Kennedy (or perhaps Ted Sorenson). I asked my colleagues about them sitting there, and they said they used to hand them out to classes, but no one read them, so they had stopped handing them out.

As a high school history teacher back in the day (1980s) I quickly learned that my students had no need to read the pages I assigned, because in class I would ask a few questions and then, of course, go over the material in the reading assigned anyway.

As Chester Finn reminded us many years ago, students are not stupid, and if they don’t have to do certain school work (like reading and writing), they won’t do it.

Perhaps Korean students, and American students who want to go to CalTech or Stanford might do the work anyway, but most high school students, who, according to the Kaiser Foundation, are spending 53 hours a week with electronic entertainment media, and of course another big chunk of time with their friends, do not want to do any reading or writing they don’t absolutely have to do.
The Summer college reading lists, studied again this year by the National Association of Scholars, seem to favor books written since 1990….They report that 97% of the 309 colleges and universities they studied chose books published in 1990 or later for their students to read in the Summer. The most popular book by far was The Immortal Life of Henrietta Lacks (2010).

“There were no classics of history; nor biographies, speeches or writings by American political leaders (1620-2013); no works by ancient philosophers; no works of the Enlightenment; no classical works of Christian, Jewish, Muslim, Hindu, Buddhist, or Confucian thought, and no scientific classics.”

Other than that, there were lots of books on hot political topics of sustainability, race, diversity, class and gender. In the 17th century, one of the reasons for the high literacy of Americans was that just about every family read the Bible

Perhaps one of the reasons for the almost universally condemned inability (illiteracy) of our current college students (and many employees) to read very well and to write clearly, may be that we have given up on having students read good books (and write serious term papers) at every level of our education system. You think? 

As Mark Twain said: “The man who does not read good books has no advantage over the man who cannot read them.”

Monday, May 13, 2019


Diane Ravitch, in Where Did Social Studies Go Wrong? Fordham 2003 [excerpt 2-4]

Over the past century, the teaching of chronological history was steadily displaced by social studies. And for most of the century, the social studies establishment eagerly sought to reduce the status of chronological history, in the belief that its own variegated field was somehow superior to old-fashioned history. Given the plasticity of its definition, the social studies field has readily redefined its aims to meet whatever the sociopolitical demands of the age were. As a consequence, it is now the case that all history teachers are also social studies teachers, but there are many social studies teachers who do not teach history and who have never studied history.

History, once a core subject of study in every grade beginning in elementary school, lost its pride of place over the years. When social studies was first introduced in the early years of the 20th century, history was recognized as the central study of social studies. By the 1930s, it was considered primus inter pares, the first among equals. In the latter decades of the 20th century, many social studies professionals disparaged history with open disdain, suggesting that the study of the past was a useless exercise in obsolescence that attracted antiquarians and hopeless conservatives. (In the late 1980s, a president of the National Council for the Social Studies referred derisively to history as “pastology.”)

A century ago, the study of history was considered a modern subject. In the early decades of the 20th century, most high schools in the United States offered a four-year sequence in history that included ancient history, European history, English history, and American history. Most also offered or required a course in civics. Even as the study of history appeared to be firmly anchored in the schools, history textbooks began to improve over the static models of the 19th century, which tended to plod through dull recitations of political events. Historians like Charles Beard, Edward Eggleston, and David Saville Muzzey sought to incorporate political, social, and economic events into their telling of history.

Even the elementary grades offered a rich mix of historical materials, such as biographies of famous men (and sometimes women), history tales, hero stories, myths, legends, and sagas. Teachers for the early grades often took courses to learn about myths, legends, and storytelling, knowing that this was an important feature of their work. Consequently, many—perhaps most—children arrived at the study of Greece and Rome in high school with a well-stocked vocabulary of important figures and classical myths.

Until 1913, history was history and “social studies” was virtually unknown. In that year, a committee of educationists issued a report on the reorganization of the secondary curriculum that placed history into the new field of social studies. This report, eventually published as part of the Cardinal Principles of Secondary Education, was written under the chairmanship of Thomas Jesse Jones, a prominent reformer and social worker who had taught social studies at the Hampton Institute in Virginia to African Americans and American Indians. Jones was one of the first to use the term “social studies.” He was a strong believer in useful studies, such as industrial and trade education. He was very much part of the progressive avant garde that believed that academic studies were necessary for college preparation but inappropriate for children who were not college-bound, that is, children of workers, immigrants, and nonwhites.
Leading educational theorists, like Jones and David Snedden of Teachers College, viewed education as a form of social work and thought that children should study only those subjects that would provide immediacy and utility in their future lives.

The Jones report on social studies, incorporated into the famous Cardinal Principles report of the National Education Association in 1918, suggested that the goal of social studies was good citizenship and that historical studies that did not contribute to social change had no value. This report, when it appeared and for many years afterwards, was considered the very height of modern, progressive thought. It had a devastating impact on the teaching of history and gave a strong boost to its replacement by social studies.

Since it was hard to argue that the study of ancient history, European history, or English history contributed to social change or to improving students’ readiness for a vocation, these subjects began to drop out of the curriculum. They were considered too “academic,” too removed from students’ immediate needs. They made no contribution to social efficiency. The committee in charge of reorganizing the secondary curriculum saw no value in such abstruse goals as stimulating students’ imaginations, awakening their curiosity, or developing their intellects.

It was in the spirit of social efficiency that the field of social studies was born. Some educational activists (like Thomas Jesse Jones) thought that the purpose of social studies was to teach youngsters to adapt to (and accept) their proper station in life.

Some thought that the goal of social studies was to teach them the facts that were immediately relevant to the institutions of their own society. Some preferred to teach them useful skills that would prepare them for the real world of family life, jobs, health problems, and other issues that they would confront when they left school.

This utilitarian emphasis undercut the teaching of history in the high schools.

Wednesday, May 1, 2019


Just as the words vulgar, unseemly, and dishonorable are not ordinarily used in conversation today, neither is virtue. The disuse of virtue is part of today’s non-judgmentalism. It’s acceptable for people to have values, which will differ across people (and who is to say that one set of values is better than another?), but the word virtue carries with it connotations of invariance and objectivity. And rightly so. Let me make a brief case for the objective, universal applicability of the cardinal virtues in our quest to become not just nice, but good. Nice and good are different. Being nice involves immediate actions and immediate consequences—you give water to the thirsty and comfort to the afflicted right here, right now. Being good involves living in the world so that you contribute to the welfare of your fellow human beings. Sometimes the immediate and long-term consequences are consistent with being nice; sometimes they are in conflict.

That’s where the importance of the cardinal virtues comes in. The four cardinal virtues were originated by the Greeks. They subsequently got their label from the Latin cardo, meaning “hinge,” because they are pivotal: All the other virtues, and the living of a virtuous life, depend on them. If you took an introductory philosophy course in college, they were probably translated from the Greek as courage, justice, temperance, and prudence.

Courage, meaning not just physical but also moral courage, is pivotal because no virtue is sustained in the face of adversity without it.

Justice—as defined by Aristotle, giving everyone his rightful due—is pivotal because it is a precondition for behaving in other virtuous ways (for example, the virtue of compassion rightly takes different forms for people in different circumstances).

Temperance is, to modern ears, an unfortunate label. It sounds insipid. Aren’t we supposed to live life to the fullest? Haven’t we decided, along with Mae West and Liberace, that too much of a good thing can be wonderful? But when you stop to think about it, too much of a good thing isn’t wonderful. It cloys. Satiates. The pleasure ends. If you still are unhappy with the idea of being temperate, think in terms of self-restraint and knowing oneself, both of which are part of the meaning of sophrosyne, the word Plato used for this virtue. Temperance is pivotal because, without it, any subsidiary virtue will be ignored when it competes with natural appetites. That leaves prudence, the cardinal virtue that requires the most work and time for you to acquire. It is also the virtue with the most unappealing label of all, with its connotation of timidity. The idea of other people saying of me, “Charles is very prudent,” is mortifying. But this is a function of evolving language.

Prudence has acquired negative connotations that it did not formerly possess. Let’s go back to the original Greek word for this cardinal virtue, phronesis. Aristotle talks about two kinds of wisdom. One is the ability to apprehend reality and make the pieces fit together—roughly, the kind of wisdom that underlies science. Phronesis is the word Aristotle used for the other kind of wisdom, better translated in the twenty-first century as practical wisdom. Phronesis is harder to come by than scientific knowledge. Studying reality is not enough. Practical wisdom means the ability to rightly assess the consequences of a course of action. Knowledge is necessary, but so is experience. You might want to be compassionate, for example, but without practical wisdom you might behave in ways that cause suffering rather than relieve it. Balancing all the considerations that go into rightly assessing long-term consequences is difficult, and it requires both thoughtfulness and a deep understanding of human life. Thus the cardinal virtue of practical wisdom is pivotal because it is the precondition for behaving in other virtuous ways.

Hence my proposition: The cardinal virtues are indispensable to being good. I don’t mean that theoretically, but in the course of going about your daily life. You really, truly, must be courageous, just, temperate, and possess practical wisdom if you also wish to be dependably kind, merciful, compassionate, tolerant, patient, or to practice any of the other virtues. Lacking the cardinal virtues, you can act in those other virtuous ways haphazardly, and occasionally have the effect you wish, but you cannot consistently have the effect you wish, nor will you be able to bring yourself to behave in those other virtuous ways when the going gets tough. You will still mean well. You will still be nice. You won’t be good. 

You don’t need to be an Aristotelian to be good. For two millennia, the world’s other most influential ethical system was Confucianism. The central virtue in Confucianism is ren, the summation of all subsidiary virtues. Ren translates as humaneness or benevolence, but the Confucian conception of ren is richer than either word conveys. Ren incorporates the idea of reciprocity (a form of the Golden Rule), which overlaps with Aristotle’s concept of justice. Ren incorporates courage. Confucianism is emphatic about the need for temperance and self-control. And one of the chief components of ren is the considered, accurate appraisal of consequences that Aristotle described as practical wisdom. If you are a good Confucian, you will be practicing the cardinal virtues.

Whether you find inspiration in the Western or the Eastern tradition is a minor issue. What is unacceptable is to go through life thinking that being nice is enough. You must come to grips with the requirements for being good.

Charles Murray, The Curmudgeon's Guide to Getting Ahead (114-118). [2014] The Crown Publishing Group. Kindle Edition.

Monday, April 29, 2019


 Rafe Esquith [2003]

When you work as hard as so many teachers do, having a brilliant child in class makes you feel good. Teachers, me included, often suffer the delusion that much of a child’s success is because of the teacher. That would be nice if it were true, but it isn’t. A teacher can help guide a brilliant child, and certainly expose him to new ideas and experiences, but you cannot teach intelligence. Bright children make teachers feel that they’re doing a fine job in the classroom, and it’s understandable why we cherish that feeling, because so often we feel as if we’re failing. But we have to remind ourselves that we must always try to do what’s in the best interest of the student, and if that means sending him to another classroom where he has better opportunities to fly high, let him go.

My experience working with youngsters has taught me many things. It’s common to have gifted students do fifty multiplication problems while other students do twenty-five, but piling on busywork is not the way to help gifted children develop their abilities. The key is to keep giving the gifted ones twenty-five problems—the twenty-five right ones; these children need to be challenged and worked hard, but wasting their time is no answer. If other children in class need extra practice with a math skill, why make children who have mastered the skill wait? We don’t want to leave children behind, but we don’t want to slow down those ready to move ahead.

To complicate things further, GATE students are often used to tutor their peers who need help. There is merit to this, as it helps bright youngsters develop compassion for others. However, teachers must be careful in finding a balance here: it’s nice for Johnny to take some time to help someone in need, but Johnny has needs, too. That’s why it’s best for Johnny to have the same hour of math but with more difficult problems, the same time for reading but with more difficult literature, and the same time for language arts but with more advanced vocabulary to study. GATE children form their own branch of special education, and just as children with learning disabilities need individualized lesson plans, GATE children need them, too.

Exposure is crucial. Gifted students need opportunities in every subject to give them the chance to develop a love of some activity in which they can then thrive. It is becoming common these days [2003] to visit classrooms in which art, music, science, history, geography, and physical education are barely taught, because the teachers are under pressure to prepare and assess their students in reading and arithmetic. At the Jungle, the school district has ordered its teachers to spend a minimum of three and a half hours per day teaching these two subjects. The entire school day is only seven hours long, and with recess and lunch taking one and a quarter hours, that leaves teachers only two and a quarter hours to teach all the other subjects we’re supposed to cover.

More often than not, these subjects have disappeared in elementary schools. Students, regular as well as GATE, will never discover they have a passion for mapmaking, painting, singing, biology, writing, or retracing the steps of Chief Crazy Horse. How can these children develop such interests if they don’t know these things exist? I’ve solved the problem by lengthening my school day to cover each of these forgotten subjects. It’s not rocket science. A child will more likely find something she likes to eat if there are more items on the menu. Of these subjects, I’ve found music and drama to be crucial in reaching gifted students. The arts bridge the gap when kids of vastly different abilities are in the same room. With drama, a good teacher can find the right role providing the proper challenge for each individual. Recently, my fifth-graders performed an unabridged production of King Lear. A brilliant young girl played Goneril. It was challenging but possible for her to learn the part. Another child with less advanced language skills played a smaller role, but the experience of learning lines and being in the production was equally rewarding for her. In this way, each child can face a challenge and be part of a happy and successful fellowship of learning.

If offered music, children of different abilities can be singers, dancers, and musicians. There is something for everyone. Visitors are surprised to see that children in my class receive little homework. Because of our extended day, I don’t pile it on when they go home. They work less than one hour per night. When youngsters can perform Shakespeare and solve algebra problems, it’s easy to forget they’re still children. Good teachers make sure the kids have time to play baseball, listen to their favorite pop star, and just look at the clouds and relax. In addition, passionate students often create homework for themselves. Students who have become fascinated with history will go home and research things that interest them. Students with a love of music practice their instruments constantly. Good readers always have homework. An exciting day of class leads to children pursuing things at home for all the right reasons.

Rafe Esquith, There Are No Shortcuts (152-155).
[2003] Knopf Doubleday Publishing Group. Kindle Edition.

Wednesday, April 17, 2019


Students Voting With Their Feet

April 15, 2019

One of the worst consequences of the politicization of the academic humanities is the drift of it down into the secondary level.

By Mark Bauerlein – 

Thirty years ago, when Allan Bloom, Bill Bennett, Dinesh D’Souza, Roger Kimball, Camille Paglia, Lynne Cheney, and many others decried the rise of identity politics in the humanities, the professors had a ready response.  “The humanities have never been more vibrant and relevant and rigorous,” they proclaimed.  The steady put-down of Dead White Male authors and artists that the conservative critics bemoaned was not a suppression that forsook the Western tradition and turned people off.  It was, instead, an exciting opening that brought unjustly overlooked individuals and cultures into the curriculum.  The humanities had never been so healthy and attractive! 

That was in 1994.  It was hard then for critics of the humanities to prove them wrong.  After all, many of those critics weren’t even practicing professors, and the ones who were practicing professors were mightily outnumbered on campus.  While the critics were writing best-selling books such as Tenured Radicals and appearing on talk shows, the professors were changing the syllabus and devising new theories and creating new job descriptions.  The critics got all the publicity—Bloom became a national celebrity—but they had little impact on the course of the disciplines.  Why should the professors care what the critics said?  They had the jobs, they controlled the hiring. 

And so they drove ever farther into Queer Theory, Postcolonialism, Ecocriticism, Cultural Studies, Gender Theory, Intersectionality, and various exotic sub-formations.  Let the conservatives howl all they want.  The rising generation of humanities scholars and teachers would transform the study of the past, and nobody could stop them. 

That was how bold and ambitious the professors were in the Nineties.  They were on a roll, riding high, convinced that the institution was in their hands and would last forever. 

All that confidence is gone now, and it wasn’t the critics who took it away.  Go to conferences, mingle in departments, talk to editors in the fields, and you will find a general pessimism and quiet desperation.  The cause isn’t Donald Trump.  It’s the undergraduates, who over the last several years have been turning away from humanities courses and going to other majors.  According to various reports, the number of degrees granted in history, English, foreign languages, and philosophy have fallen disastrously.  That’s the cold reality humanities professors face, and it has squelched their revolutionary dreams.  

Disciplines that used to stand at the center of higher education are now at the margins.  

Let’s get specific.  Right now, English, history, foreign languages, and philosophy together collect a mere one in 20 of all the bachelor’s degrees awarded each year.  According to the American Historical Association, from the years 2011 to 2017, the number of history degrees earned fell more than 30 percent, while degrees in philosophy, English, and foreign languages went down more than 20 percent.  We have reached the point that the humanities are a negligible part of most Americans’ undergraduate experience. 

What can the professors say about this?  It leaves them at a total loss.  It is easy for them to denounce conservative critics as reactionary, uptight, racist, and sexist, but they can’t attack 19-year-olds in the same way.  Indeed, to criticize the young is to start sounding conservative!  But how are they to respond to the loss of their formerly captive audience.  The professors ho believed fervently in the rightness and goodness of their actions must admit that they have presided over an institutional collapse of their domains that is astonishing, not to mention caused by an unexpected source.   

The professors believe just as strongly as ever that the turn of their fields away from traditionalist orientations (Great Books, Western Civilization, the Great American Novel) and toward identity themes of race, and sex was wholly just and timely.  But those convictions haven’t carried over to undergraduates.  They’re not interested in what today’s humanities professorate has to offer.  The professors still think intersectionality is a captivating idea, but it hasn’t seemed to have inspired very many 20-year-olds.  What is a professor to do when the small revolution he helped bring about proves to be a bust? 

There is, however, another small revolution taking place right now.  Or rather, now a revolution, but a restoration.  At several campuses around the country, professors have created special programs devoted to teaching the very works using the very approaches that the humanities theorists derided starting several decades ago.  These programs are frank about wanting to maintain the Western tradition.  They refer unabashedly to “the classics.”  They boast of their commitment to the canon.  They avoid the trendy language of cultural theory. 

Everything about them is unfashionable.  They sound oh-so-19th century.  All the hot social issues (white privilege, toxic masculinity . . .) they ignore.  

The people operating those programs sound like they missed the insertion of political awareness into the humanities that was cast as a liberation long ago, one which academics would never abandon.  Indeed, that political awareness was understood as disciplinary competence.  If you can’t do race theory, you don’t qualify for the field. 

What is the result of these out-of-touch, backward-looking, old-fashioned programs?  It is the opposite of what is happening in standard humanities departments.  Applications and enrollments are climbing, sometimes astoundingly. 

At Clemson University, an initiative called the Lyceum Program offers a minor in Political Science that emphasizes classics of political theory–the Greeks, Machiavelli, Marx, Mill. . .Students have to enroll in eight courses taught as Socratic seminars, and they have to meet with professors each week for on-on-one tutorials.  The program offers ten scholarships per year ($2,500).  When it started four years ago, the program got 200 applications.  This year, they got more than 650 applications.  Demand is so high that the program has opened up to non-scholarship students, kids who get no money but can enroll in the courses.  In two years, those enrollments have climbed from zero to 102. 

The University of Texas-Austin has the Jefferson Center for the Study of Core Texts & Ideas, which announces up front that it involves “the study of the great books.”  It began in 2014.  Currently, the program gets 500 applications for only 130 available posts in its Scholars Program. 

One of the worst consequences of the politicization of the academic humanities is the drift of it down into the secondary level.  High school humanities teachers have picked up the identity focus and changed their syllabi accordingly–more contemporary, multicultural material, less traditional, canonical, dead White Male stuff.  Students are stuck in those courses; they can’t go down the hall and pursue other subjects.  

But when alternative schools open, such as classical education charter schools, parents leap at them.  Great Hearts Academies is a charter network that began in the mid-00s as a single middle school in Arizona.  The curriculum proudly imparts Western Civilization, praising the moral imagination, not social justice, teaching Latin and Greek, not climate change.  Here is how the network has changed: it now operates in Arizona and Texas in 28 schools with 17,000 students and some 14,000 kids on the waiting list.  

It is hard not to draw the conclusion from these examples (one could list many more) that these programs have prospered because the humanities departments have abandoned the traditional turf.  

Demand for the old-fashioned humanities remains, and regular departments aren’t meeting it.  The numbers make the argument all by themselves.   

They also chart the way toward the revival of the humanities in higher education.  No matter what happens in the humanities world of research, pedagogy, conferences, publishing, and all the other things the professors do, if undergraduates continue to slide away from humanities courses, the fields can’t thrive.  We should be looking closely at these success stories and give no more credence to the academic left and its promises of social progress. 

Last month, I gave a lecture on the humanities at a small Christian college in the middle of the country.  In the question & answer period, a young man stated that he had gotten a degree in English at the school, which gave him traditional instruction in the canon.  But now, he continued, he was enrolled in a graduate literature program at a large state university and was receiving a more critical instruction in issues of identity and “exclusion.”  He was quite gratified by that.  What did I think? 

I answered: “Right now, debates over the canon and who gets included and excluded don’t really matter.  We are in a survival situation.  All theoretical questions begin with how the answers will affect enrollments.”  

We have to consider what 19-year-olds want.  Do they find the identity politics classroom compelling?  Only a few of them do.   

Do they find Hamlet’s dilemma interesting?  Is Jane Eyre’s situation meaningful?  Do Beethoven’s symphonies excite them?  Do they like Impressionist paintings?  Yes, they do, lots of them.  Undergraduate tastes spell the end of the political hijacking of the humanities.  Smart professors will realize this and get back to the core of their mission: to pass along the Great Books and High Culture of the past. 

Friday, April 12, 2019


       Let me emphasize the term story. Professional historical writing has, for a great many years now, been resistant to the idea of history as narrative. Some historians have even hoped that history could be made into a science. But this approach seems unlikely ever to succeed, if for no other reason than that it fails to take into account the ways we need stories to speak to the fullness of our humanity and help us orient ourselves in the world. The impulse to write history and organize our world around stories is intrinsic to us as human beings. We are, at our core, remembering and story-making creatures, and stories are one of the chief ways we find meaning in the flow of events. What we call “history” and “literature” are merely the refinement and intensification of that basic human impulse, that need.

        The word need is not an exaggeration. For the human animal, meaning is not a luxury; it is a necessity. Without it, we perish. Historical consciousness is to civilized society what memory is to individual identity. Without memory, and without the stories by which our memories are carried forward, we cannot say who, or what, we are. Without them, our life and thought dissolve into a meaningless, unrelated rush of events. Without them, we cannot do the most human of things: we cannot learn, use language, pass on knowledge, raise children, establish rules of conduct, engage in science, or dwell harmoniously in society. Without them, we cannot govern ourselves.

        Nor can we have a sense of the future as a time we know will come, because we remember that other tomorrows have also come and gone. A culture without memory will necessarily be barbarous and easily tyrannized, even if it is technologically advanced. The incessant waves of daily events will occupy all our attention and defeat all our efforts to connect past, present, and future, thereby diverting us from an understanding of the human things that unfold in time, including the paths of our own lives.

        The stakes were beautifully expressed in the words of the great Jewish writer Isaac Bashevis Singer: “When a day passes it is no longer there. What remains of it? Nothing more than a story. If stories weren’t told or books weren’t written, man would live like the beasts, only for the day. The whole world, all human life, is one long story.”

Wilfred McClay, Land of Hope: An Invitation to the Great American Story (New York: Encounter Books, 2019)