Wednesday, May 29, 2019


Process vs. Content, Houston, Texas

Will Fitzhugh,  

The Concord Review
19 June 2006

When teaching our students to write, not only are standards set very low in most high schools, limiting students to the five-paragraph essay, responses to a document-based question, or the personal (or college) essay about matters which are often no one else’s business, but we often so load up students with formulae and guidelines that the importance of writing when the author has something to say gets lost in the maze of processes.

On the one hand writing is difficult enough to do, and academic writing is especially difficult if the student hasn’t read anything, and on the other hand teachers feel the need to have students “produce” writing, however short or superficial that writing may be. So writing consultants and writing teachers feel they must come up with guidelines, parameters, checklists, and the like, as props to substitute for students’ absent motivation to describe or express in writing something they have learned.

Samuel Johnson once said, “an author will turn over half a library to produce one book,” the point being, as I understand it, that good writing must be based on extensive reading. But reading is just the step that is left out of the “Writing Process” in too many instances. The result is that students in fact do not have much to say, so of course they don’t have much they want to communicate in writing.

Enter the guidelines. Students are told to write a topic sentence, to express one idea per paragraph, to follow the structure of Introduction, Body, Conclusion, to follow the Twelve Steps to Effective Writing, and the like. This the students can be made to do, but the result is too often empty, formulaic writing which students come to despise, and which does not prepare them for the serious academic papers they may be asked to do in college.

I fear that the history book report, at least at the high school level in too many places, has died in the United States. Perhaps people will contact me with welcome evidence to the contrary, but where it is no longer done, students have not only been discouraged from reading nonfiction, but also have been lead to believe that they can and must write to formula without knowing something—for instance about the contents of a good book—before they write.

A nationally famous teacher of teachers of writing once told me: “I teach writing, I don’t get into content that much...” This is a splendid example of the divorce between content and process in common writing instruction. 

Reading and writing are inseparable partners, in my view. In letters from authors of essays published in The Concord Review over the years, they often say that they read so much about something in history that they reached a point where they felt a strong need to tell people what they had found out. The knowledge they had acquired had given them the desire to write well so that others could share and appreciate it as they did.

This is where good academic writing should start. When the motivation is there, born from knowledge gained, then the writing process follows a much more natural and straightforward  path. Then the student can write, read what they have written, and see what they have left out, what they need to learn more about, and what they have failed to express as clearly as they wanted to. Then they read more, re-write, and do all the natural things that have always lead to good academic writing, whether in history or in any other subject. 

At that point the guidelines are no longer needed, because the student has become immersed in the real work of expressing the meaning and value of something they know is worth writing about. This writing helps them discover the limits of their own understanding of the subject and allows them to see more clearly what they themselves think about the subject. The process of critiquing their own writing becomes natural and automatic. This is not to deny, of course, the value of reading what they have written to a friend or of giving it to a teacher for criticism and advice. But the writing techniques and processes no longer stop up the natural springs for the motivation to write.

As students are encouraged to learn more before they write, their writing will gradually extend past the five-paragraph size so often constraining the craft of writing in our schools. Our Page Per Year Plan© suggests that all public high school Seniors could be expected to write a twelve-page history research paper, if they had written an eleven-page paper their Junior year, a ten-page paper their Sophomore year, and a nine-page paper their Freshman year, and so on all the way back through the five-page paper in Fifth Grade and even to a one-page paper on a topic other than themselves their first year in school. With the Page Per Year Plan©, every Senior in high school will have learned, for that twelve-page paper, more about some topic probably than anyone else in their class knows, perhaps even more than any of their teachers knows about that subject. They will have had in the course of writing longer papers each year, that first taste of being a scholar which will serve them so well in higher education and beyond.

Writing is always much harder when the student has nothing to communicate, and the proliferating paraphernalia of structural aids from writing consultants and teachers often simply encumber students and alienate them from the essential benefits of writing. John Adams urged his fellow citizens to “Dare to read, think, speak and write” so that they could contribute to the civilization we have been given to enjoy and preserve. Let us endeavor to allow students to discover, through their own academic reading and writing, both the discipline and the satisfactions of writing carefully and well.

In 1625, Francis Bacon wrote, “Reading maketh a Full man, Conference a Ready man, and Writing an Exact man.” These benefits are surely among those we should not withhold from our K-12 students.

The Concord Review, 730 Boston Post Road, Suite 24, Sudbury, Massachusetts 01776    978-443-0022

Saturday, May 25, 2019


The whole point of school reform is to have students learn more. If this doesn’t happen, the experiment is a failure, no matter how happy the children, the parents and teachers—and the reformers—are.

Albert Shanker, The New York Times, March 13, 1994

        For many years, when I talked to principals and superintendents in large urban school districts where youngsters were not doing very well, I’d hear the same thing. They’d tell me that they had a new plan or policy or initiative that was making a tremendous difference. You couldn’t see any difference in the test scores, they’d say, but that was not the whole story. If I would only go into the schools, I’d see kids smiling, and I’d feel the warmth in the classrooms. That would tell me a whole lot more than test scores.

        There were always some hard-nosed reformers who didn’t buy that idea. They insisted that you had to judge the success of schools by the results—how well students performed. Some of these people were key players in creating the big Chicago school reform in 1988. This removed much of the authority from the central school board and brought it down to the school level, creating councils of parents and teachers and community members with the authority to hire and fire principals, spend money and establish curriculum—in other words, it introduced the kind of school-level empowerment that many believed was needed if we hoped to improve student learning.

        What’s happened? Several months ago, I was in Chicago with a group that was looking at the reform. With us were the key players who had always pushed the idea that the important thing was test scores—if you didn’t have good test scores, the kids weren’t learning.

        I asked them, “Now that the law’s been changed so that parents and teachers and the principal are in charge of what goes on in school instead of a bunch of central office bureaucrats, have the test scores gone up?” What I got from these people—those who were leaders in this revolution—was a variation of what I used to hear from principals and superintendents: “No,” they said, “the test scores haven’t gone up, but you should see the children smiling—how active and happy they are—and how the school councils are working together to make some very exciting changes.”

        That’s basically what a big report these people did on the Chicago reform also said, except it used a lot of data and scholarly apparatus. The report told us that reform is fragile but it is coming along. It classified four different types of school politics and five different types of school improvement initiatives. It found that small schools were more likely to be undertaking ambitious reforms than larger ones and said how good it is when people work together. But the report ducked the question of whether or not any of the changes were beginning to achieve what the reform intended—raise the achievement of Chicago’s students out of the cellar.

        Is student learning better in schools where there are democratic school councils or in schools where the principal runs the show? Have what the report calls “practices associated with ‘authentic learning’” produced any “authentic learning” in schools where they have been introduced? The report does say that the reform has not harmed the achievement in schools that were already doing well (a good thing). And it offers enthusiastic quotes from students and teachers in reforming schools, but no data or discussion about student performance. It talks all about process when what we want to hear about is substance.

        Am I saying that Chicago’s schools should show an across-the-board jump in student achievement after five years of reform? Of course not. Reforming any large institution is extraordinarily difficult. But it’s ridiculous and dangerous for a report on the status of the reform to omit what should be its centerpiece—an attempt to measure progress toward the goal of improved student achievement and an analysis of what seems to be working and what seems to be failing.

        Proposals for radical school decentralization are very popular now. New York City is considering a division into five borough school systems, and a number of states have or are looking at charter school proposals, which would allow people to establish free-standing public schools that are independent of local school boards and can do pretty much what they want. The jury is still out on whether decentralization in Chicago will significantly improve student achievement. But the five years of reform do illustrate one thing. When school decentralization—or any other reform—is put in place, we need to be careful that the new “owners” of the school system don’t change the rules. 

The whole point of school reform is to have students learn more. If this doesn’t happen, the experiment is a failure, no matter how happy the children, the parents and teachers—and the reformers—are.

Monday, May 20, 2019


Bookless Wonders
Will Fitzhugh

The Concord Review
The Ides of August, 2013

Someone told me that Muslims honor “People of the Book.” They have the Quran, the Jews have the Old Testament, the Christians have the New Testament, the Hindus have the Bhagavad-Gita, and so on. Perhaps the Analects and the Tao Te Ching belong in that group as well. But those are sacred books (not scared, sacred—just google it, if the word is unfamiliar...)

American students, in contrast, seem on their way to becoming what might be called Bookless Wonders. Not only do they not read the great religious texts, but they are more likely to read Catcher in the Rye (written at the fifth-grade level) than to read a single complete history book while they are in high school. Perhaps many educators now feel history is really passé in the 21st Century—after all, it is just about the Past, no?

In fact, Renaissance Learning, in a 2013 report on the books most commonly read by American high school students, found that the top forty books they read are, on average, at the fifth-grade reading level (that is, four to seven years below their current grade level).

Since the 2003 National Endowment for the Arts study of the reading of fiction among young people and others, I have wanted to do a study of the assignment of complete nonfiction books (e.g. history books) to U.S. public high school students, but no one else wants to know about that, and no one will fund our study, so no one knows, Q.E.D..

On the bookshelves of one of our history classrooms in the high school in Concord, Massachusetts, was a set of Profiles in Courage by John F. Kennedy (or perhaps Ted Sorenson). I asked my colleagues about them sitting there, and they said they used to hand them out to classes, but no one read them, so they had stopped handing them out.

As a high school history teacher back in the day (1980s) I quickly learned that my students had no need to read the pages I assigned, because in class I would ask a few questions and then, of course, go over the material in the reading assigned anyway.

As Chester Finn reminded us many years ago, students are not stupid, and if they don’t have to do certain school work (like reading and writing), they won’t do it.

Perhaps Korean students, and American students who want to go to CalTech or Stanford might do the work anyway, but most high school students, who, according to the Kaiser Foundation, are spending 53 hours a week with electronic entertainment media, and of course another big chunk of time with their friends, do not want to do any reading or writing they don’t absolutely have to do.
The Summer college reading lists, studied again this year by the National Association of Scholars, seem to favor books written since 1990….They report that 97% of the 309 colleges and universities they studied chose books published in 1990 or later for their students to read in the Summer. The most popular book by far was The Immortal Life of Henrietta Lacks (2010).

“There were no classics of history; nor biographies, speeches or writings by American political leaders (1620-2013); no works by ancient philosophers; no works of the Enlightenment; no classical works of Christian, Jewish, Muslim, Hindu, Buddhist, or Confucian thought, and no scientific classics.”

Other than that, there were lots of books on hot political topics of sustainability, race, diversity, class and gender. In the 17th century, one of the reasons for the high literacy of Americans was that just about every family read the Bible

Perhaps one of the reasons for the almost universally condemned inability (illiteracy) of our current college students (and many employees) to read very well and to write clearly, may be that we have given up on having students read good books (and write serious term papers) at every level of our education system. You think? 

As Mark Twain said: “The man who does not read good books has no advantage over the man who cannot read them.”

Monday, May 13, 2019


Diane Ravitch, in Where Did Social Studies Go Wrong? Fordham 2003 [excerpt 2-4]

Over the past century, the teaching of chronological history was steadily displaced by social studies. And for most of the century, the social studies establishment eagerly sought to reduce the status of chronological history, in the belief that its own variegated field was somehow superior to old-fashioned history. Given the plasticity of its definition, the social studies field has readily redefined its aims to meet whatever the sociopolitical demands of the age were. As a consequence, it is now the case that all history teachers are also social studies teachers, but there are many social studies teachers who do not teach history and who have never studied history.

History, once a core subject of study in every grade beginning in elementary school, lost its pride of place over the years. When social studies was first introduced in the early years of the 20th century, history was recognized as the central study of social studies. By the 1930s, it was considered primus inter pares, the first among equals. In the latter decades of the 20th century, many social studies professionals disparaged history with open disdain, suggesting that the study of the past was a useless exercise in obsolescence that attracted antiquarians and hopeless conservatives. (In the late 1980s, a president of the National Council for the Social Studies referred derisively to history as “pastology.”)

A century ago, the study of history was considered a modern subject. In the early decades of the 20th century, most high schools in the United States offered a four-year sequence in history that included ancient history, European history, English history, and American history. Most also offered or required a course in civics. Even as the study of history appeared to be firmly anchored in the schools, history textbooks began to improve over the static models of the 19th century, which tended to plod through dull recitations of political events. Historians like Charles Beard, Edward Eggleston, and David Saville Muzzey sought to incorporate political, social, and economic events into their telling of history.

Even the elementary grades offered a rich mix of historical materials, such as biographies of famous men (and sometimes women), history tales, hero stories, myths, legends, and sagas. Teachers for the early grades often took courses to learn about myths, legends, and storytelling, knowing that this was an important feature of their work. Consequently, many—perhaps most—children arrived at the study of Greece and Rome in high school with a well-stocked vocabulary of important figures and classical myths.

Until 1913, history was history and “social studies” was virtually unknown. In that year, a committee of educationists issued a report on the reorganization of the secondary curriculum that placed history into the new field of social studies. This report, eventually published as part of the Cardinal Principles of Secondary Education, was written under the chairmanship of Thomas Jesse Jones, a prominent reformer and social worker who had taught social studies at the Hampton Institute in Virginia to African Americans and American Indians. Jones was one of the first to use the term “social studies.” He was a strong believer in useful studies, such as industrial and trade education. He was very much part of the progressive avant garde that believed that academic studies were necessary for college preparation but inappropriate for children who were not college-bound, that is, children of workers, immigrants, and nonwhites.
Leading educational theorists, like Jones and David Snedden of Teachers College, viewed education as a form of social work and thought that children should study only those subjects that would provide immediacy and utility in their future lives.

The Jones report on social studies, incorporated into the famous Cardinal Principles report of the National Education Association in 1918, suggested that the goal of social studies was good citizenship and that historical studies that did not contribute to social change had no value. This report, when it appeared and for many years afterwards, was considered the very height of modern, progressive thought. It had a devastating impact on the teaching of history and gave a strong boost to its replacement by social studies.

Since it was hard to argue that the study of ancient history, European history, or English history contributed to social change or to improving students’ readiness for a vocation, these subjects began to drop out of the curriculum. They were considered too “academic,” too removed from students’ immediate needs. They made no contribution to social efficiency. The committee in charge of reorganizing the secondary curriculum saw no value in such abstruse goals as stimulating students’ imaginations, awakening their curiosity, or developing their intellects.

It was in the spirit of social efficiency that the field of social studies was born. Some educational activists (like Thomas Jesse Jones) thought that the purpose of social studies was to teach youngsters to adapt to (and accept) their proper station in life.

Some thought that the goal of social studies was to teach them the facts that were immediately relevant to the institutions of their own society. Some preferred to teach them useful skills that would prepare them for the real world of family life, jobs, health problems, and other issues that they would confront when they left school.

This utilitarian emphasis undercut the teaching of history in the high schools.

Wednesday, May 1, 2019


Just as the words vulgar, unseemly, and dishonorable are not ordinarily used in conversation today, neither is virtue. The disuse of virtue is part of today’s non-judgmentalism. It’s acceptable for people to have values, which will differ across people (and who is to say that one set of values is better than another?), but the word virtue carries with it connotations of invariance and objectivity. And rightly so. Let me make a brief case for the objective, universal applicability of the cardinal virtues in our quest to become not just nice, but good. Nice and good are different. Being nice involves immediate actions and immediate consequences—you give water to the thirsty and comfort to the afflicted right here, right now. Being good involves living in the world so that you contribute to the welfare of your fellow human beings. Sometimes the immediate and long-term consequences are consistent with being nice; sometimes they are in conflict.

That’s where the importance of the cardinal virtues comes in. The four cardinal virtues were originated by the Greeks. They subsequently got their label from the Latin cardo, meaning “hinge,” because they are pivotal: All the other virtues, and the living of a virtuous life, depend on them. If you took an introductory philosophy course in college, they were probably translated from the Greek as courage, justice, temperance, and prudence.

Courage, meaning not just physical but also moral courage, is pivotal because no virtue is sustained in the face of adversity without it.

Justice—as defined by Aristotle, giving everyone his rightful due—is pivotal because it is a precondition for behaving in other virtuous ways (for example, the virtue of compassion rightly takes different forms for people in different circumstances).

Temperance is, to modern ears, an unfortunate label. It sounds insipid. Aren’t we supposed to live life to the fullest? Haven’t we decided, along with Mae West and Liberace, that too much of a good thing can be wonderful? But when you stop to think about it, too much of a good thing isn’t wonderful. It cloys. Satiates. The pleasure ends. If you still are unhappy with the idea of being temperate, think in terms of self-restraint and knowing oneself, both of which are part of the meaning of sophrosyne, the word Plato used for this virtue. Temperance is pivotal because, without it, any subsidiary virtue will be ignored when it competes with natural appetites. That leaves prudence, the cardinal virtue that requires the most work and time for you to acquire. It is also the virtue with the most unappealing label of all, with its connotation of timidity. The idea of other people saying of me, “Charles is very prudent,” is mortifying. But this is a function of evolving language.

Prudence has acquired negative connotations that it did not formerly possess. Let’s go back to the original Greek word for this cardinal virtue, phronesis. Aristotle talks about two kinds of wisdom. One is the ability to apprehend reality and make the pieces fit together—roughly, the kind of wisdom that underlies science. Phronesis is the word Aristotle used for the other kind of wisdom, better translated in the twenty-first century as practical wisdom. Phronesis is harder to come by than scientific knowledge. Studying reality is not enough. Practical wisdom means the ability to rightly assess the consequences of a course of action. Knowledge is necessary, but so is experience. You might want to be compassionate, for example, but without practical wisdom you might behave in ways that cause suffering rather than relieve it. Balancing all the considerations that go into rightly assessing long-term consequences is difficult, and it requires both thoughtfulness and a deep understanding of human life. Thus the cardinal virtue of practical wisdom is pivotal because it is the precondition for behaving in other virtuous ways.

Hence my proposition: The cardinal virtues are indispensable to being good. I don’t mean that theoretically, but in the course of going about your daily life. You really, truly, must be courageous, just, temperate, and possess practical wisdom if you also wish to be dependably kind, merciful, compassionate, tolerant, patient, or to practice any of the other virtues. Lacking the cardinal virtues, you can act in those other virtuous ways haphazardly, and occasionally have the effect you wish, but you cannot consistently have the effect you wish, nor will you be able to bring yourself to behave in those other virtuous ways when the going gets tough. You will still mean well. You will still be nice. You won’t be good. 

You don’t need to be an Aristotelian to be good. For two millennia, the world’s other most influential ethical system was Confucianism. The central virtue in Confucianism is ren, the summation of all subsidiary virtues. Ren translates as humaneness or benevolence, but the Confucian conception of ren is richer than either word conveys. Ren incorporates the idea of reciprocity (a form of the Golden Rule), which overlaps with Aristotle’s concept of justice. Ren incorporates courage. Confucianism is emphatic about the need for temperance and self-control. And one of the chief components of ren is the considered, accurate appraisal of consequences that Aristotle described as practical wisdom. If you are a good Confucian, you will be practicing the cardinal virtues.

Whether you find inspiration in the Western or the Eastern tradition is a minor issue. What is unacceptable is to go through life thinking that being nice is enough. You must come to grips with the requirements for being good.

Charles Murray, The Curmudgeon's Guide to Getting Ahead (114-118). [2014] The Crown Publishing Group. Kindle Edition.