Friday, November 8, 2024

CHURCHILL

 The New Criterion
“One Hundred Fifty Years of Churchill” (excerpt)
by Larry P. Arnn

 There has been no better eulogy for Churchill than that given by Leo Strauss, a German Jew who left Germany in time to escape the death camps. One of his teachers was Martin Heidegger, a philosopher of note and a Nazi, who provided some of the impulse for Strauss to return to the classics and begin the recovery of political philosophy as a quest for the truth. Churchill died on January 24, 1965. When Strauss came into class and was informed of Churchill’s death, he said:


“The death of Churchill is a healthy reminder to students of political science of their limitations, the limitations of their craft.


“The tyrant stood at the pinnacle of his power. The contrast between the indomitable and magnanimous statesman and the insane tyrant—this spectacle in its clear simplicity was one of the greatest lessons which men can learn, at any time.


“No less enlightening is the lesson conveyed by Churchill’s failure, which is too great to be called tragedy. I mean the fact that Churchill’s heroic action on behalf of human freedom against Hitler only contributed, through no fault of Churchill’s, to increase the threat to freedom which is posed by Stalin or his successors. Churchill did the utmost that a man could do to counter that threat—publicly and most visibly in Greece and in Fulton, Missouri.


“Not a whit less important than his deeds and speeches are his writings, above all his Marlborough—the greatest historical work written in our century, an inexhaustible mine of political wisdom and understanding, which should be required reading for every student of political science.


“The death of Churchill reminds us of the limitations of our craft, and therewith of our duty. We have no higher duty, and no more pressing duty, than to remind ourselves and our students, of political greatness, human greatness, of the peaks of human excellence. For we are supposed to train ourselves and others in seeing things as they are, and this means above all in seeing their greatness and their misery, their excellence and their vileness, their nobility and their triumphs, and therefore never to mistake mediocrity, however brilliant, for true greatness.”

Thursday, October 10, 2024

Franklin to Washington, 1787

“I confess that I do not entirely approve of this Constitution at present, but Sir, I am not sure I shall never approve it: For having lived long, I have experienced many Instances of being oblig'd, by better Information or fuller Consideration, to change Opinions even on important Subjects, which I once thought right, but found to be otherwise.”


“When you assemble a Number of Men to have the Advantage of their joint Wisdom, you inevitably assemble with those Men all their Prejudices, their Passions, their Errors of Opinion, their local Interests, and their selfish Views. From such an Assembly can a perfect Production be expected?”


“It therefore astonishes me, Sir, to find this System approaching so near to Perfection as it does; and I think it will astonish our Enemies, who are waiting with Confidence to hear that our Councils are confounded, like those of the Builders of Babel, and that our States are on the Point of Separation, only to meet hereafter for the Purpose of cutting one another’s Throats. Thus I consent, Sir, to this Constitution because I expect no better, and because I am not sure that it is not the best.


“The Opinions I have had of its Errors, I sacrifice to the Public Good. I have never whisper’d a Syllable of them abroad. Within these Walls they were born, & here they shall die. If every one of us in returning to our Constituents were to report the Objections he has had to it, and endeavour to gain Partizans in support of them, we might prevent its being generally received. . .


“Much of the Strength and Efficiency of any Government, in procuring & securing Happiness to the People depends on Opinion, on the general Opinion of the Goodness of that Government as well as of the Wisdom & Integrity of its Governors. I hope therefore that for our own Sakes, as a Part of the People, and for the Sake of our Posterity, we shall act heartily & unanimously in recommending this Constitution, wherever our Influence may extend, and turn our future Thoughts and Endeavours to the Means of having it well administered.


“On the whole, Sir, I cannot help expressing a Wish, that every Member of the Convention, who may still have Objections to it, would with me on this Occasion doubt a little of his own Infallibility, and to make manifest our Unanimity, put his Name to this Instrument.”

Thursday, September 26, 2024

COOLIDGE

During most of my course George Sherman was the principal and Miss M. Belle Chellis was the first assistant. I owe much to the inspiration and scholarly direction which they gave to my undergraduate days. They both lived to see me President and sent me letters at the time, though they left the school long ago. It was under their teaching that I first learned of the glory and grandeur of the ancient civilization that grew up around the Mediterranean and in Mesopotamia.

Under their guidance I beheld the marvels of old Babylon, I marched with the Ten Thousand of Xenophon, I witnessed the conflict around beleaguered Troy which doomed that proud city to pillage and to flames, I heard the tramp of the invincible legions of Rome, I saw the victorious galleys of the Eternal City carrying destruction to the Carthaginian shore, and I listened to the lofty eloquence of Cicero and the matchless imagery of Homer.

They gave me a vision of the world when it was young and showed me how it grew. It seems to me that it is almost impossible for those who have not traveled that road to reach a very clear conception of what the world now means. It was in this period that I learned something of the thread of events that ran from the Euphrates and the Nile through Athens to the Tiber and thence stretched on to the Seine and the Thames to be carried overseas to the James, the Charles and the Hudson. I found that the English language was generously compounded with Greek and Latin, which it was necessary to know if I was to understand my native tongue. I discovered that our ideas of democracy came from the agora of Greece, and our ideas of liberty came from the forum of Rome. Something of the sequence of history was revealed to me, so that I began to understand the significance of our own times and our own country.

Calvin Coolidge, [1929] The Autobiography of Calvin Coolidge: Authorized, Expanded, and Annotated Edition (51-52). Intercollegiate Studies Institute (ORD). Kindle Edition.


Monday, September 23, 2024

NEW GI

But the greatest lesson was one learned from the enemy: to hate. Word of the Malmédy Massacre, of the massacres of civilians at Stavelot, Trois Points, and Bande, passed from man to man, and from unit to unit. Until the Ardennes the GI had fought a civilian’s war. Now he was learning to kill without remorse or pity.

 

John Toland, Battle: The Story of the Bulge
New York: Random House, 1959, 329-331


    Reports from men drifting back from the front were alarming. Entire units, claimed many wild-eyed refugees, had been cut off and were being wiped out. Back at Division, General Grow had no clear idea of how great his casualties were. But he did know that it was the worst day in the history of his 6th Armored Division.

    The retreat of the 6th Armored wasn’t the only reverse on George Patton’s front. For the savage meeting engagement was at its climax. Violent German attacks had struck all along the Bastogne front. In particular the 17th Airborne Division, in their first real day of action a few miles west of town, had been dealt shocking casualties, some battalions losing 40 per cent of their men.

    Ordinarily the most optimistic of American generals, Patton was now in a despondent mood. Each man lost that day weighed heavily on him. He sat at a desk and wrote in his diary, “We can still lose this war.”

    But at the front, later that night, a strange thing began to take place where the day’s disaster had been greatest. Men stopped running, and were digging in. Terror was being replaced by anger.
 
    Not far behind the 6th Armored Division breakthrough area, Colonel John Hines was in a stone house where refugees from the front were thawing out their frozen rifles and frozen bodies.

    One man, his face covered with blood and dirt, his eyes two bitter holes, was saying, “I used to wonder what I was doing in the army. I didn’t have anything personal against the Krauts, even if they were making me live in a freezing, frigging foxhole. But I learned something today. Now I want to kill every goddam Kraut in the world. You know why? To save my own ass.”

    There was a new GI in the Ardennes.

    The good-natured, rather careless, supremely confident GI who had known one victory after another since landing in Normandy; who had assumed he would be well clothed, well fed, and well led; who accepted it as his heritage to outgun and outmachine the enemy, was gone. Since December 16 he’d had few days of the overpowering air support and air cover he’d taken for granted; his clothing didn’t keep out the cold; his boots were traps for trench foot, his tanks were outnumbered; often his machines were immobilized by cold, snow, and terrain.

    He was cold and hungry. He had just fought a humiliating series of retreats where terror roamed far behind the lines. He had tasted defeat.

    But he had learned bitter lessons that were beginning to pay off. In this first major winter battle ever fought by Americans he had learned that the wounded die fast in the zero cold. He had learned in a few weeks that cold is a living enemy and must be fought.


Friday, September 13, 2024

FLOGGING

 John Prebble, Culloden [1746]
New York: Atheneum, 1962, pp. 20-21

The Age of Reason may have wished its armies would behave like Hectors, and every man may indeed, as Johnson claimed, have thought meanly of himself for not having been a soldier, but the reality of life was not that imagined by the Patriot Muses of The Gentleman’s Magazine. It was dirty, depraved and despised. All men preyed on the soldier, and in his turn he robbed and bullied them. To his colonel he was frequently a toy, to be dressed in bizarre and fanciful uniforms that must have given battle an added horror. He stood on a no-man’s-land outside the law, its victim and its guardian. When called to support it during civil riots he risked death by shooting if he refused, and trial for murder by the civil authority if he obeyed. The whip, the nine-tail cat with knots of precise size, kept him in order, and his wife or his woman could be disciplined by the whirligig. In this chair she was strapped and spun through the air until she suffered the vomiting sensations of sea-sickness. A solder who asked permission to marry a doxy who had loyally followed him through a campaign, risked a hundred lashes for impertinence. 

Flogging was notoriously commonplace. Almost every day’s entry in the Order Books contains the names of one, two, or three men sentenced to the lash, receiving anything from the minimum of twenty-five strokes to the maximum of three thousand. Men boasted their endurance of the cat. A drummer bragged that he had received twenty-six thousand lashes in fourteen years, and his officers agreed, with admiration, that four thousand of them had been given between the February of one year and the February of the next. Life for the foot-soldier was punctuated by the lash and the pox. Battle came almost as a relief. It was often his only discharge in a war.

For his sixpence a day he was expected to march from a town where innkeepers had either refused to serve him, or had robbed him when drunk, to eat a breakfast of dry bread and water, to watch his officers indulge in chivalrous courtesies with enemy officers while the lines closed, and then to endure a murderous exchange of musketry or grape at one hundred paces. “We ought to returne thanks to God,” wrote a sergeant of Foot from Flanders, “for preserving us in ye many dangers we haue from time to time been exposed unto...” But thanking God was not always easy when His mercy was hard to find...

Friday, August 23, 2024

JEFFERSON

Let us, then, with courage and confidence pursue our own Federal and Republican principles, our attachment to union and representative government. 

Kindly separated by nature and a wide ocean from the exterminating havoc of one quarter of the globe; too high-minded to endure the degradations of the others; possessing a chosen country, with room enough for our descendants to the thousandth and thousandth generation; entertaining a due sense of our equal right to the use of our own faculties, to the acquisitions of our own industry, to honor and confidence from our fellow-citizens, resulting not from birth, but from our actions and their sense of them; enlightened by a benign religion, professed, indeed, and practiced in various forms, yet all of them inculcating honesty, truth, temperance, gratitude, and the love of man; acknowledging and adoring an overruling Providence, which by all its dispensations proves that it delights in the happiness of man here and his greater happiness hereafter—with all these blessings, what more is necessary to make us a happy and a prosperous people? 

Still one thing more, fellow-citizens—a wise and frugal Government, which shall restrain men from injuring one another, shall leave them otherwise free to regulate their own pursuits of industry and improvement, and shall not take from the mouth of labor the bread it has earned. This is the sum of good government, and this is necessary to close the circle of our felicities.

from Thomas Jefferson’s First Inaugural Address, March 4, 1801

Friday, August 16, 2024

KATYN FOREST

With only three Polish divisions covering the 800-mile-long eastern border, it came as a complete surprise when at dawn on 17 September [1939] the USSR invaded Poland, in accordance with secret clauses of the Nazi–Soviet Pact that had been agreed on 24 August. The Russians wanted revenge for their defeats at Poland’s hands in 1920, access to the Baltic States and a buffer zone against Germany, and they opportunistically grasped all three, without any significant resistance. Their total losses amounted to only 734 killed. Stalin used Polish ‘colonialism’ in the Ukraine and Belorussia as his (gossamer-thin) casus belli, arguing that the Red Army had invaded Poland ‘in order to restore peace and order.’ The Poles were thus doubly martyred, smashed between the Nazi hammer and the Soviet anvil, and were not to regain their independence and freedom until November 1989, half a century later.

In one of the most despicable acts of naked viciousness of the war, in the spring of 1940 the Red Army transported 4,100 Polish officers, who had surrendered to them under the terms of the Geneva Convention, to a forest near Smolensk called Katyń, where they were each shot in the back of the head. Vasily Blokhin, chief executioner of the Russian secret service, the NKVD, led the squad responsible, wearing leather overalls and an apron and long leather gloves to protect his uniform from the blood and brains, and using a German Walther pistol because it did not jam when it got hot from repeated use. (Nonetheless he complained he got blisters on his trigger finger by the end of the third day of continuous executions.) 


In all, 21,857 Polish soldiers were executed by the Soviets at Katyń and elsewhere—an operation which, after the Germans had invaded Russia, Stalin’s police chief Lavrenti Beria admitted had been ‘a mistake’. When the Germans uncovered the mass graves on 17 April 1943, Goebbels broadcast the Katyń Massacre to the world, but Soviet propaganda made out that it had been undertaken by the Nazis themselves, a lie that was knowingly colluded in by the British Foreign Office until as late as 1972, even though charges against the Germans over Katyń were dropped at the Nuremberg Trials.


Andrew Roberts, The Storm of War: A New History of the Second World War. HarperCollins. Kindle Edition.

Thursday, August 8, 2024

ALFRED THAYER MAHAN

Nathan Miller
The U.S. Navy: A History (Third Edition)
Annapolis: Naval Institute Press, 1997, 152-153

        
    Tall, balding, and ascetic-looking, [Alfred Thayer] Mahan was not a typical naval officer. Born at West Point, he was the son of Dennis Hart Mahan, a member of the faculty at the Military Academy and a pioneer in the teaching of strategy. Mahan obtained an appointment to the Naval Academy and because he had previously attended Columbia College for two years was allowed to enter the third class, the last man in the school’s history to be permitted to skip plebe year. In the twenty-five years of service that followed his graduation in 1859, Mahan drifted along with the tide, accomplishing little except for writing a small book on naval operations in the Civil War, The Gulf and Inland Waters. Impressed with this work, Luce [Commodore Stephen B. Luce, president, Naval War College] offered its author the post of lecturer in naval history. Mahan accepted with alacrity, but because an unsympathetic Navy Department ruled that he would have to complete his tour on the Pacific Station before reporting to Newport, he missed the school’s first term.


        
    Mahan’s duties on the Wachusett were not onerous, and as she lay in the dreary Peruvian port of Callao, he spent most of his time at the local English Club devouring every history book he could find in order to prepare himself for his new assignment. Trying to find a way to “make the experience of wooden sailing vessels, with their pop-guns, useful in the naval present,” he perceived in the long sweep of history a pattern that indicated that command of the sea had been a decisive factor in the rise and fall of empires. The idea came to him while he was reading Theodore Mommsen’ History of Rome. “It suddenly struck me,” Mahan related, “how different things might have been could Hannibal have invaded Italy by sea..instead of by the long land route, or could he, after arrival, have been in free communication with Carthage by water.” With every faculty “alive and jumping,” he saw that “control of the seas was an historic factor which had never been systematically appreciated and expounded.”


        
    When Luce was ordered to sea duty in 1886, Mahan was appointed president of the college. This was something of an empty honor. Government financing for the institution was pitifully inadequate, and he had to lobby steadily for funds to keep its doors open. The president’s quarters were in such deplorable condition that Mahan had to attach rubber tubing to a radiator to obtain bath water. he also had to fight off officers and civilians who wanted the course at Newport to devote less time to the study of strategy and more to the use of evolving technologies. Now a captain, he persevered in his efforts to keep the school open and found time to turn his lectures into a book that, after being rejected by several publishers, was published by Little, Brown and Company in 1890 as The Influence of Sea Power Upon History, 1660-1783...

Monday, August 5, 2024

MORAL CHOICE

I am a libertarian, and as such I believe that people should have the legal freedom to do almost anything that doesn’t involve force or fraud. I am also an admirer of both Edmund Burke and Adam Smith, and as such believe there are many things that people can do but may not do—that is, do not have the freedom to do without reproach.

An eminent English judge a century ago, John Fletcher Moulton, put it nicely: “Between ‘can do’ and ‘may do’ ought to exist the whole realm which recognizes the sway of duty, fairness, sympathy, taste, and all the other things that make life beautiful and society possible.” He called this realm “obedience to the unenforceable,” and it is the passing of that realm that has led to the disuse of vulgar, unseemly, and dishonorable.

Charles Murray, The Curmudgeon's Guide to Getting Ahead: Dos and Don’ts of Right Behavior, Tough Thinking, Clear Writing, and Living a Good Life (113). Crown. Kindle Edition.

Friday, August 2, 2024

RELEARNING

What we need instead is a rediscovery of fundamentals, an acknowledgment that the old ways work, and a realization that if we sweep away everything old and try to reimagine something better, we will have swept away everything of value.


Law & Liberty
Daniel Buck
August 1, 2024


The Great Relearning of American Schools


At the start of the pandemic, many elite institutions went test optional, only to reverse course and again require the submission of SAT or ACT scores. It’s an eye-catching list: Brown, Dartmouth, Georgetown, Harvard, and many more. In a fit of anti-testing sentiments, they toppled a statue, but one by one they recognized their error and re-erected it.


My colleague Adam Tyner has detailed at length the importance of college entrance exams. They may be biased toward affluent students, but holistic portfolios, personal essays, involvement in after-school activities, letters of recommendation, and other alternatives to test scores are even more so. Affluent kids can afford coaching and editing for personal essays, and their well-connected parents can ensure Johnny receives a letter of recommendation from some impressive individual.


But for that kid on the other side of town, the SAT may be his only chance to prove himself. And sure enough, research finds that standardized tests are the least biased measure.


No one contests that these tests are imperfect, but elite universities have rediscovered their very real utility. They are an important data point to help admissions officers sift and sort through inflated GPAs, AI-generated personal essays, and thick academic portfolios that take time to review.


Tom Wolfe discusses a similar story—the rediscovery of old truths, this time in the twentieth century—in his famous essay “The Great Relearning.” Hippies rediscover basic hygiene after deconstructing old bourgeois norms like regular washing, and finding themselves afflicted with diseases not seen in centuries—the itch, the twitch, the rot. Architects renounce brutalism and glass monstrosities and instead embrace Art Deco and classical architecture. It turns out the glass provides no insulation while beige, cubical, utilitarian office buildings make everyone into a nihilist.


Education is undergoing a similar relearning. How anyone decided it was oppressive to measure whether students had learned course material is beyond me. Thankfully, even elite universities have recognized that standardized tests are not racist or classist. K-12 institutions would be wise to relearn likewise.


This great relearning touches on other aspects of schooling too. In the summer of 2020, in a manner resembling a Maoist show trial, school system after school system denounced traditional discipline and consequence structures, instead committing themselves to anti-racist education and restorative justice. A few years on, as misbehavior worsens, more schools are rediscovering the basic principles of human nature: if you abstain from consequences or even reward misbehavior with a heart-to-heart and a bag of chips, you get more of it.


Accordingly, districts from Las Vegas to Washington, DC are implementing stricter discipline codes to empower teachers to maintain control in their classrooms, while no-excuse discipline policies are once again receiving plaudits in the media.


The same story is soon to be told about grading. At the onset of the pandemic, perhaps understandably, many schools went pass/fail, dropped penalties for late work, and adopted more lenient grading scales. After they reopened, “equity and mastery” grading fads cemented these changes long-term, suggesting that it’s inequitable or unfair to fail students or otherwise hold them accountable for their work.


These same schools quickly discovered that low expectations only allowed students to slack off, complete the bare minimum, and still pass the class; that grades hold students accountable, incentivize work and therefore learning, and communicate important information to parents, universities, and other stakeholders.


Once again, many districts are now questioning these equity grading initiatives and re-instituting more formal gradebook procedures.


Perhaps most holistically, classical education is experiencing a renaissance. Enterprising school founders have established 264 new classical schools since 2019 alone. The media has taken notice with a host of think pieces analyzing its growth. What’s the appeal?


Since at least the 1960s, American education has fundamentally changed its telos. Student activists implored their professors to ask not “what is true?” but rather, “If we wanted to change society, how would we do it?” This theory of education that sees schools and universities as societal change agents, not institutions of academic learning, is the philosophy du jour of university education departments. Teachers are to be conspirators in cultural upheaval, not stewards of our shared knowledge and cultural inheritance.


But devoid of a great canon of literature, a body of scientific knowledge worth knowing, a compelling mythos of the American founding, or a robust understanding of Western history—even more broadly, without a commitment to objective truth or virtue—education itself becomes tinny, insubstantial, impotent. Students are left to pursue their own insular interests, reflect on their identities and navels, or pursue the latest political happening.


It doesn’t matter if teachers don’t vibe with the explicit, sequenced nature of phonics instruction. It works.
 

Students crave an education that challenges the mind, enlivens the heart, and stirs the soul. Accordingly, parents are fleeing insipid public schools to instead seek out something more robust in a classical, liberal arts tradition stretching back to Aristotle.


I can only hope the same relearning will begin with instruction too. American education is enamored with the latest fad or innovative instructional design. Invariably, every progressive recommendation to “reimagine” instruction amounts to cutting the teacher-role from the classroom. Denigrated is the idea that the teacher should be the “sage on the stage”—transmitting knowledge through direct explanation, demonstrating skills, or guiding practice. Teachers are often advised to limit instruction to less than 10 minutes.


Such advice is deeply misguided. Decades of research confirm that direction explanation, however easily cast off, is the best method to learn new materials. Seeking innovation in instruction is a category error. Technology may have changed but human nature remains the same. We learn best when others explain new concepts to us. In trying to innovate past explicit teaching and direct instruction, American schools lost something valuable.


Across the pond, the researchEd movement in the UK has popularized traditional pedagogy, and on this side of the Atlantic, we’re experiencing something similar (albeit on a smaller scale) with the rediscovery of phonics. It doesn’t matter if teachers don’t vibe with the explicit, sequenced nature of phonics instruction. It works. Would this realization expand to other content areas as well?


Throughout his essay, Tom Wolfe over and again references the concept of “year zero.” He observes that societies cannot deconstruct their social mores, institutions, habits, traditions, and structures, to then reconstruct something utopian on purely rational grounds. In reality, there is a great amount of wisdom in these very social mores, institutions, habits, traditions, and structures. Even something as simple as Grandma’s pie recipe has generations of trial and error built into its basic directions, far more wisdom than any one rational actor could ever contain or discover in the present.


In education, the ideologies of deconstruction and rationalism manifest in the idea that we can “reimagine” education entirely, or as one popular book puts it, reconsider every “institutional norm.” But we cannot shatter every institutional norm without repercussion, any more than we can stop washing our hands without repercussion.


Grades encourage hard work and learning. Tests both incentivize study and facilitate the retention of information in long-term memory. Discipline structures create orderly environments where students can learn, and consequences are themselves pedagogical tools that help students become responsible, polite, and self-controlled. Desks in rows, discrete subjects, handwritten notes, homework, reading great books out loud together—these norms all serve a purpose even if it’s not immediately obvious what that purpose is.


For decades, there was a general push and pull in the education world between progressives and traditionalists. The math wars stretch back at least to debates over California’s state curriculum in the 1990s. The phonics versus whole language debate began in the 1950s, and traditionalists have many times declared that phonics won. John Dewey first theorized a progressive education built on rationalist grounds in the early twentieth century, building on Jean Jacques Rousseau before him. As with old truths rediscovered, these are old debates as well.


Most recently, school systems embraced deconstructionism under pressure from anti-racist activists during Covid, imploring them to tear down old structures of discipline, instruction, testing, and curriculum. Traditionalists retreated while progressives advanced. Alas, an education system that forwent the basic truths of human nature was bound to fail, and schools are relearning old lessons.


In their renunciation of admissions tests, universities stumbled on the wisdom of the thought experiment “Chesterton’s Fence.” The purchaser of a new property, the idea goes, shouldn’t tear down a fence simply because they are unaware of its use. If they do, they may find snow drifts blocking their windows or wolves among the sheep. It is a call to respect the wisdom in existing institutions, but also a plea for intellectual humility. We may not know what’s best, so it is wise to respect those who came before.


What we need instead is a rediscovery of fundamentals, an acknowledgment that the old ways work, and a realization that if we sweep away everything old and try to reimagine something better, we will have swept away everything of value.

Friday, July 19, 2024

WAVELL

Encouraged by his success in the north, Wavell then moved to cover his southern flank. When Italy had declared war the Duke of Aosta, Viceroy of Ethiopia (Abyssinia), had crossed into the Sudan with 110,000 troops and taken Kassala, then into Kenya to capture Moyale, and also into British Somaliland, seizing Berbera. 

Wavell had bided his time before responding, but in late January 1941 he sent two British Commonwealth forces totalling 70,000 men—mainly South Africans—to exercise a massive pincer movement utterly to rout Aosta. 

Lieutenant-General Sir Alan Cunningham occupied Addis Ababa on 4 April, having averaged 35 miles a day for over a thousand miles, taking 50,000 prisoners and gaining 360,000 square miles of territory at the cost of 135 men killed and four captured. The Emperor Haile Selassie of Ethiopia returned to his capital on 5 May, five years to the day since it had fallen to the Italians. Aosta and his enormous but demoralized army surrendered on 17 May, leaving the Red Sea and Gulf of Aden open to Allied shipping once more. 

Meanwhile, in the north, very great victories greeted O’Connor, who saved the Suez Canal and drove the Italians back along the coast road to Benghazi. As the 6th Division forced Graziani into headlong retreat, O’Connor sent the 7th Division through the desert via Mechili to slice through the Cyrenaican bulge and cut off the Italians. At the battle of Beda Fomm on the Gulf of Sirte between 5 and 7 February 1941 the British Empire and Commonwealth won its first really significant land victory of the Second World War. 

In two months from 7 December 1940, the Western Desert Force had achieved successes that utterly belied Churchill’s statement quoted above; they had destroyed nine Italian divisions and part of a tenth, advanced 500 miles and captured 130,000 prisoners, 380 tanks and 1,290 guns, all at the cost of only 500 killed and 1,373 wounded. In the whole course of the campaign, Wavell never enjoyed a force larger than two divisions, only one of them armoured. It was the Austerlitz of Africa, and prompted his prep school to note in the Old Boys’ section of the Summer Fields magazine: ‘Wavell has done well in Africa.’

Andrew Roberts, The Storm of War: A New History of the Second World War. HarperCollins. Kindle Edition.

Monday, July 8, 2024

BOOKS FOR BOYS

I hear from plenty of educators who say they’re reluctant to talk about the needs of boys for fear of being labeled reactionary. But more boys might develop a taste for reading if they encountered more of the kinds of books they’d like to read.


What If Boys Like the “Wrong” Kind of History?
Great Battles for Boys: a delightfully countercultural book series

Frederick Hess
July 8, 2024


An Amazon box was on the porch the other day. (I get sent a lot of books. It’s a cool perk.) I pulled out five colorful, oversized paperbacks. Great Battles for Boys: The Korean War. Great Battles for Boys: The American Revolution. Great Battles for Boys: WW2 in Europe. And two more. I found the titles delightfully countercultural. I mean, who writes about military strategy today? Who unabashedly markets stuff to boys? The books, all published between 2014 and 2022, are authored by history teacher Joe Giorello. They all run about 150 to 250 pages with straightforward text, anecdotes, pictures, maps, and suggestions for further reading.


Having never heard of the series, I was curious how these books were faring in the larger world. The answer? Very well. On Amazon, at the time of this writing, Giorello’s volume on WWII in Europe ranked #2 in “Children’s American History of 1900s.” His volume on the Civil War was #1 in “Children’s American Civil War Era History Books.” His book on the Revolutionary War was #1 in “Children’s American Revolution History.” There are thousands of enthusiastic, five-star reviews.


And yet, like I said, I’d never heard of Giorello. I couldn’t find a single mention of him when I searched School Library Journal, Education Week, the National Council for the Social Studies, or the National Council for History Education. As best I can tell, he’s self-published. The stories are interesting, but the narrative is pretty rote, with no gimmickry or multimedia pizzazz. It’s just workmanlike, accessible history. For instance, the chapter on “The Battle of Britain” in WW2 in Europe begins:


By June 1940, Germany had achieved a victory in Norway, but the win came at a steep cost. The battle had damaged or sunk over half of Germany’s warships.


This loss was crucial. Hitler desperately wanted to conquer Great Britain, but with half his fleet out of commission, the German navy was no match for the powerful British Royal Navy.


Hitler decided he would conquer Britain by air.


So, what’s going on? Why have these books been such a silent success? The most salient explanation may be the frank, unapologetic decision to offer books about “great battles for boys” in an era when that’s largely absent from classrooms. This may simply be the kind of history that a lot of boys are eager to read about. Of course, even penning that sentence can feel remarkably risqué nowadays, which may be a big part of the problem.


It got me thinking. My elementary-age kids have brought home or been assigned a number of children’s books on history. Most are focused intently on social and cultural history. I’ll be honest. Even as someone who’s always been an avid reader, I find a lot of that stuff pretty tedious. As a kid, I found books about the Battle of Midway or D-Day vastly more interesting than grim tales of teen angst, and I don’t think that makes me unusual. Moreover, it surprises no one (except the occasional ideologue) to learn that girls generally appear more interested in fiction than boys—or that boys tend to prefer reading about sports, war, comedy, and science fiction, while girls favor narratives about friendship, animals, and romance.


Today, when I peruse classroom libraries, recommended book lists, or stuff like the summer reading suggestions from the American Library Association, I don’t see much that seems calculated to appeal to boys.


One reason that boys read less than girls may be that we’re not introducing them to the kinds of books they may like. There was a time when schools really did devote too much time to generals and famous battles, but we’ve massively overcorrected. Indeed, I find that too many “diverse, inclusive” reading lists feature authors who may vary by race and gender but overwhelmingly tend to write introspective, therapeutic tales that read like an adaptation of an especially heavy-handed afterschool special.


Now, my point is not that kids should read this rather than that. Schools should be exposing all students to more fiction and nonfiction, with varied topics and themes. If that requires assigning more reading, well, good.


Then there are the well-meaning educators and advocates who approach book selection as an extension of social and emotional learning. Heck, while writing this column, I got an email promoting the nonprofit I Would Rather Be Reading, which uses “trauma responsive literacy support and social-emotional learning to help children.” I’m sure it’s a lovely organization, but I’d be shocked if any of the books in question feature stoic virtues or manly courage. After all, the therapy/SEL set has worked assiduously to define traditional masculinity as “toxic.” And all this can alienate kids who find the therapy-talk unduly precious or rife with adult pathologies.


I hear from plenty of educators who say they’re reluctant to talk about the needs of boys for fear of being labeled reactionary. But more boys might develop a taste for reading if they encountered more of the kinds of books they’d like to read. I’d take more seriously those who talk about inclusive reading lists if their passion extended to the well-being of those students bored by social justice-themed tracts and if they truly seemed more invested in turning every kid into an avid reader, which requires a diverse mix of books available—including those about “great battles for boys.”


Frederick Hess is an executive editor of Education Next and the author of the blog “Old School with Rick Hess.”


Friday, July 5, 2024

MICHAELA

 A Deeper Desire

Perhaps one of the most important things a leader does is help his or her people find courage. One extraordinary story of a leader doing that in a single moment of genius can be found in an incident that occurred during the French Revolution, in the critical Battle of Toulon. Toulon was a vitally important naval city which the French couldn’t afford to lose, but it had recently fallen to the allied forces determined to quash the Revolution. The revolutionaries were stretched thin, lacking experience and short of leadership. Their hopes of regaining the crucial city were looking slim, and the success of the whole Revolution was hanging in the balance.

Just when defeat looked unavoidable, a twenty-four year old artillery officer named Napoleon Bonaparte arrived on the scene. He could see that the only hope of regaining the port lay in establishing a point from which they could effectively bombard the allied ships. In order to do this, one particularly effective but dangerously exposed gun battery needed to be constantly manned. The problem, however, was that the battery’s exposed position meant that those who manned the post were all but certain to die doing so, and eventually it reached the stage where men were simply refusing to take the post, recognising it as the suicide mission that it was. Napoleon knew that the battle had reached a decisive moment, and everything temporarily hung on this crucial point.

He walked through the camp, considering what to do, when an idea struck him. He made a sign with a few words on it. He then attached it to the lethal battery position. After this, it never lacked a man, day or night; indeed, men were fighting over the chance to hold the post. The battle was won, and Napoleon’s name was established. The words he’d written stated simply, ‘The battery for men without fear.’ 

 Something greater than the possibility of an immediate victory had been placed before the men. Something more than the threat of a sanction for cowardice, or an immediate reward for compliance. Something deeper had been appealed to and awoken; the prospect of engaging in something requiring wholehearted, full-blooded courage. The men were ultimately thirsty for the opportunity to harness everything they had, and give it their all. People today are still looking for that kind of invitation: an invitation to give everything you have to something worth fighting for.

Being part of Michaela feels a bit like that.

Katherine Birbalsingh, Michaela: The Power of Culture (33-34). Hodder Education. Kindle Edition.

Tuesday, June 18, 2024

World War One

In early 1914 Bethmann Hollweg’s secretary, Kurt Riezler, published (pseudonymously) a book entitled Characteristics of Contemporary World Politics. In it he argued that the unprecedented levels of armament in Europe were ‘perhaps the most controversial, urgent and difficult problem of the present time.’ Sir Edward Grey, always fond of explanations of the war which minimized human agency, would later agree. ‘The enormous growth of armaments in Europe,’ he wrote in his post-war memoirs, ‘the sense of insecurity and fear caused by them—it was these that made war inevitable. This, it seems to me, is the truest reading of history . . . the real and final account of the origins of the Great War.’

Historians seeking great causes for great events are naturally drawn to the pre-war arms race as a possible explanation for the First World War. As David Stevenson has put it: ‘A self-reinforcing cycle of heightened military preparedness . . . was an essential element in the conjuncture that led to disaster . . . The armaments race . . . was a necessary precondition for the outbreak of hostilities.’ David Herrmann goes further: by creating a sense that ‘windows of opportunity for victorious wars’ were closing, ‘the arms race did precipitate the First World War.’ If the Archduke Franz Ferdinand had been assassinated in 1904 or even in 1911, Herrmann speculates, there might have been no war; it was ‘the armaments race . . . and the speculation about imminent or preventive wars’ which made his death in 1914 the trigger for war. Yet, as both Stevenson and Herrmann acknowledge, there is no law of history stating that all arms races end in wars.

The experience of the Cold War shows that an arms race can deter two power blocs from going to war and can ultimately end in the collapse of one side without the need for a full-scale conflagration. Conversely, the 1930s illustrates the danger of not racing: if Britain and France had kept pace with German rearmament after 1933, Hitler would have had far greater difficulty persuading his generals to remilitarize the Rhineland or to risk war over Czechoslovakia. The key to the arms race before 1914 is that one side lost it, or believed that it was losing it. It was this belief which persuaded its leaders to gamble on war before they fell too far behind. Riezler erred when he argued that ‘the more the nations arm, the greater must be the superiority of one over the other if the calculation is to fall out in favour of war.’ On the contrary: the margin of disadvantage had to be exceedingly small—perhaps, indeed, only a projected margin of disadvantage—for the side losing the arms race to risk a war. The paradox is that the power which found itself in this position of incipient defeat in the arms race was the power with the greatest reputation for excessive militarism—Germany.

Sir Niall Ferguson, The Pity of War: Explaining World War I (82-83). Basic Books. Kindle Edition.

Monday, June 17, 2024

ENIAC

 ENIAC, one of the first “modern” computers, debuted in 1946. It weighed 27 tons, required 240 square feet (22.3 square meters) of floor space, and needed 174,000 watts (174 kilowatts) of power, enough to (allegedly) dim all the lights in Philadelphia when turned on. In 1949, Popular Mechanics predicted that one day a computer might weigh less than 1.5 tons. In the early 1970s, Seymour Cray, known as the “father of the supercomputer,” revolutionized the computer industry. His Cray-1 system supercomputer shocked the industry with a world-record speed of 160 million floating-point operations per second, an 8-megabyte main memory, no wires longer than four feet, and its ability to fit into a small room. The Los Alamos National Laboratory purchased it in 1976 for $8.8 million, or $36.9 million in today’s inflation-adjusted dollars.

But as predicted by Moore’s Law (the number of transistors that can fit on a microchip will double roughly every twenty-four months), modern computing has improved and spread beyond even the wildest speculations of people living at the time of any of these computers (certain science fiction excepted). A team of University of Pennsylvania students in 1996 put ENIAC’s capabilities onto a single 64-square-millimeter microchip that required 0.5 watts, making it about 1/350,000th the size of the original ENIAC. And that was twenty years ago. Popular Mechanics’ prediction proved correct, though a bit of an (understandable) understatement.

Still, try telling its 1949 editorial staff that today we hold computers in our hands, place them in our pockets, and rest them on our laps. Laptop computers with 750 times the memory, 1,000 times the calculating power, and essentially an infinitely greater amount of general capabilities as the Cray-1 are now available at Walmart for less than $500. Bearing out the Iron Sky comparison, a smartphone with 16 gigabytes of memory has 250,000 times the capacity of the Apollo 11 guidance computer that enabled the first moon landing. A life’s wages in 1975 could have bought you the computing power of a pocket calculator in 2000. In 1997, $450 could have bought you 5 gigabytes of hard-drive storage that is free today. A MacBook Pro with 8 gigabytes of RAM has 1.6 million times more RAM than MANIAC, a 1951 “supercomputer.” Forget angels on the head of a pin: Intel can fit more than six million transistors onto the period at the end of this sentence.

What this all means for the consumer is an unprecedented spread of technology. Today’s cell phones exceed the computing power of machines that required rooms mere decades ago. Nearly half the world uses the Internet, up from essentially zero in 1990. Today, then, computers are better, faster, smarter, more prevalent, and more connective than ever before. In the 1990s, progressive policy makers fretted over something called “the digital divide.” They convinced themselves that, absent government intervention, the Internet would be a plaything for the wealthy. They raised new taxes, transferred wealth, paid off some constituents, and claimed victory. But the truth is that the Internet was always going to be for everyone, because that is what the market does. It introduces luxuries for the wealthy, and the wealthy subsidize innovations that turn luxuries—cell phones, cars, medicine, computers, nutritious food, comfortable homes, etc.—into necessities. It is the greatest triumph of alchemy in all of human experience, and the response from many in every generation is ingratitude and entitlement.

Jonah Goldberg, Suicide of the West: How the Rebirth of Tribalism, Populism, Nationalism, and Identity Politics Is Destroying American Democracy (352). Random House Publishing Group. Kindle Edition.

MERITOCRACY

 The Free Press

17 June 2024

 Meritocracy now! DEI is on the way out, and not a day too soon. But what should replace it? When it comes to recruitment, the short answer is surely: hire the best person for the job. One person sticking to this once uncontroversial, now edgy proposition is Alexandr Wang, the 27-year-old who became the world’s youngest self-made billionaire after he dropped out of MIT to co-found AI firm Scale in 2016. In a memo announcing the company’s new hiring policy, Wang writes “Scale is a meritocracy, and we must always remain one.” The guiding principle, he notes, is “MEI: merit, excellence, and intelligence.” He continues:  

That means we hire only the best person for the job, we seek out and demand excellence, and we unapologetically prefer people who are very smart.

We treat everyone as an individual. We do not unfairly stereotype, tokenize, or otherwise treat anyone as a member of a demographic group rather than as an individual. 

We believe that people should be judged by the content of their character—and, as colleagues, be additionally judged by their talent, skills, and work ethic.

There is a mistaken belief that meritocracy somehow conflicts with diversity. I strongly disagree. No group has a monopoly on excellence. A hiring process based on merit will naturally yield a variety of backgrounds, perspectives, and ideas. Achieving this requires casting a wide net for talent and then objectively selecting the best, without bias in any direction. We will not pick winners and losers based on someone being the “right” or “wrong” race, gender, and so on. It should be needless to say, and yet it needs saying: doing so would be racist and sexist, not to mention illegal.

Upholding meritocracy is good for business and is the right thing to do.

Friday, May 24, 2024

MEN WANTED

Sir Ernest Shackleton when he was about to set out on one of his expeditions, printed a statement in the papers, to this effect:

“Men wanted for hazardous journey to the South Pole. Small wages, bitter cold, long months of complete darkness, constant danger. Safe return doubtful. Honor and recognition in case of success. Ernest Shackleton, 4 Burlington Street.” [May 15, 1913]

In response to his posted ad, Shackleton was supposedly flooded with 5,000 responses, men clamoring to take their chances on the icy southern continent. [September 10, 1913]

 

(When the survivors returned, WWI was on, so they joined the military.)

Monday, May 13, 2024

USE OF HISTORY

 “The things that are now before us,” said the Princess, “require attention, and deserve it. What have I to do with the heroes or the monuments of ancient times—with times which can never return, and heroes whose form of life was different from all that the present condition of mankind requires or allows?”

“To know anything,” returned the poet, “we must know its effects; to see men, we must see their works, that we may learn what reason has dictated or passion has excited, and find what are the most powerful motives of action. To judge rightly of the present, we must oppose it to the past; for all judgment is comparative, and of the future nothing can be known. The truth is that no mind is much employed upon the present; recollection and anticipation fill up almost all our moments. Our passions are joy and grief, love and hatred, hope and fear. Of joy and grief, the past is the object, and the future of hope and fear; even love and hatred respect the past, for the cause must have been before the effect.

“The present state of things is the consequence of the former; and it is natural to inquire what were the sources of the good that we enjoy, or the evils that we suffer. If we act only for ourselves, to neglect the study of history is not prudent. If we are entrusted with the care of others, it is not just. Ignorance, when it is voluntary, is criminal; and he may properly be charged with evil who refused to learn how he might prevent it.

“There is no part of history so generally useful as that which relates to the progress of the human mind, the gradual improvement of reason, the successive advances of science, the vicissitudes of learning and ignorance (which are the light and darkness of thinking beings), the extinction and resuscitation of arts, and the revolutions of the intellectual world. If accounts of battles and invasions are peculiarly the business of princes, the useful or elegant arts are not to be neglected; those who have kingdoms to govern have understandings to cultivate.

“Example is always more efficacious than precept. A soldier is formed in war, and a painter must copy pictures. In this, contemplative life has the advantage. Great actions are seldom seen, but the labours of art are always at hand for those who desire to know what art has been able to perform.

“When the eye or the imagination is struck with any uncommon work, the next transition of an active mind is to the means by which it was performed. Here begins the true use of such contemplation. We enlarge our comprehension by new ideas, and perhaps recover some art lost to mankind, or learn what is less perfectly known in our own country. At least we compare our own with former times, and either rejoice at our improvements, or, what is the first motion towards good, discover our defects.”

Samuel Johnson, The History of Rasselas, Prince of Abissinia (42-43) [1759]. Kindle Edition.

Thursday, May 9, 2024

DEKULAKIZATION

How many “kulaks” died in the course of “de-kulakization”? On 1 January 1932, the GPU carried out a general census of all deportees: it listed 1,317,022 people. We know, by the same police sources, that nearly 1.8 million “kulaks” were deported during the two main deportation waves in 1930 and 1931. Losses accordingly numbered close to half a million people, or nearly 30% of all deportees. Undoubtedly, a not insignificant proportion of those had escaped. In 1932, the GPU komandatury, which actually managed to keep accurate records of the deportees they were supposed to keep watch over, counted no less than 207,000 escapes (38,000 runaways were recaptured); in 1933, the number of escapes was 216,000 (54,000 recaptured). Considering a number of local GPU reports (for different periods in 1930 and 1931) on the flights of deported “kulaks”, we can extrapolate that around 200,000-250,000 deportees managed to escape in 1930-1931. This still leaves us with approximately 250,000-300,000 deaths. 

A number of local reports confirm the very high mortality rates among the deportees, especially among children and elderly people. In 1931, the mortality rate was 1.3% per month (16% per annum) among the deportees to Kazakhstan, and 0.8% per month (10% per annum) for those to western Siberia. Infant mortality oscillated between 8% and 12% per month, and peaked at 15% per month in Magnitogorsk. From June 1931 to June 1932, the mortality rate among deportees in the region of Narym, in Western Siberia, reached 11.7%. In 1932, the overall number of deaths among deportees was over 90,000 (annual death rate: 6.8%); in 1933, it was 151,600 (annual death rate: 13.3%). Altogether, more than half a million deportees died in 1930-1933, or 22% of the 2.3 million people deported during those years. Most of them died untimely deaths, of general exhaustion and hunger (Zemskov, 2003; Danilov & Krasilnikov, 1993, 1994, Viola, 2007).

Nicholas Werth, SciencesPo, 23 September 2011

Monday, May 6, 2024

BATTLE OF THE BULGE

But the greatest lesson was one learned from the enemy: to hate. Word of the Malmédy Massacre, of the massacres of civilians at Stavelot, Trois Points, and Bande, passed from man to man, and from unit to unit. Until the Ardennes the GI had fought a civilian’s war. Now he was learning to kill without remorse or pity.

John Toland, Battle: The Story of the Bulge
New York: Random House, 1959, pp. 329-331

    Reports from men drifting back from the front were alarming. Entire units, claimed many wild-eyed refugees, had been cut off and were being wiped out. Back at Division, General Grow had no clear idea of how great his casualties were. But he did know that it was the worst day in the history of his 6th Armored Division.

    The retreat of the 6th Armored wasn’t the only reverse on George Patton’s front. For the savage meeting engagement was at its climax. Violent German attacks had struck all along the Bastogne front. In particular the 17th Airborne Division, in their first real day of action a few miles west of town, had been dealt shocking casualties, some battalions losing 40 per cent of their men.

    Ordinarily the most optimistic of American generals, Patton was now in a despondent mood. Each man lost that day weighed heavily on him. He sat at a desk and wrote in his diary, “We can still lose this war.”

    But at the front, later that night, a strange thing began to take place where the day’s disaster had been greatest. Men stopped running, and were digging in. Terror was being replaced by anger.
 
    Not far behind the 6th Armored Division breakthrough area, Colonel John Hines was in a stone house where refugees from the front were thawing out their frozen rifles and frozen bodies.

    One man, his face covered with blood and dirt, his eyes two bitter holes, was saying, “I used to wonder what I was doing in the army. I didn’t have anything personal against the Krauts, even if they were making me live in a freezing, frigging foxhole. But I learned something today. Now I want to kill every goddam Kraut in the world. You know why? To save my own ass.”

    There was a new GI in the Ardennes.

    The good-natured, rather careless, supremely confident GI who had known one victory after another since landing in Normandy; who had assumed he would be well clothed, well fed, and well led; who accepted it as his heritage to outgun and outmachine the enemy, was gone. Since December 16 he’d had few days of the overpowering air support and air cover he’d taken for granted; his clothing didn’t keep out the cold; his boots were traps for trench foot, his tanks were outnumbered; often his machines were immobilized by cold, snow, and terrain.

    He was cold and hungry. He had just fought a humiliating series of retreats where terror roamed far behind the lines. He had tasted defeat.

    But he had learned bitter lessons that were beginning to pay off. In this first major winter battle ever fought by Americans he had learned that the wounded die fast in the zero cold. He had learned in a few weeks that cold is a living enemy and must be fought.

    Medics had learned to tuck frozen morphine Syrettes under their armpits; to put plasma under the hoods of trucks and jeeps. Infantrymen were saving their hands from frostbite by cutting four oversize mitten patterns from blankets and sewing them together. Trench foot, which was cutting down more Americans than bullets, was beaten with muffs made from blankets. At night the men learned to take off their soggy combat boots and socks, massage their feet, and then pull these blanket “Tootsie warmers” on, topped by overshoes. 

    They learned how to dry socks and shoes: heat pebbles in a can; dump the hot pebbles into the wet socks and the socks into the shoes.

    They learned that ordinary field jackets were little protection against the biting winds of the Ardennes. Inner linings were made of blankets sewed to the inside.

    They learned that two wool shirts were equal in warmth—and less bulky to fight in—than a shirt and overcoat. But the shirts had to be switched every night, the one next to the body, wet from perspiration even in the coldest day, taken off and hung up to dry.
 
    They learned what tramps and people of depression days had long known, that paper was a good insulator. A few sheets of newspaper wrapped around the chest between shirts was a buffer against the rawest wind.

    They also learned to wear shoes and overshoes that were a little too big, for tightness, cutting off circulation, brought on almost instant trench foot. They would stuff paper between shoes and arctics—a trick long used by hunters. The paper not only anchored their misfit footgear but retained body heat.

    They learned to heat food over a  “flambeau”—a wine bottle filled with gas with a wick of twisted rags.

    The facts of cold, long known to men of Minnesota and Maine, were passed on to men from Alabama and Texas. Frozen toes, ears, noses were rubbed gently to start circulation. The old wives’ remedy of rubbing snow on these frozen parts often brought on gangrene. Hands stiff from cold, unable to trigger a gun, were placed under armpits. To survive a night in a freezing foxhole many a man lived by covering his head with a blanket and trapping his own warm breath.

    The GIs learned not to eat snow except in very small doses or their stomachs would become chilled. Tankers learned that their great friend, Calvados, was their great enemy in the cold. For alcohol brought body heat to the surface, causing radiation and deadly chilling.

    They learned that cold metal would sweat when brought indoors, and then quickly freeze when taken outside. All weapons and ammunition were left outdoors, protected only from the falling snow.

    They learned the big lessons too. That their tanks should be whitewashed to blend with the snow; that they should wear sheets like Halloween ghosts.

    Mechanics, with Yankee ingenuity, soon learned how to make their machines work under sub-zero conditions. Where rubber tracks were unavailable for tanks, great metal cleats were welded on steel tracks to conquer ice and snow.
 
    But the greatest lesson was one learned from the enemy: to hate. Word of the Malmédy Massacre, of the massacres of civilians at Stavelot, Trois Points, and Bande, passed from man to man, and from unit to unit. Until the Ardennes the GI had fought a civilian’s war. Now he was learning to kill without remorse or pity.

Friday, April 19, 2024

BOLSHEVIK INHUMANITY

A number of historians have rightly emphasised the point that the February revolution in 1917 did not provoke a counter-revolution. The overthrow of the Tsarist regime prompted a wide variety of reactions among the former ruling class: a resignation to events, a bitterness at the incompetence and obstinacy of the imperial court, yet also an initial optimism among its more liberal and idealistic members. Most of the nobility and bourgeoisie supported the Provisional Government in the hope that it would at least restrain the worst excesses and keep the country together. The initial absence of any attempt to fight back illustrated not so much apathy, as the feeling that there was little of the ancien regime left that was worth defending.

A determination to resist only began to develop during the summer, when the Bolshevik programme polarised opinion. The question is important when it comes to the origins of the civil war itself, which led to the deaths of up to 12 million people, the utter impoverishment of the whole country and suffering on an unimaginable scale. Konstantin Paustovsky lamented the lost opportunity for democratic change. ‘The idyllic aspect of the first days of the Revolution was disappearing. Whole worlds were shaking and falling to the ground. Most of the intelligentsia lost its head, that great humanist Russian intelligentsia which had been the child of Pushkin and Herzen, of Tolstoy and Chekhov.

It had known how to create high spiritual values, but with only a few exceptions it proved helpless at creating the organisation of a state.’ Spiritual values never stood a chance against a fanatical determination to destroy all those of the past, both good and bad. No country can escape the ghosts of its past, least of all Russia. The writer and critic Viktor Shklovsky compared the Bolsheviks to the devil’s apprentice who, in an old Russian folk tale, boasted that he knew how to rejuvenate an old man. To restore his youth, he first needed to burn him up. So, the apprentice set him on fire, but then found that he could not revive him.

Fratricidal wars are bound to be cruel because of their lack of definable front lines, because of their instant extension into civilian life, and because of the terrible hatreds and suspicions suspicions which they engender. The fighting right across the Eurasian land-mass was violent beyond belief, especially the unspeakable cruelty of Cossack atamans in Siberia. Even that arch-conservative politician V.V. Shulgin believed that one of the major reasons for the failure of the Whites was a ‘moral collapse’—that they behaved as badly as their Bolshevik enemy. There was, nevertheless, one subtle yet important difference. All too often Whites represented the worst examples of humanity. For ruthless inhumanity, however, the Bolsheviks were unbeatable.

Antony Beevor, Russia (501-502). Penguin Publishing Group. Kindle Edition.

Friday, April 12, 2024

THE MEMORY HOLE

As soon as Winston had dealt with each of the messages, he clipped his speakwritten corrections to the appropriate copy of the Times and pushed them into the pneumatic tube. Then, with a movement which was as nearly as possible unconscious, he crumpled up the original message and any notes that he himself had made, and dropped them into the memory hole to be devoured by the flames. What happened in the unseen labyrinth to which the pneumatic tubes led, he did not know in detail, but he did know in general terms. As soon as all the corrections which happened to be necessary in any particular number of the Times had been assembled and collated, that number would be reprinted, the original copy destroyed, and the corrected copy placed on the files in its stead. 

This process of continuous alteration was applied not only to newspapers, but to books, periodicals, pamphlets, posters, leaflets, films, sound tracks, cartoons, photographs—to every kind of literature or documentation which might conceivably hold any political or ideological significance. Day by day and almost minute by minute the past was brought up to date. In this way every prediction made by the Party could be shown by documentary evidence to have been correct; nor was any item of news, or any expression of opinion, which conflicted with the needs of the moment, ever allowed to remain on record. 

All history was a palimpsest, scraped clean and reinscribed exactly as often as was necessary. In no case would it have been possible, once the deed was done, to prove that any falsification had taken place. The largest section of the Records Department, far larger than the one on which Winston worked, consisted simply of persons whose duty it was to track down and collect all copies of books, newspapers, and other documents which had been superseded and were due for destruction. A number of the Times which might, because of changes in political alignment, or mistaken prophecies uttered by Big Brother, have been rewritten a dozen times still stood on the files bearing its original date, and no other copy existed to contradict it. 

Books, also, were recalled and rewritten again and again, and were invariably reissued without any admission that any alteration had been made. Even the written instructions which Winston received, and which he invariably got rid of as soon as he had dealt with them, never stated or implied that an act of forgery was to be committed; always the reference was to slips, errors, misprints, or misquotations which it was necessary to put right in the interests of accuracy. But actually, he thought as he readjusted the Ministry of Plenty's figures, it was not even forgery. It was merely the substitution of one piece of nonsense for another. 

Most of the material that you were dealing with had no connection with anything in the real world, not even the kind of connection that is contained in a direct lie. Statistics were just as much a fantasy in their original version as in their rectified version. A great deal of the time you were expected to make them up out of your head.

George Orwell, 1984 Houghton Mifflin Harcourt. Kindle Edition.


Thursday, March 28, 2024

SOFT BIGOTRY

You might expect policymakers to be scrambling to shore up academic standards. In fact, they are doing the opposite.


The Economist
March 18, 2024
Soft bigotry
New numbers show falling standards in American high schools
Low-achieving pupils may suffer the most


SPRINGFIELD, in MASSACHUSETTS, might seem an improbable setting for an education miracle. The city with a population of 155,000 along the Connecticut river has a median household income half the state average; violent crime is common. Yet graduation rates at the city’s high schools are surging. Between 2007 and 2022 the share of pupils at the Springfield High School of Science and Technology who earned a diploma in four years jumped from 50% to 94%; at neighbouring Roger Putnam Vocational Technical Academy it nearly doubled to 96%.


Alas, such gains are not showing up in other academic indicators. At Springfield High scores on the SAT, a college-admissions test, have tumbled by 15% over the same period. Measures of English and maths proficiency are down, too. The pass rate on advanced-placement exams has fallen to just 12% compared with a national average of 60%.


The trend at Springfield High is all too common. Between 2007 and 2020 the average graduation rate at public high schools in America leapt from 74% to 87%. During this period pupils notched up gains in course credits and grade-point averages. Yet SAT scores fell (see chart 1). Results from the latest Programme for International Student Assessment (PISA), an international test of 15-year-olds, show that maths and reading literacy are flat or down. An analysis by The Economist suggests that schools are lowering academic standards in order to enable more pupils to graduate. And the trend is hurting low-performing pupils the most.


America has fretted about academic standards at its public schools for decades. In 1983 the Department of Education released a landmark report, “A Nation At Risk,” which warned of a “rising tide of mediocrity” in the country’s schools. The response was swift. Within five years 45 states had raised graduation requirements; and more than two dozen had introduced other reforms, including more comprehensive curriculums and higher salaries for teachers. Some states also started requiring graduates to pass “minimum-competency” exams, standardised tests introduced in the 1970s that evaluated pupils’ ability to do eighth- or ninth-grade level English and maths.


But as graduation requirements were toughened up, coursework was watered down. A survey conducted in 1996 by Public Agenda, a policy research group, found that just half of public high-school students felt that they were being challenged academically. Another survey in 2001 found that only a quarter of pupils thoughßt that their teachers had high expectations of them. Even the federal government acknowledged again that academic standards were falling short. A report by the Department of Education found that more than a tenth of maths coursework taken by the class of 2005 consisted of primary- and middle-school-level material. Only a third of algebra 1 students and a fifth of geometry students received “rigorous” instruction.


Grading got easier, too. The best evidence for this comes from comparisons of classroom grades with performance on state exams taken at the end of the school year. A study by Seth Gershenson of American University found that between 2005 and 2016, 36% of North Carolina public-school students who received Bs in their algebra 1 courses failed their end-of-course exams. Pupils with Cs failed 71% of the time. Another study, by Chris Clark of Georgia College & State University, analysed maths courses at Georgia public high schools in 2007 and yielded similar results. “Some schools and school systems appear to be inflating course grades,” Mr Clark concluded, “while others appear to hold their students to higher standards.”
Such evidence suggests that academic standards at American high schools are too low. But are they getting worse? To answer this, The Economist assembled data on graduation rates and standardised test scores at 3,000 high schools across six states—Colorado, Georgia, Illinois, Massachusetts, Michigan and North Carolina—for school years from 2007 to 2022.


Doing the maths


We found that four-year graduation rates in our sample increased during this period, even as test scores fell. Gains were greatest in high schools with the lowest test scores. In 2007 schools with scores on the sat or act, another college-admissions exam, in the bottom tenth of our sample graduated half of their pupils; in 2022 they graduated two-thirds. As low-performing schools have passed more pupils, the relationship between test scores and graduation rates has weakened (see chart 2).


Just how far has the academic bar been lowered? To quantify this, we conducted a regression analysis of graduation rates between 2007 and 2022 that controlled for average ACT or SAT scores, dropout rates and school year. If academic standards were consistent over time, we would expect no underlying trend in graduation rates from year to year. Instead, we found that graduation rates drifted upward, even after controlling for changes in test scores and dropout rates.


Our analysis suggests that high schools are graduating thousands of students who, not long ago, might not have made the grade. Some states have lowered the bar more than others. In Illinois graduation rates are about one percentage point higher than we would expect based on academic performance alone; in North Carolina they are nearly eight points higher. Overall, we found that public high schools in our sample are inflating graduation rates by roughly four percentage points compared with 15 years earlier.


Sceptics will point out that the test-taking population is significantly different today than it was 15 years ago, and that this may be making test scores look worse than they actually are. “If more and more students are sitting for these tests,” says Thomas Dee of Stanford University, “the composition changes over time in ways that probably bias scores downward.” Such “compositional effects” do not appear to explain our results, however. The share of students taking the ACT or SAT in our sample actually fell from 78% in 2007 to 68% in 2022. This would suggest that, if anything, our estimates of graduation-rate inflation may be too low, rather than too high.


You might expect policymakers to be scrambling to shore up academic standards. In fact, they are doing the opposite. In May last year New Jersey’s board of education voted to lower the passing score on the state’s high-school graduation test, saying the current standards had “adverse impacts” on students. In November Oregon education officials scrapped its “essential skills” graduation exams in maths, reading and writing. At least four more states—Florida, Massachusetts, New Jersey and New York—are considering doing away with their own exit exams. In January Alaska’s board of education voted to lower proficiency standards for the state’s reading and maths exams.


The trend towards weakening standards can be blamed in part on No Child Left Behind, an education-reform law passed in 2002. It required states to track the share of students graduating in four years and set annual targets for improvement. Schools that failed to hit their targets faced sanctions, including possible closure. Although such policies were well-intentioned, they had perverse outcomes. To keep graduation rates up, teachers devised creative ways of raising grades: allowing students to retake exams, removing penalties for late assignments, adjusting grading scales. “We’re doing what I call ‘grading gymnastics’,” says Eric Welch, a social-studies teacher in Fairfax County, Virginia. “There’s a lot of pressure to hit the metric, regardless of how you do it,” explains Peter VanWylen, a data consultant and former teacher in Memphis, Tennessee. “Nobody wants to lose their job and so there’s this pressure to get the number where it needs to be.”


Other concerns are also at work. “The push for educational equity, and in particular racial equity, has been used in a lot of places to push against higher standards for high-school graduation,” says Morgan Polikoff of the University of Southern California. When New Jersey debated new testing benchmarks last year, one board-of-education member argued that a higher standard would be “unfair” to black and Latino students in urban districts. Oregon’s decision to drop its graduation exam in November was based in part on a report by the education department which concluded that the test produced “inequitable outcomes” for “historically marginalised” groups.


Must try harder


Lowering standards, it is thought, can help narrow such achievement gaps. Yet it may have the opposite effect. A recent working paper by Brooks Bowden, Viviana Rodriguez and Zach Weingarten of the Universities of Pennsylvania and Texas at San Antonio analyses how a more lenient grading policy introduced by North Carolina public high schools in 2014 affected effort and academic performance. The authors found that after schools implemented the new grading scale, which led to more As and fewer Fs, students with low test scores showed up to class less often and put in less effort. The attendance of high-scoring students did not change. Although the policy led to slightly higher graduation rates, it also contributed to wider gaps in GPAs and standardised test scores between high- and low-achieving students.


This suggests that policies that lower the bar may harm the very students they are meant to help. “I don’t think we’re helping anybody by handing out higher grades or giving out graduation certificates,” says Dr Bowden, one of the authors of the study. Better instead to set expectations high, reckons Dr Polikoff. “People rise to the expectations you set.” ■


Friday, March 8, 2024

ROUSSEAU

The antinomian temptation  

Translated into the political sphere, Rousseau’s ideas about freedom and virtue are a recipe for totalitarianism. “Those who dare to undertake the institution of a people,” Rousseau wrote in the Social Contract, “must feel themselves capable, as it were, of changing human nature, ... of altering the constitution of man for the purpose of strengthening it.” As the philosopher Roger Scruton observed in an essay on the French Revolution, “the revolutionary consciousness lives by abstract ideas, and regards people as the material upon which to conduct its intellectual experiments.” Man is “born free,” Rousseau famously wrote, but is “everywhere in chains.” Alas, most men did not, according to him, truly understand the nature or extent of their servitude. It was his job to enlighten them—to force them, as he put it in one chilling epithet, to be free. Such “freedom” is accomplished, Rousseau thought, by bringing individual wills into conformity with what he called the “general will”—surely one of the most tyrannical political principles ever enunciated. “If you would have the general will accomplished,” he wrote, “bring all the particular wills into conformity with it; in other words, as virtue is nothing more than this conformity of the particular wills, establish the reign of virtue.”

Establishing the reign of virtue is no easy task, as Rousseau’s avid disciple Maximilien Robespierre discovered to his chagrin. All those “particular wills”—i.e., individual men and women with their diverse aims and desires—are so recalcitrant and so ungrateful ungrateful for one’s efforts to make them virtuous. Still, one does what one can to convince them to conform. And the guillotine, of course, is a great expedient.

Robespierre was no political philosopher. But he understood the nature of Rousseau’s idea of virtue with startling clarity, as he showed when he spoke of “virtue and its emanation, terror.” It is a remark worthy of Lenin, and a grim foreshadowing of the Marxist-Leninist rhetoric that informed a great deal of Sixties radicalism. I mention Rousseau here because, acknowledged or not, he is an important intellectual and moral grandfather of so much that happened in the cultural revolution of the 1960S. (Important “fathers” include Nietzsche, Marx, and Freud.) Rousseau’s narcissism and megalomania, his paranoia, his fantastic political ideas and sense of absolute entitlement, his sentimentalizing nature-worship, even his twisted, hypertrophied eroticism: all reappeared updated in the tumult of the 1960s. And so did the underlying totalitarian impulse that informs Rousseau’s notion of freedom.


Roger Kimball, The Long March: How the Cultural Revolution of the 1960s Changed America (17-18). [2001] Encounter Books. Kindle Edition.