Friday, July 19, 2024

WAVELL

Encouraged by his success in the north, Wavell then moved to cover his southern flank. When Italy had declared war the Duke of Aosta, Viceroy of Ethiopia (Abyssinia), had crossed into the Sudan with 110,000 troops and taken Kassala, then into Kenya to capture Moyale, and also into British Somaliland, seizing Berbera. 

Wavell had bided his time before responding, but in late January 1941 he sent two British Commonwealth forces totalling 70,000 men—mainly South Africans—to exercise a massive pincer movement utterly to rout Aosta. 

Lieutenant-General Sir Alan Cunningham occupied Addis Ababa on 4 April, having averaged 35 miles a day for over a thousand miles, taking 50,000 prisoners and gaining 360,000 square miles of territory at the cost of 135 men killed and four captured. The Emperor Haile Selassie of Ethiopia returned to his capital on 5 May, five years to the day since it had fallen to the Italians. Aosta and his enormous but demoralized army surrendered on 17 May, leaving the Red Sea and Gulf of Aden open to Allied shipping once more. 

Meanwhile, in the north, very great victories greeted O’Connor, who saved the Suez Canal and drove the Italians back along the coast road to Benghazi. As the 6th Division forced Graziani into headlong retreat, O’Connor sent the 7th Division through the desert via Mechili to slice through the Cyrenaican bulge and cut off the Italians. At the battle of Beda Fomm on the Gulf of Sirte between 5 and 7 February 1941 the British Empire and Commonwealth won its first really significant land victory of the Second World War. 

In two months from 7 December 1940, the Western Desert Force had achieved successes that utterly belied Churchill’s statement quoted above; they had destroyed nine Italian divisions and part of a tenth, advanced 500 miles and captured 130,000 prisoners, 380 tanks and 1,290 guns, all at the cost of only 500 killed and 1,373 wounded. In the whole course of the campaign, Wavell never enjoyed a force larger than two divisions, only one of them armoured. It was the Austerlitz of Africa, and prompted his prep school to note in the Old Boys’ section of the Summer Fields magazine: ‘Wavell has done well in Africa.’

Andrew Roberts, The Storm of War: A New History of the Second World War. HarperCollins. Kindle Edition.

Monday, July 8, 2024

BOOKS FOR BOYS

I hear from plenty of educators who say they’re reluctant to talk about the needs of boys for fear of being labeled reactionary. But more boys might develop a taste for reading if they encountered more of the kinds of books they’d like to read.


What If Boys Like the “Wrong” Kind of History?
Great Battles for Boys: a delightfully countercultural book series

Frederick Hess
July 8, 2024


An Amazon box was on the porch the other day. (I get sent a lot of books. It’s a cool perk.) I pulled out five colorful, oversized paperbacks. Great Battles for Boys: The Korean War. Great Battles for Boys: The American Revolution. Great Battles for Boys: WW2 in Europe. And two more. I found the titles delightfully countercultural. I mean, who writes about military strategy today? Who unabashedly markets stuff to boys? The books, all published between 2014 and 2022, are authored by history teacher Joe Giorello. They all run about 150 to 250 pages with straightforward text, anecdotes, pictures, maps, and suggestions for further reading.


Having never heard of the series, I was curious how these books were faring in the larger world. The answer? Very well. On Amazon, at the time of this writing, Giorello’s volume on WWII in Europe ranked #2 in “Children’s American History of 1900s.” His volume on the Civil War was #1 in “Children’s American Civil War Era History Books.” His book on the Revolutionary War was #1 in “Children’s American Revolution History.” There are thousands of enthusiastic, five-star reviews.


And yet, like I said, I’d never heard of Giorello. I couldn’t find a single mention of him when I searched School Library Journal, Education Week, the National Council for the Social Studies, or the National Council for History Education. As best I can tell, he’s self-published. The stories are interesting, but the narrative is pretty rote, with no gimmickry or multimedia pizzazz. It’s just workmanlike, accessible history. For instance, the chapter on “The Battle of Britain” in WW2 in Europe begins:


By June 1940, Germany had achieved a victory in Norway, but the win came at a steep cost. The battle had damaged or sunk over half of Germany’s warships.


This loss was crucial. Hitler desperately wanted to conquer Great Britain, but with half his fleet out of commission, the German navy was no match for the powerful British Royal Navy.


Hitler decided he would conquer Britain by air.


So, what’s going on? Why have these books been such a silent success? The most salient explanation may be the frank, unapologetic decision to offer books about “great battles for boys” in an era when that’s largely absent from classrooms. This may simply be the kind of history that a lot of boys are eager to read about. Of course, even penning that sentence can feel remarkably risqué nowadays, which may be a big part of the problem.


It got me thinking. My elementary-age kids have brought home or been assigned a number of children’s books on history. Most are focused intently on social and cultural history. I’ll be honest. Even as someone who’s always been an avid reader, I find a lot of that stuff pretty tedious. As a kid, I found books about the Battle of Midway or D-Day vastly more interesting than grim tales of teen angst, and I don’t think that makes me unusual. Moreover, it surprises no one (except the occasional ideologue) to learn that girls generally appear more interested in fiction than boys—or that boys tend to prefer reading about sports, war, comedy, and science fiction, while girls favor narratives about friendship, animals, and romance.


Today, when I peruse classroom libraries, recommended book lists, or stuff like the summer reading suggestions from the American Library Association, I don’t see much that seems calculated to appeal to boys.


One reason that boys read less than girls may be that we’re not introducing them to the kinds of books they may like. There was a time when schools really did devote too much time to generals and famous battles, but we’ve massively overcorrected. Indeed, I find that too many “diverse, inclusive” reading lists feature authors who may vary by race and gender but overwhelmingly tend to write introspective, therapeutic tales that read like an adaptation of an especially heavy-handed afterschool special.


Now, my point is not that kids should read this rather than that. Schools should be exposing all students to more fiction and nonfiction, with varied topics and themes. If that requires assigning more reading, well, good.


Then there are the well-meaning educators and advocates who approach book selection as an extension of social and emotional learning. Heck, while writing this column, I got an email promoting the nonprofit I Would Rather Be Reading, which uses “trauma responsive literacy support and social-emotional learning to help children.” I’m sure it’s a lovely organization, but I’d be shocked if any of the books in question feature stoic virtues or manly courage. After all, the therapy/SEL set has worked assiduously to define traditional masculinity as “toxic.” And all this can alienate kids who find the therapy-talk unduly precious or rife with adult pathologies.


I hear from plenty of educators who say they’re reluctant to talk about the needs of boys for fear of being labeled reactionary. But more boys might develop a taste for reading if they encountered more of the kinds of books they’d like to read. I’d take more seriously those who talk about inclusive reading lists if their passion extended to the well-being of those students bored by social justice-themed tracts and if they truly seemed more invested in turning every kid into an avid reader, which requires a diverse mix of books available—including those about “great battles for boys.”


Frederick Hess is an executive editor of Education Next and the author of the blog “Old School with Rick Hess.”


Friday, July 5, 2024

MICHAELA

 A Deeper Desire

Perhaps one of the most important things a leader does is help his or her people find courage. One extraordinary story of a leader doing that in a single moment of genius can be found in an incident that occurred during the French Revolution, in the critical Battle of Toulon. Toulon was a vitally important naval city which the French couldn’t afford to lose, but it had recently fallen to the allied forces determined to quash the Revolution. The revolutionaries were stretched thin, lacking experience and short of leadership. Their hopes of regaining the crucial city were looking slim, and the success of the whole Revolution was hanging in the balance.

Just when defeat looked unavoidable, a twenty-four year old artillery officer named Napoleon Bonaparte arrived on the scene. He could see that the only hope of regaining the port lay in establishing a point from which they could effectively bombard the allied ships. In order to do this, one particularly effective but dangerously exposed gun battery needed to be constantly manned. The problem, however, was that the battery’s exposed position meant that those who manned the post were all but certain to die doing so, and eventually it reached the stage where men were simply refusing to take the post, recognising it as the suicide mission that it was. Napoleon knew that the battle had reached a decisive moment, and everything temporarily hung on this crucial point.

He walked through the camp, considering what to do, when an idea struck him. He made a sign with a few words on it. He then attached it to the lethal battery position. After this, it never lacked a man, day or night; indeed, men were fighting over the chance to hold the post. The battle was won, and Napoleon’s name was established. The words he’d written stated simply, ‘The battery for men without fear.’ 

 Something greater than the possibility of an immediate victory had been placed before the men. Something more than the threat of a sanction for cowardice, or an immediate reward for compliance. Something deeper had been appealed to and awoken; the prospect of engaging in something requiring wholehearted, full-blooded courage. The men were ultimately thirsty for the opportunity to harness everything they had, and give it their all. People today are still looking for that kind of invitation: an invitation to give everything you have to something worth fighting for.

Being part of Michaela feels a bit like that.

Katherine Birbalsingh, Michaela: The Power of Culture (33-34). Hodder Education. Kindle Edition.

Tuesday, June 18, 2024

World War One

In early 1914 Bethmann Hollweg’s secretary, Kurt Riezler, published (pseudonymously) a book entitled Characteristics of Contemporary World Politics. In it he argued that the unprecedented levels of armament in Europe were ‘perhaps the most controversial, urgent and difficult problem of the present time.’ Sir Edward Grey, always fond of explanations of the war which minimized human agency, would later agree. ‘The enormous growth of armaments in Europe,’ he wrote in his post-war memoirs, ‘the sense of insecurity and fear caused by them—it was these that made war inevitable. This, it seems to me, is the truest reading of history . . . the real and final account of the origins of the Great War.’

Historians seeking great causes for great events are naturally drawn to the pre-war arms race as a possible explanation for the First World War. As David Stevenson has put it: ‘A self-reinforcing cycle of heightened military preparedness . . . was an essential element in the conjuncture that led to disaster . . . The armaments race . . . was a necessary precondition for the outbreak of hostilities.’ David Herrmann goes further: by creating a sense that ‘windows of opportunity for victorious wars’ were closing, ‘the arms race did precipitate the First World War.’ If the Archduke Franz Ferdinand had been assassinated in 1904 or even in 1911, Herrmann speculates, there might have been no war; it was ‘the armaments race . . . and the speculation about imminent or preventive wars’ which made his death in 1914 the trigger for war. Yet, as both Stevenson and Herrmann acknowledge, there is no law of history stating that all arms races end in wars.

The experience of the Cold War shows that an arms race can deter two power blocs from going to war and can ultimately end in the collapse of one side without the need for a full-scale conflagration. Conversely, the 1930s illustrates the danger of not racing: if Britain and France had kept pace with German rearmament after 1933, Hitler would have had far greater difficulty persuading his generals to remilitarize the Rhineland or to risk war over Czechoslovakia. The key to the arms race before 1914 is that one side lost it, or believed that it was losing it. It was this belief which persuaded its leaders to gamble on war before they fell too far behind. Riezler erred when he argued that ‘the more the nations arm, the greater must be the superiority of one over the other if the calculation is to fall out in favour of war.’ On the contrary: the margin of disadvantage had to be exceedingly small—perhaps, indeed, only a projected margin of disadvantage—for the side losing the arms race to risk a war. The paradox is that the power which found itself in this position of incipient defeat in the arms race was the power with the greatest reputation for excessive militarism—Germany.

Sir Niall Ferguson, The Pity of War: Explaining World War I (82-83). Basic Books. Kindle Edition.

Monday, June 17, 2024

ENIAC

 ENIAC, one of the first “modern” computers, debuted in 1946. It weighed 27 tons, required 240 square feet (22.3 square meters) of floor space, and needed 174,000 watts (174 kilowatts) of power, enough to (allegedly) dim all the lights in Philadelphia when turned on. In 1949, Popular Mechanics predicted that one day a computer might weigh less than 1.5 tons. In the early 1970s, Seymour Cray, known as the “father of the supercomputer,” revolutionized the computer industry. His Cray-1 system supercomputer shocked the industry with a world-record speed of 160 million floating-point operations per second, an 8-megabyte main memory, no wires longer than four feet, and its ability to fit into a small room. The Los Alamos National Laboratory purchased it in 1976 for $8.8 million, or $36.9 million in today’s inflation-adjusted dollars.

But as predicted by Moore’s Law (the number of transistors that can fit on a microchip will double roughly every twenty-four months), modern computing has improved and spread beyond even the wildest speculations of people living at the time of any of these computers (certain science fiction excepted). A team of University of Pennsylvania students in 1996 put ENIAC’s capabilities onto a single 64-square-millimeter microchip that required 0.5 watts, making it about 1/350,000th the size of the original ENIAC. And that was twenty years ago. Popular Mechanics’ prediction proved correct, though a bit of an (understandable) understatement.

Still, try telling its 1949 editorial staff that today we hold computers in our hands, place them in our pockets, and rest them on our laps. Laptop computers with 750 times the memory, 1,000 times the calculating power, and essentially an infinitely greater amount of general capabilities as the Cray-1 are now available at Walmart for less than $500. Bearing out the Iron Sky comparison, a smartphone with 16 gigabytes of memory has 250,000 times the capacity of the Apollo 11 guidance computer that enabled the first moon landing. A life’s wages in 1975 could have bought you the computing power of a pocket calculator in 2000. In 1997, $450 could have bought you 5 gigabytes of hard-drive storage that is free today. A MacBook Pro with 8 gigabytes of RAM has 1.6 million times more RAM than MANIAC, a 1951 “supercomputer.” Forget angels on the head of a pin: Intel can fit more than six million transistors onto the period at the end of this sentence.

What this all means for the consumer is an unprecedented spread of technology. Today’s cell phones exceed the computing power of machines that required rooms mere decades ago. Nearly half the world uses the Internet, up from essentially zero in 1990. Today, then, computers are better, faster, smarter, more prevalent, and more connective than ever before. In the 1990s, progressive policy makers fretted over something called “the digital divide.” They convinced themselves that, absent government intervention, the Internet would be a plaything for the wealthy. They raised new taxes, transferred wealth, paid off some constituents, and claimed victory. But the truth is that the Internet was always going to be for everyone, because that is what the market does. It introduces luxuries for the wealthy, and the wealthy subsidize innovations that turn luxuries—cell phones, cars, medicine, computers, nutritious food, comfortable homes, etc.—into necessities. It is the greatest triumph of alchemy in all of human experience, and the response from many in every generation is ingratitude and entitlement.

Jonah Goldberg, Suicide of the West: How the Rebirth of Tribalism, Populism, Nationalism, and Identity Politics Is Destroying American Democracy (352). Random House Publishing Group. Kindle Edition.

MERITOCRACY

 The Free Press

17 June 2024

 Meritocracy now! DEI is on the way out, and not a day too soon. But what should replace it? When it comes to recruitment, the short answer is surely: hire the best person for the job. One person sticking to this once uncontroversial, now edgy proposition is Alexandr Wang, the 27-year-old who became the world’s youngest self-made billionaire after he dropped out of MIT to co-found AI firm Scale in 2016. In a memo announcing the company’s new hiring policy, Wang writes “Scale is a meritocracy, and we must always remain one.” The guiding principle, he notes, is “MEI: merit, excellence, and intelligence.” He continues:  

That means we hire only the best person for the job, we seek out and demand excellence, and we unapologetically prefer people who are very smart.

We treat everyone as an individual. We do not unfairly stereotype, tokenize, or otherwise treat anyone as a member of a demographic group rather than as an individual. 

We believe that people should be judged by the content of their character—and, as colleagues, be additionally judged by their talent, skills, and work ethic.

There is a mistaken belief that meritocracy somehow conflicts with diversity. I strongly disagree. No group has a monopoly on excellence. A hiring process based on merit will naturally yield a variety of backgrounds, perspectives, and ideas. Achieving this requires casting a wide net for talent and then objectively selecting the best, without bias in any direction. We will not pick winners and losers based on someone being the “right” or “wrong” race, gender, and so on. It should be needless to say, and yet it needs saying: doing so would be racist and sexist, not to mention illegal.

Upholding meritocracy is good for business and is the right thing to do.

Friday, May 24, 2024

MEN WANTED

Sir Ernest Shackleton when he was about to set out on one of his expeditions, printed a statement in the papers, to this effect:

“Men wanted for hazardous journey to the South Pole. Small wages, bitter cold, long months of complete darkness, constant danger. Safe return doubtful. Honor and recognition in case of success. Ernest Shackleton, 4 Burlington Street.” [May 15, 1913]

In response to his posted ad, Shackleton was supposedly flooded with 5,000 responses, men clamoring to take their chances on the icy southern continent. [September 10, 1913]

 

(When the survivors returned, WWI was on, so they joined the military.)