Tuesday, June 18, 2024

World War One

In early 1914 Bethmann Hollweg’s secretary, Kurt Riezler, published (pseudonymously) a book entitled Characteristics of Contemporary World Politics. In it he argued that the unprecedented levels of armament in Europe were ‘perhaps the most controversial, urgent and difficult problem of the present time.’ Sir Edward Grey, always fond of explanations of the war which minimized human agency, would later agree. ‘The enormous growth of armaments in Europe,’ he wrote in his post-war memoirs, ‘the sense of insecurity and fear caused by them—it was these that made war inevitable. This, it seems to me, is the truest reading of history . . . the real and final account of the origins of the Great War.’

Historians seeking great causes for great events are naturally drawn to the pre-war arms race as a possible explanation for the First World War. As David Stevenson has put it: ‘A self-reinforcing cycle of heightened military preparedness . . . was an essential element in the conjuncture that led to disaster . . . The armaments race . . . was a necessary precondition for the outbreak of hostilities.’ David Herrmann goes further: by creating a sense that ‘windows of opportunity for victorious wars’ were closing, ‘the arms race did precipitate the First World War.’ If the Archduke Franz Ferdinand had been assassinated in 1904 or even in 1911, Herrmann speculates, there might have been no war; it was ‘the armaments race . . . and the speculation about imminent or preventive wars’ which made his death in 1914 the trigger for war. Yet, as both Stevenson and Herrmann acknowledge, there is no law of history stating that all arms races end in wars.

The experience of the Cold War shows that an arms race can deter two power blocs from going to war and can ultimately end in the collapse of one side without the need for a full-scale conflagration. Conversely, the 1930s illustrates the danger of not racing: if Britain and France had kept pace with German rearmament after 1933, Hitler would have had far greater difficulty persuading his generals to remilitarize the Rhineland or to risk war over Czechoslovakia. The key to the arms race before 1914 is that one side lost it, or believed that it was losing it. It was this belief which persuaded its leaders to gamble on war before they fell too far behind. Riezler erred when he argued that ‘the more the nations arm, the greater must be the superiority of one over the other if the calculation is to fall out in favour of war.’ On the contrary: the margin of disadvantage had to be exceedingly small—perhaps, indeed, only a projected margin of disadvantage—for the side losing the arms race to risk a war. The paradox is that the power which found itself in this position of incipient defeat in the arms race was the power with the greatest reputation for excessive militarism—Germany.

Sir Niall Ferguson, The Pity of War: Explaining World War I (82-83). Basic Books. Kindle Edition.

Monday, June 17, 2024

ENIAC

 ENIAC, one of the first “modern” computers, debuted in 1946. It weighed 27 tons, required 240 square feet (22.3 square meters) of floor space, and needed 174,000 watts (174 kilowatts) of power, enough to (allegedly) dim all the lights in Philadelphia when turned on. In 1949, Popular Mechanics predicted that one day a computer might weigh less than 1.5 tons. In the early 1970s, Seymour Cray, known as the “father of the supercomputer,” revolutionized the computer industry. His Cray-1 system supercomputer shocked the industry with a world-record speed of 160 million floating-point operations per second, an 8-megabyte main memory, no wires longer than four feet, and its ability to fit into a small room. The Los Alamos National Laboratory purchased it in 1976 for $8.8 million, or $36.9 million in today’s inflation-adjusted dollars.

But as predicted by Moore’s Law (the number of transistors that can fit on a microchip will double roughly every twenty-four months), modern computing has improved and spread beyond even the wildest speculations of people living at the time of any of these computers (certain science fiction excepted). A team of University of Pennsylvania students in 1996 put ENIAC’s capabilities onto a single 64-square-millimeter microchip that required 0.5 watts, making it about 1/350,000th the size of the original ENIAC. And that was twenty years ago. Popular Mechanics’ prediction proved correct, though a bit of an (understandable) understatement.

Still, try telling its 1949 editorial staff that today we hold computers in our hands, place them in our pockets, and rest them on our laps. Laptop computers with 750 times the memory, 1,000 times the calculating power, and essentially an infinitely greater amount of general capabilities as the Cray-1 are now available at Walmart for less than $500. Bearing out the Iron Sky comparison, a smartphone with 16 gigabytes of memory has 250,000 times the capacity of the Apollo 11 guidance computer that enabled the first moon landing. A life’s wages in 1975 could have bought you the computing power of a pocket calculator in 2000. In 1997, $450 could have bought you 5 gigabytes of hard-drive storage that is free today. A MacBook Pro with 8 gigabytes of RAM has 1.6 million times more RAM than MANIAC, a 1951 “supercomputer.” Forget angels on the head of a pin: Intel can fit more than six million transistors onto the period at the end of this sentence.

What this all means for the consumer is an unprecedented spread of technology. Today’s cell phones exceed the computing power of machines that required rooms mere decades ago. Nearly half the world uses the Internet, up from essentially zero in 1990. Today, then, computers are better, faster, smarter, more prevalent, and more connective than ever before. In the 1990s, progressive policy makers fretted over something called “the digital divide.” They convinced themselves that, absent government intervention, the Internet would be a plaything for the wealthy. They raised new taxes, transferred wealth, paid off some constituents, and claimed victory. But the truth is that the Internet was always going to be for everyone, because that is what the market does. It introduces luxuries for the wealthy, and the wealthy subsidize innovations that turn luxuries—cell phones, cars, medicine, computers, nutritious food, comfortable homes, etc.—into necessities. It is the greatest triumph of alchemy in all of human experience, and the response from many in every generation is ingratitude and entitlement.

Jonah Goldberg, Suicide of the West: How the Rebirth of Tribalism, Populism, Nationalism, and Identity Politics Is Destroying American Democracy (352). Random House Publishing Group. Kindle Edition.

MERITOCRACY

 The Free Press

17 June 2024

 Meritocracy now! DEI is on the way out, and not a day too soon. But what should replace it? When it comes to recruitment, the short answer is surely: hire the best person for the job. One person sticking to this once uncontroversial, now edgy proposition is Alexandr Wang, the 27-year-old who became the world’s youngest self-made billionaire after he dropped out of MIT to co-found AI firm Scale in 2016. In a memo announcing the company’s new hiring policy, Wang writes “Scale is a meritocracy, and we must always remain one.” The guiding principle, he notes, is “MEI: merit, excellence, and intelligence.” He continues:  

That means we hire only the best person for the job, we seek out and demand excellence, and we unapologetically prefer people who are very smart.

We treat everyone as an individual. We do not unfairly stereotype, tokenize, or otherwise treat anyone as a member of a demographic group rather than as an individual. 

We believe that people should be judged by the content of their character—and, as colleagues, be additionally judged by their talent, skills, and work ethic.

There is a mistaken belief that meritocracy somehow conflicts with diversity. I strongly disagree. No group has a monopoly on excellence. A hiring process based on merit will naturally yield a variety of backgrounds, perspectives, and ideas. Achieving this requires casting a wide net for talent and then objectively selecting the best, without bias in any direction. We will not pick winners and losers based on someone being the “right” or “wrong” race, gender, and so on. It should be needless to say, and yet it needs saying: doing so would be racist and sexist, not to mention illegal.

Upholding meritocracy is good for business and is the right thing to do.