Monday, June 17, 2024

ENIAC

 ENIAC, one of the first “modern” computers, debuted in 1946. It weighed 27 tons, required 240 square feet (22.3 square meters) of floor space, and needed 174,000 watts (174 kilowatts) of power, enough to (allegedly) dim all the lights in Philadelphia when turned on. In 1949, Popular Mechanics predicted that one day a computer might weigh less than 1.5 tons. In the early 1970s, Seymour Cray, known as the “father of the supercomputer,” revolutionized the computer industry. His Cray-1 system supercomputer shocked the industry with a world-record speed of 160 million floating-point operations per second, an 8-megabyte main memory, no wires longer than four feet, and its ability to fit into a small room. The Los Alamos National Laboratory purchased it in 1976 for $8.8 million, or $36.9 million in today’s inflation-adjusted dollars.

But as predicted by Moore’s Law (the number of transistors that can fit on a microchip will double roughly every twenty-four months), modern computing has improved and spread beyond even the wildest speculations of people living at the time of any of these computers (certain science fiction excepted). A team of University of Pennsylvania students in 1996 put ENIAC’s capabilities onto a single 64-square-millimeter microchip that required 0.5 watts, making it about 1/350,000th the size of the original ENIAC. And that was twenty years ago. Popular Mechanics’ prediction proved correct, though a bit of an (understandable) understatement.

Still, try telling its 1949 editorial staff that today we hold computers in our hands, place them in our pockets, and rest them on our laps. Laptop computers with 750 times the memory, 1,000 times the calculating power, and essentially an infinitely greater amount of general capabilities as the Cray-1 are now available at Walmart for less than $500. Bearing out the Iron Sky comparison, a smartphone with 16 gigabytes of memory has 250,000 times the capacity of the Apollo 11 guidance computer that enabled the first moon landing. A life’s wages in 1975 could have bought you the computing power of a pocket calculator in 2000. In 1997, $450 could have bought you 5 gigabytes of hard-drive storage that is free today. A MacBook Pro with 8 gigabytes of RAM has 1.6 million times more RAM than MANIAC, a 1951 “supercomputer.” Forget angels on the head of a pin: Intel can fit more than six million transistors onto the period at the end of this sentence.

What this all means for the consumer is an unprecedented spread of technology. Today’s cell phones exceed the computing power of machines that required rooms mere decades ago. Nearly half the world uses the Internet, up from essentially zero in 1990. Today, then, computers are better, faster, smarter, more prevalent, and more connective than ever before. In the 1990s, progressive policy makers fretted over something called “the digital divide.” They convinced themselves that, absent government intervention, the Internet would be a plaything for the wealthy. They raised new taxes, transferred wealth, paid off some constituents, and claimed victory. But the truth is that the Internet was always going to be for everyone, because that is what the market does. It introduces luxuries for the wealthy, and the wealthy subsidize innovations that turn luxuries—cell phones, cars, medicine, computers, nutritious food, comfortable homes, etc.—into necessities. It is the greatest triumph of alchemy in all of human experience, and the response from many in every generation is ingratitude and entitlement.

Jonah Goldberg, Suicide of the West: How the Rebirth of Tribalism, Populism, Nationalism, and Identity Politics Is Destroying American Democracy (352). Random House Publishing Group. Kindle Edition.

No comments:

Post a Comment