Monday, January 12, 2026

KNOWLEDGE FIRST

 AI Changes NOTHING About What Students Need to Learn

The disruptive technology of our age will change many things. What schools teach kids should not be one of them.


Frederick Hess
January 2026


What does AI mean for what schools should teach? The mantra of our age is “AI changes everything.”
I think that’s wrong. Profoundly wrong. In fact, when it comes to what schools should teach, it’s fairer to say, “AI changes nothing.”


Don’t misunderstand me. Obviously, AI is reshaping the economy, the workforce, and the production of 21st century staples like TikTok videos and country songs.


But AI can do its thing without necessarily changing what students should learn.


We’re constantly told by a parade of tech bros, education impresarios, executives, and politicos that “the age of AI” demands less in the way of traditional academics and more focus on “soft skills” like “communication, problem-solving, and collaboration.”


The World Economic Forum’s explainer, “Why AI makes traditional education models obsolete – and what to do about it,” urges educators to ditch “specialized knowledge” and “embrace the ‘how to think’ model.” Harvard’s Howard Gardner predicts that, by 2050, children will need just a few years of “reading, ’riting, ’rithmetic, and a little bit of coding” because “most cognitive aspects of mind . . . will be done so well by large language machines and mechanisms that whether we do them as humans will be optional.” Instead of academics, OpenAI CEO Sam Altman advises students to pursue a “deep familiarity with the tools” and “sort of evolve yourself with technology.” Economist Tyler Cowen says “the curriculum itself is now radically obsolete” and filled with “wasteful instruction.”


TL;DR: Academic content is out; skills and learning “how to think” are in. If this all feels familiar, it should. While the tool may be new, the advice isn’t.


The purveyors of 21st Century Skills have spent decades insisting that it’s foolish for students to spend so much time learning academic content when there are so many more valuable things for students to learn. As Ken Kay, former president of the Partnership for 21st Century Skills, explained in “The Seven Steps to Becoming a 21st Century School or District,” what really matters are “the 4C’s”—“Critical thinking, Communication, Collaboration, [and] Creativity.”


Two decades ago, in a 2006 TIME Magazine cover story on “How to Build a Student for the 21st Century,” Deborah Stipek, the then-dean of Stanford University’s esteemed Graduate School of Education, mocked the idea that students should still learn South American geography, Civil War battles, or the periodic table of elements. Why? As she put it, “You can look it up on Google.”
Harvard’s Tony Wagner has gone even further, declaring, “Knowledge has become a commodity. Free like air, like water . . . available on every internet connected device. There is no longer a competitive advantage in knowing more than the person next to you because they’re going to Google it and figure it out just in time.”


In a new book on reviving the liberal arts, Angela Bauer, the provost at Texas Woman’s University, urges a “seismic shift” in which colleges “transition from a content-based curriculum” and “knowledge-based tests” to “experiential learning theory.” Though she offers all manner of “life skill” alternatives to content-based learning (ranging from ethics to teamwork to cultural competence), it’s never quite clear what—if anything—Bauer thinks today’s graduates should actually know.


That’s a common malady. Indeed, those intent on demoting academic content never offer much more than vacuous, hand-waving incantations as to what students should learn instead. I can’t help but think of the century-old Cardinal Principles of Secondary Education, issued back in 1918 by the National Education Association, which tagged academics as just one of seven priorities for schools—while elevating more “modern” pursuits like “health,” “worthy home-membership,” and the “worthy use of leisure.” That exercise was prompted by early 20th-century industrialization: The commission of experts wrestled with what students needed to know in an era of factory production and world-changing technologies like cars, planes, and radios. They concluded it was less literature and more life skills.


While each subsequent technological era has yielded similar exhortations, the calls took on newfound fervor in the digital age. Back in 1989, former Assistant Secretary of Labor Arnold Packer, a Johns Hopkins scholar and co-author of the hugely influential “Workforce 2000” report, took to the Washington Post to argue it was silly for students to still “cut up frogs” in biology when “workers need to know digital technology.”


Instead of anatomy, what did Packer want students to learn? He raved about a program whose students “use computers and videodiscs to learn about photocopiers and fax machines and about telephones that are used in complex conference calls.” Of course, he wrote, the goal wasn’t simply “learning how to operate a fax machine” but to master “the higher skills that will enable [students] to operate tomorrow’s office equipment; in other words, they are learning to learn.” It’s never quite clear why he thought using a fax machine helps students “learn how to learn” but dissection does not.


Look, I have no longitudinal data here, but I strongly suspect that graduates who were literate, numerate, and modestly knowledgeable about science fared better over the past 35 years than those with even a dazzling mastery of photocopiers and faxes.


I don’t mean to pick on Packer. He’s had lots of company over the years. In 2000, the federal 21st Century Workforce Commission published “A Nation of Opportunity: Strategies for Building America’s 21st Century Workforce,” which identified the three “hottest” jobs in IT: Technical Support, Database Administration, and Computer Programming. Umm, whoops. Those “hot” 21st century jobs weren’t ultimately so hot. In computer programming, the number of positions plunged 60 percent between 2001 and 2019—and that’s before AI started to wreck house. And, in a bit of unfortunate timing, the Commission’s report was immediately followed by a wave of offshoring that gutted U.S.-based technical support. Today, tech support pays a bit under $30 an hour, and not many would consider it a growing field. Turns out it’s hard to predict the shape of the future workforce or the skills graduates will need a decade or two hence.


But workforce projections are cool and, since it takes decades to see if the prognosticators are right, no one’s ever held accountable for being wrong. Meanwhile, talk of a “new workforce” is an excuse to dream up fun lists of nifty “new skills,” which is more appealing than struggling to do better at teaching the old, boring ones. “New skills” make for exciting grant applications, buzzy headlines, and inspiring keynotes. Trying to help kids master history, geometry, geography, or biology? Not so much.


A quarter-century ago, Marc Prensky, the man who coined the term “digital native,” explained, “The single biggest problem facing education today is that our Digital Immigrant instructors . . . are struggling to teach a population that speaks an entirely new language.” You see, wrote Prensky:
[A]fter the digital “singularity” there are now two kinds of content: “Legacy” content (to borrow the computer term for old systems) and “Future” content. “Legacy” content includes reading, writing, arithmetic, logical thinking, understanding the writings and ideas of the past, etc – all of our “traditional” curriculum . . . Some of it (such as logical thinking) will continue to be important, but some (perhaps like Euclidean geometry) will become less so, as did Latin and Greek. “Future” content is to a large extent, not surprisingly, digital and technological.


In practice, making room for “future” content means “legacy” academics get demoted. Expectations decline, rigor gets dismissed as outdated, and the focus drifts from knowledge-rich instruction. This is why the proponents of “new skills” have consistently short-changed students, and why they’re at risk of doing so once again in the age of AI.


First, keep in mind that none of the new skills are especially new. Critical thinking? Collaboration? Communication? If you think these weren’t important for personal and professional success before the digital age, you’re nuts. I mean, some of the most wildly successful books of the past century (like How to Win Friends and Influence People, published in 1936, or The Power of Positive Thinking, published in 1952) covered precisely these skills and how to practice them. The so-called 21st century skills aren’t actually all that new.


Second, all the paeans to photocopiers and Google elide a simple truth: Students can’t think deeply about nothing. Skills are not a replacement for knowledge; they should be complementary. It’s tough to think critically or communicate incisively if you’re just “thinking about thinking” or “communicating about communicating” (or “learning how to learn” about conference calls). These skills are all worth developing, but only if there’s an objective. I mean, there’s nothing about studying “legacy” content—literature, history, math, science—that should get in the way of students learning empathy, collaboration, and problem-solving. Hell, these subjects are rife with opportunities to practice and master those skills.


So, then, how should we prepare students for the “age of AI”?


Here’s a hot take: Give students a robust, content-rich education. Make it rigorous and engaging. Teach reading, writing, math, literature, history, geography, science, world languages, and the arts. Teach Civil War battles, Euclidean geometry, dissection, the periodic table, and much else. Sure, cultivate useful skills. But job one for schools should be teaching a broad base of knowledge that will prepare students to be autonomous, thoughtful adults, no matter what the workforce actually looks like in 2046 (when today’s 4th graders turn 30).


Ultimately, the assertion that AI makes knowledge less valuable is more talking point than truism. As Ohio State’s Michael Clune aptly observed recently in The Atlantic, AI requires students to “analyze its written responses,” identify “inaccuracies,” “integrate new information with existing knowledge,” “envision new solutions,” “make unexpected connections,” and “judge when a novel concept is likely to be fruitful.”


Guess what? All those tasks depend on knowledge. You can’t identify inaccuracies, integrate new information, envision new solutions, make connections, or judge concepts absent baseline understanding. Clune quotes sociologist Gabriel Rossman, who notes, “Careful use of AI helps me at work, but that is because I completed my education decades ago and have been actively studying ever since.”


Leveraging AI’s vaunted capabilities requires deep, fluid knowledge. You want AI to help plan a manned mission to Mars? Great. You better know enough about orbital dynamics, mass-thrust ratios, material strength, atmospherics, and nutrition to ask the right questions. You want AI to help pen a country song? You’re well-served by being versed in lyrics, melody, editing, and cultural touchpoints.
Students have studied literature, history, languages, geography, geometry, and chemistry for centuries through all manner of innovations (including the steam engine, factory, airplane, transistor, and personal computer). Why? Because this is the corpus of knowledge that, when taught responsibly and well, helps students understand their humanity and their world. This is how schools prepare responsible citizens, productive adults, and autonomous human beings. Advances in technology, even one as staggering as AI, don’t change that. This is a timeless lesson—one we’re apparently obligated to learn time and again.


[This post originally appeared on the blog Old School with Rick Hess at Education Next.]

Friday, January 9, 2026

Harvard Drops Western Civilization

 The Daily Signal—Commentary
Harvard Says Yes to Discrimination, No to Western Civ
Daniel McCarthy | December 30, 2025

Daniel McCarthy is the editor of Modern Age: A Conservative Review and a columnist for The Spectator and Creators Syndicate.


At Harvard University today, professors who teach Western history are history.
James Hankins, a specialist in Renaissance thought, was one of the last holdouts.


Now Hankins, who has just published a hefty book that teaches what Harvard doesn’t—The Golden Thread: A History of the Western Tradition. Vol. 1—has decamped for the University of Florida’s Hamilton School of Classical and Civic Education.


It’s not the warmer weather that’s drawn him away from Cambridge, Massachusetts.
It’s the contrast in intellectual climates: frozen and dead, where Western history is concerned, at Harvard; full of green shoots at the University of Florida.


“We have not hired with tenure a historian in a Western field—ancient, medieval, early modern, or modern—in a decade,” Hankins says about his Harvard department, which in that time “lost eight senior historians in Western fields—all major figures—through death, retirement or departure for other universities. I will be the ninth, and I am not expecting to be replaced.”


The loss isn’t just Harvard’s: “The replacement of Western history by global history” has done “serious harm … to the socialization of young Americans,” the historian warns in Compact Magazine.


“When you don’t teach the young what civilization is, it turns out, people become uncivilized.”
In the 40 years Hankins taught at Harvard, he saw his profession shift its focus from European civilization to cultures once considered barbarian:


“In this absurdist rendering of world history, Central Asian peoples,” for example, “are presented as the drivers of cultural innovation, spreading their benign influence east and west via the Silk Road.”


This is marketed as “‘de-centering the West,’ where Western countries are literally put in their place as an ugly growth on the back side of Eurasia.”


Mongol hordes aren’t the only protagonists of the new history, however: anyone who isn’t white qualifies.


A similar standard applies to elite graduate-school admissions, in Hankins’ experience: “I was told informally by a member of the admissions committee” in 2021, he recalls, that “admitting a white male … was ‘not happening this year.’”


As for Harvard’s white male undergraduates, their skin color and sex are obstacles to academic advancement.


“[T]he best student at Harvard—he won the prize for the graduating senior with the best overall academic record—was rejected from all the graduate programs to which he applied. He too was a white male,” the historian recounts.


“I called around to friends at several universities to find out why on earth he had been rejected. Everywhere it was the same story: Graduate admissions committees around the country had been following the same unspoken protocol as ours.”
Could anyone get past this illegal discrimination?


“The one exception I found to the general exclusion of white males had begun life as a female,” Hankins writes.


Hankins was one of the mere 3% of Harvard faculty who identify as conservative.
Yet even progressive scholars admit there’s a problem.


Theda Skocpol, Harvard’s Victor S. Thomas Professor of Government and Sociology, is nobody’s idea of a conservative.


In a recent interview published with the journal Sociologica, however, Skocpol acknowledges, “We in the universities … went overboard with trying to boost some groups over others.”


That came in the midst of lowering standards—the average GPA at Harvard is now, notoriously, around 3.8: an “A-.”


“We went too far in academia for a decade before Trump, in giving the impression that students’ feelings matter more than what they learn. … We’re at the point at Harvard where everybody thinks they should get an ‘A,’ and they don’t think they should do much to get it.”


But how seriously should taxpayers, who foot the bill for much of higher ed’s mischief, take a professor like Skocpol when she says, “I believe in actual equality, applying the same standards to everyone, not using special quotas or excuses for particular groups”?


Not very seriously, when in the same interview she says Trump administration officials cracking down on the academy’s follies “all resent some woman—particularly some Black woman—who told them what to do or who got what they think they deserved.”


Harvard’s last president, Claudine Gay, was a black woman whose scholarship was infected with plagiarism. She’s still employed by Harvard, with a compensation package said to be in the neighborhood of $900,000.


The contrast between Gay’s treatment as a black woman and the white male students with exemplary records who have been discriminated against the way Hankins describes should make even a Theda Skocpol question her premises about why conservatives criticize higher ed.


Harvard and its peers have replaced Western history with global history—and merit with identity politics.


If scholars of Hankins’ caliber keep migrating to the freer and fairer setting of places like Florida, Harvard will at last find itself replaced, too.


COPYRIGHT 2025 CREATORS.COM

Friday, August 22, 2025

INDIAN STUDENT

 Date: August 20, 2025 at 11:11:34 PM EDT

—From an Indian student who came (with his parents) to the in-person TCR History Camp at the Commonwealth School in Boston this Summer:


        When I got accepted into the camp in early February, I was almost ecstatic. I had been looking for a space where history was taken seriously, not treated like a subject to memorize, as it is in India. I wasn’t sure how my work on colonial education policy in India would fit in. Most people were writing on classical history or the American culture revolution, while I came with notes on Macaulay’s Minute, missionary schools, and the rewriting of Indian history through textbooks. Instead of being out of place, it turned into an advantage. People pushed me with questions I hadn’t faced before: was colonial schooling in India different from British models in Africa? How did textbooks play a role in the epistemic conquest?


        What stood out at camp was how much attention went to the process of writing history. Back home, no one asks how you frame a thesis or whether you’re using primary sources well. At the TCR History Camp, those were the first things people wanted to know. It forced me to explain my ideas on epistemes, how colonial schools created new ways of thinking, and to back them up clearly. That kind of feedback was different from anything I had experienced in India. I got one-on-one sessions with each of the instructors and I have to specially thank Mr. Gray for working closely with me and pushing me to deliver to a very high standard of writing.


        The History Camp was also my first time in Boston. I hate the cliché, but it was love at first sight. I spent hours at the Boston Public Library, walked along the Charles River, and wandered down Newbury Street. The mix of historic architecture, the energy of the college culture, and the sense of history running through the city made it hard not to be drawn in.


        The camp left me with a clearer sense of what I was doing. I arrived thinking my topic was narrowly Indian. I left knowing it was part of a larger story about how empires use schools to shape knowledge, and how those structures stay in place long after the empire is gone. 


        Thank you TCR for this experience. I will treasure it.

Friday, August 15, 2025

PEARL HARBOR

The Axis propagandists had caricatured America as helplessly splintered by race, ethnicity, class, and creed; as a pampered, luxury-loving society in which the only cause that aroused the people was the pursuit of the almighty dollar; as a nation of loafers and malingerers, overpaid, overfed, and over-enfranchised, in which politicians went with hats in hand to receive the benediction of union bosses. To their eyes, the United States was a sprawling, individualistic, leisure-loving nation, strung out on jazz, movies, baseball, comic strips, horse racing, and radio comedies—anything but work, and certainly not the work of marching off to war. 

It was a nation enfeebled by divided government, with power impotently shared by the president and Congress and law courts, all constantly put upon by an insolent, unbridled press. Americans were a parochial, self-absorbed, inward-looking people, who could not care less about the rest of the world, who would never consent to spill a drop of blood to defend England, the villainous oppressor of their revolutionary heritage; or Russia, the nerve center of global communism; or any part of Asia, a place so distant and alien it could have been in another galaxy. 

The critique was shallow and disingenuous, a collage of crude stereotypes and half-truths. But even in America one heard self-criticism along similar lines, and the nation was clearly unprepared to confront the Axis in 1941. The American people did not like the naked aggression of Germany and Japan, and a majority favored Roosevelt’s policy of providing munitions and material support to their victims. But entering the war was a very unpopular prospect. In a Gallup poll taken less than two months before Pearl Harbor, only 17 percent of Americans had favored war with Germany. There was open talk of mass desertions from the army, encapsulated in the mutinous acronym “OHIO,” or “Over the hill in October.” 

In August, the House had voted to extend the peacetime draft by the 1-vote margin of 203 to 202. The isolationist movement actually grew stronger in the weeks leading up to Pearl Harbor, with its leaders shouting to packed public halls that Roosevelt was conspiring to foment war with the Axis. By 1941, the president saw that war was coming but could do nothing more than he had already done to change the temper of the American people. He could only wait for some inciting incident or provocation. “The turning point,” Hitler had called the Japanese attack on Pearl Harbor. “A complete shift in the general world picture,” Goebbels had concluded. 

They were right, but not in the sense they intended. Before December 7, 1941, the American industrial economy, lying completely beyond the reach of Axis bombers or armies, had been the single best hope of the embattled Allies. Only by militarizing that economy, harnessing it entirely to war production, could the power of the Axis be destroyed. But the sprawling republic would never be mobilized or militarized without the consent of the American people. “There was just one thing that they [the Japanese] could do to get Roosevelt completely off the horns of the dilemma,” wrote the presidential speechwriter Bob Sherwood, “and that is precisely what they did, at one stroke, in a manner so challenging, so insulting and enraging, that the divided and confused American people were instantly rendered unanimous and certain.” If the Second World War could be said to have pivoted on a single point, it was not the Battle of Britain, or El Alamein, or Stalingrad, or the fall of Italy. Pearl Harbor, by giving Roosevelt the license to do what needed to be done, sealed the fate of both Germany and Japan.

Ian W. Toll, Pacific Crucible: War at Sea in the Pacific, 1941-1942 (Vol. 1) (The Pacific War Trilogy): War at Sea in the Pacific, 1941–1942 (61-62). (Function). Kindle Edition. 

Monday, August 4, 2025

ARSENAL

The numbers are still staggering, every time we look at them. From 1940 to 1945, the U.S. produced 141 aircraft carriers; eight battleships; 807 cruisers, destroyers, and destroyer escorts; and 208 submarines. It produced 324,000 aircraft, 88,410 tanks, 2.4 million trucks, 2.6 million machine guns, and 41 billion rounds of ammunition.


NRPLUS
MagazineSeptember 2025 Issue

How We Built the Arsenal of Democracy
By Arthur Herman
July 24, 2025 


And how we can do it again

The bad news is, 30 years after the end of the Cold War, our nation’s defense-industrial base is in serious crisis.


According to a recent report from the Commission on the National Defense Strategy for the United States (an independent, bipartisan group established by Congress in 2022), the factories, facilities, plants, and shipyards of our current defense-industrial base are “grossly inadequate” for confronting the dual threats of Russia and China. The Defense Department agrees. Its first-ever Defense Industrial Strategy document highlighted “serious shortfalls” in the existing base, including manufacturing, supply chains, workforce, and production, and it concluded that “this call to action may seem a great cost, but the consequences of inaction or failure are far greater.”


That was under the Biden administration. The “big, beautiful bill” passed by Congress and signed by Trump at least tries to undo the damage of the past 30 years. It sets aside $29 billion for shipbuilding and other spending tied to our naval and maritime industrial base; our officials are belatedly realizing the two are intertwined and inseparable. It spends another $25 billion for munitions spread across various programs — the war in Ukraine demonstrated that our industrial base is not making enough conventional artillery shells. Another $5 billion will be invested in the critical minerals needed for building today’s weaponry, and $16 billion will go toward innovative technologies such as drones, AI, and low-cost weapons.


All this, however, will take time, which is in shorter supply even than money. All in all, it’s a grim situation we’re only beginning to address.


We’ve been here before, on the eve of World War II. The result five years later was the creation of the greatest military-industrial complex in history. But the answer then, as now, didn’t spring from Congress or the Oval Office. It came from corporate boardrooms around the country.


In 1940, the United States had the 18th-largest army in the world, right behind tiny Holland. While equipped with modern carriers and battleships, the Navy faced too many global commitments with meager resources; it was not prepared to face a potential invader like Hitler’s Germany. General George Marshall, Army chief of staff, warned Roosevelt that if Hitler landed five divisions on American soil, there would be nothing he could do to stop them. Meanwhile, within a year and a half, the Navy’s vaunted battleships sat at the bottom of Pearl Harbor.


America found itself systematically unable to meet the demands of modern mechanized warfare on land, sea, or air. Neither the War nor Navy Departments had plans for how to revive a defense-industrial base that had been largely dismantled after World War I.


That critical summer of 1940, Roosevelt found a corporate leader willing to undertake the task: mass-production wizard and General Motors President William “Big Bill” Knudsen. Knudsen told FDR that, given 18 months’ head start, he and his colleagues could mobilize enough of American industry to trigger the single greatest outpouring of modern weaponry the world had ever seen, from planes, tanks, and machine guns to ships, submarines, and aircraft carriers.


Roosevelt decided to give Knudsen and his colleagues a free hand, based on three key principles.


The first was mobilizing America’s biggest and most productive companies to make what was needed, even if they had never made war matériel before. Knudsen turned to the automotive, steel, chemical, and electronics industries because they had the largest engineering departments — men (and sometimes women) who could figure out how to produce the decisive weapons the military needed in record numbers, from bazookas (GE) and torpedoes (Westinghouse) to entire B-24 bombers (Ford) — eventually even the plutonium for an atomic bomb (DuPont).


Second, Knudsen insisted that FDR clear away antibusiness tax rules and regulations, including suspending antitrust laws, so industry could focus on producing what the armed forces needed, not dodging government lawsuits. That included pushing aside the Navy and War Departments’ antiquated rules for procurement, which reflected leisurely peacetime conditions, not wartime emergency.


Third, Knudsen insisted on keeping the entire process voluntary, so corporate leaders would be free to decide on their own which war matériel they were best suited to contract for, and how to produce it. The point was to reduce Washington’s interference in the production process and make sure that federal dollars followed the trail of productivity and innovation, not the other way around.


The plan worked. By the time Japanese bombs fell on Pearl Harbor in December 1941, the scale of American war production was already approaching that of Nazi Germany. America was on the way to becoming what Roosevelt famously dubbed the “arsenal of democracy” (a phrase that was coined by Knudsen). By the end of 1942, America was producing more tanks, ships, planes, and guns than the entire Axis.


The numbers are still staggering, every time we look at them. From 1940 to 1945, the U.S. produced 141 aircraft carriers; eight battleships; 807 cruisers, destroyers, and destroyer escorts; and 208 submarines. It produced 324,000 aircraft, 88,410 tanks, 2.4 million trucks, 2.6 million machine guns, and 41 billion rounds of ammunition.


By 1944, American industry was producing eight aircraft carriers a month, 50 merchant ships a day, and a warplane every five minutes. Two-thirds of all the war matériel used by all the Allies in World War II came from America — as did the most powerful innovative weapon in history, the atomic bomb.


In one sense, the challenge this time of turning around our current defense-industrial base will be easier. In terms of talent, innovation, and basic physical facilities, we already have the best military-industrial complex on the planet. It’s true that we are no longer the manufacturing center of the world we once were. In 1945, the United States hosted one-half of the world’s industrial capacity. Today it’s less than 16 percent. Today it’s China that enjoys the edge, at just over 30 percent. The Chinese have unleashed their manufacturing base to build up their forces on land, sea, air, and space in ways that suggest they’ve learned — or at least think they’ve learned — the lessons of World War II better than we have.


One thing is clear. The rebuilding of our defense-industrial base can’t, and won’t, rest on the big defense contractors alone, the Boeings, the Lockheed Martins, and the Northrop Grummans. They bring a lot to the process: talent, experience, and unsurpassed skill in integrating many supply chains and subcontractors into complex workable wholes. But they are not the drivers of today’s high-tech economy the same way that Ford and GM and General Electric were the drivers of our industrial economy when they armed America for World War II. They have become too tethered to the bureaucratic workings of today’s Pentagon to be the main architects of tomorrow’s.


Fortunately, a new generation of patriotic business leaders — from Elon Musk and Vivek Ramaswamy to the CEOs of high-tech defense companies like Palantir and Anduril and General Atomics — are waiting for the opportunity to transform their companies into powerhouses of a new arsenal of democracy. The same is true of the leaders of the “Magnificent Seven” tech companies, including Jeff Bezos of Amazon, Sundar Pichai of Google, and Jensen Huang of Nvidia. Give them the right kind of call, as Roosevelt did with Bill Knudsen and Henry Kaiser in 1940, and they will answer.


This raises a broader point about rebuilding our defense industries. If we are going to leap ahead of China in critical areas such as shipbuilding, space, and hypersonics, we’ll have to deploy an extremely innovative series of technologies and policies that allows us to reinvent our manufacturing base as a whole. A great place to start will be the defense-industrial base.


In fact, those who bemoan the shrunken state of our manufacturing economy and complain that we don’t have enough time or resources or workers to rebuild that base are looking at the problem from the wrong end. By reinventing the economic sector most vital to our national security — the defense-industrial sector — we can achieve ripple effects throughout the rest of the economy. But only if we unleash the energy and dynamism of the private sector to solve our most pressing issues in the public sector.


Overall, there are six steps we can take to reinvent a defense-industrial base for the 21st century.


First, the new administration has to sweep aside the regulations and obstacles that slow our productive defense sector. One of those obstacles is the Pentagon budget system itself, which is encrusted with rules and red tape more suited to the industrial age than the space age. Congress needs to adopt the reforms recommended by its commission on Pentagon budget reform. (Full disclosure: I helped to write that commission’s interim and final reports.)


Second, engage the best advanced technologies to accelerate production and innovation. Most big defense-manufacturing facilities are set up for very limited types of production, such as the F-35 or Virginia-class submarine. The future of defense production, however, lies in diversifying the manufacturing process itself, through the use of AI, robotics, and 3-D printing. The ultimate goal should be to produce multiple products at once: for example, advanced sensors with one line geared for defense and national security, the other for commercial purposes.


Third, make use of a host of smaller, leaner, more specialized defense companies that provide vital supply chain support and subcomponents for larger defense contractors such as Lockheed Martin, RTX, and Anduril — all here in the United States. They will also be important laboratories for developing new approaches to manufacturing and the technologies that will support that effort — not only for our defense base but for a revived commercial base as well.


Fourth, incentivize a new generation of workers for defense and defense-related industries. A study by the think tank Third Way showed that in 2022 some 600,000 Americans were in registered apprenticeship programs. That’s barely 0.3 percent of the working-age population in the country. That number is five times higher in Canada, seven times higher in Germany, and twelve times higher in Switzerland.
Instead of focusing on apprenticeship programs for the big defense contractors, bring the programs to the smaller, more innovative players. In Germany, for example, so-called Mittelstand (small and medium-size businesses) actively engage their younger workers in fashioning the business itself: More than 80 percent offer incentives for workers to contribute new ideas. The American equivalent can be seedbeds for building a new industrial workforce that is engaged, creative, and productive.


Fifth, enlist the universities in developing defense-related technologies, including AI and quantum computing. University-based research and development were crucial to the military-industrial complex during the Cold War. One of its historic offshoots was Silicon Valley. Bringing university research to small- to middle-scale defense firms, not just the big contractors, can save not only our defense-industrial base but also our universities in the post-woke era.


Sixth, incentivize venture capitalists to fund our national security. Venture capitalists are expert at finding opportunities in commercial markets but not so good in understanding defense applications. If we rethink defense production as a step in successful commercial manufacturing, rather than the other way around, we could open the floodgates for the $1.3 trillion venture capital market to flow directly into the defense and defense-related realm. That would be a key advantage over China, as well, where new venture capital investment in 2025 will barely hit $70 billion.


In that regard, it’s time for the Pentagon to encourage defense producers to think about how their products can open a niche in commercial markets as well as meet military requirements. That won’t just draw in private capital investment. It’s how defense producers can make their products more innovative and cost-effective, in order to compete in the commercial marketplace.


The arsenal of democracy in World War II was built by companies large and small who had first made their mark as commercial companies. Defense specialists — firms with little or no commercial business — accounted for only 6 percent of the Defense Department’s major programs at the end of the Cold War. In 2024, it was 61 percent. It’s time to turn those numbers back around. By doing so, by unleashing the energy, creativity, and drive of the private sector to rebuild our defense-industrial base, we can trigger a tech-industrial revival of the American economy — one that makes us more secure and more prosperous far into the future.

Friday, July 11, 2025

FATHERHOOD

If we follow the evidence where it leads, we must conclude that the biggest intervention in education is not another literacy coach or SEL program. It’s dad.


AEI

Family Structure Matters to Student Achievement. What Should We Do with That?


By Robert Pondiscio
The Next 30 Years
July 10, 2025

A recent report from the University of Virginia—Good Fathers, Flourishing Kids—confirms what many of us know instinctively but rarely see, or avoid altogether, in education debates: The presence and engagement of a child’s father has a powerful effect on their academic and emotional well-being. It’s the kind of data that should stop us in our tracks—and redirect our attention away from educational fads and toward the foundational structures that shape student success long before a child ever sets foot in a classroom.


The research, led by my AEI colleague Brad Wilcox and co-authored by a diverse team that includes another AEI colleague, Ian Rowe, finds that children in Virginia with actively involved fathers are more likely to earn good grades, less likely to have behavior problems in school, and dramatically less likely to suffer from depression. Specifically, children with disengaged fathers are 68% less likely to get mostly good grades and nearly four times more likely to be diagnosed with depression. These are not trivial effects. They are seismic.


Most striking is the report’s finding that there is no meaningful difference in school grades among demographically diverse children raised in intact families. Black and white students living with their fathers get mostly A’s at roughly equal rates—more than 85%—and are equally unlikely to experience school behavior problems. The achievement gap, in other words, appears to be less about race and more about the structure and stability of the family.


This may be a surprising finding to some, but not to William Jeynes, a professor of education at California State University, Long Beach, whose meta-analyses have previously demonstrated the outsized academic impact of family structure and religious faith. (The new UVA report does not study the role of church-going.) As I wrote in How the Other Half Learns, Jeynes’ work highlights how two-parent households and religious engagement produce measurable benefits in educational achievement. “When two parents are present, this maximizes the frequency and quality of parental involvement. There are many dedicated single parents,” Jeynes has noted. “However, the reality is that when one parent must take on the roles and functions of two, it is simply more difficult than when two parents are present.” Jeynes’ most stunning finding, and his most consistent, is that if a Black or Hispanic student is raised in a religious home with two biological parents the achievement gap totally disappears—even when adjusting for socioeconomic status.


My colleague Ian Rowe has been a tireless advocate for recognizing and responding to these patterns. He has long argued that NAEP—the Nation’s Report Card—should disaggregate student achievement data by family composition, not just by race and income. That simple step would yield a more honest accounting of the challenges schools are facing—and help avoid both unfair blame and unearned credit.


Yet this conversation remains a third rail in education. Many teachers and administrators are understandably wary of saying too much about family structure for fear of stigmatizing children from single-parent households—particularly in settings where single-parent households are dominant. Rowe has also faced resistance to his efforts to valorize the “Success Sequence,” the empirical finding that graduating high school, getting a full-time job, and marrying before having children dramatically increases one’s odds of avoiding poverty. But being cautious is not the same as being silent, and it’s not compassionate to pretend these dynamics don’t matter when the data so clearly shows that they do.


None of this absolves educators of their duty to reach and teach every child. But it does suggest we should be clear-eyed in how we interpret data and set expectations. Teachers, particularly those in low-income communities, often shoulder the full weight of student outcomes while lacking the ability to influence some of the most powerful predictors of those outcomes. That’s frustrating—and understandably so.


Citing compelling evidence on fatherhood and family formation is not a call for resignation or excuse-making. It’s a call for awareness and intelligent action. While schools can’t influence or re-engineer family structure, teachers can respond in ways that affirm the role of fathers and strengthen the school-home connection. They can make fathers feel welcome and expected in school life—not merely tolerated. They can design family engagement activities that include dads as co-participants, not afterthoughts. They can build classroom cultures that offer structure and mentoring, especially to students who may lack it at home. And maybe—just maybe—the field can overcome its reluctance to share with students what research so clearly shows will benefit them and the children they will have in the future. Rowe takes pains to note his initiative to teach the Success Sequence is intended to help students make decisions about the families they will form, not the ones they’re from. “It’s not about telling them what to do,” he says, “it’s about giving them the data and letting them decide for themselves.”


This leads to a final point, and for some an uncomfortable one: If we truly care about student outcomes, perhaps we should be willing to support the institutions that reliably foster them. And that includes religious schools.


Religious schools—particularly those rooted in faith traditions that emphasize marriage, family life, and moral formation—often create environments where the presence of fathers and the reinforcement of shared values are not incidental but central. A recent analysis by Patrick J. Wolf of the University of Arkansas, published in the Journal of Catholic Education, found that adults who attended religious schools are significantly more likely to marry, stay married, and avoid non‑marital births compared to public‑school peers. The effects are most pronounced among individuals from lower‑income backgrounds.


In states with Education Savings Accounts (ESAs) and other school choice mechanisms, we have an opportunity—perhaps an obligation—to expand access to these institutions. That’s not merely a question of parental rights or religious liberty. It’s a matter of public interest. If these schools produce better education and social outcomes by encouraging family formation and reinforcing the value of fatherhood, the public benefits—even if instruction is delivered in a faith-based context. Said simply: The goal of educational policy and practice is not to save the system. It’s to help students flourish.


So yes, let’s fund fatherhood initiatives. Let’s run PSAs about the importance of dads. But let’s also get serious about expanding access to the kinds of schools—whether secular or religious in nature—that support the kind of family culture where children are most likely to thrive. Because if we follow the evidence where it leads, we must conclude that the biggest intervention in education is not another literacy coach or SEL program. It’s dad.

Thursday, July 3, 2025

BOOK BANNING

 What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one.


Foreword to Amusing Ourselves to Death
Neil Postman, 1985

We were keeping our eye on 1984. When the year came and the prophecy didn’t, thoughtful Americans sang softly in praise of themselves. The roots of liberal democracy had held. Wherever else the terror had happened, we, at least, had not been visited by Orwellian nightmares. 

But we had forgotten that alongside Orwell’s dark vision, there was another—slightly older, slightly less well known, equally chilling: Aldous Huxley’s Brave New World. Contrary to common belief even among the educated, Huxley and Orwell did not prophesy the same thing. Orwell warns that we will be overcome by an externally imposed oppression. But in Huxley’s vision, no Big Brother is required to deprive people of their autonomy, maturity and history. As he saw it, people will come to love their oppression, to adore the technologies that undo their capacities to think. What Orwell feared were those who would ban books. 

What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be reduced to passivity and egoism. Orwell feared that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become a trivial culture, preoccupied with some equivalent of the feelies, the orgy porgy, and the centrifugal bumblepuppy. As Huxley remarked in Brave New World Revisited, the civil libertarians and rationalists who are ever on the alert to oppose tyranny “failed to take into account man’s almost infinite appetite for distractions.” In 1984, Huxley added, people are controlled by inflicting pain. In Brave New World, they are controlled by inflicting pleasure. In short, Orwell feared that what we hate will ruin us. Huxley feared that what we love will ruin us. 

This book is about the possibility that Huxley, not Orwell, was right.

Postman, Neil. Amusing Ourselves to Death: Public Discourse in the Age of Show Business. Penguin Group. Kindle Edition.