icon caret-left icon caret-right instagram pinterest linkedin facebook twitter goodreads question-circle facebook circle twitter circle linkedin circle instagram circle goodreads circle pinterest circle

Historical Studies in Contemoporary America

Nuclear Weapons and the Making of Modern America, 1940-1963

Leslie Groves, military boss of the Manhattan Project, and his two administrative superiors, James Conant (president of Harvard University) and Vannevar Bush (President Roosevelt's science advisor and chief of new weapons development for the U.S. during World War II

Revolutionary technologies are often placed in the service of national power and objectives. In this seminar on Hiroshima and the making of modern America, we study the creation and use of the first atomic bomb and the ensuing nuclear-arms race between the U.S. and the Soviet Union. We will concentrate on the period from 1940 to 1963 – and also examine contemporary issues arising from nuclear weapons and efforts to limit their spread Iran, North Korea and other nations outside of the “club” of accepted nuclear nations. Through diverse perspectives, students will gain an appreciation for the technological and scientific forces that drove the U.S. to global dominance in the years following World War II. Hiroshima profoundly altered American attitudes towards science and technology, war and international security, education and invention. The bomb also raised questions about faith in technological progress, forcing some Americans to re-evaluate their relationship to complex systems on which their lives depended. The movement to limit the testing of nuclear weapons – and control their spread to other nations – spawned the first genuinely global movement to control science and technology. At home, anxieties about nuclear weapons energized an American public more accustomed to trusting experts and government decrees into a more activist, critically-engaged civil society, and contributed to the rise of feminism, environmentalism, and movements around “appropriate” and “responsible” innovation. In the Cuban Missile Crisis in October 1962, these anxieties reached a crescendo, bringing the U.S. and Soviet Union to the brink of nuclear war. At the same time, the “Manhattan Project,” came to embody the view that given enough resources, technocrats could achieve virtually any end. In short, Hiroshima reveals much about not only the making of modern America but also the contemporary world.

Computers, Software and the Foundation of Digital Experience, 1945-1995

Failures in the story of computing offer important lessons about how innovation works, and why certain technically sound approaches failed to gain acceptance. For instance, Isaacson makes much of Gates’s decision to provide basic software for the first personal computer offered by IBM in 1981. Then a hegemonic giant in the computer world, IBM anointed Gates as king of PC software. That story is supported by Gates’s retrospective recollection that, at the time, he realized that one operating system, most likely the one chosen by IBM, would become the standard. Yet even if this account is correct, many observers fail to mention that IBM made the decision, not to assist Gates in his ascent, but because the company wanted to maintain the robust market and future for its more powerful mini- and mainframe computers, whose software at the time was powerful and proprietary to the company. Rather than depict IBM’s decision as inevitable, arising from a set of constrained circumstances (the company needed a PC OS and didn’t want to take the time to create one for what its engineers considered to be a toy computer that would perhaps quickly be superseded), one might widen the hsitorical frame by describing the outlook and mentality of IBM’s computer experts at the time. He would have then learned that IBM had sound engineering reasons for segregating the emerging PC from its then hegemonic computing machines that were essential to running large organizations. By missing the contingency at play in IBM’s world circa 1980, one might present as neat and logical an inflection point in the historical road that was actually messy, provisional and highly random.
The same kind of teleological thinking infects Isaacson’s repeated attempts to establish that the shape of the digital revolution – and especially the links between the PC as a tool for individual creativity and the rise of the Internet as a means for creative individuals to join forces in interesting ways -- owes much to the California counter-culture of the 1960s and 1970s. That proposition is largely correct, as authors John Markoff (author of What the Dormouse Said: How the Sixties Counterculture Shaped the Personal Computer Industry, 2005) and Fred Turner (author of From Counterculture to Cyberculture, 2006) showed a full decade before Isaacson turned up. But the strong links between the counterculture and PC pioneers doesn’t justify Isaacson’s further leap -- of describing the resulting digital revolution as chiefly motivated and propelled by in the goal of shifting power over information from elites to the masses. The populist desire to spread the power of computer as widely as possible indeed reflected democratic impulses long at play in American society. But by ignoring the countervailing force unleashed by the digital revolution – that information became radically re-centralized by the Internet, thus enabling governments to more easily surveil and track dissenters and loyal citizen alike – Isaacson is unable to explain how what he views as digital tools of democratic empowerment have become, in the hands of authoritarian agents such as the National Security Agency and the government of China, modalites of surveillance and repression. Once more, Isaacson distorts the history of computing in disturbing ways, by focusing on a narrow set of business winners when the effects of the spread of computing on society are much more complicated. While the specter of an all-knowing and all-powerful digital Panopticon overlooks the nuances and complexities of a networked world, the idea that corporate-mediated and –managed computing platforms are sure pathways for individual and collective liberation is no less dangerous an illusion.