Though most Americans don’t realize it, for the past three decades real wages in the United States have not risen. In 2007 the hourly wage of the average American worker, adjusted for inflation, was lower than in 1972.
What caused this prolonged wage stagnation? Most observers acknowledge the role played by economic globalization, in the form of outsourcing and intense competition from abroad. But two other powerful factors figure in the mix: the erosion of equalizing institutions and a glut of available workers. We know a lot about the former, from the decline of unions to the stubbornly low minimum wage. The worker surplus is less familiar. We hear about specific or seasonal labor shortages in the economy (nurses, farm laborers), but general labor shortages are extremely rare. Federal data indicate that we have achieved full employment only three times since World War II: in the early 1950s (Korea), the late ’60s (Vietnam), and the late ’90s (a miracle).
But even in the best times, our high employment rates have been misleading. Take last April—before the full impact of the housing collapse was felt—when the government announced an unemployment rate of 5 percent. Scholars and activists have long questioned the Bureau of Labor Statistics employment criteria, which characterize as unemployed only those who have recently looked for work and not found it. Thirty years ago the BLS inaugurated...
To read the rest of this article please login or become a subscriber.
About the Author
Frank Stricker, professor of history, labor studies, and interdisciplinary studies at California State University, Dominguez Hills, is the author of Why America Lost the War on Poverty—and How to Win It (University of North Carolina Press, 2007).