The election of Donald Trump, and the daily infliction of another huckster, ideologue, paranoid, or unrepentant one-percenter cabinet appointment, has upended the politics of inequality. The defining issue of our time, not an insignificant source of Trump’s victory, is disappearing from the national political radar. So it is dismally appropriate, in the days between the appointments of Ben Carson at Housing and Urban Development and Andrew Puzder at Labor, that Thomas Piketty and colleagues have released updated and revised estimates of the share of national income going to top earners.
Making novel use of tax returns, this research first highlighted the now-iconic “suspension bridge” of top income shares—high at the tail end of the Gilded Age and through the laissez-faire 1920s, descending sharply through the shocks of the Great Depression and the Second World War (and the policies that accompanied both), and then rising again as the postwar political and economic compact was dismantled. The new paper adds to this story in three important ways. First, drawing on a wider range of income data (combining tax, survey, and national accounts), it offers a more complete picture of income distribution—capturing not just the top 10 percent but (for the last half-century) the bottom 50 percent and the middle 40 percent as well. Second, it traces those shares before and after taxes and transfers, offering a clear view of the distributional impact of government programs. And third, it teases apart household or tax unit income measures, and then uses individual incomes to suggest the ways in gender inequality has shaped these larger trends.
Here a couple of glimpses at the new data. The first graphic shows the share of national income claimed by the top 10 percent and top 1 percent of earners, with a toggle for the pre-tax and post-tax estimates.
Thanks to Piketty’s landmark 2013 book, the basic trajectory of top incomes is by now well known. But it is even starker if we express those incomes in real (inflation-adjusted) dollars. The second graphic traces average incomes—pre- and post-tax—for the top 10, 5, 1, 0.5, 0.1, and 0.01 percent of earners. Across most of this history, the impact of the tax system on those incomes is slight. I have calculated a crude effective tax rate (the difference between pre- and post-tax incomes as a percentage of pre-tax incomes) for each top income group. While rates spike for those at the very top of the income distribution during the World War II era, they fall off quickly—and since the 1970s have settled in at less than 20 percent for the top 10 percent of earners, and less than 30 percent for the top 0.01 percent.
Beginning in the 1960s, the survey and tax data is robust enough to generate an income share for the bottom 50 percent of the distribution as well. The final graphic traces the real average income of the bottom 50 percent in the lower panel, and the real average income of top income groups in the upper panel. The grey bars show the ratio between the two. The real average income of the bottom 50 percent is essentially flat, peaking at $16,632 in 1979 and stagnating thereafter ($16,197 in 2014). In 1979, the average one-percenter earned 28 times the income of the average earner from the bottom 50 percent; by 2014, this ratio had ballooned to 81 times. In 1979, the average ten-percenter earned 9 times the income of the average earner from the bottom 50 percent; by 2014, this ratio had ballooned to 19 times.
These trends are dismal, and show no sign of abating. As the economy has slowly recovered from the Great Recession, wages have scarcely budged. Income gains since 2007 have flowed almost exclusively to the richest 1 percent. While this new data brings us through those dismal years (to 2014), it is also clearly a harbinger of worse to come. By all indications, the incoming administration is not just indifferent to the root causes—growing wage inequality, financialization, the collapse of progressive taxation—but eager to double down on all of them.
When Ta-Nehisi Coates’s made his influential “Case for Reparations” in the pages of the Atlantic in 2014, his focus, perhaps surprisingly, was not on slavery. It was on housing discrimination—all of the measures that, from the 1870s on, prevented former slaves and their descendants from getting their “forty acres and a mule,” from owning land and building wealth. While rooted in Jim Crow and the lost promise of Reconstruction, this disadvantage fully flowered in the middle years of the twentieth century—an era in which federal programs opened new opportunities for white families, but allowed the southern Congressional veto and local discrimination in both North and South to shut out most black families. It was an era, as Ira Katznelson and others have shown, “when affirmative action was white.”
A new resource hosted by the Digital Scholarship Lab at the University of Richmond reveals the extent of the government’s role in fueling and enforcing midcentury housing discrimination. In the 1930s, the New Deal’s Home Owners’ Loan Corporation (HOLC) financed or refinanced nearly one in every five residential mortgages in the United States. The HOLC, aimed at relieving distressed borrowers and lenders alike, revolutionized home finance and home ownership by offering long-term, low-down-payment options at a time when the standard commercial loan expected 50 percent down at sale and a payoff in three to five years. This intervention (and infusion) steadied banking and construction sectors rocked by the Depression. And, by dramatically changing the terms of home finance, the HOLC put homeownership within reach of much of the working class. Between 1930 and 1960, the homeownership rate grew from 48 to 62 percent.
But the benefits, notoriously, were far from even. HOLC lending was guided by a hastily assembled library of “residential security” maps which rated the creditworthiness of virtually every urban neighborhood in the country. Lacking the capacity to survey cities themselves, the New Deal turned to local real-estate and finance interests whose first principle and guiding motive was the maintenance of racial segregation. The textual area descriptions which accompanied the HOLC maps echoed private real-estate appraisals of the era, which used the threat of racial “incursion” or “invasion” as the primary marker of property value and leaned on the strength (or weakness) of local restrictions such as race-restrictive deed covenants. In other words, the maps became the basis for three decades’ worth of relentless, federally endorsed redlining.
Following the lead of Kenneth Jackson’s Crabgrass Frontier (1985), a generation of scholarship has documented the motives and the impact of the HOLC’s redlining practices in particular settings. Now, a team of scholars working with the Digital Scholarship Lab at the University of Richmond has made the entire corpus of HOLC maps—and the accompanying area descriptions—accessible. The resulting database, Mapping Inequality: Redlining in New Deal America, is a remarkable accomplishment of publicly engaged scholarship. Geo-referenced on a national map, the 150-odd city maps offer a damning overview of the relentless scope of federal support for local segregation. Each city map in turn allows users to browse the area descriptions, which catalogued detrimental influences such as “colored population” or “absence of restrictions.” This was the fruit, as Robert Weaver (who would later serve as the first Secretary of Housing and Urban Development under Lyndon Johnson) noted bitterly in 1948, of turning a federal program “over to the real estate and home finance boys.” Full HOLC map of Cleveland, 1939 (courtesy of Mapping Inequality)
In some respects, the maps presented here overstate the impact of the HOLC. Perhaps best understood as an inventory of existing practice in real estate and home finance, the HOLC did not invent redlining—indeed, as Jim Greer and others have suggested, it is unclear how just how widely used or distributed the maps were. What’s more, as Todd Michney and Amy Hillier have shown for a number of cities, the HOLC “red zones” that so often accompanied African-American occupancy did not preclude federal mortgage underwriting in those neighborhoods—although they did worsen its terms. In this sense, the HOLC maps succored subprime lending as much as direct disinvestment.
But in other respects, this presentation of the HOLC maps probably understates their impact. Alongside the easy equation of race and credit risk in established neighborhoods, the HOLC (and its parent agency the Federal Housing Administration) flooded the suburbs with subsidies for developers who—building cul-de-sacs in the cornfields outside St. Louis or Chicago or Des Moines—could easily meet federal guidelines for home size, lot size, setback and the like. These developments, many of which were “protected” by race-restrictive deed covenants before the soil was turned, were closed to African Americans. In this sense, the Federal Housing Administration discouraged investment in older city neighborhoods not by prohibiting it, but simply by ensuring that it was easier and more lucrative elsewhere.
Even if the HOLC just inventoried and codified what realtors and banks and developers were already doing, it had a profound impact nevertheless. Our expectation of public (especially federal) programs is that they will bring with them a commitment to equal protection—an argument made consistently, if not always successfully, with the growing scope of federal contracts, subsidies, social policies, and labor standards after the 1930s. But in housing, the federal government was not just timid in confronting racial segregation; it actively encouraged, subsidized, and sanctified it. It was as if the federal government decided to monitor voting registration in Mississippi in 1963, and outsourced the job to the local Klan.
By laundering housing policy through the systematic and candid racism of local real-estate and finance interest, the HOLC also opened a gap between black and white ownership that has never closed. In the new world of home finance, white families bought homes at higher rates, they bought them earlier in life, they bought them on better terms, and they bought them in neighborhoods where housing value appreciated reliably. This yielded, of course, a widening of the racial wealth gap even as other disparities (wages, income) closed slowly in the civil rights era and after. Today, median African-American family wealth is less than one-tenth that of white families—a gap largely attributable to disparate access to housing subsides such as the HOLC, and their impact across generations.
In turn, this segregation and stratification of opportunity have hardened the inequalities that—especially in the American context—flow from housing. Declining or stagnant home values sap the funding of local services, especially schools, that rely on local property taxes. Our public policies in the last half century made it hard for African Americans to escape central cities but easy or natural for jobs to do so—resulting in an enduring mismatch between residential and economic opportunities. And the “neighborhood effects” of segregation and concentrated poverty are pernicious and persistent across a range of social and individual outcomes—including public health, cognitive ability, social mobility, and civic engagement.
The HOLC maps collected at Mapping Inequality provide much more than an accessible archive of a sordid moment in the history of American public policy—although that alone would be worth the price of admission. They document the motives and scale of discriminatory practices that have always animated the American real-estate industry (“just one of those things” that everyone was doing, as Donald Trump lamely offered in defense of his own history of racial bias), and which persist to the present. And they shine a light into the long shadow of systematic housing segregation and discrimination whose consequences—in neighborhoods, and in the opportunities that one generation can offer to the next—are unabated.
The 1935 National Labor Relations Act transformed the landscape of American labor relations, establishing basic collective bargaining rights and outlawing an array of unfair labor practices. The law emboldened activists to organize across the core of the industrial economy, and the tight labor markets of the war years cemented those gains. When the Depression hit in 1929, only one in ten workers were covered by union contracts; by 1945, fully a third of the workforce—skilled and unskilled alike—enjoyed that security.
It wasn’t long before business interests pushed back, winning “antidiscrimination” (read: anti-union) amendments to state constitutions in Florida (1944), Arkansas (1944), Nebraska (1946), and Arizona (1946) and then scoring a broader victory with the passage of the Taft-Hartley Act in 1947. Taft-Hartley allowed states—through the passage of so-call “right to work” laws—to evade the union security clause (Section 8.A.3) of the NLRA. Under “right to work” (RTW), workers covered by union contracts could not be required to join the union and pay dues. In short order, most of the deep South—where starkly racialized labor markets and low-wage, “low-road” economic development were the norm—had embraced RTW. Over the following decade, several states in the midwest and mountain West followed suit (see map below); by 1960, nineteen states had passed RTW laws, and another four joined the list over the next half-century (while Indiana repealed its law in 1965).
In recent years, the right has rediscovered RTW, taking the battle to states that were once union strongholds in an effort to deal a final blow to the labor movement and to signal a “business-friendly” legislative and regulatory climate. Indiana and Michigan succumbed in 2012. Wisconsin in 2015. West Virginia in February of this year.
It is difficult to plumb the full impact of RTW. The timing and location of the laws’ passage makes it almost impossible to untangle the driving forces behind rates of economic growth or job creation across states. The economic trajectories of Minnesota and Mississippi since 1954 (when the latter passed its RTW law) are profoundly different, but it is quite an explanatory leap to pin that difference—good or bad—on one detail of state labor law. And before-and-after analysis of states that have more recently passed RTW are inconclusive at best.
While there is no credible evidence that RTW laws boosts investment or job growth, their impact on a state’s workers are a little clearer. RTW makes both new organizing and, as we have begun to see in Scott Walker’s Wisconsin, holding onto past gains more difficult. Even if union coverage (the reach of the contract) remains the same, dues-paying membership slips. And by undermining bargaining power, it dampens wages. The work of Elise Gould and Heidi Sherholz (updated by Gould and Will Kimball in 2015) controls for an array of individual (education and employment status, for example) and state variables (cost of living, unemployment rate), and finds that wages in RTW states are a little over 3 percent lower than in non-RTW states (a $1500 deficit for a typical full-time worker). The compensation penalty—taking into account lower rates of employer-provided health and pension coverage—is even wider.
This cascade of disadvantage for workers in RTW states, from lower unionization rates to lesser bargaining power to lower wages, is illustrated in the two graphs below. In these visualizations of the data, the states are strung like pearls along each annual measure. The RTW states are red, the others blue. The “box-and-whisker” for each year traces the variability across the states: the centerpoint of each box is the median state on that measure; the top and bottom of the box mark off the seventy-fifth and twenty-fifth percentiles (the “interquartile range”); the top and bottom whiskers reach out to values that are no more than 1.5 times the interquartile range; outliers in the data fall beyond the whiskers.
In the graph of union density, unsurprisingly, RTW states cluster below the median. In many settings, the passage of RTW in the 1940s or ‘50s locked in low unionization rates in states (especially in the South) that had evaded the first wave of post-NLRA organizing. In other settings, RTW set the conditions for economic development in states (especially in the southwest) where postwar economic growth was most intense. As telling as the general pattern are some of the exceptions, including pockets of organizational solidarity (hotel workers in Nevada, manufacturing workers in Iowa) despite a RTW climate.
On wages, we see much the same pattern: a wide variation across states, the RTW states dripping off the bottom of the scale. The distinction between RTW states and the rest is perhaps most pronounced for men and women at the median, sixtieth and seventieth wage percentiles (a wage range, in 2015 dollars, that runs from about $15/hour to $30/hour). I make no claim here that RTW alone is dampening wages (this is a crude measure that does not control for other economic and policy differences across states). But it is clear, I think, that RTW is a potent marker of the broader climate that workers face in a state. Anti-labor legislation, in most of these settings, is accompanied by regressive and austere fiscal regimes, by woeful underinvestment in education, and by social policies that combine meager cash assistance with generous subsidies or incentives for participation in the low-wage labor market. This, in the end, is not about “rights” at all; it’s about power—the diminished power of unions to represent their members and bargain for living wages, and the naked power of business interests to turn states into laboratories of austerity and neoliberalism.
The American system of retirement famously rests on three legs: personal savings and assets, employment-based pensions, and Social Security. Like any tripod, the system is efficiently stable when all three legs are strong, but also vulnerable to weakness in any one of them. Over the last generation, a combination of wage stagnation, declining job quality, and recessionary damage has chiseled away at family resources and job-based benefits. The malevolent misdirection of market fundamentalism, of course, is to sharpen its saws for Social Security—the only leg of this stool still bearing any weight.
Family savings—including not just retirement savings but also other asset cushionslike home ownership—are increasingly shaky. The personal savings rate, whichhovered at or above 10 percent from the 1960s to the mid-1980s, is now a meager 4.8 percent. According to a recent survey by the National Institute on Retirement Security, 45 percent of working-age Americans have no retirement account assets, 62 percent of near-retirement households (ages 55-64) have retirement savings that amount to less than a year of annual income, and the median retirement savings for those households is a paltry $14,500. These numbers, not surprisingly, are also sharply skewed by race, income, and educational attainment.
So what about job-based pensions? Our retirement system (like out health care system) is premised on the notion that public policy need only mop up around the edges, or supplement, employment-based plans. But private pension coverage grew modestly through the middle years of the last century and plateaued at only about half of the workforce. Coverage at work was a sort of lottery, largely dependent on job characteristics like industry, firm size, and union coverage. And such coverage widened labor-market inequalities, as pension plans followed high wages and job tenure.
But even for those workers claiming coverage, retirement security has been whittled away. The private pension of the last generation was usually a defined-benefit plan, financed largely (often wholly) by the employer, and promising a lifetime annuity based on years of service and earnings. Most coverage today takes the form of a defined-contribution plan, such as a 401(k), to which the worker is often the only contributor, and whose retirement benefits depend (often disastrously) on the performance of private equity or company stock. The graphic below captures to slow displacement of defined-benefit pensions, by number of plans, number of participants, contributions, assets, and benefits.
A retirement system should both facilitate savings, managing the risk of retirement across a lifespan, and even out some of the inequality generated by the market, managing the risk of retirement across a diverse population. Our public-private hybrid accomplishes neither. Personal retirement savings and job-based coverage masquerade as private achievements, but depend heavily on the tax advantages that accrue to both; indeed the exclusion of pension contributions and earningsrepresents a tax expenditure of over 1 percent of GDP, in the range of $1.7 trillion. By the same token, Social Security benefits are shaped largely by private employment and earnings history; the benefit structure softens market inequities but also sustains them. Our public-private social insurance system is an artifact of patterns of employment and labor force participation that are no longer with us. What was once a source of security (at least for many) is now just another eddy of inequality, eroding retirement savings at one bank of the income stream and depositing them at the other.
Although the retirement security crisis is dire, the solutions are well within reach. One tack would be to further encourage, subsidize, or mandate job-based pension savings. This could be accomplished by making it easier for small businesses (very few of whom offer pensions) to establish plans; or by mandating modest retirement plans (jointly financed, conservatively-invested) for all workers—with a refundable tax credit to make the plans accessible to part-time and low-wage workers. Another tack would be to decouple retirement security from work. The Obama administration’s recently launched myRA program (which features no minimum deposit, no fees, and a modest return backed by Treasury bonds) takes a step in this direction—although low-income workers are unlikely to claim the tax advantages, and there is no mechanism for employer contributions. And, of course, any of this should be accompanied by recommitting to Social Security, ensuring its future by raising (or removing) the cap on taxed earnings, and nudging the payroll tax rate up 2 or 3 percent. But try telling that to Paul Ryan.
The first study from the newly established JPMorgan Chase Institute, released yesterday, offers an intriguing “big data” glimpse at economic insecurity in today’s U.S. economy. “Weathering Volatility: Big Data on the Financial Ups and Downs of U.S. Individuals” uses a stratified sample from the account and transaction data of almost 30 million JPMorgan Chase customers. Alongside credit information (monthly payments, outstanding balances, delinquencies) for the same individuals, the JPMorgan data measures variation in income and consumption, over a short and recent time span, with rare precision—and a notable tone of angst.
At a sample size of 100,000, the JPMorgan data set boasts nearly twice the sample size of the Current Population Survey (60,000 household respondents) and far exceeds the reach of the Bureau of Labor Statistics’ Consumers Expenditure Survey(7,000 households) or the various panel surveys (which have between 5,000 and 15,000 respondents). The resulting sample, broken into income quintiles (equal fifths of the underlying population), follows the structure of the Census numbers: the upper limit for its first quintile is $23,000, just over the Census/CPS threshold of $21,433 for 2013, and so on up to $104,500, the upper limit for the fourth (compared to the Census/CPS’s $106,101). But most importantly, the JPMorgan data consists of actual transactions—a much more precise and reliable measure of earning and spending than we can expect from survey respondents.
The findings, unsurprisingly, point to large swings in both incomes and consumption, both from year to year and from month to month. Seventy percent of those sampled see a swing in income of more than 5 percent from one year to the next; 84 percent see that volatility from month to month. Some of this is an artifact of the data collection (those paid weekly, for example, see their incomes bounce around just because some months have five Friday paydays and some don’t), but much of it signals the economic damage and insecurity that comes with stagnant wages, atattered safety net, persistently high unemployment and underemployment, andraggedly unreliable work schedules.
Of interest to JPMorgan, of course, are the behavioral responses to income volatility, and the financial products (“innovative insurance or credit products“) that might be offered in response. The report devotes much of its energy to parsing the difference between “responders” (those whose changes in consumption tend to track changes in their income), “sticky optimists” (mostly higher earners) for whom consumption increases more than income by more than 10 percent, and “sticky pessimists,” for whom consumption lags behind increases in income by more than 10 percent.
Of more lasting interest is the way in which this report and dataset confirm and extend our understanding of the insecurity gap pried open by the simultaneous retreat of private labor standards and public policies—what Jared Bernstein has described as the “YOYO” (you’re on your own) economy (not altogether unrelated to that more common twenty-first-century motto, YOLO) and Jacob Hacker has dubbed the “great risk shift.” In the JPMorgan data, this emerges in the volatility of both income and consumption, which—especially for modest earners—capture the heavy reliance on EITC-padded tax refunds (big jumps in income in March and April) and the exposure to unexpected expenses.
The report finds not only that incomes and consumption are both volatile, but that they are volatile independently from one another: households face the risk of lost income, unanticipated expenses, or both. And that risk—for all but the highest earners—is accompanied by insufficient savings or assets to weather such a shock. By JPMorgan estimates, earners in the lowest quintile would need about $1,600 in liquid assets to withstand income volatility—but actual account balances average less than half that ($600). This gap is simply magnified for higher earners: in the second quintile, $2,800 is needed but only $1,400 is available; in the third, $4,800 is needed but only $3,000 is available; in the fourth, $8,200 is needed but only $6,200 is available.
These findings are important—especially considering the richness of the underlying data and their source. The data does have its limits; in the scholarship on this problem, some of the most interesting (and open) questions revolve around the historical timing and the distribution of income volatility. Covering only 2012-14, and with no greater demographic detail than gender and age, the JPMorgan data offers few insights here. But as a contemporary snapshot, drawn on a large and precise sample of income and spending, it is hard to beat.
Social or labor market policies are measured by their reach, their adequacy, and their costs. By these metrics, a minimum wage increase is a slam dunk. A generation of research now demonstrates pretty decisively that markets can accommodate a reasonably higher minimum at no significant threat to job creation—especially when ancillary gains (productivity gains, less turnover, increase in aggregate demand) are taken into account. Raising the minimum wage makes almost no demands on the public purse, and could in fact recoup much of the current public subsidy (through working families’ reliance on means-tested tax credits, cash assistance, health care, and food security programs) of low-wage employment. Even a small increase promises to make a big difference: in 2013, Arin Dube estimated that an increase to $10.10 would raise the incomes of poor families (those at the 10th percentile) by 12 percent and lift five to seven million out of poverty. An increase to $12 would likely have even larger poverty-fighting effects.
While much of our social and tax policy is either poorly targeted (it reaches the poor unevenly) or aimed in in the wrong direction (it benefits those who don’t need it), a minimum wage increase hits the bull’s-eye. As EPI’s new estimates of the impact of the “Raise the Wage Act” (bringing the minimum to $12.00/hour by 2020) underscore, the benefits of an increase would flow overwhelmingly to those—young workers, single parents, workers of color—who need it the most. The interactive graphic below summarizes this important new work: the first menu sorts workers by race, the second by income, age, family status, labor force participation, and educational attainment.
There is an evocative episode in Wayne Johnson’s Colony of Unrequited Dreams in which a young Joey Smallwood (the novel’s real-life protagonist, who would go on to be the first premier of Newfoundland) sets out to organize railroad sectionmen in the colony’s rugged interior. The sectionmen live in shacks at one-mile intervals along the 700 miles of rail lines between the Atlantic coast and the island’s southwestern tip. Recruiting them proves daunting for Smallwood, who sets a goal of twenty miles and twenty signatures a day. “I fancied I was walking the lone street in a company town called Sectionville,” he observes. The sectionmen and their families communicate little with one another, and the railroad—their employer—is their only source of contact with the wider world. Little wonder that at each milepost, Johnson finds families “driven to eccentricity by isolation.”
The material, social, and democratic poverty of Sectionville illustrates the importance of cities and labor unions both to each other and to the goal of shared prosperity. Cities offer the natural solidarities of work and neighborhood that make sustained organizing possible. Union density (built on residential density) discourages competition on wages and encourages competition on efficiency and quality. This benefits both workers and their employers, for whom the benefits of a well-trained workforce, easy access to suppliers and consumers, and decent public goods far outweigh the costs. Cities drive the economy: the top 100 U.S. metropolitan areas, on merely 12 percent our land area, account for at least three-quarters of GDP. They are home to the best jobs and opportunities. They claim virtually all of population growth. They house our best schools and our leading cultural institutions. And they are, by any measure, greener than sparser forms of economic or residential development.
In turn, cities—by virtue of their density and diversity—sustain progressive politics. Perhaps the starkest determinant of the presidential vote in 2012 was population density: across red and blue states, 98 percent of the most densely populated counties went to Obama, while 98 percent of the least densely populated counties went to Romney. The takeaway would seem to be that people who live close to one another are more tolerant and empathetic; they are more likely to know someone of a different color, a different income group, or a different sexual orientation. They rely on and appreciate the provision of public goods and public services (transit, parks, garbage collection)—even as they consume fewer public dollars than their red-state counterparts. And they have a deeper appreciation of the regulatory standards (guns, labor conditions, food, public health) that promise us a modicum of safety and security.
In an urban (and still urbanizing) nation, all of this would seem to be good news. So why, when it comes to the hard work of building a just and sustainable future, does it feel like we still live in Sectionville?
The answer lies in the parallel decline of American cities and the American labor movement in the second half of the twentieth century. The pre–New Deal labor movement was built largely around strong urban unions—in skilled trades, in unskilled services, and in local manufacturing—whose bargaining power was rooted as much in the urban form as it was in the workplace. In many settings, such unions—of printers, carpenters, janitors, waitresses, teamsters—shaped local politics, sustained local solidarity (through secondary boycotts), and pooled resources in the provision of basic services such as health centers.
In the 1930s, the emergent Congress of Industrial Organizations took a different tack,building silos of solidarity around particular industries and pattern bargaining across their constituent firms. The CIO was often indifferent—and sometimes hostile—to the older local unions, many of which, especially in construction, clung to the rival American Federation of Labor. Labor maintained its metropolitan presence in settings where strong local and industrial organization overlapped (autoworkers in Detroit, steelworkers in Pittsburgh, packinghouse workers in Chicago, dockworkers in San Francisco), but local solidarity—in all these settings—depended on the continued health and growth of both the big industrial unions and the central cities in which they were rooted.
Instead of growth, however, the years after the Second World War saw both cities and unions undergo dramatic decline. In the 1950s, about a third of American workers belonged to unions. By 1990, this number had fallen by half, to about 16 percent, and it now sits at just over 11 percent. And these numbers are cushioned by the growth (pre-Reagan) and relative stability (post-Reagan) of public-sector unions over that span: in the private sector, union density was under 25 percent by the early 1970s, half that (under 13 percent) by the late 1980s, and half that again (6.7 percent) by 2013.
American cities declined at the same rate, and for some of the same reasons. Many metropolitan regions grew more slowly than the nation as a whole, and most central cities shrank even if the metropolitan areas of which they were a part continued to grow. Of our fifteen largest central cities, eleven saw their peak population in 1950—the only exceptions being New York, Chicago, Los Angeles, and Houston. And the others didn’t just stop growing: nine of those cities lost more than a quarter of their population over the next fifty years, and five (St. Louis, Detroit, Pittsburgh, Buffalo, and Cleveland) lost more than half.
The demographic shift was in part regional (Rust Belt to Sun Belt), but it was primarily local—that is, urban to suburban. Residentially and commercially, cities simply got thinner. Consider St. Louis. In 1930, the metropolitan area encompassed four counties (two in Missouri, two on the Illinois side) and a population of about 1.3 million—of which over 60 percent lived in the City of St. Louis. In 1970, the metropolitan area included three more counties on the Missouri side, and growth was confined to the suburbs: the City claimed barely a quarter of the metropolitan population (620,000 of 2.36 million). By 2010, the metro area sprawled across seventeen counties and the City claimed just 10 percent of the metro population (now nudging 3 million). Population density, about 1,000 persons per square mile in 1960, was less than a third of that (322 persons per square mile) by 2010.
The overlapping trajectories of urban and union decline are underscored by the graph below, which plots the decline in union density against the decline in the central city populations of the six largest Rust Belt cities (St. Louis, Detroit, Pittsburgh, Buffalo, Cincinnati, and Cleveland).
The sources of union and urban decline are complex but familiar. Even at its peak, the American labor movement was regionally, sectorally, and jurisdictionally fragmented. Even where it was strongest, it depended upon fragile industry-specific deals with leading employers. As a result, its political presence never matched its numbers. Hemmed in by broader political constraints and its own organizing and political strategies, the mid-century labor movement is commonly characterized as timidly contractual at the bargaining table and barrenly married to the Democrats at the ballot box.
At the same time, the stakes at the bargaining table, which in the American setting determined not only wages but social benefits like health care and pensions, were unusually high. This meant that the political backlash—which began in many stateswith the passage of the Taft-Hartley Act in 1947 and spread across the economy with the business offensive of the 1970s—was also unusually severe. Labor organization peaked earlier and at a lower ratein the United States than among most of its peers. And the political, legal, and administrative attacks on labor rights since the 1970s led to much steeper losses in the US than in any of its peers.
In American cities, decline reflected some of the same pressures facing labor—notably, deindustrialization and globalization. But, in most Rust Belt settings, urban decline began decades before any hint of trouble in the larger economy. Racial transition in American cities in the 1930s and 1940s yielded a nasty pattern of segregation and discrimination enforced and sustained by restrictive deed instruments, private realty, federal housing and mortgage policies, local zoning, and an enthusiasm for urban renewal that equated black occupancy with blight. Fragmented local government, flush with federal redevelopment and highway funds, became a sort of centrifuge that flung people, employment, and tax capacity to the suburban fringes.
Again, consider St. Louis. The city saw major plant closings and disinvestment in its core (see map below) and waves of new retail, commercial, and industrial investment on the suburban fringe. Employment simply moved from St. Louis City to St. Louis County, and then further out. Commuting times—especially for suburb-to-suburb commutes—rose steadily (in St. Louis as in most other metro areas), and all of the natural solidarities of local employment began to dissipate.
Major Plant Closings (1970-2000) and Vacant Land (2003) in St. Louis
These losses are commonly viewed as a consequence of deindustrialization and globalization, as a story of factories moving to Ciudad Juárez or China and of working Americans—now cut loose from the smokestacks of the central city—moving to the suburbs. But this misreads the timing, and the spatial pattern, of economic and demographic flight from the midcentury city. As Jefferson Cowie has shown in hismasterful account of RCA’s departure from Camden, New Jersey, industry’s first strategy was to leave the city for cheaper and less union-friendly settings—including the suburban fringe, struggling rural outposts, or the right-to-work South. For RCA, this meant targeting female workers (the wives and daughters of workers in Indiana’s declining stone industry) in Bloomington and then African Americans in Memphis before crossing the Rio Grande.
Two Cities, Two Industries
We can see this pattern—of the labor force and its urban base thinning out together—across a variety of settings and industries. The motives, and the consequences, of this migration were explicitly anti-union; they were a means of cutting costs by destroying the bargaining power of workers. This was accomplished both by moving production from urban centers of union strength to suburban or rural hinterlands and by seeking out the cheapest and most malleable labor force. In some sectors, such as the boot and shoe industry, job quality and union density were whittled away in the United States long before production moved overseas. Others, such as meatpacking, were largely immune from globalization, but still saw production scattered—from an older urban base to the cornfields of Kansas and Iowa, from Metroland to Sectionville. Let’s compare the two a little more closely.
The big packinghouses left Chicago and Omaha and Kansas City for low-wage outposts of Ottumwa and Columbus Junction and Storm Lake, a move motivated by the desire to slash labor costs and facilitated by the industry’s move from rail to refrigerated trucking as its primary means of transport. Small and struggling Midwestern towns competed fiercely for this new investment, offering expansive tax and infrastructure (especially water and sewage) incentives. This migration pointedly undermined the political and community alliances that had sustained the United Packinghouse Workers and eroded the ability of the union to organize across plants or secure master agreements. The map below tells the story.
In 1947, employment in meatpacking across the Midwest was concentrated in the region’s metro areas: Chicago, Peoria, St. Louis (including the counties on the Illinois side), Milwaukee, Madison, Dubuque, Cedar Rapids, Waterloo, Omaha, and Kansas City. The only non-metropolitan outposts were an Armour plant in Mason City (Cerro Gordo County), Iowa, and a Hormel Plant in Fort Dodge (Woodbury County), Iowa. Over the following decades, the industry thinned out dramatically. Employment in the metro settings fell, while production moved to new or expanded plants in places like Ottumwa (John Morrell), Denison (Farmland), and Storm Lake (Iowa Packers), Iowa; Austin (Hormel), Albert Lea (Wilson) and Worthington (Swift), Minnesota; Schulyer (Cargill) and Fremont (Hormel), Nebraska; and Arkansas City (Rodeo/Morell), Emporia (Tyson), and Liberal (National Beef), Kansas.
Missouri’s boot and shoe industry began to abandon the commercial core of St. Louis for the Ozarks early in the twentieth century. The state’s largest leather-goods firm, the Brown Shoe Company, built a plant in Moberly in rural Randolph County in 1905. Others followed Brown’s lead. Small towns in Missouri and Illinois competed for this investment, offering generous incentives (construction assistance, tax abatements) and a “company town” hostility to union organizing. Employers made no secret of their motives. When the arrival of the Wagner Act and CIO in the 1930s spawned a renewed union drive, employers explicitly responded with threats including plant closings and migration. In 1941, Brown Shoe opened a new plant in Dyer, Tennessee.
The map above tells the story. In 1947, Missouri’s leather-goods industry employed about 19,000, over 90 percent of whom worked (and lived) in the city of St. Louis. A decade later, employment in leather goods had almost doubled statewide (to 37,000) but fallen by half in the city—which now claimed less a quarter of those jobs. After that, employment began to drop off—to 29,000 in 1967 (18 percent in St. Louis), 14,600 in 1987 (4.5 percent in St. Louis), and only 1,645 in 2007 (just over 100 of which, 6 percent, were in St. Louis).
Although the economic fortunes of the two industries were quite dissimilar, their labor-relations strategies had much in common. Both sought escape from organized (and more easily organizable) urban labor markets. Both initially targeted rural workers and then moved on to foreign workers—the packers by recruiting immigrants, the shoemakers by moving production overseas entirely.
In each case, the spatial dispersion of production brought with it a collapse in union density and a collapse in real wages. At the end of the 1970s, just under half of meatpacking workers belonged to a union. This fell to a third by the early 1980s, and to less than one in five by the end of the decade (see graph below). In meatpacking, wages fell alongside those of other production workers, but more dramatically. Between 1947 and 1979, the average real hourly wage of all production workers nearly doubled; after 1979 it flatlined, rising 1.5 percent over 33 years. Between 1947 and 1979, the average real hourly wage of meatpacking workers rose 80 percent; after 1979 it fell nearly 30 percent. In 1970, packinghouse wages were about 20 percent higher than the average manufacturing wage; by 2002 they were 20 percent lower. Wage and union losses were closely related to plant size and location. At the end of the 1970s, large urban packing plants (those with 1,000 or more employees) boasted wages 23 percent higher than the industry average, and 30 to 45 percent higher than the wages at small plants (those with fewer than 500 employees). As outmigration of production continued, and larger rural facilities became the industry norm, the wage premium at larger plants gradually dissipated.
In the boot and shoe industry, production wages rose steadily (in real dollars)—from under $7 an hour in the late 1930s to almost $13 an hour in the early 1970s—but then ground to a halt, falling back below $12 by the early 1990s. Sectoral wages have risen since then, but this is largely an artifact of collapsing employment: as most production moved overseas (national employment in the broader leather goods sector fell from 135,000 to under 30,000 between 1990 and 2012), the few jobs that remained were in non-production jobs or in “boutique” artisanal lines.
We can see a glimpse of these patterns across our more recent history. Since 1983 (when good industry-level data on union membership becomes available), density in both meatpacking and footwear has fallen steadily. While the overall employment trajectories are quite distinct—meatpacking consistently employs about half-a-million across these thirty years; footwear employment almost disappears—the net effect is similar. The jobs leaving the urban core are mostly union jobs. The new jobs—whether they are in Ottumwa or Malaysia—are not.
Union leaders—at least outside the building trades—now have a deep appreciation of the impact of sprawl on public- and private-sector unionism. Big-box suburban commercial development displaces union jobs, especially in grocery retail and warehousing. In sectors such as hospitality or building services, union density declines almost in direct proportion to the distance from the urban core. And sprawl undermines public-sector unions either by reducing demand for their services (as with transit) or by putting unrelenting pressure on public budgets—and thereby feeding the backlash against teachers and other public servants.
What union leaders haven’t fully recognized is how far back this pattern goes—and what it will take to challenge it, or to organize despite it. Meatpacking and shoes are outlying examples, but this sketch of the relationship between urban decline and union decline could be replicated for almost any city and its major sector of employment. We need to devote closer attention to the geography of organizing and sustaining labor power, and to the ways in which urban decline and sprawl erode the natural solidarities of city life.
The state-level research series of the Occupational Employment Statistics provides some revealing comparisons, across states, of occupational wages. The map shows the regional patterns with, unsurprisingly, wages generally lower in southern states and in less urbanized settings. The “bar and whisker” graph at right shows the distribution–and range–around the median state for each occupation.
This month’s jobs report was widely celebrated for showing that—after adding 217,000 jobs in May 2014—the United States had finally returned to the December 2007 (pre-recession) level of employment. This is a useful comparative benchmark, underscoring the unusual depth and duration of this downturn: measured against other postwar business cycles (see graphic below), the 2007–9 downturn cost us a bigger share of jobs, and the subsequent recovery took much longer to gain them back.*
Meanwhile, the December 2007 jobs threshold has been rendered meaningless by over six years of economic and demographic change. Since the labor force has grown substantially over the last six-and-a-half years, the current rate of unemployment is still far above the pre-recession benchmark. The first goal, then, should be to return the December 2007 rate of unemployment—and by this measure [see graphic below], we’re still 2 million jobs in the hole.
This measure of the jobs deficit is incomplete, however, because the unemployment rate does not include those who have given up looking for work. If we set our goal at a return to pre-recession unemployment and labor-force participation rates, the jobs deficit swells to over 9.5 million.
But this still sets a low bar. Rates of employment and labor-force participation in December 2007 were not nearly as strong as they had been a decade earlier. If we aim for those “full employment” targets (the unemployment and labor-force participation rates of the late 1990s), we are running a deficit of a couple of million jobs before the recession even starts. From there, the lost jobs and missing workers pile up quickly, reaching—and plateauing at—a jobs deficit of over 11 million. Some recovery.
Needless to say, the book has incurred its share of criticism—some substantive, some silly. But of all the theoretical and methodological issues with Piketty’s sweeping account that reviewers have raised, I think the singular lingering weakness is this: the political and institutional sources of American inequality (and of American exceptionalism) are given short shrift. Piketty devotes surprisingly little attention to the policy shifts that unshackled incomes at the top and destroyed bargaining power at the bottom. “It’s like saying slavery is an inequality of assets between slaves and slaveholders,” as Suresh Naidu put it in Jacobin,“without describing the plantation.”
The graphic below takes a longer view, tracing real (inflation-adjusted) wages in key productive sectors since the 1930s and 1940s. The sectors covered here, like meatpacking or automobiles, are those for which decent wage data can be assembled for this full sweep of over seventy years. Importantly, they are also those sectors that we have historically relied upon for living-wage employment. Into the 1970s, as the uniform growth in real wages suggests, jobs in these industries were a ticket to the middle class. Since the late 1970s, however, wages in these industries have flattened at best—and in some cases fallen off substantially.