The first study from the newly established JPMorgan Chase Institute, released yesterday, offers an intriguing “big data” glimpse at economic insecurity in today’s U.S. economy. “Weathering Volatility: Big Data on the Financial Ups and Downs of U.S. Individuals” uses a stratified sample from the account and transaction data of almost 30 million JPMorgan Chase customers. Alongside credit information (monthly payments, outstanding balances, delinquencies) for the same individuals, the JPMorgan data measures variation in income and consumption, over a short and recent time span, with rare precision—and a notable tone of angst.
At a sample size of 100,000, the JPMorgan data set boasts nearly twice the sample size of the Current Population Survey (60,000 household respondents) and far exceeds the reach of the Bureau of Labor Statistics’ Consumers Expenditure Survey(7,000 households) or the various panel surveys (which have between 5,000 and 15,000 respondents). The resulting sample, broken into income quintiles (equal fifths of the underlying population), follows the structure of the Census numbers: the upper limit for its first quintile is $23,000, just over the Census/CPS threshold of $21,433 for 2013, and so on up to $104,500, the upper limit for the fourth (compared to the Census/CPS’s $106,101). But most importantly, the JPMorgan data consists of actual transactions—a much more precise and reliable measure of earning and spending than we can expect from survey respondents.
The findings, unsurprisingly, point to large swings in both incomes and consumption, both from year to year and from month to month. Seventy percent of those sampled see a swing in income of more than 5 percent from one year to the next; 84 percent see that volatility from month to month. Some of this is an artifact of the data collection (those paid weekly, for example, see their incomes bounce around just because some months have five Friday paydays and some don’t), but much of it signals the economic damage and insecurity that comes with stagnant wages, atattered safety net, persistently high unemployment and underemployment, andraggedly unreliable work schedules.
Of interest to JPMorgan, of course, are the behavioral responses to income volatility, and the financial products (“innovative insurance or credit products“) that might be offered in response. The report devotes much of its energy to parsing the difference between “responders” (those whose changes in consumption tend to track changes in their income), “sticky optimists” (mostly higher earners) for whom consumption increases more than income by more than 10 percent, and “sticky pessimists,” for whom consumption lags behind increases in income by more than 10 percent.
Of more lasting interest is the way in which this report and dataset confirm and extend our understanding of the insecurity gap pried open by the simultaneous retreat of private labor standards and public policies—what Jared Bernstein has described as the “YOYO” (you’re on your own) economy (not altogether unrelated to that more common twenty-first-century motto, YOLO) and Jacob Hacker has dubbed the “great risk shift.” In the JPMorgan data, this emerges in the volatility of both income and consumption, which—especially for modest earners—capture the heavy reliance on EITC-padded tax refunds (big jumps in income in March and April) and the exposure to unexpected expenses.
The report finds not only that incomes and consumption are both volatile, but that they are volatile independently from one another: households face the risk of lost income, unanticipated expenses, or both. And that risk—for all but the highest earners—is accompanied by insufficient savings or assets to weather such a shock. By JPMorgan estimates, earners in the lowest quintile would need about $1,600 in liquid assets to withstand income volatility—but actual account balances average less than half that ($600). This gap is simply magnified for higher earners: in the second quintile, $2,800 is needed but only $1,400 is available; in the third, $4,800 is needed but only $3,000 is available; in the fourth, $8,200 is needed but only $6,200 is available.
These findings are important—especially considering the richness of the underlying data and their source. The data does have its limits; in the scholarship on this problem, some of the most interesting (and open) questions revolve around the historical timing and the distribution of income volatility. Covering only 2012-14, and with no greater demographic detail than gender and age, the JPMorgan data offers few insights here. But as a contemporary snapshot, drawn on a large and precise sample of income and spending, it is hard to beat.
Social or labor market policies are measured by their reach, their adequacy, and their costs. By these metrics, a minimum wage increase is a slam dunk. A generation of research now demonstrates pretty decisively that markets can accommodate a reasonably higher minimum at no significant threat to job creation—especially when ancillary gains (productivity gains, less turnover, increase in aggregate demand) are taken into account. Raising the minimum wage makes almost no demands on the public purse, and could in fact recoup much of the current public subsidy (through working families’ reliance on means-tested tax credits, cash assistance, health care, and food security programs) of low-wage employment. Even a small increase promises to make a big difference: in 2013, Arin Dube estimated that an increase to $10.10 would raise the incomes of poor families (those at the 10th percentile) by 12 percent and lift five to seven million out of poverty. An increase to $12 would likely have even larger poverty-fighting effects.
While much of our social and tax policy is either poorly targeted (it reaches the poor unevenly) or aimed in in the wrong direction (it benefits those who don’t need it), a minimum wage increase hits the bull’s-eye. As EPI’s new estimates of the impact of the “Raise the Wage Act” (bringing the minimum to $12.00/hour by 2020) underscore, the benefits of an increase would flow overwhelmingly to those—young workers, single parents, workers of color—who need it the most. The interactive graphic below summarizes this important new work: the first menu sorts workers by race, the second by income, age, family status, labor force participation, and educational attainment.
There is an evocative episode in Wayne Johnson’s Colony of Unrequited Dreams in which a young Joey Smallwood (the novel’s real-life protagonist, who would go on to be the first premier of Newfoundland) sets out to organize railroad sectionmen in the colony’s rugged interior. The sectionmen live in shacks at one-mile intervals along the 700 miles of rail lines between the Atlantic coast and the island’s southwestern tip. Recruiting them proves daunting for Smallwood, who sets a goal of twenty miles and twenty signatures a day. “I fancied I was walking the lone street in a company town called Sectionville,” he observes. The sectionmen and their families communicate little with one another, and the railroad—their employer—is their only source of contact with the wider world. Little wonder that at each milepost, Johnson finds families “driven to eccentricity by isolation.”
The material, social, and democratic poverty of Sectionville illustrates the importance of cities and labor unions both to each other and to the goal of shared prosperity. Cities offer the natural solidarities of work and neighborhood that make sustained organizing possible. Union density (built on residential density) discourages competition on wages and encourages competition on efficiency and quality. This benefits both workers and their employers, for whom the benefits of a well-trained workforce, easy access to suppliers and consumers, and decent public goods far outweigh the costs. Cities drive the economy: the top 100 U.S. metropolitan areas, on merely 12 percent our land area, account for at least three-quarters of GDP. They are home to the best jobs and opportunities. They claim virtually all of population growth. They house our best schools and our leading cultural institutions. And they are, by any measure, greener than sparser forms of economic or residential development.
In turn, cities—by virtue of their density and diversity—sustain progressive politics. Perhaps the starkest determinant of the presidential vote in 2012 was population density: across red and blue states, 98 percent of the most densely populated counties went to Obama, while 98 percent of the least densely populated counties went to Romney. The takeaway would seem to be that people who live close to one another are more tolerant and empathetic; they are more likely to know someone of a different color, a different income group, or a different sexual orientation. They rely on and appreciate the provision of public goods and public services (transit, parks, garbage collection)—even as they consume fewer public dollars than their red-state counterparts. And they have a deeper appreciation of the regulatory standards (guns, labor conditions, food, public health) that promise us a modicum of safety and security.
In an urban (and still urbanizing) nation, all of this would seem to be good news. So why, when it comes to the hard work of building a just and sustainable future, does it feel like we still live in Sectionville?
The answer lies in the parallel decline of American cities and the American labor movement in the second half of the twentieth century. The pre–New Deal labor movement was built largely around strong urban unions—in skilled trades, in unskilled services, and in local manufacturing—whose bargaining power was rooted as much in the urban form as it was in the workplace. In many settings, such unions—of printers, carpenters, janitors, waitresses, teamsters—shaped local politics, sustained local solidarity (through secondary boycotts), and pooled resources in the provision of basic services such as health centers.
In the 1930s, the emergent Congress of Industrial Organizations took a different tack,building silos of solidarity around particular industries and pattern bargaining across their constituent firms. The CIO was often indifferent—and sometimes hostile—to the older local unions, many of which, especially in construction, clung to the rival American Federation of Labor. Labor maintained its metropolitan presence in settings where strong local and industrial organization overlapped (autoworkers in Detroit, steelworkers in Pittsburgh, packinghouse workers in Chicago, dockworkers in San Francisco), but local solidarity—in all these settings—depended on the continued health and growth of both the big industrial unions and the central cities in which they were rooted.
Instead of growth, however, the years after the Second World War saw both cities and unions undergo dramatic decline. In the 1950s, about a third of American workers belonged to unions. By 1990, this number had fallen by half, to about 16 percent, and it now sits at just over 11 percent. And these numbers are cushioned by the growth (pre-Reagan) and relative stability (post-Reagan) of public-sector unions over that span: in the private sector, union density was under 25 percent by the early 1970s, half that (under 13 percent) by the late 1980s, and half that again (6.7 percent) by 2013.
American cities declined at the same rate, and for some of the same reasons. Many metropolitan regions grew more slowly than the nation as a whole, and most central cities shrank even if the metropolitan areas of which they were a part continued to grow. Of our fifteen largest central cities, eleven saw their peak population in 1950—the only exceptions being New York, Chicago, Los Angeles, and Houston. And the others didn’t just stop growing: nine of those cities lost more than a quarter of their population over the next fifty years, and five (St. Louis, Detroit, Pittsburgh, Buffalo, and Cleveland) lost more than half.
The demographic shift was in part regional (Rust Belt to Sun Belt), but it was primarily local—that is, urban to suburban. Residentially and commercially, cities simply got thinner. Consider St. Louis. In 1930, the metropolitan area encompassed four counties (two in Missouri, two on the Illinois side) and a population of about 1.3 million—of which over 60 percent lived in the City of St. Louis. In 1970, the metropolitan area included three more counties on the Missouri side, and growth was confined to the suburbs: the City claimed barely a quarter of the metropolitan population (620,000 of 2.36 million). By 2010, the metro area sprawled across seventeen counties and the City claimed just 10 percent of the metro population (now nudging 3 million). Population density, about 1,000 persons per square mile in 1960, was less than a third of that (322 persons per square mile) by 2010.
The overlapping trajectories of urban and union decline are underscored by the graph below, which plots the decline in union density against the decline in the central city populations of the six largest Rust Belt cities (St. Louis, Detroit, Pittsburgh, Buffalo, Cincinnati, and Cleveland).
The sources of union and urban decline are complex but familiar. Even at its peak, the American labor movement was regionally, sectorally, and jurisdictionally fragmented. Even where it was strongest, it depended upon fragile industry-specific deals with leading employers. As a result, its political presence never matched its numbers. Hemmed in by broader political constraints and its own organizing and political strategies, the mid-century labor movement is commonly characterized as timidly contractual at the bargaining table and barrenly married to the Democrats at the ballot box.
At the same time, the stakes at the bargaining table, which in the American setting determined not only wages but social benefits like health care and pensions, were unusually high. This meant that the political backlash—which began in many stateswith the passage of the Taft-Hartley Act in 1947 and spread across the economy with the business offensive of the 1970s—was also unusually severe. Labor organization peaked earlier and at a lower ratein the United States than among most of its peers. And the political, legal, and administrative attacks on labor rights since the 1970s led to much steeper losses in the US than in any of its peers.
In American cities, decline reflected some of the same pressures facing labor—notably, deindustrialization and globalization. But, in most Rust Belt settings, urban decline began decades before any hint of trouble in the larger economy. Racial transition in American cities in the 1930s and 1940s yielded a nasty pattern of segregation and discrimination enforced and sustained by restrictive deed instruments, private realty, federal housing and mortgage policies, local zoning, and an enthusiasm for urban renewal that equated black occupancy with blight. Fragmented local government, flush with federal redevelopment and highway funds, became a sort of centrifuge that flung people, employment, and tax capacity to the suburban fringes.
Again, consider St. Louis. The city saw major plant closings and disinvestment in its core (see map below) and waves of new retail, commercial, and industrial investment on the suburban fringe. Employment simply moved from St. Louis City to St. Louis County, and then further out. Commuting times—especially for suburb-to-suburb commutes—rose steadily (in St. Louis as in most other metro areas), and all of the natural solidarities of local employment began to dissipate.
Major Plant Closings (1970-2000) and Vacant Land (2003) in St. Louis
These losses are commonly viewed as a consequence of deindustrialization and globalization, as a story of factories moving to Ciudad Juárez or China and of working Americans—now cut loose from the smokestacks of the central city—moving to the suburbs. But this misreads the timing, and the spatial pattern, of economic and demographic flight from the midcentury city. As Jefferson Cowie has shown in hismasterful account of RCA’s departure from Camden, New Jersey, industry’s first strategy was to leave the city for cheaper and less union-friendly settings—including the suburban fringe, struggling rural outposts, or the right-to-work South. For RCA, this meant targeting female workers (the wives and daughters of workers in Indiana’s declining stone industry) in Bloomington and then African Americans in Memphis before crossing the Rio Grande.
Two Cities, Two Industries
We can see this pattern—of the labor force and its urban base thinning out together—across a variety of settings and industries. The motives, and the consequences, of this migration were explicitly anti-union; they were a means of cutting costs by destroying the bargaining power of workers. This was accomplished both by moving production from urban centers of union strength to suburban or rural hinterlands and by seeking out the cheapest and most malleable labor force. In some sectors, such as the boot and shoe industry, job quality and union density were whittled away in the United States long before production moved overseas. Others, such as meatpacking, were largely immune from globalization, but still saw production scattered—from an older urban base to the cornfields of Kansas and Iowa, from Metroland to Sectionville. Let’s compare the two a little more closely.
The big packinghouses left Chicago and Omaha and Kansas City for low-wage outposts of Ottumwa and Columbus Junction and Storm Lake, a move motivated by the desire to slash labor costs and facilitated by the industry’s move from rail to refrigerated trucking as its primary means of transport. Small and struggling Midwestern towns competed fiercely for this new investment, offering expansive tax and infrastructure (especially water and sewage) incentives. This migration pointedly undermined the political and community alliances that had sustained the United Packinghouse Workers and eroded the ability of the union to organize across plants or secure master agreements. The map below tells the story.
In 1947, employment in meatpacking across the Midwest was concentrated in the region’s metro areas: Chicago, Peoria, St. Louis (including the counties on the Illinois side), Milwaukee, Madison, Dubuque, Cedar Rapids, Waterloo, Omaha, and Kansas City. The only non-metropolitan outposts were an Armour plant in Mason City (Cerro Gordo County), Iowa, and a Hormel Plant in Fort Dodge (Woodbury County), Iowa. Over the following decades, the industry thinned out dramatically. Employment in the metro settings fell, while production moved to new or expanded plants in places like Ottumwa (John Morrell), Denison (Farmland), and Storm Lake (Iowa Packers), Iowa; Austin (Hormel), Albert Lea (Wilson) and Worthington (Swift), Minnesota; Schulyer (Cargill) and Fremont (Hormel), Nebraska; and Arkansas City (Rodeo/Morell), Emporia (Tyson), and Liberal (National Beef), Kansas.
Missouri’s boot and shoe industry began to abandon the commercial core of St. Louis for the Ozarks early in the twentieth century. The state’s largest leather-goods firm, the Brown Shoe Company, built a plant in Moberly in rural Randolph County in 1905. Others followed Brown’s lead. Small towns in Missouri and Illinois competed for this investment, offering generous incentives (construction assistance, tax abatements) and a “company town” hostility to union organizing. Employers made no secret of their motives. When the arrival of the Wagner Act and CIO in the 1930s spawned a renewed union drive, employers explicitly responded with threats including plant closings and migration. In 1941, Brown Shoe opened a new plant in Dyer, Tennessee.
The map above tells the story. In 1947, Missouri’s leather-goods industry employed about 19,000, over 90 percent of whom worked (and lived) in the city of St. Louis. A decade later, employment in leather goods had almost doubled statewide (to 37,000) but fallen by half in the city—which now claimed less a quarter of those jobs. After that, employment began to drop off—to 29,000 in 1967 (18 percent in St. Louis), 14,600 in 1987 (4.5 percent in St. Louis), and only 1,645 in 2007 (just over 100 of which, 6 percent, were in St. Louis).
Although the economic fortunes of the two industries were quite dissimilar, their labor-relations strategies had much in common. Both sought escape from organized (and more easily organizable) urban labor markets. Both initially targeted rural workers and then moved on to foreign workers—the packers by recruiting immigrants, the shoemakers by moving production overseas entirely.
In each case, the spatial dispersion of production brought with it a collapse in union density and a collapse in real wages. At the end of the 1970s, just under half of meatpacking workers belonged to a union. This fell to a third by the early 1980s, and to less than one in five by the end of the decade (see graph below). In meatpacking, wages fell alongside those of other production workers, but more dramatically. Between 1947 and 1979, the average real hourly wage of all production workers nearly doubled; after 1979 it flatlined, rising 1.5 percent over 33 years. Between 1947 and 1979, the average real hourly wage of meatpacking workers rose 80 percent; after 1979 it fell nearly 30 percent. In 1970, packinghouse wages were about 20 percent higher than the average manufacturing wage; by 2002 they were 20 percent lower. Wage and union losses were closely related to plant size and location. At the end of the 1970s, large urban packing plants (those with 1,000 or more employees) boasted wages 23 percent higher than the industry average, and 30 to 45 percent higher than the wages at small plants (those with fewer than 500 employees). As outmigration of production continued, and larger rural facilities became the industry norm, the wage premium at larger plants gradually dissipated.
In the boot and shoe industry, production wages rose steadily (in real dollars)—from under $7 an hour in the late 1930s to almost $13 an hour in the early 1970s—but then ground to a halt, falling back below $12 by the early 1990s. Sectoral wages have risen since then, but this is largely an artifact of collapsing employment: as most production moved overseas (national employment in the broader leather goods sector fell from 135,000 to under 30,000 between 1990 and 2012), the few jobs that remained were in non-production jobs or in “boutique” artisanal lines.
We can see a glimpse of these patterns across our more recent history. Since 1983 (when good industry-level data on union membership becomes available), density in both meatpacking and footwear has fallen steadily. While the overall employment trajectories are quite distinct—meatpacking consistently employs about half-a-million across these thirty years; footwear employment almost disappears—the net effect is similar. The jobs leaving the urban core are mostly union jobs. The new jobs—whether they are in Ottumwa or Malaysia—are not.
Union leaders—at least outside the building trades—now have a deep appreciation of the impact of sprawl on public- and private-sector unionism. Big-box suburban commercial development displaces union jobs, especially in grocery retail and warehousing. In sectors such as hospitality or building services, union density declines almost in direct proportion to the distance from the urban core. And sprawl undermines public-sector unions either by reducing demand for their services (as with transit) or by putting unrelenting pressure on public budgets—and thereby feeding the backlash against teachers and other public servants.
What union leaders haven’t fully recognized is how far back this pattern goes—and what it will take to challenge it, or to organize despite it. Meatpacking and shoes are outlying examples, but this sketch of the relationship between urban decline and union decline could be replicated for almost any city and its major sector of employment. We need to devote closer attention to the geography of organizing and sustaining labor power, and to the ways in which urban decline and sprawl erode the natural solidarities of city life.
The state-level research series of the Occupational Employment Statistics provides some revealing comparisons, across states, of occupational wages. The map shows the regional patterns with, unsurprisingly, wages generally lower in southern states and in less urbanized settings. The “bar and whisker” graph at right shows the distribution–and range–around the median state for each occupation.
This month’s jobs report was widely celebrated for showing that—after adding 217,000 jobs in May 2014—the United States had finally returned to the December 2007 (pre-recession) level of employment. This is a useful comparative benchmark, underscoring the unusual depth and duration of this downturn: measured against other postwar business cycles (see graphic below), the 2007–9 downturn cost us a bigger share of jobs, and the subsequent recovery took much longer to gain them back.*
Meanwhile, the December 2007 jobs threshold has been rendered meaningless by over six years of economic and demographic change. Since the labor force has grown substantially over the last six-and-a-half years, the current rate of unemployment is still far above the pre-recession benchmark. The first goal, then, should be to return the December 2007 rate of unemployment—and by this measure [see graphic below], we’re still 2 million jobs in the hole.
This measure of the jobs deficit is incomplete, however, because the unemployment rate does not include those who have given up looking for work. If we set our goal at a return to pre-recession unemployment and labor-force participation rates, the jobs deficit swells to over 9.5 million.
But this still sets a low bar. Rates of employment and labor-force participation in December 2007 were not nearly as strong as they had been a decade earlier. If we aim for those “full employment” targets (the unemployment and labor-force participation rates of the late 1990s), we are running a deficit of a couple of million jobs before the recession even starts. From there, the lost jobs and missing workers pile up quickly, reaching—and plateauing at—a jobs deficit of over 11 million. Some recovery.
Needless to say, the book has incurred its share of criticism—some substantive, some silly. But of all the theoretical and methodological issues with Piketty’s sweeping account that reviewers have raised, I think the singular lingering weakness is this: the political and institutional sources of American inequality (and of American exceptionalism) are given short shrift. Piketty devotes surprisingly little attention to the policy shifts that unshackled incomes at the top and destroyed bargaining power at the bottom. “It’s like saying slavery is an inequality of assets between slaves and slaveholders,” as Suresh Naidu put it in Jacobin,“without describing the plantation.”
The graphic below takes a longer view, tracing real (inflation-adjusted) wages in key productive sectors since the 1930s and 1940s. The sectors covered here, like meatpacking or automobiles, are those for which decent wage data can be assembled for this full sweep of over seventy years. Importantly, they are also those sectors that we have historically relied upon for living-wage employment. Into the 1970s, as the uniform growth in real wages suggests, jobs in these industries were a ticket to the middle class. Since the late 1970s, however, wages in these industries have flattened at best—and in some cases fallen off substantially.
This graphic summarizes the key inequality and policy trends (for the U.S.) traced in Thomas Piketty’s Capital in the Twenty-First Century. Scrolling through the inequality metrics suggest the key themes in Piketty’s examination of the U.S. case: the now-familiar “suspension bridge” of income inequality, dampened only by the exceptional economic and political circumstances of the decades surrounding World War II; the growing share of recent income gains going to the very high earners (the 1% or .01%); the stark inequality within labor income (see the top 1% and top 10% wage shares) generated by the emergence of lavishly-compensated “supermanagers”; and a concentration of wealth that fell little over the first half of the twentieth century and has grown steadily since then.
Scrolling through the policy metrics suggests some of the causal forces at work: a precipitous decline in the top inheritance and income tax rates (lifting the ceiling on high incomes); and the collapse of labor standards and bargaining power (lowering the floor for everyone else). I have added here one data series—the trajectory of union density—on which Piketty is curiously silent (his chapter on income inequality uses the minimum wage as a surrogate for bargaining power more generally).
Before Social Security, almost 80 percent of American seniors lived in poverty. As Social Security contributions and payments became established early in the postwar era, poverty among seniors began to fall—and continued to do so under the Great Society, abetted by the passage of Medicare in 1965. Since the program’s growth slowed in the 1980s and 1990s, poverty among seniors has leveled out at about 10 percent.
Cuts to social programs have only widened the gap between poor Americans and everyone else—especially the retreat from AFDC since 1996. By any measure, the TANF program is aweak substitute for the program it displaced. By the mid to late 1970s, AFDC reached about a third of all poor families, and over 80 percent of poor families with children. With the implementation of TANF, this coverage shrank almost immediately (1996-1997) to about half of all poor families and about two-thirds of those with children. By 2010-2011, only 20 percent of all poor families, and just over 27 percent for those with children, were receiving TANF assistance. Between 1992 and 2010 alone, one million more American children slipped below the poverty line. The share of Americans living in severe poverty (below 50 percent of the poverty line) has almost doubled since 1972.