date: 11 March 2018
American Labor and Working-Class History, 1900–1945
Summary and Keywords
Early 20th century American labor and working-class history is a subfield of American social history that focuses attention on the complex lives of working people in a rapidly changing global political and economic system. Once focused closely on institutional dynamics in the workplace and electoral politics, labor history has expanded and refined its approach to include questions about the families, communities, identities, and cultures workers have developed over time. With a critical eye on the limits of liberal capitalism and democracy for workers’ welfare, labor historians explore individual and collective struggles against exclusion from opportunity, as well as accommodation to political and economic contexts defined by rapid and volatile growth and deep inequality.
Particularly important are the ways that workers both defined and were defined by differences of race, gender, ethnicity, class, and place. Individual workers and organized groups of working Americans both transformed and were transformed by the main struggles of the industrial era, including conflicts over the place of former slaves and their descendants in the United States, mass immigration and migrations, technological change, new management and business models, the development of a consumer economy, the rise of a more active federal government, and the evolution of popular culture.
The period between 1896 and 1945 saw a crucial transition in the labor and working-class history of the United States. At its outset, Americans were working many more hours a day than the eight for which they had fought hard in the late 19th century. On average, Americans labored fifty-four to sixty-three hours per week in dangerous working conditions (approximately 35,000 workers died in accidents annually at the turn of the century). By 1920, half of all Americans lived in growing urban neighborhoods, and for many of them chronic unemployment, poverty, and deep social divides had become a regular part of life. Workers had little power in either the Democratic or Republican party. They faced a legal system that gave them no rights at work but the right to quit, judges who took the side of employers in the labor market by issuing thousands of injunctions against even nonviolent workers’ organizing, and vigilantes and police forces that did not hesitate to repress dissent violently. The ranks of organized labor were shrinking in the years before the economy began to recover in 1897. Dreams of a more democratic alternative to wage labor and corporate-dominated capitalism had been all but destroyed. Workers struggled to find their place in an emerging consumer-oriented culture that assumed everyone ought to strive for the often unattainable, and not necessarily desirable, marks of middle-class respectability.
Yet American labor emerged from World War II with the main sectors of the industrial economy organized, with greater earning potential than any previous generation of American workers, and with unprecedented power as an organized interest group that could appeal to the federal government to promote its welfare. Though American workers as a whole had made no grand challenge to the nation’s basic corporate-centered political economy in the preceding four and one-half decades, they entered the postwar world with a greater level of power, and a bigger share in the proceeds of a booming economy, than anyone could have imagined in 1896. The labor and working-class history of the United States between 1900 and 1945, then, is the story of how working-class individuals, families, and communities—members of an extremely diverse American working class—managed to carve out positions of political, economic, and cultural influence, even as they remained divided among themselves, dependent upon corporate power, and increasingly invested in a individualistic, competitive, acquisitive culture.
Keywords: labor and working-class history, labor movement, trade unions, class, Progressive Era, World War I, World War II, radicalism, capitalism, race and ethnic relations, legal history, political history, New Deal
Workers and the Rise of Corporate America
American trade unionists entered the 20th century battered by a series of savage defeats which, by 1896, brought the end of an era when millions of Americans had joined mass movements seeking alternatives to corporate-dominated, wage-labor capitalism. Labor reformers’ post-Civil War dream of emancipating American laborers from the wage system and their hopes for the creation of a producers’ republic based on principles of cooperation and commonwealth had been shattered in Chicago’s Haymarket Square on May 4, 1886. The wind had been stolen from the spirit of unionism in the all-important steel industry at Andrew Carnegie’s Homestead mill in 1892, and from industrial unionism on the nation’s rail lines in the defeat of the 1894 Pullman strike and boycott. Finally, the Republican Party’s defeat of the Populist-Democrat fusion in the presidential campaign in 1896 ensured that the vast majority of wage workers and farmers would not have the support of their own national political party.
Ascendant corporate leaders had been emboldened and empowered by much of the public’s revulsion against the labor-related violence of the late 19th century. The forces of “law and order” at the local, state, and federal levels came to the aid of business in strikes and lockouts during the “Age of Industrial Conflict.” Business also won the crucial legal conflict over the definition of “freedom” in the workplace and in employment markets. Court injunctions against labor activity were ubiquitous in the wake of the 1894 Pullman boycott, and case law privileged employers’ prerogatives at all turns. In the eyes of the law, Americans generally—with the exception of married white women—had a responsibility to work, but their sole right at work was the right to quit. Furthermore, legislators paid less attention to workers’ welfare than they did to subsidizing the growth of American industry or sustaining their own political power, all too often lining their pockets with the graft that ran rampant in that period of fantastic growth. Lawmakers had taken the first steps toward regulating trusts and moderating the worst forms of corruption, but those efforts were generally weak, and the nation’s courts ensured that employers’ power in the workplace would be virtually unchecked.
Great changes were taking place, yet Americans generally believed that even more change was needed if the republic were to survive and thrive in the industrial era. In the workplace as much as in surrounding communities, Americans feared the implications of this new era of global economic expansion. Political and ideological violence may have been rare, but when violence broke out, it both stigmatized and divided labor groups, even as it brought swift reactions from local police, private detective firms, and state and federal officials.1 More broadly, a general fear of the revolutionary changes taking shape in everyday life inspired both a broad-based progressive reform impulse, shared by many American workers, and a renewed American radicalism, as well as the forces of reactionary repression and business conservatism that sought to stamp out what many saw as the real possibility of mob action and socialist insurgency.
The labor violence and economic upheavals of the late 19th century had been horrific enough to convince many powerful Americans that reform was necessary. In 1898, Republican president William McKinley, who would be assassinated in 1901 by the anarchist Leon Czolgosz, appointed the United States Industrial Commission to study the causes of labor violence. At the same time, a broad group of largely middle-class and elite Americans, soon to be known as Progressives, set out to document and then ameliorate the worst forms of corruption in the economy and politics, and to soften the edges of the new industrial system by making workplaces, consumer products, and neighborhoods safer and healthier. There was no single Progressive Era social movement; rather, reformers sought everything from antitrust legislation, shorter working hours, and safer workplaces to bans on child labor, protective legislation for female workers, and reforms that would clean up manufacturing and the political process.
These top-down reform efforts—efforts that emphasized the need for greater efficiency and order in the economy and at the workplace—would be deeply ambiguous for workers. But they reflected an important move away from the commitments to Social Darwinism and laissez-faire principles that had defined the Gilded Age. Progressive reform itself could become a form of social control. Workers were subjected to intense moral campaigns, the Americanization efforts of both well-intentioned settlement house workers and less salutary anti-immigrant vigilantes, and the institution of “scientific management” regimes fostered by Frederick Winslow Taylor, Elton Mayo, and Frank and Lillian Gilbreth. One reformer’s vision of order and efficiency often became a reality of social control for workers.
For most workers, the greatest fears derived from the accelerating changes at the workplace that were well underway by the turn of the century. The mechanization of industry and employers’ drive for efficiency had long been forcing workers to do more specialized task work and robbing them of the control over their work many had enjoyed in systems of craft production. There were benefits as production skyrocketed across the economy. Whereas the pick miner in a coal shaft produced 2.5 tons per day on average, the fully mechanized open pit mines of the 1930s produced 16.2 tons per worker per day. In 1919, Henry Ford’s assembly line produced four times the output per worker per hour than the industry had produced in 1910. Simultaneously, the kinds of occupations Americans held and their experiences at work changed dramatically, not always for the worse. Gangs of day laborers were transformed into legions of semiskilled workers running transportation and equipment handling machines. Skilled, independent workers in iron and steel production became semiskilled machinists and repair technicians. These mechanized factories also required the development of a whole new set of tool-and-die makers. Overall, there was an upward leveling effect of mechanization. Between 1910 and 1930, the proportion of unskilled workers in industrial work fell from 36 to 30.5 percent, the semiskilled rose from 36 to 39 percent, and the skilled increased from 28 to 30.5 percent.2 Not everyone benefited, of course. Black men, when they were not stuck in sharecropping or tenant farming, were generally relegated to the hot, heavy, hard jobs, and most black women were forced to accept the long hours and lack of independence in domestic service.
The 20th century also saw what one historian has described as the “degradation of work.”3 The dream of the United States as an independent producers republic, which had inspired Americans from Thomas Jefferson to the Knights of Labor in the 1870s and 1880s, had long been dead. As early as 1877, two-thirds of American workers were wage laborers, with little hope of opening their own shops or owning their own farms. By 1940, no more than one-fifth of the population of the United States were self-employed.4 Wage labor—underpaid, demanding long hours, and subjecting workers to dangerous conditions (approximately 35,000 workers died in accidents annually at the turn of the century)—had become a permanent condition.5 Not only were the benefits of the wage economy unequally distributed, but the very nature of work became both more demanding and less satisfying. A profound contradiction emerged that arguably continues to shape workers’ lives in the 21st century: “The scientific-technical revolution and ‘automation’ requires ever higher levels of education, training, the greater exercise of intelligence and mental effort in general,” which is accompanied by “a mounting dissatisfaction with the conditions of industrial and office labor.”6
Despite their shared circumstances and some success in building a diverse labor movement in the early part of the century, American workers entered World War I perhaps more divided among themselves than at any other point in the nation’s history. Nativism was on the rise, and workers were divided by skill, craft, race, gender, and region. Industrial employers took advantage of workers’ fears and their internal divisions. On one hand, some corporate leaders developed systems of “welfare capitalism,” voluntarily providing marginal benefits to workers in order to stifle their dissatisfaction at work. On the other hand, business leaders and their allies in politics and the press played workers of different backgrounds against one another in order to undercut the possibility of shared militancy. It would be difficult, even for the most privileged workers, to fight for a place in the system.
Fighting for a Place in the System
With a significant economic recovery underway in 1897, American labor leaders began a new organizing push, primarily through the American Federation of Labor (AFL), railroad brotherhoods, and various unaffiliated unions. These organizations largely excluded racial minorities and women, and this model of organizing sought to come to terms with, rather than to transform, corporate dominance of the industrial economy. Nonetheless, the leaders of these unions and their largely white, male rank and file won critical victories and increased the AFL’s membership from 264,000 in 1897 to 1.6 million by 1904. Moreover, as the historian Julie Greene has shown, it is easy to overstate the apolitical character of the AFL’s “pure and simple unionism.” In addition to “bread-and-butter” contractual issues, the Federation actively pursued political influence in the late 19th and early 20th centuries. It is true, however, that the AFL assumed that trade unionists would speak for all American workers in the political sphere.7
The AFL sustained the power of craft workers in the construction and transportation trades, while also beginning to win benefits for some more skilled industrial workers. The railroad brotherhoods exerted significant, if informal, political influence through allies like Theodore Roosevelt in the Republican Party.8 Even mineworkers—who had a reputation as the most violent and militant of unionists, and who had, indeed, fought many labor wars—had gained enough leverage to cause President Theodore Roosevelt to mediate between the workers and the mine owners in a bitter 1902 anthracite coal strike.
Many, though hardly all, employers had initially accepted the rise of the AFL, even going as far as voluntarily recognizing unions and forming the National Civic Federation, a coalition of labor and business leaders seeking cooperation in the economy. By 1904, however, employers had grown frustrated with the demands of union contracts and workers’ increased militancy, and they began to hit back. They increased the use of “yellow dog contracts” to force workers to sign agreements that promised they would not join a union. Employers divided workers by national origin and regularly employed strikebreaking replacement workers. The National Association of Manufacturers embarked on a concerted “open shop” drive; the forerunner of today’s “right-to-work” laws, these were campaigns by employers and their political allies to ensure that workers in a unionized shop did not have to belong to the union. This protection of workers’ right to contract as individuals amounted to a thinly veiled attempt to undermine all organized labor, as unions could not afford to represent workers who were “free riders” on the backs of their union member coworkers. In 1913, the open-shop drive climaxed in an actual labor war in the Colorado coal fields, as the Rockefeller-owned Colorado Fuel and Iron Company pushed for ever greater production and at one point destroyed a workers’ camp in Ludlow, Colorado, killing eleven children and two women in the attack (see Figure 1).9
As a result of such attacks on organized labor, membership in unions actually dropped in 1905 and remained stagnant for the next five years. Yet the booming economy before and during World War I increased labor’s power: the AFL’s membership increased by approximately 800,000 between 1910 and 1917, and organized labor as a whole grew to 4 million by 1920.10 The membership also became increasingly diverse in terms of skill level and occupations. These were important gains for workers, but they remained limited in no small part by the failure of the AFL to imagine an alliance with the vast majority of unorganized workers.
Figure 1. “Slain Miner and One of His Fighting Comrades.”
Photo by Bain News Service, Forbes Camp, Ludlow, Colorado, May 3, 1914. Prints and Photographs Division, Library of Congress (LC-B2-3034-14), digital ID: LC-DIG-ggbain-15854.
Radical Alternatives in the Progressive Era
Workers frustrated with the exclusionary practices and political moderation of the AFL could turn to an embattled world of labor radicalism which was going through something of a renaissance after the defeats of the 1880s and 1890s. American radicals—led by the socialist Eugene V. Debs and an eclectic band of militants that included Mother Jones (Figure 2), Elizabeth Gurley Flynn, “Big Bill” Haywood, and Lucy Parsons, among others—pushed for more radical and immediate change through the Socialist Party, insurgent industrial unions in mining and textiles, and through the Industrial Workers of the World.
Founded in 1901, the Socialist Party of America (SP) quickly emerged as a powerful political force. Within a decade the SP had built more than three thousand local branches and forty-two state organizations. Dozens of candidates affiliated with the new party won municipal and county elections on town squares stretching from Texas through Illinois to Milwaukee, Wisconsin. Meanwhile, the party’s leader, Eugene V. Debs, won 897,000 votes in his run for the presidency in 1912 and more than a million votes for president in 1920, while he was in prison after being convicted of sedition during World War I.
In the 1910s, garment workers in New York City and Chicago organized unions in the industry for which the term “sweatshop” was coined. Although workers suffered oppressive conditions in sweatshops, they were isolated from the rest of the workforce, and they could not take action directly against the manufacturers. But as manufacturers moved production to larger factories in order to produce standardized clothing and to distance themselves from the increasingly negative reputation of sweatshops—spread by Progressive reformers—the larger shops also brought unskilled workers out of their relative isolation. Working conditions did not necessarily improve in larger shops, but opportunities to build worker solidarity presented themselves. Employers attempted to maintain divisions among workers, separating them by ethnicity and gender, and by offering “bonus pay” to the most productive workers.
After years of suffering, garment workers organizing came in quick surges: the “Uprising” of 20,000 in New York City in 1909, another strike of 60,000 workers in New York City in 1910, a 1910–1911 strike of 40,000 workers in Chicago, and the movement for unionization and reform after the infamous Triangle Shirtwaist factory fire in New York in March 1911 (Figure 3). Together these actions reinvigorated the International Ladies’ Garment Workers’ Union and created the Amalgamated Clothing Workers of America. In one of the most dramatic moments in U.S. labor history, the young immigrant garment worker Clara Lemlich took the stage from AFL leader Samuel Gompers, who had refused to call a strike. Speaking in Yiddish, she called her fellow garment workers to action. Within two days, approximately 20,000 workers from 500 factories were on strike. By the 1920s, the tens of thousands of members of the ACWA and the ILGWU had won the closed shop, higher wages, shorter working hours, and better working conditions. These events also revealed the politicization of immigrant women in the industry and showed that immigrant workers could be organized, contrary to much AFL commentary. Along with the United Mineworkers, the garment workers forged a new model of unionism, demonstrating that a pragmatic industrial unionism could succeed as well as the more hidebound craft unionism of the AFL. In this, the new unions were important exceptions to the rule of non-socialist craft organizing of the era.11
The Industrial Workers of the World (IWW) created another key, if short-lived, bastion of American labor radicalism. Founded in Chicago in 1904, the IWW took inspiration from a group from the Western Federation of Miners who had been radicalized during a series of violent strikes in Idaho, Montana, and Colorado. Rallying around their shared distaste for the AFL’s conservatism and exclusionary practices, the IWW sought to create “One Big Union” of all workers regardless of skill level, race, ethnicity, or gender. Emphasizing the necessity of direct action and workers’ control of the workplace, they called for an end of the wage system and workers’ ownership of the means of production. The “Wobblies,” as the members came to be known, tapped into and inflamed the radical spirit of many of the most marginalized workers. The IWW thus backed its demands for the fulfillment of workers’ needs, the bread of daily life, with the threat of a radical sensibility at least rhetorically committed to revolution. (see Figure 4).
The preamble to the IWW’s 1908 constitution declared, “A struggle must go on until the workers of the world organize as a class, take possession of the earth and the machinery of production, and abolish the wage system.”12 The IWW’s revolutionary vision inspired many miners, loggers, and migrant agricultural workers in the West, as well as unorganized industrial workers in the East. Together, they built a lively workers’ culture with hundreds of songs collected in the Little Red Songbook. IWW membership peaked at 600,000 in 1916, riding a wave of important victories and broader socialist sentiment. Most famously, in the 1912 “Bread and Roses Strike” in Lawrence, Massachusetts, IWW leaders joined with local workers to strike against wage cuts and many years of low wages, long hours, dangerous working conditions, and terrible living conditions in the communities surrounding the factory. The IWW sustained a thread of American radicalism that otherwise might have been lost. The Wobblies’ radical critique of capitalism, their at least rhetorical support for direct action tactics such as sabotage, and their unswerving commitment to interracial organizing among all men and women carried these principles on through the relatively conservative first three decades of the century. The IWW also sustained the idea of industrial unionism, which was a minority strain the AFL’s organizing efforts, emphasizing that workers ought to be organized across all skill levels in a given industry.
Figure 2. “‘Mother’ Jones and Her Army of Striking Textile Workers.” Peirce & Jones for the New York World-Telegram & Sun, Philadelphia, PA, 1903.
Prints and Photographs Division, Library of Congress, digital ID: LC-DIG-ds-07713.
Figure 3. “Photograph of Police Officers, Civilians and Victims on the Sidewalk during the Triangle Shirtwaist Factory Fire.” March 25, 1911.
Franklin D. Roosevelt Library Photographs, 1870–2004, Franklin D. Roosevelt Library (#6040083).
Figure 4. “I.W.W. Hat Card.” Bain News Service, New York, NY, April 11, 1914.
Prints and Photographs Division, Library of Congress (LC-B2-3017-30), digital ID: LC-DIG-ggbain-15713.
Obstacles to Organizing in the Progressive Era
During the Progressive Era, the American Federation of Labor claimed to speak for all American workers. Still, with few exceptions, the AFL consisted largely of skilled, white, male workers, and focused its strikes, lawsuits, and limited political activity on maintaining those workers’ craft privileges.13 Its leaders also discouraged any organizing efforts not under the banner of the AFL, treating them as “dual unions,” or as enemies seeking to undermine the AFL. Furthermore, the federation’s leaders refused to engage in the broad political work that would have allowed them to challenge the anti-labor decisions of the courts or the narrowness of Progressive Era reforms.14 Such a closed, jealous, and litigious world of labor was hardly a beacon for the growing ranks of new immigrant and American migrant workers entering the deskilled factories of the North.
The limits of the Socialist Party’s gains also became clear soon enough. In the electoral arena, the SP never managed to reach the status of a viable third national party. The SP may have maintained a significant base of voters—as shown in Debs’s 1 million votes in the 1920 presidential election—but their efforts ran headlong into the anti-radical repression during and after World War I and the deeply conservative Republican ascendancy of the 1920s. Moreover, to the extent that Socialist politicians, such as Victor Berger and his allies in Milwaukee, made gains toward practical reform, they also distanced themselves from the more radical class politics of much of the American left. The Socialist leader Morris Hillquit denounced Berger and his allies as “sewer socialists”—sticking them for constantly bragging about how good Milwaukee’s sewer system was, even as they had failed to push forward the larger class struggle. Similarly, when socialist trade unionists rose to the leadership ranks in AFL unions, their pragmatism emerged. “Time and time again,” concludes the historian David Brody, “once they had acceded to office, Socialists began to act—if they did not always talk—like any other trade unionists.”15 Accommodation to established centers of power, however justifiable it may have been for Socialist activists in particular political contexts, added to the effects of internal divisions and repression of the left in limiting the SP’s radical challenge to American political and economic systems.
The IWW—in part because the Wobblies had some success, and in part because they sustained an unflagging rhetorical radicalism—also became the target of government and vigilante repression. Wobbly activists leading “free speech campaigns” faced club-wielding police officers and were whipped and even tarred and feathered by vigilantes throughout the West. During World War I, 1,200 miners suspected of being aligned with the IWW in Bisbee, Arizona, were rounded up, forced onto a freight train at gunpoint, and abandoned in the desert without food or water for a day and half before a nearby military commander arranged for their extradition to New Mexico. At the same time, the federal government raided IWW offices across the country and convicted hundreds of Wobblies for antiwar speech. In the end, the IWW became one of the driving forces behind the rise of the American Civil Liberties Union and the push for protections of free speech during and after World War I, but the Wobblies could not save themselves from this repression. By the end of the war, with many of its leaders imprisoned, deported, or having fled the country, the IWW was unable to sustain itself as an institution.
Still more obstacles stood in the way of mass labor organizing in the first decades of the 20th century. Chief among them were the racial and ethnic divisions that ran through the shop floors of American industry. Historians have examined in great detail the intraclass racism that blocked white workers from acting in ways that would have been truly class-conscious. Between the late 19th century and World War I, tens of thousands of black workers gained access to unions, some all-black but some biracial in organization. Yet unions often acted as agents of division; some included racial exclusion clauses in their constitutions, while others gave lip service to solidarity while declaring that, in practice, black workers would undercut the wages and opportunities of white workers. For their part, recent black migrants from the South, the majority of black workers in the factories, alternately feared or despised the “white man’s union.”16
White workers and union leaders used episodes of black strikebreaking as evidence that black workers were inevitably the opponents of labor progress. Whites’ descriptions of black workers represented a powerful, if contradictory, mix of racist notions of black inferiority and fear of black physical superiority. Black workers, they feared, could outwork white workers, and black workers would do it on the cheap. In 1901, the AFL defended itself against accusations of racism, arguing that “the antipathy … some union workers have against the colored man is not because of his color, but because of the fact that generally he is a ‘cheap man.’”17 But by 1905, the division between white and black workers had become so pronounced that AFL chief Samuel Gompers (Figure 5) declared, “If the colored man continues to lend himself to the work of tearing down what the white man has built up, a race hatred worse than any ever known before will result. Caucasian civilization will serve notice that its uplifting process is not to be interfered with in any such way.”18 Not surprisingly, black leaders felt differently. The black political leader Ida B. Wells praised strikebreakers as “men who proved their value by risking their lives to obtain work,” and she endorsed “the constitutional right of all men to earn a living and to protect themselves in the exercise of that right.”19
Workers and labor reformers also struggled to organize during one the most conservative eras in United States judicial history. In its 1905 decision in Lochner v. New York (198 U.S. 45), the United States Supreme Court overruled a New York law limiting hours for bakery employees. Rather than being necessary to protect the welfare of the workers, the court found that such hours legislation amounted to an unconstitutional attempt to regulate business, and “unreasonable, unnecessary and arbitrary interference with the right and liberty of the individual to contract.” With this reading of the Fourteenth Amendment’s due process clause, the Court would go on in subsequent years to constrain workers’ rights and legislative efforts to reform the industrial system. In 1908, for instance, the Court upheld what were known as “ironclad” or “yellow dog” contracts, which forced individual workers to sign an agreement not to join a union in order to secure a job. Also in 1908, the Court found that labor boycotts of employers had been banned by the 1890 Sherman Anti-Trust Act. In fact, there were more antitrust actions brought against union activities than business combinations until the Clayton Act of 1914 attempted to exclude union activity from the regulation of commerce, declaring that “the labor of human beings is not a commodity.” In 1911, the Court banned consumer boycotts, and in this period it also upheld blacklisting of union organizers, the constitutionality of company towns, and employers’ use of civil lawsuits to resist interference in their businesses. Even when the Court did support the constitutionality of reform measures, as in the 1908 Muller v. Oregon (208 U.S. 412) case allowing for limiting the number of hours women could work, the judges did so by appealing to the notion that women were the weaker sex and had special responsibilities in the home. The justices found support in the “widespread belief that woman’s physical structure, and the functions she performs in consequence thereof, justify special legislation restricting or qualifying the conditions under which she should be permitted to toil.”
The Supreme Court’s antagonism to any limits on the individual’s “liberty of contract” ran counter to legislators’ gradual rewriting of state and federal law. The U.S. Congress regulated child labor in 1919 and instituted a system of workers’ compensation in 1916, while twenty-five states passed workers’ compensation laws between 1911 and 1921. State and federal officials also formally began investigating workers’ safety, especially after the Triangle Shirtwaist fire in New York City in 1911 created widespread outrage against the factory owners’ willful refusal to protect their workers from dangerous conditions. The 1926 Railway Labor Act required railway industry employers to engage in collective bargaining and banned discrimination against unions in the railway industry (this was expanded to airlines in 1936). The 1931 Davis-Bacon Act required construction contracts with the federal government to specify a minimum or “prevailing” wage for workers under that contract. The 1932 Norris-LaGuardia Act for the first time provided protection for workers’ rights to organize, banned yellow dog contracts, and outlawed the use of court injunctions in nonviolent labor disputes. By 1932, then, in the face of much judicial resistance, legislators had responded to growing public alarm by initiating a revolution in labor law that would come to fruition when the Supreme Court upheld the 1935 National Labor Relations Act.
Figure 5. “Samuel Gompers—Federal Commission on Industrial Relations, New York, New York,” 1915.
Prints and Photographs Division, Library of Congress (LC-B2-3361-1).
World War I and the Hope for Industrial Democracy
World War I provided an unprecedented opening for unions to make gains and for workers who had traditionally been excluded from industrial work to enter the nation’s factories. The federal government spurred a national mobilization of the workforce and economic resources, while coordinating industrial planning. Although the government went so far as to take over the railroads, the federal intervention in the economy hardly represented wartime socialism. Instead, the government relied on industry leaders who acted as “dollar-a-year” men, voluntarily aiding in the planning of the wartime economy, and it ensured profits for industry with cost-plus contracts. In essence, the federal government forged a larger role in managing the economy with the primary goal of efficient war-related production. This managed economy also facilitated the private accumulation of capital for employers and benefited masses of workers.
Why was this a boon for unions and workers? In the first place, the wartime economy required labor peace. Therefore, the federal government facilitated the formation and growth of unions. At the same time, the wartime economic boom required many new workers. With the end of European immigration and the draft of white men into the military, women and African Americans found new opportunities. The long-term consequences of the war differed sharply for women and men. Women’s industrial experiences proved to be a largely temporary phenomenon. The war did help to provide the necessary impetus to pass the Nineteenth Amendment to the U.S. Constitution, giving women the right to vote. But the war did not lead to major changes in gender roles; gender lines in the workforce reemerged after the war, and the popular image of the liberated “flapper” in the Roaring Twenties remained a decidedly minority experience.
For African Americans, the war sparked a major demographic, economic, and political transition. Between 1915 and 1918, nearly 500,000 African Americans migrated from the South to northern cities, with another 700,000 following in their wake during the 1920s. The Great Migration, as this movement of black southerners to industrial cities has been called, began a process that not only transformed the lives of the migrants but also fundamentally changed the populations and politics of major American cities.20 World War I-era migrants built modern black urban communities in places like New York’s Harlem, Chicago’s South Side, and Detroit’s Black Bottom. Out of these communities would grow civil rights organizations like the National Association for the Advancement of Colored People, black nationalist organizations like Marcus Garvey’s Universal Negro Improvement Association, and the first major black labor radicals and trade unions. In the 1920s, Harlem was especially fertile ground for black working-class politics. As African American artists and writers created the Harlem Renaissance, black socialists and communists spoke on soapboxes on New York City’s streetcorners and helped popularize a black class politics. Building on the longstanding activism of Hubert Harrison and others, people like A. Philip Randolph who got their start in the 1910s would help build a nationally powerful, labor-based civil rights movement in the 1930s and 1940s.
The Business Decade
World War I seemed to offer an opportunity for workers to improve their position in the economy. Workers, in fact, gained a great deal in real wages and political power during the brief period of nearly full employment during the war. Yet unions’ efforts to institutionalize their place in an “industrial democracy” were roundly defeated in a series of strikes between 1919 and 1921. In 1919, alone, more than 4 million workers—approximately one-fifth of the workforce—went on strike. A general strike of 60,000 in Seattle, Washington, a strike by nearly the entire police force in Boston, Massachusetts, and a national steel strike of 350,000 workers in Pittsburgh and beyond (Figure 6) are representative of the broad scope of the strikes by workers fearful that they would lose what they had won during the war and facing the prospect of a severe postwar recession. In each case, the workers lost, and they ended up more divided than before, and more desperate for jobs at virtually any wage. Moreover, the entrenched economic conservatism of the federal government and popular culture not only marginalized labor unions but also celebrated the spirit of innovation, speculation, and acquisitive individualism of the “business decade.”
The benefits of the business decade were deeply unequal. To many Americans, the 1920s seemed to promise the unending expansion of the American economy. Consumer goods proliferated. The number of telephones doubled, by 1930 about half of Americans had indoor toilets, and Henry Ford refined assembly line production, allowing many working families to own a car. Yet the expansion of the consumer economy depended on an equal expansion of the consumer credit economy; Americans bought their radios and other modern wonders on installment plans. Moreover, even with the greater availability of credit, full participation in the consumer economy remained a dream for most. As the economic historian W. Elliot Brownlee notes, “Only one family in six owned an automobile, only one family in five owned a fixed bathtub or had electricity in its home, and only one family in ten had a telephone.”21 As importantly, while the automobile and other manufacturing industries boomed, core American economic sectors lagged far behind. Workers in these “sick industries,” including agriculture, mining, and New England textiles, were facing depression conditions well before the stock market crash in 1929.
Unions declined sharply in the 1920s under pressure from a conservative attack. Employers promoted an “American Plan” that celebrated the democracy of the open shop and that associated organized labor with un-American economic systems. Companies also promoted “welfare capitalism,” providing workers with benefits such as home loans, group insurance policies, stock options, and regular sponsorship of sports teams all in the name of reducing costly labor turnover and improving industrial harmony. Perhaps most importantly, some four hundred firms created Employee Representation Plans, or company unions, which sought to promote worker allegiance to the company and to provide a kind of pressure release for workers thinking about organizing in their own interests. Welfare capitalists sought to prevent unions from ever rising again, and for a time they succeeded. The number of strikes receded dramatically, and union membership declined. The success of unregulated markets and welfare capitalism, however, was short-lived, and the mass unemployment, poverty, and insecurity of the 1930s would help spark the greatest surge in union members in U.S. history.
Figure 6. “Pittsburgh Strike [1919 Strikers Demonstrating in Car].”
Photo by Bain News Service, 1919. Prints and Photographs Division, Library of Congress (LC-B2-5005-13), digital ID: LC-DIG-ggbain-29279.
The Crash and Its Immediate Aftermath
On October 24, 1929, “Black Tuesday,” traders on the New York Stock Exchange shed 16.4 million shares of stock, causing a drastic decline in the overall value of stocks. From a high of 381 on September 3, 1929, the Dow Jones Industrial Average ultimately fell to a low of 41.22 on July 8, 1932. Approximately five thousand banks failed between 1929 and 1933. Industrial production declined by over half between the crash and the middle of 1932. By that year, unemployment soared to between one-quarter and one-third of the total labor force. Things were not much better for those who managed to hold onto employment: wages fell 50 to 75 percent in the early years of the Great Depression. Economic sectors that had been struggling in the 1920s saw conditions only worsen; farm income declined by 60 percent, and one-third of famers lost their land in the 1930s. The industries that had driven the prosperity of the 1920s were now failing; by 1932, the automobile industry was producing at only 20 percent of its capacity. The stock market crash laid bare the underlying weaknesses in the U.S. economy and created mass unemployment, poverty, and insecurity.
President Herbert H. Hoover responded to the crash much more energetically than previous presidents had in similar crises, but his efforts were too limited to meet the depth of this one, in part because he remained steadfastly committed to voluntaristic, optimistic, Progressive-style interventions. Hoover moved to shore up public confidence while also supporting business leaders’ efforts to protect their financial interests. As Secretary of Treasury Andrew Mellon advised his fellow capitalists to “liquidate labor, liquidate stocks, liquidate the farmers, liquidate real estate,” Hoover assured the nation that the “fundamental business of the country was sound,” and asked for voluntary cooperation from corporate managers to maintain employment and wages. As realization of the deepening crisis dawned on him, Hoover also increased federal funds for public works, moved to cut taxes, and requested private agencies, as well as state and local governments, to provide relief to the approximately 7 million unemployed by 1931. Arguing that direct unemployment relief was a “dangerous” suggestion, Hoover instead created the Reconstruction Finance Corporation, which provided loans to businesses and banks in the hope that greater corporate stability would strengthen the economy.
President Hoover’s limited, top-down response to the crisis aggravated widespread anxieties and led to a new level of popular unrest. Destitute Americans living in shantytowns (Figure 7), popularly known as “Hoovervilles,” clearly blamed the president for their condition. Thousands of Americans joined in organizing for relief from the federal government. In unemployed organizations, spearheaded by socialist and communist organizers, Americans demanded monetary relief and reinstalled tenants in their apartments when they were evicted. The most important protests and strikes of the 1930s were still years away, but the unemployed organizing of the early 1930s played an important role in increasing popular militancy.
In 1932, a group of 22,000 World War I veterans marched on Washington, D.C., to demand that the U.S. Congress pay them the bonuses they had been promised for their service in the war. For weeks thousands of veterans camped on Anacostia Flats, within sight of the Capitol, while President Hoover and Congress refused to pay the bonuses. Finally, the president sent the U.S. Army to break up the “Bonus Army” camps. Generals Douglas McArthur, George Patton, and Dwight Eisenhower led the operation. Photographs and newsreels showed tanks rolling through the streets of the nation’s capital, and current U.S. soldiers setting fire to tents occupied by the heroes of World War I, and they contributed to Hoover’s loss of public support as the 1932 election neared.
Figure 7. “William A. Swift, Once a Farmer, Now a Resident of Circleville’s ‘Hooverville.’ When he Returned from the War He Went West. “Made awful good money jobbin’ around.’”
Photo by Ben Shahn for the Farm Security Administration, 1938. Prints and Photographs Division, Library of Congress (LC-USF3301-006408-M5), digital ID: LC-DIG-fsa-15603.
Workers and the Changing State during the New Deal
By 1932, Herbert Hoover had become by all accounts the most unpopular person in the United States. In contrast, New York’s governor, Franklin Delano Roosevelt, brought his optimistic paternalism to the national public, projecting confidence and campaigning on the promise that he would bring “Happy Days Again.” As governor, Roosevelt had experimented with unemployment relief and public works programs that became popular among New Yorkers. Yet he came to the presidency with no immediate or comprehensive solution to the nation’s economic troubles. Instead, the New Deal represented a series of experiments which, though they did not pull the nation out of the depression (only economic mobilization for World War II would do that), still dramatically transformed the American economy by creating a new welfare state, strengthening unions, and affirming the economic importance of government action as a source of both spending and business regulation.
On 8 June, a Scottish banker named Alexander Fordyce shorted the collapsing Company’s shares in the London markets. But a momentary bounce-back in the stock ruined his plans, and he skipped town leaving £550,000 in debt. Much of this was owed to the Ayr Bank, which imploded. In less than three weeks, another 30 banks collapsed across Europe, bringing trade to a standstill. On July 15, the directors of the Company applied to the Bank of England for a £400,000 loan. Two weeks later, they wanted another £300,000. By August, the directors wanted a £1 million bailout. The news began leaking out and seemingly contrite executives, running from angry shareholders, faced furious Parliament members. By January, the terms of a comprehensive bailout were worked out, and the British government inserted its czars into the Company’s management to ensure compliance with its terms.
If this sounds eerily familiar, it shouldn’t. The year was 1772, exactly 239 years ago today, the apogee of power for the corporation as a business construct. The company was the British East India company (EIC). The bubble that burst was the East India Bubble. Between the founding of the EIC in 1600 and the post-subprime world of 2011, the idea of the corporation was born, matured, over-extended, reined-in, refined, patched, updated, over-extended again, propped-up and finally widely declared to be obsolete. Between 2011 and 2100, it will decline — hopefully gracefully — into a well-behaved retiree on the economic scene.
In its 400+ year history, the corporation has achieved extraordinary things, cutting around-the-world travel time from years to less than a day, putting a computer on every desk, a toilet in every home (nearly) and a cellphone within reach of every human. It even put a man on the Moon and kinda-sorta cured AIDS.
So it is a sort of grim privilege for the generations living today to watch the slow demise of such a spectacularly effective intellectual construct. The Age of Corporations is coming to an end. The traditional corporation won’t vanish, but it will cease to be the center of gravity of economic life in another generation or two. They will live on as religious institutions do today, as weakened ghosts of more vital institutions from centuries ago.
It is not yet time for the obituary (and that time may never come), but the sun is certainly setting on the Golden Age of corporations. It is time to review the memoirs of the corporation as an idea, and contemplate a post-corporate future framed by its gradual withdrawal from the center stage of the world’s economic affairs.
Framing Modernity and Globalization
For quite a while now, I have been looking for the right set of frames to get me started on understanding geopolitics and globalization. For a long time, I was misled by the fact that 90% of the available books frame globalization and the emergence of modernity in terms of the nation-state as the fundamental unit of analysis, with politics as the fundamental area of human activity that shapes things. On the face of it, this seems reasonable. Nominally, nation-states subsume economic activity, with even the most powerful multi-national corporations being merely secondary organizing schemes for the world.
But the more I’ve thought about it, the more I’ve been pulled towards a business-first perspective on modernity and globalization. As a result, this post is mostly woven around ideas drawn from five books that provide appropriate fuel for this business-first frame. I will be citing, quoting and otherwise indirectly using these books over several future posts, but I won’t be reviewing them. So if you want to follow the arguments more closely, you may want to read some or all of these. The investment is definitely worthwhile.
- The Corporation that Changed the Worldby Nick Robins, a history of the East India Company, a rather unique original prototype of the idea
- Monsoonby Robert Kaplan, an examination of the re-emergence of the Indian Ocean as the primary theater of global geopolitics in the 21st century
- The Influence of Sea Power Upon History: 1660-1783by Alfred Thayer Mahan, a classic examination of how naval power is the most critical link between political, cultural, military and business forces.
- The Post-American Worldby Fareed Zakaria, an examination of the structure of the world being created, not by the decline of America, but by the “rise of the rest.”
- The Lever of Riches by Joel Mokyr, probably the most compelling model and account of how technological change drives the evolution of civilizations, through monotonic, path-dependent accumulation of changes
I didn’t settle on these five lightly. I must have browsed or partly-read-and-abandoned dozens of books about modernity and globalization before settling on these as the ones that collectively provided the best framing of the themes that intrigued me. If I were to teach a 101 course on the subject, I’d start with these as required reading in the first 8 weeks.
The human world, like physics, can be reduced to four fundamental forces: culture, politics, war and business. That is also roughly the order of decreasing strength, increasing legibility and partial subsumption of the four forces. Here is a visualization of my mental model:
Culture is the most mysterious, illegible and powerful force. It includes such tricky things as race, language and religion. Business, like gravity in physics, is the weakest and most legible: it can be reduced to a few basic rules and principles (comprehensible to high-school students) that govern the structure of the corporate form, and descriptive artifacts like macroeconomic indicators, microeconomic balance sheets, annual reports and stock market numbers.
But one quality makes gravity dominate at large space-time scales: gravity affects all masses and is always attractive, never repulsive. So despite its weakness, it dominates things at sufficiently large scales. I don’t want to stretch the metaphor too far, but something similar holds true of business.
On the scale of days or weeks, culture, politics and war matter a lot more in shaping our daily lives. But those forces fundamentally cancel out over longer periods. They are mostly noise, historically speaking. They don’t cause creative-destructive, unidirectional change (whether or not you think of that change as “progress” is a different matter).
Business though, as an expression of the force of unidirectional technological evolution, has a destabilizing unidirectional effect. It is technology, acting through business and Schumpeterian creative-destruction, that drives monotonic, historicist change, for good or bad. Business is the locus where the non-human force of technological change sneaks into the human sphere.
Of course, there is arguably some progress on all four fronts. You could say that Shakespeare represents progress with respect to Aeschylus, and Tom Stoppard with respect to Shakespeare. You could say Obama understands politics in ways that say, Hammurabi did not. You could say that General Petraeus thinks of the problems of military strategy in ways that Genghis Khan did not. But all these are decidedly weak claims.
On the other hand the proposition that Facebook (the corporation) is in some ways a beast entirely beyond the comprehension of an ancient Silk Road trader seems vastly more solid. And this is entirely a function of the intimate relationship between business and technology. Culture is suspicious of technology. Politics is mostly indifferent to and above it. War-making uses it, but maintains an arms-length separation. Business? It gets into bed with it. It is sort of vaguely plausible that you could switch artists, politicians and generals around with their peers from another age and still expect them to function. But there is no meaningful way for a businessman from (say) 2000 BC to comprehend what Mark Zuckerberg does, let alone take over for him. Too much magical technological water has flowed under the bridge.
Arthur C. Clarke once said that any sufficiently advanced technology is indistinguishable from magic, but technology (and science) aren’t what create the visible magic. Most of the magic never leaves journal papers or discarded engineering prototypes. It is business that creates the world of magic, not technology itself. And the story of business in the last 400 years is the story of the corporate form.
There are some who treat corporate forms as yet another technology (in this case a technology of people-management), but despite the trappings of scientific foundations (usually in psychology) and engineering synthesis (we speak of organizational “design”), the corporate form is not a technology. It is the consequence of a social contract like the one that anchors nationhood. It is a codified bundle of quasi-religious beliefs externalized into an animate form that seeks to preserve itself like any other living creature.
The Corporate View of history: 1600 – 2100
We are not used to viewing world history through the perspective of the corporation for the very good reason that corporations are a recent invention, and instances that had the ability to transform the world in magical ways did not really exist till the EIC was born. Businesses of course, have been around for a while. The oldest continuously surviving business, until recently, was Kongo Gumi, a Japanese temple construction business founded in 584 AD that finally closed its doors in 2009. Guilds and banks have existed since the 16th century. Trading merchants, who raised capital to fund individual ships or voyages, often with some royal patronage, were also not a new phenomenon. What was new was the idea of a publicly traded joint-stock corporation, an entity with rights similar to those of states and individuals, with limited liability and significant autonomy (even in its earliest days, when corporations were formed for defined periods of time by royal charter).
This idea morphed a lot as it evolved (most significantly in the aftermath of the East India Bubble), but it retained a recognizable DNA throughout. Many authors such as Gary Hamel (The Future of Management), Tom Malone (The Future of Work) and Don Tapscott (Wikinomics) have talked about how the traditional corporate form is getting obsolete. But in digging around, I found to my surprise that nobody has actually attempted to meaningfully represent the birth-to-obsoloscence evolution of the idea of the corporation.
Here is my first stab at it (I am working on a much more detailed, data-driven timeline as a side project):
To understand history — world history in the fullest sense, not just economic history — from this perspective, you need to understand two important points about this evolution of corporations.
The Smithian/Schumpeterian Divide
The first point is that the corporate form was born in the era of Mercantilism, the economic ideology that (zero-sum) control of land is the foundation of all economic power.
In politics, Mercantilism led to balance-of-power models. In business, once the Age of Exploration (the 16th century) opened up the world, it led to mercantilist corporations focused on trade (if land is the source of all economic power, the only way to grow value faster than your land holdings permit, is to trade on advantageous terms).
The forces of radical technological change — the Industrial Revolution — did not seriously kick in until after nearly 200 years of corporate evolution (1600-1800) in a mercantilist mold. Mercantilist models of economic growth map to what Joel Mokyr calls Smithian Growth, after Adam Smith. It is worth noting here that Adam Smith published The Wealth of Nations in 1776, strongly influenced by his reading of the events surrounding the bursting of the East India Bubble in 1772 and debates in Parliament about its mismanagement. Smith was both the prophet of doom for the Mercantilist corporation, and the herald of what came to replace it: the Schumpeterian corporation. Mokyr characterizes the growth created by the latter as Schumpeterian growth.
The corporate form therefore spent almost 200 years — nearly half of its life to date — being shaped by Mercantilist thinking, a fundamentally zero-sum way of viewing the world. It is easy to underestimate the impact of this early life since the physical form of modern corporations looks so different. But to the extent that organizational forms represent externalized mental models, codified concepts and structure-following-strategy (as Alfred Chandler eloquently put it), the corporate form contains the inertia of that early formative stage.
In fact, in terms of the two functions that Drucker considered the only essential ones in business, marketing and innovation, the Mercantilist corporation lacked one. The archetypal Mercantilist corporation, the EIC, understood marketing intimately and managed demand and supply with extraordinary accuracy. But it did not innovate.
Innovation was the function grafted onto the corporate form by the possibility of Schumpeterian growth, but it would take nearly an entire additional century for the function to be properly absorbed into corporations. It was not until after the American Civil War and the Gilded Age that businesses fundamentally reorganized around (as we will see) time instead of space, which led, as we will see, to a central role for ideas and therefore the innovation function.
The Black Hills Gold Rush of the 1870s, the focus of the Deadwood saga, was in a way the last hurrah of Mercantilist thinking. William Randolph Hearst, the son of gold mining mogul George Hearst who took over Deadwood in the 1870s, made his name with newspapers. The baton had formally been passed from mercantilists to schumpeterians.
This divide between the two models can be placed at around 1800, the nominal start date of the Industrial Revolution, as the ideas of Renaissance Science met the energy of coal to create a cocktail that would allow corporations to colonize time.
Reach versus Power
The second thing to understand about the evolution of the corporation is that the apogee of power did not coincide with the apogee of reach. In the 1780s, only a small fraction of humanity was employed by corporations, but corporations were shaping the destinies of empires. In the centuries that followed the crash of 1772, the power of the corporation was curtailed significantly, but in terms of sheer reach, they continued to grow, until by around 1980, a significant fraction of humanity was effectively being governed by corporations.
I don’t have numbers for the whole world, but for America, less than 20% of the population had paycheck incomes in 1780, and over 80% in 1980, and the percentage has been declining since (I have cited these figures before; they are from Gareth Morgan’s Images of Organization and Dan Pink’s Free Agent Nation). Employment fraction is of course only one of the many dimensions of corporate power (which include economic, material, cultural, human and political forms of power), but this graph provides some sense of the numbers behind the rise and fall of the corporation as an idea.
It is tempting to analyze corporations in terms of some measure of overall power, which I call “reach.” Certainly corporations today seem far more powerful than those of the 1700s, but the point is that the form is much weaker today, even though it has organized more of our lives. This is roughly the same as the distinction between fertility of women and population growth: the peak in fertility (a per-capita number) and peak in population growth rates (an aggregate) behave differently.
To make sense of the form, the divide between the Smithian and Schumpeterian growth epochs is much more useful than the dynamics of reach. This gives us a useful 3-phase model of the history of the corporation: the Mercantilist/Smithian era from 1600-1800, the Industrial/Schumpeterian era from 1800 – 2000 and finally, the era we are entering, which I will dub the Information/Coasean era. By a happy accident, there is a major economist whose ideas help fingerprint the economic contours of our world: Ronald Coase.
This post is mainly about the two historical phases, and are in a sense a macro-prequel to the ideas I normally write about which are more individual-focused and future-oriented.
I: Smithian Growth and the Mercantilist Economy (1600 – 1800)
The story of the old corporation and the sea
It is difficult for us in 2011, with Walmart and Facebook as examples of corporations that significantly control our lives, to understand the sheer power the East India Company exercised during its heyday. Power that makes even the most out-of-control of today’s corporations seem tame by comparison. To a large extent, the history of the first 200 years of corporate evolution is the history of the East India Company. And despite its name and nation of origin, to think of it as a corporation that helped Britain rule India is to entirely misunderstand the nature of the beast.
Two images hint at its actual globe-straddling, 10x-Walmart influence: the image of the Boston Tea Partiers dumping crates of tea into the sea during the American struggle for independence, and the image of smoky opium dens in China. One image symbolizes the rise of a new empire. The other marks the decline of an old one.
The East India Company supplied both the tea and the opium.
At a broader level, the EIC managed to balance an unbalanced trade equation between Europe and Asia whose solution had eluded even the Roman empire. Massive flows of gold and silver from Europe to Asia via the Silk and Spice routes had been a given in world trade for several thousand years. Asia simply had far more to sell than it wanted to buy. Until the EIC came along
A very rough sketch of how the EIC solved the equation reveals the structure of value-addition in the mercantilist world economy.
The EICstarted out by buying textiles from Bengal and tea from China in exchange for gold and silver.
Then it realized it was playing the same sucker game that had trapped and helped bankrupt Rome.
Next, it figured out that it could take control of the opium industry in Bengal, trade opium for tea in China with a significant surplus, and use the money to buy the textiles it needed in Bengal. Guns would be needed.
As a bonus, along with its partners, it participated in yet another clever trade: textiles for slaves along the coast of Africa, who could be sold in America for gold and silver.
For this scheme to work, three foreground things and one background thing had to happen: the corporation had to effectively take over Bengal (and eventually all of India), Hong Kong (and eventually, all of China, indirectly) and England. Robert Clive achieved the first goal by 1757. An employee of the EIC, William Jardine, founded what is today Jardine Matheson, the spinoff corporation most associated with Hong Kong and the historic opium trade. It was, during in its early history, what we would call today a narco-terrorist corporation; the Taliban today are kindergarteners in that game by comparison. And while the corporation never actually took control of the British Crown, it came close several times, by financing the government during its many troubles.
The background development was simpler. England had to take over the oceans and ensure the safe operations of the EIC.
Just how comprehensively did the EIC control the affairs of states? Bengal is an excellent example. In the 1600s and the first half of the 1700s, before the Industrial Revolution, Bengali textiles were the dominant note in the giant sucking sound drawing away European wealth (which was flowing from the mines and farms of the Americas). The European market, once the EIC had shoved the Dutch VOC aside, constantly demanded more and more of an increasing variety of textiles, ignoring the complaining of its own weavers. Initially, the company did no more than battle the Dutch and Portuguese on water, and negotiate agreements to set up trading posts on land. For a while, it played by the rules of the Mughal empire and its intricate system of economic control based on various imperial decrees and permissions. The Mughal system kept the business world firmly subservient to the political class, and ensured a level playing field for all traders. Bengal in the 17th and 18th centuries was a cheerful drama of Turks, Arabs, Armenians, Indians, Chinese and Europeans. Trade in the key commodities, textiles, opium, saltpeter and betel nuts, was carefully managed to keep the empire on top.
But eventually, as the threat from the Dutch was tamed, it became clear that the company actually had more firepower at its disposal than most of the nation-states it was dealing with. The realization led to the first big domino falling, in the corporate colonization of India, at the battle of Plassey. Robert Clive along with Indian co-conspirators managed to take over Bengal, appoint a puppet Nawab, and get himself appointed as the Mughal diwan (finance minister/treasurer) of the province of Bengal, charged with tax collection and economic administration on behalf of the weakened Mughals, who were busy destroying their empire. Even people who are familiar enough with world history to recognize the name Robert Clive rarely understand the extent to which this was the act of a single sociopath within a dangerously unregulated corporation, rather than the country it was nominally subservient to (England).
This history doesn’t really stand out in sharp relief until you contrast it with the behavior of modern corporations. Today, we listen with shock to rumors about the backroom influence of corporations like Halliburton or BP, and politicians being in bed with the business leaders in the Too-Big-to-Fail companies they are supposed to regulate.
The EIC was the original too-big-to-fail corporation. The EIC was the beneficiary of the original Big Bailout. Before there was TARP, there was the Tea Act of 1773 and the Pitt India Act of 1783. The former was a failed attempt to rein in the EIC, which cost Britain the American Colonies. The latter created the British Raj as Britain doubled down in the east to recover from its losses in the west. An invisible thread connects the histories of India and America at this point. Lord Cornwallis, the loser at the Siege of Yorktown in 1781 during the revolutionary war, became the second Governor General of India in 1786.
But these events were set in motion over 30 years earlier, in the 1750s. There was no need for backroom subterfuge. It was all out in the open because the corporation was such a new beast, nobody really understood the dangers it represented. The EIC maintained an army. Its merchant ships often carried vastly more firepower than the naval ships of lesser nations. Its officers were not only not prevented from making money on the side, private trade was actually a perk of employment (it was exactly this perk that allowed William Jardine to start a rival business that took over the China trade in the EIC’s old age). And finally — the cherry on the sundae — there was nothing preventing its officers like Clive from simultaneously holding political appointments that legitimized conflicts of interest. If you thought it was bad enough that Dick Cheney used to work for Halliburton before he took office, imagine if he’d worked there while in office, with legitimate authority to use his government power to favor his corporate employer and make as much money on the side as he wanted, and call in the Army and Navy to enforce his will. That picture gives you an idea of the position Robert Clive found himself in, in 1757.
He made out like a bandit. A full 150 years before American corporate barons earned the appellation “robber.”
In the aftermath of Plassey, in his dual position of Mughal diwan of Bengal and representative of the EIC with permission to make money for himself and the company, and the armed power to enforce his will, Clive did exactly what you’d expect an unprincipled and enterprising adventurer to do. He killed the golden goose. He squeezed the Bengal textile industry dry for profits, destroying its sustainability. A bubble in London and a famine in Bengal later, the industry collapsed under the pressure (Bengali economist Amartya Sen would make his bones and win the Nobel two centuries later, studying such famines). With industrialization and machine-made textiles taking over in a few decades, the economy had been destroyed. But by that time the EIC had already moved on to the next opportunities for predatory trade: opium and tea.
The East India bubble was a turning point. Thanks to a rare moment of the Crown being more powerful than the company during the bust, the bailout and regulation that came in the aftermath of the bubble fundamentally altered the structure of the EIC and the power relations between it and the state. Over the next 70 years, political, military and economic power were gradually separated and modern checks and balances against corporate excess came into being.
The whole intricate story of the corporate takeover of Bengal is told in detail in Robins’ book. The Battle of Plassey is actually almost irrelevant; most of the action was in the intrigue that led up to it, and followed. Even if you have some familiarity with Indian and British history during that period, chances are you’ve never drilled down into the intricate details. It has all the elements of a great movie: there is deceit, forgery of contracts, licensing frauds, murder, double-crossing, arm-twisting and everything else you could hope for in a juicy business story.
As an enabling mechanism, Britain had to rule the seas, comprehensively shut out the Dutch, keep France, the Habsburgs, the Ottomans (and later Russia) occupied on land, and have enough firepower left over to protect the EIC’s operations when the EIC’s own guns did not suffice. It is not too much of a stretch to say that for at least a century and a half, England’s foreign policy was a dance in Europe in service of the EIC’s needs on the oceans. That story, with much of the action in Europe, but most of the important consequences in America and Asia, is told in Mahan’s book. (Though boats were likely invented before the wheel, surprisingly, the huge influence of sea power upon history was not generally recognized until Mahan wrote his classic. The book is deep and dense. It’s worth reading just for the story of how Rome defeated Carthage through invisible negative-space non-action on the seas by the Roman Navy. I won’t dive into the details here, except to note that Mahan’s book is the essential lens you need to understand the peculiar military conditions in the 17th and 18th centuries that made the birth of the corporation possible.)
To read both books is to experience a process of enlightenment. An illegible period of world history suddenly becomes legible. The broad sweep of world history between 1500-1800 makes no real sense (between approximately the decline of Islam and the rise of the British Empire) except through the story of the EIC and corporate mercantilism in general.
The short version is as follows.
Constantinople fell to the Ottomans in 1453 and the last Muslim ruler was thrown out of Spain in 1492, the year Columbus sailed the ocean blue. Vasco de Gama found a sea route to India in 1498. The three events together caused a defensive consolidation of Islam under the later Ottomans, and an economic undermining of the Islamic world (a process that would directly lead to the radicalization of Islam under the influence of religious leaders like Abd-al Wahhab (1703-1792)).
The 16th century makes a vague sort of sense as the “Age of Exploration,” but it really makes a lot more sense as the startup/first-mover/early-adopter phase of the corporate mercantilism. The period was dominated by the daring pioneer spirit of Spain and Portugal, which together served as the Silicon Valley of Mercantilism. But the maritime business operations of Spain and Portugal turned out to be the MySpace and Friendster of Mercantilism: pioneers who could not capitalize on their early lead.
Conventionally, it is understood that the British and the Dutch were the ones who truly took over. But in reality, it was two corporations that took over: the EIC and the VOC (the Dutch East India Company, Vereenigde Oost-Indische Compagnie, founded one year after the EIC) the Facebook and LinkedIn of Mercantile economics respectively. Both were fundamentally more independent of the nation states that had given birth to them than any business entities in history. The EIC more so than the VOC. Both eventually became complex multi-national beasts.
A lot of other stuff happened between 1600 – 1800. The names from world history are familiar ones: Elizabeth I, Louis XIV, Akbar, the Qing emperors (the dynasty is better known than individual emperors) and the American Founding Fathers. The events that come to mind are political ones: the founding of America, the English Civil War, the rise of the Ottomans and Mughals.
The important names in the history of the EIC are less well-known: Josiah Child, Robert Clive, Warren Hastings. The events, like Plassey, seem like sideshows on the margins of land-based empires.
The British Empire lives on in memories, museums and grand monuments in two countries. Company Raj is largely forgotten. The Leadenhall docks in London, the heart of the action, have disappeared today under new construction.
But arguably, the doings of the EIC and VOC on the water were more important than the pageantry on land. Today the invisible web of container shipping serves as the bloodstream of the world. Its foundations were laid by the EIC.
For nearly two centuries they ruled unchallenged, until finally the nations woke up to their corporate enemies on the water. With the reining in and gradual decline of the EIC between 1780 and 1857, the war between the next generation of corporations and nations moved to a new domain: the world of time.
The last phase of Mercantilism eventually came to an end by the 1850s, as events ranging from the first war of Independence in India (known in Britain as the Sepoy Mutiny), the first Opium War and Perry prying Japan open signaled the end of the Mercantilist corporation worldwide. The EIC wound up its operations in 1876. But the Mercantilist corporation died many decades before that as an idea. A new idea began to take its place in the early 19th century: the Schumpeterian corporation that controlled, not trade routes, but time. It added the second of the two essential Druckerian functions to the corporation: innovation.
II. Schumpeterian Growth and the Industrial Economy (1800 – 2000)
The colonization of time and the apparently endless frontier
To understand what changed in 1800, consider this extremely misleading table about GDP shares of different countries, between 1600-1870. There are many roughly similar versions floating around in globalization debates, and the numbers are usually used gleefully to shock people who have no sense of history. I call this the “most misleading table in the world.”
Chinese and Indian jingoists in particular, are prone to misreading this table as evidence that colonization “stole” wealth from Asia (the collapse of GDP share for China and India actually went much further, into the low single digits, in the 20th century). The claim of GDP theft is true if you use a zero-sum Mercantilist frame of reference (and it is true in a different sense of “steal” that this table does not show).
But the Mercantilist model was already sharply declining by 1800.
Something else was happening, and Fareed Zakaria, as far as I know, is the only major commentator to read this sort of table correctly, in The Post-American World. He notes that what matters is not absolute totals, but per-capita productivity.
We get a much clearer picture of the real standing of countries if we consider economic growth and GDP per capita. Western Europe GDP per capita was higher than that of both China and India by 1500; by 1600 it was 50% higher than China’s. From there, the gap kept growing. Between 1350 and 1950 — six hundred years — GDP per capita remained roughly constant in India and China (hovering around $600 for China and $550 for India). In the same period, Western European GDP per capita went from $662 to $4,594, a 594 percent increase.
Sure, corporations and nations may have been running on Mercantilist logic, but the undercurrent of Schumpeterian growth was taking off in Europe as early as 1500 in the less organized sectors like agriculture. It was only formally recognized and tamed in the early 1800s, but the technology genie had escaped.
The action shifted to two huge wildcards in world affairs of the 1800s: the newly-born nation of America and the awakening giant in the east, Russia. Per capita productivity is about efficient use of human time. But time, unlike space, is not a collective and objective dimension of human experience. It is a private and subjective one. Two people cannot own the same piece of land, but they can own the same piece of time. To own space, you control it by force of arms. To own time is to own attention. To own attention, it must first be freed up, one individual stream of consciousness at a time.
The Schumpeterian corporation was about colonizing individual minds. Ideas powered by essentially limitless fossil-fuel energy allowed it to actually pull it off.
By the mid 1800s, as the EIC and its peers declined, the battle seemingly shifted back to land, especially in the run-up to and aftermath of, the American Civil War. I haven’t made complete sense of the Russian half of the story, but that peaked later and ultimately proved less important than the American half, so it is probably reaosonably safe to treat the story of Schumpeterian growth as an essentially American story.
If the EIC was the archetype of the Mercantilist era, the Pennsylvania Railroad company was probably the best archetype for the Schumpeterian corporation. Modern corporate management as well Soviet forms of statist governance can be traced back to it. In many ways the railroads solved a vastly speeded up version of the problem solved by the EIC: complex coordination across a large area. Unlike the EIC though, the railroads were built around the telegraph, rather than postal mail, as the communication system. The difference was like the difference between the nervous systems of invertebrates and vertebrates.
If the ship sailing the Indian Ocean ferrying tea, textiles, opium and spices was the star of the mercantilist era, the steam engine and steamboat opening up America were the stars of the Schumpeterian era. Almost everybody misunderstood what was happening. Traveling up and down the Mississippi, the steamboat seemed to be opening up the American interior. Traveling across the breadth of America, the railroad seemed to be opening up the wealth of the West, and the great possibilities of the Pacific Ocean.
Those were side effects. The primary effect of steam was not that it helped colonize a new land, but that it started the colonization of time. First, social time was colonized. The anarchy of time zones across the vast expanse of America was first tamed by the railroads for the narrow purpose of maintaining train schedules, but ultimately, the tools that served to coordinate train schedules: the mechanical clock and time zones, served to colonize human minds. An exhibit I saw recently at the Union Pacific Railroad Museum in Omaha clearly illustrates this crucial fragment of history:
The steam engine was a fundamentally different beast than the sailing ship. For all its sophistication, the technology of sail was mostly a very-refined craft, not an engineering discipline based on science. You can trace a relatively continuous line of development, with relatively few new scientific or mathematical ideas, from early Roman galleys, Arab dhows and Chinese junks, all the way to the amazing Tea Clippers of the mid 19th century (Mokyr sketches out the story well, as does Mahan, in more detail).
Steam power though was a scientific and engineering invention. Sailing ships were the crowning achievements of the age of craft guilds. Steam engines created, and were created by engineers, marketers and business owners working together with (significantly disempowered) craftsmen in genuinely industrial modes of production. Scientific principles about gases, heat, thermodynamics and energy applied to practical ends, resulting in new artifacts. The disempowerment of craftsmen would continue through the Schumpeterian age, until Fredrick Taylor found ways to completely strip mine all craft out of the minds of craftsmen, and put it into machines and the minds of managers. It sounds awful when I put it that way, and it was, in human terms, but there is no denying that the process was mostly inevitable and that the result was vastly better products.
The Schumpeterian corporation did to business what the doctrine of Blitzkrieg would do to warfare in 1939: move humans at the speed of technology instead of moving technology at the speed of humans. Steam power used the coal trust fund (and later, oil) to fundamentally speed up human events and decouple them from the constraints of limited forms of energy such as the wind or human muscles. Blitzkrieg allowed armies to roar ahead at 30-40 miles per hour instead of marching at 5 miles per hour. Blitzeconomics allowed the global economy to roar ahead at 8% annual growth rates instead of the theoretical 0% average across the world for Mercantilist zero-sum economics. “Progress” had begun.
The equation was simple: energy and ideas turned into products and services could be used to buy time. Specifically, energy and ideas could be used to shrink autonomously-owned individual time and grow a space of corporate-owned time, to be divided between production and consumption. Two phrases were invented to name the phenomenon: productivity meant shrinking autonomously-owned time. Increased standard of living through time-saving devices became code for the fact that the “freed up” time through “labor saving” devices was actually the de facto property of corporations. It was a Faustian bargain.
Many people misunderstood the fundamental nature of Schumpeterian growth as being fueled by ideas rather than time. Ideas fueled by energy can free up time which can then partly be used to create more ideas to free up more time. It is a positive feedback cycle, but with a limit. The fundamental scarce resource is time. There is only one Earth worth of space to colonize. Only one fossil-fuel store of energy to dig out. Only 24 hours per person per day to turn into capitive attention.
Among the people who got it wrong was my favorite visionary, Vannevar Bush, who talked of science: the endless frontier. To believe that there is an arguably limitless supply of valuable ideas waiting to be discovered is one thing. To argue that they constitute a limitless reserve of value for Schumpeterian growth to deliver is to misunderstand how ideas work: they are only valuable if attention is efficiently directed to the right places to discover them and energy is used to turn them into businesses, and Arthur-Clarke magic.
It is fairly obvious that Schumpeterian growth has been fueled so far by reserves of fossil fuels. It is less obvious that it is also fueled by reserves of collectively-managed attention.
For two centuries, we burned coal and oil without a thought. Then suddenly, around 1980, Peak Oil seemed to loom menacingly closer.
For the same two centuries it seemed like time/attention reserves could be endlessly mined. New pockets of attention could always be discovered, colonized and turned into wealth.
Then the Internet happened, and we discovered the ability to mine time as fast as it could be discovered in hidden pockets of attention. And we discovered limits.
And suddenly a new peak started to loom: Peak Attention.
III. Coasean Growth and the Perspective Economy
Peak Attention and Alternative Attention Sources
I am not sure who first came up with the term Peak Attention, but the analogy to Peak Oil is surprisingly precise. It has its critics, but I think the model is basically correct.
Peak Oil refers to a graph of oil production with a maximum called Hubbert’s peak, that represents peak oil production. The theory behind it is that new oil reserves become harder to find over time, are smaller in size, and harder to mine. You have to look harder and work harder for every new gallon, new wells run dry faster than old ones, and the frequency of discovery goes down. You have to drill more.
There is certainly plenty of energy all around (the Sun and the wind, to name two sources), but oil represents a particularly high-value kind.
Attention behaves the same way. Take an average housewife, the target of much time mining early in the 20th century. It was clear where her attention was directed. Laundry, cooking, walking to the well for water, cleaning, were all obvious attention sinks. Washing machines, kitchen appliances, plumbing and vacuum cleaners helped free up a lot of that attention, which was then immediately directed (as corporate-captive attention) to magazines and television.
But as you find and capture most of the wild attention, new pockets of attention become harder to find. Worse, you now have to cannibalize your own previous uses of captive attention. Time for TV must be stolen from magazines and newspapers. Time for specialized entertainment must be stolen from time devoted to generalized entertainment.
Sure, there is an equivalent to the Sun in the picture. Just ask anyone who has tried mindfulness meditation, and you’ll understand why the limits to attention (and therefore the value of time) are far further out than we think.
The point isn’t that we are running out of attention. We are running out of the equivalent of oil: high-energy-concentration pockets of easily mined fuel.
The result is a spectacular kind of bubble-and-bust.
Each new pocket of attention is harder to find: maybe your product needs to steal attention from that one TV obscure show watched by just 3% of the population between 11:30 and 12:30 AM. The next displacement will fragment the attention even more. When found, each new pocket is less valuable. There is a lot more money to be made in replacing hand-washing time with washing-machine plus magazine time, than there is to be found in replacing one hour of TV with a different hour of TV.
What’s more, due to the increasingly frantic zero-sum competition over attention, each new “well” of attention runs out sooner. We know this idea as shorter product lifespans.
So one effect of Peak Attention is that every human mind has been mined to capacity using attention-oil drilling technologies. To get to Clay Shirky’s hypothetical notion of cognitive surplus, we need Alternative Attention sources.
To put it in terms of per-capita productivity gains, we hit a plateau.
We can now connect the dots to Zakaria’s reading of global GDP trends, and explain why the action is shifting back to Asia, after being dominated by Europe for 600 years.
Europe may have increased per capita productivity 594% in 600 years, while China and India stayed where they were, but Europe has been slowing down and Asia has been catching up. When Asia hits Peak Attention (America is already past it, I believe), absolute size, rather than big productivity differentials, will again define the game, and the center of gravity of economic activity will shift to Asia.
If you think that’s a long way off, you are probably thinking in terms of living standards rather than attention and energy. In those terms, sure, China and India have a long way to go before catching up with even Southeast Asia. But standard of living is the wrong variable. It is a derived variable, a function of available energy and attention supply. China and India will never catch up (though Western standards of living will decline), but Peak Attention will hit both countries nevertheless. Within the next 10 years or so.
What happens as the action shifts? Kaplan’s Monsoon frames the future in possibly the most effective way. Once again, it is the oceans, rather than land, that will become the theater for the next act of the human drama. While American lifestyle designers are fleeing to Bali, much bigger things are afoot in the region.
And when that shift happens, the Schumpeterian corporation, the oil rig of human attention, will start to decline at an accelerating rate. Lifestyle businesses and other oddball contraptions — the solar panels and wind farms of attention economics — will start to take over.
It will be the dawn of the age of Coasean growth.
Adam Smith’s fundamental ideas helped explain the mechanics of Mercantile economics and the colonization of space.
Joseph Schumpeter’s ideas helped extend Smith’s ideas to cover Industrial economics and the colonization of time.
Ronald Coase turned 100 in 2010. He is best known for his work on transaction costs, social costs and the nature of the firm. Where most classical economists have nothing much to say about the corporate form, for Coase, it has been the main focus of his life.
Without realizing it, the hundreds of entrepreneurs, startup-studios and incubators, 4-hour-work-weekers and lifestyle designers around the world, experimenting with novel business structures and the attention mining technologies of social media, are collectively triggering the age of Coasean growth.
Coasean growth is not measured in terms of national GDP growth. That’s a Smithian/Mercantilist measure of growth.
It is also not measured in terms of 8% returns on the global stock market. That is a Schumpeterian growth measure. For that model of growth to continue would be a case of civilizational cancer (“growth for the sake of growth is the ideology of the cancer cell” as Edward Abbey put it).
Coasean growth is fundamentally not measured in aggregate terms at all. It is measured in individual terms. An individual’s income and productivity may both actually decline, with net growth in a Coasean sense.
How do we measure Coasean growth? I have no idea. I am open to suggestions. All I know is that the metric will need to be hyper-personalized and relative to individuals rather than countries, corporations or the global economy. There will be a meaningful notion of Venkat’s rate of Coasean growth, but no equivalent for larger entities.
The fundamental scarce resource that Coasean growth discovers and colonizes is neither space, nor time. It is perspective.
The bad news: it too is a scarce resource that can be mined to a Peak Perspective situation.
The good news: you will likely need to colonize your own unclaimed perspective territory. No collectivist business machinery will really be able to mine it out of you.
Those are stories for another day. Stay tuned.
Note #1: This post weighs in at over 7000 words and is a new record for me.
Note #2: I hope those of you who have read Tempo got about 34.2% more value out of this post.
Note #3: Yeah, I am opening up a new blogging battlefront, after nearly two years of pussyfooting around geopolitics and globalization via things like container shipping and garbage. Frankly, I’ve been meaning to for a while, but simply wasn’t ready.