Abstract
Understanding why explicit, managerial-centred corporate social responsibility (CSR) which first developed in the United States, has recently spread globally requires an examination of the circumstances under which it first emerged. CSR arose out of the success of large American corporations of the early 20th century in preventing unionization or significant regulation of their workplaces, a success that required firms to assume responsibility for employee welfare, not only to promote efficiency and prevent a resurgence of labour activism, but also to reassure traditionalists concerned with new white collar employment relations. Corporate executives discovered that positive publicity with regard to personnel policies helped them manage other controversies, so after World War I, some began to extend claims of being responsible employers to argue for a more comprehensive set of social responsibilities. The onset of the Depression and its aftermath reduced the significance of managerial voluntarism for almost a half century, but the rise of neo-liberalism has revived managerial CSR, although once again it may not survive a global crisis.
In discussing the contemporary rhetorical embrace of corporate social responsibility (CSR) on the part of major corporations, Hanlon and Fleming (2009) point to the emergence of a new regime of global capitalism that has largely supplanted an older post-war ‘Fordist‘ regime characterized by relatively high wages, social welfare, Keynesian stimulation and government regulation. In Hanlon and Fleming’s view, CSR has won managerial legitimacy in recent years because ‘the state has freed up the firm to be more at the heart of institutional life and the social structure. The firm now has more power to reshape society but in order to do so, it needs legitimacy in the eyes of the population.‘One potential source of such legitimacy would be for corporations to establish themselves “as the main source of security in the post-Fordist society even though many would see it as a source of instability”’ (Hanlon and Fleming, 2009: 944). Kinderman’s (2008, 2010) research supports their argument, demonstrating that corporate leaders, first in the UK, and more recently in Germany, embraced CSR as a direct response to, and argument for, reduced regulatory restraints and mandatory obligations.
Hanlon’s and Fleming’s argument can be generalized even further, and applies equally well to that seminal event in the history of corporate capitalism, the rise of the large and highly autonomous American industrial corporation. The professed assumption of social responsibilities on the part of corporate executives emerged in the US in the early decades of the 20th century because the new large corporations, facing few legal or institutional constraints, required legitimacy to continue to maintain and extend their size and autonomy. One can find American counterparts between the Great Merger Movement of the 1890s and the arrival of the New Deal to virtually all of the items on Hanlon’s list of contemporary neoliberal trends: ‘the increased dominance of finance, the rise of the institutional investor, the decline of collective bargaining, the increasing growth of the non-standard working “career”, the retrenchment and commoditization of large parts of the welfare state, increased polarization of income inequality, etc.’ (2008: 164). A century ago, American society featured powerful institutional investors, a new form of careerism, a dearth of collective bargaining and social welfare and increasing inequality. If it is only recently that the American-style ‘explicit’ managerial version of CSR has won increasing acceptance at the expense of the more ‘implicit’ version that had been traditionally favoured in Europe and elsewhere (Matten and Moon, 2008), it is largely because it has been only in the last two decades or so that managers of multinational corporations have operated within an environment that shares important similarities to what their American counterparts experienced during the first three decades of the 20th century (Kinderman, 2008, 2010). And while American managerial autonomy was eventually reduced by the changes in the economic and political environments that began with the New Deal, these changes never went as far as their European or East Asian equivalents in constraining management, and certainly not far enough to extinguish the legitimating role in the US of voluntarily embracing CSR (Marens, 2010).
If one factor does distinguish the seminal American version of CSR from its global counterpart a century later, it is the centrality of the ‘labour question’ to the former. This is not to say that contemporary discussions of CSR entirely ignore the treatment of individual employees, relations with unions or the practice of offshoring. These discussions, however, tend to be more abstract and less crucial to modern formulations of CSR compared to their American counterparts of a century ago because firms today are generally less dependent on regional or even national labour markets. While employees of both eras share a relative political and legal weakness in contrast to the intervening decades, nonetheless, the original American corporate giants, without access to such contemporary expediencies as runaway shops, global supply chains or outsourced services, remained largely dependent on geographically constrained labour markets. Local workers, even newly-arrived immigrants, had to be willing to show up and perform adequately. Thus, for business in the early 20th century United States, ‘responsibility‘ meant, first and foremost, being viewed as a responsible employer. Having outmaneuvered organized labour and blocked or defanged virtually all governmental attempts at workplace regulation, it then became the responsibility of the leaders of the new corporate giants to manage their workplace power in the interest of productivity, labour peace and, ultimately, social legitimacy. This employer responsibility extended to the treatment of the emerging armies of white collar workers, who, while not likely to unionize or strike, still needed to be management efficiently, and whose fate were of special concern to traditionalist critics of the new corporate order. Furthermore, the employment relationship was regarded at the time as so central to the social role of business, that perceptions of how well firms treated their employees helped shaped consumer and political judgements with regard to other areas of corporate responsibility, ranging from product safety to market dominance.
This article will review the process in which American employment relations gave birth to explicit CSR in four parts. First, I discuss how, unlike its industrial rivals elsewhere, American corporate management was able to preserve its autonomy, blocking virtually all efforts to unionize its workplaces or regulate its relations with employees. Second, I investigate the responsibilities that accrued to corporate management once it had achieved formal control over the employment relationship. Third, I show how voluntary initiatives on the part of management to assume, or claim to assume, responsibility for the welfare of employees formed the basis of the more generalized constructions of corporate social responsibility that emerged between the world wars, and survive in somewhat modified form through a post-war generation of American pluralism. The article concludes by suggesting that a contemporary era of global capitalism, that in many important respects matches the American corporate environment of a century ago, has quite logically promoted a similar formulation of CSR, with the important exception of its relative neglect of the employment relationship, a neglect that may not prove sustainable.
Winning the war
To usurp under false titles, they call Empire, and where they make a desert, they call it peace-Chief Calgacus of Britain on the eve before the battle against the Romans.
If one read only the literature related to CSR published over last 60 years, it would appear to be a topic of largely academic interest, with only the occasional contribution from one or another business leader or political figure (Carroll, 1999; Heald, 1970; Marens, 2010). Corporate executives, however, did not need to wait for college professors to begin taking serious notice of their responsibilities, and before World War II, the situation was reversed, and executives and other non-academics conducted most of the discussion (Heald, 1970). Even Adolph Berle, the academic most closely associated with this discussion before the New Deal, had been a corporate lawyer for a dozen years, and only joined the Columbia University faculty for the opportunity it gave him to conduct research into the modern American corporation (Schwarz, 1987). By the time Berle’s and Means’s The Modern Corporation and Private Property (1932) appeared at the close of an era of maximum corporate autonomy, executives, as well as their allies and critics, had already spent decades discussing how voluntary assuming social responsibilities might win the new corporate giants less suspicion and more appreciation.
In general, the corporate leadership of the United States found itself in an enviable position by the beginning of World War I. The US far exceeded the rest of the world in the number of large geographically dispersed firms (Schmitz, 1995), heirs to the technological, managerial, infrastructural and investment legacies of the American railroad system, and beneficiaries of the enormous market and demand for inputs that the railroad had generated (Chandler, 1977; Roy, 1995; Standiford, 2005). While giant firms existed in Britain, Germany and elsewhere, nowhere had their management enjoyed American-level autonomy. The US, a decentralized polity that had only recently depopulated and repopulated its land mass, did not evolve out of the absolutist traditions of more stable industrializing societies in which long-standing corporatist bodies and patronage networks constrained entrepreneurship (Gerstenberger, 2005; Haldon, 1993), a reality captured by one American diplomat, who dismissed German industrial cartels as merely a new form of guild (Keller, 1980). Even Britain, an intermediate case in terms of the institutional constraints upon business (Vogel, 1996), in which royal absolutist pretensions were ended in 1688 by the invasion by the most advanced capitalist state of that time, still managed a Polanyist ‘double-movement‘ against the excesses of industrialization led by (still) influential aristocrats such as the Seventh Lord Shaftsbury.
Moreover, the new large, autonomous, and typically publically-traded corporate giants were proving their efficacy on the world stage. By 1900, shortly after the great merger movement had consolidated so many corporate giants, the American manufacturing sector was exporting as much as Britain’s, while also supplying a much larger domestic market, and in a dramatic turnaround for an economy that a few decades previously was primarily exporting agricultural commodities, the US became the leader in manufacturing the most ‘high-tech’ products of the time: locomotives, industrial machinery, electrical equipment, typewriters and farm machinery (Kirkland, 1961). Moreover, the new large American industrial corporations provided models for the rise of new giants in retail and other sectors (Galbraith, 1952). Internationally, many of the new industrial giants, rather than simply selling their wares to wholesalers abroad, set up subsidiaries for marketing and even educating potential customers. Europeans in the early decades of the century complained of the American menace (Kirkland, 1961), whose commercial penetration extended to licensing, establishing subsidiaries and consulting services, a trend accelerated by the demand generated by the First World War (Forbes, 1917), and by the end of that war, the ‘Chandlerian’ corporate giant had become the leading institution in the emerging economic dominance of the US (Arrighi, 1994).
While protectionist measures and colony-based autarky may have blunted some of the impact of American industrial achievement upon the rest of the world, the envy and awe of American manufacturing was an ever-present theme in foreign reaction to the growth of American industry. In 1901, a Hamburg newspaper complained that ‘it may be remarked that the typewriting machine with which this article is written, as well as the thousands nay, hundreds of thousands of others that are in use throughout the world, was made in America, that it stands on an American table, in an office furnished with American desks, bookcases and chairs, which cannot be made in Europe of equal quality, so practical and convenient, for a similar price’ (Flint, 1901: 384–385), and a quarter of a century later, an Australian delegation arrived in the US to study the roots of this success in American factories ‘as did delegations and observers from Europe, to see at first hand the productive base of American’s economic prosperity and international power, to learn the secret of a nation where it was claimed every worker would soon own an automobile, a radio, and a telephone’ (Gillespie, 1991: 7).
In achieving this hegemony, the large American corporations had managed to thwart innumerable efforts on the part of the American Labour movement to challenge managerial autonomy in both workplaces and the political arena. The literature covering countless, continuous and typically quixotic efforts to either organize the workers or regulate the workplaces of industrial America before the Great Depression is vast and rich (e.g. Bernstein, 1970; Kolko, 1963; Lambert, 2005; Lichtenstein, 2002; Montgomery, 1987; Nelson, 1974, Schatz, 1983; Weinstein, 1981). A full-scale summary of the findings is entirely beyond the scope of this article, but what stands out in all of these narratives is exactly how little was accomplished in terms of constraining the new corporate management before the New Deal. Organized labour was not without successes elsewhere within the American economy during the four decades preceding the market crash of 1929. Thousands of strikes per year, many not actually led by formal labour organizations, won concessions for workers in settings such as machine shops, construction sites, single site factories, mines and even some railroad lines (Edwards, 1981; Lambert, 2005). Many of the craft-based unions that formed the core of the American Federation of Labor (AFL) in 1886 survived the era of single-digit unionization rates of the early 20th century. The American labour movement, however, was generally stopped at the gates of the new corporate giants (e.g. Brody, 1980; Montgomery, 1987), and as manufacturing became dependent on newer technologies, craft unionists who had held on to niches within manufacturing were often expelled through the same gate, most notoriously at Homestead (Standiford, 2005).
Long-time AFL President Gompers has been continuously criticized for his lack of aggressiveness in confronting these corporate giants, especially when he began hobnobbing with corporate executives at the National Civic Federation after the turn of the century (Livesay, 1978). Yet his cautiousness might have had some justification. After various unions leaders rejected his advice and squandered rare opportunities at Pullman, NCR and American Sheet Steel, he may have not been entirely wrong to see no practical alternative to limiting himself to occasional calls for union-management cooperation (Jacoby, 1983; Mandel, 1963; Nelson, 1974). The AFL was so impotent in its dealing with the large-scale corporate world, that in 1924, when Gerard Swope offered William Green, Gompers’s successor, the opportunity to organize General Electric, Green proved unable to even fashion a response (Bernstein, 1970).
Nor can the failure of American labour to confront the new economic reality be laid entirely at the door of the exclusivity and conservatism of craft unionism. If anything, Knights of Labor producerism, Wobbly syndicalism and American Railroad Union inclusiveness achieved even less. Moreover, this nearly total failure to organize large companies or pass workplace regulation stands in sharp contrast to the experience of other industrial nations. In Germany, which also had a significant population of large firms, the predecessor union of I.G. Metall Verband, organized 1.5 million members by 1928, while the Weimar Republic simultaneously established a parallel system of representation through works councils, which granted legal rights to employees unknown in the United States (Markovits, 1986).
A number of social and political factors contributed to the relative impotence of American labour. Socially, the United States possessed an exceptionally diverse working class even within the same regions, and their mutual distrust could be exploited by management. Even the American Railroad Union, which precociously included a variety of skills and ethnic groups, discriminated on the basis of race (Lindsey, 1964). But it was not just ethnic divisions that were less egregious elsewhere. Nativist workers viewed successive waves of American immigrants as threats to their social position, while Catholic and Jewish trade unionists were wary of forming alliances with evangelical nativist-tinged populists, despite a set of common grievances against corporations (Livesay, 1978; Sanders, 1999). If American craft unionists were more hostile toward the new industrial work (and the workers who performed it) than elsewhere, this might have been because their own social position was less secure than in the ‘old-world’, in which long-standing guilds were sometimes sufficiently entrenched that they could play a role in the training and regulation of the new industrial workforce (Thalen, 2004).
American corporations also possessed a unique political advantage in that they grew to enormous size in a highly decentralized political system of a constitutionally-limited democracy, which lacked a strong parliamentary centre that could be influenced or potentially even captured by labour-oriented parties (Pagano and Volpin, 2005; Vogel, 1996). There was no American equivalent during the early 20th century of a new and stunningly successful British Labour party whose threatening electoral successes protected the interests of industrialized workers in the political arena (Holt, 1977). Moreover, when political mobilization did succeed in generating American legislation—typically at the state level—that either encouraged unionization or regulated work, the American judiciary was likely to exercise its unusually high level of independence to overturn these efforts (Forbath, 1991; Hattam, 1993; Sanders, 1999). This is not to suggest that the American establishment was inherently more conservative than that in other nations, but, lacking an ‘old world’ aristocratic, military or ecclesiastic tradition that might institutionally support organized resistance to the new aggressive corporate leadership (Polanyi, 1957), American conservatism was more closely aligned with commercial interests even before the dominance of the large industrial corporation. As Veblen (1923: 429) observed, ‘It lies in the nature of the substantial citizen-official to let business interests coalesce with the national integrity in such a way as to make the safekeeping of business-as-usual the first and constant care of the official establishment’‘.
This is not to argue that American governments actually practiced laissez-faire, but to the extent they intervened to it was far more likely to be on the side of influential business leaders. American governments actually offered a variety of services to business enterprises from the very beginnings of the Republic thorough the Progressive era (Kolko, 1963; Nock, 1937; Roy, 1995), and government procurement subsidized the development of a host of new products and stimulated the growth of such giants as Dupont, RCA and Procter and Gamble (Marens, 2008). The mild regulatory interventions of the Progressive Era often worked to either restore public confidence in a manner that favoured larger businesses, as, for example, food inspection, (Barkan, 1985) or reduce legal liability, as with workers’ compensation (Weinstein, 1981). Libertarian essayist Alan Jay Nock argued, in response to business complaints over New Deal interference with economic freedom, ‘American business never followed a policy of laissez-faire, never wished to follow it, never wished the state to let it alone … When the factory system came in, those hordes of miserable beings were already there in full force; they were there because state intervention had expropriated them from the land’ (1937: 294, 299).
When it came to insuring the autonomy of the large corporation, however, American government’s most direct contribution was to consistently take management’s side in labour disputes, threatening or even employing violence against workers. This favouritism sometimes surprised labour activists in the early years of the large corporation, partly because such violence over labour disputes had been rare before the railroad strikes of the 1870s (Forbath, 1991; Lambert, 2005). Yet, as early as 1893, the journalist Oren Taft (1893: 67), summarized the lesson of Homestead for the benefit of the mill workers bewildered by the willingness of the National Guard to take management’s side: ‘[A]ll the machinery of State stands ready to protect and further the interests of capital, while labour is left absolutely without law, a law unto itself, save when it commits some act, to be dealt with as a criminal’. When it came to protecting workers from management, American authorities provided a great deal less assistance, doing virtually nothing to protect workers from either firm-sponsored violence or the actions of such anti-union vigilantes such as the Black Legions who terrorized auto workers (Colby, 1984).
The increase in violence against labour that began in the last years of the 19th century was itself a product of the rise of large corporations that began with railroads. The volatile swings in supply and demand that characterized the late 19th and early 20th century, which were themselves partly the by-product of the new corporations, were often handled ham-handedly by management with drastic pay-cuts and other provocations (Lindsey, 1964; Livingston, 1986). Furthermore, the managers of these larger firms exercised more political influence than the proprietors of the smaller shops that had previously dominated American manufacturing, and these same smaller shops were generally more dependent on skilled labour that could not be so readily alienated during a dispute. Finally, the general public itself might have became fearful of the new larger-scale employee actions, often connected to immigrant groups, who, according to one survey conducted in 1911, formed a majority of employees in many key industries, and were regarded as vaguely unsavory by a significant share of the population, often including some of their fellow immigrants (Kraut, 1994). John Rockefeller II (1916), shamed by the Ludlow massacre, suggested that the large size of the new enterprises was itself to blame for the decline in personalized labour relations that had presumably ameliorated the intensity of labour disputes in the past. Whatever the precise combination of factors that encouraged the use of violence, or its threat through injunctions, it became commonplace during railroads, mining and manufacturing disputes, and according to Veblen, an inevitable result of the structure of the society: ‘[I]n any eventual resort to force, the workmen are under a handicap as against the owners, a handicap due to law and precedent as well as to the businesslike predilections that are habitual among the personnel of the constituted authorities … So that any conjunction of circumstances that may threaten to encroach on the accomplished facts of absentee ownership … will for with be rated as a menace to the national integrity and a call for official measures of repression to guard the public’s safety’ (1923: 411, 429).
Although Federal troops intervened in some of the more famous mining and railroad disputes, more commonly it was either city police or state militia (National Guard) units that ‘restored order’, or even employer-paid, but legally deputized, private guards (Lambert, 2005; Morn, 1982). Police and National Guard units grew in size during the last two decades of the 19th century as a response to labour troubles, and were often even subsidized by the businesses that benefited from their protection (Goldstein, 1978). Between 1880–1930, half of National Guard deployments were responses to labour disputes, typically railroads, mining or large manufacturing (Cooper, 1980), and during the strike wave of 1920–1924, that number reached 90% (Goldstein, 1978). Moreover, the federated system of American government allowed management to shop for compliant officials. If local government proved too close to strikers (Gutman, 1976), then state governors might prove more sympathetic to employers, and if the governor disappointed, as Illinois Governor Altgeld did during the Pullman strike, then Federal troops sent by either local commanders or presidential orders, often intervened (Cooper, 1980). Even President Roosevelt, who once publicly declared that ‘[i]f I were a wage worker, I would certainly join the union‘ (Baker, 1904: 50), and who pressured mine owners into negotiating during the 1902 coal strike, could only get management to bargain through intermediaries, thus preventing any legitimization of the Coal Miner’s Union.
The result of all of this management-favouring firepower was ‘the bloodiest and most violent labour history of any industrial nation in the world’, whose victims were overwhelmingly the strikers (Taft and Ross, 1969: 221). The Little Steel Strike of 1937, a few years beyond the era covered in this article, offers an unusually well-documented tabulation. Sixteen workers were killed, ten of whom were shot by police during one unarmed march, and another 267 were injured. By contrast, there were 40 injured police, no fatalities and only three suffered gunshot wounds. (Taft and Ross, 1969). Moreover, the impact of state violence went beyond its actual utilization. The injunction was a far more efficient tactic, requiring only the implicit threat of violence. While American jurisdictions generally recognized some abstract right to strike, company attorneys and sympathetic judges discovered a host of other reasons to threaten striking workers, charging strikers with intimidation, restraint of trade, trespass, breach of contract, inciting violence or even tortuous infliction of economic harm. Injunctions were issued because strikers talked to workers, publicized their grievances, picketed businesses to encourage them to hire from the union, violated yellow dog contracts, interfered with mail delivery or supported another union’s effort to win a dispute (Forbath, 1991; Hattam, 1993). Even Gompers was sentenced to jail (overturned on a technicality) for advocating a consumer boycott in the AFL magazine (Forbath, 1991).
With the legal and armed support of various state organs, the new corporate giants experienced nearly unbroken triumphs over organized labour before the Depression. Even a brief period of government-imposed cooperation during the First World War was readily repudiated once the war ended and the ‘carrot’ of war procurement disappeared. Moreover, as corporate bureaucracy solidified, employers were able to exercise the same kind of authority over their white collar workers. The evolution of American employment law from that of ‘master-servant’, which imposed certain obligations upon the employer, to an extreme form of employment-at-will, was itself a product of this new corporate order. The full doctrine had only been adopted in New York in 1895, then the most commercially important state, and it evolved through the adjudication involving the new white collar corporate personnel, leaving these workers at the legal mercy of their employers to hire, fire, promote, and generally order about as they wished. One influential New York case, for example, stripped managers of the protection of the traditional presumption of an implicit annual contract, with its implied right of severance pay (Feinman, 1976). With legal and organizational victory total, if not assuredly permanent, the question remained as to how management was going to both defend and use this triumph.
Winning the peace
If you catch your train on time, you can get to work by nine, and start your slaving job to get your pay. If you start to get annoyed, look at me, I’m self-employed. I love to work at nothing all day-Randy Bachman, ‘Taking Care of Business’.
Victory can become a source of new headaches for the victor, and the triumph of one institutional arrangement over others eventually generates its own contradictions (Friedland and Alford, 1991). Labour may have been defeated, but it didn’t go quietly, and as long as business required employees, not only would the possibility of union resurgence remain, but business leaders could be expected to be held to account for how they treated their dependent employees by other elements within the society. This was clearly understood by the most influential figures of the time, commenting from various points on the mainstream political spectrum. For John Rockefeller II (1916), ‘the problem of promoting the cooperation between Labour and Capital, may well be regarded, therefore, as the most vital problem of modern civilization’. Woodrow Wilson, who built his political career on opposing the efforts of robber barons such as the elder Rockefeller (Roy, 1995), messaged Congress while still at Versailles that the ‘right of labor to live in peace and comfort must be recognized by governments and America should be the first to lay the foundation stones upon which industrial peace shall be built’ (Wilson, 1919).
Berle and Dodd, debating in print during the tail-end of this era of maximal managerial autonomy, disagreed on many things, but they could reach a consensus regarding the responsibilities that corporations owed their employees. Dodd asserted that ‘[t]here is a widespread and growing feeling that industry owes to its employees, not merely the negative duties of refraining from overworking or injuring them, but the affirmative duty of providing them so far as possible with economic security’ (1932: 1151) and Berle, writing with Means, could tacitly agree with Dodd, arguing that fair wages and job security could legitimately ‘divert a portion of the profits from the owners of passive property (i.e. shareholders)’ (1932: 312).
In effect, large and influential segments of the American establishment were urging top management to be responsible victors. Having largely defused any immediate threat from organized labour, the new large corporations of the early decades of the 20th century had to ‘win the hearts and minds of the public as well’ (Gillespie, 1991: 17). Victory in the present did not insure the lack of serious challenges in the future, especially after the Bolshevik Revolution suggested the potential impermanence of triumph. Furthermore, poor treatment of employees, if publicized, would not sit well among consumers and voters, and major labour conflicts that disrupted such vital activities as coal delivery or railroad service caused resentment even among the proprietors of smaller businesses, who might not otherwise not feel any particular sympathy for strikers, leaving simmering the always present possibility of the kind of cross-class alliances that had already led to anti-trust action (Edwards, 1979). MacKenzie King, future Prime Minister of Canada, advised Rockefeller II that while during Rockefeller’s father’s generation, it was possible to ‘keep business pretty much to those who were engaged in it. Today, there [is] a social spirit abroad, and it was absolutely necessary to take the public into one’s confidence … and especially to stand out for certain principles very broadly’ (Gitelman, 1988: 64).
From the beginnings of general incorporation in the 1830s to the Great Merger Movement at the end of that century, corporate leaders had promised that endowing the corporation with various legal rights would lead to economic growth that would benefit the entire society (Dodd, 1954; Roy, 1995), yet the educated ‘establishment’ often regarded the business corporation with a degree of caution and even skepticism, decades before the corporate giants emerged. As one legal journal that originated from the industrializing State of Massachusetts expressed it, The enjoyment of a corporate franchise is not of common right. It is the grant of the whole people of certain powers to a few individuals, to enable them to affect some specific benefit, or promote the general good. When the corporation fails to produce the expected benefit, and far more when its charter is perverted to injurious purposes, the whole people ought to have the power to control the operations, and even to revoke the charter. (American Jurist, 1830: 307)
Ultimately, the corporate record of sharing economic success with its workers was a mixed one, depending upon which particular group of workers in which period one looks (Licht, 1995; Ramirez, 1978). Whatever the average payoff to workers may have been, corporate abuse through unnecessary wage cuts (Lambert, 2005; Montgomery, 1987), a record on safety issues that was weak by world standards (Asher, 1986; Livesay, 1978) and the employment of state or private violence against strikers hardly contributed to forging a positive public image for the new corporations, and instead inspired various efforts to restrain them. The most egregious behaviour on the part of some corporate leaders often led to wide spread condemnation from even conservative voices. Carnegie was denounced for his cowardice at Homestead by many conservative newspapers and Mark Hanna, the most politically influential businessman of his time, publicly labelled George Pullman ‘a damned idiot’ for triggering a disastrous strike by his notoriously unreasonable treatment of his workers (Lindsey, 1964: 319).
Moreover, it was not just periodic abusive treatment of industrial workers that generated public concern. While corporate employment had become boringly conventional by the time William Whyte (1957) wrote Organization Man, the armies of clerks that corporations hired and the pyramidal managerial hierarchies they constructed were still novel developments a half century earlier. As such, they were often met with suspicion, not just by labour leaders and socialists, but by respectable establishment figures as well, many of whom viewed them as violating the ideal of personalized employment relations, while also preempting the traditional American ideal of independent proprietorship (Davis, 2000). As America’s most influential newspaper columnist expressed it at this time: In the last thirty years or so American business has been passing through a reorganization so radical that we are just beginning to grasp its meaning. At any rate for those of us who are young to-day the business world of our grandfathers is a piece of history that we can reconstruct only with the greatest difficulty. We know that the huge corporation, the integrated industry, production for a world market, the network of combinations, pools and agreements have played havoc with the older political economy. The scope of human endeavor is enormously larger, and with it has come, as Graham Wallas says, a general change of social scale. (Lippmann, 1914: 35–36)
One response to these concerns was welfare capitalism, and as William Tolman, a leading advocate suggested, pre-revolutionary France provided a warning for the recalcitrant: In the old times master and man lived and worked together; there was a daily point of contact, a continuous personal touch. To-day all is changed. The employer in many cases is as much an absentee as were the nobles of France in the latter part of the 18th century and the landlords of some of the worst tenements in slumdom to-day. It is an industrial condition that naturally followed the organization of great capital into syndicates and trusts … the day has passed when the employer is able to individualize those who work for him; not knowing them by name or even by sight, the personal touch, the point of contact has been lost. (1909: 48)
Some of the strongest cautionary notes were actually struck by individuals who would have been regarded as unsympathetic or even hostile to the labour movement. Judge Peter Grosscup (1905), for example, who had dealt harshly with Pullman strikers, warned the readers of a national magazine of the threat that corporate employment posed to the entrepreneurial and egalitarian spirits that had supposedly built the country. Critiques, conservative as well as liberal, often shifted blame to new, poorly understood, industrial technology for destroying traditional labour relations. Yale President Arthur Hadley, a conservative economist, argued that ‘the substitution of mechanical for intelligent labour is often a very serious evil in modern manufacturing’ (1896: 350). The Progressive Unitarian Minister John Graham Brooks (1903: 6) agreed, warning ‘that the storm centre of conflict between employers and employees was the application of science and invention’. John Rockefeller II agreed, blaming the labour unrest that bedeviled him on this technologically-driven new scale of business: ‘[T]he use of steam and electricity, resulting in the development of large-scale industry … has by necessity erected large-scale barriers between employers and men, thus making it more difficult to understand each other’ (1916: 113).
Moreover, the treatment of employees was often regarded as linked to other potential social ills. It is difficult to appreciate how closely labour relations was tied to health issues a century ago, not only because of the possibility of injury or poisoning on the job, but also the real possibility of poor working conditions, unhygienic manufacturing processes or careless adulteration spreading contagious diseases to consumers. Compounding the concern was the high level of immigrant labour employed in factories at a time when both immigrants and their squalid and ‘uncivilized’ living conditions were often blamed for spreading such dreaded diseases as typhoid, tuberculosis and even plague (Kraut, 1994). Nor could consumers possibly know with confidence the standard of hygiene or care of the products they bought, since, increasingly, consumer products were manufactured, transported and even sold by strangers whose attention to the wholesomeness of their ingredients, processes and employees could hardly be taken for granted (Barkan, 1985). The tremendous commercial success of Sinclair Lewis’s The Jungle (1906) exemplifies this generalized anxiety.
Moreover, to the chagrin of the business community, unions were attempting to exploit this fear of disease and poisoning to argue for patronizing only responsible unionized butchers, bakers, tailors and textile workers (Kraut; 1994). By the 1920s, with mass production in heavy industry creating all kinds of new hazards for workers, the health and safety activists of the Worker’s Health Bureau attempted to assist unions in make these an organizing and bargaining issue, albeit with very limited success (Rosner and Markowitz, 1989). Florence Kelley (1899), appealing more directly to middle-class consumers, had had more success promoting her National Consumers’ League (NCL) and its concern with the safety of manufactured items, especially foodstuffs and clothing, although, Kelley, a socialist and child of famous abolitionists, did try to forge ties between these consumers and working class employees by relating working and workers’ living conditions to the wholesomeness of products. As a result, her organization included wage levels, hours and labour-management relations as criterion for its seal of approval. As she herself wrote at the beginning of the era of corporate hegemony: The National Consumers’ League asks that purchasers, by insisting upon buying goods bearing its label, will discriminate in favor of those manufacturers who treat their employees humanely, so far as that is possible under the conditions of the competitive system; and that they will do so both for the sake of the employees and also for the sake of promoting that form of manufacture which is most wholesome for the whole community, in preference to conditions in which danger of spreading infection is constant and considerable. The appeal is still, as before, on behalf of the employee; but it is, also, on behalf of a far larger constituency-the whole purchasing public. (Kelley, 1899: 302)
In what might be regarded as a preliminary effort to turn the voluntary assumption of social responsibility into a business advantage, some of the new corporations attempted to win customers by publicizing their embrace of at least the ‘middle-class safety‘ portion of Kelley’s advice. More enlightened standards of hygiene, safety and wholesomeness could prove good for business in a variety of ways. While in the late 19th century, many companies went to such lengths to attempt to hide industrial accidents as refusing to use ambulances to remove injured employees (Asher, 1986), during the early years of the 20th century, International Harvester, United States Steel and a few other giant manufacturing concerns generated favourable publicity by voluntarily establishing ‘cutting-edge’ workers’ compensation plans that had the added benefits of both delaying passage of more expensive mandatory programs and making participation contingent on signing away the right to sue or join a union (Asher, 1989). Food companies such as Campbell, Quaker Oats, Hershey, Carnations and Aunt Jemima built successful advertising campaigns around the wholesomeness of their workplaces and products (Barkan, 1985).
Even some forms of state interventions proved acceptable, even profitable, to large corporations. After Sinclair Lewis’s The Jungle (1906) shook public confidence, large meat packers appeared quite willing to submit to inspections, paid for through taxation, which helped them squeeze out smaller competitors, while also denying other nations an excuse to bar their products (Barkan, 1985; Kolko, 1963). Moreover, there is no reason to assume all corporate claims of concern for wholesomeness were necessarily disingenuous, let alone empty shams. A number of industrial founders sincerely accepted their responsibility for promoting industrial ‘uplift’ by raising the life-style and behaviour of their employees to middle-class levels of respectability (Barley and Kunda, 1992). Ford is famous for the ‘social workers’ he hired to monitor the home life of his largely immigrant workforce, and Kohler who, built his fixture business on the promotion of bathroom hygiene, saw his mission as starting with his own workers, offering them an array of medical services that went well beyond what was expected of employers at that time (Hoy, 1995; Nevins, 1954).
Corporate management, then, collectively faced a degree of cross-class trepidation and even suspicion as to how it would wield its recently established power. If it failed to do so in a manner that was regarded by non-corporate opinion leaders as responsible, firms risked the confidence of potential customers, a political reaction of unwanted regulation and even a resurgent labour movement, a danger eventually made concrete by the Russian Revolution and the labour strife that followed World War I (Slichter, 1929). Beyond wanting to avoid a possible social backlash, businesses also competed with one another, if not always directly, then at least for the consumer’s or the investor’s dollar. As a result, raising productivity of labour remained an unavoidable issue. Corporate executives may have been free to manage as they chose, but they ran real risks if they did not ‘chose’ to manage well. William Leiserson, the institutional economist who would later serve on the National Labor Relations Board, lectured to one Harvard audience that corporate success in solving the ‘Labour Problem’ and largely abolishing ‘unrest, strikes, boycotts and other forms of conflict’, still left companies a multiplicity of ‘small case’ labour problems in managing workers on a day-to-day basis that were collectively no less crucial (Leiserson, 1929: 127).
Ironically, managing a disorganized mass of workers presented organizational difficulties that would not have existed to the same degree had a union been in place to enforce agreements and assuage employee concerns, although a union would certainly have set a price for its cooperation (Jacoby, 1983). Instead, management faced the prospect of endless individual acts of defiance in the form of loafing, rule-breaking and even acts of sabotage (Forbes Magazine, 1917). Furthermore, employees lacking in voice could simply ‘exit’, and, in an ironic twist, the more that management exercised the power to establish skills and control training, the more costly it became to replace workers (Schatz, 1983). The turn-over problem became so significant that two of the most important American economists of the mid-20th century began their careers studying it (Douglas, 1918; Slichter, 1921).
To keep employees reasonably content and productive required at least some degree of responsible behaviour on the part of management to avoid injuring, alienating or cheating employees. Surprisingly perhaps, given the hostility toward unions on the part of contemporary executives and neo-classical economists, a number of economists and conservative figures at the time argued that responsible labour organizations were the appropriate institution for protecting employees from autocratic management to ensure employees a degree of fairness in pay, safety at work and freedom from arbitrary decision-making. Hanna, for one, developed a reputation for fair dealing with moderate unions in his late 19th century business ventures (he never actually ran a giant corporation), and the economist John Bates Clark, the marginalist pioneer who believed wages should be set by market forces, anticipated Galbraith’s (1952) argument for the value of countervailing power, conceding that under conditions of mass immigration, ‘collective bargaining tends to equalize the strategic position of men and employers’ since ‘[w]ithout organization and with individual bargaining, wages will be draw down to what idle men will accept, which may be less than what they will produce‘ (Clark, 1907: 451). Richard Olney, who, as Attorney General, suppressed the Pullman strike, actually endorsed the right of employees to form labour unions as a counterweight to combinations of capital (Eggert, 1970).
Important legal figures hardly known for their hostility to business expressed similar sentiments. Supreme Court Justices Louis Brandeis and Oliver Wendell Holmes could maintain good relations with business leaders while arguing in favour of collective bargaining in court cases. And not only did the ‘progressive’ Woodrow Wilson (1919), who had pushed anti-combination legislation when governor of New Jersey, urge governments to officially ‘recognize the right of men collectively to bargain’, but so did his conservative presidential opponent William Taft, who, when serving as Chief Justice of the Supreme Court, could simultaneously ignore Congressional intent to prevent the Clayton Anti-Trust Act from being applied to unions, while arguing in dicta for the necessity for a right to strike against the new large employers: [Unions] were organized out of the necessities of the situation. A single employee was helpless in dealing with an employer. He was dependent ordinarily on his daily wage for the maintenance of himself and family. If the employer refused to pay him the wages that he thought fair, he was nevertheless unable to leave the employ and to resist arbitrary and unfair treatment. Union was essential to give laborers opportunity to deal on equality with their employer. They united to exert influence upon him and to leave him in a body in order, by this inconvenience, to induce him to make better terms with them. They were withholding their labor of economic value to make him pay what they thought it was worth. The right to combine for such a lawful purpose has in many years not been denied by any court. The strike became a lawful instrument in a lawful economic struggle or competition between employer and employees as to the share or division between them of the joint product of labor and capital. (American Steel Foundries v. Tri-City Trades Council, 1921: 209)
Despite these general endorsements by business-sympathetic figures of some form of unionization and collective bargaining in order for workers to defend their own interests, corporate executives had no intention of actually sharing power with independent labour organizations. There were a few exceptions, most famously G.E.’s Gerard Swope, but Swope, was unique, a former volunteer at Chicago’s Hull House (even marrying a life-long friend of Florence Kelley) who had the luxury of running a major company with few serious rivals, and was even denounced for his views by some of his more reactionary colleagues (Schatz, 1983). More typical was Charles Schwab of Bethlehem Steel, who while less confrontational in his personnel policies compared to the man he replaced, Henry Clay Frick, insisted that he would never permit himself ‘to be in a position of having labour dictate to management’ (Brody, 1980: 58). The general attitude of self-perceived enlightened managers was that they were fully capable of assuming the responsibility of providing employees the understanding, fair treatment, assistance, protection and voice they deserved.
Not surprisingly, they were often wrong in assuming that top-down policy initiatives that reflected executive perspectives, or even prejudices, would satisfy employees. Judge Elbert Gary of US Steel, influenced by the writings of employee ownership advocate Abram Hewitt, believed sufficiently in his company’s relatively broad stock plan that he even argued against his own board in order to preserve it (Tarbell, 1925), yet he would never accede to the far more expensive reduction in the 12-hour day, an issue of such importance to workers that it led to major strikes (Brody, 1980). The employee representation plan that Rockefeller established Colorado Fuel and Iron in the aftermath of the Ludlow massacre proved more useful in ameliorating working conditions than in actually raising wages (Selekman and Van Kleeck, 1924). Furthermore, to the great surprise of both Rockefeller and the managers of the firm, the plan left employees so frustrated by the limits of its power-sharing and inability to end a number of abuses that it failed to prevent workers from joining unsuccessful national coal and steel strikes in 1919 (Gitelman, 1988; Selekman, 1924).
Whatever the practical limitations of top-down reform, these did not prevent corporate executives from preferring their own judgement as to how to ensure labour peace to being forced to share decision-making power with independent unions. This attitude was reflected in the experience of the National Civic Federation, where the organization’s efforts to mediate labour disputes petered out within a couple of years into the 20th century, while the Welfare and Safety Departments of the NCF continued to attract large-scale corporate participation until American entry into the first World War (Montgomery, 1987). Even Rockefeller II, the sincere and tireless advocate of a formal role for employee input, quite openly hoped that his representation plan would make independent unions unattractive (Gitelman, 1988).
Whether or not they convinced employees, welfare programs and rationalized personnel policies had their desired effect on at least some outside observers. Leiserson, hardly a sheltered intellectual, was convinced shortly before the onset of the Depression that a significant segment of corporate management must be accomplishing exactly what they claimed to pursue, since [t]hese personal workers whom industry has put in the field know as well as the union leaders that injustice, exploitation, low wages, unfair discharges, speedups, and overwork cause resentment, discontent, strikes, and unionization, … leading the great masses of unskilled, semiskilled, and clerical workers away from the official labour movement and attaching them with various devices more or less loyally to the management of the corporations which employ them. (Leiserson, 1929: 141)
Having publicly assumed responsibility for employee welfare, and having convinced many influential observers that they were fulfilling this responsibility, American executives began to use this assumption as the basis for redefine their social roles more broadly.
Generalizing social responsibilities
Almost in the manner of an occupying army seeking to pacify newly conquered territories for their own benefit, corporate management, having established control over their workplaces, set out to find ways turn this control to their ongoing advantage without inciting resistance or a negative reaction from voters or consumers. Looking back on the eve of the stock market crash, Owen Young (1929: 161), Swope’s superior at G.E. analogized: ‘In 1905 cars were new and drivers reckless, so were executives, not yet adjusted, [but] now trained for the job, [they are] more considerate [of employees] than smaller units of business’. One result of this new level of ‘consideration’ was the rationalization of personnel policy by ending practices that ultimately cost companies in reduced efficiency and bad publicity, such as periodic pay-cuts, autocratic foremen and blatant skimping on provisions for safety and personal comfort.
The new enlightened approach to personnel was expressed by the President of Studebaker Motors as a ‘duty of capital and management to compensate labour liberally, paying at least the current wage and probably a little bit more, and give workers healthful surroundings and treat them with the utmost consideration’ (Forbes, 1924: 113). Forbes Magazine had made essentially the same point negatively seven years earlier: ‘Given [employees’] power to help or hinder a firm, the employer who does not do everything in his power to satisfy his men is not only short-sided from his point of view but is an enemy of national peace and harmony’ (Forbes Magazine, 1917: 112). As labour economist Sumner Slichter (1929), pointed out, ‘good’ personnel policies that kept peace while improving efficiency had become especially desirable for multiple reasons during the 1920s: to prevent another upsurge of labour militancy after the threatening episodes of the post-war year, to recover sunk costs in raw materials and equipment and to compensate for new restrictions on the flow of relatively compliant immigrant labour.
One area in which management could find common interest with employees was to reform the system of first-line management, taking autonomy away from foremen and supervisors, whose interests were not necessarily those of the company—even to the extent of holding on to their best subordinates—and whose favouritism and arbitrariness were often the immediate source of both costly inefficiency and employee dissatisfaction (Edwards, 1979; Nelson, 1974). By the 1920s companies were standardizing personnel policy and the training of supervisors and managers to reduce arbitrary and ill-informed supervision (Slichter, 1929). By the 1920s, a few firms began experimenting with putting such reforms on a ‘scientific basis’, most famously at Western Electric, home of the Hawthorne experiments, where top management sought to develop ‘friendly and fair’ managerial techniques for motivating and encouraging employees to embrace ‘a loyal and enthusiastic interest in the company’s work’ that presumably would raise productivity (Gillespie, 1991: 25). With these beginnings of what would later become the human relations movement, top management aimed to convince both employees and the public that good management not only delivered material, health and moral benefits, but would even improve the psychological well-being of employees (Barley and Kunda, 1992).
Beyond implementing and publicizing personnel policy reforms, corporations sought ways, direct or through opinion leaders, to make the public view the necessities of corporate organization as virtues and, in the process, created the modern public relations industry (Raucher, 1968). George Perkins, whom J. P. Morgan often dispatched to manage firms he built through mergers, viewed ‘enlightened’ personnel policies as a way of reversing the ‘big is bad’ presumption that had animated much of American politics at least since Andrew Jackson, allowing proponents to argue that large oligopolistic firms were necessary for bringing high wages and job security to the population at large (Ozanne, 1967). This was not just a matter of heading off labour strife, although ending periodic waves of costly and disruptive strikes was certainly one major goal. The degree of dependency that the new corporate order generated among its employees, which had alarmed some conservative observers nostalgic for a country supposedly dominated by independent proprietors, was reframed in a positive light as a form of economic security that did not require government intervention (Tone, 1997). Similarly, the supposedly stifling and unmanly features of corporate bureaucratic careers was redefined by the public relations specialists as an outlet for ambition and entrepreneurship available even to those who lacked the capital to start their own businesses (Davis, 2000).
Moreover, enlightened personnel policies could be presented as not only good for employees and a boon to communities, but also as an alternative to socialism and communism, an issue of some urgency after 1917. Fear of class conflict spread was not exclusive to the American upper class. The predecessor the National Council of Churches, in a report highly sympathetic to the plight of workers, also noted that, ‘[w]hat we have in our basic industries today is, in the main, not a competition between individuals performing the same tasks, but between classes, which is obviously unserviceable and disastrous’ (Committee on the War and Religious Outlook, 1921: 102). The fear of potentially violent class conflict allow corporate leaders to frame their voluntary initiatives as an effort to following President Wilson’s (1919) recommendation to defuse conflict through ‘[s]ound thinking and an honest desire to the interests of the whole nations, as distinguished from the interests of a class’. The Chairman of US Rubber suggested that the ‘building up of mutual confidence rather than the building up of factions and classes’ would be the result if both labour and management ‘will make the proper effort along the l’ (Seger, 1922: 8). Similar claims of finding a cure of class antagonism where made by the early promoters of personnel research in the 1920s (Gillespie, 1991).
Forbes Magazine ran a regular feature, ‘Who is Our Best Employer’, which lionized this kind of collaborative attitude. In one case, the winning telephone company was reportedly run by managers who ‘realize that the spirit of service is best cultivated in employees who work for men … who regard the men and women working for them as men and women, and not as labour units or so much manpower’ (Martin, 1918: 179). The article quotes one of the company’s appreciative employees, with anxiety-fueled sarcasm, that ‘we do not feel that we are “the poor, downtrodden working classes”’. Still, in 1928, after years of publishing such positive examples in his magazine, Forbes himself expressed frustration that, despite years of advising readers, conditions in a large number of businesses still ‘tend to breed socialists, communists, and other unwholesome agitators’ (quoted in Heald, 1970: 107).
Not everyone within the mainstream intelligentsia bought into these claims of a new era of enlightened capitalism as accessory to the new corporate order. Even the Dean of Harvard Business School, an institution dedicated to training the new corporate managers, could admonish corporate leaders that ‘[w]ords about service are too often a smug cover for a desire to be left alone’ (1927: 404). Executives, however, went beyond words, and corporate welfare programs gave some substance to these efforts to reframe the social impact of the corporation, often at very little actual cost (Brody, 1980). Most workers might have preferred simply receiving higher wages than the smorgasbord of programs and benefits that were initiated by corporations (Tone, 1997), but higher wages were costly. When wages were raised significantly above market rates, as at Ford Motors, other employers perceived a violation of solidarity (Nevins, 1954), while the employees themselves paid with a higher intensity of work (Toller, 1930). Furthermore, diverting the relatively modest cost of welfare programs into the pay packets of employees, would have generated considerably less positive impact, not only because there was little productivity or public relations value in a small generalized raise, but also, for those managers sincerely expecting that an employee gym or a subsidized hot lunch would produce more efficient employees, putting a bit more cash in employees’ pockets might even prove counterproductive, if the money was spent on beer or sweets (Tone, 1997).
The first step away from defending the corporate record as employer toward a more general formulation of corporate social responsibility came from the success companies experienced from applying their reputation as employers to deflect criticism on other grounds. In an era in which the wholesomeness of prepared food had become a political issue, Heinz and Hershey leveraged the caring way they supposedly treated their employees to bolster public trust in the content of the foods they sold (Tone, 1997). National Cash Register had such a strong reputation based on its extensive social welfare programs, that when under legal attack for its blatant violation of the Sherman Act, it found no shortage of independent defenders of the corporation, including newspapers such as the widely read Brooklyn Eagle (Sealander, 1988). When International Harvester was also investigated for its anti-trust violations, and was furthermore the subject of a magazine expose titled ‘Making Cripples and Dodging Taxes’, a glowing article on the firm’s welfare capitalist program written by public relations pioneer, Ivey Lee, turned the tide of elite public opinion, including the relatively pro-union ex-President Roosevelt (Raucher, 1968; Tone, 1997).
As Heald’s (1970) classic study demonstrates, by the post-World War I years, corporate leaders generalized from the positive feedback they received by presenting themselves as good employers to begin defining their firms as ‘good citizens’ in a more general sense. These claims, perhaps, were intended to create some independence from the demands of shareholders (Berle and Means, 1932; Dodd, 1932; Friedman, 1970), but such a stance was not necessarily disingenuous. With the precise social role and nature of the corporation an important political issue during the 19th century, even the earliest advocates for general and accessible incorporation would never have been so politically naïve as to assert that the sole duty that attached to corporations was to investors (Dodd, 1954; Roy, 1995).
Those corporate executives who spoke of general social responsibilities were, to a very large degree, experienced with establishing and publicizing various forms of corporate welfare and enlightened personnel policies, and it was not a great stretch to universalize the same kinds of arguments in making claims for a generalized form of responsible corporate citizenship. In these formulations, the role of the executive was either cast as that of broker, balancing the various interests of the different classes of people associated with their firms, or as a trustee of the organization itself (Heald, 1970). The two formulations were not mutually exclusive, and Owen Young (1927: 392) combined these two perspectives in a speech at Harvard, in which he declared, we have come to see [executives] as trustees of the whole undertaking, whose responsibilities is to see to it on the one side that the invested capital is safe and that its return is adequate and continuous, and on the other side that competent and conscientious men are found to do the work and that their job is safe and their earnings are adequate and continuous.
Others echoed these sentiments, including a number of equally famous names in the business world. Perhaps not surprisingly, given his relatively left-leaning views, Gerard Swope argued that both the public and employees should be considered ahead of stockholders. Endorsing some version of CSR was not merely a liberal affectation, however. Liberty Leaguer Alfred Sloan, a harsh critic of the New Deal, acknowledged that ‘industrial management must expand its horizon of responsibility … It must consider the impact of its operations on the economy as a whole in relation to the social and economic welfare of the entire community’ (1941: 145). George Eastman, whose company, Kodak, was a leader in corporate welfare, claimed that anything for the betterment of humanity was good business, while Heinz famous for the care he gave to both his employees and the products they made, saw his company as being responsible to grocers, employees and customers as well as stockholders (Heald, 1970). Gary described his role running US Steel as ‘as occupying a position of balance among … investors, employees, customers, competitors, and all others who may be interest in, or affected, by the actions or attitudes of the managers’ (Tarbell, 1925: 100). Self-defined ‘responsible’ employers began to see themselves in the more generalized role as managers of socially responsible citizen-corporations.
With the coming of depression and then war, however, and the failure of the largely voluntary efforts under the National Recovery Act to end the economic crisis, the self-assessment of corporate leaders took a back seat to a series of largely unprecedented interventions on the part of American government that included: the legitimization of a resurgent industrial labour movement, closer monitoring of the financial and investment world and the growth of government spending and regulatory interventions that accompanied preparations for fighting a world war. Nonetheless, when peace returned and government intervention was drastically reduced but not ended, discussion of CSR was revived primarily by academics, a discourse that revisited some of the themes of the pre-New Deal era, but also reflected fundamental changes that occurred within American political economy (Marens, 2010).
By the 1950s, with much of the New Deal and war-time command economy dismantled and fears of recurring depression abated, discussion resumed regarding the appropriate responsibilities of the still dominant large corporations. This new 1950s CSR discourse veered from its predecessor in two important ways. First, while some executives continued to contribute (e.g. Abrams, 1951; Bullis 1953), the participants were largely members of a rapidly expanding professorate, a product of the enormous post-war growth of higher education, especially with regard to business schools (Khurana, 2007; Whyte, 1957). Second, the focus of the discussion changed somewhat in response to a different post-war political economy. While American society had not, after demobilization, maintained the level of institutional corporatism or direct government intervention in the economy that still typified other industrial nations, the recent resurgence of industrial unionism, broad acceptance of the value of fiscal stimulation and a ratcheting-up of government regulation had curbed the autonomy of corporate management relative to the pre-depression years, and with the acquiescence of portions of the executive stratum, whose firms had benefitted enormously from the resulting increase in demand from both consumers and government (Collins, 1982).
As a result, the new academic CSR literature, while still presuming ‘explicit’ managerial initiatives with regard to their firm’s social responsibilities, moved a little closer to the ‘implicit’ or imposed end of the CSR spectrum than had been true during the pre-new Deal years (Matten and Moon, 2008), with the academic contributors, reflecting their own academic backgrounds in either macroeconomics (e.g. Bowen, Dale, Galbraith, Kaysen, Melman) or industrial relations (Chamberlain, Dunlop, Kuhn, Selekman, Slichter), conceding the right of other institutions—labour unions but also, churches, schools, governments and agriculture—to help define and even enforce what corporate responsibilities ought to be (Marens, 2010). These steps in the direction of a more implicit version of CSR, however, proved to be something of a generation-long interlude of mild corporativism that separated eras of managerial-centred CSR, as American pluralism gave way to the rise of neo-liberalism by the late 20th century.
Conclusion: the present as the past, almost
The 1960s and 1970s saw efforts to create a new academic field for studying the relationship between business and society, including CSR, within American business schools, but, ‘business and society’ as it was usually called, never entirely institutionalized itself as a distinct discipline (Marens, 2010), and was regarded with some suspicion as either closet social-democracy or an excuse for executives to neglect their duty to shareholders (Cheit, 1991; Friedman, 1970). This failure to find a settled niche actually reflected the unsettled relationship between corporations, organized labour and government in the post-war United States, the success of which generated its own set of destabilizing trends: inflation, shareholder rebellion, the financing of new competitors and the global search for cheaper labour. These new circumstances lead to cries by the late 1970s on the part of business and financial leaders to ‘unshackle’ economic growth by clearing away such obstacles in the way of renewed corporate competitiveness as: obstructionist unionism, excessive regulation, incentive-robbing welfare and burdensome taxes (Arrighi, 1994; Brenner, 2003; Pollin, 2003).
The subsequent release of many of these competitive ‘shackles’ did initiate a new era of executive autonomy, and this was reflected within American business schools beginning in the 1980s, as the discourse on CSR was taken over in the 1980s by a new generation of self-described business ethicists, often trained in philosophy, who, in the words of one disgruntled ‘business and society’ professor, shifted the discourse toward ‘noncontextualist abstractions found in the lore of conventional philosophy … [based upon] … . . . a nearly studied ignorance of what has actually taken place within the American business world’ (Frederick 1998: 44). While these ethicists utilized (then) unfamiliar vocabulary, much of the substance of their writing was actually a return to a pre-war managerial-centred version of CSR, in which a high level of managerial power to define and implement CSR was tacitly accepted as legitimate. What these business ethicists offered were various principles to guide this restored autonomy, that while more formally presented than in the past, were not substantively new. The executive-as-broker model reappeared as Freeman’s (1984) Kantian stakeholder management, and the trustee-of-the-firm approach was recast as Donaldson’s (1982) Lockean social contracting. Leaders of corporations responded to this advice by embracing these ethicists (if not necessarily their advice) to a much greater degree than their pluralist predecessors. For example, the Business Roundtable of CEOs, formed in the 1970s to resist unions politically, established an Institute for Corporate Ethics, in which Freeman and Donaldson played leading roles (Marens, 2010).
This neo-explicit version of CSR differs from its pre-depression predecessor in one important respect: the contemporary version puts a great deal less emphasis on the employment relationship, reflecting a fundamental difference between the corporate eras. The discussions around CSR in the early 20th century focused more specifically on American employees because business leaders of the time, for all of their autonomy and power, required the cooperation of domestic workers to a much greater degree than is true today, and an ‘irresponsible’ employer might even lose the tolerance of consumers and voters, neighbours to these or similarly situated employees. This dependency was drastically reduced once technological change made outsourcing, global supply chains and runaway shops practical, a shift stimulated and protected by US military spending, itself a central feature of post-depression pluralism. Moreover, the emergence of finance as an alternative source of profit for even industrial corporations further reduced the pressure to win a degree of acquiescence from American workers (Arrighi, 1994; Pollin, 2003), a shift captured by the heir to G.E.’s Young and Swope, who told a television interviewer that ‘ideally you’d have every plant you own on a barge to move with currencies and changes in the economy’ (Welch, 1998). Consistent with the decline in importance of most categories of domestic workers, when American business ethicists discuss the problems of specific groups of identifiable employees, these are typically foreign industrial workers (e.g. Arnold and Bowie, 2007), while specific American controversies around union avoidance, declining pay rates or the training of replacement workers by the laid-off are virtually never raised in the contemporary CSR academic literature.
Moreover, as neo-liberalistic neo-managerial autonomy, fealty to finance and search for ever cheaper labour has spread from its American core, so has the modern American version of CSR that emphasizes managerial voluntarism while de-emphasizing employment relations in the core. However, the world’s supply of labour that, on one hand, is very cheap and pliant and, on the other skilled, productive and adequately supported by infrastructure is showing signs that it is not a panacea or bottomless pit, as wages for certain skills have risen internationally and workers have demonstrated renewed individual and collective resistance to exploitation (Jacob, 2012; Markoff, 2012). In recent years, finance has demonstrated its limitation as a way of generating wealth independent of production, and a new wave of austerity has placed limits on the ability of governments to make up corporate shortfalls. Professed fealty to voluntary CSR may have granted a degree of legitimacy to the growing hegemony of corporate decision-making, as it had in the American 1920s, but as the United States learned in the following decade, legitimacy does not help a great deal in the absence of paying customers, a stable and trustworthy financial system and a government willing and able to spend to support its economy.
Footnotes
Author biography
Richard Marens is a Professor of Management at California State University, Sacramento. He has published numerous articles and book chapters on such topics as: American labour’s shareholder activism, the history of middle-management, the role of business ethics in American society, catholic social teaching in historical perspective, employee ownership and the application of neo-marxian theories of capitalist evolution to organization theory. Address: College of Business Administration, Sacramento State University, 6000 J Street, Sacramento, CA 95819, USA. Email:
