Peons Online: How History Suggests the Illegality of User Contracts

April 8, 2021
2018-11-01T000000Z_1271325069_RC1DFFBAFEA0_RTRMADP_3_ALPHABET-GOOGLE-HARASSMENT.JPG
Workers stand outside the Google offices after walking out as part of a global protest over workplace issues, in London on November 1, 2018. (Reuters/Toby Melville)

No one in Silicon Valley has taken an historical approach to examining the problems of big tech. In The Code: Silicon Valley and the Remaking of America, University of Washington history professor Margaret O’Mara tells us that Silicon Valley’s view from the beginning has been “why care about history when you were building the future?” And in If Then: How the Simulmatics Corporation Invented the Future, Harvard history professor Jill Lepore explains that in Silicon Valley “the meaninglessness of the past and the uselessness of history became articles of faith, gleefully performed arrogance.” Lepore goes on to quote self-driving car engineer Anthony Levandowski, who said in 2018 that “the only thing that matters is the future. … In technology, all that matters is tomorrow.”

But companies like Google and Facebook may be ignoring history at their peril. That’s because “what’s past is prologue,” in the words of William Shakespeare (who both respected time’s lessons and was himself a disruptor). Saul Levmore and Martha C. Nussbaum, editors of The Offensive Internet, say that a way ahead is often revealed by looking back:  “Old solutions are sometimes appropriate for new problems.” And Sydney J. Harris, a long-time columnist for the Chicago Sun-Times, wrote that “history repeats itself, but in such cunning disguise that we never detect the resemblance until the damage is done.” History, properly interpreted, can provide us with a perspective from which to see not only the problems of Silicon Valley but also the solutions to them.

People Machines, Then and Now

In If Then, Lepore writes that in the United States in the late 1950s, a team of social scientists worked on a project to build a top-secret “People Machine” that would use the collection of personal data to direct human behaviour.  She describes the process: “Collect data. Write code. Detect patterns. Target ads. Predict behavior. Direct action. Encourage consumption. Influence elections.”

As Lepore relates in her book, Newton H. Minow, later the chair of the Federal Communications Commission, and historian Arthur M. Schlesinger, Jr., were both working on the 1960 Democratic presidential campaign. In March 1959, when Minow received a copy of a plan to use election returns and public information surveys to predict the future, Minow wrote to Schlesinger: “My own opinion is that such a thing (a) cannot work, (b) is immoral, (c) should be declared illegal. Please advise.” Although the People Machine ultimately failed and the project was concluded after 11 years, their application of behavioural science — what Lepore describes as “the science of psychological warfare” — lives on and seems to be working in the behavioural advertising model of micro-targeted ads. And the ascendancy of market thinking and its emphasis on efficiency and convenience over the last half century has blinded us to the moral issues and their legal consequences that Minow so astutely perceived.

History, properly interpreted, can provide us with a perspective from which to see not only the problems of Silicon Valley but also the solutions to them.

As Minow understood, it is morally wrong to manipulate other people so that they lose their autonomy. But this is essentially the behavioural advertising business model of surveillance capitalism that Google and Facebook have adopted. It starts with the collection of personal data, a process that is surreptitious, immediate, effortless, silent, invisible, unnoticed and automatic. As a result, most users are unaware of the extent of this vast vacuuming operation. The companies use the latest social science research on human weaknesses to hook users on their services and the latest technology to surveil them, and then they manipulate them through advertisements targeted to their individual interests. The result is that the users lose not only their privacy but their autonomy, with serious consequences for their beliefs and behaviour. This is immoral, as Minow stated. But it also should be illegal, as he said.

Peons in the Internet Age

History gives us a precedent on how the law should deal with the deprivation of autonomy. Indentured servitude was common in the United States in the eighteenth century but disappeared in the nineteenth century. However, a new form of bondage, peonage, came into practice later in the nineteenth century. Essentially, it consisted of a contract of compulsory service (such as fieldwork or housekeeping) for a master in payment of a debt (often for transportation, rent and food). As I’ve written elsewhere, we can see that in peonage, as in slavery, the main wrong was in denying the person’s autonomy, but there was another important aspect of peonage, not common with slavery, that also made it wrong and justified holding such contracts null and void: its threat to democratic government, as Justice Jacob Trieber, the presiding judge in the 1903 Peonage Cases, identified.

In his opinion,1 Trieber wrote that the peon “was a freeman, with political as well as civil rights.” Yet, a peon’s vote could be controlled by the master. The loss of the peons’ autonomy constituted the threat to democracy because those votes could have “sufficient voice … to determine the result of an election.” It was against this background that courts in the late nineteenth and early twentieth centuries found that the contracts of peonage were null and void and without legal effect.

In the case of peonage, the full, free and informed consent of the peon to the contract of service was irrelevant. Society had decided that the relationship established by the peonage contract was too evil to merit support by the legal system. The peon’s loss of autonomy was the loss of liberty, one of the inalienable rights mentioned in the Declaration of Independence. An individual could not consent to the loss of this right by signing a peonage contract.

Many commentators have spoken of the user’s relationship with digital technology or social media in terms of a loss of autonomy similar to that of peonage. These statements have no legal effect, but they highlight the similarities between the relationship of internet service user and peon. Columbia law professor Tim Wu, in The Attention Merchants, has called Facebook “a virtual attention plantation”; writer Nicholas Carr has called the advertising-based business model “a modern kind of sharecropping system”; security specialist Bruce Schneier, in Data and Goliath, has called the relationship between internet service providers and their users “more feudal than commercial”; and law professor Frank Pasquale, in The Black Box Society, has called the relationship “self-incurred tutelage.” Other scholars, such as Jonathan L. Zittrain, Nathaniel Persily and Zeynep Tufekci, have noted the threats to democratic election practice that result from user contracts — specifically, that result from the relationship that these contracts establish, which enables micro-targeting behavioural advertising. This relationship, like that of peonage, deprives people of their autonomy.

In their contracts with users, both Google and Facebook agree that California law should apply to interpret these contracts, regardless of where the users are located. California law says that contracts that have certain characteristics are illegal and without effect. (And, if the law determines Google’s and Facebook’s user contracts to be illegal, the consent of the users of Google and Facebook to the collection and monetization of their personal data, because it is part of these contracts, would also be illegal and without effect.) Specifically, three categories of contracts are illegal: first, contracts that are unconscionable; second, contracts against public policy; and third, contracts contrary to good morals. As Minow suggested, such contracts can be “declared illegal” by a court in a lawsuit.

The California Civil Code and related case law require that a contract be both procedurally unconscionable and substantively unconscionable before it will meet the requirement of “unconscionability.” The Google and Facebook terms of service that are part of the contract are procedurally unconscionable because they are contracts of adhesion and exhibit both oppression and surprise — the legalese meaning, essentially, that they give the user no opportunity to opt out of or negotiate terms. Whether they also constitute substantive unconscionability will depend on whether a court finds that they (in the language of California case law) “shock the conscience” or are “overly harsh” or “one-sided.” A plausible argument can be made that California courts could currently find that the Google and Facebook contracts are substantively unconscionable.

California law on “public policy” has evolved and narrowed over the years, but can still apply to many different situations. Although anything that has a tendency to injure the public welfare is, in principle, against public policy, determining which contracts fall into this vague category is very difficult. Given the unpredictability of determining what constitutes “public policy,” the application of “public policy” to deny enforceability of the Google and Facebook contracts is certainly plausible.

The discussion above declaring the user contracts of Google and Facebook immoral suggests that these contracts should satisfy the criterion “contrary to good morals” under California law, but that would ignore California legislation and court precedents. Under California law, the category “contrary to good morals” has covered different types of contracts, often those relating to gambling, marriage, marijuana, prostitution, pornography, hush money and fiduciary duties. The cases on gambling provide the best insight into the user contracts of Google and Facebook. California courts, recognizing behavioural addiction as well as substance addiction, have seen that the behavioural addiction of the compulsive gambler contributes to a loss of self-control and autonomy. Therefore, they have ruled that enforcing a gambling contract of an addicted gambler is contrary to good morals. The behavioural addiction of the users of Google and Facebook supports a plausible argument that the contracts of the two platforms are also contrary to good morals under California law.

Restoring Autonomy

If a class action lawsuit alleging that the user contracts of Google and Facebook are illegal were successful, the consequences for the two companies would be catastrophic. If the court ruled that the contracts were illegal, it should then enjoin the companies from monetizing the users’ personal data. Deprived of advertising revenue, the companies would go bankrupt and have to convert to a subscription business model that would allow only general, not personalized, advertising.

Far from being useless, history can help us to resolve the problem of the loss of autonomy by users of Google and Facebook. Minow’s wise observation can point us to the right legal mechanism for implementing this solution. Of course, this mechanism does not address the many other problems of social media and the internet, such as ownership of personal data, restrictions on the collection of personal data and the fiduciary duties of holders of personal data. These problems will require creative thinking and a look at what other insights history can provide.

1. Peonage Cases, 123 F 671 (MD Ala 1903).

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Preston M. Torbert is a lecturer at the University of Chicago Law School, visiting professor at the Peking University School of Transnational Law and the author of “Because It Is Wrong: An Essay on the Immorality and Illegality of the Online Service Contracts of Google and Facebook” in the Case Western Reserve Journal of Law, Technology & the Internet (January 2021).