ronically, the recent, politically grandiose calls to break up big technology platform companies come just as the companies are already busily unbundling. Whether through independent advisory boards, outsourcing core trust and safety functions, or actual restructuring, technology companies are atomizing their core operations and offshoring liability. More important than how a platform company structures itself is how that structure defines user rights and corporate accountability. Relying on legislation or regulation to define platform governance standards through punishment avoids the problem: companies can just restructure. Recent headlines offer numerous examples of mergers, investments and bankruptcies that have been used to manipulate corporate liability (see, for example, Witt and Pasternack 2019). The defining policy question for the digital era isn’t how we regulate company size, it’s how we ensure that digital and platform governance standards persist across these companies’ supply chains. How do we build data rights we can trust, and ensure that companies can’t use corporate restructuring to avoid accountability?
While antitrust investigations and pressure gain momentum, there’s already significant expert criticism of antitrust’s ability to cope with fluid investment interests and complex data and digital sharing. The problem isn’t just company size, it’s that companies weren’t designed to keep promises to the public, but to create, distribute and dispose of value and liability. And because that’s their purpose, companies are exceptionally good at using incorporation and contracting to make meaningful accountability almost impossible.
One alternative to prevalent practice is the common law trust — a legal instrument that creates purpose-built governance over a set of assets and, importantly, creates fiduciary and contractual duties. Digital trusts — trusts that manage digital assets, such as data, code or the right to represent a data subject — may offer a more reliable way to ensure that platform companies are practically and legally accountable for their impact.
Instead of focusing our public investments in platform governance on breaking up the platform companies, we should focus on improving the ways that companies make important promises and increasing the prevalence of legal tools designed to uphold public interests, values and loyalties, such as trusts.
Unbundling Platform Governance
The term platform governance often hides a significant amount of complexity around fundamental questions — starting with what it means, but also including “by whom,” “to what end” and “enforced how,” among others. By framing policy discussions around platform governance, we risk focusing on an instrumental debate and missing the much larger — and more concerning — political economy of global regulatory enforcement, in addition to the specific complications posed by technology platform companies.
Platform governance can refer to a range of decisions and structures, but this analysis focuses on governmental and corporate governance of the companies that own technology platforms, and the business decisions that ultimately determine user rights. It’s important to be specific about the frame because no matter how commendable, ethical or well-designed a governance mechanism may be, the fundamental challenge remains that our institutions struggle to effectively regulate global companies. If any type of platform governance is to be effective, it will have to grapple with unbundling — substantively and structurally.
Globalization, Platform Regulation and Arbitrage
At the most basic level, technology markets are global and the laws that protect our fundamental rights are national. Both sovereign governments and multinational companies understand the impacts of that disconnect and are racing to leverage the resulting political economy to their own benefit. While that’s predictable, it’s also predictably raising questions about the effectiveness of strictly sovereign approaches to setting platform and data governance standards.
Historically, companies started local and grew, gradually, into international markets through careful negotiations with a range of related stakeholders at each step. Technology platform companies, however, launch globally overnight — leapfrogging market-access negotiations, often only responding to public and user concerns when they become scandals.
In addition to obvious scale issues, the shift in dynamic also decouples market access and government authority — whereas companies used to have to proactively earn government approval to reach their citizens, governments now have to proactively take steps to limit access to their markets, or find other ways to punish companies for abuse. That’s a relatively weak stance for governmental regulators — and is the current footing for most data rights and governance protections.
The “global first” nature of technology platforms also means that companies no longer need to have offices everywhere they offer services, which enables them to focus on other drivers, such as access to skilled labour, favourable market conditions or suppliers. That flexibility also enables platform companies to manage their structure to minimize their regulatory burden, a practice called arbitrage. For nearly every type of government regulation, there are jurisdictions that market themselves to large companies through beneficial regulation.
The most obvious and highest-profile example of the tension between governmental authority and corporate arbitrage is taxation. For measure, a joint study between the University of Copenhagen and the International Monetary Fund found that 40 percent of all foreign direct investment is, in fact, multinational companies avoiding taxes through shell corporations (Damgaard, Elkjaer and Johannesen 2019, 13). Even more damning, according to research from the Council on Foreign Relations, US companies report nearly seven times more profit from known international tax havens than from large, commercial markets (cited by Wolf 2019). Tax justice is a systemic issue, and a particularly acute problem among technology platforms, who are holding at least US$500 billion offshore to avoid US tax.
Ireland is one of the most important tax havens in the world; US-based multinationals represent 50 percent of the largest companies in Ireland, and 80 percent of its domestic corporate tax revenue (Quell 2019). Ireland markets itself to multinational technology companies based on favourable tax rates and government-sanctioned access to European markets. Ireland’s US$14 billion in tax breaks to Apple prompted a European Commission order demanding that Apple pay the bill, over Irish objection (Cao 2019).
The example of Ireland also demonstrates the shifting power of companies and countries in setting regulatory standards through arbitrage. If platform governance standards are to be effective, they’ll need to grapple with the ways that sovereign competition may create a “race to the bottom” in corporate accountability.
What’s missing is accountability to the people and groups that taxes and regulations are meant to protect — exactly the people whose trust these platforms need.
Antitrust, Unbundling and Supply Chain Governance
Most approaches to platform regulation focus on punishing companies for business practices that exploit users or cause large, negative social outcomes. The challenge with this focus is that the nature of company formation has changed (OpenCorporates 2018). Globalization and automation make it easier for companies to evolve from single entities into supply chains or “service-oriented incorporation” (McDonald 2019a). While holding companies and supply chains aren’t new, technology enables platform companies to manage corporate structure with unprecedented speed, geographic spread and operational granularity. As a result, companies are unbundling into supply chains, both increasing the potential for arbitrage and limiting the effectiveness of regulation.
Framing platform governance regulation around corporate accountability only enables authorities to focus on the behaviour of individual companies — individual links in the supply chain — whereas our digital rights are defined by the standards upheld across entire supply chains. As the saying goes, a chain’s only as good as its weakest link — and in the digital economy, the links are competing with each other. In order for platform governance to be effective, it will need to incorporate approaches to accountability that extend to entire supply chains. So, while antitrust dominates the political discourse because it cathartically promises to punish platform companies, building legitimate platform governance requires a constructive approach, ensuring that digital supply chains can credibly, accountably uphold standards and duties of care.
There’s a significant amount of public and international pressure for the companies that own the world’s largest platforms to unbundle: Alphabet, Facebook, Amazon and Apple are all under antitrust investigation in the United States, as well as by various authorities across Europe. Google, specifically, is under antitrust investigation by all 50 US attorneys general (McKinnon and Kendall 2019), after receiving US$9.4 billion in antitrust fines from the European Union (Lomas 2019). There are countless op-eds, political speeches and academic theories for how to break up big tech companies.
Yet, whether it’s to minimize national and regional tax burdens, offshore liability for the risky new ventures or comply with data sovereignty and localization requirements, big tech companies are unbundling as fast as they can. As companies grow and mature, they often manipulate the way they’re incorporated to minimize the cost and burden of regulatory compliance. The problem is, “regulatory compliance” is anodyne, executive-speak for “avoiding public protections of workers, the environment and governmental authority.” Mature, global companies use company structure differently than most people intuitively expect, using individual companies as shells that contain unbundled parts of their operation, to manage obligations to governments and the public.
Alphabet, the primarily Google-funded holding company run by Google’s founders, is the highest-profile example of this approach. Google created Alphabet in 2015, amid investor and European antitrust pressures (Sharma 2018) and spun out a number of companies dedicated to individual lines of business, such as Nest (home thermostats), Google Capital (investment) and Sidewalk Labs (urban technology). While these units are technically separate, Alphabet has a well-documented history of merging elements of those separate companies in and out of Google, often in ways that fundamentally alter the company’s previous statements about data privacy or use. Many of these companies share data, leverage the same advertising products and co-invest in joint ventures, intertwining them financially. Google unbundling into Alphabet has done something far more important than temporarily provide antitrust cover — it perfectly, publicly illustrated how corporate structures can be used to manipulate accountability to the public, customers and governments.
There are, of course, a range of trends and pressures compelling the transition from single companies to supply chains. At the operational level, most platform companies weren’t designed for the kind of politically and socially complex governance that their businesses require. As a result, platform companies build, outsource and partner with a growing range of corporate structures, ostensibly independent oversight bodies and third-party vendors to rebuild public trust. Whether it’s recognizing the social influence of the interdependent technologies involved in internet infrastructure, resolving disputes between users or convincing governments to allow them to operate in public spaces, platform companies are acting in recognition that their long-term sustainability is contingent on their ability to build functional supply chains — and that trust is a system requirement.
Supply chain governance has become an increasingly prevalent vector for improving the social impact of industrial practice, with several high-profile successes in improving labour conditions and environmental impact. And social activists are increasingly using supply chain advocacy to achieve social impact ends such as Google’s banning of advertising for predatory loans (Sydell 2016) and Cloudflare’s deplatforming of online hate sites the Daily Stormer and 8chan (Wong 2017; Elfrink 2019).
While these approaches have been novel and effective, they’ve also been anecdotal and opportunistic, rather than clear or systemic. The public reaction to platform governance has been to push for transparency, consistency and accountability. Structural approaches to building trust across variably aligned companies, linked by a supply chain of services and customers, aren’t necessarily new, but they are fundamentally different to antitrust enforcement. And they start, not at the supply chain level, but where all chains fail: with the weakest link.
The weakest link in digital governance supply chains isn’t any specific company, but the way we design the links themselves. In order for us to build trustworthy platform governance, we’ll need to build corporate forms and contracting structures that are designed explicitly to accountably uphold common standards and duties.
Fiduciary Supply Chain Governance
Sometimes governance systems aren’t defined by how well they achieve their core goals, but by how effectively they prevent bad actors from exploiting their core goals. One of the most important considerations for designing credible platform governance is creating operational, accessible and legally enforceable approaches to accountability and governance processes over time. Those questions are contributing to a growing field of digital political science (McDonald 2019b), which focuses on designing governance standards and systems that blend public and private infrastructure to build equity.
In common law, the oldest and most established approach to creating shared standards of duty — especially during times of rapid legal transition — is fiduciary duties. Fiduciary duties are legally enforceable promises to act on someone else’s behalf based on a type of law called equity, which relies on fairness to resolve disputes when there isn’t binding or applicable law. Fiduciary duties typically include duties to loyalty (representing a person or group’s best interests, to the best of a fiduciary’s ability) and care (upholding an appropriate standard in that representation). There’s already quite a bit of exploration of the idea of fiduciaries in digital and platform economies (for example, Balkin 2016), largely because they offer a key, unique benefit: they enable parties to negotiate broadly defined, legal accountability in ways that regulation and more traditional contracts can’t.
The trajectory of fiduciary law scholarship has moved from its base in individual representation toward complex and multi-party governance, creating ample foundations to apply to digital economies. The recent focus on data trusts by the Canadian and UK governments, as well as by large corporate actors such as Alphabet, Microsoft and Mastercard, suggests that we’ve moved past asking “if” there is a role for digital and platform fiduciaries to the “how” of adapting those structures to platform governance.
Proposals about how to ensure that platform companies work toward, and take responsibility for, their social impact range from broadly defined duties of care, to setting industrial and engineering professional standards, to imposing fiduciary duties on platform companies, to using trusts to govern aspects of the data economy. The defining difference between these approaches isn’t “what” they might accomplish — it’s “who” has the power to mandate an approach, and “how” it might work in practice. Each differs in its approach to accountability, mitigating power asymmetry through sovereign regulation, industrial self-regulation, specialist professional services and collective self-governance, respectively.
While there are a lot of small differences between the proposed approaches to platform governance, they break down according to “who” gets to hold platforms accountable. The current and traditional approach in many places is for government regulators to receive complaints, investigate abuse and then issue punitive or compensatory fines. Standards-based approaches rely on industry to self-regulate, with the potential for the public to hold individual companies accountable in civil court, for breach of industry standard. And, fiduciary and trust-based models create a dedicated steward, with a defined mandate, that can also be held liable in court, in cases where they ignore or underperform their duties. The key difference between these approaches is whether they focus on empowering public rights of action (government action) or private rights of action (public accountability).
The primary strength of public rights of action and statutory approaches to fiduciary duties is that governments typically have an established set of mature tools and infrastructure to regulate markets and companies. One central criticism of public rights of action is that they’re inherently political, based on the influence and jurisdiction of individual governments over a company or industry. The most obvious example of this is the way that technology companies, and their stock prices, react to news about investigations by different authorities. The International Grand Committee on Big Data, Privacy and Democracy, whose most recent meetings in May 2019 were attended by representatives from 12 governments, has had three consecutive requests to testify ignored by Facebook CEO Mark Zuckerberg (Kates 2019). By contrast, technology stocks all lost significant (temporary) value when the US Department of Justice announced its antitrust investigations into the big five American tech platforms (Savitz 2019). So, if a single government defines or imposes a model of fiduciary duties on tech platforms, they could become global standards, effectively imposing their cultural norms on global platform users. Beyond that, and more concerningly, statutory fiduciary duties could also restrict the accessibility of enforcement mechanisms, for people outside the imposing country. In other words, the process of deciding “who” gets to define standards around platform governance could prevent, or at least undermine, the effectiveness of that governance.
By contrast, centring fiduciary platform governance on individual rights of action relies on private law, focusing on enabling members of the public to resolve disputes with platforms without government intervention. The strongest argument against private rights of action are the large power asymmetries involved in access to justice, digital literacy and capacity. That being said, technology platforms have also been responsible for building governance mechanisms that scale quite effectively — eBay famously pioneered “online dispute resolution” to settle 60 million disputes per year (Rule 2008). That’s not to suggest more technology is always the solution, but rather that there are lots of ways for technology platforms to build credible, scalable and trustworthy governance processes. More importantly, focusing on individual agency, which can be assigned or allocated to fiduciaries, advocates and other collective action models, centres the conversation around public equity, instead of focusing on political economy of large institutions. Unlike statutory approaches to fiduciary law, data trusts enable people to design and negotiate for their own priorities and values in the way that they’re represented in digital systems.
Thankfully, statutory (public) and data trust (private) approaches to creating fiduciary duties aren’t mutually exclusive — and are likely to complement each other as the field of practice develops professional standards. These trends point to the public’s desire for smaller, more accountable technology platform companies, especially as they relate to user-facing data governance. They also point to the practical complexity of competing public and private authorities, designing digital rights across cultures and legal jurisdictions, and balancing competing, valid interests. Those challenges aren’t specific to technology platforms, but they are significantly complicated by their global reach and domestically incorporated supply chains. Ultimately, these questions are not novel technology issues, but foundational political science questions. Given the prominence of digitization, their solutions are as likely to be engineered in state houses as in Silicon Valley.
Ultimately, platform governance is an almost infinitely complex challenge because of the scale of negotiation involved. The simple existence of the term “trust” doesn’t inherently earn public trust, and the use of the legal instrument doesn’t inherently ensure good governance. That said, trusts are a clear, established, tested legal vehicle for articulating, consolidating and stewarding the public interest — especially in contexts without well-established laws or rights.
The opportunity that data trusts offer to platform governance is a credible legal container to use as we start experimenting with new approaches, without risking that a failed experiment will make it easier to exploit the underlying data or its subjects.
Whether platform companies continue to unbundle to avoid liability, or because governments figure out how to make them, their component pieces will need what their aggregate lacked: a clearly articulated and operationalized duty to protect the public. It’s possible that some sovereign, or group of sovereigns, will be able to compel that articulation and operationalization — but it’s far more likely that those involved will figure it out first, in practice. Rather than try to drive deterministic approaches to platform governance, which are framed by a government’s legitimacy and leverage, policy authorities should focus on building an enabling environment for principled, accountable experimentation around data governance that clearly articulates standards for accountability and redress.
Policy makers looking for practical approaches to advancing platform governance should prioritize de-risking the enabling environment for data trusts, including by harmonizing international fiduciary laws, in parallel to their investments in antitrust. While governments and companies continue to wrestle over who has the authority to take companies apart, data trusts are a critical element of laying the foundation for the future — not because they inherently solve our trust problems but because, unlike most other legal tools, they clearly establish duties and accountabilities, often to specific groups and social causes. No matter how we decide to fix the platform economy we have, we’ll need to build the foundations of the future with different legal tools than the ones that got us here.
Data trusts are a new version of an old tool, and one that provides continuity during big transitions. Importantly, they also do the one thing that our current institutions do not: clearly and directly create actionable accountability. Rather than search for a perfect, silver bullet tool — the authorities pushing for platform governance should start with the powerful legal tools that we have and focus on ways to maximize their utility. No matter how we approach fixing platform governance, antitrust can only be part of the solution. In order to build the future of platform governance, policy makers will also need to maximize the value of the legal and governance tools we have to build trust. Data trusts are one place to start.
Balkin, Jack M. 2016. “Information Fiduciaries and the First Amendment.” Faculty Scholarship Series 5134. https://digitalcommons.law.yale.edu/fss_papers/5154.
Cao, Sissi. 2019. “Apple Refuses to Pay Ireland $14 Billion in Back Taxes — And the Irish Don’t Want It.” Observer, September 17. https://observer.com/2019/09/apple-ireland-tax-lawsuit-european-union-corporate-tax-dodging/.
Damgaard, Jannick, Thomas Elkjaer and Niels Johannesen. 2019. “The Rise of Phantom Investments.” Finance & Development 56 (3): 11–13. www.imf.org/external/pubs/ft/fandd/2019/09/pdf/the-rise-of-phantom-FDI-in-tax-havens-damgaard.pdf.
Elfrink, Tim. 2019. “‘A cesspool of hate’: U.S. web firm drops 8chan after El Paso shooting.” The Washington Post, August 5. www.washingtonpost.com/nation/2019/08/05/chan-dropped-cloudflare-el-paso-shooting-manifesto/.
Kates, Graham. 2019. “Facebook’s Mark Zuckerberg declines latest invite to appear before international lawmakers.” CBS News, September 9. www.cbsnews.com/news/facebooks-mark-zuckerberg-declines-latest-invite-to-questioning-by-international-lawmakers/.
Lomas, Natasha. 2019. “Google fined €1.49BN in Europe for antitrust violations in search ad brokering.” Techcrunch, March 20. https://techcrunch.com/2019/03/20/google-fined-1-49bn-in-europe-for-antitrust-violations-in-search-ad-brokering/.
McDonald, Sean. 2019a. “How Regulations Are Reshaping Digital Companies.” Cigionline, April 15. www.cigionline.org/articles/how-regulations-are-reshaping-digital-companies.
———. 2019b. “What Is Stalling Better Data Governance?” Cigionline, June 7. www.cigionline.org/articles/what-stalling-better-data-governance.
McKinnon, John D. and Brent Kendall. 2019. “States to Move Forward With Antitrust Probe of Big Tech Firms.” The Wall Street Journal, August 19. www.wsj.com/articles/attorneys-general-to-move-forward-with-antitrust-probe-of-big-tech-11566247753.
OpenCorporates. 2018. “Fireflies and algorithms — the coming explosion of companies.” https://medium.com/@opencorporates/fireflies-and-algorithms-the-coming-explosion-of-companies-9d53cdb8738f.
Quell, Molly. 2019. “Apple and Ireland Fight Against EU War on Corporate Tax Deals.” Courthouse News Service, September 17. www.courthousenews.com/apple-and-ireland-lead-charge-in-eu-war-on-corporate-tax-deals/.
Rule, Colin. 2008. “Resolving Disputes in the World’s Largest Marketplace.” ACResolution: The Quarterly Magazine of the Association for Conflict Resolution, Fall. http://colinrule.com/writing/acr2008.pdf.
Savitz, Eric J. 2019. “Facebook and Other Big Tech Stocks Are Barely Moving on the DoJ’s New Probe. Here’s Why.” Barron’s, July 24. www.barrons.com/articles/doj-investigation-tech-stocks-51563989279.
Sharma, Rakesh. 2018. “Why Google Became Alphabet.” Investopedia, January 2. www.investopedia.com/articles/investing/081115/why-google-became-alphabet.asp.
Sydell, Laura. 2016. “Google To Ban Payday Loan Ads.” National Public Radio, May 11. www.npr.org/2016/05/11/477693475/google-to-ban-payday-loan-ads.
Witt, Jesse and Alex Pasternack. 2019. “The strange afterlife of Cambridge Analytica and the mysterious fate of its data.” Fast Company, July 26. www.fastcompany.com/90381366/the-mysterious-afterlife-of-cambridge-analytica-and-its-trove-of-data.
Wolf, Martin. 2019. “Martin Wolf: why rigged capitalism is damaging liberal democracy.” Financial Times, September 18. www.ft.com/content/5a8ab27e-d470-11e9-8367-807ebd53ab77.
Wong, Julia Carrie. 2017. “The far right is losing its ability to speak freely online. Should the left defend it?” The Guardian, August 28. www.theguardian.com/technology/2017/aug/28/daily-stormer-alt-right-cloudflare-breitbart.