What Does Digital Policy Have to Do with Climate? A Great Deal

Even relatively small changes in corporate practices could meaningfully contribute to reducing emissions.

July 10, 2023
drought
A tree stands in the middle of a dry grain field (aerial view with a drone). Farmers have to struggle particularly with the consequences of prolonged periods of drought. (REUTERS)

Digital policy and climate have been identified as the two “meta-policies” defining the current European Commission’s agenda. Maybe surprisingly, then, the Digital Services package — which refers to the Digital Services Act (DSA) and the Digital Markets Act (DMA), both passed by the European Union in 2022 as a new regulatory framework for internet services — draws few connections between the two. The words sustainable, sustainability and climate don’t appear in either the DSA or the DMA. Environment and ecosystem appear only in phrases like platform ecosystem and online environment. There’s no acknowledgement that digital platforms aren’t some kind of parallel universe or intangible cloud, but are instead energy-intensive material systems that directly impact the “offline environment” now rapidly spiralling into climate breakdown.

In an essential recent paper on the proposed Artificial Intelligence [AI] Act (which does explicitly address sustainability, albeit in flawed ways), Philipp Hacker argues that “every legal field — just like every industrial, administrative or consumption sector — will have to chart paths across its own territory to map possible contributions to the collective effort of mitigating climate change.” He suggests this is especially important for tech law, because it helps shape a rapidly evolving economic sector with substantial climate impacts. Hacker points out that information and communication technologies as a whole produce up to 3.9 percent of global greenhouse gas emissions, compared to aviation’s 2.5 percent, and that the impacts of digital technologies will likely increase alongside the growth in highly energy-intensive machine-learning tools.

In response to Hacker’s call for “every legal field” to consider sustainability, I offer here some reflections on the relevance of sustainability to my own primary research topic, the DSA (and I hope others will do so for the DMA and the rest of the European Union’s Digital Single Market program). I argue that emissions, water consumption, and other direct and indirect environmental impacts are, in principle, within the scope of DSA articles 34 and 35, which require the largest platforms to take reasonable measures to assess and mitigate broadly defined systemic risks. I highlight some areas where these obligations could be leveraged to demand more sustainable business practices, and where even relatively minor changes by dominant platforms could have significant ripple effects across the industry. If regulators and civil society prioritize sustainability in implementing the DSA, it could make a small but significant contribution to the urgent society-wide emissions cuts that the climate emergency demands.

Why the DSA?

Beyond the argument that all fields of law should prioritize sustainability, I think there are three reasons it’s particularly worth thinking about in the context of the DSA. First, unlike other relevant legislation, such as the AI Act and the proposed corporate sustainability due diligence directive, the DSA is already in force and will be fully applicable from early 2024. In the context of both the escalating environmental crisis and rapid technological developments, it offers legal tools that are available immediately.

Second, its scope is wider than the AI Act, where key reporting and risk mitigation obligations currently only cover AI tools in specific high-risk sectors. Articles 34 and 35 apply at the level of services, not specific technologies. Services with over 45 million users must assess and mitigate “systemic risks” associated with their design, functioning and use. Importantly, the obligations extend not only to the risks of individual AI systems, but to any impacts of platforms’ design and operations, in the context of the broader business and social context within which they are used.

Finally, DSA enforcement could target some of the companies with the most impact and influence. In contemporary tech industries, essential infrastructural services like cloud computing, operating systems and many consumer-facing services (such as search and maps) are dominated by a few massive companies. Emerging AI technologies are expected to follow a similar platformized model. These dominant companies offer access to broader ecosystems of smaller actors, and consequently wield significant power to regulate other companies’ business practices. Any regulatory lever that could push them toward more sustainable practices is worth thinking about, because their design and governance choices have ripple effects throughout the sector.

So far, the Commission has designated the following services as very large online platforms (VLOPs) or very large online search engines (VLOSEs), meaning they are subject to the DSA’s risk mitigation obligations: Alibaba Group’s AliExpress; Amazon Store; Apple App Store; Bing; Booking.com; Facebook; Google’s Maps, Play, Search and Shopping; Instagram; LinkedIn; Pinterest; Snapchat; TikTok; Twitter; Wikipedia; YouTube; and German online retailer Zalando. This list doesn’t cover all the largest infrastructural services (since it focuses on consumer-facing platforms and excludes, for example, cloud providers) but overlaps with them significantly. That makes it highly relevant to ask what the DSA demands of these companies as a response to environmental crises.

It should be clear, judging from the latest reports of the Intergovernmental Panel on Climate Change, that no risk is more real, immediate, severe and systemic than the climate emergency.

Climate as a Systemic Risk

Notwithstanding suggestions by some tech industry figures that hypothetical superhuman AI poses an existential threat to humanity, it should be clear, judging from the latest reports of the Intergovernmental Panel on Climate Change, that no risk is more real, immediate, severe and systemic than the climate emergency. And while AI and digitalization might be seen by many as a separate issue to climate, the material infrastructure and energy and resource use required by digital technologies means they are inherently intertwined. So does article 35, which deals with mitigation, require VLOPs/VLOSEs to mitigate climate impacts?

The first paragraph of article 34 (which deals with risk assessment) lists relevant areas of risk: illegal content, fundamental rights, civic discourse, elections, public security, gender-based violence, public health, safety of minors, and people’s physical and mental well-being. Climate and sustainability are not expressly mentioned. However, climate impacts that are already affecting the European Union — from the 2022 heat waves that caused over 20,000 deaths, and other extreme weather events like floods, to drought and its long-term impacts — directly impact several of these: most obviously public health, physical well-being and security, and, arguably, fundamental rights.

Surprisingly, then, there has been virtually no discussion so far of potential DSA obligations to assess and mitigate environmental risks. One US non-governmental organization (Climate Action Against Disinformation) has highlighted its relevance to climate-related misinformation, yet this is hardly the most consequential way dominant tech platforms impact the climate. As policy delay and half measures replace outright denial, climate action is not primarily prevented by misinformation, but by political obstacles to major economic reform.

The DSA is fundamentally geared toward managing risks and strengthening oversight within existing market structures, and as such, hardly pursues the kind of ambitious structural reforms that the climate emergency demands. However, two areas can be highlighted where VLOPs/VLOSEs’ business practices have significant environmental impacts, and which fall squarely within the scope of articles 34 and 35. The DSA empowers regulators to demand changes in these business practices, which could meaningfully — albeit incrementally — contribute to reducing the overall impacts of the tech sector.

The first area is the direct environmental costs of the technologies built and implemented by VLOPs/VLOSEs, or by other companies within their platform ecosystems. Virtual environments are built on very material infrastructures of servers and data centres, involving not only substantial emissions but also significant water consumption and mining. Current trends suggest that without regulatory intervention, these impacts will sharply increase — notably, through the push by leading companies, including Meta and Apple, to commercialize virtual reality (VR) products, and through advances in generative AI, which is increasingly being integrated into search engines, social media and other consumer-facing applications. Both VR and AI technologies are very energy- and resource-intensive.

The second area involves indirect impacts of platforms’ business models. Of the 19 current VLOPs/VLOSEs, 10 are funded wholly or primarily by advertising and four are e-commerce sites (some advertising-funded platforms like Facebook and TikTok are also diversifying into e-commerce). Accordingly, most of them benefit from encouraging people to consume ever more products and services, regardless of the resulting environmental impacts.

On a straightforward reading of article 34(1), these direct and indirect environmental impacts pose systemic risks to several of the public interests mentioned. In principle, this means platforms must “diligently identify, analyse and assess” at least yearly, and before deploying new functionalities, how “the design or functioning of their service and its related systems, including algorithmic systems, or…the use made of their services” could contribute to those risks. Article 35(1) says they must “put in place reasonable, proportionate and effective mitigation measures.” Article 37 further requires regular independent audits of their risk assessments and mitigation measures.

What could this mean in practice? “Reasonable, proportionate and effective mitigation measures” should require VLOPs/VLOSEs to take steps to minimize their own energy and resource use. Plausibly, it could also require them to implement measures to discourage unsustainable practices by other businesses in their wider platform ecosystems — in much the same way that they are already expected by consumers and regulators to implement safeguards against fraud, criminal activity or privacy violations by third parties. The following sections highlight some significant practical steps that could be taken in each area.

One obvious mitigation measure could be building energy-intensive data centres in locations using more renewable power.

Mitigating Direct Impacts

Without overstating the capacity of corporate due diligence to address structural economic problems, we can say, in law and energy policy professor Shelley Welton’s words, that the climate crisis requires us to pursue “every feasible emission cut that can be achieved anywhere.” And importantly, given these companies’ scale and industry-wide influence, even relatively small changes in their business practices could meaningfully contribute to reducing emissions. By definition, each of the VLOPs/VLOSEs reaches at least 45 million consumers (and probably many more outside Europe) per month. Research suggests that relatively minor changes to the large AI models on which many services are built can make significant differences to downstream energy consumption. Any efficiency improvements these companies are induced to make under the DSA could have an outsized impact.

One obvious mitigation measure could be building energy-intensive data centres in locations using more renewable power — something leading companies already claim to do. However, cutting overall energy use is still essential. Renewable energy still has environmental impacts, and given the urgency of rapid economy-wide decarbonization, any unnecessary use of energy takes away renewable capacity that is needed elsewhere.

This implies that risk mitigation measures should include choosing technologies that require less computing power (and thus less energy for data centres) and making other efficiency improvements wherever reasonably possible. An expansive interpretation of DSA risk assessments could impose “sustainability by design” obligations like those Hacker advocates including in the AI Act, requiring companies to assess different technologies’ environmental costs and pursue more sustainable options. Current AI Act provisions on climate reporting and assessment only cover high-risk and foundation models, and their final scope remains to be determined. Meanwhile, the DSA provides broader risk mitigation obligations, effective immediately. The European Commission could already issue interpretative guidance under article 35(3), clarifying that risk mitigation measures should include sustainability by design.

More radically, articles 34 and 35 could offer regulators a basis on which to discourage or block the rollout of products whose environmental costs far outweigh their social benefits. Not only emerging technologies, like generative AI and immersive 3D video, but also more established data-processing technologies, used for purposes like targeted advertising and content recommendations, are very energy- and resource-intensive. It’s hard to argue this is a better use of the world’s remaining “carbon budget” and other scarce resources than, for example, AI applications in other fields such as health care — or, taking a broader perspective, than basic necessities for the 13 percent of people in the world who don’t have access to electricity. Indeed, these technologies arguably don’t even serve users in wealthy countries. Many analysts understand major companies’ rush toward VR and generative AI more as “arms races” to control the infrastructure for future lucrative platform ecosystems than as responses to consumer demand.

As scholars like Amy Kapczynski and the late David Graeber have suggested, the contemporary tech industry’s focus on creating ever more sophisticated and energy-hungry technologies for media production and consumption is completely out of line with society’s actual urgent need for things like green energy, infrastructure and buildings. In Hacker’s words, this “is a debate that, in terms of climate emergency, our societies must increasingly be prepared to have.”

Article 34(1) of the DSA requires specific risk assessments before deploying new functionalities that could affect systemic risks. These assessments could provide a basis not only for open public debate about what technologies are worth building, but also for regulatory intervention. If the European Commission really wants to prioritize environmental issues in DSA implementation, it could issue guidance stating that where a risk assessment finds substantial environmental impacts that aren’t justified by clear social benefits, it will regard deploying such a product as a violation of the article 35 duty to reasonably and proportionately mitigate risks.

Mitigating Downstream Impacts

Beyond the direct impacts of technologies they build themselves, many VLOPs/VLOSEs exercise significant power over broader tech ecosystems. We already expect gatekeeper platforms that give third-party businesses access to consumers — like e-commerce marketplaces and app stores — to actively regulate areas ranging from privacy and cybersecurity to consumer protection (which is emphasized in the DSA’s provisions on online marketplaces). It wouldn’t be a stretch for “reasonable, proportionate and effective” mitigation of climate risks to encompass governance of platform ecosystems.

E-commerce is an obvious area where this is relevant: marketplace platforms like Amazon and Zalando govern supply chains for swaths of consumer purchases, with serious environmental implications. These platforms could be required to favour more sustainable products in recommendations, and penalize wasteful production practices such as planned obsolescence. Similarly, advertising platforms could be required to stop hosting advertisements — or at least penalize them in recommendation and pricing systems — for the most environmentally damaging products.

E-commerce platforms also provide logistics services for third-party sellers, representing another important pressure point for more sustainable production and distribution practices. Risk mitigation could include limiting or ending support for practices such as free returns, which not only encourage unnecessary purchases but are a logistical nightmare that often results in good-as-new products going to landfill. It should also prevent platforms from destroying perfectly functional goods to save warehouse storage costs.

Another key area is software development. App stores and other companies such as Meta that provide resources to third-party developers could require them to preferentially use less energy-intensive technologies — for example, more efficient machine-learning techniques, or simpler technologies where they would work instead. If the dominant smartphone app stores, Google Play and Apple, implemented such policies, they could have meaningful impacts across consumer software markets. In future, with Meta and Apple building VR systems as app-store-like platforms for third-party developers, such obligations could make a major difference to these ecosystems’ environmental impacts.

The Future Outlook

As I’ve argued elsewhere, the breadth and vagueness of articles 34 and 35 in the DSA are a blessing and a curse. The interpretation I’ve just outlined is perfectly plausible, but since the range of potentially relevant risks and mitigation strategies is huge, there’s no guarantee companies and regulators will prioritize environmental issues. Indeed, since risks and mitigation measures will primarily be defined by VLOPs/VLOSEs themselves, they may well be interpreted in corporate-friendly ways that — conveniently — require only cosmetic changes to current business practices.

In a thought-provoking recent article on financial ESG (environmental, social and governance) initiatives, Claire Parfitt and Gareth Bryant argue that industry-led ESG risk mitigation inevitably prioritizes risks to investors over broader social harms. Leading tech law scholar Julie E. Cohen identifies similar patterns in other regulatory fields. Risk-based regulation may be presented as objective and technocratic, but the definition and evaluation of risks is determined by political interests. If “systemic risks” in the DSA are primarily defined by platform companies, we should not be optimistic that this framework will achieve significant reforms.

Yet this malleability also offers opportunities. Parfitt and Bryant argue that “financial risk is now a field of politics in its own right,” with environmental activists, like fossil fuel divestment campaigners, fighting to redefine risk in ways that serve progressive goals. DSA systemic risks, likewise, should be a field of political struggle. Civil society and academics will significantly influence DSA interpretation and enforcement — from conducting independent research that informs regulatory enforcement, to participating in developing codes and best practices. Regulators are currently gearing up for DSA implementation by hiring staff, designing internal processes, and issuing delegated acts, codes and guidance. There is a window of opportunity now to push for a focus on sustainability.

It may seem unlikely that regulators will pursue radical interpretations such as entirely banning unnecessarily environmentally destructive products, but the primary barriers are political, not legal — and they aren’t set in stone. Moreover, even less-radical interpretations, like requiring sustainability impact assessments and encouraging gatekeeper platforms to favour less environmentally damaging products, could non-negligibly reduce emissions. Every fraction of a degree counts. The DSA offers important levers that could meaningfully impact the emissions trajectory of the tech industry — if regulators and civil society actors make use of this potential.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Rachel Griffin is a Ph.D. candidate and lecturer in law at Sciences Po Paris.