The UK is seeking to coordinate the work of its privacy and competition regulators through bodies such as the Digital Regulation Cooperation Forum. This is to be welcomed. There is much that the Information Commissioner’s Office (‘ICO’) and Competition and Markets Authority (‘CMA’) can usefully discuss, such as which regulator should act (or act first) when boundary cases arise, and how each regulator can best accommodate the other’s objectives when setting enforcement priorities.
But coordination has its limits. The ultimate vision - at least for many with a competition policy mindset - is of the privacy and competition regimes working together to drive competition that delivers better privacy outcomes. It is not clear that this vision is realistic. If privacy standards are thought to be too low, the more productive approach is likely to be to raise those standards directly by enforcing the GDPR.
The limits to coordination
The privacy and competition regimes can work in mutually reinforcing ways. For instance, appropriate enforcement of the GDPR’s transparency rules can ensure that consumers have easy access to trustworthy information about how firms process their data, in turn empowering consumers to make informed choices about their privacy when navigating digital or other markets.
But the vision of privacy and competition working together to improve privacy standards faces three challenges.
The first is that the GDPR may dampen competition, at least in some contexts. For instance, the GDPR can make it harder for personal data to be shared between multiple competitors, including given the difficulty of identifying a valid GDPR justification for sharing, and of ensuring sufficient transparency for the consumers concerned (see the ICO’s 2019 ‘Update report into adtech and real time bidding’ for a practical example). If the data in question is needed for rival firms to be able to compete, GDPR impacts such as these can undermine competition.
The second challenge is that, even if there is healthy competition, aspects of the GDPR can arguably make it harder for firms to compete on privacy grounds. Online ad-funded products and services are a good illustration. Most are funded by targeted advertising, and a privacy advocate might want to see firms transition over time to gathering and using ever less data in their advertising operations. But, as currently understood1, consent under the GDPR already requires firms that use targeted advertising to also offer consumers a fully-functional ‘privacy lite’ version of their product or service that is free of targeted advertising. So, for example, online publications generally provide consumers with a choice to browse without targeted advertising (and associated cookies). But this feature of the GDPR makes it hard to see why a firm could ever compete by bringing to market a less privacy-intrusive version of targeted advertising. A consumer who is already using the privacy lite version of an offering would have no privacy-based reason to switch to the new firm. And if a consumer has not selected the privacy lite version of an existing offering, it is hard to see why the new firm’s privacy stance would entice them.
The third challenge is not specific to the GDPR. Even assuming the privacy regulator has ensured that accurate privacy information is easily available to guide consumer choices, competition is only likely to raise privacy standards if, in practice, (i) consumers inform themselves of the relevant privacy information before making their choices and (ii) those choices are materially influenced by such information.
As for the first condition, the evidence suggests that consumers spend very little time reading privacy policies2. And this is broadly consistent with experience in other contexts, such as consumer contracts.3
It is also unclear whether the second condition is generally satisfied. Consumers may say they care about privacy, but what matters in terms of market outcomes is what they do - their ‘revealed preference’. And consumer action suggests privacy considerations play little role. After the much-publicised Cambridge Analytica scandal broke in March 2018, Facebook’s ‘Daily Active Users’ held steady in North America and dropped by only 1% in Europe.4 TikTok achieved meteoric growth despite repeated public concerns about its data practices.
There are a variety of reasons why consumers may not give much weight to privacy when they make their choices. Some may not properly weigh future harms (a phenomenon known to economists as ‘time inconsistency’). Or it may be that consumers do not ‘price in’ potential negative effects on other people. For example, tracking the purchasing behaviour of one consumer may enable a firm to charge a higher price to another; allowing consumer profiles and preferences to be easily shared between firms may enable a voter dataset to be constructed that could be used to manipulate the democratic process. Or it could just be that the revealed preference of many consumers simply does not accord with the worldview of those in privacy regulation. To give one example: going by the content of EU law, the EU seemingly thinks it important that consumers are presented with a cookie banner whenever they first visit a website. It seems doubtful that consumers agree.
Direct enforcement
There is an elegance in trying to use competition to raise privacy standards. Healthy competition can drive continual progress on price, quality, range and service to the ongoing benefit of consumers. Why not do the same for privacy?
There are some technical and GDPR-specific issues with trying to use competition in this way. In principle, however, a post-Brexit UK could make different trade-offs in its privacy regime to avoid or at least minimise issues of this type. Countries outside the EU could choose not to follow the EU’s path. But a more fundamental challenge remains: in practice, consumers do not seem to care enough about privacy for their choices to tilt markets towards better privacy outcomes.
Stepping back, this is perhaps not a particularly anomalous result. We do not leave to consumers the job of raising environmental standards, or product safety standards. Instead, we have environmental laws, and product safety laws, and try to ensure that these are properly enforced.
Why should the same not be true for privacy? If privacy standards are not high enough, the likely better answer is to enforce privacy laws directly (or update those laws, and then enforce them) rather than hope that the market can be steered into delivering higher standards by itself. With the generality of firms complying with the privacy standards that society has chosen and set down in law, competition can be left to do its work - improving offerings in the ways that matter most in the eyes of consumers.
- See, for instance, the European Data Protection Board’s 4 May 2020 Guidelines on consent.
- See, for instance, the discussion in Chapter 4 of the CMA’s 1 July 2020 ‘Online platforms and digital advertising Market study final report’.
- See Chapter 2 of the OFT’s 24 February 2011 ’Consumer contracts’ market study report.
- See Facebook’s 2020 SEC Filing.