The European Commission has finally issued the proposed Digital Markets Act (DMA), its bid to complement antitrust intervention in digital markets with ex-ante regulation in the form of a set of obligations that platforms identified as “gatekeepers” should abide by. The UK – having severed its links with Europe – simultaneously laid out its own distinct approach to regulating digital markets, now taking real shape after the statement of intentions in the 2019 Furman report. All of this is happening, extraordinarily, in the very same weeks that have seen five major complaints filed in the US against Google and Facebook by the federal agencies and the state attorney generals. And China has opened a major investigation of e-commerce giant Alibaba.
While the final form of the EU DMA rules will change possibly substantially in its journey through the European Parliament and European capitals before final approval, there is a lot to consider already. First, let’s say what this isn’t. Americans in particular, looking at it from afar, may expect it to be something akin to common carrier or public utility-style regulation. Not so – the regime is not designed to regulate infrastructure monopolies, but rather to create competition as well as to redistribute some rents. Second, the current definition of “gatekeeper” is not nuanced, and so we expect it will be updated and improved in the review process. Third, in our reading, the list of Obligations seems to be a catalogue derived from past and current antitrust cases involving the usual set of big tech platforms, but lacks the translation tools to map a rule from the setting that inspired it to other businesses that are deemed gatekeepers. Translating these dicta into actionable rules that people and companies can understand likely will require clearer organising principles around business models. The UK is doing just this – the CMA proposed regulation identifies the equivalent of a gatekeeper platform while at the same time creating a set of rules designed for that specific business model. Fourth, while “data” is mentioned multiple times in the Obligations, it is unclear that the rules do enough to recognise the direct consumer harm that flows from the exploitation of data and the extraction and appropriation of consumer value, amplified by privacy concerns. Lastly, while we understand there are legal reasons why the DMA could not include merger reform, the effective regulation of digital platforms requires powering up this essential tool. As the UK is folding its merger control into its digital markets regime, and the US is making undoing bad mergers a cornerstone of its antitrust cases against Facebook and Google, there appears to be a significant lacuna in the EC digital regime that needs to be addressed.
For Americans: What this isn't
The US’ “big awakening” on the use of antitrust to deal with digital markets (Google and Facebook in particular) is much welcome and overdue. To Europeans, the recent federal and state complaints have looked like an extraordinary giant iceberg breaking free and finally on the move – with a much broader scope and bolder agenda than anything Europe had set out to do. While Europe has done good cases, zooming in on a particular market and conduct (Google Shopping, Android), nothing has been quite as far-reaching in ambition. “You cannot buy your way out of competition” is the big underlying theme of the US complaints – a theme that has broad reach, encompassing exclusivity agreements, special deals with rivals to keep them out of a market, and multiple acquisitions to buy out threats. It will take some time for the US policy community to evaluate what they can expect to achieve with these cases and on what timeline. In the future we expect to see digital regulatory initiatives advance also in the US.
And because experimentation with different approaches will matter, industry participants and policymakers in the US will benefit from watching the European regulatory experiment unfold. It is important for Americans to appreciate that the European DMA (European Commission 2020) is not a step to breakups (in classic European fashion, these are briefly mentioned only as a last resort for repeat offenders) or a common carrier/public utility style regulation. Its animating principle is not so much to control the power of a monopoly infrastructure (e.g. setting access terms), but much more to prohibit or discourage conduct that has either the intent or effect of preventing entry of a rival (or raising its cost) where entry would otherwise be possible. A second purpose is to enforce fairness, a strong pillar of the European ordoliberal tradition, by prohibiting conduct that exploits and weakens counterparties that depend on the platform. Removing obstacles to entry, and fairness in the relationship with dependants, are the two goals of the law. Its method is “pro-competitive regulations” that seek to tame market power by enabling new competitors, rather than choosing price or quality levels.1
Note that this is quite different from a sector-specific regulator who might approve particular prices or approve certain product characteristics. US observers tend to associate the word “regulation” with this type of market intervention. The EC law is designed to operate much more strongly on the dimension of barriers to entry and to competition in the expectation that, if entry barriers are lowered, more competition can create a competitive price or quality (though consumer protection is also needed, which the parallel Digital Services Act – issued simultaneously to the DMA – is intended to take up).
The European Commission approach: Needs a translation key, and some organising principles
The DMA envisages a two-step process in which the “provider of a core platform service”2 first self-designates as a “gatekeeper”, and then adheres to list of obligations that apply to all gatekeepers.
The criteria for the designation of a gatekeeper are quantitative (annual EEA turnover above €6.5 billion in the last three years, average market capitalisation or equivalent fair market value above €65 billion in the last year, active in at least three Member States, over 45 million monthly active end users in the Union and over 10,000 yearly active business users in the last year). Back-of-the-envelope calculations suggest that these criteria will capture not only (obviously) the core businesses of the largest players (GAFAM), but perhaps also a few others. Oracle and SAP, for instance, would appear to meet the thresholds, as would AWS and Microsoft Azure. Conversely Twitter, AirBnB, Bing, Linkedin, Xbox Netflix, Zoom and Expedia do not appear to meet the thresholds at present, and Booking.com, Spotify, Uber, Bytedance/TikTok, Salesforce, Google Cloud and IBM Cloud appear to meet some but not others at this point.3
For those that do not meet the quantitative criteria, there is a long-winded alternative method of designation via a “market investigation” – a new tool which, however, will require time to get going and to run, and may not survive the review process in its current form.4 The designation of gatekeepers mainly through quantitative rules is clearly intended to leave no room for the imagination – it will curb shenanigans and flannelling by companies trying to argue against all common sense, and speed up the process of designation. On the other hand, a more principled approach will be needed for platforms that fall below the hard thresholds but may still be capable of conduct the law wishes to proscribe.
There are then two sets of “obligations” laid out for gatekeepers: a shorter list of obligations that apply without qualification, and a longer list of obligations “susceptible of being further specified” – the latter more tentative and “for discussion”, the former a definitive list of proscribed conducts (i.e. “thou shall not”).
Identifying conducts that are not acceptable in general is important and right, but these lists are a curious game of charades. With experience and familiarity with past, current and pipeline EC antitrust cases, one can just about assign each entry to a particular company and its issue. We attempt to do this in the table below. But this mapping is not obvious, because the writers have generalised each case away from its specific setting in order to apply a rule across the board. And then, when the mapping is finished, it is clear that some rules really are specific to one – or perhaps two – platforms, but unclear how they might or should apply to others, both within and outside the traditional GAFA list. So how can these lists be made operational? Some organising principles around business models would have been more useful, even if one does not want to get too “close and personal” and name individual companies. A fixed set of rules – covering all kinds of business models – applying to any platform that is designated a gatekeeper is the contrary of “flexible”. What is more, the separation between the designation of a gatekeeper first, and the application of the obligation second, is artificial because it is through the evaluation of conduct and its impact that an agency would identify a gatekeeper and understand what particular rules would ameliorate the problems that have been identified. As discussed further below, the UK seems to be taking this combined approach.
The gatekeeper role cannot be independent of business models
Intuitively, we think of a gatekeeper as an intermediary who essentially controls access to critical constituencies on either side of a platform that cannot be reached otherwise, and as a result can engage in conduct and impose rules that counterparties cannot avoid. Susan Athey proposes a similar definition: “A platform acts as a gatekeeper when it aggregates a meaningfully large group of participants that are not reachable elsewhere” (Athey 2020). The key is that the way in which gatekeeping power can materialise is distinct across business models (and platforms are often conglomerates operating several related businesses models; for example, Amazon Marketplace is distinct from AWS, Google’s various individual businesses – operating systems, search, placing of display ads – are all different, and so on). The designation of gatekeeper applies not to the whole firm, but to one business within the conglomerate.
The need to recognise business models explicitly in designing rules for tech is now well established (Caffarra 2019, Athey 2020, Caffarra et al. 2020). The DMA makes only a fleeting reference to business models (four times in the whole document, and to no particular purpose), but in practice there are big differences in economic properties and incentives across these business models. Compare three rough groups: ad-funded digital platforms (Google, Facebook, Bing, Pinterest, Twitter, Snapchat), transaction or matchmaking platforms that are marketplaces and exchanges (Uber, Airbnb, Amazon, DoubleClick), and OS ecosystem platforms (i.e. operating systems and app stores such as iOS, Appstore, Android, Google Play Store, Microsoft Windows, AWS, Microsoft Azure etc.). These business models differ in systematic ways in terms of (a) the type of economies of scale they rely on (data scale, R&D costs); (b) the type and direction of network effects (direct/indirect, one/both directions); (c) the potential for multihoming (on one or both sides), and (as emphasised again by Athey); and (d) the potential for disintermediation, either by someone else “introducing a different layer” intermediating two sides of the platform (e.g. end users and business users) or finding a way for two sides to connect to each other directly.
These distinctions matter because they mean the entry strategies of competitors will differ, and therefore defensive strategies will also differ. They also matter for the definition of a gatekeeper. Because a gatekeeper must be a business that controls access to a large enough group of users to affect entry and competition, key to the designation of a gatekeeper is whether there are obstacles to multihoming, and whether users cannot directly bypass the platform. Obstacles to multihoming and disintermediation could be in part inherent to the service (transaction costs, technical barriers), but could also be induced by the conduct of the platform. At the stage of designating a gatekeeper, this distinction does not matter. If there is a large enough user base that entry depends on, including upstream and downstream, and there is limited ability to multihome and no real possibility for bypassing the platform, then the platform business will be deemed to have “gatekeeper power”. However, the analysis of disintermediation and multihoming possibilities differs between three main categories of business models: ad-funded businesses, transaction/match-making businesses, and operating systems/app stores.
What the “business models” approach makes clear is that it is also hard to formulate rules that are model-independent and work across the piece. It seems optimistic to us to imagine that despite the different incentives created by the different functions of these platforms, a list of rules that are fairly specific to one setting will work across all of them. More flexibility will need to be built in to make sure each rule fits and is effective in each setting; but by articulating a goal of protecting the competitive process and consumers, that flexibility can make the rules stronger, not weaker.
Translating the Obligations
In the table below, we reproduce the list and try to annotate it (not without some ambiguity) to map how we think the Obligations may have arisen (with a couple of exceptions) from particular platform issues based on publicly known cases and complaints. Some rules appear to have an “Apple” label on them, others a “Google” label, others an “Amazon” label; only a few appear relevant to more than one platform.
Obligations for gatekeepers, DMA Art. 5
(a) refrain from combining personal data sourced from these core platform services with personal data from any other services offered by the gatekeeper or with personal data from third-party services, and from signing in end users to other services of the gatekeeper in order to combine personal data
(b) allow business users to offer the same products or services to end users through third party online intermediation services at prices or conditions that are different from those offered through the online intermediation services of the gatekeeper
(c) allow business users to promote offers to end users acquired via the core platform service, and to conclude contracts with these end users regardless of whether for that purpose they use the core platform services of the gatekeeper or not, and allow end users to access and use, through the core platform services of the gatekeeper, content, subscriptions, features or other items by using the software application of a business user, where these items have been acquired by the end users from the relevant business user without using the core platform services of the gatekeeper;
(d) refrain from preventing or restricting business users from raising issues with any relevant public authority relating to any practice of gatekeepers
(e) refrain from requiring business users to use, offer or interoperate with an identification service of the gatekeeper in the context of services offered by the business users using the core platform services of that gatekeeper;
(f) refrain from requiring business users or end users to subscribe to or register with any other core platform services identified pursuant to Article 3 or which meets the thresholds in Article 3(2)(b) as a condition to access, sign up or register to any of their core platform services identified pursuant to that Article;
(g) provide advertisers and publishers to which it supplies advertising services, upon their request, with information concerning the price paid by the advertiser and publisher, as well as the amount or remuneration paid to the publisher, for the publishing of a given ad and for each of the relevant advertising services provided by the gatekeeper.
Obligations for gatekeepers susceptible of being further specified, DMA Art 6
(a) refrain from using, in competition with business users, any data not publicly available, which is generated through activities by those business users, including by the end users of these business users, of its core platform services or provided by those business users of its core platform services or by the end users of these business users;
(b) allow end users to un-install any pre-installed software applications on its core platform service without prejudice to the possibility for a gatekeeper to restrict such un-installation in relation to software applications that are essential for the functioning of the operating system or of the device and which cannot technically be offered on a standalone basis by third parties
Google, Apple, Microsoft?12
(c) allow the installation and effective use of third-party software applications or software application stores using, or interoperating with, operating systems of that gatekeeper and allow these software applications or software application stores to be accessed by means other than the core platform services of that gatekeeper. The gatekeeper shall not be prevented from taking proportionate measures to ensure that third party software applications or software application stores do not endanger the integrity of the hardware or operating system provided by the gatekeeper;
(d) refrain from treating more favourably in ranking services and products offered by the gatekeeper itself or by any third party belonging to the same undertaking compared to similar services or products of third party and apply fair and non-discriminatory conditions to such ranking;
Google, Amazon, Apple14
(e) refrain from technically restricting the ability of end users to switch between and subscribe to different software applications and services to be accessed using the operating system of the gatekeeper,
(f) allow business users and providers of ancillary services access to and interoperability with the same operating system, hardware or software features that are available or used in the provision by the gatekeeper of any ancillary services;
Google, Facebook, Apple16
(g) provide advertisers and publishers, upon their request and free of charge, with access to the performance measuring tools of the gatekeeper and the information necessary for advertisers and publishers to carry out their own independent verification of the ad inventory;
(h) provide effective portability of data generated through the activity of a business user or end user and shall, in particular, provide tools for end users to facilitate the exercise of data portability, in line with Regulation EU 2016/679, including by the provision of continuous and real-time access;
General – data portability is by now a non-specific policy objective
(i) provide business users, or third parties authorised by a business user, free of charge, with effective, high-quality, continuous and real-time access and use of aggregated or non-aggregated data, that is provided for or generated in the context of the use of the relevant core platform services by those business users and the end users engaging with the products or services provided by those business users;
General - data access / interoperability is a broad policy objective
(j) provide to any third party providers of online search engines, upon their request, with access on fair, reasonable and non-discriminatory terms to ranking, query, click and view data in relation to free and paid search generated by end users on online search engines of the gatekeeper, subject to anonymisation for the query, click and view data that constitutes personal data;
(k) apply fair and non-discriminatory general conditions of access for business users to its software application store
Conditioning on business models would be clearer and more useful
So we can map these rules into cases, just about. But what are the generalisable principles? The narrative explanation in paragraphs 32-57 of the draft law devotes a paragraph to each obligation, but each is just a slightly expanded version of the same list we show above. The text says – in more formal terms – that it is typically bad for a gatekeeper to mingle data, and that it is typically bad for a gatekeeper to restrict business users from offering cheaper services through other channels, or to promote and distribute its services through other channels, to restrict end users from switching between different software applications and services (e.g. through pre-installation), to deny business users sufficient transparency on advertising prices, or to use data generated from transactions by its business users on the core platform for the purpose of its own services that offer similar services to that of its business users. This is just a repetition of what the Obligations say.20
Some companies will be able to recognise themselves, but what about others who will need to second guess as to how the rule may possibly translate into their case? And how futureproof are rules enunciated in a way that seems to be very backward-looking? What will happen when technology and business models change?
A more useful approach would condition rules on business models. This would allow for a recognition that (a) business models have different economic properties, (b) the way entry may occur differs across them, (c) therefore defensive strategies to undermine entry will differ, and (d) therefore pro-competitive rules to lower entry barriers will also differ and need to be specified with that in mind.
With this approach one can then be principled and truly more specific about the conducts that should be proscribed in each case to achieve both fairness and more competition through entry and multihoming. The way that a rival will seek to enter against a social network is different to how it will compete with a search engine, or an operating system, or an e-commerce business. So one needs to think first about the business strategy a nascent competitor might deploy and then look for exclusionary conduct, entry barriers, or acquisitions that that could limit the new competition.
For instance, in the case of ad-funded services, the “flywheel” – the virtuous cycle that generates user engagement – relies on building up a user base for an interesting service that then attracts advertisers. The entry path for an entrant needs to involve various ways of trying to scale up quickly on the user side to then bring on advertisers: like doing a distribution deal with someone who accounts for a large block of users, and become a default there. Conduct that affects the ability of an entrant to gain some sort of scale can thus be problematic – for example, the gatekeeper establishing defaults to ensure persistence of users with the platforms, entering into exclusivity deals with distributors that then are unavailable to potential challengers and deprive them of scale, making/ buying a vertical service and then advantaging it to take away customers from competing verticals, integrating into adjacent areas and then bundling/tying again to make entrant scale more difficult. We see these issues raised in both the EC Android case and the US Department of Justice/state attorney generals’ complaint against Google search.
The case of platforms like operating systems or app stores has a different set of concerns – for example, whether the platforms place obstacles for developers to operate across other platforms, whether they make it difficult for developers to distribute through other channels, and whether they make it difficult for users to port their content across platforms. And different again is the case of marketplaces and transaction platforms, where multihoming is often prevalent. Here we may worry about scale, generating data that creates a competitive advantage versus both rivals and complements on the platform, and that data being used in ways that may harm incentives to innovate; or there could be concerns about an algorithm for surfacing a recommendation to consumers designed in a way that may favour the platform over business complements that operate on the platform.
In our view list of “Obligations” set out in the DMA is too much of a reproduction of past issues rather than a clear statement of organising principles. Much clarity would be gained by some organisation around business models, which would also clarify which platforms/businesses are “in scope” for which behaviour.
Secondly, the criteria/process for designating gatekeepers and the identification of problematic conduct seem hard to separate into two sequential steps because it is the nature of the gatekeeper’s business that determines both the harm and the best regulatory choice. It seems to us that a single unified analysis would be more successful at identifying the conduct that could be improved with regulation.
Lastly, we worry that the method of applying rules derived from all platforms to any one of them will not actually work. In practice, some of these prohibitions either do not make sense or may well be counterproductive when stretched across different environments. Will there be unintended consequences to applying all the extra rules? Will a fixed set of rules up front be able to prevent the harms of the specific case at that time?
The UK approach: Business models in action
The UK regulation is expected to work somewhat differently. The CMA published its proposal to government a week before the DMA, on 8 December 2020 (CMA 2020), with a recommendation to establish the long-awaited Digital Markets Unit (DMU) and for this to implement a new regulatory regime for “the most powerful digital firms” – the “Strategic Market Status (SMS) regime”.
The entry point to the SMS regime is an assessment of whether a firm has strategic market status. Unlike the DMA, there are, however, no explicit quantitative thresholds and criteria to be met (although some may come later). The essence is market power, but not any market power – in “certain circumstances,21 the effects of a firm’s market power in an activity can be particularly widespread or significant” (para 4.17). The process of designation is described as an “evidence-based economic assessment as to whether a firm has a substantial entrenched market power in at least one digital activity, providing the firm with a strategic position (meaning the effects of its market power are likely to be particularly widespread and/or significant)” (para 12).
The proposal then outlines the shape of a “coherent regulatory landscape”, whereby each firm that meets the SMS test should be subject to a specific code of conduct that applies to the firm in question and sets clear upfront rules. The code of conduct is supposed to reflect three general proposed objectives: fair trading (exploitation), open choices (exclusion), and trust and transparency (consumer protection). These are then to be tailored to the activity, the conduct, and harms it is intended to address. Notice the critical difference to the European DMA: there is no fixed, pre-established list of rules. The DMU will evaluate whether a particular platform has this important level of market power and at the same time develop the set of rules needed to protect consumers and prevent exclusion of rivals or exploitation of trading partners. As the CMA puts it, the goal is “(a)n enforceable code of conduct which sets out clearly how the firm is expected to behave in relation to the activity motivating its Strategic Market Status designation" (emphasis added). So the formulation of the specific code of conduct for that specific platform will go hand in hand. This seems very apt. It will generate rules targeted to the problematic conduct, that directly take into account the business model and that can be adjusted and updated as technology and business models evolve one by one.
What about concerns about direct data exploitation?
While data issues are mentioned multiple times in the Obligations, we worry about whether there is enough leeway here to really develop and pursue concerns that are well-founded economically, but not traditional, or if the law will embrace harms that are created by the exploitative use of data. Obligation (a) under Art. 5 does proscribe the mingling of user data from different services. And Obligation (a) under Art. 6 appears to have been formulated directly with the Amazon Marketplace investigation in mind, and concerns about use of seller data. But how do these generalise? And how do we account for privacy concerns, that are intimately connected with market power issues and amplify them?
We know that changes in the way data is shared, paired with other data, and used can become a quality-adjusted price increase to consumers for the use of “free” services. And unknown privacy characteristics (like not knowing how data given five years ago may be used today) are analogous to “hidden prices” in behavioural economics. Data based on a consumer’s browsing history and app use can be used to predict personal characteristics that many users would strongly prefer to remain private, and yet can be monetised very attractively in applications like medical services, insurance services, financial services and employment decisions. And the ability to leverage the “data firehose” is a concern if it allows the gatekeeper to behave as a discriminating (data) monopolist; this can extract consumers’ surplus and leave consumers worse off.22
We hope more weight will be given to these concerns in future, though it is not clear to us that the current draft of the DMA recognises these important dimensions of direct consumer harm in a general enough way.
Merger control as the orphan
The third pillar of the UK regime is the establishment of “specific “SMS merger rules” to tighten merger control for this group. The motivation is a great cry for action to address “historic underenforcement against digital mergers in the UK and around the world” and the fact that strategic acquisitions have been part of the business model and contributed to create market power that has then become entrenched (para 4.121-124). In making merger control an explicit part of its new digital regime, the CMA recognises that all acquisitions by SMS firms need to be scrutinised with care; and not under the usual standard which is applied to any merger, but with “a lower and more cautious standard of proof”. That is, the substantive test does not change (it is still a “substantial lessening of competition”), but the level of certainty the CMA will be required to have around that is lowered from a “balance of probabilities” test to a “realistic prospect” test. No agency can have all the facts at the time, and there is a big band of uncertainty. But “uncertainty should not be an excuse for inaction”.23
And in the US, the recent complaints at the federal and state level have essentially underscored that enforcers must either be much stricter in the mergers they block, or be clear with industry participants that they face a risk that a few years down the road there may be a need to review and undo those mergers that turned out to be harmful.
In contrast, there is nothing in the DMA on merger control. We understand this is because there is no legal basis for the DMA to alter the EC Merger Regulation. But this leaves a big lacuna in the rules. Art. 31 in the DMA draft just mentions an obligation of gatekeepers to “inform” the EC of any planned deals, but nothing flows from there. Without changes to the merger regime, the EC digital regulation package will remain incomplete (and risk the repeat of decisions like Google/Fitbit). While Member States (and the UK) will be able to enforce vigorously in this space, the EC will be hobbled in its ability to protect dynamic competition and innovation through this critical tool, and digital mergers will continue to be allowed based on a standard of proof which is simply unfit for purpose. By comparison with other jurisdictions, legal caution about having to demonstrate loss of competition to the usual standard “in Luxembourg” is likely to cripple the initiative that should flow from impetus behind the DMA. The EC may state publicly that potential competition concerns are nothing new, but the reality is it has not enforced against killer acquisitions or acquisition of nascent competitors at anything like the rate of the CMA. The adoption of the DMA (and the DSA) responds to a call for regulators to serve citizens and consumers better. Without explicit changes to merger rules, history is likely to repeat itself and hold back competition in this sector.
Authors’ note: The authors have been involved to different degrees in advisory work both for and against tech platforms, including Apple, Amazon, Microsoft, Uber and others.
References and further reading
ACCC (2019), Digital Platforms Inquiry Final Report.
Amelio, A and B Jullien (2012), “Tying and freebies in two-sided markets”, International Journal of Industrial Organization 30(5): 436-446.
Athey, S (2020), “Platform Markets – Business Models & Gatekeepers”, presentation, November .
Athey, S, E Calvano and J Gans (2016), “The impact of consumer multi-homing on advertising markets and media competition”, Management Science.
Belleflamme, P and M Peitz (2019), “Managing competition on a two-sided platform”, Journal of Economics & Management Science 28(1): 5-22.
Bourreau, M et al. (2020) “Google/Fitbit will monetize data and harm consumers”, VoxEU.org, 30 September.
Caffarra, C (2019), “Follow the Money: Mapping issues with digital platforms into actionable theories of harm”, e-Competitions Special Issue, August.
Caffarra, C, F Etro, O Lathan and F Scott Morton (2020, “Designing regulation for digital platforms: Why economists need to work on business models”, VoxEU.org 4 June.
Caffarra, C, G Crawford and T and Valletti (2020), "‘How tech rolls’: Potential competition and ‘reverse’ killer acquisitions”, VoxEU.org, 11 May.
Caffarra, C and T Valletti (2020), “Google/Fitbit review: Privacy IS a competition issue”, VoxEU.org, 4 March.
Caffarra, C (2020), “The UK’s “other” big experiment: Regulating online platforms?”, VoxEU.org, 6 January.
Casadesus-Masanell, R and F Zhu (2010), “Strategies to fight ad-sponsored rivals”, Management Science 56(9): 1484-1499.
Choi, J P and J Doh-Shin (2021), “A leverage theory of tying in two-sided markets with non-negative price constraints”, American Economic Journal, Microeconomics, in press.
CMA (2020), A New Pro-Competition Regime for Digital Markets, Advice of the Digital Markets Taskforce, December.
CMA (2020), Online Advertising and Digital Markets Study, July.
Etro, F, and C Caffarra (2017), “On the Economics of the Android Case”, European Competition Journal 13(2-3).
Etro, F (2020), “Product selection in online marketplaces”, DISEI WP20, University of Florence.
Etro, F (2020), “Device-funded vs Ad-funded platforms”, DISEI WP19, University of Florence.
European Commission (2020), “Proposal for a Regulation of the European Parliament and of the Council on contestable and fair markets in the digital sector (Digital Markets Act)”, Brussels, 15 December.
Furman, J et al. (2019), Unlocking Digital Competition, Report of the Digital Competition Expert Panel.
Geradin, D and D Katsifis (2019), "An EU competition law analysis of online display advertising in the programmatic age", European Competition Journal 15(1).
Hagiu, A, T The and J Wright (2020), “Should Amazon be allowed to sell on Its own marketplace?”, SSRN 3606055.
Prat, A and T Valletti (2019), “Attention oligopolies”, mimeo, New York University.
Rietveld, J, M Schilling and C Bellavitis (2019), “Platform strategy: Managing ecosystem value through selective promotion of complements”, Organization Science 30(6): 1232-1251.
Scott Morton, F and D Dinelli (2020), Roadmap for a Digital Advertising Monopolisation Case Against Google, Omydiar Network Report.
Scott Morton, F and D Dinelli (2020), Roadmap for a Monopolization Case Against Google Regarding the Search Market, June.
Scott Morton, F and D Dinelli (2020), Roadmap for a Monopolization case against Facebook, June.
Stigler Committee on Digital Platforms (2019), Final Report, September.
Tremblay, M (2020), “The Limits of Marketplace Fee Discrimination”, Net Institute Working Paper #20-10, September.
Wen, W and F Zhu (2019), “Threat of platform-owner entry and complementor responses: Evidence from the mobile app market”, Strategic Management Journal 40(9): 1336-1367.
1 The Stigler Report recommended just this approach (Stigler Committee on Digital Platforms 2019).
2 These are (a) online intermediation services; (b) online search engines; (c) online social networking services; (d) video-sharing platform services; (e) number-independent interpersonal communication services; (f) operating systems; (g) cloud computing services.
3 This is based on desktop research and public information, and should be seen as a first approximation only.
4 The “market investigation” is a tool introduced in the DMA as the pale remnant of what was expected to be a much more powerful New Competition Tool. This was, however, shot down by the internal Regulatory Scrutiny Board in November as legally impossible to achieve under the banner of Art 114 where the DMA sits. (Source: MLex 17 December 2020, EU 'gatekeeper' law faced internal criticism over choice of targets and negative impact).
5 The EC fined Facebook in 2017 for providing misleading information at the time of the WhatsApp acquisition on its ability to “establish reliable automated matching between Facebook users' accounts and WhatsApp users' accounts”, see here). Germany’s Bundeskartellamt issued in 2019 a decision (under appeal) prohibiting Facebook from combining user data from different sources (see here). There is a known investigation underway by the EC about FB’s use of data, that is understood to also cover how data is collected, combined and used from different sources, December 2019 (see here). A simultaneous investigation was opened around Google’s use of data (see here).
6 The issue of MFNs or parity clauses was at the core of the e-books case which was settled by the EC with Amazon in 2017 (see here). Amazon has also been reported to have voluntarily abandoned in 2019 any residual parity clauses in contracts with sellers on their marketplace (see here). The issue of parity clauses has been the focus of long-standing disputes between Online Travel Agents such as Bookings.com and Expedia and multiple European national regulators (France, Italy, Germany, Sweden and others), with the EC acting as a “coordinator” (for a summary of events, see here).
7 The EC opened formal investigations in July 2020 into Apple’s App Store rules “to assess whether Apple's rules for app developers on the distribution of apps via the App Store violate EU competition rules. The investigations concern in particular the mandatory use of Apple's own proprietary in-app purchase system and restrictions on the ability of developers to inform iPhone and iPad users of alternative cheaper purchasing possibilities outside of apps” (see here, Apple Cases AT.40437 and 40716).
8 This is about businesses such as advertisers or publishers being required to use the platforms’ own ID solution when offering their services. It is about data collection by the gatekeeper and the refusal to use alternative ID services (e.g. publishers’ own IDs). Thought to be in scope in the new investigation by the EC of Google adtech and data practices, as reported by MLex 23 December 2020, cases Cases AT.40660 – Google Adtech, AT.40670 – Google Data-related practices. Also thought to be in scope in the Facebook data investigation, according to press reports.
9 This could refer to various known ties forced by Google in the ad tech stack, e.g. between AdX – Google Ads or YouTube – Google Ads (see again here, and the EC investigation of the digital adtech stack as mentioned in previous footnotes; also in scope in the investigation by the French Adlc of the digital adtech stack).
10 The issue has emerged in multiple Adtech investigations, and it is thought to be in scope in the current EC Google Ad Tech investigation also (ACCC 2019).
11 The EC sent Amazon a Statement of Objection “for use of non-public seller data”, November 2020 (see here). Could also refer to Google in adtech, where Google used data collected via DFP to develop its Open Bidding solution and help AdX/GAM compete against header bidding – see CMA report and Texas complaint.
12 The obvious reference here is the classic 2018 EC Android decision, that was about pre-installation and default restrictions, see https://ec.europa.eu/competition/antitrust/cases/dec_docs/40099/40099_99.... More broadly refers to software platforms sold with pre-installed apps such as Apple and Microsoft.
13 Apple’s EC investigation of rules for the application of the Appstore, see footnote 10, responding to complaints from third party apps around the terms of their operations on the App Store, and complainants’ requests that they should be allowed to bypass the App Store in-app payment systems, and that alternative app stores should be allowed to operate on the App Store (see here; see also the Epic complaint here). Google’s app store has been subject of similar complaints.
14 This is generally about “self-preferencing” though the underlying practices are very different. The Google Search (Shopping) decision of 2017 is the classic reference in the context of ad-funded models, where “self-preferencing” took the form of Google favouring its own price comparison services and undermining third parties’ (see here). As to Amazon, the EC issued a Statement of Objection and simultaneously announced the opening of a second investigation around concerns that Amazon may use third party seller data to favour its own products on the Marketplace (e.g. through entry and pricing decisions), and favour itself “through its processes”. In the case of Apple, complainants such as Spotify have been making a strong public case that Apple favours Apple’s own apps (e.g Apple Music) (see here).
15 This may refer to complainants in the Apple case and their complaint on the ability of users subscribing to services outside the App Store to consume the service on their Apple devices (see inter alia https://timetoplayfair.com/).
16 Allowing third party businesses to Interoperate without discrimination with the platform, in the same way as the platform’s own services, is an established aspiration from past cases going back to Microsoft. This is known to be in scope in the current Facebook investigation, which looks inter alia at “application programming interface (API) that allows app developers to access data or functionalities on its platform and its photo-sharing site Instagram and software components to interact” (see here). Google and Apple are likely to be facing similar issues in the relationship with developers.
17 This is of direct relevance to Google and Facebook’s advertising businesses (see for instance here). It may become relevant to Amazon as its advertising business develops.
18 This is specific to Google and intended to favour potential entry in search (see EC Google Search (Shopping) case).
19 Specific to Apple and Google and their respective app stores.
20 Thus, for example, para 36 explains the prohibition of “combining end user data from different sources or signing in users to different services of gatekeepers”, under Obligations Art 5 (a), just on the basis that this “gives them potential advantages in terms of accumulation of data, thereby raising barriers to entry”. Para 37 then goes to the next Obligation under Art 5 (b), that gatekeepers should allow “business users of their online intermediation services to offer their goods or services to end users under more favourable conditions, including price, through other online intermediation services” – that is, should not apply MFNs – and this is justified based on the obvious observation that “such restrictions have a significant deterrent effect on business users (…) in terms of their use of alternative online intermediation services, limiting inter-platform contestability”. Para 38 moves on to the next (Art 5 (c)), which concerns the obligation on gatekeepers to allow “business users (to be) free in promoting and choosing the distribution channel they consider most appropriate to interact with any end users that these business users have already acquired through core platform services provided by the gatekeeper” – but says nothing more than this is “to prevent further reinforcing their dependence on the core platform services of gatekeepers”.
21 For example, when a firm “has achieved very significant size or scale”, “is an important access point to customers”, “can use the activity to extend market power from one activity into a range of other activities”, “can use the activity to determine the rules of the game|" or “may have broader social or cultural importance” (para. 4.20).
22 See the discussion in Bourreau et al. (2020).
23 As stated by Mike Walker, CMA Chief Economist, at the CRA Roundtable event of 17 December 2020 on “The European Digital Regulation Experiment”.