Chapter
6
19
minute read

Platform Power

CHAPTER
6
Chapter
19
19
minute read

Platform Power

What’s a Rich Text element?

The rich text element allows you to create and format headings, paragraphs, blockquotes, images, and video all in one place instead of having to add and format them individually. Just double-click and easily create content.

Static and dynamic content editing

A rich text element can be used with static or dynamic content. For static content, just drop it into any page and begin editing. For dynamic content, add a rich text field to any collection and then connect a rich text element to that field in the settings panel. Voila!

How to customize formatting for each rich text

Headings, paragraphs, blockquotes, figures, images, and figure captions can all be styled after a class is added to the rich text element using the "When inside of" nested selector system.

Over the past few decades, online platforms have become ever more powerful across many areas of our lives. From Twitter to Deliveroo, these platforms have disrupted entire societal institutions and processes, replacing them with new kinds of companies whose corporate decisions can circumvent or even supersede public governance. They now shape our social, economic and political worlds, structuring how we relate to each other, how we work and how we engage in politics. As the digital world develops and more of our interactions happen online, their power increases. 

Platforms are key enablers of new power, in both good and bad forms. The #MeToo movement wouldn’t have been possible without the social media platform Twitter. The mechanism of a hashtag to become the collector of common experience, the rallying point for change, could not have existed without the platform and the user mechanics on which it is founded. As Timms and Heimans describe:

New power operates differently, like a current. It is made by many. It is open, participatory, and peer-driven. It uploads and it distributes. Like water or electricity, it’s most forceful when it surges. The goal with new power is not to hoard it but to channel it.”84

Platforms can grow very fast, are constantly changing and wield very real power. As of July 2021, 51.15 million people use Facebook in the UK.85 Facebook enables neighbours to organise litter picks and fundraise for local charities, but it also allows far-right conspiracy theories to spread. Many people are pointing out the link between polarisation in our society and social media algorithms. From speeches in the Senate about filter bubbles86 to articles about genocides caused by Facebook87, the structure of our political discourse is increasingly shaped by how social media platforms are designed. 

Yet it’s not just social media; platforms also affect our economic structures. According to the TUC, 14.7% of working adults now work via gig economy platforms, compared to 5.8% in 201688. Airbnb is disrupting local housing markets as more landlords move away from residential to short term holiday lets.89 Tinder has 66 million active users each month and is the medium through which modern intimacy is navigated and millions of relationships are started.90 The scale and pervasiveness of online platforms across so many dimensions of our common life makes it essential to interrogate their power and how it operates. 

The growth of digital interaction has led to the creation of huge market monopolies enjoyed by technology companies on an unprecedented scale. Alphabet (Google, Android), Meta (Facebook, Whatsapp, Instagram etc) and Amazon (Alexa, Kindle) are all worth over £1 trillion dollars.91 They all operate on a global scale, across national boundaries and are increasingly buying up related services, creating international monopolies. This “first mover” advantage means that the company that fills the gap first gains an advantage, e.g. Facebook dominates social networks, Google is synonymous with searching, Amazon monopolises retail marketplaces etc. VCs are willing to invest huge amounts of money betting on the next platform to achieve monopoly status92

In response, regulation of platforms all too often takes place retrospectively, meaning that the harms caused by these new forms can go unchecked for some time. Platforms are a space in which the boundaries between old and new power need to be negotiated and balanced. How can we ensure fairness and accountability on platforms, without losing the freedom needed for new power to thrive? What does that kind of regulation look like? With the launch of the “Metaverse” and as lines between the virtual and real life become blurred, these questions are becoming more urgent.

What is a platform?

Online platforms underpin much of our everyday lives. They now co-exist with our physical spaces such as pubs, community centres, newspapers, shops or high streets. Some platforms, such as Uber and AirBnB, are online versions of a marketplace, connecting suppliers to consumers. Others structure and mediate our online relationships through dating apps or social networks. This is about more than just Facebook; different forms of platform now control large parts of our lives and our social interactions:

You can also segment platforms based on how they generate revenue. Facebook or Instagram are part of the “attention economy”, generating revenue by monetising attention and focusing on data extraction. Other platforms are “commission platforms” linking consumers to suppliers such as Deliveroo and Uber, while a further category is funded through member subscriptions such as Spotify or Netflix93. What they have in common is that they are fundamentally changing how we behave.

How does a platform have power?

One way a platform exerts power is through what it allows users to do and the behaviours it encourages. While we can all set up groups on Facebook, it is Facebook that controls the ways we engage with the platform - and we can only set up these groups in particular ways within a certain menu of options. And while companies are making decisions to improve user experience, they can also shape our interactions in ways that maximise revenue, or harness attention. For example, on Facebook you can react to statuses using one of 7 emotion reactions: angry, sad, love, shocked, laugh, like, care. Why choose these reactions?94 Because by creating a measuring scale of human emotions it’s easier to produce valuable data which can be used by companies to improve their advertising, while also arguably narrowing our own emotional ranges. 

Some companies are beginning to think more deeply about how their platforms are designed. For example, Twitter has experimented in attempting to improve discourse on the platform, which tends to incentivise outrage and oversimplification, perhaps due to the inherent difficulties of communicating a nuanced argument within a character limit. From limiting replies to allowing downvotes, to prioritising responses in a thread, the platform has run a number of experiments to try to improve user experience. Yet these are still features that are introduced without consulting the user; as individuals we cannot choose to accept or reject them. 

Platforms also exert power by controlling access to their services. For example, to set up a Twitter account all you need is an email address, but to create an account to become a driver on Uber you need extensive paperwork. These rules are sometimes governed by law, but sometimes by decisions made by the platform itself. This can lead to arbitrary decisions which are difficult to appeal or change. Uber has tried to verify driver identity and DBS checks using facial recognition software. Some driver unions are campaigning against this, because so many drivers were misidentified through the software and then went on to have their licence to operate revoked by Transport for London. This particularly affects certain groups: research has shown that facial recognition systems can have an especially high error rate when used to identify people from BAME communities.95

Platforms also make decisions about what users are allowed to say and do online and how they can interact with their products. Twitter must often make these kinds of decisions when it comes to banning accounts on the basis of abuse or hate speech, often in the heat of a crisis. On 9th January 2020, Twitter banned former US President Donald Trump’s Twitter account, which was followed by nearly 89 million people96, after his posts in relation to the Capitol riots. Platforms are making decisions about content moderation, sometimes by making rulings about content and free speech which differ from the law. These rulings can be difficult to challenge, both because of technical reasons and due to the lack of democratic governance or formal scrutiny. Who decides? 

Platform owners decide who is allowed to offer services on them and the terms on which they do that. They control how open or closed to design their software and how other entities can relate to them. While many of the technology giants have built closed systems, open source or open licensed systems have appeared in response; these are systems that enable the code which controls the platform to be copied and adapted, allowing users to interact with the system as they choose. This is significant because often the closed nature of platforms means that monopolies are more likely to develop due to network effects. They have a captive audience, which means their attempts to replicate existing products or platforms are more successful because they can build in full integration for their existing users. Decisions about the level of openness of a platform to other builders or developers is a key source of power. 

Regulating Platforms

So how might we respond to the ‘new’ power of platforms? Different approaches to regulation are emerging. Some include applying existing regulatory frameworks while others involve creating new mechanisms which operate at the technical level and are in some cases yet to be thought of! What follows is a rough framework for the kinds of interventions that could be made to regulate the markets around platforms, the companies that run the platforms and the product itself. 

How to regulate the market?

One route would be to break up existing platforms using traditional anti-monopoly laws. This is an approach that is useful when it comes to platforms buying other platforms such as Facebook’s acquisition of Instagram or Whatsapp, but it can be limited when it comes to breaking up platforms. This is due to network effects, as previously discussed, which mean natural monopolies emerge around particular platforms. 

One way around this, which commentators such as Benedict Evans97 have suggested that is to use regulation to introduce competition for the services built on top of monopoly or near monopoly platforms.

A more robust approach is put forward by Elizabeth Warren who has proposed separating out the “platform utility” component from providing services on the platform98. This would explicitly break up the existing platforms, making them smaller and therefore easier to regulate by the government. There are also strong arguments from academics such as Tommaso Valletti that within existing platforms there are some natural structural "break points" that would make this process easier.116

Another approach to opening up platforms is called interoperability. The Internet, email and the World Wide Web are concrete examples of interoperability. For example, you can email someone with a gmail account from a hotmail account without you both needing to be part of gmail or hotmail. The different email services can interact with each other through open standards. So if social media became interoperable, I could message you on Messenger from Whatsapp, or you could use your Facebook account to join a Discord server. This kind of digital world requires a slightly different online infrastructure from the one we currently have, with open standards and better data portability. Organisations such as Redencentralize99  are beginning to create the platforms, tools and infrastructure this kind of ecosystem might need. 

Another approach to platform market regulation, devised by Rufus Pollock, author of The Open Revolution, suggests a system of “remuneration rights”. Users of the internet would all pay a subscription fee (or the government would fund this) to use and maintain a social media network, but social media providers would gain revenue based on the numbers of users they attract. This would provide a free-market-like but open-compatible way to fund innovators and prevent monopolies100.

One way to break down platforms’ monopoly on data ownership would be to create new forms of data storage, either through individual systems that allow users to hold their own data history in their own account (like Tim Berners-Lee’s Solid system)101 or through the more collective operation of Data Trusts - “a structure whereby data is placed under the control of a board of trustees with a fiduciary responsibility to look after the interests of the beneficiaries — you, me, society.” Data Trust are a new kind of institution which “offers all of us the chance of a greater say in how our data is collected, accessed and used by others. This goes further than limiting data collecting and access to protect our privacy; it promotes the beneficial use of data, and ensures these benefits are widely felt across society. In a sense, data trusts are to the data economy what trade unions are to the labour economy.102

A final way to break down monopolies and to stop platforms dominating markets financially is by making them completely open source. This means the code for the platform is open, so users themselves can redesign them and adapt them for their own purposes. For example, Wikipedia allows its moderators and editors to create their own plug-ins and digital tools, which are designed and built by community members themselves, to work alongside the platform; an interesting ecosystem has emerged as the platform has changed and developed through open contribution to meet the needs of its users and editors. It’s an unusual example of a completely open-source platform, run for free by volunteers which attracts billions of users from around the world.

Case Study: Mastodon

Mastodon is an open-source software for running self-hosted social networks that are similar to Twitter.103 Communities can set up their own versions called “instances” of the platform, creating their own rules, governance, features, codes of conduct and privacy and moderation policies.104 Users are able to create content warnings for posts to highlight any sensitive content, change the lengths of their updates and have options around privacy and whether their posts are public or can only be shared on the timelines of their followers.105

People can communicate between and across “instances” to give them access to a wider network than just their “instance”.106 There has been some controversy with the platform as an instance of Mastodon was created by a social media organisation called GAB which was for people banned from Twitter for hate speech. This led to significant governance questions around the decentralised nature of Mastodon and how central questions around access should be settled: 

“Mastodon's founder, Eugen Rochko, refused to create a blanket ban on GAB, leaving it up to individual "instances" to decide whether or not to interact with the interlopers. As he explained to The Verge, a blanket ban would be almost impossible, given the decentralized nature of the service….On the other hand, most "fediverse" members would be unlikely to have to deal with GAB or its users, considering the content contained in GAB's "instance" routinely violates the Mastodon "covenant." Violating these rules prevents instances from being listed by Mastodon itself, lowering the chances of other "instance" owners inadvertently adding toxic content and users to their server nodes. And Rochko himself encouraged users to pre-emptively block GAB's "instance," resulting in ever fewer users being affected by GAB's attempted invasion of the Mastodon fediverse.”107

How to regulate the company

Another way to approach platform regulation is to regulate the company itself. 

For example, Uber’s power was limited through the application of employment law when GMB won a lawsuit to ensure all workers were deemed employees rather than contractors and therefore entitled to a living wage, sick pay and a pension108.

Similarly, new laws can be passed by governments to add new responsibilities that the company running the platform must fulfil, for example, the UK’s online safety bill introduces a new “duty of care” for online platforms.

Case Study: Online Safety Bill

The UK Online Safety Bill is a proposed bill to improve internet safety in the UK and covers any tech firms that allow users to post their own content or to interact with one another. This includes platforms such as Facebook, Twitter, Instagram, YouTube and Snapchat but also commercial pornography sites like OnlyFans and search engines such as Google.

It creates a new “duty of care” for online platforms towards their users, requiring them to take action against harmful content. Harmful content is defined in three parts: preventing the proliferation of illegal content; ensuring children are not exposed to harmful or inappropriate content; and ensuring that adults are protected from legal but harmful content. This last part is significant as it covers a new category of “legal but harmful”, the contours of which will be decided by the Culture Secretary in consultation. 

The bill tries to ensure that platforms preserve access to “democratically important” content that is journalistic and comments on political parties or issues and empowers Ofcom to block access to particular websites within the UK. Though the bill is seeking to limit the power of large platforms, there are some concerns that the increased regulatory burden is more likely to be easily absorbed by the larger platforms but make it harder for smaller new platforms to compete.109

Another approach to regulating the companies could be to treat platforms as “public utilities” and public infrastructure. As Josh Simmons and Dipayan Ghosh argue:  

Facebook and Google should be treated as a new kind of public utility — utilities for democracy. These are public utilities in a far more fundamental, political sense than the narrow, economic concept of corporations that exercise a monopoly over public goods. Their unilateral control over the algorithmic infrastructure of the public sphere concentrates forms of social and political as well as economic power, shaping how we understand and interpret the world around us, discuss matters of fundamental public interest, organize social and political groups, and make choices about matters of collective self-government.110

They propose a framework for regulation that ensures public utilities operate in accordance with public values, provides targeted transparency around how algorithms operate, imposes firewalls between commercial functions (such as advertising) and public functions such as the governance of public debate and has some elements of democratic governance. 

As more platforms blur the lines between public social infrastructure and private service, this approach certainly has some merits. However, there are some difficulties about defining what is a “public utility” and what isn’t. Are dating platforms a “public utility”? Social media platforms blur the lines here because they are new spaces which do many different things: they are simultaneously newspapers, public discussion spaces, private conversations, broadcasters and publishers. They are many things at once which means it’s harder to use existing frameworks to understand them. 

Co-owning platforms?

Governments could sponsor different models of ownership and systems of governance. Many tech platforms begin as start-ups and become more traditionally corporate as they grow. As they continue to grow and become more like public utilities, facing larger questions of regulation, governance and ethics, their corporate form may need to change once more. Some are suggesting that the best longer-term structure for tech platforms could be a switch to co-operative worker ownership or through an “exit to community strategy” towards user ownership. According to Wired, in 2018 Uber and Airbnb wrote letters to the Securities and Exchange Commission proposing to be allowed to grant company equity to their users—their drivers and hosts, respectively:

“Somehow, what seemed impossibly utopian in 2017 was now the corporate strategy of the biggest gig platforms. Without much fanfare, user ownership was quietly emerging as an industry trend. Airbnb’s letter made the reasoning plain: “The increased alignment of incentives between sharing economy companies and participants would benefit both.” Platforms could get more loyalty from users who might otherwise come and go on a whim. Equity awards, meanwhile, could cut users into the benefits of company ownership, which are usually reserved for elite employees or people who already have wealth to invest.”111

Longer term it may be worth making these transitions from start up to corporate to cooperative easier to make it a more standard governance lifecycle for technology platforms.

Some platforms are already owned by users or workers; for example, within the gig economy we are seeing the setting up of worker owned cooperatives such as cooperative delivery services and private hire services. Many of these kinds of structures have only started to emerge and will take longer to scale up, but could be a good future template for platforms.

Case Study: Wings

Wings is one such example of a new cooperative delivery service operating in Finsbury Park, North London. It was founded by a collective of delivery riders and organisers who came together during the first Covid-19 lockdown to deliver free food parcels to over 800 households. As they emerged from the pandemic, the riders drew on their experiences working for delivery companies in London to create a plan for a worker-owned, zero-emission alternative to corporate delivery platforms.

Wings riders are guaranteed a London Living Wage, sick pay and employee benefits, while collaborating on all aspects of the service, learning new skills and collectively pushing the business forward. The platform is embedded within the community, prioritising local independent restaurants and working alongside local charities and community organisations to deliver free food to people in need.

As co-founder Ben Jacob explains, “as riders, we want to live decent lives from the hard work we put in, in communities that value and care for each of their members. Being a co-operative is about more than just guaranteeing riders a decent wage. Through collective ownership, we can reclaim the technologies of the platform economy to empower workers, build community wealth, and provide democratically-owned local infrastructure in our city.”

The cooperative has been supported by the local Islington council with seed funding of £20,000 to help it grow during its first six months with additional funding and support from the Unfound cooperative accelerator.112

How to regulate the product

One way to start to regulate platforms at the product level is to regulate the algorithms that structure them. There have been some interesting proposals around “algorithmic accountability” especially in relation to access to public services, as concerns about racial biases within algorithms become more prevalent. Opening up the current black box of the algorithms that run many platforms could give those who use them more control over how their online environments are constructed, for example deciding to make social media “timelines” less polarising, or shaping the algorithm that controls their Tinder dating suggestions. 

Another approach might be to empower people to utilise the platforms in new ways, like empowering communities to build their own governance systems, for example by providing governance processes and templates.

Case Study: Metagov

One project that is taking this on is the Metagovernance platform which is creating a new software toolkit that helps online communities to build their own digital governance systems. These systems can be anything from voting systems to juries to economies. The idea is to create a ““governance layer” for connecting, interoperating, and orchestrating a whole bunch of other governance apps and services.”

The founders of this project recognise that platforms have entered the realms of the social and the political, creating new forms of organisation, new institutions, groups and practices. 

Competition, ideology, and technological advances have created the conditions for a new generation of games (e.g. Minecraft, Seed), social networks (e.g. Mastodon, Vingle), and collaborative platforms (e.g. Aragon, Colony). This new generation of online communities is changing the rules of online governance. In these worlds, users have the right to self-governance—the right to come together and organize their own social and political institutions. Our goal is to describe, support, and expand this right”.113

One other potential upstream intervention might be to try to ensure people working on platform design are more conscious of the impact of their choices. Influencing the people designing platforms is a goal of the Time well spent movement spearheaded by Tristan Harris, a former Google employee. Harris started out by giving a presentation on “a call to minimise distraction and respect user’s attention” while working for Google, which encouraged his peers to think about how notifications were compelling people to look at their phones114. This began a conversation about the ethics of design which culminated in the creation of the Centre for Humane Technology which lobbies platforms to design their services differently115.

Propositions:

  1. Update the regulatory framework for platforms; investigate how to break up existing platforms and explore how interoperability across platforms and stronger anti-monopoly legislation could prevent further consolidation of monopoly power.
  2. Legislate to make the design and workings of algorithms transparent and accessible to workers and users. 
  3. Make it easier for starts ups and companies to transfer to cooperative structures within the UK. Provide templates and pathways for companies to pursue this change.
  4. Create tax breaks and seed funding for the growth of community platforms.
  • Henry Timms and Jeremy Heimans, New Power (1st edn, Macmillan 2018)

  • S. Dickson, ‘Number of Facebook users in the United Kingdom from September 2018 to March 2022’, www.statistica.com, 2022, https://www.statista.com/statistics/1012080/uk-monthly-numbers-facebook-users/#:~:text=Facebook%20users%20in%20the%20United%20Kingdom%20(UK)%202018%2D2021&text=As%20of%20July%202021%2C%20there,year%20prior%2C%20in%20July%202020 (last accessed 19.06.2022)

  • Adi Robertson, ‘The Senate’s secret algorithms bill doesn’t actually fight secret algorithms’, www.theverge.com, 2019, https://www.theverge.com/2019/11/5/20943634/senate-filter-bubble-transparency-act-algorithm-personalization-targeting-bill (accessed on 19.06.2022)

  • Dan Milmo, ‘Rohingya sue Facebook for £150bn over Myanmar genocide’, www.theguardian.com, The Guardian, 2021, https://www.theguardian.com/technology/2021/dec/06/rohingya-sue-facebook-myanmar-genocide-us-uk-legal-action-social-media-violence (accessed on 19.06.2022)

  • TUC, ‘Gig economy workforce in England and Wales has almost tripled in last five years – new TUC research’, www.tuc.org.uk, 2021, https://www.tuc.org.uk/news/gig-economy-workforce-england-and-wales-has-almost-tripled-last-five-years-new-tuc-research (accessed on 19.06.2022)

  • Gary Barker, ‘The Airbnb Effect On Housing And Rent’, www.forbes.com, Forbes, 2020 https://www.forbes.com/sites/garybarker/2020/02/21/the-airbnb-effect-on-housing-and-rent/ (accessed on 19.06.2022)

  • Dating Zest, https://datingzest.com/tinder-statistics/ (accessed on 19.06.2022)

  • Michael Adams, ‘The 10 Biggest Tech Companies in the World’, www.money.usnews.com, 2022, https://money.usnews.com/investing/stock-market-news/slideshows/most-valuable-tech-companies-in-the-world?slide=12 (accessed on 19.06.2022)

  • Thomas Hanna, Mathew Lawrence, Nils Peters, A Common Platform (Common Wealth, 2020)

  • Thomas Hanna, Mathew Lawrence, Nils Peters, A Common Platform (Common Wealth, 2020)

  • Hannah O’Rourke, Making the future: a new software for the Left (Compass, 2018)

  • Natasha Lomas, ‘Uber Under Pressure over Facial recognition checks for drivers’, www.techcrunch.com, 2021, https://techcrunch.com/2021/03/19/uber-under-pressure-over-facial-recognition-checks-for-drivers/ (last accessed on 19.06.2022)

  • TweetBinder, ‘Donald Trump and Twitter – 2009 / 2021 analysis’ https://www.tweetbinder.com/blog/trump-twitter/ (last accessed on 19.06.2022)

  • Benedict Evans in a personal contribution to a Power Now Project seminar

  • BElizabeth Warren, ‘Here’s how we can break up Big Tech’, www.medium.com, 2019, https://medium.com/@teamwarren/heres-how-we-can-break-up-big-tech-9ad9e0da324c (accessed on 19.06.2022)

  • Redecentralize, https://redecentralize.org/about/ (accessed on 19.06.2022)

  • ‘Remuneration Rights’, https://openrevolution.net/remuneration-rights, (accessed on 19.06.2022)

  • Solid Project, https://solidproject.org/about (accessed on 19.06.2022)

  • Anouk Ruhaak, ‘Data Trusts: Why, what and how’, www.medium.com, 2019, https://medium.com/@anoukruhaak/data-trusts-why-what-and-how-a8b53b53d34 (accessed on 19.06.2022)

  • What is Mastodon?’, https://upload.wikimedia.org/wikipedia/commons/transcoded/7/70/What_is_Mastodon.webm/What_is_Mastodon.webm.480p.vp9.webm (accessed on 19.06.2022)

  • Mastodon, https://joinmastodon.org/ (accessed on 19.06.2022)

  • Mastodon, https://en.wikipedia.org/wiki/Mastodon_(software) (accessed on 19.06.2022)

  • Alexander Traykov, ‘Mastodon: A Federated Answer to Social Media Centralization’, www.sitepoint.com, 2020, https://www.sitepoint.com/mastodon-a-federated-answer-to-social-media-centralization/ (accessed on 19.06.2022)

  • Trust and Safety Foundation, ‘Decentralized social media platform Mastodon deals with an influx of Gab users’, 2019, https://www.tsf.foundation/blog/decentralized-social-media-platform-mastodon-deals-with-an-influx-of-gab (accessed on 19.06.2022)

  • Sarah Butler, ‘Uber agrees union recognition deal with GMB’, www.theguardian.com, The Guardian, 2021, https://www.theguardian.com/business/2021/may/26/uber-agrees-historic-deal-allowing-drivers-to-join-gmb-union (accessed on 19.06.2022)

  • John Woodhouse, Analysis of the Online Safety Bill (House of Commons Library, 2022)

  • Josh Simons and Dipayan Ghosh, Utilities for Democracy (Brookings, 2020)

  • Nathan Schneider and Morshed Mannan, ‘Let Users Own the Tech Companies They Help Build’, www.wired.com, Wired, 2021, https://www.wired.com/story/opinion-let-users-own-the-tech-companies-they-help-build/?fbclid=IwAR0RJlsWoLa6fwG0P1ivCz-fLsX9dNd8qDw4l84yRenHXa6tWPJB_p7j4DY (accessed on 19.06.2022))

  • Ben Jacob, in discussion for the New Power Project, 2022

  • Metagov, https://metagov.org/ (accessed on 19.06.2022)

  • Casey Newton, ‘Google’s new focus on well-being started five years ago with this presentation’, www.theverge.com, The Verge, 2018, https://www.theverge.com/2018/5/10/17333574/google-android-p-update-tristan-harris-design-ethics (accessed on 19.06.2022)

  • Casey Newton, ‘The leader of the Time Well Spent movement has a new crusade’ www.theverge.com, The Verge, 2019, https://www.theverge.com/interface/2019/4/24/18513450/tristan-harris-downgrading-center-humane-tech (accessed on 19.06.2022)

  • Kwoka, John E. and Valletti, Tommaso M., Scrambled Eggs and Paralyzed Policy: Breaking Up Consummated Mergers and Dominant Firms (November 24, 2020). Industrial and Corporate Change, Available at SSRN: https://ssrn.com/abstract=3736613 or http://dx.doi.org/10.2139/ssrn.3736613]