0%

Growing Digital Platforms and the Need for Regulatory Framework

Ashmita Gautam

May 22, 2020

9 MIN READ

Growing Digital Platforms and the Need for Regulatory Framework

Throughout the initial stages of the COVID-19 pandemic, racist memes of people eating bats engulfed social media. Posts augmenting the patient statistics, and unprofessional content on topics such as vaccination, health care, and nutrition also received massive popularity among the audience — posing a severe threat to public health.

This issue has worsened further by the influx of internet users with limited digital literacy. Users considering social media to be sacrosanct has not only exacerbated public stress but also questioned the capacity and integrity of digital intermediaries.

This situation stipulates the need for a functional reassessment of digital businesses. Major e-service providers like Google, Airbnb, Facebook, Amazon, and Uber have prioritized the Digital market, which they recognize as one of the most crucial markets in the world.

Within this marketplace, digital platforms operate as intermediaries to facilitate interaction between producers and end-users (including buyers).

From the geopolitical viewpoint, cross-border data flows have made the country borders much more porous than what globalization could have done.

Thus, a multi-sided market is created with distinct user groups, for instance, Uber enables the transaction between drivers and passengers, YouTube gives a platform to content providers, users, and advertisers. Similarly, in a video game console, the seller charges consumers for playing video games in the console and a separate royalty is levied on the developers of the game.

To survive the competition, these platforms reap benefits by diffusing digital content and services at low marginal costs and high unit margins.

In no time, the growth begins to follow an exponential trajectory, giving room for the network effects to kick in. As more sellers generate more buyers, strong network effects propel, leading to the vicious circle of network externalities with economic and legal consequences (Facebook–Cambridge Analytica data scandal to Uber labor rights controversies).

These legal disputes are ubiquitous in the platforms, indicating the need for a regulatory mechanism with traceability and proactive monitoring.

Thus, this article emphasizes the selection of an appropriate method for regulation by highlighting the broad contours of the escalation mechanism, its domestic and geopolitical repercussions, and the existing regulatory framework for the digital intermediaries.

Domestic and Geopolitical Implications

For domestic implications, we have innumerable cases of fake news and hate speech. In the paper “Fanning the Flames of Hate: Social Media and Hate Crime”, Muller and Schwarz (2018) investigated the link between social media and hate crime using the surveyed data from Facebook.

With platforms opposing sharing large sets of operational data with public authorities, information asymmetry poses another concern in this framework.

The study showed a strong correlation between exposure to hate speech in social media and real-life violence indicating how hateful social media activity (with anti-refugee remarks) translates into violent anti-refugee hate crimes.

The recent rise of mass shootings, xenophobia, and Islamophobia in numerous countries cannot be overlooked as well.

From the geopolitical viewpoint, cross-border data flows have made the country borders much more porous than what globalization could have done.

With the slow development of international regulatory regimes, regional guidelines have been adopted by some countries but these rules do not guarantee the safety of data extracted or stored by other jurisdictions. Moreover, the number of cases registered under mobile phishing and malicious campaigns have risked the sensitive information ranging from Personal Data (such as name, address and phone number) to Personally Identifiable Information (such as credit card information).

Existing Regulations

As a result, various countries are debating the need for regulating the authenticity of these digital platforms. While some regulations already exist, (such as video-on-demand service of amazon regulated by Ofcom), lawmakers have recommended the need for a single regulatory body that can protect the users from deleterious content and the monopolistic behavior of the platform owners.

The EU Digital Single Market Strategy aims to bring sectoral legislation that would abrade liability exemptions for online intermediaries.

Nonetheless, if a regulator places a strict penalty for the obligations, digital platforms might choose to delete the content for every complaint received. Moreover, with the volume of the content generated, a single regulatory body will not have the capacity to investigate the complaints of a large number of cases.

With platforms opposing sharing large sets of operational data with public authorities, information asymmetry poses another concern in this framework.

Concluding that a single regulatory body will be an inappropriate method of regulating platforms, we can now turn to the other alternatives, starting with self-regulation.

Under self-regulation, the platform owners come together and devise a framework to govern with common guidelines.

Self-regulation has already gained wide industry support as it takes care of the heterogeneity of multifarious digital business models by allowing adequate time for consensus building.

One such framework has been launched by The Internet and Mobile Association of India (IAMAI) for self-regulation of Online Curated Content Providers with Hotstar, Voot, Jio, and SonyLiv as signatories.

In certain circumstances, self-regulation can also be mandated by public authorities. One such example was developed in California, US.

The state developed Transportation Network Companies (TNCs) for the self-regulation option of ride-sharing platforms by defining standards that drivers must adhere to.

Even if the guidelines are pre-defined, the enforcement mechanism is left on the platform itself, for instance, Uber delists drivers or riders if they damage property, hurt someone, or engage in a romantic relationship with drivers or fellow riders.

Whenever the companies are unhappy with the internal regulation, the statutory body can conduct adequate screening of the internal procedures.

As described by Michèle Finck in his working paper “Digital Regulation: Designing a Supranational Legal Framework for the Platform Economy”, isolated self-regulation not only lacks transparency but also fails to account for the interests of actors other than the platform itself.

Encouragement of self-regulating oligopolies in the digital space will not allow the companies to abide by the individual’s freedom of speech and expression.

Further, self-regulation largely depends on the automated systems that can be risky as technology has not been developed to the extent that it can understand the context behind the background. With no human in the loop, it will be difficult to trust an algorithm.

This scenario invites the need for a co-regulation model that combines the elements of both state regulation and self-regulation.

Under this set-up, companies can come together to meet certain standards for transparency with content adjudication.

The negotiation process will require the development of internal methods for regulation by considering the differences in services among the companies.

The intermediary body will then choose an appropriate ‘content standards code’ and the adequacy of this type of regulation will be checked by a statutory body.

Whenever the companies are unhappy with the internal regulation, the statutory body can conduct adequate screening of the internal procedures.

This method will receive better compliance compared to the self-regulation model as the external adjudicator can make unbiased decisions after a reliable evaluation of the complaint.

Way ahead

Going forward, the need for regulation of the digital intermediaries cannot be debated but the selection of an appropriate method for escalation still needs to be reassessed.

The co-regulation model is a more dependable system of regulation as it allows the public and private players to work as collaborators resulting in cooperation between state, civil society, and market players.

However, the recommendations made by the human rights experts must also be incorporated in this model with adequate stakeholders’ consultation.

With this bid, a robust regulatory approach can be created with a substantial commitment to promoting business efficiencies and protecting the public interest altogether.

(The writer is a Visiting Fellow at NIICE)

(Nepal Institute for International Cooperation and Engagement (NIICE), Nepal’s independent think tank, and Khabarhub — Nepal’s popular news portal — have joined hands to disseminate NIICE research articles from Nepal)

0