Can intermediary regulation fix the broken social media?

By Kamesh Shekar

The issues of hate speech, disinformation, misinformation and polarisation on one side, and arbitrary exercise of power on public discourse and information on the other side is what I coin as a broken social media problem. The urge to fix social media has been well recognised by the various policy actors starting from policy cognoscenti, the State, to the general public since the Cambridge Analytica fallout. While this was a normal problem for a long time, some recent developments moved the visibility of the problem towards an unanticipated crisis.

Social media is a deliberative medium. Nation’s mood toward a couple of unrelated significant problems, which directly impacted the standing of the incumbent Government among the public and opinion-makers, opened up the window for regulating social media. Since 9th August 2020, Indian Farmers are protesting against the three farm bills passed in September 2020. Recently, the Government ordered twitter to remove about 1,178 accounts (including one of the eminent publications called The Caravan) which was accused of spreading disinformation on farmers’ protest. While Twitter complied with the order for a brief period, it restored the accounts after few hours. This event and other events (such as the Tandav fallout) pushed the State to develop the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules, 2021).

 Intermediary Guidelines and Digital Media Ethics Code

The window for social media policy opened up in 2018 when the Ministry of Electronics & Information Technology (Meity) came up with draft intermediary guidelines for public comments. After that, the policy window got curtained until 2021, when the political will (reaction to political crisis or outcry) unveiled the same. The Meity recently notified the IT Rules, 2021. Parts on digital media and online curated content were newly incorporated in IT Rules 2021, which brought no chance for policy actors to deliberate since they were not part of the 2018 draft.

As IT Rules 2021 were passed as an executive order, there is still scope for legislation. Currently, it is difficult to say when the window for legislation will open up. Still, policy actors such as policy commentators, legal scholars and advocacy groups have already started pointing out many flaws in the IT Rules, 2021, seeking a more democratic route (through legislation) and hammering the policy window to open up again.

 When the window for legislation opens up, some nuanced aspects of social media content and its moderations must be deliberated by the policy actors to form robust legislation. There are two kinds of moderation performed by the social media platforms, i.e., hard moderation – in the form of content takedown, flagging, and soft moderation – in the form of algorithmic recommendation of the content. The legislation must recognise this difference in moderation and consider the below suggestions.

Principle-Based Approach for Hard Moderation

Social media platforms follow their own community guidelines as a base for performing hard moderation of third-party content. According to IT rules, 2021, the Government can also order the platforms to take down content they determine unlawful within 36 hours from the notification. While performing hard moderation, both social media platforms and the Government adopt a content-based approach to decide which content to be takedown at their discretion, not backed by any principles or supervision. This paves the way for unaccounted censorship over our information appetite and public discourse, where it becomes challenging to determine the rationale behind the takedown of the content, whether it’s for collective user welfare, political reasons or wrongdoing. For instance, recently Government of India had requested to take down 52 tweets which were criticising the Government for the second wave of COVID-19 and its handling of the same. Twitter complied with the request and took down tweets providing notice to respective users. This act by the Government has drawn global attention and criticism from the policy actors as it violates freedom of expression and causes unaccounted censorship. This also shows it is challenging to hold the Government accountable for these takedown requests.

Therefore, it is critical to move towards a principle-based approach where we define principles (in lines with fiduciary responsibility) to advise the hard moderation process. The Government’s role has to be limited to defining the principles by deliberating with various policy actors and monitoring due diligence by mandating various disclosure mechanisms for social media platforms. The hard moderation process performed by social media has to be per the defined principles on a case by case basis accounting for the context, significance, veracity of information.

Besides, it is also essential to list out charter rights for digital platform user as part of the legislation to counterbalance and check the power wielded by both the Government and social media platforms on our information appetite.

Outcome Analysis for Enhancing Soft Moderation

When we see and read posts on social media, we don’t realise that the content which appears on our screen is not coincidental. Social media platforms collect information (input data) on our preferences, behaviour and relationships etc., to develop an algorithmic system to recommend content (output) which will help them retain our presence for long on the platform. In algorithmic recommendation systems, feedback from outcome analysis is missing. While social media platforms adjust their recommendation algorithm to account for user behaviour change, this does not guarantee whether they do it to enhance user welfare. Therefore, it is necessary to mandate an audit of the algorithmic recommendation systems. An independent auditing agency has to perform the audit based on some of the defined normative principles. The results/inferences derived through the audit have to be incorporated into the algorithm (input data & coding) to enhance the recommendation systems (outcome) for better user outcomes. Besides, as a stimulative policy measure, a market for algorithmic rating system has to be forged to push the platforms toward performing better on the user outcome aspect.

Therefore to fix the broken social media problem, there is a need for legislation in consultation with various policy actors on some of the nuanced aspects of social media. The role and responsibilities of Government and social media in hard moderation and soft moderation have to be defined clearly. Also, there is an urgent need for a framework to bring government and social media platforms accountable for their actions through vesting charter rights to digital users.

The author is a tech policy enthusiast. He is currently pursuing PGP in Public Policy from the Takshashila Institution. Views are personal and do not represent Takshashila’s policy recommendations.

Previous
Previous

Politics of the Internet

Next
Next

Dynamic Random Access Memory (D-RAM)- Examining the dynamics of an oligopolistic market