Columns
Regulating social media platforms
It should balance empowering users, holding platforms accountable and promoting transparency.Ujjwal Acharya
Social media sites have become ubiquitous in our daily lives. Although they are omnipresent, unavoidable and beneficial, we do not know how to tame them, ensuring their proper use and protection from malefactors. Regulating these platforms is complicated, primarily because of the internet’s global reach and people’s right to opinion and expression. Some of the international, regional and national efforts to regulate them have been effective, given their adherence to market and profit protection rules.
Regulatory efforts
In consultation with various stakeholders and experts, the Ministry of Communication and Information Technology (MoCIT) is drafting a bill on the regulation (operation and management) of social media use since the Council of Ministers approved the proposal on November 29, 2023. This came following the government’s controversial decision to ban TikTok in Nepal, which received massive criticism from activists on the internet and is under review at the Supreme Court. However, people and critics alike agree on the urgency of strictly regulating social media platforms.
Two days before the ban, the Directives to Manage Use of Social Media 2023 was published in the gazette and came into effect. The directives state that it was necessary to “manage the use of social media and promote self-regulation of social media operators and users”. This was issued under the Electronic Transaction Act 2008 and has provision for social media platforms to register, appoint a point of contact and follow the rules and directives issued by a unit at the MoCIT. It also states 19 “do not” rules for social media users.
I believe the directives are heavily restrictive and go beyond the mandate of the Constitution of Nepal and its parent Act. Implementing a comprehensive regulatory framework that addresses the multifaceted aspects of social media platforms should strike a balance between empowering users, holding platforms accountable and fostering a responsible and transparent digital environment.
Intertwined and confusing aspects
Regulating social media platforms necessitates an approach that addresses two intertwined and often confusing aspects: User-generated content and platform responsibilities. Users should bear responsibility for the content they produce, emphasising accountability for the potential impact of their posts. Concurrently, platforms should remove illegal content through user reporting mechanisms and proactive monitoring.
The regulatory framework encompasses various dimensions: Local registration, compliance with local and national laws, taxation, accountability and transparency. Social media platforms must have a registration process, ensuring they operate within the legal frameworks of their jurisdictions. The registration also gives a point of contact for communication when necessary. To make it practical and acceptable, the laws and regulations must be within the acceptable parameters of human rights and the protection of citizens.
Social media must be transparent in their operation and procedures, especially regarding content removal, which provides users with clear insights into the processes employed. There should also be transparency in advertising practices, ensuring the users’ awareness about the nature and origin of advertisements on the platform. Facebook’s Ad Library is an example of how this could be achieved. Accountability outlines the platform’s commitment to actively contribute to the well-being of the online and offline community in the respective country. This could be achieved through corporate social responsibilities and immediate response during disasters or violence. What Viber did by creating channels to support life-saving content during the 2015 earthquakes and the Covid-19 pandemic can be taken as examples.
Taxation of social media platforms is a must when advertising increases. They should be mandated to use local agents so citizens can trade in local currency, and such expenditures are within the purview of the state. If these platforms receive advertisements microtargeting only in Nepal, it is favourable to make transactions in Nepal or bring them within the country’s regulatory advertising framework.
UNESCO recently published “Guidelines for the Governance of Digital Platforms: Safeguarding Freedom of Expression and Access to Information through a Multistakeholder Approach”, which outlines five platform principles: Conducting human rights due diligence; adherence to international human rights standards, including in platform design; content moderation and content curation; transparency; making information and tools available for users and accountability to relevant stakeholders.
Responsibility for content
While the policy for social media regulation should address the abovementioned matters, existing laws must deal with user-generated content issues. The platforms should be held accountable for illegal content—child pornography, hate speech, incitement to violence and so on—that they fail to remove despite being aware of their presence through their monitoring and user reports.
The platforms should remove other content deemed illegal under the existing laws and provide the state with data on the users who post such content. This process should not be based on the arbitrary power of a ministry or its unit but on a court order. The security agency and/or government unit should be given a communication channel to request content removal in an emergency. A mechanism should exist where agencies issuing such requests are held accountable for their act, and the court constantly reviews such requests, ensuring their necessity.
That the government decided to ban TikTok for illegal content but didn’t take any action against users who produced such content was ironic. The problematic or “bad content” could be controlled by implementing existing laws. The National Penal (Code) Act 2017, the Electronic Transaction Act 2008 and other laws have provisions that could be applied to restrict and punish the perpetrators.
Nepal’s attempts to regulate social media show that policymakers have not clearly divided the line between content and platform regulation. The documents they have created appear messy and restrictive and do not adhere to the established international standards. Countries like Nepal, which have a tiny social media market, lack the evidence, data, expertise and technology to draft proper regulations. We need appropriate legal instruments to ensure that social media platforms are regulated correctly and that citizens enjoy and use the benefits without hampering democracy.