Furthermore, there is a growing apprehension that such regulatory measures could pose a significant threat to the fundamental democratic values that underpin the nation's governance and the underlying principle of CMA Section 3(3), which states that “nothing in the CMA shall be construed as permitting the censorship of the internet”. Civil society organizations (CSOs), including ARTICLE 19 and CIJ, which have been previously engaged for consultation, have expressed our concerns to the government regarding the potential imposition of licensing on social media platforms to moderate harmful content. We have advised the government against hasty decision-making and emphasized the need for thorough consideration of the implications and stakeholders involved.
Additionally, on June 27, 2024, the CSOs issued a letter to the Prime Minister urging the government to prioritize increased collaboration and consultation with civil society organisations and other relevant stakeholders. Their message highlighted the importance of inclusive and transparent processes in shaping policies related to social media regulation.
Overreach of licensing framework
The licensing system for network and application services faces a significant challenge due to difficulties in anticipating future needs and developments, and a notable lack of independent oversight, which can impact the fairness and transparency of the licensing process. This lack of clear guidelines and oversight has created uncertainties for social media platforms.
Consequently, these platforms may need to meet specific regulatory requirements and adhere to standards set by regulatory authorities as part of the license renewal process. This would involve a closer working relationship between the platforms and the regulatory bodies, ensuring that they operate by the requirements outlined in the licensing framework. As a result, platforms could be more compliant and consent to more removal requests from the government instead on focusing on effective and timely content moderation.
It is important to note that lack of transparency in the compliance process gives large platforms even more power to police what we see, say, and share online—with disastrous consequences for public debate, the free flow of information, and democracy. Social media networks are a vital space for us to connect, share, and access information.
ARTICLE 19, in a legal analysis of the CMA, repeatedly warned that some of the provisions under the CMA are problematic and not in line with international human rights standards. The more new regulations are in place, the more power the Malaysian Communications and Multimedia Commission (MCMC) has to regulate content and social media companies. We have repeatedly raised the issue of using sections 211 and 233 of the CMA to define harmful content. At the same time, the provisions have been abused over the years to restrict freedom of expression. In principle, we reiterate that Sections 211 and 233 of the CMA should be
repealed, as they have an expansive scope and vague interpretation. The provisions also do not meet the international freedom of expression standard, especially the three-part test: legitimate aim, provided by law, proportionate, and necessary.
Platform accountability
ARTICLE 19 and CIJ understand the government’s intent to hold social media platforms and messaging application accountable as means of tackling online abuse, hate speech, and other problematic content, including scams and fraud, that targets children or other online users.
The important step is to get the social media platforms to (i) enhance its community standards and guidelines to meet international human rights standards, including on data protection, privacy and transparency on use of artificial intelligence (AI); and (ii) ensure that their content moderation and removal policies and actions are effective and timely, done in transparent and systematic ways, without personal, political or business biases. Social media platforms will have to invest in adequate human and language detection resources to go beyond automated flagging or using AI to detect harmful content.
Thus, the government will have to adopt innovative and alternative means of holding these platforms accountable, as any attempts to incorporate these platforms into a more traditional regulatory regime are unlikely to be effective and may have unforeseen implications given the rapidly growing nature of technology and the global reach of these platforms. Any attempts to hold the platforms accountable must ensure that there is meaningful protection of the rights of the public, including in not infringing on the users’ freedom of expression.
Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression noted in 2021 that while Malaysia is not a party to the International Covenant on Civil and Political Rights (ICCPR), the content of Article 19 of the ICCPR is based on Article 19 of the UDHR. Thus, it should inform Malaysia of its obligations under international law. Under Ar2cle 19(3) of the ICCPR, restric2ons to the freedom of expression are permissible only when "provided by law" and necessary for "the rights or reputa2ons of others" or "for the protec2on of na2onal security or of public order (ordre public), or of public health and morals."
Way forward
It is essential to address the lack of transparency regarding the specific requests made by the MCMC or other government entities to the platforms and their responses to these requests. The government should avoid unnecessarily regulating online content moderation and licensing social media platforms. Any regulatory framework for social media platforms must be based on principles of transparency, accountability, and the protection of human rights.
This should include requirements to enhance transparency in content moderation decisions and to improve systems for resolving disputes arising from these decisions.
It is recommended that the government adopt the following:
1) Establish a social media council which would promote a multi-stakeholder independent regulatory framework;
2) Set up an independent committee to review the root causes of hate speech and cyberbullying, and relatedly develop comprehensive plan of action, using the Rabat Plan of Action as the framework; and
3) Enhance its education and awareness programmes aimed at building a resilient society guided by ethical and responsible content creating standards, and with adequate digital literacy to combat the dangers of harmful content.
In conclusion, to achieve better results in countering harmful social media content and protecting users, the government must reconsider its current plan and consult more comprehensively with CSOs. This is necessary because effectively addressing harmful content goes beyond just content moderation; it also entails addressing the root causes of issues such as hate speech, cyberbullying, and gender-based violence. Engaging with CSOs can provide insights into the broader societal and systemic problems that contribute to harmful content and help develop more holistic and effective strategies for mitigating these challenges
The press statement is issued by Centre for Independent Journalism (CIJ)