By Ashwini Natesan
The author is a Legal Consultant / Research Fellow specializing in Technology, Media and Communications Law
The Online Safety Bill (Bill) published in the gazette issued on 18 September 2023 is a crucial development and has attracted widespread attention and rightly so. Prior to this Bill, existing legislations in Sri Lanka have not regulated internet intermediaries. In simple terms an internet intermediary can be either platform providers where third parties can post their content, and other users can access it or messaging platforms, or search engines. While existing laws have already been used to regulate disinformation, intermediary liability has been a grey area in Sri Lanka. The Bill seeks to fill this lacuna. The Bill, however, does not only impact internet intermediaries but all users on online platforms including media.
Amongst others, the preamble to the Online Safety Bill (Bill) suggests it is TO PROHIBIT ONLINE COMMUNICATION OF CERTAIN STATEMENTS OF FACT IN SRI LANKA; SUPPRESS THE FINANCING AND OTHER SUPPORT OF COMMUNICATION OF FALSE STATEMENTS OF FACT AND FOR MATTERS CONNECTED THEREWITH ORINCIDENTAL THERETO.
The Bill defines “fact” as that which “includes anything or state of things which are seen, heard or otherwise perceived by the users of internet-based communication services;”
The first question that would arise is what is “statement of fact” and who would decide what constitutes statement of fact?
A reading into the provisions of the Online Safety Bill reveal that the Online Safety Commission (“Commission”) would be the authority, empowered, amongst others, to “issue notices to persons who communicate false statements that constitute offences under this Act, to stop the communication of such statement”. The Commission, whose members are appointed by the President, have wide powers including powers to issue codes of practice to internet intermediaries.
The offences under the Bill also raises concerns. Clause 12 of the Bill reads that “Any person…. promotes feelings of ill-will and hostility between different classes of people, by communicating a false statement, commits an offence and shall on conviction be liable to imprisonment for a term not exceeding five years….” While “truth” “falsity” of statements in itself is not easy to discern, what can be stated about “promoting ill-will and hostility.”
There are many offences under the Bill ranging from false statements amounting to contempt to intentional insult by false statement with intent to provoke a breach of peace. These widely worded offences could lead to limiting freedom of expression. For instance, Clause 14 of the Bill states that any person “maliciously or wantonly, by communicating a false statement gives provocation to any person intending or knowing it to be likely that such provocation, will cause the offence of rioting to be committed” would be liable to imprisonment whether rioting has been caused (imprisonment not exceeding 5 years) or not (imprisonment not exceeding 3 years). The Bill defines “false statement” to mean “a statement that is known or believed by its maker to be incorrect or untrue and is made especially with intent to deceive or mislead but does not include a caution, an opinion or imputation made in good faith.” Concerns arise over how this would be interpreted. Another example is Clause 22 of the Bill where, even communication of statement of facts that could be “harassing” is an offence with imprisonment of not exceeding 5 years or fine not exceeding five hundred thousand rupees. Clause 22 of the Bill reads as follows “Any person, whether in or outside Sri Lanka who wilfully makes or communicates a statement of fact, with intention to cause harassment to another person (in this section referred to as the “target person”), by publishing any “private information” of the target person or a related person of the target person, and as a result causes the target person or any other person harassment, commits an offence”. Concerningly the term “harassment” is defined to mean “an act or behaviour which has the effect of threatening, alarming or distressing a person or violating a person’s dignity or creating an intimidating, degrading, hostile, humiliating or offensive environment or, which has all such effects”. This could particularly affect media freedoms in reporting of even news items, where such news could be having the effect of “alarming or distressing” another person. It is pertinent to note that this Bill has overriding effect over other laws in the event of inconsistency.
Intermediary Liability and Safe Harbour
Of particular relevance are the provisions on intermediary liability. Where a complaint has been made relating to a false statement, the Commission can issue a notice to the internet intermediaries. Specifically, internet intermediaries are required to disable access and remove prohibited content in 24 hours. Besides the practicality of such a requirement, it raises serious concerns as to the extent of regulating online expression.
Clauses 28 and 29 of the Bill calls for disclosure of identity of the person who is making the prohibited statement, where such information is not available, the internet service intermediary maybe called upon through an order of the Magistrate Court to disclose such information. How will this provision be implemented where messaging platforms are concerned? In terms of the Bill “internet service intermediary” includes “a service of transmitting such materials to end users on or through the internet;”, clearly bringing messaging platforms within its ambit. Could this mean that messaging platforms with end-to-end encryption may need to alter their operations? A reference can be made to the provision of the Information Technology (Intermediary Guidelines
and Digital Media Ethics Code) Rules, 2021 in India. Rule 4 (2) of the said Rules require
messaging platforms to disclose details of the originator of messages. These Rules including the said provision have been challenged in the courts of law in India.
Where the internet service intermediaries fail to adhere to the Codes of Practice as issued by the Commission and such failure has caused wrongful loss, liability is imposed on intermediaries under Clause 30 of the Bill. The exemption / safe harbour provision included under Clause 31 of the Bill is subject to compliance with rules and regulations made under the Bill. Can these lead to platforms policing content more rigorously?
Part VI of the Bill deals with inauthentic online accounts (including bot accounts) and coordinated inauthentic behaviour. Here again, the Commission may issue a notice to the internet intermediary to prohibit its platform from being used to communicate any prohibited content or prohibit persons from using such specified accounts.
While there is a definite necessity to regulate disinformation, hate speech and other concerns arising from content posted online. However, such a law should be defined in more stringent and definite terms.