1,093
Views
0
CrossRef citations to date
0
Altmetric
Media & Communication Studies

Industry approaches in handling online exploitation of children: A comparative study of the policy, guidelines and best practices in Malaysia, Singapore and Australia

Article: 2241713 | Received 21 Feb 2023, Accepted 24 Jul 2023, Published online: 15 Aug 2023

Abstract

Child Exploitation on the Internet is an issue of international importance. Despite notable efforts taken by nations to prevent the online exploitative use of children, the problem remains a serious issue. Although one entity cannot take responsibility for regulating the material, there seems to be an added responsibility placed on Internet Service Providers as gatekeepers of information on the Internet to take possible measures to regulate the material. This study focuses on industry regulatory initiatives, including policy, guidelines, and best practices that are enforceable by the communications and multimedia industry, with a focus on Content Service Providers in handling online exploitation of children in Malaysia. This article begins by examining the current legislative framework provided in the Communications and Multimedia Act of 1998 and the Content Code in Malaysia. The article then examines the Online Safety Act 2021 of Australia and the Online Safety (Miscellaneous Amendments) Act 2022 of Singapore, which was passed in Parliament on 9 November 2022, and takes effect from first NaN Invalid Date with a focus on industry service provider towards content regulation. Finally, the article concludes by proposing recommendations from the Australian and Singaporean legislative frameworks as points of reference for best practices in regulating content in Malaysia.

1. Introduction

Online Exploitation of Children is an ever-growing threat to the safety and wellbeing of children. As the Internet becomes more easily accessible more children will be accessing the Internet which undoubtedly leads to issues pertaining to their safety. Although one entity cannot take responsibility for regulating the material, there seems to be an added responsibility placed on Internet Service Providers as gatekeepers of information on the Internet to take possible measures to regulate the material. Therefore, this study focuses on the industry regulatory initiatives, including policy, guidelines, and best practices that are enforceable by the communications and multimedia industry, with a focus on Content Service Providers in handling online exploitation of children in Malaysia. More specifically, the research focuses on the Communications and Multimedia Industry players’ standard operating procedures incident handling and responses implemented related to online exploitation of children in Malaysia.

This article begins by examining the current legislative framework provided by the Communications and Multimedia Act of 1998 (“CMA “98”) and the Content Code in Malaysia. The article then examines the Online Safety Act 2021 of Australia and the Online Safety (Miscellaneous Amendments) Act 2022 of Singapore, with a focus on industry service provider towards content regulation. As this study focuses on the liability of Content Service Providers the two jurisdictions of Australia and Singapore have been selected as a basis for a comparative study as both legal frameworks have comprehensive provisions and a set of standards placed on service providers in determining material deemed unsuitable especially relating to online child exploitation. For example, the framework in Australia includes a set of Basic Online Safety Expectations which articulate the Government’s minimum safety expectations for online services, such as social media, gaming and online dating sites, websites, and online messaging services. Mandatory reporting requirements in the Act requires online service providers to provide specific information about online harms that occur on their platforms. Whereas in Singapore, the Online Safety (Miscellaneous Amendments) Act 2022 explains the meaning of “Online Communication Service” which was previously not defined in the Broadcasting Act 1994. Hence, their choice of selection as a basis for this study is due to its relevance in addressing the specific issue of placing liability on content service providers in both Australia and Singapore and to assess its effectiveness in regulating online exploitative material in Malaysia. The article then concludes by proposing recommendations from the Australian and Singaporean legislative frameworks as points of reference for best practices to be implemented when regulating child exploitative content in Malaysia.

1.1. Understanding online child exploitation against the international framework

Child abuse either in physical or in the online media is not new. Literature and art during the ancient Greek and Roman times depict how children mostly young boys were abused for sexual pleasure. Though ancient documents regularly portraited children as sexual objects it was only around the 18th century that the idea that young children should be protected from sexual abuse was promulgated. Child abuse took a different turn with the advent of the Internet. The many facets of the Internet technology which allows users to transcend communications across jurisdictions, features of remaining anonymous and the ability to reach out to many users contributed to the alarming numbers of online abuse cases. Now, children are known as “digital natives” as described by Marc Prensky, in “Digital Native, Digital Immigrants.” They are a generation of young people who are all “native speakers” of the digital language of computers, video games and the Internet, with at least five years of active Internet use.Footnote1 Marc Prensky also observed that the Internet has become children and young people’s primary source of entertainment and information and is a vital part of their social life. There is clearly an important difference between children particularly younger children inadvertently stumbling across pornographic and obscene content online, and young people who deliberately seek it out intensifying debate over sexualised content online and its consumption by children and youth. Author Adam Tomison proposed a very general definition for the term child sexual abuse as “the use of a child for sexual gratification by an adult or significantly older child/adolescent.”Footnote2

In Malaysia, Juriah Abdul Jalil opined that sexual victimisation of children has evolved since the advent of the Internet, where opportunities have been created for individuals with bad intentions to abuse, harm and exploit children. In “Combating Child Pornography in Digital Era: Is Malaysian Law Adequate to Meet the Digital Challenge?”Footnote3 she also raised concerns that some might even collaborate to organise crimes that exploit and abuse children particularly in child pornography. Juriah’s concern is justifiable as pornographic materials depicting children are so easily available on the Internet. In Malaysia, prohibited content under the Malaysian Communications and Multimedia Content Code’s guideline includes content as prohibited under the Code and this includes indecent content which is described as “material, which is offensive, morally improper and against current standards of accepted behaviour.” This suggests that these harmful materials are not condoned by the Malaysian communicative regulators. Nevertheless, sexual materials are still easily available online and children are still easily exposed to such content online. As mentioned earlier, Juriah Abdul Jalil noted that indecent, obscene and offensive online contents are governed by the CMA ’98 and the Content Code. In “Choosing not to be a Target: Protecting Children from harmful and illegal material on the Internet,”Footnote4 Manique Cooray analyses how the issue of children becoming victims through unsupervised Internet surfing and provides a detailed account of the legislative provisions in Malaysia in regulating content on the Internet. It is to be highlighted that with the revised guidelines in the Content Code in the year 2022 there are specific provisions now to include “Child Pornography” and now the content is kept separate from that of obscene and indecent material, and it is to have the meaning as defined under Part 1 of the Code with specific reference to Section 4 of the Sexual Offences Against Children Act 2017.Footnote5

To address the concerns on the liability of Internet Service Provider for cyber defamation under Communications and Multimedia Act 1998 in Malaysia was looked into by the researchers Zahidah Zakari et all in “Liability of Internet Service Provider in Cyber Defamation: An Analysis on Communication and Multimedia Act 1998.”Footnote6 Although the focus of this article is on Online Defamation the writers have examined the law regarding the liability of online intermediaries. The findings of the study found out that CMA ’98 should be revised and amended, in the provisions specifically on liability of the ISPs in Malaysia. In “Internet Service Providers Liability for Third Party Content: Freedom to Operate? Ida Madieha Abdul Ghani Azmi et all support a contention put forward in their work that in order to regulate the material and to address the ISP liability a procedure transpired via negotiations and multi-lateral agreements involving a number of countries should be implemented.

It Is observed there is limited research aside from the above writers specifically addressing the problem of child online exploitation in Malaysia except for various reports that provides statistics on the subject matter from by the Department of Social Welfare and the Royal Malaysian Police. There is a dearth of research on the various Standard Operating Procedures followed in Malaysia in removal of online sexual content of children by various applications and platform providers on the Internet. Thus, this research supplements the research gap in child online exploitation in Malaysia and the best practices and guidelines followed by the Content Service Providers and the extent to which the current provisions in the CMA’‘98 and the Content Code address the issue of liability towards content service providers.

1.2. Theoretical framework and methodology

By introducing a legal framework that would place certain responsibilities onto service providers, it is believed that it could significantly reduce the amount of sexual content online, thus preventing minors from being easily exposed to all types of sexual content from websites and social networking platforms and exposure to other forms of abuse online. Hence, the theoretical formwork of this study is built upon the principle of preventing harm to children. The methodology employed in this research is of two-fold. Firstly, a doctrinal research methodology with a comparative element in its research of the existing literature and reports were carried out. The analysis is presented in a descriptive analysis of the current literature on child online exploitation. Qualitative data collection on secondary sources including public documents, journals, conference articles and newspapers were studied. Secondly, a non-probability sampling method was used whereby the respondents were selected based on criteria of providing services as content service providers who are able to remove child online sexual material in Malaysia. To this end a survey questionnaire was sent to platform operators such as TikTok, Instagram WhatsApp, YouTube, Telegram, Google Play Store, Twitter, Telecom Malaysia, Celcom, Maxis and App Store. Due to the confidentiality in the standard operating procedures followed by each platform the study also focuses on examining the various standard operating procedures which are available and made known to the public. This was also to complement the lack of responses from the above platform operators to the survey link shared by the researchers.

2. Online content regulation in Malaysia

The Internet was introduced in Malaysia in June 1987 when the Malaysian Institute of Microelectronic Systems established a Malaysian computer network called “Rangkom” with the co-operation of several local institutions of higher learning. Rangkom was subsequently renamed Joint Advanced Integrated Networking (“JARING”) as the main Internet Service Provider (ISP).Footnote7 In 1992, a satellite was installed to link Malaysia and the United States of America, and JARING was fully connected to the Internet, providing Malaysian users with access to Internet resources in more than 140 countries. Figure indicates the growth of Internet Users in Malaysia and, as of date, there are 13 ISPs in the country.Footnote8 During this time, “Vision 2020” was launched with the objectives inter alia of establishing a scientific and progressive society, leading to the establishment of the Multimedia Super Corridor in 1996.

Figure 1. Kemp S, “Digital 2022: Malaysia – DataReportal – Global Digital Insights” (DataReportalfebruary 15, 2022) https://datareportal.com/reports/digital-2022-malaysia#:~:text=Internet%20use%20in%20Malaysia%20in%202022&text=Malaysia’s%20internet%20penetration%20rate%20stood,percent.

Figure 1. Kemp S, “Digital 2022: Malaysia – DataReportal – Global Digital Insights” (DataReportalfebruary 15, 2022) https://datareportal.com/reports/digital-2022-malaysia#:~:text=Internet%20use%20in%20Malaysia%20in%202022&text=Malaysia’s%20internet%20penetration%20rate%20stood,percent.

The birth of a new convergent communication and multimedia industry in Malaysia requires a new paradigm in approaching media policies and regulations. In line with this, Malaysia adopted a convergence regulatory model for the communications and multimedia industry in November 1998. There were series of laws enacted to govern the said industry including the CMA “98”.Footnote9

In Malaysia, CMA ‘98 establishes a framework which is first premised on the view that the communications and multimedia industry is in the best position to regulate the communications industry itself. In line with this the Act provides for the introduction of four self-regulatory codes prepared by the appropriate industry forums. By virtue of Section, 212 of the CMA ‘98 the matters which the Content Code may address include but are not limited to the restrictions on the provision of unsuitable content; the methods of classifying content; the procedures for handling public complaints and for reporting information about complaints to the Commission among others. The regulatory model as followed in Malaysia therefore is more of a self-regulatory system. Various recently reported cases demonstrate that many children are exposed to pornographic and obscene material online in the country. Cases on cyberbullying, issues pertaining to breach of privacy issues are increasing. Although the existing legislative provision in Malaysia makes it an offence to distribute and make available such content in Malaysia, harmful sexual material can easily be accessed from websites originating from external servers. Therefore, generally, the limitation of the CMA‘98 and the Content Code in relation to self-regulation is that the system is based on notice and removal of the content. However, such a regulatory system may not be suitable for content such as online child sexual material and the manner in which such abuse occurs.

2.1. Online exploitation of children in Malaysia

As a signatory to the United Nations Convention on the Rights of the Child 1989 and the Optional Protocol to the Convention on the Rights of the Child on the Sale of Children, Child Prostitution and Child Pornography, Malaysia too defines a “child” as a person under the age of 18 years as provided in section 2(1) 1 of the Child Act 2001. Although child pornography is an offence in the country, Malaysia had the distinction of having the highest number of IP addresses for uploading and downloading photographs of child pornography in Southeast Asia in 2018.Footnote10

The discovery of the two well-known cases of Richard Huckle and Blake JohnstonFootnote11 with their countless number of Malaysian children abused online for sexual exploitation brought about changes in the law with the introducing of the Sexual Offences Against Children’s Act 2017.Footnote12 “Child Online Exploitation” capturing the broad different facets of exploitation of children now includes offences relating to child pornography and its related offense as well. In a more recent report in 2022, the Women, Family, and Community Development Minister stated that sexual exploitation of children online has been on the rise, as more than 100,000 Internet Protocol of Malaysian Addresses have been involved with child pornography for six years.Footnote13 The police revealed 106,764 IP addresses that were recorded between 2017 and August 2022.Footnote14

Another recent survey indicates that, as of January 2022,Footnote15 an estimated 89% of people in Malaysia use social media. It rose by 43% from 2016, when social media users accounted for only 62% of Malaysia’s entire population. WhatsApp had the highest number of users, with a penetration rate of 93.2%, followed by Facebook at 88.7%, and Instagram at 79.3%. In addition, according to the Internet Users statistics in 2020 provided by the Malaysian Communications and Multimedia Commission, there is an increase in the number of children between the ages of 5 and 17 years using the Internet. This is a stark increase in the data captured for Internet usage in the country in 2016. The statistics indicate that as the Internet becomes more easily accessible, more children will access the Internet, which undoubtedly leads to issues pertaining to their safety and well-being online. Therefore, at this stage, it is sufficient to note that although Malaysia has a framework set in place for the online regulation of content, the exploitation of children online seems to be increasing. Hence, it is important to implement the best practices and guidelines for regulating content.

2.2. Communications and multimedia act 1998 (Malaysia)

The Communications and Multimedia Act 1998 was brought into force in three phases to “regulate the converging communications and multimedia industries, and for incidental matters.” The CMA ‘98 covers communications over networks, including electronic media, but not print media. The term “content” found under the CMA ‘98 is interpreted as:

Any sound, text, still picture, moving picture or other audio-visual representation, tactile representation, or any combination of the preceding that is capable of being created, manipulated, stored, retrieved, or communicated electronically.Footnote16

Based on the above, content regulated under the CMA ‘98 only deals with online or digital platforms.

In terms of the type of content that is permissible the following is provided:

  1. No content application service provider or other person using a content application service shall provide content that is indecent, obscene, false, menacing, or offensive in character with intent to annoy, abuse, threaten, or harass any person.

  2. A person who contravenes subsection (1) commits an offence and shall, on conviction, be liable to a fine not exceeding 50,000 ringgit or to imprisonment for a term not exceeding one year or to both and shall also be liable to a further fine of 1,000 ringgit for every day or part of a day during which the offence is continued after conviction.Footnote17

Another general provision related to the prohibition of sexual content on the Internet is:

It is an offence if a person knowingly: (1) makes, creates, or solicits communication via network facilities, network service, or applications service, which is obscene, indecent, false, menacing, or offensive in character with intent to annoy, abuse, threaten, or harass another person; or (2) initiates communication using any applications service, whether continuously, repeatedly, or otherwise, during which communication may or may not ensue, with or without disclosing his identity and with intent to annoy, abuse, threaten, or harass any person at any number or electronic address.Footnote18

Sections, 211 and 233 are general provisions and extend to a description of offensive content on the Internet; however, they are subject to the court’s assessment of whether the content falls under Section, 211 or 233. The CMA ‘98 does not encroach on the scope of other existing laws that regulate traditional media and content services.

2.3. Powers and procedures of the Malaysian communications and multimedia commission

The CMA ‘98 sets out a new regulatory licensing framework for the communication and multimedia industry and created a new regulatory body, known as the Malaysian Communications and Multimedia Commission (“MCMC”). It seeks to provide a generic set of regulatory provisions and industry self-regulation while granting the Minister of Information absolute powers to determine who is allowed to broadcast and the nature of the content to be broadcasted. The CMA ‘98 emphasis self-regulation by the communication and multimedia industry by providing for the creation of industry forums by the industry body designated or appointed by the MCMC.Footnote19 The Communication and Multimedia Content Forum (“CMCF”) is an example of an industry forum created under the Act. The management of the CMCF rests with the Chairman and 18 council members elected from the six “ordinary‟ member categories of advertising, broadcasters, civic groups, content creators/distributors and audiotext hosting service providers as well as Internet access service providers.Footnote20 Its role is to govern content and address content-related issues disseminated by electronic media through industry self-regulation in line with a Content Code.Footnote21 The main function of this forum is to formulate and implement voluntary industry codes that serve as guidelines for the industry to operate. The Content Code, which was revised in 2022, is an example of a code formulated by CMCF. Since it is voluntary in nature, the said Codes are not legally binding, therefore non-compliance of the provisions in the Codes will not render the said industry to contravene CMA ‘98.

2.4. Content code and guidelines on content regulation

The Content Code was first introduced on 1 September 2004, and was revised on 30 May 2022, as the principal source for industry self-regulation containing governing standards and best practices for content dissemination within the communications and multimedia industry in Malaysia.Footnote22 Section 95 of the CMA ‘98 provides for an industry forum to prepare a voluntary industry code dealing with any matter provided for in this Act, on its own initiative, or upon request by MCMC.Footnote23 The Code outlines procedures of self-regulation that provide a platform for creativity, innovation, and healthy growth of a fast-evolving industry. The Content Code demonstrates a commitment toward self-regulation by the industry in compliance with the CMA ‘98. It seeks to identify offensive and objectionable content while spelling out the obligations of content providers within the context of the social values in this country. It also provides a guide for content consumers to practice self-regulation.Footnote24

As such, section, 211 of the CMA ‘98 sets out five categories for Internet content to be offence: indecent content, obscene content, false content, menacing content, and offensive content. Section, 213 presents the following conclusions:

The Content Code shall detail matters such as restrictions on the provision of unsuitable content, methods of classifying content, procedures for handling public complaints, and reporting information about complaints to the Commission. Representation of Malaysian Culture and National Identity, public information and education regarding content and regulation, and technologies for the end-user control of content and other matters of concern to the community.

Based on the said provisions, the Content Code further classifies offensive content into nine categories: Indecent Content, Obscene Content, Violence, Menacing Content, Bad Language, False Content, Children’s Content, Family Values, and People with Disabilities. Part 2 of the Content Code defines “Indecent Content” as “material, which is offensive, morally improper, and against current standards of accepted behavior.” The depiction of nudity is not allowed, except for exceptions, for non-sexual content nudity based on art, information, and/or sciences. Such depictions shall not be excessive or explicit in nature (i.e., not too prolonged, close-up, or gratuitous) with illustrations.Footnote25 “Obscene Content” on the other hand is understood as “material which gives rise to a feeling of disgust by reason of its lewd portrayal and is essentially offensive to one’s prevailing notion of decency and modesty.” It is possible that such content has a negative influence and corrupts the minds of those easily influenced. The test of obscenity is whether the content has the tendency to deprave and corrupt those whose minds are open to such communication. However, specific regards must be referred to, as follows:Footnote26

  1. Explicit Sex Acts/Pornography: Any portrayal of sexual activity that a reasonable adult considers explicit and pornographic is prohibited. The portrayal of sex crimes, including rape, attempted rape, or statutory rape, as well as bestiality, is not permitted, including the portrayal of such sexual acts, through animation, and whether consensual or otherwise.

  2. Child Pornography shall have meaning as defined under Part 1 of this Code with specific reference to Section 4 of the Sexual Offences Against Children Act 2017.

  3. Sexual Degradation The portrayal of any individual as a mere sexual object, or to demean, exploit, or discriminate them in such a manner is prohibited.

Part 5 of the Content Code provides Specific Online Guidelines that must be followed by the industry, including.

2.4.1. Internet Access Service Providers (“IASP”)

An IASP provides access to content needs to incorporate terms and conditions related to contracts and legal notices as terms of use with their subscribers. Footnote27 The existence of terms and conditions is to be displayed on the IASP website that is accessible by subscribers via links or other methods. The subscribers comply with the requirements of Malaysian law that includes the Content Code and prohibits Content or that is in contrast to Malaysian law.

2.4.2. Online Content Hosting Provider (“OCH”)

The Online Content Hosting Provider merely provides access to content, which is neither created nor aggregated by itself but is hosted on its facilities. Because the OCH hosts the space where the content can be stored, it is also relevant for the OCH to ensure that the prohibited content is not stored in its space. Regarding OCH, the same requirements as IASP apply, as the OCH must ensure that users and subscribers comply with the requirements of Malaysian law.

2.4.3. Online Content Developers (“OCD”)

An Online Content Developer refers to a Code Subject who develops Content files for the Code Subject or on behalf of others to be made accessible online. The Content Code is silent in relation to specific guidelines for OCD.

2.4.4. Online Content Aggregators (“OCA”)

An Online Content Aggregator refers to a person who aggregates and/or purchases content. The OCA needs to incorporate terms and conditions in contracts and legal notices that includes Users, Subscribers, and content providers, requiring compliance with Malaysian law while removing the content should there be a contravention of the guidelines set above.

2.4.5. Link Providers

A person who provides links to other sites containing prohibited content shall remove the link to such sites within 24 hours of being notified by the Complaints Bureau of the content.

2.4.6. Online Service Providers (“OSP”)

OSP is a provider of online services or network access, or the operator of facilities thereof, and includes, but is not limited to, an Internet service provider, news provider, entertainment provider, and e-government service providers.

2.5. An assessment of the existing guidelines on online communications industry in Malaysia

2.5.1. Content code

As evident from the above discussion the CMA ‘98 and the Content Code does not place any responsibility on the Content Providers to remove the offensive content. It merely provides guidelines to be adopted by the industry in terms of notification of unacceptable content and does not impose any further responsibility on service providers. For instance, if there is a subscriber or user that is providing Prohibited Content, the Complaints Bureau will then notify the IASP then the IASP is to identify the user/subscriber for the removal of the said content. In the event of non-compliance by a subscriber, the IASP has the right to withdraw access and is entitled to block or remove prohibited content, which is in accordance with the complaint procedure contained in the Code.Footnote28 Similarly, when an OCH is notified by the Complaints Bureau that a user or subscriber is providing prohibited content, the OCH will take the same measures as the IASP. Likewise, for OCD, since the Content Code is silent on specifications, it is presumed that the guidelines empower the user to control the nature of Online Content to be applicable.Footnote29 The responsibility lies with the user and not with the OCD. A similar practice is applicable to OCA, OSP and Link Providers.

Another factor is that although the general and specific measures under Part 5 of the Content Code section 8 of Part 5 of the Code lay down measures to be followed by the industry, these measures are not required to be undertaken by IASPs, OCAs, Link Providers and OCHs. They are:

  1. Provide rating systems for Online Content.

  2. Block access by users or subscribers to any material unless directed to do so by the Complaints Bureau acting in accordance with the complaint procedure set out in the Code.

  3. Monitor the activities of Users and subscribers; or

  4. Retain data for investigation unless such data retention is rightfully requested by the relevant authorities in accordance with Malaysian law.

The fact that Part 5 section 8 does not require the industry to do any of the actions mentioned therein, for example in (b) block access by their users unless directed, is detrimental to the regulation of online child exploitation as it is IASPs, OCH, OCD, OCA, and OSPs that are content providers in the communication and multimedia industry that are directly or indirectly in a position to remove content on child exploitation. This, to a certain extent, goes against the concept of self-regulation, as industry players will need to remove the content only in the direction from the authority rather than taking proactive measures to remove understandable content within their control.

2.5.2. Notice and take down procedures

The Content Code provides provisions for a complaint to be made to the service provider or content provider regarding unsuitable content.Footnote30 However, if the complaint cannot be resolved between the users and the content provider, it may then be referred to the Complaint Bureau, which is responsible for investigating complaints regarding content that is in breach of the code.

The Complaints Bureau shall:

  1. Consider and deal with complaints related to content, as provided in the Code;

  2. Investigate any Content that is considered to be in breach of the code without necessarily having been a complaint;

  3. Rule on any dispute arising between members of the Content Forum or between a member and a non-member;

  4. Interpret provisions of the Code when a need arises or a request is made. Footnote31

The complaint must be addressed within two months after it has been deemed valid, not frivolous, and reasonable.Footnote32 A complaint to the Complaints Bureau can also be made by industry players with regard to non-compliance by another industry player.Footnote33 On receiving a complaint and prior to adjudication, the Complaints Bureau shall provide necessary assistance and guidance to the parties involved with the intention of mediating an amicable resolution through mutual consultation. If mediation attempts fail, the Complaints Bureau will proceed to deal with the complaint. The Complaints Bureau shall convene an inquiry as and when the need arises, and may combine the hearing of two or more complaints into a single inquiry.

An IASP who is notified by the Complaints Bureau, (that its user or subscriber who is providing prohibited Content) and if the IASP is able to identify such subscriber the IASP will take the following steps, known as notice and take down procedure comprising of the following steps:

  1. Within a period of two working days from the time of notification, inform the subscriber to remove the prohibited content.

  2. Prescribe a period within which its subscriber removes the prohibited content, ranging from one to twenty-four hours from the time of notification.

  3. I If the subscriber does not remove such prohibited content within the prescribed period, the IASP is entitled to suspend or terminate the subscribers’ access account. Non-compliance will allow the Complaint Bureau to impose sanctions, as provided in Part 8 and Section 9 of the Content Code 2022.

Based on the above discussion, the Specific Guidelines demonstrate a clear and concise method of dealing with any type of offensive or unsuitable content on different platforms. The specific definition of the different types of content helps both the industry and subscribers to comply with the requirements. However, the guidelines themselves rely too much on the user to be proactive by complaining to the service or content provider before the content is taken down. The complaint may be referred to the Complaints Bureau if it is not resolved. The fact that the complaint “may be” referred and not “must be” referred to the Complaints Bureau does not guarantee that action pertaining to it will be taken.

Furthermore, the voluntary nature of Guidelines do not impose any proactive action by the industry. Section 98(1) provides that subject to Section 99, compliance with a registered voluntary industry code shall not be mandatory, and by virtue of Section 98(2) compliance with a registered voluntary industry code shall be a defense against any prosecution, action, or proceeding of any nature, whether in court or otherwise, taken against a person (who is subject to the voluntary industry code) regarding a matter dealt with in that code. As such, the approach to curbing exploitative material for children is not holistic. The responsibility of the industry is further eroded in Part 5 section 8.1. This goes against the concept of self-regulation, which is at the core of the Content Code. As gatekeepers of information on their services, the industry must be placed with the responsibility of removing content, especially relating to child exploitative material. As such, the industries listed above are exactly those content providers who should be held accountable, and, at present, service providers are not in breach of the provisions of the Content Code by omission of the law in place.

3. Online content regulation in Australia

In 2015, the Australian Government established the Office of the Children’s eSafety Commissioner under the Enhancing Online Safety for Children Act 2015 to help protect Australian children from cyberbullying harm and to assume a national leadership role in online safety for children. In 2017, the government expanded the commissioner’s remit to cover online safety for all Australians. The office was renamed as the Office of the Safety Commissioner.Footnote34 The aim of the Enhancing Online Safety for Children Act 2015 was to identify and remove online cyberbullying material and enforce measures to make social media a safer space for children. Australia’s new Online Safety Act 2021 builds on these principles by extending them to include all citizens under its influence, and crucially demands a high level of private sector compliance and cooperation to ensure online safety. Additionally, it gives the Australian eSafety Commissioner (or eSafety) new powers to require ISPs to block access to child abuse material and regulate illegal and restricted content irrespective of where it is hosted.Footnote35 In addition, at the state and territory levels, laws such as the Children and Young People Act (2008), Crimes (Child Sex Offenders) Act (2005), Care and Protection of Children Act (2007), and Child Protection Act (1999) all address offences that are now interpreted as being applicable to cyberspace.Footnote36

3.1. Online safety act 2021 (Australia)

The Online Safety Act 2021 of Australia came into force on 23 January 2022, where it set out a more integrated, expansive, and distinct framework for regulating online safety, especially for children, as compared to its previous Enhancing Online Safety for Children Act 2015.Footnote37 The role and scope of powers of the eSafety Commissioner has also improved since its inception in 2015, when the Enhancing Online Safety for Children Act 2015 was first introduced. The following discussion focuses on the enhanced powers of the eSafety Commissioner and the development of voluntary codes.

3.2. The powers of the eSafety commissioner

The powers of the eSafety Commissioner originate from OSA 2021, in which OSA 2021 enhances the ability of the eSafety Commissioner to act quickly to protect victims of online abuse across our reporting schemes.Footnote38 It gives them the authority to compel online service providers to remove seriously harmful content within 24 hours of receiving a formal notice, halving the time previously allowed (although eSafety may extend this period in certain circumstances).Footnote39 In the eSafety Commissioner, powers within the purview of the expectations are as follows,Footnote40 as highlighted in Part 2, section 27 of the OSA 2021Footnote41:

  1. The power to require online service providers to report how they meet any or all of the expectations, either on a non-periodic or periodic basis. The obligation to respond to a reporting notice is enforceable and is backed by civil penalties.

  2. The power to require reporting can either apply to specific providers or via a legislative instrument – a determination – may apply to a specified class of providers; and

  3. The power to issue statements to provider(s) about compliance and non-compliance with expectations, and publish such statements.

3.3. Industry codes and industry standards

The OSA 2021 also stipulates what the Australian Government now expects technology companies to operate online services. Footnote42 These expectations are set in what OSA 2021 defines as Basic Online Safety Expectations. Basic Online Safety Expectations, known as “the Expectations.” are the key elements of OSA 2021.Footnote43 These Expectations aim to ensure that social media, messaging, gaming, app services, and website providers take reasonable steps to keep Australians safe online.

Part 2 focuses on basic online safety expectations with Division 2, focusing on expectations regarding safe use. This includes core and Additional Expectations.

For core expectations, Division 2 (1) states “The provider of the service will take reasonable steps to ensure that end-users are able to use the service in a safe manner.” In terms of additional expectation, the sections provides “(2) The provider of the service will take reasonable steps to proactively minimise the extent to which material or activity on the service is unlawful or harmful.”

Examples of reasonable steps that could be taken include “(3) Without limiting Subsections (1) or (2), reasonable steps for the purposes of this section could include the following.

  1. Developing and implementing processes to detect, moderate, report, and remove (as applicable) material or activity on a service that is unlawful or harmful.

  2. If a service or a component of a service (such as an online app or game) is targeted at, or being used by, children (the children’s service), ensuring that the default privacy and safety settings of the children’s service are robust and set to the most restrictive level.

  3. ensuring that persons who are engaged in providing the service, such as the provider’s employees or contractors, are trained and are expected to implement and promote online safety.

  4. continually improving technology and practices relating to the safety of end-users;

  5. Ensuring that assessments of safety risks and impacts are undertaken, and safety review processes are implemented throughout the design, development, deployment, and post-deployment stages for the service.

The next expectation is for the provider to consult the commissioner and refer to the commissioner’s guidance in determining reasonable steps to ensure safe use. Additional expectations require providers to take reasonable steps regarding encrypted services; provider are required to take reasonable steps regarding anonymous accounts; provides to consult and cooperate with other service providers to promote safe use. Division 3 focused on expectations regarding certain materials and activities. Division 4 concentrates on expectations regarding reports and complaints.

OSA 2021 equips the eSafety Commissioner with powers to require reporting from providers to improve transparency and accountability. The eSafety Commissioner can now require online service providers to report how they meet any or all of the expectations. The obligation to respond to a reporting requirement is enforceable and supported by civil penalties and other mechanisms. The eSafety Commissioner can also publish statements regarding the extent to which services meet expectations. The requirements are designed to improve providers’ safety standards, transparency, and accountability.Footnote44

This also expands to industries where Part 9, Division 7, in the industry codes and industry standards of OSA 2021, delegates the responsibility to ensure the safety of online users by requiring industries to develop their own codes of practice to regulate illegal and restricted content such as child sex abuse videos or videos promoting terrorism.Footnote45 Further, Part 9, Division 7 of OSA 2021, is divided into six subdivisions (i.e., Section, 132–150) which covers all aspects relating to industry codes and industry standards expected to be complied by the industries. Among the important provisions are Sections, 137 to 160 of the OSA 2021. Section, 137 of OSA 2021 is under Subdivision B, pertaining to the General Principle to the Industrial Codes and Industrial Standards, where it stipulates that the Parliament intends that the eSafety Commissioner should make reasonable efforts to ensure that for each section of the online industry, the industry code and the industry standard are registered within the stipulated time.Footnote46 Section, 137 (2) provides the following conclusions.

  1. The Parliament intends that the Commissioner should make reasonable efforts to ensure that for each section of the online industry:

  2. (a) An industry code is registered under this division within six months of its commencement.

  3. (b) An industry standard is registered under this division within 12 months after the commencement of this division.

Section 138 provides examples of matters that may be dealt with by industry codes and industry standards,Footnote47 while Section 139Footnote48 addresses the escalation of complaints in relation to dealing with complaints about class 1 material, or class 2 material, provided on social media services; procedures to be followed in order to deal with complaints about class 1 material, or class 2 material, provided on relevant electronic services; procedures to be followed in order to deal with complaints about class 1 material, or class 2 material, provided on designated internet services.

The meaning of Class 1 materials is stipulated in Section 106 of OSA 2021.Footnote49 The material is in reference to Australia’s National Classification Scheme, a cooperative arrangement between the Australian Government and state and territory governments for the classification of films, computer games, and certain publications.Footnote50 Class 1 material is a material that is or would likely be refused classification under the National Classification Scheme. It includes material that: “depicts, expresses or otherwise deals with matters of sex, drug misuse or addiction, crime, cruelty, violence or revolting or abhorrent phenomena in such a way that they offend against the standards of morality, decency and propriety generally accepted by reasonable adults to the extent that they should not be classified; describes or depicts in a way that is likely to cause offence to a reasonable adult, a person who is, or appears to be, a child under 18 (whether the person is engaged in sexual activity or not), or promotes, incites or instructs in matters of crime or violence.”Footnote51

Class 2 materials, on the other hand, are materials that are stipulated in Section 107 of OSA 2021.Footnote52 These can be classified as either X18+ (or, in the case of publications, category 2 restricted) or R18+ (or, in the case of publications, category 1 restricted) under the National Classification Scheme, as it is considered inappropriate for general public access and/or for children and young people under 18 years of age. When classifying materials into Class 1 or Class 2, the determinator is the context in which the material is used. The nature and purpose of the material must be considered, including its literary, artistic, or educational merit and whether it serves a medical, legal, social, or scientific purpose.Footnote53 This means that it is unlikely that sexual health education content, information about sexuality and gender, or health and safety information about drug use and sex will be considered illegal or restricted by the eSafety Commissioner.

Section 143 of OSA 2021, which is under sub-division C, explains compliance with the industry code.Footnote54 In essence, it stipulates that if the eSafety Commissioner is satisfied that the industry code has been contravened, the eSafety Commissioner may by written notice direct the person to comply with the industry code or they will be fined for 500 penalty units. Section 144 highlights the formal warning of breaches in industry codes. Section 144 (2) states that “the commissioner may issue a formal warning if the person contravenes an industry code registered under this division.”

3.4. An assessment of the existing guidelines on online communications industry in Australia

3.4.1. Enhanced powers of the eSafety commissioner

The main function of the eSafety Commissioner ensures that Australians have safer and more positive online experiences. It focuses on three main aspects: protection, prevention, and proactive systemic changes.Footnote55 It provides protection to online users as it has the power to formally direct Australian Internet Service Providers to block violent material.Footnote56 Hence, problems such as illegal online content, relating to Child Sexual Abuse Material and Abhorrent Violent Material, which includes content that promotes, incites, or instructs in terrorist acts or other violent crimes; problems related to serious cyberbullying directed at Australian children under 18 years old; and image-based abuse, which is sharing or threatening to share intimate images or videos of a person without their consent, can be resolved effectively. In addition to providing protection, the function of eSafety is to provide prevention by informing Australians about online safety risks, how to be safe online, and where to go for help. Apart from raising awareness, their work ranges from education and training to research, early intervention, and harm minimization. eSafety takes a strategic approach, providing leadership, guidance, and evidence-based resources, and plays a key role in educating Australians to develop critical digital skills and stay safe online.

In this regard, eSafety introduces its Safety by Design to encourage technology companies to anticipate, detect, and eliminate online risks so that the digital environment is safer and more inclusive. This challenges technology companies in altering their design ethos. For many, this means switching from a “profit at all costs” approach to “moving thoughtfully,” investing in risk mitigation at the front-end, and embedding user protections from the outset. This also means putting user safety and rights at the center of the design and development of online products and services, rather than retrofitting safeguards after harm has occurred.Footnote57

3.4.2. Notice and take down procedures

Part 3 of OSA 2021 focuses on Complaints, Objections and Investigations (i.e., Sections 29–43). Australian children may lodge a complaint about cyber-bullying material, and the eSafety Commissioner can act on it by initiating investigation on the same. Objection notices can also be submitted to the eSafety Commissioner, which will be further investigated, and the contents may consider giving a removal notice.Footnote58 In relation to the online content scheme, complaints about Class 1 and Class 2 materials can be made to the eSafety Commissioner pursuant to Section 38 of the OSA 2021. The investigative power of the eSafety Commissioner relating to the matters set out in Section 38 Of OSA 2021 is based on Sections 42 and 43 of the Act.Footnote59

The Removal and Remedial Notices of Class 1 and Class 2 materials come under Part 9 of OSA 2021 (i.e., Division 2 to 6). In Division 2, the eSafety Commissioner may give the provider of the service a removal notice if it falls within the purview of Section 109 of OSA 2021, that is, among others, the material was posted on either social media service, electronic service, or designated service; the Commissioner must be satisfied that the material is or was Class 1 Material; and that the material can be accessed by users in Australia and not within the categories of exemption by Parliamentary, Court/Tribunal or Inquiry. If the removal notice was issued, the provider had to take reasonable steps for its removal within a period of 24 h or period that the commissioner allowed.Footnote60 The same mechanism applies to the removal of a notice given to a hosting service providerFootnote61; Non-compliance with the removal notice will attract Civil Penalty in the form of 500 penalty units,Footnote62 and the eSafety Commissioner has the power to revoke the removal notice based on Section, 113 of OSA 2021.Footnote63 If the breach under Section 109 happens more than twice within 12 months, the eSafety Commissioner has the option to make a choice to either prepare a statement, then publish the statement on the Commissioner’s website, and then provide a copy of the statement to the provider of service.Footnote64

Division 3 focuses on the removal of Class 2 materials. The mechanism works in the same manner as in Division 2 (i.e., from Section 114 to Section 118A of Division 3). Division 4 is about remedial notice relating to Class 2 materials, and the mechanism is also the same as in Division 2 (i.e., from 119 to Section 123A of Division 4). Division 5, on the other hand, is related to the Link Deletion Notice, which essentially allows the eSafety Commissioner to issue such notice to a person who provides an Internet search engine service to which the end user in Australia can access the class 1 material by using the link provided by the service.Footnote65 Once the Link Deletion Notice is issued, the provider must cease to provide a link to the material using the service within a span of 24 hours or longer, as allowed by the eSafety Commissioner. The provider must also notify the eSafety Commissioner about this issue. However, the power of the eSafety Commissioner to issue the Link Deletion Notice is conditional to Section 124 (4) of OSA 2021, which states that the said notice will only be issued if the eSafety Commissioner is satisfied that there were two or more times during the 12-month period when end-users in Australia were able to access Class 1 material with the link by the service.

It must be noted that this can only be given if during the previous 12 months the commissioner has given one or more removal notices under Sections 109 and 110 in relation to Class 1 material and the notices are not complied with. This shows that the eSafety Commissioner could not exercise its power blindly, and it is conditional to some extent in order to ensure that the law is not prejudicial to the person who provides an Internet search engine service and first offenders. Section 125 of OSA 2021 to Section 127 of OSA 2021 explains the Compliance with the Link Deletion Notice, Formal Warning and Revocation of the Link Deletion Notice that are similar to Division 2. (i.e., Section, 111 of OSA 2021 to Section, 113 of OSA 2021)

4. Online content regulation in Singapore

The Broadcasting Act 1994 of Singapore was enacted to regulate the operation of and ownership in broadcasting services and apparatus, and for matters connected to it. The authority responsible for overseeing the duties within the Act is defined in Section 2, referring to the Info-communications Media Development Authority (“IMDA”) established in Section 3 of the Info-communications Media Development Authority Act 2016. Section 6 of the Broadcasting Act of 1994 empowers the IMDA to issue and review Codes of Practice in relation to standards of programs and advertisements broadcast by licensees, and standards required to be maintained by licensees. On 1 October 2016, the IMDA issued the Internet Code of Practice, which is mandatory as per Code 1 (2), stating that IASP and Internet Content Providers under the Broadcasting (Class License) Notification (N1) are required to comply with the Code of Practice.

4.1. Guidelines on content regulation

Under the Internet Code of Practice, Code 4 states that prohibited material would be material that is “Objectionable” on grounds of public interest, public morality, public order, public security, national harmony or prohibited by applicable Singapore laws.Footnote66 To determine whether the materials fall within the prohibited material category, seven factors must be considered.

  1. whether the material depicts nudity or genitalia in a manner calculated to titillate; and,

  2. whether the material promotes sexual violence or sexual activity involving coercion or non-consent of any kind.

  3. Whether the material depicts a person or person clearly engaged in explicit sexual activity.

  4. Whether the material depicts a person who is, or appears to be, under 16 years of age in a sexually provocative manner or in any other offensive manner.

  5. Whether the material advocates homosexuality or lesbianism or depicts or promotes incest, pedophilia, bestiality, and necrophilia.

  6. whether the material depicts detailed or relished acts of extreme violence or cruelty; and

  7. whether the material glorifies, incites, or endorses ethnic, racial, or religious hatred, strife, or intolerance.Footnote67

Additional consideration is given to intrinsic medical, scientific, artistic, or educational value. The Code of Practice empowers licensees to seek assistance from the IMDA when in doubt about the status of the content.

The current provisions in the Code of Practice regarding prohibited materials are sufficiently wide to cover child exploitation. The inclusion of the seven factors further enables the IASP, Reseller, and Internet Content Provider to determine the nature of the content to comply with the requirements of the Code.

4.1.1. Specific online guidelines

As for the specific online guidelines for the IASP, Reseller, and Internet Content Provider, Clause 3 of the Code of Practice provides guidelines to the IASP, Reseller, and Internet Content Provider as follows:

4.1.1.1. IASP/Reseller

Clause 3 of the Code provides that the IASP or Reseller will discharge their obligations under Code with a program related to the World Wide Web, when he denies access to sites notified by the IMDA as containing prohibited material as defined in Clause 4 of the Code.

4.1.1.2. IASP/Reseller in relation to Newsgroup

Clause 3(2) of the Code provides that the IASP or Reseller will discharge obligations related to newsgroups, namely: (a) refrains from subscribing to news groups if of the view that it contains prohibited material, and (b) unsubscribing from any newsgroup that the IMDA may direct.

4.1.1.3. Internet Content Provider

Clause 3(3) states that the Internet Content Provider will discharge their obligation (a) in relation to private discussions hosted on their service (chat groups) when the licensee chooses discussion themes that are not prohibited under the guidelines of clause 4. (b) In relation to programs on service by other persons who are invited to do so on the licensee’s service for public display(bulletin boards), when the licensee denies access to any contributions that contain prohibited material that he discovers in the normal course of exercising his editorial duties, or is informed about; and (c) in relation to all other programs on his service, if the licensee ensures that such programs do not include material that would be considered to be prohibited under the guidelines in clause 4 below. Clause 3(4) provides that an Internet Content Provider shall deny access to material considered by the IMDA to be prohibited if directed to do so by the IMDA.

4.1.1.4. Web Publisher or Web Server

Clause 3(5) Paragraph (3) does not apply to any web publisher or web server administrator with respect to programs on his service for which he has no editorial control.Footnote68

Based on the above discussion, the emphasis of the Code of Practice seems to be on how the IASP, Reseller, or Internet Content Provider can discharge their obligations in relation to the prohibited content in various ways, such as denying access to content, not subscribing to the content, exercising editorial duties, or denying access upon the request of IMDA. Thus, even though the application of the Code of Practice is mandatory, the absence of any proactive measures that the industry must observe to ensure the content posted is within the Code Practice, which weakens its enforcement. Thus, the question is what can be done in the event of non-compliance with the content of the material posted by the industry.

4.1.2. Notice and take down guidelines

The Code of Practice does not provide any specific notice and takes down guidelines if the content falls within the prohibited content. The Code requires the IASP, Reseller of the Internet Content Provider, to deny access to the content if requested by the IMDA. The request by the IMDA is mentioned in section 16 of the Broadcasting Act 1994, which allows the IMDA to act against the content by prohibiting the broadcasting of the whole or part of the content, and non-compliance of the such directive is an offence under the Act. Unfortunately, the guidelines on how this action begins are not clear, especially for those who can lodge a complaint pertaining to prohibited content.

Since the Broadcasting Act 1994 and the Code of Practice are silent on the notice and take down guidelines, perhaps reference can be made to existing notice and take down procedures in the Copyright Act 2021 (Chapter 63).Footnote69 An example of it being exercised can be seen via Singtel, as one of Asia’s leading communication technology groups, operating one of the world’s fastest growing and dynamic regions with an extensive range of digital and telecommunication services.Footnote70 The ISP in Singapore provides a copyright takedown notice for the copyright holder if there is an infringement of the copyright holder’s work. Upon receiving the valid take down notice which is to be done in accordance to the prescribed form of the Copyright Act, Singtel will take the steps to remove or disable accessFootnote71 to material with the Act. Singtel will proceed to inform the person who posted the material. The person who posted the material will become the “Respondent” if they choose to send a valid counter-notice to Singtel to restore the material. Singtel then proceeds to assess whether it is practical to restore the material unless the copyright owner decides to take the matter to court.

4.2. Online safety (miscellaneous amendments) act 2022

On 3rd 3 October 2022,Footnote72 the Government of Singapore introduced the Online Safety (Miscellaneous Amendments) Bill for the first time. On 9 November 2022, the Online Safety (Miscellaneous Amendments) Bill was passed by the Parliament of Singapore. The new Act amends the Broadcasting Act of 1994 and the Electronic Transactions Act of 2010 to regulate the providers of online communication services.

The Minister for Communications and Information stated that the new Act will address content that did not fall within the ambit of the current legislation, for example, content encouraging suicide and self-harm, content attempting to mimic videos of impossible physical stunts, and content that exposed the children to inappropriate content and unwanted social interaction online.Footnote73 Based on a survey conducted by the Ministry, the top three online content items felt that children needed to be most protected from sexual content, cyberbullying, and violent content. Footnote74 The Act covers content in social media services, which has not been covered previously.Footnote75 The new Section 45 L focuses on the “Online Code of Practice for regulated online communication service” allows IMDA to issue Codes in addition to the existing ones, since the Act includes Social Media platforms, and the existing Code of Practice may not be suitable to be applied.

4.2.1. Guidelines on content regulation under the act 2022

The Act introduces additional 5 types of content: egregious content, heinous sex crime, public health measures, public health risks, and terrorism. For this discussion, the focus is on egregious content and heinous sex crimes. The addition of section 45D “egregious content” will refer to content that contains the following namely: (a) Advocates or instructs on suicide or self-harm; (b) violence or cruelty, physical abuse or harm to humans; (c) instructs sexual violence or coercion with sexual conduct, regardless if it is used to commit a sex crime or not; (d) Depicting sexual purpose, exploits or nudity of a child that reasonable person will regard as being offensive, regardless if sexual activity is present or not; (e) conduct that is obstructing public health measures or creates public health risk; (f) matters of race, religion that stirs hostility or ridicule; (g) matters in relation to terrorism.Footnote76

The Act also introduces the definition of heinous sex crime in section 45D (3) to include (a) sexual assault, rape, or sexual assault by penetration or grooming of a child; (b) abetment or a conspiracy to commit an offence in paras (a) or (b); (c) attempt to commit an offence specified in para (a) or (b); (d) conduct outside Singapore that had been engaged in within Singapore would be an offense as specified in (a) to (d) and punishable by a Singapore Court.Footnote77

The Act amends Section 2 of the Broadcasting Act 1994Footnote78 with the addition of the definition of “content” to means information or material capable of communication by means of a broadcasting service or an electronic service: (a) whether in the form of text; (b) whether in the form of speech, music or other sounds; (c) whether in the form of visual images (animated or otherwise), pictorial or graphic form (for example, an anthropomorphic or a humanlike depiction); (d) whether in any other form; or (e) whether in any combination of forms. Section 45D (2) states that it is immaterial whether the conduct is conducted within or outside Singapore.

Therefore, based on the discussion above, the Act adds more types of content that fall within the meaning of Prohibited Content, with the objective of safeguarding children from exposure to inappropriate sexual content and unwanted social interaction online. The nature of the content can be in various forms, such as text, music, pictures, and any other forms that may exist in the future. Furthermore, the Act also addresses the methods of conveying content through the digital landscapes of Singapore, including social media services. The inclusion of social media services as a new platform to be regulated is applauded because this platform is very popular with children owing to its flexibility and creativity, allowing them to be innovative and creative.

4.2.2. Specific guidelines on content under the act 2022

Section 45 K allows the IMDA to regulate online communication services, after considering matters such as the range of all communication services and the extent and nature of the effect that types of online communication services have on Singapore.Footnote79 This information must be adequately conveyed to the public for the public’s knowledge. For IMDA to regulate online communication services, Section 45 L states that IMDA may issue one or more Online Codes of Practice.Footnote80 In the ambit of Section 45 L, the Online Code of Practice, whether issued or amended may require the providers of regulated online communication to provide all or the following details, namely: requiring the providers to prevent Singapore-end-Users from accessing content that presents a risk of harm; mitigating and managing the risks of danger from content that is provided; matter to provide practical guidance or certainty in respect of what content presents harm to Singapore end-users or certain types of Singapore-end Users; the procedures must be followed by a provider of a regulated online communication service to satisfy duty under Section 45 M; require conducting an audit at its own cost; reporting to the IMDA or whenever requested by the authority regarding the information and measures taken by the provider; conducting risk assessments, making mitigating risks; collaboration and cooperation of regulated online communication service with research on its own by suitable experts or experts proved by authority; assist in developing the authority’s understanding of nature of systemic risks related to functions and use of service, evolution and severity of online systemic risks.

The Act further requires the regulated provider to take reasonable and practical steps to comply with the Online Code of Practice according to Section 45 M.Footnote81 In the event that the provider fails to comply, the provider must show that it is not reasonably practicable to do so due to the lack of means to satisfy the duty. This requirement is to be followed regardless of confidentiality, privacy, or any duty imposed by the contract or any rule of law. No civil or criminal liability will be incurred if there is an omission or act done with care and faith in compliance. If the provider fails to comply without any reasonable excuse, the authority may order a penalty not exceeding 1 million or direct the provider to take steps to remedy the failure via a written notice. Failing with results that the provider will be guilty for an offence and liable on conviction to a fine not exceeding 1 million and not exceeding 100,000 for every day or part of a day, the offence continues.

4.2.3. Notice and take down procedure under the act 2022

Section 45 H of the Act provides that IMDA has power to direct the providers to either disable access on service or stopping delivery or stop communication of “egregious” content to Singapore End-Users, if IMDA is satisfied that the content is of “egregious” nature.Footnote82 This must be performed within a specified period, as mentioned in the written direction. The section will require terminating, alerting, or suspending services provided to Singapore End-Users concerned without causing any sort of affect be it part or whole of any area in Singapore. This provision will not apply to user-generated content of a private or domestic nature. The content that is “egregious” must be identified by the IMDA to enable the provider to comply.

While Section 45IFootnote83 is applicable to the providers of an Internet access service, if IMDA is satisfied that an online communication service has been given a section 45 HFootnote84 direction, non-compliance of the directive will result in IMDA directing the provider to stop access by Singapore end-users.

The Act clearly identifies the measures that IMDA can take against providers in the event of non-compliance with content guidelines. However, the Act does not state the parties that may make a report pertaining to prohibited content to the IMDA. Nevertheless, the fact that the Act provided the modes of notice and take down, which must be complied with by the providers, may indicate that anyone can lodge a report regarding the content of IMDA. This approach ensures that the prohibited content will not continue to be conveyed.

5. Guidance from Australia and Singapore as points of references for best practices in regulating the material in Malaysia

The table below indicates differences and the strengths of the legal frameworks in Malaysia, Australia and Singapore.

Table 1. Differences and the strengths of the legal frameworks in Malaysia, Australia and Singapore

5.1. Industry self-regulation

The Content Code in Malaysia aims for industry self-regulation, whereas Australia’s eSafety Commissioner requires the codes used by a particular association or body to determine whether their Industry Codes are sufficient. The commissioner under Section, 141 of the Online Safety Act (Australia) may opt to request that they further develop the codes. This also includes the addition of Section, 145 of the Online Safety Act (Australia), where the eSafety Commissioner will determine the codes and industry standards themselves by legislative instruments should there be issues towards the codes that have been granted to them or noncompliance with granting the Code. In terms of its applicability based on the Act, a reference can be made to Division 7, which essentially covers online access that caters to end users in Australia. From matters such as installation of service, to the use of service and social media as it covers the industries in relation to those used by Australian end-users.?Footnote86

The same holds true in Singapore, where the adoption of the codes is mandatory. The new provisions introduce duties to Service Providers to ensure that prohibited content is inaccessible to the public. The providers must explain the failure to comply with said duties, and the Act makes it an offence for non-compliance without reasonable excuse by the providers. The introduction of an offence demonstrated the seriousness of the authority in curbing exploitative material in Singapore. The Act encourages providers to self-regulate by conducting their own audits pertaining to the measures taken in compliance with the Act. The new provisions also encourage co-operation between the IMDA and providers in analyzing future or potential risks. Thus, in terms of regulating the content in Malaysia, it is proposed that a better option is for the communications industry to be held responsible for the regulation of the content by enabling the specific services providers to develop the codes and by making it mandatory for the codes to be followed. As discussed above, as a point of reference, guidance can be placed on the Basic Online Safety Expectations provided in Australia’s Online Safety Act 2021 with the creation of its Industry Codes

5.2. Classification of content

Australia implemented a comprehensive classification of Class 1 and Class 2 contents. In Australia, the classification system discussed above assists in determining the material that is or is likely to be refused classification under the National Classification Scheme. In Singapore, the Online Safety (Miscellaneous Amendments) Act 2022 explains the meaning of the ‘Online Communication Service, which was previously not defined in the Broadcasting Act 1994. The Online Communication Service covers everything from SMS, MMS, Internet Access Service, one-to-one live aural communication, communication between two or more end-users, user-generated content, and an electronic service that has been prescribed by the Minister by order of Gazette to be an excluded electronic service, provided that the functionalities of the service or user-generated content enable the service. This suggests that such a classification system is better suited for the removal of online child exploitation for the purposes of clarity and uniformity in classifying content. Thus, Malaysia should use it as a reference point for the classification of content.

5.3. Notice and takedown

The industry codes in Australia, when registered, require online platforms and service providers to detect and remove illegal content such as child sexual abuse or acts of terrorism. They can also put greater onus on industry to shield children from age-inappropriate content such as pornography. It also allows eSafety to impose industry-wide standards if online service providers cannot reach agreement on the codes or if they develop codes that do not have appropriate safeguards. These Codes and standards are enforceable by civil penalties and injunctions to ensure that online service providers comply.

In Singapore, the new Act 2022 has also placed more specific guidelines on notice and takedown procedures with the onus placed on service providers. Hence, it is recommended that Malaysia focus more on such a mechanism and hold all content service providers responsible for removing any child exploitative material from their servers.

5.4. eSafety commissioner

In terms of enforcement, the eSafety Commissioner has more power with the fact that it may request an order from the Federal Court of Australia under Division 9 of the Act that will focus on the Federal Court to be satisfied if there were occasions within 12 months that have resulted in an online risk that is significant, to order a cease to persons in categories such as social media, electronic services, designated Internet services, and supplying an Internet carriage. It is recommended that the Commission be empowered to enable direct service providers to remove content such that the removal of content from the servers is more effective.

6. Recommendations and concluding remarks

To understand the current practices followed in content regulation various content application providers were selected based on criteria of providing services as providers who are able to remove child online sexual material in Malaysia. To this end a study of the standard operating procedures followed by the platform operators such as TikTok, Instagram WhatsApp, YouTube, Telegram, Google Play Store, Twitter, Telecom Malaysia, Celcom, Maxis and App Store indicate the following:

  1. No consistent age limit for opening accounts.

  2. No common interpretation of what amounts to online child exploitation/or minors with the exception of twitter, YouTube and TikTok.

  3. Almost all platforms do not prohibit users from creating, uploading, or distributing content that facilitates the exploitation or abuse of children but is subjected to immediate removal or as per the community guidelines of the said platform.

  4. No consistent take down procedures among the providers as some Apps such as Telegram which has its own mechanism of reporting illegal material.

  5. Reporting of illegal material is to the parent company.

From the above it is to be noted that various platform providers follow different standard operating practices which are linked to the policies and best practices of their parent companies.

As stated earlier the Content Code was not enacted as enabling legislation, but merely as a guideline. Industrial compliance to the Content Code is therefore not mandatory, as provided in Section 92 of the CMA ‘98. Had the Content Code been enacted as legislation, compliance is undoubtedly mandatory. Nevertheless, industrial compliance with the Content Code could serve as legal defence against any prosecution, action or proceeding of any nature, whether in a court or otherwise as stated in Section 98 (2) of the CMA ‘98. According to the CMCF, this “legal defence” could act as a “safeguard” for industry players in secondary liability claims. ISPs and other Code subjects are subjected to legal actions for their likelihood to host third party illegal content. Most of the time, third party illegal content is stored inside their domains without knowledge. However, such “safeguard” does not grant complete immunity against secondary liability. Neither the CMA ‘98, which was enacted as legislation, had any provisions to grant immunity hence it cannot act as “safety net” for the Code subjects in secondary liability claims.

In this regard, it is argued that the extent of “protection” accorded by Section 98 (2) is vague. Section 98 does not provide immunity against civil or criminal liability towards Code subjects despite having adhered to the rules of the Content Code. No Code subjects in Malaysia have come forward to challenge this provision in any court of law- hence have not shed any light to this issue.

In this regard, it is submitted that the Content Code has been slow to “reward” its subjects due to its non-mandatory status. More protection to Code subjects is thus required to ensure higher confidence for Internet industry members. It is submitted that incentives in terms of exclusion from liability should also be in place to ensure the industry continues to self-regulate voluntarily. This also enhances corporate social responsibility among industry members. Despite the growing content risks, the self-regulation framework does nothing significant to reduce its exposure towards children. This is primarily because of the non-censorship guarantee adopted by the scheme based on Section 3(3) of the CMA ‘98 and Part 7 of MSC Bill of Guarantee. One may argue that non-censorship has been making self-regulation stronger in that self-regulation requires self-constraint and self-discipline. However, in the context of reducing children’s exposure to content risks, non-censorship does nothing proactive to assist.

For example, Part 5 of the Content Code does not require ISPs in Malaysia to monitor users. The Code Provides, “in acknowledging that in the fast-changing Online environment, it is very often impractical, costly, difficult and ineffective to monitor Content, Code Subjects shall nonetheless fulfil, to the best of their ability the requirements of the Code. (d) Users are responsible for their choice and utilisation of Online Content.” This shows that despite promoting online freedom, protection of children against exposure to content risks remains crucial. However, there are exceptions to the above general rules. Section, 263 of CMA ‘98 provides that “a licensee shall use his best endeavour to prevent the network facilities that he owns or provides or the network service, applications service or content applications service that he provides from being used in, or in relation to, the commission of any offence under any law of Malaysia.”

Consequently, Content Providers will only be required to filter and monitor online content upon receiving directions from MCMC. This usually relates to providing assistance in investigations and prosecutions. Unless required by the law, the Providers have no active duty to monitor any prohibited content that passes through their domains. This is confirmed by the “innocent carrier” provisions in Part 5 of the Content Code that does not place liability onto ISPs for hosting prohibited content since they were merely content conduits.

From the above analysis, the provisions in the Online Safety Act 2021 and the Online Safety (Miscellaneous Amendments) Act 2022 are clearly indicators of the need for the law to be kept abreast to address the ever-changing threats to child exploitation online in Malaysia. Therefore, it is timely for the Communication and Multimedia Act 1998 and the Content Code to be amended to reflect the changing times and to place more responsibility on the communication and multimedia industry in Malaysia.

Disclosure statement

No potential conflict of interest was reported by the author(s).

Additional information

Funding

This Research is funded by the Malaysian Communications and Multimedia Commission through the Digital Society Research Grant.

Notes on contributors

Manique A.E. Cooray

Dr. Manique A.E. Cooray is an Associate Professor at the Faculty of Law, Multimedia University. Her main area of research is on cybercrimes, particularly in relation to exploitation of children online.

Iman Syamil Bin Ahmad Rajuhan

Iman Syamil is currently a Chambering student in the law Firm at Wan Ahmad Ridzuan & Co, Kuala Lumpur.

Wan Nur Addibah Binti Adnan

Wan Nur Addibah Binti Adnan is Lecturer at the Faculty of Law, Multimedia University. She has experience in litigation, arbitratioin and mediation.

Notes

1. * Dr. Manique A.E. Cooray, Associate Professor, Faculty of Law, Multimedia University, Malacca, Malaysia; * Iman Syamil Bin Ahmad Rajuhan; Graduate Student; Faculty of Law, Multimedia University; * Wan Nur Addibah Binti Adnan, Lecturer, Faculty of Law, Multimedia University, Malacca, Malaysia.* This Research is funded by the Malaysian Communications and Multimedia Commission (“MCMC”) through the Digital Society Research Grant.Prensky (Citation2001), “Digital Natives, Digital Immigrants Part 2: Do They Really Think Differently?”, On the Horizon, Vol. 9 No. 6, pp. 1–6. https://doi.org/10.1108/10748120110424843, accessed September 27, 2022.

2. Adam Tomison, “Linking Child Abuse and Other Family Violence: Findings From A Case Tracking Study” (Adam, Citation1995) 41 Family Matters 33.

3. Juriah Abdul Jalil, “Combating Child Pornography in Digital Era: Is Malaysian Law Adequate to Meet the Digital Challenge?” (Citation2015) 23 Pertanika Journal of Social Science & Humanities 137–152.

4. Manique Cooray, “Choosing not to be a Target: Protecting Children from harmful and illegal material on the Internet” Impact of Law on Family Institutions, Ed., Norchaya Talib et al, pp.117–130.

5. Manique Cooray, “Child Pornography on the Internet: Then and Now”, Children in Malaysia: Selected Contemporary Issues, Ed., Dr. Jal Zabadi Mohd Yusoff (Kuala Lumpur: University of Malaya Press, 2018) 115–125; Dr. Manique Cooray & Dr. Farah Nini Dusuki, “Ensuring Justice for Victims of Child Abuse via Statutory and Administrative Reforms in Malaysia” Mimi Makariah Law Series: The Child Act 2001- Past, Present and Future”. Ed., Jal Zabdi Mohd Yusoff et al, (Kuala Lumpur: University Malaya Press, 2022) 117–130; Manique Cooray, “Choosing not to be a Target: Protecting Children from harmful and illegal material on the Internet” Impact of Law on Family Institutions, Ed., Norchaya Talib et al, (Kuala Lumpur: University Malaya Press, 2009) pp.117–130.

6. “Liability of Internet Service Provider in Cyber Defamation: An Analysis on Communication and Multimedia Act 1998”, Zahidah Zakaria, Fadhilah Abdul Ghani, Nurul Huda Ahmad Razali, Noor Suhani Sulaiman4, Nurul Hidayat Mat Nawi.

7. ISP Quick List “List of Internet Service Providers in Malaysia” http://www.ispquicklist.com/Internet-Service-Providers-List-in-Malaysia.aspx accessed 3rd December 2022.

8. Ibid.

9. Computer Crimes Act 1997(Act 563), Digital Signature Act 1997(Act 562), Communication and Multimedia Act 1998 (Act 588), Payment systems Act 2003 (Act 627), Electronic Commerce Act 2006 (Act 658) and the Personal Data Protection Act 2010 (Act 709).

10. “Malaysia Tops in South-East Asia for Online Child Pornography” (January 30, 2018) https://www.straitstimes.com/asia/se-asia/malaysia-tops-in-south-east-asia-for-online-child-pornography last accessed on 24 September 2022.

11. R v Richard Huckle (London Central Criminal Court Old Bailey Unreported 2016, US v. Blake Robert Johnston, 2017, Northern District of California (Unreported) as referred in Nabilah Hani Ahmad Zubaidi, “Monitoring Internet Child Pornography (ICP) in Malaysia”, (2021) 29 (S2) Pertanika J. Soc. Sci. & Hum. 185–203, 186.

12. Ibid.

13. Shah A, “Rina: 106,764 Malaysian IP Addresses Involved in Child Porn” The Star (September 4, 2022) https://www.thestar.com.my/news/nation/2022/09/04/rina−106,000-malaysian-ip-addresses-involved-in-child-porn accessed 12 September 2022

14. Ibid.

15. Digital Business Lab July 27, 2022) https://digital-business-lab.com/2022/07/%E2%91%A1-social-media-penetration-in-malaysia-research/ accessed September 27, 2022

16. Section 6, CMA ’98.

17. Section 211 of the CMA ‘98.

18. Section 233 of the CMA ‘98.

19. Section 94 of the Communications and Multimedia Act 1998.

20. Dr. Siti Zabedah Mohd Shariff, Rohayu Kosmin, Regulating Content in Broadcasting, Media and the Internet: A Case Study on Public Understanding of their Role on Self-Regulation, International Journal of Humanities and Social Science Vol. 2 No. 23; December (Siti Zabedah & Kosmin, Citation2012), 140–150.

21. Ammar Abdullah Saeed Mohammed, Nazli Ismail Nawang, Offensive Content on The Internet: The Malaysian Legal Approach, International Journal of Innovation, Creativity and Change, Volume 5, Issue 2, Special Edition (Ammar Abdullah & Nazli, Citation2019), 367–377, www.ijicc.net.

22. Dr. Siti Zabedah Mohd Shariff, Rohayu Kosmin, Regulating Content in Broadcasting, Media and the Internet: A Case Study on Public Understanding of their Role on Self-Regulation, International Journal of Humanities and Social Science Vol. 2 No. 23; December (Siti Zabedah & Kosmin, Citation2012), 140–150.

23. Section 95 CMA ’98.

24. Communications and multimedia content forum “What is the Content Code” https://contentforum.my/contentcode22/#:~:text=What%20is%20the%20Content%20Code,and%20multimedia%20industry%20in%20Malaysia. accessed on 21 November 2022.

25. Content Code 2022 Part 2 Guidelines on Content.

26. Content Code 2022 Part 2, Section 3.

27. Content Code 2022 Part 1 Section 5.

28. Content Code 2022 Part 5: Specific Online Guidelines Section 7.1.

29. Content Code Part 5 Section 6 “General Measures”

30. Content Code 2022 Part 1 Sections 5.

31. Content Code 2022 Part 8, Section 3 “Complaints Bureau.”

32. Content Code 2022 Part 8, Section 3.3.

33. Content Code 2022 Part 8, Section 5 “Procedure for Industry Complaints.”

34. Australian Government “Reviews of the Enhancing Online Safety Act 2015 and the Online Content Scheme- discussion paper” (Department of Communications and The Art, June 2018).https://www.infrastructure.gov.au/sites/default/files/consultation/pdf/reviews-enhancing-online-safety-act-2015-and-online-content-scheme-discussion-paper-mk4.pdf accessed 7 December 2022

35. Observer Research Foundation “Promoting Child Safety Online in the Time of COVID-19: The Indian and Australian Responses” (Issue Brief, Issue No. 557, June 2022).https://www.orfonline.org/wp-content/uploads/2022/06/ORF_IssueBrief_557_ChildSafetyOnline.pdf accessed 7 December 2022.

36. Anirban Sarma “Australia: Demonstrating leadership in the space of online child safety” (Raisina Debates, 20 September 2022) < https://www.orfonline.org/expert-speak/australia-demonstrating-leadership-in-the-space-of-online-child-safety/> accessed 7 December 2022.

37. Julie Chandra “Online Safety Act 2021” [2022] Communications Law and Policy in Australia, accessed 17th November 2022.

38. eSafety Commissioner “Our Legislative Functions” https://www.esafety.gov.au/about-us/who-we-are/our-legislative-functions accessed on 3rd December 2022.

39. Online Safety Act 2021 Sections 77 “Removal notice given to the provider of a social media service, relevant electronic service or designated internet service”

40. Australian Government eSafety Commissioner “Regulatory Schemes”.

41. Online Safety Act 2021 Part 2 Section 27 “Functions of the Commissioner”.

42. Australian Government eSafety Commissioner “Online Safety Act 2021 Fact Sheet” https://www.esafety.gov.au/sites/default/files/2021–07/Online%20Safety%20Act%20-%20Fact%20sheet.pdf accessed 17th November 2022.

43. Australian Government eSafety Commissioner “Basic Online Safety Expectations” https://www.esafety.gov.au/industry/basic-online-safety-expectations accessed 17th November 2022

44. Australian Government eSafety Commissioner “Basic Online Safety Expectations” https://www.esafety.gov.au/industry/basic-online-safety-expectations accessed 17th November 2022

45. Online Safety Act 2021 Part 9, Division 7 Industry codes and industry standards.

46. Online Safety Act 2021 Section 137.

47. Online Safety Act 2021 Section 138 Examples of matters that may be dealt with by industry codes and industry standards.

48. Online Safety Act 2021 Section 139.

49. Online Safety Act 2021 Section 106 Class 1 material.

50. eSafety Commissioner “Online Content Scheme Regulatory Guidance December 2021 eSC RG 4” https://www.esafety.gov.au/sites/default/files/2021–12/eSafety-Online-Content-Scheme.pdf accessed 3rd December 2022.

51. Online Safety Act Section 107 Class 2 Material ‘ if the material were to be classified by the Classification Board in a corresponding way to the way in which a film would be classified under the Classification (Publications, Films and Computer Games) Act 1995-the material would be likely to be classified as X 18 + .

52. Online Safety Act 2021, Section 107, p. 93–95 “Class 2 material”

53. Australian Government eSafety Commissioner “Illegal and Restricted Online Content”.https://www.esafety.gov.au/key-issues/Illegal-restricted-content accessed 27th September 2022.

54. Online Safety Act 2021 Section 143.

55. Australian Government “Overview of eSafety’s role and functions” (eSafety Commissioner, July 2021).< https://www.esafety.gov.au/sites/default/files/2021–07/Overview%20of%20role%20and%20functions_0.pdf> accessed 7 December 2022

56. Ibid.

57. Ibid.

58. Section 35 of the Online Safety Act 2021, p 36.

59. Section 38 of the Online Safety Act 2021, p 4.0.

60. Section 109 of the Online Safety Act 2021, p 98.

61. Section 110 of the Online Safety Act 2021, p 98–99.

62. Victoria Legal Aid. (n.d.). retrieved November 3, 2022, from https://www.legalaid.vic.gov.au/penalty-units

63. Section 113 of Online Safety Act 2021, p 99–100.

64. Section 113A of Online Safety Act 2021, p 100.

65. Section 124 of Online Safety Act 2021, p 108.

66. Internet Code of Practice code 4 Prohibited Material (1) Prohibited material is material that is objectionable on the grounds of public interest, public morality, public order, public security, national harmony, or is otherwise prohibited by applicable Singapore laws.

67. Internet Code of Practice Code 4 Prohibited Material.

68. Internet Code of Practice Clause 3(5) Paragraph does not apply to any web publisher or web server administrator in respect of programmes on his service for which he has no editorial control.

69. Copyright Act 2021, Section 193B “Transmission, routing and provision of connections”,193C “System caching”, 193 DA “Storage and information location” and 193DE “Regulations”.

70. Singtel “Our Company” https://www.singtel.com/about-us/company accessed 21 November .2022

71. Singtel “Copyright Act Notification” https://www.singtel.com/copyright accessed 19 November 2022

72. Online Safety (Miscellaneous Amendments) Bill No.28/2022.

73. Speech by Mrs Josephine Teo, Minister for Communications and Information at the Second Reading of the Online Safety (Miscellaneous Amendments) Bill, on 8 November 2022, at https://www.mci.gov.sg/pressroom/news-and-stories/pressroom/2022/11/speech-by-minister-for-communications-and-information-mrs-josephine-teo-at-the-second-reading-of-the-online-safety-(miscellaneous-amendments)-bill?page=2, accessed on 19 November, 2022.

74. Ibid.

75. Ibid.

76. Online Safety (Miscellaneous Amendments) Bill No. 28/2022. Clause 45D “Egregious Content”

77. Online Safety (Miscellaneous Amendments) Bill No. 28/2022. Clause 45D(3).

78. Online Safety (Miscellaneous Amendments) Bill No.28/2022 New Sections 2A,2B,2C and 2D.

79. Online Safety (Miscellaneous Amendments) Bill No.28/2022 Clause 45K Regulated online communication service.

80. Online Safety (Miscellaneous Amendments) Bill No.28/2022 Clause 45 L Online Code of Practice for regulated online communication service.

81. Duty of regulated online communication service provider 45 M.

82. Online Safety (Miscellaneous Amendments) Bill No.28/2022 Clause 45 H direction.

83. Online Safety (Miscellaneous Amendments) Bill No.28/2022 Section 45I.

84. Online Safety (Miscellaneous Amendments) Bill No.28/2022 Clause 45 H.

85. A fund established by section 81 of the Constitution, consists of all revenues and moneys raised or received by the executive government of the Commonwealth. The CRF is self-executing in nature, which means that all money received by the Commonwealth automatically forms part of the CRF. Retrieved from Australian Government Department of Finance “Consolidated Revenue Fund”(Pgpa glossary,7 August 2019) <https://www.finance.gov.au/about-us/glossary/pgpa/term-consolidated-revenue-fund-crf >18th April 2023

86. Section 134 of Online Safety Act 2021, p. 113.

References

  • Adam, T. (1995). Linking child abuse and other family violence: Findings from a case tracking study. 41 Family Matters 33.
  • Ammar Abdullah, S. M., & Nazli, I. N. (2019). Offensive content on the internet: The Malaysian legal approach. International Journal of Innovation, Creativity & Change, 5(2), 367–25. Special Edition. www.ijicc.net
  • Juriah, A. J. (2015). Combating child pornography in digital Era: Is Malaysian law adequate to meet the digital challenge? Pertanika Journal of Social Science and Humanities, 137–152.
  • Prensky, M. (2001). Digital natives, digital immigrants part 2: Do they really think differently? On the Horizon, 9 (6), 1–6. Retrieved September 27, 2022. https://doi.org/10.1108/10748120110424843.
  • Siti Zabedah, M. S., & Kosmin, R. (2012, December). Regulating content in broadcasting, media and the internet: A case study on public understanding of their role on self-regulation. International Journal of Humanities and Social Science, 2(23), 140–150.