Closing Speech by Minister Josephine at the 2nd Reading of the Online Safety (Misc) Bill
09 NOV 2022
Introduction
1. Mr Deputy Speaker, I thank Members for their interest in the Bill. All 16 Members who spoke have given their support, reflecting the broad consensus on the need and timeliness of the proposals.
2. Members raised many useful points which I will address.
Who the Bill will/will not cover
3. Let me start with clarifications on the types of services that the Bill will cover.
4. Ms Tin Pei Ling, Mr Louis Ng, and Mr Saktiandi Supaat asked what other types of services, besides Social Media Services, may be specified in the Schedule of Online Communication Services, or OCS.
5. Dr Shahira Abdullah wanted to know how IMDA will decide which service providers to designate. She and Ms Tin also asked about updating our regulations to keep in step with new technologies.
a. Like many Singaporeans we engaged, Members acknowledge the fast pace of change in the online landscape.
b. We are therefore committed to updating our laws and regulations as frequently as necessary to keep them relevant and effective.
6. In terms of type of services, we will prioritise those that are more widely used in Singapore, and where the safety risks have become or are becoming apparent.
a. IMDA will use various data sources on user trends in Singapore to aid these assessments.
b. The Government is actively studying several areas, but I seek Members’ understanding that it can be counter-productive to discuss them prematurely.
c. Let us instead better understand and characterise the issues, taking reference from regulatory attempts elsewhere, before moving to design a suitable set of interventions for Singapore.
d. For example, Mr Melvin Yong, Mr Alex Yam, Mr Gerald Giam and Mr Mark Chay asked about online gaming, whereas the Bill only covers social media services currently. We share their concerns about online gaming. We have been thinking about it and will share more details when ready.
7. Within each specified OCS, which entities to designate will depend on how much reach or impact they have with Singapore viewers.
8. IMDA will consult services before designating them under the Bill, to ensure that designated services are clear on the requirements and are given the opportunity to provide input on the proposals laid out by IMDA.
a. Mr Zhulkarnain Abdul Rahim, Mr Saktiandi Supaat and Mr Leon Perera asked about the consultation process.
b. Details will be set out on how IMDA will work closely with the designated services. Having built constructive relationships with many of these services over the years, we are confident the processes will be robust.
c. The list of services to be designated eventually will be published by IMDA.
9. Ms Nadia Samdin, Mr Alex Yam, and Ms Tin Pei Ling asked why private communications have been excluded.
a. The short answer is that there are legitimate privacy concerns, which Mr Gerald Giam also shares. But users are not without recourse.
b. IMDA’s draft Code of Practice for Online Safety will require designated Social Media Services to provide easily accessible user reporting mechanisms throughout its service.
c. If individuals encounter harmful messages or unwanted interactions in private messages when using these social media services, they could block the sender or report the sender to the service.
10. While we do not intend to police private communications, we are also aware that there are groups with very large memberships, which could be used to propagate egregious content, making them no different from non-private communications. In such instances IMDA will be empowered to take the same actions against them.
a. Mr Louis Ng and Mr Zhulkarnain Rahim asked about the specific factors in determining whether communications are private, that could shield such services from complying with IMDA’s protective measures. Mr Mark Chay asked about messaging platforms.
b. Labelling a group or communications as private does not make it so.
c. The Bill sets out a list of factors that must be considered collectively. For example, it may be possible to conclude that a social media group is public, even if that social media group has been set to “private” and requires the owner to grant permission before one can access the content, but the owner is indiscriminate in granting that access.
d. We will continue to study this issue closely with other agencies, industry, and international partners.
Types of content the Bill will/will not cover
11. Next, on what type of content the Bill will, or will not, address.
12. Mr Zhulkarnain Rahim asked whether drug abuse and other illegal activities will be covered. Mr Saktiandi Supaat highlighted a particular area in the online domain that is of growing concern to many users – scams.
a. Under “egregious content” as defined under the Bill, content that may cause risk to public health will be covered. Depending on the facts of the case, this may include drug-related content.
b. IMDA’s Code of Practice for Online Safety also requires services to apply content moderation systems to vice and organised crime, including fraud and scam content.
13. Mr Gerald Giam and Mr Leon Perera asked whether the Bill would cover non-consensual sharing of intimate images, and Mr Louis Ng asked why content “likely to cause feelings of enmity, hatred, ill will or hostility” is applied only to racial and religious groups, and not other demographic segments such as gender.
14. To a large extent, the kinds of problematic content the Members have in mind will already be covered within the Bill.
a. Content that advocates or instructs on violence, including sexual violence to individuals will be covered.
b. IMDA’s draft Code of Practice for Online Safety will require services to assess and act on cyberbullying, including content that is likely to cause harassment, alarm or distress to the user, which Mr Leon Perera, and Mr Darryl David, as well as Mr Melvin Yong also emphasised the need for.
c. For cases of harassment, there may also be recourse under laws such as the Protection from Harassment Act (POHA). I will say more about this later.
15. To Mr Louis Ng’s question on providing more details of “harmful content” under the Code – IMDA has issued a set of draft Guidelines giving examples of the content covered, which will be finalised together with the Code.
16. Mr Leon Perera spoke at length about the problem of loot boxes in online gaming. Mr Mark Chay also raised this issue.
a. This matter falls under the Gambling Control Act, but since Members raised it, I will briefly address it.
b. The Government recognises the potential harms of loot boxes.
c. This is why we made significant updates to the Gambling Control Act earlier this year to ensure that our laws are able to address emerging trends and products such as in-game loot boxes, which are monitored by the Gambling Regulatory Authority.
d. I invite Members to file Parliamentary Questions if they wishes to discuss this issue in greater detail.
17. Mr Saktiandi Supaat also raised queries on content such as lifestyles that go against traditional norms of society, participation in foreign armed conflicts, animal cruelty, and commercialised nudity.
18. Should we go beyond concerns over safety of individuals and communities, to cover other types of content at this juncture?
a. This has the same problem as if our proposals attempted to cover other types of services prematurely.
b. The Bill will be unwieldy, our proposals lacking in focus, and the results likely ineffective.
19. Ms Nadia Samdin asked if we had considered streamlining all online related harms into the Online Safety (Miscellaneous Amendments) Bill.
20. Our approach has been to identify and address specific areas of harm in a targeted manner.
a. As to whether the laws will be consolidated later, that remains to be seen. At this time, it is more important that we put in place legislation that effectively addresses and combats the respective harms.
b. For example, at the Committee of Supply debates this year, the Ministry of Home Affairs (MHA) announced that it was studying potential levers to deal with criminal offences committed online. Work is in progress. These levers are envisioned to complement the provisions under the Online Safety (Miscellaneous Amendments) Bill.
Who decides what content is covered
21. This leads me to questions raised by quite a few Members on how the types or thresholds of harmful or egregious content are determined, and whether a committee or deliberative body could be set up to formulate or review these thresholds.
22. The Government had consulted various stakeholders, including parents, community groups and industry representatives in arriving at the proposals in the Bill.
23. Egregious content can take many forms and exist in grey areas which can be difficult to define clearly. A case in point is Ms Nadia Samdin’s example of online forums for users to share their experiences with each other to deal with depression and anxiety, and to provide mutual support.
24. When assessing whether a piece of content is harmful or egregious, IMDA will take an objective approach, considering the context in which it is presented.
a. If such content is educational in nature, or helps users to overcome these harms, naturally it will not be considered harmful or egregious.
b. On the other hand, social media trends or challenges may sometimes appear innocuous – such as the “Milk Crate” challenge Mr Darryl David had mentioned.
c. But if they result in harm to users, such as by advocating or providing instructions on self-harm or suicide, they would be considered harmful.
25. If the concern is whether individual Social Media Services have done enough to curb exposure to harmful content, the Government will continue to consult widely across society and share the feedback with the companies. In other words, we would want to hold the mirror to them so that they know what our society’s expectations are and be able to make adjustments accordingly.
26. When urgent action is needed, such as to remove offensive content that advocates violence towards certain communities or could cause serious injuries to them, IMDA must be able to act fast. In such situations, consultations with stakeholders are better done as part of after-action review.
27. To Mr Gerald Giam’s question, if services are aggrieved by IMDA’s regulatory decisions, they can appeal to the Minister. And the Minister’s decision can also be challenged on judicial review.
a. Mr Gerald Giam and Mr Leon Perera sought assurances that the Bill will not be used to curtail democratic rights or freedom of expression.
b. I stated in my opening address that IMDA does not have unfettered ability to issue new Codes. The Bill clearly sets out the purposes for which IMDA can issue these Codes, which is recorded in Hansard.
c. I would also like to remind Members of the overarching purpose of the Bill – And that is, to provide a safe environment and conditions that protects online users, while respecting freedom of speech and expression as enshrined in Article 14 of the Constitution.
28. Let me also address a specific area that Mr Gerald Giam raised – the provisions on journalistic content in the UK’s draft Online Safety Bill.
a. I thank Mr Giam for his support of the Bill – I mean the Singapore Bill, not the UK Bill – and also and his suggestion for Singapore to mirror the UK proposal. We are always watching developments internationally, and considering what would be useful in our context. I will make three brief points on Mr Giam’s suggestion.
b. First, the draft Bill in the UK has not been passed into law. The draft provisions have been through several revisions and are far from final. So, whether this part goes in eventually, that remains to be seen.
c. Second, without going into detail, there have been criticisms that the provisions on journalistic content may be exploited by bad actors. It could inadvertently allow anyone, under the guise of being a “citizen journalist”, to communicate egregious content and expose users to harm.
d. Third, this Bill is about online safety. It has no interest in curbing legitimate journalistic content.
How will the provisions under the Bill be enforced
29. This brings me to my next point on enforcement, which several Members have raised.
30. I will explain the enforcement measures that the Bill provides for at each stage, and how these relate to the online service providers.
a. IMDA will first assess if there are instances of non-compliance, either with the Code’s requirements, or with directions issued by IMDA. It does not matter whether there are management changes within the companies. Accountability resides with the legal entities.
b. Where there is non-compliance, in general, IMDA will engage the Services to understand their reasons. This includes Services that do not have a corporate presence in Singapore.
31. Thereafter, if there is no meaningful response or mutually acceptable solution, and IMDA finds the Services to still be in breach of their obligations, measures such as financial penalties will be considered.
a. Mr Desmond Choo asked if the penalties for non-compliance are too low to have sufficient impact or deterrence.
b. The financial penalty quantum is comparable with other local legislation that cover Social Media Services, such as the Foreign Interference (Countermeasures) Act and the Protection from Online Falsehoods and Manipulation Act.
c. Services will also face reputational damage. Imagine if a service is consistently found to be in breach, and IMDA over a period of time is regularly issuing them penalties, these will be known to the public and users themselves can exercise the decision whether to continue using the service. So, I think the reputational damage has also to be considered.
32. In the event that these still fail to address our serious concerns, IMDA may then issue a blocking direction to Internet Access Service Providers, to stop Singapore users from accessing these services.
a. But to Mr Zhulkarnain Rahim’s question, the purpose of section 45H(2)(b) is to ensure that this happens only if the platform had refused to comply with IMDA’s direction.
b. This reflects our proportionate approach towards regulating content.
33. To Dr Shahira Abdullah’s question regarding the details of a blocking direction such as duration, this will depend on the individual case.
c. Suffice to say that it is a measure IMDA will not take lightly.
d. But IMDA’s resolve in protecting Singaporeans’ interests should not be tested.
34. Let me also address various technical questions from Members.
35. Mr Gan Thiam Poh asked how “Singapore users” will be determined. The OCS providers will typically have geolocation data on whether a user accesses the service from Singapore. This is common practice.
a. Mr Gan Thiam Poh and Mr Melvin Yong also asked how the Government would ensure that Singapore users are not exposed to harmful content given the use of VPNs.
b. Just like fire codes cannot prevent people from playing with fire, neither can we shield people completely if they intentionally seek out harmful content online.
c. Parents have a role to play, as do the individuals themselves as well as our wider society to be aware and vigilant.
36. Mr Zhulkarnain Rahim asked what we mean by “reasonably practicable” steps taken by the OCS to comply with IMDA’s direction. This requires the balancing of various considerations, such as the technology that is available to implement that direction. So, we will have to look into the details.
37. Mr Zhulkarnain Rahim also asked about the proposed section 45J(2).
a. This provision ensures that compliance with IMDA’s directions does not cause a service provider to incur liability in Singapore, if for example the content creator takes issue with it.
b. IMDA’s concern is to protect users in Singapore and this Bill only requires action against content accessible in Singapore. Thus, this provision naturally only insulates against liability under Singapore law.
c. Since our measures are also proportionate to the harm and consistent with leading jurisdictions, it is unlikely that the service providers will attract liability elsewhere for complying with IMDA’s directions in Singapore.
d. But we will monitor international developments and keep in mind his suggestions on reciprocal immunity.
38. Mr Zhulkarnain Rahim, Mr Alex Yam, Mr Saktiandi Supaat, as well as Mr Gerald Giam asked who will enforce the Bill, whether a dedicated new body such as an eSafety Commissioner that will be set up, and whether the respective Government teams are sufficiently resourced.
39. I thank them for looking out for the teams working behind the scenes on online safety, including the mental well-being as highlighted by Dr Shahira Abdullah.
a. As I mentioned above, compliance assessments will be undertaken by IMDA, which has both the experience and expertise in performing this role.
b. If egregious content is flagged to IMDA, and IMDA assesses there is a need to act, action will be taken.
c. All this will be a lot of work, but we will periodically review our resourcing to ensure that the team is able to carry out its responsibilities fully and effectively. And here, I notice that my colleague from the Ministry of Finance is also right behind. I am sure we have the support of the Ministry if more resources are to be needed.
Empowerment of individual users
40. Members have asked how individual users can provide feedback about problematic content or non-compliance.
41. I agree with Mr Saktiandi Supaat that users are effectively a wider pool of eyes who can help to identify and flag problematic content. Mr Alex Yam is also right to remind us that users must play a role in policing harms they may come across.
42. Users are indeed our first line of defence. This is why we expect social media services to take user reports seriously, and to ensure that their systems and processes are sufficiently robust.
43. Under IMDA’s draft Code of Practice for Online Safety, designated services will be required to provide effective, transparent, easy-to-access, and easy-to-use reporting mechanisms to all individuals.
44. This is a more effective way, to tackle voluminous online content at source.
a. In turn, users expect that social media services assess their reports and take appropriate action in a timely and diligent manner.
b. Services will be required to include information on these actions in their annual reports.
c. With this information, IMDA will be able to assess the adequacy of the Service’s measures.
d. Audits may also be undertaken to ensure compliance.
45. Mr Gerald Giam and, I believe, also Mr Saktiandi Supaat asked for the social media services to submit reports at a higher frequency than annually to establish the services’ effectiveness in acting on user reports. As a start, IMDA intends for the reports to be submitted annually but this can be reviewed later on.
Timelines to act on directions and user reports
46. Given the speed at which harmful or egregious content can be amplified and spread online, the speed of action must be proportionate to the potential harm of the content identified.
47. Members asked about the timelines for services to act on directions issued by IMDA or to respond to user reports.
48. IMDA’s directions will stipulate a specific timeline for disabling access. For egregious content that could cause serious harm, the timeline would generally be within hours.
49. IMDA will also require social media services to act on user reports in a timely and diligent manner that is proportionate to the severity of the potential harm. In particular, timelines must be expedited for content and activity related to terrorism.
Protection for young users
50. Members have expressed concerns about the impact of harmful online content on young users. Ms Janet Ang and Mr Gerald Giam raised the need to leverage technology to combat harmful online content, including through setting default content restriction settings for young users. We understand and share these concerns.
51. Therefore IMDA’s draft Codes will put in place additional safeguards to protect young users, including minimising their exposure to inappropriate content, and providing tools for children or their parents to manage their safety online.
a. The Code also requires that services provide differentiated accounts to children, whereby safety settings are robust and set to more restrictive levels that are age-appropriate by default.
b. Children and their parents or guardians must be provided clear warnings of implications if they opt out of the default settings.
c. We will continue working with industry players to see how such measures can be strengthened.
52. We recognise that there are gaps. In practice though, users might try to circumvent these measures.
a. Mr Desmond Choo, Mr Melvin Yong, Mr Gerald Giam, as well as some respondents to MCI’s public consultation in July, have asked about the possibility of requiring age verification systems.
b. Mr Saktiandi Supaat, Mr Alex Yam, Mr Mark Chay and Mr Melvin Yong asked about measures to better protect the young including age-specific provisions or mandating screen time restrictions.
53. Most social media services that have got significant reach or impact already require users to be at least 13 years old to register for an account.
a. Users have to declare their date of birth at the point of registration.
b. This way, services will be able to apply age-appropriate policies to their respective users, including content moderation.
c. In line with this, PDPC will be clarifying that personal data may be used to implement such age-appropriate policies on social media services. It is permitted and we will make it clear.
54. To mitigate against false age declarations, which is the problem I think we all recognise,
a. Some social media services use a combination of Artificial Intelligence, machine learning technology, and facial recognition algorithms to proactively detect and remove underage accounts.
b. Some also allow users to report accounts suspected to be underage, which will be investigated and suspended if the reports are accurate.
55. However, there is currently no international consensus on the standards for effective and reliable age verification by social media services which Singapore can also reliably reference.
a. Instead, we will continue to closely monitor and extensively consult on the latest developments in age verification technology, taking into account data protection safeguards, and consider viable regulatory options.
b. In addition, we will continue to work with Social Media Services, educators, and other stakeholders, to help parents guide young users navigating online spaces, and make young users better aware of the safety tools that are available to them.
Protection for individual user and victims
56. Members have also raised the importance of providing support to victims or users affected by online harms.
57. We recognise that while laws provide the necessary legal tools for victims, they can often be daunting and difficult to approach.
a. Members would be glad to know that organisations such as SG Her Empowerment, or S.H.E. have stepped up to augment Defence Guild’s efforts in providing legal support to victims of online abuse. May I just register the MCI family's sincere appreciation to Mr Zhulkarnain Rahim and his fellow volunteers for stepping up to perform this very important function.
b. Continuing the work of the Sunlight AFA, which concluded its tenure in July this year, SHE is working with the Singapore Council of Women’s Organisations to launch a support centre for victims of online harm. We see this as an important gap to plug. We are committed to making it happen and I believe that it will be made available soon. Those in need will then be able to seek support and legal advice from counsellors and pro bono lawyers from this centre.
58. As I mentioned in my opening speech, online harassment, cyberbullying and doxxing are dealt with under the Protection from Harassment Act 2014 (POHA). Victims of gender-based online harms, of which a commonly known example is image-based sexual abuse, will be able to seek recourse under POHA where the online harm amounts to harassment.
a. The Protection from Harassment Court has served many victims since it was established last year. And a reason that more have been able to get redress is because of the wider awareness of its existence.
b. The Ministry of Law (MinLaw) is also looking into how victims can be better empowered to put a stop to such online harms generally, and to seek redress against, and hold accountable those who are responsible.
c. This includes cyberbullying, and more novel forms of online hurt, such as cancel campaigns, which Minister Shanmugam has spoken about before.
i. MinLaw’s efforts will complement MCI’s efforts to enhance the Government’s regulatory tool kit, as well as MHA’s efforts to address criminal offences committed online.
ii. More details will be announced at an appropriate juncture.
59. But I think Members see that we are not stopping with this Bill. There are other proposals that are being considered and we probably will not have to wait very much longer for these to be known publicly.
Public Education/Equipping parents and educators
60. Which leads me to my final point – that public education must come hand in hand with legislation. Ms Nadia Samdin, Mr Alex Yam, Mr Mark Chay, Mr Leon Perera, and Mr Zhulkarnain Rahim spoke about this.
61. Members also called for more collaboration with service providers in this area. For example, Mr Melvin Yong asked whether the Government would consider setting up a self-regulatory taskforce with key OCS providers. We can explore this suggestion when we engage further with the industry.
62. Mr Deputy Speaker, I seek your permission to distribute a handout to the Members, which contains a list of safety measures on social media services, and public education programmes organised in collaboration with various technology companies and community partners.
63. To highlight a few examples:
a. Google held its Online Safety Park at the Digital for Life Festival earlier this year, and it is partnering the Media Literacy Council (MLC) to bring its “Be Internet Awesome” programme to primary schools to train 50,000 parents and children on online safety measures. The last I met with them, they said that the 50,000 target has been met. They are actually aiming to double it to 100,000.
b. Meta collaborated with the National Crime Prevention Council and MLC on a campaign to educate users on top scam typologies and tips to keep safe. This campaign reached over 2 million users, and a second campaign has been launched on e-commerce scams.
c. There are many others, and we will continue to build on these efforts.
64. Mr Deputy Speaker, may I make a few comments in Mandarin please.
65. 现今社会,一个看似微小的网络危害却有可能像野火般迅速传开,造成极大伤害。
66. 大家都知道要灭火就需要消防员,网络世界也一样。
67. 通讯网络安全法案的目的就是为了让我国采取及时、有效的“灭火”行动。
68. 尽管法律可以发挥作用,但不可能全面打击网络危害。因此,我国采取的是灵活应对的方针, 按部就班的方式来应对瞬息万变的网络世界。
69. 更重要的是, 政府很清楚我们必须网罗各方伙伴,一起寻找解决方案。
14. To a large extent, the kinds of problematic content the Members have in mind will already be covered within the Bill.
a. Content that advocates or instructs on violence, including sexual violence to individuals will be covered.
b. IMDA’s draft Code of Practice for Online Safety will require services to assess and act on cyberbullying, including content that is likely to cause harassment, alarm or distress to the user, which Mr Leon Perera, and Mr Darryl David, as well as Mr Melvin Yong also emphasised the need for.
c. For cases of harassment, there may also be recourse under laws such as the Protection from Harassment Act (POHA). I will say more about this later.
15. To Mr Louis Ng’s question on providing more details of “harmful content” under the Code – IMDA has issued a set of draft Guidelines giving examples of the content covered, which will be finalised together with the Code.
16. Mr Leon Perera spoke at length about the problem of loot boxes in online gaming. Mr Mark Chay also raised this issue.
a. This matter falls under the Gambling Control Act, but since Members raised it, I will briefly address it.
b. The Government recognises the potential harms of loot boxes.
c. This is why we made significant updates to the Gambling Control Act earlier this year to ensure that our laws are able to address emerging trends and products such as in-game loot boxes, which are monitored by the Gambling Regulatory Authority.
d. I invite Members to file Parliamentary Questions if they wishes to discuss this issue in greater detail.
17. Mr Saktiandi Supaat also raised queries on content such as lifestyles that go against traditional norms of society, participation in foreign armed conflicts, animal cruelty, and commercialised nudity.
18. Should we go beyond concerns over safety of individuals and communities, to cover other types of content at this juncture?
a. This has the same problem as if our proposals attempted to cover other types of services prematurely.
b. The Bill will be unwieldy, our proposals lacking in focus, and the results likely ineffective.
19. Ms Nadia Samdin asked if we had considered streamlining all online related harms into the Online Safety (Miscellaneous Amendments) Bill.
20. Our approach has been to identify and address specific areas of harm in a targeted manner.
a. As to whether the laws will be consolidated later, that remains to be seen. At this time, it is more important that we put in place legislation that effectively addresses and combats the respective harms.
b. For example, at the Committee of Supply debates this year, the Ministry of Home Affairs (MHA) announced that it was studying potential levers to deal with criminal offences committed online. Work is in progress. These levers are envisioned to complement the provisions under the Online Safety (Miscellaneous Amendments) Bill.
Who decides what content is covered
21. This leads me to questions raised by quite a few Members on how the types or thresholds of harmful or egregious content are determined, and whether a committee or deliberative body could be set up to formulate or review these thresholds.
22. The Government had consulted various stakeholders, including parents, community groups and industry representatives in arriving at the proposals in the Bill.
23. Egregious content can take many forms and exist in grey areas which can be difficult to define clearly. A case in point is Ms Nadia Samdin’s example of online forums for users to share their experiences with each other to deal with depression and anxiety, and to provide mutual support.
24. When assessing whether a piece of content is harmful or egregious, IMDA will take an objective approach, considering the context in which it is presented.
a. If such content is educational in nature, or helps users to overcome these harms, naturally it will not be considered harmful or egregious.
b. On the other hand, social media trends or challenges may sometimes appear innocuous – such as the “Milk Crate” challenge Mr Darryl David had mentioned.
c. But if they result in harm to users, such as by advocating or providing instructions on self-harm or suicide, they would be considered harmful.
25. If the concern is whether individual Social Media Services have done enough to curb exposure to harmful content, the Government will continue to consult widely across society and share the feedback with the companies. In other words, we would want to hold the mirror to them so that they know what our society’s expectations are and be able to make adjustments accordingly.
26. When urgent action is needed, such as to remove offensive content that advocates violence towards certain communities or could cause serious injuries to them, IMDA must be able to act fast. In such situations, consultations with stakeholders are better done as part of after-action review.
27. To Mr Gerald Giam’s question, if services are aggrieved by IMDA’s regulatory decisions, they can appeal to the Minister. And the Minister’s decision can also be challenged on judicial review.
a. Mr Gerald Giam and Mr Leon Perera sought assurances that the Bill will not be used to curtail democratic rights or freedom of expression.
b. I stated in my opening address that IMDA does not have unfettered ability to issue new Codes. The Bill clearly sets out the purposes for which IMDA can issue these Codes, which is recorded in Hansard.
c. I would also like to remind Members of the overarching purpose of the Bill – And that is, to provide a safe environment and conditions that protects online users, while respecting freedom of speech and expression as enshrined in Article 14 of the Constitution.
28. Let me also address a specific area that Mr Gerald Giam raised – the provisions on journalistic content in the UK’s draft Online Safety Bill.
a. I thank Mr Giam for his support of the Bill – I mean the Singapore Bill, not the UK Bill – and also and his suggestion for Singapore to mirror the UK proposal. We are always watching developments internationally, and considering what would be useful in our context. I will make three brief points on Mr Giam’s suggestion.
b. First, the draft Bill in the UK has not been passed into law. The draft provisions have been through several revisions and are far from final. So, whether this part goes in eventually, that remains to be seen.
c. Second, without going into detail, there have been criticisms that the provisions on journalistic content may be exploited by bad actors. It could inadvertently allow anyone, under the guise of being a “citizen journalist”, to communicate egregious content and expose users to harm.
d. Third, this Bill is about online safety. It has no interest in curbing legitimate journalistic content.
How will the provisions under the Bill be enforced
29. This brings me to my next point on enforcement, which several Members have raised.
30. I will explain the enforcement measures that the Bill provides for at each stage, and how these relate to the online service providers.
a. IMDA will first assess if there are instances of non-compliance, either with the Code’s requirements, or with directions issued by IMDA. It does not matter whether there are management changes within the companies. Accountability resides with the legal entities.
b. Where there is non-compliance, in general, IMDA will engage the Services to understand their reasons. This includes Services that do not have a corporate presence in Singapore.
31. Thereafter, if there is no meaningful response or mutually acceptable solution, and IMDA finds the Services to still be in breach of their obligations, measures such as financial penalties will be considered.
a. Mr Desmond Choo asked if the penalties for non-compliance are too low to have sufficient impact or deterrence.
b. The financial penalty quantum is comparable with other local legislation that cover Social Media Services, such as the Foreign Interference (Countermeasures) Act and the Protection from Online Falsehoods and Manipulation Act.
c. Services will also face reputational damage. Imagine if a service is consistently found to be in breach, and IMDA over a period of time is regularly issuing them penalties, these will be known to the public and users themselves can exercise the decision whether to continue using the service. So, I think the reputational damage has also to be considered.
32. In the event that these still fail to address our serious concerns, IMDA may then issue a blocking direction to Internet Access Service Providers, to stop Singapore users from accessing these services.
a. But to Mr Zhulkarnain Rahim’s question, the purpose of section 45H(2)(b) is to ensure that this happens only if the platform had refused to comply with IMDA’s direction.
b. This reflects our proportionate approach towards regulating content.
33. To Dr Shahira Abdullah’s question regarding the details of a blocking direction such as duration, this will depend on the individual case.
c. Suffice to say that it is a measure IMDA will not take lightly.
d. But IMDA’s resolve in protecting Singaporeans’ interests should not be tested.
34. Let me also address various technical questions from Members.
35. Mr Gan Thiam Poh asked how “Singapore users” will be determined. The OCS providers will typically have geolocation data on whether a user accesses the service from Singapore. This is common practice.
a. Mr Gan Thiam Poh and Mr Melvin Yong also asked how the Government would ensure that Singapore users are not exposed to harmful content given the use of VPNs.
b. Just like fire codes cannot prevent people from playing with fire, neither can we shield people completely if they intentionally seek out harmful content online.
c. Parents have a role to play, as do the individuals themselves as well as our wider society to be aware and vigilant.
36. Mr Zhulkarnain Rahim asked what we mean by “reasonably practicable” steps taken by the OCS to comply with IMDA’s direction. This requires the balancing of various considerations, such as the technology that is available to implement that direction. So, we will have to look into the details.
37. Mr Zhulkarnain Rahim also asked about the proposed section 45J(2).
a. This provision ensures that compliance with IMDA’s directions does not cause a service provider to incur liability in Singapore, if for example the content creator takes issue with it.
b. IMDA’s concern is to protect users in Singapore and this Bill only requires action against content accessible in Singapore. Thus, this provision naturally only insulates against liability under Singapore law.
c. Since our measures are also proportionate to the harm and consistent with leading jurisdictions, it is unlikely that the service providers will attract liability elsewhere for complying with IMDA’s directions in Singapore.
d. But we will monitor international developments and keep in mind his suggestions on reciprocal immunity.
38. Mr Zhulkarnain Rahim, Mr Alex Yam, Mr Saktiandi Supaat, as well as Mr Gerald Giam asked who will enforce the Bill, whether a dedicated new body such as an eSafety Commissioner that will be set up, and whether the respective Government teams are sufficiently resourced.
39. I thank them for looking out for the teams working behind the scenes on online safety, including the mental well-being as highlighted by Dr Shahira Abdullah.
a. As I mentioned above, compliance assessments will be undertaken by IMDA, which has both the experience and expertise in performing this role.
b. If egregious content is flagged to IMDA, and IMDA assesses there is a need to act, action will be taken.
c. All this will be a lot of work, but we will periodically review our resourcing to ensure that the team is able to carry out its responsibilities fully and effectively. And here, I notice that my colleague from the Ministry of Finance is also right behind. I am sure we have the support of the Ministry if more resources are to be needed.
Empowerment of individual users
40. Members have asked how individual users can provide feedback about problematic content or non-compliance.
41. I agree with Mr Saktiandi Supaat that users are effectively a wider pool of eyes who can help to identify and flag problematic content. Mr Alex Yam is also right to remind us that users must play a role in policing harms they may come across.
42. Users are indeed our first line of defence. This is why we expect social media services to take user reports seriously, and to ensure that their systems and processes are sufficiently robust.
43. Under IMDA’s draft Code of Practice for Online Safety, designated services will be required to provide effective, transparent, easy-to-access, and easy-to-use reporting mechanisms to all individuals.
44. This is a more effective way, to tackle voluminous online content at source.
a. In turn, users expect that social media services assess their reports and take appropriate action in a timely and diligent manner.
b. Services will be required to include information on these actions in their annual reports.
c. With this information, IMDA will be able to assess the adequacy of the Service’s measures.
d. Audits may also be undertaken to ensure compliance.
45. Mr Gerald Giam and, I believe, also Mr Saktiandi Supaat asked for the social media services to submit reports at a higher frequency than annually to establish the services’ effectiveness in acting on user reports. As a start, IMDA intends for the reports to be submitted annually but this can be reviewed later on.
Timelines to act on directions and user reports
46. Given the speed at which harmful or egregious content can be amplified and spread online, the speed of action must be proportionate to the potential harm of the content identified.
47. Members asked about the timelines for services to act on directions issued by IMDA or to respond to user reports.
48. IMDA’s directions will stipulate a specific timeline for disabling access. For egregious content that could cause serious harm, the timeline would generally be within hours.
49. IMDA will also require social media services to act on user reports in a timely and diligent manner that is proportionate to the severity of the potential harm. In particular, timelines must be expedited for content and activity related to terrorism.
Protection for young users
50. Members have expressed concerns about the impact of harmful online content on young users. Ms Janet Ang and Mr Gerald Giam raised the need to leverage technology to combat harmful online content, including through setting default content restriction settings for young users. We understand and share these concerns.
51. Therefore IMDA’s draft Codes will put in place additional safeguards to protect young users, including minimising their exposure to inappropriate content, and providing tools for children or their parents to manage their safety online.
a. The Code also requires that services provide differentiated accounts to children, whereby safety settings are robust and set to more restrictive levels that are age-appropriate by default.
b. Children and their parents or guardians must be provided clear warnings of implications if they opt out of the default settings.
c. We will continue working with industry players to see how such measures can be strengthened.
52. We recognise that there are gaps. In practice though, users might try to circumvent these measures.
a. Mr Desmond Choo, Mr Melvin Yong, Mr Gerald Giam, as well as some respondents to MCI’s public consultation in July, have asked about the possibility of requiring age verification systems.
b. Mr Saktiandi Supaat, Mr Alex Yam, Mr Mark Chay and Mr Melvin Yong asked about measures to better protect the young including age-specific provisions or mandating screen time restrictions.
53. Most social media services that have got significant reach or impact already require users to be at least 13 years old to register for an account.
a. Users have to declare their date of birth at the point of registration.
b. This way, services will be able to apply age-appropriate policies to their respective users, including content moderation.
c. In line with this, PDPC will be clarifying that personal data may be used to implement such age-appropriate policies on social media services. It is permitted and we will make it clear.
54. To mitigate against false age declarations, which is the problem I think we all recognise,
a. Some social media services use a combination of Artificial Intelligence, machine learning technology, and facial recognition algorithms to proactively detect and remove underage accounts.
b. Some also allow users to report accounts suspected to be underage, which will be investigated and suspended if the reports are accurate.
55. However, there is currently no international consensus on the standards for effective and reliable age verification by social media services which Singapore can also reliably reference.
a. Instead, we will continue to closely monitor and extensively consult on the latest developments in age verification technology, taking into account data protection safeguards, and consider viable regulatory options.
b. In addition, we will continue to work with Social Media Services, educators, and other stakeholders, to help parents guide young users navigating online spaces, and make young users better aware of the safety tools that are available to them.
Protection for individual user and victims
56. Members have also raised the importance of providing support to victims or users affected by online harms.
57. We recognise that while laws provide the necessary legal tools for victims, they can often be daunting and difficult to approach.
a. Members would be glad to know that organisations such as SG Her Empowerment, or S.H.E. have stepped up to augment Defence Guild’s efforts in providing legal support to victims of online abuse. May I just register the MCI family's sincere appreciation to Mr Zhulkarnain Rahim and his fellow volunteers for stepping up to perform this very important function.
b. Continuing the work of the Sunlight AFA, which concluded its tenure in July this year, SHE is working with the Singapore Council of Women’s Organisations to launch a support centre for victims of online harm. We see this as an important gap to plug. We are committed to making it happen and I believe that it will be made available soon. Those in need will then be able to seek support and legal advice from counsellors and pro bono lawyers from this centre.
58. As I mentioned in my opening speech, online harassment, cyberbullying and doxxing are dealt with under the Protection from Harassment Act 2014 (POHA). Victims of gender-based online harms, of which a commonly known example is image-based sexual abuse, will be able to seek recourse under POHA where the online harm amounts to harassment.
a. The Protection from Harassment Court has served many victims since it was established last year. And a reason that more have been able to get redress is because of the wider awareness of its existence.
b. The Ministry of Law (MinLaw) is also looking into how victims can be better empowered to put a stop to such online harms generally, and to seek redress against, and hold accountable those who are responsible.
c. This includes cyberbullying, and more novel forms of online hurt, such as cancel campaigns, which Minister Shanmugam has spoken about before.
i. MinLaw’s efforts will complement MCI’s efforts to enhance the Government’s regulatory tool kit, as well as MHA’s efforts to address criminal offences committed online.
ii. More details will be announced at an appropriate juncture.
59. But I think Members see that we are not stopping with this Bill. There are other proposals that are being considered and we probably will not have to wait very much longer for these to be known publicly.
Public Education/Equipping parents and educators
60. Which leads me to my final point – that public education must come hand in hand with legislation. Ms Nadia Samdin, Mr Alex Yam, Mr Mark Chay, Mr Leon Perera, and Mr Zhulkarnain Rahim spoke about this.
61. Members also called for more collaboration with service providers in this area. For example, Mr Melvin Yong asked whether the Government would consider setting up a self-regulatory taskforce with key OCS providers. We can explore this suggestion when we engage further with the industry.
62. Mr Deputy Speaker, I seek your permission to distribute a handout to the Members, which contains a list of safety measures on social media services, and public education programmes organised in collaboration with various technology companies and community partners.
63. To highlight a few examples:
a. Google held its Online Safety Park at the Digital for Life Festival earlier this year, and it is partnering the Media Literacy Council (MLC) to bring its “Be Internet Awesome” programme to primary schools to train 50,000 parents and children on online safety measures. The last I met with them, they said that the 50,000 target has been met. They are actually aiming to double it to 100,000.
b. Meta collaborated with the National Crime Prevention Council and MLC on a campaign to educate users on top scam typologies and tips to keep safe. This campaign reached over 2 million users, and a second campaign has been launched on e-commerce scams.
c. There are many others, and we will continue to build on these efforts.
64. Mr Deputy Speaker, may I make a few comments in Mandarin please.
65. 现今社会,一个看似微小的网络危害却有可能像野火般迅速传开,造成极大伤害。
66. 大家都知道要灭火就需要消防员,网络世界也一样。
67. 通讯网络安全法案的目的就是为了让我国采取及时、有效的“灭火”行动。
68. 尽管法律可以发挥作用,但不可能全面打击网络危害。因此,我国采取的是灵活应对的方针, 按部就班的方式来应对瞬息万变的网络世界。
69. 更重要的是, 政府很清楚我们必须网罗各方伙伴,一起寻找解决方案。
70. 其中一个重要伙伴就是家长。但许多家长是在他们成年后才开始接触 社交媒体。
71. 网络世界变幻无常,让不少家长措手不及。
72. 因此,我们正与各方合作,加强家长的网络安全意识和应对能力,比如让家长清楚知道社交媒体服务的一些安全选项,以便让他们更好引导他们的孩子。
73. 虽然挑战颇大,只要上下同心,齐心协力,我相信我们还是可以为国人打造一个更安全、璀璨的网络空间。
71. 网络世界变幻无常,让不少家长措手不及。
72. 因此,我们正与各方合作,加强家长的网络安全意识和应对能力,比如让家长清楚知道社交媒体服务的一些安全选项,以便让他们更好引导他们的孩子。
73. 虽然挑战颇大,只要上下同心,齐心协力,我相信我们还是可以为国人打造一个更安全、璀璨的网络空间。
Conclusion
74. Mr Deputy Speaker, I have tried to respond to as many of the questions and suggestions as I can.
75. The Bill before us today seeks to create a safer online environment for Singapore users.
a. Users will be empowered with the tools to manage their own safety and equipped with the information needed to make informed decisions about how they wish to use online services.
b. In turn, online services will be held accountable for their systems, processes, and actions.
c. And where there is egregious content, such as content that undermines racial and religious harmony, the Government will step in to protect users.
76. Ultimately, we must recognise that there is no single measure that will assure us of online safety. We will need laws, codes, education, user reporting and a whole range of interventions. We will also need to keep updating our measures to deal with new risks. I am heartened that Members are united on this, and I thank the House for its unanimous support.
77. Shared responsibility, parental guidance and active individual involvement will play a key role in ensuring that even in the face of harmful online content, users, including children, can stay safe online.