Speech by Minister Josephine Teo at the 2nd Reading of the Online Safety Bill
Introduction
1. Mr Speaker, I beg to move, “That the Bill be now read a second time”.
2. Sir, a study conducted by MCI in June this year found that almost 80%
of Singapore residents are concerned with online harms.1
In stark contrast, when we ask people how they feel about walking alone
on the streets of Singapore alone at night, 97% said that they would be
comfortable to do so.2 There is obviously a sizeable gap
between how safe Singaporeans feel online, and offline.
3. Today, most of us remain connected online throughout the day. Online
services have become the key conduits through which we communicate and
consume content.
4. Because of this, the prevalence of harmful online content on these
services can have negative serious consequences on the physical, emotional,
and mental well-being of society.
Rationale for the Bill
5. The Bill we are debating today is not the first law introduced to secure
our online space. The Government has, over the years, introduced targeted
laws to deal with specific types of harmful online content and behaviour,
including:
a. Falsehoods, which are dealt with under the Protection from Online Falsehoods
and Manipulation Act, or POFMA;
b. Foreign interference, which is addressed under the Foreign Interference
(Countermeasures) Act, or FICA;
c. Online harassment, such as cyberbullying, which is dealt with under
the Protection from Harassment Act, or POHA. POHA was also recently updated
in 2019 to cover doxxing.
6. Our laws have served to protect many Singaporeans.
a. POFMA was integral to Singapore’s response to COVID-19, allowing the
Government to address the deluge of misinformation which made COVID-19
not just a pandemic, but an “info-demic”.
b. Recently, the Straits Times reported a record high number of protection
orders filed and granted in 2021 under POHA3 , more than double
the number in previous years. Lawyers attributed the spike in applications
to media attention on the issue of harassment. The application process
has also been enhanced with the opening of the Protection from Harassment
Court.
7. However, there are still gaps that need to be addressed.
a. One growing concern is content encouraging suicide and self-harm.
i. Just two months ago, an investigation in the UK concluded that 14-year-old
Molly Russell took her own life, after being exposed to thousands of self-harm
and suicide related posts in the months leading up to her death. Many of
these posts portrayed suicide as an inevitable consequence of depression.
b. There have also been reports of users’ accidental deaths while attempting
to mimic videos of impossible physical stunts. Unknown to some victims,
these reckless acts and dangerous challenges had been heavily edited.
8. Our children, who may lack the capacity or maturity to deal with certain
types of content, are particularly vulnerable when exposed to inappropriate
content and unwanted social interaction online.
a. In June this year, MCI conducted a study which asked respondents what
online content they felt children needed to be most protected from.
The top three were sexual content, cyberbullying, and violent content.
b. In a dialogue with youth held by MCI and the National Youth Council
last year, participants indicated that the top three online harms they
and their peers faced included being insulted online, impersonated by someone
else, and receiving unwanted contact from another person4.
9. If such harmful content existed only on websites, IMDA would be able
to deal with them under the existing Broadcasting Act.
a. But today, users are much more likely to consume content from the feeds
of Social Media Services, where such harmful content can be pushed via
algorithms, and spread quickly through our social connections.
i. Just two weeks ago, Meta announced that Facebook recorded nearly 2
billion daily users 5, while Instagram recorded 2 billion monthly
active users6 .
ii. TikTok has been downloaded over 3.5 billion times worldwide since
its launch 7 , while YouTube recorded 30 billion daily views
on “YouTube Shorts”. 8
10. As the Internet evolves, so must our laws.
a. In the book “Tools and Weapons” co-authored by Microsoft’s President
Brad Smith and Carol Ann Browne, the backwardness of some cybersecurity
measures was likened to “digging trenches to defend against missiles”.
b. In the same way, we must recognise that online content can inflict
serious damage on our people and communities, if our laws fall short.
11. We must have the ability to deal with harmful online content accessible
to Singapore users, regardless of where the content is hosted or initiated.
12. The entities controlling the biggest and most popular Online Communication
Services or platforms accessible in Singapore all operate outside of Singapore,
and fall outside the legal remit of the Broadcasting Act today.
13. To ensure that Singapore users of these services and platforms can
be kept safe, we must be able to take appropriate action on these entities,
as long as they provide content accessible by Singapore users.
Global consensus on the need to protect users online
14. We are not alone in thinking this way.
15. There is a growing consensus that rules must be put in place to prevent
harms in the online world, just as in the physical world.
16. Calls for online services to take greater responsibility in ensuring
safety on their platforms have also led to jurisdictions such as the UK,
the EU, Germany, and Australia to introduce or propose new online safety
laws.
a. Mr Speaker, I seek your permission to distribute handouts to the Members,
which summarise online safety laws enacted or under consideration in these
jurisdictions.
Responding to growing public concerns
17. Mr Speaker, the Singapore public, like many other societies, is concerned
over the potential damage caused by harmful online content, and expects
Social Media Services to take greater responsibility to protect their users.
18. In July and August this year, MCI conducted a public consultation
and series of engagements on our proposals to combat harmful online content,
and received more than 600 responses.
a. Respondents expressed the desire for safety features to manage their
exposure to certain types of content.
b. Similarly, MCI’s June 2022 study, found that 9 in 10 respondents felt
that such measures would protect users from harmful online content to at
least a moderate extent.
c. Parents, in particular, were concerned over viral social media content
which featured dangerous pranks and challenges, harmful advertising, cyberbullying,
and explicit sexual content.
d. Some suggested keeping younger users in mind when developing safety
features, including tailoring content moderation thresholds, and ensuring
young users can easily report inappropriate content.
19. In a separate poll conducted earlier this year by the Sunlight Alliance
for Action, Singaporeans ranked reporting systems, and laws to tackle online
harms, as the top two measures that would facilitate help-seeking.
Enhancing online safety in Singapore
20. I will now explain our approach to enhancing online safety for Singapore
users and Members will find that there are similarities to the practices
elsewhere, examples of which I have circulated.
a. The first is to tackle the problems in an accretive manner. Rather
than take a “Big Bang” approach which some countries are attempting and
have an all-encompassing law, let us design our laws in a considered and
calibrated manner.
b. Second, as far as possible, be outcome-driven instead of being overly
prescriptive. In today’s context, we are dealing with a vast volume of
user-generated content.
i. Rather than chasing individual pieces of content, we must ensure that
systems and processes to regulate the content are put in place and maintained
by platforms.
ii. Instead of prescribing how these systems and processes are set up,
we should specify the outcomes they ought to achieve.
c. The third and perhaps most important of all, is to recognise that laws
are not a silver bullet. The Government will need to work with partners,
including our citizens, to tackle harmful content and enhance the safety
of users online.
Taking a targeted, accretive approach
21. Today’s online content service providers are different from traditional
local broadcasters, and require a different regulatory approach.
22. In fact, each type of service is different. “Social Media Services”
are not the same as “Over-The-Top Media Services”, which also operate differently
from “Game Distribution Services".
23. The Bill allows us to adopt this accretive approach by building on
existing laws to introduce new ones; so that over time, our foundations
for digital safety become stronger.
a. If passed by Parliament, this Bill will create a new part in the Broadcasting
Act to regulate “Online Communication Services”, which are electronic services
that enable users to access or communicate content via the Internet.
b. The regulations will only apply to specified types of “Online Communication
Services”, which are listed in a Schedule under the Broadcasting Act.
24. For now, we will only specify one type of Online Communication Service
in the Schedule – and that is “Social Media Services”.
a. Under the Bill, a Social Media Service is defined as an electronic
service:
i. Whose sole or primary purpose is to enable online interaction or linking
between two or more users, including enabling users to share content for
social purposes; and
ii. Which allows users to communicate content on the service.
25. Why have we chosen to regulate Social Media Services as a matter of
priority?
a. Well, because 3 in 5 users, or thereabout, from MCI’s June 2022 survey
experienced harmful content online using social media platforms.
b. This is the highest proportion compared to other platforms such as
e-commerce sites, search engines, and news sites.
Adopting outcome-driven laws
Codes of Practice for Designated Online Communication Services
26. Given the voluminous user-generated content in today’s evolving online
space, it is not efficient to regulate individual pieces of content.
27. IMDA will instead focus on system-wide measures which are more effective
at scale:
a. Under sections 45K and 45L of the proposed Bill, IMDA will be
able to designate Online Communication Services with significant reach
or impact in Singapore, and require them, via Codes of Practice, to put
in place measures to keep Singapore users safe.
b. This approach is similar to how we go about regulating fire safety.
Building owners, occupiers, and qualified persons must adhere to the SCDF’s
Fire Code, which requires them to put in place systems and processes to
maintain high fire safety standards, to keep their occupants safe.
i. Likewise, Online Communication Services must have in place systems
and processes to minimise Singapore users’ exposure to, and mitigate the
impact of, harmful content on their platforms.
28. IMDA will impose these requirements on designated Online Communication
Services via Codes of Practice.
29. By stating in the Codes the outcomes which regulated services must
meet, IMDA aims to provide sufficient clarity on what the services must
do to protect users, whilst allowing some flexibility for them to adjust
their approaches.
30. We can also expect IMDA to update the Codes from time to time.
a. This will allow us to be agile and responsive to technologies as they
evolve.
b. But before introducing new requirements, IMDA will consult and work
collaboratively with service providers to assess the most suitable approaches
to strengthening safety on their platforms.
31. Under the Bill, IMDA does not have unfettered ability to issue new
Codes. The new section 45L sets out that IMDA can issue Codes for the following
purposes:
a. First, to ensure services have systems or processes in place to address
harmful content.
b. Second, to provide practical guidance or certainty in respect of what
content should be covered.
c. Third, to set out the procedures that service providers must follow
when audits are carried out.
d. Fourth, to require services to collaborate with approved researchers
to understand systemic risks relating to the service.
Code of Practice for Online Safety
32. Earlier, I explained that we will apply our new laws to Social Media
Services as the first type of Online Communication Service. Let us now
turn to the Code that designated Social Media Services with significant
reach or impact in Singapore must comply with.
33. In October, IMDA issued a draft copy of the “Code of Practice for
Online Safety”.
34. This draft Code comes after an extensive study of international online
safety legislation as well as proposals, and engagements with major Social
Media Services in Singapore, including Facebook, YouTube, Instagram, TikTok,
Twitter, and HardwareZone.
a. The Social Media Services consulted were receptive to the proposals
laid out in the draft Code, and the Bill.
b. They support the Government’s commitment to find innovative and effective
solutions to combat harmful online content, and recognise the need to improve
online safety.
35. The designated Social Media Services will be expected to meet the
key outcomes as follows:
a. First, minimise Singapore users’ exposure to harmful content and empower
users with tools to manage their own safety. The Social Media Services
must also take additional steps to minimise children’s exposure to inappropriate
content and provide tools allowing children or their parents to manage
their safety.
b. Second, make available an easy-to-use mechanism for Singapore users
to report harmful content and unwanted interactions.
c. Third, provide transparency on the effectiveness of their measures
in protecting Singapore users from harmful content. Designated Social Media
Services must provide information that reflect Singapore users’ experience
on their services. This will allow users to make informed decisions about
how they use the service.
36. If the Bill is passed, IMDA will further consult relevant Social Media
Services, before finalising the Code for issuance.
Interventions to stop access to egregious content
37. We believe that the Code of Practice for Online Safety will reduce
users’ exposure to harmful online content, but it will not eliminate them
completely.
a. Part of the reason is that these Social Media Services tend to operate
globally, drawing in users and content from around the world.
b. Their safety measures are not tuned to reflect an in-depth understanding
of Singapore’s local context or our racial and religious sensitivities.
38. Members may recall that in the early days of the COVID-19 pandemic,
supermarkets were purportedly running out of toilet paper.
a. A social media post surfaced, suggesting that people use the Bible
and the Quran as toilet paper.
b. This post was very offensive, and denigrated two religions in Singapore.
However, it was not moderated nor removed by the platform concerned.
c. IMDA had to step in to engage the platform, and the platform eventually
disabled access to the post.
39. There may also be egregious content on non-designated Social Media
Services, which are not subject to the Code of Practice for Online Safety.
a. In May last year, a poll published on a Social Media Service sexualised
local female Islamic teachers, asked users to rank them, and further promoted
sexual violence against them.
b. The post went viral, and the modest reach of this particular service
received a sudden big boost.
c. It not only caused great distress to the individuals involved, but
also unsettled many others in our community.
40. These issues are like fires that occur, even as the Fire Code has
prevented most fires. In such instances, we must have firefighters who
are properly equipped to act quickly, so as to minimise, if not prevent,
serious injury and damage.
41. If Parliament agrees, the new section 45H proposed by this Bill will
allow IMDA act as an “online firefighter”, to direct any Social Media Service
to:
a. Disable Singapore users’ access to egregious content; and
b. Stop the egregious content from being transmitted to Singapore users
via other channels or accounts.
42. IMDA has in fact performed this role for some time now, working with
Social Media Services behind the scenes to deal with egregious content.
a. As Singapore’s media regulator, IMDA also has significant experience
in assessing content across the different media platforms and making decisions
to protect the community.
b. Under this Bill, IMDA will be better equipped to ensure Singapore users
are protected from egregious content online.
43. But IMDA will not have carte blanche to issue directions. Its powers
will be limited in scope.
a. First, IMDA will not be able to issue directions in respect of private
communications. Those will remain private.
b. Second, directions can only be issued for certain categories of egregious
content relating to user safety.
c. The new section 45D proposed by the Bill defines “egregious content”
to include content advocating terrorism, suicide and self-harm, violence
(including sexual violence), child sexual exploitation, content posing
public health risk, and content likely to undermine racial and religious
harmony. These categories will be set out in law.
d. When dealing with content that requires the expertise of other agencies,
IMDA will consult them accordingly.
e. As an example, when assessing content pertaining to public health measures
and risk, IMDA will consult MOH and its experts.
Enforcement and penalties
44. The new section 45M proposed in the Bill requires designated services
to take all reasonably practicable steps to comply with an applicable Code
of Practice.
a. If they do not, IMDA can take regulatory action under the proposed
section 45N to issue:
i. A financial penalty; or
ii. A rectification direction requiring the service to remedy the failure
to comply with the Code of Practice. Non-compliance with a rectification
direction will be a criminal offence, punishable with a fine.
45. For egregious content, non-compliance with a direction by IMDA will
also be a criminal offence, punishable by a fine.
46. Mdm Deputy Speaker, I said right at the beginning that laws are necessary
but success alone in ensuring our citizens’ safety cannot just depend on
the laws.
47. Respondents of MCI’s public consultation and engagements agreed with
this view.
a. They wanted Government to mandate stronger measures, and Social Media
Services to do more to reduce harmful online content.
b. At the same time, they emphasised that all of us, as users of Social
Media Services, have an individual responsibility to protect ourselves.
c. During one of our engagements, Mr Mark Joel Premraj, a parent, shared
his perspective on how parents also play a key role in educating their
children on inappropriate content online, including to encourage them to
flag the inappropriate content they come across.
48. Besides establishing a robust regulatory toolkit, the Government has
taken active steps to nurture a well-informed and discerning public.
49. Efforts to educate the public include the National Library Board’s
S.U.R.E. programme.
a. It equips the public to think critically, be responsible producers,
and consumers of information and stay safe and well online.
b. Since its launch in 2013, S.U.R.E. has conducted over 6 million physical
and digital engagements.
50. In addition, MOE’s refreshed Character and Citizenship Education curriculum
has a stronger focus on Cyber Wellness education, where students learn
to be safe, respectful and responsible users of cyber space, and to be
a positive peer influence.
51. In support of the Digital for Life movement, launched in February
last year, community partners have also spearheaded initiatives which helped
over 270,000 Singaporeans enrich their lives through digital technologies.
a. For example, TOUCH Community Services has partnered Meta to conduct
the Digitally Ready Families programme, where low-income families learn
digital life skills and cyber wellness tips.
b. Another Digital for Life partner is “Kids PlaySafer”. Created and run
by Ms Sandra Low, a mother of an 11-year-old and 9-year-old, “Kids PlaySafer”
has conducted talks on digital literacy and cyber safety to help parents
manage their children’s digital needs.
52. Mdm Deputy Speaker, may I continue in Mandarin, please.
53. 对许多人来说,科技犹如魔法,大大改善我们的生活并带来很多便利。
54. 但对英国一名14岁少女茉莉的父母而言,社交媒体竟是夺走他们女儿生命的魔咒。
55. 正处于花样年华的茉莉,因为接触到数以千计的自残和自杀相关的网络内容而深受影响,最终导致她选择结束自己短暂的生命。
56. 除了自残内容,网上的不良内容不计其数,包括性暴力、恐怖主义、种族仇视等,不胜枚举。
57. 如果我们让这些有害内容充斥我们的网络空间,更多人,特别是幼童,将深受其害。我们得来不易的社会凝聚力也有可能遭到摧毁。
58. 但是,不论任何法令都不能完全阻挡网络所有的有害内容。因此,在政府加强法令的同时,我们希望看到,身为家长的我们,能鼓励孩子在网上遇到问题时,主动告诉我们,让我们有机会帮助他们。
59. 我们大家都希望,民间团体和社会人士在加强对自身有害内容的了解时,也帮助其他弱势群体对抗这类内容。
60. 这样,我们才能继续上下同心、团结一致!
Conclusion
61. Mdm Deputy Speaker, let me conclude
62. When the author Arthur C. Clarke published “2001: A Space Odyssey”,
one of the lines in the book became famous and quoted many times over.
It says, “Any sufficiently advanced technology is indistinguishable from
magic.”
63. Mdm Deputy Speaker, magic happens as we speak.
a. Without disturbing Parliamentary proceedings, Members can compare notes
instantaneously, in fact I saw some of you do so. And conduct research
on the fly either directly by instructing the colleagues outside chamber
or just looking at the QR code that I distributed.
b. Gone are the days where we might rush home to catch a favourite television
programme.
c. So much content is available online anytime, anywhere.
64. But not all of this content is good.
a. To ensure that safety is upheld for Singapore users, we need Online
Communication Services to be held accountable.
b. Equally, we need the support of everyone in the community to keep each
other safe online.
65. I appeal to Members to support this Bill so that we can together improve
online safety. I beg to move.
[1] Conducted online between 15 to 22 June 2022 with 1,053 Singapore residents aged 15 and above.
[2] From the 2020 Gallup Global Law and Order Report.
[3] The Straits Times, 30 Oct 22, “Record number of protection orders against harassment filed and granted in 2021" (https://www.straitstimes.com/singapore/record-number-of-protection-orders-against-harassment-filed-and-granted-in-2021)
[4] “Conversation on Online Harms” held on 9 Jul 2021 involving 72 youths from various IHLs, hosted by SMS Sim Ann and SPS Rahayu Mahzam.(https://youthopia.sg/converse/onlineharms/)
[5] Transcript of Meta’s 3Q 2022 Results Conference Call, 26 Oct 22. ( https://investor.fb.com/investor-events/event-details/2022/Q3-2022-Earnings/default.aspx)
[6] Tech Crunch, “TikTok was the top app by worldwide downloads in Q1 2022”, 16 Apr 22. ( https://techcrunch.com/2022/04/26/tiktok-was-the-top-app-by-worldwide-downloads-in-q1-2022/amp/)
[7] Transcript of Alphabet’s 3Q 2022 Results Conference Call, 25 Oct 22. ( https://abc.xyz/investor/static/pdf/2022_Q3_Earnings_Transcript.pdf?cache=a250e61)