By Simran Toor, Chief Executive Officer and Cheryl Tan, Director, Programmes & Partnerships, SHE
The Current State of Play
Technologies are in a constant state of evolution, and while this promises great potential for positive change, there is also a darker side to the coin. Bad actors persist in exploiting emerging technologies, exposing us to new and often severe risks. For instance, Artificial Intelligence (AI) can be misused to generate inappropriate content, as seen in cases like a South Korean man imprisoned for using an AI image generator for child pornography. In Spain, AI was used to manipulate photos of young girls from social media. These developments are a significant cause for concern among Internet users, parents, educators, and policymakers.
In Singapore, the prevalence of online harms has become increasingly apparent. Recent research by SG Her Empowerment (SHE), conducted on 1,056 Singaporeans, revealed that 58% of respondents had either personally experienced online harms or knew others who had. Among them, 38% had personally encountered online harms, and 47% were aware of others who had been targeted. The Sunlight Alliance for Action (AfA) conducted a 2022 online poll in Singapore, which also found that 47% of respondents had personally experienced online harms. Moreover, a recent Ministry of Communications and Information (MCI) Online Safety Poll showed that two-thirds of Singapore’s Internet users had encountered harmful content online in the previous six months.
SHE’s research has also shown that youths are particularly vulnerable to online harms. A higher proportion of individuals aged 15 to 44 reported experiencing or knowing of others who had experienced online harms. Additionally, female youths were twice as likely to have experienced sexual harassment compared to their male counterparts.
Why is Concern Warranted?
The potential negative impact of online harms is significant at both individual and societal levels.
SHE’s research found that 40% of online harm survivors experienced severe physical and mental health issues, including suicidal thoughts, fear for their safety, and depression. Additionally, 31% felt a range of negative emotions, such as anger, sadness, anxiety, embarrassment, shame, or helplessness. Alarmingly, many individuals do not know how to seek help when facing such situations, and this lack of knowledge is more pronounced among women, with 60.9% of female respondents in the Sunlight AfA poll unaware of where to seek assistance after experiencing online harms.
The research also highlighted a growing lack of trust in the online space, with 76% of respondents feeling uncomfortable expressing their views on controversial topics. About 66% indicated they would self-censor, and 68% said they would disengage from online activities to avoid online harm. These behaviors, rooted in the need for self-protection, can lead to imbalanced online discussions, where only the most extreme or polarising views are expressed.
Individuals who disengage from the online world may miss out on digital opportunities. If certain segments of society, like female youths who are more targeted for online sexual harassment, disengage disproportionately, it may lead to broader societal inequities.
Disturbingly, 21% of respondents considered online harm a “normal part of life,” indicating a worrying acceptance of online bad behavior. Furthermore, MCI’s Online Safety Poll found that nearly half of those encountering harmful content online did not block or report it, as it did not occur to them or because they were unconcerned. This trend is concerning, and there is a risk that these online behavioral norms may spill over into real life, potentially leading to a more toxic society over time. We must question whether this is the direction we want our society to take.
The Current Regulatory Framework
In Singapore, policymakers have been proactive in addressing the challenges posed by technology and online harms. Recent legislative measures include:
- The Online Criminal Harms Act (May 2023): This act expands the regulatory tools available to combat online criminal activities, broadens the range of entities authorities can act against, and introduces more effective measures to address online criminal harms.
- The Online Safety (Miscellaneous Amendments) Act (February 2023): This amendment to the Broadcasting Act empowers the InfoComm Media Development Authority (IMDA) to order social media platforms to remove egregious content, including that which promotes self-harm, child exploitation, terrorism, racial or religious tensions, or threats to public health. Non-compliant platforms may face liability or service blockage.
- The Code of Practice for Online Safety (July 2023): Introduced by the IMDA, this legally binding code obliges designated Social Media Services to enhance online safety and reduce users’ exposure to harmful content, such as self-harm, cyberbullying, and sexual content.
- The Protection of Harassment Act (POHA) (2014): This law protects individuals from harassment both in real life and online. Victims can seek protection orders and have harmful publications removed through court applications. The Protection from Harassment Court, established in June 2021, handles online and offline harassment cases.
However, despite these legislative efforts, the data indicates that online harms are on the rise. The main challenge lies in the rapid pace of technological advancements, which can render legislation obsolete almost as soon as it is enacted. Moreover, for those targeted by online harms, the existing recourse may not be sufficient. SHE’s research shows that over 80% of respondents desire faster and more permanent solutions, with more than 70% preferring immediate removal or cessation of online harms without resorting to legal action (although many agreed that survivors should have the option to obtain a legally enforceable order if the harmful conduct or content persists). This suggests that current legal remedies may be considered too costly, time-consuming, or inadequately responsive to the fast-spreading and impactful nature of online harms.
Next Steps: Building a Healthier Online Ecosystem
Creating a healthier online environment for future generations involves addressing multiple pressing needs through a well-rounded policy, regulatory, and enforcement model:
- Public Education: Educating the public is essential to combat the normalisation of online harms, raise awareness of available support, and provide avenues for help.
- Swift and Permanent Recourse: Targeted individuals of online harms require quick and lasting solutions to address their concerns.
- Support for Survivors: Supporting survivors in coping with the mental and emotional impact of online harms is crucial.
- Staying Ahead of Technology: To keep pace with evolving technologies, continuous research and engagement with stakeholders are essential to anticipate emerging issues.
The Australian e-Safety Commission, established in 2015, offers valuable insights. It combines regulatory powers for rapid harmful content removal, conducts research on online safety trends, provides education and training to enhance digital skills, and regularly updates policies and legislation. Furthermore, it collaborates with investors and industry players to improve user safety standards.
Introducing a similar e-Safety Commission in Singapore could not only serve these functions but also rebuild trust in online spaces and promote healthy online discourse. This could be achieved through ongoing dialogues between internet platforms, users, and regulators, and by defining ground-up ‘rules of engagement’ for internet users, such as a Code of Conduct or best practices for online behavior.
At the community level, organisations like SHE, in collaboration with the Singapore Council of Women’s Organisations (SCWO), have already taken proactive steps by launching SHECARES@SCWO, Singapore’s first comprehensive support center for online harm targets in early 2023. The center offers a helpline, pro bono counseling, legal clinics, and assistance with reporting online harms. SHE also partners with leading internet platforms like Google/YouTube, Meta, TikTok, and LinkedIn, participating in priority flagger/trusted flagger programs to expedite user recourse through in-app reporting.
Urgent action is required to curb the escalating trends of harm and distrust, or else we risk a future where the internet becomes a breeding ground for wrongdoing, and only the most courageous individuals dare to navigate its depths. Such an outcome would be a genuine loss, as the digital realm offers immense potential for growth, experience, and opportunity.
The time has arrived for us to seek fresh, innovative, and multifaceted solutions to address the current challenge. An e-safety Commission may well be the solution Singapore requires.
SG Her Empowerment (SHE) is an independent non-profit organisation, with Institution of Public Character status, that strives to empower girls and women through community engagements and partnerships. SHE engages and listens to women and men across the community, and from all age groups and walks of life. SHE facilitates research and gathers data to clearly frame the issues and identify the needs, in order to shape strategies that will make a positive impact. SHE also collaborates with community stakeholders from different interest groups, civil society organisations, corporates and the Government. Through these efforts, SHE advocates for change and champions a more equal society. For more info, please visit she.org.sg.
The views and recommendations expressed in this article are solely of the author/s and do not necessarily reflect the views and position of the Tech for Good Institute.