Singapore has done well in women’s advancement: from the enactment of the Women’s Charter in 1961 to the White Paper on Women’s Development in 2022, we have maintained consistent efforts to ensure that girls and women are treated fairly and equally in Singapore.
The emergence of the digital realm poses fresh challenges. There is growing data to show that girls and women experience the online sphere differently from their male counterparts in several areas, including safety, bias, and sexual harassment and objectification.
As a society invested in gender equality, we must ask ourselves: will these challenges result in widening gender-based divides that negatively affect the fair and equal treatment of girls and women? If so, what can be done to prevent this?
Safety
Females are less likely to feel safe online.
A poll by the Sunlight Alliance for Action found that 92% of Singaporeans felt safe walking alone at night in Singapore. Yet, the same study also revealed that only 61% of Singapore females feel safe from online harms, as compared to 72% of males.
Female youths are also more prone to experiencing sexual based harms and are twice as likely to report experiencing sexual harassment, as compared to male youth.
Perhaps as a testament to the serious negative impact of such harms, female youth are also most concerned about Image-Based Sexual Abuse (e.g. sextortion and revenge porn) and sexual harassment.
Bias
Online spaces are rife with gender-biased attitudes.
In SHE’s 2024 Safeguarding Online Spaces Study, 27% of youth polled agreed that if a woman uploads an image of herself, she should accept all forms of comments directed towards her, including disrespectful ones. A higher proportion of male youth (37%) harboured this misconception, suggesting that unhealthy gender mindsets are becoming normalised within this age group.
This is concerning. If left unchecked, such attitudes will proliferate, resulting in harmful behaviours towards females both online and offline, such as verbal abuse or victim blaming.
Artificial Intelligence (AI) & Deepfakes
AI can perpetuate gender-bias, encourage the sexual objectification of women, and facilitate gender-based online harms. This is of concern to many Singaporeans: SHE’s 2024 Safeguarding Online Spaces Study found that 55% of youths polled identified the sexualisation/objectification of women as a negative effect of generative AI.
As it stands, females are disproportionately and negatively impacted by AI. More than 95% of deepfake pornography created using AI, targets women. AI has also been misused to generate child pornography or doctored images of young girls. Such content can have a serious and long-lasting impact on the girls and women involved, particularly if used as a cyber-smear weapon.
In addition, AI tools, if trained on data that contains embedded gender biases, can create products that further entrench harmful gender stereotypes that result in discrimination against girls and women. For example, the hyper-sexualisation of women is a common flaw of AI trained on Internet data: the avatar-generating app Lensa, which was trained on such data, tended to create female avatars that were nude or skimpily dressed, as compared to the male avatars it created, which were predominantly fully dressed.
AI tools can also lower the barriers of entry to bad actors, allowing them to amplify their actions without needing to have high levels of programming skills. Harassers can easily use these tools to create multiple personas or voices and impersonate others to attack their target.
Gaming
In the online gaming world, female characters are also often objectified and sexualised to attract male gamers, and female gamers are subject to harassment and abuse.
A 2021 survey of 900 female gamers in the US, China and Germany revealed that 77% had experienced gender-specific harassment when gaming, including inappropriate sexual messages and name calling. The same survey found that to avoid conflict and harassment when playing online games, 59% had adopted non-gendered or male-gendered identities.
Likewise, a 2023 experiment by Maybelline showed that 83% of Australian gamers who identified as female have experienced offensive behaviour online. As a result, most turn off their microphones to conceal their identity.
Help-Seeking Behaviours
Unfortunately, many targets of online harms are unaware of how to seek help, with the lack of awareness more pronounced in females and youth.
SHE’s 2024 Safeguarding Online Spaces Study revealed that 38% of respondents had low awareness of self-help tools e.g., in-app reporting, and 49% had low awareness of legal recourse options, with females more likely to be unaware of the legal options.
The Study findings also suggest a more worrying trend and potential gender divide: to deal with online harms, 66% of Singaporeans polled self-censor their views online, while 68% disengage digitally.
Insofar as females are more prone to online harassment and abuse, they may be more likely to self-censor or disengage from the online space. This could result in them losing out on digital experiences and opportunities, leading to inequities at a broader, societal level.
Why Does This Matter?
We must acknowledge the gender-based harms, biases, and unhealthy behaviours that women experience online.
We must also take proactive steps to address these, lest they become so commonplace and accepted that they begin to shape new norms of behaviour between women and men. Some of these include:
- Developing a code of conduct to encourage civility online and discourage negative and gender-biased online behaviours.
- Increased public education on online harms and gender-based online harms, including how to call them out and seek help.
- New forms of legal recourse for targets of online harms that include innovative levers such as automatic damages or legal presumptions which presumes that consent is not sought when compromising images are shared online.
- Criminalising emerging online harms, such as harmful
- Developing strong partnerships between law enforcement, the community, and Internet platforms, to ensure that perpetrators of online harms are traceable and held accountable.
- Implementing clear frameworks to regulate AI training data to avoid perpetuating negative gender bias.
Much work has been done to keep our World safe, fair and equal for women and men alike.
Let’s now ensure we do the same for the online world.
About SHE
SG Her Empowerment (SHE) is an independent non-profit organisation, with Institution of Public Character status, that strives to empower girls and women through community engagements and partnerships. SHE believes that gender equality must be achieved both online and in the real world. As part of its efforts to support targets of online harms, SHE set up SHECARES@SCWO, in collaboration with Singapore Council of Women’s Organisations. The centre is the region’s first holistic support centre for targets of online harms, and offers a helpline, text-line, pro bono counselling support and legal clinics, and help with the reporting of online harms. SHE facilitates research to clearly frame the issues and identify the needs, to shape strategies that will make a positive impact. SHE also collaborates with community stakeholders from different interest groups, civil society organisations, corporates and the Government. Through these efforts, SHE advocates for change and champions a more equal society. For more info, please visit she.org.sg.
The views and recommendations expressed in this article are solely of the author/s and do not necessarily reflect the views and position of the Tech for Good Institute.