SINGAPORE: A public consultation was launched on Wednesday (Jul 13) to seek views on the Government’s proposed measures to improve online safety with regard to users of social media platforms, including young people.
There are two sets of plans introduced by the Ministry of Communications plus Information (MCI) within June.
The first, a Code associated with Practice for On-line Safety, will require social media services with substantial reach or effect to have system-wide processes to reduce exposure to harmful on the internet content for Singapore-based users, including people below the age of eighteen.
The second, the Content Code designed for Social Media Services, will allow Infocomm Media Growth Authority (IMDA) to direct any social media marketing service to turn off local access to content that is deemed damaging to Singapore’s society, for example online material that incites racial or religious disharmony.
“We recognise that some social media solutions have put in place measures to protect their customers. However , such measures vary from service in order to service, ” said MCI in its open public consultation paper.
“Additionally, when evaluating harmful content on social media services, Singapore’s unique socio-cultural framework needs to be considered. Provided the evolving nature of harmful online content, more can be carried out, especially to protect youthful users. ”
SAFEGUARDS FOR TEENAGERS
Under the Code of Practice pertaining to Online Safety, specialists are considering requiring specified social media services to get community standards to get six categories of content – sexual articles, violence, self-harm, cyberbullying, content endangering community health and content facilitating vice and prepared crime.
“These designated services will also be expected to moderate content to reduce users’ exposure to such harmful articles, for example , to turn off access to such articles when reported by users, ” said MCI.
“For child lovemaking exploitation and misuse material, and terrorism content, these providers will be required to proactively detect and get rid of such content. inch
For youthful users, MCI proposed additional safeguards this kind of as including stricter community standards plus tools that enable young people or their particular parents to manage their particular exposure to harmful content.
The ministry said these tools could include those that limit the visibility of young users’ accounts to others.
“The tools could be activated by default to get services that enable users below eighteen to sign up for an accounts, ” it added.