CSAM rise drives tighter online ID, youth access measures

1 week ago 6
ADVERTISE HERE

The government is preparing to implement a minimum age of 16 for social media access, backed by age and identity verification mechanisms – AI Image

KUALA LUMPUR (Dec 10): Malaysia is raising the alarm over the growing spread of child sexual abuse material (CSAM) online, a digital threat that experts warn could inflict long-term harm on the nation’s younger generation if not tackled swiftly.

Recent nationwide operations, conducted under Op Pedo from Sept 22 to 30, 2025, by the Royal Malaysia Police (PDRM) and the Malaysian Communications and Multimedia Commission (MCMC), resulted in the discovery of over 880,000 CSAM-related files and the arrest of 31 individuals at 37 different locations.

Authorities say the surge reflects how anonymous accounts, closed networks and cashless transactions are increasingly being used to produce and circulate harmful content.

Amid this worrying trend, the government is preparing to implement a minimum age of 16 for social media access, backed by age and identity verification mechanisms. The move aims to reduce young users’ exposure to high-risk digital spaces and limit opportunities for predators to target minors.

Siraj Jalil

Siraj Jalil, President of the Malaysian Cyber Consumer Association (MCCA), stresses that early detection is vital in combating online exploitation.

“CSAM spreads extremely fast once it enters the Internet. If detection is late, the material is already copied, shared and embedded across multiple platforms. Early action helps stop distribution, identify victims and perpetrators quickly, and prevent the content from slipping into the dark web,” he said.

But Siraj cautions that age limits alone are not enough.

“Age on the internet is easy to fake. Predators use hidden channels that bypass age policies,” he explained.

“What matters more is platform safety, agency collaboration, automated detection technology and better digital literacy among parents and students.”

Mediha Mahmood

Mediha Mahmood, Chief Executive Officer of the Communications and Multimedia Content Forum of Malaysia (CMCF), shares the view that authorities must act swiftly when CSAM reports arise.

She says complaints must be reviewed within hours, with harmful material removed immediately and accounts taken down.

“We need a transparent process, at minimum, a clear notification that a report is being investigated and action has been taken,” she said.

She also believes AI technology can help detect repeated images and block access to CSAM in real time.

Children under 16 are vulnerable, said Associate Professor Dr. Fauziah Mohd Saad, Counselling Psychology expert from Universiti Pendidikan Sultan Idris (UPSI), Perak.

Their cognitive abilities are still developing, which makes it difficult for them to assess risks, notice subtle manipulation, or recognise harmful intentions from strangers online.

“They are easily influenced emotionally, eager for attention and approval, and predators exploit exactly these psychological weaknesses,” she said.

Many rely heavily on praise or positive reinforcement, which offenders use to groom them through emotional support, digital gifts or excessive compliments.

Compounding the issue, Dr. Fauziah explains that predators often hide behind fake accounts, VPNs and temporary profiles, making identification difficult without strong verification systems.

Fear, shame and self-blame often prevent victims from reporting what happened, enabling the exploitation to persist without detection by parents, educators or platform operators.

She believes social media platforms must take greater responsibility by enforcing age and identity verification such as the Electronic Know Your Customer (E-KYC), which can reduce fake accounts and ensure accountability.

She also recommends proactive monitoring of private messages and group chats, with automated alerts for high-risk behaviour such as an adult sending images to a child or using sensitive phrases linked to exploitation.

As Malaysia prepares to tighten access to digital platforms, experts agree: strengthening identity verification, boosting platform responsibility and empowering families with digital literacy form the core of a safer online environment for young users.

The government’s upcoming measures underscore a strong commitment to safeguarding children, ensuring the online environment becomes not a haven for predators, but a safer and more trustworthy space for the next generation.

Read Entire Article