Curchorem isn’t a scandal. It’s a warning

At the centre of all of this lies a troubling reality: children are not reporting. Not because harm does not exist, but because fear does

Peter F. Borges | 26th March, 07:20 pm
Curchorem isn’t a scandal. It’s a warning

The recent incident emerging from Curchorem, Goa, has quickly drawn public attention. A young man has been arrested, crowds gathered outside the police station, pressure mounted, and statements followed, including political reactions. Almost immediately, the case was labelled a “sex scandal” across print, electronic media, and even hashtags. While the phrase may appear routine in media vocabulary, its use in this context is deeply problematic. Because beneath the noise and the framing, one fact stands out: there is no clear indication so far of any victim coming forward to file a complaint. That silence should concern us far more than the label itself.

Reports indicate the circulation of explicit videos allegedly involving minor girls. If that is indeed the case, then this is not a “scandal” in the conventional sense. It is a matter of Child Sexual Abuse Material (CSAM), which refers to any sexually explicit content involving a minor, whether created, shared, stored, or even viewed. The use of the word “scandal” shifts the narrative from protection to spectacle. It invites curiosity, fuels discussion, and risks turning a serious child protection issue into something consumed rather than confronted. For a child who may already be afraid, ashamed, or unsure about coming forward, such framing can act as a barrier rather than support.

Goa is not unfamiliar with the problem of non-reporting in cases involving children. Fear of stigma, social judgment, and lack of trust in systems often prevent disclosure, even when harm is ongoing. In such a context, sensational language does not merely misrepresent the issue; it can deepen silence. The question, therefore, is not just what happened in Curchorem, but why, despite the scale of attention, no child has come forward. Are we creating an environment that encourages reporting, or one that makes children retreat further?

It is also important to recognise that cases involving CSAM do not begin at the point of arrest. They often begin much earlier in the digital ecosystem. Across the world, technology platforms are required to detect and report suspected CSAM. These reports are routed through the CyberTipline operated by the National Center for Missing and Exploited Children (NCMEC) in the United States and are shared with Indian authorities such as the National Crime Records Bureau and the Indian Cyber Crime Coordination Centre. From there, they are forwarded to State Police for action. This means that, in many instances, systems receive signals before cases become public. Globally, over 32 million such reports were generated in a single year, with around 5.6 million linked to India. This is not a marginal issue; it reflects a large and ongoing digital reality.

Yet, despite this framework, there is no publicly available data on how many such reports have reached Goa, how many have been acted upon, or how many have resulted in FIRs and investigations. This lack of transparency raises important questions about preparedness and response. Are these alerts being systematically tracked at the State level? Is there adequate capacity to analyse and act on them? Or does action depend largely on public outrage, media visibility, and political pressure? If the latter is true, then the system is not detecting early; it is reacting late.

At the same time, the nature of such cases is evolving. Not all instances of CSAM today originate from organised networks. A growing proportion involves adolescents themselves, particularly in the context of relationships. Young people may share intimate images out of trust, curiosity, or emotional connection, often without fully understanding the risks. When relationships break down, such content can be shared without consent, sometimes as a form of retaliation or control. This phenomenon, often described as revenge sharing, transforms what was once private into something harmful and widely circulated. Many adolescents do not realise that forwarding, storing, or even viewing such content can constitute a criminal act, thereby becoming part of a chain of harm.

At the centre of all of this lies a troubling reality: children are not reporting. Not because harm does not exist, but because fear does. Fear of blame, fear of exposure, fear of not being believed, and fear of consequences continue to shape their choices. A child protection system cannot function effectively if it does not address this fear. Beyond enforcement, there is a need for safe and accessible reporting mechanisms, school-based awareness and disclosure systems, trained and trusted adults, and confidential support services. Children must be assured that they will be protected, not judged.

The role of the media, in this context, is critical. Language is not a minor detail; it shapes public understanding and response. Sensational labels may attract attention, but they can also distort the issue and discourage disclosure. Responsible reporting in cases involving children requires restraint, accuracy, and a clear focus on protection rather than spectacle. This is not about limiting media freedom; it is about strengthening ethical responsibility.

The Curchorem case will continue to be discussed and debated. It may also be politicised. However, if the conversation remains confined to reaction and rhetoric, the deeper issue will remain unaddressed. This is not merely about one incident. It is about whether systems are equipped to detect early, respond effectively, and support children in a manner that builds trust. It is about whether language protects or harms, whether systems reassure or intimidate, and whether children feel safe enough to speak.

In the end, the measure of our response is not the intensity of our debate, but the presence or absence of children’s voices. And right now, that silence speaks louder than anything else.



Share this