CBI now part of Interpol’s anti-child abuse database
The Central Bureau of Investigation (CBI) has joined the international child sexual exploitation database of Interpol, a move which will bolster its fight against sex abuse of minors and help in identification of victims and abusers using sophisticated analytics software, officials said.
Sources in the agency said the CBI recently joined a select group of agencies of 67 countries that are connected to the database.
Interpol database on paedophiles and disseminators of child sexual abuse material (CSAM) on the internet relies on multi-media from various countries which are analysed based on a specialised “image comparison software” to identify the victim and the accused from photos and videos, they said.
Once the national central bureau of a country, the CBI in the case of India, gets that information they proceed against them according to local laws.
The Interpol mechanism takes help from financial institutions, internet service providers and software developers, besides cross-sector partners, in tracking CSAM and shutting down illegal distribution channels.
The database with over 2.7 million images and videos on an average helps in the identification of seven abuse victims across globe every day with over 27,000 victims and over 12,000 abusers identified so far, the Interpol website states.
The tool not only helps sharing of information among various investigation officers but also avoids “duplication of effort and saves precious time” as it tells the investigators if the images have already been identified in another country.
The CBI had last year cracked a major network of paedophiles and distributors of CSAM who were selling the illicit content on social media for just Rs 10 for 60 videos with payments received through Paytm.
Code-named Operation Carbon, the agency’s coordinated operation had unmasked 51 social media groups comprising 5,700 offenders having five lakh social media messages and 10 lakh suspected CSAM video messages.
Using artificial intelligence, the agency filtered through the identity of victims to assess if a message will fall in the category of CSAM or adult porn, sources said.