Tech News

A New Approach Helps Eliminate Child Pornography

[ad_1]

Every day, a a team of researchers in the UK is facing a mountain that appears to be permanent. The 21-year-old, who works at the Internet Watch Foundation’s office in Cambridgeshire, spends many hours viewing child pornography. And every time he finds a picture or video he needs to be tested and documented. Last year alone the group identified 153,383 pages containing child pornography. This creates a vast repository that can be distributed globally to combat violence. The problem? Different countries have different ways of sharing photos and videos.

To date, researchers at the UK Child Protection Organization have examined whether their findings are divided into three categories: either A, B, or C. These groups have followed the UK law and guidelines of sentencing for child abuse and have described in detail the types of violence. . Images in category A, for example, the most dangerous group, also include the worst cases involving children. These sections are used to describe how long a convicted felon should be tried. But some countries use different groups.

Now the IWF believes that data change could eliminate this. The group has rebuilt its Intelligrade software, to integrate images and videos into the laws and regulations of Australia, Canada, New Zealand, the US, and the UK, known as the Five Eyes. This change should mean less repetition of monitoring activities and allow professional companies to lead the worst images and videos in the first place.

“We hope we have the opportunity to share data for more people to use, rather than just work on our silos,” said Chris Hughes, director of the IWF hotline. “At the moment, when we share data it is difficult to compare in depth with our findings because it would not be accurate.”

Countries submit documents differently based on what happens in them and the age of the children involved. Some countries share photos based on whether children are minors or exchanges and the ongoing crime. The most dangerous group in the UK, A, includes sex scams, animal love, and violence. It does not include nudity, Hughes says. Whereas in the US this falls on a higher class. “Right now, the US request for IWF images in Group A would be missing from this category,” Hughes said.

All images and videos that the IWF watches are given a negative rating, especially the number, which is shared by manufacturing companies and law enforcement agencies around the world. These hashes are used to detect and prevent malicious content that is posted online. The hashing system has made a significant contribution to the spread of cybercrime, but the latest IWF tool adds new insights to each hash.

The IWF’s secret tool is metadata. This is what data is all about – it can be what, who, how, and when is in the pictures. Metadata is a powerful tool for researchers, because it allows them to see what people look like and analyze them as they do. Some of the biggest promoters of metadata are spies, who claim it can reveal much more contained in social media.

The IWF has increased the amount of metadata it generates on any image as well as the video it adds to their list, Hughes says. Any new photo or video monitored is being tested more closely than ever before. In addition as sexual violence falls in three UK categories, its reviewers are now adding 20 more reports to their reports. These fields are similar to what is needed to ensure image distribution in the other five Eyes-funded countries by comparing each of the laws and highlighting what is required. “We decided to offer more granularity by describing age, the amount of glasses based on what was happening in the picture, and confirming gender,” Hughes says.

[ad_2]

Source link

Related Articles

Leave a Reply

Back to top button