![]() | Issue: | Europe II 2015 |
Article no.: | 6 | |
Topic: | Nudity and child sexual abuse: why social networks are not to be blamed | |
Author: | Christian Berg | |
Title: | CEO & founder | |
Organisation: | NetClean Technologies | |
PDF size: | 1024KB |
About author
Christian Berg, CEO, founded NetClean Technologies ten years ago with the purpose to develop technical solutions to stop the spread of child sexual abuse content. He has grown the company to its current global market position, partnering with law enforcement all over the world and IT-companies such as Microsoft and Intel Security.
With a strong technology background in software development, security and cloud services, Christian has an extensive global network of technical experts. He is an experienced speaker and is often invited to speak about technical solutions against child abuse content at events around the world.
Christian Berg has a Master’s Degree in Automation Engineering from Chalmers University of Technologies and a Master’s Degree in Innovation and Entrepreneurship from Chalmers School of Entrepreneurship.
The product and services from NetClean Technologies are used all over the world by law enforcement, ISPs and Fortune500 companies.
Article abstract
Christian Berg, CEO and founder of NetClean explains how the use of social media has desensitised our behaviour and attitude towards child sexual abuse, and how digital investigators are using imaging technology to identify victims and remove illicit content.
Full Article
Our lives are much more connected than ever before. We have become obsessively attached to our smartphones and tablets, watching and following what our friends and peers are doing on social media networks and mobile chat apps. Our interactions and awareness of the actions of others are much more frequent, more instant, and largely, less meaningful.
Today, Facebook has 1.4 billion active users and Twitter attracts over 236 million. The power of all of these people coming together is immense. We have already seen plenty of success stories, crowdfunding new innovative Kickstarter projects or raising awareness and funds for charitable causes. Everything is documented online – the journey, the people involved, with plenty of photo evidence to show ‘I’ve done it!’ People and businesses expend immense amounts of effort, and vast quantities of money to capitalise on this power and appear in the best light.
As filtered and finessed as our online world may be, social media isn’t just a flattering mirror of the world, with all the prettiest, wittiest elements hand picked and presented. It also mirrors the darker elements of our society, criminals, trolls, bullies, groomers and abusers. The anonymity and pervasiveness of social media has created a way for criminal communities to come together, find like-minded individuals and fuelled a booming industry in online crime. Whilst criminals are often largely seen as utilising unseen forums, dark web listings, private chat groups or peer-to-peer downloads, social media has become another tool in their arsenal.
Flickr, Facebook, Twitter, numerous social sites are used by those who create, distribute and collect
child sexual abuse (CSA) images and videos. Most CSA content is created illegally by organised gangs or individual groomers and abusers and then traded. However an increasing amount of CSA content is being generated by children themselves, encouraged by peers, groomers or simply falling into line with an overly sexualised sharing culture.
The dark side of a sharing culture
Learning to share is a hard lesson for kids, but it seems social media has accelerated that learning curve. Openness has become the norm and privacy a concern for another time.
In 2014 an IWF report assessed 3,808 ‘self-generated’ sexual images of children and young people online, 17.5 per cent of which featured children aged 15 and under – something not found in the IWF’s previous report, in 2012. Almost half of the content depicting children aged 15 years or younger was Category A or B compared to 27.6% of content in the 16-20 years age range. Category A refers to images involving penetrative sexual activity; images involving sexual activity with an animal or sadism. Category B refers to images involving non-penetrative sexual activity.
Every day, thousands of selfies, food snaps, and family photos are shared on Facebook, Twitter, Instagram, Snapchat, Tumblr and other forums and websites. However whilst these spaces give the illusion of being your own, your plot of land in the teaming mass of the internet. Once that material is released into the wild, it’s not only archived by you, and the social network you share it on, it’s also open season for the rest of the world. In fact, 90 per cent of the total images and videos assessed as part of the IWF study had been harvested from the original upload location and were being redistributed on third party websites.
Phenomena like ‘revenge porn’ and ‘doxing’ have shown us the devastation that sharing too much, even with people you trust, can cause. Unfortunately once the content is out there, there is very little that individuals can do to bring it back under control.
Normalising criminal communities
Anonymity, and the ease with which obscene content can be accessed have helped reduce not only the physical, but also the mental barriers to engaging in illegal activity. A click, an open browser, a saved file, seems such a small action for such a devastating crime.
Twenty years ago, criminals and voyeurs were by definition operating alone. The communication was linear. Once the police picked up the trail, they could quickly shut down the operation and press charges. Case closed. However with the rise in social media, the Internet has provided the means to bring people together, normalise their behaviour, and change their attitudes towards traditional societal values. As a result, people are drawn towards collective thinking and behaviour. Sometimes for good, other times in illegal practices like child sexual exploitation.
On the whole, social networks are doing a much better job at taking down explicit nude and insensitive photos and videos being posted online, compared to just a few years ago. Facebook, for example, has recently revamped its community standards in its takedown policy. It now includes a separate section on “dangerous organisations” and gives more details about what types of nudity it allows to be posted. It actively encourages its members to report posts that they believe violate its rules.
Yet, the reality is quite different. As soon as one case is closed, another opens up. More child sexual abuse cases emerge every day and each case could contain new or unidentified victims. The challenges facing investigators today is much more widespread than we have ever anticipated. The only way to tackle the problem is from the root of the cause.
Calling for citizen police
Social media has called for a change in our roles in society. Today the public actively collaborates with social networks and law enforcement to monitor and report on suspicious criminal activities and child abuse cases. Police officers are also able to embrace the anonymity or pseudonyms that social networks offer and blend into the public to catch criminals in action.
Child sexual abuse happens across the world. South Africa, for example has one of the highest rates of sexual violence against children in the world. A recent study by the Bureau of Marketing Research (BMR) at UNISA highlighted just how vulnerable children are online. The study found that 31 per cent of children in Gauteng had encountered paedophiles online, while 60 per cent had taken and shared naked or semi-naked pictures of themselves. Similar incidents have also been reported in the UK, as seen in recent high profile cases involving celebrities. In both countries, it is clear that society lacks a truly proactive approach to preventative measures.
While the subject still very much a taboo, awareness is growing. Legislations is being brought into place to protect children and organisations around the world are calling for the public to be more vigilant in what’s happening around them – in the workplace, in their online communities and in their neighbourhood.
The European Financial Coalition against Commercial Sexual Exploitation of Children Online (EFC) is a coalition of key actors from law enforcement, the private sector and civil society in Europe. Chaired by Europol, the organisation is created to fight the commercial sexual exploitation of children online. In tackling child sexual abuse online, it recommends better cooperation between law enforcement, public hotlines and the private sector to monitor and halt the circulation of child abuse material.
The EFC also aims to raise awareness about new forms of criminal behaviour involving commercial sexual extortion. It is directly engaging with representatives of alternative payment systems (such as Bitcoins and other digital payment) to lock down on illegal transactions involving child abuse material. By monitoring the Deep Web and Darknet, the coalition aims to prevent any potential consequence involving commercial child sexual exploitation of further migration from a traditional payment system to a new, largely unregulated digital economy.
Of the long list of recommendations, it is our opinion that the most effective way to tackle child sexual abuse is to track the abuse content itself. It includes the proactive identification and removal of child abuse material based on hash and photo DNA techniques.
Tackling the root of the cause
Advanced image recognition technology has proven to be the most useful tool to aid investigators in finding relations between digital files and building visual maps when solving cases.
With the proliferation of digital media, investigators and social networks face a huge challenge in trying to prevent illegal content from being shared on these services. Facebook alone has seen over 350 million photos being uploaded by users each day. With millions of images and videos circulating online, manually reviewing and analysing these digital files is an impossible task. Although more digital files mean more potential evidence, it is time consuming, frustrating and nearly impossible to make sense of such large quantities of information.
However, technology can help our visual capabilities along the way, by using image hashing technology to separate pertinent (illegal) from non-pertinent material. By tackling the root of the problem, the image itself, we can ensure that those responsible for creating and sharing this material are found. More importantly by finding and assessing the material we can find and help victims of abuse.