07 Feb Safeguarding podcast – interview with Susie Hargreaves CEO IWF
In this edition of the SafeToNet Foundation Safeguarding Podcast, we interview Susie Hargreaves, CEO of the Internet Watch Foundation (IWF) about the work they do in tracking down, cataloging and deleting online child pornography. It’s not an explicit episode but it’s nonetheless a tough topic.
If you’d rather read than listen, then the transcript below has been lightly edited for reading clarity.
Welcome to the SafeToNet Foundation’s safeguarding podcast where we talk about all things to do with safeguarding children’s online experience.
I’m in Cambridge today because I’m in the home of a great organization that should be a household name. It’s one of these hidden brands that we will benefit from, I am with their CEO, Susie Hargreaves.
Thank you for inviting me into your fantastic premises, Susie. Perhaps you could explain who that is. What, is it you do? What is the IWF?
Okay, so the IWF is an international hotline for reporting and removing online child sexual abuse. We’re based in the UK and we’ve been going for 22 years. We were set up by the Internet industry.
When we started, the Internet was a very fledgling operation. And the Internet companies that existed at the time very much felt that they weren’t responsible for the content that they carried. The Government and the police disagreed with that, so they came to an agreement whereby the Internet industry would take responsibility for cleaning up its own house, and we were created as a result. At that time, we had seven or eight members.
We’re a self-regulatory body with an independent board and we’re funded by the internet industry. We currently have around 141 members who range from small Internet companies through to the global internet giants. And we all work together, to do everything we can to remove online child sexual abuse wherever it’s hosted in the world.
Okay. So, you are based in Cambridge, UK organisation, you’re in fact a charity.
Yes, we are a charity and that’s a good point. We’re a not for profit organization, so yes, funded by the internet industry, but we exist for public benefit.
While we are a membership organization and the Internet industry pays for us, we do have an independent board and we do work on behalf of the public to ensure that we can do everything we can to get rid of online child sexual abuse.
And the great thing about our model is that because we have the support of the Internet industry, we also have access to their technology, and their great minds in relation to technology and the Internet, and we can work together to do whatever we can to get rid of this heinous problem.
And when people ask why Internet companies do that, they do for two reasons. One reason is they do it because they don’t want to be associated with child sexual abuse. They are people, they have children, they have families and they don’t want to have anything that they work with associated with horrible problem.
And the second thing is it’s obviously very bad for their brand, so it’s in their interest to work very closely with us.
It’s really important to say that whilst we are a charity and we’re totally independent of government and law enforcement, we do work incredibly closely with both. So, you know, that relationship we have between us, wasn’t the government on one side and Industry on the other side. It is really important because it means we can work with everybody and help them to achieve their [safeguarding] objectives as well.
The IWF is a UK-based charity, independent, funded by your members, the ISPs, the Internet service providers, and other network operators. But your reach and influence go beyond the shores of the UK, you do have an international impact as well.
Yes, absolutely. So, our membership when originally started, was just UK based companies and you know, our companies [now] range from SafeToNet, to Google and Facebook and Amazon, you know, a real wide range of organizations, household names through to companies who are incredibly important because of the services they provide, but that people wouldn’t have heard of necessarily.
But they are operating and based all around the world. So initially, it was very much about being in the UK, but of course the Internet is global, so whatever we do, if we’re going to achieve our mission of eliminating online child sexual abuse, we have to work internationally.
The majority of our companies actually are now international members and they will work with us on an international level. Google is our member in the UK, but because they’re a member across the world, they are a member in every single country that they operate and they will take and deploy our services in every country that they operate. So, it’s very much an international approach that we have to take.
Most people, fortunately, will never come across any of this kind of content and if they do, it’s almost certainly going to be by accident. Which begs the question, how big a problem is it? Because you’ve got this wonderful, wonderful premises in Cambridgeshire and you’ve got a sizable staff. You’ve got a lot of people doing a lot of different things, which we’ll go into in a bit more detail later. So this problem must be of a certain size, it’s not half a dozen grubby photos sculling around somewhere, is it, it’s quite a large problem.
Yes. You’re absolutely correct, Neil. When you look at the size of the problem nobody’s able to totally quantify exactly how big the problem is, and that’s for a number of reasons. New people are coming online all the time across the world, an awful lot of the images and videos we see are duplicates, they’re shared again and again.
New content comes online, so nobody really knows the full scale of the problem. But we do know some indicators that help us realize that the volume is growing and growing.
It’s not a new thing. I think one of the things we should say is that people did look at child sexual abuse before the Internet, but the Internet has caused an explosion in terms of people’s ability to access this material, it is now not very hard if you’re really determined to access this material.
To give you a size and scale of the issue, the police say that in the UK about a hundred thousand people at any one time are looking at child sexual abuse, so that gives you an idea of scale just in the UK. I don’t think we’re even particularly worse than other countries in terms of people accessing this material, that just gives you an indication of the numbers in terms of people looking at the content.
But for the IWF we’re about going after and removing that content. In very simplistic terms we’re about removing the content, and the police are about going after the bad guys. But we do work closely together on a number of issues.
In fact, in 2018 we removed down over 105,000 web pages of child sexual abuse and that’s millions of images and videos. Compare that with the first year I was at the IWF, which was seven years ago, we took 9,000 web pages down.
And there are two reasons why we’ve been able to take down so much more content than in 2014, the whole landscape changed completely.
At about the time I started at the IWF, it was just after that the Jimmy Saville revelations came out. I would say since then, the issue of child sexual abuse has not been off the front page of the media. Up until that then, we were very much this kind of industry secret that was kept very, very quiet. We kind of took the problem away for industry, but it was clearly in the public domain once the Jimmy Saville revelations came out.
[Since then] the industry has been much more proactive and public about the problem. And David Cameron, when he was prime minister, then got personally involved. He made it a mission of his to just actually say, I want to sort this out, I want to do see more being done for fighting child sexual abuse.
And Dave Cameron also then said, look, we want you to start proactively searching for this content. Up until that point we’d simply be working off public reports. His [Cameron’s] Government said, you know where it is, go out and find more.
And [the second reason] was really on the back of two very high-profile cases of two little girls who were murdered, Tia Sharp and April Jones, separate cases, but their trials of their murder happened at the same time court. And for the first time in court, a link was made between people who had been looking at child sexual abuse, who did not have a history of crime, going out and then committing these horrific murders straight afterwards.
This was very much the push, the tragic murders of April and Tia, the Jimmy Savile revelations and David Cameron’s personnel involvement. We went back to the industry and said, you know, you’ve all got to step up. You’ve got to do more.
And then we saw a 400% increase in the first year we were proactively searching. We’ve been able to proactively search for this content, which means we can find more, plus technologies have developed hugely in the last few years. We now have the use of crawler software, which enables us to go out and it crawls the Internet for potential child sexual abuse and we’re able to hash images, which means we able to apply a digital fingerprint on known images of child sexual abuse. And that gives us an opportunity to use that hash, that digital fingerprint, to go out and search for all the duplicates.
It’s been a combination of having the ability to go and proactively search and the technology behind it to help us do more.
The problem seems to be of a remarkable scale and you’ve got certain technologies to help, but you also have a fantastic team of people. For the average person, it is illegal to look at this stuff and just be clear, viewing this stuff, as well as its production and distribution are illegal. But you’ve got a team of folk, who’s prime task is to search for this stuff, so they must be allowed to do that by law. They have an exception to do that.
Yes. We have a Memorandum of Understanding with the Crown Prosecution Service and with the National Police, the Police Chiefs’ council, which in effect protects our analysts and our hotline team, which means that if we’re looking at the content within the hotline, we can’t be prosecuted.
That’s a unique arrangement that we have given that we’re not law enforcement. You’re absolutely right, what you’re watching is criminal content, and we say criminal because it does actually carry a prison sentence. There are lots of illegal things like a speeding ticket, but it’s not a speeding ticket, it is actually potentially an imprisonable offense so it is a very serious matter.
It is anybody who’s under the age of 18, but [with] child sexual abuse, there’s a massive range. So you’ve got babies through to 17 year olds who taken a self-generated image of themselves. But the majority of what we see is, is shifting slightly and we’ve seen more self-generated content, but we see an awful lot of children under 10.
Those ages are really quite shocking, but you mentioned self-generated images. I read your annual report which, first of all congratulations on a fantastic job, the results you’re achieving are really quite astonishing, but one of the anecdotes that stuck in my mind was of a young girl in the family bathroom, who was doing whatever she was doing in front of the video camera, who then turned and engaged in the conversation through the locked door with someone who was probably a parent, dealt with that conversation and then resumed her activity. Is that, that’s what you mean by self-generated?
We would define it as content that has been produced by the child with no adult present in the room. But inevitably it’s nearly all done on live streaming and then recorded, which means that we don’t see the other side of it [the predator’s side]. We see the child and the end result. But clearly these are children who are coerced, deceived, they’re groomed. They are doing something at the behest of someone who we can’t see. We’ve seen so much of this content, we record it and then we will categorize it.
And unfortunately, in the six months we recorded it in 2018 about one in four of the reports that we took action on were self-generated [images]. And about 78% of the children in those images were girls that we assessed as being 11 to 13 years old and in bedrooms or household settings.
There is model of the safeguarding is mentioned in her Majesty’s Government’s guidelines for safeguarding children called Contextual Safeguarding.
Contextual Safeguarding is a model that came out of I think the University of Bedford. Dr Carlene Fermin has done fantastic work and contextual safeguarding is all about the different physical spaces in which children pass during their day, and the different people that surround that child who can safeguard or could be safeguarding those children.
Starting from the day at home, you then get the school bus or the journey to school, then obviously at school and then after school, some kind of after school activity or sports club or shopping in a shopping mall with their friends then home…
But it seems to me that there are now contexts within contexts – when I was a young child the home was the home and anything that happened outside the home stopped at the garden gate.
Now you’re saying that the home is not safe. The bedroom and the bathroom are just as open as the shopping malls, through the use of social media. This provides people with access directly into the most private spaces, the [private] context that children are in.
I totally agree that children aren’t necessarily safe at home when they’re unsupervised. I think they can be safe, it’s all about ensuring that they have proper supervision and get proper education and they’re able to build their own resilience and look after themselves online.
But just because your 11-year old is in their bedroom, if they’ve got a camera enabled device and Internet access, they’re potentially at risk if they don’t know what they’re doing, so don’t think they’re just safe because they’re in their bedroom.
I don’t want to sound like the harbinger of doom or anything, but I think certain children are more vulnerable obviously and that there needs to be some parental controls on lots of levels. And I don’t just mean in terms of technology controls, but ensuring that parents are aware of the risks, or carers, guardians and that children themselves are aware of the risks.
You know, it’s terribly worrying when you see these young girls who are really vulnerable, they’re at a vulnerable age, and they’re easy to be manipulate, it’s terribly heart-breaking.
What do you think are the causes of this? Because it seems, I don’t want to use word “unnatural”, but if you look at the animal kingdom, animals don’t seem abuse their children, their offspring, their young in the same way.
Maybe we’re straying off the topic, but do you have a view as to the causes of this? The work you do is fantastic, but it’s after the effect. It’s after the event has taken place. Is there any way that it could be stopped before?
Do I know why men, it’s predominantly men, I have to say, do I know why men do it? I don’t, I don’t know. We’re not behaviourists. We don’t, I don’t really know exactly, but what I do know is that if you see an image or a video of a child who’s under 10 being raped, you know it’s wrong. You don’t need me to tell you that.
And actually, there are lots of ways in which people can sort of check their own behaviour and we just need to keep raising a zero-tolerance approach to this because I think people have got to take responsibility. I think that people think is a victimless crime. It’s not a victimless crime. We don’t make up these pictures. They’re real children and they’re really sexually abused.
I would like to tell you a story about, we call her Tara who I met in the States who was sexually abused by her stepfather from birth right up to the age of about 15 when she was rescued.
She saw her dad at weekends and so it was really hard for them to pick up what was happening with her. Her mother was totally unaware of this and eventually she was rescued and her stepfather got 60 years in prison, so it gives it a sense of how bad it was.
But she, she now speaks about her experiences. She had an assigned police officer and in the United States, you can opt in to be notified anytime someone’s caught with your images on their computer because it’s linked to damages. You have the right to sue them for damages. And she opted in and she had had over 1500 notifications when I’d seen her, and the police officer who worked with her told me that one of her images had been shared 70,000 times.
She told me that she was once in a shopping mall after, this is after she’d been rescued she was about 18 or 19 and somebody had come up to her and the shopping mall and was talking to her, a man, about looking at her images online.
She felt physically unsafe, all the time. And I think people think it’s a victimless crime and that you know, well it doesn’t matter, because [they think] I’m not hurting these children, I’m just looking at these images.
But the fact is that the survivors I’ve met, they don’t just spend the time thinking “Has someone in this room seen my picture?”. They actually feel scared physically all the time that someone’s going to hurt them. And I just think it’s such a terrible crime and people, need to realize that and realize that it’s not okay to look at these images. They’re real people.
We mentioned earlier the word “sexting”, and what we’ve talked about so far is really horrendous, but sexting is a little bit different. And the reason I say that is that most people don’t realize that it’s illegal for them to have an intimate photo of themselves under 18.
Today, it’s not unusual to read a two-page spread in a headline publication like Cosmopolitan for example, a perfectly legitimate, and worthy publication, nothing wrong with it at all, but they’ll have a two-page spread on the joys of sexting and that’s fine for adults. It’s a grownup’s a magazine. It’s not a publication intended for children. But inevitably children will flick through all this kind of stuff and they will read a two-page spread in a legitimate magazine that says “The joys of sexting and how to get the best out of it”.
And of course, teenagers being teenagers, they’re going to try this stuff, because that’s what teenagers do. They experiment and they try and they push limits, but they are technically breaking the law. And if they take a photo of themselves, they’ve broken the law, if they then send it, they’ve broken the law and they are in fact distributing child pornography.
Yes, you’re technically correct, but we’ve done a lot to try to not criminalize young people in that situation. I know that you probably can cite some exceptions of a ridiculous situation where somebody was 17 ending up on the sex offenders register or something. But if you talk to Simon Bailey, Chief Constable of Norfolk, National Police Chiefs’ Council’s lead on child protection, he is absolutely clear. There is no desire to criminalize young people who may have shared images of themselves and particularly if they’re over the age of 16 and there’s there’s the issue of consent there and all that kind of stuff.
We’re in the middle of developing a project with the NSPCC where [older] children self-refer images of themselves and then we can work with them to get these images taken down. So I think there’s a kind of common sense prevailing. Technically you’re right. Technically, they are illegal images and you’re right, people don’t realize that if you’re over the age of consent, and if you’re under 18, it’s still an illegal image.
But I think people are really trying to be sensible.
I’ve had young people say to me, what’s your problem sexting is fun, it’s exciting, all that kind of stuff. And I think yes, you probably think it is, but you probably don’t really want some pedophile looking at images of you, which is how it comes back to us. I don’t, they realize how those images are actually shared.
And of course, it might be fun and exciting right now, but maybe it won’t be when you’re 26 or 27 or whatever when it comes back to bite.
There’s so many cases of celebrities who’ve had images shared or they’ve been hacked into and shared. And it’s not actually done them any harm in the sense of, it’s been appalling, but it’s not the quite the career ruining moment it might’ve been.
But actually, their privacy has been invaded and that’s totally unacceptable. I just think we are really moving towards just trying to get sensible, and to be honest, the Police have got so much to deal with that actually they haven’t got time to deal with 17-year olds who … they’re trying to deal with major things, their resources are so thin now.
So yes, there will be some extreme cases where somebody reacted ridiculously and criminalize the young person, but I don’t think anybody wants to do that.
Having said that, we’re partners in the UK Safer Internet Centre, one of our partners, Southwest Grid for Learning, they publish a book called “So you’ve got naked online”, which helps you navigate your way through what do I do if you want to try and get rid of these images [of yourself].
And I do think young people should have the right to have those [images] removed. The IWF can only remove stuff if it’s criminal and if we know that they’re under 18, so there are different ways to get the content removed. And that’s not just down to us, but you’re right that it’s potentially a criminalising act.
You look after your staff really well. I’ve had a look around and I love the Space Hopper. I love the table tennis table. You’ve got all these lovely things, but they’re there to serve a real purpose because all your analysts do a hell of a job because they have to look at this stuff and they classify it and all this kind of stuff, so tell us a little bit about that.
OK, you met one of our analysts downstairs and our analysts are the heart of what we do here at the IWF. And nothing is more important than the staff in the hotline, so we do everything we can to protect them. And it is a strange and unique job and everybody who does apply for it, they want to do something, they want to make a difference.
And I know one thing I feel very proud of is that when people go into the hotline, they say it’s a really good atmosphere in there. It’s not stressed and actually that’s partly because we really go to lengths to really look after them.
So, when we advertise for people, we have lots of people apply, but we have a very intense, strenuous recruitment process.
We have the normal kind of application form, an interview, but we also do a psychological profile. We’re looking for people who are very resilient. We don’t want people coming with a kind of personal mission. We don’t assess content in relation of whether we approve or not. We assess it on whether it’s illegal or not illegal. If people have got particularly strong views about certain things, they might not be appropriate here.
So they have a psychological profile, they then have an image viewing in session and have the weekend to think about it and some pull out then. Then when they start there’s six months training for an analyst, they have mandatory monthly counselling. They do a group work, team building as well. They have an annual psychological assessment. And then we also do four away days for all the staff.
And we try to create an environment and atmosphere that makes them feel looked after. So, our analysts never work a minute’s over time. They have mandatory breaks, they can take breaks whenever they like as well. They can request extra counselling, they do a lot of peer support for each other, so they keep an eye on each other. And then we have as you say, a breakout room table tennis and Sky very nicely give us full license for all their packages, we’ve got two big TVs!
And we really encourage people to kind of mingle around at lunchtime because as you know, because you were taken into the hotline, the hotline is a protected environment behind those two doors. And when visitors go into the hotline we have to shut down so that nobody can see what’s on the screens and half our staff don’t go into the hotline.
So we work very hard to make sure that they mingle a lot so that they actually spend time with each other otherwise they can get quite segregated.
But we really do look after them and we’re very, very proud of them.
This is a hot line, and just like other hotlines, the general public can report things, the public report something that they might find they stumbled across, how would they do that?
Okay. So, if you’ve accidentally stumbled on online sexual abuse, you need to send us the URL, the web page address, at the top of your browser. What we need is that, we can’t do anything without that. We don’t take phone calls because we need the actual web page so that we can look at it. Go onto our website, iwf.org.ukand it’s there right on the first page where you can report.
You can either leave us your personal details if you want to know what happened to your report or you can report it anonymously, about 80% of our reports are anonymous, we don’t share any information with the police.
In fact, the only information we share with the police is anything related to the victims. So if we spot a new victim, we immediately escalate that, and we work with the police to see if we can provide information – the images and pictures might give some clue as to the whereabouts of that child. So, please report to us if you see anything that you think might be child sexual abuse. We want people to report to us and we’ll do everything we can to get it removed as soon as possible.
There is one other area that I need to explore with you and it’s January, 2019 with something like less than 60 days before the B word kicks off. We cannot avoid the topic of Brexit. We can’t avoid it really because you are a charity and you do get funding from your members, the people that pay for your services, but you also receive substantial grants from the EU. I think you’ve successfully secured funding from the EU until 2020, but then all bets are off.
And to my mind be an awful shame if for lack of some money, this service stopped. I don’t know if this is an existential crisis for you, but clearly there must be an impact.
Absolutely. I mean, we do get £400,000 a year from the EU and that actually pays directly for some salaries in the hotline. It pays 50% of salaries and the hotline and that’s what’s really important to recognize that unless we get that money back, we’ll have to cut down our staff.
So we’re working really hard to look at how we can raise money from other sources. The [EU] money is guaranteed to the end of 2020. We are in discussion with the UK government, but we’re also looking at other possibilities.
One thing we’ve never really explored before, which we’re looking at now, is donations from the public. We need to raise our profile a bit because we want people to know that we’re here, we exist and are providing really important services for the UK. And we’re looking for ways to generate [money], and meet that gap.
But not just meet that gap because we could always do more. Actually, we want to grow what we’re doing. We could have another 20 analysts in there and we could be doing work every day and we could bring down loads more content.
We can, we can bring down as much content as we have analysts and bear in mind that yesterday our analysts took action on a thousand web pages, they are pretty effective teams. Think how much we could do if we have more people!
Coming out with the EU is also problematic for us in terms of legislation and we’re part of In Hope, which is the network of hotlines. But, but the main challenge for us is that funding, you’re absolutely right to mention that Neil.
So if anyone can look around the back of their Sofa…
Oh yes! They can donate on our web site. That would be great. We are a charity and any money that people donate to us, we would put towards fighting the fight.
And what a worthy fight to fight.
Listen, we’re going to have to wrap it up, so thank you so much for your time, it’s been fascinating. And if anyone has been affected by any of that rather traumatic content then talk to the NSPCC, perhaps with Samaritans, or seek some external advice because it’s a tough subject.
And I think your team do a fantastic job, Susie, so thank you.