The F%cking Takedown of Pornhub with Laila Mickelwait
FML TalkDecember 18, 2024x
54
00:32:48

The F%cking Takedown of Pornhub with Laila Mickelwait

Trigger warning: This episode deals with sexual assault, sexual abuse, and attempted suicide.

 

This week, Gabrielle sits down with non-profit organizer turned investigative journalist, Laila Mackelwait, as they discuss Laila’s role in taking down the now infamous adult content site, Pornhub. Laila was one of the first to raise the alarm about the site’s complete lack of content and user verification, which allowed anyone with just an email address to upload content, no matter what it was. This effectively meant that the site was set up to enable perpetrators to upload abusive and illegal content with no repercussions for their actions whatsoever. Until f%cking now. This episode highlights the urgent need for more accountability in our digital age as well as how one person with a mission for justice can make a huge difference.

 

You can follow Laila @lailamickelwait to learn more about her book “Takedown” as well as the Justice Defense Fund, an organization she set up to empower victims to hold their perpetrators accountable.


Thank you so much to our incredible sponsors!

Lumen - Visit lumen.me/FMLTALK to get 15% off your Lumen


Beam - Get up to 35% off for a limited time when you go to shopbeam.com/fmltalk and use code fmltalk at checkout.


Follow your host!

TikTok: @gabrielle_stone

Instagram: @gabriellestone & @fmltalkpodcast

YouTube: FML TALK

Website: www.eatprayfml.com

Plus, if you want to submit an FML story, email it to info@eatprayfml.com 

Learn more about your ad choices. Visit megaphone.fm/adchoices

[00:00:00] Lumen is the world's first handheld metabolic coach. It's a device that measures your metabolism through your breath. The app then lets you know if you're burning fat or carbs, gives you guidance to improve your nutrition, advises you on workouts and sleep and even stress management.

[00:00:16] Having all this information at your fingertips is remarkable. The way you can tailor your health needs on the day-to-day to really learn about your metabolic health is so incredibly valuable. Your metabolism is your body's engine. It's at the core of so many answers your body is trying to give you. Lumen gives you all the recommendations you need to improve your metabolic health.

[00:00:40] So if you want to stay on track with your health this holiday season, go to lumen.me slash FML Talk to get 15% off your Lumen. That's L-U-M-E-N dot M-E slash FML Talk for 15% off your purchase. Lumen makes a great gift too. Thank you, Lumen, for sponsoring this episode.

[00:01:05] What is up, all of my beautiful freaking people? Welcome back to another episode of FML Talk. Today's episode is serious. It is shocking and it is a little hard to listen to, but it's also captivating, powerful and important. So sit back, get ready and welcome to FML Talk.

[00:01:28] Oh my God. Wait, how old was the other girl? 19. Can you believe that? Hey, this is Gabrielle Stone.

[00:01:32] Good book? He did what? 48 hours? What a dick. Yeah, but have you seen all the roses on our Instagram? And this is FML Talk. Oh no, she didn't.

[00:01:44] Okay, everyone. Normally we have lots of trigger warnings on various episodes. This one I really need you to listen to. There is trigger warning for sexual assault, sexual abuse, attempted suicide.

[00:01:57] It's kind of like the umbrella of all the triggers rolled into one.

[00:02:02] We are going to be talking about the downfall and the takedown of Pornhub today. And I was not aware of this story, although it has been widely in the news over the past couple of years.

[00:02:15] It is shocking. The details of it are disturbing. The stories that will be told throughout this episode are jarring, but it is really an important episode to listen to for protecting your children, being aware of the sexual crimes that take place, and how we are fighting against those that are happening.

[00:02:45] So, buckle up. Today's episode feels like a true crime show that is real life. Here we go.

[00:02:59] Lila McElwain, welcome to FML Talk. I am so excited to have you here. And I feel like excited is a weird word because of the subject matter we're going to be talking about.

[00:03:10] But I am excited to have you here nonetheless to share your story and dig in today.

[00:03:15] Thank you so much for having me. I am excited to be able to talk with you and share with your listeners as well.

[00:03:20] Can you, I guess, just take us to the beginning of how all of this began, how all of this started, and just give us kind of the cliff notes of what you went through?

[00:03:30] A hundred percent. So, just to kind of give a context for what we're talking about, the story that we're discussing is about what Financial Times has called one of the biggest takedowns of content in internet history that happened because PornHub, the world's largest and most popular porn site, we discovered was infested with videos of real sexual crimes.

[00:03:52] So, you know, content warning for those who, you know, could be sensitive to this kind of content about sexual violence. This is what it is about.

[00:04:01] This all started because I had been investigating and combating the injustice of sex trafficking for, at the time this began in 2020, it had been over a decade.

[00:04:12] You know, I had been fighting this fight, but was noticing a trend of sex trafficking and this kind of violence against women and girls and boys, not only happening offline, but in a digital world, it was being monetized and distributed online as well.

[00:04:27] And so, I was looking at that intersection between what I called the big porn industry, just like there's big tobacco, there's big pharma, there's actually an industry that I called big porn.

[00:04:37] And, you know, I had been noticing some headlines at the end of 2019.

[00:04:41] There were some very disturbing headlines that were showing up in the news that were grabbing my attention.

[00:04:48] One of them was about a woman named Nicole Ademando, who, she was a mother of two young children, like I was at the time, and she had been raped and sexually tortured by her partner.

[00:05:01] And he was filming those images and videos of abuse and uploading them to Pornhub.

[00:05:07] And she ended up killing him.

[00:05:09] And she was sentenced to life in prison.

[00:05:12] You're kidding.

[00:05:13] For what she did.

[00:05:14] I'm not kidding.

[00:05:15] It's one of the most horrific stories I've ever heard.

[00:05:18] And I mean, just the image of her, like holding her two young children before she had to go to prison was like, just heart shattering.

[00:05:27] But this is what happened to her.

[00:05:29] Thankfully, her sentence was reduced to seven years after much fighting by her advocates.

[00:05:36] Today, just so you know, she's free.

[00:05:38] So she was actually released just recently.

[00:05:41] So that was in the headlines.

[00:05:43] And I'm like, okay, so the torture and rape of this woman was uploaded to Pornhub.

[00:05:48] Okay, noted.

[00:05:49] There was also an investigation by the London Sunday Times, which is a renowned paper in the UK, like the New York Times.

[00:05:55] And they had found dozens of illegal videos on the site within minutes, even children as young as three years old.

[00:06:03] Wow.

[00:06:04] Noted.

[00:06:05] Really, you know, horrible.

[00:06:06] Then there was a story about a 15-year-old girl from Florida who was from Broward County, Florida.

[00:06:12] She had been missing for an entire year.

[00:06:14] And then she was finally found when her mother was tipped off by a Pornhub user that he had recognized her daughter on the site.

[00:06:22] And so her mother found the child being raped and abused in 58 videos that were being not only monetized with ads all over the videos, but also sold as paid to download content where Pornhub was taking 35% of the rape video sales and the abuser was taking 65%.

[00:06:40] And then police actually matched his face from the video to surveillance footage from a 7-Eleven, rescued her from his apartment.

[00:06:48] He had impregnated her as well.

[00:06:50] And it was under the title, like the account name was called Daddy Slut.

[00:06:55] I cannot.

[00:06:56] I mean, the details are just so bad.

[00:06:59] And okay, so there's more than that.

[00:07:01] But let's just take those few examples.

[00:07:03] So those are, you know, I'm thinking about how in the world did this happen?

[00:07:07] Because at the time, Pornhub, you know, was the world's brand name for porn.

[00:07:12] And they had people wearing their apparel proudly in public.

[00:07:16] You know, there's jokes on Saturday Night Live.

[00:07:18] They're walking New York Fashion Week.

[00:07:20] They've got pop-up shops, you know, in New York.

[00:07:23] And it's just like this cheeky brand name.

[00:07:26] They have this whole arm of Pornhub called Pornhub Cares, where they do this philanthropic, you know, PR stunts, like save the oceans and the whales and the bees, then donate for breast cancer research.

[00:07:38] And all of this is like highly publicized.

[00:07:40] So there's this image of like they care about health and safety.

[00:07:44] And besides the fact that they're literally, you know, the 10th most trafficked website in the world, not just porn site, but website at the time.

[00:07:52] And they had 130 million visits per day, 47 billion visits per year, enough content uploaded every year that it would take 169 years to watch.

[00:08:02] And this was the world's YouTube of porn.

[00:08:05] So it's like anybody with a camera with an iPhone could film a sex act, become a pornographer, essentially literally anybody in the world, and then upload it online.

[00:08:13] So as these news stories were kind of churning in my mind, really arrested my attention.

[00:08:19] I was up late one night, it was February 1st, 2020.

[00:08:22] And I was thinking about these news stories as I was trying to console my inconsolable baby who had had a birth complication and was just screaming all hours of the day and night.

[00:08:33] And as I'm sitting there walking him, I have an idea.

[00:08:36] And I say, I am going to test the upload system for myself and see what it takes to upload content to Pornhub.

[00:08:43] So finally get him to bed, I just take some videos of the rug and whatever in the dark room and go through the process.

[00:08:51] And I find out what millions of people already knew.

[00:08:53] And that was that all it took was to upload content to Pornhub was an email address.

[00:08:59] So anonymously with a VPN, anybody in the world could upload content, no verification of ID, consent, to make sure they're not children, to make sure they're not rape victims.

[00:09:09] And that's exactly how all this content got on the site.

[00:09:12] And I realized at that moment that this is potentially the largest repository of sexual crime in North America, the servers, because they had at that time 42 million images and almost 11 million videos on the site that night that had all been uploaded without verifying that they're not children or it's not non-consensual content or image-based sexual abuse.

[00:09:33] And I just felt like, okay, I have to sound the alarm on this and began, just started the hashtag on Twitter called Trafficking Hub.

[00:09:41] And it started to catch on.

[00:09:43] I started a petition that now has 2.3 million signatures from every country in the world.

[00:09:48] And thousands of media articles have since been written.

[00:09:52] Whistleblowers from the company came forward.

[00:09:54] Even the former owner of Pornhub came forward to help.

[00:09:57] Victims were coming left and right.

[00:09:59] Victims were speaking out, saying this happened.

[00:10:01] And it all spiraled and ended up where we're at today.

[00:10:05] They had to take down 91% of the entire website because it was infested.

[00:10:12] Yes.

[00:10:13] Wow.

[00:10:13] Yeah.

[00:10:13] So they went from 56 million pieces of content to 5 million, 5.2 million.

[00:10:18] And we got Visa, MasterCard, and Discovered all cut ties with them.

[00:10:22] PayPal.

[00:10:23] They lost all major advertisers.

[00:10:25] The CEO and the COO were forced to resign.

[00:10:27] They sold the company.

[00:10:29] The secret shareholder was exposed.

[00:10:31] Like, all of this happened.

[00:10:32] But we're not done yet.

[00:10:34] Because justice still has not been fully served on behalf of so many victims.

[00:10:38] But that's kind of, in a nutshell, how it started and, you know, what has happened so far.

[00:10:43] Okay.

[00:10:44] What the fuck?

[00:10:45] There's so much to unpack here.

[00:10:47] So what was your job at the time?

[00:10:49] Yeah.

[00:10:50] Yeah.

[00:10:50] No, I was working in a nonprofit organization hourly basis at the time because I was just got

[00:10:56] off an informal maternity leave.

[00:10:58] And so I was kind of just doing what I could do at the time, which was, you know, at the

[00:11:03] time, all I could do was be researching and posting on social media because I was...

[00:11:07] But you weren't like an attorney or like, I mean, that's crazy.

[00:11:10] Well, I mean, I got my education.

[00:11:12] Like, I did my master's in public diplomacy.

[00:11:15] I had been involved with anti-trafficking work for over a decade.

[00:11:18] Okay.

[00:11:19] Okay.

[00:11:19] And so I was working on policy and things like that, trying to make an impact.

[00:11:24] At the time, to be honest, I was very discouraged because I had spent over 10 years trying to

[00:11:29] make a difference and felt like I was making very little progress.

[00:11:32] So when this all began, I was in a very disillusioned and discouraged place.

[00:11:37] That's different now because we've had so much happen over the last four years.

[00:11:41] But to be honest, that's where I was at when this began.

[00:11:48] You all know I've had a hell of a time with my hormones being all over the place since having

[00:11:53] my son.

[00:11:53] And it's hugely affected my health, especially during that time of the month.

[00:11:58] Hormone fluctuations are a common experience for many women.

[00:12:02] And it's a topic I've become increasingly aware of and focused on.

[00:12:07] These hormonal shifts can influence our health in numerous ways, affecting everything from

[00:12:11] our menstrual cycles and our skin to our energy levels and metabolism.

[00:12:16] That's why I am thrilled to introduce you all to GLOW by Beam.

[00:12:21] GLOW's ingredients support a multitude of different issues, metabolism, energy levels, immune function,

[00:12:27] collagen production, and inflammatory responses.

[00:12:30] Something so many of us are needing support with.

[00:12:34] GLOW by Beam is a hormone balancing super powder for women.

[00:12:38] This delicious drink mix is tailor-made to help balance women's hormones.

[00:12:42] It contains 18 comprehensive active ingredients, vitamins, minerals, and natural antioxidants,

[00:12:49] added electrolytes for a hydration boost.

[00:12:52] The benefit list is long, but for me it was the PMS symptom relief which had become so awful

[00:12:59] after pregnancy.

[00:12:59] So on behalf of women everywhere, thank you.

[00:13:03] There has never been a better time to prioritize your health and try the best-selling GLOW super

[00:13:09] powder.

[00:13:10] Get up to 35% off for a limited time when you go to shopbeam.com slash FMLtalk and use code

[00:13:19] FMLtalk at checkout.

[00:13:20] That's shopbeam.com slash FMLtalk and use code FMLtalk for 35% off.

[00:13:34] So today, Pornhub is still accessible and exists.

[00:13:39] It's just clean of inappropriate.

[00:13:42] It's not clean.

[00:13:44] That's the shocking part about it is that it's just hardly what it was before, right?

[00:13:49] It was cut down to a stump.

[00:13:51] But even the remaining 9% that's on the website today is still awash with unverified content.

[00:13:58] So they started to verify.

[00:13:59] So they had to take 91% of the site and flush it down the toilet because it was infested

[00:14:05] with videos of real sexual crime.

[00:14:07] But then they changed their process.

[00:14:10] Now they require uploaders to be verified.

[00:14:12] But that's just not enough because often the uploader is the abuser.

[00:14:16] Like there's a case I'll tell you.

[00:14:17] So there was a 12-year-old boy in Alabama.

[00:14:19] He was drugged, overpowered, and raped in 23 videos by a man named Rocky Shea Franklin,

[00:14:25] who's in prison for 40 years for what he did.

[00:14:28] And he was a verified uploader on Pornhub.

[00:14:31] So he gave his ID.

[00:14:33] He registered.

[00:14:34] And then he uploaded his rape videos of this 12-year-old boy.

[00:14:37] And again, paid a download.

[00:14:38] He was sharing the profits from this horrific abuse of this boy.

[00:14:43] And he was verified.

[00:14:44] So the problem is that the site is still infested with videos that have never had the individuals

[00:14:50] in the videos verified for aging consent.

[00:14:54] And that means that the site today, the rest of it is still awash with illegal content.

[00:15:00] So they have to actually take down the rest that has not been verified.

[00:15:05] So that's the issue today that, you know, what's going on.

[00:15:10] I'm interested in, we were talking earlier, I'm a new mom.

[00:15:14] My son just turned a year old.

[00:15:17] How do you, as a mom, go into all of these real-life horrific scenarios and not just throw

[00:15:26] your hand?

[00:15:26] I mean, I would have moved to a little island and never spoke to anyone again and just like

[00:15:31] kept my kids there.

[00:15:32] Like, I don't know how you can separate the horrificness of the real world.

[00:15:38] You know what I mean?

[00:15:38] Yeah.

[00:15:39] I've become incredibly protective of my own kids as I've learned more and more and more

[00:15:44] about how children are being exploited.

[00:15:47] And they, you know, they're still young.

[00:15:49] So I, you know, they're four and seven.

[00:15:50] So they're, you know, not really at the age where they even would be interested in social

[00:15:56] media or trying to get online besides watching whatever kid show that I would let them watch

[00:16:03] on my own device.

[00:16:04] But I am very aware.

[00:16:07] Like, sometimes I do fantasize about this idea of like, can we just go somewhere where

[00:16:11] there's no internet and just be like, you know, live in a protected bubble?

[00:16:17] Yeah.

[00:16:17] Yeah.

[00:16:18] Because it is such a dangerous world that we live in that we didn't grow up in, you know,

[00:16:23] with the way that the internet is ubiquitous and every device is a portal to the, you know,

[00:16:30] we would be afraid of like children walking alone in the back alley, right?

[00:16:34] Because there's predators, but now every device is that.

[00:16:36] Yeah.

[00:16:37] And it's overwhelming.

[00:16:39] At the same time.

[00:16:42] Yeah.

[00:16:42] What's happened, right?

[00:16:43] With within a digital world.

[00:16:45] So it's both things.

[00:16:46] But, you know, I mean, the reality that I realized, and I think all mothers and anyone

[00:16:52] who cares about children would realize is that the victims who are being abused on sites like

[00:16:57] this are anybody's daughter, anybody's son, because it is this too, like, you know, the

[00:17:03] taken scenario that you see in the movies where somebody is kidnapped and they're taken

[00:17:07] and they're exploited.

[00:17:08] And that does happen.

[00:17:10] Right.

[00:17:10] But a lot of these cases were, you know, the case of Serena.

[00:17:14] Serena, who was a 14 year old girl from Bakersfield, California, who was just, she was a straight

[00:17:20] A student.

[00:17:21] She'd never kissed a boy before.

[00:17:22] She had a crush on a boy older than her who convinced her to send him some nude images

[00:17:28] and videos.

[00:17:28] And she wanted to impress him.

[00:17:30] Who could blame her?

[00:17:31] She, you know, this is very normal today to do sexting and all this.

[00:17:35] So she did.

[00:17:36] And then he shared them with friends.

[00:17:38] They got out of control.

[00:17:40] They ended up uploaded to Pornhub.

[00:17:42] Millions of views upload again and again.

[00:17:44] So she would plead with them to take the videos down and they would hassle her to prove that she was a victim,

[00:17:50] prove that you're underage, prove that this is you.

[00:17:53] And then eventually, if she did get it down, it would just go right back up again.

[00:17:57] And it sent her on this spiral of despair and depression.

[00:18:01] And she ended up dropping out of school because she was being bullied.

[00:18:04] She ended up getting addicted to drugs, to self-medicate for the pain.

[00:18:07] She ended up trying to kill herself multiple times.

[00:18:10] She ended up living homeless out of a car.

[00:18:13] And the terrifying thing about that is, like, it literally could be anybody's child that this could happen to in this day and age.

[00:18:19] And so just that reality, I think it inspires me to keep going despite the fact that it's hard, right?

[00:18:26] And despite the fact that it's horrific because it's not just a fight for Serena.

[00:18:32] It's for our children, right?

[00:18:34] It's for the future generations who will grow up in this digital world.

[00:18:38] And so I think that's, you know, I try to keep that, like, in the forefront of all of this is, you know, it's for all our kids.

[00:18:45] Oh, my God.

[00:18:46] That, like, hurts my soul to hear stories like that.

[00:18:49] But I think it's really important for that information to be out so people can be aware.

[00:18:55] What is your take on posting kids on social media?

[00:18:58] Because we've talked about that a bit on this show before my son was born.

[00:19:02] I was like, ah, do I, don't I?

[00:19:03] And there was, like, a lot of back and forth about it.

[00:19:06] And we decided against it.

[00:19:08] But I'm interested to hear your take on it.

[00:19:10] I think you're smart to do that.

[00:19:12] I think you're wise.

[00:19:13] I don't do it.

[00:19:14] Now, you know, with AI and the capabilities, what's happening is predators are able to take the faces of children off of social media,

[00:19:24] superimpose them in very realistic ways onto AI-generated child sexual abuse material that gets distributed.

[00:19:32] And the thing to know about it, it's not like you could just get it down and it's down.

[00:19:37] Now, anything that's posted on the Internet, you have to consider it's forever.

[00:19:41] And a video like that, you know, victims call it the immortalization of their trauma because even when it's fake, it's still traumatizing to victims of this.

[00:19:50] And when it's real, it's probably even worse.

[00:19:53] Right.

[00:19:53] But to just know that the consequences of something like that happening could be forever.

[00:19:59] And so I think that it's hard because it's fun to share your life with your friends on social media.

[00:20:07] And it's validating.

[00:20:09] And it's like, you know, it's just there's this whole aspect of it that's really positive.

[00:20:13] But there's this very dark side of it as well.

[00:20:16] And I guess at a minimum, make sure your account is private and make sure that every single person who's following you, you know them in the real world.

[00:20:26] And so I guess that's at a minimum.

[00:20:27] Yeah, there was a really fucked up story that my friend told me that was in the news a while back that was kind of like the deciding factor for me.

[00:20:37] And it was this woman.

[00:20:38] I think it was Alabama.

[00:20:39] She only posted her daughter on her page.

[00:20:42] It was private.

[00:20:43] She only accepted people that she knew.

[00:20:46] And her daughter's information ended up on the dark web, like with like, this is like the hours she's at school.

[00:20:54] This is like the what her front door looks like.

[00:20:56] And it was basically like selling the information to go abduct this child.

[00:21:00] And the FBI found it, showed up at her door and traced it back to this guy who she accepted because it was so and so from church.

[00:21:08] She didn't really know him.

[00:21:09] But that's what's his face from church.

[00:21:12] And I was like, that's fucking terrifying.

[00:21:15] And it's so like at first for me, it was like, well, I understand if it's like a big account, like if it's like an account like mine, where like there's hundreds.

[00:21:25] And thousands of people following me and like everybody, you know, that's one thing.

[00:21:30] And then it was the argument of like, well, what if it's an account that only like 500 or 600 people are following?

[00:21:36] And I'm like, no, like it's on the Internet.

[00:21:38] That's in my opinion.

[00:21:40] That's what creepy predators who are looking for that type of content.

[00:21:45] Like they're looking for the kids that are just like singing to the camera and the mom's like, this is so cute.

[00:21:50] I'm going to upload it.

[00:21:52] Yeah.

[00:21:52] And also, even if they're people, you know, so even if you're accepting people that you know and trust, they could get hacked.

[00:22:00] Right.

[00:22:00] And then there's somebody behind there that you don't know.

[00:22:03] Right.

[00:22:03] So I guess the only way to be 100% safe with your own kids is to really be strict about it, which is hard.

[00:22:10] But yeah, very.

[00:22:12] I mean, it's better to be safe than sorry.

[00:22:14] Yeah.

[00:22:15] At the end of the day, I think.

[00:22:17] And then just do like private texts with your friends and share the old fashioned way, I guess, with, you know, because you want to keep up to date.

[00:22:25] I know.

[00:22:25] It's wild that like, yeah, that it has to be that.

[00:22:37] All this stuff happens with Pornhub.

[00:22:39] What did that look like when they were actually taken down?

[00:22:42] Like, was it a court case?

[00:22:44] Was it the FBI just investigated?

[00:22:47] Like, what happened?

[00:22:48] It was a bombshell story in the New York Times.

[00:22:50] So two-time Pulitzer Prize winning journalist Nicholas Kristof did a six-month investigation.

[00:22:56] And I talk about the inside baseball of how that all evolved in my book, Takedown.

[00:23:02] But he connected with many victims who were coming forward to me, and I was connecting them to him.

[00:23:07] And he was speaking with insiders.

[00:23:09] And he developed this just incredibly powerful article, the Children of Pornhub, that sent shockwaves around the world.

[00:23:16] And within days, the pressure was on.

[00:23:19] And what really happened was we were fighting with the credit card companies for 10 months at this point when this article was released.

[00:23:26] But they were showing their concern.

[00:23:28] We were showing someone's evidence and saying, like, you're profiting from this because every swipe, every download, every advertisement that's on this trafficking, rape, abuse, you're profiting.

[00:23:39] It's illegal to profit knowingly from this content.

[00:23:42] But they were resistant.

[00:23:43] They were resistant.

[00:23:44] They didn't want to disengage.

[00:23:45] Finally, this article comes out, and the pressure is on.

[00:23:49] There's literally thousands of follow-on articles after this.

[00:23:52] People are up in arms in Parliament.

[00:23:54] Justin Trudeau, the prime minister, is making statements on TV.

[00:23:58] And the credit card companies finally say, we're investigating Pornhub.

[00:24:03] And at that moment, they confirmed what everybody already knew, that the site was infested with illegal crime.

[00:24:10] And then they cut ties.

[00:24:12] And then Pornhub was so scared about losing the credit card companies and so desperate to be able to get them back.

[00:24:20] They did the unthinkable.

[00:24:21] And they deleted 91% of their entire website overnight.

[00:24:25] And it just was gone.

[00:24:27] Because it was unverified content.

[00:24:29] They had no clue which videos were actually rough sex and what was rape and who was 16 and who was 18.

[00:24:36] And that's how that happened.

[00:24:38] However, we do know that two weeks later, the credit cards quietly snuck back to the advertising arm of Pornhub.

[00:24:46] And we're continuing to monetize.

[00:24:48] Because they sell 4.6 billion ad impressions on Pornhub every day.

[00:24:52] And it was another two-year battle to finally get them to cut off Pornhub once and for all.

[00:24:57] But it did happen.

[00:24:58] Oh, God. Unreal.

[00:25:00] That's what happened.

[00:25:01] Yeah.

[00:25:02] And victims, like, I know the story of Serena was so horrific and sad.

[00:25:06] But Serena is a hero.

[00:25:08] Because not only was she the face of that New York Times piece, but she is suing the literal hell out of Pornhub.

[00:25:13] It's owners, executives by name.

[00:25:16] She's suing Visa.

[00:25:17] And she's suing the hedge funds that even funded Pornhub and its parent company.

[00:25:22] This is the 12-year-old boy.

[00:25:25] Wait, wait.

[00:25:25] Which one is Serena?

[00:25:27] Is that the one from Bakersfield?

[00:25:28] She's a 14-year-old girl.

[00:25:29] Good.

[00:25:30] I hope she fucking gets everything.

[00:25:34] I know.

[00:25:35] And I mean, when these cases, and the 12-year-old boy I told you about, he's suing Pornhub.

[00:25:39] He just filed an amended complaint a couple of weeks ago.

[00:25:42] I mean, the details of what happened to him, like, the police were reaching out to Pornhub and they were ignoring the police for seven months that that boy's rape was online.

[00:25:53] Oh, my gosh.

[00:25:53] So when these cases go to trial, I mean, it could be the end of Pornhub and its parent company.

[00:25:59] I fucking hope so.

[00:26:00] Because of the hurricane.

[00:26:00] I hope so.

[00:26:01] Yeah.

[00:26:02] Oh, my God.

[00:26:03] I'm like, I have, like, chills.

[00:26:06] I feel like I need to take a shower.

[00:26:07] There must be so many victims that don't have the means to go after big companies legally, like, to get the legal representation they need.

[00:26:18] Like, what do you think the answer is for, like, the vast majority of those victims who aren't going to be the center of some crazy legal battle?

[00:26:28] Yeah.

[00:26:28] Well, that's why the Justice Defense Fund exists.

[00:26:32] My organization is called the Justice Defense Fund.

[00:26:34] And what we are aiming to do is not only shine a light on bad actors like Pornhub and hold them accountable in the court of public opinion, but also in the court of law and empower victims to do exactly that, to access the historically unequal justice system.

[00:26:50] And today, you know, not just thanks to our work, but thanks to other organizations and other attorneys, there are nearly 300 victims that are suing in 25 lawsuits.

[00:26:59] And that includes class actions on behalf of tens of thousands of child victims.

[00:27:04] So there is a substantial number of victims.

[00:27:06] There's so many more than that in the world.

[00:27:09] But we're trying to fix that problem of these victims because these corporations are so big and they hire the most powerful attorneys to try to squash any accountability.

[00:27:20] And we just want to empower victims to be able to get justice.

[00:27:25] Not like any amount of money could ever compensate for what happened to them.

[00:27:29] But in order to get closure, in order to get healing, they need the recognition of harm.

[00:27:33] And it's actually a deterrent to future abusers because when they see Pornhub's executives sued, right, they'll be like, okay, maybe that's not the best idea to do that.

[00:27:44] Right.

[00:27:45] Yeah.

[00:27:45] Oh, my God.

[00:27:46] You are quite literally an angel that was put on this earth.

[00:27:50] I'm really thankful for the work that you've done to bring this to the forefront and this massive takedown.

[00:27:57] Can you tell everybody where they can find you what the name of the book is if they want to dive more into it and just like 100% thank you a million times?

[00:28:06] Well, you know, I have to just say that, you know, I have had the honor of spearheading this effort.

[00:28:12] But the true heroes in all of this are 100% the courageous victims who's without their voice.

[00:28:19] None of this would have happened.

[00:28:20] But also, you know, there's been like 600 organizations who participated.

[00:28:24] And even those in the porn industry have been some of the greatest allies in helping to hold this porn site accountable.

[00:28:30] And so I'm so grateful for the opportunity I've had.

[00:28:33] But, you know, I just say, look, this has been amazing group effort.

[00:28:37] And even those like you who, you know, make a decision like you could say, I don't want to hear about that.

[00:28:42] That's horrible.

[00:28:43] And just not have this interview.

[00:28:45] But people like you were like, no, we're going to just be courageous about this and just dive into something so disgusting and horrible so that we can shine a light on it.

[00:28:54] And everything, every moment like that really matters.

[00:28:57] And so people can 100% of the proceeds from the sale of this book go to the Justice Defense Fund.

[00:29:04] And you can read the book.

[00:29:05] It's takedown inside the fight to shut down Pornhub for child abuse, rape and sex trafficking.

[00:29:10] And you go on a journey of discovery with me.

[00:29:13] It's written like a true crime, first person, fun narrative format.

[00:29:18] And you go through, you can meet the victims, you meet the whistleblowers, you go through this journey with me.

[00:29:22] And people can find me on social media at Lila McElwey is my handle.

[00:29:26] And I'm posting about this all the time.

[00:29:28] I mean, this is full-time focus.

[00:29:30] So you can do that.

[00:29:32] You can join Team Takedown.

[00:29:33] We created Team Takedown not only to help fully take down Pornhub and hold its executives accountable, but to take down illegal content across the internet.

[00:29:41] And our goal is to prevent it from happening in the future.

[00:29:44] We need policies.

[00:29:46] So age and consent verification policies for all user-generated porn so this doesn't happen again.

[00:29:50] And so that's our goal is to make the internet a safer place for our children for generations to come.

[00:29:57] Well, thank you.

[00:29:58] Like very truly.

[00:30:00] Thank you.

[00:30:01] It's overwhelmingly scary as a parent to open your brain to those types of realities and possibilities.

[00:30:10] And I appreciate all the work that you're doing and coming on and sharing it with my listeners so that they can be a little more aware and protect their kids and join in the Takedown as well.

[00:30:22] Thank you so much for having me.

[00:30:23] It was just an honor to speak with you.

[00:30:25] Thank you.

[00:30:26] Thank you for being here.

[00:30:26] Thank you.

[00:31:00] Thank you.

[00:31:01] Please go check out the information in the show notes that she also stated on this episode.

[00:31:06] And thank you guys for listening and coming on this journey with us.

[00:31:10] It's a tough topic that unfortunately does not get publicized and spoken about enough.

[00:31:15] And I am thankful that we have a platform to include stuff like this for the victims and help spread the word.

[00:31:24] I love you guys.

[00:31:25] I will see you next week.

[00:31:29] All right, FMLers.

[00:31:31] If you don't want to miss an episode,

[00:31:33] make sure to follow on your favorite podcast app.

[00:31:36] And if you're loving the show,

[00:31:38] drop us a five-star rating and leave a review.

[00:31:40] You can keep up with me on Instagram at Gabrielle Stone

[00:31:44] or the podcast page at FML Talk Podcast.

[00:31:47] For all the merch and books signed personally by me,

[00:31:50] you can shop the FML line on eatpreyfml.com.

[00:31:54] And as always, have a fucking self-love cocktail on me.

[00:31:59] Cheers.

[00:32:06] This podcast has been brought to you by Podcast Nation.