SoVote

Decentralized Democracy

House Hansard - 296

44th Parl. 1st Sess.
April 9, 2024 10:00AM
He said: Madam Speaker, imagine being the parent of a teenage daughter who has been missing for months and somebody discovers 50 explicit videos of that daughter being sexually abused on Pornhub, the most popular porn site in the world. Imagine how one would feel if intimate images of one's sibling was uploaded and Pornhub refused one's request to remove that content. Now, imagine if those videos of their exploited loved ones were being monetized and published for profit by Pornhub and were made available to Pornhub's over 130 daily visitors. How would someone feel if Pornhub’s only response was an auto-reply email? Understandably, one would be outraged. One would be furious, yet this happens over and over. Survivors, including a 12-year-old from Ontario, have had to seek justice through their own lawsuits because in Canada, the onus is on survivors and on law enforcement to prove, after the material has been uploaded, that the individuals depicted in those videos are either under age or have not consented to their distribution. This is a serious problem that Bill C-270, the stopping internet sexual exploitation act, seeks to fix. it’s important to note that for years, survivors, child protection agencies and the police have spoken out about this exploitation. They have not been silent. Survivors have shared how pornographic companies like Pornhub have been profiting from content depicting minors, sex trafficking victims, sexual assault, intimate images and gender-based violence for years. As early as 2019, companies like PayPal cut ties with MindGeek due to the availability of exploitive and abusive content. In March 2020, a few parliamentarians and I wrote a public letter to the Prime Minister to alert him about the exploitation that was happening on MindGeek. We followed up in November 2020 with a letter to the then Minister of Justice, urging him to ensure that our laws were adequate to prevent women and girls from being exploited by Pornhub. It was The New York Times exposé on December 4, 2020, in a piece written by Nicholas Kristof, that finally got the public's and the government’s attention. It was entitled “The Children of Pornhub: Why does Canada allow this company to profit off videos of exploitation and assault?” That article finally kicked off a firestorm of international attention on Pornhub, which is one of many pornographic websites owned by MindGeek, a Canadian company based in Montreal. About a year ago, it was bought and rebranded as Aylo by a company called Ethical Capital Partners, based in Ottawa. A few days after that article, the House of Commons ethics committee initiated an investigation into Pornhub. I joined the ethics committee for its study on Pornhub and listened to the harrowing stories of young women who had videos of sexual assaults or intimate content shared without their consent. Many of these women were minors when the videos were created and uploaded to pornography sites like Pornhub. I want to take a moment to share some of their testimony. Serena Fleites, whose story was covered by The New York Times exposé, had videos of her at age 13 uploaded by her ex-boyfriend. After that, her whole life came crumbling down. She experienced depression and drug use. She was harassed by people at her school who found her video and sent it to family members. She was blackmailed. She had to pretend to be her mother to have the videos taken down from Pornhub. This was all while she was 13 years old. In the end, she stopped going to school. She told us: I thought that once I stopped being in the public so much, once I stopped going to school, people would stop re-uploading it. But that didn't happen, because it had already been basically downloaded by [all the] people...[in] the world. It would always be uploaded, over and over and over again. No matter how many times I got it taken down, it would be right back up again. It basically became a full-time job for her to just chase down those images and to get them removed from Pornhub. Some witnesses appeared anonymously to protect their identities. One witness stated, “I was 17 when videos of me on Pornhub came to my knowledge, and I was only 15 in the videos they [were] profiting from.” She went on to say, “Every time they took it down, they also allowed more and more videos of me to be reuploaded.” That witness also said, “Videos of me being on Pornhub has affected my life so much to the point that I don't leave my house anymore. I stopped being able to work because I [am]...scared to be out in public around other people.” Another survivor who spoke to us at committee is Victoria Galy. As a result of discovering non-consensual images and videos of herself on Pornhub, she completely lost her sense of self-worth, and at times, she was suicidal. She told us at committee, “There were over eight million views just on Pornhub alone. To think of the amount of money that Pornhub has made off my trauma, date rape and sexual exploitation makes me sick to my stomach.” She added, “I have been forced to stand up alone and fight Pornhub”. It is a serious failure of our justice system when survivors have to launch their own lawsuits to get justice for the harms caused by companies like MindGeek. This Canadian company has not faced a single charge or consequence in Canada for publishing its videos of exploitation and for profiting from them. This is truly shameful. Last year, a survivor named Uldouz Wallace reached out to me. Uldouz is a survivor of the 2014 iCloud hack. She is also an award-winning actress, executive producer, activist and director of Foundation RA. Uldouz had photos and videos taken in the 2014 iCloud hack and uploaded onto porn sites like Pornhub, and she fought for years to get them taken down. As a result of this, she told us, “I lost followers, I lost everything that you could think of. It was just such hard time for me. I ended up spending over a million dollars over a three-year span just to get the content taken down on me with no success.... They're making so much money off of the non-consensual uploading of images and videos. The re-uploading is also a billion dollar industry.” She added, “There's still no federal laws. There's barely any laws at all to hold anyone online accountable. There's currently foreign revenge laws but for people like me there's nothing.” Rachel, a survivor from Alberta, said that it was devastating and that it is going to haunt her for the rest of her life. She said that she will always be someone's porn. I want to point out the incredible courage of Victoria, Serena, Uldouz, Rachel and many other survivors who have spoken out. In the midst of one of the most difficult moments of their lives, they are fighting back against a billion-dollar industry that seeks to profit from their pain and exploitation. I thank Victoria, Serena, Uldouz, and Rachel for refusing to back down. I thank them for their courage. I thank them for their relentless pursuit of justice. I would encourage members to listen to their full testimonies, and they can do so at www.siseact.ca. Throughout the ethics committee hearings and from the interactions I have had with survivors since, it is clear that this is a common problem. Pornographic companies are publishing and monetizing content without verifying the age and the consent of the people depicted in them. This is particularly a problem for Canada as many of those websites are hosted here. Bill C-270, the stopping Internet sexual exploitation act, would stop this. I am going to quote right from the summary of my bill. It states that the SISE act would: ...prohibit a person [including companies] from making, distributing or advertising pornographic material for commercial purposes without having first ascertained that, at the time the material was made, each person whose image is depicted in the material was 18 years of age or older and gave their express consent to their image being depicted. The SISE act would also allow individuals to revoke their consent. This is an important part to express the ongoing consent. Finally, the SISE act would provide for aggravating factors when the material created or published actually depicts minors or non-consensual activity. I am also pleased to share that I consulted on the bill with a variety of child protection agencies, law enforcement groups and the Canadian Centre for Child Protection to ensure that there are no gaps and that police have the tools to ensure they can seek justice. The heart of the bill is consent. No one should be publishing sexually explicit material without the express consent of everyone depicted in that material. Children cannot consent to exploitation. Victims of sex trafficking and sexual assault cannot consent. Those filmed without their knowledge cannot consent, yet pornography companies freely publish this content and profit from it because there is no onus on them to verify the age or the consent of those depicted. That is why the second recommendation of the 2021 ethics committee report is: That the Government of Canada mandate that content-hosting platforms operating in Canada require affirmation from all persons depicted in pornographic content, before it can be uploaded, that they are 18 years old or older and that they consent to its distribution, and that it consult with the Privacy Commissioner of Canada with respect to the implementation of such obligation. We have heard from survivors who testified that their images of abuse would not be online if companies like Pornhub had bothered to check for age and consent. Bill C-270 would fulfill this important recommendation from the ethics committee report and, importantly, I should add that this report was unanimously supported by all parties at the ethics committee. The recommendation also suggests consulting with the Privacy Commissioner. I happy to share with my colleagues that on February 29, 2024, the Privacy Commissioner released his investigation into Pornhub's operator Aylo, formerly MindGeek. The report was initially scheduled to be released on May 23, but it was delayed for over nine months when MindGeek, or Aylo, and its owners, Ethical Capital Partners took the Privacy Commissioner to court to block the release of that report. The Privacy Commissioner’s investigation into Aylo, MindGeek, was in response to a woman whose ex-boyfriend had uploaded intimate images of her to MindGeek's website without her consent. The young woman had to use a professional service to get it taken down and to remove her images from approximately 80 websites, where they had been re-posted more than 700 times. The report shared how the publishing of the woman’s intimate images led to a permanent loss of control of the images, which had a devastating effect on her. It caused her to withdraw from her social life and to live in a state of fear and anxiety. The Commissioner stated: This untenable situation could have been avoided in many cases had MindGeek obtained direct consent from each individual depicted in content prior to or at the time of upload. Pornhub’s own Monthly Non-Consensual Content reports suggest that non-consensual content is still regularly uploaded and viewed by thousands of users before it is removed. We find that by continuing to rely solely on the uploader to verify consent, MindGeek fails to ensure that it has obtained valid and meaningful consent from all individuals depicted in content uploaded to its websites. Ultimately, the Privacy Commissioner recommended that Pornhub and its owners adopt measures that would verify age and consent before any content is uploaded. I would urge all members to read the Privacy Commissioner's report on Pornhub. While Pornhub and its owners are the biggest pornography company in the world, this bill would ensure that age verification and consent applies to all pornography companies because whether it is videos of child exploitation, sex trafficking, AI deepfakes, sexual assault or an intimate encounter filmed by a partner, once a video or image has been uploaded, it is virtually impossible to eliminate. Each video can be viewed and downloaded millions of times within a 24-hour period, starting an endless nightmare for victims who must fight to get those videos removed, only for them to be uploaded again within minutes or hours. Canada must do more to prevent this exploitive content from ever reaching the Internet in the first place. I hope I have the support of my colleagues in ending this nightmare for so many and in preventing it for so many more. To the survivors, some of whom are watching today, we thank them. Their voices are being heard. I want to thank the organizations that have supported me along the way in getting this bill to this point: National Centre on Sexual Exploitation, National Council of Women of Canada, Ottawa Coalition to End Human Trafficking, London Abused Women's Centre, Defend Dignity, Vancouver Collective Against Sexual Exploitation, The Salvation Army, Survivor Safety Matters, Foundation RA, Montreal Council of Women, CEASE UK, Parents Aware, Joy Smith Foundation, Hope Resource Centre Association, Evangelical Fellowship of Canada, Colchester Sexual Assault Centre, Sexual Assault and Violence Intervention Services of Halton, and Ally Global Foundation.
2292 words
  • Hear!
  • Rabble!
  • star_border
  • Apr/9/24 6:19:50 p.m.
  • Watch
  • Re: Bill C-63 
Madam Speaker, I have a lot to say about the bill. I will just start with a brief personal anecdote. I want to be very clear when I say this: I do not do this as victim porn or looking for sympathy. It is an example of how if somebody like myself, in a position of privilege, has a hard time accessing the justice system, what about others? When I was a minister of the Crown, over 10 years ago, I received very explicit sexualized online threats, very graphic descriptions of how somebody was going to rape me, with what instruments, and how they were going to kill me. I was alone in a hotel room. My schedule had been published the day before, and I was terrified. The response at that time from law enforcement, and the process I had to go through as a minister of the Crown, to attempt to get justice in a situation that did not involve intimate images, sticks with me to this day. If I had to go through that at that time, what hope is there for somebody who does not have my position of privilege? What the bill would do is recognize that the forms of discrimination and harassment that, as my colleague from Esquimalt—Saanich—Sooke says, disproportionately impact women, sexual minorities and other persons, have outpaced Parliament's ability to change the law. Here we are today. Briefly, I want to respond to some of the points of debate. First of all, my colleague from the Liberals suggested that we expedite Bill C-63. That bill has been so widely panned by such a variety of disparate stakeholders that the government has not even scheduled it for debate in the House yet. Second, and this is particularly for my colleagues who are looking to support this, to send the bill through to second reading, Bill C-63 would not provide criminal provisions either for any of the activities that are in the bill or for some of the other instances that have been brought up in the House for debate tonight, particularly the non-consensual distribution of deepnudes and deepfake pornography. I raised the issue in the House over seven months ago. The intimate image distribution laws that are currently in the Criminal Code were only put in place in 2014, about a decade after social media came into play, and after Rehtaeh Parsons and Amanda Todd tragically died due to an absence in the law. Seven months have passed, and the government could have dealt with updating the Criminal Code with a very narrow provision that the Canadian Bar Association and multiple victims' rights groups have asked for, yet it has chosen not to. There are so many articles that have been written about what is wrong with what is in Bill C-63 that we now need to start paying attention to what is wrong with it because of what is not in there. There is no update to Canada's Criminal Code provisions on the distribution of intimate images produced by artificial intelligence that are known as deepnudes. I want to be very clear about this. There are websites right now where anyone in this place can download an app to their phone, upload any image of any person, including any person in here, and imagine what that looks like during an election campaign, erase people's clothes, and make it look like legitimate pornography. Imagine, then, that being distributed on social media without consent. Our Criminal Code, the Canadian Bar Association, as well as law professors, and I could read case after case, say that our laws do not update that. At the beginning of February, there was a Canadian Press article that said that the government would update the law in Bill C-63, but it did not. Instead, what it chose to do was put in place a three-headed bureaucracy, an entirely extrajudicial process that amounts to a victim of these crimes being told to go to a bureaucratic complaints department instead of being able to get restitution under the law. Do we know what that says to a perpetrator? It says, “Go ahead; do it. There is no justice for you.” It boggles my mind that the government has spent all of this time while countless women and vulnerable Canadians are being harassed right now. I also want to highlight something my colleague from Esquimalt—Saanich—Sooke said, which is that there is a lack of resources for law enforcement across the country. While everybody had a nice couple of years talking about defunding the police, how many thousands of women across this country, tens of thousands or maybe even millions, experienced online harassment and were told, when they finally got the courage to go to the police, that it was in their head? One of those women was killed in Calgary recently. Another of those women is Mercedes Stephenson, who talked about her story about trying to get justice for online harassment. If women like Mercedes Stephenson and I have a hard time getting justice, how is a teenager in Winnipeg in a high school supposed to get any sort of justice without clarity in the Criminal Code if there are deepnudes spread about her? I will tell members how it goes, because it happened in a high school in Winnipeg after I raised this in the House of Commons. I said it was going to happen and it happened. Kids were posting artificial intelligence-generated deepnudes and deepfakes. They were harassing peers, harassing young women. Do members know what happened? No charges were laid. Why were no charges laid? According to the article, it was because of ambiguity in the Criminal Code around artificial intelligence-created deepnudes. Imagine that. Seven months have passed. It is not in Bill C-63. At least the bill before us is looking at both sides of the coin on the Criminal Code provisions that we need to start looking at. I want to ensure that the government is immediately updating the Criminal Code to say that if it is illegal to distribute intimate images of a person that have been taken with a camera, it should be the exact same thing if it has been generated by a deepnude artificial intelligence. This should have been done a long time ago. Before Bill C-63 came out, Peter Menzies, the former head of the CRTC, talked about the need to have non-partisan consensus and narrowly scoped bills so it could pass the House, but what the government has chosen to do with Bill C-63 is put in place a broad regulatory system with even more nebulousness on Criminal Code provisions. A lot of people have raised concerns about what the regulatory system would do and whether or not it would actually be able to address these things, and the government has not even allowed the House to debate that yet. What we have in front of us, from my perspective, is a clear call to action to update the Criminal Code where we can, in narrow provisions, so law enforcement has the tools it needs to ensure that victims of these types of crimes can receive justice. What is happening is that technology is rapidly outpacing our ability to keep up with the law, and women are dying. I am very pleased to hear the multipartisan nature of debate on these types of issues, and that there is at least a willingness to bring forward these types of initiatives to committee to have the discussions, but it does concern me that the government has eschewed any sort of update of the Criminal Code on a life-versus-life basis for regulators. Essentially what I am worried about is that it is telling victims to go to the complaints department, an extrajudicial process, as opposed to giving law enforcement the tools it needs. I am sure there will be much more debate on this, but at the end of the day, seven months have passed since I asked the government to update the Criminal Code to ensure that deepnudes and deepfakes are in the Criminal Code under the non-consensual intimate image distribution laws. Certainly what we are talking about here is ensuring that law enforcement has every tool it needs to ensure that women and, as some of my colleagues have raised here, other sexual minorities are not victimized online through these types of technologies.
1429 words
  • Hear!
  • Rabble!
  • star_border