SoVote

Decentralized Democracy

House Hansard - 310

44th Parl. 1st Sess.
May 7, 2024 10:00AM
Mr. Speaker, I am very pleased to speak to Bill C-270, an act to amend the Criminal Code (pornographic material), at second reading. I would like to begin my remarks by stressing the bill's important objective. It is to ensure that those who make, distribute or advertise pornographic material verify that those depicted in that material are at least 18 years of age and have consented to its production and distribution. As the sponsor has explained, the bill's objective is to implement recommendation number two of the 2021 report of the House of Commons Standing Committee on Access to Information, Privacy and Ethics. Specifically, that report recommends that the government “mandate that content-hosting platforms operating in Canada require affirmation from all persons depicted in pornographic content, before it can be uploaded, that they are 18 years old or older and that they consent to its distribution”. This recommendation responds to ongoing concerns that corporations like Pornhub have made available pornographic images of persons who did not consent or were underage. I want to recognize and acknowledge that this conduct has caused those depicted in that material extreme suffering. I agree that we must do everything we can to protect those who have been subjected to this trauma and to prevent it from occurring in the first place. I fully support the objective of the committee's recommendation. I want to say at the outset that the government will be supporting this bill, Bill C-270, at second reading, but with some serious reservations. I have some concerns about the bill's ability to achieve the objective of the committee's recommendation. I look forward, at committee, to where we can hear from experts on whether this bill would be useful in combatting child pornography. The bill proposes Criminal Code offences that would prohibit making, distributing or advertising pornographic material, without first verifying the age and consent of those depicted by examining legal documentation and securing formal written consent. These offences would not just apply to corporations. They would also apply to individuals who make or distribute pornographic material of themselves and others to generate income, a practice that is legal and that we know has increased in recent years due to financial hardship, including that caused by the pandemic. Individuals who informally make or distribute pornographic material of themselves and of people they know are unlikely to verify age by examining legal documentation, especially if they already know the age of those participating in the creation of the material. They are also unlikely to secure formal written consent. It concerns me that such people would be criminalized by the bill's proposed offences, where they knew that everyone implicated was consenting and of age, merely because they did not comply with the bill's proposed regulatory regime governing how age and consent must be verified. Who is most likely to engage in this conduct? The marginalized people who have been most impacted by the pandemic, in particular sex workers, who are disproportionately women and members of the 2SLGBTQI+ communities. Notably, the privacy and ethics committee clearly stated that its goal was “in no way to challenge the legality of pornography involving consenting adults or to negatively impact sex workers.” However, I fear that the bill's proposed reforms could very well have this effect. I am also concerned that this approach is not consistent with the basic principles of criminal law. Such principles require criminal offences to have a fault or a mental element, for example, that the accused knew or was reckless as to whether those depicted in the pornographic material did not consent or were not of age. This concern is exacerbated by the fact that the bill would place the burden on the accused to establish that they took the necessary steps to verify age and consent to avoid criminal liability. However, basic principles of criminal law specify that persons accused of criminal offences need only raise a reasonable doubt as to whether they committed the offence to avoid criminal liability. I would also note that the committee did not specifically contemplate a criminal law response to its concerns. In fact, a regulatory response that applies to corporations that make, distribute or advertise pornographic material may be better positioned to achieve the objectives of the bill. For example, our government's bill, Bill C-63, which would enact the online harms act, would achieve many of Bill C-270's objectives. In particular, the online harms act would target seven types of harmful content, including content that sexually victimizes a child or revictimizes a survivor, and intimate content communicated without consent. Social media services would be subjected to three duties: to act responsibly, to protect children and to make content inaccessible that sexually victimizes a child or revictimizes a survivor, as well as intimate images posted without consent. These duties would apply to social media services, including livestreaming and user-uploaded adult content services. They would require social media services to actively reduce the risk of exposure to harmful content on their services; provide clear and accessible ways to flag harmful content and block users; put in place special protections for children; take action to address child sexual exploitation and the non-consensual posting of intimate content, including deepfake sexual images; and publish transparency reports. Bill C-63 would also create a new digital safety commission to administer this regulatory framework and to improve the investigation of child pornography cases through amendments to the Mandatory Reporting Act. That act requires Internet service providers to report to police when they have reasonable grounds to believe their service is being used to commit a child pornography offence. Failure to comply with this obligation can result in severe penalties. As I know we are all aware, the Criminal Code also covers a range of offences that address aspects of the concerns animating the proposed bill. Of course, making and distributing child pornography are both already offences under the Criminal Code. As well, making pornography without the depicted person's knowledge can constitute voyeurism, and filming or distributing a recording of a sexual assault constitutes obscenity. Also, distributing intimate images without the consent of the person depicted in those images constitutes non-consensual distribution of intimate images, and the Criminal Code authorizes courts to order the takedown or removal of non-consensual intimate images and child pornography. All these offences apply to both individuals and organizations, including corporations, as set out in section 2 of the Criminal Code. Should parliamentarians choose to pursue a criminal response to the concerns the proposed bill seeks to address, we may want to reflect upon whether the bill's objectives should be construed differently and its provisions amended accordingly. I look forward to further studying such an important bill at committee.
1144 words
  • Hear!
  • Rabble!
  • star_border
Mr. Speaker, the subject that we are dealing with this evening is a sensitive one. My colleagues have clearly demonstrated that in the last couple of minutes. We all have access to the Internet and we basically use it for three reasons: for personal reasons, for professional reasons and for leisure, which can sometimes overlap with personal reasons. Pornography is one of those uses that is both for leisure and for personal reasons. To each their own. The use of pornography is a personal choice that is not illegal. Some people might question that. We might agree or disagree, but it is a personal decision. However, the choice that one person makes for their own pleasure may be the cause of another person's or many other people's nightmare. Basically, that is what Bill C-270 seeks to prevent, what it seeks to sanction. The purpose of the bill is to ensure that people do not have to go through hell because of pornography. This bill seeks to criminalize the fact that, under the guise of legality, some of the images that are being viewed were taken or are being used illegally. I want to talk briefly about the problem this bill addresses and the solutions that it proposes. Then, to wrap up, I will share some of my own thoughts about it. For context for this bill and two others that are being studied, Bill S‑210 and C‑63, it was a newspaper article that sounded the alarm. After the article came out, a House of Commons committee that my esteemed colleague from Laurentides—Labelle sits on looked at the issue. At that time, the media informed the public that videos of women and children were available on websites even though these women and, naturally, these children never gave their consent to be filmed or for their video to be shared. We also learned that this included youths under 18. As I said, a committee looked at the issue. The images and testimonies received by the committee members were so shocking that several bills that I mentioned earlier were introduced to try to tackle the issue in whole or in part. I want to be clear: watching pornography is not the problem—to each their own. If someone likes watching others have sex, that is none of my concern or anyone else's. However, the problem is the lack of consent of the people involved in the video and the use of children, as I have already said. I am sure that the vast majority of consumers of pornography were horrified to find out that some of the videos they watched may have involved young people under the age of 18. These children sometimes wear makeup to look older. Women could be filmed without their knowledge by a partner or former partner, who then released the video. These are intimate interactions. People have forgotten what intimacy means. If a person agrees to be filmed in an intimate situation because it is kind of exciting or whatever, that is fine, but intimacy, as the word itself implies, does not mean public. When a young person or an adult decides to show the video to friends to prove how cool it is that they got someone else to do something, that is degrading. It is beyond the pale. It gets to me because I saw that kind of thing in schools. Kids were so pleased with themselves. I am sorry, but it is rarely the girls who are so pleased with themselves. They are the ones who suffer the negative consequences. At the end of the day, they are the ones who get dragged through the mud. Porn sites were no better. They tried to absolve themselves by saying that they just broadcast the stuff and it is not up to them to find out if the person consented or was at least 18. Broadcasting is just as bad as producing without consent. It encourages these illegal, degrading, utterly dehumanizing acts. I am going back to my notes now. The problem is that everyone is blaming everyone else. The producer says it is fine. The platform says it is fine. Ultimately, governments say the same thing. This is 2024. The Internet is not new. Man being man—and I am talking about humankind, humans in general—we were bound to find ourselves in degrading situations. The government waited far too long to legislate on this issue. In fact, the committee that looked into the matter could only observe the failure of content moderation practices, as well as the failure to protect people's privacy. Even if the video was taken down, it would resurface because a consumer had downloaded it and thought it was a good idea to upload it again and watch it again. This is unspeakable. It seems to me that people need to use some brain cells. If a video can no longer be found, perhaps there is a reason for that, and the video should not be uploaded again. Thinking and using one's head is not something governments can control, but we have to do everything we can. What is the purpose of this bill and the other two bills? We want to fight against all forms of sexual exploitation and violence online, end the streaming and marketing of all pornographic material involving minors, prevent and prohibit the streaming of non-consensual explicit content, force adult content companies and streaming services to control the streaming of this content and make them accountable and criminally responsible for the presence of this content on their online sites. Enough with shirking responsibility. Enough with saying: it is not my fault if she feels degraded, if her reputation is ruined and if, at the end of the day, she feels like throwing herself off a bridge. Yes, the person who distributes pornographic material and the person who makes it are equally responsible. Bill C‑270 defines the word “consent” and the expression “pornographic material”, which is good. It adds two new penalties. Essentially, a person who makes or distributes the material must ensure that the person involved in the video is 18 and has given their express consent. If the distributor does not ask for it and does not require it, they are at fault. We must also think about some of the terms, such as “privacy”, “education”, but also the definition of “distributor” because Bill C-270 focuses primarily on distributors for commercial purposes. However, there are other distributors who are not in this for commercial purposes. That is not nearly as pretty. I believe we need to think about that aspect. Perhaps legal consumers of pornography would like to see their rights protected. I will end with just one sentence: A real statesperson protects the dignity of the weak. That is our role.
1165 words
  • Hear!
  • Rabble!
  • star_border
Mr. Speaker, I appreciate the opportunity to say a few words in support of Bill C-270, which is an excellent bill from my colleague from Peace River—Westlock, who has been working so hard over his nine years in Parliament to defend the interests of his constituents on important issues like firearms, forestry and fracking, but also to stand up for justice and the recognition of the universal human dignity of all people, including and especially the most vulnerable. Bill C-270 seeks to create mechanisms for the effective enforcement of substantively already existing legal provisions that prohibit non-consensual distribution of intimate images and child pornography. Right now, as the law stands, it is a criminal offence to produce this type of horrific material, but there are not the appropriate legal mechanisms to prevent the distribution of this material by, for instance, large pornography websites. It has come to light that Pornhub, which is headquartered in Canada, has completely failed to prevent the presence on its platform of non-consensual and child-depicting pornographic images. This has been a matter that has been studied in great detail at parliamentary committees. My colleague for Peace River—Westlock has played a central role, but other members from other parties have as well, in identifying the fact that Pornhub and other websites have not only failed but have shown no interest in meaningfully protecting potential victims of non-consensual and child pornographic images. It is already illegal to produce these images. Why, therefore, should it not also be clearly illegal to distribute those images without having the necessary proof of consent? This bill would require that there be verification of age and consent associated with images that are distributed. It is a common-sense legal change that would require and affect greater compliance with existing criminal prohibitions on the creation of these images. It is based on the evidence heard at committee and based on the reality that major pornography websites, many of which are headquartered in Canada, are continuing to allow this material to exist. To clarify, the fact that those images are on those websites means that we desperately need stronger legal tools to protect children and stronger legal tools to protect people who are victims of the non-consensual sharing of their images. Further, in response to the recognition of the potential harms on children associated with exposure to pornography or associated with having images taken of them and published online, there has been discussion in Parliament and a number of different bills put forward designed to protect children in vulnerable situations. These bills are, most notably, Bill C-270 and Bill S-210. Bill S-210 would protect children by requiring meaningful age verification for those who are viewing pornography. It is recognized that exposing children to sexual images is a form of child abuse. If an adult were to show videos or pictures to a child of a sexual nature, that would be considered child abuse. However, when websites fail to have meaningful age verification and, therefore, very young children are accessing pornography, there are not currently the legal tools to hold them accountable for that. We need to recognize that exposing young children to sexual images is a form of child abuse, and therefore it is an urgent matter that we pass legislation requiring meaningful age verification. That is Bill S-210. Then we have Bill C-270, which would protect children in a different context. It would protect children from having their images depicted as part of child pornography. Bill C-270 takes those existing prohibitions further by requiring that those distributing images also have proof of age and consent. This is common sense; the use of criminal law is appropriate here because we are talking about instances of child sexual abuse. Both Bill S-210 and Bill C-270 deal with child sexual abuse. It should be clear that the criminal law, not some complicated nebulous regulatory regime, is the appropriate mechanism for dealing with child abuse. In that context, we also have a government bill that has been put forward, Bill C-63, which it calls the online harms act. The proposed bill is kind of a bizarre combination of talking about issues of radically different natures; there are some issues around speech, changes to human rights law and, potentially, attempts to protect children, as we have talked about. The freedom of speech issues raised by the bill have been well discussed. The government has been denounced from a broad range of quarters, including some of their traditional supporters, for the failures of Bill C-63 on speech. However, Bill C-63 also profoundly fails to be effective when it comes to child protection and the removal of non-consensual images. It would create a new bureaucratic structure, and it is based on a 24-hour takedown model; it says that if something is identified, it should be taken down within 24 hours. Anybody involved in this area will tell us that 24-hour takedown is totally ineffective, because once something is on the Internet, it is likely to be downloaded and reshared over and over again. The traumatization, the revictimization that happens, continues to happen in the face of a 24-hour takedown model. This is why we need strong Criminal Code measures to protect children. The Conservative bills, Bill S-210 and Bill C-270, would provide the strong criminal tools to protect children without all the additional problems associated with Bill C-63. I encourage the House to pass these proposed strong child protection Criminal Code-amending bills, Bill S-210 and Bill C-270. They would protect children from child abuse, and given the legal vacuums that exist in this area, there can be no greater, more important objective than protecting children from the kind of violence and sexualization they are currently exposed to.
988 words
  • Hear!
  • Rabble!
  • star_border