SoVote

Decentralized Democracy

House Hansard - 229

44th Parl. 1st Sess.
October 4, 2023 02:00PM
  • Oct/4/23 5:11:33 p.m.
  • Watch
  • Re: Bill S-12 
There has been a lot of debate on this topic. I would like to present something to all political parties that has not been discussed in this House, which I really feel needs to be considered at committee. This topic has not been addressed whatsoever and I fear that we are creating a loophole that could victimize a lot more women and a lot more public officials. I really hope that the government and the justice committee give consideration to this issue. It was sort of addressed by my colleague from Esquimalt—Saanich—Sooke when he started talking about additional offences that would cause somebody to be mandatorily added to the national sex offender registry, where he said that there would be two more offences added to a list for automatic registration. The first is sextortion offences, where so-called revenge porn is used by an ex against their partner who has left them and they are angry so they post intimate images without consent; the second is that any posting of intimate images without consent would result in automatic registration. I am happy to be corrected, but I do not think in this bill that type of offence is automatic registration. I believe it is discretionary enrolment. That might be something in and of itself, if that's true, that the justice committee needs to correct. However, there is a bigger problem here. The definition that the Criminal Code would use to define “intimate image”, I believe, is stated as follows: (2) In this section, intimate image means a visual recording of a person made by any means including a photographic, film or video recording, (a) in which the person is nude, is exposing his or her genital organs or anal region or her breasts or is engaged in explicit sexual activity; (b) in respect of which, at the time of the recording, there were circumstances that gave rise to a reasonable expectation of privacy; and (c) in respect of which the person depicted retains a reasonable expectation of privacy at the time the offence is committed. The definition of “intimate image” would not change in this act, but the circumstances under which intimate images are produced have dramatically changed in the last year. I would like to draw the attention of all of my colleagues of all political parties to a brief that was written by the University of Western Ontario's violence against women and children unit. Brief 39, written in April 2021, talks about policy options for something called “non-consensual deepnudes and sexual deepfakes”. If members are not familiar with these terms, every person in this House needs to be. In lay terms, what this means that if they or their children post something to social media, post a picture of themselves, there is now technology that is essentially like X-ray vision. Therefore, if they google something called “deepnude”, they see that it is a technology that actually scrubs the clothing off persons and posts that. That is problem number one. There is also software that superimposes an image, like someone's face, on top of somebody else's body. These are super convincing, incredibly real and hugely problematic. In the U.S. in August, there were several articles that were posted; one called “Revenge Porn and Deep Fake Technology: The Latest Iteration of Online Abuse”. Some jurisdictions in the United States have enacted some form of revenge porn legislation. However, when they put this legislation through their respective legislatures, it did not consider deepnudes or deepfakes because of the definition of an intimate image. Going back to the definition in the Criminal Code of what an intimate image is, “there were circumstances that gave rise to a reasonable expectation of privacy” and, going forward, “the person depicted retains a reasonable expectation of privacy at the time the offence is committed.” I can just see legions of lawyers working on behalf of deepnude apps and people who are generating these for profit, arguing that somebody, by posting their image online, abandons their right to privacy; and, therefore, because the definition of intimate image in the Criminal Code does not articulate specifically images that were generated using this new technology, they did not have a reasonable right to privacy. I can guarantee that this is what is going to happen. Sometimes I feel like I am standing in the House and I am like Cassandra, doomed to know the future and nobody believes me. However, this is an instance where Parliament should not be rushing through legislation that has such an incredibly profound impact on women. This is how women are being abused now and this is how children are being abused, and our laws have not caught up. Going back to the brief that I mentioned, I draw colleagues' attention to some of the policy options that Western University outlines. I will read the entire section: 1. Criminalize the production and distribution of non-consensual deepnudes and sexual deepfakes. Currently, Canada has no law criminalizing non-consensual deepnudes and sexual deepfakes. There are other legal responses that individuals may be able to utilize like defamation...depending on the context. However, it is not certain. In fact, my analysis shows that the tort of public disclosure of embarrassing private facts would not cover this situation. If we take this lack of law in Canada, which is hugely negligent and hugely behind the world, and add that the Supreme Court ruling has basically eliminated the mandatory listing of somebody on the registry, how are we disincentivizing people from creating deepfakes and deepnudes of their exes and putting them online? There is virtually no guarantee of criminal repercussion and no guarantee that they will be on the sexual offence registry. In fact, somebody might even be looking at creating a business off of this for those who are not smart enough to figure out how to do it themselves, and it is shockingly easy. I want members to picture themselves for a moment, just to drive this point home. We are in the middle of the next election campaign, and a member who is out door knocking looks at their phone and sees that they have been scrubbed by X-ray vision. It is all over the Internet for the next week. The member will not have any recourse because we have a legislative gap and we do not have the incentive to put someone on the national offender registry afterwards. Someone could have cost that member their career because of this information, and there is no repercussion for them afterwards. I am relating this to try to twig members' interest using self-interest, but we all understand the bigger implication here, which is the exploitation of children and women. This is a powerful tool for abusive men to victimize women and their spouses. Women and spouses will very quickly, if they are not already, be under threat of this: “I am just going to scrub your clothes off”, or “It doesn't matter if you don't send me your nudes; I'll just make them anyway.” We know that is happening right now, and we know that it is happening to our kids with Snapchat and all of these other things. Half the time we do not even know what app our kids are on anymore. It is tough. The other thing that this lack of law does is it makes it less possible for people to teach consent properly. We have to be able to educate our children and ourselves on what consent means. If the law has a giant gap in what artificial intelligence images are creating, then we have a problem. This legislation and the review at the justice committee present our Parliament with an opportunity to address this issue in a meaningful way for the first time. Colleagues, I implore you, particularly members of the justice committee, that when the bill goes to committee, invite people who have expertise in this area so that we understand the prevalence of this situation and what some other jurisdictions are doing. Also, think about amending the definition of “intimate image” so that it specifically deals with deepfakes and deepnudes. We should be talking about it being illegal and immoral and saying that someone should end up on a sexual offence registry just like any other offender. I almost think it is worse for people to do this, just to be fair. I am putting this on the record for future court challenges that might be looking at this parliamentary debate: The intent of this legislation should be to ensure that people who use artificial intelligence deepfake and deepnude technology to victimize women and children are on the sex offender registry. We should make that absolutely clear. I will close by saying that this is why we need to review legislation. No member has talked about this. I hope the justice committee spends adequate time looking at all of these perils before this legislation is rammed through.
1534 words
All Topics
  • Hear!
  • Rabble!
  • star_border
  • Hear!
  • Rabble!
  • star_border
  • Oct/4/23 5:22:31 p.m.
  • Watch
  • Re: Bill S-12 
Mr. Speaker, Western University's brief has a pretty good outline of what some of the definitions of these are, and I would go back to it, as it has been thought about. I think there are two dozen references of other literature in there that I would draw my colleague's attention to. I would ask colleagues on the justice committee to intersect with some of the work that is being done on the industry committee regarding Bill C-27, the artificial intelligence and data act, to ensure that our laws are harmonized as we move forward and make sure that is done in a way so women, others, people in public life and children are not victimized.
119 words
All Topics
  • Hear!
  • Rabble!
  • star_border
  • Oct/4/23 5:24:04 p.m.
  • Watch
  • Re: Bill S-12 
Mr. Speaker, it is not a comfortable topic for me to discuss, but I have been victimized online. I try not to make debates about me in the House because I represent 120,000 other people. However, if I saw something like this of me spreading and going viral online, through Telegram channels, WhatsApp or whatever, I think it would victimize me. It would devastate anyone in this place. Certainly, there are people and agents who would like to undermine our democracy. This is war, and our legislature, our Parliament, has a chance to close the door to the actors on this. I encourage a rationed amendment to ensure that we are closing this loophole and that people who utilize technology to do this are not, as you say, Mr. Speaker, able to do indirectly what they cannot do directly. It is up to Parliament to ensure that the spirit of this act captures that with regard to deepfakes and deepnudes.
161 words
All Topics
  • Hear!
  • Rabble!
  • star_border
  • Oct/4/23 5:26:21 p.m.
  • Watch
  • Re: Bill S-12 
Mr. Speaker, I agree with the member's sentiment. Many colleagues in here have talked about how the bill does not have an adequate and comprehensive enough list of what should be included, in terms of convictions or areas of conviction, from a mandatory perspective on the national sex offender registry list. This is why it is so imperative for the justice committee to have a fulsome study. I think the area she mentioned is deeply important. I would apply what she said to the concept that I brought forward. It is so easy to make these images. Somebody could do it thousands of times and never be put on a sex offender registry. It is not even a loophole. We could drive a bus through it. Let us patch that up at the justice committee to keep our kids and women safe.
143 words
All Topics
  • Hear!
  • Rabble!
  • star_border