SoVote

Decentralized Democracy

Bill C-63

44th Parl. 1st Sess.
February 26, 2024
  • This is a bill called the Online Harms Act that was introduced in the Canadian Parliament. The purpose of the bill is to promote online safety, reduce harms caused by harmful content online, and ensure that social media services are transparent and accountable. The bill establishes the Digital Safety Commission of Canada, the Digital Safety Ombudsperson of Canada, and the Digital Safety Office of Canada to enforce the act and support users of social media services. The bill also imposes duties on social media service operators to act responsibly, protect children, make certain content inaccessible, and keep records. It also amends the Criminal Code and the Canadian Human Rights Act to address online harms.
  • H1
  • H2
  • H3
  • S1
  • S2
  • S3
  • RA
  • Yea
  • Nay
  • star_border
Mr. Speaker, I am grateful for the opportunity to wrap up the debate on the SISE act at second reading. I have appreciated listening to the members give their speeches. At the outset, I want to briefly urge members to use the term “child sexual abuse material”, or CSAM, rather than “child pornography”. As we heard from the member for Kamloops—Thompson—Cariboo, the latter term is being replaced with CSAM because pornography allows for the idea that this could be consensual. That is why the member for Kamloops—Thompson—Cariboo has put forward a bill that would change this in the Criminal Code as well. During the first hour of debate, we heard from the member for Laurentides—Labelle, who gave a passionate speech outlining the many serious issues of the impact of the pornography industry on women and youth. I simply do not have the time to include all of that in my speech, but we both sat on the ethics committee during the Pornhub study and heard directly from the survivors who testified. It was the speech, however, from the Parliamentary Secretary to the Leader of the Government in the House of Commons that left me scratching my head. I do not think he actually read Bill C-270 or even the Liberals' own bill, Bill C-63. The parliamentary secretary fixated on the 24-hour takedown requirement in Bill C-63 as the solution to this issue. However, I do not think anyone is opposed to a 24-hour takedown for this exploitative intimate content sharing without consent or the child sexual abuse material. In fact, a bill that was solely focused on the 24-hour takedown would pass very quickly through this House with the support of everyone, but that does not take into account what Bill C-270 is trying to do. It is completely missing the point. The 24-hour takedown has effect only after harmful content has been put up, such as CSAM, deepfakes and intimate images that have been shared. Bill C-270 is a preventative upstream approach. While the takedown mechanism should be available to victims, the goal of Bill C-270 is to go upstream and stop this abusive content from ever ending up on the Internet in the first place. As I shared at the beginning of the debate, many survivors do not know that their images are online for years. They do not know that this exploitative content has been uploaded. What good would a 24-hour takedown be if they do not even know the content is there? I will repeat the words of one survivor that I shared during the first hour of debate: “I was 17 when videos of me on Pornhub came to my knowledge, and I was only 15 in the videos they've been profiting from.” She did not know for two years that exploitative content of her was being circulated online and sold. That is why Bill C-270 requires age verification and consent of individuals in pornographic material before it is posted. I would also point out that the primary focus of the government's bill is not to reduce harm to victims. The government's bill requires services “to mitigate the risk that users of the regulated service will be exposed to harmful content”. It talks about users of the platform, not the folks depicted in it. The focus of Bill C-270 is the other side of the screen. Bill C-270 seeks to protect survivors and vulnerable populations from being the harmful content. The two goals could not be more different, and I hope the government is supportive of preventing victims of exploitation from further exploitation online. My colleague from Esquimalt—Saanich—Sooke also noted that the narrow focus of the SISE act is targeted at people and companies that profit from sexual exploitative content. This is, indeed, one of the primary aims of this bill. I hope, as with many things, that the spread of this exploitative content online will be diminished, as it is driven by profit. The Privacy Commissioner's investigation into Canada's MindGeek found that “MindGeek surely benefits commercially from these non-compliant privacy practices, which result in a larger content volume/stream and library of intimate content on its websites.” For years, pornography companies have been just turning a blind eye, and it is time to end that. Bill C-270 is a fulfillment of a key recommendation made by the ethics committee three years ago and supported by all parties, including the government. I hope to have the support from all of my colleagues in this place for Bill C-270, and I hope to see it at committee, where we can hear from survivors and experts.
815 words
  • Hear!
  • Rabble!
  • star_border
Mr. Speaker, I appreciate the opportunity to say a few words in support of Bill C-270, which is an excellent bill from my colleague from Peace River—Westlock, who has been working so hard over his nine years in Parliament to defend the interests of his constituents on important issues like firearms, forestry and fracking, but also to stand up for justice and the recognition of the universal human dignity of all people, including and especially the most vulnerable. Bill C-270 seeks to create mechanisms for the effective enforcement of substantively already existing legal provisions that prohibit non-consensual distribution of intimate images and child pornography. Right now, as the law stands, it is a criminal offence to produce this type of horrific material, but there are not the appropriate legal mechanisms to prevent the distribution of this material by, for instance, large pornography websites. It has come to light that Pornhub, which is headquartered in Canada, has completely failed to prevent the presence on its platform of non-consensual and child-depicting pornographic images. This has been a matter that has been studied in great detail at parliamentary committees. My colleague for Peace River—Westlock has played a central role, but other members from other parties have as well, in identifying the fact that Pornhub and other websites have not only failed but have shown no interest in meaningfully protecting potential victims of non-consensual and child pornographic images. It is already illegal to produce these images. Why, therefore, should it not also be clearly illegal to distribute those images without having the necessary proof of consent? This bill would require that there be verification of age and consent associated with images that are distributed. It is a common-sense legal change that would require and affect greater compliance with existing criminal prohibitions on the creation of these images. It is based on the evidence heard at committee and based on the reality that major pornography websites, many of which are headquartered in Canada, are continuing to allow this material to exist. To clarify, the fact that those images are on those websites means that we desperately need stronger legal tools to protect children and stronger legal tools to protect people who are victims of the non-consensual sharing of their images. Further, in response to the recognition of the potential harms on children associated with exposure to pornography or associated with having images taken of them and published online, there has been discussion in Parliament and a number of different bills put forward designed to protect children in vulnerable situations. These bills are, most notably, Bill C-270 and Bill S-210. Bill S-210 would protect children by requiring meaningful age verification for those who are viewing pornography. It is recognized that exposing children to sexual images is a form of child abuse. If an adult were to show videos or pictures to a child of a sexual nature, that would be considered child abuse. However, when websites fail to have meaningful age verification and, therefore, very young children are accessing pornography, there are not currently the legal tools to hold them accountable for that. We need to recognize that exposing young children to sexual images is a form of child abuse, and therefore it is an urgent matter that we pass legislation requiring meaningful age verification. That is Bill S-210. Then we have Bill C-270, which would protect children in a different context. It would protect children from having their images depicted as part of child pornography. Bill C-270 takes those existing prohibitions further by requiring that those distributing images also have proof of age and consent. This is common sense; the use of criminal law is appropriate here because we are talking about instances of child sexual abuse. Both Bill S-210 and Bill C-270 deal with child sexual abuse. It should be clear that the criminal law, not some complicated nebulous regulatory regime, is the appropriate mechanism for dealing with child abuse. In that context, we also have a government bill that has been put forward, Bill C-63, which it calls the online harms act. The proposed bill is kind of a bizarre combination of talking about issues of radically different natures; there are some issues around speech, changes to human rights law and, potentially, attempts to protect children, as we have talked about. The freedom of speech issues raised by the bill have been well discussed. The government has been denounced from a broad range of quarters, including some of their traditional supporters, for the failures of Bill C-63 on speech. However, Bill C-63 also profoundly fails to be effective when it comes to child protection and the removal of non-consensual images. It would create a new bureaucratic structure, and it is based on a 24-hour takedown model; it says that if something is identified, it should be taken down within 24 hours. Anybody involved in this area will tell us that 24-hour takedown is totally ineffective, because once something is on the Internet, it is likely to be downloaded and reshared over and over again. The traumatization, the revictimization that happens, continues to happen in the face of a 24-hour takedown model. This is why we need strong Criminal Code measures to protect children. The Conservative bills, Bill S-210 and Bill C-270, would provide the strong criminal tools to protect children without all the additional problems associated with Bill C-63. I encourage the House to pass these proposed strong child protection Criminal Code-amending bills, Bill S-210 and Bill C-270. They would protect children from child abuse, and given the legal vacuums that exist in this area, there can be no greater, more important objective than protecting children from the kind of violence and sexualization they are currently exposed to.
988 words
  • Hear!
  • Rabble!
  • star_border
Mr. Speaker, the subject that we are dealing with this evening is a sensitive one. My colleagues have clearly demonstrated that in the last couple of minutes. We all have access to the Internet and we basically use it for three reasons: for personal reasons, for professional reasons and for leisure, which can sometimes overlap with personal reasons. Pornography is one of those uses that is both for leisure and for personal reasons. To each their own. The use of pornography is a personal choice that is not illegal. Some people might question that. We might agree or disagree, but it is a personal decision. However, the choice that one person makes for their own pleasure may be the cause of another person's or many other people's nightmare. Basically, that is what Bill C-270 seeks to prevent, what it seeks to sanction. The purpose of the bill is to ensure that people do not have to go through hell because of pornography. This bill seeks to criminalize the fact that, under the guise of legality, some of the images that are being viewed were taken or are being used illegally. I want to talk briefly about the problem this bill addresses and the solutions that it proposes. Then, to wrap up, I will share some of my own thoughts about it. For context for this bill and two others that are being studied, Bill S‑210 and C‑63, it was a newspaper article that sounded the alarm. After the article came out, a House of Commons committee that my esteemed colleague from Laurentides—Labelle sits on looked at the issue. At that time, the media informed the public that videos of women and children were available on websites even though these women and, naturally, these children never gave their consent to be filmed or for their video to be shared. We also learned that this included youths under 18. As I said, a committee looked at the issue. The images and testimonies received by the committee members were so shocking that several bills that I mentioned earlier were introduced to try to tackle the issue in whole or in part. I want to be clear: watching pornography is not the problem—to each their own. If someone likes watching others have sex, that is none of my concern or anyone else's. However, the problem is the lack of consent of the people involved in the video and the use of children, as I have already said. I am sure that the vast majority of consumers of pornography were horrified to find out that some of the videos they watched may have involved young people under the age of 18. These children sometimes wear makeup to look older. Women could be filmed without their knowledge by a partner or former partner, who then released the video. These are intimate interactions. People have forgotten what intimacy means. If a person agrees to be filmed in an intimate situation because it is kind of exciting or whatever, that is fine, but intimacy, as the word itself implies, does not mean public. When a young person or an adult decides to show the video to friends to prove how cool it is that they got someone else to do something, that is degrading. It is beyond the pale. It gets to me because I saw that kind of thing in schools. Kids were so pleased with themselves. I am sorry, but it is rarely the girls who are so pleased with themselves. They are the ones who suffer the negative consequences. At the end of the day, they are the ones who get dragged through the mud. Porn sites were no better. They tried to absolve themselves by saying that they just broadcast the stuff and it is not up to them to find out if the person consented or was at least 18. Broadcasting is just as bad as producing without consent. It encourages these illegal, degrading, utterly dehumanizing acts. I am going back to my notes now. The problem is that everyone is blaming everyone else. The producer says it is fine. The platform says it is fine. Ultimately, governments say the same thing. This is 2024. The Internet is not new. Man being man—and I am talking about humankind, humans in general—we were bound to find ourselves in degrading situations. The government waited far too long to legislate on this issue. In fact, the committee that looked into the matter could only observe the failure of content moderation practices, as well as the failure to protect people's privacy. Even if the video was taken down, it would resurface because a consumer had downloaded it and thought it was a good idea to upload it again and watch it again. This is unspeakable. It seems to me that people need to use some brain cells. If a video can no longer be found, perhaps there is a reason for that, and the video should not be uploaded again. Thinking and using one's head is not something governments can control, but we have to do everything we can. What is the purpose of this bill and the other two bills? We want to fight against all forms of sexual exploitation and violence online, end the streaming and marketing of all pornographic material involving minors, prevent and prohibit the streaming of non-consensual explicit content, force adult content companies and streaming services to control the streaming of this content and make them accountable and criminally responsible for the presence of this content on their online sites. Enough with shirking responsibility. Enough with saying: it is not my fault if she feels degraded, if her reputation is ruined and if, at the end of the day, she feels like throwing herself off a bridge. Yes, the person who distributes pornographic material and the person who makes it are equally responsible. Bill C‑270 defines the word “consent” and the expression “pornographic material”, which is good. It adds two new penalties. Essentially, a person who makes or distributes the material must ensure that the person involved in the video is 18 and has given their express consent. If the distributor does not ask for it and does not require it, they are at fault. We must also think about some of the terms, such as “privacy”, “education”, but also the definition of “distributor” because Bill C-270 focuses primarily on distributors for commercial purposes. However, there are other distributors who are not in this for commercial purposes. That is not nearly as pretty. I believe we need to think about that aspect. Perhaps legal consumers of pornography would like to see their rights protected. I will end with just one sentence: A real statesperson protects the dignity of the weak. That is our role.
1165 words
  • Hear!
  • Rabble!
  • star_border
Mr. Speaker, I am very pleased to speak to Bill C-270, an act to amend the Criminal Code (pornographic material), at second reading. I would like to begin my remarks by stressing the bill's important objective. It is to ensure that those who make, distribute or advertise pornographic material verify that those depicted in that material are at least 18 years of age and have consented to its production and distribution. As the sponsor has explained, the bill's objective is to implement recommendation number two of the 2021 report of the House of Commons Standing Committee on Access to Information, Privacy and Ethics. Specifically, that report recommends that the government “mandate that content-hosting platforms operating in Canada require affirmation from all persons depicted in pornographic content, before it can be uploaded, that they are 18 years old or older and that they consent to its distribution”. This recommendation responds to ongoing concerns that corporations like Pornhub have made available pornographic images of persons who did not consent or were underage. I want to recognize and acknowledge that this conduct has caused those depicted in that material extreme suffering. I agree that we must do everything we can to protect those who have been subjected to this trauma and to prevent it from occurring in the first place. I fully support the objective of the committee's recommendation. I want to say at the outset that the government will be supporting this bill, Bill C-270, at second reading, but with some serious reservations. I have some concerns about the bill's ability to achieve the objective of the committee's recommendation. I look forward, at committee, to where we can hear from experts on whether this bill would be useful in combatting child pornography. The bill proposes Criminal Code offences that would prohibit making, distributing or advertising pornographic material, without first verifying the age and consent of those depicted by examining legal documentation and securing formal written consent. These offences would not just apply to corporations. They would also apply to individuals who make or distribute pornographic material of themselves and others to generate income, a practice that is legal and that we know has increased in recent years due to financial hardship, including that caused by the pandemic. Individuals who informally make or distribute pornographic material of themselves and of people they know are unlikely to verify age by examining legal documentation, especially if they already know the age of those participating in the creation of the material. They are also unlikely to secure formal written consent. It concerns me that such people would be criminalized by the bill's proposed offences, where they knew that everyone implicated was consenting and of age, merely because they did not comply with the bill's proposed regulatory regime governing how age and consent must be verified. Who is most likely to engage in this conduct? The marginalized people who have been most impacted by the pandemic, in particular sex workers, who are disproportionately women and members of the 2SLGBTQI+ communities. Notably, the privacy and ethics committee clearly stated that its goal was “in no way to challenge the legality of pornography involving consenting adults or to negatively impact sex workers.” However, I fear that the bill's proposed reforms could very well have this effect. I am also concerned that this approach is not consistent with the basic principles of criminal law. Such principles require criminal offences to have a fault or a mental element, for example, that the accused knew or was reckless as to whether those depicted in the pornographic material did not consent or were not of age. This concern is exacerbated by the fact that the bill would place the burden on the accused to establish that they took the necessary steps to verify age and consent to avoid criminal liability. However, basic principles of criminal law specify that persons accused of criminal offences need only raise a reasonable doubt as to whether they committed the offence to avoid criminal liability. I would also note that the committee did not specifically contemplate a criminal law response to its concerns. In fact, a regulatory response that applies to corporations that make, distribute or advertise pornographic material may be better positioned to achieve the objectives of the bill. For example, our government's bill, Bill C-63, which would enact the online harms act, would achieve many of Bill C-270's objectives. In particular, the online harms act would target seven types of harmful content, including content that sexually victimizes a child or revictimizes a survivor, and intimate content communicated without consent. Social media services would be subjected to three duties: to act responsibly, to protect children and to make content inaccessible that sexually victimizes a child or revictimizes a survivor, as well as intimate images posted without consent. These duties would apply to social media services, including livestreaming and user-uploaded adult content services. They would require social media services to actively reduce the risk of exposure to harmful content on their services; provide clear and accessible ways to flag harmful content and block users; put in place special protections for children; take action to address child sexual exploitation and the non-consensual posting of intimate content, including deepfake sexual images; and publish transparency reports. Bill C-63 would also create a new digital safety commission to administer this regulatory framework and to improve the investigation of child pornography cases through amendments to the Mandatory Reporting Act. That act requires Internet service providers to report to police when they have reasonable grounds to believe their service is being used to commit a child pornography offence. Failure to comply with this obligation can result in severe penalties. As I know we are all aware, the Criminal Code also covers a range of offences that address aspects of the concerns animating the proposed bill. Of course, making and distributing child pornography are both already offences under the Criminal Code. As well, making pornography without the depicted person's knowledge can constitute voyeurism, and filming or distributing a recording of a sexual assault constitutes obscenity. Also, distributing intimate images without the consent of the person depicted in those images constitutes non-consensual distribution of intimate images, and the Criminal Code authorizes courts to order the takedown or removal of non-consensual intimate images and child pornography. All these offences apply to both individuals and organizations, including corporations, as set out in section 2 of the Criminal Code. Should parliamentarians choose to pursue a criminal response to the concerns the proposed bill seeks to address, we may want to reflect upon whether the bill's objectives should be construed differently and its provisions amended accordingly. I look forward to further studying such an important bill at committee.
1144 words
  • Hear!
  • Rabble!
  • star_border
Mr. Speaker, as the member for Shefford and the Bloc Québécois critic for the status of women, I want to say that we support Bill C-270 in principle. We would like to examine this bill in committee. The Bloc Québécois fully supports the bill's stated objective, which is to combat child pornography and the distribution and commercialization of non-consensual pornography. Since the first warning about the tragedy of women and girls whose sexual exploitation is the source of profits for major online porn companies, the Bloc Québécois has been involved at every stage and at all times in the public process to expose the extent of this public problem, which goes to our core values, including the right to dignity, safety and equality. On this subject of online sexual exploitation, as on all facets and forms of the sexual exploitation of women, we want to stand as allies not only of the victims, but also of all the women who are taking action to combat violence and exploitation. I will begin by giving a little background on the topic, then I will explain the bill and, in closing, I will expand on some of the other problems that exist in Canada. First, let us not forget that the public was alerted to the presence of non-consensual child pornography by an article that was published in the New York Times on December 4, 2020. The article reported the poignant story of 14-year old Serena K. Fleites. Explicit videos of her were posted on the website Pornhub without her consent. This Parliament has already heard the devastating, distressing and appalling testimony of young Serena, which helped us understand the sensitive nature and gravity of the issue, but also the perverse mechanisms that porn streaming platforms use to get rich by exploiting the flaws of a technological system that, far from successfully controlling the content that is broadcast, is built and designed to promote and yet conceal the criminal practices of sexual exploitation. Reports regarding the presence of child sexual abuse material and other non-consensual content on the adult platform Pornhub led the Standing Committee on Access to Information, Privacy and Ethics to undertake a study on the protection of privacy and reputation on online platforms such as Pornhub. My colleague from Laurentides—Labelle has followed this issue closely. The committee noted that these platforms' content moderation practices had failed to protect privacy and reputation and had failed to prevent child sexual abuse material from being uploaded, despite statements by representatives of MindGeek and Pornhub who testified before the committee. That same committee looked at regulating adult sites and online pornography, without challenging the legality. The committee heard testimony from survivors, critics of MindGeek's practices, child protection organizations, members of law enforcement, the federal government, academics, experts and support organizations, and it received many briefs. The Standing Committee on Access to Information, Privacy and Ethics made 14 recommendations regarding the problems it had studied. The committee's 2021 report was clear and it recommended that the government introduce a bill to create a new regulator to ensure that online platforms remove harmful content, including depictions of child sexual exploitation and non-consensual images. We know that sexually explicit content is being uploaded to Pornhub without the consent of the individuals involved, including minors, and that these individuals have tried and failed to get Pornhub to remove that content. We know that these survivors have been traumatized and harassed and that most of them have thought about suicide. That is the type of testimony that we heard at the Standing Committee on the Status of Women with regard to cases of sexual exploitation. We know that even if content is finally removed, users just re-upload it shortly afterward. We know that the corporate structure of MindGeek, which was renamed Aylo last August, is the quintessential model for avoiding accountability, transparency and liability. We know that investigations are under way and that there has been a surge in online child sexual exploitation reports. We must now legislate to respond to these crimes and deal with these problems. We also need to keep in mind the magnitude of the criminal allegations and the misconduct of which these companies are accused. Just recently, a new class action lawsuit was filed in the United States against MindGeek and many of the sites it owns, including Pornhub, over allegations of sex trafficking involving tens of thousands of children. Let us not forget that these companies are headquartered right in Montreal. The fact that our country is home to mafia-style companies that profit from sexual exploitation is nothing to be proud of. The international community is well aware of this, and it reflects poorly on us. For these reasons, we have an additional obligation to take action, to find solutions that will put an end to sexual exploitation, and to implement those solutions through legislation. With that in mind, we must use the following questions to guide our thinking. Are legislative proposals on this subject putting forward the right solutions? Will they be effective at controlling online sexual exploitation and, specifically, preventing the distribution of non-consensual content and pornographic content involving minors? Second, let us talk a little more about Bill C‑270. This bill forces producers of pornographic material to obtain the consent of individuals and to ensure that they are of age. In addition, distributors will have to obtain written confirmation from producers that the individuals' consent has been obtained and that they are of age before the material is distributed. These new Criminal Code provisions will require large platforms and producers to have a process for verifying individuals' age and consent, without which they will be subject to fines or imprisonment. The House will be considering two bills simultaneously. The first is Bill C-270, from the member for Peace River—Westlock, with whom I co-chair the All-Party Parliamentary Group to End Modern Slavery and Human Trafficking. The second is Bill C-63, introduced by the Minister of Justice, which also enacts new online harms legislation and aims to combat the sexual victimization of children and to make intimate content communicated without consent inaccessible. We will need to achieve our goals, which are to combat all forms of online sexual exploitation and violence, stop the distribution and marketing of all pornographic material involving minors, prevent and prohibit the distribution of explicit non-consensual content, force adult content companies and platforms to control the distribution of such content, and make them accountable and criminally responsible for the presence of such content on their online platforms. There is a debate about the law's ability to make platforms accountable for hosted content. It also raises questions about the relevance of self-regulation in the pornography industry. Third, let us talk about what we can do here. Due to the high volume of complaints it receives, the RCMP often reacts to matters relating to child sexual abuse material, or CSAM, rather than acting proactively to prevent them. Canada's criminal legislation prohibits child pornography, but also other behaviours aimed at facilitating the commission of a sexual offence against a minor. It prohibits voyeurism and the non-consensual distribution of intimate images. Other offences of general application such as criminal harassment and human trafficking may also apply depending on the circumstances. In closing, I will provide a few figures to illustrate the scope of this problem. Between 2014 and 2022, there were 15,630 incidents of police-reported online sexual offences against children and 45,816 incidents of online child pornography. The overall rate of police-reported online child sexual exploitation incidents has also risen since 2014. The rate of online child pornography increased 290% between 2014 and 2022. Girls were overrepresented as victims for all offence types over that nine-year period. The majority of victims of police-reported online sexual offences against children were girls, particularly girls between the ages of 12 and 17, who accounted for 71% of victims. Incidents of non-consensual distribution of intimate images most often involved a youth victim and a youth accused. Nearly all child and youth victims, 97% to be exact, between 2015 to 2022 were aged 12 to 17 years, with a median age of 15 years for girls and 14 years for boys. Overall, nine in 10 accused persons, or 90%, were youth aged 12 to 17. For one-third of youth victims, or 33%, a casual acquaintance had shared the victim's intimate images with others. Here is a quote from the Montreal Council of Women: “On behalf of the members of the Montreal Council of Women, I wish to confirm our profound concern for those whose lives have been turned upside down by the involuntary and/or non-consensual sharing of their images on websites and other platforms such as the Montreal-based Pornhub. The ‘stopping Internet sexual exploitation act’ will make much-needed amendments to the Criminal Code to protect children and those who have not given consent for their images and other content to be shared and commercialized.” We must act. It is a question of safety for our women and girls. Young women and girls are depending on it.
1567 words
  • Hear!
  • Rabble!
  • star_border
  • Apr/18/24 6:34:31 p.m.
  • Watch
  • Re: Bill C-63 
Madam Speaker, the member certainly could consider supporting the government's online harms bill, which I think is a major piece of legislation that certainly will help to protect minors and children when they are interacting online. I appreciate this opportunity to speak about the ongoing threat of extortion in Canada. The Government of Canada is deeply concerned about Canadians who are victimized by acts of extortion and related violence. The Government of Canada is aware of growing concerns related to extortion across the country and, indeed, the government has heard directly from the mayors of Surrey, British Columbia; Edmonton, Alberta; and Brampton, Ontario, about how this is impacting their communities. The recent increase in the number and severity of extortion attempts, particularly targeting members of Canada's South Asian community are alarming. The Government of Canada and the RCMP encourage anyone experiencing or witnessing extortion to report it to their local police of jurisdiction and discourage anyone from complying with demands for money. Rest assured, the Government of Canada is committed to protecting the safety of Canadians and Canadian interests against these threats. We are taking concrete action to protect all affected communities across Canada. As Canada's national police force, the Royal Canadian Mounted Police is mandated to prevent, detect and investigate serious organized crime, in order to protect Canadians and Canadian interests. In doing so, the RCMP works closely with domestic and international law enforcement partners to share information and target shared threats. The RCMP and its law enforcement partners across the country have observed an increase in the number of extortion crimes taking place and are working collaboratively to investigate these incidents. While the RCMP cannot comment on specific investigations, I can confirm that significant coordination is under way across the country to address similar types of extortion attempts directed at the South Asian communities in British Columbia, Alberta and Ontario. While many investigations remain ongoing, a number of arrests have been made, and information sharing across agencies, I would say, is imperative, as coordinated efforts are under way to identify cases that may be related to one another. To this end, the RCMP is actively sharing information with local law enforcement to support their ongoing efforts. Rest assured, law enforcement agencies across the country are utilizing the required tools and resources to combat these serious incidents in order to keep Canadians safe.
397 words
  • Hear!
  • Rabble!
  • star_border
  • Apr/18/24 3:33:05 p.m.
  • Watch
  • Re: Bill C-63 
Mr. Speaker, I rise today to address budget 2024. I propose to deliver my remarks in two contexts: first, to address how this budget resonates with the residents whom I am privileged to represent in Parkdale—High Park in Toronto; second, to look more largely at some of the very important components that relate to the administration of justice in this country and are touched on in this budget document. I am proud to have represented, for almost nine years now, the constituents in Parkdale—High Park. What those constituents have talked to me repeatedly about is the need to address housing. In budget 2024, we find some very key provisions that relate to housing. I cannot list them all, but some deal with the pressing issue of building more housing, increasing housing supply. That is fundamental in terms of what we are trying to do as a government, and it is empowered and advanced by this important budget document. What I am speaking of here is, for example, $15 billion in additional contributions to Canada's apartment construction loan program, which will help to build more than 30,000 additional new homes. What I also take a lot of pride in is the fact that we are addressing the acute needs of renters. I say that in two respects. This budget document outlines, for example, how renters can be empowered to get to the point of home ownership by virtue of having a proper rental payment history. This can contribute to building up one's credit worthiness with credit ratings agencies; when the time comes to actually apply for a mortgage, one will have built up that credit worthiness by demonstrating that one has made regular rent payments over a period of years. This is truly empowering for the renters in my community and communities right around the country. I have already heard that feedback from the renters whom I represent. Lastly, I would simply point out what we are doing with respect to the tenants' bill of rights. This is a really important document that talks about ensuring that tenants have rights they can vindicate, including in front of tribunals and, potentially, courts of law. We are coupling that with a $15-million investment that would empower and unlock advocates who assist those renters. That is fundamental. In that respect, it actually relates to the two hats that I wear in this chamber, in both my roles as a representative of individual renters and as Minister of Justice. Another component that my constituents have been speaking to me about regularly since 2015 is our commitment to advancing meaningful reconciliation with indigenous peoples. Again, this document has a number of components that relate to indigenous peoples in budget 2024. There are two that I would highlight for the purpose of these remarks. First, there is the idea about what we are doing to settle litigation against indigenous peoples and ensure that we are proceeding on a better and more conciliatory path forward. We talk about a $23-billion settlement with respect to indigenous groups who are litigating discriminatory underfunding of children and child family services and the fact that this historic settlement was ratified by the federal court. That is critical. Second, in this document we also talk about funding a project that is near and dear to my heart. Why do I say that? It is because, in 2017, I had the privilege of serving as the parliamentary secretary to the Minister of Heritage. At that time, I helped to co-develop, along with Métis, first nations and Inuit leaders, the legislation that has now become the Indigenous Languages Act. That is coupled with an indigenous languages commission. In this very budget document, we talk about $225 million to ensure the continued success of that commission and the important work it is doing to promote, enhance and revitalize indigenous languages in this country. Those are fundamental investments. I think it is really important to highlight them in the context of this discussion. I would also highlight that my riding, I am proud to say, is full of a lot of people who care about women. They care about feminism; they care about social and economic policies that empower women. I would highlight just two. First of all, we talk about pharmacare in this budget. The first volley of pharmaceutical products that will be covered includes contraceptive devices that would assist, as I understand it, as many as nine million Canadians through access to contraception. This would allow women, particularly young women and older women, to ensure that they have control over their reproductive function. That is fundamental to me as a representative, and it is fundamental to our government and what our government prioritizes in this country. I would also say that, with $10-a-day child care, there are affordable and robust means of ensuring that people's children are looked after in this country; that empowers women to do such things as participate in the workforce. What I am speaking about here is that we are hitting levels of women's participation in the workforce that have never been seen before, with women's labour force participation of 85.4%. That is an incredible social policy that is translating into a terrific economic policy. We can also talk about the $6.1-billion Canada disability benefit. I am proud to say that the constituents of Parkdale—High Park care meaningfully about inclusive policies, policies that alleviate poverty and are addressed to those who are vulnerable and those who are in need. People have been asking me about the disability benefit, including when we will see it and when it will come to the fore. We are seeing it right now with this document. The very document that we will be voting on in this chamber includes a $6.1-billion funding model to empower Canadians who are disabled and to ensure that we are addressing their needs. This budget also represents a bit of a catch-up, meaning that we are catching up to the rest of the G7. Until this budget was delivered, we remained the only G7 country in the world not to have a national school food program. It goes without saying that not a single one of the 338 members privileged to serve in this House would think it is good for a child to arrive at school hungry, in any of their communities or in this country as a whole. I do not think this is a partisan statement whatsoever. We would acutely address child hunger. Through a national school food program, we would ensure that children do not arrive at school hungry, which would impede their productivity and certainly limit their education. Through a $1-billion investment, we would cure school poverty and school hunger. We are also introducing legislation to reduce cellphone and banking fees, which is fundamental. With respect to the hat I wear as Minister of Justice, which I have done for about eight months, I firmly believe that one of my pivotal roles is ensuring access to justice. I would say that this document really rings true to the commitment that I have personally and that our government and the Prime Minister have to this. Here, I am speaking about the notion of our commitment to legal aid. Legal aid has multiple components, but it is fundamental to ensuring that people can have their rights vindicated with the assistance of counsel. This helps address things such as court backlogs and court delays; it is also fundamental for the individual litigants before the courts. There is a criminal legal aid package in this budget that includes $440 million over five years. There is also immigration and refugee legal aid. Unfortunately, since the provinces have wholesale resiled from their involvement in this portfolio, since 2019, we have been stepping in with annual funding. We are making that funding no longer simply annual; we are projecting it over a five-year term, which gives certainty and predictability to the people who rely on immigration and refugee legal aid, to the tune of $273 million. That is fundamental. Members heard in question period about efforts we are making to address workplace sexual harassment. I will pivot again here to the fact that this dovetails with both my ministerial role and my role of devoted constituency representative as the MP for Parkdale—High Park. I hear a great deal from my constituents about speaking to women's needs in terms of addressing harassment and sexual harassment. With this budget, we would provide $30 million over three years to address workplace sexual harassment. That is also fundamental. Likewise, what we are doing on hatred is fundamental. Three full pages of the budget document are dedicated to addressing hatred. Some points dovetail with legislation that I have tabled in this House, including Bill C-63, regarding what we would do to curb online hatred and its propensity to spread. However, there are also concrete investments here that talk about Canada's action plan on combatting hate and empowering such bodies as the Canadian Race Relations Foundation, with the important work it is doing in terms of promoting better understanding and the knowledge base of hate crimes units. Also, fundamentally, there is money dedicated in this very budget to ensuring that both law enforcement agencies and Crown prosecutors are better trained and provided better information about how to identify hate and potentially prosecute it. With where we are as a country right now, this is a pressing need; I am very proud to see budget 2024 addressing it directly. For the reasons I outlined earlier, in terms of how this addresses the particular needs of my constituents and for the very replete justice investments that are made to ensuring access to justice and tackling pernicious issues, such as sexual harassment and hatred, I believe this is a budget that all 338 of us should get behind and support.
1680 words
  • Hear!
  • Rabble!
  • star_border
Madam Speaker, the member is so sensitive to us calling out what the Conservative Party is doing. I just finished saying that the most important reality of our Canadian Forces is the families, and he is standing up on a point of order. Does he not realize that the families of the Canadian Forces members are, in fact, what this report is all about? As someone who was in the Canadian Forces and who was posted in Edmonton, I understand the issue of housing. I understand the pros and cons, the dips and so forth that take place, the waiting list for PMQs, for barracks and the whole process in which housing has evolved in the Canadian Forces, and I understand how important the issue is. I knew this not only today, and it did not necessarily take the report coming to the floor to be debated. This is not new. There has always been waiting lists to get into PMQs since the days when I was in the forces. I had to wait, and I actually lived in a PMQ. There have always been waiting lists. Why did the Conservative Party wait until today to introduce this motion? If, in fact, Conservatives were genuine and really cared about the families and the Canadian Forces, they could have introduced some form of a motion on an opposition day. They should have done that if they genuinely cared about families and those in the forces representing our country and doing a phenomenal job, whether in Canada or abroad. The Government of Canada has the backs of those members in the Canadian Forces and their families a lot more than Stephen Harper ever did. When I was first elected to the House of Commons in 2010, Stephen Harper literally closed down veterans offices, not two or three, but nine all over the country. Members can imagine the veterans who already served in the forces in many different capacities and were going into private homes and facilities, some even in the non-profit area, when Stephen Harper shut down those access offices. In Manitoba, it was in Brandon. I was glad that when we took over the reins of power, we actually reopened those offices to continue to support our veterans. There are two issues here that really need to be talked about. First and foremost is the motivating factor of the Conservative Party today and why the Conservatives are moving this motion. As the NDP House leader clearly attempted to get this motion passed, the Conservatives said no. It was not because of interest for members of the forces but rather to prevent legislation from being debated. Just yesterday, I was in the House and had the opportunity to speak to a private member's bill, Bill C-270, which dealt with the issues of child porn and non-consensual porn. I stood in my place and provided commentary on how serious and important that issue is, not only to the government but also to every member inside this chamber. Throughout the debate, we found out that the Conservative Party was actually going to be voting against Bill C-63, which is the online harms act. That was important to mention because the Conservatives were criticizing the government for not calling the legislation. They were heckling from their seats and were asking why we did not call the legislation if it was so important. The Conservatives realize that when they bring in motions, as they have done today, they are preventing the government from bringing in legislation and from having debates on legislation. Then, they cry to anyone who will listen. They will tell lies and will do all sorts of things on social media. They spread misinformation to Canadians to try to give the impression that the House and Canada are broken. There is no entity in the country that causes more dysfunction in the House of Commons, or even outside of the Ottawa bubble, than the Conservative Party of Canada under the leadership of the far right MAGA leader today. That is the core of the problem. They have a leader who genuinely believes and who wants to demonstrate that this chamber is dysfunctional. The only thing that is dysfunctional in this chamber is the Conservative Party. It does not understand what Canadians want to see. If we look at some of the commitments we are making to the Canadian Armed Forces, we are talking about billions of dollars in the coming years. We have a target, and a lot depends on economic factors, but we are looking at 1.7% by 2030. Let us contrast that to the Conservative government of Stephen Harper, who was the prime minister when the current Conservative leader was a parliamentary secretary and was a part of that government in a couple of roles. We saw a substantial decrease in funding. I made reference to the veterans and to shutting them down. What about the lack of general funding toward the Canadian Forces? We hit an all-time low under the Conservative Party and Stephen Harper. It was 1% of the GDP. That would be awfully embarrassing to go abroad and to start talking to people in the United States or to any of our ally countries in NATO. They were laughing at the Harper regime. The Liberal government had to straighten out the problems of the Conservatives' inability to get a jet fighter. For years, they tried and failed. The Liberal government is now delivering on getting the jet fighters. The Liberal government continues to look at ways we can enhance our Canadian Forces, not only for today but also into the future. We will have new search and rescue aircraft that will be operating out of places like the city of Winnipeg. An hon. member: They cannot fly. Mr. Kevin Lamoureux: I do not know if the member knows what he is talking about across the way. Yes, they can fly. Planes do fly. Madam Speaker, I can suggest to the members opposite that we are being challenged by the official opposition to get legislation passed, but the problem is that when it comes time to allow for that debate to occur, the Conservatives put in blockades of sorts. They will filibuster endlessly. They will bring in things like concurrence reports. What totally amazes me is that one Conservative member will stand up, and then another Conservative member will stand up to say, “I move for another Conservative member to be able to speak”. Then, they cause the bells to ring for 30 minutes. How productive is that? How productive is it to debate when the Conservative Party says that it is done for the day and that it is going to adjourn debate for the day, again, causing the bells to ring? That is one of my favourites. We all know the Conservative Party does not like to work late. It is more nine-to-five work, and if one goes a little beyond that, its numbers go down. In the end, we wanted to have more debate. To facilitate that debate, we are prepared to sit late into the evening. We will even sit until midnight to have debates. I am happy to hang around the floor of the House of Commons and to contribute to debates. I do not have a problem going until midnight. The Conservatives, on the other hand, need their sleep time and need their relaxation. After 6:30, they do not want to have debate, yet they will tell Canadians, “they are trying to ram things through, not allowing debate and cannot get legislation off”. It is like how a little kid wants to get a chocolate bar, and here is a Tory kicking him under his feet so that he constantly falls down and cannot reach the chocolate bar—
1320 words
  • Hear!
  • Rabble!
  • star_border
  • Apr/9/24 6:19:50 p.m.
  • Watch
  • Re: Bill C-63 
Madam Speaker, I have a lot to say about the bill. I will just start with a brief personal anecdote. I want to be very clear when I say this: I do not do this as victim porn or looking for sympathy. It is an example of how if somebody like myself, in a position of privilege, has a hard time accessing the justice system, what about others? When I was a minister of the Crown, over 10 years ago, I received very explicit sexualized online threats, very graphic descriptions of how somebody was going to rape me, with what instruments, and how they were going to kill me. I was alone in a hotel room. My schedule had been published the day before, and I was terrified. The response at that time from law enforcement, and the process I had to go through as a minister of the Crown, to attempt to get justice in a situation that did not involve intimate images, sticks with me to this day. If I had to go through that at that time, what hope is there for somebody who does not have my position of privilege? What the bill would do is recognize that the forms of discrimination and harassment that, as my colleague from Esquimalt—Saanich—Sooke says, disproportionately impact women, sexual minorities and other persons, have outpaced Parliament's ability to change the law. Here we are today. Briefly, I want to respond to some of the points of debate. First of all, my colleague from the Liberals suggested that we expedite Bill C-63. That bill has been so widely panned by such a variety of disparate stakeholders that the government has not even scheduled it for debate in the House yet. Second, and this is particularly for my colleagues who are looking to support this, to send the bill through to second reading, Bill C-63 would not provide criminal provisions either for any of the activities that are in the bill or for some of the other instances that have been brought up in the House for debate tonight, particularly the non-consensual distribution of deepnudes and deepfake pornography. I raised the issue in the House over seven months ago. The intimate image distribution laws that are currently in the Criminal Code were only put in place in 2014, about a decade after social media came into play, and after Rehtaeh Parsons and Amanda Todd tragically died due to an absence in the law. Seven months have passed, and the government could have dealt with updating the Criminal Code with a very narrow provision that the Canadian Bar Association and multiple victims' rights groups have asked for, yet it has chosen not to. There are so many articles that have been written about what is wrong with what is in Bill C-63 that we now need to start paying attention to what is wrong with it because of what is not in there. There is no update to Canada's Criminal Code provisions on the distribution of intimate images produced by artificial intelligence that are known as deepnudes. I want to be very clear about this. There are websites right now where anyone in this place can download an app to their phone, upload any image of any person, including any person in here, and imagine what that looks like during an election campaign, erase people's clothes, and make it look like legitimate pornography. Imagine, then, that being distributed on social media without consent. Our Criminal Code, the Canadian Bar Association, as well as law professors, and I could read case after case, say that our laws do not update that. At the beginning of February, there was a Canadian Press article that said that the government would update the law in Bill C-63, but it did not. Instead, what it chose to do was put in place a three-headed bureaucracy, an entirely extrajudicial process that amounts to a victim of these crimes being told to go to a bureaucratic complaints department instead of being able to get restitution under the law. Do we know what that says to a perpetrator? It says, “Go ahead; do it. There is no justice for you.” It boggles my mind that the government has spent all of this time while countless women and vulnerable Canadians are being harassed right now. I also want to highlight something my colleague from Esquimalt—Saanich—Sooke said, which is that there is a lack of resources for law enforcement across the country. While everybody had a nice couple of years talking about defunding the police, how many thousands of women across this country, tens of thousands or maybe even millions, experienced online harassment and were told, when they finally got the courage to go to the police, that it was in their head? One of those women was killed in Calgary recently. Another of those women is Mercedes Stephenson, who talked about her story about trying to get justice for online harassment. If women like Mercedes Stephenson and I have a hard time getting justice, how is a teenager in Winnipeg in a high school supposed to get any sort of justice without clarity in the Criminal Code if there are deepnudes spread about her? I will tell members how it goes, because it happened in a high school in Winnipeg after I raised this in the House of Commons. I said it was going to happen and it happened. Kids were posting artificial intelligence-generated deepnudes and deepfakes. They were harassing peers, harassing young women. Do members know what happened? No charges were laid. Why were no charges laid? According to the article, it was because of ambiguity in the Criminal Code around artificial intelligence-created deepnudes. Imagine that. Seven months have passed. It is not in Bill C-63. At least the bill before us is looking at both sides of the coin on the Criminal Code provisions that we need to start looking at. I want to ensure that the government is immediately updating the Criminal Code to say that if it is illegal to distribute intimate images of a person that have been taken with a camera, it should be the exact same thing if it has been generated by a deepnude artificial intelligence. This should have been done a long time ago. Before Bill C-63 came out, Peter Menzies, the former head of the CRTC, talked about the need to have non-partisan consensus and narrowly scoped bills so it could pass the House, but what the government has chosen to do with Bill C-63 is put in place a broad regulatory system with even more nebulousness on Criminal Code provisions. A lot of people have raised concerns about what the regulatory system would do and whether or not it would actually be able to address these things, and the government has not even allowed the House to debate that yet. What we have in front of us, from my perspective, is a clear call to action to update the Criminal Code where we can, in narrow provisions, so law enforcement has the tools it needs to ensure that victims of these types of crimes can receive justice. What is happening is that technology is rapidly outpacing our ability to keep up with the law, and women are dying. I am very pleased to hear the multipartisan nature of debate on these types of issues, and that there is at least a willingness to bring forward these types of initiatives to committee to have the discussions, but it does concern me that the government has eschewed any sort of update of the Criminal Code on a life-versus-life basis for regulators. Essentially what I am worried about is that it is telling victims to go to the complaints department, an extrajudicial process, as opposed to giving law enforcement the tools it needs. I am sure there will be much more debate on this, but at the end of the day, seven months have passed since I asked the government to update the Criminal Code to ensure that deepnudes and deepfakes are in the Criminal Code under the non-consensual intimate image distribution laws. Certainly what we are talking about here is ensuring that law enforcement has every tool it needs to ensure that women and, as some of my colleagues have raised here, other sexual minorities are not victimized online through these types of technologies.
1429 words
  • Hear!
  • Rabble!
  • star_border
Madam Speaker, New Democrats support, as all parties do, tackling the important issues that the bill before us seeks to tackle. We also know that there has been an explosion of sexual exploitation of individuals online without their consent and an explosion of child pornography. What we have to do is find those measures that would be effective in bringing an end to these heinous practices. Like the member for Peace River—Westlock, I would like to support and salute the survivors who have told their tales, at much personal sacrifice and much personal anguish, publicly acknowledging what has happened to them and the impact it has had on their lives. We would not be making progress on these issues without that work by those survivors, so I think we all want to salute them for their bravery in taking up this problem. However, the challenge with these issues is to find what will actually work to end sexual exploitation. We know that a lack of resources for enforcement is almost always at least equally important to any gaps in legislation. What we need to see is dedicated funding to specific and skilled police units to tackle these questions because it can become highly complex and highly convoluted in trying to bring these cases to prosecution, and we know that is one of the problems with the existing legislation. It is difficult to prosecute for these offences under the Criminal Code as it now stands. We look forward, as New Democrats, to hearing from expert witnesses in committee on what measures will actually be the most effective in bringing an end to these practices, and whether and how the measures proposed in Bill C-270 would contribute to bringing an end to online sexual exploitation. The bill, in some senses, is very simple. It would require checking ID and keeping records of consent. Some would argue that the existing law already implicitly requires that, so is this a step that would make it easier to prosecute? I do not know the answer to that, but I am looking forward to hearing expert testimony on it. While this legislation is not specific to women, it is important to acknowledge the disproportionate representation of women as victims of both child pornography and of sexual exploitation online without consent. However, I would also note that we have had a recent rash of cases of sexploitation or sextortion of young men who thought they had been speaking to other partners their own age online. They later find out that they were being threatened with the images they had shared being posted online and being asked for money or sexual favours to avoid that. Yes, it is primarily women, but we have seen this other phenomenon occurring where men pose as young women to get young boys to share those images. Obviously, we need more education for young people on the dangers of sharing intimate images, although I am under no illusion that we can change the way young people relate to each other online and through their phones. Education would be important, but some measures to deal with these things when they happen are also important. If we look at the Criminal Code, paragraph 162.1(1) already makes it illegal to distribute an intimate image without consent. Of course, child pornography, under a succeeding subsection, is also already illegal. This was first brought forward and added to the Criminal Code 11 years ago. I was a member of Parliament at that time, and the member for Peace River—Westlock joined us shortly after. It came in an omnibus bill brought forward by the Conservatives. In that bill, there were a number of things, to be honest, that New Democrats objected to, but when the bill, which was Bill C-13 at the time, was brought forward, our spokesperson Françoise Boivin offered to the government to split the bill, take out the section on online exploitation without consent and pass it through all stages in a single day. The Conservatives refused, at that point, to do that, and it took another year and a half to get that passed into law. New Democrats have been supportive in taking these actions and have recognized its urgency for more than a decade. We are on board with getting the bill before us to committee and making sure that we find what is most effective in tackling these problems. What are the problems? I see that there are principally two. One, as I have mentioned before, is the difficulty of prosecution and the difficulty of making those who profit from this pay a price. All the prosecutors I have talked to have said that it is difficult to make these cases. It is difficult to investigate, and it is difficult to get convictions. Are there things we can do that would help make prosecution easier, and are the things suggested in the bill going to do that? I look forward to finding that out in committee. The second problem is the problem of takedown, and we all know that once the images are uploaded, they are there forever. They are hard to get rid of. As members of the government's side have pointed out, there are measures in government Bill C-63 that would help with warrants of seizure, forfeiture, restitution and peace bonds in trying to get more effective action to take down the images once they have been posted. I am not an optimist about the ability to do that, but we seem to lack the tools we need now to make a stab at taking the images off-line. It is also important to remember that whatever we do here has to make our law more effective at getting those who are profiting from the images. That is really what the bill is aimed at, and I salute the member for Peace River—Westlock for that singular focus because I think that is really key. We also have to be aware of unintended consequences. When subsection 162.1(1) became law, in court we ran into a problem fairly early on of minors who share private images between each other, because technically, under the law as it is written, that is illegal; it is child pornography, and it certainly was not the intention to capture 15-year-olds who share intimate images with each other. Whenever we make these kinds of changes, we have to make sure they do not have unintended consequences. Whether we like the practices that young people engage in online or not is not the question. We just have to make sure we do not capture innocent people when we are trying to capture those who profit from exploitation. The second part, in terms of unintended consequences, is I think we have to keep in mind there are those who are engaged in lawful forms of sex work online, and we have to make sure they are not captured under the broad strokes of the bill. Again, I am looking forward to hearing the testimony about what will work to tackle these problems. We know the images are already illegal, but we know we lack effective tools in the legal system both to prosecute and to get the images taken down. New Democrats are broadly supportive of the principles in the bill. We are looking forward to the expert testimony I am certain we will hear at committee about what will actually work in tackling the problem. I look forward to the early passage of the bill through to committee.
1276 words
  • Hear!
  • Rabble!
  • star_border
Madam Speaker, to be very clear, with regard to the issue of non-consensual pornography and child pornography, I like to believe that every member in the House would be deeply offended by any activity that would ultimately lead to, encourage or promote, in any fashion whatsoever, those two issues. It angers a great number of us, to the degree that it causes all forms of emotions. We all want to do what we can to play an important role in making our online world experience a safer place. I must say that I was a little surprised when the member for Peace River—Westlock responded to the issue of Bill C-63. I did have some concerns. When one thinks of non-consensual pornography and child pornography, they are already illegal today in Canada. We know that. I appreciate what is being suggested in the private member's legislation, but he was asked a question in regard to Bill C-63, the government legislation dealing with the online harms act. It is something that is very specific and will actually have a very tangible impact. I do not know 100%, because this is the first time that I heard that members of the Conservative Party might be voting against that legislation. That would go against everything, I would suggest, in principle, that the member opposite talked about in his speech. The greatest threat today is once that information gets uploaded. How can we possibly contain it? That is, in part, what we should be attempting to deal with as quickly as possible. There was a great deal of consultation and work with stakeholders in all forms to try to deal with that. That is why we have the online harms act before us today. I wanted to ask the member a question. The question I was going to ask the member is this: Given the very nature of his comments, would he not agree that the House should look at a way in which we could expedite the passage of Bill C-63? By doing that, we are going to be directly helping some of the individuals the member addressed in his opening comments. The essence of what Bill C-63 does is that it provides an obligation, a legal obligation, for online platforms to take off of their platforms child pornography and non-consensual pornography. For example, the victims of these horrific actions can make contact and see justice because these platforms would have 24 hours to take it off. It brings some justice to the victims. I do not understand, based on his sincerity and how genuine the member was when he made the presentation of his bill. I have a basic understanding of what the member is trying to accomplish in the legislation, and I think that there are some questions in regard to getting some clarification. As I indicated, in terms of the idea of child pornography not being illegal, it is illegal today. We need to make that statement very clear. Non-consensual pornography is as well. Both are illegal. There is a consequence to perpetrators today if they are found out. What is missing is how we get those platforms to get rid of those images once those perpetrators start uploading the information and platforms start using the material. That is what the government legislation would provide. Hopefully before we end the two hours of debate the member can, in his concluding remarks, because he will be afforded that opportunity, provide some thoughts in regard to making sure people understand that this is illegal today and the importance of getting at those platforms. If we do not get at those platforms, the problem is not going to go away. There was a question posed by I believe a New Democratic member asking about countries around the world. People would be surprised at the motivation used to get child pornography on the net and livestreamed. I have seen some eye-opening presentations that show that in some countries in the world the person who is putting the child on the Internet is a parent or a guardian. They do it as a way to source revenue. They do it for income for the family. How sad is that? How angering is it to see the criminal element in North America that exploits these individuals, and children in particular. This is not to mention of course the importance of non-consensual pornography, but think of the trauma created as a direct result of a child going through things a child should never, ever have to experience. This will have a lifetime effect on that child. We know that. We see generational issues as a direct result of it. That is the reason I like to think that every member of the House of Commons would look at the issue at hand and the principles of what we are talking about and want to take some initiative to minimize it. Members need to talk to the stakeholders. I have had the opportunity in different ways over the last number of years to do so. It is one the reasons I was very glad to see the government legislation come forward. I was hoping to get clarification from the member on Bill C-270. He may be thrown off a little because of Bill C-63, which I believe will be of greater benefit than Bill C-270. After listening to the member speak though, I found out that the Conservative Party is apparently looking at voting against Bill C-63. We come up with things collectively as a House to recognize important issues and put forward legislation that would have a positive impact, and I would suggest that Bill C-63 is one of those things. I would hope the member who introduced this private member's bill will not only be an advocate for his bill but be the strongest voice and advocate within his own caucus for the online harms act, Bill C-63, so we can get the support for that bill. It would literally save lives and take ungodly things off the Internet. It would save the lives of children.
1039 words
  • Hear!
  • Rabble!
  • star_border
  • Apr/9/24 5:50:33 p.m.
  • Watch
  • Re: Bill C-63 
Madam Speaker, I thank the member for bringing forward this private member's bill, which directs our attention to some really important problems. Is the member familiar with the report from the Department of Justice on cyber-bullying and non-consensual distribution of images from just a year ago, which takes quite a different approach from his bill and says we need to rewrite the existing offence so it is easier to prosecute and include measures, which are now in Bill C-63, to allow forfeiture, seizure, restitution and peace bonds in connection with these kinds of things?
98 words
  • Hear!
  • Rabble!
  • star_border
  • Apr/9/24 5:47:55 p.m.
  • Watch
  • Re: Bill C-63 
Madam Speaker, Bill C-63 has no criminal offences around the uploading of this kind of content. In this bill, it would be a criminal offence to upload. We want to make sure this content never hits the Internet. A 24-hour takedown period is not good enough. We want to ensure that companies are doing their due diligence to ensure that their content is of people who are of age and that people consent to it. An important piece of this bill is also that, if somebody has made a written request saying they revoke their consent, immediately that content must come down.
104 words
  • Hear!
  • Rabble!
  • star_border
  • Apr/9/24 5:47:05 p.m.
  • Watch
  • Re: Bill C-63 
Madam Speaker, the topic that the member is dealing with is particularly important. One of the arguments that he is making is with respect to taking down this heinous material online. I agree with him. However, the bill does not make any provisions for it. Bill C-63, which is government legislation, does make provisions for taking down these types of heinous materials. The member's leader has said that he would vote against it. I wonder if the hon. member will be supporting Bill C-63 or if he is going to stick with what is here that would not accomplish the objectives that he is seeking, which I hope we would all be in favour of.
118 words
  • Hear!
  • Rabble!
  • star_border
Mr. Speaker, I support this question of privilege in light of the violation of government's obligation to answer an Order Paper question, but I also add to it, considering how the government has taken steps to take control of the Internet in Canada. It has done this through legislation like Bill C-11, which centralizes regulatory control of what Canadians can see, hear and post online based on what the government deems “Canadian”. In addition, I highlight Bill C-18, which has resulted in the government being one of the biggest gatekeepers of news in Canada. This is a major conflict of interest and a direct attack on journalistic integrity in this country. Now, most recently, through Bill C-63, the government proposes to establish an entire commission, yet another arm of the government, that would regulate online harm. How can Canadians trust the government to police various aspects of the Internet if it cannot even be honest and tell the truth about the content requested to be taken down? Trust is pinnacle and frankly the government has not earned any of it. The truth must prevail. Mr. Speaker, you have the opportunity to look into this and to get to the bottom of it, or you can keep us in the dark and allow secrecy and injustice to reign. I understand that you are the one to make this decision, and we are putting our trust in you to make sure that this place is upheld and democracy is kept strong.
255 words
  • Hear!
  • Rabble!
  • star_border
  • Mar/21/24 3:17:07 p.m.
  • Watch
  • Re: Bill C-63 
Mr. Speaker, I wanted to make a very brief intervention in response to the government House leader's parliamentary secretary's response to my question of privilege on Bill C-63 and the leak that occurred. The parliamentary secretary's 25-minute submission extensively quoted the Internet. What it did not do, however, was explain exactly how the sources whom Travis Dhanraj and Rachel Aiello spoke to were lucky enough to state precisely which of the options the government consulted on would make it into the bill. Had the reporting been based on the published consultation documents, the media reports would have said so, but they did not. They quoted “sources” who were “not authorized to speak publicly on the matter before the bill is tabled in Parliament.” The parliamentary secretary's implication that the sources were all stakeholders uninformed about the ways of Parliament is demonstrably untrue. CTV's source was “a senior government source”. The CBC attributed its article to “two sources, including one with the federal government”. Besides, had these sources actually all been stakeholders speaking about previous consultations, why would they have sought anonymity to begin with, let alone specify the need for anonymity, because the bill had not yet been introduced? As I said back on February 26, the leakers knew what they were doing. They knew it was wrong, and they knew why it was wrong. We are not talking about general aspects of the bill that might have been shared with stakeholders during consultation processes. We are talking about very detailed information that was in the legislation and was leaked to the media before it was tabled in the House. That is the issue we are asking you to rule on, Mr. Speaker.
298 words
  • Hear!
  • Rabble!
  • star_border
  • Mar/19/24 5:15:50 p.m.
  • Watch
  • Re: Bill C-63 
Mr. Speaker, I am rising to respond to a question of privilege raised by the member for Regina—Qu'Appelle on February 26 regarding the alleged premature disclosure of the content of Bill C-63, the online harms act. I would like to begin by stating that the member is incorrect in asserting that there has been a leak of the legislation, and I will outline a comprehensive process of consultation and information being in the public domain on this issue long before the bill was placed on notice. Online harms legislation is something that the government has been talking about for years. In 2015, the government promised to make ministerial mandate letters public, a significant departure from the secrecy around those key policy commitment documents from previous governments. As a result of the publication of the mandate letters, reporters are able to use the language from these letters to try to telegraph what the government bill on notice may contain. In the 2021 Liberal election platform entitled “Forward. For Everyone.”, the party committed to the following: Introduce legislation within its first 100 days to combat serious forms of harmful online content, specifically hate speech, terrorist content, content that incites violence, child sexual abuse material and the non-consensual distribution of intimate images. This would make sure that social media platforms and other online services are held accountable for the content that they host. Our legislation will recognize the importance of freedom of expression for all Canadians and will take a balanced and targeted approach to tackle extreme and harmful speech. Strengthen the Canada Human Rights Act and the Criminal Code to more effectively combat online hate. The December 16, 2021, mandate letter from the Prime Minister to the Minister of Justice and Attorney General of Canada asked the minister to achieve results for Canadians by delivering on the following commitment: Continue efforts with the Minister of Canadian Heritage to develop and introduce legislation as soon as possible to combat serious forms of harmful online content to protect Canadians and hold social media platforms and other online services accountable for the content they host, including by strengthening the Canadian Human Rights Act and the Criminal Code to more effectively combat online hate and reintroduce measures to strengthen hate speech provisions, including the re-enactment of the former Section 13 provision. This legislation should be reflective of the feedback received during the recent consultations. Furthermore, the December 16, 2021, mandate letter from the Prime Minister to the Minister of Canadian Heritage also asked the minister to achieve results for Canadians by delivering on the following commitment: Continue efforts with the Minister of Justice and Attorney General of Canada to develop and introduce legislation as soon as possible to combat serious forms of harmful online content to protect Canadians and hold social media platforms and other online services accountable for the content they host. This legislation should be reflective of the feedback received during the recent consultations. As we can see, the government publicly stated its intention to move ahead with online harms legislation, provided information on its plan and consulted widely on the proposal long before any bill was placed on the Notice Paper. I will now draw to the attention of the House just how broadly the government has consulted on proposed online harms legislation. Firstly, with regard to online consultations, from July 29 to September 25, 2021, the government published a proposed approach to address harmful content online for consultation and feedback. Two documents were presented for consultation: a discussion guide that summarized and outlined an overall approach, and a technical paper that summarized drafting instructions that could inform legislation. I think it is worth repeating here that the government published a technical paper with the proposed framework for this legislation back in July 2021. This technical paper outlined the categories of proposed regulated harmful content; it addressed the establishment of a digital safety commissioner, a digital safety commission, regulatory powers and enforcement, etc. Second is the round table on online safety. From July to November 2022, the Minister of Canadian Heritage conducted 19 virtual and in-person round tables across the country on the key elements of a legislative and regulatory framework on online safety. Virtual sessions were also held on the following topics: anti-Semitism, Islamophobia, anti-Black racism, anti-Asian racism, women and gender-based violence, and the tech industry. Participants received an information document in advance of each session to prepare for the discussion. This document sought comments on the advice from the expert advisory group on online safety, which concluded its meetings on June 10. The feedback gathered from participants touched upon several key areas related to online safety. Third is the citizens' assembly on democratic expression. The Department of Canadian Heritage, through the digital citizen initiative, is providing financial support to the Public Policy Forum's digital democracy project, which brings together academics, civil society and policy professionals to support research and policy development on disinformation and online harms. One component of this multi-year project is an annual citizens' assembly on democratic expression, which considers the impacts of digital technologies on Canadian society. The assembly took place between June 15 and 19, 2023, in Ottawa, and focused on online safety. Participants heard views from a representative group of citizens on the core elements of a successful legislative and regulatory framework for online safety. Furthermore, in March 2022, the government established an expert advisory group on online safety, mandated to provide advice to the Minister of Canadian Heritage on how to design the legislative and regulatory framework to address harmful content online and how to best incorporate the feedback received during the national consultation held from July to September 2021. The expert advisory group, composed of 12 individuals, participated in 10 weekly workshops on the components of a legislative and regulatory framework for online safety. These included an introductory workshop and a summary concluding workshop. The government undertook its work with the expert advisory group in an open and transparent manner. A Government of Canada web page, entitled “The Government's commitment to address online safety”, has been online for more than a year. It outlines all of this in great detail. I now want to address the specific areas that the opposition House leader raised in his intervention. The member pointed to a quote from a CBC report referencing the intention to create a new regulator that would hold online platforms accountable for harmful content they host. The same website that I just referenced states the following: “The Government of Canada is committed to putting in place a transparent and accountable regulatory framework for online safety in Canada. Now, more than ever, online services must be held responsible for addressing harmful content on their platforms and creating a safe online space that protects all Canadians.” Again, this website has been online for more than a year, long before the bill was actually placed on notice. The creation of a regulator to hold online services to account is something the government has been talking about, consulting on and committing to for a long period of time. The member further cites a CBC article that talks about a new regulatory body to oversee a digital safety office. I would draw to the attention of the House the “Summary of Session Four: Regulatory Powers” of the expert advisory group on online safety, which states: There was consensus on the need for a regulatory body, which could be in the form of a Digital Safety Commissioner. Experts agreed that the Commissioner should have audit powers, powers to inspect, have the powers to administer financial penalties and the powers to launch investigations to seek compliance if a systems-based approach is taken—but views differed on the extent of these powers. A few mentioned that it would be important to think about what would be practical and achievable for the role of the Commissioner. Some indicated they were reluctant to give too much power to the Commissioner, but others noted that the regulator would need to have “teeth” to force compliance. This web page has been online for months. I also reject the premise of what the member for Regina—Qu'Appelle stated when quoting the CBC story in question as it relates to the claim that the bill will be modelled on the European Union's Digital Services Act. This legislation is a made-in-Canada approach. The European Union model regulates more than social media and targets the marketplace and sellers. It also covers election disinformation and certain targeted ads, which our online harms legislation does not. The member also referenced a CTV story regarding the types of online harms that the legislation would target. I would refer to the 2021 Liberal election platform, which contained the following areas as targets for the proposed legislation: “hate speech, terrorist content, content that incites violence, child sexual abuse material and the non-consensual distribution of intimate images.” These five items were the subject of the broad-based and extensive consultations I referenced earlier in my intervention. Based on these consultations, a further two were added to the list to be considered. I would draw the attention of the House to an excerpt from the consultation entitled, “What We Heard: The Government’s proposed approach to address harmful content online”, which states, “Participants also suggested the inclusion of deep fake technology in online safety legislation”. It continues, “Many noted how child pornography and cyber blackmailing can originate from outside of Canada. Participants expressed frustration over the lack of recourse and tools available to victims to handle such instances and mentioned the need for a collaborative international effort to address online safety.” It goes on to state: Some respondents appreciated the proposal going beyond the Criminal Code definitions for certain types of content. They supported the decision to include material relating to child sexual exploitation in the definition that might not constitute a criminal offence, but which would nevertheless significantly harm children. A few stakeholders said that the proposal did not go far enough and that legislation could be broader by capturing content such as images of labour exploitation and domestic servitude of children. Support was also voiced for a concept of non-consensual sharing of intimate images. It also notes: A few respondents stated that additional types of content, such as doxing (i.e., the non-consensual disclosure of an individual’s private information), disinformation, bullying, harassment, defamation, conspiracy theories and illicit online opioid sales should also be captured by the legislative and regulatory framework. This document has been online for more than a year. I would also point to the expert advisory group's “Concluding Workshop Summary” web page, which states: They emphasized the importance of preventing the same copies of some videos, like live-streamed atrocities, and child sexual abuse, from being shared again. Experts stressed that many file sharing services allow content to spread very quickly. It goes on to say: Experts emphasized that particularly egregious content like child sexual exploitation content would require its own solution. They explained that the equities associated with the removal of child pornography are different than other kinds of content, in that context simply does not matter with such material. In comparison, other types of content like hate speech may enjoy Charter protection in certain contexts. Some experts explained that a takedown obligation with a specific timeframe would make the most sense for child sexual exploitation content. It also notes: Experts disagreed on the usefulness of the five categories of harmful content previously identified in the Government’s 2021 proposal. These five categories include hate speech, terrorist content, incitement to violence, child sexual exploitation, and the non-consensual sharing of intimate images. Another point is as follows: A few participants pointed out how the anonymous nature of social media gives users more freedom to spread online harm such as bullying, death threats and online hate. A few participants noted that this can cause greater strain on the mental health of youth and could contribute to a feeling of loneliness, which, if unchecked, could lead to self-harm. Again, this web page has been online for more than a year. The member further cites the CTV article's reference to a new digital safety ombudsperson. I would point to the web page of the expert advisory group for the “Summary of Session Four: Regulatory Powers”, which states: The Expert Group discussed the idea of an Ombudsperson and how it could relate to a Digital Safety Commissioner. Experts proposed that an Ombudsperson could be more focused on individual complaints ex post, should users not be satisfied with how a given service was responding to their concerns, flags and/or complaints. In this scheme, the Commissioner would assume the role of the regulator ex ante, with a mandate devoted to oversight and enforcement powers. Many argued that an Ombudsperson role should be embedded in the Commissioner’s office, and that information sharing between these functions would be useful. A few experts noted that the term “Ombudsperson” would be recognizable across the country as it is a common term and [has] meaning across other regimes in Canada. It was mentioned that the Ombudsperson could play more of an adjudicative role, as distinguished from...the Commissioner’s oversight role, and would have some authority to have certain content removed off of platforms. Some experts noted that this would provide a level of comfort to victims. A few experts raised questions about where the line would be drawn between a private complaint and resolution versus the need for public authorities to be involved. That web page has been online for months. Additionally, during the round table on online safety and anti-Black racism, as the following summary states: Participants were supportive of establishing a digital safety ombudsperson to hold social media platforms accountable and to be a venue for victims to report online harms. It was suggested the ombudsperson could act as a body that takes in victim complaints and works with the corresponding platform or governmental body to resolve the complaint. Some participants expressed concern over the ombudsperson's ability to process and respond to user complaints in a timely manner. To ensure the effectiveness of the ombudsperson, participants believe the body needs to have enough resources to keep pace with the complaints it receives. A few participants also noted the importance for the ombudsperson to be trained in cultural nuances to understand the cultural contexts behind content that is reported to them. That web page has been online for more than a year. Finally, I would draw the attention of the House to a Canadian Press article of February 21, 2024, which states, “The upcoming legislation is now expected to pave the way for a new ombudsperson to field public concerns about online content, as well as a new regulatory role that would oversee the conduct of internet platforms.” This appeared online before the bill was placed on notice. Mr. Speaker, as your predecessor reiterated in his ruling on March 9, 2021, “it is a recognized principle that the House must be the first to learn the details of new legislative measures.” He went on to say, “...when the Chair is called on to determine whether there is a prima facie case of privilege, it must take into consideration the extent to which a member was hampered in performing their parliamentary functions and whether the alleged facts are an offence against the dignity of Parliament.” The Chair also indicated: When it is determined that there is a prima facie case of privilege, the usual work of the House is immediately set aside in order to debate the question of privilege and decide on the response. Given the serious consequences for proceedings, it is not enough to say that the breach of privilege or contempt may have occurred, nor to cite precedence in the matter while implying that the government is presumably in the habit of acting in this way. The allegations must be clear and convincing for the Chair. The government understands and respects the well-established practice that members have a right of first access to the legislation. It is clear that the government has been talking about and consulting widely on its plan to introduce online harms legislation for the past two years. As I have demonstrated, the public consultations have been wide-ranging and in-depth with documents and technical papers provided. All of this occurred prior to the bill's being placed on notice. Some of the information provided by the member for Regina—Qu'Appelle is not even in the bill, most notably the reference to its being modelled on the European Union's Digital Services Act, which is simply false, as I have clearly demonstrated. The member also hangs his arguments on the usage of the vernacular “not authorized to speak publicly” in the media reports he cites. It is certainly not proof of a leak, especially when the government consulted widely and publicly released details on the content of the legislative proposal for years before any bill was actually placed on notice. The development of the legislation has been characterized by open, public and wide-ranging consultations with specific proposals consulted on. This is how the Leader of the Opposition was able to proclaim, on February 21, before the bill was even placed on notice, that he and his party were vehemently opposed to the bill. He was able to make this statement because of the public consultation and the information that the government has shared about its plan over the last two years. I want to be clear that the government did not share the bill before it was introduced in the House, and the evidence demonstrates that there was no premature disclosure of the bill. I would submit to the House that consulting Canadians this widely is a healthy way to produce legislation and that the evidence I have presented clearly demonstrates that there is no prima facie question of privilege. It is our view that this does not give way for the Chair to conclude that there was a breach of privilege of the House nor to give the matter precedence over all other business of the House.
3096 words
  • Hear!
  • Rabble!
  • star_border
Mr. Speaker, I am rising this afternoon on a question of privilege concerning the leak of key details of Bill C-63, the so-called online harms bill, which was tabled in the House earlier today. While a lot will be said in the days, weeks and months ahead about the bill in the House, its parliamentary journey is not off to a good start. Yesterday afternoon, the CBC published on its website an article entitled “Ottawa to create regulator to hold online platforms accountable for harmful content: sources”. The article, written by Naama Weingarten and Travis Dhanraj, outlined several aspects of the bill with the information attributed to two sources “with knowledge of Monday's legislation”. I will read brief excerpts of the CBC's report revealing details of the bill before it was tabled in Parliament. “The Online Harms Act, expected to be introduced by the federal government on Monday, will include the creation of a new regulator that would hold online platforms accountable for harmful content they host, CBC News has confirmed.” “The new regulatory body is expected to oversee a digital safety office with the mandate of reducing online harm and will be separate from the Canadian Radio-television and Telecommunications Commission (CRTC), sources say.” “Sources say some components of the new bill will be modelled on the European Union's Digital Services Act. According to the European Commission, its act “regulates online intermediaries and platforms such as marketplaces, social networks, content-sharing platforms, app stores, and online travel and accommodation platforms.”” Then, today, CTV News published a second report entitled “Justice Minister to Introduce New Bill to Tackle Harmful Online Content”. In Rachel Aiello's article, she says, “According to a senior government source [Bill C-63] would be expected to put an emphasis on harms to youth including specific child protection obligations for social media and other online platforms, including enhanced preservation requirements. It targets seven types of online harms: hate speech, terrorist content, incitement to violence, the sharing of non-consensual intimate images, child exploitation, cyberbullying, and inciting self-harm, and includes measures to crack down on non-consensual artificial intelligence pornography, deepfakes and require takedown provisions for what's become known as 'revenge porn'. Further, while the sources suggested there will be no new powers for law enforcement, multiple reports have indicated the bill will propose creating a new digital safety ombudsperson to field Canadians' concerns about platform decisions around content moderation.” As explained in footnote 125 on page 84 of the House of Commons Procedure and Practice, third edition, on March 19, 2001: “Speaker Milliken ruled that the provision of information concerning legislation to the media without any effective measures to secure the rights of the House constituted a prima facie case of contempt.” The subsequent report of the Standing Committee on Procedure and House Affairs concluded: “This case should serve as a warning that our House will insist on the full recognition of its constitutional function and historic privileges across the full spectrum of government.” Sadly, Mr. Speaker, the warning has had to be sounded multiple times since. Following rulings by your predecessors finding similar prima facie contempts on October 15, 2001, April 19, 2016 and March 10, 2020, not to mention several other close-call rulings that fell short of the necessary threshold yet saw the Chair sound cautionary notes for future reference, a number of those close-call rulings occurred under the present government that would often answer questions of privilege with claims that no one could be certain who had leaked the bill or even when it had been leaked, citing advanced policy consultations with stakeholders. Mr. Speaker, your immediate predecessor explained, on March 10, 2020, on page 1,892 of the Debates, the balancing act that must be observed. He said: The rule on the confidentiality of bills on notice exists to ensure that members, in their role as legislators, are the first to know their content when they are introduced. Although it is completely legitimate to carry out consultations when developing a bill or to announce one’s intention to introduce a bill by referring to its public title available on the Notice Paper and Order Paper, it is forbidden to reveal specific measures contained in a bill at the time it is put on notice. In the present circumstances, no such defence about stakeholders talking about their consultations can be offered. The two sources the CBC relied upon for its reporting were, according to the CBC itself, granted anonymity “because they were not authorized to speak publicly on the matter before the bill is tabled in Parliament.” As for the CTV report, its senior government source “was not authorized to speak publicly about details yet to be made public.” When similar comments were made by the Canadian Press in its report on the leak of the former Bill C-7 respecting medical assistance in dying, Mr. Speaker, your immediate predecessor had this to say when finding a prima facie contempt in his March 10, 2020 ruling: Everything indicates that the act was deliberate. It is difficult to posit a misunderstanding or ignorance of the rules in this case. Just as in 2020, the leakers knew what they were doing. They knew it was wrong and they knew why it was wrong. The House must stand up for its rights, especially against a government that appears happy to trample over them in the pursuit of legislating the curtailing of Canadians' rights. Mr. Speaker, if you agree with me that there is a prima facie contempt, I am prepared to move the appropriate motion.
963 words
  • Hear!
  • Rabble!
  • star_border
  • Feb/26/24 3:29:20 p.m.
  • Watch
  • Re: Bill C-63 
moved for leave to introduce Bill C-63, An Act to enact the online harms act, to amend the Criminal Code, the Canadian Human Rights Act and an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service, and to make consequential and related amendments to other acts.
54 words
  • Hear!
  • Rabble!
  • star_border