C-63: Frequently Asked Questions
-
C-63 regulates large social media services. A social media service is defined as a website or application that is accessible in Canada, and which has the primary purpose of facilitating interprovincial or international online communication among users of the website or application by enabling them to access and share content. It must enable a user to communicate content to the public. Adult content sites, and live streaming services are also included. A regulated service is one that: is of a size determined by the regulator; or is smaller but falls under special regulations. Operators of regulated services are also subject to regulation and potential liability.
The Act excludes any private messaging feature of the regulated service.
-
The bill is concerned with the risk of seven forms of harmful content: (1) Intimate content communicated without consent; (2) Content that sexually victimizes a child or revictimizes a survivor; (3) Content that induces a child to harm themselves; (4) Content used to bully a child; (5) Content that foments hatred; (6) Content that incites violence; and, (7) Content that incites violent extremism or terrorism.
-
Regulated services must abide by four duties and one data transparency requirement: 1) the duty to act responsibly; 2) the duty to protect children; 3) the duty to make certain content inaccessible; 4) the duty to keep records; and 5) the transparency provision to make inventories and electronic data accessible.
-
The Duty to Act Responsibly requires regulated services to implement “measures that are adequate to mitigate the risk that users of the service will be exposed to harmful content on the service while protecting their freedom of expression.” In order to minimize the risk of harmful content, regulated services must:
Submit a Digital Safety Plan that includes: 1) a risk assessment of how their users may be exposed to harmful content; 2) a description of their risk mitigation plan; 3) their assessment of the effectiveness of these measures; and, 4) data on how they came to this determination.
A version of the plan, excluding user data, financial data, or and trade secrets, must be made public.
This Digital Safety Plan provides a substantial transparency mechanism. Operators are required to share data on: 1) how they complied with the regulations; 2) design features implemented to minimize risk; 3) measures taken to protect children; 4) human resources deployed to comply; 5) the volume of harmful content moderated; 6) content that was flagged by users; 7) non-harmful content moderation; 8) the complaints heard by the internal resource person; 9) any internal research conducted on harmful content; 10) how they complied with child pornography responsibilities; 11) all data used to comply; and, 12) any other information provided by regulation.
Regulated social media services must also:
Provide blocking and harmful content flagging tools — they do not have to act on this flag, but must inform the user who flagged it if anything was done, and notify the user whose content was flagged.
Label automated content — this bot provision is defined as content that is likely to have been repeatedly posted by an automated program, or is amplified by those multiple instances of automated communication
Make available a harmful content resource person — this role will serve as an internal voice responding to user flags and direct users to internal and external resources.
Provide clear Guidelines — they must make user guidelines publicly available and include a standard of conduct with respect to harmful content, and their measures for addressing it.
Not notify users of law enforcement reporting — the operator must not notify a user that the operator has made a report to a law enforcement agency in relation to the content
-
The Duty to Protect Children will be determined in regulation. An operator must integrate into a regulated service that it operates any design features respecting the protection of children, such as age appropriate design, that are provided for by regulations.
-
The Duty to Make Certain Content Inaccessible mandates that two types of content be made inaccessible by operators: 1) content that sexually victimizes a child or revictimized a survivor; and, 2) intimate content communicated without consent.
If an operator reasonably suspects either of these types of content, they must, with 24 hours: 1) make that content inaccessible to all persons in Canada and continue to make it inaccessible until the operator makes a decision; and, 2) give notice to the user who communicated the content on the service that the content has been made inaccessible
This does not include content flagged by users. If one of these two types is flagged by a user, the operator must conduct an initial assessment and determine whether it is trivial or has already been reviewed.
The operator must reconsider content decisions appealed by users who posted these two types of content.
-
The operator of a regulated service must keep all records, including information and data, that are necessary to determine whether the operator is complying with the operator’s duties under this Act.
-
The Transparency Provision mandates that regulated services must share data with qualified persons for research purposes, if: 1) the person’s primary purpose is to conduct research or engage in education, advocacy or awareness activities; and 2) the person conducts research that is, or engages in education, advocacy or awareness activities that are, related to the purposes of this Act.
A person is defined as: a corporation and a trust, a partnership, a fund, a joint venture and any other unincorporated association or organization.
At the request of an accredited person, the regulator may make an order requiring the operator of a regulated service to give access to any electronic data that is referred to in an inventory of electronic data that is included in a Digital Safety Plan.
-
Amendments to the Criminal Code to:
Create a hate crime offense for committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;Create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
Define “hatred” for the purposes of the new offence and the hate propaganda offences; and
Increase the maximum sentences for the hate propaganda offences.
Amendments to the Canadian Human Rights Act to:
Provide that it is a discriminatory practice to communicate, or cause to be communicated, hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination.
This authorizes the Canadian Human Rights Commission to accept complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Amendments to an Act respecting the mandatory reporting of Internet child pornography by persons who provide and Internet service to:
Clarify the types of Internet services covered by that Act;
Simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
Require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
Extend the period of preservation of data related to an offence;
Extend the limitation period for the prosecution of an of- fence under that Act; and
Add certain regulation-making powers
-
The Act creates three new entities to enforce the duties and responsibilities on regulated social media services: 1) The Digital Safety Commission of Canada; 2) The Digital Safety Ombudsperson of Canada; 3) The Digital Safety Office of Canada.
Digital Safety Commission of CanadaThe Commission’s mandate is to promote online safety in Canada and contribute to the reduction of harms caused to persons in Canada as a result of harmful content online by, among other things:
Ensuring the administration and enforcement of this Act;
Ensuring that operators are transparent and ac- countable with respect to their duties under this Act;
Investigating complaints relating to content that sexually victimizes a child or revictimized a survivor and intimate content communicated without consent;
Contributing to the development of standards with respect to online safety through research and educational activities;
Facilitating the participation of Indigenous peoples of Canada and interested persons in the Commission’s activities; and
Collaborating with interested persons, including operators, the Commission’s international counter- parts and other persons having professional, technical or specialized knowledge.
The commission will consist of three to five full-time members to be appointed by the Governor in Council to hold office during good behaviour.
Digital Safety Ombudsperson of Canada
The Ombudsperson’s mandate is to provide support to users of regulated services and advocate for the public interest with respect to systemic issues related to online safety.
The Ombudsperson may:
Gather information with respect to issues related to online safety, including with respect to harmful content, such as by obtaining the perspective of users of regulated services and victims of harmful content;
Highlight issues related to online safety, including by making publicly available any information gathered under paragraph (a), other than personal information; and
Direct users to resources, including those provided for under this Act, that may address their concerns regarding harmful content.
Digital Safety Office of Canada
The Office’s mandate is to support the Commission and the Ombudsperson in the fulfillment of their mandates, the exercise of their powers and the performance of their duties and functions.
On the recommendation of the Minister, the Governor in Council is to appoint a Chief Executive Officer of the Office. The Chief Executive Officer will:
Have the rank and status of a deputy head of a department;
Be responsible for the management of the Office’s day-to-day business and affairs; and
Will supervise the Office’s employees and their work.
The head office of the Office is to be at any place in Canada that may be designated by the Governor in Council.
-
The regulator has investigative powers to:
Summon and enforce the appearance of persons before the Commission and compel them to give oral or written evidence on oath and to produce any documents or other things that the Commission considers necessary, in the same manner and to the same extent as a superior court of record.
Hold public or private hearings.
Designate inspectors it considers qualified for the purposes of verifying compliance or preventing non-compliance with this Act.
Inspectors may, for a purpose related to verifying compliance or preventing non-compliance with this Act, enter any place in which they have reasonable grounds to believe that there is any document, information or other thing relevant to that purpose.
An inspector is considered to have entered a place if they access it remotely by a means of telecommunication with the knowledge of the owner.
Power to issue orders:
If the Commission has reasonable grounds to believe that an operator is contravening or has contravened this Act, it may make an order requiring the operator to take, or refrain from taking, any measure to ensure compliance with this Act.
An order of the Commission may be made an order of the Federal Court and is enforceable in the same manner as an order of that court.
An order may be made an order of the Federal Court by following the usual practice and procedure of that court or by filing a certified copy of the order with the registrar of that court.
-
A person in Canada may make submissions to the Commission respecting harmful content that is accessible on a regulated service or the measures taken by the operator of a regulated service to comply with the operator’s duties under this Act.
This provision protects the identity of the complainant if they work for the company they are making the complaint on. This is a whistleblower protection measure.
They may also make a complaint to the Commission that content on a regulated service is con- tent that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent.
For these two types of content only, the commission may investigate and if needed request removal of the content.
The commission will conduct an initial review of this content and dismiss it if found trivial or subject of another complaint. If the Commission does not dismiss the complaint, it must:
Give notice of the complaint to the operator of the regulated service to which the complaint relates and the user who communicated the content on the service; and
Make an order requiring the operator to, without delay, make the content inaccessible to all persons in Canada and to continue to make it inaccessible until the Commission gives notice to the operator of its decision under subsection 82(4) or (5), as the case may be.
The Commission must then decide whether there are reasonable grounds to believe that the content is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent.
They can then give notice and order to make content permanently inaccessible.
-
There are two types of punishments for non-compliance: violations and offences.
Violations are administrative monetary penalties to promote compliance with the Act that may be enacted if an operator:
Contravenes a provision of this Act or the regulations;
Contravenes an order of the Commission;
Contravenes a requirement imposed by an inspector under section 93;
Contravenes an undertaking that it entered into with the Commission or a person authorized to enter into undertakings;
Contravenes a requirement imposed by the Commission under section 117 or subsection 119(2);
Obstructs or hinders the Commission, an inspector or a person authorized to issue a notice of violation, in the exercise of their powers or the performance of their duties and functions; or
Makes a false or misleading statement orally or in 25 writing to the Commission, an inspector or a person authorized to issue a notice of violation, in the exercise of their powers or the performance of their duties and functions.
A violation that is continued on more than one day constitutes a separate violation in respect of each day on which it is continued.
The maximum penalty for a violation is not more than 6% of the gross global revenue of the person that is believed to have committed the violation or $10 million, whichever is greater. The amount of the penalty is to be determined by taking into account the following factors:
The nature and scope of the violation;
The history of compliance with this Act by the person that is believed to have committed the violation;
Any benefit that the person obtained by commit- ting the violation;
The ability of the person to pay the penalty and the likely effect of paying it on their ability to carry on their business;
The purpose of the penalty;
Any factor prescribed by regulation; and (g) any other relevant factor.
Offences made by operators may be subject to penalties:
Contravenes an order of the Commission;
Contravenes an undertaking that it entered into with the Commission or a person authorized to enter into undertakings;
Contravenes a requirement imposed by the Com- mission under section 117 or subsection 119(2);
Obstructs or hinders the Commission, an inspector or a person authorized to issue a notice of violation, in the exercise of their powers or the performance of their duties and functions; or
Makes a false or misleading statement orally or in writing to the Commission, an inspector or a person authorized to issue a notice of violation, in the exercise of their powers or the performance of their duties and functions.
Penalties for offences include:
On conviction on indictment, to a fine of not more than 8% of the operator’s gross global revenue or $25 million, whichever is greater; or on summary conviction, to a fine of not more than 7% of the operator’s gross global revenue or $20 million, whichever is greater.
There is also personal liability for persons that commit an offence:
To a fine of not more than 3% of the person’s gross global revenue or $10 million, whichever is greater, in the case of a person that is not an individual, and
To a fine at the discretion of the court, in the case of an individual; or on summary conviction, to a fine of not more than 2% of the person’s gross global revenue or $5 million, whichever is greater, in the case of a person that is not an individual, and to a fine of not more than $50,000, in the case of an individual.
A person is not to be found guilty of an offence if they establish that they exercised due diligence to prevent the commission of the offence.
-
The Act will be reviewed no later than the fifth anniversary of the day on which this section comes into force, and every five years after that, the Minister must cause a review of this Act and its operation to be undertaken. The Minister must cause a report on the review to be laid before each House of Parliament within one year after the review is completed.