The Liberal government is planning to create a "digital safety commission of Canada" to regulate social media companies to take steps to reduce the risk of online harms to their users.

Justice Minister Arif Virani tabled the Online Harms Act today, and creating a new regulator is just one of the new measures being proposed.

Prime Minister Justin Trudeau has long promised to better protect against online harms, but his ministers have repeatedly said developing such legislation was complicated.

Bill C-63 seeks to create a new regulator for social-media services and establish a new ombudsperson to advocate for users who have concerns about online safety.

The proposed law would require companies to make "inaccessible" intimate images shared without consent and content that "sexually victimizes a child."

The government is also amending the Criminal Code to introduce stiffer punishments for existing hate propaganda offences and amend the Canadian Human Rights Act to include online hate speech as discrimination.

Here are five things Bill C-63 proposes to do. 

1. Target specific types of harmful content

The government is looking to target the non-consensual sharing of intimate images — including deepfakes generated by artificial intelligence and content that "sexually victimizes a child or revictimizes a survivor."

The bill would also cover anything online that is used to bully a child or urge them to commit self-harm. 

Content that incites violent extremism or terrorism, along with material that incites violence or stirs hatred, is also included. 

There is overlap with five categories of content the government proposed tackling in a 2021 consultation document. One key difference: the earlier plan included provisions around hate speech writ large, whereas the new bill does not. 

2. Add fresh responsibilities for online platforms 

The bill seeks to usher in new rules for online platforms, one of which is broadly defined as the "duty to act responsibility." 

Under that banner, companies would be expected to reduce their users' exposure to harmful content by "continuously" assessing such risks, developing mitigation strategies and providing tools for users to flag harmful content. 

The legislation would also require platforms to publish what it calls "digital safety plans" to outline what measures they are taking to reduce the risk of exposing users to harmful content and track their effectiveness. Companies would also have to share data with researchers. 

The government says the new rules will apply to social-media sites, "user-uploaded adult content" and "live streaming services." 

It says companies must have a certain number of users on their platforms in order to be covered by the new law, a threshold the government says will be decided in later regulations. 

While the government says the goal is to target platforms Canadians use the most, Trudeau's cabinet can decide to add in services with fewer uses "when they pose a significant risk of harm." 

3. Create a new regulator and a new ombudsperson 

The government seeks to create a new "digital safety commission," which would be comprised of five individuals appointed by cabinet. 

It would be separate from the Radio-television and Telecommunications Commission, which regulates traditional broadcasters. 

The new body would be in charge of ensuring online platforms follow the rules outlined in the proposed law. 

A new "independent" ombudsperson, which the government says would also be appointed by cabinet, would advocate on behalf of users. 

It would provide users with information about complaints they wish to file and make recommendations not only to social media services but to the regulator and the government. 

3. Give companies 24 hours to remove certain content

The law seeks to give the new digital safety commission power to "order removal of content that sexually victimizes a child or revictimizes a survivor," as well as intimate images shared without an individual's consent 

It says companies must remove this material within 24 hours.

The government says users can file a complaint with the platform itself or to the new regulator. 

It promises "frivolous" complaints would be screened out. 

4. Strengthen reporting around child pornography

The government is looking to amend a current law that says it is mandatory for internet services to report instances of child sex abuse images on the internet. 

It says it wants to ensure these rules apply to social-media platforms and proposes to "create authority to centralize mandatory reporting" of such offences "through a designated law enforcement body."

The amendment also seeks to extend how long such data can be preserved to assist in police investigations, as well as add three more years to the current two-year limitation period for prosecution. 

5. Change the Canadian Human Rights Act and add stiffer sentences for hate crimes 

The government is seeking to add online hate speech as a form of discrimination under the law and allow people to file complaints against individuals posting such content to the Canadian Human Rights Commission. 

It also seeks to make changes to the Criminal Code, including raising the maximum punishment for four hate propaganda offences. 

For example, someone found guilty of advocating genocide could face life imprisonment, up from five years in prison. 

The government is also looking to create a new hate crime offence that could applied to every other offence, instead of listing it as an aggravating factor during sentencing.