UK considers tougher Online Safety Act after UK riots, Musk comments
As the riots raged in the U.K., Elon Musk began making incendiary comments about the situation, including the statement: “Civil war is inevitable.” Musk is the owner of X, the social media platform formerly known as X.
Aytug Can Sencar | Anadolu | Getty Images
LONDON — Prime Minister Keir Starmer’s Labour government is considering ways to toughen up internet safety regulations in the U.K. after misinformation sparked a spate of anti-immigration protests and X owner Elon Musk made incendiary comments in posts that were viewed by millions of people.
Two industry sources with knowledge of the matter told CNBC that following the events of the past two weeks, Labour is considering a review of the Online Safety Act — legislation that requires tech giants to prevent the spread of illegal and harmful content on their platforms.
These sources were not authorized to speak publicly about the proposed changes, as the conversations surrounding revamped online safety laws are ongoing.
Top officials have made comments in recent days saying that the government may review the Online Safety Act to make it tougher on disinformation, hate speech and incitement to violence.
“There are obviously aspects of the Online Safety Act that haven’t come into effect yet. We stand ready to make changes if necessary,” Nick Thomas-Symonds, minister for the Cabinet Office, told CNBC sister network Sky News.
Media and telecommunications regulator, Ofcom, has been unable to act against social media platforms for allowing hate speech and other content that would violate the law, because of the fact that the legislation hasn’t fully come into force yet.
What is the Online Safety Act, exactly? And what could it mean for tech firms like Elon Musk’s X? CNBC runs through all you need to know.
What is the Online Safety Act?
The Online Safety Act is a landmark piece of legislation in the U.K. that seeks to force social networks and video streaming media companies to rid their platforms of illegal content.
The regulation contains new duties which would require tech companies to actively identify, mitigate and manage the risks of harm from such material appearing on their platforms.
There are several examples of content that, if reported, could make a company liable for criminal sanctions. These include child sexual abuse, fraud, racially or religiously aggravated offenses, incitement to violence, and terrorism.
Once the rules take effect, Ofcom would have the power to levy fines of as much as 10% of companies’ global annual revenues for breaches. In cases where repeat breaches occur, individual senior managers could even face jail time.
Ofcom has said the new duties on tech firms won’t fully come into force until 2025, once it’s finished consulting on codes of conduct for the companies.
Why are there calls for the law to change?
Two weeks ago, a 17-year-old knifeman attacked several children attending a Taylor Swift-themed dance class in the English town of Southport in…
Read More: UK considers tougher Online Safety Act after UK riots, Musk comments