Sunday , May 9 2021

Websites that will be fined for online damages & # 39; under new suggestions



Molly RussellImage copyright
PER YEAR

Image title

Molly Russell took her life away after her approaching disturbing material on Instagram

Web sites may be punished or blocked if they fail to resolve "network damage," such as terrorist propaganda and child abuse, according to governmental plans.

The Department of Culture, Media and Sports proposed an independent supervisor and a code of practice that should be followed by a technology company.

Senior executives would be responsible for the breach, with possible tax on the regulatory finance industry.

But one think tank called plans for a "historical attack" on freedom of speech.

The White Book Online Harms covers a range of issues, including the spread of terrorist content, child abuse, so-called revenge pornography, hate crimes, harassment and "false news".

Ministers also say that social networks have to deal with self-enhancing material and suicide, which has become a major issue after the 14-year-old Molly Russell survived his life in 2017.

After she died, the family found disturbing material about depression and suicide on her Instagram account. Molly's father keeps the diva of social media partially responsible for her death.

Disclosure of Proposals, Digital, Culture, Media and Sports Secretary Jeremy Wright said, "The self-regulation of online businesses is over.

"The industry's voluntary online networking actions have not been consistently applied or gone far enough."

What are the suggestions?

The document calls for an independent regulator to keep in mind an internet company.

Such a regulator would be financed by the technology industry. The government did not decide whether to establish a new body or to give new powers to the existing authority.

The regulator will define the "best practice code" that social networks and internet companies must adhere to.

Like Facebook, Twitter and Google, the rules would apply to messaging services such as Snapchat and cloud storage services.

The regulator will be able to authorize the company and publish notices of naming and shaming those who violate the rules.

The government says it also pays fines for individual business executives, or makes the search engines remove links to offensive websites.

Ministers "predict" that fines and company warnings will be included in the final bill. They are further consulting on blocking harmful sites or stopping them from being listed by search engines.

Best practice code

The White Book offers some suggestions that could be included in the Best Practice Code.

This suggests that widespread false news could be solved by forcing social networks to use factual checks and promote legitimate news sources.

But the regulator will be able to define the code.

White paper also claims that social media companies should produce annual reports that reveal how harmful content is found on their platforms.

The Children's Charitable Organization NSPCC calls for new regulations from 2017 and has repeatedly called for social care to set up a legal obligation to care.

The spokesperson said, "It's time for social networks. The cops did not succeed, and our children paid the price."

Concern about censorship

However, TechUK, the umbrella group representing the British technology industry, said the government must be "clear about how compromises are balanced between preventing harm and fundamental rights."

Matthew Lesh, research manager at the Adam Smith Institute's free market research center, went further.

He said: "The government should be surprised at what the Western world is doing in the Internet censorship.

"Proposals are a historic attack on freedom of speech and freedom of the press.

"At a time when Britain criticizes the violations of freedom of expression in countries such as Iran, China and Russia, we should not undermine our freedom at home."

Freedom of Speech by Participants Article 19 warned that the government "should not create an environment that promotes the censorship of legitimate expression".

The spokesman said: "Article 19 strongly opposes any duty of concern imposed on Internet platforms.

"We believe that duty care would inevitably require them to proactively monitor their networks and take a restrictive approach to removing content.

"Such actions could violate the rights of individuals to freedom of expression and privacy."

At first glance, this is a tough new regime – and ministers have acted on the demands of charitable organizations such as NSPCC who want to tame what they consider to be a "Wild West Web".

But closer look reveals all kinds of problems that need to be solved.

Will the whole new organization get a huge job of regulating the internet? Or will the job be committed to the media regulator Ofcom?

What sanctions will be available to the regulator? Will it equally apply to large social networks and small organizations such as parents' message boards?

Most importantly, the regulator will decide on material that is not illegal, but it can still be considered harmful.

Take this example. Disinformations are listed as potential harm, and Health Minister Matt Hancock spoke of the adverse effects the anti-vaccination campaigns had.

So, will the regulator tell the companies that their duty of care means they have to remove such material?

The government is now planning to consult its proposals. Perhaps they will discover that its dual ambitions are to make Britain the safest place in the world and the best way to launch digital business is mutually incompatible.

The BBC has a digital guide to online life for parents and teenagers: BBC Own It


Source link