• Markets
  • World
  • Politics
  • TV
  • More
  • Flag of United States

    The measures have been brought into sharper focus since the live streaming on one of Facebook’s platforms of a lone gunman killing 50 people at two New Zealand mosques in March.

    The parliament voted 308 to 204 with 70 abstentions to back the proposal to tackle the misuse of internet hosting services for “terrorist purposes” .

    “Companies that systematically and persistently fail to abide by the law may be sanctioned with up to 4 percent of their global turnover,” it said.

    A new European Parliament, to be elected on May 23-26, will finalize the text of the law in negotiations with the European Commission and representatives of EU governments, a process likely to take many months.

    “There is clearly a problem with terrorist material circulating unchecked on the internet for too long,” said Daniel Dalton, the parliament’s rapporteur for the proposal.

    “This propaganda can be linked to actual terrorist incidents and national authorities must be able to act decisively. Any new legislation must be practical and proportionate if we are to safeguard free speech,” he said.

    “It …absolutely cannot lead to a general monitoring of content by the back door.”

    EU officials moved to regulate because they believe internet companies are not doing enough under voluntary measures, even though the first hour is the most vital to stemming the viral spread of online content.

    Facebook said it removed 1.5 million videos containing footage of the New Zealand attack in the first 24 hours after the shootings.

    Worries the new rules are lacking and could be misused have been expressed by three U.N. special rapporteurs for human rights and by the EU’s own rights watchdog.

    Related Story:   Merkel looks to Africa to cement a legacy shaped by migration

    Companies rely on a mix of automated tools and human moderators to spot and delete extremist content. However, when illegal content is taken down from one platform, it often crops up on another, straining authorities’ ability to police the web.

    In response to industry concerns that smaller platforms do not have the same resources to comply as speedily with tougher EU rules, lawmakers said authorities should take into account the size and revenue of companies concerned.

    Draft measures call on the bloc’s national governments to put in place the tools to identify extremist content and an appeals procedure. The one-hour rule would apply from the point of notification by national authorities.

    Brussels has been at the forefront of a push by regulators worldwide to force tech companies to take greater responsibility for content on their sites.

    Story first appeared at reuters.com

    How the staff rate this post.
    How do you rate this post?
    [Total: 0 Average: 0]

    LEAVE A REPLY

    Please enter your comment!
    Please enter your name here

    This site uses Akismet to reduce spam. Learn how your comment data is processed.