Do any jurisdictions seriously consider reclassifying social media websites as publishers?Can social media be regulated under the “public accomodations” clause of the Civil Rights Act?Can Germany fine social media sites over hate speech?Fatal Riot Caused by Social Media In the WestHas Facebook’s (or other social network) block policy been abused to limit freedom of expression inside EU?Has there been any attempt to prevent false information from spreading? (legal fact check)Does the US government fund the media?Why are western-owned social media sites allowed and utilized in Burma?

Non-Legendary Planeswalkers

A demigod among men

I didn't do any exit passport control when leaving Japan. What should I do?

Why is lying to Congress a crime?

Why is CMYK & PNG not possible?

Can the bass be used instead of drums?

Why is coffee provided during big chess events when it contains a banned substance?

Paper accepted at a probably predatory conference, how shall I proceed?

Can I use I2C over 2m cables?

How to remind myself to lock my doors

What is the meaning of "shop-wise" in "… and talk turned shop-wise"?

Can you decide not to sneak into a room after seeing your roll?

How honest to be with US immigration about uncertainty about travel plans?

What can I do to avoid potential charges for bribery?

Why is matter-antimatter asymmetry surprising, if asymmetry can be generated by a random walk in which particles go into black holes?

What is joint estimation?

Is there a practical way of making democratic-like system skewed towards competence?

What is a terminal plane in high frequency circuits?

How to handle shared mortgage payment if one person can't pay their share?

Disrespectful employee going above my head and telling me what to do. I am his manager

Why are seats at the rear of a plane sometimes unavailable even though many other seats are available in the plane?

Prove the inequality is true

What powers an aircraft prior to the APU being switched on?

What causes standard door hinges to close up to a certain amount automatically?



Do any jurisdictions seriously consider reclassifying social media websites as publishers?


Can social media be regulated under the “public accomodations” clause of the Civil Rights Act?Can Germany fine social media sites over hate speech?Fatal Riot Caused by Social Media In the WestHas Facebook’s (or other social network) block policy been abused to limit freedom of expression inside EU?Has there been any attempt to prevent false information from spreading? (legal fact check)Does the US government fund the media?Why are western-owned social media sites allowed and utilized in Burma?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty
margin-bottom:0;

.everyonelovesstackoverflowposition:absolute;height:1px;width:1px;opacity:0;top:0;left:0;pointer-events:none;








10

















The US Communications Decency Act of 1996 was the first attempt to regulate pornography on the internet.



Moreover, section 230 of the act has been used by internet companies to avoid being characterised as publishers and thus avoid the responsibilities and obligations as publishers.



Still, it’s very difficult to look at a company like Facebook and not see them as a massive publishing company. That they are called ‘new media’ or ‘social media’ companies suggest this is how they are seen de facto, if not de jure.



Is it perhaps time for legislators to take another look at this legal provision given that questions have now been raised about the effectiveness of social media self-regulation of content.



Q. Are any jurisdictions seriously considering relaxing this kind of provision?










share|improve this question























  • 14





    @Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.

    – Jeff Lambert
    Apr 18 at 12:55






  • 4





    @Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.

    – hszmv
    Apr 18 at 13:32






  • 1





    Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.

    – pjc50
    Apr 18 at 14:31






  • 1





    This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.

    – reirab
    Apr 18 at 20:56






  • 1





    A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.

    – reirab
    Apr 18 at 20:57

















10

















The US Communications Decency Act of 1996 was the first attempt to regulate pornography on the internet.



Moreover, section 230 of the act has been used by internet companies to avoid being characterised as publishers and thus avoid the responsibilities and obligations as publishers.



Still, it’s very difficult to look at a company like Facebook and not see them as a massive publishing company. That they are called ‘new media’ or ‘social media’ companies suggest this is how they are seen de facto, if not de jure.



Is it perhaps time for legislators to take another look at this legal provision given that questions have now been raised about the effectiveness of social media self-regulation of content.



Q. Are any jurisdictions seriously considering relaxing this kind of provision?










share|improve this question























  • 14





    @Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.

    – Jeff Lambert
    Apr 18 at 12:55






  • 4





    @Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.

    – hszmv
    Apr 18 at 13:32






  • 1





    Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.

    – pjc50
    Apr 18 at 14:31






  • 1





    This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.

    – reirab
    Apr 18 at 20:56






  • 1





    A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.

    – reirab
    Apr 18 at 20:57













10












10








10








The US Communications Decency Act of 1996 was the first attempt to regulate pornography on the internet.



Moreover, section 230 of the act has been used by internet companies to avoid being characterised as publishers and thus avoid the responsibilities and obligations as publishers.



Still, it’s very difficult to look at a company like Facebook and not see them as a massive publishing company. That they are called ‘new media’ or ‘social media’ companies suggest this is how they are seen de facto, if not de jure.



Is it perhaps time for legislators to take another look at this legal provision given that questions have now been raised about the effectiveness of social media self-regulation of content.



Q. Are any jurisdictions seriously considering relaxing this kind of provision?










share|improve this question
















The US Communications Decency Act of 1996 was the first attempt to regulate pornography on the internet.



Moreover, section 230 of the act has been used by internet companies to avoid being characterised as publishers and thus avoid the responsibilities and obligations as publishers.



Still, it’s very difficult to look at a company like Facebook and not see them as a massive publishing company. That they are called ‘new media’ or ‘social media’ companies suggest this is how they are seen de facto, if not de jure.



Is it perhaps time for legislators to take another look at this legal provision given that questions have now been raised about the effectiveness of social media self-regulation of content.



Q. Are any jurisdictions seriously considering relaxing this kind of provision?







regulation social-media fake-news






share|improve this question















share|improve this question













share|improve this question




share|improve this question



share|improve this question








edited Apr 18 at 13:12







Mozibur Ullah

















asked Apr 18 at 12:27









Mozibur UllahMozibur Ullah

1




1










  • 14





    @Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.

    – Jeff Lambert
    Apr 18 at 12:55






  • 4





    @Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.

    – hszmv
    Apr 18 at 13:32






  • 1





    Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.

    – pjc50
    Apr 18 at 14:31






  • 1





    This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.

    – reirab
    Apr 18 at 20:56






  • 1





    A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.

    – reirab
    Apr 18 at 20:57












  • 14





    @Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.

    – Jeff Lambert
    Apr 18 at 12:55






  • 4





    @Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.

    – hszmv
    Apr 18 at 13:32






  • 1





    Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.

    – pjc50
    Apr 18 at 14:31






  • 1





    This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.

    – reirab
    Apr 18 at 20:56






  • 1





    A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.

    – reirab
    Apr 18 at 20:57







14




14





@Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.

– Jeff Lambert
Apr 18 at 12:55





@Obie2.0 I think the difference is that if someone libels you in a book, you can sue both the author and the publisher, so publishers don't publish authors who do that. If someone posts libel on Facebook, you can't sue Facebook for letting them do that.

– Jeff Lambert
Apr 18 at 12:55




4




4





@Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.

– hszmv
Apr 18 at 13:32





@Obie2.0: Less dramatically, if Facebook is considered a publisher, they can also be sued for any instance of copyright infringement, which may include anything from linked news articles to pictures of memes.

– hszmv
Apr 18 at 13:32




1




1





Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.

– pjc50
Apr 18 at 14:31





Facebook in Germany supports the impressum, the mandatory data for anyone publishing; I'm not sure whether this means they're counted as a publisher. Facebook and Twitter certainly follow German law on not displaying Nazi symbols.

– pjc50
Apr 18 at 14:31




1




1





This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.

– reirab
Apr 18 at 20:56





This isn't answer to the question, but regarding some assumptions made in the question, there is a very large difference between a platform like Facebook or Twitter or, say, a comments section on a blog or news article vs. a "published" such as the actual blog itself, actual news articles themselves, etc. Namely, the difference are user-generated content vs. editorial content curated by the owner of the site/platform. Social media of any sort (as well as sites like Stack Exchange) would be almost completely impossible if they had editorial liability for user-generated content.

– reirab
Apr 18 at 20:56




1




1





A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.

– reirab
Apr 18 at 20:57





A site would not be able to hire enough editors to review all user-generated content for accuracy or copyright infringement.

– reirab
Apr 18 at 20:57










2 Answers
2






active

oldest

votes


















11


















Attorneys General of 47 states sent a letter to Congress in July of 2013 recommending that the civil and criminal immunity in Section 230 be removed. So there is broad support for doing something to address internet companies' responsibilities, but it is hard to find agreement on what should be done.



The ACLU came out in opposition to weakening the law's protections, and went so far as to submit a rebuttal letter to Congress one week later.




If their proposal were to pass, it would mean that every website on the Internet could be subject to legal liability for violations of an unfathomable number of state laws.




Matt Zimmerman from the Electronic Frontier Foundation had this to say about The AGs proposal:




Their approach is wrong, and dangerously so, but even if the AGs disagree and want a debate about how state criminal laws fit into the regulation of the Internet, they owe the public a more honest discussion.

[...]

Instead, the AGs are really proposing to do something far more revolutionary in scope: make service providers criminally responsible for what their users do, even if they don't intend for any illegal activity to take place on or through their services (or even have specific knowledge about it).




It's not clear if the ACLU or EFF are against modifying the law in any way, or if they were just against the AGs specific recommended course of action.



Congress did vote in March of 2018 to narrowly limit the scope of Section 230, but the limitation was mostly concerned with service providers who 'knowingly facilitate' sex trafficking with the apparent target of the restriction being the online service Backpage.com. The bill was signed into law by President Trump on April 11th, 2018. From the debate prior to the vote:




Sen. Blumenthal: This bill would clarify section 230 of the Communications Decency Act, which was never intended to give websites a free pass to aid and abet sex trafficking. It was never intended to immunize completely those websites so they could knowingly facilitate sex trafficking. Those words are in the bill--``knowingly facilitate.''


The purpose of our measure, very simply, is to give survivors their day in court. Right now, the courtroom doors are barred to them, as a recent court of appeals opinion remarked, outrageously so. It would also open avenues of prosecution to law enforcement where they are currently roadblocked.




It should be noted that Section 230 is not ironclad. Some internet publishers have been found liable in court for "encourag[ing] the development of what is offensive."






share|improve this answer



































    7


















    Yes, the Lords in the UK did debate this in January of 2018. From parliament.uk:




    Lords debates online news and content publishers



    Members of the Lords, including a former government digital champion and the shadow spokesperson for digital, culture, media and sport, debated the role played by social media and online platforms as news and content publishers, in the House of Lords on Thursday 11 January.



    This was a balloted debate. They normally take place on a Thursday in the chamber. During debates, members are able to put their experience to good use, discussing current issues and drawing the government's attention to concerns.



    The debate was proposed by Baroness Kidron (Crossbench), member, Royal Foundation Taskforce on the Prevention of Cyberbullying.




    In may of 2018, the government has stated the following in reply to a green paper (on page 14 in the linked document):




    Whilst the case for change is clear, we also recognise that applying publisher standards of
    liability to all online platforms could risk real damage to the digital economy, which would be
    to the detriment of the public who benefit from them. That is why we are working with our
    European and international partners, as well as the businesses themselves, to understand
    how we can make the existing frameworks and definitions work better, and what a liability
    regime of the future should look like. This will play an important role in helping to protect
    users from illegal content online and will supplement our Strategy







    share|improve this answer





























      Your Answer








      StackExchange.ready(function()
      var channelOptions =
      tags: "".split(" "),
      id: "475"
      ;
      initTagRenderer("".split(" "), "".split(" "), channelOptions);

      StackExchange.using("externalEditor", function()
      // Have to fire editor after snippets, if snippets enabled
      if (StackExchange.settings.snippets.snippetsEnabled)
      StackExchange.using("snippets", function()
      createEditor();
      );

      else
      createEditor();

      );

      function createEditor()
      StackExchange.prepareEditor(
      heartbeatType: 'answer',
      autoActivateHeartbeat: false,
      convertImagesToLinks: false,
      noModals: true,
      showLowRepImageUploadWarning: true,
      reputationToPostImages: null,
      bindNavPrevention: true,
      postfix: "",
      imageUploader:
      brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
      contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
      allowUrls: true
      ,
      noCode: true, onDemand: true,
      discardSelector: ".discard-answer"
      ,immediatelyShowMarkdownHelp:true
      );



      );














      draft saved

      draft discarded
















      StackExchange.ready(
      function ()
      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fpolitics.stackexchange.com%2fquestions%2f40722%2fdo-any-jurisdictions-seriously-consider-reclassifying-social-media-websites-as-p%23new-answer', 'question_page');

      );

      Post as a guest















      Required, but never shown


























      2 Answers
      2






      active

      oldest

      votes








      2 Answers
      2






      active

      oldest

      votes









      active

      oldest

      votes






      active

      oldest

      votes









      11


















      Attorneys General of 47 states sent a letter to Congress in July of 2013 recommending that the civil and criminal immunity in Section 230 be removed. So there is broad support for doing something to address internet companies' responsibilities, but it is hard to find agreement on what should be done.



      The ACLU came out in opposition to weakening the law's protections, and went so far as to submit a rebuttal letter to Congress one week later.




      If their proposal were to pass, it would mean that every website on the Internet could be subject to legal liability for violations of an unfathomable number of state laws.




      Matt Zimmerman from the Electronic Frontier Foundation had this to say about The AGs proposal:




      Their approach is wrong, and dangerously so, but even if the AGs disagree and want a debate about how state criminal laws fit into the regulation of the Internet, they owe the public a more honest discussion.

      [...]

      Instead, the AGs are really proposing to do something far more revolutionary in scope: make service providers criminally responsible for what their users do, even if they don't intend for any illegal activity to take place on or through their services (or even have specific knowledge about it).




      It's not clear if the ACLU or EFF are against modifying the law in any way, or if they were just against the AGs specific recommended course of action.



      Congress did vote in March of 2018 to narrowly limit the scope of Section 230, but the limitation was mostly concerned with service providers who 'knowingly facilitate' sex trafficking with the apparent target of the restriction being the online service Backpage.com. The bill was signed into law by President Trump on April 11th, 2018. From the debate prior to the vote:




      Sen. Blumenthal: This bill would clarify section 230 of the Communications Decency Act, which was never intended to give websites a free pass to aid and abet sex trafficking. It was never intended to immunize completely those websites so they could knowingly facilitate sex trafficking. Those words are in the bill--``knowingly facilitate.''


      The purpose of our measure, very simply, is to give survivors their day in court. Right now, the courtroom doors are barred to them, as a recent court of appeals opinion remarked, outrageously so. It would also open avenues of prosecution to law enforcement where they are currently roadblocked.




      It should be noted that Section 230 is not ironclad. Some internet publishers have been found liable in court for "encourag[ing] the development of what is offensive."






      share|improve this answer
































        11


















        Attorneys General of 47 states sent a letter to Congress in July of 2013 recommending that the civil and criminal immunity in Section 230 be removed. So there is broad support for doing something to address internet companies' responsibilities, but it is hard to find agreement on what should be done.



        The ACLU came out in opposition to weakening the law's protections, and went so far as to submit a rebuttal letter to Congress one week later.




        If their proposal were to pass, it would mean that every website on the Internet could be subject to legal liability for violations of an unfathomable number of state laws.




        Matt Zimmerman from the Electronic Frontier Foundation had this to say about The AGs proposal:




        Their approach is wrong, and dangerously so, but even if the AGs disagree and want a debate about how state criminal laws fit into the regulation of the Internet, they owe the public a more honest discussion.

        [...]

        Instead, the AGs are really proposing to do something far more revolutionary in scope: make service providers criminally responsible for what their users do, even if they don't intend for any illegal activity to take place on or through their services (or even have specific knowledge about it).




        It's not clear if the ACLU or EFF are against modifying the law in any way, or if they were just against the AGs specific recommended course of action.



        Congress did vote in March of 2018 to narrowly limit the scope of Section 230, but the limitation was mostly concerned with service providers who 'knowingly facilitate' sex trafficking with the apparent target of the restriction being the online service Backpage.com. The bill was signed into law by President Trump on April 11th, 2018. From the debate prior to the vote:




        Sen. Blumenthal: This bill would clarify section 230 of the Communications Decency Act, which was never intended to give websites a free pass to aid and abet sex trafficking. It was never intended to immunize completely those websites so they could knowingly facilitate sex trafficking. Those words are in the bill--``knowingly facilitate.''


        The purpose of our measure, very simply, is to give survivors their day in court. Right now, the courtroom doors are barred to them, as a recent court of appeals opinion remarked, outrageously so. It would also open avenues of prosecution to law enforcement where they are currently roadblocked.




        It should be noted that Section 230 is not ironclad. Some internet publishers have been found liable in court for "encourag[ing] the development of what is offensive."






        share|improve this answer






























          11














          11










          11









          Attorneys General of 47 states sent a letter to Congress in July of 2013 recommending that the civil and criminal immunity in Section 230 be removed. So there is broad support for doing something to address internet companies' responsibilities, but it is hard to find agreement on what should be done.



          The ACLU came out in opposition to weakening the law's protections, and went so far as to submit a rebuttal letter to Congress one week later.




          If their proposal were to pass, it would mean that every website on the Internet could be subject to legal liability for violations of an unfathomable number of state laws.




          Matt Zimmerman from the Electronic Frontier Foundation had this to say about The AGs proposal:




          Their approach is wrong, and dangerously so, but even if the AGs disagree and want a debate about how state criminal laws fit into the regulation of the Internet, they owe the public a more honest discussion.

          [...]

          Instead, the AGs are really proposing to do something far more revolutionary in scope: make service providers criminally responsible for what their users do, even if they don't intend for any illegal activity to take place on or through their services (or even have specific knowledge about it).




          It's not clear if the ACLU or EFF are against modifying the law in any way, or if they were just against the AGs specific recommended course of action.



          Congress did vote in March of 2018 to narrowly limit the scope of Section 230, but the limitation was mostly concerned with service providers who 'knowingly facilitate' sex trafficking with the apparent target of the restriction being the online service Backpage.com. The bill was signed into law by President Trump on April 11th, 2018. From the debate prior to the vote:




          Sen. Blumenthal: This bill would clarify section 230 of the Communications Decency Act, which was never intended to give websites a free pass to aid and abet sex trafficking. It was never intended to immunize completely those websites so they could knowingly facilitate sex trafficking. Those words are in the bill--``knowingly facilitate.''


          The purpose of our measure, very simply, is to give survivors their day in court. Right now, the courtroom doors are barred to them, as a recent court of appeals opinion remarked, outrageously so. It would also open avenues of prosecution to law enforcement where they are currently roadblocked.




          It should be noted that Section 230 is not ironclad. Some internet publishers have been found liable in court for "encourag[ing] the development of what is offensive."






          share|improve this answer
















          Attorneys General of 47 states sent a letter to Congress in July of 2013 recommending that the civil and criminal immunity in Section 230 be removed. So there is broad support for doing something to address internet companies' responsibilities, but it is hard to find agreement on what should be done.



          The ACLU came out in opposition to weakening the law's protections, and went so far as to submit a rebuttal letter to Congress one week later.




          If their proposal were to pass, it would mean that every website on the Internet could be subject to legal liability for violations of an unfathomable number of state laws.




          Matt Zimmerman from the Electronic Frontier Foundation had this to say about The AGs proposal:




          Their approach is wrong, and dangerously so, but even if the AGs disagree and want a debate about how state criminal laws fit into the regulation of the Internet, they owe the public a more honest discussion.

          [...]

          Instead, the AGs are really proposing to do something far more revolutionary in scope: make service providers criminally responsible for what their users do, even if they don't intend for any illegal activity to take place on or through their services (or even have specific knowledge about it).




          It's not clear if the ACLU or EFF are against modifying the law in any way, or if they were just against the AGs specific recommended course of action.



          Congress did vote in March of 2018 to narrowly limit the scope of Section 230, but the limitation was mostly concerned with service providers who 'knowingly facilitate' sex trafficking with the apparent target of the restriction being the online service Backpage.com. The bill was signed into law by President Trump on April 11th, 2018. From the debate prior to the vote:




          Sen. Blumenthal: This bill would clarify section 230 of the Communications Decency Act, which was never intended to give websites a free pass to aid and abet sex trafficking. It was never intended to immunize completely those websites so they could knowingly facilitate sex trafficking. Those words are in the bill--``knowingly facilitate.''


          The purpose of our measure, very simply, is to give survivors their day in court. Right now, the courtroom doors are barred to them, as a recent court of appeals opinion remarked, outrageously so. It would also open avenues of prosecution to law enforcement where they are currently roadblocked.




          It should be noted that Section 230 is not ironclad. Some internet publishers have been found liable in court for "encourag[ing] the development of what is offensive."







          share|improve this answer















          share|improve this answer




          share|improve this answer



          share|improve this answer








          edited Apr 18 at 13:27

























          answered Apr 18 at 13:18









          Jeff LambertJeff Lambert

          11.5k5 gold badges33 silver badges56 bronze badges




          11.5k5 gold badges33 silver badges56 bronze badges


























              7


















              Yes, the Lords in the UK did debate this in January of 2018. From parliament.uk:




              Lords debates online news and content publishers



              Members of the Lords, including a former government digital champion and the shadow spokesperson for digital, culture, media and sport, debated the role played by social media and online platforms as news and content publishers, in the House of Lords on Thursday 11 January.



              This was a balloted debate. They normally take place on a Thursday in the chamber. During debates, members are able to put their experience to good use, discussing current issues and drawing the government's attention to concerns.



              The debate was proposed by Baroness Kidron (Crossbench), member, Royal Foundation Taskforce on the Prevention of Cyberbullying.




              In may of 2018, the government has stated the following in reply to a green paper (on page 14 in the linked document):




              Whilst the case for change is clear, we also recognise that applying publisher standards of
              liability to all online platforms could risk real damage to the digital economy, which would be
              to the detriment of the public who benefit from them. That is why we are working with our
              European and international partners, as well as the businesses themselves, to understand
              how we can make the existing frameworks and definitions work better, and what a liability
              regime of the future should look like. This will play an important role in helping to protect
              users from illegal content online and will supplement our Strategy







              share|improve this answer
































                7


















                Yes, the Lords in the UK did debate this in January of 2018. From parliament.uk:




                Lords debates online news and content publishers



                Members of the Lords, including a former government digital champion and the shadow spokesperson for digital, culture, media and sport, debated the role played by social media and online platforms as news and content publishers, in the House of Lords on Thursday 11 January.



                This was a balloted debate. They normally take place on a Thursday in the chamber. During debates, members are able to put their experience to good use, discussing current issues and drawing the government's attention to concerns.



                The debate was proposed by Baroness Kidron (Crossbench), member, Royal Foundation Taskforce on the Prevention of Cyberbullying.




                In may of 2018, the government has stated the following in reply to a green paper (on page 14 in the linked document):




                Whilst the case for change is clear, we also recognise that applying publisher standards of
                liability to all online platforms could risk real damage to the digital economy, which would be
                to the detriment of the public who benefit from them. That is why we are working with our
                European and international partners, as well as the businesses themselves, to understand
                how we can make the existing frameworks and definitions work better, and what a liability
                regime of the future should look like. This will play an important role in helping to protect
                users from illegal content online and will supplement our Strategy







                share|improve this answer






























                  7














                  7










                  7









                  Yes, the Lords in the UK did debate this in January of 2018. From parliament.uk:




                  Lords debates online news and content publishers



                  Members of the Lords, including a former government digital champion and the shadow spokesperson for digital, culture, media and sport, debated the role played by social media and online platforms as news and content publishers, in the House of Lords on Thursday 11 January.



                  This was a balloted debate. They normally take place on a Thursday in the chamber. During debates, members are able to put their experience to good use, discussing current issues and drawing the government's attention to concerns.



                  The debate was proposed by Baroness Kidron (Crossbench), member, Royal Foundation Taskforce on the Prevention of Cyberbullying.




                  In may of 2018, the government has stated the following in reply to a green paper (on page 14 in the linked document):




                  Whilst the case for change is clear, we also recognise that applying publisher standards of
                  liability to all online platforms could risk real damage to the digital economy, which would be
                  to the detriment of the public who benefit from them. That is why we are working with our
                  European and international partners, as well as the businesses themselves, to understand
                  how we can make the existing frameworks and definitions work better, and what a liability
                  regime of the future should look like. This will play an important role in helping to protect
                  users from illegal content online and will supplement our Strategy







                  share|improve this answer
















                  Yes, the Lords in the UK did debate this in January of 2018. From parliament.uk:




                  Lords debates online news and content publishers



                  Members of the Lords, including a former government digital champion and the shadow spokesperson for digital, culture, media and sport, debated the role played by social media and online platforms as news and content publishers, in the House of Lords on Thursday 11 January.



                  This was a balloted debate. They normally take place on a Thursday in the chamber. During debates, members are able to put their experience to good use, discussing current issues and drawing the government's attention to concerns.



                  The debate was proposed by Baroness Kidron (Crossbench), member, Royal Foundation Taskforce on the Prevention of Cyberbullying.




                  In may of 2018, the government has stated the following in reply to a green paper (on page 14 in the linked document):




                  Whilst the case for change is clear, we also recognise that applying publisher standards of
                  liability to all online platforms could risk real damage to the digital economy, which would be
                  to the detriment of the public who benefit from them. That is why we are working with our
                  European and international partners, as well as the businesses themselves, to understand
                  how we can make the existing frameworks and definitions work better, and what a liability
                  regime of the future should look like. This will play an important role in helping to protect
                  users from illegal content online and will supplement our Strategy








                  share|improve this answer















                  share|improve this answer




                  share|improve this answer



                  share|improve this answer








                  edited Apr 18 at 13:23

























                  answered Apr 18 at 13:12









                  JJJJJJ

                  13.9k5 gold badges44 silver badges86 bronze badges




                  13.9k5 gold badges44 silver badges86 bronze badges































                      draft saved

                      draft discarded















































                      Thanks for contributing an answer to Politics Stack Exchange!


                      • Please be sure to answer the question. Provide details and share your research!

                      But avoid


                      • Asking for help, clarification, or responding to other answers.

                      • Making statements based on opinion; back them up with references or personal experience.

                      To learn more, see our tips on writing great answers.




                      draft saved


                      draft discarded














                      StackExchange.ready(
                      function ()
                      StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fpolitics.stackexchange.com%2fquestions%2f40722%2fdo-any-jurisdictions-seriously-consider-reclassifying-social-media-websites-as-p%23new-answer', 'question_page');

                      );

                      Post as a guest















                      Required, but never shown





















































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown

































                      Required, but never shown














                      Required, but never shown












                      Required, but never shown







                      Required, but never shown









                      Popular posts from this blog

                      Tamil (spriik) Luke uk diar | Nawigatjuun

                      Align equal signs while including text over equalitiesAMS align: left aligned text/math plus multicolumn alignmentMultiple alignmentsAligning equations in multiple placesNumbering and aligning an equation with multiple columnsHow to align one equation with another multline equationUsing \ in environments inside the begintabularxNumber equations and preserving alignment of equal signsHow can I align equations to the left and to the right?Double equation alignment problem within align enviromentAligned within align: Why are they right-aligned?

                      Training a classifier when some of the features are unknownWhy does Gradient Boosting regression predict negative values when there are no negative y-values in my training set?How to improve an existing (trained) classifier?What is effect when I set up some self defined predisctor variables?Why Matlab neural network classification returns decimal values on prediction dataset?Fitting and transforming text data in training, testing, and validation setsHow to quantify the performance of the classifier (multi-class SVM) using the test data?How do I control for some patients providing multiple samples in my training data?Training and Test setTraining a convolutional neural network for image denoising in MatlabShouldn't an autoencoder with #(neurons in hidden layer) = #(neurons in input layer) be “perfect”?