• tocano@piefed.social
    link
    fedilink
    English
    arrow-up
    3
    ·
    23 hours ago

    The article seems to focus on the fact that it’s hard to do “content moderation well at scale”. Wouldn’t such legislation discourage creation and growth of giant social media “platforms”, giving more space for more local, smaller communities?

      • killingspark@feddit.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        17 hours ago

        Such legislation is normally only applied above a certain threshold of either size or profits generated. This encourages making creating small communities that can organically grow up to a point where serious overview needs to happen

    • FreedomAdvocate
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      22 hours ago

      But if you’re legally responsible for the content that is posted on your platform, not many new companies are going to want to take on that risk.

      What would likely happen is everyone would still only use the big players, but you’d upload your post/photo/video/whatever and it would be like “your post is saved, pending approval. Expected wait time: 18 hours”. It would be a terrible user experience, but a necessary thing because all it would take is one malicious loser uploading CP hoping to get it through so they can instantly report them to the police and/or file a lawsuit against the platform.

      What do you mean by “more local, smaller communities”? Many small social media sites with very few users so moderation is easy? No one would use them because there is no one on them.