• Steve@communick.news
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    2
    ·
    edit-2
    2 days ago

    On one hand the Judge is right. On the other hand the lawyer is right. Then on two more hands, they’re both wrong.

    Yes, it’s bad to legislate by moral panic. Yes, kids are addicted to social media. Those are both facts.

    The reason age gating is a bad idea isn’t because of moral panic, or “the children”. It’s because we’re ALL addicted to social media. It isn’t just the kids, it’s adults as well. The problem is the intentionally addicting algorithms, meticulously engendered to keep us scrolling. I’m telling you in 50 years, we’ll know how all the social media companies were hiding and lying, about the addictive harmful nature of their business; Just like we know about tobacco and oil companies today.

    The best solution I can think of, is to revisit Section 230. You can’t hold these companies responsible for what people post to their sites, but we can and must hold them accountable, for what they recommend! If you have a simple easily definable sorting or ranking system of what people choose to follow? You’re fine, no accountability for something bad showing up. If you have some black box algorithm of infinite scrolling, based on a complex criteria that nobody can really break down and explain exactly why a specific post was shown to a specific individual? Now you’re on the hook for what they see.

    • Flax@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      I think it would depend on what they recommend. I think some algorithms are fine, like hashtags in common with content you liked or posts from the same person, posts that are overall well liked that day, obviously stuff you follow, etc. But specifically engineering stuff that annoys you to appear, or starting to recommend the same political agenda to everyone regardless of how they interact with the platform, etc, shouldn’t be okay.

      • Steve@communick.news
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        2 days ago

        Yes the idea isn’t, that they aren’t allowed to recommend anything. It’s that they can be held accountable (I.E. sued) if what they recommend, leads to people being radicalized by a hate group, or attempting suicide from cyber bullying. Or even just extra tharapy from doom scrolling ourselves to sleep. Right now Section 230 says they can’t be held liable for anything on their sites. Which is obviously stupid.