Discord announced on Monday that it’s rolling out age verification on its platform globally starting next month, when it will automatically set all users’ accounts to a “teen-appropriate” experience unless they demonstrate that they’re adults.

Users who aren’t verified as adults will not be able to access age-restricted servers and channels, won’t be able to speak in Discord’s livestream-like “stage” channels, and will see content filters for any content Discord detects as graphic or sensitive. They will also get warning prompts for friend requests from potentially unfamiliar users, and DMs from unfamiliar users will be automatically filtered into a separate inbox.

Direct messages and servers that are not age-restricted will continue to function normally, but users won’t be able to send messages or view content in an age-restricted server until they complete the age check process, even if it’s a server they were part of before age verification rolled out. Savannah Badalich, Discord’s global head of product policy, said in an interview with The Verge that those servers will be “obfuscated” with a black screen until the user verifies they’re an adult. Users also won’t be able to join any new age-restricted servers without verifying their age.

  • paraphrand@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    11
    ·
    1 day ago

    Ok, let’s say it is. They just want to invade our privacy. Now set that aside.

    On a separate topic: What’s an alternative solution for the “there’s porn on the playground” problem that discord has? They are participants in it, they are facilitators. They shouldn’t be immune. Giving platforms a pass on things like this is pernicious. Giving platforms a pass is why Elon Musk thinks he can get away with CSAM generators.

    • iceonfire1@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      12 hours ago

      Are platforms responsible for the content their users post? No. Rules for unusual situations (CSAM, terrorism, copyright, etc) have been established for probably decades by this point.

      An online chat/message board is not comparable to an image generator because the company is not creating potentially harmful content.

      • paraphrand@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        8 hours ago

        I think it has become increasingly clear that letting platforms off the hook has been really bad for society across the world across many dimensions. We need something in the middle of zero responsibility and full responsibility.

        • iceonfire1@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 hours ago

          These are actually separate issues. TPB isn’t legally responsible for theft if someone leaks an unreleased song. But, the publishing company can sue them for damages related to lost income based on TPB’s distribution of content.

          • Trainguyrom@reddthat.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 hours ago

            But, the publishing company can sue them for damages related to lost income based on TPB’s distribution of content.

            Wasn’t the argument supposed to be that because they only link to torrents but don’t host the content that the torrents link to themselves they’re not a disributor just a link aggregator?

    • TheNamlessGuy@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      14 hours ago

      The issue isn’t age verification, but rather how its handled. It SHOULD be handled by discord sending a request to my local government asking them “is this person an adult”, leading me to confirming who I am to them (and them only). None of my personal info should be handled by discord, and it should frankly be illegal for them to demand it like this.

      • Trainguyrom@reddthat.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 hours ago

        Any kind of required age verification has significant privacy and security implications. Honestly I think the best approach is the pinky promise we’ve generally had until now, where by default the platform will not display explicit content until the user actively consents and asserts that they are of legal age.

        Why exactly do we need to be verifying age? Any kind of legal/government documents and agreements are already covered by purjory laws, physical deliveries and purchases are already handled by photo ID checks, and porn is of course harmful to teens/preteens but they’ve always been finding ways to access porn even before the home computer era (and honestly this would be better handled through education by schools and parents than forceful legislation)

      • paraphrand@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 hours ago

        I don’t disagree. We need better tools for this.

        The problem is making sure these tools are not used as tools of the state for tracking.

    • spring_cedar_dust@reddthat.com
      link
      fedilink
      English
      arrow-up
      19
      ·
      1 day ago

      I’m sorry are we putting aside biometric infomation being aggregated across multiple platforms and assigned to browsing habits so that governments know exactly where you’re browsing at any single moment?

      Are we just putting that aside???

      If discord can’t handle it don’t blame me. I mean what even is this point???

      • paraphrand@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        4
        ·
        edit-2
        6 hours ago

        I’m not putting it aside to dismiss the idea that it’s bad. I was doing so just for the sake of conversation. You’re really overreacting to conversation.

        This age verification stuff is an invasion of privacy, and we should have a better answer to the claims that it is the only way to put in protection on the platform side. “Parents” is not the answer to the platform’s responsibility here as a facilitator who profits from facilitating.

        Age verification systems are invasive and harm privacy AND it’s weird that we think platforms should be all ages with no serious/effective barriers. It’s weird that we just act like there is no solution, and bemoan parents.

        I was quite clear about switching the topic focus.

        • WhyJiffie@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          18 hours ago

          the better solution is to ban porn from the platform. that’s it. there are some people who won’t like it, but they will find another porn platform.

    • Bakkoda@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 day ago

      Boycott the platform. No one’s gonna rescue the end user anymore. There’s no more oversight. Remove yourself from the ecosystem and encourage healthier alternatives.