Instagram users have told the BBC of the “extreme stress” of having their accounts banned after being wrongly accused by the platform of breaching its rules on child sexual exploitation.

The BBC has been in touch with three people who were told by parent company Meta that their accounts were being permanently disabled, only to have them reinstated shortly after their cases were highlighted to journalists.

  • some_guy@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    28 minutes ago

    That’s an ok mistake to make that has zero chance of then ruining someone’s life and reputation. /s

  • Lime Buzz (fae/she)@beehaw.org
    link
    fedilink
    English
    arrow-up
    12
    ·
    8 hours ago

    Whilst this is terrible it’s a good reminder that companies that use automated bots, algorithms, and underpaid and exploited workers for moderation will never be good for social media platforms.

    Commercial and social media should never go together in the same sentence if you want it to be good and truly for the people who use it.

      • Lime Buzz (fae/she)@beehaw.org
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        2 hours ago

        I’ve been a moderator on many many things and this is what I’ve learned:

        • Firstly, make sure we are a team with tonnes of support from the get go, both each other and mental health services etc as a bare minimum.

        • Secondly, understand if people need to take breaks as it can take a real toll

        • Thirdly nobody on the mod team at all should talk down to or criticise other’s work, especially not in a group or public setting. Ask them how they’re doing, ask them what support or information they need etc and then give it to them.

        • Fourthly explain what is important and why it’s important in a way each person can understand. Have detailed documents also explaining this.

        • Fifthly have for the users of the service etc easy to understand, non legalise rules, CoCs and guidelines etc

        • Sixthly if someone violates the rules then you either talk to them personally, if it’s harmful then delete it and if it’s really harmful (such as CSAM) then you remove them and don’t allow them back on no matter what.

        Edit: Oh and make sure the mod team is diverse as things caught by only one subset of the population would not be the same as caught by others. Having people of colour, queer, trans and nonbinary people, cis women, disabled, and neurodivergent etc people on the mod team is essential for good moderation.

        That’s not an exhaustive list but what I’ve learned by both being a moderator on many many things and watching others moderate. Hope this helps!

          • Novaling@lemmy.zip
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 hours ago

            Lol, depends on the group and mod level. Yeah, the overall Tiktok and Insta moderators get paid (poorly) to do their job, but do subreddit and individual sever Discord mods get paid?

            If we’re talking about Fediverse, maybe the mods of an instance would get paid, but I can bet most Fedi mods rn aren’t being paid, they do it for love.

            The biggest point is this:

            a team with tonnes of support from the get go, both each other and mental health services etc as a bare minimum

            There was a post the other day about protesting Tiktok mods who talked about the horrifying amounts of gore and NSFL stuff they had to remove from the service, getting no support nor concern for their well-being from Tiktok. So you can absolutely pay someone (although not well probably, even for a billion dollar company), but all the money in the world won’t erase looking at 300 beheading videos.

            I think the biggest thing helping the Fediverse is the fact that it’s fragmented and not meant to get too big per instance. No one is ever gonna have to moderate millions of accounts like big tech does. Mods can suspend sign-ups and set their own limits. We could make a pretend scenario where each instance is only 1K people, which is far easier than what YT and Insta have to do.