It’s only good because of all the hard work being put in by the moderators. Unfortunately, behind the scenes, Lemmy sucks and is severely lacking in moderation tools to deal with spammers, trolls and sick people who post illegal content.
See this post for instance, I feel pretty bad for the mods who have to deal with such stuff: https://beehaw.org/post/7943139
It’s not just the mods but the admins going to lengths to keep their instances clean. The awfulness outlined in that post means I’m not sure I should keep hosting my own instance.
Indeed, it’s not really a good idea to run your own instance if you’re not prepared to deal with such content. Many small instance admins have shutdown their instances for this very reason.
There was a patch merged recently which disabled caching of federated images, but I believe it still needs some work. Some discussion around that over here: https://sh.itjust.works/post/3962112 including an interesting comment suggesting rerouting /pictrs/ path to 404, so nginx won’t serve any images.
I am biased saying this, but I really don’t think Lemmy is bad behind the scenes. On the contrary, I think it’s revolutionary from a technological perspective, not only because of the Fediverse but because of the way it’s implemented and all the great new technologies used.
Keep in mind that this is a FOSS project, and there is obviously no budget to be hiring moderation teams for CSAM like software giants do.
CSAM was an obvious problem from the start, but when it comes down to it, it’s a moderator job and not a job for the actual software to do.
Thankfully there are new tools now to help moderators deal with CSAM that are possibly going to be incorporated to Lemmy afaik.
TLDR: Don’t blame the software for people being shit
Sorry, but I disagree. Note that I don’t disagree with the idea or the technology itself (and the concept of Fediverse), the problem is the current state of development. Saying that it’s the moderators job doesn’t absolve the responsibility of the software, when the software, in it’s current state, doesn’t really provide any decent tools for moderation and user access controls.
CSAM was never a problem on well-configured traditional forums, which were based on forum software such as Invision, vBulletin etc. To elaborate, in traditional forums, you’d get a LOT of controls for filtering out the kind of users who post such content. For instance, most forums won’t even let you post until you complete an interactive tutorial first (reading the rules and replying to a bot indicating you’ve understood them etc). On top of that, you can have various levels of restrictions, eg, someone with less than 100 posts, or an account less than a month old may not be able to post any links or images etc. Also, you can have a trust system on some forums, where a mod can mark your account as trusted or verified, granting you further rights. You can even make it so that a manual moderator approval is required before image posting rights are granted. In this instance, a mod would review your posting history and ensure that your posts genuinely contributed to the community and you’re unlikely to be a troll/karma farmer account etc.
So, short of accounts getting compromised/hacked, it’s very difficult to have this sort of stuff happen on a well-configured traditional forum.
I used to be a mod on a couple of popular forums back in the day, and I even ran my own server for a few years (using Invision Power Board), and never once have I had to deal with such content.
The fact is Lemmy, in it’s present state, is woefully inadequate to deal with such content. Dealing with CSAM should never be a volunteer mod’s job - that stuff can scar you for life, or even trigger PTSD/bad memories for those who might’ve suffered abuse in their forgotten past. If people are involved, it should be a job for professionals who’re trained to deal with this stuff.
Once again, I don’t disagree with the general idea or the concept of Lemmy, it’s just unfortunate timing the Reddit exodus happened when the software was essentially an alpha.
I agree that CSAM protection is lacking, but the software is not an alpha.
As a platform I use it just as much as I used to use Reddit and usually it’s much faster, more enjoyable and not profit driven.
CSAM protection is essential, but it’s a very hard problem to solve and naturally it takes time.
Feature wise though, it’s constantly improving and showing how powerful FOSS can be when enough people are interested in it.
I agree that CSAM needs to be fixed as soon as possible, I’m just pointing out that despite this huge problem, the software is otherwise doing very well and improving faster and faster as more people join in.
the software is not an alpha. As a platform I use it just as much as I used to use Reddit and usually it’s much faster, more enjoyable and not profit driven.
That’s because you’re only seeing it from the eyes of a user. Talk to any admin of a big instance and you’ll see how inadequate it is. Or just head over to Beehaw, they have made some very detailed statements on how much of a nightmare Lemmy is, and on the current bleak state of development.
Yeah I agree. Lemmy obviously isn’t at the level reddit is, but reddit has had nearly 2 decades of development with a larger userbase.
I certainly would like to see Lemmy development happen a bit more quickly, and in particular better 3rd party/mod tools (I REALLY want a RES for Lemmy), but I don’t think we’re in a bad place on the Lemmy timeline.
I think that’s a bit of a dangerous take we need to address. There seem to be very real legal and practical risks + obstacles to safetly and effectively running an instance. I don’t want to see people getting themselves hurt or disillusioned prematurely because they had rose colored glasses about this subject
I hear that sometimes, but what legal stuff has actually happened ever?
I’m in the EU so of course I can get some notice of some sort I guess and that’s all fine, but as long as I don’t make money from it or are going illegal things all day long, what is this “hypothetical” danger?
I don’t know anything has technically happened with Lemmy yet but I do know about a guy who got charged and convicted (his sentence was suspended or something, I believe) of facillitating transmission of CSAM since he hosted a TOR endpoint/exit.
Might be conflating issues and Tor/Lemmy but the fundamental dangers are shared, I believe. Different countries have different rules about Safe Harbour stuff or whatever so you’ll have to research it yourself a bit more. I simply don’t have the time
Besides contributing actual code… not really. You can donate money which will help with the upkeep of servers, but that’s not really an issue with most instances. No amount of money can compensate someone (normal) for dealing with the trauma associated with such content. So yeah, the only thing that can really help right now is tools for moderation and user access controls.
It’s only good because of all the hard work being put in by the moderators. Unfortunately, behind the scenes, Lemmy sucks and is severely lacking in moderation tools to deal with spammers, trolls and sick people who post illegal content.
See this post for instance, I feel pretty bad for the mods who have to deal with such stuff: https://beehaw.org/post/7943139
It’s not just the mods but the admins going to lengths to keep their instances clean. The awfulness outlined in that post means I’m not sure I should keep hosting my own instance.
Indeed, it’s not really a good idea to run your own instance if you’re not prepared to deal with such content. Many small instance admins have shutdown their instances for this very reason.
There was a patch merged recently which disabled caching of federated images, but I believe it still needs some work. Some discussion around that over here: https://sh.itjust.works/post/3962112 including an interesting comment suggesting rerouting
/pictrs/
path to 404, so nginx won’t serve any images.Don’t, software is shit and you’re not prepared for what you’ll see and how much time you need to sink into that project.
deleted by creator
I am biased saying this, but I really don’t think Lemmy is bad behind the scenes. On the contrary, I think it’s revolutionary from a technological perspective, not only because of the Fediverse but because of the way it’s implemented and all the great new technologies used.
Keep in mind that this is a FOSS project, and there is obviously no budget to be hiring moderation teams for CSAM like software giants do.
CSAM was an obvious problem from the start, but when it comes down to it, it’s a moderator job and not a job for the actual software to do.
Thankfully there are new tools now to help moderators deal with CSAM that are possibly going to be incorporated to Lemmy afaik.
TLDR: Don’t blame the software for people being shit
Sorry, but I disagree. Note that I don’t disagree with the idea or the technology itself (and the concept of Fediverse), the problem is the current state of development. Saying that it’s the moderators job doesn’t absolve the responsibility of the software, when the software, in it’s current state, doesn’t really provide any decent tools for moderation and user access controls.
CSAM was never a problem on well-configured traditional forums, which were based on forum software such as Invision, vBulletin etc. To elaborate, in traditional forums, you’d get a LOT of controls for filtering out the kind of users who post such content. For instance, most forums won’t even let you post until you complete an interactive tutorial first (reading the rules and replying to a bot indicating you’ve understood them etc). On top of that, you can have various levels of restrictions, eg, someone with less than 100 posts, or an account less than a month old may not be able to post any links or images etc. Also, you can have a trust system on some forums, where a mod can mark your account as trusted or verified, granting you further rights. You can even make it so that a manual moderator approval is required before image posting rights are granted. In this instance, a mod would review your posting history and ensure that your posts genuinely contributed to the community and you’re unlikely to be a troll/karma farmer account etc.
So, short of accounts getting compromised/hacked, it’s very difficult to have this sort of stuff happen on a well-configured traditional forum.
I used to be a mod on a couple of popular forums back in the day, and I even ran my own server for a few years (using Invision Power Board), and never once have I had to deal with such content.
The fact is Lemmy, in it’s present state, is woefully inadequate to deal with such content. Dealing with CSAM should never be a volunteer mod’s job - that stuff can scar you for life, or even trigger PTSD/bad memories for those who might’ve suffered abuse in their forgotten past. If people are involved, it should be a job for professionals who’re trained to deal with this stuff.
Once again, I don’t disagree with the general idea or the concept of Lemmy, it’s just unfortunate timing the Reddit exodus happened when the software was essentially an alpha.
I agree that CSAM protection is lacking, but the software is not an alpha. As a platform I use it just as much as I used to use Reddit and usually it’s much faster, more enjoyable and not profit driven.
CSAM protection is essential, but it’s a very hard problem to solve and naturally it takes time. Feature wise though, it’s constantly improving and showing how powerful FOSS can be when enough people are interested in it.
I agree that CSAM needs to be fixed as soon as possible, I’m just pointing out that despite this huge problem, the software is otherwise doing very well and improving faster and faster as more people join in.
That’s because you’re only seeing it from the eyes of a user. Talk to any admin of a big instance and you’ll see how inadequate it is. Or just head over to Beehaw, they have made some very detailed statements on how much of a nightmare Lemmy is, and on the current bleak state of development.
Yeah I agree. Lemmy obviously isn’t at the level reddit is, but reddit has had nearly 2 decades of development with a larger userbase.
I certainly would like to see Lemmy development happen a bit more quickly, and in particular better 3rd party/mod tools (I REALLY want a RES for Lemmy), but I don’t think we’re in a bad place on the Lemmy timeline.
I’m just missing a solid Eyeblech here. I miss that the most from Reddit.
Be the change you want to see!
It’s quick and easy to spin up a community/sub!
Oh dear, I certainly hope I’m not involved in any eyeblech… stuff.
Also I think something like that would be defederated from most places.
I just think it’s neat.
I think that’s a bit of a dangerous take we need to address. There seem to be very real legal and practical risks + obstacles to safetly and effectively running an instance. I don’t want to see people getting themselves hurt or disillusioned prematurely because they had rose colored glasses about this subject
I hear that sometimes, but what legal stuff has actually happened ever?
I’m in the EU so of course I can get some notice of some sort I guess and that’s all fine, but as long as I don’t make money from it or are going illegal things all day long, what is this “hypothetical” danger?
I call BS, at least for the EU.
I don’t know anything has technically happened with Lemmy yet but I do know about a guy who got charged and convicted (his sentence was suspended or something, I believe) of facillitating transmission of CSAM since he hosted a TOR endpoint/exit.
Might be conflating issues and Tor/Lemmy but the fundamental dangers are shared, I believe. Different countries have different rules about Safe Harbour stuff or whatever so you’ll have to research it yourself a bit more. I simply don’t have the time
Where were that? In the US?
Goddam, I really hope I never come into contact with anything like that. I think it would turn me into a fanatical vigilante.
deleted by creator
I answered that here: https://lemmy.ml/comment/3916556
deleted by creator
deleted by creator
Besides contributing actual code… not really. You can donate money which will help with the upkeep of servers, but that’s not really an issue with most instances. No amount of money can compensate someone (normal) for dealing with the trauma associated with such content. So yeah, the only thing that can really help right now is tools for moderation and user access controls.
deleted by creator