lift the ban on self-harm talks
i don't support lifting the ban on NSFW content outside of NSFW chats, that is really intrusive and shouldn't be unbanned
i must also preface that I don't support the unbanning of lolicon, shotacon, and guro, that is heavily normalizing of pedophilia and could seriously affect people who have histories involving abuse
however, the ban on self-harm will inevitably lead to people avoiding talking about the issues that affect them. when people are contemplating, it should be obvious that a space in which they can talk about their issues without legal action against them or account suspending, and it's obviously preferable to talk to a group of friends that you know very well as opposed to some impersonal therapist. obviously not going to advocate _never_ talking to professionals, that'd be ridiculous, but for some people having professionals involved could be a huge stressor and it'd inevitably lead to things like parents and others finding out about their issues which could be unsafe or extremely stressing.
i must also note that i am okay with the banning of self-harm advocation, such as the recent trend in "kys" jokes, which could be particularly damaging psychological.
Ash Davis commented
AFAIK the guidelines only talked about promoting it. Right below it, it said that you should be uplifting to others.
I don't think people who seriously mention it (because it's a cry for help, or they are helping someone) are getting banned.
I do want to ban all (I'd like to say idiots) people who use "kys" and "hang yourself" as casual insults.
Led Skull1337 commented
I didn't know there was a ban on self-harm conversations, but as a suicide prevention counselor can definitely say that banning people from talking about self-harm is an absolutely terrible and immoral idea. Also contrary to popular belief, jokingly saying "kys" __does not make people kill themselves nor does talking about suicide make people kill themselves.__ That's just a myth. If people can't talk about their self-harm problems then it will go un-noticed and they wont get help and ***that*** makes people kill themselves.
If you take a closer look at the guidelines it specificaly forbids "Sharing content glorifying or promoting, in general, self-harm or suicide.", it doesn't forbid people from seeking help or talking about it.
Hold your tatas people, guro isn't banned. (Here: tumblr.com tagged guro) Most guro isn't pedophilia.
There are way too many people excusing pedophilia in the comments so here is my two cents:
Self harm discussions are ok but graphic pictures of it are not.
Helmic and Stranger have great input that exemplifies exactly what I was going to say.
NSFW content outside of NSFW channels should be a per server option. Admins and moderators, as well as individual server settings and rules, are what should be the deciding factor, not the devs. Discord is meant to for interaction between people, and in general is pretty private. Private servers, private DMs, invites only for server access, and the admins or moderators who enforce the guidelines of each server, and even the members themselves... all of this already filters people who don't want to be on NSFW server or see NSFW content.
For public servers, then, it would be optional to disable NSFW content outside of NSFW channels, instead of the default.
As for talks of self harm and suicide... it's a grey area. On one hand, people should be allowed to ask for help, but I can't help but point out that Discord is not a suicide hotline or chat service, and it's not random people's responsibility to remain capable of helping someone with a very difficult to talk about issue. Let alone what Helmic mentioned about trolls and other users abusing the "I'm going to kill myself" line in order to cause a ruckus or just for attention without any actual ideation on it.
IMO, there are DMs for talking to people about really personal stuff like this, and shouldn't be approached in a public server as it is. Even if some people there are helpful, there are plenty of people who won't be, plenty of others who might be triggered by it, etc. There's really no winning in this scenario.
As for the bans related to it, I do believe that it's aimed more at people encouraging it or demanding it or harrassing others with it, not people who are asking for help. Some clarity perhaps is in order, there.
As for lolicon/shotacon etc - connecting it to pedophilia is really assumptive of you and many people. They are fake, they are drawn/illustrated/animated MEDIA, and is not representative of real people any more than that anime school girl who's realistically 15 is. The very reason it exists is because it doesn't apply to the law on child porn, and therefore shouldn't really be banned through Discord either IMO. Again, I think it's something that should be moderated on a server to server basis, not the Devs.
I do agree with the post on the issue of lifting bans of talks on suicide and self harm, however I think banning people for even jokingly saying "kys" is really excessive, not that I advocate self harm or suicide in any way shape or form, it's just that what if it is jokingly said and it's understood by everyone as such especially if the speaker states so to be sure. Yet that person, who might be actually a genuinely good friend, gets banned for a joke, even if said in the context of discussing this kind of serious issue. What if it's just an utterance of annoyance rather than a serious suggestion? It's context dependent, if someone genuinely said "Kill yourself." to someone who's suicidal, then I would understand reporting and penalizing them. But doing it for even a joke statement, especially if it's indicated as that by the individual through anagram? Again, context.
Okay i read the post and looked at A BUNCH of different comments and this is my thought. I think they should lift the ban on NSFW content on the out side of NSFW chats and it should be put up to the individual servers,in how they want to take care of it. Discord gives server admin enough power to take care of it. So it has nothing to do with discord and discord should not get in the way. And connecting lolicons to pedo's is wrong that is completely rude and inconsiderate. Loli is just a cartoon or something that is fake.Your second topic ya they should cause its none of their business discord inst a physiologist they wouldnt know if someone was joking or not so baning is just pointless. P.s. Type a better post like the first thing you type about is the the name of the post.
To be frank, Discord is used by people in private for private conversations, whether that privacy extends to a group of 10 friends or 2000 people interested in a singular topic. Unless someone is causing trouble on a **meta** level, where kicks and bans issued by the admin in charge isn't enough, it shouldn't be dealt with at all by the Discord people.
For stuff like self-harm, though, it's a bit mushy even if we're not talking about people doing obviously shitty things like telling others to "kys." I've seen people come into channels full of strangers and then just start threatening to kill themselves, or popping into a Twitch stream chat and demanding the streamer's immediate attention.
And while it'd be great if we were all mental health professionals and could handle that sort of situation and not take someone that's trolling seriously or dismiss someone that needs help as simply seeking attention, most of us aren't and we need a way to defer these sorts of folk to someone who can actually handle it.
There shouldn't be bans handed out, and Discord should definitely be trying to get people who need help that help, but I've seen this come up multiple times and the mods just don't have a good way to handle it because it feels dangerous to ban some stranger that's making a big old scene in what's supposed to be a viewing party for esports, no one wants to end up causing someone to kill themselves and trolls take advantage of that.
Iunno, maybe there's some more elegant solution, but so long Discord isn't fucking with people who are coming out about this to friends on their own servers and aren't banning someone because a lone asshole on one of those tight-knit servers decides to abuse reports, I think that's at least workable until someone can come up with a better way to handle it.
I don't think there should be a ban as someone who has had the issue in the past (don't worry I got mental health support) however users seen discussing it often should be sent notifications in an semi-egressive manner to direct them to mental health resources. TL;DR don't punish people for being depressed rather use your unique set of information to find people who need it and to get them help!
I think the ban is worded very poorly, I don't think they actually want to prevent people from finding support or asking for help, I think they want to prevent people from telling others to go kill themselves or encouraging self-harm. Again, I think that the ban is badly worded, but that also being said... I am not a fan of censorship and I find the ban reactionary anyway.
All in all, I think the ban should be on encouraging self-harm.
Chief Awesome commented
Didn't know this was banned, and it's really pathetic that it is.
We're not children that need to be protected at all costs from the idea that people can be sad.
If someone wants to inform the discord group they created that they're suicidal, that's their prerogative.
I agree with this. It makes it harder for people to discuss their experiences of self-harm to other people. Who knows, it could actually help someone to be able to share those experiences and what not!
A message to discord:
Unless we're doing anything illegal or harmful, leave our private servers be. It's not like we're publicly posting things like gore for example.
As for public servers: Admins can ban users who break any rules for that particular server.
As for DMs: People can block and report users.
There is no need for you to baby sit us. I really hope you don't decide to go a step further; next thing you know you will be banned for talking politics in your own server with friends.
I really have to wonder how you can say "lolicon and shotacon normalize pedophilia" as a slippery slope argument, but then choose to ignore the reasoning for guro. That would imply you don't have any reasoning for it besides "I don't like it", and even pedophilia's argument is extremely weak seeing as places where "underage drawings" were introduced, gave actual pedophiles an outlet rather than going out and raping children.
While this specific website brings up actual child porn, it's effects are clearly recorded as reducing crimerate, and I think it's a safe deduction that drawings are just as capable of fulfilling that role. And to add: Pedophilia is a mental illness, people don't just contract it because they look at vaguely anatomically similar drawings. Even if they did, they still have morals.
On the other hand, it seems apparent your post is mostly related to the title, I think you should separate the two so people can get a clear understanding of what's actually being supported. I agree with everything except maybe the last line. Moderating edgy jokes is going to take discord off into a steep dive of overly moderating, and that can easily be resolved on the server-level among the owners.
ѕмσℓ ∂σggσѕ (llama) commented
I helped a friend through a really tough time with discord that I don't really wanna discuss here but I definitely feel like any ban similar to this should be lifted. It's definitely better to be able to talk about it than to ignore it
Personally I support the unbanning of cartoon underaged NSFW content. Why? It's cartoon and it's fake. It should not be banned and this is an idiotic decision. This doesn't support pedophilia, what are you smoking?
Where does it say that talking about self-harm is against the rules?
"Sharing content or promoting, in general, self-harm or suicide. Don't post content that promotes or glorifies self-harm or suicide. This includes any content that encourages others to: cut or injure themselves; embrace anorexia, bulimia, or other eating disorders; or commit suicide rather than, for example, seeking counseling or treatment. Online communities have the power to lift people up from dark places - be the kind of person who lifts others."
They're saying that encouraging people to harm/kill themselves and sharing self-harm/suicide content will not be tolerated, and that you should encourage them to seek counseling or treatment and be the kind of person that lifts others. Where does it say that you're not allowed to talk about self-harm at all?