Comics
This is a community for everything comics related! A place for all comics fans.
Rules:
1- Do not violate lemmy.ml site-wide rules
2- Be civil.
3- If you are going to post NSFW content that doesn't violate the lemmy.ml site-wide rules, please mark it as NSFW and add a content warning (CW). This includes content that shows the killing of people and or animals, gore, content that talks about suicide or shows suicide, content that talks about sexual assault, etc. Please use your best judgement. We want to keep this space safe for all our comic lovers.
4- No Zionism or Hasbara apologia of any kind. We stand with Palestine 🇵🇸 . Zionists will be banned on sight.
5- The moderation team reserves the right to remove any post or comments that it deems a necessary for the well-being and safety of the members of this community, and same goes with temporarily or permanently banning any user.
Guidelines:
- If possible, give us your sources.
- If possible, credit creators of each comics in the title or body of your post. If you are the creator, please credit yourself. A simple “- Me” would suffice.
- In general terms, write in body of your post as much information as possible (dates, creators, editors, links).
- If you found the image on the web, it is encouraged to put the direct link to the image in the ‘Link’ field when creating a post, instead of uploading the image to Lemmy. Direct links usually end in .jpg, .png, etc.
- One post by topic.
view the rest of the comments
i've given several examples where that isn't as clear cut, but whatever. speech is a behavior, and can modulate how we act. if you tell people that a group of people is evil, and never say what to do about it, you still increase the likelihood that somebody will act on the belief that that group of people is evil. there are material consequences for speech between causing violence and not causing violence.
the barrier of lawfulness, violence, and all that are socially defined, yes, but if you concede that much, then there will be communities that define racism, bigotry, and other forms of inflammatory speech as violent, and decide that those things ought not to be in their social spaces. unless you're appealing to the group consensus of the largest possible group, there will be subcultures that disagree with each other on what does and doesn't constitute violent speech. if you're appealing to the legality of speech, you aren't appealing to group consensus, you're appealing to the government. so either we as autonomous communities ought to draw our own lines for what is and isn't violent speech ourselves (what i believe), or there is a precise legal definition we have to adhere to, given to us by the government. in reality, its both. there are firm lines of conduct that the government prohibits in theory (though i would dispute their efficacy), and there are communities that disagree on what the limit should be. i don't think that having codes of conduct in this way is necessarily authoritarian.
to be clear, i am here talking to you because i prefer the model that federated services use for moderating their communities, and believe that having tech companies be the sole arbiter of what is and isn't proper speech is a fundamentally flawed approach. that being said, the problem i have with your solution is one that's shared with a lot of community moderation on platforms. it relies on people being willing and able to confront and defuse bigotry on an individual level. i'm jewish. i don't want to hear what Redneck Russell has to say. i doubt that i could say anything to him to change his mind, and i don't want my internet experience to be saturated in Russells, for the basic reason that i want my time online to be relatively relaxing. people who are less attached to jewish identity are even less likely to engage with him, because it doesn't affect them personally, internet arguments are often unpleasant, and they also want their time online to be relatively relaxing. so how do things pan out if a community is only loosely engaged? well, if we aren't relying on moderators to curate our platforms, the hate motivated Russells of the world are empowered to say their bullshit, they receive relatively little resistance, and the relative permissiveness attracts more Russells. the people who want a nice place to hang out online go elsewhere, the concentration of Russells rises, and we're left with a platform that is actively hostile towards jewish people. oops!
if you are part of a focused, highly engaged community, maybe your solution works, but most online spaces are not focused and highly engaged. i agree generally that echo chambers are problematic, but i think on the whole that federation does more to mitigate that than large, algorithmically segregated platforms. i don't really agree that banning or blocking don't or won't play a role in ensuring that social spaces are friendly and enjoyable to be in, especially for marginalized people groups. if you let people say the n word on your platform, and don't do anything about the people who do, don't expect many people of color to want to be where you are. its just not fun to hang out with bigots if you're the one they're targeting, and that will affect the culture of your platform.
i think it really isn't so simple. some people are more invested in a community than others, lots of people are just... not interested in auditing their moderators. generally i think its a good idea to have it be transparent, certainly better than what any major social media platforms do, but at a certain point it does just come down to trust. for example, i agree broadly with the code of conduct for Beehaw, that's why i have an account there. i'm generally uninterested in trying to verbally spar with bigots, i don't want to engage deeply with the moderation of the platform, i have no interest in litigating what is and isn't proper conduct on the site, that's not what i use the internet for. lots of people who are the target of bigotry and hatred just... don't really want to constantly be on guard for that shit. they want a space where they can exist without being confronted with cruelty. i wouldn't want to be on the kind of platform you're describing, sorry.