That link misses the point, as it openly says that it's not concerned with the ethics involved with silencing people, which is what we're talking about. Even then, while limiting what people are allowed to say might slow misinformation, it doesn't work to limit or control access to ideas, which always find another vector to permeate through.
No, you're trying to ignore that deplatforming works, and the information goes directly against the idea that they will "always find another vector". You talk about the ethics of deplatforming, but there's plenty of ethical conundrums about platforming bad actors. I prefer to keep my platform relatively safe and welcoming, which would mean more moderation so that the most vulnerable feel welcome. You may differ and that's your choice. But the ethical questions cut both ways, and I feel pretty comfortable about where I stand on it.
You say that "we" know "that stuff" is wrong, but this gives the impression that people just somehow come into that knowledge, and never question it. In fact, asking questions is a good thing, because even if we've already come to a conclusion, there's always going to be people who haven't. Whether it's young people who are naturally prone to doing what their authority figures don't want them to, disaffected people who are taking a second look at something they never considered before, or simply people who are tired of being told what the "right" answers are, you can't assume that any debate is ever truly settled.
What do you
think we're talking about? Because I'm talking about bigotry and such. When you're talking about engaging Nazis, I don't see a reason to platform their ideas and views. It's counterproductive and giving them big outlets to espouse those views is generally way more harmful because you give them a bullhorn to do so. Again, imagine if every time we had to talk about racial justice we had to debate a Nazi on-air. That's not constructive, that's actively
destructive to the discussion because we are ceding easily-drawn boundaries as to what is acceptable and what isn't.
There are always going to be people looking for answers, and then searching for the "why" behind those answers. You can't treat odious ideas like a disease, where you can simply quarantine them away from everyone else. All it does it give those ideas an air of mystique, and create an atmosphere that entices people who aren't satisfied with their current life. Far better to provide answers instead of insisting that there's no need to go looking.
I mean, you
can. We don't need to have a debate why slurs are unacceptable every time someone decides to use them, just as we don't need to engage them in a good faith discussion to convince them they are wrong. In fact, we know this doesn't work, which is why we have something called the
backfire effect.
On the contrary, it does work as a discussion tactic. It just doesn't get immediate results; changing hearts and minds isn't something you can fit on a bumper sticker (or a tweet for that matter). Being afraid that this increases the reach of people with odious ideas overlooks the fact that it also increases the reach of people with virtuous ideas, and likewise underestimates people's ability to understand and accept that virtue. If we take it to be true that the reasoning for tolerance, acceptance, diversity, etc. are stronger than the arguments against them, then there shouldn't be an issue with proclaiming those reasons far and wide, because we know they'll win out against ideas to the contrary (which has been the course of human history).
Again, this is not a "discussion" tactic. It's a "deprogramming" tactic. These are very different and require very different approaches. The latter is not a workable approach to content and platform moderation, and really needs to come from outside those discussions first.
Also it doesn't increase the reach of odious people, as we have dozens of cases against this: removing people from platforms
decreases their reach almost every time. It's only when you
replatform them that they can regain their reach. Alex Jones and the other examples in the post I link go directly against what you say and you can't provide any evidence to counter.
In other words, engagement has a proven track record of working. Deplatforming...not so much. You can take down individual people, sure, but it has a strong failure rate with suppressing the ideas they spout.
I mean, that's not what "engagement" is. Again, deprogramming is very different than "engagement" and "discussion". Deplatforming works, and I've provided data that shows that it cuts down on toxicity.
I can continue to show data in this regard, but at this point I think you're not really engaging with the premise.
On the contrary, giving them a platform where you engage with them almost always works. It's just that it doesn't work immediately, and quite often works more with regard to the people watching than the one(s) you're debating. The difference between "deprogramming" and "engagement" that you're purporting in that regard has no real substance to it, as both involve putting them in contact with a wider group of people and letting them see that the ideas they've latched onto don't work. The podcast I linked to makes it clear that Daryl Davis did engage in discussion, rather than deprogramming; he openly says that, since he was talking to people in bars, in their cars, etc. Deplatforming simply pushes them back into their enclaves, isolating them further and allowing odious beliefs to be reinforced.
It does. It really,
really does. Like right now we are in a discussion, and I'm engaging you. I'm
not trying to deprogram you. That is what you
think engagement is, but it is not: deprogramming something is not just talking on a subject, but a long process of pulling someone away from the edge. Your view of how people change their minds does not really reflect what we know about how people react to their beliefs being changed, and it even ignores the article you posted
yourself, where the person did not debate them on topics immediately but found inroads to form a relationship to bring down their belief system. That is not something that you do discussing a topic on a messageboard; in fact, the lack of personal investment makes it almost completely alien in that regard.
Bad ideas aren't diseases which can be pushes to the fringes where they'll fade away. They need to be showcased as to why they're bad, and what "bad" means. As Abraham Lincoln said, you destroy your enemies when you make them your friends.
Sure, but Lincoln didn't win the war with just words, and similarly Lincoln did not compromise on the 13th Amendment. Hell, Lincoln suspended
habeas corpus in some areas. The ideal and the reality are very different things.
And we also don't need to brook bad ideas in our discussions. We don't
need to debate Nazis, racists, xenophobes, climate denialists, etc, every time they show their faces. It would grind every discussion we have to a halt. Having a common, established ground for discourse allows us to have fruitful discussions instead of endlessly litigating meaningless tangents.
Right, I'm just going to say it: whatever happens in an RPG is in no way comparable to the nazis, kkk, communists, calvinists or anything like that. Being upset about slavery in a GAME is just looking to be offended. Please stop being so offended about fiction you yourself can ignore.
I dunno, I think people who are offended that people are offended about something tend to be the people looking to get angry at something.
It's fine to get angry that something is in a game, especially when it's a touchy real-world subject. No one would bat an eye if people didn't like sexual assault in an RPG. That things transfer to stuff that carry culture weight for people (particularly minorities) shouldn't really be surprising.