I think deplatforming is exactly what we need here: neonazi and white supremacists are people who are reliant on easy to use discourse-amplifying platforms.
The more you make it difficult for them to gather virtually, the more you’re breaking the spell that keeps them together.
Before social media, physical neonazi or white supremacy chapter had A LOT of work to do to radicalize people, and were very easy to police and keep under control. With online-based organizational tools, radicalization became easier and harder to patrol.
Do you think a random Arkansas soccer mom would have ever joined anything as crazy as Q-anon conspiracy, if she had to attend Q-anon chapter in person instead of participating in an online forum while doing the laundry?
There's a contradiction in your argument. On one hand, you claim these platforms "amplify" hate speech, and on the other hand, you want to push them to other platforms.
...but it is exactly that isolation that create the bubble induced amplification of hate-speech.
If everyone was forced to exist on the same platform (as a though experiment) then hate-speech would have to co-exist with rational thinkers and their hateful ideas might not propagate much.
...but the moment you ban/deplatform/censor/etc... then they begin to form isolated groups with no counter-balance of rationality.
Have you been discussing this online with people that fervently believe the election was fraudulent? It is eerily similar to talking with someone in a cult. Any 'evidence' you present to them will be taken and 'debunked' by their fellow cult members.
There is already no counter balance.
Breaking up their platforms into smaller pieces is actually helpful and will keep them from readily finding new members on the internet.
I think this is something that largely gets ignored in all of these discussions about deplatforming, free speech, etc. There is a baked in assumption that all people are rational and will respond to well-reasoned arguments and evidence and will eventually arrive at a reasonable and evidence-based conclusion if you just bear with them and try harder. But my experience so far has not shown this to be the case.
I’ve tried to have a calm and reasonable discourse with people who believe in things like QAnon, anti-vaccination theories, the “deep state”, GMO conspiracies, among others, and I’m a single voice among however many hundreds or thousands of others they are listening to. No amount of evidence or rational argument will sway these people because they are swimming in a sea of voices reinforcing their views.
I’m not sure if deplatforming is the right solution here but I also don’t know what the alternatives are. Should we let them keep espousing and spreading patently false information that is actually causing harm to society to a large audience and hope that people just ignore it? That doesn’t seem like a good solution, either.
To be honest, rational argument and evidence doesn't exactly sway people of the opposite belief because they, too, are swimming in a sea of like voices.
Every time big tech bans or suspends a user, another supporter of Trump is born.
Violent thinking exists whether or not we "accept it". By putting all the violent thinking people on their own isolated platform, we are facilitating the escalation of their thinking into coordinating actions.
The best school to become a thief is jail. Concentrating violent thinking people together will agitate the self-radicalization that occurs. How do you avoid this?
I think that's a US thing. Other countries seem to be genuinely more rehabilitative. Looks like 55% of the state prison population is there for violent offenses: https://www.prisonpolicy.org/reports/pie2020.html
Handling non-violent offenders away from the violent ones might be a good start.
I completely agree with this. You will find a ton of stories about nonviolent people who had to become/join violent groups in prison just to stay alive and not be abused incessantly.
I hear what you're saying and I don't disagree that both "sides" are susceptible to group think and a mob mentality. I consider myself fairly moderate which is why I include examples of extreme views from the left (anti-vaccination theories, GMO conspiracies, 9/11 “truthers”) in my analysis, as well. I’m not a fan of misinformation no matter what the source is and do my best to try to “lead” people away from it.
That being said, what do we do in this situation (which is a common one I see in these kinds of debates)?:
- Group A claims X.
- Person N asks Group A to provide evidence for their claim X.
- Group A cannot provide said evidence, or the evidence provided is unreliable/cannot be validated.
- Person N asks for additional evidence that the claim X made by Group A is true.
- Group A insists they are right and refuses to provide any additional evidence for claim X.
- Person N chooses to not believe Group A’s claim X due to a lack of evidence.
What’s the next step? Assuming good faith on the sides of both parties, how do you reach some sort of agreement or conclusion?
And what do we do when claim X made by Group A is one that leads to violence or other societal harms (such as an increase in death and suffering due to the spread of disease in the case of anti-vaccination theories)?
I don’t have a good answer, here. I desperately want to have rational and productive conversations with people with whom I disagree. But how do you avoid a deadlock situation like the above?
Agreement, and conclusion should be reached by applying procedures to which sides* agree: be it a coin tossing, asking oracle, or voting. Opposing side won't magically disappear when your side achieves power over media. What will likely happen instead they will get more reasonable motivation to wrestle this power from your hands with any means possible.
*Btw, usually there are more then two sides: it's just stale majoritarian system which leaves people just two unsatisfactory choices in US.
The example tells about inability to reach conclusion in debates, but explicitly assumes good faith, you probably skipped the final part. And in real life as well, not a lot of people would be inclined to choose civil war instead of political/cultural one.
Giving how gullible the general public is, I think this will be a great way to stop the flow of garbage into their gullible minds. Sure the zealots and mentally unstable will find other means to communicate with each other but deplatforming Trump right now is critical to preventing another Capitol building repeat. He needs to be stalled until he has to show up in NY state for tax/real estate fraud charges. Then he will be more focused on that rather than trying to stir up more mayhem.
You are assuming that they will not ultimately form censorship resistant platforms that that will be far more dangerous than having their conversations visible to the majority.
Not only is that game of whack-a-mole unsustainable, it will lead to the creation of un-silencible platforms. ...and if they are scalable they might become the new standard.
It will be pretty sad if social media platforms are ultimately displaced by the platforms created due to their censorship.
And it's a fair assumption. If they do, those platform won't have state-like power and leverage like Facebook or Twitter, and will be way easier to patrol. The real solution would be to break down facebook and regulate gigantic internet leviathans to avoid that they become more powerful than nation states. While we work on it, this might be the best solution we have to stop easy radicalization.
Modern online communities fracture into as many sub-groups as deemed necessary by both algorithms and user preference. Moving everyone to one platform won't limit the spread of violent hate speech, or limit organized sub-groups from co-ordinating violent activity.
From a platform perspective, there isn't much difference between companies curtailing the use of their platforms to carry out violent activity and those platforms curtailing other illegal activities such as spamming or malware distribution.
We don't consider sending 1 million unsolicited emails free speech, and wouldn't consider a gang using reddit to co-ordinate free speech. Is there any difference for groups coordinating the violent overthrow of an elected democracy?
> the moment you ban/deplatform/censor/etc... then they begin to form isolated groups with no counter-balance of rationality.
(1) Rationality is not a counterbalance to implacable, violent division when there is a fundamental conflict of value. Rationality is just ruthlessly optimized maximization of one's own utility function.
(2) For those things where rationality in principal could be a counterbalance, the problem is that people (all of them) are not rational (rationality is an abstract, unattainable ideal), and the ways in which certain people are irrational can minimize the effect of what partial rationality other people in the group they are incorporated into have.
When a group tries to exclude certain ideas from general conversation, it tends to be ideas whose expression is seen as indicateling one or both of (a) irreconcilable conflict of values with values that the rest of the groups sees as table stakes, and/or (b) incorrigible and toxic-to-the-group irrationality in assessment of reality and pursuit of values, whether or not these values are compatible with the group’s values.
So, in either case, the decision is not made in a lack of awareness that isolation removes, int he abstract, rationality as a counterbalance in the conflict between the core group and the excised group, it's is made instead because of the belief that rationality does not function as such a counterbalance with the concrete group targeted for excision, and indeed that the presence of that excised group mitigates the function rationality otherwise would serve as a counterbalance to conflicts within the core group.
> ...but the moment you ban/deplatform/censor/etc... then they begin to form isolated groups with no counter-balance of rationality.
The first such group that comes to mind is Heaven's Gate, which I think illustrates the opposite:
A brief reading of Wikipedia suggests to me that key members arrived in place via media attention, even though they spent significant other effort trying to recruit members.
I see your point but unfortunately this is really not how platforms these days work... Even if people are on the "same platform" they get trapped in their own echo chambers because by doing so the platforms make the most amount of money.
This argument only makes sense if you act like history began in the 20th century. There was so clearly much more widespread racism and neonazism long before the mainstream internet. Also yes I do think that soccer mom would go to an in-person event if she sees some politician that appeals to her fears and biases, we've seen that all before. It doesn't mean she'll storm the US capitol building, but it does mean she can exist without Facebook.
Deplatforming is isolation and in a sense, this already happened on reddit because they could isolate their own topic ( eg. TheDonald) from other opinions ( eg. Snowflake).
I'm not sure where the sitting is, but the ones radicalizing right now don't get a counter opinion.