I called this stream “Irreconcilable Differences” because I wanted to talk about X, Elon Musk, and the broader issue of online Nazis and deplatforming. Everyone is sick of talking about Musk, who’s clearly going through some breakdown. His erratic behaviour, whether it’s due to drugs, a manic episode, or something else, is not normal. As someone who’s dealt with addiction in the family, I understand the impulse to defend such behaviour, but we don’t have to tolerate it. We shouldn’t have to endure the abuse on X, a platform overrun with Nazis.
This ties into what’s happening with Substack. You can hold two views simultaneously: recognizing the Nazi problem on X and understanding the complexities of deplatforming. I used to defend Musk during his Twitter takeover, acknowledging the existing problems like media manipulation and the power of blue checks. However, the Nazi issue on X is undeniable and measurable. These are complex problems without easy solutions, but good processes can mitigate the downsides.
Substack is facing similar issues. The manipulation of platforms by bad actors, the abuse of power, and the corruption within these systems are well-documented. Musk’s claims about being targeted hold some truth, but they don’t negate the observable Nazi problem on X. The old Twitter team did their best, but the cost of a more sanitized, corporate-friendly platform was high.
The issue with Substack and other platforms is that they can be manipulated. The Atlantic, for example, has a history of publishing articles that lump different groups together, leading to mass deplatforming. This happened with Reddit in 2020, targeting subreddits like r/gendercritical and r/The_Donald. The same tactics are now being used against Substack.
The problem with deplatforming is the lack of clear criteria and due process. Algorithms and machine learning are not sophisticated enough to avoid false positives, which means innocent people get caught up in the net. The solution isn’t simple. It’s not just about banning Nazis; it’s about how you do it without harming innocent users and without legal recourse.
Substack’s stance on free speech is admirable but problematic. They risk becoming dependent on far-right money if they don’t handle this correctly. The same goes for X. The platform’s value diminishes as normies leave, creating a death spiral. Blue Sky and Threads face the opposite problem, becoming havens for cancel culture.
The real issue is the suppression of dissent and the control of information. The Atlantic and other publications use their power to manipulate public perception and deplatform inconvenient voices. This isn’t about defending Nazis; it’s about ensuring a fair and transparent process. Without it, we’re all at the mercy of corporate whims and bad actors.
We need to think clearly about the steps involved in deplatforming. It’s easy to say “ban Nazis,” but how do you do it in a way that scales and doesn’t harm innocent people? The current systems are flawed, and the lack of legal recourse makes it worse. Platforms like Substack need to find a balance, but it’s a complex issue without easy answers.
In conclusion, the problems with X, Substack, and other platforms are deep-rooted and complex. They require thoughtful solutions that consider the broader implications. It’s not just about banning bad actors; it’s about creating a fair and transparent process that protects innocent users while addressing the real issues. Until then, we remain stuck in a cycle of manipulation and suppression.