Wrong for the Right Reasons, and Right for the Wrong Reasons

Play Video
  1. I've padlocked my X account to step back from the platform, clean up my followers, and avoid engaging with far-right rhetoric and bots.
  2. My disillusionment with X under Elon Musk stems from its failure to resolve issues and its transformation into an unpleasant space.
  3. I've reassessed my views on large language models and AI, recognizing the ethical concerns and over-promising in the technology.
  4. We need stringent regulations in UX and tech, similar to those in psychology, to ensure ethical practices and accountability.
  5. I'm shifting to LinkedIn and other platforms for meaningful discussions and focusing on projects addressing tech regulation and real-world problems.

I’ve decided to padlock my account on X, formerly known as Twitter, to step back from the platform and focus on other avenues. This decision stems from my realisation that I was wrong about many aspects of the platform and its management under Elon Musk. I had hoped that the issues with the original Twitter would be resolved, but the reality has been disappointing. The platform has become a hub for far-right accounts, bots, and mass reporting, making it an unpleasant space.

As someone deeply involved in technology and analysis, I understand that being wrong is part of the process. However, it’s frustrating to see people who are not in the field gloat about being right without having done the necessary analysis. They got lucky, but their correctness doesn’t stem from a place of understanding or expertise.

My primary reason for locking my account is to clean up my followers, purge bots, and curate my feed. I want to salvage my account in case the platform improves in the future. The current state of X is such that I no longer want to engage with the fascist rhetoric and fake accounts that populate it.

I’ve also been wrong about other things, such as the potential benefits of large language models (LLMs) and AI. Initially, I saw promise, but now I see the harms and over-promising in the technology. The lack of regulation and the ethical concerns surrounding AI are alarming. We need better oversight and accountability in this space.

In terms of user experience (UX) and technology, there should be more stringent regulations. For instance, if psychologists need to be registered and go through ethical reviews, why shouldn’t UX researchers and tech professionals who manipulate user behaviour? There should be a regulatory framework to ensure ethical practices in technology.

The broader issue is that people often oversimplify complex problems. They think in binary terms and fail to see the multiple avenues available to address issues. For example, regulating unethical nudges in tech isn’t just about policing; it involves setting legal standards, providing oversight, and creating avenues for complaints and corrections.

Moreover, the justice system can handle many of these issues. Disinformation campaigns, mass reporting, and harassment can be managed through fines and criminal charges. We need to make these actions illegal and enforce penalties to deter bad actors.

The current state of X and the broader tech landscape highlights the need for better policies and regulations. We must ensure that technology serves the public good and that those who misuse it face consequences. This isn’t about being authoritarian; it’s about creating a fair and just society where people are held accountable for their actions.

As I pivot away from X, I’m focusing more on LinkedIn and other platforms where meaningful discussions can happen. I’m also working on various projects and writing about the critical aspects of technology and its regulation. It’s time to move beyond the culture wars and focus on solving real problems in the tech world.

Leave a Comment