The recent announcement by X (formerly known as Twitter) that blocked accounts will soon be able to view public posts—albeit without the ability to engage—has stirred up an important debate. For many, the block function on social media has long been a necessary tool for safeguarding privacy, ensuring peace of mind, and creating boundaries in an online world that can sometimes feel unmanageable. So, why change it now, and what does this shift really mean for user safety, trust, and the future of digital interactions?
High time this happened.
The block function will block that account from engaging with, but not block seeing, public post.
— Elon Musk (@elonmusk) September 23, 2024
Understanding the Changes
Elon Musk confirmed this upcoming modification, explaining that while blocked accounts will still not be able to interact with users, they will soon regain the ability to view the content of public posts. This means that where previously someone you blocked would see a “You’re blocked” message, they will now be able to read your public updates like any other user.
At face value, this appears like a minor adjustment—after all, blocked users could always simply log out and view a public account’s tweets anonymously. However, the significance of this change goes deeper. It reflects a shift in how X manages user boundaries and personal control in the ever-evolving landscape of social media.
Pros of the New System
1. A More Open Public Space
One of the core principles Musk often espouses is the idea of free speech and open communication. In that light, these changes could be viewed as an effort to ensure that public discourse on X remains accessible to all. If a user posts publicly, that information should, in theory, be available to anyone, including those they may have blocked. For instance, users engaging in public debates or business-related interactions may appreciate that their content remains visible, even to those they’ve blocked for personal reasons.
For creators, public figures, or businesses, this update could be beneficial. These accounts often block malicious users or bots while still hoping to reach the broadest possible audience. In that respect, allowing blocked accounts to view content without engagement ensures that those seeking to gather information still can, while preserving the poster’s control over direct interaction.
2. Eliminating the Workaround
The truth is, blocking someone on a public platform has always been limited in effectiveness. Blocked users could always log out and view your tweets, so the block function never really equated to invisibility. What X is doing here is making that process explicit. By removing the illusion of a “full” block, X could be encouraging users to view public posts as truly public. In this sense, the change promotes transparency in how the platform functions.
The Cons and the Safety Concerns
1. Weakening Boundaries for Victims of Harassment
While the intent may be transparency, the reality for many users is that blocking is more than just about engagement; it is often about feeling secure. The ability to block a stalker, an abusive ex-partner, or a persistent troll and prevent them from seeing your posts was, for many, a vital part of establishing digital boundaries. This update, by letting blocked users view public content, risks softening those boundaries, which could leave some users feeling vulnerable.
For individuals experiencing online harassment, this change could blur the line between safety and exposure. Though the platform will still prevent blocked accounts from engaging—replying, retweeting, or quoting—simply knowing that a harasser can see your posts may reduce the feeling of security.
2. Trust Erosion
Social media platforms build their user base on trust. Users trust that they can shape their online experiences in a way that aligns with their comfort and safety levels. The block function has long been part of that contract. Weakening it, especially without substantial dialogue with the community, risks eroding that trust. While the change may seem pragmatic from a platform governance perspective, it underestimates the emotional and psychological weight that users place on the tools available to them for self-protection.
Safety vs. Openness: The Balancing Act
The shift reflects a wider tension in digital culture between creating open platforms and respecting personal boundaries. It raises an essential question for social media: can we have both? Is it possible to maintain a platform that encourages open exchange while also protecting vulnerable users from those they have deliberately sought to avoid?
From a business perspective, X’s move could be seen as an attempt to increase transparency in how public information is treated. Musk has pointed out that the block function will still restrict direct engagement, so users will remain insulated from harassment in the form of unwanted replies or retweets. Yet, the ability of blocked accounts to continue viewing posts may still make users—particularly women, LGBTQ+ individuals, or activists—feel more exposed. This is especially true for those dealing with stalkers, who may not need to engage to inflict emotional damage.
3. The Precedent of Reversal
It’s worth noting that this isn’t the first time such a change has been considered. In 2013, Twitter introduced a similar update, allowing blocked users to view and even engage with public posts. The backlash was swift, and Twitter reversed the change due to concerns from users about their safety and the erosion of the control they had come to expect. Musk’s changes don’t go quite that far, but the echoes are there. Will X experience the same backlash, or have times changed?
The Path Forward
The digital world is fluid, and policies that govern interactions on social media platforms must balance openness with user safety. As someone who advocates for mental health and the importance of safe digital spaces, I see this change as part of a broader dialogue about the future of social media. How do platforms like X evolve while maintaining a sense of security for all users, especially those most at risk of online abuse?
X’s decision reflects an ongoing effort to streamline the platform’s approach to public discourse. Still, it’s essential that in pursuit of openness, we don’t lose sight of the emotional and psychological needs of users who rely on the block feature to feel safe online.
There is no easy answer. But one thing is certain: as social media platforms continue to redefine their rules of engagement, they must tread carefully, keeping user trust and safety at the core of their decisions. At the end of the day, it’s not just about tweets—it’s about people.
Follow Scott Dylan for more insights on XX’s block feature changes raise concerns about user safety and trust. Scott Dylan explores the pros and cons of this controversial decision.