If there’s one thing I’ve learned about the internet, it’s that our comments create permission structures for other commenters to say the same things.
If we’re vulnerable, other people can be vulnerable. If we’re overcritical, other people can become overcritical. If we’re nuanced, other people can find nuance.
And when one person casually misinterprets the meaning or message of a post, it can lead to multiple people reading the comments and misunderstanding the post the same way.
We are undeniably social creatures and we take cues from the crowd just as much as we do from the person on stage.
That’s why I appreciate the block & hide comment feature on Bluesky & Substack. Because the earliest commenters often set the tone for future commenters. If it goes south in the beginning, it likely won’t turn around.
Every time I read a comment that doesn’t jive with the message intended, it’s not that I’m bothered that someone disagrees with me or that someone misunderstands me. And to be frank, if it was only one random comment, I’d let it slide.
But I’ve learned over the years that bad commenters are invited by bad commenters and the tone that they set almost always multiplies.
Once a single person has been led astray, it derails many future commenters. And it disrupts the pathway I was trying to set––which leads people on tangents that have nothing to do with the original post.
MOST arguments are deflections and derailments that distract from the original post. That keep people spinning away from the original message. And it’s painful to watch new commenters coming in and getting swept up in this unhealthy spiral.
But it’s also a tactic that trollers use, reliably––precisely because it has that power. They can sweep people up in tangents very quickly and pull them far away from the power and impact of the original message.
When forums like X don’t allow the original posters to hide or delete comments, it takes all of 3 seconds for bots to hijack any post you make––and force you to become their carrier.
In actuality, then, all of the positive messengers become carriers for the worst viruses. And there’s nothing they can do about it.
That’s why, throughout the election, all of the legitimate and hardworking independent journalists had thousands of bot comments attached to each and every one of their political information posts on X. Which meant ALL good information came attached with the infection.
When the “good guys” send around the infected “email chain” and you’re foolish enough to open it because it’s from someone you trust… you’re screwed.
And it’s only after the election that we can now look at this kind of simple manipulation and see how it was weaponized on a grand scale to derail healthy information and keep Americans spiraling into wild, conspiracy tangents––even on the threads of the most legitimate and thoughtful journalists among us.
As I’ve organized both small teams of 5 people and large projects with thousands of people, I’ve found that the only way to protect healthy conversations is to actually nurture them.
We can’t arbitrarily tell people how to feel––but we can create a space for people who already feel that way to express themselves.
Our platforms build spaces for other people. And if we foster anger, then we build spaces for people to be angry.
If we foster judgment, then we build spaces for people to criticize and blame others.
And if we’re vulnerable, we build spaces for people to open up.
All three of those are valid responses after a devestating election result. I mean, we’re fucked and we need to openly be angry, criticize others, and express our vulnerabilities.
But whichever space is tailored for one, may not so easily be able to hold space for the others. Because once you allow a certain type of mentality to take hold, it becomes harder and harder for other people to go against the grain.
Throughout 2024, I mostly withdrew from online spaces because I wasn’t comfortable and happy with the overlords who had either forbidden me from expressing the things I wanted to say (Zuck throttling keywords & links)… or they had fostered an environment where you could only say negative things (Musk engineering & boosting radical content).
I’m not sure yet what our new spaces look like in 2025. But I’m not going back to the way things were.
I’m not going back to the centralization that allowed Zuckerberg to forbid us from accessing healthy political information and from Musk allowing Russia to attach viruses and infections to every healthy informational post we read.
On Substack and Bluesky, it seems that we’ve been given a fresh opportunity to take responsibility for the platforms we create. To practice good faith Collective Moderation. When we create posts, it’s our job to moderate the comments and make sure bad actors haven’t hijacked it to spread disinformation.
It’s also, perhaps, our job to ask ourselves when we post in other people’s spaces if the tone that we contribute in our comments is in line with the original message. Because remember, sometimes your comment isn’t untrue––it’s just distracting and deflecting from something important that someone else had to say.
If this post made sense to you, please join me on Substack and help me figure out what kind of organic space we can create to get us through these chaotic next four years.
On Substack, we have a private chat for subscribers to build community.
And on Bluesky, I will do my best to moderate public comments and make sure my corner of the platform stays healthy for the next four years.
Thanks for reading!
Resist Rebel Revolt is a community as well as a newsletter. Subscribe, to be part of the chatroom & join a small, but growing hub of like-minded people!
All Rights Reserved © 2024-2025 Elephant Grass Press, LLC
You can always unsubscribe in a single click. Thanks again, and please share with friends if you’re enjoying your subscription!