Hi. I’m Michael Salamone. I’m nobody important. Feel free to forget my name. I’m only introducing myself to be polite. I’ve been approached a lot in 2018 about writing a column again, and I’ve been reluctant because I feel there are many more important voices we should be paying attention to; specifically gay, lesbian, bi, queer, binary, transgender and transitioning people, women, migrants, homeless, people of color, people with disabilities, poor people and indigenous peoples, in no particular order.
Mediocre white men like myself don’t deserve extra space in the space age. We should be listening more than speaking. However, I am human, and have thoughts, feelings and perspectives getting bottled up. Writing is good for me and my development as an okay human. I’ve been working on transitioning from a typical human to an okay human, with hopes of one day being a good human. I think writing more might help get me there, if practiced alongside a whole lot of other work.
I’ve decided to start blogging on Medium.com instead of other publishing outfits, including ones I help manage, because it’s monetized and a platform that by design I felt I would not be taking space or attention away from anyone else by using. Also, I don’t want to be pigeonholed into only writing about media, politics or religion, which are generally what people invite me to speak or write about. I also just wanted a morning writing project to do over coffee that I wasn’t investing a whole lot of time in. My friends at Progressive Army have asked to run this series since it started, so they’ll have access to it too, a couple days after it’s been behind the Medium pay-wall. Though, PA might skip some of the non-political chapters. I’m cool with that. Hopefully you are too. If not, follow me on Twitter or MediaRevolt (@michaelsalamone) and you’ll catch anything you might have missed.
This series will look at any topic I feel like and propose solutions to make it better or fix what is broken. Sometimes it might be serious. Sometimes it might be silly. However, because of lack of confidence, I suppose I’ll begin with one of those topics people generally ask me to speak on.
Salamone Wants to Help Fix Stuff: Social Media
MAGA-bomber, Cesar Sayoc threatened at least 50 people on Twitter before he sent out more than a dozen pipe bombs in hopes of becoming one of America’s most deadly domestic terrorists. Reportedly he had a list of over 100 potential targets he hoped to bomb. Despite Sayoc clearly breaking Twitter’s Terms of Service, and being reported for doing so, his threats went unchecked on the platform.
I love Twitter. So much so that I probably should seek a 12-step meeting for it. I try to manage my screen time with the site like I’m self-parenting an inner-toddler. However, over the last several years I have noticed that the policing of the site is in as much need for reform as policing in America. Like law enforcement, white nationalists, fascists and racists in general seem to get a lot of rope and a stop for a burger on the way to jail. Meanwhile, activists and journalists get beat down and locked away without review, even when the charges make no sense.
Still, Twitter seems somehow better than its peers at Facebook or Google-owned products like YouTube. Facebook just purged (even respected) independent media outlets, yet racist uncles everywhere still get to post lies about public figures like Michelle Obama wherever they like on the site. There’s no bigger trash fire on the planet than a YouTube comments section, and YouTube has been demonetizing activists and activist journalists for years, all-the-while making it harder for such accounts to be found on the platform by viewers seeking out that content.
While I’ve no data to back me up on this next claim: I fear that a huge draw for these social platforms is exactly the purpose of trash-talk and hate-speech. The freedom to harass, amplified by the ease of manipulation of these platforms through sock-puppet accounts and troll attacks, has turned social networking into social warfare and social engineered a vocal minority of hate-filled humans into getting a seat at the table to bully the adults in the room.
Trolling and trolls rallying each other to group harass is basically as old as the internet itself, and like hackers, I believe there are white-hat trolls, black-hat trolls and grey-hat trolls. Black-hat trolls seek to alter facts, do harm to others and derail important conversations. White-hat trolls organize together to tell truth against lies and protect the net from fake news going unchecked. Grey-hats can go either way, and because of that, are fairly dangerous to interact with, in my view.
Sock-puppet farming is a relatively new venture, where large amounts of fake accounts aim for social manipulation, be it to pad the friend/follower numbers of a personality, to attack in numbers against a personality or to inflate the numbers of participation for a fringe movement. In 2016, both David Brock on behalf of Hillary Clinton and Steve Bannon on behalf of Donald Trump weaponized sock-puppet farming to new levels. And while the press pinned the worst of this behavior on the Russians to influence elections, the practice is still going on unchecked by domestic political operatives in both parties. It is particularly still being practiced by far-right extremists such as those who influenced the MAGA-bomber and the even-further-right Pittsburgh synagogue shooter, Robert Bowers. Both of those loony-tunes re-posted far-right conspiracy theories on their social media accounts that were born out of political operatives using sock-puppet farming to push those narratives. For example, fear of a caravan of migrants attacking our borders, which any real journalism into the issue has defused as fear-mongering, was something both domestic terrorists promoted.
Those who argue all of the above flaws in social media are protected by free-speech miss the basic fact that websites are not carte blanche free-speech zones. Any site with a Terms of Service page has the ability to limit speech in their little section of the internet. I would argue that to protect its users from violence, every website needs to take that responsibility. Dig deep into your values and debate speech in your webspace and prioritize protecting your users from harm.
Terrorists and physical violence are not the only weapons extremists are using online. We have endless discussions about cyberbullying hurting our children, but it does psychological harm to adults just as much. A responsible online community protects its members from psychological abuse as well as threats of physical abuse.
Again, I love Twitter, but until I earned a blue check-mark, none of the death threats I received on the site were dealt with. Back then I was making public appearances both as a musician and an activist, and I was in a state of constant fear, especially after an MMA fighter (with a blue check-mark) threatened to find me at one of these events and smash my skull in. I used the website correctly. I reported tweets and tweeters threatening violence. It was only after I received a blue check of my own that I started getting emails confirming my reports were looked at and dealt with. Do I feel any safer? Only because I don’t gig as a musician and don’t announce my public activism so much anymore.
I founded and am lead developer on a social media project called MediaRevolt.org. The concept started as needing a big-brother-free space free of tracking and marketing monitoring of users for activists and artists to organize. The experiment is based on the community having a say in the management of the community, and funding the project themselves instead of selling their data to advertisers or government agencies. It’s become a lot more than that and though a year-old now, is still in development and severely underfunded; but one of the things I believe we’ve tackled in a much better way than the big social outfits is how we deal with all of the previously mentioned problems. I think every website could learn from us in these areas.
Firstly, every abuse report on MediaRevolt is dealt with by humans from start to finish. This is true for reported posts of any nature; harassment, fake news, breaking Terms of Service in any fashion. We have volunteer ombudsmen who look into every issue, refer a ruling to site administrators, who then take action and report the action in our forums, minus any personally identifying information, for all site users to review and comment on. This might take longer than letting an algorithm take the lead, but the results make for consistent policy, engagement with site members about site policy, and a safer space overall.
We also took great care in drafting a Terms of Service that stands against both trolling and sock-puppets. That includes during sign-up, having members agree to pledge the following: “I pledge not to troll, doxx, practice hate speech or harassment and to treat other users with kindness, dignity and respect.”
While such a policy makes MediaRevolt unappealing to some users, it makes others feel like it is a more safe-space. Besides, MediaRevolt isn’t trying to replace traditional social media for anyone. It’s meant to offer safe-space for people to network, organize, learn from and educate with others. There’s plenty of places for discourse on the internet. I would say ours is a place for solidarity. Still, I think we’re handling user safety and mutual respect in a way that many could learn from.
Having clear expectations for speech and behavior on a site is not just responsible, but necessary in today’s age where social influence is influencing real-life violence. Inviting site users to review the site’s choices of censoring posts or banning users is not only transparent but creates a clear vision of the site’s policies and enforcement. Instead of allowing sock-puppet farms to social engineer opinions and behavior, use your site’s Terms of Service to social engineer a community where people treat each other with respect, even during times of disagreement.
That’s how I’d fix social media, if given the opportunity to do so.