As background: "Substack faces criticism for giving Nazis a platform", Dominick Mastrangelo, The Hill, December 15, 2023
Here's Substack's decision yesterday. In essence:
"We don’t like Nazis," but some people are Nazis. If we stop paying them, they will become more Nazi. Therefore, we will continue to pay Nazis for Nazi words, so that they'll merely hold steady in their Nazi views or maybe even stop, precisely because we allowed them to continue publishing and paid them for it.
The part beginning "I just want to make it clear" is "the single worst paragraph ever written by a tech founder," Ryan Broderick said on January 10.
Justin Hendrix wrote:
"McKenzie’s argument is a straw man. No one thinks banishing Nazis from Substack will make fascism 'go away.' But some do think that it is ethically wrong for Substack to do business with Nazis, and for it to distribute and promote Nazi content.
* * *
'Substack's position on Nazis only makes sense if you ignore the last eight years of world events and tech policy debates,' Melissa Ryan, the CEO of Card Strategies, an expert on far-right extremism, and one of the protest letter's signatories told me. 'We know from experience that these choices have dangerous, sometimes deadly consequences.'" (Substack Cofounder Defends Commercial Relationships with Nazis, Justin Hendrix, Tech Policy, Dec 21, 2023)
Substack's position continued (ever more bizarrely) to argue that Nazis are best understood as aligned with the powerless among us, who most need help making their voices heard.
Yes, that's contradictory.
On the one hand (per Substack), Nazis are sort of holding us hostage, because they'll increase their hate and violence if we stop paying them.
On the other hand (per Substack), the Nazis are powerless and need to be helped.
If you aren't really up for this, then follow Marisa Kabas on Bluesky. She led the push for Substack to answer this question. She hints (unsurprisingly) that she may be moving her publication, the Handbasket, elsewhere.
Substack's position on this "has been common knowledge since trans people protested Substack in 2021," says Jude Ellison Doyle, and
"the question of what trans people were supposed to do about Substack — leave and lose the income, stay and lose the moral high ground, take the money and pretend Substack wasn’t transphobic, take the money and “make change from inside” (though the people who promised to do this all wound up leaving) — was bitterly fought. That fight created some fairly deep enmities. Disagreements about Substack account for maybe 90% of the conflicts I’ve had with other trans people in my time."
In the second week of January, Substack talked to Platformer and explained a new strategy (read this on Platformer).
As CNN explained: "After Casey Newton, founder of Substack tech news publication Platformer, flagged a list of publications violating content guidelines to the company, Substack says it is removing five. None of the nixed newsletters have paid subscribers and, in total, account for about 100 active readers," and "a November article in The Atlantic pointed out at least 16 different newsletters with Nazi symbols, as well as many more supporting far-right extremism."
Later, Casey Newton published: "Why Platformer is leaving Substack. We’ve seen this movie before — and we won’t stick around to watch it play out." Jan 11, 2024. Newton explained: "I’m not aware of any major US consumer internet platform that does not explicitly ban praise for Nazi hate speech, much less one that welcomes them to set up shop and start selling subscriptions." Only Substack. Newton and some colleagues identified seven Substack publications that "conveyed explicit support for 1930s German Nazis and called for violence against Jews, among other groups. Substack removed one before we sent it to them. The others we sent to the company in a spirit of inquiry: will you remove these clear-cut examples of pro-Nazi speech?" Substack leaked the info to another outlet on Substack, "along with the information that these publications collectively had few subscribers and were not making money. (It later apologized to me for doing this.)"
Newton believes Substack did this to make the discourse about Nazis on Substack "appear to be laughably small," when in fact the journalists had not attempted "a comprehensive review of hate speech on the platform" and had only sent several examples.
Thomas Fuchs on Bluesky: The fact that "Substack owners deliberately shared confidential private discussions they had with Platformer with authors they liked" show that they're untrustworthy.
"Substack did the basic thing we asked it to," Newton says, by removing five of the six publications. But Newton has "larger concerns" that Substack hasn't addressed. Substack's "defense boils down to the fact that nothing that bad has happened yet. But we have seen this movie before, from Alex Jones to anti-vaxxers to QAnon, and will not remain to watch it play out again"
Regarding his withdrawal, Newton said on January 18, “a big part of it was: Can we sleep at night?...Do we feel good about where we are spending our time? Do we feel good about who we are building value for? And in the cases of X and Substack, the answers were no."
Ryan Broderick summed it up: "essentially," Substack will, going forward, expect "users and writers to flag objectionable content." The problem is that Substack, which writers have used to make email newsletters, has lately been putting effort into turning itself into a social media network more like Twitter where people are more likely to encounter strangers' work. If it's a social platform, it needs to moderate itself like one, Broderick suggests; "you can’t protect your social network on a case-by-case basis when you 'become aware' of it."
In January, Hamish McKenzie says he “never imagined” their six-year-old platform “would one day grow this big. Now that we’re here, we feel a greater responsibility to writers than ever before.” What responsibility does he feel? To grow bigger, with less moderation? But that's not what writers are saying they want. Less moderation isn't an unavoidable error rate; not getting rid of identifiable Nazis is a choice.
A.R. Moxon writes (January 26):
"Then it came out that Substack founder McKenzie had astroturfed a "grass roots" creator response to get ahead of an actual grass-roots movement by creators who were asking him to explain himself about the Nazis. And then it came out that Substack leaked information from a publication investigating the Nazi story—Platformer, one of the flagship publications on Substack—to try to reframe the nature of the complaint as being about a few sites rather than Substack’s posture toward those sites, and position the whole thing as much ado about nothing rather than a deep foundational rot."
A parting thought:
"Flies don’t stop coming into the house because you want them to; they stop because you get off the couch and close the screen door. Any social media or blogging platform faces this. Substack may attract more Nazis than average because Substack has a “okay you don’t agree with me now but what if I wrote another 8,000 words about it” vibe. 2023 Nazis have a very “I didn’t have this insight until I read The Fountainhead for the sixth time, let me elaborate” thing going. Say what you want about the 1939 Nazis, but at least they were occasionally terse." (Ken White, "Substack Has A Nazi Opportunity," Dec 21, 2023)
No comments:
Post a Comment