For the first newsletter of 2024, let’s go back a week to a scene from A Christmas Carol by Charles Dickens.
“You will be haunted,” resumed the Ghost, “by Three Spirits.”
Scrooge’s countenance fell almost as low as the Ghost’s had done.
“Is that the chance and hope you mentioned, Jacob?” he demanded, in a faltering voice.
“It is.”
“I—I think I’d rather not,” said Scrooge.
“Without their visits,” said the Ghost, “you cannot hope to shun the path I tread.”
From an ESG perspective, this scene sets the stage for an upcoming pivot for Scrooge. Of course, by this point, the reader knows that Scrooge has been failing miserably along Social and Governance issues, leading with his shareholder value (his own) at the expense of his stakeholders (borrowers) and the poor Bob Cratchit. Yet, Jacob Marley arrives to offer Scrooge hope.
Three ghosts, representing the past, present, and future, appear to Scrooge that night to warn him, and of course, we know how the story ends. These mysterious stakeholders convince him to change his ways.
So, what has me thinking about A Christmas Carol and ghosts this week? Well, I’ve been doing some soul-searching myself as multiple news outlets have reported that Substack is platforming and monetizing content from Nazis because open discourse is preferable to censorship.
Oh boy. Honestly, I am tired of making moral judgment calls on companies.
There are a lot of takes on this issue, but I’m a bit partial to Margaret Atwood’s read. Atwood, perhaps best known for authoring The Handmaid’s Tale, correctly illustrates how Substack can’t have it both ways. Funnily enough, she is also proudly touted as a writer on Substack’s About page. Still, most articles about this issue are from the values point of view because, well…Nazis.
If there are two things ESG is familiar with, it is values and value.
While Scrooge dealt with tangible assets and financials along with his values, modern companies deal with a complex web of stakeholder values, intangibles, globalization, and interconnected risks. At stake here are the values surrounding free speech and what it means to be a company operating in a civil society. ESG, specifically the Social and Governance pillars, plays a role in the examination.
What better way to look at the complexities of values and value of this issue than through the Three Ghosts of Christmas?
The Ghost of Christmas Past: History and Regulation
The first ghost appears against the horrific backdrop of war.
The first stop is World War II and the results of Nazism: millions of Jews were killed. When I was a teenager, I went on a senior class trip to Europe, and we visited the concentration camp at Dachau. I will never forget that experience, and world governments are keen to ensure the rise of Nazism never happens again.
For any online platform, the movement of global regulations is something to monitor, but it is of particular note for Germany. Germany has several laws preventing Holocaust denialism and Nazi imagery, even in online forms. These laws have been acted on in the past. For example, Wolfenstein 3D, a video game, was banned for Nazi imagery for over 30 years. Nazi content from Substack could easily manifest a regulatory risk for the company, limiting subscribers for content creators by proxy.
Substack, a US-based company, argues against censorship and favors free speech. Free speech in the US often seems singular due to its interpretation and is upheld by companies clumsily as a result. The US Constitution’s First Amendment protects from government censorship, but not private companies. Private companies, like social media companies, can restrict and moderate free speech. There is no obligation from the company to protect citizens. Still, the lack of moderation is consciously aligned with Substack’s purpose in this case. Substack argues that discourse is better than pushing out those you disagree with.
But again, we’re talking about Nazis.
Next, the ghost takes us to the United Nations. In 2022, a UN resolution was passed condemning Holocaust denial or distortion in this new age of technology. The terminology of the resolution recognizes:
…that States, regional organizations, national human rights institutions, civil society, non-governmental organizations, religious communities and the media play a crucial role in promoting tolerance and understanding, as well as fighting racism, negative stereotypes, hate speech and the deliberate spread of disinformation that may incite to discrimination, hostility or violence, and in the universal promotion and protection of human rights…
While the US has enshrined free speech from government censorship, this non-binding resolution can translate into real-world regulations for companies to adhere to. In November 2022, the EU enacted the Digital Services Act to protect its citizens from disinformation and digital harm. Under this act, the EU is now pursuing an investigation into Twitter/X for its control of hate speech and disinformation.
Twitter/X and Substack pass responsibility to the content creator, but these regulations target the platforms anyway. Under the paper-thin shield of a principle, in this case, open discourse, Substack has opened itself up to this regulatory risk itself and its content creators. The opaque platitudes in its terms of service will not be enough to quell regulators if scrutiny is invited, which it may be. If the EU decides to investigate Substack similarly and limit its use or distribution across member states, where does that leave content creators?
From the horrors of World War II to Germany’s reconciliation and regulation through to the UN’s resolution and now the Digital Services Act lies a whisper from the ghost of “never again.”
The Ghost of Christmas Present: The Anti-Woke Business Model
The second ghost appears in Substack’s offices asking a question:
In a civil society during the digital age, what is the role of a company?
Companies are not government entities working to protect citizens and their rights, but Substack, like any company, can choose to do whatever it likes within the law. The company aligns with absolute free speech but practices some Governance and moderation by not allowing certain types of content in its Content Guidelines.
Ironically, when Twitter/X chose absolute free speech and began getting flooded with hate speech, Substack started Notes as an alternative for fleeing users, capitalizing on similar Governance missteps. Now, it finds itself in a similar position. The ghost flips Threads open to find users looking at other platforms like Beehiiv and Ghost (which brings a smile to the ghost’s face).
Many social media platforms and now Generative AI companies enact moderation controls to prevent hateful or dangerous content. Companies do this to protect themselves from material risks ranging from regulatory to stakeholder pressures. Yet, here we are in 2024, and platforming and monetizing Nazi content is not seen as a Governance risk unless physical violence is incited. I can’t even believe that is a sentence I’ve written.
The ghost flips open a newsfeed where we see that during 2023, Twitter/X lost $44B as its owner, Elon Musk, supported antisemitic posts directly. Meanwhile, advertisers’ content appeared next to Nazi/hateful tweets. As advertisers fled, Musk swore at them in an embarrassing interview.
By allowing Nazi content on its platform, Substack has placed itself in the same position as Twitter/X. Even Notes may head down the same path if Nazis feel empowered to appear under the guise of discourse.
Like any modern company, Substack doesn’t operate in a vacuum. To monetize its platform, it relies on Stripe, a payments platform. Stripe has a Prohibited and Restricted Businesses policy, which seems Substack aligns with. Stripe even mentions a similar prohibition against racism only when related to violence. Will this represent a future reputational risk to Stripe via association and force them to place a stake in the ground on the issue? For Stripe, Substack’s management of this issue is Stripe’s Governance issue.
Another place to look is Substack’s investors. One of them is Wisdom Ventures, which states two of these bullets on its Vision page:
Nurture human connection, not division.
Support equality and social justice, not economic and social division.
With those points lingering in our minds for interpretation, the ghost clicks off its mobile device and fades away, leaving a piece of paper behind.
The Ghost of Christmas Future: The Falsehood of Self-Regulation
As the Ghost of Christmas Future appears, we’re left with a printed Note from one of Substack’s co-founders, Hamish McKenzie, defending the company’s position.
…history shows that censorship is most potently used by the powerful to silence the powerless.
This leads us to the argument of self-regulation. The idea is that if people don’t want to read Nazi-related material, these writers will give up. One might argue that censorship violates the civil liberties of Nazis, but again, a company is not responsible for enforcing civil rights. This is a platform choice Substack has made.
Companies take stands through action and inaction all the time. For example, back in 2015, online retailers stopped selling Confederate flags. Selling a Confederate flag is a material consideration for a retailer as societal tolerances shift. There are all kinds of risks and opportunities in making a move like this, but the opportunities usually outweigh the risks. This is stakeholder capitalism and reputational risk at work, not censorship.
In the case of Substack, it apparently believes that open discourse of this type satiates more content creators than those who oppose it whether they mean to or not. They have (hopefully) weighed the risks carefully and landed here.
Unlike Twitter/X, which is largely ad-based, Substack relies on content generated from its writers, like me. So, when my content about ESG and Corporate Governance is published on a platform that allows Nazis to publish their message, this is a problem because it seems like I have no idea what I’m writing about. It also represents a reputational risk for anyone who isn’t a free speech absolutionist and an opportunity for those who are.
The question is whether the content creators with 2M paid subscribers will jump ship. This is Substack’s open future bet on self-regulation.
I’m staying here because there’s too much risk for me to move until the book is out, but it pains me to wait because I see this Governance risk brewing. I have seen no evidence that the management team at Substack is in control of this issue now that it is in the open.
Returning to what McKenzie writes in his Note:
I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don't think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.
I’d opine that normalizing Nazism as a ‘view’ is far worse than anything else the platform does.
One of Twitter/X’s issues that Substack does not have is that bots can quickly take over within the simplicity of Twitter/X’s character limits. However, Generative AI represents a new risk. If Nazis feel safe on this platform and leverage Generative AI to churn out different non-violent hate speech at scale, I believe there will be no turning back, and Substack will realize too late the mistake it's made.
As the Ghost takes leave, a foggy future remains.
Meanwhile, a return to purpose
And so, as Scrooge returns to his bedpost for his hopeful ending, Substack’s future is less certain, but it has woken up with the same purpose as the day before.
I believe a huge risk exists when a company goes against its purpose. Substack’s About page is as close as we’ll get, but unlike many companies, Substack offers a glimpse of the future it wants to see.
We think the internet's powers, married to the right business model, can be harnessed to build the most valuable media economy the world has ever known—an economy where value is measured not only in dollars but also in quality, in good-faith discourse, and in creating an internet that celebrates and supports humanity.
The biggest question for me resides in those last few words. How can anyone tolerate Nazis in good-faith discourse and believe that they would be celebrating or supporting humanity in any way?
Back to Atwood’s article, Substack can’t have it both ways here. There is a line, and they’ve marked where it stands for the company. From a Governance perspective, this line must sit with their purpose, which means moderation is a requirement because an internet that celebrates humanity does not align with the Nazi agenda.
Now that the line has been made public, the inflow of hate may lead to the outflow of content creators, leaving little room for any discourse, only another echo chamber. This post stands as my ESG perspective on the issue, so I suppose I’ve contributed to the discourse that Substack promotes. Still, discourse without action is just words on the screen. In most cases, Substack is a platform to facilitate that discourse, but it has become its target here. What good is discourse without movement?