Were it not for the rest of last week's startling events in Washington DC, the actions taken by a few technology companies to muzzle the US President would have seemed even more dramatic.
The day after Donald Trump's supporters stormed the US Capitol, Facebook said it would suspend his account until he leaves office, and potentially for longer. Twitter, the pulpit that the President has used to great effect throughout his term, went one better on Friday, banning him completely. A host of other platforms, from Snapchat to YouTube, took some form of action.
In a few days, the online megaphone that Trump so prizes has vanished.
Given the potential for Trump to incite further violence through his smartphone, there is a strong argument for the restrictions. No law prevents Facebook and Twitter from enacting them. But the significance of Twitter and Facebook's decisions merit discussion.
A democratically elected head of state has been blocked or restricted from communicating with tens of millions of followers through the world's most popular online services.
Reaction to the move would perhaps have been stronger were it not for years of pressure on the platforms to confront Trump. Since the 2016 election, tech companies have fretted over their supposed role in the fake news and advertising that allegedly contributed to his victory. Activists, as well as employees, have demanded they rein in Trump.
Last summer, Twitter began hiding some of Trump's tweets and restricting how they were shared. When Mark Zuckerberg demurred about Facebook taking similar action, it sparked a walkout among staff. By the time of the election, Trump's repeated claims of voter fraud were accompanied by disclaimers.
Check out that Twitter disclaimer: pic.twitter.com/kfXAUgdM6n— Ron Ruggless ☕️ (@RonRuggless) January 6, 2021
Banning Trump completely felt like closing a door that had been gradually been sliding shut, rather than slamming it in his face. And yet the companies' actions will have far-reaching consequences.
For one thing, banning Trump will do little to appease a Democrat party that will soon have control of the White House and Senate, and has promised tech regulation. If anything, it may serve as evidence of the power wielded by internet companies, bolstering calls to break them up or limit their growth.
It also sets a precedent for more forceful content moderation: the primary message from Democrats towards Silicon Valley last week was not so much a welcome for Trump's suspensions, as a demand for further action.
Mark Warner, a senator and prominent critic of tech companies, said Facebook and others had become "core organising infrastructure" for real-world violence. Zuckerberg is nothing if not political, and has seen the way the wind is blowing. Last week's ban would probably have happened even if the run-off votes in Georgia had not handed Democrats effective control of the US Senate, but it certainly gave Facebook more cover.
So it seems inevitable that Facebook, at least, will move further towards more forceful moderation, a trend that is unlikely to stop at Trump. And because tech firms have tended to move in lockstep when it comes to moderation, others will follow.
This may go some way to avoiding a regulatory crackdown. But in taking a more active approach to moderation, they face a potentially greater threat: a mass exodus of users who feel they no longer belong there.
It is hard to oppose conspiracy theories, dangerous propaganda and incitements to violence being thrown off major social networks. But in both cases, migration is more likely than extinction: those who feel the incumbent internet companies are becoming censors will simply find homes elsewhere.
Last week, Parler, which refers to itself as "free speech social media", saw downloads jump above those of Twitter in Apple's App Store. Gab, another alternative, said it was seeing 10,000 new users an hour.
Parler's future already appears in jeopardy: Android, Apple and Amazon have said they will stop hosting it, but other avenues will arise. Shortly after his Twitter ban, Trump said his team had been looking at setting up his own platform.
Facebook and Twitter should fear this. Their services have been built on the idea that they are a public square, where almost everyone has a place. The clock has been ticking on that optimistic vision for some time; after last week, it is probably over.
Some 74 million people voted for a man who no longer has a place on Facebook or Twitter, and a sizeable proportion of them will now look elsewhere.
The rest of us should have concerns too. The more the online world fragments, the easier it is for radical and dangerous parts of it to spring up, and the harder it is for law enforcement to monitor.
Students of the web have for years observed the gradual rise of the "splinternet", in which the online world fractures along geographical lines, with China, Europe and the US all going in separate directions.
After last week, the same thing may well happen along political boundaries.