Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Luckily, as moral agents, we humans are capable of differentiating "general political party" from "white nationalists", and can target the latter and not the former.

This is highly debatable. There are plenty of examples of benign movements and opinions that have been stifled violently by society and the state.

You can find with little effort plenty of people today calling for censoring or even being violent towards harmless liberal or conservative people because they are "communists" or "fascists". People are not good judgement and measured response.



Luckily, in civilized societies we have a system of checks and balances in place to ensure that doesn't happen.


Do we anymore?

Facebook and Twitter have become a digital public commons for discourse today. The check and balance is "what Facebook decides". That doesn't exactly seem like an adversarial check and balance system to me.


> Facebook and Twitter have become a digital public commons for discourse today

Absolutely correct, however, they aren’t the only places for public discourse. People have never been able to demand a newspaper print their article or that a magazine must include their story—people have always had the choice to start their own newspaper or their own magazine and build their own audience and this is still true today, in fact it’s much easier than it’s ever been.

People who’s business is access to human inputs, whether they are newspapers, music venues, theaters, magazines, etc.. have almost always had the freedom to set their own standards and it isn’t clear to me why business owners shouldn’t have this freedom anymore.


>Absolutely correct, however, they aren’t the only places for public discourse

Small comfort if they are the main places for public discourse - so cutting people and ideas there essentially means relegating them to far less reach.

Strange how when some foreign state censors FB or Twitter it's an outrage, but when FB or Twitter sensor people directly "there are other places".

Not to mention the monetary deplatforming (e.g. Mastercard, PayPal, Patreon and co not allowing funding), in which case there are no "other places" (not many in any way, and not reputable for someone to go pay there).

>People have never been able to demand a newspaper print their article or that a magazine must include their story

Which is irrelevant, since newspapers and magazines where always top-down affairs, written and curated by a specific team. Social media and platforms were supposed to be open to society (hence "social"), not only for a select team of journalists to have an account there.


> Strange how when some foreign state censors FB or Twitter it's an outrage, but when FB or Twitter sensor people directly "there are other places".

Because a government has a monopoly on violence, while a private company has freedom of association. You're conflating two different situations that are only superficially similar.

> Social media and platforms were supposed to be open to society (hence "social")

Yep, and that didn't work out so well. Hence, the bans.


The litmus test I think we should use is "We should honor the intent of the user".

If a group of users explicitly want access to white nationalist content, they should be able to get it. So I would oppose blogs, webhosts, and cloudflare deplatforming anyone for any reason besides the outright illegal.

Facebook and Twitter are not just about serving content to those who have the intent to view it...infact the whole point of these social networks is that they expose content to NEW people who didn't initially have any intent to view. This is promotion, not access, and I have no problem with private entities choosing what they want to promote.

I would apply this same test to payments. Users who have explicit intent to financially contribute to objectionable content creators such as Alex Jones should still have a way of doing so. When Patreon, Matercard, etc etc deplatform him it closes the door to those who already have intent. Of course, I'm all for FB and Twitter shutting down the campaign so the word wouldn't spread nearly as far.

As a moderate liberal who finds sexual content over-censored, yet am disgusted by right-wing and anti-vax (anti-vax is often leftist!) conspiracy theorists, I think this test "honor their intent" test is a great way to keep the internet relatively sex positive, and extremist content relatively niche.


This seems like a relatively moderate view, and I like how it breaks down the individual freedom of intentioned users vs. the freedom of users who have no intention of seeing said content.

But at the end of the day, aren't the companies who are providing payment processing or website hosting profiting off of extremism and hate? If you'll recall, the reason why they started deplatforming individuals to begin with was that large swaths of people boycotted their services until they chose to no longer do business with said extremists.

Isn't that voting with our dollars? Isn't that the Free Market of Ideas in action?


Oh, was that a thing? For example was there a lot of outrage and pressure on payment processors to deplatform FetLife, or for Patreon to remove cam girls? I don't recall anything along those lines.


An organization choosing not to publish someone is not censorship. People choosing not to listen is not censorship.

A government choosing what information you have access to IS censorship.

There are many organizations, anyone can start one. There is only one government and you can't escape it.


That wouldn't be a problem if these organizations hadn't captured 95% of the discourse. There is nothing in the definition of censorship that requires it to be done by the government.

The same sort power brokers that would drive censorship in a place like China are the ones who fund political campaigns, found think tanks, control media empires, and choose advertising spend, and use this leverage to drive censorship on social media.

In the end if the rich and powerful have effectively squelched dissent does it matter if it was through government mandate or some more complex mechanism though private means?


While true by the dictionary definition, the commonly-understood colloquial definition of 'censorship' is government censorship.


I don't believe that to be true, as evidenced by this debate itself and the proliferation of this same debate across the internet.


'a problem' and 'censorship' are two different things.


The censorship by private organizations would not be a problem if...


Luckily, "civilized societies" are some of the most deluded about this point.

From McCarthyism, to J.E. Hoover, to MLK, Gary Webb, to WMD, to the Patriot Act, to Snowden, to the "collusion" BS, to today's de-platforming, the establishment and the media easily stomps on whoever they don't like with impunity.


So, can we as humans learn from history? Can we establish better, more thoughtful, and more balanced societies as time and our understanding progresses?

Or have we already built the pinnacle of society at some past point, and everything we ever do in the future is doomed to be as bad or worse than what we already have?

I'd like to choose optimism here, personally.


Well, starting with free for everybody speech platforms, and letting people make up their mind, would be a good start.

Banning ads would also be another good start, but I don't see the idea getting very popular.


No, it would not be a good start. How do we know this? Because that's what the actual start was. And it led to Facebook becoming the carrier of all the hate people could convince each other to accept. And people targeted each other, conditioned each other, to normalize this hate and acceptance of violence against 'others'. And so we have the situation we have today, where Facebook was forced to acknowledge that they became a platform for hate.


How about instead of a one-stop-shop social network, we go back to the random topic-specific forums of yesteryear? It decentralizes discussion, allows individuals to freely associate among themselves, and doesn't result in a "one size fits all" mentality when it comes to moderation.


Indeed, this is a decentralised problem in need of a decentralised solution. Also relevant are https://hypothes.is/ and IPFS.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: