Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is how the internet is supposed to work. Encounters damage, routes around it. Censorship is damage, and people are routing around it.

As someone who never engaged with FB/Twitter from the beginning because I thought they were terrible ideas (all the rules, risks, and challenges of elite-level competition with none of the benefits), I was very glad to read this because this kind of divergence is precisely what creates growth. The platforms were becoming a ceiling on growth, and their policies have created the critical mass to start new ones. This will be a great time to be in tech.



I see it as the opposite: this may be how a technical network is supposed to work, but it's not how a society is supposed to work.

The internet has created the ability for any given subculture to discover instant validation and reinforce their beliefs into a tribe, which further solidifies and extremifies those beliefs.

It's a psychological power almost equivalent to a nuclear reaction: beliefs and ideas that might have naturally died out pre-internet are instead amplified and spread, and dug deeper into peoples' minds.

This is a critical time to be in tech, but not for the growth: for the need to design solutions to that problem. Censorship is a primitive attempt at stopping this psychological nuclear reaction; people are finding ways around it. We need a bigger, better solution to the reaction, lest society destroy itself through unfettered tribalism.


> The internet has created the ability for any given subculture to discover instant validation and reinforce their beliefs into a tribe, which further solidifies and extremifies those beliefs.

I don't believe this is true. I remember "Ye Olde Days" (TM) of the Internet. You went to an independent forum or mailing list that supported your interest (video games, music, politics) and then you argued with everyone in your "tribe" about video games and music and politics regardless of the purpose of the forum (which is hilarious and endearing to me). You had friends and you had frenemies but you didn't have strangers. It was nice. It was a community. Some interest brought you all together but you didn't stay there solely for that interest. You stayed for the people.

Now what do we have? Complete corporate domination and centralization. All interests are segmented into hashtags and subreddits but exist in the same super-massive platform. And there conversation is moderated by two entities. Moderators and the mob. There is no community anymore because the community is too fluid and too large (sometimes its everyone on the internet). You don't know the people you argue with. And so argument can never be taken on good faith.

You know, I had more stuff written but can't quite complete the thought in words. I think I'll just make an alternative.


You're right. The way we design our social platforms has a gigantic impact on how people see each other, and talk to each other, and the impact that has on us.

Perhaps the problem isn't the self-reinforcement, but rather the platforms being designed more on instantaneous engagement and addiction rather than talking and human-level discussion.

We've devolved internet discourse into very simplistic, unintelligent, instant gratification that's friendly to advertising and monetization.

That makes a lot more sense, and why I think it's so important to think of this problem as a much bigger systemic ecosystem. The design of a platform or a system impacts how people behave; the ones we have now have just the right mix of characteristics to cause this sort of insular tribalism.

We can choose to design different ones.


We have, and the users, for the most part, chose the screamy, thirsty, censored, algorithmic ones.


I think this is both true and untrue.

Screamy and thirsty seem to be innate to most (important) human discussions. So we can throw those out. They're irrelevant to the platform problem if the platform didn't create them (amplification is not relevant either in my mind).

So then, did users "choose" censored, algorithmic ones. Technically... but I think the nature of the censoring and the algorithms changed. It went from pulling spam and porn to pulling "harmful" content. The algorithm went from "WHERE user_id IN ..." to "curation".

I don't think those things are necessarily bad or evil. But these things started as open platforms to connect everyone. Now they're reverting to "communities" and people are adapting. People are forming "communities" on different "platforms".

The whole social media marketplace is so muddled and ripe for disruption. Network effects are bogus. Its just an excuse incumbents and academics use to rationalize their dominance post hoc. Twitter, Facebook, Parler, the rest will fail and fail fast with the right alternative.


The worst argument for designing an experience is that users want it.

We can do far better.

But I get your point. Realistically this has to be a societal change that demands better social media platforms, not something just thrown into the market with no demand.


> The worst argument for designing an experience is that users want it.

This statement may be the crux of disagreement in the culture wars as well. To me this is an alien and hostile idea against all that can be good. I'm sure you have some reasoning based on an experience behind it, but it's very likely one of those irreconcilable differences of interests that we have conventions and civilization to navigate and negotiate around. I don't think we persuade each other, but rather, negotiate boundaries. Those boundaries are what we understand as tolerance.


I doubt we think as differently as you believe.

I am just a user experience designer. There is a wide, wide rift between what users desire and what will actually solve their problems. I'm paid to reconcile that difference.

I do not intend to design society. But then, people are, as we speak. I don't know what's better--letting them design for what people want regardless of what it does to society, or designing for a society that's better regardless of what individuals desire.

We're primitive creatures. All of us, myself included. We're mostly run by our lizard brains, going after what spurts the happy chemicals into our brains.

I don't think society should be shaped to battle against human nature; but I certainly don't think uncontrolled human nature should shape society. As in all things, a balance.

That is what design is, always.


The only community you have is your friends, and your friends usually have similar beliefs to you. So the space where you could have people with significantly different political views meet on the same terms is gone.


The point is that your friends don't always have similar beliefs to you. You don't think rednecks in the south play video games or soup up their cars or listen to rock and roll, the same as a Biden voter from Los Angeles?

So if you set up a forum for fans of the Atari, you get all kinds. They get to know each other. And then they talk about whatever.

But several things have smashed that all to pieces.

The first and main problem is that everything is a single site with millions of people now. You can go over to /r/atari and talk about Atari, but if you try to talk to any of those people about climate change they'll direct you to a different sub, which is full of entirely different people who you don't know or trust.

Then the sites that are independent are often operated by the company that makes the product. Sony might host a PlayStation forum, but they're going to boot you out for talking about politics or religion.

And then there's the fact that everything has become disposable. What do you do with your broken out of warranty Atari? Get a soldering iron. What do you do with a broken out of warranty iMac? Get a rubbish bin. But then there's nothing to build a community around, because everything is an appliance that you can't improve or repair.


Well said! Makes me think about the scope of tribalism before the internet. Without a doubt there was some, perhaps to a lesser degree though? I think certain views/opinions were more privately kept, perhaps shared with family members and close friends but very rarely shared with people outside their inner circle -- everything you said had your name attached to it and thus carried risks. I suppose you could write newspaper columns under a pseudonym, maybe mask your voice/face on radio or TV, but that was likely the most anonymous you could be. The internet is more or less anonymous by default. Should users be allowed to hide behind anonymity? I have to say yes because otherwise it just feels like 1984.

I think that it's possible to minimize tribalism by enabling civil discussion that shares many perspectives, with all hostility removed (and perhaps minimal emotions), and most importantly, let the audience form their own opinions. I really think that technology can solve this. Like all ideas, success is entirely dependent on implementation.


> I think that it's possible to minimize tribalism by enabling civil discussion that shares many perspectives, with all hostility removed (and perhaps minimal emotions), and most importantly, let the audience form their own opinions.

It is unsurprising that HN commentators think that the best discussion forum would be one without any emotion or hostility.

I think it is important, however, to guard against the hubris of porting your own model (whether it is American-style democracy or HN-style discussion forums) to every corner of the globe.

It might be apparent to us why this model is better, but it might be equally apparent to other, more combative internet commentators, why their model is better. When very smart people on both sides think that a different approach is best, it might be worth treading carefully.


Agree. It will be difficult to design such a system.

One of the reasons I strongly believe that design—not engineering technology—will be the limiting factor and defining practice for the 2020's.


I wonder if instead of it being a technical solution it will be a social one. That maybe in the future we will look at people constantly hooked on their phones and posting on social media the way we now look at smoking.

E.g where it was once totally normal to light up a cigarette in the elevator, the subway, the theater, etc, it is now acceptable to spend every moment of down time staring at your phone, scrolling.

If there is wider recognition of the negative (mental) health effects of that it will be less acceptable to do it in public, and less acceptable to espouse views or information learned from such a source.


Censorship is the last-resort solution. Ideally, you'd want to have control over people's education and social structures to prevent deviation before any censorship is even needed.

It's a real shame, because it seems we had reached a peak with the advent of mandatory state school and television. Now things are going downhill with more and more diverse subcultures allowed to build upon themselves and explore the limits and avenues for improvement in their ideas.

There has to be a way to keep people in line with our society's values while still giving them an impression of freedom.


Recognizing the satire, I agree with the implicit critique that you're making.

That said, I think we also have to recognize that there is no true vacuum of discourse - ie.

> There has to be a way to keep people in line with our society's values while still giving them an impression of freedom.

Even if this is not what we're moving to with public schooling, it is essentially what we're moving to in the private sphere post-Citizen's United, etc. only the values are dictated by those with wealth, rather than procedural, governmental power.


That last point is the kicker to me.

We need to start understanding that there is no such thing as a natural state of freedom; there is only freedom from and freedom to, within specific environmental constraints and power structures. We are always influenced--the question is just what influences we want to prevent, design, or control and which we want to leave undesigned or "free."


Philip Pettit's writing on the difference between freedom as non-interference and freedom as non-domination are very interesting on this issue.


> Ideally, you'd want to have control over people's education and social structures to prevent deviation before any censorship is even needed.

Is this satire?


It's trying to explicit and bring grandparent's ideas to their logical conclusion. I dislike slippery slope-type arguments, but I fail to see how one can coherently agree with their comment and not mine.


I appreciate the sarcasm. Some responses in this thread are giving me a bad vibe. This site is full of users thinking they can somewhat engineer society towards what they think its a proper state and that aligns very well with the currant behavior of big social media corporations. I wonder when did we tech people deviated so much towards being aspiring tyrants...


It makes more sense when you think about the fact that society is already engineered to be in some state, and the power of social media companies appears to be influencing that in the wrong direction.

If you're okay with putting society to the whims of its current incentives and the corporations' addictive advertising-optimized technology, you're free to your opinion, but that seems even more dystopian to me than attempting to do better.

At the very least I think arguing for sustaining the current complex designed incentivized structured state of society is morally equal to arguing for some different state.


It's fanfic of that scene in 1984 where the person (whose name I forgot) responsible for designing NewSpeak talked about how the goal was to simplify language until dissent was impossible to reason about, much less discuss.


Very funny; not arguing for censorship here, but for the design of social media technology that reinforces humanity and not primitive dopamine hit engagement driven advertising. What people say is not the problem.

Anyway, we are already in a balanced system that keeps people in line with society's values while still giving them an impression of freedom. The question is just which direction you go from there.


The problem with your arguments is that they are full of assumptions of things that you say are good for society and humanity, while we don't actually have a scientific understanding of what the humanity is for to even be able to decide what is good or not to get there. Which, of course, makes all the arguments imply a power in someone's hands to decide what is "good" for humanity and society or what they are for as long as it's not "primitive dopamine hit engagement driven advertising", but someone more aligned with your ideals. I hope you can see how this is not much different from somebody else you don't align with having that power.


First, in no world am I arguing that I personally get to decide what is good or bad. Lord no.

Second, your argument devolves to everyone should be able to decide what to do for themselves or their company without discussion about what is good or bad for society or the environment at large.

So no, I can’t agree with that.

At some point we need to be able to discuss how to make society better. If that’s not allowed, then that’s not a society I want to live in.


Censoring and filtering ideas will just lead to evasion of the filters. It will never work, we need to embrace free speech and let people believe as they choose.


The long term solution might lie in educational technology. Can we find scalable ways to teach children critical thinking, epistemology, and metacognition?


So what you are recommending is that we prevent people from escaping censorship - is that right?


The last two sentences say that censorship is not a functional solution to the problem and that we need to find something better, so I’d say very definitely no.

I’m not the person you responded to, of course, but I don’t see how you could’ve read their post and thought they were advocating for inescapable censorship.


"people are finding ways around it" was his main gripe with censorship.


It's not a gripe, it's a reality. I have no horse in this race other than the systemic outcome.

Censorship (or official annotation, let's call it) was an attempt to curb the tribalism or put some controls on the spread of it; I'm certainly saying that it will not work.

I can't imagine alternative solutions at this time, but we need to. It will likely be some sort of system we can't yet imagine, and it will likely need to be at the societal level of mutual agreement. Think less of controlling speech, more "wow cars can kill people, we should probably agree to some rules around driving them around."

I expect things will get much worse before those types of systems are put in place.


It is concerning that you have "no horse" in a race over whether there should be American censorship.

It's as if the centrists have thrown away liberalism, which makes no sense to me given that liberal values are immensely popular among the public.


By saying I have no horse in the race, I'm saying that I'm trying to take an objective viewpoint on what is happening and what impact that has, not that I have no opinions or moral positions on the matter. Keep those separate.

Keep in mind also that censorship is a government matter: the government should not censor people; that's enshrined in the 1st amendment. But private platforms and companies have every right to design their communications systems the way they see fit, and I expect them to do so ethically with societal impact in mind.

I expect soon the government will need to enter into this race and take some wider action, but I don't know what that will be, nor how it will play with our constitution. It's going to be a wild ride as the psychological nuclear weapons we've created duke it out with the individual rights principles we've laid in stone hundreds of years ago. I can only hope we design some systemic solution that does not require that fight to take place.

And just to give an idea of the solution type I have in mind: right now the social network systems we have are optimized for addiction and engagement of content, and quick viral spread of content with minimal thought. Could we instead design systems that are more about human communication and understanding? Could we alter our existing platforms to tune down the addictive tribal dopamine hit in their nature? I bet that would have a large impact.

In other words, simply censoring speech without considering the design of the technology would be foolish. The platforms, not the speech, are the problem.


No. I'm saying that censorship isn't the long-term solution here, because it's untenable from a human rights perspective as well as simply a functional perspective (people don't like it).

We'll need to find something better, but it's difficult to imagine what shape that'll take at this point.


then I would say your vision is incoherent since you obviously aren't for small pockets "absolute free speech" but aren't willing to commit yourself to controlling speech ubiquitously which would be necessary to prevent the former.


I would say you're not looking wide enough here. Absolute free speech isn't a problem in isolation by any means; but there are impacts to the ability of that to grow into self-reinforcing belief that doesn't match reality. One appears harmless, like a single alpha particle; but the other is an impact akin to an alpha particle chain reaction, or nuclear explosion. Similar impact on society, I would argue.

Could we conceive of a way not to control speech, but to inform or educate or provide information in context--or do something, anything so that it doesn't have the power to self-reinforce and create psychological weapons of mass destruction? What if it's the design of our current social networks around instant engagement and addictive content that's the root of the problem? Could we change the nature and design of that platform without restricting the speech on it whatsoever?

I have to believe it's possible. I'm under no illusion that I know the answer, or that the answer is even something that we have a name for yet. This is not contradictory or incoherent, it's just yet unknown.


While I disagree with the GP, I think you are drawing a false dichotomy and there are multiple conceptions of free speech besides the negative, non-interferential, liberal one that would permit limits on some speech without "ubiquitous" control.

Take, for instance, limits on total expenditures on political speech over $1,000,000.


No. The post explicitly said that censorship isn't solving the problem and we need a better approach.

The problem is the asymmetry between bad-faith and good-faith action. In tightly-connected local societies that asymmetry is generally countered by reputation effects and limited scope of bad-faith action. In the worst cases, it's countered by societies being small enough that even when bad-faith ideas take over their spread is limited.

Technology has broken down the limitations. Information is spread by entities with no history or reputation at all (Twitter bots, for instance). The spread of bad-faith ideas is no longer limited spatially. The result of this is that obvious scams like QAnon (created to sell merchandise) thriving because there are now mechanisms to bypass all the natural limitations that used to constrain them throughout centuries of history.

Censorship is kind of like slapping a tourniquet on it. It may stop some of the worst symptoms, but it has a lot of terrible side effects and doesn't work that well anyway.

The biggest advances that need to be made are in recognizing that there is a problem in the first place, and that a "marketplace of ideas" is not equipped to deal with asymmetric bad actors. I don't know what the solution is, but "marketplace of ideas" has proven insufficient to deal with the real world. We need people to be looking for something better instead of claiming there is no problem.


While I agree with that, it all depends on which direction you look at it.

For example, how much of this reaction is caused by the earlier amplification of other ideas that would have just as easily died out?

I’ve been watching the political machine for a long time and one thing I can say for conservatives is that they are consistent in what they “say” they believe. It hasn’t changed much in years and I can at least respect that.

On the other side, there seems to be a new cause every week. Maybe it’s simply better use of technology, but most of what I see on Twitter seems to be messages designed to benefit the left. Whether it’s drawing attention to a subculture that feels marginalized or convincing that same group that everybody on the right hates them, it’s a pattern that’s really hard to deny. IMO Reddit is far worse that Twitter or FB in this respect too.

Nobody has the energy, time or capacity for the amount of things the right is supposed to hate.

I don’t think a different platform is going to change any of that. I think it’s just going to create a new channel.


It's disturbing to think the socio-technological phenomenon that let my fun, happy community grow to multiple 10k+ person conventions and hundreds of smaller ones worldwide also gave actual Nazis and various schools of white supremacy a new life.


The argument that we need to stop tribalism to preserve society is both an un-self-conscious conservativism and a more pernicious appeal to cultural homogenization, which is what the dominance of the platforms caused, and which these innovations are a reaction to.

There is no diversity in 140 characters of closely monitored claptrap, and there is no risk and opportunity in trust and safety. There is no culture in a homogenized overton window, and there is no innovation in walled gardens.

The very idea that we need to suppress ideas for fear that the ignorant masses may be exposed to them is a fatuous, aspirational elitism that is the very reason the platforms have become stagnant.


> The very idea that we need to suppress ideas for fear that the ignorant masses may be exposed to them is a fatuous, aspirational elitism that is the very reason the platforms have become stagnant.

There are ideas and then there are lies. I think the problem is the lies, not the ideas. If an idea, like that the Earth is flat, can only find support in lies, the idea will go away if people aren't free to lie.

To be clear, a lie is a proposition insincerely asserted with the intention of causing others to sincerely believe it.


Yes, but people who get annoyed by someone else's ideas just calls them "lies."

I, for instance, watched in astonishment as a relatively straightforward story about Hunter Biden's emails somehow became a Russian disinformation campaign.


I still vote Democrat, but it is the growth of exactly this attitude that you've identified that is causing me to move increasingly away from considering myself a "Democrat".

I can't see how people fail to notice the snobbish elitism underlying "we need to manage the discourse so that people's behavior is under control."

It's amazing how cyclical this sort of stuff is. Plato's Republic was enmeshed with a similar logic: that there is a natural way for society to progress, that human society is interfering with this natural way and that is a problem, and that we ought to have philosopher-kings (read: techies & politicians) to shape beliefs and discourse into a more natural (and thus "good") direction.

G.A. Cohen's writings on the history of philosophy have a scathing critique of this sort of thinking that I recommend.


I do not think we need to manage speech or control discourse whatsoever. If you think that's the idea here, you're not thinking deep enough about the nature of the systems that are problematic right now.

We've designed communication systems that are optimized for instant gratification, engagement, getting the dopamine hit of seeing things you agree with or angry at, and rapidly moving to the next. That's not speech, that's a designed system of attention seeking, all with the primary objective of getting eyeballs on advertisements and content primarily.

We've designed communication platforms that cause us to cease seeing each other as human beings, and instead condensed thoughts and memes that are simply repeated to conform and get reinforced for a quick natural drug.

I do not think we should control the discourse or have structural control over what's right or wrong to say or think; that would be absurdly dystopian.

I do think we need to change our systems to incentivize human context around that speech, and reinforce our own humanity in how we read and respond to it.

I think it's possible to design better platforms that don't bring out the worst in society, and that we should do so.


Anyone who has studied the history of philosophy would know that the arguments you are making are classics among defenders of censorship: limiting speech for the "sake of speech" (over attention-seeking), "incentivize the context", "reinforce our humanity." None of these exempt you from managing speech, you're just trying to launder your values to a higher level - like what "good speech" is or whether a given speech-act appropriately "reinforces our own humanity." Tune the knobs of the social media platform until it starts producing speech I am more comfortable with.

Indeed much of JS Mill's On Liberty (1859) was dedicated to responding to these and others.

> we need to change our systems to incentivize human context around that speech

Who is the "we" who "should" do so? How does that "we" coordinate so that everyone building these systems builds them in the same way?

And why should these systems be optimized for whatever extrinsic goal you like better than "attention seeking"? That is managing speech.


The key difference here is that the speech is already highly managed and influenced to a degree never before seen in history. To think otherwise would be foolish.

Twitter isn't some natural state of the world that is pristine and unassailable; Facebook isn't the square in the park where people can speak their voice, and other choose to listen, participate, or walk along. Neither are anything like a book or newspaper.

If you're saying we shouldn't optimize or manage or turn the dials of these systems, then that's accepting that the current management and optimization and dials are somehow, inexplicably, acceptable or natural.

Someone has designed these systems and is influencing speech with their decisions. I don't have any power over them, but someone does, and the dials and designs of the system and the type of speech and communication they reinforce or encourage will change over time.

What gives them, the designers of the systems, any more right to manage their own system and the speech on it by their design choices? Are we to simply accept the corporations' design of their systems without critique or argument?

Not taking any action is to accept the current action, which still influences society. Not managing systems is to accept the current state of those systems, which still manages speech.

I do not see any difference whatsoever in those paths, so I will argue for trying to improve the systems.


OOC are you a Parler user/have you visited the site? If this is a victory for free speech, it's certainly a Pyrrhic one. Twitter and facebook comments seem like the pinnacle of civil discourse next to the stuff I found on Parler just by creating an account and following all the recommended content in the onboarding process.


No, it’s awful and will ultimately justify creating the same government power over cyberspace that it exerts over airwaves. It a painful example of how the miracle of technology doesn’t mean tech companies can govern.

It’s hard stuff, on the one hand, if 1999 era tech was all we had, I’d likely be sick or dead as remote work wouldn’t have been possible for me. On the other, with 1999 era tech, we wouldn’t have a reality show host president fueling a pandemic to get more attention in media.

Right wing lunatics (and other fringe types) weren’t powerful in 1990 because they couldn’t broadcast their filth. They got traction because the rules were loosened for radio and angry white man radio became a thing.

You have the state of shitshow today because any lunatic can broadcast anything and gather a following.


I mostly agree. However, it'll be interesting to see how well new competitors can fair in an industry like social media where the product (i.e. us) is already near monopolized on a few large platforms.

Personally, I like Parler's interface, and don't mind that it's kind of a conservative bent, but I've heard from other conservatives that the interface isn't as good as Twitters. And it's smaller audience is already keeping some big-names away from it, even people who would theoretically really like the platform, like Ben Shapiro or Scott Adams. They've both created accounts there, but have both signaled that it's just too small of a platform and is a waste of their time to post there, so they stick with Twitter.


Those personalities you mention (Shapiro, Adams, etc) are artifacts of the prior platforms. It's meaningless if not better that they do not move. What makes this exciting is new characters and economics will emerge.

Former British PM Tony Blair once related an anecdote about why he decided to run for the leadership, when Labour had just lost hard in an election and he heard their campaign lead rant, "I can't believe it, the people voted against us...what's wrong with them?" And that is how he saw his opportunity.

Between this cultural shift, tons of idle cash sitting on balance sheet sidelines looking for productive assets and growth, and a winter of semi-lockdowns ahead of us with nothing to do but code, next summer is going to be the most epic year for startups since '96-97. That is, assuming civilization isn't wiped out, but I'm pretty well hedged.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: