I've known several people that would no longer work for Facebook, but the Cambridge Analytica isn't the biggest concern. It's the fact that they are censoring people, even within private groups.
I have a friend that jokingly said (in a private group) that men are vile pigs. We knew she was joking - it was good natured. Yet, Facebook issued her a warning and removed her post and threatened her with a ban. First they came for Alex Jones and I said nothing because I don't like Alex Jones (and think he's insane), but now that the precedent is set that Facebook is the speech police, it will expand to us all (especially with their machine learning advancements that are here and yet to come).
The EFF has a really important article about this that I implore everyone to read[1].
FB has its problems, but I generally find the negative press overstated and wonder if Zuck's approach to interact with the press and congress actually backfires (compare to the other companies which largely ignore them). I appreciate how often he talks to the press to explain what they're trying to do though.
I also see the Cambridge Analytica scandal as what it is - permissive APIs that were abused and then locked down. Cambridge Analytica is to blame in this for abusing TOS and behaving badly, FB is arguably negligent - but I think the reaction is extreme.
Plus from people I know inside FB there really is a huge funded effort to stop abuse and manipulation via 'integrity' teams. It'll be interesting to see how they modify things given Zuck's recent pivot towards focusing on privacy as a core feature.
I don't see any nuance, I see invented complexity masquerading as something sophisticated.
This article is written as if they're trying to suss out the perfect boundaries along which to apply censorship. That frame of thinking is a con. They will never have a perfect censorship implementation because there is no win state.
Let me back up: As you read this, facebook has a censorship policy. What would it take for facebook to be know they're done arguing over what does or doesn't get deleted? How do they know when the censorship policy isn't good enough? Advertiser pressure, press pressure and political pressure. i.e. fashion.
It should have been obvious to everyone present at (or read about) the meeting in this article where Facebook attempted to invent a principled stance that allows casual misandry while banning similarly-tempered casual misogyny. But I'm perfectly willing to believe Facebook can't see it. I can explain.
Facebook's stance towards nudity has a very clever property (that is almost certainly unintentional). It allows users to lie to themselves. Facebook could automatically hide nudity to everyone who isn't an opted-in adult, and still throw it behind a twitter-style click gate so there is no accidental NSFW at W. But they don't. They'd have to add an "I'm not a prude" checkbox. And THERE'S the rub. The lack of such an option lets people uncomfortable with nudity tell themselves that they're not the kind of person that's uncomfortable with nudity. The want to think they're sex-positive enough to have nudity, and just a reasonable person who doesn't mind if it happens to be banned. Even more importantly, it lets people them avoid thinking about the question "am I so uncomfortable with nudity that the idea of other people - the WRONG PEOPLE - seeing it makes me uncomfortable?"
They're not a government and it's reasonable for them to have some editorial control over what's posted to their platform (in the interest of keeping it a place their users want to be).
A lot of this is detailed in the article, it's helpful to read it.
>it's reasonable for them to have some editorial control over what's posted to their platform
I'm not saying it's not.
I'm saying that their editorial policy is (in part) driven by fashion instead of principle. And complexity is used to obscure that rather than reveal it. My attempt to use Occam's razor to explain the obfuscation leads me to conclude there must be some utility that caused the system to evolve in such a way that it's possible to avoid seeing/acknowledging that.
> if Zuck's approach to interact with the press and congress actually backfires
Since when Zuckerberg speaks, he always seems to be equivocating if not outright lying, and his historical responses when Facebook has been called out for being abusive in the past has always been lots of promises with no real changes, I suspect so.
> It'll be interesting to see how they modify things given Zuck's recent pivot towards focusing on privacy as a core feature.
That entire effort is Facebook's attempt to change the topic to something that is less threatening to them. The real issue is the privacy invasions from Facebook itself. Everything about their "pivot towards privacy" is premised on the privacy threat being actors external from Facebook.
It’s really easy to win the crowd by calling everyone else biased. When FB is one of the biggest lobbyists in Washington I highly doubt the press has ever been critical enough.
They got by on being the new darling child startup and built up an gargantuan pile of moral debt which they are now fairly paying for.
I have a friend that jokingly said (in a private group) that men are vile pigs. We knew she was joking - it was good natured. Yet, Facebook issued her a warning and removed her post and threatened her with a ban. First they came for Alex Jones and I said nothing because I don't like Alex Jones (and think he's insane), but now that the precedent is set that Facebook is the speech police, it will expand to us all (especially with their machine learning advancements that are here and yet to come).
The EFF has a really important article about this that I implore everyone to read[1].
[1] https://www.eff.org/deeplinks/2018/01/private-censorship-not...