Spoken from the pretty obvious position of never having to have worked a low-wage people facing job.
Here's the real situation: the people that pick up the phone when you call them up aren't going to be paid much above minimum wage at all. They have zero institutional power to fix anything. You're yelling at people that, themselves, almost certainly are only barely making enough money to get by either.
It is worthless to yell at these people because they can't fix shit; they don't set policies, they have no power to fix things and all your yelling is going to achieve is at best counterproductive to what you want to get done (since now the front facing employee dislikes you personally and is less inclined to try and help you out) and at worst is going to get you into further trouble when you do need something routine done. (Since now you're on the list of "people that the employees don't want to put any extra effort into since they're jerks".)
There are people that get paid to be the complaints facing entity of the organization, who are paid to withstand whatever shit you can throw at them and who have an ability to fix up whatever you needed in specific. They're not the people that pick up the phone.
What you need to do is channel the inner Karen and ask to speak to the manager. The manager can help you with this sort of thing, they are the ones that can do shit to avoid sustaining the machine, because they have a career they want to grow into and risk actual consequences for pissing people off.
Be polite (but firm; you don't need to be walked over) to the first tier support employees, even if they can't help you. Save the complaints for the manager (who you shouldn't be afraid to ask to speak to either). The managers job is to deal with the real complaints, not the routine stuff that just happens to need a human involved. They are taking a job to be the face of the machine for reasons other than "I literally need a minimum wage job to survive".
The employee didn't mistreat anyone. She simply stated the procedure (which sucks!).
It was OOP that chose to escalate this to malicious compliance and ascribed a lot more to her attitude than what's actually said. OOP assumed that she was out to get him in specific, when nothing in the described call even suggests as much.
The correct response would've been to ask for the manager and if the manager chooses to stonewall in an obnoxious way (which is possible!), then you pull the frustrating fax from hell on them. At that point, you're not just speaking to someone who has no power to fix shit, you're talking to someone who does have the power to fix shit and chooses to be a stick in the mud about it. That's when being a jerk back is deserved.
Being a jerk to low paid employees in this manner is unacceptable, rude and makes me think a lot less of the person writing it.
lol. stopped reading after your first line because I've worked every low-wage, customer facing job you can imagine. shoe salesman, phone rep for verizon and then t mobile and then at&t, fast food, local diner waitstaff, office receptionist, contract installer, HVAC repair, cable service tech. that's a truncated list. I have the opinions I have because I've had those jobs, not in spite of them. I know how I carried myself, and it was a very low bar to reach. it's only when people don't reach that bar that I raise issue. because the bar is "know where you are, know what you can do, know what you can't do, and be as accommodating and responsive to the client/customer as you possibly can be, given your constraints." doesn't feel onerous to me. and in this specific case, I don't even have a problem with karen, per se. at least not from the content of the story. my reply was in response to other people insisting that karen needs to be coddled because all she did was answer a phone and this horrible man sent her a fax! (the horror)
I don't dislike Codeberg inherently, but it's not a "true" GitHub replacement. It can handle a good chunk of GitHub repositories (namely those for well established FOSS projects looking to have everything a proper capital P project has), but if you're just looking for a generic place to put your code projects that aren't necessarily intended for public release and support (ie. random automation scripts, scraps of concepts that never really got off the ground, things not super cleaned up), they're not really for that - private repositories are discouraged according to their FAQ and are very limited (up to 100mb).
They also don't want to host your homepage, so if GitHub Pages is why you used GitHub, they are not a replacement.
Unfortunately I don't think there's really an answer to that conundrum that doesn't involve just spinning up your own git server and accepting all the operational overhead that comes with it. At least Forgejo (software behind Codeberg) is FOSS, so you can do that and it should cover most of what you need (and while you're in the realm of having a server, a Pages-esque replacement is trivial since you're configuring a webserver anyway.) Maybe Gitlab.com, although I am admittedly unfamiliar with how Gitlab's "main" instance has changed over the years wrt features.
> If you do not contribute to free/libre software (or if it is limited to your personal homepage), and we feel like you only abuse Codeberg for storing your commercial projects or media backups, we might get unhappy about that.
Emphasis mine. This isn't about if it's technically possible (it certainly is), it's whether or not it's allowed by their platform policies.
Their page publishing feature seems more like it's meant for projects and organizations rather than individual people. The way it's described here indicates that using them to host your own blog/portfolio/what have you is considered to be abusing their services.
Seems fair to me, they're a nonprofit that exists in our lived reality and not an abusive monopolist that can literally throw a billion dollars to subsidize loss leaders.
All it shows the world is why there needs to be a VAT like tax against US digital services to help drive a public option for developers.
There's no reason why the people can't make our own solutions rather than be confined to abusive private US tech platforms.
Disagree the only alternative is to let the people decide, I don't trust a dozen men that already have deeply undemocratic beliefs to dictate the direction of tech for society.
You are against democracy, I am not. Democracy has led to some of the best advances of civilization, all oligarchies have done is introduce mass poverty, mass misery, and mass death.
At least with democracy we went to the moon for mankind, not shareholders.
Reading what you quoted, no it is not, as long as you contribute to free software or you have projects that are open source. Not just your personal homepage. If you only have a personal homepage and nothing else that is open source, then they have a problem.
Which makes it not really a suitable replacement for GitHub, which is my entire point.
Keep in mind, I'm not saying Codeberg is bad, but it's terms of use are pretty clear in the sense that they only really want FOSS and anyone who has something other than FOSS better look elsewhere. GitHub allowed you to basically put up anything that's "yours" and the license wasn't really their concern - that isn't the case with Codeberg. It's not about price or anything either; it'd be fine if the offer was "either give us 5$ for the privilege of private repositories or only publish and contribute public FOSS code" - I'm fine paying cash for that if need be.
One of the big draws of GitHub (and what got me to properly learn git) back in the day with GitHub Pages in particular was "I can write an HTML page, do a git push and anyone can see it". Then you throw on top an SSG (GitHub had out of the box support for Jekyll, but back then you could rig Travis CI up for other page generators if you knew what you were doing), and with a bit of technical knowledge, anyone could host a blog without the full on server stack. Codeberg cannot provide that sort of experience with their current terms of service.
Even sourcehut has, from what I can tell, a more lenient approach to what they provide (and the only reason why I wouldn't recommend sourcehut as a GitHub replacement is because git-by-email isn't really workable for most people anymore). They encourage FOSS licensing, but from what I can tell don't force it in their platform policies. (The only thing they openly ban is cryptocurrency related projects, which seems fair because cryptocurrency is pretty much always associated with platform abuse.)
I mean, it is arguably much easier to just write the HTML page and upload it with FTP and everyone can see it. I never understood why github became a popular place to host your site in the first place.
> I never understood why github became a popular place to host your site in the first place.
Easy: it was free, it was accessible to people that couldn't spend money for a hosting provider (read: high schoolers) and didn't impose arbitrary restrictions on what you were hosting.
Back then, your options as a high school student were basically to either try and reskin a closed off platform as much as you could (Tumblr could do that, but GitHub Pages also released in the time period where platforms were cracking down on all user customization larger than "what is my avatar") or to accept that the site you wanted to publish your stuff on could disappear at any moment the sketchy hosting provider that provided you a small amount of storage determined your bandwidth costs meant upselling you on the premium plan.
GitHub didn't impose those restrictions in exchange for being a bit less interactive when it came to publishing things (so no such thing as a comment section without using Disqus or something like that, and chances are you didn't need the comments anyways so win-win) That's why it got a lot more popular than just using an FTP server.
There are multiple reasons why FTP by itself became obsolete. Some of them I can think of off the top of my head:
1) Passive mode. What is it and why do I need it? Well, you see, back in the old days, .... It took way too long for this critical "option" to become well supported and used by default.
2) Text mode. No, I don't want you to corrupt some of my files based on half-baked heuristics about what is and isn't a text file, and it doesn't make any sense to rewrite line endings anymore anyway.
3) Transport security. FTPS should have become the standard decades ago, but it still isn't to this day. If you want to actually transfer files using an FTP-like interface today, you use SFTP, which is a totally different protocol built on SSH.
chrome and firefox dropped support for it 5 years or so ago, it has had a lot of security issues over the years, was annoying over NAT, and there are better options for secure bulk transfers (sftp, rsync, etc)
Depending on your hardware (SBC), FTP can also be several times faster than SFTP for transferring files over a LAN. Though I'll admit to having used other protocols like torrents for large files that had bad transfers or other issues (low-quality connection issues causing dropped connections, etc).
Finding an HTTP+FTP server was easier than finding github. Your OS probably has a FTP client installed already, but finding another one is easier than finding and most definitely easier than learning git.
And if you already knew how to write/make HTML you'd for sure already know all of that too.
This is definitely a matter of perspective. I have had a Github account since 2010, and git comes installed on Linux and macOS.
I don't always have a server available to host an HTTP+FTP server on. Or want to pay for one, or spend time setting one up. I can trust that Github Pages will have reasonable uptime, and I won't have to monitor it at all.
> And if you already knew how to write/make HTML you'd for sure already know all of that too.
This seems unnecessarily aggressive, and I don't really understand where it's coming from.
BTW, you can absolutely host plain HTML with Github Pages. No SSG required.
> And if you already knew how to write/make HTML you'd for sure already know all of that too.
That's a completely false statement. My kid took very basic programming classes in school which covered HTML so they could build webpages, which is a fantastic instant-results method. Hooray, now the class is finished, he wants to put it on the web. Just like millions of other kids who couldn't even spell FTP.
I just checked, I’m not using the feature but my current ISP still offers it: https://assistance.free.fr/articles/631 (10 GB FTP storage tied to the ISP-specific e-mail address).
There was a lot of sites that provided some cpanel-like option as long as you're ok with yourcoolname.weirdhostingname.com. I believe they all came with a filebrowser and the always present public_html folder.
There was geocities (now gone) and a couple of *.tk domains that would inject their ads all over your page. Neither makes a great substitute for GitHub pages these days.
I touched on the issues with FTP itself in another comment, but who can forget the issues with HTTP+FTP, like: modes (644 or 755? wait, what is a umask?), .htaccess, MIME mappings, "why isn't index.html working?", etc. Every place had a different httpd config and a different person you had to email to (hopefully) get it fixed.
That FAQ snippet is insane to me. Maybe it's a cultural thing but I'd never do business with a company that has implicit threats in their ToS based on something so completely arbitrary.
The worst part is really the unclear procedure. If they set out terms that say they'll give me 4 weeks to migrate projects they don't like off the platform, with n email reminders in between, then that's not ideal but fine. As it is, I'd be worried I'll wake up to data loss if they get 'unhappy'. I have the same problem with sourcehut, actually, with their content policy.
Well it's kind of describing the reality that exists at other companies today. Most ToS's have clauses where they can kick you off for not using it as intended, solely at their discretion. At least these guys are honest and upfront about it. I do agree though some more guidelines around their policy would be nice.
Hey, I’m building Monohub - as a GitHub alternative, and having private repositories is perhaps a key feature - it started as a place for me to host my own random stuff. Monohub [dot] dev is the URL. It’s quite early in development, so it’s quite rough around the edges. It has PR support though.
Hosted in EU, incorporated in EU.
Would be happy if you tried it out — maybe it’s something for you.
I started developing it as a slim wrapper around Git to support my own needs. At the same time, it is essential to have rich features like pull requests/code review, so I started focusing on designing a tool that strikes an appropriate balance between being minimalistic and functional. One thing that I focus on is allowing users to disable any feature they don't need.
And the site also uses Cloudflare (for domain registrar, DNS and CDN):
ipinfo monohub.dev
Core
- IP 188.114.96.1
- Anycast true
- Hostname
- City San Francisco
- Region California
- Country United States (US)
- Currency USD ($)
- Location 37.7621,-122.3971
- Organization AS13335 Cloudflare, Inc.
- Postal 94107
- Timezone America/Los_Angeles
Auth is hosted by Kinde (an Australian company, uses AWS)
Unfortunately I don't think there's really an answer to that conundrum that doesn't involve just spinning up your own git server and accepting all the operational overhead that comes with it.
Hmm all that operational overhead... Of an ssh server? If you literally just want a place to push some code, then that really isn't that hard.
Lots and lots of programmers have very little understanding and especially operation knowledge of how to host a public service. You can be an extreme graphics programmer and not know the web stack at all.
And no, its not that hard once you learn. Except, now its a never ending chore when it was an appliance. Instead of a car you have a project car.
> Lots and lots of programmers have very little understanding and especially operation knowledge of how to host a public service. You can be an extreme graphics programmer and not know the web stack at all.
Can confirm.
Also, not everyone who wants to share content publicly has a domain name with which to do so, or the kind of Internet connection that allows running a server. If you include "hosting" by using a hosting provider... it's perfectly possible (raises hand) to not even have any experience with that after decades of writing code and being on the Internet. (Unless you count things like, well, GitHub and its services, anyway.)
I think both of you are misunderstanding what I proposed. You just need a single VM with an ssh server. Literally no web service needed, if all you want to do is host some code remotely.
I didn't misunderstand. Sshd is a web service. Most folks don't already know how and don't want to set up a machine that is always on, that will restart on power loss, that will have a static IP or dynDNS, with a domain name and proper routing and open ports and certs and enough bandwidth and that's before you even worry about actual security and not just what is needed to work.... It's actually a big annoyance if you don't do it all the time.
The rest of the owl: go to provider, set up VM (20 questions) log into root. SSH for login. set up firewalls. create non-root user. useradd or adduser? depends if you want a home dir I guess. debug why you can't ssh in. Finally get in. sudo apt update. sudo apt install git (or is it named something else?). install failtoban. install fw.
If it's your ssh server and it's single user you don't need to use the "git@" part at all.
Just store the repo and access it with your account.
The whole git@ thing is because most "forge" software is built around a single dedicated user doing everything, rather than taking advantage of the OS users, permissions and acl system.
For a single user it's pointless. For anyone who knows how to setup filesystem permissions it's not necessary.
There isn't much advantage that can be taken from O/S users and perms anyway, at least as far as git is concerned. When using a shared-filesystem repository over SSH (or NFS etc.), the actually usable access levels are: full, including the abilities to rewrite history, forge commits from other users, and corrupt/erase the repo; read-only; and none.
Git was build to be decentralized with everyone having its own copy. If it's an organization someone trusted will hold the key to the canonical version. If you need to discuss and review patches, you use a communication medium (email, forums, IRC, shared folder,...)
but if you're just looking for a generic place to put your code projects that aren't necessarily intended for public release and support (ie. random automation scripts, scraps of concepts that never really got off the ground, things not super cleaned up), they're not really for that - private repositories are discouraged according to their FAQ and are very limited (up to 100mb).
Until the AI scrapers[1] come for you at 5k requests per second and you're doing operations in hard-mode.
1. Most forges have http pages for discoverability. I suppose one could hypothetically setup an ssh-only forge and statically generate a html site periodically, but this is already advanced ops for the average Github user
This isn't a real thing and if it ever becomes a thing you can sue them for DDOS and send Sam Altman to jail. AI scraping is in the realm of 1-5 requests per second, not 5000.
FWIW, Pierre's "Code Storage" project [1] seems like it simplifies a lot of the operational overhead of running git servers, if what you want is "an API for git push". Not affiliated with the company (and I haven't tried it myself, so I can't vouch for how well it works), I just think it's a neat idea.
I think "Code Storage" (definitely needs a unique name), is less an API for git push (surely git push is that API?), and more an API for "git init"? It seems to be Git as infrastructure, rather than Git as a product. i.e. if you're using it for a single repo it's probably not a good fit, it's for products that themselves provide git repos.
Yeah, but ooh boy is a private gitlab server complicated. Omnibus installation helps manage that, but if you outgrow it you're in for a complicated time.
Also gitlab has cves like every other week... You're going to be on that upgrade train, unless you keep access locked down (no internet access!!) and accept the admittedly lower risk of a vulnerable gitlab server on your LAN/VPN network.
Even if gitlab is updated fully, you're fighting bot crawlers 24/7.
I think the internet has "GitHub Derangement Syndrome" right now. It's an outlet for people's frustration.
The current trend reminds me a lot of the couple years we had where Game Developers were that outlet. They needed to "Wake up" and not "Go woke, go broke". An incredible amount of online discourse around gaming was hijacked by toxic negativity.
I'm sure every individual has their really good logical reasons, but zooming out I think there is definitely a similar social pathology at play.
> I think the internet has "GitHub Derangement Syndrome" right now. It's an outlet for people's frustration.
I would argue that the open source people aren't the only ones paying attention right now.
If you are hosting proprietary code on Github, it has become clear that Microsoft is going to feed that into their AI training set. If you don't want that, you don't have a choice but to leave Github.
While the donation banner doesn't seem like an issue to me, the WMF comparison is extremely inappropriate if they want to talk about appropriate means of fundraising.
The WMF is notorious for its donation banners making wildly exaggerated claims about the state of the Foundation (it needs some money to be operational, it is however not by any real stretch of the imagination in financial trouble or losing its independence because it doesn't get enough money; they have a massive endowment that can run Wikipedia for the next 50 years or so, and major corporations already give money to the WMF to keep it in the air, making the statements those donation messages give to regular readers very deceptive), scaring people in third world countries into parting with their meager savings because they are scared of the WMF vanishing through deceptive language and in general their donation drives are extremely intrusive to the respective Wikipedias.
I understand that the Document Foundation just wants to bring donations to the attention of their users, but the WMF is the worst point to compare it to.
If anything I think the WMF approach is why people are upset with the LibreOffice banner.
They have been breeding bad will and it is overflowing onto others.
That said, the failure of this post to recognise the problem of the WMF approach does not build confidence in the ability to recognise when users might have a legitimate complaint. That leads them to wonder where LibreOffice is headed.
I think that may be the first time I've seen licensing drama over something as minor as adding another author to the copyright list.
Pretty sure those are completely standard for major changes in maintainers/hostile forks/acknowledging major contributors. I've seen a lot of abandoned MIT/BSD projects add a new line for forks/maintainers being active again in order to acknowledge that the project is currently being headed by someone else.
From my "I am not a lawyer" view, Kludex is basically correct, although I suppose to do it "properly", he might need to just duplicate the license text in order to make it clear both contributors licensed under BSD 3-clause. Probably unnecessary though, given it's not a license switch (you see that style more for ie. switching from MIT to BSD or from MIT/BSD to GPL, since that's a more substantial change); the intent of the license remains the same regardless and it's hard to imagine anyone would get confused.
I suspect (given the hammering on it in responses), that Kludex asking ChatGPT if it was correct is what actually pissed off the original developer, rather than the addition of Kludex to the list in and of itself.
The original author said they were “the license holder”, specifically with a “the”, in discussions around both Starlette and MkDocs, which yes, just isn’t true even after rounding the phrase to the nearest meaningful, “the copyright holder”. This appears to be an honest misconception of theirs, so, not the end of the world, except they seem to be failing at communication hard enough to not realize they might be wrong to begin with.
Note though that with respect to Starlette this ended up being essentially a (successful and by all appearances not intentionally hostile?) project takeover, so the emotional weight of the drama should be measured with respect to that, not just an additional copyright line.
Inherently, not really. An expired, self-signed or even incorrect (as in, the wrong domain is listed) certificate can be used to secure a connection just as well as a perfectly valid certificate.
Rather, the purpose of all of these systems (in theory) is to verify that the certificate belongs to the correct entity, and not some third party that happens to impersonate the original. It's not just security, but also verification: how do I know that the server that responds to example.com controls the domain name example.com (and that someone else isn't just responding to it because they hijacked my DNS.)
The expiration date mainly exists to protect against 2 kinds of attacks: the first is that, if it didn't exist, if you somehow obtained a valid certificate for example.com, it'd just be valid forever. All I'd need to do is get a certificate for example.com at some point, sell the domain to another party and then I'd be able to impersonate the party that owns example.com forever. An expiration date limits the scope of that attack to however long the issued certificate was valid for (since I wouldn't be able to re-verify the certificate.)
The second is to reduce the value of a leaked certificate. If you assume that any certificate issued will leak at some point, regardless of how it's secured (because you don't know how it's stored), then the best thing you can do is make it so that the certificate has a limited lifespan. It's not a problem if a certificate from say, a month ago, leaks if the lifespan of the certificate was only 3 days.
Those are the on paper reasons to distrust expired certificates, but in practice the discussion is a bit more nuanced in ways you can't cleanly express in technical terms. In the case of a .mil domain (where the ways it can resolve are inherently limited because the entire TLD is owned by a single entity - the US military), it's mostly just really lazy and unprofessional. The US military has a budget of "yes"; they should be able to keep enough tech support around to renew their certificates both on time and to ensure that all their devices can handle cert rotations.
Similarly, within a network you fully control, the issues with a broken certificate setup mostly just come down to really annoying warnings rather than any actual insecurity; it's hard to argue that the device is being impersonated when it's literally sitting right across from you and you see the lights on it blink when you connect to it.
Most of the issues with bad certificate handling come into play only when you're dealing with an insecure network, where there's a ton of different parties that could plausibly resolve your request... like most of the internet. (The exception being specialty domains like .gov/.mil and other such TLDs that are owned by singular entities and as a result have secondary, non-certificate ways in which you can check if the right entity owns them, such as checking which entity the responding IP belongs to, since the US government literally owns IP ranges.)
It's probably a bit of both from what I've seen of how Americans tend to react to their government doing things (online anyways).
The US's quagmire of incoherent laws and many jurisdictions seems to be a bad combination of:
* Apathetic voters that are raised on a media diet of "big government bad", which impedes any regulations on a federal level. (Note that this is irrespective on if the voters actually want a small government, it's what they're led to believe.)
* Politicians that don't like to give up power; there's an unusual desire for local/state US officials to claim responsibility and get very pissy when the federal government steps in with a standardized solution. This is very unusual compared to other countries; punting responsibilities to local officials in other countries is generally seen as a way for politicians to abdicate responsibility by letting it die in micromanagement and overworked administrative workers and isn't popular to do anymore these days. (This is also a two way street, where federal US lawmakers can abdicate making any legislation that isn't extremely popular by just punting it down to the states, even if they have legal majorities.)
* The US has a court system that overly favors case law rather than actual law. Laws in the US are permitted to be painfully underdefined since there's an assumption that the courts will work out all the finer details. It's an old system more designed around the days of bad infrastructure across large distances (like well, the British Empire, which it's copied from). It's meant to empower the judicial branch to be able to make the snap decision even if there's not directly a law on the books (yet) or if a law hasn't actually reached the judiciary in question. The result is that you end up with a bunch of different judiciaries, each with their own slightly different rules. It also encourages other bad behavior like jurisdiction shopping where people will try to find the judiciary most favorable to them, crafting "the perfect case" to get a case law on the books the way you want it to get judges to override similar cases and so on and so forth - in other countries, what the supreme court judges doesn't have nearly the same lasting impact that a decision in the US has.
* And finally, the entire system is effectively kept stuck in place because lobbyists like it this way; if they want to kill regulation, they just get some states to pass on it and then hem and haw at the notion of a federal regulation. Politicians keep it in place on their own, lobbyists provide them the grease/excuse to keep doing it. (And those lobbyists these days also have increasing amounts of ownership over the US media, so the rethoric about voters not liking big government regulations is reinforced by them as well.)
It didn't end up this way on purpose; the historical reasons for this are mostly untied from lobby interests (which is mostly just "the US is the size of a continent in width", "states didn't actually work together that much at first" and "the US copied shit from the British Empire"), but they're kept this way by lobby interests.
That does sound like there's an exploitable element there isn't it?
Statistically speaking, most people use one of the biggest email providers, which use their own models to detect spam (or even quietly drop messages). If you're doing an unpopular TOS change, why not set the mail up to still be RFC compliant but in such a way where the mail isn't going to be allowed through by any of the providers. Then you can just claim the problem is userside.
For example, the Message-ID header is technically not required (SHOULD rather than MUST), but as a spam detection measure, Gmail just drops the message entirely for workspace domains: https://news.ycombinator.com/item?id=46989217
The exploitability goes both ways, I think. Users can also mark similar emails as spam to keep such emails out of their inboxes. Not sure how one could deal with that.
Oh that's awesome. Finally the contradiction of buying Google to avoid Google has been resolved for GOS.
I am curious to know how Motorola intents to deal with Google's policies surrounding Android forks, but I'm sure that's a hurdle they know how to cross.
Nobody, because no company is actually pro-customer. Which is fine, the customer and the company's goals don't align beyond "want product" and "supplies product".
The problem is that Amazon abuses it's market position as being the search engine for customer products to unfairly prevent anyone from competing with them. Being "better than Amazon" as a seller in the margins is completely impossible, because Amazon demands sellers price match them.
Let's say you're a seller who wants to make 7$ from each sale as revenue (your actual margins from making the product aren't relevant to this estimate). If you list this product on the Amazon store, Amazon is going to take your listed price and apply their own price cut on top of this (although it's usually framed the other way around, so you list the final sale price and Amazon then says how much they take). For simplicity's sake, we'll go with a 30% cut, so they list it for 10$. Now let's say there's a second storefront you want to sell to, we'll call it Bamazon. Bamazon has a lower cut than Amazon does, let's say it's 10%. So the final product would then be listed for 8$ (taking into account customer psychology on price listings), making Bamazon the better seller, right? The smart customer gets a better deal, Amazon is incentivized to improve their margins if they don't want to lose market share and everybody's happy.
Wrong. What happens instead is that Bamazon will now also list the product for 10$ (because if it's listed lower, Amazon screws the seller by delisting them from Amazon, which is unacceptable for the seller because Amazon is the one with the monopoly position, so the seller then can sell absolutely nothing), making the product equally expensive for the customer and making Bamazon's deal only an improvement for the seller, who now gets higher profits from their sales, screwing the customer. Meanwhile Bamazon is rendered unable to compete with Amazon on their better margins since Amazon is the assumed default. Any benefit of a different store having better margins is fully masked by this approach, only benefiting Amazon.
It's a Most Favored Nations clause and their use on online platforms is both ubiquitous, scummy and makes things more expensive for the customer while also entrenching Amazon's monopoly position. This crap is usually couched as pro-customer rethoric, but it really isn't. It mostly serves to entrench monopolies not on their quality, but through their existing market share. (Valve also famously does this by the way.)
Just a heads up, since no company is pro-consumer, and I assume you know what it is to be pro-consumer, if you started a truly pro-consumer business, you would put all the others out of business.
Just think about that.
Ironically, a large part of Amazon's rise was on the back of their very pro-consumer policies. Not many companies would tolerate large scale GPU return fraud (among other items) for those many years for example.
That's a very simplistic take because it assumes full transparency for all consumers - all while advertising, one of the biggest industries in our society, explicitly allows companies to turn the money they make from consumer-hostile behavior into additional reach, and even worse: all while large companies and VCs keep buying up pro-consumer businesses and enshittifying them.
Some companies have good intent. Public benefit corporations are a thing. They aren't really relevant, because unscrupulous companies outcompete them.
Your assertion that pro-consumer companies would outcompete unscrupulous ones depends on consumers and regulators holding them accountable. So why are you arguing against being suspicious of companies?
Obviously the best strategy for companies is to appear to be pro-consumer, but "cheat" (meaning price fixing but also things like advertising and buying up competitors) as much as possible. In that context, "all companies are anti-consumer" is a decent shorthand for "you should assume every company is anti-consumer because the regulatory environment favors it, even if there are exceptions."
Well for one, Servo isn't just JavaScript, it's an entire engine. Closer to Blink & Gecko.
Secondly, Ladybird wants to be a fourth implementor in the web browsers we have today. Right now there's pretty much three browser engines: Blink, Gecko and WebKit (or alternatively, every browser is either Chrome, Firefox or Safari). Ladybird wants to be the fourth engine and browser in that list.
Servo also wants to be the fourth engine in that list, although the original goal was to remove Gecko and replace it with Servo (which effectively wouldn't change the fact there's only three browsers/three engines). Then Mozilla lost track of what it was doing[0] and discarded the entire Servo team. Nowadays Servo isn't part of Mozilla anymore, but they're clearly much more strapped for resources and don't seem to be too interested in setting up all the work to make a Servo-based browser.
The question of "why not use Servo" kinda has the same tone as "why are people contributing to BSD, can't they just use Linux?". It's a different tool that happens to be in the same category.
Here's the real situation: the people that pick up the phone when you call them up aren't going to be paid much above minimum wage at all. They have zero institutional power to fix anything. You're yelling at people that, themselves, almost certainly are only barely making enough money to get by either.
It is worthless to yell at these people because they can't fix shit; they don't set policies, they have no power to fix things and all your yelling is going to achieve is at best counterproductive to what you want to get done (since now the front facing employee dislikes you personally and is less inclined to try and help you out) and at worst is going to get you into further trouble when you do need something routine done. (Since now you're on the list of "people that the employees don't want to put any extra effort into since they're jerks".)
There are people that get paid to be the complaints facing entity of the organization, who are paid to withstand whatever shit you can throw at them and who have an ability to fix up whatever you needed in specific. They're not the people that pick up the phone.
What you need to do is channel the inner Karen and ask to speak to the manager. The manager can help you with this sort of thing, they are the ones that can do shit to avoid sustaining the machine, because they have a career they want to grow into and risk actual consequences for pissing people off.
Be polite (but firm; you don't need to be walked over) to the first tier support employees, even if they can't help you. Save the complaints for the manager (who you shouldn't be afraid to ask to speak to either). The managers job is to deal with the real complaints, not the routine stuff that just happens to need a human involved. They are taking a job to be the face of the machine for reasons other than "I literally need a minimum wage job to survive".
reply