Hacker Newsnew | past | comments | ask | show | jobs | submit | akdev1l's commentslogin

You need to very specific and also question the output if it does something insane

This decade’s version of “works on my box”

MacOS handles memory pressure better than Linux imo (at least for interactive use cases)

I have seen MacOS overcommit up to 50% of memory and still have the system be responsive.

Yesterday I filled up my ram accidentally on Fedora and even earlyoom took several minutes to trigger and in the meantime the system was essentially non-responsive


The plural of 'anecdote' is not 'data'.

It's exactly what it is

How do you think data is created? It's lots of anecdotes, normalised.


macOS uses solid-state drives to do swap to help increase virtual memory. I can run multiple browsers and IDEs smoothly on my 8GB MacBook.

This is with earlyoom/systemd-oomd enabled ?

From my experience it does not help much, and I still get occasional freezes when a program misbehaves on Linux. It’s not a huge problem, but it is a problem and it exists; I have been dealing with it for about 15 years with no significant improvement.

The earlyoom/oomd changes are quite recent.. I've had a 'better' experience, but I guess it's not really fixed yet.

Yeah, Fedora ships systemd-oomd

It did eventually work to but it took a while. It also did not killed the culprit runaway processes somehow but it did kill enough stuff for me to regain control of the system.


>Windows 95 worked around this by keeping a backup copy of commonly-overwritten files in a hidden C:\Windows\SYSBCKUP directory. Whenever an installer finished, Windows went and checked whether any of these commonly-overwritten files had indeed been overwritten.

This is truly unhinged. I wonder if running an installer under wine in win95 mode will do this.


This is truly unhinged

Granted, but at the same time it's also resolutely pragmatic.

Apparently there was already lots of software out there which expected to be able to write new versions of system components. As well as buggy software that incidentally expected to be able to write old versions, because its developers ignored Microsoft's published best practices (not to mention common sense) and and didn't bother to do a version comparison first.

The choice was to break the old software, or let it think it succeeded then clean up after the mess it made. I'd bet they considered other alternatives (e.g. sandbox each piece of software with its own set of system libraries, or intercept and override DLL calls thus ignoring written files altogether) but those introduce more complexity and redirection with arguably little benefit. (I do wonder if the cleanup still happens if something like an unexpected reboot or power loss happens at exactly the wrong time).

Could the OS have been architected in a more robust fashion from the get-go? Of course.

Could they have simply forbidden software from downgrading system components? Sure, but it'd break installers and degrade the user experience.

Since the OS historically tolerated the broken behavior, they were kind of stuck continuing to tolerate it. One thing I learned leading groups of people is if you make a rule but don't enforce it, then it isn't much of a rule (at least not one you can rely on).

I would argue the deeper mistake was not providing more suitable tooling for developers to ensure the presence of compatible versions of shared libraries. This requires a bit of game theory up front; you want to always make the incorrect path frictiony and the correct one seamless.


There was (and still is) VerInstallFile, however this was introduced in Windows 3.1 and it is possible installers wanted to also support Windows 3.0 (since there wasn't much of a time gap between the two many programs tried to support both) so they didn't use it.

It is important to remember that Microsoft created some of this chaos to begin with. Other aspects can be attributed to "the industry didn't understand the value of $x or the right way to do $y at the time". And some of this is "nonsense you deal with when the internet and automatic updates is not yet a thing".

Why did programs overwrite system components? Because Microsoft regularly pushed updates with VC++ or Visual Studio and if you built your program with Microsoft's tools you often had to distribute the updated components for your program to work - especially the Visual C runtime and the Common Controls. This even started in the Win3.11 days when you had to update common controls to get the fancy new "3d" look. And sometimes a newer update broke older programs so installers would try to force the "correct" version to be installed... but there's no better option here. Don't do that and the program the user just installed is busted. Do it and you break something else. There was no auto-update or internet access so you had to make a guess at what the best option was and hope. Mix in general lack of knowledge, no forums or Stack Overflow to ask for help, and general incompetence and you end up with a lot of badly made installers doing absolute nonsense.

Why force everyone to share everything? Early on primarily for disk space and memory reasons. Early PCs could barely run a GUI so few hundred kilobytes to let programs have their own copy of common controls was a non-starter. There was no such thing as "just wait for everyone to upgrade" or "wait for WindowsUpdate to roll this feature out to everyone". By the early 2000s the biggest reason was because we hadn't realized that sharing is great in theory but often terrible in practice and a system to manage who gets what version of each library is critical. And we also later had the disk space and RAM to allow it.

But the biggest issue was probably Microsoft's refusal to provide a system installer. Later I assume antitrust concerns prevented them from doing more in this area. Installers did whatever because there were a bunch of little companies making installers and every developer just picked one and built all their packages with it. Often not updating their installer for years (possibly because it cost a lot of money).

Note: When I say "we" here that's doing a lot of heavy lifting. I think the Unix world understood the need for package managers and control of library versions earlier but even then the list of problems and the solutions to them in these areas varied a lot. Dependency management was far from a solved problem.


> This is truly unhinged.

This is bog-standard boring stuff (when presented with a similar problem, Linux invented containers lol) - read some of his other posts to realize the extent Microsoft went to maintain backwards compatibility - some are insane, some no doubt led to security issues, but you have to respect the drive.


It’s not bog-standard. Containers are not equivalent to doing what is described in the article.

Containers are in fact redirecting writes so an installer script could not replace system libraries.

The equivalent would be a Linux distro having the assumption that installer scripts will overwrite /usr/lib/libopenssl.so.1 with its own version and just keeping a backup somewhere and copying it back after the script executes.

No OS that I know of does that because it’s unhinged and well on Linux it would probably break the system due to ABI compatibility.

If they had taken essentially the same approach as wine and functionally created a WINEPREFIX per application then it would not be unhinged.

edit: also to be clear, I respect their commitment to backwards compatibility which is what leads to these unhinged decisions. I thoroughly enjoy Raymond Chen’s dev blog because of how unhinged early windows was.


Man, after looking at the veritable pile of stinking matter that is claude code, compare it with the NT 4 source leak.

Windows may have suffered its share of bad architectural decisions, but unhinged is a word that I wouldn't apply to their work on Windows.


I think you guys read “unhinged” as way more negative than I meant.

Just because I am saying it’s unhinged doesn’t mean I don’t think it’s cool

I’ve never read any windows source so I can still contribute to wine but I’ve read the NT kernel is really high quality


It's easy to forget in these discussions that Microsoft didn't have infinity resources available when writing Windows, and often the dodgy things apps were doing only became clear quite late in the project as app compatibility testing ramped up. Additionally, they had to work with the apps and history they had, they couldn't make apps work differently.

You say, oh, obviously you just should redirect writes to a shadow layer or something (and later Windows can do that), but at the time they faced the rather large problem that there is no formal concept of an installer or package in Windows. An installer is just an ordinary program and the OS has no app identity available. So, how do you know when to activate this redirection, and what is the key identifying the layer to which redirects happen, and how do you handle the case where some writes are upgrades and others are downgrades, etc, and how do you do all that in a short amount of time when shipping (meant literally in those days) will start in just a few months?


I mean it looks like they did try to redirect writes somehow. They probably tried more sane options until they arrived here.

>there is no formal concept of an installer or package in Windows.

this one is on them, I think package managers already existed - doesn’t seem like there was ever a blocker for windows to have a package manager but Microsoft never bothered until very recently


With hindsight sure, but I don't think any desktop operating systems had package managers in that era. macOS certainly didn't. NeXTStep had their .app bundle concept, but no legacy. And UNIX package managers were of no use - few of them properly supported third party packages distributed independently of the OS vendor, especially not ones that could upgrade the OS itself.

Windows 95 was not Windows NT and it still used the FAT32 file system, where it was not really possible to enforce access rights.

As TFA says:

You even had installers that took even more extreme measures and said, “Okay, fine, I can’t overwrite the file, so I’m going to reboot the system and then overwrite the file from a batch file, see if you can stop me.”


Well and the earliest versions of Windows 95 used FAT16 (specifically VFAT for support for LFNs or long file names). So enjoy those ridiculous cluster sizes if your hard disk even approached a gig or so.

You are right that it’s not equivalent, but the article explains why redirecting the writes wasn’t a viable option.

> If they had taken essentially the same approach as wine and functionally created a WINEPREFIX per application then it would not be unhinged.

Man, wouldn't it have been nice if everyone had enough hard drive space in those days in order to do something like that...


Two words: proprietary installers.

If an installer expects to be able to overwrite a file and fails to do so, it might crash, leaving the user with a borked installation.

Of course you can blame the installer, but resolution of the problem might take a long time, or might never happen, depending on the willingness of the vendor to fix it.


> If . . . the replacement has a higher version number than the one in the SYSBCKUP directory, then the replacement was copied into the SYSBCKUP directory for safekeeping.

This as well. I know there are a million ways for a malicious installer to brick Win95, but a particularly funny one is hijacking the OS to perpetually rewrite its own system components back to compromised version number ∞ whenever another installer tries to clean things up.


Whats unhinged about a periodic integrity check? Doesn't seem much different than a startup/boot check. If you're talking about security, you've come to the wrong OS.

Then blindly overwriting the shared libraries despite the guidance what the vendor of the OS provides is actually hinged, yes?

I agree, it's unhinged for applications to overwrite newer versions of system files with older ones.

You'd have to track down some 16bit Win3.x software to install. Probably on floppy disks since CD-ROMs weren't common.

Ironically this is one of my main use cases for LLMs

“Can you give me an example of how to read a video file using the Win32 API like it’s 2004?” - me trying to diagnose a windows game crashing under wine


Exactly. I feel this is the strongest use case. I can get personalized digests of documentation for exactly what I'm building.

On the other hand, there's people that generate tokens to feed into a token generator that generates tokens which feeds its tokens to two other token generators which both use the tokens to generate two different categories of tokens for different tasks so that their tokens can be used by a "manager" token generator which generates tokens to...

And so on. It's all so absurd.


>Wi-Fi which is a lot higher quality than a consumer router

I am not really sure about it. My ISP provided AP can do a gigabit over wifi.

I need to change it because the ISP hardcodes the dns for spying reasons.

But sadly to match that performance I need to spend like $180 to get an AP with that performance


My APs are “only” 802.11ac, but on the other hand they were only $8/ea. And all of the speed critical devices on my network are wired anyway. It’s good enough to stream 1080p/120hz from my gaming rig to my iPad with imperceptible jitter and sub 10ms latency so I’m happy. If they ever get flaky down the road I’ll just upgrade to the “latest” 10 year old sub $20 used enterprise gear I can get my hands on. And that’s not the oldest part of my setup, the router itself was made circa 2013 and my managed gigabit PoE switch is of indeterminate age but probably at least 20 years old if I had to guess. Networking tech changes a lot more slowly than some other areas.

I think people are more complaining about windows crashing on updates or Microsoft putting ads everywhere or forcing one drive

That’s way more than just the “position of the start menu”


It does weird things in multi monitor because dragging a window on top of the newly “maximized” window somehow does not work

Yes MacOS breaks down the user until they give up on window management

The window management style of Mac OS is complete chaos imo

I have been using it for years and I just gave up entirely on managing anything and if I zoom out to see all my windows it looks like the freaking Milky Way from windows I forgot


>just a slow decline into incompetence.

Give them some credit, it’s been quite rapid.


when were they anything other than incompetent?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: