Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You mean something, that won't allow two packages to own the same file? Something, like, rpm or apt?

No, not really.

For one thing, package managers are only useful on packages supplied by the distro (or otherwise bundled using that convention), and we need something that allows for installing (and uninstalling, and backing up configurations for, and...) software safely and systematically in the general case.

For another thing, even packages installed with a distro's own package manager can typically dump whatever files they want wherever they want, rather than having the OS restrict them to a controlled environment.



> For one thing, package managers are only useful on packages supplied by the distro (or otherwise bundled using that convention), and we need something that allows for installing (and uninstalling, and backing up configurations for, and...) software safely and systematically in the general case.

There's nothing that limits rpm/deb to distribution. Anyone who publishes a tarball with software, can publish rpm/deb as well. Many do.

> For another thing, even packages installed with a distro's own package manager can typically dump whatever files they want wherever they want, rather than having the OS restrict them to a controlled environment.

The list of files in manifest is checked beforehand and if there's a conflict with existing package, the installation is aborted.


There's nothing that limits rpm/deb to distribution. Anyone who publishes a tarball with software, can publish rpm/deb as well. Many do.

Hence my "or otherwise bundled..." note.

But you're still only thinking in terms of packages that are bundled and installed via the system tool. Anything not installed via that tool can typically do whatever it wants if its scripts run as root, and anything that is installed via that tool typically won't be aware of anything that wasn't and will happily write all over it with no mechanism for backing up what was there before or reverting a breaking change.

The point is that relying on some voluntary convention like this isn't good enough. A modern OS should enforce mandatory restrictions on all installed software. We should be able to do things like checking exactly what is installed, or uninstalling something unwanted with or without also uninstalling any now-unused dependencies or any configuration data, and we should be able to do these things reliably, safely, and without any requirement for the software itself to be "well behaved" in any particular way.


No, they do not have to be bundled. The vendor of given software has to support it.

Vendor A, supporting system B with it's packaging system .xyz, makes deliverables available as a package .xyz. Everything is fine, stuff works as it should.

Vendor C, makes deliverable as a self-extracting installer, that happens to run on system B needs your permission/credentials to install that on your system. If you do that without any auditing, it's your problem, if it overwrites something. You did give the permission (you had to type in that password) and didn't insist on proper packaging.

Because the system provides the facility to achieve what you want; you just chose to override it. You own all the consequences of that.

If you want for a modern OS to enforce mandatory restriction on all installed software, modify your sudoers file to only allow to run rpm/yum or dpkg/apt. Because packages installed via these mean fulfil the conditions that you describe.


If you do that without any auditing, it's your problem, if it overwrites something.

I don't know whether you're genuinely missing my point or just trolling, but this doesn't seem to be a very productive discussion so this will be my last comment here.

Your argument seems akin to saying that you could choose to install only open source software, and to personally audit every line of code in that software including all its dependencies, so if you don't do that then it's your own fault if something bad happens. If you're both a world class programmer and a security expert, and yet bizarrely you have ample free time available and nothing better to do with it, that might work. In the real world, it's totally impractical, and a much better solution is to operate according to the principle of least privilege, enforced at the level of the OS, without having to rely on conventions and/or good will.

If you want for a modern OS to enforce mandatory restriction on all installed software, modify your sudoers file to only allow to run rpm/yum or dpkg/apt. Because packages installed via these mean fulfil the conditions that you describe.

No, they don't, as I've repeatedly tried to explain. At best, even if packages are available and properly constructed, your method keeps track of where files go and can remove them again afterwards. It doesn't enforce any systematic use of the filesystem to contain packages within specific areas; it doesn't manage related issues like configuration files that you might want to back up or preserve across software changes; it doesn't restrict access to files, networking or other system resources that the software has no business touching; it doesn't scale to the many-small-dependencies model prevalent with tools like NPM; and at this point there are already so many fundamental problems with basic robustness and security that anything else is probably moot anyway.

I leave you with a question, which brings us back to where we came in. Given that this broken version of npm exists and that it was made available via at least one production channel that should not have included it as a result of presumed human error by the maintainers, how would anything material have changed today if people had been installing it via an official package and their package manager as you suggest, rather than via npm update?


> I don't know whether you're genuinely missing my point or just trolling, but this doesn't seem to be a very productive discussion so this will be my last comment here.

I'm afraid it is you, who is still missing the point.

No matter what the system does, if you use your root privileges, all bets are off. You are the god of the system, you can do whatever you want, the system has no way to stop you. That includes destroying the system, whether directly, or by scripts run on your behalf.

The only way for the system to enforce anything is to take away root from you. There is and will be no system in existence, that can both provide you with both unlimited power AND handholding you. That's the law of the objective reality we live in. To quote: "Ils doivent envisager qu’une grande responsabilité est la suite inséparable d’un grand pouvoir." (They must consider that great responsibility follows inseparably from great power).

> It doesn't enforce any systematic use of the filesystem to contain packages within specific areas

That's right, because it has no knowledge, what your specific areas are, or what they are allowed to contain.

> it doesn't manage related issues like configuration files that you might want to back up or preserve across software changes;

configuration files are app-specific, "the system" cannot have knowledge of it's internal structure and of your intent. What it can do (and does) is show you the old and new versions, optionally the diff between them and leave the final decision on you. It will never overwrite your configuration without your consent (see the first part of the answer).

If you want the full SCM power over you config, put your config into SCM. Not everyone wants it, but those who want it, have the option available. Others may prefer other ways of management, in the gamut from "none" to "full blown provisioning system".

> it doesn't restrict access to files, networking or other system resources that the software has no business touching;

To the software, or it's installer? It pretty much does to the software, when it is being run. To the installer? See the first part of the answer.

> Given that this broken version of npm exists and that it was made available via at least one production channel that should not have included it as a result of presumed human error by the maintainers, how would anything material have changed today if people had been installing it via an official package and their package manager as you suggest, rather than via npm update?

It boggles my mind, why anybody would run npm as a root. The only thing they achieve is to write files where they otherwise can't, and risk exactly what happened now.

They _could_ run npm as a normal user, which happens to own the target directory, and it would be without the risk of damaging the system.

So the problem is not npm bugs; to problem is people not realizing what they are doing and refusing to take responsibility when it goes wrong.


If you assume that conventions don't work as people will just run whatever crap as root, I don't think you can solve the problem without taking away that right from the user (as is customary on mobile devices).

At that point, solving the problem comes at too high a cost. A few messed up npm installs seem to be the lesser evil here.


If you assume that conventions don't work as people will just run whatever crap as root...

That's not really the issue, I think. Literally everything you install, however legitimate the source and however well-intentioned the people providing it, is "whatever crap" for the purposes of this exercise. What happened here could also have happened using just about anything else you installed on a typical Linux system today, whether from an official distro package repository, or some other source of packaged files, or side-loaded with one of those horrendous "Sure, I'll download your arbitrary script from the Internet and pipe it through sh as root to install your software without even checking it, as you recommend on your web site" things.

There is no reason that our systems should trust arbitrary installation scripts to do arbitrary things, whether they're running as root or not, but especially if they are. I'm stunned at the opposition I'm seeing from so many people on HN to the idea of making a system more secure, even while we're discussing a demonstrated, system-destroying bug in widely used software that was apparently unintentionally rolled out through at least one official channel when it wasn't ready.


This is governance issue.

Build all your software into packages appropriate for the OS you use and then put them in a company repo. Install from there.

If you're just dumping whatever "stuff" you want on a machine in whatever location with no control, you're gonna have a bad time.


Unless you are going to systematically and reliably audit literally everything that any installer in any of those packages does as root, this is not a solution to the real problem, it's just trying to reduce the risk a bit.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: