It's primarily just a statement of widely agreed principles everyone has always claimed to follow, even when UNIX was new, and others arguably followed them with more success e.g.
> The Unix philosophy emphasizes building simple, compact, clear, modular, and extensible code that can be easily maintained and repurposed by developers other than its creators
Nobody says "we want to build complicated, sprawling, unclear, unmodular and rigid code", so this isn't a statement that sets UNIX apart from any other design. And if we look at the competing space of non-UNIX platforms, we see that others arguably had more success implementing these principles in practice. Microsoft did COM, which is a much more aggressive approach to modularity and composable componentization than UNIX. Apple/NeXT did Objective-C and XPC, which is somewhat similar. Java did portable, documented libraries with extensibility points better than almost any other platform.
Many of the most famous principles written down in 1978 didn't work and UNIX practitioners now do the exact opposite, like:
• "Make each program do one thing well. To do a new job, build afresh rather than complicate old programs by adding new features" yet the --help for GNU grep is 74 lines long and documents 49 different flags.
• "Don't hesitate to throw away the clumsy parts and rebuild them." yet the last time a clumsy part of UNIX was thrown away and rebuilt was systemd, yielding outrage from people claiming to defend the UNIX philosophy.
About the only part of the UNIX philosophy that's actually unique and influential is "Write programs to handle text streams, because that is a universal interface". Yet even this principle is a shadow of its former self. Data is exchanged as JSON but immediately converted to/from objects, not processed as text in and of itself. Google, one of the world's most successful tech companies, bases its entire infrastructure around programs exchanging and storing binary protocol buffers. HTTP abandoned text streams in favor of binary.
Overall the UNIX philosophy has little to stand apart other than a principled rejection of typed interfaces between programs, an idea that has few defenders today.
A philosophy that gets worship like some cult, and yet most UNIX systems hardly follow up, or most tools when checking those endless man pages full of options.
And the "Worse is Better" follows some good design principles, but in a very twisted way: the program is designed to minimize the effort the programmer needs to write it.
Implementation simplicity meant one important thing: Unix could be quickly developed and iterated. When Unix was still new, this was a boon and Unix grew rapidly, but at one point backward compatibility had to be maintained and we remained with a lot of cruft.
Unfortunately, since implementation simplicity and development speed nearly always took precedence over everything else, this cruft could be quite painful. If you look at the C standard library and traditional Unix tools, they are generally quite user hostile. The simple tools like "cat" and "wc" are simple enough to make them useful, but most of the tools have severe shortcomings, either in the interface, lack of critical features or their entire design. For example:
1. ls was never really designed to output directory data in a way that can be parsed by other programs. It is so bad that "Don't parse ls" became a famous warning for shell script writers[1].
2. find has a very weird expression language that is hard to use or remember. It also never really heard about the "do one thing well" part of Unix philosophy and decided that "be mediocre at multiple things" is a better approach. Of course, finding files with complex queries and executing complex actions as a result is not an easy task. But find makes even the simplest things harder than they should be.
A good counterexample is "fd"[2]. You want to find that has a "foo" somewhere in its name in the current directory and display the path in a friendly manner? fd foo vs find . -name 'foo' -Printf "%P\n". What to find all .py files and run "wc -l" on each of them? fd --extension py --exec wc -l (or "fd -e py -x wc -l" if you like it short). "Find requires you to write find . -name '*.py' -exec wc -l {} ;". I keep forgetting that and have to search the manual every time.
Oh, and as a bonus, if you forget to quote your wildcards for find they may (or may not!) be expanded by the shell, and end up giving you completely unexpected results. Great foolproof design.
3. sed is yet another utility which is just too hard to learn. Most people use it as mostly as a regex find-and-replace tool in pipes nowadays, but its regex syntax is quite lacking. This is not entirely sed's fault, since it predates Perl and PCRE which set the modern standard for regular expressions that we expect to more or less work the same everywhere. But it is another example of a tool that badly violates the principles of good design.
The Unix Haters Handbook is full of many more examples, but the reality is that Unix won because other OSes could not deliver what their users needed fast enough. Unix even brought some good ideas to the mass market (like pipes) even if the implementation was messy. We now live under the shadow of its legacy, for better or worse.
But I don't think we should idolize the Unix philosophy. It is mostly a set of principles ("everything is a file", "everything is text" and "each tool should do one job", "write programs to work together") that was never strictly followed (many things in UNIX are not files, many commands do multiple jobs, most commands don't interact nicely with each other unless you just care about opaque lines). But most importantly, the Unix philosophy doesn't extend much beyond designing composable command line tools that handle line-oriented text for power users.
https://web.stanford.edu/class/archive/cs/cs240/cs240.1236/o...