Hacker Newsnew | past | comments | ask | show | jobs | submit | thefre's commentslogin

For you 1st point: because in most jurisdiction CP is absolutely forbidden and typically you are not protected if you don't proactively act to eradicate CP on your service with reasonable dilligence, whereas an upload of a film might or might not be problematic, given that it can be for private use or public distribution.

For the third point who are you kidding. Technically this is problematic. In the real world even representatives and people in grand jury exchange ripped films, most probably most judges do it too. So this is both technically a reason but an highly hypocritical one.


> For you 1st point: because in most jurisdiction CP is absolutely forbidden and typically you are not protected if you don't proactively act to eradicate CP on your service with reasonable dilligence, whereas an upload of a film might or might not be problematic, given that it can be for private use or public distribution.

That's not at all the point though. It would be one thing if MegaConspiracy had a policy to only "full ban" videos that were targets of DMCA-complaints if it was clear to them that the content was actually infringing. Then they'd have a plethora of excuses (e.g. no time to sort through all the DMCA requests, "investigations" revealed it was unclear material was infringing, etc.)

Instead what the judge and jury will see is that MegaConspiracy had a "permaban video" tool which they didn't even attempt to use.

> For the third point [MegaConspiracy distributing infringing material internally] who are you kidding. Technically this is problematic.

Well, this is the best kind of "technically", given that it's exactly what they're being accused of! If the grand jury was as sympathetic to that as you think then why did they hand down the indictment?

Either way it's not very hypocritical given that some naïve users might be able to claim ignorance of the progeny of videos from the Internet, but MegaConspiracy went out of their way to establish DMCA protocol so they have no excuse for ignorance of the "safe harbor" provisions at all.


Change their government.


The problem is the combination of this evil clause and Apple's general attitude and evilness.

First on the EULA: even if you used a program to do a formatting, said formatting is still your work, and the EULA recognize that by using the term work, so Apple is basically saying: if you use our "free" tool you have to give us a part of your work. So how exactly is that free? i don't know.

Then what happens if you use something else to do the formatting then just convert using Apple program (if that is possible)? Or if you can't convert redo exactly the same formatting, or something really near. Apple risk to argue that you breached the agreement even if you technically might not have.

And then what happens if they recognize that you used another tool to publish on other platform, but the too version are still too close to their taste? That's when Apple's general attitude and evilness will enter the game. They will say: I don't publish you, and I don't have to give any reason for that. You might say, yeah it is the game, it is their right. Except two things: the kind of people who object that typically also fake a taste for liberalism and free markets and so on (they are more probably thinking about something anarcho-capitalism style and how it could be great to make a lot of money fucking people all around you just because you can). I recognize that given the current laws, it is not obvious that Apple's don't have the right to do that, they even might have. But I found this way of doing business highly disgusting. The second thing is that a numerical publisher in Apple style has nothing in common with a real publisher. Their marginal cost for publishing you is absurdly low, their margins guaranteed, and they don't help you with you work (well yeah they kind of help you by "giving" you a program, ok in that case i suppose buying a fancy pen gives the builder some right on the story you wrote with? -- in any case this "help" as nothing in common with the help of a real publisher).

It all can be summarized with: Apple is constructing a giant monopolistic market, because well it is obvious that monopolies works and brings an absurd amount of money. And now every other giant multinationals find how brilliant this is and are trying to do the same, even for PC. They might even say it is not monopolies because they are multiple of those, and if they turn their sentence to say that "well" people might even believe them.

If this is the kind of world you want and like to live in, good for you. On my side, it makes me vomit.


> There's an essay somewhere I remember reading where an ITS hacker looks at Unix for how it handles the interrupted system call problem and comes away horrified at the discovery that it makes the user do it via EINTR.

Not really the real point of the essay, but yeah there kind of is. Now come with a design (with similar premices, you can't just argue that the whole system call semantic has to change...) which does not involve the user, and we will talk.


Given that SA_RESTART already manages to perform that restart without the user's intervention, it is obviously pretty simple to come up with such a design: you just make SA_RESTART the default behavior. ;P


Have you ever tried to write portable code for multiple unixes?


Yes. Afterwards, I learned more about standards and specifications (including taking on a weird fetish for lurking on the mailing lists where people actively are working on them), and realized that my attempts were flawed by premise. ;P

Seriously, POSIX is /not/ able to provide for you the ability to have a single program always work on every system: they tried really hard, but the world isn't perfect. They were (and are) attempting to unify and control something really complex, and they made remarkable progress given how many people they were trying to bring together who already had existing incompatible implementations.

In this case, they carefully made clear that this behavior was unspecified. That does not mean that they failed or that their specification was broken. In fact, I'd argue the opposite: if they had required implementations to do something in specific, the specification would be broken as it would not have described reality. You can't just claim that the implementations of your standard that people are actively trying to code against don't exist or are incorrect.

Honestly, though, to take a more direct appraisal of your question, the real epiphany for me came when someone clubbed me over the head with the difference between "portable" and "ported", and then demonstrated that all of the people that had come before me whose work I most admired had concentrated on making "portable" code as opposed to "ported" code: the most amazing code I've ever seen is the code that has managed to easily be adapted to changing environments as it had the simplest design and most powerful abstractions from the underlying systems, often as a direct result of attempting to embrace so many unrelated platforms.

Which then leads to a "better" question: have you ever tried to write code that could easily be ported between multiple operating systems, whether they be any of the numerous implementations of Unix (old or new, BSD or System V, largely compliant or downright buggy), Windows (using native APIs, not compatibility wrappers), or Mac OS (9, not X)?

If not, I recommend trying it, as that is what "portability" really is: once you experience it for your own code, it is difficult to take projects that insist on only working on a single homogeneous set of environments seriously anymore.


So the next hot article will be "OMG a GNU/Linux distribution has a messy code base"?

Once you are talking about a multiple GB system, of course it's messy. Except it is actually a very poor choosing to describe the situation. E.g. mixing multiples programming langages in a big system has nothing messy, it would even be a HUGE warning not to encounter multiple programming langages in such a huge system.


"""Once you are talking about a multiple GB system, of course it's messy. """

Only there's no need for a typesetting engine + layout templates + graphing, math, bibliography etc add-ons, to ever be a "multiple GB system".

It just carries too much legacy garbage.


> This dream lasted a few days until we discovered that TeX, the typesetting engine underlying LaTeX, isn’t written in C. TeX is written in WEB, Donald Knuth’s “literate” programming language.

Seriously, they discovered that after starting the project???


are you high?


> In your average company, a programmer is considered a glorified mechanic or janitor, a code-monkey if you will, well below the guys that "bring in the money" like sales and marketing. It is an expense, something that the company has to live with because someone has to implement the ideas that the guys in charge come up with to help the guys that "bring in the money" bring in more money.

It's worse than that. Even in the tech sector, the situation you describe sometimes happens. I have yet to fully analyze that kind of situation, but i've already come with some criteria that might help identify such environments.

What is sad is that it's merely a reflect of the mainstreams governance models at wide scale (and only worse: democracy is even less frequent in companies than in countries...), which have proven to be poor, and will prove to be catastrophic in a near future.

Of course in the real world things happen on a continuous scale, but if by bad luck the environment you're in exactly fit below exacerbate description, run!

1. A strictly tree like, military like hierarchy, ruling everything in strict tree order regardless of its a tech, hr, or other issue. This is one of the most effective way to waste talent and to take non optimal decisions (and not even near to optimal) -- add not taking into account bottom-up proposals if you want the perfect mix to achieve a high level of ineffectiveness.

2. The hierarchical tree is strict enough so that programmers typically can't even be spontaneously inspired by new ideas, which often is not enough to kill a business, but just to impede it, so sadly the situation might persist.

3. Each programmer is considered as a "just another programmer" in the company, regardless of achievement, knowledge, skill. The paradoxical part is that does not mean that individual requests are not done to the good people when needed, but it looks like not much people would recognize that when they have no question. Improvements in achievement, knowledge, skill are neither fully exploited and often not rewarded at all, directly leading to a high turnover.

4. Small valorisation of tech realization for those who actually do them, regardless at which level (that can go as far as considering programming mainly as a cost center that needs to be beaten into submission to behave, and preventing it from getting helpful resources).

Lesser versions of those situations exist, leading to more effective tech companies that are more programmers friendly. The two often somehow go together. At the end of this better path, you have some companies like MS, Google. (not exempt from pb, but they arguably are not that bad)


I think this comic comparing the org charts of various tech companies is relevant to your analysis. :)

http://www.bonkersworld.net/2011/06/27/organizational-charts...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: