> I think in general adults should be able to make up their minds about ingesting chemicals that might hurt them. However, it makes sense to make an exception -- if the chemical has addictive properties and empirically is likely to have network effects (i.e. in practice it tends to spread through the community), it seems reasonable for the government to step in and regulate the substance in more substantial ways than enforcing disclosure.
Does it? What result are you trying to produce by this legislation?
If you're trying to produce the result that fewer people use X harmful substance, then it absolutely does not make sense to regulate the substance itself. Evidence suggests that regulation merely makes selling the substance more profitable and buying the substance more "edgy", making the substance more dangerous. Instead, it makes sense to provide better programs for treatment and education around the substance, which results in lower usage, higher recovery rates when someone does use it, and less incentives for people to traffic the substance.
But if you're trying to produce profits for prison corporations and further militarize law enforcement, by all means, let's make all substances illegal.
> However, by far the most infuriating (and one I run into frequently in my line of work, hence the anger) is when you are trying to get the language running on a platform for which binary bootstraps do not yet exist.
> Portability matters. If you want your language to be useful and available to as many people as possible, why would you seek to artificially limit the number of platforms it can be built on, just so you can avoid writing the bootstrap in C? I'm sure there is some amount of pride on the part of the language author when their language can bootstrap itself, but it certainly isn't a pragmatic decision.
This problem is easily solved by having a rule that each new version of the compiler must compile in an older version of the compiler. The first few versions are written in C, and once the compiler is self-hosting, new versions of the compiler are compiled on older versions of the compiler. This gives you a path from C to the current version of the language.
In practice, this happens very naturally, because it's how compilers are usually written. Assuming you have version control and the first versions of the compiler are written in C, you usually have the ability to bootstrap up from C. The only thing missing in many projects is documentation and tooling for that process.
I'm fine with cracking down on ISIS propaganda as long as we also crack down on the propaganda of hawks in the West. ISIS is bad, no question there. But the US invasion of Iraq has caused more deaths than the fighting with ISIS. We need to stop pretending that these actions are different. Killing people is killing people even if it's done by militaries at the orders of men in suits.
> It's debatable. Do you stand aside and look the other way and say, that's their lot in the world, or do you do something.
> It's not an was call. If you do nothing people die. If you try to do something people will die. The idea is to pave a better path to the future.
This was the propaganda used to excuse invading Iraq, but it's incredibly naive to believe it.
> Or even today, do we say, forget Syria. Seal it up, let them sort things out and condemn a whole country's population to great suffering?
That's exactly what we'd be doing if they weren't sitting in the way of fossil fuel transportation routes.
It's ridiculous to claim that we're doing this to reduce the suffering of people in Syria. The US has supplied Israel with bombs that have been used to kill Syrian civilians for more than half a century, yet suddenly we care about the suffering of Syrians now? Yeah, right.
So are you saying countries should retreat from Syria and let things fall where they may, or are you saying we should get more involved?
Of course countries act selfishly. It's crazy to think otherwise. Everyone will act selfishly. Europe the ME, etc. want stability in Syria selfishly, but coincidentally it's a good thing.
> So are you saying countries should retreat from Syria and let things fall where they may, or are you saying we should get more involved?
I'm saying that our current involvement in Syria is worse than not being involved, and talking about Syria as if "helping" is even an option our leaders are considering is naive, and buying into the propaganda.
I can see a number of ways we could be involved positively in Syria, but we're not, and we're not going to be.
I dunno, but I've read some of the books from this year's list and I liked them. Especially Mindset: The New Psychology of Success, although I think it was a bit longer than it needed to be. It's a high value book, but I got most of the value from the first chapter.
Graph databases are exciting, but I'm far more interested in the potential of append-only stores. Rather than recording data at all, you store events (item added, item deleted, etc.).
This allows auditability and for you to look back at the state of the data at any time, but the largest benefit, in my opinion, is that it decouples data from its data structure. This allows you to treat data structures like "caches" that are efficiently structured for how they will be used. If you want, you don't have to choose between relational databases or graph databases or anything else: you can play the same set of events into different structures and query the appropriately structured database for the kind of query you're doing. It also allows you to implement security at the data storage level in a very simple and granular way: you can reject events based on predicates which update as themselves as they receive modifications to the permissions, and distribute filtered streams of data to users based on what events they are allowed to see. Overall, the power of this method is very large.
> Not being able to find private space in your multi-million dollar open floor plan modern home is one of those good kind of problems to have. If you want to live in a home with private space it isn't exactly hard to do. Since when did twitter prevent you from keeping a diary if you so choose?
I think that the article isn't saying we've lost the ability to choose privacy. It's saying that we've created a world where there's incentives to give up privacy, and very little consideration of what that choice means: sometimes we don't even comprehend that we're making that choice. Your own comment is an example:
> I do agree with the author that private space is important though. It's nice have time to reflect without the worry of interruption or judgement. In these "Open offices" it can be an actual problem, I seem to be the only person I know who would be fine with just a cubicle.
I don't see a cubicle as being private at all. It doesn't prevent people from interrupting or judging me, it only allows me a little respite from distraction, and even that it does poorly. Contrast this with the concept of a study, a room that houses rarely have any more, but in which much of the work of the world has been done.
Your very example of a "private" space demonstrates the ease with which one can give up privacy without even considering the choice.
> As far as I can tell, this prevents countries from demanding source code to a closed source product as a condition for selling into a country. Seams reasonable?
How could this possibly be reasonable?
Being able to look at the source code is a huge part of regulating code in life-and-death applications. Medical devices, cars, planes, train systems, etc. all need to be held to safety standards, and not allowing regulators access to the source code significantly inhibits their ability to do their job.
> If you want to audit a closed-source program for bugs and backdoors, people have been able to do this successfully hundreds (of thousands?) of times with standard tools like debuggers, disassemblers, and automated program analysis tools. The fact that a program is closed source has not stopped anyone from researching how it works. Ask ANY security researcher if it suddenly becomes impossible.
First of all, there are cases where you're just flat wrong. The Gauss virus contains an encrypted payload which is encrypted with a few system strings, meaning that when it runs in that system it will execute the encrypted code. It's impossible to audit the encrypted code without knowing the properties of the system it's intended to run on. Admittedly this is currently an unusual case, but given the incentives, it's unlikely that this will remain unusual. Hiding proof of code like the code in the Volkswagen emissions scandal[2] presents a large barrier to regulation, and it's not hard to imagine cases where it would make proving wrongdoing outright impossible.
Even in cases where an encrypted payload is not used, you cannot in good faith argue that the capability of auditing a binary is equivalent to the capability of auditing the source code. Yes, it's possible in most cases, but obviously it's significantly harder to audit computer generated, optimized, and decompiled assembly code than it is to audit code written in, say, C. The skill and time required to audit a binary is such that it's cost prohibitive in most cases: in the Volkswagen case, they discovered the fraudulent code with a $50,000 study that white-box tested the code by driving the car.[3] My guess is that an audit of the C code could have revealed something like:
if(car_is_being_emissions_tested()) {
...
}
...and such an audit could have been done by a junior in college working at a $15/hour internship. As these ways to cheat regulation become more sophisticated, under-funded regulatory agencies will be unable to keep up. It may be mathematically possible to audit binaries effectively, but it's not monetarily or organizationally an effective way to do regulation.
In short, it's absolutely not reasonable to tie the hands of regulators by preventing them access to the source code. Doing so is equivalent giving up on regulating a large fraction of possible anti-regulation behaviors.
And this is only talking about regulation, when the binary being audited is produced by a company trying to skirt regulation. Even more concerns arise when the binary being audited might be partially created by a government. The NSA has well-known capabilities in this area, and we have no reason to believe that China (where many devices in the US are manufactured) has similar capabilities or will in the near future.
As I said, pretty sure that disassemblers and debuggers and emulators and binary code analysis tools are a thing. I've made a living off deconstructing binary code in the past.
The bill of rights is a set of meta-laws and as such is separated from the problems that laws solve by a level of indirection, and as such they have to be more abstract and general to cover all laws that could be made. I don't think it makes sense to talk about the bill of rights and laws as being the same.
Bob Black's essay, The Abolition of Work gives an alternative dichotomy to the "work versus idleness" dichotomy: work versus play. You can be paid for playing if your play produces value (and many forms of play do produce value).
Pursue a career you like and feels like play (DJing, skating, liberal arts, photography, music) and you wont have to work a day in your life (because nobody's hiring).
I realize you're joking, but in the context of this discussion, this shouldn't be mistaken for a valid point. We're talking about how to structure a society, and I think it makes sense to structure society in a way that allows people to play, maybe not all the time, but more and more as technology reduces the need to work. Continuing to push the idea that work is a value in itself as work provides less and less value is going to be a bigger and bigger problem as the need for work decreases.
> Today's youth think the suburbs was about race now? It was about a lot of people don't actually like living in a concrete jungle, that was what it was about.
Ugh. You realize that people can do things for more than one reason?
For that matter, "people" isn't monolithic. One person can have a mix of reasons, and a different person can have an entirely different mix of reasons.
Does it? What result are you trying to produce by this legislation?
If you're trying to produce the result that fewer people use X harmful substance, then it absolutely does not make sense to regulate the substance itself. Evidence suggests that regulation merely makes selling the substance more profitable and buying the substance more "edgy", making the substance more dangerous. Instead, it makes sense to provide better programs for treatment and education around the substance, which results in lower usage, higher recovery rates when someone does use it, and less incentives for people to traffic the substance.
But if you're trying to produce profits for prison corporations and further militarize law enforcement, by all means, let's make all substances illegal.