And anyone implementing numerical algorithms is thankful for the tremendous amount of thought put into the fp spec. The complexity is worth it and makes the code much safer.
imo they were wrong almost as much as they were right. -0.0, the plethora of NaNs, and having separate Inf and NaN all make the life of people writing algorithms a lot more annoying for very little benefit.
There was actually no "thought" being put into the IEEE spec as such. It was merely a codification of the design of the Intel FPU (only one of many, very different implementations of FP units pre-standardisation). There was thought put into that implementation, but the "standard" is merely a codification of that design.
It has many many warts, and many design choices were made given the constraints of hardware of that time, not by considerations in terms of a standard.
Perhaps the solution is to rethink the role of the fridge in the kitchen. It could be designed to be a part of a kitchen island, or have cabinets placed above it. In conventional kitchens, a chest does not make sense. But it could be well integrated if we start with the assumptions the fridge will be a chest.
My house is plastered, and it is substantially more soundproofed than drywalled houses in the neighborhood. It is not a function of the construction method, since my house is stick framed just like my neighbors.
Tan isn't exactly non-violet either, so more confused on the tangent:
>He once tweeted that seven of the city’s supervisors — all progressives — should “die slow, motherfuckers” in a late-night polemic. The tweet, which Tan said was a joke, prompted hateful mail and police reports.
I have had a similar experience; my preferred material to work with is wood. However, as I got more into tinkering with electronics and vintage computing, I'm finding more instances where wood does not achieve sufficient strength-to-weight ratio, especially for small parts where wood grain and anisotropy becomes a significant factor to consider.
I'm a bit awestruck. Was there any discussion about it among your peers? We might be a generation or two apart, I saw that video when I was not yet an adult and it might have been literally part of my introduction to the person that is Richard Stallman. It definitely wasn't a good first impression.
When sorting eigenpairs of a dense matrix, usually tou end up with a Schur decomposition. The basic operation that you can do is swap two adjacent eigenvalues on the diagonal, so bubblesort is a natural candidate.
We are constantly losing technology as the treadmill of technological progress continues. Casette tapes, CRT displays, and perhaps photographic film are some examples. One can argue that there are "strictly better" technologies available now, but there are always niche cases where the new and obsolete technology are not quite fungible. What if for some reason a modern industry gets wiped out? Then we'd have to revisit the lost art.
As an immediate example, my wife's business needs p-channel small signal JFETs. These apparently are no longer fabricated, and with the way the semiconductor industry moves, they are likely never coming back in any appreciable quantity. So once the world's supply of obsoleted semiconductors dries up, the technology will basically be lost.
I don't understand why you believe Banach-Tarski to be obviously false. All that BT tells me is that matter is not modeled by a continuum since matter is composed of discrete atoms. This says nothing of the falsity of BT or the continuum.
All that BT tells me is that when I break up a set (sphere) into multiple sets with no defined measure (how the construction works) I shouldn't expect reassemlbing those sets should have the same original measure as the starting set.
reply