Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But energy efficiency has been increasing at a fairly steady rate for much longer than the last 10 years.

Jevons paradox(1) has previously ensured that increased efficiency resulted in increased demand. What has changed now?

1: https://en.wikipedia.org/wiki/Jevons_paradox



I'm not sure that's entirely true, especially for domestic energy consumption. 10 years ago, I had a CRT TV, incandescent bulbs, and a washing machine that was probably about an F on the EC's ratings. At work, I used a computer with a big CRT screen (actually, maybe just an LCD by then) and a processor that used about 40W just sitting there. And 20 years before that, my parents also had a CRT TV, incandescent bulbs, and a similarly inefficient washing machine. These were likely a LITTLE more inefficient, but nothing much.

Now I have an LCD TV that uses well under half the power of a CRT, LED bulbs which use about 5% the power of incandescents, and an A-rated washing machine that uses about a half the power of the old-fashioned ones. At work I use a computer with an LED-backlit LCD monitor and a modern processor that idles at practically nothing.

For most people who had "all mod cons" 20 years ago, they're using less electricity today than they were then.

Electric cars are the one wrinkle, and I think they will drive electricity usage upwards. Other than that, though, domestic use is declining for the middle class, and commercial and industrial use are too in many cases. An awful lot of electricity usage 20 years ago was essentially waste, and we're getting quite good at eliminating that.


Electric cars create synthetic electrical demand, since they displace oil demand. It's still energy demand. (The switch from oil to coal vs solar vs hydro vs wind is interesting, but the top-line "electricity demand" number means very little.)


You're right that it is still energy demand, and you're right that it's substituting oil for electricity.

But I don't know what you mean by "synthetic" since charging an EV is very much a real load, which is exactly why utilities should be the biggest fans of EV's. It's a no-brainer... "hey utilities, you know how demand for your product has been stagnant or falling? Well here's a product that can steal gasoline's market share, makes up about a third of your consumers' consumption, AND is cheaper to operate for your consumers."

Utilities should be the biggest champions of EV's.


Home appliances have made tremendous advances in the last 50 years in terms of efficiency. A lot of this drive was simply to cut manufacturing costs. Refrigerators in particular have made considerable advances almost every decade.

In the last 30 years the regulations around water consumption have driven rapid development in dish and clothes washer technology which has also improved their electrical efficiency.

The efficiencies we see today in desktop and server CPUs were less intentional and more a bi-product of the performance goals. Modern Intel CPUs have their origins in the Centrino mobile platform of the early 2000s, a variant of the P3. The Netburst CPUs of the time eschewed efficiency for speed and so P4 chips were very hot and inefficient, when Intel hit a ceiling with TDP and clockspeeds they had to reevaluate the situation. With Centrino, Intel's mobile CPU, they were trying to make the most power efficient CPU they could and concluded that the faster the CPU could complete it's workload and power down, the more battery could be saved. The end result was a very powerful and power efficient CPU. Intel found that with a few tweaks Centrino could outperform Netburst at a lower clockspeed and with far less power consumption.


Just the switch to smartphones alone is incredibly impressive on efficiency. For average users + average US electricity rates, you're looking at $0.05 to $0.10 per month in cost.

Nationally, every hour people are using their phones, instead of watching TV, playing a gaming console, or using a traditional computer, is a big savings.


Simple explanation: Jevons paradox is an observed correlation that has held true for certain notable industries at certain periods of history, but is not a law of nature. Demand for anything is neither infinite nor infinitely elastic.

But more specifically in the case of electricity: there is no demand for electricity, and never has been. The actual demand is for what electricity can do. If the slope of the energy efficiency gains of electrical goods is steeper than the slope of the induced demand for those goods, there's no reason why more goods demand (in accordance with Jevons) shuldn't result in less electrical demand (seemingly contrary to Jevons).


But (manifestations of easily usable) energy should be the most vulnerable to Jevons effects, because it has so many potential uses that that any increase in efficiency will flip some marginal case into profitability. Ditto for lower prices.


The problem is that energy as energy gets cheaper, it becomes an increasingly marginal part of cost of what uses energy.

Eg., let's say you've got some good/service, of which energy is 10% of its lifetime operating cost. If you drop the cost of energy by 50%, then that's only a 5% cost decrease for the good. In order to generate a Jevons rebound from the perspective of the energy supplier, that 5% decrease in price would have to increase demand by more than 100%. Seems unlikely.

It was a different story back in Jevons' day, when energy was a genuinely significant component of the overall cost structure of a LOT of things. But in how many goods and services is that still the case? As the cost of energy decreases relative to other cost components (like labour), you'd expect the Jevons effect to produce diminishing and eventually negative returns, no?


You might want to reflect a bit more closely on what Jevron's paradox actually is. It is not related to the cost of the final product.

Take air; it is essentially infinite and free. We use vast quantities of air in every aspect of our life. We use it in cars, in households, in computing, we breath it, we rely on their being a few km of the stuff above us to maintain pressure, etc. It is so abundant it gets a bit silly to talk about how essential it is to our way of life. Use of the stuff is almost certainly growing exponentially with population.

Believe it or not, but raw electricity is more useful than air. Humans would happily use vast amounts of electricity 24x7x365 our entire lives if it is available. As price goes down, usage should be spiking. Rather than Jevron's paradox being broken it is much more likely that some combination of

* Supply is tightening

* Real prices rising

* Something is badly wrong with the economy

is affoot.


> [Jevons Paradox] It is not related to the cost of the final product.

I don't believe that's correct. From the Wikipedia article:

"Goods and services generally use more than one type of input (e.g. fuel, labour, machinery), and other factors besides input cost may also affect price. These factors tend to reduce the rebound effect, making the Jevons paradox less likely to occur."

So the rebound effect is explicitly related to price elasticity, with the critical term being "price". If the cost of an input contributes less and less to the price of the final output, why would one expect it to have any effect?

> Humans would happily use vast amounts of electricity 24x7x365 our entire lives if it is available.

I really don't see any evidence for that statement (and indeed the parent article is providing clear evidence to the contrary). Humans want their needs attended to, and to the degree that electricity correlates to that, they'll have an appetite for electricity. To the degree that it doesn't, they won't. Nobody actually wants to use vast amounts of electricity simply for its own sake. Moreover, there are human desires that are inversely correlated with electricity usage, such as quiet, dark skies, etc. One would expect those desires to exert a downward pressure on electricity demand even if it were completely free.

It seems obvious to me that what is happening here is that in essentially any good or service you can name, non-energy costs (labour, materials, land, etc.) are becoming a much more significant component of price, both due to decreasing energy costs and increasing energy efficiency, as well as intrinsic rises in the cost of labour and materials. This in turn limits the ability for the cost of electricity to significantly stimulate or repress demand. Seems pretty straightforward.


Lack of breakthru innovation, IMHO.

The last device invented which is used by almost any household and consumes non-negligible amount of electricity is home PC. Which will turn 40 years soon.

In the meantime, even older devices, invented earlier, such as fridges, washing machines, etc. became much more efficient.

Naturally, the average amount of consumed electricity goes down.

If someone invents a a robot which will clean my apartment, do dishes and iron my shirts, and double my electricity bill, I will buy this device tomorrow.

Unforunately, for the last 30 years we made no progress in inventing something that can consume non-trivial amount of electricty and bring non-trivial value to an average household.


This is a good point.

There are however a number of innovations that are poised, or could potentially be poised, to increase electrical demand, from electric vehicles to interstellar laser propulsion. But in every case I can think of -- with one exception -- the cost of electricity isn't a significant barrier to their mass adoption. Which is why in addition to falling electricity prices not stimulating the consumption of existing goods, they also aren't stimulating the production of new goods.

The one exceptional use-case is crypto mining, which obviously does elastically respond to falling energy prices. But it's an unusual kind of demand in that in places where the electricity is higher than some threshold value, demand for this will be zero. Also, fucking hell, what a stupid use-case. Can we think of nothing better to do with our resources than come up with new ways to put a price on scarcity?


I think cloud computing deserves a mention. An increasingly large part of each person's demand for computation is satisfied by servers in AWS, Azure, Google, etc. due to the increase in popularity and complexity of online services. Depending on whether the technology sticks, Blockchain (PoW) could also drive a significant increase in energy.


Good point. For a number of years every few years there have been something that requires a lot more power. Offhand I can think of e.g. heating, fridges, televisions, microwaves, cooling, PC desktops and consoles (over many generations).


If you could drive the unit costs of electricity low enough for it to be cheaper than natural gas for heating, you could see major growth from that.

Obviously, not likely to happen anytime soon, but if the cost-reduction progression of renewables continues, it could become viable in the medium-term.

Currently it's far more expensive to heat with electric than natural gas, so most people with access to it don't use electric.


> Humans would happily use vast amounts of electricity 24x7x365 our entire lives if it is available.

That assumption could be false, though.

I only want to fill my house with a certain amount of lumens before it starts hurting my eyes, and it only takes a certain amount of electricity to produce those many lumens. Maybe I could get a bigger house, but that ends up being limited by housing costs instead of electricity costs, so the cost of houses limits how much electricity I burn on lights.

Or maybe TV's? Sure, a bigger TV would eat more electricity, but I can't afford an infinitely large TV, so the price of TV's limits how much electricity I use. Computers are the same way TV's are.

I guess your second bullet point is true? Electricity costs are falling, but the total cost of doing stuff with electricity is going up because the gadgets are getting more expensive relative to how much power they pull.


Good point; perhaps it is not efficiency then but outsourcing of light/heavy industry (simply meaning while TVA et al demand is falling, Chinese power demand is growing even more rapidly), and the fact there is more supply in the form of unmetered local solar/wind generation which looks like a fall in demand on the grid but actually isn't. Interesting.


Intuitively, that only applies when there's actually some pent up demand. At some point, availability of teacups really won't drive more demand.


Energy is the principal or a major secondary input into food, transportation, housing, consumer and industrial goods, et cetera. In other words, just about everything we spend money on. So unless we start seeing a major drop in the demand for currency, I don't buy this argument.


Everything we spend money on is being manufactured more power efficiently, is more power efficient on its own, and grid demand drops as adoption of new efficient gadgets and individual solar expands?

At home we went all LED lights. My office followed suit. Even my city swapped the night time street lights

Sadly we had almost every appliance in the house die in the first 4 yrs we owned it. But they’re all new efficient models now

Fewer desktops, more tablets and phones. Our big TV is only on 3-4 hrs a week (but it too is a more power efficient model)

A lot of effort has gone into battery storage and efficiency this last decade.

Little numbers add up to big ones real fast


A lot of energy is spent on things like transportation and heating though.

At some point, the rooms you occupy and the time you occupy them hits a ceiling. I'm not going to heat my room above room temperature, I'm not going to be in my room more than 24 hours in a day, even if energy efficiency made it as cheap as my current pattern of heating.

Similarly, there is a ceiling as to the amount of time I spend traveling in a certain day, I wouldn't drive to another country on a 5 hour trip on a daily basis for a better job, even if energy efficiency made it as cheap as my current travel pattern (1 hour a day).

That probably wasn't always the case, but there are indeed diminishing returns to spending money. Just look at Bill Gates' energy expenditure. It's waaaay higher than any of us in absolute terms, but relatively speaking a fraction of ours.

So I think the intuition you responded to is probably pretty solid. The other intuition I have, which is very much related, is that our gdp growth is increasingly service-oriented. i.e. it's not just that industries get more efficient, thereby seeing gdp growth and energy consumption decouple. It's also that the relative share of our gdp is possibly moving away from energy-intensive industries, which aren't really growing much at all, to low-energy intensive industries. With agriculture and manufacturing being quite energy-intensive compared to services industries like education, legal services, finance (ignoring proof of work) etc, a deindustrializing economy should also fuel a decoupling of energy consumption and gdp growth.


Finance consuming ever greater proportions of our GDP probably helped keep that in check https://www.bis.org/img/speeches/sp081119_g3.gif (if your rent doubles and home insurance goes up, your electricity consumption does not go up).

Similarly, offshoring of manufacturing likely muted the demand for electricity too.


I intuitively agree with you. However, the decoupling that's observed can partly be explained by lag. For example, a farm might have diesel powered equipment, tractors, etc. that last 10 years. Eventually they might switch to electric equipment. However, the innovation isn't stagnate, so its holding consumption constant. What the electricity is used for is actually changing as new uses become economically viable.


Economic inequality means that the average person can't afford to keep consuming as much energy as in the past, meanwhile the wealthiest people have saturated demand, they have everything they can imagine wishing for.


> What has changed now?

Jevons Paradox is not an immutable natural law. It's does not even have any predictive power. It's just a notice of an interesting thing that can happen but the naivest interpretation of microeconomics assumes it's impossible.


Just like Moore's law, which is also hitting a wall.


Moore's Law has been hitting - and going through - walls for the last 30 years.


True, but you can only handwave away the size of atoms for so long.


We are somewhat far away from being restricted by the size of single atoms.

A silicon atom is 0.2 nanometers and the smallest single transistors used in recent chips is in the order of 5-10 nanometers.

Also, using photons instead of electrons does allow you, almost literally to handwave away he size of atoms. How exactly, is unknown today.


Silicon lithography has to hit a wall way before single atom sizes. For a start, silicon semiconducting is a probabilistic effect and only happen on populations.

Photons at usable frequencies are much larger than atoms. Just notice that lithography is currently moving into what they call "extreme UV" (and most people call X-rays) because the UV photons are much larger than the features of current top of line chips.


>For a start, silicon semiconducting is a probabilistic effect and only happen on populations

Is that a fundamental theoretical limit like the uncertainty principles or is it a matter of engineering advancements to figure out how these forces work? I can imagine carefully positioning two atoms (maybe moving?) to obtain similar effects.


That leaves about 13 years left until Moore’s Law hits that 0.2 nm feature. Will be interesting to see how it pans out.


I would guess that Jevon's paradox doesn't exist for many domestic applications. Most people have all the lights, washing machines, etc that they need. Making lights more efficient won't make people buy more of them.


I would argue LED lighting does increase the demand for all sorts of lights rather a lot. It is due to increased efficiency, but that is not directly what creates the demand. The increased efficiency means light bulbs are no longer a fire hazard, allowing lights to be strewn just about anywhere and left on indefinitely.


Just because it’s LED doesn’t mean it doesn’t use any power at all, so it’s still a waste of energy.

Every light in our house is LED and I still teach my kids about not wasting energy and turning things off when not in use.

Maybe it’s because I grew up with incandescent bulbs, but I can’t stand the thought of random lights being left on for no good reason, even if it costs 40 cents a year to keep lit.

EDIT- ADDITIONAL THOUGHT

I was going to add that it’s not like I went around my house adding lights that didn’t exist, but as I think about it... The house I grew up in was built in the early 70’s and every room had a single ceiling light that had two bulbs in it. The hallways had 1 or two ceiling lights.

My current house, built two years ago has 8 can lights in the living room, 6 cans in the kitchen and 3 pendants over the island, 8 in the dining room, etc... So maybe you’re on to something with the thought of putting in a lot more lights than we used to.


> Maybe it’s because I grew up with incandescent bulbs

It is soooo that. When I bought my house 8 years ago I immediately had to replace some incandescent bulbs that shared a circuit with the microwave because the power consumption was tripping the breaker. And then I had to do the math to figure out when it was cost-effective to replace everything with LEDs. So I've become extremely cognizant of what things consume, and exactly what that costs...

Now I've got everything automated with LEDs and presence detection and schedules to turn things off when people aren't around or shouldn't be awake. Yet, lights needlessly being left on still bothers me.


> In the early 70's.

There were less lights in poor homes. The more affluent homes had suspended lights with multiple bulbs. (I am not sure how it is called in English). I recall that a lot of homes got more light bulbs and ornaments through the century, as they become more affordable and more hype.

The increase in count came with a decrease in power. The typical 100-150W bulb got replaced by multiple 30-50W bulbs, they do not consume significantly more in aggregate.


> The more affluent homes had suspended lights with multiple bulbs.

I think these are called chandeliers. The etymology of chandelier is French.


> Every light in our house is LED and I still teach my kids about not wasting energy and turning things off when not in use.

Are you certain that it's wasting energy, though? You have to account for the time spent actually turning the light off, as well as for the psychic cost of having to think about whether to turn it off or on.

I'm reminded of one of my old offices, where some office busybody turned off the lights in the bathroom. They were fluorescent and took awhile to brighten up, so of course this meant that one would enter a darkened room, turn them on, and then slowly get some light. Was it a huge deal? No. Did it make our lives worse than just leaving the lights on? Yes.

> Maybe it’s because I grew up with incandescent bulbs, but I can’t stand the thought of random lights being left on for no good reason, even if it costs 40 cents a year to keep lit.

At 40¢/year, leaving it lit costs .11¢/day, or .0046¢/hour. At any reasonable rate for your time & mental energy, it makes sense to just leave it lit.


> They were fluorescent and took awhile to brighten up, so of course this meant that one would enter a darkened room, turn them on, and then slowly get some light. Was it a huge deal? No. Did it make our lives worse than just leaving the lights on? Yes.

LEDs don't do this. At absolute worst, you'll have a fraction of a second before they light up at all, but once they light up, they light up immediately.


LEDs also work much better outdoors at low temperature.


That's true to an extent, but if you had 16000 lumens previously, using about 1000W, you probably don't _want_ ~100,000 lumens now, just because it uses the same amount of power. That's far too much light.


Ageeed. Also seeing as LED bulbs are ~10x more efficient per lumen than incandescent, people are not installing 10x as much brightness in their homes


Perception of brightness is not linear. The human eye response is not linear. So x10 the luminous flux (power output) will be perceived as brighter but far from x10.


k.

People are still not installing 10x as much light.


not in their home, but I can see many more historical and institutionnal buildings being illuminated, commercial use also. It may still not amount to 10x though.


Streets, offices and shops are now better lit.


Offices and shops used neon lights. It lits as well as LED or better, but consumes a lot of power.


Do you mean fluorescent bulbs? They do not use neon, but rather use mercury vapor. The UV light from the mercury excites a fluorescent layer on the inside of the tube, causing broadband emission of visible light.


In support of your argument, just about every damn device these days is covered in LEDs that run 24/7. Look around your house at night. Microwave, electric kettle, stove, phone chargers, smoke detectors, routers, modems, electric toothbrushes, and that's just what I remember off the top of my head.

Yes, not exactly the same as LED light bulbs, but illustrates the point. LED's are cheap and the power is a rounding error, so they go in everywhere.

(This is infuriating to me because I would like my house to be dark at night)


Electricity is not the only cost -- LED light bulbs are quite expensive, which puts a natural limit on how many I want to fit in my house.


I'm gonna have to call you on that one. Even high-end brand LED bulbs are so much more cost efficient than incandescent bulbs, it's the rational act to immediately replace nearly every single incandescent.

Whether you're interested in saving energy or saving money, you should remove and smash your incandescents.

Here's a blog entry I wrote with some of the simple math: https://blog.sense.com/articles/smash-incandescent-bulb-swit...


A bit less than two decades ago, when I was still living with other (post-graduate) students we got a flyer which announced a low price for Compact Fluorescent Lighting compatible with our rented house. We went into the living room where there was a whiteboard, and worked out the TCO of these more efficient lamps over their lifetime compared to just the ongoing cost of our incandescents (since we had these already we didn't need a total cost).

We then immediately set out to purchase an entire house full of CFLs. They were still going years later when we split up and each took our share, I still have one somewhere, maybe in my living room.


Personally, the reason I don't run out and replace all my existing incandescents is because I don't want to waste perfectly good bulbs; I would rather for them to reach EOL and then replace. The energy used to create a good is usually significantly more than the energy the good itself uses. I'm OK paying a little more for the energy used by the incandescents when it's overall less wasteful.


More than 99% of the total lifecycle energy of a incandescent is in its use.

Let's just repeat that.

More than 99 percent of the total lifecycle energy of a typical incandescent bulb is in its use. That includes raw materials, manufacturing, and transport.

Source: https://energy.gov/sites/prod/files/2015/10/f27/lca_factshee...


> The energy used to create a good is usually significantly more than the energy the good itself uses.

Not when you're talking about a cheap consumable product which mostly converts power into heat. A 75W bulb is about $2.50/yr worth of energy consumption per hour of daily use at $0.09/kWh. An LED equivalent is +/- $0.35/yr at a cost of $3-$6.

Maybe 5%-10% of my home's bulbs weren't worth replacing out of cycle, but LEDs are so cheap and the overall savings so high that it's not even worth thinking about.


But I'm not concerned about the money, I'm concerned about the wastefulness. The existing bulbs are completely functional until they reach EOL, at which point they will be discarded into landfill. The less frequently things are replaced, the less waste ends up in landfill (and is consumed in initial production).


We can readily quantify how 'perfectly good' incandescent bulbs are no longer perfectly good in terms of energy consumption costs, to which you might also factor the environmental and health impacts of consuming more energy than you could be.

But quantifying the wastefulness of prematurely sending them to the landfill is much more difficult. We can suggest that LEDs longer expected lifetime is another benefit to the environment but that is equally challenging to quantify for the premature replacement scenario.


I think you're looking at this wrong. Using an incandescent lamp until it burns out consumes far more resources than making that lamp did.


I think the grandparent is saying there's a limit to the number of additional efficient light fixtures one would add with promise of low runtime costs because of higher initial fixture, bulb, and dimmer costs for LED.


I haven't found an LED yet that works in my dimmable ceiling fixture, where I can dim it to the lowest possible level and get a pleasing light. The incandescent gives a very dim, warm orange glow which is great for movie watching or other mood-setting purposes. The LED still emits a rather harsh and not-as-dim light even if it's rated in the warm spectrum. Have you seen a solution for that?


For our hallway, rather than buying an LED lamp that's compatible with a dimmer, we bought one with built-in dimming. You flip the wall switch twice and it goes into a dim warm orange "nightlight" mode. Flip it twice again and it goes into full brightness daytime mode.


I believe some fixtures solve this by having two sets of LEDs; as you dim down the standard LEDs, the low-level LEDs kick in.

I've seen that in a few theatrical/architectural lights, no idea if it's available in the consumer market yet.


LEDs are terrible with dimmers. They flicker, and they have bad response curves to the dim-level. Yes, there exist properly dimmable LEDs, but buying 5 different bulbs to find one that works negates the savings.


This isn't the LED's fault. It's the fault of the fact that dimmers are awful designs. They feed a nasty chopped-up waveform to the fixture that has RMS voltage proportional to the desired power output. Aside from the fact that it has a bad power factor, it's more or less okay for plain old line voltage incandescent lights. For LEDs, several things go wrong. First, LEDs want DC or high frequency PWM. The driver is given a nasty waveform at 60 Hz. So the driver needs to rectify that waveform to DC, filter it, and then decode the waveform to dim the LED as the user requested. This is a big hack and doesn't work all that well. Second, modern power supplies aren't resistors. Supplying them with input power with rapid voltage swings is not so great. Third, the dimmers themselves are usually wired in series with the bulb without access to a neutral wire, and incandescent dimmers expect to draw their own power using current through the bulb. Given that LED drivers are highly nonlinear, this makes it very awkward for the dimmer to power itself. "Dimmable" LEDs will intentionally leak some current to power their dimmers. High-end dimmers in newer homes will have a neutral connection.

If you can find a "ELV" or "reverse phase" dimmer and you have the wiring for it, you can at least avoid the inrush current problem.

An anecdote: I have a fancy computer power supply that buzzes rather loudly when the lights are on. This is presumably because the lights are drawing a big inrush current spike 120 times per seconds due to dimming.

To top it off, LEDs have much higher frequency response than incandescent bulbs. This means that, unless you drive them with clean DC or with a very high-frequency PWM input, they'll flicker. (Not the kind of flicker you're talking about -- the kind you're seeing is the system malfunctioning. I'm talking about the kind where it flickers as designed and someone waved their fingers and decided it was hard to notice.) There's a new standard called IEEE 1789 that describes what levels of flicker are likely to be harmful (causing low productivity, headaches, general crappy feelings, etc.) and what levels are very likely to be safe. Very few LED drivers meet this standard so far. At least California has imposed a less stringent but still helpful for Title 24 compliance [1] for the last couple of years.

(This is extra nasty given that incandescent bulbs don't actually produce light proportional to the power with which they're driven.)

[1] It's Title 24 JA8.


Ok, but the dimmers worked fine with the incandescents. Your post gives a lot of reasons why the high-efficiency of LEDs comes at a cost of lower flexibility to adapt to important use cases.


Where do you find 100 watt equivalent LED bulbs in 2700K for $1/bulb?


You won't find LED's for $1/bulb, but the total cost of ownership over X years is massively in favor of the LED.

Also I simply find it convenient that LED's just. don't. die. I haven't had to change a single bulb since I upgraded ~4 years ago.


I had one out of around 50 die over ~3 years, which I'm sure is much better than I experienced with incandescents and CFLs.

Smart bulbs are so cheap now that I've obsoleted nearly all of my "dumb" LEDs.


Note that the lifetime of a bulb is quadratic, if not exponential, with its wattage.

The 100W bulb always lit in the living room will last 6 months but a 30W will do a decade.


I just saw some 2700K bulbs in Poundland (£1~=$1.38), at a selection of power levels. Not sure what the brightest was, might have been 60W equivalent?

I deliberately got a dimmer model because I wanted something to encourage my mum to sleep more; due to Alzheimer’s she has no idea what the time of day is any more, and has developed a fear of the dark and only sleeps with the light on.


You don't, not yet anyways. But if that 100w bulb is on 3-hours per day and you want a 1-year ROI, your replacement budget is probably in the $8-12 range.


I usually go to Ocean State Job Lot or Home Depot. There's a wide selection for $1-2/bulb. I like Cree or Philips as brands.


They certainly _were_ expensive, but these days reasonable quality LEDs seem to be a similar price to reasonable quality incandescents. You can still get very cheap incandescents for less than any LED, but you probably don't want them; they won't last.


My local utility has an online store for customers with amazing deals on LED lightbulbs (not as cheap as normal bulbs, but not as high as you see at a Home Depot). I've replaced every bulb in my house.


I've seen utility subsidized bulbs at Costco.


Prices are plummeting though. I can get a software-dimmable app-controlled LED (Philips Hue) bulb for cheaper ($10) than the serial-replacement incandescents it replaces. It's to the point that I'm bummed I can't find any more standard-size bulbs to replace, and they're not making candelabra etc type Hue bulbs yet.


Hue and IKEA have E12s. The Hue E12s are available as white or color, but they're terribly expensive and never seem to go on sale. The IKEA E12 is only $7, dimmable but not color temp, and will pair directly to a Hue hub.


Philips does make Hue E12 bulbs in both white ambiance and color versions.


As I commented in other thread, your argument about people having all they need could be re-phrased as a "lack of innovation".

Last thing we have invented which consume electricity is home PC. And it was 40 years ago. Without this invention we probably observed the declining demand much sooner, but hey, large monitor and good graphic card and latest 3D shooter, this stuff rocks. But before we invented home PC, people "had everything they need", they were just unaware of 3D shooters, and your argument was as valid as today.

For the last 20-30 years we had no other inventions which can convert electricity into value. And older inventions, such as fridges, got more efficient.


> Making lights more efficient won't make people buy more of them.

Yes it does. I personally added a bunch of lights I would never have added if they were inefficient. But because lights are so cheap (it's barely worth my time to turn it off, although I still do it), I added a bunch of them.

I'm still spending less on lighting power than before though.


>> Most people have all the lights, washing machines, etc that they need.

If you think that you need to visit an American suburb two weeks before Christmas.


> Jevons paradox(1) has previously ensured that increased efficiency resulted in increased demand.

I've always been confused with the labelling of the observation that the common shape of demand curves means lower price (all else being equal) leads to greater market clearing quantity as a “paradox”; what else would you expect?

That being said, all else has not been equal over time. Note that what has dropped is consumption of utility-supplied electricity; part of this may be drop in demand for electricity (use efficiency would seem to spur demand with more utility for each unit consumed, but it may suppress it if it means that the point where you hit steeply diminishing returns is hit faster), but an important part is the deployment of customer-owned generation capacity that is a direct substitute for utility-supplied electricity can be expected to hit consumption of utility-supplied electricity hard. And no one has ever labelled the effect “new perfect substitutes reduce consumption of the substituted good” a paradox.


Jevon's paradox isn't really a paradox, its just a description of differing results for differently shaped demand curves. If you have a relatively flat demand curve, then a small shift in price yields a lot change in quantity demanded.

I think price has gone down far enough that the shape of the demand curve has steepened. As consumers, there just isn't much we can do with a lot of extra electricity, so even if the price drops a lot, we're not going to consume that much more.


The article only refers to electricity demand in the US. World demand is stil growing, and I think the energy system is most properly viewed as a global one.

I would be interested to know whether the embedded energy consumed by the US has been growing or shrinking. (To calculate this you would add in the energy used to make American imports.)


The Great Recession.

The current demand flatness is only temporary, esp. when we are ushering into the world of Electric Cars and 3D printers.

There is a hungry beast lurking its called "The Data Center". As the Data Center count increases over the future, we will also see raise in demand.

Over all, my bet is Jevons paradox will stand the test of time.


- Data centers are ridiculously efficient. The latest generation of data centers from the big cloud providers have PUE in 1.07 territory, which would've seemed like pie-in-the-sky ludicrousness even 10 years ago. Every CPU cycle that's done in a data center instead of on someone's client machine or on-prem server is a decrease in power consumption.

- 3D printers don't use very much power, and in any case, while 3D printing is going to be an ever-bigger deal in manufacturing, the 3D printing at home fad has already crested and is fading away, like 3D televisions. There's not going to be one in every home.

- Electric cars will indeed increase power consumption, but mainly when they are plugged in for the night, where there is already a lot of wasted power from e.g. nuclear plants that can't scale down at night. The total increase in necessary generating capacity and infrastructure upgrades won't be as big as you think.


> the 3D printing at home fad has already crested and is fading away, like 3D televisions

I too am skeptical of 3d printing but I don’t think you can write it off just yet.

It may be true that we’re seeing the crest of the 3d dot matrix thing. But if you include all “arbitrarily programmable assembly robots” then I think we’re just in the prelude.

The data structure for 3d printing is the “thingiverse” and I don’t think we’ve seen the end of that yet. I can imagine for example a small kitchen robot with reservoirs for some staple foods, and an app where you browse dishes that can be cooked for you. Or write your own.

I would consider that 3d printing.

Similarly 3d TV is alive and well in mobile VR. And it will see huge growth this year with the advent of $200 high quality stand-alone headsets.

You can argue these things are “not the same” but the vision is the same, only the form factor is different.


>>

Data centers are ridiculously efficient. The latest generation of data centers from the big cloud providers have PUE in 1.07 territory, which would've seemed like pie-in-the-sky ludicrousness even 10 years ago. Every CPU cycle that's done in a data center instead of on someone's client machine or on-prem server is a decrease in power consumption.

<<

Do you know what is the global consumption of energy of data centers in 2016 ? It is same as consumption of energy of global aviation. Think about it, the energy consumed by Data Centers is equal to all commerical planes world wide flying. (source: realclearnergy podcast)

They may very well be efficient, that is shit load of energy and that is point.


Energy consumption of data centers as of 2016 shows they are a major component of total energy consumption and therefore worth watching. It doesn't imply how that will affect _growth_ in energy usage over time. You have to look at change over time taking into account demand, supply and efficiency together in an analysis like this http://www.datacenterknowledge.com/archives/2016/06/27/heres...


> Do you know what is the global consumption of energy of data centers in 2016 ? It is same as consumption of energy of global aviation. Think about it, the energy consumed by Data Centers is equal to all commerical planes world wide flying.

Does the global aviation cost comprise just the cost of flying planes, or does it include the cost of maintaining mainframes for flight bookings?

Also, how does the consumption of power by data centers compare to the consumption of power by all consumer PCs/workstations? That's kind of the comparison we're looking at here.


> "Data centers are ridiculously efficient. The latest generation of data centers from the big cloud providers have PUE in 1.07 territory, which would've seemed like pie-in-the-sky ludicrousness even 10 years ago. Every CPU cycle that's done in a data center instead of on someone's client machine or on-prem server is a decrease in power consumption."

Those PUE statistics are self reported and aren't very accurate. I've seen them done and they are more marketing/PR fluff than actual conditions of energy conversion efficiency. Nobody in the actual field treats them as being accurate.


Maybe the new things that people would spend electricity on now that it’s cheaper haven’t reached the mass market yet. I’m thinking mostly electric cars, because I live near Tesla, but I am sure there are other opporunities for greater electricity usage around the corner.


Are we doing less though? Because of efficiency gains we may be doing more work with less electricity so while the outward demand numbers are less that amount of work being done is increased.

I do know on a personal scale that LED lighting lets me have far more light hours for an incredibly lower bill. If anything I leave lights on more now. I even ran over a thousand feet of Christmas lights because their cost was so much less than before (like 12/15 to 1)


The efficiency of the devices people use has been increasing but the prices of electricity have been rising during the same period so efficiency from the perspective of the consumer is not increasing.

The mechanism by which increasing efficiency would increase consumption is lower prices and if that is missing then consumption won't increase.


Jevon's paradox probably only applies to some types of power usage. Industrial consumers will certainly consume more if its cheap, but a more efficient toaster doesn't mean I'll get two toasters, or toast more often. I already have more toaster capacity than I'd ever need.


People are getting older, and use less resources overall as they age. Drive less as well.


This is interesting, especially in the areas TVA covers. Would be cool to correlate this usage by average age of the customer base.


>Jevons paradox(1) has previously ensured that increased efficiency resulted in increased demand.

you can only have so many devices and appliances.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: