Electron apps tend to use a lot of memory because the framework favors developer productivity and portability over runtime efficiency.
- Every Electron app ships with its own copy of Chromium (for rendering the UI) and Node.js (for system APIs). So even simple apps start with a fairly large memory footprint. It also means that electron essentially ships 2 instances of v8 engine (JIT-compiler used in Chromium and NodeJS), which just goes to show how bloated it is.
- Electron renders the UI using HTML, CSS, and JavaScript. That means the app needs a DOM tree, CSS layout engine, and the browser rendering pipeline. Native frameworks use OS widgets, which are usually lighter and use less memory.
- Lastly the problem is the modern web dev ecosystem itself; it is not just Electron that prioritises developer experience over everything else. UI frameworks like React or Vue use things like a Virtual DOM to track UI changes. This helps developers build complex UIs faster, but it adds extra memory and runtime overhead compared to simpler approaches. And obviously don't get me started on npm and node_modules.
Your comment raises question, if you have ever been to India. Most of those 20% are old people. K-12 education need to be improved but literacy is not a major problem. Also India has cheapest internet in the world.
In about 1930, Keynes wrote "Economic Possibilities for our Grandchildren" [1] wherein he wrote:
"I believe that this is a wildly mistaken interpretation of what is happening to us.
We are suffering, not from the rheumatics of old age, but from the
growing-pains of over-rapid changes, from the painfulness of readjustment
between one economic period and another. The increase of technical efficiency
has been taking place faster than we can deal with the problem of labour
absorption; the improvement in the standard of life has been a little too quick ...
We are being afflicted with a new disease of which some readers may not yet have heard the name, but of which they will hear a great deal in the years to come--namely, technological unemployment. This means unemployment due to our discovery of means of economising the use of labour outrunning the pace at which we can find new uses for labour."
While there's no guarantee that what Smith got wrong then is the same as now, it can be a reasonable outcome that "the jobs" won't just disappear.
----
Keynes also speculated on what to do with newfound time as a result of investment returns on the back of productivity [1]:
"Let us, for the sake of argument, suppose that a hundred years hence we are all
of us, on the average, eight times better off in the economic sense than we are to-day. Assuredly there need be nothing here to surprise us ... Thus for the first time since his creation man will be faced with his real, his permanent problem-how to use his freedom from pressing economic cares, how to occupy the leisure, which science and compound interest will have won for him, to live wisely and agreeably and well."
The modern FIRE movement shows that living at a dated "standard of living" for 10-15 years can free one from work forever. Yet that's not what most people do today. I would suggest that there are deeper aspects of human drive, psychology, and varying concepts of "morality" that are actually bigger factors in what happens to "jobs".
B would require a fairly large shift in approach since currently the primary way we interact with the cloud is via browsers which are probably the biggest single users of client memory currently.
Part of the reason programs use so much memory is because of optimization, but of a different kind. Memory is fast-ish, so if you know or think that you will require X Y Z anyway then just load it in RAM. And, if you think you might need it later, don't bother unloading it. Just keep it around.
Garbage collectors also use similar strategies. Collecting garbage is expensive, so just don't until you need to. The extra memory usage in this case isn't a downside, it's an upside. Your code runs faster.
That's how Java and dotnet are able to achieve insane performance times in some benchmarks, like within 50% of native. They're not collecting garbage, and their allocators are actually faster than malloc.
If you've ever run a Java program at consistent 90% heap usage, you'll notice it absolutely grinds to a halt. I'm talking orders of magnitude slower. Naturally, this isnt highlighted in benchmarks, but it illustrates the power of allocating more memory.
They did that. But when the AI craze hit turned that all the 8GB base model Macs didn’t have enough space for even basic models (in addition to the 1-2 electron apps you can run simultaneously).
Of course seems like local AI is more or less a flop in the consumer market at least?
But still IMHO even for general use macos with 8GB is almost unusable unless you use it like an Ipad.
8GB is unusable, but is MacOS and Safari optimizabe? The point is they control the stack so they could reduce memory usage. It would be a big selling point, it could make a Mac "experience for experience" price competitive with PCs.
My Macbook Pro 13" Early 2015 w/ 8 GB RAM and 128 GB SSD is still very usable for what most people commonly use a laptop for - browsing the web, e-mail and streaming.
We may requires, high sugar food to be labeled like cigarettes, maximum portion size available (largest drink can be 500ml), put more tax on it, advertise against it, ban in schools, ban advertisements in children program/movies.
reply