Unfortunately I have to disagree. The JavaScript Engine still have to parse and interpret the code. This can take a bit of time, especially on mobile devices.
Parse time shouldn't be very long at all, especially compared to network latency. How long does it take for large (~100-150kb minified) to be parsed and interpreted? I'd be surprised if it's more than 100ms
It's barely above the threshold for detectable live feedback latency, from a musical instrument or sight/sound correlation. As a user veiwing a page with plenty of other latency, it's inconsequential, particularly since you can execute your js while image assets are still incoming.
Maybe, but in terms of understanding the code and having total control over it, it means a lot.
(And I don't buy the large teams can't adopt a self-grown code in the first place. If they're any good, they can. If they're not, then even their React and Angular skills will be bad).
Besides you know all your business logic code? That will be custom too, anyway,and they'll have to adopt it...
It's not so much the time over the wire to download, but the time to parse and run all that JS. If you open up your web dev tools, I think you'll see the impact from a 50k gzipped JS file is probably greater than a 500k image, for one thing the site can probably function without the image loaded, but your page will not be user responsive until that JS has finished executing.
My total image payload is 30k. Obviously getting there is more challenging for other sites, but at the end of the day if your page is >1mb I'm probably not going to wait for it to load
What's more relevant at Facebook's scale is dollars saved per kilobyte shrinked. Because majority of its users will tolerate almost anything at this point, including apps that crash randomly to measure loyalty.