Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's excitement because, as it says in the post, this is a 𝟿̶𝟶̶𝟶̶ 800% improvement. The Jetson TX2 (assuming that's what you meant) is a development board and not really comparable. That said, you're comparing raw performance with webGPU, which has a big performance hit; raw vs raw the M1 is faster than the TX2.


So, it's an 800% improvement over an older mac? Have you seen the other posts around here? Judging by the comments here, people are saying things like "the m1 is unreal", "might disrupt Google/Amazon in the compute space". That last one wasn't even said as a joke.

When you think 800% improvement is relevant, you kinda have to consider how great those 100% were to begin with. Otherwise, it reaaaally looks like a bunch of people in an echo chamber.

I honestly have no gripes against people liking the m1. But it annoyes me that the marketing of it was so unbelievably misleading, and it lead to this completely disconnected from reality notion that this is anywhere close to the performance you can get from a top end laptop. And a top end desktop? Yeah... It's an OK hardware, that is pretty good at being power efficient. Raw performance? Do you need it? Like, honestly. Are you running heavy compute calculations. Somehow not bound by 8 GiB memory? And you need to do this while commuting. Well, I suppose the m1 hardware is OK. Nothing to write home about. But OK.


I have nothing but circumstantial evidence for this, but I think Apple recognized that M1 needed to be viewed as exceptional hardware to get a smooth product transition with strong uptake, and as a result ensured that the marketing campaign would focus on benchmarks and word of mouth. I'm not accusing them of astroturfing but at the very least they got demo devices in the hands of the benchmarking blogs. I saw a lot of people on HN talking about how amazing M1 was at the end of last year, but all the stores in my area were closed to walk-in customers and there were very few shopping appointments available, so unless most of those people bought an M1 but didn't mention it in their comments, it seemed pretty obvious they were just going by what they saw in the blogs without having checked out the device in person. My nearest Apple store was basically empty save for Apple employees every time I went there for Genius Bar appointments. I don't believe every person hyping M1 actually used M1 before doing so.

People who buy them do seem happy. But I've seen people claim that 8GB RAM on M1 is functionally equivalent to much greater RAM on x86 because of faster whatever and more efficient something something. At a certain point the adulation exceeds credibility.

I'm excited to see what happens when this whole ecosystem matures, especially in a few years when TSMC 3nm is involved and devices get even more efficient. M1 seems to be a very fine product. But clearly there's a reason the high end MacBook devices are still running Intel...maybe that reason is software. Maybe it's display support. Maybe it's a lot of things.


> But I've seen people claim that 8GB RAM on M1 is functionally equivalent to much greater RAM on x86...

Look no further. On this very topic /u/alwayssmh parrots this nonsense [1]

"because of the fundamental difference in architecture, one should not compare ram size of an intel based mac against the ram of a M1 based mac. it's like comparing fuel tank sizes between two cars when one of them has a much more fuel efficient engine."

[1] https://news.ycombinator.com/item?id=26338178


Presumably Apple will build new data centers using Apple Silicon. This would be most disruptive to Intel but deeply affects Microsoft and Amazon.

Given what we know about compute to power consumption in the M1 and Apple’s track record of performance gains, how could Amazon or Microsoft compute not be under serious threat?

Apple wouldn’t sell server hardware to these companies.

While Amazon has been working on Gravitron2 from what I understand, the cores seriously underperform what is shipping in M1, and there is no evidence of serious work in this area from Microsoft.


The way I see it, compute is a commodity. On such things, you compete on price. Apple, in its current incarnation simply does not compete on price and refuses to do so. So how would they be a threat?


Because Apple would be selling to themselves. There is no markup, and the chips could be sold into consumer machines as well. This would allow scaled costs because production can be both for data centers and consumer products simultaneously.

X86 not only suffers from poor performance / power / thermal each chip sold has to return for Intel or AMD too.

The facilities that run these machines must be bigger, provide more power and work harder to keep cool while resulting in less actual compute.

It’s hard to compete on price when the architecture is unexpectedly antiquated and vertical integration through chip design either doesn’t exist or is pales in comparison.


800% improvement. 100% improvement is times 2, so 800% improvement is times 9


I'm on your side but like "bimonthly" this is a lost cause.


The USA is a hard country to get presents for, because you seem to have everything.

But as an Australian, I'd like to give a gift of the word "Fortnight". Fortnight is a word that means "2 weeks". Its a little old fashioned but awfully useful. Its a common word in Australia (and the UK I think) and you can find it in every English language dictionary.

Like "week", fortnight can be used as an adjective. For example, "Fortnightly meeting" - which means "A meeting that takes place every 2 weeks." Unlike "bimonthly meeting", a "fortnightly meeting" is clear and unambiguous. It is also well understood by the rest of the English speaking world.

I hope this gift finds you well. You're welcome.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: