Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't. Maybe we're thinking about different kind of caches, but if these are transparent, no-performance-impact caches, then why wouldn't you prove the system works well with caches off (guarantee deadlines are met), then enable caches for opportunistic power gains?


> if these are transparent, no-performance-impact caches

If there were no performance impact, there would be no point. I'm not just being snarky; there's an important point here. Caches exist to have a performance impact. In many domains it's OK to think about caching as a normal case, and to consider cache hit ratio during designs. When you say "no performance impact" you mean no negative performance impact, and that might be technically true (or it might not), but...

But that's not how a hard real-time system is designed. In that world, uncached has to be treated as the normal case. Zero cache hit ratio, all the time. That's what you have to design against, even counting cycles and so on if you need to. If you're designing and testing a system to do X work in Y time every time under worst-case assumptions, then any positive impact of caches doesn't buy you anything. Does completing a task before deadline because of caching allow you to do more work? No, because it's generally considered preferable to keep those idle cycles as a buffer against unforeseen conditions (or future system changes) than try to use them. Anything optional should have been passed off to the non-realtime parts of the larger system anyway. There should be nothing to fill that space. If that means the system was overbuilt, so be it.

The only thing caches can do in such a system is mask the rare cases where a code path really is over budget, so it slips through testing and then misses a deadline in a live system where the data-dependent cache behavior is less favorable. Oops. That's a good way for your product and your company to lose trust in that market. Once you're designing for predictability in the worst case, it's actually safer for the worst case to be the every-time case.

It's really different than how most people in other domains think about performance, I know, but within context it actually makes a lot of sense. I for one am glad that computing's not all the same, that there are little pockets of exotic or arcane practice like this, and kudos to all those brave enough to work in them.


While you might test your hard real-time requirements with caches disabled, there's still reason to run the code with caches afterwards.

E.g. errors that didn't match a branch or input scenario during testing which would go over budget without cache, but with cache might prevent a crash.

Another could be power consumption, latency optimization, or improvement of accuracy. E.g. some signal analysis doesn't work at all if the real-time code is above some required Nyquist threshold, but faster performance improves the maximum frequency that can be handled, improving accuracy.


You could be right on some of those. That didn't seem to be the prevailing attitude when I worked in that area, but as I said that was a long time ago - and it was in only a few specific sub-domains as well.


Forgot to mention: cache misses can be more expensive than uncached accesses, so testing with caches off and then turning them on in production can be a disaster if you hit a cache-busting access pattern. Always run what you tested.


Because you lose deterministic behavior, and there are cases where that is non-negotiable, regardless of performance cost.


Who will provide a guarantee that the caches are truly transparent and will not trigger any new bugs?

Essentially, you would need to prove the statement "if a system works well with caches off, then it works well with caches on" to the satisfaction of whatever authority is giving you such stringent requirements.


You'd have to very solidly prove that in the wors-case a cache only ever make execution time equal to or faster than a processor not using cache and never causes anything to be slower.


Even that is not enough. The cache may make everything faster, but it could lead to higher contention on a different physical resource slowing things down there. The cache cannot be guaranteed to prevent that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: