To me, this rule seems fairly natural, so I'm part of the "surprised at the backlash" camp here for sure. At a high level, "if" is a way of skipping unneeded code, and "for" is a way of repeating code, so why wouldn't you want to skip as much unneeded code as possible and repeat as little code as necessary? This isn't meant to diminish the concerns raised in the comments here, but if the tools we're using to express what we want the computer to do end up working better when we can't use simple heuristics like "doing less stuff to achieve the same results is better than doing more stuff", that seems like a fundamental issue with the tools themselves.