True, but you need to consider what you care about more - to have the problem solved, or to solve it yourself? Or to put it differently - do you care about increasing utility, or your own fuzzy feelings?
I'll admit that I didn't read the link that you shared. However, I'd like to assert that I feel fuzzies yield utilons to the individual in a similar manner as utilons yield fuzzies, each requiring refinement in their acquisition to be optimal. I think it's a more-grey area than your comment or the title of the article you linked implies (having not read said article).
Please read it, it's not long. In the article, utilons refer to the utility your help brings, and fuzzies to the feelings they generate in you (which indeed you can see as utility for you, but you could also count it separately).
The point of the article is that if you optimize for utilons, fuzzies and status points separately - e.g. by doing one thing for maximum utility, and then another for maximum fuzzies - you'll be more efficient in all of them than if you try to find one thing that maximizes all three at the same time.
Either of those goals are good, but they're better done separately. See also: http://lesswrong.com/lw/6z/purchase_fuzzies_and_utilons_sepa....