I was watching the 7.30 report on dog meat being eaten in Bali (a relatively recent cultural innovation, apparently), and I annoyed to see one old Balinese guy say he eats it because it keeps you healthy, especially in winter.
What is the name for the belief that eating particular animals is particularly good for you, in certain ways? Most notoriously, it pervades Chinese medicine, and other Asian cultures, but I suppose it hangs around in lots of other continent's native cultures too: the idea, in a generic sense, that eating a particularly strong or fierce animal (or a particular organ of it) will pass on some of its character to the eater.
It kind of drives me nuts: a quasi spiritual idea that has been responsible for the endangerment of so many species for completely spurious reasons. (Or is it a case of a placebo effect meaning it actually does help people? But even if it is, can't they move onto using sugar pills instead of God knows what animal's penis, or heart, or whatever?)
I know people aren't evil for eating dogs, although my personal fondness for them means, of course, that I wish people wouldn't. And, I know, they aren't endangered and never will be. But if the motivation is simply because they are supposed to be particularly healthy for you - that just annoys me in particular.