Not a very artistic picture, but I'll be damned if that cookbook doesn't bring up a loaded issue. Why do so many people insist in making their country part of their religion? I grew up with a pretty moderate/liberal religious education, so here's my take:
A big theme in a lot of churches is creating the Kingdom of God on Earth. Different people and denominations may use different words or ascribe different meanings to them, but it is a fairly prevalent sentiment. Unfortunately there are a lot of people who take these teachings as divine license to force their utopia on everyone around them. I guess my view - treat people right, try to help them, and things will get better - is too hard.

No comments:
Post a Comment