So far, the first answerer was the only one to use profanity, and guess what? He's a liberal. The answer to your question is, of course, that the left simply doesn't care. America isn't important to them. It's easier for them to concentrate on saving wildlife at all costs, even if it hurts people. And the reason is, it makes them feel good about THEMSELVES. It's all about them.
No comments:
Post a Comment