While I’m gratified that so many Americans are at last waking up to the sorry state of our Republic, I have to take issue with the oft-stated notion that America is becoming an empire. On the contrary: America has been an empire for some time now. What is happening to us in terms of our loss of liberties and our government’s increasing aggression, both at home and abroad, is not the onset of some new thing. It is, rather, the final stage of an illness that has proven fatal to every people who have ever contracted it—a disease of the mind. For before an empire can be birthed on the world stage it must first be conceived in the minds of men, and the imperial mindset was present in our United States of America from the very beginning.
Read the rest of the article at NaturalNews.com.