As it's being said in news everywhere, there's something wrong with our planet.
If you don't belive in any of it, just pretend you do for a minute.
If we were coming to an "end of the world" sort of thing, which country do you think would fall first?
I think we would, the United States of America. We don't really have the best "leaders" and are making the whole situation worse.