02-13-2005, 09:41 PM
I have been in this country everywhere basically but the west coast, which I said I would like to check out before leaving here. It's not like I hate the U.S. and everything it stands for or anything, I just dislike living here. What's the matter with wanting to live somewhere else? It's nothing different than disliking a certain state. Take Wyoming for example. I don't hate the people who live their or their policies, I just wouldn't want to live there. And when have I ever made the U.S. out to be the bane of my existance or my personal scapegoat?
^^Made by Blight
"If teachers were paid more, we'd be able to afford better drugs." -Eikichi Onizuka