The fact that "god" still plays such a prominent role in humanity only shows how primitive we still are. Man created gods--when did it get to be the other way around?
We're all f*cking PEOPLE. Being born in America doesn't make you any better, or thus, guaranteed to any more basic human rights than anyone else. Money is not the most important thing in the world. Jesus f*cking Christ, I'm so angry. Also, cocks.
kiiiiingcombo's recent comments: