Ever since the terrorist attacks of September 11, 2001, many Americans have believed that the events of that horrible day changed the United States forever. Each year that has gone by has seen an increase in the number who believe those changes have not been good for the nation.
read more | digg story
Wednesday, September 12, 2007
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment