A new survey from the Pew Research Center shows that one-third of Americans believe that to be truly American one has to be Christian. Republicans were more likely than Democrats to say Christianity is key to being American. Fifty-seven percent of white evangelicals also say it is important for Americans to be Christian. In addition, a third of the public said that to be American one needs to have been born in the United States, and 70 percent said being an American means speaking English.