This is very interesting, but something does not seem quite right. (Again, I need to the read the book).
I think it goes without saying that “white” Christian America is in decline. The demographics bear this out. But are the things that have long-defined “Christian America” (at least in the last half-century) fading away? I don’t know. It seems that in order to answer “yes” to this question we would need to make a case that non-white Christians do not care about core “Christian America” tenets such as the place of Christianity in public life, traditional marriage and families, opposition to abortion, a critique of the coarseness of popular culture, etc… Since evangelicalism is booming in Latin America, Africa, and elsewhere in the global South, can we really say immigrants arriving to America’s shores from these places are going to be any less “Christian” on these social issues?
This reminds me of a conversation I had recently with an immigration reporter for the Houston Chronicle who told me that many of the Latino immigrants she interviewed for a story were very conservative on social issues. She was surprised how many of them were supporting Trump because they believed The Donald would deliver the Supreme Court.