Christian is an extremely generic term with little meaning. I've met devout Christians who take it seriously and try to live "love thy neighbor" but a lot of Christians where I live (the South) just go to church to conform, don't practice it, and use the term as an in-group code. I go to a Unitarian church where Christianity is one stripe among many. Frankly, religion itself is declining because people are better educated, more broad-minded, and traditional values that go with most religions promote sexism, homophobia, provincialism, and a worldview out of one book written centuries ago by white men. I mean, seriously?