There is a lot to source from Christian ideals, many of which are the foundations of Western culture: human dignity, moral equality, conscience, limits on power and care for those less fortunate and weaker. Much of what is happening in the world today feels like a stark reversal of those ideals: selfishness and divisiveness manufactured to promote a narrow segment of society.
Recent news articles have indicated an increase in church attendance. This makes sense: we have lost our moral compass... Specifically in the USA... And people are searching for a new direction.
I think what actually happened is that the Enlightenment comprehensively developed the concept of natural rights and the Christians were like "well, we're not beating that with divine right of kings, better adopt it as the thing God did all along".
What have the Roman’s ever done for us?
From what I’ve seen religions derive from basic human virtues and not the other way around.