Looking for the meaning or definition of the word Christian Left? Here's what it means.
Proper noun
The body of religious and political movements and organizations with strong Christian faith, that share left-wing or liberal views derived directly from the Christian faith.