Appearance
Use device theme  
Dark theme
Light theme

What does Christian Left mean?

Looking for the meaning or definition of the word Christian Left? Here's what it means.

Proper noun
  1. The body of religious and political movements and organizations with strong Christian faith, that share left-wing or liberal views derived directly from the Christian faith.
Find more words!
Use * for blank tiles (max 2) Advanced Search Advanced Search
Use * for blank spaces Advanced Search
Advanced Word Finder

See Also

Nearby Definitions
Find Definitions
go
Word Tools Finders & Helpers Apps More Synonyms
Copyright WordHippo © 2024