Appearance
Use device theme  
Dark theme
Light theme

What does American Dream mean?

Looking for the meaning or definition of the word American Dream? Here's what it means.

Noun
  1. (idiomatic) A widespread determination by Americans to provide their children with a better upbringing than their parents were able to provide for them.
  2. (idiomatic) A philosophy that with hard work, courage and determination, anyone can prosper and achieve success.
Find more words!
Use * for blank tiles (max 2) Advanced Search Advanced Search
Use * for blank spaces Advanced Search
Advanced Word Finder
Examples
Nobody believes more fervently in the American Dream than he does, yet the dream has somehow eluded him.
To belabor the comparison a bit, the same could be said for the American Dream.
Read a cautionary tale about the seductive and dangerous power of a charlatan sociopath, featuring goats and the American Dream.
Several groups found themselves excluded from the American Dream of the white, periwigged, slaveholding intelligentsia of Enlightenment Virginia.
As beneficiaries of government largesse, these individuals have somehow hijacked the American Dream.
Let us all strive to ensure that all of our children are given the opportunity to achieve the American Dream.

See Also

Nearby Definitions
Find Definitions
go
Word Tools Finders & Helpers Apps More Synonyms
Copyright WordHippo © 2024