Appearance
Use device theme  
Dark theme
Light theme

What does Haiti mean?

Looking for the meaning or definition of the word Haiti? Here's what it means.

Proper noun
  1. A country in the Caribbean. Official name: Republic of Haiti.
Find more words!
Use * for blank tiles (max 2) Advanced Search Advanced Search
Use * for blank spaces Advanced Search
Advanced Word Finder
Examples
Haitian human rights organizations have repeatedly requested that Constant be forcibly returned to Haiti.
Creative ingenuity gone, the arts and industries would decay, sky-scrapers would crumble, plantations would be weedgrown, as they are in Haiti.
Of course, the people of Haiti claim that they see zombies very often, but no one has been able to prove it.
The US sent troops to occupy Haiti in 1915 after a mob dragged President Guillaume Sam from his palace and tore him limb from limb.
Other countries, such as the Dominican Republic, Haiti, and several African states, have begun to sow jatropha for future use in biodiesel.
Christopher Columbus reputedly chanced upon hammocks in Haiti and sailors were soon slumbering in them on board ship.

See Also

Nearby Definitions
Find Definitions
go
Word Tools Finders & Helpers Apps More Synonyms
Copyright WordHippo © 2024