Definition of West Germany

1. Noun. A republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990.


Definition of West Germany

1. Proper noun. A former country in Europe, now part of Germany. Officially called the Federal Republic of Germany (FRG). ¹

¹ Source: wiktionary.com

Lexicographical Neighbors of West Germany

West Berliner
West Brit
West Briton
West Chadic
West Coast
West Country
West End
West Flanders
West Flemish
West Frisian
West Frisian Islands
West German
West Germanic
West Germanic language
West Germans
West Germany (current term)
West Greece
West Highland white terrier
West Hollywood
West Indian
West Indian cherry
West Indian jasmine
West Indian satinwood
West Indian smallpox
West Indian snowberry
West Indians
West Indies
West Java
West Kalimantan
West Lothian question

Other Resources:

Search for West Germany on Dictionary.com!Search for West Germany on Thesaurus.com!Search for West Germany on Google!Search for West Germany on Wikipedia!