|
Definition of West germany
1. Noun. A republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990.
Definition of West germany
1. Proper noun. A former country in Europe, now part of Germany. Officially called the Federal Republic of Germany (FRG). ¹
¹ Source: wiktionary.com