|
Definition of West coast
1. Noun. The western seaboard of the United States from Washington to southern California.
Generic synonyms: Geographic Area, Geographic Region, Geographical Area, Geographical Region
Group relationships: West, Western United States
Group relationships: West, Western United States
Definition of West coast
1. Adjective. of or relating to the western seaboard of the United States. ¹
¹ Source: wiktionary.com