|
Definition of Wild west
1. Noun. The western United States during its frontier period.
Definition of Wild west
1. Proper noun. The western United States during the 19th-century era of settlement, commonly believed to be lawless and unruly. ¹
2. Proper noun. (context: by extension) A place or situation in which disorderly behavior prevails, especially due to a lack of regulatory oversight or an inadequate legal system. ¹
¹ Source: wiktionary.com