Get Your Premium Membership

Western

Definition

[n] a film about life in the western US during the period of exploration and development
[n] a sandwich made from a western omelet
[adj] lying toward or situated in the west; "our company's western office"
[adj] of wind; from the west
[adj] lying in or toward the west
[adj] relating to or characteristic of regions of western parts of the world; "the Western Hemisphere"; "Western Europe"; "the Western Roman Empire"
[adj] of or characteristic of regions of the United States west of the Mississippi River; "a Western ranch"


Related Information

More Western Links

Antonyms

eastern