West-american west

DEFINITION

A term referencing a blend of history and geography and whose definition has changed many times depending upon settlement boundaries. Historically, to the first settlers, the West was land that was unsettled frontier or inhabited by American Indians. After Daniel Boone opened the Appalachian Pass to white settlers, the West was frontier land beyond there. When white settlers began crossing the Mississippi River onto the plains and prairies, it was said they were settling the West. This expansion was followed by westward travel across the Rocky Mountains to the ultimate West, which was California. Today the term is a description of "non-urban land west of the Mississippi." Geographically it encompasses plains, forests and mountains and is a place with abundant wildlife. "The geographical West is the present-day land of the hunter, the cowboy, and the Indian."Source: Patricia Broder, "Bronzes of the American West".