- Western United States (Wikipedia)
The Western United States (also called the American West, the Far West, and the West) is the region comprising the westernmost U.S. states. As American settlement in the U.S. expanded westward, the meaning of the term the West changed. Before around 1800, the crest of the Appalachian Mountains was seen as the western frontier. The frontier moved westward and eventually the lands west of the Mississippi River were considered the West.
- For as to those who, through curiosity or a desire of learning, of their own accord, perhaps, offer him their services, besides that in general their promises exceed their performance, and that they sketch out fine designs of which not one is ever realized, they will, without doubt, expect to be compensated for their trouble by the explication of some difficulties, or, at least, by compliments and useless speeches, in which he cannot spend any portion of his time without loss to himself.