The United States is often portrayed as the “promised land” where people can seek out a higher standard of living. This encompasses all aspects of life, many of which will be explored in further detail below. In short, it implies better opportunities in terms of education and jobs, allowing individuals to become contributing members of society. It can also include the possibility of health care or having access to everyday necessities not available elsewhere.