preview

The Importance Of Health Care In The United States

Decent Essays
Open Document

Unfortunately in the United States (U.S.) health care is not a basic right that all have. While the U.S. is considered one of the richest nations in the world and spends more on health care than other countries throughout the world, access and affordability to health care continues to be an issue (American Nurses Association [ANA], 2017). There continues to be vulnerable and marginalized populations who are less likely to obtain health care in the U.S. due to cost or lack of insurance coverage. Generally speaking in the U.S. our health care is a privilege, not a right. Individuals must either purchase health insurance or pay for the services they receive in the event that they utilize the health care system. The Center for Economic and Social

Get Access