Racism in the United States

Decent Essays
Racism is the trend of thought, or way of thinking, which attaches great importance to the notion of the existence of separate human races and superiority of races that are usually associated with inherited physical characteristics or cultural events. Racism is not a scientific theory, but a set of preconceived opinions they value the biological differences between humans, attributing superiority to some according to racial roots. Even in such ethnically diverse country as the United States, racism continues evident against people of different ethnic traits and skin color. According to Steinberg (Steinberg, 1995), racial discrimination has been the most important cause of inequality between whites and blacks in the U.S. Because of that, minorities in American society have been fighting over years for equal rights and respect, starting with the civil rights movement in 1960s. Also, public policies implemented since 1964 in the United States have been instrumental in reducing economic inequality between blacks and whites, such as the affirmative action, a federal program that tries to include minority groups by providing jobs and educational opportunities (Taylor, 1994). From this perspective, does racism still play a dominant role in American values and American society? If so, what are the consequences of this racism that still remain in American society? What is the impact of the Barack Obama presidency on the unending fight against racism in this country?
Since everyone
Get Access