preview

The History of American Christians

Good Essays

Throughout the year Christians have strived to do the will of God. From to converting people into Christians to making a society pleasing to God. Christians in America have been present since the colonial times. In the late 19th century, they were still thriving in the United States. In the early 20th century they were still involved in the broader American culture, committed to shaping public policy and welcome in political life. But as time continued, evangelicals started to create their own subculture, no longer involving themselves in politics and the rest of the American culture. By mid to late 20th century, evangelicals saw that the nation was becoming further way from God and it was affecting them. They sought to partly reinsert themselves in the American culture and politics and found they were not as welcome as before. Even though they are not welcome, Christians must try to do the will of God by turn peoples eyes back to Him in everyday life and politics. In the late 19th century to the early 20th century, evangelical Christians were involved in the American culture. Evangelical Protestants shaped public policy by trying to reform the nation, according to their convictions. That is the case with prohibition. Protestants thought that alcohol destroyed self-discipline and self-control, not just drunkenness. Evangelicals sought to rid the nation of the wickedness of alcohol. William Riley, a baptist pastor in Minneapolis, preached about the immorality of liquor.

Get Access