The death of Christianity in the U.S.
November 13, 2017 by Miguel De La Torre
Christianity has died in the hands of Evangelicals. Evangelicalism ceased being a religious faith tradition following Jesus’ teachings concerning justice for the betterment of humanity when it made a Faustian bargain for the sake of political influence. The beauty of the gospel message — of…