Christianity is Not a White Western Religion

Either we will have a Christianity that is Western or we will have a Christianity based on the truth of the Bible, one that can transcend culture. When you separate it from its roots, the whitewashed Western, and often American, version hurts everyone, including white people.