Did Christianity end the Roman empire?
As the Western Roman Empire became more entangled with Christianity, some historians posit that imperial authority began to erode. The idea of a divine ruler was supplanted by the notion of the Kingdom of Heaven, which redirected individuals' loyalty away from the emperor and towards God.