One of the most transparent lies white supremacists tell themselves is that Christians forced their religion on Europe. No. No, they didn’t.
Everywhere across northern Europe, people looked at Rome, so rich and sophisticated, and decided they wanted some of that for themselves. Back then, if anyone was forcing you to become Christian, it was your own leaders.