What I've learned about American politics is that white people, above anything else, want to be told that they are the true victims of society. They legitimately think that somehow Black people, Latinos, Asians or LGBTQ people have privilege over them despite America being a white supremacist patriarchal country for hundreds of years