They say that the British empire is a declining power and is not an empire anymore, but if that is true do you think that likewise American sees Britain as the enemy no longer? when we fought together during WW2 obviously were two that became one. Maybe now their battle for independence is just a fading fever dream and they've come to understand what is really important and that the sun that shines on the British empire is eternal
You are a US vassal state, and back then you were a banker vassal state who waged war against your ethnic Germanic kin because you were told to.