End of the British Empire

After World War Two the US rose to power and the British Empire fell, however, it didn’t totally collapse with the aid of colonies.