World War I was a global cataclysm that toppled centuries-old dynasties and launched “the American century.” Yet at the outset few Americans saw any reason to get involved in yet another conflict among the crowned heads of Europe. Despite its declared neutrality, the U.S. government gradually became more sympathetic with the Allies, until President Woodrow Wilson asked Congress to declare war on Germany to “make the world safe for democracy.”
Key to this shift in policy and public opinion was the belief that the English-speaking peoples were inherently superior and fit for world leadership. Just before the war, British and American elites set aside former disputes and recognized their potential for dominating the international stage. By casting Germans as “barbarians” and spreading stories of atrocities, the Wilson administration persuaded the public—including millions of German Americans—that siding with the Allies was a just cause.