The End of the U.S. Women’s Soccer Dominance
The rest of the world has caught up—and that’s a good thing.
by Franklin Foer
Aug 01, 2023
4 minutes
The U.S. Women’s National Team suffers by comparison to its old glories. At the previous World Cup, in 2019, it channeled the best of the American character: magnetic self-confidence that verged on arrogance, individualism that flamboyantly flouted archaic norms. In the press, players jawboned about the president of the United States as they waged war against their own employer in the name of equal pay. On the pitch, they were a hegemonic power: adventurous, righteous, justifiably certain of their destiny.
What the world has witnessed in the early stage of this year’s World Cup, where the team has tied Portugal and the Netherlands, is a display of American decline.
You’re reading a preview, subscribe to read more.
Start your free 30 days