it was before your time.
In WWII, we saved the world from what is now seen as some really evil stuff. Not alone of course, Europe and Russia made huge sacrifices and that's where much of the war was fought. But US arms and blood were the decisive factor, Germany was winning, Japan was winning.
After WWII, the US decided to rebuild the world. We turned our enemies (Germany, Japan) into our close allies.
And the people who did it were really and seriously morally committed to doing what they thought was right. It was about building a country, working together. Not the insane politics of today.
Look, it wasn't all rose-tinted glasses. Bad stuff happened, and McCarthy was worse that what we currently have. And the civil rights movement and all of that. And the stupid wars, Korea, Vietnam, all the smaller police actions. Bad shit was done.
But on balance, the US was seen as the force of good, and the guaranteeor of world peace and the prosperity that allows.