World War 2 had profound and lasting effects on American society. It increased opportunities for women and minorities as they took new jobs in factories and industries while men were at war. However, discrimination still existed and the war led to the unjust internment of Japanese Americans. The US emerged from the war as a global superpower that helped establish international organizations like the UN to promote peace and prevent future conflicts.