PDF
During World War I, many American women took on roles in traditional jobs, like nursing and support work, while men went to fight. In contrast, World War II saw women entering the workforce in large numbers, taking on roles in factories and even serving in the military. Women gained more independence and rights during the Second World War, as they were encouraged to work in jobs previously held by men. Overall, the Second World War marked a significant shift in the perception of women's capabilities compared to World War I.

Ask a followup question

Loading...