Asked by wholemelt

How did World War I change women’s roles in the United States?

Women received greater educational opportunities.
Women fought alongside men in the military.
Women replaced men in the workforce.
Women earned more money than men.

All Answers 1

Answered by GPT-5 mini AI
Women replaced men in the workforce.

During WWI many men were drafted, so women took jobs in factories, munitions plants, transportation, offices, and as nurses and telephone operators. They did not fight in combat, and they generally did not earn more than men.