Discuss how World War I affected the lives of Americans. How did it change American social realities regarding women the family and society in general?

Please help I've looked and just cant come up with nothing.