Question
How did immigration change America?
Answers
Ms. Sue
Before immigrants came, the Native Americans had the country to themselves. Immigrants took their land, spread diseases, and declared war on the Native Americans.