Asked by Deborah
How did immigration change America?
Answers
Answered by
Ms. Sue
Before immigrants came, the Native Americans had the country to themselves. Immigrants took their land, spread diseases, and declared war on the Native Americans.
There are no AI answers yet. The ability to request AI answers is coming soon!
Submit Your Answer
We prioritize human answers over AI answers.
If you are human, and you can answer this question, please submit your answer.