The people who were talking about the West only learned about the existence of the Americas about 600 years ago. Three thousand years ago Europeans were uncivilized and backward compared to people around the Mediterranean and the Fertile Crescent.
What do you think about the other questions?
The boundaries of the "West" have fluctuated over the past 3000 years. Why do you think that is the case? What accounts for those changes?
Is the idea of the West objective or subjective? How is it decided if one country or another is part of the West?
What are the most important shared values and cultural practices that make up Western Civilization? What unites people in the West?
3 answers
I may be wasting my time here, hours after you posted your question, Jimmy, but it depends entirely on how YOU define "the West". During the Cold War, "the West" was those nations allied with the United States and Western Europe versus those aligned with the Soviet Union and communist China. In the United States, "the west" was anything west of the last frontier that had been settled, and moved from the Appalachian Mountains to the Mississippi River to California. Once, what is now Iowa was considered the "wild west". The "west" is often considered to be societies with roots in Europe as opposed to Asian-rooted societies, often called "Eastern". So, it depends on how you define the term, and at what point in history.
And, I would point out that if you live in Vladivostok and head due east, you'll end up in North America - so it's all relative. If you live in Los Angeles and head due west, you'll find the next continent to be Asia (with islands in between). Go figure.