Can anyone explain how expatriate Americans and native Europeans viewed America after WWI.
1 answer
Expatriate Americans viewed the U.S. has disparaging the talents and accomplishments of women and blacks.
1 answer