Go back and look again at the facts you've found.
The Americans won the War of 1812 because the British left them alone after that. This war established the U.S. as an independent nation that repelled a foreign invader.
What are some facts that could be used to prove the history of "the ameicans won the war of 1812
I have looked and looked and looked and I can't really find anything.
Could someone please help by giving me some facts or places that I can look
really confused,
thanks
2 answers
It is hard to say either side won, or lost. After fighting, the original reasons for fighting had largely been changed, both sides didn't want to continue, and both sides signed the Treaty of Ghent, saying they would stop fighting.
http://en.wikipedia.org/wiki/War_of_1812
http://en.wikipedia.org/wiki/War_of_1812