Game Guides > Game FAQ >  

What changed after World War 1

What changed after World War 1
Germany ended up paying and taking blame for the War. They had to sign a treaty, The Treaty of Versailles.