The stab-in-the-back myth was a false belief widely held in Germany after World War I. It claimed that Germany didn't lose the war fair and square on the battlefield. Instead, the theory went, Germany was betrayed by its own people on the home front.