In my experience (despite what you might imagine) the biggest factor has been lack of experience for the developers. You can get some pretty weird ideas when you first start out. Those weird ideas can gain traction in a group and then it becomes "the way" to do it. Over time, the system gains cruft and there is no practical way of addressing the problems. It takes really experienced people who specialise in legacy code to systematically improve large systems -- and it takes a donkey's age. It's just not a sustainable enterprise.
We tend to think that harsh deadlines and unthinking managers are to blame, but writing good code over the long term is incredibly difficult. Group dynamics are hard to deal with and as you add (or replace) people on the team, you are bound to eventually go off the rails even if you started off well. Which is not to say that you shouldn't try, but our industry is really immature. Most developers are quite young and even when you have a couple of older people on the team, they may or may not have the skills for long term design evolution. Maybe 50 years from now it will be the norm to write good code, but I think we'll still be writing legacy messes for a while.
I should point out that even the worst code I've seen written this decade is at least an order of magnitude better than the average code I saw when I started my career. As an industry we are improving!
The core tech lead on our team switched projects, new engineers were assigned to our projects and given tight deadlines, as well as being responsible for giving each other CRs. Our once well maintained code base began to fall apart, and I realized keeping it clean wasn't worth burning myself out. This is also my first real job working on/with software, so I suppose this is normal?