For over twenty years I've been teaching people about the figure shown below. It shows that the cost of fixing a bug grows exponentially as you move from one phase of development to another. Putting it another way, delaying work on an issue dramatically increases the cost of fixing it.
Turns out, it ain't so. Menzies2016 looked at data from 171 software projects over an eight year period and found that no, the effort to resolve issues doesn't increase this way. The authors are very careful in their definition of project phases, the way they attribute defects to particular phases, how they gathered and cleaned their data, and why they chose the statistical tests they used. It all culminates in a table showing how the median time to resolve issues in each phase compares with the median times of earlier and later phases, and no, there isn't the kind of steep growth in later phases that most developers believe there is. (And yes, they checked that developers do believe that with a survey.)
I'm still digesting the implications of this finding and wondering how widely it generalizes, but a quick check of the four undergraduate software engineering textbooks I still own revealed that all of them teach the "exponential increase" theory. That theory was based on careful analysis of the data available at the time, but as is so often the case in science, more and better data requires us to change our mind.