I disagree with this. The debugger is a very, very precise tool and it can be used to gain very, very precise insights into certain code, however, very often the debugger is just too precise. It is pretty much like trying to understand a large chip by looking at how gates flip and flop.
Of course, if the code is horrible enough, then you might need to switch down to actually tracing line by line and opcode by opcode, potentially even using a debugger, but for most sane code, I think it is possible and faster to understand larger blobs of code at once.
In general, if you're having trouble tracing through a particular function with a narrow band of input, then the debugger can be useful. If you're trying to figure out a larger system and/or how a system works across a large set of inputs, then stepping through with a debugger is useless.
Put another way, the debugger is to programming what the microscope is to medicine: incredibly useful for some things, but not a very good general diagnostic tool. Metrics data and checking assumptions (via unittest and/or asserting expectations based on reading the code) are much better for getting an over-all idea of what's going on. Once you've localized a problem, then the debugger can help you get a precise idea of the issue, or can help you test a hypothesis (to use the medical example, you can use the microscope to test a theory that there's a bacterial infection).
Of course, if the code is horrible enough, then you might need to switch down to actually tracing line by line and opcode by opcode, potentially even using a debugger, but for most sane code, I think it is possible and faster to understand larger blobs of code at once.