Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

OK. And I've also cringed watching ninjas step through code slowly, reading everything, spending 20 mins catching something 2 print statements would have achieved.

Debuggers aren't bad. But neither is printing. Knowing when to reach for them is probably a bit more key.



> And I've also cringed watching ninjas step through code slowly, reading everything, spending 20 mins catching something 2 print statements would have achieved.

The problem is that the two print statements will only catch the bug if they are the right two, based on a correct hypothesis of what the bug is. Which, with a debugger, won’t require stepping, but setting two breakpoints, doing a run-to-breakpoint, and inspecting values.

Stepping is required when you are exploring behavior because you don’t have an easily testable hypothesis about the source of the bug.


But with print statements if the first place I put them doesn't work, I can start doing bisects and quickly find the right place to print.

As you note, debugger breakpoints aren't magically better than print statements when I'm investigating a hypothesis – I'm no more likely to put them the right place than I would have put print statements.

And then there's a class of problems that neither debugger nor print statements will help: many years ago a very junior co-worker was wondering why his C code was giving the wrong answer for some math. It took me pointing out that one of the numeric types he was using in his code was different from the rest (I think it was #defined elsewhere, in some library, as an integer type). When the compiler did the math it had to do some type coercing.


> And then there's a class of problems that neither debugger nor print statements will help: many years ago a very junior co-worker was wondering why his C code was giving the wrong answer for some math. It took me pointing out that one of the numeric types he was using in his code was different from the rest

A debugger and watches on the values of concern absolutely will help with that (so will properly placed print statements), so its a really bad example. (Of course, strong typing helps even more with that particular case.)


No, my co-worker was so junior he didn't understand why that was happening. It took me a moment to glance at the types in the source and point out the problem, no debugger needed.


> But with print statements if the first place I put them doesn't work, I can start doing bisects and quickly find the right place to print.

Can't I also bisect with breakpoints?


You can, but the comment was talking specifically about stepping through the code line-by-line.


I can bisect faster with breakpoints. Plus with a time traveling debugger like rr the the time is further reduced.


Tools in a toolbox.

The worst developers I've ever known always have their "this is the best way to do everything" hill they die on.


They weren't ninjas, otherwise they would have used breakpoint actions for doing those print statements without modifying the source code.


What's breakpoint action? Is it like inserting printf before breakpoint?


Breaking is just the default behavior when a breakpoint is hit, you can generally attach whatever behavior / conditions you want using the debugger's scripting language.


Reading through the majority of this comment section, I get the impression that those who like print statements find value because they aren’t proficient with modern debuggers, rather than they find print statement valuable even though they’re proficient with debuggers.


I once saw an interview with Visual Studio members that one reason why they started doing talks about how to use the debugger was the continuous set of requests for features that Visual Studio does almost since it exists.

Same applies to other debuggers.

It is not only the debuggers, but also OS and language runtime tracing facilities like DTrace, eBPF, ETW, JFR, ....

Many devs aren't 10x because of QI, rather because they learn and make use of the tools available to increase knowledge about the platform.


I agree, but my feeling is, if one person is bad at using debuggers it is their fault. If (as it seems to me) most developers are bad at using debuggers, then it's probably to debugger's (and associated tooling's) fault.


For me it is the teacher's fault, given that a large majority never teaches anything related to debuggers.

So we get generations that use vim and Emacs like notepad, create Makefiles by copy-paste and barely know gdb beyond set breakpoint, run, step and continue.

Using C as example, but feel free to extrapolate to another language.

And no I am not exaggerating, this was the kind of students I would get into my lab when I spent a year as TA in 1999/2000, and naturally would have to get them up to speed into good programming practices.


Reminds me of this diagram http://i.imgur.com/ZOuf9hg.png (which I don't necessarily agree with)


It maps breakpoints to debugger actions that are triggered instead of actually stopping execution, like formatted output of whatever variables are in scope.


maybe this is sometimes called assert()? or in some debuggers you set a watch on a var and the BP triggers only on the watch-condition so the BP don't trigger each loop, only when x=7




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: