gdb can "list" and "where", which gives you the lexical and dynamic contexts respectively.
Here come the downvotes... but if you don't know the code running in the debugger well enough to remember the details, well, you've found your bug. Read and understand your code very carefully before you waste your time single-stepping through it.
(This is the biggest problem I see with developers that use IDEs -- they forget that they have a brain, and that your brain is always going to be faster than clicking shit in Eclipse.)
Oh great, another internet toughman boasting about how all those IDE-using pussies are just stupid and how Real Programming should be left to Real Programmers (queue story of Mel here somewhere). rollseyes
When you have to type commands to see what you would otherwise see immediately, how is that not a usability deficiency? GDB fans just have Stockholm syndrome.
And if you think there is anyone on this world who has a complete mental model of each line of code in a 10k-line project, let alone a 100k or 1m one, you're insane. I'm not talking about debugging a 15-line function, But even then, single-stepping through a function of that size doesn't even cost extra time when you use proper tools (and not that POS gdb), so 'waste your time' doesn't even apply there.
Debuggers have a bad rep with some crowds because of the damage gdb did - people think that that's how debugger work and can only work. Navel-gazing at its worst.
Oh great, another internet toughman boasting about how all those IDE-using pussies are just stupid and how Real Programming should be left to Real Programmers (queue story of Mel here somewhere). rollseyes
If you want to talk like this, go back to Reddit. I'm not an Internet toughman, I'm describing the motivations behind the interface of gdb. Turns out that just because someone disagrees with you doesn't mean they're dumber than you.
In this case, I think it's worth listening to me because I've used pretty much every development tool imaginable to write hundreds of thousands of lines of code. I may know what I'm talking about! Shocking!
When you have to type commands to see what you would otherwise see immediately, how is that not a usability deficiency? GDB fans just have Stockholm syndrome.
Information overload is just as bad as too little information. GDB lets you see what you need to see when you need to see it. It doesn't guess what you want it to do; you tell it. You're the brain and it's the computer.
And, it hooks into your text editor if you want that little => next to the line of code GDB is on. I've personally never found this to be useful. (As I mention in a comment above, it's rare for modern compilers to produce code that runs top-to-bottom exactly as you've typed it anyway. The source code is an explanation of the sort of steps you want the computer to take, but it's not an exact model of what the comptuer's going to do. Saying "we're now running this line of code" can be very misleading.)
And if you think there is anyone on this world who has a complete mental model of each line of code in a 10k-line project, let alone a 100k or 1m one, you're insane. I'm not talking about debugging a 15-line function.
This is just plain wrong. The debugger is the wrong tool for analyzing a bug that persists over 10k lines of code. The key to debugging is to reduce the scope of the problem, and the way to do that is by not tightly-coupling large chunks of code. If you have a bug that requires a mental model of 10k-1m lines of code, unit tests and refactoring is the tool you need to be using.
But, with that in mind, 10k lines is not that much to keep in one's memory. I maintain cperl-mode.el, which is 10k lines of elisp that I did not write. I haven't touched it for months, but I still know where the important parts are and how they interact. Reading is one of the most important skills a programmer can have. If you can't remember what you've read, you need to slow down. Lack of understanding causes bugs.
single-stepping through a function of that size doesn't even cost extra time when you use proper tools (and not that POS gdb), so 'waste your time' doesn't even apply there
Yes it does. Single-stepping requires a round-trip between the computer and your brain for every line of code. When your IDE jumps you to a new file because you stepped into a function, your brain has to context-switch and read the surrounding code. All in all, it's a slow and complicated operation compared to the normal process of running your unit tests and scanning the logs if there is a failure. That case is fast because most parts are done by the computer; it just gives you a few pieces of information that you can use to analyze, tweak, and re-run. And you can drink some coffee while the computer does most of the work!
A debugger is for a very special case where unit tests cannot find bugs, or the effort to write these test would be too high. A good example is examining a core dump from a production process. You load it up in your debugger and can inspect the program state in great detail. You see that the current call frame is 0x626f6f66. That looks like the text "foob", which seems like it might have come from some text your program was working with, "the foobar chronicles". But you're using bounds-checking for your string operations, so why the bug? Use the debugger to inspect some local in-memory state, like the "len" parameter to strlcpy. It's 2938473873! Must be an integer overflow somewhere.
The key here is that everything we did was interactive and required human thought at each step. We didn't know what the problem was, so we asked the debugger for some data. Then we thought about it for a while, came up with a theory to test, and asked the debugger for more information to prove or disprove our theory. After repeating this a few times, we came to the conclusion that we are doing an addition wrong in the foo function, and that's making the program segfault.
With that in mind, we switch back to our text editor, write a test for the length-calculation math, and then fix the math. We shouldn't need a trip to the debugger for a while, because we use the debugger to collect information in odd cases, not to watch our program run normally.
So anyway, I guess my hangup with the graphical IDE-based debuggers is "why". Why are you writing code that needs to be single-stepped regularly? Why is your codebase so messy that you can't understand it? Why is your time so worthless to you that you are doing the computer's work for it, instead of the other way around?
A debugger is a tool to use for investigating weird things, not something you should bust out everyday.
Debuggers have a bad rep with some crowds because of the damage gdb did - people think that that's how debugger work and can only work. Navel-gazing at its worst.
A lot of insults, but not a lot of information. If you're so sure that Visual Studio's debugger has solved all of programming's problems, why not share a concrete example of how you use it? Your post adds nothing to the conversation and makes people wish your computer would punch you in the face.
I'm not going to argue further since Mithaldu has already made the points I'd make and in a much more coherent way than I would anyway, but I have to give it to you that
"Your post adds nothing to the conversation and makes people wish your computer would punch you in the face."
made me chuckle, and I will be adding this to my list of favorite internet insults.
> I'm not an Internet toughman, I'm describing the motivations behind the interface of gdb.
Actually, yes, you did that, but you still are one.
For the plain and simple reason that you repeatedly stooped to insults while making your points instead of letting them stand on their own. You should not be surprised people think little of you if you cannot teach without belittling people.
It's funny how both sides in any argument or disagreement think one side is belittling the other by doing nothing more than assuming the other side doesn't understand something or another (this includes you just now btw). Certes pointing out ad hominem counts as ad hominem every single time too.
Most certainly not. The factual divide is one issue and his politeness another. The posts of jrockway are laced with flat out insults that are unnecessary and have nothing to do with any assumptions about understanding.
Let me point one out:
> Your post [...] makes people wish your computer would punch you in the face.
Surely there was insult and ad hominem in his post (not arguing the fact), but I believe it was in reply to the insult and ad homimen from the post he was replying to. Calling someone an internet toughman counts as ad hominem just as much as calling someone computerfacepunch.
Although, the more I think about it, your point was probably: no need to stoop to the same level.
> they forget that they have a brain
> where -O3'd code magically executes as though the compiler didn't do any optimizations on it.
> And finally, if you use correct grammar in your comments
These have varying degrees of subtlety, but they are definitely aimed digs.
There was a definite precedent that warranted the toughman comment and made it seem like an accurate description to me even when i was actively trying to be nice.
Edit: I understand your point and think about it fairly often, i.e. when programmers in one language attack another and the attacked group goes on to point out how they refrain from attacks. I just do not think i have any ash to sprinkle on my shoulder in this case.
It's not a dig. I do think you're dumb if you don't like gdb. Naturally, my opinion comes out in my writing, which I think is my Creative License. You should realize that anything you read is biased. You should read the post for content, mix it with its credibility, and then reach a decision.
In the end, I think that my experience programming adds more credibility than my love for snide comments subtracts. That doesn't make me a toughguy, that makes my writing interesting to read.
And for a more factual reply: You're making a lot of wrong assumptions.
Ahead: I will be talking about a Perl debugger.
> Information overload is just as bad as too little information.
Here you're making the assumption that such a GUI debugger would show ALL OF THE THINGS ALL THE TIME. Instead however they can be pretty damn intelligent. My weapon of choice inspects the lexical scope around the current position and displays variables around it, as such there is a window of things that the user is very likely to care about whenever the debugger stops. Then, to top it off: The debugger does not actually show all the contents of complex variables at once and instead permits it to show/hide/page their contents and remembers what state they were in when looked at the last time.
Information overload is a real problem, but technology has advanced and as long as people don't try to create actual intelligence, agents that take out the work out of certain oft-repeated tasks are viable and useful.
> Single-stepping requires a round-trip between the computer and your brain for every line of code.
Here you're making the assumption that one has to step through every single line of code to get to the place one wants to look at. In reality, breakpoints are employed instead and sometimes even conditional breakpoints, in order to avoid having to s-s-s-s on a point that's executed repeatedly with data one doesn't care about. Additionally they have nifty tools like "step out" or "step over" that permit it to make it easier to get to interesting points without going through entire irrelevant functions. And lastly, breakpoints are not static, they can easily be set or disabled at runtime, making exploration easy.
> And, it hooks into your text editor if you want that little => next to the line of code GDB is on. I've personally never found this to be useful.
Here you're making the assumption that all the code is in one file most of the time. I do not think i need to expand on that. Also, keep in mind that debuggers are often more than clever enough to realize that per-line debugging is stupid and per-statement debugging much more useful. As such, the little arrow can, for example, be really useful in showing you that your code keeps stepping on a certain line where you expected it to proceed.
> The key to debugging is to reduce the scope of the problem
When one starts using third-party libraries the debugger is stupidly useful in showing exactly where inside that library the input data is operated on, so it can actually be the best tool to reducing the scope.
> unit tests unit tests unit tests
You like them a lot. To the point where you seem to ignore that they're entirely synthetic and can mislead a lot about what actually happens when code collides with real data. Unit tests are not a debugging tool. They are self-checking documentation of your code.
Ok, i can't be arsed to write more.
Please try to understand that there are many perspectives and that insulting people will only lead to them hardening their perspectives.
Here come the downvotes... but if you don't know the code running in the debugger well enough to remember the details, well, you've found your bug. Read and understand your code very carefully before you waste your time single-stepping through it.
(This is the biggest problem I see with developers that use IDEs -- they forget that they have a brain, and that your brain is always going to be faster than clicking shit in Eclipse.)