Question

I work on a lot of projects for fun, so I have the freedom to choose when I want to finish the project and am not constrained by deadlines. Therefore, if I wanted to, I could follow this philosophy:

Whenever you hit a bug, don't debug. Instead, spend time learning about topics related to the bug from textbooks and work on other projects until one day you can come back to the bug and solve it instantly thanks to your piled up knowledge.

However, maybe the 'philosophy' is bad because

  1. debugging is a skill that should be practiced so that when I need to finish projects by a deadline I will have acquired the skills necessary to debug quickly.
  2. staring at lines of code and struggling to understand them when debugging makes you a much better programmer than writing new lines of code does.
  3. debugging isn't a waste of time for some reason which I hope you'll tell me.
Was it helpful?

Solution

Whenever you hit a bug, don't debug. Instead, spend time learning about topics related to the bug from textbooks and work on other projects until one day you can come back to the bug and solve it instantly thanks to your piled up knowledge.

You make the false assumption that people "write" bugs because of lack of knowledge. A lot of bugs are mistakes, and making false assumptions. And actually the debugging in itself is a learning experience which will develop you as programmer.

Yes, debugging is usefull because of:

3: you would never deliver a working product if you didn't.

OTHER TIPS

Depends what you mean by debugging. In your approach debugging looks like just trying around until the bug is fixed.

From my point of view debugging is a little broader, i.e. reproducing the bug in an isolated scenario, evaluating the cause of the bug, fixing the bug, testing in a common scenario, etc. Iterations are possible ;)

To answer the question, debugging (like defined in the previous paragraph) is not a waste of time.

I would approach the topic from the other side: how to avoid creating bugs in the first place? Quality has to be built in, not tacked on afterwards.

Anyway, I feel there's a very simple reason not to allow bugs to fester if you find them: they tend to get worse when left alone.
It even happens quite frequently that the behaviour of bugs is something future development relies on for other functionality. Then when you do get around fixing that bug, you wind up breaking other stuff.

In layman's words:

I beats me how on earth can someone separate the process of programming from the proccess of debugging.

If you write perfect programs the very first time you try, then you are Neo, the chosen one from Matrix trilogy, otherwise writing a program is a process of writing, testing and debugging.

Then, no, debugging is not a waste of time, it's a neccessity arised from the fact that you can't simply write a perfect program at the first try.

Debugging and researching a bug are not very interchangeable.

Debugging is going through your code(using a debugger, running it while printing internal stuff, reviewing the code etc.) trying to find the part of your code that doesn't do what you meant it to do.

Researching is needed when you find a problem in a "black box" - a part of the code you can't probe, either because it's part of the language, or you don't have access to it's source code, or you simply lack the experience/knowledge/time to probe the source code by yourself. When you give that black box input you think is correct and it gives you incorrect output or side-effects, that means that either you don't use it properly or there is a bug in the component itself. At any rate - you need to research it in order to solve it.

Debugging can not be replaced with researching. Problems that need debugging are usually problems specific to your own code. If your function returns 2 instead of 3, you can't google the bug because even if you find someone else whose function return 2 instead of 3 it'll probably be because of different reasons. Even if your bug falls into a broader pattern that you can research, you'll often need to do some debugging first to find out which pattern it is, and by doing that you have probably already solved the bug.

Researching can not be replaced with debugging. If you find such a black box you're not going to just throw input at it randomly until you happen to make it work. Since the problem is with something external you can actually look it up. If a function from external library returns 2 instead of 3, and you find someone else on the internet who has the same problem, it's probably from the same reason, since you both use the same function.

Debugging and researching a bug are not very interchangeable. Which approach you should take(sometimes you'll need both!) depends on the bug at hand, not on some general rule decided beforehand.

Your customers will greatly appreciate it if they report a critical bug and you take the approach of just letting it sit there until suddenly a light goes on in your head and you come to the realisation just what exactly caused it several months later without ever having put any effort into actually analysing the problem...
Debugging isn't just mindlessly reading source code and hoping to spot the error. If the error were that obvious it shouldn't have ever made it into production in the first place.

All good answers here. I would only add that bugs fall into two categories.

When you write a program, you have in mind, if not on paper, a notion of what you want it to do - a set of requirements.

When a program fails to meet its requirements, let's call that a type 1 bug. Example: printing the wrong output.

When a program fails to meet any possible requirement, let's call that a type 2 bug. Example: a crash or hang.

Either way, it's as important to know how to take bugs out as it is to put them in.

Whenever you hit a bug, don't debug.

I'm not the original author so don't know for sure, but I don't believe they are arguing that you should never debug. I believe they are arguing against the tendency to immediately start debugging, as soon as you find a bug. I am guilty of this, sometimes, and it can be very tempting because sometimes it is a quick way to find the problem.

But debugging cannot replace a good understanding the code, and indeed, having that good understanding can help you debug the issue more quickly.

The skill, which only comes with experience of trying both, is understanding how much time to spend understanding the code, without a debugger, and when to just leap in with the debugger and "see what is actually going on"

For me a bug is when something doesn't work in a way I expect it to. Which usually means I don't understand how something work and in most cases I've seen it means that it is just a sign of something more serious going bad somewhere deeper. Which means I have to find why I see what I see. In some cases the result is "I don't even know how this worked before". Which is good - I just found a serious error by noticing that something is out of place.

So no, it is not a waste of time.

Bugs generally happen because of a wrong assumption of how something works. Most of the time, the wrong assumption is about how your own code works. The algorithm you came up with for calculating something doesn't quite do what you think, especially in edge cases. You make an assumption about the input that doesn't actually hold. And so on.

In a few cases, the wrong assumption will be about something external. You misunderstood how initialization order during construction works (or maybe you got confused because it works differently in the language you're used to - just compare initialization of objects in Java, C# and C++). You thought the modulo operator never returns negative results.

In the latter case, learning additional things may help you, although stumbling across the right bit of knowledge is a big stroke of luck.

But most of the time, the additional knowledge you need is what precisely the code you wrote is doing. And learning that means studying the code, possibly watching it execute, and that's exactly what debugging is.

Its an interesting concept once you get past the emotional response many of these answers give.

Imagine the question said "never debug, instead write a test to reproduce the issue and fix the offending code from that". That's more ofd my understanding where the questioner was coming from.

Its a nice idea, but in the practical world, flawed because bugs don't often come from poor code, but poor data (yea, ok, it still counts as poor code because it doesn't handle the data given). Its the input that matters though. Code that works because it received expected data during testing is still working code, but when you deliver it the sheer complexity of a system says that you'll get very unexpected data - especially in circumstances where the data may be correct, but some other data elsewhere may cause a conflict. this is why you debug, you try to figure out why your 'working' code fails to work as expected.

A valid alternative is much logging, but if you put enough in, you usually end up narrowing the cause down which leaves you with a reduced debugging task - that still requires some debugging.

Licensed under: CC-BY-SA with attribution
scroll top