Question

When looking through some code that was handled by another employee, I see a lot of code written in:

do{
    ...
}while(false);

What advantage (if any) does this provide?

Here is more of a skeleton that is happening in the code:

try{
    do{
        // Set some variables

        for(...) {
            if(...) break;
            // Do some more stuff
            if(...) break;
            // Do some more stuff
        }
    }while(false);
}catch(Exception e) { 
    // Exception handling 
}

Update:

C++ Version:
Are do-while-false loops common?

Was it helpful?

Solution

Maybe it was done to be able to jump out of the "loop" at any time, e.g:

do
{
    ...
    if (somethingIsWrong) break;
    //more code
    ...
}
while(false);

But as others have said, I wouldn't do it like this.

OTHER TIPS

In Java, there is no reason to do this.

In C, this is a common idiom when defining macros:

Consider:

#define macro1 doStuff(); doOtherStuff()
#define macro2 do{ doStuff(); doOtherStuff(); } while( false )

if( something ) macro1; // only the first statement in macro1 is executed conditionally
if( something ) macro2; // does what it looks like it does

...but macros in C are evil and should be avoided if at all possible.

Does your coworker come from a C background?

No advantage. Don't do it.

It can be used to goto the end of a block which can be used to avoid a bunch nested if/then blocks.

do {
    if (done()) continue;
    // do a bunch of stuff
    if (done()) continue;
    // do a bunch more stuff
    if (done()) continue;
    // do even more stuff
} while (false);

This is used in C to define block inside macro. See this for example.

Example, the following is invalid:

#define f(x) { g(x); h(x); }

if (x >= 0) f(x); else f(-x);

but with this definition, it will work:

#define f(x) do { g(x); h(x) } while(false)

It is useless, but the coder could have used a multiple break commands to do some weird exception handling.

do{
if(<something>){...}else{break}
if(<something else>){...}else{break}
...
}while(false)

Granted its stupid, but I did find something like in a old c program once

Your intuition is right. That is a completely useless thing to do.

It is possible that whoever coded it originally had something other than false as the condition, and simply changed it to false rather than removing the entire block not to lose this "history" of the code. This is a just clutching at straws however. To be quite frank it's just a plain example of a tautology, which has no place in code.

In pretty much every language other than C/C++ this provides no tactical advantage.

In C/C++ there is a case with macros where do/while(false) makes it easier to expand a macro safely into multiple statements. This is advantageous when the macro otherwise looks like a normal function call.

Have a look at this question

The OP asks about this exact construct and explains some reasons for using it. The consensus seems to be that it's a bad thing to do.

As a placeholder for a future when some other condition is put in place of false.

It could be used to skip execution of some code (like a goto or something) but when I look at it again, there seems to be a for loop (where the if(...) break; statements are) in the do-while. Otherwise, I would say that it would be a Java version of a goto...

I've used this convention for years! It's most useful when you have a "pipeline" of processing and/or conditional checks. It simply avoids multiple levels of nested if() statements and thus makes the code (much) easier to read. Yes there are alternatives like try/catch and I only use this style in certain thin/lower-level situations.

I start the "loop" (never actually loops) with a comment like...

// Error loop (never loops)

The error-loop is...

set errorCode = fail;
do {
  if (this)
    break;

  if (that)
    break;

  // Success
  set errorCode = ok;

  // Alternative Success
  errorCode = doWhatever();
} while (false);

Consider this style if you have a bunch of nested if() statements and your indents go more than 3 levels deep.

I've used this convention in Perl, C/C++, Java and Delphi/Pascal.

I'm going to guess the author of that didn't trust his code so he used to run it a couple times to see if that made it work more and this is the archeological remains of that.

The only reason I can think of is to create a block to make variables declared within the {...} more tightly scoped, but there are better ways of doing this (like creating functions, or just creating blocks - hat tip to Pete Kirkham).

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top