Question

This is for a class on Compilers.

I'm calling this code:

struct attribute attr = {.token = token, .buffer = *buffer, .length = length, .format = format};
fprintf(stdout, "after struct: %d\n", data.flags);
data.attributes[i] = attr;
fprintf(stdout, "after data.attributes[i] = attr: %d\n", data.flags);

from a function with this prototype:

int attribute( int token, char *buffer, unsigned int length, int format);

and it goes to this struct:

#define DATA    struct data
DATA
{
//define the scanner attribute table
#define MAX_ATTRIBUTES  128
    ATTRIBUTE attributes[MAX_ATTRIBUTES];
    unsigned int index;
    int column;
    int flags;
#define FLAGS_ECHO      0x0001
#define FLAGS_DEBUG     0x0002
#define FLAGS_PARSE     0x0004
#define FLAGS_SYMBOL    0x0008
#define IS_FLAGS_ECHO(a)    (a & FLAGS_ECHO)    
#define SET_FLAGS_ECHO(a)   (a |= FLAGS_ECHO)
#define CLR_FLAGS_ECHO(a)   (a &= ~FLAGS_ECHO)
#define IS_FLAGS_DEBUG(a)   (a & FLAGS_DEBUG)   
#define SET_FLAGS_DEBUG(a)  (a |= FLAGS_DEBUG)
#define CLR_FLAGS_DEBUG(a)  (a &= ~FLAGS_DEBUG)
#define IS_FLAGS_PARSE(a)   (a & FLAGS_PARSE)   
#define SET_FLAGS_PARSE(a)  (a |= FLAGS_PARSE)
#define CLR_FLAGS_PARSE(a)  (a &= ~FLAGS_PARSE)
#define IS_FLAGS_SYMBOL(a)  (a & FLAGS_SYMBOL)
#define SET_FLAGS_SYMBOL(a) (a |= FLAGS_SYMBOL)
#define CLR_FLAGS_SYMBOL(a) (a &= ~FLAGS_SYMBOL)
};

The ATTRIBUTE struct is here:

#define ATTRIBUTE       struct attribute
ATTRIBUTE{
    int token;
#define MAX_BUFFER_SIZE     128
    char buffer[MAX_BUFFER_SIZE];
    int length;
#define FORMAT_NONE         0
#define FORMAT_CHAR         1
#define FORMAT_DECIMAL      2
#define FORMAT_HEXIDECIMAL  3
#define FORMAT_OCTAL        4
#define FORMAT_FLOAT        5
    int format;
};

The problem is that when I call that first bit of code the first time, the value of data.flags gets reset from 8 to 0.

All the formatting and (at least seemingly to me) excessive use of macros are per the professor, so I didn't change them.

Does anyone know why this would be happening? Please let me know if you need more information than I've given.

EDIT:

It seems that the problem is likely coming from where I determine i:

for(i = sizeof(data.attributes)/sizeof(data.attributes[0]) - 1; i >= 0; i--){
    if(&(data.attributes[i]) != NULL) break;
}

Question Answered Edit:

Thanks to the information gained from Zan Lynx's answer, I figured out that rather than the way I had been doing it, I needed to initialize the elements of the attributes array somewhere like this:

int i;
struct attribute null_attr = {.token = -1, .format = -1, .length = -1};
for(i = 0; i < sizeof(data.attributes)/sizeof(data.attributes[0]); i++){
    data.attributes[i] = null_attr;
}

and then check for the earliest uninitialized attribute like this:

for(i = sizeof(data.attributes)/sizeof(data.attributes[0]) - 1; i >= 0; i--){
    if(data.attributes[i].token != -1
      && data.attributes[i].length != -1
      && data.attributes[i].format != -1)
    break;
}
Was it helpful?

Solution

We really need a reproducible example in a single code file if at all possible. Perhaps post a link to some code that compiles at ideone.com?

If this is really happening when i is 0 (please verify that) then I would guess your struct attribute is somehow getting defined as different sizes in different compile units. Is that possible?

I've seen that in other code before and it caused no end of confusion. In that other code it was caused by a #define symbol that was picking up a different value in different Makefiles.

I decided to do it myself.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top