Вопрос

I'm writing a buffer into a binary file. Code is as in the following :

FILE *outwav = fopen(outwav_path, "wb");
if(!outwav)
{
    fprintf(stderr, "Can't open file %s for writing.\n", outwav_path);
    exit(1);
}

[...]

//Create sample buffer
short *samples = malloc((loopcount*(blockamount-looppos)+looppos) << 5);
if(!samples)
{
    fprintf(stderr, "Error : Can't allocate memory.\n");
    exit(1);
}

[...]

fwrite(samples, 2, 16*k, outwav);   //write samplebuffer to file

fflush(outwav);
fclose(outwav);
free(samples);

The last free() call causes me random segfaults. After several headaches I thought it was probably because the fwrite call would execute only after a delay, and then it would read freed memory. So I added the fflush call, yet, the problem STILL occurs.

The only way to get rid of it is to not free the memory and let the OS do it for me. This is supposed to be bad practice though, so I'd rather ask if there is no better solution.

Before anyone asks, yes I check that the file is opened correctly, and yes I test that the memory is allocated properly, and no, I don't touch the returned pointers in any way.

Это было полезно?

Решение

Once fwrite returns you are free to do whatever you want with the buffer. You can remove the fflush call.

It sounds like a buffer overflow error in a totally unrelated part of the program is writing over the book-keeping information that free needs to do its work. Run your program under a tool like valgrind to find out if this is the problem and to find the part of the program that has a buffer overflow.

Лицензировано под: CC-BY-SA с атрибуция
Не связан с StackOverflow
scroll top