Assuming you don't care about the casting style of the compiler and the machine, is there any appreciable difference between:
#include <stdint.h>
#include <inttypes.h>
#include <stdio.h>
static int64_t tosigned (void *p)
{
return *(int64_t *)p;
}
int main (void)
{
int64_t i = 0xfabf00d0badf00d;
uint64_t u = 0xabad1deacafebabe;
printf("%"PRId64 "\n" "%"PRId64 "\n", tosigned(&i), tosigned(&u));
return 0;
}
And this?
#include <stdint.h>
#include <inttypes.h>
#include <stdio.h>
static int64_t tosigned (void *p, int isunsigned)
{
return isunsigned ? (int64_t)*(uint64_t *)p : *(int64_t *)p;
}
int main (void)
{
int64_t i = 0xfabf00d0badf00d;
uint64_t u = 0xabad1deacafebabe;
printf("%"PRId64 "\n" "%"PRId64 "\n", tosigned(&i, 0), tosigned(&u, 1));
return 0;
}
Assembly dumps using clang -S -std=c99 -Wall -Wextra
for -O2
and higher show no difference in output. However, I wanted to know if there was a more "correct" way to do it, or if the first method would run into Undefined or Implementation-Specific behavior.
I assume casting to a pointer-to-the-unsigned-version of the same type would be just as well the same as casting the result of the pointer dereference.
Background:
I was writing an int64 library for Lua and came across this academic question when doing typecasts for arithmetic.
I store int64_t and uint64_t userdata (opaque C blobs stored in the Lua runtime) on the Lua stack. There are "metamethods" assigned to do arithmetic, e.g., "__add"
for x + y
instead of raising a type error because you can't (normally) perform arithmetic on non-numbers. However, the signed and unsigned values are stored as different types and are thus not compatible. I do a wider conversion on the second argument, casting it to the same signedness as the first. The isunsigned
param roughly matches "does this have an unsigned type table?". The runtime returns a void pointer for the stored value and how you convert it is up to you.