Question

This kind of builds up on Already asked question... However here, say, I'm given a hexadecimal input which could be a max of '0xFFFF' I'll need it converted to binary, so that I'd end up with a max of 16 bits.

I was wondering if using 'bitset' it'd be quite simple.. Any ideas?

EDIT :

After getting answers, improvised piece of code here : http://pastebin.com/f7a6f0a69

Was it helpful?

Solution

Supposing by "hexadecimal input" you mean a string containing a hexadecimal number, then this would work:

const char* const str = "0xFFFF";
std::istringstream iss(str);
int i;
iss >> std::hex >> i;
if(!iss && !iss.eof()) throw "dammit!";
std::cout << '"' << str << "\": " << i << "(0x" << std::hex << i << ")\n";
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top