Question

I'm getting a signed payload from an authentication source that comes in a base64 encoded and URL encoded format. I'm getting confused somewhere while evaluating, and ending up with similar data in different 'formats'.

Here's my code:

//Split the message to payload and signature
string[] split = raw_message.Split('.');

//Payload
string base64_payload = WebUtility.UrlDecode(split[0]);
byte[] payload = Convert.FromBase64String(base64_payload);

//Expected signature
string base64_expected_sig = WebUtility.UrlDecode(split[1]);
byte[] expected_sig = Convert.FromBase64String(base64_expected_sig);

//Signature
byte[] signature = hmacsha256.ComputeHash(payload);

//Output as a string
var foo = System.Text.Encoding.UTF8.GetString(expected_sig);
var bar = BitConverter.ToString(signature);

The expected signature (foo) comes out like so: 76eba09fcb54877299dcbd1e1e35717e3bd42e066e7ecdb131c7d0161dec3418

The computed signature (bar) is as follows:

76-EB-A0-9F-CB-54-87-72-99-DC-BD-1E-1E-35-71-7E-3B-D4-2E-06-6E-7E-CD-B1-31-C7-D0-16-1D-EC-34-18

Obviously, when comparing bytes for bytes, this doesn't work.

I see that I'm having to convert the expected_sig and the signature in different ways to get them to display as a string, but I can't figure out how I need to change the expected signature to get to where I can compare bytes for bytes.

I can obviously work around the issue but simply converting the string bar, but that's dirty and I just don't like it.

Where am I going wrong here? What am I not understanding?

Was it helpful?

Solution

The good news is that the hash computation appears to be working.

The bad news is that you're receiving the hash in a brain-dead fashion. For some reason it seems that the authors decided it was a good idea to:

  • Compute the hash (fine)
  • Convert this binary data to text as hex (fine)
  • Convert the hex back into binary data by applying ASCII/UTF-8/anything-ASCII-compatible encoding (why?)
  • Convert the result back into text using base64 (what?)
  • URL-encode the result (which wouldn't even be necessary with hex...)

Using either base64 or hex on the original binary makes sense, but applying both is crazy.

Anyway, it's fairly easy for you to do the same thing. For example:

string hexSignature = string.Join("", signature.Select(b => b.ToString("x2")));
byte[] hexSignatureUtf8 = Encoding.UTF8.GetBytes(hexSignature);
string finalSignature = Convert.ToBase64String(hexSignatureUtf8);

That should now match WebUtility.UrlDecode(split[1]).

Alternatively, you can work backwards from what's in the result, but I wouldn't go as far as parsing the hex back to bytes - it would be simpler to keep the first line of the above, but use:

string expectedHexBase64 = WebUtility.UrlDecode(split[1]);
byte[] expectedHexUtf8 = Convert.FromBase64String(expectedHexBase64);
string expectedHex = Encoding.UTF8.GetString(expectedHexUtf8);

Then compare it with hexSignature.

Ideally, you should talk to whoever's providing you with the crazy format and hit them with a cluestick though...

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top