I wouldn't worry that much about the fact that a JSON message itself may or may not be UTF8-encoded. The encoding of the message does not affect the encoding of the contents.
Let's assume we want to transmit a 256-byte string which contains each byte once:
my $string = join '', map chr, 0x00 .. 0xFF;
When we encode and decode the message, we end up with an equivalent string again:
my $message = encode_json { str => $string };
my $new_string = (decode_json $message)->{str};
$new_string eq $string or die "The strings aren't equal";
While the strings are equal, they do not have the same contents. We can achieve that by “downgrading” the new string:
use utf8;
# this may die if the $new_string contains characters outside of
# the native encoding
utf8::downgrade($new_string);
How and why this works is perfectly well-defined, but armoring the message by using an ASCII-only encoding is admittedly preferable:
use MIME::Base64;
use JSON;
my $string = join '', map chr, 0x00 .. 0xFF;
my $message = encode_json { str => encode_base64 $string };
my $new_string = decode_base64 decode_json($message)->{str};
$string eq $new_string or die;