If you use 7 bits per number, you get 0-127. Whenever you get a number greater than 100, you have to discard it. You lose the use of that data point but its still random. You lose 28 of every 128 or about 20% of the random information.
If you use 20 bits at a whack, you get a number between 0 and 1,048,575. This can be broken into 3 random values between 0 and 99 (or 1-100 if you add 1 to it). You have to use integer arithmetic or throw away any fractional part when dividing.
if (number > 1000000) discard it.
a = number % 100;
b = (number / 100) % 100;
c = (number / 10000) % 100;
You only waste 48,575 values out of 1048575 or about 5% of the random information.
You can think of this process this way. Take the number you get by converting 20 bits to an decimal integer. Break out the 10's and 1's digits, the 1000's and 100's digits and the 100,000's and 10,000's digits and use those as three random numbers. They are truly random since those digits could be any value at all in the original number. Further, we discarded any values that bias particular values of the three.
So there's a way to make more efficient use of the random bits. But you have to do some computing.
Note: The next interesting bit combination is 27 bits and that wastes about 25%. 14 bits would waste about 60%.