Pergunta

If I run the following code:

BitSet test = new BitSet(20);
System.out.println("Bitset size is " + test.size());
System.out.println("Bitset length is " + test.length());

I get the output:

Bitset size is 64
Bitset length is 0

Which makes sense now that I look closer at the documentation (the former getting the size including implementation overhead and the latter being the last set bit, with all defaulting to false), but is not what I want.

Since the BitSet I actually use can have varying lengths, I want to be able to get back the number of bits represented (ignoring whether they are set). Is there a way to do that (hopefully with a built-in method)?

I realize I could try flipping all the bits and then doing length, or by tracking the variable I use to instantiate the BitSet, but I feel like both would require several lines of comments with my reasons and I was hoping for something a bit more self-documenting.

Foi útil?

Solução

A BitSet will automatically expand to a size sufficient to represent the highest bit set in it; the initial size is simply a hint as to how many bits you intend to use. So your question as posed makes no sense.

Outras dicas

I did some testing on my Mac and it appears that the BitSet length wants to be a multiple of 64. So, for your BitSet size of 20, it defaulted to 64. If you set your size to 65, then you should see a length of 128. Setting it to 129 should give you 192. This should give you a way to calculate your 'true' BitSet length.

Bitset internally uses long (64 bits). So you can determine number of bits needed in multiples of this using the size() method.

Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top