Description
I am unhappy with the fact that we have bit operations and cast to 32bit or so on number types.
In the presence of arbitrary length integers, these operations are problematic.
On the one hand, they reveal the internal representation (two-complement).
But perhaps more important, their semantics are problematic with respect to the arbitrary length.
For most algorithms that use bit operations, we want fixed-length bit representations.
Examples for bit vectors, bit sets, can be found for instance in Common Lisp, Java, Scala.
I am wondering whether 32bit-wide bit arrays are sufficient for many of the things we want to do (i.e., the benchmarks we care about (there was a new one for F/J which I could not yet implement). If that's the case, we could map them directly to Java's int
, which we currently don't use for anything.
Another question about the design is whether we should treat them as Objects
or Values
.
Do we need/want identity? Is it useful to have a reference to the changing BitArray
?
And finally the name:
BitArray
this is my current choice, it is an array of bits (fixed size), or do we need aBitVector
?BitSet
as in Java or Scala doesn't make sense to me. It's not a set of bits is it? The Java doc even says it's a vector of bits.BitMap
, better thanBitSet
Tension between BitArray
and BitVector
:
<<
operation semantics, wrapping on current size, or extending?>>
and what with?- we could of course have extending and non-extending (i.e. wrapping) operations
- however, main use case is for porting algorithms that do word-sized operations