Fork me on GitHub
#other-languages
<
2022-01-09
>
quoll20:01:52

Just to vent for a moment… 53-bit integers that can only be (easily) bit-manipulated as 32-bit integers is a really tough paradigm to work around sometimes. Consider this Java:

int x, y;
long carry, ylong = y & 0xFFFFFFFF;
...
long product = ylong * (x & LONG_MASK) + carry;
This takes a pair of 32-bit ints (`x` and y), multiplies them and adds in a carry value. This results in a 64-bit value. Doing the same in JavaScript (and hence, ClojureScript):
function multiplyCarryInt(a, b, carry) {
    const al = 0xFFFF & a
    const ah = a >>> 16;
    const bl = 0xFFFF & b;
    const bh = b >>> 16;

    const blal = bl * al;
    const blah = bl * ah;
    const bhal = bh * al;
    const bhah = bh * ah;

    var p0 = blal + carry;
    var p1 = blah + bhal + (p0 >>> 16);
    return [bhah + (p1 >>> 16), p1 << 16 | (p0 & 0xFFFF)];
}
var [product-high, product-low] = multiplyCarryInt(y, x, carry)
This is gradeschool math. Apparently I can do it with one fewer multiply, but I’m still learning Karatsuba.

quoll20:01:29

Anyway, I’m slowly making progress. But it feels like JavaScript really doesn’t like numbers

andy.fingerhut20:01:06

I am guessing this is because its integers are just 64-bit IEEE floats with a 0 mantissa?

quoll20:01:39

Nearly 🙂 Integers are any value where the exponent (not the mantissa) is 0 or more. (Exponents are actually stored as a “biased value”, so exponent=0 is actually stored with the bit pattern corresponding to 1023=0x3FF) When the exponent is zero, then Integers can be stored up to the maximum “Safe” integer value: 9007199254740991 or: 0x1fffffffffffff i.e. 53 bits

quoll20:01:36

But (and here’s where it gets weird), JavaScript also manages integers greater than this. It does this by increasing the exponent. As a result, it has to “skip” values.

quoll20:01:03

for (var i = 0; i < 16; i++) console.log( 9007199254740991 + i);
9007199254740991
9007199254740992
9007199254740992
9007199254740994
9007199254740996
9007199254740996
9007199254740996
9007199254740998
9007199254741000
9007199254741000
9007199254741000
9007199254741002
9007199254741004
9007199254741004
9007199254741004
9007199254741006

quoll20:01:17

Basically, if you’re looking at numbers that large, then you shouldn’t consider them as integers. They’re better thought of as floating point, with a precision that doesn’t extend down to the units

quoll20:01:35

But JavaScript Number.isInteger(…) returns true on all of those numbers.

quoll20:01:10

But now that I have all of that off my chest… it DOES handle 32-bit values just fine (sort of). Which means that we can do anything, including 64-bit multiplications. It just takes work.

quoll20:01:23

I was just venting about the work 🙂

andy.fingerhut21:01:51

Oh, vent away. The language not having 64-bit integers has led you to working around it to make them out of 32-bit integer ops on half of a 64-bit IEEE float.

andy.fingerhut21:01:14

I'm sure N-thousand developers have looked at this situation and wondered how much work it would be to add true integer types to JavaScript, and probably gave up after seeing how deep the existing implementation is embedded in the existing libraries out there.

quoll21:01:54

That “half of a 64-bit IEEE float” is actually where I’m working. As soon as you apply a bit operation, the number is instantly truncated to a signed 32-bit integer.

quoll21:01:28

For instance, here is a 33-bit integer:

> 0x1ffffffff
8589934591
And here is that same number after applying a no-op bit operation to it:
> 0x1ffffffff | 0
-1

quoll21:01:38

i.e. it got truncated to the 32-bit value 0xffffffff, and then converted to a signed 32-bit value

quoll21:01:45

So JavaScript actually has another numerical type: the 32-bit signed int. The problem is that any arithmetic (as opposed to logical) operation on it will promote it to a 53-bit signed int

quoll21:01:56

actually… I don’t know that it’s stored that way, or if it’s just treated as 32 bit during the operation. The end effect is the same though

andy.fingerhut20:01:51

A programming language with no true native integer type is just odd to me.