Have you performed simple arithmetic operations like 0.1 + 0.2? You might have gotten something strange: 0.1 + 0.2 = 0.30000000000000004.

  • aubeynarf@lemmynsfw.com
    link
    fedilink
    arrow-up
    4
    arrow-down
    2
    ·
    edit-2
    3 months ago

    JavaScript is truly a bizarre language - we don’t need to go as far as arbitrary-precision decimal, it does not even feature integers.

    I have to wonder why it ever makes the cut as a backend language.

    • luciole (he/him)@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      The JavaScript Number type is implemented as an IEEE 754 double and as such any integer between -253 and 253 are represented without loss of precision. I can’t say I’ve ever missed explicitly declaring a value as an integer in JS. It’s dynamically typed anyways. There’s the languages people complain about and the ones nobody uses.

      • aubeynarf@lemmynsfw.com
        link
        fedilink
        arrow-up
        2
        ·
        3 months ago

        And then JSON doesn’t restrict numbers to any range or precision; and at least when I deal with JSON values, I feel the need to represent them as a BigDecimal or similar arbitrary precision type to ensure I am not losing information.

        • luciole (he/him)@beehaw.org
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          3 months ago

          I hope you work in a field where worrying about your integers hitting larger values than 9 quadrillion is justified.

          • aubeynarf@lemmynsfw.com
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            3 months ago

            Could be a crypto key, or a randomly distributed 64-bit database row ID, or a memory offset in a stack dump of a 64 bit program