• ARk@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      reserve me tickets for the inevitable shit show that follows 🍿

  • call_me_xale@lemmy.zip
    link
    fedilink
    arrow-up
    19
    ·
    1 year ago

    Bold of you to assume no one will come up with a replacement date library rather than just getting rid of JS.

  • Alien Nathan Edward@lemm.ee
    link
    fedilink
    arrow-up
    15
    ·
    1 year ago

    I’ve got a bunch of freeze dried food from my backpacking days. Who wants to jump in on a business selling Y275.76K Survival Kits?

      • 14th_cylon@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        it may or may not be a monday - probably won’t. it will be monday based on the (4000 | year) => !(leap year) rule, but by the year 275000 the difference will be so big that i am pretty sure people will make more rules to solve that.

    • SuperJetShoes@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      This will be a tough one to fix. There must be millions upon millions of embedded systems out there with 16-bit epoch burned in.

      They’ll all be much tougher to find than “YEAR PIC(99)” in COBOL was.

      Y2K wasn’t a problem because thousands upon thousands of programmers worked on it well in advance (including myself) we had source code and plenty of static analysis tools, often homegrown.

      The 2038 bugs are already out there…in the wild…their source code nothing but a distant dream.

  • asudox@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    1 year ago

    Well, I am comfortable leaving the upcoming disaster this will cause to the next generations.

  • interolivary@beehaw.org
    link
    fedilink
    arrow-up
    10
    arrow-down
    1
    ·
    edit-2
    1 year ago

    I honestly don’t quite get why it’s so common to hate Javascript.

    I mean, it’s not my favorite language to put it mildly (I prefer type systems that beat me into submission) but as far as popular dynamically typed languages go, it’s not nearly the worst offender out there. Yes, lol, weird things equal weird things when you use == but that’s not exactly unique among dynamic languages, and some people couldn’t come to terms with it not being like Java despite the name so they never bothered learning how prototypal inheritance works, and also who the fuck needed both null and undefined when either of those by itself is already a mistake and introducing them to a language should be grounds for a nice, solid kick to the groin.

    But, warts and all, the implementations are generally reasonably performant as far as these things go, the syntax is recognizable because eg. braces are common whether we like them or not and notably also survives copy-pasting from eg. the internet or anything that doesn’t use the same whitespace you do, and it’ll happily let you write code in a quite multiparadigm way, leading to some people to insist Javascript is kind of like Scheme and other people to insist Javascript is nothing like Scheme.

    So, shit could be worse. And by “shit” and “worse” I mean eg. Python, notable for achievements such as: being one of the first if not the first language with a designer who huffed enough solvents to think that semantically significant whitespace is a great idea especially combined with no real standardization on whether you need to use tabs or spaces, and which often doesn’t survive being copy-pasted from the web and is a nightmare to format; being unable to actually run anything in parallel up until very recently because lol why bother with granular locking in the runtime when you can just have one global interpreter lock and be done with it; or being popular in part due to the fact that its FFI makes it easy to write modules for it in languages that aren’t a crime against common sense and can run faster and more parallel than an 80’s BASIC interpreter. And let’s not even go into the whole “virtual environment” thing.

    So while Python’s not quite INTERCAL-bad, at least INTERCAL doesn’t have significant whitespace and its manuals are really damn funny.

    And then there’s eg. Ruby, with 9999 ways to do everything and all of them so slow that it aspires to one day be as fast as INTERCAL, and PHP which is a practical joke that went too far and somehow managed to eventually convince people it’s actually a real language.

    edit: oh and if you don’t know about INTERCAL, I can highly recommend checking out the the C-INTERCAL revision’s manual, which includes eg. a very helpful circuitous diagram and a logical table to explain one of its more odd operators. There’s also a resource page that’s maintained by one of the perpetrators of the C-INTERCAL revision.

  • Turun@feddit.de
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    That’s because this is the maximum integer that can be stored in a double precision floating point number without loss of precision, lol

    • interolivary@beehaw.org
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      That’s one thing that really bugs me about Javascript (weirdly enough I’m okay with eg prototypal inheritance and how this works, or at least worked before the bolted on classes that were added because apparently I’m like one of the dozen or so people who had no problems with those concepts). The fact that all numbers are floats can lead to a lot of fun and exciting bugs that people might not even realize are there until they suddenly get a weird decimal where they expected an integer

  • viking@infosec.pub
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    What people fail to see is that this is the largest date the API can store, not a magical cutoff date in the distant future.

    You could create a date today and send it to the API, and it could potentially crash it, or create a buffer overrun.

    • Redkey@programming.dev
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      The definition of the Date object explicitly states that any attempt to set the internal timestamp to a value outside of the maximum range must result in it being set to “NaN”. If there’s an implementation out there that doesn’t do that, then the issue is with that implementation, not the standard.