• expr@programming.dev
    link
    fedilink
    arrow-up
    1
    ·
    9 months ago

    …you don’t accept them. Basically every programming language accepts some kind of -werror flag to turn warnings into errors. Warnings for development builds, errors for production builds. This has been a solved problem for a very long time. Not only is it assinine to force them to be errors always, it’s semantically incorrect. Errors should be things that prevent the code from functioning in some capacity.

    • dbx12@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      9 months ago

      Oh, that makes warnings errors and does not mean “ignore errors”. I’m not too familiar with compiler flags. You could do some mental gymnastics to argue that the unused variable causes the compiler to exit and thus the code is not functioning and thus the unused variable is not a warning but an error :^)

      • expr@programming.dev
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        It’s a pretty standard flag in basically all compiled languages, just goes by a different name. -werror in C, -Werror in Java, TreatWarningsAsErrors in C#, etc.