Due to potential new direction of D, I’m looking for some escape route just in case. I’m primarily a gamedev, so no functional programming languages like Rust or Haskell. Also one of the features I dislike the most in C/C++ is the super slow and super obsolete precompiler with its header files, so no zig, I don’t want to open two files for editing the same class/struct. Memory safety is nice to have, but not a requirement, at worst case scenario I’ll just create struct SafeArray<Type>. Also I need optional OOP features instead of reinventing OOP with all kinds of hacks many do when I would need it.

Yes I know about OpenD, and could be a candidate for such things, just looking into alternatives.

  • Dark Arc@social.packetloss.gg
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    14 days ago

    There’s no precompiler in C++. There’s a preprocessor but that’s something entirely different. It’s also not a slow portion of the compile process typically.

    C++ is getting to the point where modules might work well enough to do something useful with them, but they remove the need for #include preprocessor directives to share code.

    • BatmanAoD@programming.dev
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      14 days ago

      OP clearly means “preprocessor”, not “precompiler”. You’re right that preprocessing itself isn’t slow, but the header/impl split can actually cause some slowness at build time.

      • Dark Arc@social.packetloss.gg
        link
        fedilink
        English
        arrow-up
        3
        ·
        13 days ago

        Slow compared to what exactly…?

        The worst part about headers is needing to reprocess the whole header from scratch … but precompiled headers largely solve that (or just using smaller more targeted header files).

        Even in those cases there’s something to be said for the extreme parallelism in a C++ build. You give some of that up with modules for better code organization and in some cases it does help build times, but I’ve heard in others it hurts build times (a fair bit of that might just be inexperience with the feature/best practices and immature implementations, but alas).

        • BatmanAoD@programming.dev
          link
          fedilink
          arrow-up
          3
          ·
          13 days ago

          Slow compared to just chucking everything into a single source file, actually: https://github.com/j-jorge/unity-build

          That’s only true for clean builds and even then isn’t universally true, and of course there are other reasons not to do unity builds. But the existence of the technique, and the fact that historically it has sped up build times enough for various projects to adopt it, does show that the C++ model, with headers and separate compilation units, has some inherent inefficiency.

          • Dark Arc@social.packetloss.gg
            link
            fedilink
            English
            arrow-up
            3
            ·
            13 days ago

            Sure, there’s a cost to breaking things up, all multiprocessing and multithreading comes at a cost. That said, in my evaluation, single for “unity builds” are garbage; sometimes a few files are used to get some multiprocessing back (… as the GitHub you mentioned references).

            They’re mostly a way to just minimize the amount of translation units so that you don’t have the “I changed a central header that all my files include and now I need to rebuild the world” (with a world that includes many many small translation units) problem (this is arguably worse on Windows because process spawning is more expensive).

            Unity builds as a whole are very very niche and you’re almost always better off doing a more targeted analysis of where your build (or often more importantly, incremental build) is expensive and making appropriate changes. Note that large C++ projects like llvm, chromium, etc do NOT use unity builds (almost certainly, because they are not more efficient in any sense).

            I’m not even sure how they got started, presumably they were mostly a way to get LTO without LTO. They’re absolutely awful for incremental builds.

            • BatmanAoD@programming.dev
              link
              fedilink
              arrow-up
              2
              ·
              13 days ago

              Yeah, I mean, I tried to be explicit that I wasn’t recommending unity builds. I’m just pointing out that OP, while misinformed and misguided in various ways, isn’t actually wrong about header files being one source of slowness for C++.