From their manual:
>Some explicit non-goals:
>* convenient syntax for writing build files by hand. You should generate your ninja files using another program. This is how we can sidestep many policy decisions.
[emphasis in original]
In general I agree, though I can think of one counterexample: ninja. The language is intentionally minimal, to encourage users to generate build files in higher-level languages. So far I've tried this approach with one small project, and found it refreshing.
Other than speed, the killer feature that distinguishes it from its predecessors (e.g. make) is generator rules: ninja knows about the build file generator, so it knows how to reinvoke it when something changes.
I disagree (but thank you for a technical comment).
Header dependency extraction and change/rebuild detection has to be handled at the underlying build system, there is just no other way, not for any real project. You are saying that CMake could have handled all this itself, that is, extracted all the dependency information from all the source files and encoded it in the generated .ninja
file. But that would mean CMake would have to re-generate it every time you change any of your #include
's. I don't think this will be acceptable for most projects.
EDIT:
> The only relevant special feature is the ability to load dependencies list produced as a byproduct of the rule execution.
It is actually more specialized, Ninja knowns about different dependency styles, etc, as described here.
EDIT 2:
Just for the record, I am not the one downvoting you.
I think no mention of make can be considered complete without also mentioning recursive make considered harmful.
Though, granted, once you're deep in enough to consider recursive make at all you should be looking out for alternatives, make just doesn't scale well at all. IMNSHO: Shake if Haskell is ok, something that generates ninja files otherwise. (Shake can eat ninja files, too, but its actual strength is the haskellesque "hey, look, it's not a program but a library" thing).
Yeah it kind of surprised everyone. I follow /r/ocaml and /r/reasonml so I caught the initial rebranding and subsequent confusion, and it was a mess. At first it sounded like Reason and BuckleScript decided to merge and drop OCaml support, especially since a lot of us thought the reasonml.org site was the official entry point to Reason since it presented itself as such. Then it became clear that Reason wasn't going anywhere and this was a move by BuckleScript and a subset of the Reason community that only cared about the JS side to abandon Reason and OCaml, and the Reason side as just as surprised by the move.
Now the real Reason site primarily mentions using js_of_ocaml for JS compilation instead of endorsing bucklescript, though maybe when Melange's changes and updates* to bucklescript stabilise it'll switch over to suggesting that. jsoo is good at what it does but it uses some OCaml black magic that creates some awful error messages; plus it converts the intermediate AST representation to JS and creates a giant unreadable blob. BuckleScript (and now Melange) take over compilation a bit earlier, giving you more readable output and allowing some things to be done more smoothly.
* BuckleScript/ReScript works by forking the OCaml compiler and has been stuck on years-out-of-date versions of it as a result. Works well but you miss out on some nice stuff that's been done the past few years. Melange is trying to fix some of that by splitting out the compiler changes in a way that lets it keep up with recent compiler versions, plus eventually replace its odd custom Ninja-based build system with OCaml ecosystem stuff (dune for build, esy for packages).
Why? Is it some sort of a hazing ritual?
Anyway, Ninja manual explains how it is better than Make:
https://ninja-build.org/manual.html#_introduction
> It is born from my work on the Chromium browser project, which has over 30,000 source files and whose other build systems (including one built from custom non-recursive Makefiles) would take ten seconds to start building after changing one file. Ninja is under a second.
There's also a separate section on feature comparison: https://ninja-build.org/manual.html#_comparison_to_make
Starting from:
> Ninja has special support for discovering extra dependencies at build time, making it easy to get header dependencies correct for C/C++ code.
Make was made 44 years ago, don't you think we can do much better now?
Make is definitely not expressive enough to be used directly pretty much all open source projects use autotools to generate makefiles, and that's a huge mess.
I'm using an in-house engine written using C++, and like almost everyone else, I use cmake to generate project files. (Which are typically just Makefiles, or Ninja files when I'm feeling fancy)
External dependencies are pre-compiled into libraries, and are uploaded into Amazon S3 buckets.
I'm using Concourse as my build server; it does builds inside docker images (so I don't have to worry about computer upgrades breaking things), and uploads the results to s3/dropbox/itch/steam/wherever. Also manages the game's version number, which is nice. :)
(exception: I haven't figured out how to cross-compile OS X builds yet, so those are handled by a Concourse worker process on a physical OS X box.)
Scale, on my machine (Core i5-5200U, a Thinkpad T450) the gmake based build system needs around five minutes to find out nothing needs to be done. Ninja is much faster than gmake, which is why it is used under the hood. The control files for ninja are optimized for consumpotion, which is why they are not intended to be written manually, but rather generated. There was, or better is, another project called kati, which creates build.ninja files from the old Android.mk files, so I'm rather surprised to learn that there is another ninja-based build system for the AOSP.
Edit: According to the readme of kati in the AOSP tree it seems to be deprecated: https://android.googlesource.com/platform/build/kati/
The build can be faster if you use Ninja instead of make. Ninja was designed for speed and to have its input files be generated by programs instead of hand-written. If a project is already just using make, simply changing the generator to Ninja cmake -G Ninja
will probably result in faster builds.
Also, just like you use xcode in some cases, other people might prefer other IDE's like Visual Studio for example and so they would use the appropriate generators for that.
We use ninja - a very small and fast build system. Ninja generally needs something to generate its build files (CMake, Meson, etc.), but we just use a little Python script to generate ours.
It took a little learning to do, but it ended up being a lot faster and more flexible than plain-o-makefiles (not that there is anything wrong with Make if it suits your case).
I use VSCode as my editor and it's pretty easy to configure it to run "ninja" as the build command.
Use it with ´cmake -GNinja ‘
For larger projects Ninja is much faster for incremental builds : on my side, ninja takes about 2 seconds to build a project with 1000 cpp files where only one cpp file needs recompilation, versus 45 seconds with ‘make ´
Further advice : combine this with ccache
Regardless of platform or IDE, I really like ninja as my actual build tool. As others have said, get rid of those platform specific .vcxproj
files and maintain your build settings in a single place, the CMakeLists.txt
files.
Politics are real, so you don't need to do this day one, but make it your future goal. When you can get the CMakeLists.txt files up to snuff for vcproj, it should be a relatively easy sell to others. "Maintain <x> in a single place." is a strong argument. :-)
Good luck!!!
I would use the built in build command.
This way you hardly need to do any work at all in your bat script AND you get the ability to easily change your build system to something like ninja (or anything else) for no extra cost.
For the default generator (visual studio) the invocation would look something like:
cmake . cmake --build . --config Debug --target INSTALL
Note that other CMake best practices like doing an out of source build are being ignored here for the sake of brevity.
First, let me say that I agree with many of your points, and I will leave those out in this reply. I am only going to address points where I don't (fully) agree or can offer some good workarounds:
> It's slow to code, it's slow to compile and it's slow to iterate on it.
But fast to run your game.
> it's slow to compile > If you miss the deadline because compilation time takes 3min each time
For a change in a single file? Either you are (ab)using templates, or there's something wrong with the build system.
Are you using ninja?
I find that with ninja, compile times are quite reasonable, even for bigger projects. Especially if you turn off optimizations until release with /Od. It's usually a couple of seconds for a single file change for me.
> Compilation errors are strange for newcomers...the time for a newcomer to understand the codebase because we have so many custom structure is so long.
I agree that learning curve is the problem. Not a big problem for people who already use C++ though. Switching would mean they have to spend the time to learn new language, new tools, etc.
> compiler specific implementation for part of the langage that is not defined
So, never use the undefined behavior and you will be fine.
I agree with the rest you wrote, and might offer a good candidate for replacement: It's C. Yeah, I know it sounds crazy, but... Fast compile times, no templates so no crazy error messages, simple and easy to learn and you can handle memory efficiently. The only problem is having to invent a lot of underlying libraries and conventions to be able to be productive. But once you do... look at Linux kernel - in development for 30 years, 27 million lines of code, and still isn't falling apart and still everyone knows how to get in and contribute.
I use Linux and Mac to develop games, but making a Windows release is really the easiest one. For Linux I need to set up a steam-runtime environment, and on macOS I will soon have to notarize applications with Apple (it's required for all new games on Steam). I have also ported my games to Nintendo Switch - it's pretty straightforward, but you have to use specific tools and hardware for that and you need to go through approval process for every update.
Windows is the easiest because you can simply compile, throw in the .dll files and you're done. No special setup needed. The only drawback is that MSVC compiler is somewhat slower than clang, even with ninja. Because of this I prefer to develop on macOS and Linux as I can go through change-compile-run cycles much faster. I only do the final builds on Windows.
To sum it up: for development macOS or Linux; for deployment: Windows.
You got it.
Just like WASM, which is the portable assembly to target from now on as it allows you to interoperate with any code written in any language. Ninja is the portable assembly of build systems. Write your build code in any language you like, then compile it to Ninja, and now your build code will be able to interoperate with literally everyone and everything.
Ninja (which can be used as a backend for CMake) already writes start and finish times for all the steps to its log file. There is also a tool to convert it to Chrome's format.
As I said in your other post; ninja is a standalone binary
Either get it from your package manager if it's in there; download the binary, or compile it yourself from source.
Install it (i.e. move it) into some place in your PATH
.
https://mesonbuild.com/Running-Meson.html#building-the-source
stated that the meson build system uses ninja by default; thus so long as it is in your PATH
, it will be able to find and use it.
No "setup" needed.
> There's so, so many things we do just because we have to keep compatibility.
Sure, but "keeping compatibility" does not preclude making new, better programs in parallel to the compatibility shim, up to day where the old thing is finally deprecated & removed.
WinRT apps don't have the hidden top-left close action for instance, and while make
certainly has a lot of inertia, it is also possible to use newer systems such as ninja which improve in many points without any of the make
heritage.
That's an interesting point. I tried to get generated headers working in Ninja, and it can be done, but you have to tell Ninja explicitly which object files depend on which generated header files. Otherwise, when you first build the object file, it doesn't know which header files need to be generated. For bigger projects that use a lot of generated headers, those dependencies are probably unmaintainable.
Here is my working build.ninja
file. Note that the rule for test.cpp.o
explicitly lists yello_world.h
as a dependency:
rule CXX_COMPILER__test depfile = $out.d deps = gcc command = g++ $DEFINES $INCLUDES $FLAGS -MD -MT $out -MF $out.d -o $out -c $in description = Building CXX object $out
rule CXX_EXECUTABLE_LINKER__test command = g++ $FLAGS $LINK_FLAGS $in -o $TARGET_FILE $LINK_PATH $LINK_LIBRARIES description = Linking CXX executable $TARGET_FILE
rule build_header command = ruby $in $out
build test.cpp.o: CXX_COMPILER__test test.cpp | yello_world.h FLAGS = -Wall
build test: CXX_EXECUTABLE_LINKER__test test.cpp.o FLAGS = -Wall TARGET_FILE = test
build yello_world.h: build_header hello_world.rb
default test
So what should you do if you want generated headers in a Ninja project?
I'd recommend just generating the header files ahead of time during the configuration stage, so they are all present before any C++ objects are built. And maybe you just need to run your configuration stage again if any of the code for generating headers changes.
Apparently order-only dependencies can be used to help with generating header files in Ninja. I'm not sure exactly how yet.
LLVM uses Ninja and generated headers. How does that work?
Also remember that whatever method you choose it should also be usable without MSBuild. Many projects build with MSVC compiler but using some other builder (usually Ninja) because MSBuild is slow. So slow, in fact, that an article published on MS web site on how to build your code faster had "use a different build system than MSBuild" as one of its entries.
> the reference could be automatically added for the next build
Speaking of which, could you please make VS output "make style" dependency output files? Ninja needs to hack around that by making VC output debug information with /showIncludes
and then parsing that. Which is awful because, among other things, the output format of said debug information is locale dependent. More information here.
Nothing will please everyone :)
I liked that it has both testing and packaging built right in.
Have you checked out the Ninja build tool?
https://ninja-build.org/manual.html
I remember compile times being faster but I think that it has something like ccache built in.