Feedback of someone who is used to manage large (>1500) software stack in C / C++ / Fortran / Python / Rust / etc:
- (1) Provide a way to compile without internet access and specify the associated dependencies path manually. This is absolutely critical.
Most 'serious' multi-language package managers and integration systems are building in a sandbox without internet access for security reasons and reproducibility reasons.
If your build system does not allow to build offline and with manually specified dependencies, you will make life of integrators and package managers miserable and they will avoid your project.
(2) Neverever build in '-03 -march=native' by default. This is always a red flag and a sign of immaturity. People expect code to be portable and shippable.
Good default options should be CMake equivalent of "RelWithDebInfo" (meaning: -O2 -g -DNDEBUG ).
-O3 can be argued. -march=native is always always a mistake.
- (3) Allow your build tool to be built by an other build tool (e.g CMake).
Anybody caring about reproducibility will want to start from sources, not from a pre-compiled binary. This also matter for cross compilation.
They are what will allow interoperability between your system and other build systems.
- (5) last but not least: Consider seriously the cross-compilation use case.
It is common in the world of embedded systems to cross compile. Any build system that does not support cross-compilation will be de facto banned from the embedded domain.
The reason why I like it (beyond ease-of-use) is that it can spit out CMakeLists.txt and compile_commands.json for IDE/LSP integration and also supports installing Conan/vcpkg libraries or even Git repos.
I would happily switch to it in a heartbeat if it was a lot more well-documented and if it supported even half of what CMake does.
As an example of what I mean, say I want to link to the FMOD library (or any library I legally can't redistribute as an SDK). Or I want to enable automatic detection on Windows where I know the library/SDK is an installer package. My solution, in CMake, is to just ask the registry. In XMake I still can't figure out how to pull this off. I know that's pretty niche, but still.
The documentation gap is the biggest hurtle. A lot of the functions/ways of doing things are poorly documented, if they are at all. Including a CMake library that isn't in any of the package managers for example. It also has some weird quirks: automatic/magic scoping (which is NOT a bonus) along with a hack "import" function instead of using native require.
All of this said, it does work well when it does work. Especially with modules.
Agreed, xmake seems very well-thought-out, and supports the most modern use-cases (C++20 named modules, header unit modules, and `import std`, which CMake still has a lot of ceremony around). I should switch to it.
> For example, Cmake can use vcpkg to install a package but then I still have to write more cmake to actually find and use it.
I have this solved at our company. We have a tool built on top of vcpkg, to manage internal + external dependencies. Our cmake linker logic leverages the port names and so all you really do is declare your manifest file (vcpkg.json) then declare which one of them you will export publicly.
Everything after that is automatic including the exported cmake config for your library.
Thank you everyone for the feedback so far! I just wanted to say that I understand this is not a fully cohesive and functional project for every edge case. This is the first day of releasing it to the public and it is only the beginning of the journey. I do not expect to fully solve a problem of this scale on my own, Craft is open source and open to the community for development. I hope that as a community this can grow into a more advanced and widely adopted tool.
I don’t love this approach either (what a security nightmare…) - but it is easy to do for users and developers alike. Having to juggle a bunch of apt-like repositories for different distros is a huge time sink and adds a bunch of build complexity. Brew is annoying with its formulae vs tap vs cask vs cellar - and the associated ruby scripting… And then there’s windows - ugh.
I wish there was a dead simple installer TUI that had a common API specification so that you could host your installer spec on your.domain.com/install.json - point this TUI at it and it would understand the fine grained permissions required, handle required binary signature validation, manifest/sbom validation, give the user freedom to customize where/how things were installed, etc.
It is definitely worse. At leas a binary is constant, on your system, can be analyzed. Curl|sh can give you different responses than just curling. Far far worse
Having to work around a massive C++ software project daily, I wish you luck. We use conan2, and while it can be very challenging to use, I've yet to find something better that can handle incorporating as dependencies ancient projects that still use autoconf or even custom build tooling. It's also very good at detecting and enforcing ABI compatibility, although there are still some gaps. This problem space is incredibly hard and improving it is a prime driver for the creation of many of the languages that came after C/C++
I find that conan2 is mostly painful with ABI. Binaries from GCC are all backwards compatible, as are C++ standard versions. The exception is the C++11 ABI break.
And yet it will insist on only giving you binaries that match exactly. Thankfully there are experimental extensions that allow it to automatically fall back.
Uses CMAKE, Sorry not for me. Call me old but i prefere good old make or batch. Maybe it's because i can understand those tools. Debugging CMAKE build problems made me hate it. Also i code for embedded CPU and most of the time CMAKE is just overkill and does not play well the compiler/binutils provided. The Platform independency is just not happening in those environments.
When you need a configuration step, cmake will actually save you a lot of time, especially if you work cross platform or even cross compile. I love to hate cmake as much as the next guy, and it would be hard to design a worse scripting language, but I'll take it any time over autoconf. Some of the newer tools may well be more convenient - I tried Bazel, and it sure wasn't (for me).
If you're happy to bake one config in a makefile, then cmake will do very little for you.
> most of the time CMAKE is just overkill and does not play well the compiler/binutils provided
You need to define a CMake toolchain[1] and pass it to CMake with --toolchain /path/to/file in the command-line, or in a preset file with the key `toolchainFile` in a CMake preset. I've compiled for QNX and ARM32 boards with CMake, no issues, but this needs to be done.
For toy projects good old Make is fine...but at some point a project gets large enough that you need something more powerful. If you need something that can deal with multiple layers of nested sub-repositories, third-party and first-party dependencies, remote and local projects, multiple build configurations, dealing with non-code assets like documentation, etc, etc, etc - Make just isn't enough.
For simple projects. Make is easier for simple things I will grant. However when your projects gets complex at all make becomes a real pain and cmake becomes much easier.
Cmake has a lot of warts, but they have also put a lot of effort into finding and fixing all those weird special cases. If your project uses CMake odds are high it will build anywhere.
Seems to solve a problem very similar to Conan or vcpkg but without its own package archive or build scripts. In general, unlike Cargo/Rust, many C/C++ projects dynamically link libraries and often require complex Makefile/shell script etc magic to discover and optionally build their dependencies.
How does craft handle these 'diamond' patterns where 2 dependencies may depend on versions of the same library as transitive dependencies (either for static or dynamic linking or as header-only includes) without custom build scripts like the Conan approach?
One interesting chicken-egg-problem I couldn't solve is how to figure out the C/C++ toolchain that's going to be used without running cmake on a 'dummy project file' first. For some toolchain/IDE combos (most notably Xcode and VStudio) cmake's toolchain detection takes a lot of time unfortunately.
I'm intrigued by the idea of writing one's own custom build system in the same language as the target app/game; it's probably not super portable or general but cool and easy to maintain for smaller projects: https://mastodon.gamedev.place/@pjako/115782569754684469
This certainly seems less awful than the typical C building process.
What I've been doing to manage dependencies in a way that doesn't depress me much has been Nix flakes, which allows me a pretty straightforward `nix build` with the correct dependencies built in.
I'm just a bit curious though; a lot of C libraries are system-wide, and usually require the system package manager (e.g. libsdl2-dev) does this have an elegant way to handle those?
Yes, many libraries are system wide that is true. This is something I had on the list of features to add. System dependencies. Thank you for the feedback!
Project description is AI generated, even the HN post is AI generated, why should I spend any energy looking into your project when all you're doing is just slinging AI slop around and couldn't be bothered to put any effort in yourself?
Anyone can make a tool that solves a tiny part of the problem. however the reason no such tool has caught on is because of all the weird special cases you need to handle before it can be useful. Even if you limit your support to desktop: OS/X and Windows that problem will be hard, adding various linux flavors is even more difficult, not to mention BSD. The above is the common/mainstream choices, there Haiku is going to be very different, and I've seen dozens of others over the years, some of them have a following in their niche. Then there are people building for embedded - QNX, vxworks, or even no OS just bare metal - each adding weirdness (and implying cross compiling which makes everything harder because your assumptions are always wrong).
I'm sorry I have to be a downer, but the fact is if you can use the word "I" your package manager is obviously not powerful enough for the real world.
I will categorize this as a pattern I've seen which leads to stagnation, or is at least aiming for it. Usually these are built on one or more assumption which doesn't hold. The flow of this pattern:
- Problem exists
- Proposals of solutions, (varying quality), or not
- "You can't just solve this. It's complicated! This problem must exist". (The post I'm replying to
- Problem gets solved, hopefully.
Anecdotes I'm choosing based on proximity to this particular problem: uv and cargo. uv because people said the same thing about python packaging, and cargo because its adjacent to C and C++ in terms of being a low-level compiled language used for systems programming, embedded/bare-metal etc.
The world is rich in complexity, subtlety, and exceptions to categorization. I don't think this should block us from solving problems.
I didn't say the problem couldn't be solved. I said the problem can't be solved by one person. There is a difference. (maybe it can be solved by one person over a few decades)
This is true. There is no way I could solve a problem of this scale by myself. That is why this is an open source project and open to everyone to make changes on. There is still much more to improve, this is only day 1 of release to the public.
But how this tool figures out where the header files and build instructions for the libraries are that are included? Any expected layout or industry wide consensus?
...and for custom requirements a manually created CMakeLists.extras.txt as escape hatch.
Unclear to me how more interesting scenarios like compiler- and platform-specific build options (enable/disable warnings, defines, etc...), cross-compilation via cmake toolchain files (e.g. via Emscripten SDK, WASI SDK or Android SDK/NDK) would be handled. E.g. just trivial things like "when compiling for Emscripten, include these source files, but not those others".
CMakes piles up various generations of idioms so there are multiple ways of doing it, but personally I’ve learned to steer away from find_package() and other magical functions. Get all your dependencies as subdirectories (whichever way you prefer) and use add_subdirectory(). Use find_package() only in so-called "config" mode where you explicitly instruct cmake where to find the config for large precompiled dependencies only
Cmake is infamously not a build system. It is a build system generator.
This is now a build system generator generator. This is the wrong solution imho. The right solution is to just build a build system that doesn’t suck. Cmake sucks. Generating suck is the wrong angle imho.
If you think cmake isn't very good, the solution isn't to add more layers of crap around cmake, but to replace it. Cmake itself exists because a lot of humans haven't bothered to read the gnu make manual, and added more cruft to manage this. Please don't add to this problem. It's a disease
As much of a dog as cmake is, "just use make!" does not solve many of the problems that cmake makes a go at. It's like saying go write assembler instead of C because C has so many footguns.
GNU Make has a debugger. This alone makes it far superior to every other build tool I've ever seen. The cmake debugging experience is "run a google search, and try random stuff recommended by other people that also have no idea how the thing works". This shouldn't be acceptable.
This is very true. My thought process was that since majority of projects already run on CMake, I would simply build off of that and take advantage of what CMake is good at while making the more difficult operations easier. Thank you for your feedback!
CMake is a combination of a warthog of a specification language, and mechanisms for handling a zillion idiosyncracies and corners cases of everything.
I doubt than < 10,000 lines of C code can cover much of that.
I am also doubtful that developers are able to express the exact relations and semantic nuances they want to, as opposed to some default that may make sense for many projects, but not all.
Still - if it helps people get started on simpler or more straightforward projects - that's neat :-)
FWIW: there is something fundamentally wrong with a meta-meta build system. I don't think you should bother generating or wrapping CMake, you should be replacing it.
Cmake is doing a lot of underappreciated work under the hood that would be very hard to replicate in another tool, tons of accumulated workarounds for all the different host operating systems, compiler toolchains and IDEs, it's also one of few build tools which properly support Windows and Visual Studio.
Just alone reverse engineering the Xcode and Visual Studio project file formats for each IDE version isn't fun, but this "boring" grunt work is what makes cmake so valuable.
The core ideas of cmake are sound, it's only the scripting language that sucks.
Do your Makefiles work across Linux, macOS and Windows (without WSL or MingW), GCC, Clang and MSVC, or allow loading the project into an IDE like Xcode or Visual Studio though? That's why meta-build-systems like cmake were created, not to be a better GNU Make.
Ok, then just cl.exe instead of gcc or clang. Completely different set of command line options from gcc and clang, but that's fine. C/C++ build tooling needs to be able to deal with different toolchains. The diversity of C/C++ toolchains is a strength, not a weakness :)
One nice feature of MSVC is that you can describe the linker dependencies in the source files (via #pragma comment(lib, ...)), this enables building fairly complex single-file tools trivially without a build system like this:
cl mytool.c
...without having to specify system dependencies like kernel32 etc... on the cmdline.
In the age of AI tools like this are pointless. Especially new ones, given existence of make, cmake, premake and a bunch of others.
C++ build system, at the core, boils down to calling gcc foo.c -o foo.obj / link foo.obj foo.exe (please forgive if I got they syntax wrong).
Sure, you have more .c files, and you pass some flags but that's the core.
I've recently started a new C++ program from scratch.
What build system did I write?
I didn't. I told Claude:
"Write a bun typescript script build.ts that compiles the .cpp files with cl and creates foo.exe. Create release and debug builds, trigger release build with -release cmd-line flag".
And it did it in minutes and it worked. And I can expand it with similar instructions. I can ask for release build with all the sanitize flags and claude will add it.
The particulars don't matter. I could have asked for a makefile, or cmake file or ninja or a script written in python or in ruby or in Go or in rust. I just like using bun for scripting.
The point is that in the past I tried to learn cmake and good lord, it's days spent learning something that I'll spent 1 hr using.
It just doesn't make sense to learn any of those tools given that claude can give me working any build system in minutes.
It makes even less sense to create new build tools. Even if you create the most amazing tool, I would still choose spending a minute asking claude than spending days learning arbitrary syntax of a new tool.
This is a fair and valid point. However, why leave your workflow to write a prompt to an AI when you can run simple commands in your workspace. Also you are most likely paying to use the AI while Craft is free and open source and will only continue to improve. I respect your feedback though, thank you!
You're missing finding library/include paths, build configuration (`-D` flags for conditional compilation), fetching these from remote repositories, and versioning.
Feedback of someone who is used to manage large (>1500) software stack in C / C++ / Fortran / Python / Rust / etc:
- (1) Provide a way to compile without internet access and specify the associated dependencies path manually. This is absolutely critical.
Most 'serious' multi-language package managers and integration systems are building in a sandbox without internet access for security reasons and reproducibility reasons.
If your build system does not allow to build offline and with manually specified dependencies, you will make life of integrators and package managers miserable and they will avoid your project.
(2) Never ever build in '-03 -march=native' by default. This is always a red flag and a sign of immaturity. People expect code to be portable and shippable.
Good default options should be CMake equivalent of "RelWithDebInfo" (meaning: -O2 -g -DNDEBUG ).
-O3 can be argued. -march=native is always always a mistake.
- (3) Allow your build tool to be built by an other build tool (e.g CMake).
Anybody caring about reproducibility will want to start from sources, not from a pre-compiled binary. This also matter for cross compilation.
- (4) Please offer a compatibility with pkg-config (https://en.wikipedia.org/wiki/Pkg-config) and if possible CPS (https://cps-org.github.io/cps/overview.html) for both consumption and generation.
They are what will allow interoperability between your system and other build systems.
- (5) last but not least: Consider seriously the cross-compilation use case.
It is common in the world of embedded systems to cross compile. Any build system that does not support cross-compilation will be de facto banned from the embedded domain.
> -march=native is always always a mistake
Gentoo user: hold my beer.
It's also an option on NixOS but I haven't managed to get it working unlike Gentoo.
Gentoo binaries aren't shipped that way
>15000
15000 what?
1500 C/C++ individual software components.
The 15000 was a typo on my side. Fixed.
I see, thanks. I didn't mind the number it just wasn't clear what was it about.
Besides Cargo, you might want to take a look at Python's pyproject.toml standard. https://packaging.python.org/en/latest/guides/writing-pyproj...
It's similar, but designed for an existing ecosystem. Cargo is designed for `cargo`, obviously.
But `pyproject.toml` is designed for the existing tools to all eventually adopt. (As well as new tools, of course.)
The least painful C/C++ build tool I've used is xmake
https://github.com/xmake-io/xmake
The reason why I like it (beyond ease-of-use) is that it can spit out CMakeLists.txt and compile_commands.json for IDE/LSP integration and also supports installing Conan/vcpkg libraries or even Git repos.
Then you use it likeI would happily switch to it in a heartbeat if it was a lot more well-documented and if it supported even half of what CMake does.
As an example of what I mean, say I want to link to the FMOD library (or any library I legally can't redistribute as an SDK). Or I want to enable automatic detection on Windows where I know the library/SDK is an installer package. My solution, in CMake, is to just ask the registry. In XMake I still can't figure out how to pull this off. I know that's pretty niche, but still.
The documentation gap is the biggest hurtle. A lot of the functions/ways of doing things are poorly documented, if they are at all. Including a CMake library that isn't in any of the package managers for example. It also has some weird quirks: automatic/magic scoping (which is NOT a bonus) along with a hack "import" function instead of using native require.
All of this said, it does work well when it does work. Especially with modules.
Agreed, xmake seems very well-thought-out, and supports the most modern use-cases (C++20 named modules, header unit modules, and `import std`, which CMake still has a lot of ceremony around). I should switch to it.
I've had some experience with this but it seems to be rather slow, very niche and tbh I can't see a reason to use it over CMake.
Nice. I have been thinking of making something similar. Now hopefully I don't have to!
Not sure how big your plans are.
My thoughts would be to start as a cmake generator but to eventually replace it. Maybe optionally.
And to integrate suppoet for existing package managers like vcpkg.
At the same time, I'd want to remain modular enough that's it's not all or nothing. I also don't like locking.
But right now package management and build system are decoupled completely. And they are not like that in other ecosystems.
For example, Cmake can use vcpkg to install a package but then I still have to write more cmake to actually find and use it.
> For example, Cmake can use vcpkg to install a package but then I still have to write more cmake to actually find and use it.
I have this solved at our company. We have a tool built on top of vcpkg, to manage internal + external dependencies. Our cmake linker logic leverages the port names and so all you really do is declare your manifest file (vcpkg.json) then declare which one of them you will export publicly.
Everything after that is automatic including the exported cmake config for your library.
Thank you everyone for the feedback so far! I just wanted to say that I understand this is not a fully cohesive and functional project for every edge case. This is the first day of releasing it to the public and it is only the beginning of the journey. I do not expect to fully solve a problem of this scale on my own, Craft is open source and open to the community for development. I hope that as a community this can grow into a more advanced and widely adopted tool.
The installation instructions being a `curl | sh` writing to the user's bashrc does not inspire confidence.
They did say it was inspired by cargo, which is often installed using rustup as such:
I don’t love this approach either (what a security nightmare…) - but it is easy to do for users and developers alike. Having to juggle a bunch of apt-like repositories for different distros is a huge time sink and adds a bunch of build complexity. Brew is annoying with its formulae vs tap vs cask vs cellar - and the associated ruby scripting… And then there’s windows - ugh.
I wish there was a dead simple installer TUI that had a common API specification so that you could host your installer spec on your.domain.com/install.json - point this TUI at it and it would understand the fine grained permissions required, handle required binary signature validation, manifest/sbom validation, give the user freedom to customize where/how things were installed, etc.
Given you're about to run a binary, it's no worse than that.
It is definitely worse. At leas a binary is constant, on your system, can be analyzed. Curl|sh can give you different responses than just curling. Far far worse
This is fitting for something simulating cargo, which is a huge supply chain risk itself.
Having to work around a massive C++ software project daily, I wish you luck. We use conan2, and while it can be very challenging to use, I've yet to find something better that can handle incorporating as dependencies ancient projects that still use autoconf or even custom build tooling. It's also very good at detecting and enforcing ABI compatibility, although there are still some gaps. This problem space is incredibly hard and improving it is a prime driver for the creation of many of the languages that came after C/C++
I find that conan2 is mostly painful with ABI. Binaries from GCC are all backwards compatible, as are C++ standard versions. The exception is the C++11 ABI break.
And yet it will insist on only giving you binaries that match exactly. Thankfully there are experimental extensions that allow it to automatically fall back.
Uses CMAKE, Sorry not for me. Call me old but i prefere good old make or batch. Maybe it's because i can understand those tools. Debugging CMAKE build problems made me hate it. Also i code for embedded CPU and most of the time CMAKE is just overkill and does not play well the compiler/binutils provided. The Platform independency is just not happening in those environments.
When you need a configuration step, cmake will actually save you a lot of time, especially if you work cross platform or even cross compile. I love to hate cmake as much as the next guy, and it would be hard to design a worse scripting language, but I'll take it any time over autoconf. Some of the newer tools may well be more convenient - I tried Bazel, and it sure wasn't (for me).
If you're happy to bake one config in a makefile, then cmake will do very little for you.
> most of the time CMAKE is just overkill and does not play well the compiler/binutils provided
You need to define a CMake toolchain[1] and pass it to CMake with --toolchain /path/to/file in the command-line, or in a preset file with the key `toolchainFile` in a CMake preset. I've compiled for QNX and ARM32 boards with CMake, no issues, but this needs to be done.
[1]: https://cmake.org/cmake/help/latest/manual/cmake-toolchains....
For toy projects good old Make is fine...but at some point a project gets large enough that you need something more powerful. If you need something that can deal with multiple layers of nested sub-repositories, third-party and first-party dependencies, remote and local projects, multiple build configurations, dealing with non-code assets like documentation, etc, etc, etc - Make just isn't enough.
For simple projects. Make is easier for simple things I will grant. However when your projects gets complex at all make becomes a real pain and cmake becomes much easier.
Cmake has a lot of warts, but they have also put a lot of effort into finding and fixing all those weird special cases. If your project uses CMake odds are high it will build anywhere.
Odds are high the distro maintainer will lose hair trying to package it
“Show HN” has really become a Claude code showcase in the last 6 months, maybe it's time to sunset the format at this point …
Yup, I read "— think Cargo, but for C/C++." and closed the tab.
What about cmkr?
https://cmkr.build/
Seems to solve a problem very similar to Conan or vcpkg but without its own package archive or build scripts. In general, unlike Cargo/Rust, many C/C++ projects dynamically link libraries and often require complex Makefile/shell script etc magic to discover and optionally build their dependencies.
How does craft handle these 'diamond' patterns where 2 dependencies may depend on versions of the same library as transitive dependencies (either for static or dynamic linking or as header-only includes) without custom build scripts like the Conan approach?
Heh, looks like cmake-code-generators are all the rage these days ;)
Here's my feeble attempt using Deno as base (it's extremely opinionated though and mostly for personal use in my hobby projects):
https://github.com/floooh/fibs
One interesting chicken-egg-problem I couldn't solve is how to figure out the C/C++ toolchain that's going to be used without running cmake on a 'dummy project file' first. For some toolchain/IDE combos (most notably Xcode and VStudio) cmake's toolchain detection takes a lot of time unfortunately.
I'm intrigued by the idea of writing one's own custom build system in the same language as the target app/game; it's probably not super portable or general but cool and easy to maintain for smaller projects: https://mastodon.gamedev.place/@pjako/115782569754684469
This certainly seems less awful than the typical C building process.
What I've been doing to manage dependencies in a way that doesn't depress me much has been Nix flakes, which allows me a pretty straightforward `nix build` with the correct dependencies built in.
I'm just a bit curious though; a lot of C libraries are system-wide, and usually require the system package manager (e.g. libsdl2-dev) does this have an elegant way to handle those?
Yes, many libraries are system wide that is true. This is something I had on the list of features to add. System dependencies. Thank you for the feedback!
As long as it's for C/C++ and not C or C++, I'm skeptical.
Why do you say this? I respect it, I'm just curious.
Project description is AI generated, even the HN post is AI generated, why should I spend any energy looking into your project when all you're doing is just slinging AI slop around and couldn't be bothered to put any effort in yourself?
Anyone can make a tool that solves a tiny part of the problem. however the reason no such tool has caught on is because of all the weird special cases you need to handle before it can be useful. Even if you limit your support to desktop: OS/X and Windows that problem will be hard, adding various linux flavors is even more difficult, not to mention BSD. The above is the common/mainstream choices, there Haiku is going to be very different, and I've seen dozens of others over the years, some of them have a following in their niche. Then there are people building for embedded - QNX, vxworks, or even no OS just bare metal - each adding weirdness (and implying cross compiling which makes everything harder because your assumptions are always wrong).
I'm sorry I have to be a downer, but the fact is if you can use the word "I" your package manager is obviously not powerful enough for the real world.
I will categorize this as a pattern I've seen which leads to stagnation, or is at least aiming for it. Usually these are built on one or more assumption which doesn't hold. The flow of this pattern:
Anecdotes I'm choosing based on proximity to this particular problem: uv and cargo. uv because people said the same thing about python packaging, and cargo because its adjacent to C and C++ in terms of being a low-level compiled language used for systems programming, embedded/bare-metal etc.The world is rich in complexity, subtlety, and exceptions to categorization. I don't think this should block us from solving problems.
I didn't say the problem couldn't be solved. I said the problem can't be solved by one person. There is a difference. (maybe it can be solved by one person over a few decades)
This is true. There is no way I could solve a problem of this scale by myself. That is why this is an open source project and open to everyone to make changes on. There is still much more to improve, this is only day 1 of release to the public.
I mean -- if I'm going to join a team to solve the hard 20%, I'd like to see the idea validated against the easy 80% first.
If it's really bad, at least the easy 20%.
Yesterday I had to wrestle with CMake.
But how this tool figures out where the header files and build instructions for the libraries are that are included? Any expected layout or industry wide consensus?
I believe it supports only projects having a working cmake setup, no extra magic
I suspect it depends on a specific directory structure, e.g. look at this generated cmake file:
https://github.com/randerson112/craft/blob/main/CMakeLists.t...
...and for custom requirements a manually created CMakeLists.extras.txt as escape hatch.
Unclear to me how more interesting scenarios like compiler- and platform-specific build options (enable/disable warnings, defines, etc...), cross-compilation via cmake toolchain files (e.g. via Emscripten SDK, WASI SDK or Android SDK/NDK) would be handled. E.g. just trivial things like "when compiling for Emscripten, include these source files, but not those others".
CMakes piles up various generations of idioms so there are multiple ways of doing it, but personally I’ve learned to steer away from find_package() and other magical functions. Get all your dependencies as subdirectories (whichever way you prefer) and use add_subdirectory(). Use find_package() only in so-called "config" mode where you explicitly instruct cmake where to find the config for large precompiled dependencies only
Cmake is infamously not a build system. It is a build system generator.
This is now a build system generator generator. This is the wrong solution imho. The right solution is to just build a build system that doesn’t suck. Cmake sucks. Generating suck is the wrong angle imho.
Compared to Conan, what are the advantages?
If you think cmake isn't very good, the solution isn't to add more layers of crap around cmake, but to replace it. Cmake itself exists because a lot of humans haven't bothered to read the gnu make manual, and added more cruft to manage this. Please don't add to this problem. It's a disease
As much of a dog as cmake is, "just use make!" does not solve many of the problems that cmake makes a go at. It's like saying go write assembler instead of C because C has so many footguns.
GNU Make has a debugger. This alone makes it far superior to every other build tool I've ever seen. The cmake debugging experience is "run a google search, and try random stuff recommended by other people that also have no idea how the thing works". This shouldn't be acceptable.
This is very true. My thought process was that since majority of projects already run on CMake, I would simply build off of that and take advantage of what CMake is good at while making the more difficult operations easier. Thank you for your feedback!
I'm all for shitting on CMake, but Jesus, to suggest Make as a replacement/improvement is an unhinged take.
I'm suggesting that people creating build systems read the make manual. Surely this isn't controversial?
Please consider adding `cargo watch` - that would be a killer feature!
Yes! This is definitely on the list of features to add. Thank you for the feedback!
Impression before actually trying this:
CMake is a combination of a warthog of a specification language, and mechanisms for handling a zillion idiosyncracies and corners cases of everything.
I doubt than < 10,000 lines of C code can cover much of that.
I am also doubtful that developers are able to express the exact relations and semantic nuances they want to, as opposed to some default that may make sense for many projects, but not all.
Still - if it helps people get started on simpler or more straightforward projects - that's neat :-)
FWIW: there is something fundamentally wrong with a meta-meta build system. I don't think you should bother generating or wrapping CMake, you should be replacing it.
Cmake is doing a lot of underappreciated work under the hood that would be very hard to replicate in another tool, tons of accumulated workarounds for all the different host operating systems, compiler toolchains and IDEs, it's also one of few build tools which properly support Windows and Visual Studio.
Just alone reverse engineering the Xcode and Visual Studio project file formats for each IDE version isn't fun, but this "boring" grunt work is what makes cmake so valuable.
The core ideas of cmake are sound, it's only the scripting language that sucks.
Another fresh example of what you don't like: https://www.youtube.com/watch?v=ExSlx0vBMXo Building C++: It Doesn't Have to be Painful! - Nicole Mazzuca - Meeting C++ 2025
Build systems don't plan to converge in the future =)
My thoughts exactly. I thought this was going to be some new thing, but it's just yet another reason that I'll stick with Makefiles.
Do your Makefiles work across Linux, macOS and Windows (without WSL or MingW), GCC, Clang and MSVC, or allow loading the project into an IDE like Xcode or Visual Studio though? That's why meta-build-systems like cmake were created, not to be a better GNU Make.
There is something fundamentally wrong with Windows or Visual Studio that it requires ugly solutions.
Ok, then just cl.exe instead of gcc or clang. Completely different set of command line options from gcc and clang, but that's fine. C/C++ build tooling needs to be able to deal with different toolchains. The diversity of C/C++ toolchains is a strength, not a weakness :)
One nice feature of MSVC is that you can describe the linker dependencies in the source files (via #pragma comment(lib, ...)), this enables building fairly complex single-file tools trivially without a build system like this:
...without having to specify system dependencies like kernel32 etc... on the cmdline.Windows and Visual Studio solutions are perfectly fine. MSBuild is a declarative build syntax in XML, it's not very different from a makefile.
XML is already terrible. But the main problem seems to be that they created something similar but incompatible to make.
Just switch to bazel, copy my hermetic build config and just use it ... yes, you can hate me know.
Will take C only 51 years to adopt.
In the age of AI tools like this are pointless. Especially new ones, given existence of make, cmake, premake and a bunch of others.
C++ build system, at the core, boils down to calling gcc foo.c -o foo.obj / link foo.obj foo.exe (please forgive if I got they syntax wrong).
Sure, you have more .c files, and you pass some flags but that's the core.
I've recently started a new C++ program from scratch.
What build system did I write?
I didn't. I told Claude:
"Write a bun typescript script build.ts that compiles the .cpp files with cl and creates foo.exe. Create release and debug builds, trigger release build with -release cmd-line flag".
And it did it in minutes and it worked. And I can expand it with similar instructions. I can ask for release build with all the sanitize flags and claude will add it.
The particulars don't matter. I could have asked for a makefile, or cmake file or ninja or a script written in python or in ruby or in Go or in rust. I just like using bun for scripting.
The point is that in the past I tried to learn cmake and good lord, it's days spent learning something that I'll spent 1 hr using.
It just doesn't make sense to learn any of those tools given that claude can give me working any build system in minutes.
It makes even less sense to create new build tools. Even if you create the most amazing tool, I would still choose spending a minute asking claude than spending days learning arbitrary syntax of a new tool.
This is a fair and valid point. However, why leave your workflow to write a prompt to an AI when you can run simple commands in your workspace. Also you are most likely paying to use the AI while Craft is free and open source and will only continue to improve. I respect your feedback though, thank you!
You're missing finding library/include paths, build configuration (`-D` flags for conditional compilation), fetching these from remote repositories, and versioning.