I don't disagree with anything you said, but I think it's important to point out that your experience is unique in that you're talking about build infrastructure for a massive codebase: most of Google's internal stuff, right?
In my experience plain GNU Make works great until you start working with massive projects. Similarly, the ninja speed improvement: you just don't see it unless you have a truly massive project, or you're using low-quality Makefiles, like what CMake produces. I've measured; I can't see it. I have many medium-sized projects, all with simple Makefiles, and it works really really well. I'm pretty convinced at this point that this should be the way to go for most projects out there.
The 3 makefiles I wrote were actually for a medium-size open source project https://www.oilshell.org/ (medium measured by lines of code)
I could probably write a blog post about this, but for two of the Makefiles I needed to enumerate the rules dynamically (inputs weren't fixed, they were "globbed"). And my memory is that this interacted very poorly with other GNU make features build variants
Whereas that pattern is extremely simple with Python/Ninja. Just write a loop or a nested loop and generate a build rule on each iteration. It's done in 5 minutes, whereas GNU make was a constant struggle.
In my experience plain GNU Make works great until you start working with massive projects. Similarly, the ninja speed improvement: you just don't see it unless you have a truly massive project, or you're using low-quality Makefiles, like what CMake produces. I've measured; I can't see it. I have many medium-sized projects, all with simple Makefiles, and it works really really well. I'm pretty convinced at this point that this should be the way to go for most projects out there.