The idea of replacing the auto tools with GNU make code could be a good thing. I tend to just write simple Makefiles, and avoid using autofoo in new projects at all. That's not good for portability, but it's excellent for sanity.
But, what caused the auto tools to become such a mess in the first place? Part of it was the design, which from 30,000 feet is to write a code generator for some not ideal, but lowest common denominator language, and ship a copy of the (generated) code with every project.
The new design is to write an include file in some not ideal, but almost lowest common denominator language, and ship a copy of that with every project. This avoids many of the pitfalls in the original design, such as having to deal with, and avoid editing, thousands of lines of nasty generated code when a build fails.
I'm afraid it doesn't avoid other pitfalls we've seen with the auto tools. One being that all these copies of code need to be kept up to date. Another being that the interfaces between such an include file and the Makefile that would use it tend to be very broad, and, if not very well defined, can make upgrading the include file break things. As happens when you try to use the wrong version of the auto tools on a project that was written for an earlier version.
Maybe smarter people than I can get around these problems. And there's probably, sadly, truth to the idea that to get a system like this really widely adopted, it needs to distribute build code that uses only lowest common denominator tools.
(I was going to talk about how I started off using a similar include file but redesigned and sidestepped these problems while still getting wide acceptance for a build tool in the Debian community, but I see I already blogged about that earlier, in debhelper autoconf grudge match.)