Interesting. I'm just starting to learn about C++ but the sense I get is that modern C++ is moving away from the preprocessor. One example is preferring the use of enums and inline functions over the use of macros.
Now, that doesn't change the fact that this project is cool and could lead to interesting developments but are preprocessor advancements a good thing in general for C++ or is it moving backwards?
I have found (and let me qualify, I am by no means an old hand at this) that macros allow you to do a degree of metaprogramming beyond just being "dumb enums" that makes them powerful on their own. I'm half expecting to be struck down where I stand for this next statement on account of some fine point I'm not aware of, but aspects of modern languages; decorators in python, serve similar functionality to what I used to use macros in C for. I think there's certainly room for the preprocessor going forward, although it may be valuable to ask, as python has, can we make it more elegant/idiomatic/coherent with the language as a whole. C macros, when you get "involved" are... arcane.
> macros allow you to do a degree of metaprogramming
I think this comment would be more believable if you provided some examples. Since you didn't, let me provide one and then hopefully you (or someone else) can provide more.
I would provide a summary, but I think you can get a hint just from the URL. (Actually, the URL is misleading. The title of the blog post is "Higher Order Macros in C++".)
I'll admit, I didn't write the comment with the expectation that it would be attacked on believability, but I'll try to give a satisfactory response:
The definition of metaprogramming is "the writing of computer programs that with the ability to treat programs as their data", which #defines can certainly do, and which decorators in a slightly different sense also do. I hate to just drop Wikipedia links, but it says it far better than I could, and contains an example of using C style macros to do templated metaprogramming. (http://en.wikipedia.org/wiki/Metaprogramming)
OP here. You are right that C++ is moving away from the preprocessor (personally I can't wait for modules to save me from #include hell). CPIP was not intended to advocate advancements in the preprocessor but merely to make preprocessing in either C or C++ more traceable/comprehensible.
'Modern' C++ does not depend so stongly on a preprocessor however C does and most big C projects use all features of the C preprocessor, however opaque. This is why I chose to illustrate CPIP with preprocessing a single file from the linux kernel: http://cpip.sourceforge.net/copy_to_html/cpu/index_cpu.c_942...
We're moving away from it, it's just not done yet. I'd be surprised if C++17 doesn't have a module system, especially after the clang guys implemented one as a proof-of-concept.
C++ has more powerful tools than preprocessor and is used in different scenarios.
On the other hand, C is predominant language in embedded systems (while C++ is available it is usually highly stripped version), where performance and memory (both binary and runtime) footprint is critical. Therefore code elimination/swap for different hardware and/or slightly different featureset is best done at compile time using preprocessor. And we need best tools available to debug that.
Wouldn't this be the main reason C is predominant, combined with historical reasons a la 'predecessor used C as well'? I find it hard to believe one wouldn't be able C++ which is as efficient as C. Afaik the preprocessor features are exactly the same. Also I've seen the virual function overhead card being played too much and it doesn't make too much sense: one can benefit from goodies like templates/lambdas/STL without using virtual inheritance. And all those goodies mostly appear when writing the code, i.e. make for faster cleaner development: after the compiler went over it they are all optimized away and turned into what looks a whole lot like assembly generated from plain C.
I certainly wouldn't want to encourage more use of the preprocessor. But I can understand having a preprocessor that does something more than whatever's stock on your system (JPL, for instance, has their own C preprocessor: http://lars-lab.jpl.nasa.gov/ , scroll down to "released code"), even if it's standard conforming.
With macros you can easily implement small dsl like stuff. I'm not at my desktop so I can't show any examples, but things like what mfc uses for message maps makes for very easy and powerful expressions.
I doubt it. Jinja2 is a templating language for generating web pages. CPIP is a means of understanding what the C/C++ preprocessor is up to. These are unrelated.
Now, that doesn't change the fact that this project is cool and could lead to interesting developments but are preprocessor advancements a good thing in general for C++ or is it moving backwards?