Why aren't macros consisted of in the majority of modern-day programming languages?

I recognize that they are applied exceptionally unsafely in C/C++. Can not they be applied in a more secure means? Are the negative aspects of macros actually negative sufficient to surpass the substantial power they give?

0
2019-05-04 17:46:44
Source Share
Answers: 5

The largest trouble I have actually seen with macros is that when greatly utilized they can make code really illegible and also keep given that they permit you to hide reasoning in the macro that might or might not be very easy to locate (and also might or might not be unimportant).

0
2019-05-08 04:43:59
Source

Macros can, as Scott notes, permit you hide reasoning. Certainly, so do features, courses, collections, and also several various other usual tools.

Yet an effective macro system can go better, allowing you to make and also make use of syntax and also frameworks not generally located in the language. This can be a remarkable device without a doubt : domain-specific languages, code generators and also even more, all within the convenience of a solitary language setting ...

However, it can be abused. It can make code tougher to read, recognize and also debug, increase the moment essential for new designers to come to be accustomed to a codebase, and also bring about pricey blunders and also hold-ups.

So for languages planned to streamline shows (like Java or Python), such a system is a taboo.

0
2019-05-08 04:32:59
Source

To address your inquiries, think of what macros are mostly made use of for (Warning : brain-compiled code).

  • Macros made use of to specify symbolic constants #define X 100

This can conveniently be changed with : const int X = 100;

  • Macros made use of to specify (basically) inline type-agnostic features #define max(X,Y) (X>Y?X:Y)

In any kind of language that sustains function overloading, this can be mimicked in a far more type-safe fashion by having actually strained features of the proper type, or, in a language that sustains generics, by a common function. The macro will gladly try to contrast anything consisting of reminders or strings, which could compile, yet is likely not what you desired. On the various other hand, if you made macros type-safe, they supply no advantages or ease over overloaded features.

  • Macros made use of to define faster ways to often-used components. #define p printf

This is conveniently changed by a function p() that does the very same point. This is fairly associated with C (needing you to make use of the va_arg() family members of features) yet in several various other languages that sustain variable varieties of function debates, it is much less complex.

Sustaining these attributes within a language as opposed to in an unique macro language is less complex, much less mistake vulnerable and also much much less complex to others reviewing the code. Actually, I can not consider a solitary use-case for macros that can not conveniently be replicated in an additional means. The only area where macros are absolutely valuable is when they are linked to conditional collection constructs like #if (etc).

On that particular factor, I will not say with you, given that I think that non-preprocessor remedies to conditional collection in preferred languages are exceptionally difficult (like bytecode shot in Java). Yet languages like D have actually thought of remedies that do not call for a preprocessor and also disappear difficult than making use of preprocessor conditionals, while being much much less error-prone.

0
2019-05-08 04:07:34
Source

I assume the major factor is that macros are lexical . This has numerous effects :

  • The compiler has no other way of examining that a macro is semantically shut, i.e. that it stands for a "device of definition" like a function does. (Consider #define TWO 1+1-- what does TWO*TWO equivalent? 3.)

  • Macros are not keyed in like features are. The compiler can not examine that the parameters and also return type make good sense. It can just examine the increased expression that makes use of the macro.

  • If the code does not compile, the compiler has no other way of recognizing whether the mistake remains in the macro itself or the area where the macro is made use of. The compiler will certainly either report the incorrect area fifty percent of the moment, or it needs to report both despite the fact that among them is possibly great. (Consider #define min(x,y) (((x)<(y))?(x):(y)) : What should the compiler do if the sorts of x and also y do not match or do not implement operator<?)

  • Automated devices can not collaborate with them in semantically valuable means. Specifically, you can not have points like IntelliSense for macros that function like features yet expand to an expression. (Again, the min instance.)

  • The side-effects of a macro are not as specific as they are with features, creating possible complication for the designer. (Consider once more the min instance : in a function call, you recognize that the expression for x is reviewed just as soon as, yet below you can not recognize without considering the macro.)

Like I claimed, these are all effects of the reality that macros are lexical. When you attempt to transform them right into something extra correct, you wind up with features and also constants.

0
2019-05-08 03:59:10
Source

But of course, macros can be made and also applied far better than in C/C++.

The trouble with macros is that they are properly a language syntax expansion device that revises your code right into something else.

  • In the C/ C++ instance, there is no basic peace of mind monitoring. If you take care, points are ALRIGHT. If you slip up, or if you overuse macros you can get involved in large troubles.

    Include in this that several straightforward points you can do with (C/C++ design) macros can be carried out in various other methods various other languages.

  • In various other languages such as numerous Lisp languages, macros are much better incorporated with the core language syntax, yet you can still get troubles with affirmations in a macro "dripping". This is resolved by hygienic macros.


Quick Historical Context

Macros (brief for macro-instructions) first showed up in the context of setting up language. According to Wikipedia, macros were readily available in some IBM assemblers in the 1950s.

The initial LISP really did not have macros, yet they were first presented right into MacLisp in the mid 1960s : https://stackoverflow.com/questions/3065606/when-did-the-idea-of-macros-user-defined-code-transformation-appear. http://www.csee.umbc.edu/courses/331/resources/papers/Evolution-of-Lisp.pdf. Before that, "fexprs" given macro-like capability.

The earliest variations of C really did not have macros (http://cm.bell-labs.com/cm/cs/who/dmr/chist.html). These were included circa 1972-73 using a preprocessor. Before that, C just sustained #include and also #define.

The M4 macro-preprocessor came from circa 1977.

Extra current languages evidently implement macros where the version of procedure is syntactic as opposed to textual.

So when a person speak about the primacy of a certain definition of the term "macro", it is necessary to keep in mind that the definition has actually advanced with time.

0
2019-05-08 02:07:26
Source