Topic: The true meaning of "const
Author: maxtal@physics.su.OZ.AU (John Max Skaller)
Date: Sun, 4 Dec 1994 00:33:00 GMT Raw View
In article <CznxMx.Jto@alsys.com> kst@alsys.com (Keith Thompson) writes:
>In <CzLIEn.Au1@ucc.su.OZ.AU> maxtal@physics.su.OZ.AU (John Max Skaller) writes:
>> Language Standards make guarrantees about semantics
>> of nice programs and attempt to avoid placing constraints on
>> implementors. This is to permit both optimisation and applicability
>> to a wide range of architectures.
>
>That's true of some language standards, but not less true of others.
>The Ada standard, for example, is far stricter than the C and C++
>standards in its requirements on the implementation in the presence
>of illegal programs. An illegal construct in an Ada program must be
>diagnosed at compilation time; an Ada compiler may not successfully
>compile an invalid program. The intent is to promote portability by
>disallowing extensions to the language standard (though an implementation
>may define new pragmas, attributes, and compilation units).
This is _possible_ in Ada in some areas,
_because_ it has a module system. (packages). And almost _impossible_
in C++ because it does not.
--
JOHN (MAX) SKALLER, INTERNET:maxtal@suphys.physics.su.oz.au
Maxtal Pty Ltd,
81A Glebe Point Rd, GLEBE Mem: SA IT/9/22,SC22/WG21
NSW 2037, AUSTRALIA Phone: 61-2-566-2189
Author: mvuille@procntrl.synapse.net (Martin Vuille)
Date: Mon, 21 Nov 1994 21:28:25 LOCAL Raw View
In article <CzLIEn.Au1@ucc.su.OZ.AU> maxtal@physics.su.OZ.AU (John Max Skaller) writes:
>In article <mvuille.15.001696F8@procntrl.synapse.net> mvuille@procntrl.synapse.net (Martin Vuille) writes:
>>It seems to me that one way in which #define are superior is that
>>they are guaranteed not to allocate any storage. (This can be significant
>>in some resource-restricted environments, such as microcontrollers.)
> I disagree. Not only is there no such guarrantee, but
>allocating such storage may, on some architectures, REDUCE
>total storage use, especially if the constant is used in
>more than one place. For example, short form addressing
>may be more efficient that embedding a constant in an
>immediate mode machine instruction.
>Specifying that a #define or const int "may not occupy any storage"
>is clearly a gratuitous overspecification. There is no
>way you can tell. Such a requirement would be meaningless.
>Summary: there is no rule on storage for const int because
>it would be meaningless and even if it stuck it might
>be suboptimal.
As you (and another astute individual who replied via e-mail) have
correctly pointed out, I should have given this a little more
thought before beaming out my question to the electronic universe.
Every useful "constant" will require storage, either in the data
space or in the program space (as an immediate operand.)
At the time I formulated my original question, I was pondering the
consequence of using "const" vs. "#define" in an environment where
data space is extremely constrained.
Clearly, whether the choice of "const" over "#define" will result
in a larger or smaller program (code+data) is very dependent on
many factors that are outside the scope of a language standard.
Although I had blithely assumed that "const" would require more
data store than "#define" (and at the same time ignored the
possible code store savings), upon reflection I can think of
some pathological cases where the reverse would be true!
Thank you for helping me get my thoughts in order.
MV
Martin Vuille | "Your partner in | System Design Consulting
ProControl | successful product | Software and Firmware Development
(613) 258-0021 | development" | System Integration
Author: maxtal@physics.su.OZ.AU (John Max Skaller)
Date: Mon, 21 Nov 1994 02:18:22 GMT Raw View
In article <mvuille.15.001696F8@procntrl.synapse.net> mvuille@procntrl.synapse.net (Martin Vuille) writes:
>The use of "const" for defining "named constants" is promoted as
>superior to the use of preprocessor #define's.
>
>It seems to me that one way in which #define are superior is that
>they are guaranteed not to allocate any storage. (This can be significant
>in some resource-restricted environments, such as microcontrollers.)
I disagree. Not only is there no such guarrantee, but
allocating such storage may, on some architectures, REDUCE
total storage use, especially if the constant is used in
more than one place. For example, short form addressing
may be more efficient that embedding a constant in an
immediate mode machine instruction.
>
>What does the C standard or the C++ proto-standard say about this?
Absolutely nothing. Nor should they/it. Nor, at least
in C and C++, is there any way whatsoever such a guarrantee
could even be sensibly _written_ since it could not possibly
mean anything -- only rules which are _testable_ can
be written sensibly as constraints on implementors. ***
Language Standards make guarrantees about semantics
of nice programs and attempt to avoid placing constraints on
implementors. This is to permit both optimisation and applicability
to a wide range of architectures.
*** In principle there is only one kind of rule in the C++ WD:
constraints on implementations. You may think there are
constraints on programmers "you have to initialise a reference"
for example. That is wrong. "If a reference is not initialised,
the program is ill formed" is Standardese and it means
"The implementation _shall_ diagnose if a reference is not
initialised and all other constraints of this Standard are
hereby lifted".
How about "You can't dereference a null pointer?"
In Standardese "The behaviour of a program which dereferences
a null pointer is undefined". Which means the implementor
can do anything e wants. Trash you disk. (Under DOS this
_really_ happens!) That is, it means there are NO constraints
on the implementor in these circumstances.
Of course, if you want well defined behaviour out of your
program, it is a good idea not to do stuff for which the
Standard releases the implementor from any required
interpretation. That is, you can choose, if you want,
to make those things constraints on you as a programmer.
(This is mandatory for conformance testers)
Now consider a rule of the ARM which in a sense cannot be part of the
C++ Standard "A non-local variable of a type with a constructor
may not be optimised away even if it appears to be unused".
This is not correct. Well, not really. If the constructor (or destructor)
has no behavioural side effects, it can be optimised away
_no matter what the Standard says_.
The committee has a name for this phenomena: there is
an unwritten rule called the "as if" rule. It means
"When we say 'you must do X' we do not actually mean you must
do X, you can actually do Y as long as it _looks as if_ you
had done X."
This rule is not in the Standard. It is a _consequence_
of the nature of Standardisation: Standards are "behaviourist".
They only specify what can be measured. Most Standards are
overspecified (they specify more than what is measurable).
Such an overspecification is a model, it is done for
simplicity. [Gratuitous overspecification, however, is
not acceptable]
Specifying that a #define or const int "may not occupy any storage"
is clearly a gratuitous overspecification. There is no
way you can tell. Such a requirement would be meaningless.
Summary: there is no rule on storage for const int because
it would be meaningless and even if it stuck it might
be suboptimal.
--
JOHN (MAX) SKALLER, INTERNET:maxtal@suphys.physics.su.oz.au
Maxtal Pty Ltd,
81A Glebe Point Rd, GLEBE Mem: SA IT/9/22,SC22/WG21
NSW 2037, AUSTRALIA Phone: 61-2-566-2189
Author: kst@alsys.com (Keith Thompson)
Date: Tue, 22 Nov 1994 09:42:32 GMT Raw View
In <CzLIEn.Au1@ucc.su.OZ.AU> maxtal@physics.su.OZ.AU (John Max Skaller) writes:
> Language Standards make guarrantees about semantics
> of nice programs and attempt to avoid placing constraints on
> implementors. This is to permit both optimisation and applicability
> to a wide range of architectures.
That's true of some language standards, but not less true of others.
The Ada standard, for example, is far stricter than the C and C++
standards in its requirements on the implementation in the presence
of illegal programs. An illegal construct in an Ada program must be
diagnosed at compilation time; an Ada compiler may not successfully
compile an invalid program. The intent is to promote portability by
disallowing extensions to the language standard (though an implementation
may define new pragmas, attributes, and compilation units).
*Please* don't start a language flame war about this. I haven't said
that one approach is better than the other; I'm just trying to clarify
the facts.
> The committee has a name for this phenomena: there is
> an unwritten rule called the "as if" rule. It means
> "When we say 'you must do X' we do not actually mean you must
> do X, you can actually do Y as long as it _looks as if_ you
> had done X."
>
> This rule is not in the Standard. It is a _consequence_
> of the nature of Standardisation: Standards are "behaviourist".
> They only specify what can be measured. Most Standards are
> overspecified (they specify more than what is measurable).
> Such an overspecification is a model, it is done for
> simplicity. [Gratuitous overspecification, however, is
> not acceptable]
Actually, I think it is stated rather clearly in the C standard.
See section 5.1.2.3, "Program execution". It states that the standard
describes "an abstract machine in which issues of optimization are
irrelevant". The minimal requirements on a conforming implementation are
(paraphrasing):
Volatile objects must be stable at sequence points.
At program termination, all data written to files must be up to date.
I/O to interactive devices must take place as specified.
I'd guess that the C++ standard has (or will have) similar wording.
> Specifying that a #define or const int "may not occupy any storage"
> is clearly a gratuitous overspecification. There is no
> way you can tell. Such a requirement would be meaningless.
>
> Summary: there is no rule on storage for const int because
> it would be meaningless and even if it stuck it might
> be suboptimal.
Agreed.
--
Keith Thompson (The_Other_Keith) kst@alsys.com
TeleSoft^H^H^H^H^H^H^H^H Alsys, Inc.
10251 Vista Sorrento Parkway, Suite 300, San Diego, CA, USA, 92121-2718
/user/kst/.signature: I/O error (core dumped)
Author: kst@alsys.com (Keith Thompson)
Date: Wed, 23 Nov 1994 00:41:37 GMT Raw View
In <CznxMx.Jto@alsys.com> I wrote:
> That's true of some language standards, but not less true of others.
---
>>> Error: delete "not" in line above.
> The Ada standard, for example, is far stricter than the C and C++
> standards in its requirements on the implementation in the presence
> of illegal programs. An illegal construct in an Ada program must be
> diagnosed at compilation time; an Ada compiler may not successfully
> compile an invalid program.
Based on e-mail response, I may have generated some confusion here.
The Ada standard does not require all errors to be detected during
compilation. It does require all "illegal" constructs (e.g., syntax
errors, references to undeclared variables, type mismatches) to be
detected and rejected during compilation.
Another class of errors must be detected at run time (e.g., arithmetic
overflow, array index violation); for these, Ada uses an exception
mechanism similar to C++'s. Mechanisms are available to suppress
certain run-time checks, at the risk of unpredictable behavior if the
corresponding constraints are violated.
Yet another class of errors results in "erroneous execution"; such errors
are not required to be detected during either compilation or execution,
and can result in arbitrarily bad things happening. This corresponds
to "undefined behavior" in C and C++. The authors of the Ada standard
have tried very hard to minimize this last category, including only
constructs that cannot reasonably be detected in all cases given the
current state of the art.
Sorry for the off-topic post. My point is more about language standards
in general than about C/C++/Ada in particular.
--
Keith Thompson (The_Other_Keith) kst@alsys.com
TeleSoft^H^H^H^H^H^H^H^H Alsys, Inc.
10251 Vista Sorrento Parkway, Suite 300, San Diego, CA, USA, 92121-2718
/user/kst/.signature: I/O error (core dumped)
Author: mvuille@procntrl.synapse.net (Martin Vuille)
Date: Thu, 17 Nov 1994 22:35:13 Raw View
The use of "const" for defining "named constants" is promoted as
superior to the use of preprocessor #define's.
It seems to me that one way in which #define are superior is that
they are guaranteed not to allocate any storage. (This can be significant
in some resource-restricted environments, such as microcontrollers.)
What does the C standard or the C++ proto-standard say about this?
When I checked Stroustrup, 2nd ed., it seemed that this was entirely
left up to how much effort the implementor wanted to put in. K&R, 2nd
ed., didn't give me the impression that the compiler would even try
to avoid allocating storage.
Any comments?
(posted to both comp.std.c++ and comp.std.c)
MV
Martin Vuille | "Your partner in | System Design Consulting
ProControl | successful product | Software and Firmware Development
(613) 258-0021 | development" | System Integration