Topic: Related? question regarding enum
Author: davisonj@en.ecn.purdue.edu (John M Davison)
Date: Thu, 28 Apr 1994 02:26:26 GMT Raw View
In article <Co4Awo.Bs7@ucc.su.OZ.AU> maxtal@physics.su.OZ.AU (John Max
Skaller) writes:
>
>Unless I'm mistaknen the compiler is required to choose
>the underlying type as an arithmetic type of appropriate
>signedness and sufficent size "not gratuitously larger than 'int'".
On the other hand, I believe that the current Hewlett-Packard C
compiler for their HP-UX systems has an extension that allows the programmer
to expressly specify the type to be associated with a given enum.
--
John Davison
davisonj@ecn.purdue.edu
"I cannot conceive that anybody will require multiplications at the rate of
40,000 or even 4,000 per hour..." -- F. H. Wales (1936)
Author: eru@tnso04.tele.nokia.fi (Erkki Ruohtula)
Date: 19 Apr 1994 17:49:02 GMT Raw View
In article <1994Apr13.183042.6437@xilinx.com> lou@xilinx.com (Lou Sanchez-Chopitea) writes:
>In article <ERU.94Apr11131205@tnso04.tele.nokia.fi> eru@tnso04.tele.nokia.fi (Erkki Ruohtula) writes:
>>Wouldn't the following rule solve the problem with no extra syntax:
>>The implementation must always choose the smallest available integer
>>type (the type with minimum sizeof) that is able to contain all of the
>>values of the enum constants. If there is both a signed and an unsigned
>>type that satisfies the above rule, then the unsigned type must be chosen.
>
> Why over-constrain the implementation? The only need I see is to
>have a mechanism to force the implementation to choose something
>suitable. The implementation might be able to optimize better on
>longs, or shorts, than on chars. Why force unsigned if signed is
>also OK? If the programmer can include valued constants to force
>a certain minimum size or signedness (sp?) it should suffice.
You probably did not see the context of the original discussion, which
was about forcing the enum to have some particular size for whatever
reason (for example, an externally imposed data layout, a desire to save
space). Or at least that is how I understood the problem being discussed.
Someone commented earlier that my proposed rule does not
provide a guarantee to the user that the representation will be
equivalent to "unsigned short", "long int" etc. since there are many
ways that an implementation might bind the integer sizes to these
type names. In my opinion this is irrelevant here.
In all implementations with 1, 2 and 4 byte integer sizes and 8-bit
bytes, applying the rule to the declaration "enum {a, b=0xffff}" would
always produce an unsigned 2-byte type, regardless of whether the name
for this type in the implementation is called "unsigned short" or
"unsigned int".
Another poster commented that declaring dummy constants just to
define the type size produces bad name space pollution, and that an
"unnamed constant" declaration would be needed. I agree completely.
I was merely trying to show how to do the extension without
introducing extra syntax rules.
Actually I would also be relatively happy with the other proposal
(<representation type> enum { ...). The main thing is that there
should be some standard way to force the implementation to use an
enum representation chosen by the programmer for those situations
where the programmer knows best.
--
Erkki Ruohtula / Nokia Telecommunications Oy
eru@tele.nokia.fi / P.O. Box 33 SF-02601 Espoo, Finland
(My private opinions, of course)
Author: lou@xilinx.com (Lou Sanchez-Chopitea)
Date: Wed, 13 Apr 1994 18:30:42 GMT Raw View
In article <ERU.94Apr11131205@tnso04.tele.nokia.fi> eru@tnso04.tele.nokia.fi (Erkki Ruohtula) writes:
>In article <rfgCo35JA.3D2@netcom.com> rfg@netcom.com (Ronald F. Guilmette) writes:
>>This gave me the idea that what would *really* be nice would be to give the
>>programmer control of *both* the signedness *and* the size of each enum type.
>>So, for example, one might be allowed to write:
>>
>> unsigned long enum { TABLE_SIZE = 70000 };
>>
>>or perhaps:
>>
>> signed char color { red, green, blue = 127 };
>>
>>Well... it's a thought. I dislike the fact that the implementation gets to
>>choose the size and signedness of enums, and there isn't any *standard*
>>thing that you as a programmer can do to influence it.
>
>Wouldn't the following rule solve the problem with no extra syntax:
>The implementation must always choose the smallest available integer
>type (the type with minimum sizeof) that is able to contain all of the
>values of the enum constants. If there is both a signed and an unsigned
>type that satisfies the above rule, then the unsigned type must be chosen.
Why over-constrain the implementation? The only need I see is to
have a mechanism to force the implementation to choose something
suitable. The implementation might be able to optimize better on
longs, or shorts, than on chars. Why force unsigned if signed is
also OK? If the programmer can include valued constants to force
a certain minimum size or signedness (sp?) it should suffice.
>
[some deleted]
>--
>Erkki Ruohtula / Nokia Telecommunications Oy
>eru@tele.nokia.fi / P.O. Box 33 SF-02601 Espoo, Finland
>(My private opinions, of course)
Cheers
Lou
--
Lou Sanchez-Chopitea EMail: lou@xilinx.com
Senior Software Engineer SnailMail: 2100 Logic Drive
SpeakMail: (408) 879-5059 San Jose, CA 95124
FaxMail: (408) 559-7114 #include <disclaimer.h>
Author: rfg@netcom.com (Ronald F. Guilmette)
Date: Mon, 11 Apr 1994 08:13:58 GMT Raw View
(Note that I have cross-posted this response to *both* comp.std.c *and* also
to comp.std.c++.)
In article <wmallory.54.2D9B0C1E@amelia.sp.trw.com> wmallory@amelia.sp.trw.com (Walter Mallory) writes:
>>> unsigned enum FLAVOR { vanilla, choclate, strawberry } flavor;
>>>
>>>which I believe would allow what I have in mind without invalidating any
>>>existing code. Is this reasonable or am I out of my mind (or just woefully
>>>uninformed)?
>
>>I think that's a darn good idea. Maybe we can get it into the next version
>>of the C standard. If not, maybe x3j16 can be convinced to put it into
>>the C++ standard. (Lord knows they put everything else in!)
>
>It's nice to know I am not completely alone on this. Being able to have
>control over the sign of an enumerated type and having *standard* support for
>enumerated types with bit fields would be a definite plus for me. I have
>found this to be a powerful combination (I have been able to live with the
>lack of sign control for enums) when used with a lint that supports strong
>type checking.
Coincidently enough another thing came up with respect to the SIZE of enums
in a C++ class I was teaching this past week.
In the class (which consisted mostly of PC programmers) someone asked if
enumerators could always be used in place of non-parameterized macros (in
either C or C++) in those cases where the definition would be just an
integral constant, as in:
enum { TABLE_SIZE = 70000 };
Sadly, we quickly found out that the PC C++ compiler we were using refused
to accept this... because the value in question was bigger than a (16-bit)
`int'. (It's refusal was, of course, correct according to the ANSI/ISO C
standard.)
This gave me the idea that what would *really* be nice would be to give the
programmer control of *both* the signedness *and* the size of each enum type.
So, for example, one might be allowed to write:
unsigned long enum { TABLE_SIZE = 70000 };
or perhaps:
signed char color { red, green, blue = 127 };
Well... it's a thought. I dislike the fact that the implementation gets to
choose the size and signedness of enums, and there isn't any *standard*
thing that you as a programmer can do to influence it.
>Since I understand that x3j16 is considering strong(er?) type
>checking for the C++ standard, maybe a case can be made.
In C++, enum types are already pretty strongly typed... but that's not even
related to the issue here. Here, we are talking about ``representation'',
not ``type''.
--
-- Ron Guilmette, Sunnyvale, CA ---------- RG Consulting -------------------
---- domain addr: rfg@netcom.com ----------- Purveyors of Compiler Test ----
---- uucp addr: ...!uunet!netcom!rfg ------- Suites and Bullet-Proof Shoes -
Author: lincmad@netcom.com (Linc Madison)
Date: Mon, 11 Apr 1994 09:23:42 GMT Raw View
Ronald F. Guilmette (rfg@netcom.com) wrote:
: In article <...> wmallory@amelia.sp.trw.com (Walter Mallory) writes:
: >>> unsigned enum FLAVOR { vanilla, choclate, strawberry } flavor;
: >
: >It's nice to know I am not completely alone on this. Being able to have
: >control over the sign of an enumerated type and having *standard* support
: >for enumerated types with bit fields would be a definite plus for me.
: Coincidently enough another thing came up with respect to the SIZE of enums
: in a C++ class I was teaching this past week.
: enum { TABLE_SIZE = 70000 }; /* not allowed if 70000 > MAXINT */
: unsigned long enum { TABLE_SIZE = 70000 };
: signed char color { red, green, blue = 127 };
: Well... it's a thought. I dislike the fact that the implementation gets to
: choose the size and signedness of enums, and there isn't any *standard*
: thing that you as a programmer can do to influence it.
: >Since I understand that x3j16 is considering strong(er?) type
: >checking for the C++ standard, maybe a case can be made.
: In C++, enum types are already pretty strongly typed... but that's not even
: related to the issue here. Here, we are talking about ``representation'',
: not ``type''.
There are some portability issues on the other side, though. The
current standard ensures that any enum can always be cast to an int
(implicitly or explicitly) without loss of information. The issue also
arises into overloaded function resolution, since under current rules
enums are a best match to their own enum type, then to type 'int'. You
would have to keep a table of the size of each enum and which integer
type it can be cast to for these purposes.
You also get some interesting results if you do things like:
unsigned long enum foo { foo0, foo1, foo2, foo3 };
signed char enum color { red, green, blue = 127 };
...
unsigned long ul;
int n;
signed char sc;
color c;
ul = foo2 + foo3; // no problem, both are unsigned longs
ul = foo2 + red; // red gets promoted to unsigned long
n = green + 2; // green gets promoted to int
c = red + 1; // red is promoted to int, result is int -- ERROR
-- Linc Madison * Oakland, California * LincMad@Netcom.com
Author: eru@tnso04.tele.nokia.fi (Erkki Ruohtula)
Date: 11 Apr 1994 10:12:04 GMT Raw View
In article <rfgCo35JA.3D2@netcom.com> rfg@netcom.com (Ronald F. Guilmette) writes:
>This gave me the idea that what would *really* be nice would be to give the
>programmer control of *both* the signedness *and* the size of each enum type.
>So, for example, one might be allowed to write:
>
> unsigned long enum { TABLE_SIZE = 70000 };
>
>or perhaps:
>
> signed char color { red, green, blue = 127 };
>
>Well... it's a thought. I dislike the fact that the implementation gets to
>choose the size and signedness of enums, and there isn't any *standard*
>thing that you as a programmer can do to influence it.
Wouldn't the following rule solve the problem with no extra syntax:
The implementation must always choose the smallest available integer
type (the type with minimum sizeof) that is able to contain all of the
values of the enum constants. If there is both a signed and an unsigned
type that satisfies the above rule, then the unsigned type must be chosen.
So in typical machines:
representation of enum {a, b} is unsigned char,
representation of enum {a = -200, b = 200} => is signed short, etc.
With this rule, you would be able to specify any representation you
want by adding dummy constants at the boundaries. If you do not care
about the representation, the system will give you the representation
with the smallest memory requirement, which is often useful. Currently
I am sometimes forced to use #defined constants instead of enums (which
I would prefer) simply to avoid wasting memory in large data structures.
Some current C compilers with certain options already choose the enum
representation with rules somewhat like this, but most do not.
Would the above-described behaviour be completely legal under
the current C standard?
--
Erkki Ruohtula / Nokia Telecommunications Oy
eru@tele.nokia.fi / P.O. Box 33 SF-02601 Espoo, Finland
(My private opinions, of course)
Author: g2devi@cdf.toronto.edu (Robert N. Deviasse)
Date: Mon, 11 Apr 1994 12:50:27 GMT Raw View
In article <ERU.94Apr11131205@tnso04.tele.nokia.fi> eru@tnso04.tele.nokia.fi (Erkki Ruohtula) writes:
>In article <rfgCo35JA.3D2@netcom.com> rfg@netcom.com (Ronald F. Guilmette) writes:
>>This gave me the idea that what would *really* be nice would be to give the
>>programmer control of *both* the signedness *and* the size of each enum type.
>>So, for example, one might be allowed to write:
>>
>> unsigned long enum { TABLE_SIZE = 70000 };
>>
>>or perhaps:
>>
>> signed char color { red, green, blue = 127 };
>>
>>Well... it's a thought. I dislike the fact that the implementation gets to
>>choose the size and signedness of enums, and there isn't any *standard*
>>thing that you as a programmer can do to influence it.
>
>Wouldn't the following rule solve the problem with no extra syntax:
>The implementation must always choose the smallest available integer
>type (the type with minimum sizeof) that is able to contain all of the
>values of the enum constants. If there is both a signed and an unsigned
>type that satisfies the above rule, then the unsigned type must be chosen.
>
>So in typical machines:
>representation of enum {a, b} is unsigned char,
>representation of enum {a = -200, b = 200} => is signed short, etc.
>
>With this rule, you would be able to specify any representation you
>want by adding dummy constants at the boundaries. If you do not care
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Dummy constants polluting the global namespace irks me. If this route is taken
then we should be allowed to specify this dummy constant without polluting the
global namespace. Something like:
enum {a,b, ...=-1};
should work. Personally, I'd prefer the
unsigned char enum {a,b};
option. Do we really need another kludge solution in C++?
>
>--
>Erkki Ruohtula / Nokia Telecommunications Oy
>eru@tele.nokia.fi / P.O. Box 33 SF-02601 Espoo, Finland
>(My private opinions, of course)
Take care
Robert
--
/----------------------------------+------------------------------------------\
| Robert N. Deviasse |"If we have to re-invent the wheel, |
| EMAIL: g2devi@cdf.utoronto.ca | can we at least make it round this time"|
+----------------------------------+------------------------------------------/
Author: kanze@us-es.sel.de (James Kanze)
Date: 12 Apr 1994 18:54:15 GMT Raw View
In article <rfgCo35JA.3D2@netcom.com> rfg@netcom.com (Ronald F.
Guilmette) writes:
|> This gave me the idea that what would *really* be nice would be to give the
|> programmer control of *both* the signedness *and* the size of each enum type.
|> So, for example, one might be allowed to write:
|> unsigned long enum { TABLE_SIZE = 70000 };
|> or perhaps:
|> signed char color { red, green, blue = 127 };
Although the programmer does not have control, the standards
committeee have changed the definition of the "underlying type" of the
enum so that the enum that you propose must be acceptable. If it
doesn't fit in an int, then the compiler is obliged to make it a long.
I expect it is only a matter of time (lots of time?) until compilers
actually implement this.
--
James Kanze email: kanze@lts.sel.alcatel.de
GABI Software, Sarl., 8 rue du Faisan, F-67000 Strasbourg, France
Conseils en informatique industrielle --
-- Beratung in industrieller Datenverarbeitung