Topic: enums are always signed?
Author: Ross Smith <ross.s@ihug.co.nz>
Date: 1999/11/11 Raw View
wmm@fastdial.net wrote:
>
> You are correct -- the underlying type of the enum must be unsigned
> (unless sizeof(int) on that implementation is >4). The sentence
> following the one you quoted is informative on this point:
>
> It is implementation-defined which integral type is used as
> the underlying type for an enumeration except that the
> underlying type shall not be larger than int unless the value
> of an enumerator cannot fit in an int or unsigned int.
>
> It's clear that your enumerator would require exactly an unsigned
> int (and not a signed long, for instance) on an implementation where
> sizeof(int) == 4.
Which leads to the interesting question of what happens when an enum has
values that fit in an int or unsigned int but a range that doesn't.
enum Foobar {
Foo = -1,
Bar = 0xFFFFFFFF
};
On a system with 32-bit int, by 7.2p5 the underlying type may not be
larger than int, since all of the enum values will fit into an int or
unsigned int. But there's no suitable type that holds both of the
values.
(Experimenting with some real compilers, I found that GCC 2.95 gives the
enumeration 64 bits, while MSVC 6 and Sun CC 5 both give it a signed
32-bit underlying type and set Bar to -1.)
--
Ross Smith <ross.s@ihug.co.nz> The Internet Group, Auckland, New Zealand
========================================================================
"There are many technical details that make Linux attractive to the
sort of people to whom technical details are attractive." -- Suck
[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std-c++@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]
[ FAQ: http://reality.sgi.com/austern_mti/std-c++/faq.html ]
Author: Francis Glassborow <francis@robinton.demon.co.uk>
Date: 1999/11/11 Raw View
In article <80a2qm$ecp@news.or.intel.com>, Eric Guadalupe
<eguad@usa.net> writes
>To me, it seems obvious that int cannot represent the value
>0xffffffffu, so the underlying type of the enum should be
>unsigned, and the assignment should not generate a warning.
But it could also use a long. The Standard does not, IIRC, place a
requirement on which choice a compiler should make. In the case you
quoted this would result in a narrowing conversion - certainly worth a
warning.
Francis Glassborow Journal Editor, Association of C & C++ Users
64 Southfield Rd
Oxford OX4 1PA +44(0)1865 246490
All opinions are mine and do not represent those of any organisation
[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std-c++@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]
[ FAQ: http://reality.sgi.com/austern_mti/std-c++/faq.html ]
Author: Christopher Eltschka <celtschk@physik.tu-muenchen.de>
Date: 1999/11/12 Raw View
"James Kuyper Jr." wrote:
>
> Eric Guadalupe wrote:
> >
> > Hello,
> >
> > I have a question about a compiler's behavior regarding
> > the following lines:
> >
> > enum { Value = 0xffffffffu };
> > unsigned n = Value;
> >
> > At the second statement, the compiler I am using generates
> > a warning about an integer conversion causing a change in
> > sign.
> >
> > Does this mean the compiler is using signed int for enum?
>
> If 0xffffffffu < (unsigned)INT_MAX, then it can use 'int' as the
> underlying type. If 0xffffffffu > UINT_MAX, then it can use 'long' as
> the underlying type.
>
> > To me, it seems obvious that int cannot represent the value
> > 0xffffffffu, so the underlying type of the enum should be
> > unsigned, and the assignment should not generate a warning.
>
> It only has to be unsigned if 0xffffffffu is between INT_MAX and
> UINT_MAX. There aren't many implementations with that characteristic.
> UINT_MAX has to be 2^N-1, for some integer N. INT_MAX is typically about
> half UINT_MAX, in which case N would have to be 36. 36 bit integer
> implementations exist, but they're not very common.
I count 8 f's - therefore 0xffffffffu should be 16^8-1 which is 2^32-1.
Most 32-bit systems have INT_MAX=2^31-1 and UINT_MAX=2^32-1.
So on those systems, 0xffffffffu should be _exactly_ UINT_MAX.
---
[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std-c++@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]
[ FAQ: http://reality.sgi.com/austern_mti/std-c++/faq.html ]
Author: "James Kuyper Jr." <kuyper@wizard.net>
Date: 1999/11/12 Raw View
"James Kuyper Jr." wrote:
>
> Eric Guadalupe wrote:
> >
> > Hello,
> >
> > I have a question about a compiler's behavior regarding
> > the following lines:
> >
> > enum { Value = 0xffffffffu };
> > unsigned n = Value;
> >
> > At the second statement, the compiler I am using generates
> > a warning about an integer conversion causing a change in
> > sign.
> >
> > Does this mean the compiler is using signed int for enum?
>
> If 0xffffffffu < (unsigned)INT_MAX, then it can use 'int' as the
> underlying type. If 0xffffffffu > UINT_MAX, then it can use 'long' as
> the underlying type.
>
> > To me, it seems obvious that int cannot represent the value
> > 0xffffffffu, so the underlying type of the enum should be
> > unsigned, and the assignment should not generate a warning.
>
> It only has to be unsigned if 0xffffffffu is between INT_MAX and
> UINT_MAX. There aren't many implementations with that characteristic.
> UINT_MAX has to be 2^N-1, for some integer N. INT_MAX is typically about
> half UINT_MAX, in which case N would have to be 36. 36 bit integer
> implementations exist, but they're not very common.
Sorry, I miscounted the hex digits (for some reason I was thinking of
the 'u' as another digit). Change 36 to 32, and remove my comments about
it being uncommon.
---
[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std-c++@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]
[ FAQ: http://reality.sgi.com/austern_mti/std-c++/faq.html ]
Author: "Eric Guadalupe" <eguad@usa.net>
Date: 1999/11/09 Raw View
Hello,
I have a question about a compiler's behavior regarding
the following lines:
enum { Value = 0xffffffffu };
unsigned n = Value;
At the second statement, the compiler I am using generates
a warning about an integer conversion causing a change in
sign.
Does this mean the compiler is using signed int for enum?
In 7.2.5 the standard says:
"The underlying type of an enumeration is an integral
type that can represent all the enumerator values
defined in the enumeration".
To me, it seems obvious that int cannot represent the value
0xffffffffu, so the underlying type of the enum should be
unsigned, and the assignment should not generate a warning.
Is this legitimate compiler behavior? Or is this a misuse
of the enum keyword? A member constant would be ideal, I
just want the value to appear in the class declaration.
Eric
---
[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std-c++@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]
[ FAQ: http://reality.sgi.com/austern_mti/std-c++/faq.html ]
Author: "James Kuyper Jr." <kuyper@wizard.net>
Date: 1999/11/11 Raw View
Eric Guadalupe wrote:
>
> Hello,
>
> I have a question about a compiler's behavior regarding
> the following lines:
>
> enum { Value = 0xffffffffu };
> unsigned n = Value;
>
> At the second statement, the compiler I am using generates
> a warning about an integer conversion causing a change in
> sign.
>
> Does this mean the compiler is using signed int for enum?
If 0xffffffffu < (unsigned)INT_MAX, then it can use 'int' as the
underlying type. If 0xffffffffu > UINT_MAX, then it can use 'long' as
the underlying type.
> To me, it seems obvious that int cannot represent the value
> 0xffffffffu, so the underlying type of the enum should be
> unsigned, and the assignment should not generate a warning.
It only has to be unsigned if 0xffffffffu is between INT_MAX and
UINT_MAX. There aren't many implementations with that characteristic.
UINT_MAX has to be 2^N-1, for some integer N. INT_MAX is typically about
half UINT_MAX, in which case N would have to be 36. 36 bit integer
implementations exist, but they're not very common.
---
[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std-c++@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]
[ FAQ: http://reality.sgi.com/austern_mti/std-c++/faq.html ]
Author: wmm@fastdial.net
Date: 1999/11/11 Raw View
In article <80a2qm$ecp@news.or.intel.com>,
"Eric Guadalupe" <eguad@usa.net> wrote:
> I have a question about a compiler's behavior regarding
> the following lines:
>
> enum { Value = 0xffffffffu };
> unsigned n = Value;
>
> At the second statement, the compiler I am using generates
> a warning about an integer conversion causing a change in
> sign.
>
> Does this mean the compiler is using signed int for enum?
> In 7.2.5 the standard says:
>
> "The underlying type of an enumeration is an integral
> type that can represent all the enumerator values
> defined in the enumeration".
>
> To me, it seems obvious that int cannot represent the value
> 0xffffffffu, so the underlying type of the enum should be
> unsigned, and the assignment should not generate a warning.
>
> Is this legitimate compiler behavior? Or is this a misuse
> of the enum keyword? A member constant would be ideal, I
> just want the value to appear in the class declaration.
You are correct -- the underlying type of the enum must be unsigned
(unless sizeof(int) on that implementation is >4). The sentence
following the one you quoted is informative on this point:
It is implementation-defined which integral type is used as
the underlying type for an enumeration except that the
underlying type shall not be larger than int unless the value
of an enumerator cannot fit in an int or unsigned int.
It's clear that your enumerator would require exactly an unsigned
int (and not a signed long, for instance) on an implementation where
sizeof(int) == 4.
I don't think it's a misuse of enum, although an initialized static
const member would probably be better. There is a lot of code that
was written using enums this way before initialized static const
members were available, and there's no intention on the part of the
Committee to invalidate that code or that usage.
--
William M. Miller, wmm@fastdial.net
OnDisplay, Inc. (www.ondisplay.com)
Sent via Deja.com http://www.deja.com/
Before you buy.
---
[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std-c++@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]
[ FAQ: http://reality.sgi.com/austern_mti/std-c++/faq.html ]