Topic: conversion operator rules


Author: fjh@munta.cs.mu.OZ.AU (Fergus Henderson)
Date: Wed, 25 Jan 1995 14:10:23 GMT
Raw View
> Paul J Lucas <pjl@graceland.att.com> writes:
>> jason@cygnus.com (Jason Merrill) writes:
>>> Paul J Lucas <pjl@graceland.att.com> writes:
>
>>>> struct B : private A {
>>>> operator A() const;
>>>> operator A&();
>>>> operator A const&() const;
>>>> };
>
>>> None of these UDC's should ever be called.
>
>>  But, with private inheritance, the above is actually a useful
>>  technique.  Why was this forbidden?

I don't know the committee's reasoning at the time, but one design
principle frequently mentioned is that C++ is supposed to be extensible,
but not mutable.  The conversions you are trying to define already have
builtin meanings in C++.

--
Fergus Henderson - fjh@munta.cs.mu.oz.au
all [L] (programming_language(L), L \= "Mercury") => better("Mercury", L) ;-)




Author: jason@cygnus.com (Jason Merrill)
Date: Mon, 16 Jan 1995 20:33:09 GMT
Raw View
>>>>> Paul J Lucas <pjl@graceland.att.com> writes:

>  Ok, fine; but that still leaves the other part of my question
>  unanswered: What is the difference between:

>   operator T() const

>  and either

>   operator T&()
>   operator T const&() const

>  i.e., are either the latter two, when present, preferred over
>  the former and, if so, why?

The second would be the only alternative if you needed to bind the return
value to a non-const reference.  Any of the three would be candidates if
you needed to bind the return value to a const reference or call a member
function on the return value.

The current rules for choosing between user-defined conversion sequences,
from the Valley Forge minutes, are:

          User-defined conversion sequence U1 is a better conversion
          sequence than another user-defined conversion sequence U2 [if
          they contain the same user-defined conversion operator or
          constructor and] if the second standard conversion sequence of U1
          is better than the second standard conversion sequence of U2.

Thus if more than one of the three is a candidate, the conversion is
ill-formed.

>  The ARM also doesn't say whether the former actually constructs
>  an instance of T.  For trivial types like:

>   operator int() const

>  it doesn't matter; but what about for used-defined types where
>  it does?

'operator T' has a return type of 'T'; returning from it is identical to
returning from a plain function with a return type of 'T'.  So yes, it does
construct an instance of T.

Jason




Author: jason@cygnus.com (Jason Merrill)
Date: Mon, 16 Jan 1995 07:40:30 GMT
Raw View
>>>>> Paul J Lucas <pjl@graceland.att.com> writes:

> In <JASON.95Jan15015820@phydeaux.cygnus.com> jason@cygnus.com (Jason Merrill) writes:
>>>>>>> Paul J Lucas <pjl@graceland.att.com> writes:

>>> struct A { };

>>> struct B : private A {
>>> operator A() const;
>>> operator A&();
>>> operator A const&() const;
>>> };

>> None of these UDC's should ever be called.

>> 12.3.2  Conversion functions                          [class.conv.fct]

>> A conversion operator is never used to convert a (possibly
>> qualified) object (or reference to an object) to the (possibly  quali-
>> fied) same object type (or a reference to it), or to a (possibly qual-
>> ified) base class of that type (or a reference to it).

>  Where did you get this?  It's not in 12.3.2 of my ARM; is it an
>  ANSI addition?

Yes.  The above text is from the September Working Paper.

>  But, with private inheritance, the above is actually a useful
>  technique.  Why was this forbidden?

I don't imagine that private inheritance was considered when drafting the
above rule.  However, I don't believe that private inheritance merits an
exception; for better or for worse, the position of the committee has
been that access control is always checked last, and does not affect
which candidates are eligible for overload resolution.

Jason